According to Jack Weast, senior principal engineer and the chief systems architect of Intel’s Autonomous Driving Group, “People are downright scared of robot cars.” At least they are until they gain familiarity with what autonomous vehicles can do. The Intel Trust Interaction Study identified seven areas of concern:
Now, for the first time, a team of scientists led by Professor Simon Schultz and Dr Luca Annecchino at Imperial College London has developed a robot and computer programme that can guide tiny measuring devices called micropipettes to specific neurons in the brains of live mice and record electrical currents, all without human intervention. This is the first reported fully automated platform to do this.
But the Robby is loaded with advanced technology, ranging from video cameras to ultrasonic sensors to inertial motion devices, and uses artificial intelligence to not only find its way around city streets, but to learn.Rui Li, one of the co-founders of Robby Technologies, a start-up based in a Palo Alto house not far from the Palo Alto house where Mark Zuckerberg developed Facebook, said the devices use “advanced AI technology so it can react differently to different situations, and learn. It becomes smarter and smarter.”When does it achieve self-awareness?
he Mako System allows doctors to map out the surgery using a virtual three-dimensional model based on the patient’s CT scan and to make adjustments during surgery to provide a better-fitting knee replacement.
Chinese technology firm Qihan launched the Sanbot Nano on Thursday equipped with Amazon’s voice assistant that is featured in its Echo speaker.The 2.7 foot robot will go on sale in October for $2,800, and will be available in English and German. Those are the languages that Alexa currently understands.Qihan’s Sanbot Nano is equipped with 50 sensors that helps it avoid obstacles, hear voices, and recognize when someone enters the room.Users can access all the features that Alexa enables such as controlling lights in the home or ordering pizza.
Recently researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have gotten closer to making this type of request easier: In a new paper, they present an Alexa-like system that allows robots to understand a wide range of commands that require contextual knowledge about objects and their environments. They’ve dubbed the system “ComText,” for “commands in context.”