人間とロボットではものの見方が違うという。
それはそうで、ロボットからみて缶と人間は同じ物体にみえるということだ。
それが問題でもある。
BBCより。
Robotics: How machines see the world
http://www.bbc.com/future/story/20140822-the-odd-way-robots-see-the-world
Autonomous bots like self-driving cars don’t see the world like us. Frank Swain discovers why this could be a problem.
Can you tell the difference between a human and a soda can? For most of us, distinguishing an average-sized adult from a five-inch-high aluminium can isn’t a difficult task. But to an autonomous robot, they can both look the same. Confused? So are the robots.
Last month, the UK government announced that self-driving cars would hit the roads by 2015, following in the footsteps of Nevada and California. Soon autonomous robots of all shapes and sizes – from cars to hospital helpers – will be a familiar sight in public. But in order for that to happen, the machines need to learn to navigate our environment, and that requires a lot more than a good pair of eyes.
(Thinkstock)
Robots like self-driving cars don’t only come equipped with video cameras for seeing what we can see. They can also have ultrasound – already widely used in parking sensors – as well as radar, sonar, laser, and infra red. These machines are constantly sending out flashes of invisible light and sound, and carefully studying the reflections to see their surroundings – such as pedestrians, cyclists and other motorists. You’d think that would be enough to get a comprehensive view, but there’s a big difference between seeing the world, and understand it.•••