Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A, Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
 +603-80239829

 +603-80238639
 +603-80237089


Penang Office:
44A Jalan Besi,
11600 Green Lane,
Penang, Malaysia.
 +604-6192582
 +604-6192583
fictronpgsales@fictron.com

Fictron Industrial Automation Pte Ltd
140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

Letí»s Build Robots That Are as Smart as Babies

Letí»s Build Robots That Are as Smart as Babies
View Full Size
Let’s face it: Robots are dumb. At the best they are idiot savants, ideal for doing one thing really well. Generally, even those robots demand specific environments in which to do their one thing really well. This is the reason autonomous cars or robots for home health care are so hard to build. They are going to need to react to an uncountable number of situations, and they'll need a generalised understanding of the world in order to navigate them all.
 
Toddlers as young as two months already perceive that an unsupported object will fall, while five-month-old babies know materials such as sand and water will pour from a container rather than plop out as a single chunk. Robots lack these understandings, which prevents them as they try to navigate the world without a prescribed task and movement.
 
However, we could see robots with a generalized knowledge of the world (and the processing power required to wield it) thanks to the video-game industry. Researchers are bringing physics engines — the software that provides real-time physical interactions in complex video-game worlds — to robotics. The goal is to develop robots’ understanding in order to learn about the world in the same way babies do.
 
Providing robots a baby’s sense of physics helps them navigate the real world and can even save on computing power, according to Lochlainn Wilson, the CEO of SE4, a Japanese company developing robots that could operate on Mars. SE4 plans to refrain the problems of latency caused by distance from Earth to Mars by building robots that can operate on their own for a few hours before receiving more instructions from Earth.
 
Wilson says that his company uses simple physics engines such as for instance PhysX to help build more-independent robots. He adds that if you can tie a physics engine to a coprocessor on the robot, the real-time basic physics intuitions won’t take compute cycles away from the robot’s key processor, which will often be focused on a more complicated task.
 
Wilson’s firm occasionally still turns to a traditional graphics engine, like Unity or the Unreal Engine, to handle the demands of a robot’s movement. In some instances, however, such as a robot accounting for friction or understanding force, you really need a robust physics engine, Wilson says, but not a graphics engine that simply simulates a virtual environment. For his projects, he often turns to the open-source Bullet Physics engine built by Erwin Coumans, who is now an employee at Google.
 
Bullet is a common physics-engine option, but it isn’t the only one out there. Nvidia Corp., for example, has realized that its gaming and physics engines are well-placed to handle the computing demands essental to robots. In a lab in Seattle, Nvidia is working with teams from the University of Washington to build kitchen robots, totally articulated robot hands and more, all equipped with Nvidia’s tech.
 
The robot could also understand that less pressure is needed to grasp something like a cardboard box of Cheez-It crackers versus something more durable like an aluminum can of tomato soup. Nvidia’s silicon has already helped advance the fields of artificial intelligence and computer vision by making it possible to process multiple decisions in parallel. It’s possible that the company’s new focus on virtual worlds will help advance the field of robotics and teach robots to think like babies.
 

Switch to Mobile Version