Our team at Data61 has been working on equipping robots with sensory payloads that are designed for specific tasks, such as inspection, or providing information to the system that’s guiding the robot’s movements.
Image of contestant

An image of a woman with a robot.

A participant in the IAEA robotics challenge looks at her robot challenger – photo by Leslie Overs, Data61

Machines are blind when they’re born – we create them with only enough sensory capacity to do what they need to do. Your blender, for instance, can’t sense touch. Your washing machine (probably) can’t see, and your GoPro can’t sense heat.

As the development of mechanical robots (machines capable of performing physical tasks automatically) expands, it’s becoming increasingly necessary to imbue these mobile, semi-autonomous, lightweight robots with the ability to soak up high fidelity information about the world through a range of devices and processing techniques.

Our team at Data61 has been working on equipping robots with sensory payloads that are designed for specific tasks, such as inspection, or providing information to the system that’s guiding the robot’s movements.

Nuclear robot wars

Recently, our Queensland Centre for Advanced Technologies (QCAT) hosted the International Atomic Energy Agency (IAEA) robotics challenge. Teams from nine countries around the world were competing to test the autonomy and sensing capabilities of the machines they’d brought with them – navigating complex and changing environments through adaptable decision making, and sensing small changes in their environment.

Some bots mapped rooms using Light Detection and Ranging (LIDAR), and others floated on a shimmering pool of water, above a simulation of nuclear control rods. It’ll be a while before the results are assessed and the winners announced, but the challenge was a brilliant demonstration of the extremely high standards of sensing required in modern robots. These machines are highly specific, and finely tuned, and will be used to complement the skills of human inspectors in various scenarios, such as the inspection of nuclear waste, or the management of active nuclear power plants.

The Martian

Zee robot

An image of a Zee roboot

Zee awakens, via the Data61 Autonomous Systems Lab

Our hexapods use six legs, each with spider-like joints, as a platform for a streaming camera and a real-time 3D scanning LIDAR. This slightly adorable robot is capable of changing the direction of the camera using the motors in each leg joint – making it perfectly suited for inspection tasks.

It can also traverse completely novel environments with relative ease. As a demonstration, the Data61 robotics team brought Zee to the Powerhouse Museum during the 2017 SPARK festival, and took some time to plant it in their simulated Mars environment. The video below features Ryan Steindl from the Queensland robotics team, chatting about Zee’s capabilities and energy management:

Flying eyes and a SLAM dunk

Much of the way we humans soak up information from the world is taken for granted – our biological senses evolved over millions of years, and for the most part, this information is collected and processed by our brains without conscious strain or effort. For the robotic tools we’re designing to perform dangerous, dirty and undesirable tasks, we also need to imbue some degree of autonomy, to ensure they’re capable of dealing with situations when they’re disconnected from humans.

Our autonomous aerial vehicle, carrying a 3D LIDAR mapping payload, and uses something called ‘Simultaneous Localisation and Mapping’ (SLAM) to create a 3D point cloud of its environment in real-time, which feeds back into the navigation system into the vehicle. You can launch this system – dubbed HoverMap – in an area that’s completely disconnected from GPS, and it can navigate and analyse without human intervention.

Hovermap recently completed the world’s first underground autonomous flight inside a mine – this was covered in the Wall Street Journal. If it sounds familiar, you might be thinking of the scene in Ridley Scott’s sci-fi follow-up to his Alien film, Prometheus, in which a remote drone is used to map mysterious alien tunnels:

3D mapping from the movie Prometheus.

3D mapping from the movie Prometheus.

3D mapping from the movie Prometheus.

Finding the right balance between human intervention and the capability to make real-time decisions autonomously in these new machines is a tricky challenge – too much autonomy and the risks of mistakes are high, and too little can result in humans performing menial, dangerous tasks that could be otherwise dealt with by a machine.

Higher quality information about the state of the real world around these machines makes finding that balance easier, and it’s something that we see eye-to-eye on at Data61.