InsectBot Home | ANU Home | Search ANU | HORUS | Staff Home
The Australian National University NICTA
InsectBot - Research

Home > Research

Current InsectBot Research

3D Docking - Chris McCarthy - 2008

2D Docking at 67 degree angle

3D Centring - Luke Cole - 2008

3D Centring with no ceiling (top view) 3D Centring 3D Centring (Mapped flow on sphere)

Navigating a robot through confined three-dimensional spaces, such as a helicopter flying within a building containing rooms and corridors, presents some obvious difficulties. In an effort to solve some of the difficulties, this work experiments a biological inspired technique that uses optical flow to perform three-dimensional centring in corridor-like environments. The experiments are performed on a omni-directional mobile robot, which has vertical motion for two fish-eye cameras mounted to provide almost 360 degree vision.

Teleoperation - Dr. Robert Mahony, Dr. Felix Schill, Dr. Peter Corke, Luke Cole - 2008

InsectBot and Falcon Haptic Joystick Teleoperation using Falcon Haptics Joystick

This work proposes the use of optical flow from a moving robot to provide force feedback to an operator's joystick to facilitate collision free teleoperation. Optic flow is measured by a pair of wide angle cameras on board the vehicle and used to generate a virtual environmental force that is reflected to the user through the joystick, as well as feeding back into the control of the vehicle. We show that the proposed control is dissipative and prevents the vehicle colliding with the environment as well as providing the operator with a natural feel for the remote environment. Experimental results are provided on the InsectBot holonomic vehicle platform.

Real-time motion recovery - Rebecca Dengate

The focus of this project is to develop assistive devices for individuals with severe and profound vision impairment resulting from diseases such as Age-related Macular Degeneration and Retinitis Pigmentosa. We describe focus groups that are being conducted to understand such needs. To assist with tasks such as navigation and obstacle avoidance for an indi vidual who is walking, knowledge of self-motion is essential. In this context we present a new implementation of a wide angle camera visual motion recovery algorithm suitable for use on a low cost, low power, light-weight wearable sens- ing device. For wearable sensing, camera paths are far more erratic than for ground based ve- hicles such as wheeled robots or cars. Also, weight from computing, cameras and batteries is a major issue.

Visually guided homing - John Lim

Coming Soon

Future InsectBot Research

  • Graze Landing