This is a low-cost, self-contained add-on module for existing third party standard power wheelchairs. It can learn the layout of a building in order to autonomously navigate from any location in a building to another without user assistance. In buildings where it has not been taught the layout, it can assist the user to autonomously drive through doorways, down long corridors, dock near a desk or table, or be summoned to the user’s bedside from its charger location. An easy to use Graphical Interface allows the user to control the chair manually (as usual) or in one of five autonomous and assistive modes:
In this mode, the chair is taught the layout of the building by driving it around once. Thereafter it can autonomously go from any location to any another at the request of the user at any time if the user becomes tired of manual control or if the user wishes to engage in other activities such as texting or reading.
In this mode, there is no map or prior set-up, so it can be used anywhere in any building to help the user go through doors. It takes the chair smoothly through the door even under extreme conditions where the chair is not properly oriented in line with the door, just as seen in the autonomous navigation video.
In this mode, there is no map or prior set-up, so it can be used anywhere in any building to help the user go through long corridors. It takes the chair smoothly through the corridor and adapts its speed to the flow of traffic in the corridor.
The chair can be made to self-navigate from its charge station to the bedside or other location of its user when the user is ready.
When approaching a desk or table, the chair can be requested to maneuver into a safe position relative to the desk or table without user intervention.
Throughout this video demo, the chair is placed under a series of challenging conditions:
The navigation module is custom packaged as per the aesthetic and functional preferences of each wheelchair manufacturer. This demo video shows the generic “un-packaged” engineering development version. Prior to making this video, the chair was manually guided through all the corridors and rooms of the building to create and save a map in its memory. The map can be seen on the GUI displayed on the Macbook laptop in the video. Normally, this GUI would appear on a touch screen tablet of the OEM’s choice.
The home-screen of the GUI provides five Self-Driving options as explained above. From a computational perspective, the most challenging of those options is autonomous navigation to an arbitrary distant location in a dynamic environment with moving objects. In this video the chair is directed to use Autonomous Navigation Mode to go from the workshop to the end of a series of corridors. You can see the robot’s progress displayed in the GUI which shows the robot’s location in the map. Once the robot reaches its destination, the GUI once again displays the whole map of the building floor in both 2D and 3D so that the user can choose a new destination or return to the home-screen for more options. This GUI is always customized to the requirements of the OEM because everyone has really different ideas of how the GUI should be arranged.
Cyberworks has developed the first realistic 3D Virtual Reality Simulator for self-driving wheelchairs. This allows Cyberworks to achieve extraordinary levels of robustness by testing their navigation systems in hundreds of real-world environments using real sensory data without leaving the lab. The Sim uses data from real buildings (generated by walkthroughs using an inexpensive RGBD sensor) and using the actual mechanics and kinematics of existing wheelchairs (which allows Cyberworks to duplicate the slippage, turning radius, etc of those machines). The Cyberworks Sim runs custom versions of its core autonomous navigation software adapted to the specific kinematics of each simulated wheelchair. This will allow Cyberworks to dramatically accelerate its R&D program many fold.
Credits1 : In Cooperation with our Valued Partners at the University of Toronto Institute for Aerospace Studies, University de Sherbrooke and NSERC Grants #491326-15and #485345-15