wheelchair

 

Autonomous navigation

Wheelchairs that map and drive

 

Overview

This is a low-cost, self-contained add-on module for existing third party standard power wheelchairs. It can learn the layout of a building in order to autonomously navigate from any location in a building to another without user assistance. In buildings where it has not been taught the layout, it can assist the user to autonomously drive through doorways, down long corridors, dock near a desk or table, or be summoned to the user’s bedside from its charger location. An easy to use Graphical Interface allows the user to control the chair manually (as usual) or in one of five autonomous and assistive modes:

  1. Autonomous Self-Navigation from Current Location to Any Other :

    In this mode, the chair is taught the layout of the building by driving it around once. Thereafter it can autonomously go from any location to any another at the request of the user at any time if the user becomes tired of manual control or if the user wishes to engage in other activities such as texting or reading.

  2. Door Traversal Mode:

    In this mode, there is no map or prior set-up, so it can be used anywhere in any building to help the user go through doors. It takes the chair smoothly through the door even under extreme conditions where the chair is not properly oriented in line with the door, just as seen in the autonomous navigation video.

  3. Corridor Mode:

    In this mode, there is no map or prior set-up, so it can be used anywhere in any building to help the user go through long corridors. It takes the chair smoothly through the corridor and adapts its speed to the flow of traffic in the corridor.

  4. Summon Mode:

    The chair can be made to self-navigate from its charge station to the bedside or other location of its user when the user is ready.

  5. Self Docking Mode:

    When approaching a desk or table, the chair can be requested to maneuver into a safe position relative to the desk or table without user intervention.

  6. Demo Video

    Credits1

    Navigation Demo:

    Throughout this video demo, the chair is placed under a series of challenging conditions:

    1. Getting through the door smoothly without any jitter or wobble motions
       
    2. Getting through the door when there is an unexpected occlusion (a passing object)
       
    3. Once through the door the chair is facing a blank wall. This would confound most vision based navigation systems since there are no features in the wall on which to lock-on for guidance, but the chair performs properly.
       
    4. Once in the corridor, the camera operator repeatedly walks in-front of the chair or close to its side, which requires the chair to make evasive maneuvers to continue safely towards its destination while safely and smoothly avoiding the camera operator.
       

    General:

    The navigation module is custom packaged as per the aesthetic and functional preferences of each wheelchair manufacturer. This demo video shows the generic “un-packaged” engineering development version. Prior to making this video, the chair was manually guided through all the corridors and rooms of the building to create and save a map in its memory. The map can be seen on the GUI displayed on the Macbook laptop in the video. Normally, this GUI would appear on a touch screen tablet of the OEM’s choice.

    Graphical User Interface:

    The home-screen of the GUI provides five Self-Driving options as explained above. From a computational perspective, the most challenging of those options is autonomous navigation to an arbitrary distant location in a dynamic environment with moving objects. In this video the chair is directed to use Autonomous Navigation Mode to go from the workshop to the end of a series of corridors. You can see the robot’s progress displayed in the GUI which shows the robot’s location in the map. Once the robot reaches its destination, the GUI once again displays the whole map of the building floor in both 2D and 3D so that the user can choose a new destination or return to the home-screen for more options. This GUI is always customized to the requirements of the OEM because everyone has really different ideas of how the GUI should be arranged.

    New VR Simulator Allows Rapid Prototyping and Testing

    Cyberworks has developed the first realistic 3D Virtual Reality Simulator for self-driving wheelchairs. This allows Cyberworks to achieve extraordinary levels of robustness by testing their navigation systems in hundreds of real-world environments using real sensory data without leaving the lab. The Sim uses data from real buildings (generated by walkthroughs using an inexpensive RGBD sensor) and using the actual mechanics and kinematics of existing wheelchairs (which allows Cyberworks to duplicate the slippage, turning radius, etc of those machines). The Cyberworks Sim runs custom versions of its core autonomous navigation software adapted to the specific kinematics of each simulated wheelchair. This will allow Cyberworks to dramatically accelerate its R&D program many fold.

    Watch a brief history of Cyberworks here

    There are over 4.4 million power wheelchairs in the USA alone. Cyberworks aims to automate wheelchairs to increase the quality of life for wheelchair users across the globe through sales and installation of add-on navigation systems for existing equipment in the field as well as to OEMs for new equipment sales.

    In 1984, Cyberworks demonstrated the world’s first autonomous robot, one that could navigate and work in unknown areas – without any apriori teaching or map learning: The world’s first fully autonomous robot.

    In the late 1980’s, Cyberworks also deployed, across Europe and in Japan, the world’s first commercial industrial autonomous cleaning robot.

    In 1991, Cyberworks developed the world’s first robotically-assisted electric scooter for the disabled.

    In the late 80s and 90s, Cyberworks published some of the earliest peer-reviewed scientific papers on autonomous vehicles, in leading international journals.

    Today we work with global OEMs to provide next-generation, low-cost add-on autonomous navigation modules for their existing and future products.

    We also fund research partnerships with several top tier universities to ensure we remain on the leading edge of autonomous vehicle technology.

    Credits1 : In Cooperation with our Valued Partners at the University of Toronto Institute for Aerospace Studies, University de Sherbrooke and NSERC Grants #491326-15and #485345-15