By Tyler Irving | Jul 27, 2017 @ 06:49 |
The technology could assist people with limited mobility due to multiple sclerosis, Parkinson’s or other conditions. A new academic-industry collaboration at U of T’s Faculty of Applied Science & Engineering is harnessing improved sensors and artificial intelligence to make electric wheelchairs self-driving.
The technology could greatly simplify the lives of more than 5 million power wheelchair users across North America, and millions more worldwide.
Since electric wheelchairs were first pioneered by inventors such as George Klein in the 1950s, the fundamental technology has remained much the same. Most are controlled by joysticks, which may seem simple to use but can be frustratingly cumbersome for many everyday tasks – from docking at a desk to traversing a narrow door frame.
“Imagine parking a car in a tight space using only a tiny joystick,” says Jonathan Kelly, an assistant professor at the University of Toronto Institute for Aerospace Studies who is leading the new collaboration. “That would be annoying for anyone.”
The problem is compounded for users with multiple sclerosis, amytrophic lateral sclerosis (ALS, also known as Lou Gherig’s disease) or spinal cord injuries who often lack upper body control, or those with Parkinson’s disease who often have hand tremors. Some of these users employ devices known as the Sip-and-Puff (SNP) controllers, in which they input commands by sipping or puffing air using a plastic straw. They are an alternative to joysticks, but they can make complex tasks even more overwhelming.
Several groups around the world are working on self-driving wheelchairs, but most rely on high-end sensors that are priced out of reach of a typical consumer. Kelly, an expert in robotic sensing and perception, believes that the task could be accomplished for much less, thanks to a recent explosion in mass-produced sensor technology.
He points to the Microsoft Kinect, which contains both a visible-spectrum camera and an infrared laser to detect distances.
“Sensors like that used to cost thousands of dollars, but now you can buy them for less than $200,” says Kelly. “It has been a game-changer for robotics.”
Automation could also help with tasks that are less complex but more routine. For example, an autonomous wheelchair could use sensors to map a space and tag certain key locations, such as the kitchen, bedroom etc. The user could then navigate to those spaces with a single command.
Two years ago, Kelly was approached by Vivek Burhanpurkar, the CEO of Cyberworks Robotics, Inc. The company has a long history in autonomous robotics, including self-driving industrial cleaning machines, but Burhanpurkar saw an opportunity to move into assistive devices.
“It’s only the past five years that we’ve reached a critical inflection point, allowing us to achieve unprecedented levels of autonomous behaviours at consumer level price-points,” says Burhanpurkar, who started Cyberworks after studying under U of T Engineering Professor Emeritus K.C. Smith. “Jonathan’s group was a natural partner for us because they have the same set of altruistic values and goals as we do. We share a common vision for the future, which is a rare thing between academia and industry.”
Rather than designing a new chair from scratch, the team – composed of Cyberworks engineers and U of T engineering researchers – focused on retrofitting existing chairs using inexpensive sensors, controllers and a small computer. Over the past two years, the team wrote software and developed algorithms to deal with many common situations, including driving down narrow corridors and avoiding obstacles. Another key collaborator on the project was François Michaud and his team at the Université de Sherbrooke, who came on board in 2016.
Rapid developments in sensor technology make this an ideal platform for aspiring engineers to gain research experience: much of the work on the U of T engineering side has been carried out by undergraduates. Last summer, the team also included Maya Burhanpurkar, Vivek’s daughter, who is working on the project as part of a gap year before starting her undergraduate degree at Harvard next fall.
In the end, Maya Burhanpurkar created an algorithm that enables the wheelchair to traverse a doorway with as little as 6.5 centimetres of clearance on either side. This month, she will be presenting her work at the International Conference on Rehabilitation Robotics in London, U.K.
Kelly says that the next step will be to test the wheelchair in controlled environments under the supervision of occupational therapists. He has lined up potential collaborators and is currently applying for ethics approval and funding.
“Once we have the user study data, the product is essentially ready for commercialization,” he says. “It wasn’t always easy, but I’ve been really surprised to see how far we’ve come in two years. We’ve had so many talented people working on the project, and now when I see it operating it always brings a smile to my face. I’m super excited about it now.”
- Keywords: Self-Driving, Wheelchairs, Robotic Retrofit, artificial intelligence, sensor technology, Rehabilitation Robotics, autonomous.
- Source: University of Toronto
- Image: University of Toronto