Disability, specifically losing the use of one’s limbs, has a profound impact on quality of life. Improving tools designed to enable paralysed individuals to regain independence is a major focus of modern-day medicine, and recent advances in computational science have created new opportunities in this field. Researchers from Iraq explain the inner workings of their recently developed motorized wheelchair that can be controlled by the user’s facial movements.
The majority of motorized wheelchairs are controlled by a hand-operated joystick – a feature that limits their use to individuals with functional upper limbs. For quadriplegic individuals and others who rely on movements of their head, neck, tongue and eyes, a better solution is needed. This fact motivated a recent study published in Bio-Algorithms and Med-Systems that proposes a technique that may help this group of patients to control a motorized wheelchair in a more efficient way.
Assuming that an individual has the ability to tilt their head up, down, right, and left, this kind of facial movement can be used to direct the movement of a motorized wheelchair. The components required to realize this goal include a wheelchair equipped with two motors attached to the back wheels, a laptop with a webcam, and a microcontroller that sends signals to the wheel motors.
The user sits on the wheelchair facing a webcam that is fixed in front of their head in a position that allows the camera to detect and monitor the tilt of their face. A video stream from this camera is then analyzed by a newly built algorithm to determine the direction in which the user’s face is tilted.

Four types of face tilting are translated into movements of the motorized wheelchair: right, left, down (stop) and up (forward) © De Gruyter; Illustration taken from the original manuscript in Bio-Algorithms and Med-Systems
The video stream, which is a sequence of multiple images, is organized into an image matrix in which a reference image is used for comparison against subsequent images. This allows the algorithm to calculate the degree and direction of facial tilt at any given moment.
Once the algorithm recognizes the direction in which the user’s face is tilted, a signal is sent to the microcontroller. The microcontroller in turn sends another signal to the relevant motors in the rear wheels, switching them on or off. The researchers found that only four distinct facial movements were sufficient for the user to move the wheelchair forward, right and left, and to stop it.
The researchers state that their algorithm performed excellently in distinguishing facial movements and controlling the motorized wheelchair. They also highlight future avenues of research, particularly how the algorithm can be adapted to allow the user to control other aspects of their wheelchair’s movement, such as speed.
Read the original article here: