Visually Impaired Athletes At The Paralympics
Watching the recent Tokyo Paralympics was truly inspiring. I always learn something new from watching a Paralympics event. Whether it’s the different classifications of disabilities or the modifications or assistance that are required to allow the athletes to play and compete in their sports. One thing that got my attention this time were events where athletes have a visual impairment. The modifications for the visually impaired could be in the rules or the equipment or having assistance or guides. For example, in Football Five-a-side, the rules are similar to Futsal but there are a few modifications. On each side, the goal-keeper is sighted while the other four players have a visual impairment and they all wear eye patches to ensure fairness. The football has bells in it so the players can hear where the ball is. There are also 3 guides for each team, including the goal-keeper and 2 other guides standing outside the pitch to give feedback to the players. So they rely mainly on their hearing and touch – focusing on the sound of the ball, listening and feeling their environment and trusting the voices/directions of their guides. Spectators have to be quiet when the game is played so that the players do not get distracted by other noises.
Sports With Guides
There are other summer sports where visually impaired athletes also have a guide. They include archery, cycling, swimming, triathlon, athletics, etc. For sports like archery and swimming, the guides are stationary and simply provide some form of feedback. The spotter/guide in archery tells the archer where the shots landed, and the swimmer’s guide (typically the coach) taps on the swimmer’s head with a soft padded stick to let them know they are reaching the end of the lap. Then there are the sports like cycling, triathlon and running where the guides actually have to run, swim and cycle with the visually impaired athlete. For cycling events, the guides ride with the athlete on a tandem bike. Whereas for running and swimming, the guide is tethered to the athlete with a rope. In this case, the guide plays the important role of warning the athlete about obstacles if there are any, giving feedback about timing, the distance remaining, conditions of the race and also encouragements to keep going.
It was while watching the Tokyo Paralympics T12 Marathon event that it struck me that it is not easy being a running guide. It got me to think: What if the running guide had an injury and/or struggled to keep up with the athlete? Or what if there were technologies that could provide the same guidance to keep the runner on the right path, and provide feedback of the running pace and distance left? Is it possible for a visually impaired runner to run independently with the support of assistive technologies?
Running Assistive Tech
After doing a bit of research, I came across a number of developments out there in the academia and commercial space aimed to help visually impaired runners run without being physically tethered to a human guide. In all of the cases, the technologies have some way of sensing or capturing data about the environment or location. Then they would process the data to create feedback that’s useful for the runner and finally there will be a way of delivering that feedback which is either through the sense of touch or hearing or both. There is a general trend of creating a virtual hallway/track that the runner should follow and they will receive feedback when they are hitting or crossing the virtual boundary. Here, we are going to cover a number of tech and related stories that I have found.
Google’s Project Guideline (Computer Vision)
In 2019, Thomas Panek, an avid runner and President & CEO of Guiding Eyes for the Blind, was at a Google hackathon and he brought up the question/challenge: “can we help a blind runner navigate (without a guide dog or running guide)?”. He wasn’t expecting more than a discussion about the possibilities. But the team at Google built a rough demo on the same day using the smartphone (camera) to track a line on the ground and providing audio feedback for the user to walk along the line. From there, they moved on to test the concept on an indoor track. Thomas wore a running belt with a phone strapped on it so that the rear camera was pointing forward. An app on the phone would run computer vision algorithms to track a line on the track and provide audio cues via bone-conducting headphones to Thomas. It worked really well indoors in a controlled environment.
The challenge came when they took it outdoors because there are a lot more variables for the line detection algorithm. The lines on the roads would vary in thickness or density, there are leaves and dirt on the road and lighting conditions would change thus affecting the algorithms. The Google engineers didn’t want to rely on location data or cloud processing, so they had to train a machine learning model that would run locally on the phone to accurately detect different lines on the roads. After months of building the on-device machine learning model, they managed to pull it off, and Thomas was even able to run a 5k event in Central Park all by himself. The solution is not quite done yet and the Google Accessibility team is still doing more research and working on improving the solution. Anyone who is interested should fill up this form (link) to get in touch with them. [Original story link]
Simon Wheatcroft – Pushing Running Boundaries With Tech
Simon Wheatcroft suffers from a degenerative eye disease called Retinitis Pigmentosa. He was legally blind at 17 and got worse going into his 20s. At one point when he was not working, running gave him something to do and in a way kept him going. He learned to adapt and to rely on his other senses – feeling the slight hump on the lines of the road, and his memory of routes. He also used technology, like the RunKeeper App which gave him audio feedback of the distance he has covered. He ventured from a football pitch to a closed road, to the open road and then to ultramarathons. Besides using the RunKeeper App, he has also worked with a few technology companies and products. These are some of those he has used:
Google Glass (GPS, Camera + Remote Guidance – Aira)
Google Glass came out around 2013 and the initial goal was for it to be a heads up display that allows the wearer to look up information hands-free. There is a camera and voice activation which allows the wearer to take a picture almost instantly (with a voice command) without having to reach for their smartphone. But it also became an opportunity for Simon who could rely on the Google Glasses’ camera, voice activation, GPS and audio feedback to direct him on his runs. In the US, another blind runner, Erich Manser also used the Google Glass combined with a service called Aira to guide him through the Boston marathon. Erich still had a guide running beside him but he was mainly getting audio guidance from an Aira agent who is remotely in front of a computer viewing the run through Erich’s Google Glass camera. So the agent was able to tell Erich exactly what was in front of him and where he should go. Sadly, the Google Glass project switched gears and no longer has a focus on the consumer market. On the other hand, it proved that remote guidance (from a service provider like Aira) and a connected camera can be very useful for visually impaired runners
IBM’s Guidance App – eAscot (GPS)
Through IBM’s Bluemix Garage in London, Simon worked with a team of developers to build a guidance app called “eAscot”. It was named after Simon’s guide dog Ascot. The app uses GPS and a planned route to navigate Simon along a path and if he deviates from that path, it provides different beeping sounds to correct his veering off. High pitch beeping if he veers to the right and low pitch beeping if he veers to the left. Simon took this to the 4 Deserts Namibia race and although he had to stop at the 100-mile mark due to injury, it proved that the technology is useful in keeping him on the right path even in the desert.
WayBand (GPS & Magnetometer)
Simon found out about the technology WearWorks was building and decided to reach out to them. WearWorks had developed a wristband (known as Wayband) that provides haptic feedback based on the location and orientation of the wearer’s smartphone, with reference to the destination and route that the wearer is trying to go. The concept is similar to the eAscot app that relies on GPS. The difference is, besides GPS, Wayband utilises the smartphone’s sensors to identify the orientation of the user. The Wayband app also allows the user to plan the route and set multiple intermediate goal-points. So when the user reaches each goal-point, they know that they are on the right path. Then different haptic (vibrations) feedback from the wristband informs the user/wearer when they are off course.
In Simon’s case, he wanted to use the Wayband to guide him through the New York marathon. Fo that, there were a number of challenges the WearWorks team had to overcome to ensure things worked properly. Firstly the NY marathon route crossed two massive metal bridges. Metal affects the magnetometer and inertial sensor readings. Fortunately, as the bridge path is straight, so the workaround is to just rely on the GPS signal. Secondly, they had to identify areas where GPS is affected by skyscrapers in NY and come up with corrections. They also had to fix stuff like the smartphone position affecting the orientation reading, the intensity of the haptic vibration so that it can be felt during a run. Lastly, they added an ultrasonic/proximity sensor onto Simon so he knows when people are in front of him.
Even with all that prep, there were still some unexpected glitches such as cell traffic from tens of thousands of runners and spectators interfering with the Wayband signal. The rain also broke the proximity sensor so Simon only managed to do 17 miles of the marathon fully on his own and the running guide stepped in to assist him till the finish. In the end, it was still considered a positive outcome for a first marathon trial. It demonstrated what could be possible, and with improved GPS chips in the future, and some improved designs, the tracking and feedback will potentially get better.
Developments In Research
Besides the products and solutions that I found in the commercial space, there are a number of developments in academic research and even exploratory projects that aim to help visually impaired runners run independently. Here are a few interesting ones:
Running Tracks With Guidance
BLINDTRACK (Local Positioning System)
BLINDTRACK was a project initiated by the European Union (EU). The goal was to develop a system that can be installed on a standard 8-lane 400m athletics running track where visually impaired runners can run around the track with guidance to stick within the lanes. The BLINDTRACK system consists of a locating system and a tactile belt to be worn by the runner. The locating system (BlackFIR) is made up of four antennas to be placed around the track and transponders/tags that are worn/placed on the runners. When the runner is running on the track, each receiver tracks the distance to the runner’s tag and the Central Control Unit works out the exact position of the tag/runner and transmits feedback to the runner via the tactile belt so that they stay within their lane and even to avoid stationary and dynamic obstacles. On top of that, the system is able to store the athlete’s lap times, training times and related metrics. They currently have a working prototype and interested parties can reach out to them here: link or read about their research paper here: link.
Track With Electromagnetic Field & Sensor
Researchers from UNIVPM looked into using electromagnetic fields to create boundaries on a running track and equipping an athlete with an electromagnetic sensor to determine their position with respect to the electromagnetic field. In the study, two wires were laid on a running track across three lanes and the electromagnetic signals are set up in a way that the signals cancel each other out in the middle (between the 2 wires). So when the runner is running right in the middle with an electromagnetic sensor strapped on a waist belt, the sensor would detect no signal and that would be considered the safe zone. When the runner runs closer to the right or left “boundary”, they will be notified via vibration signal bands strapped on their left and right arm. So when the runner gets vibration feedback on the right arm, they know they have veered right and vibration feedback on the left arm means veering left. More details can be found in their papers here: part I and part II.
Hyper directional Sound
In Sweden, a sound design agency called Efterklang (previously known as Lexter) wanted to help a young blind runner (Oscar Widegren) run independently using sound. They have previously used directional sound speakers to create unique sound experiences. These speakers generate sound waves in a narrow beam so they can be targeted in a specific direction and won’t be heard beyond that beam. So what they did was place two directional speakers, each one targeting each side of a straight running track lane. This forms two “walls” of sounds. The left “wall” sounds out a continuous sequence of “bip”s while the right “wall” sounds out a continuous sequence of “whuip”s. So when Oscar runs down the straight track, he will be guided by “bip”s on his left and “whuip”s on his right. They managed to run a successful test and they are going to continue to explore how this technology can help more visually impaired people. More information can be found on their website: link.
Drones As Guides (Proof Of Concept)
The robotics research lab from the University of Nevada was experimenting with drones and how it could guide visually impaired runners. The idea was to have a drone fly 3m in front of the visually impaired runners and they would follow the sound of the drone rotors. The researchers ran a study in an indoor straight track where two visually impaired participants were asked to walk towards the sound of the flying drone. They found that the participants were able to accurately locate and follow the drone in front of them. Although they didn’t manage to do actual running tests, their proof of concept showed that it is possible.
Another research group in Melbourne, Exertion Games Lab, developed a prototype jogging partner drone called the Joggobot. The Joggobot was designed and programmed to follow the runner by tracking a marker on the runner’s shirt. It can be programmed to simply follow the runner or act as a pacer so the runner tries to keep up with the drone. Although they did not intend for this to be used by visually impaired runners, it could potentially be used the way described by the research group mentioned earlier. So having a preprogrammed route on the drone and using the camera tracking to maintain a safe distance from the runner. There are still some risks to running with drones like strong winds, trees or overhanging objects, but overall there are still interesting possibilities to explore there.
Assistive technologies can definitely play a part in helping visually impaired runners run independently. Not only does it give them a sense of independence and freedom, being able to run provides all the benefits of physical activity and staying fit and healthy.
A number of the technologies that were mentioned are either just once-off or still in development and not available for the masses yet and I think that’s because there are multiple challenges to work out in terms of making the solutions more robust. For example, when the Wayband was used as a test case at the New York marathon, they found a number of limitations due to environmental factors where they had to have workarounds. They even had to add an ultrasonic sensor to help detect other runners in front. So although not all visually impaired runners would want to run a marathon independently, testing out the tech at a marathon event is a great stress test to see what could break and what else could be improved.
The Google Glass and Aira remote guidance seemed like a pretty good solution. Having the connected camera and a person describing exactly what is in front is almost as good as having a running guide. The main barrier to that is probably the cost of the device and possibly the cost of the monthly subscription to that service. And even if cost is not a factor, the Glass is no longer available to the public. One likely hack for that would be to have a video call with someone using just the smartphone and have it strapped to a running belt similar to the Google Guideline Project.
We also saw a few examples of embedding technologies to the sporting environment (location system or electromagnetic field or sound) to create the virtual hallway for a runner to “sense” the correct pathway. Going forward, it will be great to see some form of collaboration between designers of facilities and wearable devices or app developers. Perhaps a more universal “sensing” mechanism that could have integration with smartphone apps and smartwatches that most people already use.
We may not be at a stage where we can rely on the tech fully because there are still limitations and many variables in play (e.g. environmental factors, internet connectivity etc). But if more people are willing to give the tech a go, researchers and developers can continue to work towards more robust and usable solutions.
Lastly, creating more assistive tech for visually impaired people could benefit the wider population as well. Wayband who is about to launch their product noted that their wristband could be helpful to sighted tourists exploring a new city. So instead of looking down at their smartphone for directions, they could focus on the sights and sounds while the wristband can inform them when they have deviated from their original planned path. Also, besides running and walking, the technologies we have covered could be applicable to keeping distracted drivers safe.
And that’s all I have got for this article. Hopefully, that was somewhat informative or perhaps inspiring. If there is an impactful piece of research or technology that I left out, please leave a comment or feedback below. Or if you would like to chat about any of these, do drop me a message here (link). With that, thanks for reading!