Blogs

Can phones successfully replace one of our natural senses? Exploring the case of vision

Machine vision is a field of Artificial intelligence that has seen significant advancements in the last few years and had a beneficial impact on the world. From security to retail to insurance, these industries depend heavily on the visual input received from cameras to automate the physical security sector, checkout-free shopping, and to price and underwrite policies accurately.

In healthcare, computer vision has accelerated medical research and provides doctors with proper support for identifying illnesses using imaging analysis.

Simply put, this technology allows cameras to understand and map the world around them in real-time, making them, in some cases, more accurate than the human eye.

In daily life, The first community in line to benefit from the advancements of computer vision that comes to mind is the blind community.


Using their phones, WAIT, blind people use phones?


YES !! Check out this video by the fantastic bling YouTuber James Rath - He says: "Just because we are blind, doesn't mean we don't have jobs to go to or errands to run... the smartphone offers so much independence... it's such a great time to be blind."


Anyway, using a computer vision software or app on their phones, blind people can hear text and scene descriptions. They can hear anything their phone camera sees.  


A startup leading with grand ambitions in this field is Myfinder.

MyFinder does right about what their name suggests - identifying and locating objects in a 3D space that enables visually impaired people to get support through human-centred AI and augmented reality. Their software turns a simple iPhone into the only assistive technology a blind person needs to navigate new environments. Both the iPhone 13 Pro and iPhone 13 Pro Max feature three 12MP lenses and a LiDAR scanner with a 6x Optical zoom range that can capture the whorls of a fingerprint or the precise grain of a leaf's - no end to the possibilities. This shows the clear advancements in the capabilities of consumer hardware.


Once the app is opened, the user starts hearing the objects in the scene; by double-clicking on one, they launch the 3d navigation towards the object. The app also has a digital cane feature, which uses haptic feedback to communicate to the user whether an upcoming obstacle is on their way or if the surface is not flat.

Even with such powerful software, the founder realised the more powerful impact of human connection. The app also connects human volunteers worldwide immediately; as soon as the feature is enabled, the volunteer receives a notification, allowing them to see the blind users' camera input and communicate what they see.



As a founder in assistive tech, I find myself having to explain many times to people that the visually impaired can use phones. Apple has done a fantastic job in providing us as developers with the frameworks in machine learning and the hardware to make life-changing apps like MyFinder that make the smartphone an even more powerful tool to support the independence of communities which it benefits the most. Technology is here to successfully replace our senses when nature fails us. - Ghita El Haitmy, CEO of MyFinder

Ghita El Haitmy

CEO and Co-founder of MyFinder with a BS in Psychology, a MSc in Marketing, CAPM, NEF+ (2021 Cohort), and 6+ years of international experience.