My love of shopping started as a teenager. What began as weekend shopping trips with my friends evolved into a fully fledged passion for fashion.
I spent hours watching and reading fashion videos and blogs, researching the latest trends and styles. This wasn’t as straightforward as it sounds as I have a degenerative eye condition called retinitis pigmentosa, which was diagnosed when I was six. To browse online I use screen-reading software, which played a key role in helping me when I decided to launch my own blog, My Blurred World, in 2015. It began as a way to explore accessibility in beauty and fashion, but now touches more widely on my own experiences living with an impairment.
As a vision impaired person, I have to alter the way I shop in order to make it accessible to me. There are some elements of the in-person shopping experience that have always been veiled in a layer of inaccessibility: labels with small print being the main culprit. Other factors include cluttered store layouts with little space to navigate between different displays, poor lighting and lack of staff training. This means that I lose autonomy in choosing what I buy.
This is where assistive technology comes into play. My phone’s home screen is populated with a selection of apps I use in my day-to-day life – the main ones convert visual information into the spoken word.
I recently had the opportunity to try new features in Seeing AI, a Microsoft app that is designed to narrate the world around you.
The Seeing AI app guides the user towards the product’s barcode and then narrates its information
Armed with this new technology in the palm of my hand, I headed to a pharmacy in Llandudno, Wales, and stood in the aisle anticipating a dose of independence.
The new Seeing AI feature is courtesy of a collaboration between Microsoft and Haleon, a company that owns many healthcare brands such as Sensodyne and Centrum. Using the short text feature on the app, which reads out snippets of text picked up by the phone camera, I was able to identify some brand names as I scanned the shelves. When I picked up a Sensodyne toothpaste from the throng and hovered it in front of my phone’s camera, the device emitted a series of beeps that grew in intensity as I neared the barcode. Then, ping, once it had locked on, the product information was available in less than a second.
Having this accurate information all in one place was a gamechanger for me. I hardly ever buy a new product without doing my research first, even if it’s a simple everyday item such as toothpaste. With a couple of allergies, I can never be too careful in terms of the products I use, so access to an ingredients list is vital. The need for this information would previously warrant a time-consuming internet search, or I’d have to ask a family member or friend to read the information out to me.
It was therefore refreshing to be able to access everything I needed to know so quickly and easily. From the product description to usage instructions, I could make an informed choice and a purchase in-store just like anyone else.
Vision impaired people deserve autonomy in the choices we make and it’s apps such as Seeing AI that make it a reality.
The app has other functionalities that are helpful to me in my everyday life. I regularly use its ability to read documents out loud, for example letters, and also the option to identify money.
Williams’ PenFriend is invaluable for makeup
Other tech for independence
There are a myriad of other aids and technologies that feature in my day-to-day life to make tasks more accessible to me.
I’m a workaholic and tools such as screen-reading software make it possible for me to go about my work independently. A screen-reader will read whatever is shown on the screen of both my laptop and phone, so long as the website or document is accessible.
I’m a keen concertgoer and, for me, a slash of lipstick is a must to frame the words as I sing along at gigs. But I want to know I’m using the right colours for the occasion. So accompanying the essentials in my makeup bag is a PenFriend; a microphone-like device that enables me to record audio messages that are stored on sticky labels, and then play them back by directing the PenFriend at the label. I use the labels to describe different shades of lipstick or eyeshadow and I find them incredibly useful when I pop on makeup. The recordings can be as long or as short as you like.
A liquid level indicator attached to cups beeps when the liquid reaches the top
When venturing outside independently, my white cane is always in hand. I had a love/hate relationship with my cane from the moment I first picked it up when I was eight years old. But I’ve since realised what a vital part my cane plays in my independence. It alerts me to any obstacles, while also letting others know that I’m vision impaired, ultimately giving me the confidence I need when navigating the world on my own.
Back inside the house, activities such as baking can be made feasible with the help of some talking scales and I always place a liquid level indicator on the side of cups when making hot drinks. The device will sound a series of beeps as the liquid rises to the top, so, yes, there’s even an accessible way of making a cup of tea.
These are just a selection of the items I use in my day-to-day life. Advancements in technology have helped increase my own independence, making a visual world accessible in ways it has never been before.
Find out how Haleon is making healthcare labelling accessible for all. Download the Microsoft Seeing AI app for free from the Apple App Store