5 min read
The Accessible QR code helps blind and partially sighted users to find product information. I designed the first version of the app and the CMS that enables users to access all the information inputted.
Lead designer responsible for the design of the app (V1) and CMS, discovery and ideation, competitor analysis, journey mapping, wireframes, user testing, interaction design and stakeholder interviews.
The World Health Organisation estimates that 2.2 billion people have near or distant vision impairment (source: WHO). 284 million people are partially sighted with 39 million registered blind globally (source: Blindlook), making finding relevant product information in store and at home a significant and everyday challenge.
The Zapvision app and SDK have been designed to enhance the accessibility of product information for this community, providing greater independence when purchasing and searching for products. Zapvision has been developed with guidance from the Royal National Institute of Blind People (RNIB) and has been optimised through the RNIB’s expert assessment process and guided user testing research with the blind and partially sighted community.
What is Zapvision
Zapvision is an integrated system composed of an App, SDK, CMS and accessible QR code, enabling brands to provide important product information to blind and partially sighted users who are unable to access it from the packaging design.
QR codes are now used in a wide variety of contexts, including both commercial tracking applications and convenience-oriented applications aimed at mobile-phone users.
The great thing about Zapvision is that we have the technology to add an extra line to a QR code that allows us to distribute content differently when scanned by a phone or native app. Our D3 (dot-dot-dash) accessible QR solution augments the design of the QR code on the pack, maintaining its destination for standard camera apps, whilst unlocking value for the blind and partially sighted when scanned with a Zapvision-enabled app.
This means that brands don’t need to clutter their packaging with different codes, a problem that has been discussed many times due to the noise it creates.
I have had the opportunity to participate in both aspects of the product, including the end-user experience in the app for customers and the Content Management System (CMS) catering to clients/goods providers.
My role involved close collaboration with the technology developers to ensure that the Minimum Viable Product (MVP) requirements were successfully met. Additionally, I gathered valuable insights from reputable organisations such as Unilever, RNIB and Microsoft's Seeing AI.
MVP – first draft
Zappar's developers had already created a proof of concept for the technology, which enabled us to scan a code, determine the distance, and receive a message. In order to carry out testing, I created the initial, basic user journey to determine the end-to-end user interaction.
The MVP involves providing audible instructions that can be personalised for speed and comprise the following steps:
Turn on the app with the camera activated
Detect the presence of the code
Detect the distance from the code
Access information on the packaging
This version focuses on assisting users in identifying the correct product and providing warnings about ingredients or dietary information. For instance, Persil Bio and Non-Bio products may feel identical to the touch, but the code distinguishes between the two, allowing users to make an informed decision. Through observing users interacting with their phones, we determined that voice commands were not a priority, so it was not included in the MVP. Additionally, we researched apps that were specifically designed for the blind to guarantee familiarity and easy adoption.
Several user testing sessions were carried out in collaboration with RNIB. These sessions involved placing the packaging in rooms that simulated a shop environment, as well as in-home cupboards. Blind and partially sighted users were then tasked with navigating these simulated environments to locate items and access crucial information such as warnings and allergens.
The tests were predominantly successful and it was gratifying to witness the positive impact Zapvision was going to have on users' daily lives. During the testing phase, we recognised issues with the app's ability to register multiple codes at once, whether from the same product or different products.
This was originally designed but deprioritised to speed up the development of the MVP. To address this, we incorporated categories and the developers devised a method to cluster codes with identical identifiers. This streamlined the information presented to users, resulting in a more user-friendly experience.
The Content Management System (CMS)
As the lead designer for Zapvision, I was responsible for shaping the CMS to be an integral part of the overall user experience.
Working collaboratively with Unilever, the CMS was designed with a strong focus on user-centric principles, ensuring that it seamlessly linked the App/SDK, and merchants.
Features included as listed:
Seamless Product Input: The goal was to create an intuitive interface that allowed merchants to effortlessly input product information. This information is what makes the accessible QR codes truly valuable to visually impaired users, and my design aimed to ensure a straightforward input process.
Layered Information: Recognising the complexity of product details, the CMS was designed to support multiple layers of data. This approach ensured that the QR codes could provide in-depth insights, from ingredients to usage instructions.
Editable Duplicates: To streamline management and maintain consistency, I incorporated a feature that enables merchants to generate editable duplicates of product information. This makes handling similar products more efficient and coherent.
Organised Categorisation: The categorisation options include the use of tagging identifiers like European Article Number (EAN), Global Trade Item Number (GTIN), and QR URLs. These features are crucial in assisting merchants to organize their products effectively, making it easier to manage various Stock Keeping Units (SKUs) and ensuring clear product presentation.
Multilingual Adaptability: The CMS enables merchants to provide product details and descriptions in multiple languages, making the information accessible globally. Merchants can create the data table in a master language and duplicate it with the option of auto-translation to almost any language. When using auto-translate, the UX ensures they review the translations before publishing.
Accurate Pronunciation: The product information includes a feature that ensures accurate pronunciation of words. This demonstrates a commitment to inclusivity and attention to detail. One possible use case is for acronyms that are displayed in writing but should be read as full words, such as TNA for Tree Nut Allergy.
Personalised AQR Codes: The CMS empowers merchants to generate individual Accessible QR codes for each SKU, allowing for a tailored experience that matches specific products.
Throughout the design process, collaboration with Unilever played a pivotal role in refining the CMS's functionality and usability. The result is a comprehensive system that brings together innovative technology and thoughtful design, with a strong emphasis on enhancing accessibility and inclusivity.
The Zapvision project, with the CMS at its core, showcases the power of collaboration and design to create meaningful solutions for real-world challenges.