Delivering Mobile App Accessibility in Cross-platform apps | BitBakery

Delivering mobile app accessibility in cross-platform apps

January 16th, 2023 by Jane Yao

Delivering great mobile app experiences means building for everyone, including people living with a disability. According to the World Health Organization, 15% of the global population lives with some form of disability—but thousands of mobile apps still lack accessibility features to make them usable. A recent research paper from University of Washington Ph.D. student Anne Spencer Ross showed that out of 10,000 apps reviewed, 23% lacked the accessibility metadata required

During my co-op term at BitBakery, I worked on the accessibility features for the BrainFit - Free Habit Tracker app for the Women's Brain Health Initiative. In this post, I'll cover what accessible design is, why it's essential, and how BrainFit uses accessible design to help everyone improve their brain health.

What is accessible design?

When we talk about accessibility in software, we're talking about producing products and services usable for as many people as possible. There are some common examples of this, such as screen readers that describe the content on the page, customize font size on phone screens, and access closed captions or descriptive videos. 

Why is accessible design important?

Accessible technology is inclusive technology. As software designers and developers, we must produce tech that everyone can use, not just a specific group of people. That means taking into consideration people who live with disabilities in various forms. Creating software that is accessible to all people not only benefits users living with disabilities but also opens up markets that otherwise wouldn't be able to use your products or services. 

Supporting screen reader tools

Working with the team at WBHI, we focused on building accessibility into the design and development from day one. During my co-op term, I worked on implementing aria-label attributes on elements throughout the app. 

The aria-label attribute adds text to elements in the app that don't have text, such as a button to close a modal dialogue with only an X or other visual design to indicate its function. The aria-label text is then read by screen reading software as the user navigates the app.

The aria-label attribute will override the text assigned to an element too.

// accessible name: add new task

The sequence of the elements is defined by a numeric value called a tab index. Once an element is in focus, its aria-label is read to the user. 

Challenges with aria-labels and screen readers

The first major challenge that I encountered was related to testing. Each change to an element or the introduction of a new element meant changing our test plan to ensure the labels were read out as intended. 

I quickly discovered that testing on a browser was very inconsistent. The various screen reader tools picked up on different things and prioritized rules differently. Since BrainFit is a cross-platform app, we had to account for multiple screen readers and how they handle screen content. 

A potential solution to this could be to fix it on a case-by-case basis, but there are so many variations between the tools to make this manageable. For example, the screen reader on iOS only describes interactive elements like buttons and links. It did not read out aria-labels for elements like a plain text textbox.

After researching online and learning about other ways to manage this issue, we decided to use an abstract rule called the text rule. This way, iOS and Android devices would read arial-lables for interactive and non-interactive elements. 

Integrating third-party libraries

Another significant challenge was finding third-party libraries and tools that had accessibility features. Unfortunately, many of these tools have little or no accessibility features built-in. We used one of these libraries to produce graphs in the app, but it didn't allow screen readers to read the graph text to the user.

My initial idea was to use the graph label to feed text to the screen reader. I implemented an onClick function to that label where clicking it would update with the next entry on the graph. The screen reader would then read it out. Another idea we looked into was superimposing divs across the chart. Each div could then be tab-focused, allowing the user to navigate the graph using those divs. 

What's next?

I learned a lot about accessibility through navigating these development and testing challenges. More importantly, I learned more about how challenging it is to be a person living with a disability who has to use technologies like screen readers to be able to navigate apps. As I continue my studies and work toward my career as a software developer, I'm focused on designing with accessibility in mind—especially at the beginning stages of project planning.

Laying down a good foundation that allows for accessibility features to be implemented and improved on is critical. I now ask questions to make it easier to add and improve accessibility features. We must consider how people with visual, auditory, or motor impairments navigate and interact. Keeping these questions in mind improves apps for not only people living with disabilities but it also improves usability for everyone.

BrainFit - Free Habit Tracker is available today on the Apple App Store and the Google Play store.

June 14th, 2018 by Jack Mitchell
Interview with Attila Schmidt, BitBakery’s Director of UX/UI
June 25th, 2020 by Rachel Hickey
Our top takeaways for developers from WWDC 2020
March 26th, 2021 by Alex Kinsella
Our top 5 content marketing tips for 2021