How can Microsoft Copilot transform digital accessibility?
May 16 is Global Accessibility Awareness Day (GAAD), an annual observation designed to kickstart important conversations about digital accessibility and inclusion. As a leading tech provider, Avanade is focused on embedding accessibility as a core requirement to all the solutions we create – and always seeking new ways to innovate for a more inclusive digital world. One of the most exciting frontiers for digital accessibility is through generative AI assistants such as Microsoft Copilot. In honor of GAAD, our Chief Inclusion & Talent Officer Hallam Sargeant sat down with Principal AI Researcher Diana Wolfe – who has spent the past year researching AI’s impact on the future of work – to ask her about how Copilot can accelerate accessibility in the workplace.
There are more than one billion people with disabilities across the globe -- and yet a 2020 report found 98% of web homepages contain at least one accessibility error, from low contrast text to empty links. How can AI tools like Copilot help tackle these barriers to digital accessibility?
A good place to start thinking about this question is, “Where can Copilot help lift some of the onus off the individual?” In our internal research in Copilot for Microsoft 365, we found that AI tools reduce the occurrence of common errors by about 50%. That’s really helpful when you think about it through the lens of accessibility standards. One of the ways Copilot for Microsoft 365 has been successful on this front is to automate the process of detecting and suggesting fixes for accessibility issues during the writing or design process – things like color contrast, or font size, or missing alternative text. You can even link to your choice of guidelines for accessibility standards, and Copilot for Microsoft 365 will abide by them and check your work. AIs are really good with parameters. When we give them (AIs) clear parameters during the design process, we can be surprised by the ways Copilot integrates what would have previously been running through an accessibility checklist by hand.
The way you’re describing the tool makes it sound almost like a personal assistant, which is often how we talk about Copilot at Avanade. What are some of the ways AI assistants can shape a more inclusive working environment for people with disabilities?
I’ve been gathering many stories out of Microsoft, and here’s one of my favorites. There was an employee who is visually impaired, and she was able to design and edit slides in PowerPoint for the first time in her life by using Copilot for Microsoft 365. The tool allowed her to use dictation instead of going to another team member or designer. That is a revolutionary shift for that worker! Even just one gain like this one is a massive milestone for accessibility. It shows the potential for these tools to transform the work of individuals across the board, no matter who they are. Another benefit is in the way Copilot affects team dynamics and communication. Initially, one of our biggest concerns with Copilot was that it might lead to more isolation of users – you won’t need to go to your coworker and say, “Hey, where’s that document you sent?” because you can just go to your Copilot. And surprisingly, team communication and collaboration were not significantly negatively impacted, and in some cases their scores went up because people were asking more strategic questions of their teammates. When mundane tasks are taken away, you can focus on what the other person is really good at. The person becomes more important, and that contributes to belongingness.
I'm also curious to learn more about how Copilot can make technology more accessible for the neurodiverse community. Can you talk us through an example? You know, one of the standout features of Copilot for M365 that we've seen make a huge difference, especially for our neurodiverse users, is its ability to simplify daily information overload. For instance, think about how exhausting it can be to keep up with the back and forth of a long day filled with calls. Copilot steps in to transcribe these conversations, which is incredibly beneficial not just for helping with verbal processing but also for later auditing the transcriptions when your mind is fresher. It’s such a relief, right? Another major win with Copilot for Microsoft 365 is how it tackles the sheer volume of information we all face every day. You can just hop into the M365 chat and ask, "What are three things I said I would do today?" This feature really highlights the essence of what it means to lighten the executive functioning load. And it’s not just about individuals who might need extra support, like those with ADHD — this is something that benefits all of us. By focusing on these aspects, Copilot not only supports specific needs but improves the work life for everyone. Here's something else I've heard from a user that really brings this home: “For individuals like me, who face challenges in processing large volumes of information and learning new tools, bite-sized learning sessions and visual aids within Copilot for Microsoft 365 would be immensely helpful." This shows us how Copilot is not just a tool, but a part of the learning journey, making the adaptation of new technology an exciting and enriching experience. And here’s the kicker: when we design these tools with everyone in mind, they don’t just help those who need them most — they improve everyone’s experience. Taking some of the load off isn't just a necessity for some; it’s a benefit for all. This universal design approach in technology isn’t just a good practice; it’s a game changer.
How does Avanade work to eliminate potential gaps in accessibility when we're building solutions for clients? I’m proud to say we’ve been researching Copilot for more than a year at this point, which is wild. We have a ton of knowledge now, and it’s really helpful to understand more about how Copilot impacts variables like belongingness, trust and neurodiversity. But some of the gaps remain hidden to us. An example is if you prompt Copilot and you say, “I'm visually impaired, please give me dictation of this message,” and it replies with something like, “I'm sorry you're blind.” That kind of response shows bias from the tool. How do we train it out of them? Whose responsibility is it to do that? Any tool we create needs to be trained. There are norms that are embedded in the data sets these tools are operating from, and they can start to create outputs that can look like bias, so it is our job to audit this. It's our job to keep a constant feedback and iterative loop with the AI tools we promote the adoption of within our organization and with our clients. We have been proactive in many ways with our comprehensive Responsible AI program. But it comes back to listening. Right? If we go to a client, and we say this tool is going to solve everything, and then it makes someone feel potentially less included or marginalized, what are we really solving? Whose life are we making easier? So, we have to be transparent and honestly say this is a work in progress, and show all the work we’ve done so far to understand where AI tools can go right and where they can go wrong. We tell our clients the truth when we say we have some of the answers, but there are times where we will be able to lead and other times where we will be walking beside you in your journey to adopting AI in your workplace.
I know many of us are passionate about creating a more inclusive digital environment. As a user, what are some of the ways I can deploy Copilot to make sure my content is accessible for everyone? In the early days of Copilot, we thought if you could give a prompt perfectly, you’d get a perfect result. But that was untenable – it has to be a conversation back and forth with the tool. You can say, “Make this deck adhere to these accessible design standards,” but if you don’t go back and audit its work, there’s harm that can be done there. We have to stay vigilant and always keep a human in the loop when we’re designing for human needs. I also want to say that I see a lot of fear around AI. It’s true across all populations, but individuals with disabilities may feel like, “I need additional assistance at times, so is a tool that’s more efficient at doing some of my functions threatening to my job?” Having open conversations about fear of AI is so important. Just know if you have fears about these tools you are not alone. I asked some of our users the following question “What advice would you offer to peers using Copilot for M365?” Here are their responses: “To my neurodiverse peers stepping into the world of Copilot, my advice is to embrace its capabilities gradually. Start by integrating it with the tools you use the most.” “If you struggle to stay focused, especially in long meetings or trainings, the transcription feature of Copilot for M365 could be a big help. It's been a game-changer for me, helping me stay on track by providing clear summaries of what's being discussed. This turns potential distractions into productive moments.” "Learning a new tool like Copilot for M365 can be daunting, but don't let the initial fear overwhelm or hold you back. It might take some time to sort and organize information in a way that makes sense to you, but the payoff is worth it.” I highly recommend you experiment with an AI tool in your daily life. One of the things that’s really helpful for getting past fear is finding something you really don’t like doing – for instance, I really hate creating a to-do list – and have an AI assistant do it for you. See if it makes your life a little bit easier and gives you more time to focus on purpose-driven work.
Subscribe to Avanade Insights
Headquarters
North America
1191 Second Avenue
Suite 100
Seattle, WA 98101
Europe
30 Fenchurch Street
London
EC3M 3BD
Growth Markets
Avanade Asia Pte Ltd
250 North Bridge Road
#30-03 Raffles City Tower
Singapore 179101