Visual Studio Core Team's Accessibility Lab Tour Activity

What s the Accessibility Lab?

The Accessibility Technology Group on campus has a lab with numerous licensed-versions of Assistive Technologies (ATs), including Motion Trackers, Screen Readers, Screen Magnifiers, Braille Displays, Speech Recognition, and so forth. Their lab is opened to any team or person on Microsoft campus interested in using Assistive Technology. Using their lab is so straight-forward that I was a little embarrassed that I had asked Mike, their lab manager, to "hold-my-hand" for the first few OS installs. But, Mike is cool and he didn't mind.

Why did we go there?

I had wanted my team to be able to use different Assistive Technology devices to perform basic Visual Studio tasks, in order to increase Accessibility awareness and to educate how Assistive Technologies and Visual Studio work together.

What did we do?

We focused mainly on Screen Readers and Screen Magnifiers, and a couple of Motion Trackers to really open people's minds. Due to the nature of our product, Visual Studio 's targeted disability persona is blind or low-vision. We're a visual product, hence "Visual" Studio. We don't exclusively rely on sound to convey information. Sure, you might hear a beep when there's a build error, but you'll see a build error message box, squiggles, and errors in the task list. A person who is blind or has low-vision usually uses the keyboard exclusively, so we are already working on the support for people who cannot use the mouse or require a specialized keyboard. 

I installed Whidbey on 4 different screen reader stations, 2 motion tracker stations, and 2 screen magnifier stations. I asked people to form teams of 2 gathered at each station. Their task was to create an Accessible Application using Assistive Technologies. They were give hard-copies of the MSDN Whitepaper: Walkthrough: Creating an Accessible Windows Application to better simulate a real user experience. Some reported that it was a useful activity; others said they liked unstructured activity. The important part was that everyone worked with an Assistive Technology device.

By the way, the folks who worked with screen readers had 5 minutes to get oriented, and then I came by and turned off their monitors.

Creating a Hello World App without the mouse or a keyboard

While we waited for the rest of the participants to arrive, one of our teammates, Boyan, demo'ed for us how to create a WinForm's Application with a button on the form that when pressed displayed a "Hello World' Message Box. He did all of this using a Motion Tracker device. You might think what's the big deal, but if you've never worked with a motion tracker, it's designed for someone who is paralyzed or partially-paralyzed. To use this particular motion tracker, you wear a silver circular sticker on your nose (so you don't forget to take it off). The Motion Tracker tracks this sticker and the cursor basically goes where your nose goes. To click, you hover over a floating tool window on the desktop that has buttons for Single Click, Double Click, Left Button, Right Button, Scroll, etc. If you hover over for a second or two, the corresponding button on the floating tool window becomes depressed, so you can impersonate that mouse feature. There's an on-screen keyboard that allows you to simulate typing using that same technique to depress a key on the keyboard. According to Boyan, the most difficult part was trying to type. As people came into the lab and gathered behind him, you could see everyone slowly start swaying trying to get him to turn his head ever so slightly. A lot of people were amazed to see this technology used with our product.

Our Guest Speaker

Almost an hour later, after everyone had a chance to really work with an Assistive Technology and try to perform whatever task they were working on, we had a guest speaker stop by. Mia, a tester from the Accessibility Technology Group, uses screen readers to perform her job. She sat down with us and I walked her through the same exercise that Boyan had completed earlier with the motion tracker. Whenever Mia wasn t sure where focus was in the IDE, I explained where she was. In addition, I asked her what sort of information we could provide in these states. For example, our previous out-of-box state was to have focus on the solution explorer. Since the solution explorer is an empty tool window in this state, because there is no project opened by default out of box, the screen reader only replied, Solution explorer . Immediately Mia tried to tab around, but we all knew that nothing was going to happen. I looked at my team shaking their heads, like wow, we have some work to do. Whenever I could, I glanced at the audience. Some were looks of surprise, like, How can she listen to her screen reader that quickly. Others were looks of embarrassment. And a few had looks of determination to make our IDE more accessible. Mia had a captive audience the entire time. After 30 minutes, I had to end my first semi-usability study, because it seemed that people would have kept Mia there for another hour or so just asking questions.

Did the activity have an effect?

 

While I was walking down the hallway back to my office, I heard one of our devs, who had attended our activity, explain to another dev, who did not attend our activity, how he needed to improve the accessibility of his feature.

Thank You's

 

A big thank you to Mia for being our guest speaker. Your demo was the highlight of the activity.

 

Another big thank you to Mike, the lab manager, for his help setting up the lab.

 

And one more big thank you to Matt, from the Whitehorse team, for being the first one from our division to organize such a tour for all the Accessibility QA Representatives. If Matt hadn't taken the initiative, there's a good chance this activity wouldn't have come together.