Can anyone who's using the new update explain what Narrator can and can't do? Does it let you use the built in apps, apps from the Windows store, offer gestures to easily navigate etc? I am totally blind and use an iPhone as my main phone do to it's accessibility. I bought an old Nexus 7 to test Android accessibility and wonder if it is worth buying a Lumia 520.
Without a phone in hand, information seems kind of sparse. The Narrator in desktop Windows doesn't seem all that impressive (wouldn't want to kill the 3rd-pary screen reader market, I guess).
From a development point of view, devs still need to add accessibility labels to their apps and use the other accessibility APIs available. I have no idea how easy or hard that is on Windows Phone. iOS is easy enough that a dev would have to be borderline negligent not to type a few descriptive labels while they're doing the UI. Android, not too bad but not as much infrastructure as iOS (I'll admit to being a much weaker Android than iOS dev, so maybe I've missed something) for gestures and the like.
Point being, even if the WinPhone accessibility APIs are fully fleshed out, 3rd party apps still need to support it, and I haven't seen a big push from Microsoft to do so. Contrast to iOS where devs get the accessibility spiel at WWDC, and Apple seems to push it elsewhere. Android, meh, seems kind of tacked on IMO, though I guess it works well enough (fully-sighted user here, so I'm not the most qualified to say, and I'd be interested to know how the Nexus 7 works out for you).
Sometime in the next week I should get a chance to play with a friend's phone that I'm sure he'll have upgraded to 8.1. Again, I'm fully-sighted but have been the "accessibility guy" across several companies (including an accessibility lead at MSFT), so I at least know what to look for. I'll ping you if I have anything to report.
Appreciate any info you get when you get a chance to play with their phone. The Nexus 7 is a bit below where the iPhone 3GS was when it was released. I’m not sure how much of that is me having a harder time with larger screen devices though, I find an iPad much more difficult to use then my iPhone. I’m probably going to get a used Nexus 4 dirt cheap and will actually try using it as my primary phone for a while. I should really see if the newest version of Android Studio is still completely inaccessible. One nice thing about Android is that it is quite easy to program for when blind using the Eclipse based tools.
Wow, I just tried the latest build of Android Studio (0.5.something) out of idle curiosity. Even on Mac OS X, which from my POV has great accessibility at the OS level, you just get big containers without any visibility to the items inside. As a sighted user, this annoys me as well because that (usually) means there are no keyboard shortcuts to work within those containers. Contrast to Xcode, where there's a shortcut for everything and VoiceOver seems to be much more informative.
For all my complaints about Eclipse, you're right, it seems to be much better for accessibility than Android Studio. Too bad sometime down the road Google is going to switch to Android Studio (granted, Android Studio is worlds better in a lot of other respects). Though at the rate they're going, it should be a while. :-)
I don't have hardware, but Narrator works in emulator. Link to manual still is not available, so I don't know full list of features. There are 2 modes of operation, in one of them it activates controls in same places where you touch them, in other it represents content of the screen as a tree for navigation: moving finger left/right activates previous/next control, moving up/down changes level of tree. The problem is that I don't understand what thing triggers switching of modes, I did it in the emulator unintentionally. In any mode it reads content of activated control; double tap in any place of screen allows to use active control, some controls have additional action for triple tap. It's possible to use 2-finger gestures to scroll vertically and horizontally, it announces percentage after performing gesture. Scrolling works on specific level of navigation tree, sometimes it can be confusing.
Common controls are all accessible.
On-screen keyboard is shitty: every key must be activated before use. So typing is probably very slow.
In system tray area it could read every icon (level of network signal, battery, clocks...), but I could not find a way to open Action Centre.
Application switcher is bad, I could not find a way to use it properly. Button for closing apps works, but it's announced as generic button without text.
Accessible standard apps: Alarms, Data Sense, Internet explorer, Maps, Messaging, People hub, Phone dialer, Photo hub, Store.
Inaccessible standard apps: Battery Saver, Calculator, Calendar, Office, OneNote, SkyDrive, Skype, Storage Sense.
Standard apps where many buttons are unnamed, but blind usage is probably possible: Camera, FM Radio, Games hub, Xbox Music.
Emulator contains old versions of everything, so real phone could be different.