The Aural User Interface

My name is Glenn English. A couple years ago, a friend took a job at a school for the blind and visually impaired working with ... blind students. He mentioned that it was very hard and frustrating for the kids to learn to use the computers at the school, even with the high tech screen reader they have for the kids. Well, of course it's hard — the computers run with a Graphical User Interface (GUI), and these kids can't see. Duh!

It seems to me that a screen reader wouldn't be a lot of help to people who've never seen a GUI because it's describing something they've never experienced and, therefore, don't understand. A screen reader describing the contents of a GUI to a blind user is much like someone trying to describe to me what the sensations of a shark are when it's tracking a fish by sensing the prey's electrical fields — there's no way I can understand what sensing an electrical field is like, no matter how good the description is.

What the kids can sense, most of them quite well, is sound. Therefore, to this sound engineer turned computer programmer, an Aural User Interface (AUI) seemed like it might be the solution.

I'm a retired computer geek, and it makes me sad that a significant portion of humanity has such a tough time playing with these delightful toys. And I have a good friend with a problem that I could fill some of my "Golden Years" trying to help solve. So that's what I'm doing.

AUI's very far from finished, and there are many omissions and, dare I say it, bugs. Its design and construction is very modular, and it's being written in such a way that it would be possible for another programmer to add functionality to it. If you're interested, download the source code tarball (an Apple Xcode 3.1 project folder) and/or contact us at

Here's what it can do and how it works, so far:

Real Soon Now, it's going to send and receive email and surf the web and create and edit text files and tell stories and read books.

The point of all this isn't that AUI is a fancy computer — it's far from that. The point is that it talks to a blind user. It's not trying to describe what'd be available if the user could see; it interacts with the user in such a way that vision isn't necessary. It's an Aural User Interface.