Thursday, April 28, 2022

The Second Edition of Personal Power

 The Second Edition of Personal Power

Hello everyone. It's been a long and strange couple of years. A whole lot of people out there seem to have found the first edition, with all of its many imperfections, to be of some help. Given the very positive feedback and so many earnest hopes expressed that I would update this guide, I have indeed done so.

For any newcomers, welcome. My objective was to create an ebook to teach totally blind owners of iPhones how to make the most out of their devices. I go right from the beginning with a section on setting up your device even if you're totally blind and have never felt an iPhone before. From there, I explain the VoiceOver screen reader built into every iPhone. I also explain the iOS operating system and much else besides. One major objective was to point beginners to more accessible apps so that their starting experience would be less full of frustrations. Hopefully, I've managed to offer something which will be informative and enjoyable.

As before, the guide is completely free to all. I encourage you to share it with whoever you wish. Provided the guide remains unaltered and freely given, I have no issue with this. If anyone is willing to make the guide available in other formats, such as DAISY, Braille, etc, you have my sincere thanks for making that effort. I would request you share that effort with me so I can make the files available to everyone along with the other versions of the guide.

I have placed copies of the second edition of the guide in four different formats. For most readers, I recommend the EPUB version. It has an interactive table of contents and should be the best in most cases. Many people requested a version in Microsoft Word, so I've put the DOCX version in the directory. I've also created a PDF version thanks to that option being available in Pages. It works well in the Voice Dream Reader app, but I think EPUB is still better. The final version is a plain text Markdown format version of the guide. In it, there is sadly no table of contents. However, you can search for headings denoted by one or more number signs or hashes. This could be helpful for people who find that the EPUB version fails to work for them. I also hope that people will find ways to use this, or the other files, in the creation of versions with active tables of contents, or other advantageous format aids.

For ease of use, I have place these files in a shared Google Drive folder. I hope this method of distribution will prove less troublesome for people than prior attempts. Here are direct links to the various files:

Thanks to the efforts of Harrison Tu, we now have a zipped MP3 audio version of the guide. It's recorded using an Elevenlabs female voice at a speed which is easily understandable. The link to the zipped file is:

Daisy Audio format: This is for people using Daisy players. It is read by synthetic speech at a slow rate which people can use the player controls to increase if they wish.

EPUb version: This has an active table of contents and is recommended for most people.

Microsoft Word format: People find this useful for making other formats.

PDF format: This seems to work reasonably well and has an active table of contents.

Markdown plain text: This version has no table of contents but has numbersigns or hashmarks indicating headings.

Braille .brl UEB format. This should preserve formatting and is likely the best for Braille readers at present. 

Braille .brf format: This doesn't preserve formatting but might be preferred for people who don't like UEB.

Single web page .mh format: This will open as a single page with headings.

A TouchScreen Unseen: This was a lecture I presented at the 2020 Connecting The Dots conference.

The link to the folder is:

Wednesday, March 6, 2019

The Camera Which Helped The Blind;

What Is This;

Identifying Objects, People, and More with Apps and The Camera in Your iOS Device:

Handkerchiefs are splendid things to have handy when you need to wipe off messy fingers so you can then use your iPHONE without getting guck all over it. They have made my day more bearable on many occasions when I couldn’t come up with facial tissue when allergies got the better of me. They are, however, the most unbelievably frustrating objects on Earth to try finding once dropped. They make no sound at all when landing on apartment carpet or even hard floor for that matter. I had ben given a dozen handkerchiefs and slowly lost most of them over the past six or seven years since. Eventually, I would step on the thing or else vacuum it by mistake and have to dig it out of the fluff if I even had the good fortune to realize this happened. Of course, there was always the possibility that it had fallen on hard floor and was now a hazard awaiting my unwary footfall to send me slipping or sprawling headlong into an apartment wall. At least, this time around, I was pretty certain that the handkerchief was lost somewhere in the apartment. I couldn’t be certain how many days ago it had fallen from my pocket though so that meant it could have been in any number of places.

There are, of course, time tested methods for dealing with searching for dropped objects. The grid pattern search is the most thorough and laborious. I resigned myself to goodness knows how many minutes or hours of doing such a search of my entire apartment by feel. And then… it dawned on me.

Around the time I got my first iPHONE, my father had gotten an iPAD. We had chatted using the FaceTime app. I remembered that I could switch it so it used the rear-facing camera. Rather than fruitlessly crawling around the floor searching, I called my father on FaceTime. He began telling me where to point the camera so he could see the apartment floor and furniture where it could have fallen. It literally took him no more than a couple of minutes to look over the floor through my phone’s camera and tell me where to reach down and retrieve the handkerchief. He didn’t even have to leave his living room to help with finding stuff or reading printed documents. A few times, he even helped me restore my obstinately silent computer to working status by telling me where to point the camera and what was on the screen.

Finding my lost handkerchief was my first mind-blowing experience of how very useful the tiny camera in the corner of my iPHONE could be to blind people. Over time, it became habit to FaceTime my father whenever I needed his eyes for tasks requiring working eyes. For quite a while, that was where things basically stood regarding useful applications of the camera. It took a while for the technology to get better and for Apple to give enough camera control to app developers before the really awesome stuff started. Slowly, the first specially built apps started to appear which were designed specifically to help blind an visually impaired people through use of the camera. These apps made use of artificial intelligence, big data, and the ability to connect people in need of sighted help with sighted people willing to help when called. In this section, we’ll look at the most popular apps which attempt to bring aspects of sight to blind people through the technology built into every iOS device.

Object Identification:

Sadly, not all blind people have sighted friends or family in their lives who are so willing to lend their eyes when needed. There are also times when, for instance, due to the lateness of the hour, it would be inadvisable to call upon one’s friends to quickly find out what kind of cake you pull from the fridge in order to partake in a midnight snack. Being able to have the benefits of sight for crucial moments here and there without inconveniencing people has always been a strongly held desire among blind people. Tapping into the capabilities built into your iOS device, app developers have now largely made this dream a reality.


Apps such as Digit-Eyes attempt to identify objects using the bar codes most objects are now adorned with. The user must simply move the camera over the surfaces of an item until the software detects the bar code. The app then taps into a database which is usually online. The bar code is matched to an entry in this database and the blind person is then informed what the object is. Being managed online, the database of bar codes an app like Digit-Eyes can tap into is always kept up to date and takes no storage space on your device.

Although still available in the app store, this app hasn’t been updated in over two years. It has existed for nearly a decade and was the first app of its kind that I experienced. The database of codes which the current version can identify stands at over thirty seven million. The ability to purchase and/or create labels to be attached to objects and speak or type text which is spoken when the labels are encountered might well make this somewhat older system a good choice in some circumstances. Since the text or recordings you create are stored on your device, this system can work even when you’re not connected to the Internet. This approach may also be less intimidating and easier to master for people les skilled with technology or fearful of giving away information to large corporations. It is, however, quite slow by the standards of today’s apps.

Tap Tap See:

Another approach is used by apps such as Tap Tap See. Rather than using a bar code which can be hard to find on some objects, the app assists blind people in taking a good enough picture of the object to allow artificial intelligence or a human assistant to identify the object. The user is then informed, usually within seconds of taking the picture, what the object is. This app detects when an object is clearly in focus and emits a short beep to inform the blind person that an object is firmly in focus. You must go into the “more” tab and then double-tap the “settings” button to access the configuration area of this app where this feature can be enabled. This can help tremendously if people have never had occasion to take a picture before. Tap Tap See can also analyze up to ten seconds of video in order to identify objects in the camera view.

Tap Tap See was king of the proverbial block for quite some time. It was once a paid service where you had to keep re-stocking the amount of pictures you could have described for you. It has long since become free for all users to use as much as they like and still comes in handy every now and then.

For my wife Sara and I, Tap Tap See was well worth paying for photos while that was actually necessary. To just be able to pull out one’s phone, snap a picture of a bottle of beer, box of crackers or cup of noodles and find out what kind it was within seconds, that was absolute unadulterated magic. No longer was it necessary to label so many things that would only be used once. No longer did we have to be so very careful about where we put which soup cans. Provided we had our iPHOnEs, we could make informed choices without having to open something to find out what it was.

KNFB Reader;

A Reading Revolution In Your Pocket:

Being able to read print easily wherever you might happen to encounter it has been another long-standing ardent wish for many blind people. My second truly mind-blowing experience with the camera on my iPHONE happened on September 18, 2014, on the same day as the Scottish referendum. Thanks to the generosity of John Morgan, a philanthropist who I’ve had the honour and good fortune to call friend, I felt able to afford the most expensive app I’ve ever acquired. for quite a while, I had been hearing about a remarkable new app called KNFB Reader. An early demonstration of its capabilities involved someone taking a picture of a screen at the front of a large auditorium and having the print on the screen read out perfectly. This app could also help guide your hand as you positioned your iPHONE above a sheet of paper to get that optimum shot for best recognition results. It was one of those things that felt either too good to be true or too revolutionary and life-changing to be available to ordinary people like me.

I was using an iPHONE5S by this point so I had the capability to run it. People were almost too ecstatic about what it could do. I heard stories of people effortlessly photographing even large poster-sized documents and getting nearly perfect OCR results. People talked of snapping pictures of signs on the street and learning what they said. It all sounded too good to be true. The price of over $100 Canadian certainly gave me pause. I could buy a good portion of a month’s groceries or a lot of Kindle books with that sort of money. I would, in all likelihood have ultimately made the purchase. However, what decided me that day was an email and donation from John Morgan. He had heard of this remarkable app and wondered if I’d be interested in taking it for a spin an telling him what I thought of it. Naturally, at that point, I agreed. I purchased the app from the app store without difficulty and it installed without any issues.

I opened the app and read through the instructions. For an experienced user like me, or even a beginner, it all struck me as very simple and well thought through. I could use a feature called “field of view report” to get a sense of how well a document was in the camera’s focus. Once I had gotten it in good position, I could take a picture and the document in focus would be read out to me. This would apparently happen within seconds of my taking the picture. Testing it out on some smaller flyers I had gotten in the mail, I happily found that the app was as good and easy to use as people said it was. To really put my new app through the ringer, I found a large poster-sized paper that the Ontario government had sent me. I had used my older version of Kurzweil 1000 and the OCR scanner on my desk to partially scan the document. It wouldn’t all fit in the scanner but I had gotten enough read out to know it wasn’t something I needed to wrestle with in order to hear the rest. However, if this new app worked like people had raved about, it wouldn’t be nearly so hard and time consuming to get the full document readable.

I laid out the large sheet on my dining room table and made certain the room light was on. For once, it would actually be helpful to me rather than the occasional sighted visitors I had. Using the field of view feature as well as vibrations which gave me a sense of how tilted my iPHONE was, it took a number of attempts before I found a position which let the camera take in the whole paper. Moving my iPHONE farther above the paper and then getting another report, I was able to hone in on the perfect position for my iPHONE. Then, I carefully double-tapped the “take picture” button. The camera sound played and I waited hoping I hadn’t disturbed the position of the iPHONE by tapping too hard. It took around ten seconds. I was just beginning to wonder if I had crashed the app when a synthetic voice began reading the document. The contents were about as deadly dull and unimportant to my life as humanly possible. Nevertheless, I was still absolutely spellbound. The paper was read out absolutely perfectly. I had never read anything like this without there being a number of OCR recognition errors. You’d encounter them at least once every couple of sentences. There’d be a couple of nonsense characters or letters instead of numbers. It had been this way since my days in secondary school when I had gotten my very first scanner. That still certainly happens even with KNFB Reader but it’s a lot less frequent. However, on that day, the stars must have been right. I stood there at the table utterly amazed waiting for a mistake that never came as perfect sentence followed perfect sentence. I couldn’t help but think back to the weekend I had spent scanning a copy of The Elements of Style so I’d have it in time for my creative writing class in university. A good portion of my ongoing perhaps unfair seething hatred of that book can be traced back to the wrecked weekend of effort it took to get a far from perfect but useable copy scanned into my laptop one page at a time. How much easier and less painful it would have been with an app like KNFB Reader.

The implications for today’s students are absolutely profound. It might actually be useful for them to go to a library, borrow a book and find the information they needed in it. They could read through forms and merely need sighted help to fill them in properly. All of the work of character recognition happens on your own device. This means that your data is kept absolutely private and doesn’t need to travel anywhere for processing. You can also export the text to other apps and share documents when you want to. The vibration feedback plus the field of view report features makes the process of learning to take good pictures of sheets a lot easier to master than with apps which cost less.

Compared to other apps in the app store, KNFB Reader is one of the most expensive purchases you’re likely to make even if you manage to get it while on sale. However, consider this more carefully. I needed government funding to obtain the Kurzweil 1000 software I had been using for over a decade. Each time I wanted to get the latest update, I needed to pay over $100 in order to have it sent to me. The government agency spent over $1000 initially to purchase my user license for the software. However, for the cost of a single update for Kurzweil 1000, I had gotten an app which consistently yielded results as good as or better than Kurzweil 1000. These results were obtained in a fraction of the time that software and scanner took to even scan a sheet let alone interpret what it saw. Rather than a bulky scanner taking up desk space, I could pull my iPHONE out of my pocket and read print anywhere. I’ve never had to pay once for updates to the KNFB Reader app. They just keep coming every so often. While there may be less overall need to read print as more documents become electronically available, I’m still very happy to have this app for when I need a full and accurate scan of something in print. I use this app every day to read my physical mail and notes from the staff of my apartment building. The KNFB Reader app was and still is an absolute game changer.

App Store Expedition:

Prizmo Go; A Cheeper Alternative Reading Solution:

While KNFB Reader was certainly the first really noteworthy print reading app, it wasn’t the only kid on the block for very long. Other apps appeared attempting to offer OCR capabilities more affordably. These days, unlike when I picked up the app, KNFB Reader has some good competition. By far, the most successful of these is one called Prizmo Go. It was designed to cater more to the sighted user. However, Creaceed SPRL, the app developer, took great care to incorporate support for VoiceOver and special guidance to help blind users orient their cameras to get good pictures of text. You can try the app free of charge enough to get a sense of whether the app would suit your needs. Should you need more of its capabilities, you can purchase some outright such as the ability to export and share text. Other capabilities take advantage of cloud-based processing to offer enhanced accuracy above and beyond the already superb performance of the built-in OCR capabilities. Prizmo Go also offers language translation capabilities. If you hesitate to pay an ongoing subscription, you can instead pay for a limited amount of cloud-enhanced accurate scans and/or translations. Otherwise, simply pay a one-time fee to unlock the exporting capabilities of Prizmo Go and you’ll have yourself a very robust and portable OCR solution.

This illustrates the power of large-scale economics. While you won’t find quite the same level of intuitiveness as with KNFB Reader, Prizmo Go offers very comparable OCR results for people who are comfortable and proficient with using the camera of their device. The “scene description” button at the bottom right offers similar guidance to the “field of view” report feature of KNFB Reader. It also gives a sense of the number of lines of text in view. There are VoiceOver hints throughout the app as well as other help offered in the “app settings” button found at the top left. For a lot of people who feel comfortable using the camera, Prizmo Go will be more than sufficient to meet their OCR needs and will cost them a whole lot less money. For plenty of others, the more intuitive feedback and guidance that only an app designed from the ground up for blind users provides will be well worth the extra one-time expense. I’m happy that we as blind people can now regularly make such choices in the iOs ecosystem.

Seeing A I:

Putting It All Together:

To really propel things forward takes a partnership. This partnership is between blind people who own iOS devices that need to know about things in their lives and a gigantic globe-spanning company working on cutting edge artificial intelligence with mind-boggling resources. One Summer, the blind community was absolutely stunned when the Seeing A I app from Microsoft appeared in the app store. As a means of conducting ongoing research into accessibility and artificial intelligence, Microsoft had managed to leverage its artificial intelligence, massive image database and computing power to come up with what is still the current must-have app for blind people. It’s called Seeing A I. This app is the Swiss army knife of handy tools making use of the camera. It has a number of channels which each perform different tasks. The channel you start on is for reading short text as the camera sees it. Touch the channel selector in the bottom right and flick up or down to get to other channels. More are added as new features are deemed ready for public experimentation. Other available channels include ones for reading larger documents, identifying objects or people, identifying currency, detecting the level of ambient light, describing scenes and even reading hand writing. Some of these features are still in what is called “preview” status which basically means they’re available for you to try but not yet judged fully developed. This app is available free from the app store and using it is free of any charges other than possibly cellular data.

It’s impossible to properly convey the tremendous usefulness of the Seeing A I app. Nor can the impact on the social media using blind community be overstated. All through the first months during which the app was available, it seemed that there were torrents of tweets, Facebook posts and podcasts about this small but monumental app. Nothing could knock Seeing A I off of the five most recently recommended app slots on the Applevis home page. And this despite the fact that users in the UK weren’t included initially. The envy from across the pond was thick enough to cut.

When you first run the app, you are presented with tutorials which pop up as you explore. They have videos produced by the app developers at Microsoft which explain how the various features work.. Once you’ve gone through those, the app will open into the default channel for reading short text. Touch the channel selector on the bottom right and flick up or down with one finger to change to different channels. The next two channels above short text reading are for reading whole pages and product identification via bar codes.

Keep in mind that Seeing A I is a research project. Basically, this means that you’re paying for the assistance you receive with the data you’re generating. You are the product. As Microsoft’s artificial intelligence assists you by reading text or describing people or objects, it learns from the images. Human app developers working on Seeing A I and other Microsoft products might make use of the image of a coffee cup you take a picture of and have described to improve the ability of products using artificial intelligence to recognize coffee cups. A blind person might well have taken a picture of a coffee cup from a different unusual angle than most images in an image library. I feel pretty safe in using it for most things but wouldn’t use it to get the security number of my credit card or anything sensitive like that. Always be aware of where images and data are going and mindful of the motivations of people or companies who have access to it.

Speaking of being aware of who gets data, consider the case of my mother in law. Soon after my wife Sara and I began using the Seeing A I app, we visited with her family and showed them the app. With her eager permission, we decided to try the person description channel and take a picture of my mother in law. Upon examining that picture, my iPHONE calmly informed her that she was a good couple of decades older than was actually the case. She was less than thrilled with that description and jokingly warned me not to leave my iPHONE unattended in her presence.

The Eyes have It;

Bringing Willing Sighted Help Where It’s Needed:

Lets face it. There are times when no amount of fancy artificial intelligence will do the trick. You need to hunt for something, complete a process, figure out where to write on a form, etc. There are problems which require an ongoing dialogue with a sighted person to be solved efficiently. With FaceTime, you must have someone in your family or circle of friends willing to help when needed. Not everyone has an iOs device. Nor are the sighted people in our lives always free or knowledgeable enough to help. There are currently a number of apps which seek to remove these limitations. We’ll focus on two very popular apps which attempt to connect those who need sighted help with people who are willing and able to assist.

Be My Eyes:

The Be My Eyes app connects you via a video call similar to FaceTime with the first available volunteer who has made him or herself available to assist. The app shows the volunteer what the camera on the back of your device sees. The real utility in the Be My Eyes app is its ability to connect people needing help with available volunteers who have time to offer assistance at any given moment. This is great for finding lost objects or picking out clothes that look good on you.

Keep in mind that these are volunteers. All they have to do is a simple tutorial about how the app works. They aren’t vetted for security and are unpaid. Don’t use them for anything requiring sensitive information. While it is unlikely that people would volunteer with sinister motives, You never know. Someone could, for instance, se and make use of a credit card number you had the volunteer read to you. There are certainly ways in which this app and the blind people who use it might be abused by unscrupulous people.

To write this section of the guide, I wanted to test out the Be My Eyes app for myself. I found signing up to be very easy. You do need to agree to the terms of use and use the app in abidance with them. Volunteers are never responsible for your safety. Nor can the developers of Be My Eyes be held liable for any misuse of the app. Once I had signed up, I placed a call by finding and double-tapping a button called “Call first available volunteer”. I have a collector’s coin which I got after participating in a documentary called Get Lamp. It’s available on Youtube and is about text adventure games. I asked the volunteer to describe each side of the coin. She did a nice job of it and told me how to move my iPHONE so she could read me the small writing on the coin as well as describe the pictures. She had time for a brief chat and was curious about the documentary. I asked what she had to do in order to volunteer and she described the simple tutorial she had completed.

The experience was very simple and unhurried. However, one is certainly conscious of taking up a volunteer’s time. Personally, I wouldn’t want to use them for anything too lengthy. That call certainly gave me an idea of the ability of the camera on my iPHONE7 to focus on small images and what feels to me like tiny writing on the coin faces.


Competent Secure and On-demand Visual Help For Rent:

Aira, pronounced “eye”ra, puts everything on a business footing. Aira hires agents who help blind people with everyday tasks. These people are vetted for security so you can feel safer about sensitive information. Also, you as a blind client pay a subscription for a certain number of minutes each month. You can use them for whatever you like without feeling guilty for taking up somebody’s time. The software used by Aira agents is more sophisticated and allows them to pull up maps or tap into your social media to recognize faces of people. Aira agents are trusted with your data due to their having been vetted for security and this makes Aira potentially much more useful depending on your individual needs. Aira can be used in conjunction with smart glasses giving a head level hand-free view more similar to eye sight to Aira agents. Imagine walking down a street and a voice in your ear telling you about a restaurant sign coming up ahead. The agent could also tell you that your friend John is approaching you from the left. That’s what happens in one of the promotional videos you’ll find on their web site. Because these agents can access your location and other information, they can help you make travel decisions. During the setup process, Aira makes it very clear that they aren’t responsible for your safety. They won’t help you while you’re travelling unless you are using a mobility aid like a guide dog or cane. They can, however, describe your surroundings and suggest different routes. They could tell you about signs of businesses you’re passing, where a taxi stand or bus stop is, or any number of details which might prove useful. They could help you operate and otherwise inaccessible ticket machine or read information from a restaurant’s display.

While writing this guide, Aira offered a free trial for everyone which gave you thirty minutes that had to b used in seven days or les once your trial started. I took advantage of this to take Aira for a casual spin. Some of my thirty minutes were taken with going through the rules and answering some questions to set up my profile. After that, the agent asked what she could help me with.

In advertising the trial, people were encouraged to knock something off their bucket list. During the years I’ve lived in my apartment, I had often been told that I had a “good view” from my balcony. I decided to have the agent who took my call describe for me the view from the balcony of my apartment. I figured this would give me an idea of how wide an area could be seen from my iPHONE’s camera as well as a sense of how well Aira trained its agents in the art of description. She spent a few minutes describing what she saw and answering some questions I asked. I was impressed by how much could be seen and with the detailed description I was given. She agreed to type up some descriptive text and attach it to a picture she took and shared with my iPHONE. Unfortunately, to access such pictures, you need to be a paid subscriber to the service so I couldn’t access this little souvenir. however, I at least got an idea of what it was like to stand and look out on a snowy afternoon from my balcony.

As things currently stand, using the Aira service is an expense your bank account will feel the sting of even for people with steady working incomes. As of March 2019, a standard subscription without smart glasses would cost around $100 US per month giving you 120 minutes of Aira time which can be used for nearly anything you want to do with the assistance of an Aira agent. This doesn’t cover any costs you’d pay for the cellular data used. Those could be prohibitive if you don’t have a high or unlimited amount of data in your monthly plan.

Putting things on a business footing has a more profound psychological effect than you might think. While you might hesitate to use a volunteer’s time for anything lengthy or complex, the mindset changes when you’re paying for a service. As fate would have it, a friend of mine posted a perfect example of this on Facebook while I was working on this section of the guide. Michelle McQuigge doesn’t consider herself particularly handy with tools or good at assembling things. However, she had bought a heavy-duty laundry cart and decided to use an Aira agent’s help to try assembling the cart without help from her more mechanically inclined sighted friends. The agent was able to find the assembly instructions on the Internet and then talk Michelle through the process of putting her new cart together. That just floored me. I can’t count the times in life where I’ve sat still or stood out of the way while someone sighted has tired him or herself out putting some piece of furniture together for me. An experience common to blind people is that sense of bing more than willing to do the work for oneself if a sighted person would just patiently say what needed to be done and where things were. For Michelle, Aira made that dream come true.

The idea of attending an event and being able to get to your seat or know what’s around at a festival independently is very compelling. You could tour a convention haul or have paintings in a gallery describe to you. Aira really pushes this active and engaged lifestyle in its advertising. The company is doing a lot to try and lower costs and make Aira useful for people who can’t afford to subscribe to one of their plans. Businesses can sign up so that blind people can be helped by Aira agents while in their establishments at no cost other than cellular data. Airports and other places can be designated free for Aira use by anybody. Also, people can use Aira for job hunting tasks at no charge and without losing minutes if they’re subscribed. There’s an entry level plan for $30 US per month for 30 minutes. That might be useful for a few quick tasks around the home but you wouldn’t want to go on an expedition with that plan.

Camera Conclusions:

Over the past eight years, I’ve continued to be delighted by the many ways in which the camera in my iPHONE has proved a vital part of my everyday life. As the costs of artificial intelligence, computing power, internet connectivity and camera technology go down, the possibilities for even those of us on low incomes will continue to grow. Thankfully, ways are being found to tap willing people and the vast resources of companies to improve lives not only of blind people, but of everyone who gains from artificial intelligence made smarter about identifying things for us.

These are still very early days. Things are moving quite quickly. This section of guide may very well be the first to become obsolete due to changes in this area. I’m fascinated to see where both the charitable and business efforts to put the cameras on iOS devices in the service of improving the lives of blind people end up going over the next decade. There’s quite a lot of potential for amazing helpful ideas. What really excites me is how all of this is done through a piece of technology commonly used by all sorts of people be they blind, sighted, disabled in other ways, or fully able-bodied. This brings large scale economics to bear on problems which formerly relied on smaller niche markets and special charity or government economics. That in itself is a massive game changer whose implications are still very much in their infancy.

Sadly, I things there is some cause for worry about an increased divide between blind people who succeed in finding steady enough employment to tap into what paid services like Aira offer and those who aren’t as successful. On lower income levels, you quickly hit a catch22 where you spend so much on the service that you have no money to go places to put it to good use. However, I think the costs will fall over time as more businesses join as sponsors or partners. I’m also hopeful that these services will have a trickle-down effect through changing public attitudes about the possibilities for blind participation in spheres of life.

The fear surrounding what governments and massive corporations will do with the data we generate is still, I believe, a limiting force on what’s possible. Personally, I think a lot of that fear is misplaced. People aren’t really weighing up the benefits and hesitate to reach for the ready help that artificial intelligence can provide. For blind people, I seem more of a danger from the one-on-one connections between them and the sighted people they connect to. It’s far more likely that an unscrupulous person would take advantage of a bank account number accidentally revealed while trying to bring a package of oatmeal on a cluttered countertop into focus or something of that nature. In such a case though, is that very slight risk more or less dangerous than not knowing whether that package of oatmeal contains ingredients you’re allergic to? Might it be worth trusting a volunteer or paid agent to help you use that otherwise inaccessible vending machine to get a snack or pay for a ticket you need in a hurry? In exchange for a bit of mindful sensible trust, that help is now available through your iOS device.

Above my discussion of each app covered in this section, you’ll find a link to the web site about it. There are all kinds of podcasts, videos and other information for each app so you can get a sense of whether it suits your particular needs. A simple Google search can yield more independent reviews and demonstrations of these apps. People tend to talk a lot about things which have positive impacts on their lives. Take advantage of that to get a good sense of what’s possible and how you might use that little round camera in the corner of your iOS device.

Remember to think out of the box. Nothing stops you as a Blind person from using the camera for the same social purposes as a sighted person. From pictures posted to online dating sites to Facebook photos, blind people are getting up to all kinds of social activity. There are even blind Youtube stars who have lots of sighted followers. We’ll cover that in greater detail in the section dealing with social media. There are lots of implications for making friends and sharing experiences with that little camera and other technology in your iOs device. new possibilities are emerging all the time so keep an ear out for them.

Monday, December 31, 2018

year end reflections

Year End Reflections for 2018 Hello everyone. I haven’t done a personal blog entry in ages. The Personal Power guide I’ve been working on plus segments I’m doing for Kelly and Company have scratched that blogger’s itch. The rest of life certainly hasn’t been uneventful. I just don’t have the same need to document everything as I once did. However, I feel a need to take stock today at year’s end. Where to begin? Married life seems a good place. Sara and I are enjoying a largely stress-free and happy marriage. We’re both pretty good communicators and overall positive people. We certainly have disagreements but have typically managed to talk through them without inflicting pain on each other. Both of us have our own separate spheres of life and share enough interests to value our time together. It’s hard to believe we’re i our fourth year of marriage. The work I’m doing for Kelly and Company on AMI Audio has proved to be very rewarding and also quite challenging at times. How to cram all the information I want to impart into fifteen-minute segments isn’t always easy. However, it has helped organize my thinking when it comes to working on Personal Power: the iOS Edition. Increasingly though, it’s harder to make the two projects dovetail as the guide enters what I hope will be the last stretch of work. It’s been a very long haul and my initial enthusiasm for the guide is fatigued. There’s still quite a lot to get done before it’s ready for release. I’ve done way too much work to cut corners or just walk away from the project. It’s got to get done right. Financially, AMI Audio has continued to compensate me for the segments I prepare and I’m ever so thankful for this opportunity for paid work. It still feels precarious despite this being my second year of producing segments. I’ve just passed the 100 segment mark. I didn’t imagine it would last so long and there’s still the niggling dread that one day, it’ll just snap off like a switch and vanish. That keeps not happening though. Because of this income, I’ve been able to acquire a lot more apps and try things that I simply wouldn’t hav felt free to without it. The guide will be a far better resource due to this. My library of legally owned books and audio dramas has certainly gotten larger. So far there haven’t been any issues with ODSP or Peel Housing. I don’t think the upcoming changes to social assistance will cause us any trouble. However, I have a distrust for Conservatives and their tendency to dismiss and devalue people who haven’t managed to become self-sufficient. I worry tremendously about our new premiere and his habit of ignoring science in favour of ideology. It’s a case of scrapping every bit of long-term thinking the other side did regardless of merit. There’s already a stupid amount of that going on south of the border. Greed and self-centred short-term thinking is sadly winning the day right when the planet needs the exact opposite. I don’t think Trudo has much of a chance of surviving the next election. Too many mistakes made and too many badly needed initiatives promised which take god damned time that the other side will wipe out before they can realistically have any long-term results. That deeply frustrates me. This year’s politics really leaves me angry and wanting drastic world change which seems impossible right now. It feels like it’ll take some catastrophic disasters to really get people to change and ditch this us versus them crap. It sounds like those might even occur in my lifetime if the UN scientists are right. I hope we don’t waste all twelve of these next apparently crucial years we have to get at least some kind of grip on climate change. It should be the kick we need to start doing something about societal equality but people keep finding ways of dismissing or fighting against that kind of thing. I keep thinking there must be more good news out there that I’m just not coming across. Part of that has to do with how long working on this guide has dragged on. It makes me feel more stale than seems warranted given all the interesting apps and things I’ve learned and found while working on it. Another big milestone was that I’ve reached a point where I’ve gone completely legal in terms of the books I have. I now own every book I’ve wanted to own since my teenage years. I’ve done this without breaking the bank thanks largely to Kindle and Audible. Knowing how very hard it is to write the stories which have done so much to shape me over the years, this feels tremendously good. Once my guide is finished, I might well take another stab at writing a short story collection. Alternatively, thanks to the Voice Dream Reader app and sites like Storybundle, I have a substantial collection of books about creative writing and game development. Those will help whether I pursue rpg creation or story writing. But first, there’s the guide to finish before it drags me under creatively speaking. I still feel that it’ll fill a gap in the help that’s available for people who have or contemplate getting iOS devices. That hasn’t changed. I’m still using my iPHONE7. Unless my carrier offers me a super deal, it’ll stay that way for potentially the next couple of years. Sara upgraded to an iPHONEXR at far less expense than I had thought possible. So far, she’s pretty happy with it. Understandable coming from a 5S. It’ll be interesting to see what she does with so much more room and other advancements. The guide will be a better informed document taking her experience of these advancements. I’ve invested in some new peripherals. A great new Bluetooth speaker called a Fugu Tough has proved to be an excellent addition to my travel kit. Sadly, the Go Duo speakers didn’t last as long as I had hoped when I backed them. The Tough seems rugged and even has speech prompts about charge levels and connection status. The other major acquisition was another successful Kickstarter. It’s a Hexgears X1 mechanical keyboard. It’s being used to type this entry. The key feedback feels very nice and it’s far more comfortable to type on than the Microsoft Universal Mobile keyboard I’ll still use and deeply appreciate when travelling. However, there’s a noticeable lag and occasional missed keystrokes. I think this is a bluetooth issue and wouldn’t be surprised if these issues went away in time. I’m still getting used to the new keyboard and its quirks. It’ll certainly stand up to the amount of typing I do. The keys are rated for seventy million clicks. The Lifepack Hustle has been a very nice travel pack for longer expeditions. It just got another bit of use on our Christmas visit to sara’s family. Plenty of room for a home base away from home and I love the shock protection. It should last ages with the possible exception of the internal mesh pockets. I can see those tearing eventually although they haven’t at all yet. The apartment continues to be a good home for us. No serious issues at all to deal with. There are maintenance projects and that sort of thing but they really take care of the building and us renters quite nicely. Sara knows far more people here than I do these days. That’s largely due to her guide dog Aladdin. I really have to get out more this Spring and get into the habit of walking more like I used to. It’s less fun with hearing loss and the hearing aids being rendered useless by too much wind or background noise. Still, I’ll need that motion and different place to help deal with the race to finish this guide in the time frame I’d like to. I’m a deacon at my church now. That happened in September and so far, I think I’ve done alright in that role. I’m still learning a lot as I go but people seem largely happy with me. It’s a three year commitment that I mean to see through barring any unexpected life-changing opportunities. Going to different churches continues to be a source of interesting talk and reflection for Sara and I. It hasn’t proved anywhere near as divisive as I once would have expected. She remains the choir director at her church. I frankly never thought of myself as deacon material but when the call came, I couldn’t turn it down. My church has been there when I’ve needed them as much as that kind of organization can be. It’s only right that I do likewise. In a few hours, our New Year’s party will begin. I think it’ll be a good one. I haven’t seen as much of any of my friends as I should have. I plan to work on that a bit more this year. 2019 should be a more social and less isolated year if I can manage that. Hopefully, it’ll also see a new creative chapter begin. I’m at last close enough to finishing this guide that I can realistically hope to get to the end before iOS changes yet again. Just not as far before that happens as I would have liked. Well I think I’ll leave you here for the moment. Perhaps, I’ll get back into more of a blogging groove. Only time will tell. Have a happy new year, everyone.

Thursday, December 20, 2018

Maps, Taps and GPS Apps

Maps, Taps and GPS Apps;

Getting Around The Real World;

Independent travel and gaining awareness of what’s nearby have been tremendous stumbling blocks for me in life. A lot of that difficulty is due to what caused my blindness. I was born prematurely. Doctors follow what was then a common practice of providing extra oxygen to keep me alive. This oxygen certainly had that beneficial effect. However, it also destroyed my retinas and damaged part of my brain responsible for geospatial awareness. Solving spatial puzzles, comprehending geography and geometry, keeping routes and mental maps have all proven impossible for me beyond a certain level. Things are now complicated further by moderate hearing loss requiring me to use hearing aids. They’re very helpful but don’t give me a reliable sense of distance to sounds. Even walking through relatively quiet parkland, I’ve had many occasions where people have suddenly appeared in front of me when it was too late to move around them. They weren’t standing still or walking especially quietly either. Too much wind renders my hearing aids utterly ineffective. All of this is very unsettling when you’ve been accustomed to hearing everything effortlessly for most of your life. Crowded noisy places are now things I’m likely to avoid as too much noise can render me unable to hear what people say or hear moving hazards like bikes or cars.

I thought it was vital that I explain these circumstances to you before discussing the navigational options your iOS device makes available so you can put my thoughts in proper context. I have used one of these apps called BlindSquare quite extensively and have done more limited exploratory testing of other apps I’ll discuss here. Just be aware that this is one section of this guide where I’ll be relying more on what the app makers say the apps can do and on what I’ve heard other people’s experiences have been like. For a number of my friends, these apps are all they need to feel very confident in exploring their surroundings and going to new places completely on their own. For me, they are more helpful in finding out what’s around me than it actually getting to places. Using them, I can at least be confident of eventually getting home unassisted if necessary. That in itself is quite a marvellous relief.

Having access to GPS navigation doesn’t solve all of the mobility and orientation problems for any blind person. Nor does it replace the need for a cane or guide dog and good mobility skills. These apps don’t use your device’s built-in camera to gather information and aren’t aware of what’s happening around you. They won’t warn you of oncoming cars, bikes, or other hazards. They receive a GPS signal from satellites in low Earth orbit and match what’s detected to information stored on your device or retrieved from online sources. The information may not always be fully up to date. Businesses close or change locations. Any objects such as benches, garbage cans, etc, that you might put into the information as personal points of interest to keep on track could be moved. GPS signals can be blocked by structures, cloud cover, and other things. Defence department regulations don’t allow civilian GPS receivers to be as accurate as technically possible. This is to prevent them being used to precisely guide weapons. For all these reasons, it’s not a good idea to completely rely on it as some drivers have who suddenly found themselves approaching a large body of water rather than the route they presumed was there. Use your own senses and common sense.

To help offset these difficulties, GPS apps may draw data from more than one source. Your iOS device can also tap into information received via the Internet and cell towers to help figure out where you are. Provided you are connected to WiFi, these apps are even useful to owners of iPADS which don’t come with GPS receivers included. These receivers can be bought separately so it’s perfectly possible, if somewhat more awkward, to use these apps with iPADs while moving. While stationary, iPAD owners can examine maps and virtually explore areas prior to going there exactly like they could on iPHONEs. In fact, some people may find the larger size of an iPAD helpful when exploring maps.

Using these navigational apps takes a toll on your battery. As you move, the app constantly tracks your position indicated by the received satellite signals and checks for things to notify you of against available information. When necessary, it will check for and download online information presuming this is possible. All of these activities require processing power. You may also incur data charges if you go over what your cellular data plan allows for. To minimize this, download any maps, points of interest, etc to your iOS device while you’re connected to WiFi. Check the settings for the apps you use for possible ways to govern the circumstances under which data is downloaded. Investing in an external powerbank is also a very good idea for people making frequent use of GPS apps.

Another thing to consider investing in is a means of hearing information conveyed by these apps while on the move. Some people hold their iPHONEs in their hands while travelling. This can be useful as it allows you to point the top edge of your device in directions of interest and make use of the “look around” or “geobeam” features common to GPS apps designed for blind people. This tells you what lies in the direction your device is pointing to. Most people that I talk to prefer to have their iPHONEs in their pockets and use small Bluetooth speakers, earbuds or bone conduction headsets to keep informed while their hands are free for other duties. Some apps offer support for use with Braille displays. I’ve never attempted this but would presume these displays would be small and light enough to wear around the neck or over the shoulder for easy access while on the move.

There are two types of apps we’ll be discussing here. The first kind are apps which are made for use by the general public and are also made accessible for blind people. There are a great many choices here. We’ll look at the two most popular ones. These are the Maps app which comes included in iOS, and Google Maps, its primary competition available free from the app store. We’ll also look at some apps for the general public which seek to aid their users with aspects of travel. For instance, there are apps which focus specifically on travel via public transit systems. Other apps, such as Uber, attempt to facilitate travel by connecting would-be passengers with people willing to take them in their personal cars for a fee. Other apps help with more long-range expeditions booking flights and planning itineraries.

There are also a number of GPS apps designed specifically for blind people. These apps try to offer extra information which is helpful for blind users as well as facilities to help make orientation easier. Again, we’ll focus on the two most popular of these apps in North America at least. Neither of these apps is free. In fact, one costs over $100 Canadian. It takes extra effort and expertise to make apps which are as maximally built for blind people as these ones are. They typically combine functions of two or more apps into a more seamless single app, provide information tailored to be maximally helpful for blind people, and are designed with efficient accessibility from the ground up. It can be a lot easier to master the use of one of these apps than to juggle two or more apps designed for sighted people in order to gain similar information that is by nature more minimal.

One thing you should always keep in mind when using any of these apps is that they are designed around car travel. Points of interest included in the maps and data these apps draw from will be located at parking lot entrances or driveways to places. When an app tells you that you’ve arrived, you’ll still need to find the actual entrance to a place. Nothing stops you from creating a point of interest precisely at the doorway or path you need to be at. However, it is up to you to perform this task and create the position markers helpful to you. It’s a good habit to get into. Just be ready to still possibly have to search a little if there’s GPS interference. Don’t ever presume exact precision. Also, keep in mind that when you’re told that a destination is a distance away at such and such o’clock, that’s as the crow flies. In other words, it’s a straight line to your destination that doesn’t take into account obstacles you’ll have to cross or get around.

You might wonder what kinds of special facilities and capabilities navigation apps designed specifically for blind people might have. Users of BlindSquare who wear earbuds or headsets will notice that the indicator sounds which immediately precede being told about points of interest are directionally positioned. You’ll hear the indicator beep in the direction the point is located from your position as much as this is possible. Another often used example is upcoming intersections. Apps designed for blind people will typically give more detailed information about these before a user reaches an intersection so he or she has a better idea how to safely cross it. Those are just the tip of the iceberg.

Rather than going step by step through the features and operation of each app I discuss, I have chosen to concentrate more on what these apps make possible and when they might be advantageous. There is excellent help available for each of these apps already. I see no advantage in reinventing the wheel. You can also find audio and video reviews and demos. I feel more comfortable leaving you in the hands of this expertly written help than trying to explain what I haven’t used extensively. Your safety might hinge on being familiar enough with an app’s capabilities that you have the mental space to focus more on your surroundings. Presuming you’ve read previous sections on using VoiceOver, browsing the web, etc, you’ll have the skills needed to master these navigation apps using the help provided.

Going Mainstream;

Using Apps Designed for Sighted Users:

Thin, light and portable, it’s no wonder that iPHONEs and iPADs have been turned into powerful navigation aids by clever app developers. Many apps designed for sighted users have been made accessible to blind people using VoiceOver. This has been done thoughtfully. However, blind people simply aren’t the core focus for these apps. Information these apps provide is designed to give maximum aid to drivers and other people with sight who can quickly look at the screen while on the move. They have been set up to be operated as easily as possible by people with sight. In contrast, apps designed especially for blind people are carefully crafted to give extra information and easy control by touch gestures or audio menus rather than tools found at a glance. This can make quite a difference. For a lot of people, the Maps app or Google Maps will be quite sufficient to their needs as blind travellers. One major difference is that mainstream apps don’t volunteer information. You need to use VoiceOver and seek out what’s nearby or find out how far away places might be. Other than directions to destinations, sighted people typically wouldn’t appreciate being told about everything they’re passing as they move along. You’ll need to have a good grasp and proficiency with VoiceOver to make effective use of these apps while travelling. You’ll need to take your iOS device out frequently if you want to consult a mainstream app for information other than spoken directions that comprise a route.

Apple Maps:

The Maps app is designed by Apple and comes already on your device as part of the iOS operating system. It taps directly into data maintained and collected by Apple. It isn’t supported by ad revenue of any kind. It is a service meant to enhance the value of Apple products. For people concerned about privacy, this may be a better fit. You can view maps of areas and get directions to places. Apple is also integrating the Maps app with other apps like Uber and Lift to provide maximum convenience. It also supports Apple Pay and Siri. I recently heard a video which pointed out that the pricing information and ride booking procedures were better in the Apple Maps app than in Google Maps. In an effort to rival Google Maps, Apple is investing a lot of resources into collecting geographical data. This new data is slowly being added to the Maps app as it is ready. This includes indoor maps of popular places like shopping centres and airports.

Apple Maps lives up to its name. The screen is dominated by an interactive map. Blind people are actually able to explore this map by dragging their fingers along roads. They can also use the VoiceOver rotor to flick up or down between points of interest double-tapping on an item to open an information card about it. You can choose how much of the screen is taken up by these information cards. You also have current local weather, a tracking button and settings button.

The information cards have quite a bit of material on them drawn from various apps as well as the address. You will also find buttons to get directions, call or visit the web site of a place. If an app is associated with a place, there’s a button to get that. Apple really plays to that strength of integration. The information cards are where that’s most evident.

Moving your finger around the screen allows you to explore the area around you and even follow streets. You can also find out about any nearby points of interest via the rotor by turning it to the “points of interest” setting. Flick up or down between points of interest and double-tap on any which interest you. It is also possible to mark points of interest on the map.

The Maps app is a springboard for many other apps which can incorporate aspects of it into travel aids. This might allow an app focussed on restaurants to offer the ability to give guided directions to a restaurant or show a map of where it is. It also allows Siri to give guided directions to places of interest when asked. You might never go into the Maps app itself but will still very likely have made use of it without even being aware.

To find help using the Maps app, the first place to look is inside the user guide for your iOS device. You can get this for the Books app. I’ve given instructions on how to do this in the “Quickstart” section of this guide.

Google Maps

Google is and has always been the information king. This is leveraged heavily to provide maximum contextual knowledge about places. Apple initially was going to simply tap into Google’s data and pay for using the information on its products. However, the companies had a falling out and are now competing with each other in this particular sphere of interest. The map is almost secondary to the information in Google Maps. When you enter it, you’ll find a “Menu” button, a search field, and a bunch of further options including checking traffic, getting directions, and entering compass mode. Below, you’ll find a heading called Explore your local area name. In my case, Explore Mississsauga. Flicking right below that heading brings you to buttons which are categories of places. Double tapping on one will show you local places in that category. For instance, double-tapping on “coffee” will bring up places where you can enjoy coffee which are nearby. When you drill down like this, there will always be a “back” button to get up to a higher level.

If you double-tap on a location in Google Maps, Google finds and displays everything from the web site, buttons for guidance to the location, a button to call a place, and all kinds of reviews provided by users of the app or from other places. You can learn a great deal about restaurants and places of business just from what comes up on a place’s information screen. Don’t forget to scroll down with a three-finger swipe to the left. There are usually a number of screens full of information. More than you’d get on Apple Maps.

Google Maps has a lot of features including a compass mode as well as the ability to download a local map for offline use. If you use this latter option, be aware that some features depend on having a data connection and you’ll miss out on those if you’re completely offline. All of this is explained in the app itself. To find the extensive help available from within the app, double-tap on the “menu” button at the top left. Next, flick right until you come to “help and feedback”. Double-tap on that and then flick right until you get to “help” and double-tap this. You will arrive at a web page with extensive help and instructions.

If you use other apps in the Google ecosystem, you may find that they can interact with Google Maps when this is advantageous. This certainly includes any Google searches you’d perform with the Google app. This app is supported by businesses and advertising. You may encounter ads while using Google Maps if you explore the information provided thoroughly. Also, keep in mind that Google and Apple have different philosophies when it comes to privacy and sharing data. With Google, the data you generate is a product for businesses and other interested parties who have agreements with Google.

Google Maps also has facilities to make sharing information such as possible locations easier using social media, messaging, or email. This can make working out what restaurant to meet up with friends at easier. People can look at reviews, visit the web site and access the menus that restaurants make available online. This can be extra helpful for blind people since it provides a menu that you can examine without someone sighted having to read it to you. This way, when your friends are ready to order, you can be too.

Optimal Perspective;

Using Apps Specifically Designed For Blind Travellers:

While it’s certainly quite possible to make good use of the Apple and Google Maps options, there are alternatives which have been designed from the start with blind users in mind. They tap into the same sources of data as other mainstream GPS apps. However, they present information in ways to maximize the benefit to a blind traveller. For one thing, they announce nearby points of interest and other information automatically. You don’t have to constantly interact with the app to find out what’s around or how close you are to an important landmark. What’s more, the interfaces of these apps have been thought through very carefully to make them as easy as possible to use from the perspective of blind people. This can make a very big difference.

There are numerous GPS apps designed specifically for blind users. They all cost money unlike the mainstream apps we examined previously. There are a number of reasons for this disparity. For one thing, there are research and development costs associated with making these apps as easy to use and beneficial as they are. The potential user base for these apps is a lot lower than apps designed for sighted users. This reduces how attractive they are to advertisers and other ways that enable mainstream apps to be free to their users. There are often fees for developers of these apps to make use of the geographical data they tap into. Rather than having consumers pay an ongoing subscription, many developers choose to charge a higher price for their app up front and absorb the ongoing fees.

We’ll look at the two most popular apps in this category. Remember that there are other choices out there. We’ll briefly examine two of these in an app store expedition later on.

BlindSquare costs $54 Canadian in the app store. It’s a very popular option with a loyal following and frequent updates. Meanwhile, a more recent arrival from APH called Nearby Explorer bills itself as the premium navigation app. It can be yours from the app store for $109 Canadian. I won’t be going too deeply into how to operate these apps. In both cases, you can find a very detailed user guide right from within the app. The guides are also available online at the web sites for BlindSquare and Nearby Explorer. Go to: For help using BlindSquare. You’ll find all kinds of help including podcasts demonstrating the app, frequently asked questions, a link to contact the developers, and the user guide. You can visit American Printing House to get similar resources for Nearby Explorer. Go to:


BlindSquare is an app that leverages data from the FourSquare database. FourSquare is a social app and game which lets people check into places they visit in the real world telling people where they are. They can earn badges for visiting places often or visiting many places in areas being certain to check in using the Swarm or FourSquare apps when they’re present. They can also rate and review places. This data is tapped by BlindSquare to find points of interest so that it prioritizes more highly rated and popular locations when they’re in your area. It also draws data from the Open Street Maps service which provides information about streets, paths, intersections, etc. Combining these two sources gives a very useful picture of your surroundings which is constantly updated by people checking into places and uploading GPS coordinates.

Rather than taking up space on your iOS device with pre-loaded maps and a geographical database, BlindSquare frequently checks for data as you move and nothing is stored on your device other than points of interest which you create. As a result, BlindSquare takes up a small amount, around 100 MB of storage space, on your device. This small footprint makes BlindSquare quite manageable even on devices with low storage capacities. Especially considering what you’re getting in terms of capability. You absolutely need cellular data to make use of this app while travelling and not connected to WiFi. It regularly checks for new points of interest to report to you as you’re moving around.

Using BlindSquare, blind people will be alerted to points of interest which come within a radius and category of interest which they can specify. It is possible to filter what BlindSquare announces so that there is time to hear more of what’s around you that you’re actually interested in. For instance, you could have it only announce restaurants within your search radius. By default, all categories are active and BlindSquare tries to find the most popular and closest places to tell you about. Being able to focus in on what you want is a key capability.

Another special capability of BlindSquare is the ability to use 3d sound if you use a headset to sonically indicate the direction of points of interest it tells you about. As you walk along, you might hear a short beep sounding like it’s ahead and to the right. That will be immediately followed by an announcement of a donut shop which has been detected. There are many other short audio indicators which can clue you in to where things are whether or not you use a headset and perceive the 3d positioning. This has come in very handy for me when navigating the path around the man-made lake near my apartment. I have added in benches, large rocks, and other points of interest which make good landmarks and hearing the direction they’re in as I approach has enabled me to find them more easily after I’ve become disoriented.

You can also access and control most capabilities via an audio menu that you access with the play/pause button on your headset. This allows you to have your phone safe in a pocket and still control most of the tools BlindSquare offers. To access this menu, simply press the play/pause button of your headset or earbuds. A menu of options will then be cycled through and announced one by one. You merely use the play/pause button again to indicate your choice. This simple and consistent interface lets you easily adjust the radius, activate sleep mode while you stop and talk with someone, find out what’s around you, announce where you are, etc.

You can also use voice commands to control BlindSquare. This is similar to asking Siri or another digital assistant to do something. There is a list of commands specific to BlindSquare which you can find out by asking the app for help. Using this feature costs you command credits which you must purchase from within the BlindSquare app.

The app has been designed for maximum ease of operation using VoiceOver and was extensively tested by blind people. It can also give information from beacons which may be placed in or outside of venues. The CNIB community hub in Toronto has such a beacon. BlindSquare also has a “look around” feature which lets you point the top edge of your phone in a direction and find out what’s there. There are options to get weather information about a place, an option called “what’s around me” to announce nearby points of interest, a “nearby intersections” listing option, and many other options. For instance, when visiting a restaurant, you can call the place, get directions via a third-party app, view the menu, and much more. You get at these options by double-tapping on the location whether it’s in your favourites or in a list of search results. BlindSquare has also been designed to work with Braille displays. You would presumably wear a small Braille display in a sling bag having it on your chest or over a shoulder for easy access.

BlindSquare packs everything onto one screen and into menus accessed from that one screen. At the top is a toolbar featuring buttons to let you access tools, settings and other features. Beneath that row of buttons at the right edge of the screen is a radius adjustment slider. This lets you quickly increase or decrease the area around you being checked for landmarks or that will be used during searches. Below this are a plethora of category search options as well as a button giving access to announcement filtering. This lets you fine tune what is announced as you move around. The more familiar you are with the layout of this screen, the better your experience will be while on the move. It is absolutely possible to flick through all the options but so much quicker if you have a rough idea where they are and can touch a point on the screen that is close or right on the option. At the bottom of the screen are other options including the “Sleep mode” button near the bottom right. This lets you put BlindSquare to sleep during a conversation with someone or while you don’t want it checking for information and announcing things.

This approach means that you would go into the “Tools” button which you would then flick through to reach options such as “Look Around”. If you’re wearing a headset, you would more likely take advantage of the audio menu by using the “play/pause” button on your headset. Options would then be spoken and you’d just hit the button again when the one you wanted was spoken.

BlindSquare cannot plan routes and give turn by turn directions on its own. However, it has been designed to interact with other apps operating in the background while using another app like Ways, Google Maps, or the Maps app which comes on your iOS device, etc. These apps can be given coordinates by BlindSquare and can then plan routes and give directions. BlindSquare can also hook up with transit apps like Transit or Move It and give information such as bus stop and arrival times. Tying into these apps, BlindSquare can provide quite a comprehensive navigation service making things as easy as possible for blind people. That’s because it can run in the background while other apps have focus. You need to give permission for BlindSquare to be able to do this.

Example of Travel Using BlindSquare:

I find that BlindSquare is what I turn to for when I’m walking around my local area. Once I had the app, I got a mobility instructor to walk with me around the path which encircles the man-made lake near the building I live in. Around the path, there are a number of benches, garbage cans, paths leading in other directions, etc. As we came to suitable landmarks, I stood as near as possible to them and added them to BlindSquare. I didn’t set them as destinations so they don’t clutter up my menu of those. However, they are announced automatically as I approach them. I am therefore warned of both bridges on the path well before I come to them. These are somewhat narrow crossings of small creaks so I know to slow down and make certain there’s room for me to cross safely. There are enough landmarks recorded so that I can tell when I go off the path quite quickly. My apartment is also marked in BlindSquare so I can determine how to head towards it and return home.

July 1st is Canada Day. There are fireworks lit off in a park on the path around the lake. Prior to getting my first GPS system, I couldn’t attend these events on my own since the chance of getting lost was too great. However, I knew that with BlindSquare, I could find my way home even without sighted help. I had the confidence to head out into the night using BlindSquare which announced the landmarks I was passing while I walked along. I wore my Aftershokz bone conduction headset so I was alerted to landmarks while being able to listen for people or other things in my environment. The night was quite enjoyable and I had many interesting conversations. During these, I used the audio menu and activated sleep mode so that BlindSquare wouldn’t keep speaking while I was trying to engage in conversation. When it was time to move, I merely turned off sleep mode and quickly began receiving information from BlindSquare. When it was time to head home, I set BlindSquare to track the entrance to my apartment complex off of the path around the lake. It periodically announced how far away it was and in what direction as I walked. This information was enough to help me get back home and avoid straying off the circular path around the man-made lake.

When I have friends over, I often take them to a Symposia Cafe which is a restaurant in a local mall. BlindSquare announces the many landmarks along the route as I walk from my apartment. Once I’m there, I can also access the restaurant’s menu from inside BlindSquare. While this menu isn’t always kept as current as might be wished, it gives a good idea of what kinds of things are available. I can then put BlindSquare to sleep while having my meal and then wake it up when I leave the restaurant so it can help me navigate home.

Nearby Explorer:

The American Printing House for the Blind, APH, financed the development of this navigation app. It has been available on Android devices for quite some time and had garnered quite a good reputation before making its way to iOS. In order to offer better assistance, it takes a very different approach to BlindSquare. When users run the app for the first time, they are asked to download maps and geographical data for their area. So far, things are divided up by country. Nearby Explorer works in the US and Canada. The geographical data for Canada takes up around 2.5 GB of your device’s storage space. It is decompressed after being downloaded. You definitely want to be connected to WiFi when making this large download. However, you don’t need to do this very often as this massive database isn’t updated too frequently. There are millions of points of interest and associated information about them in this database. Navtek, the company responsible for maintaining this data, is highly regarded and widely used by GPS apps. This is one major reason why Nearby Explorer is twice the price of BlindSquare. This data can be used even when you don’t have cellular data or a WiFi connection. It’s always there.

Because all of this data is on your device, Nearby Explorer is able to offer route planning from within the same app rather than piggybacking from other apps. It provides very good information about upcoming intersections in timely fashion drawing on this data. Transit information and indoor exploration facilities via beacons are also made available from within the same app. This means that everything is easily accessed in a consistent manner.

the Nearby Explorer interface differs greatly from that of BlindSquare. At the top and bottom of the home screen are two toolbars with frequently used options. The top toolbar contains buttons for pause, compass, geobeam, radius, and level adjustment. The geobeam feature is like the “Look around” feature in BlindSquare. The bottom toolbar has buttons for streets, search, favourites and transit information. In between these toolbars are a number of indicators which can be set to automatically announce information or not as desired. For instance, you can have street numbers announced or not. All of these options have context menus providing even greater control of when they are spoken or not. This gives you the ability to quickly tailor the feedback from Nearby Explorer to best suit your current situation without ever leaving the home screen of the app.

There are four tabs across the bottom of Nearby Explorer for accessing less frequently use features like settings, help, and an accessible map view. This map view comes from integrating the Apple Maps view into the Nearby Explorer interface. This means that you can use Nearby Explorer’s home screen settings to determine what gets announced on the map. You can also do things like simulate being in a location, turning on a “watch” on a location on the map and then scroll around while hearing where you are relative to the marked location. You could also use features like the geobeam to explore areas before actually going there in person. People might well prefer Nearby Explorer’s tabbed approach which more thoroughly separates those less frequently used options giving them separate areas of focus. It can be a lot easier for people to master more features when things are consistently done in one app. Procedures are the same and the way things are presented is also. When you deal with different apps, be prepared for different philosophies of what’s important and how things are accessed. APH has thought through every piece of this app very carefully to maximize its usefulness for blind people specifically. That makes a big difference especially for people who might not be experienced enough to easily deal with too many different apps.

Reflections on Nearby Explorer:

The higher price of Nearby Explorer forces one to pause and ask whether it’s worth the money and hefty chunk of storage space the Navtek data takes up on your device. I think the answer ultimately boils down to personal preference, how much you pay for cellular data, and where you’re travelling. Personally, BlindSquare is powerful enough to meet my current needs. However, if I had to go to an unfamiliar city and find my way somewhere, I would appreciate having the Navtek data and other features of Nearby Explorer at my disposal. Beginners may find an app like this to be a bit overwhelming. You’ll want to spend time reviewing the instructions and examining the options before making serious use of this powerful navigation tool. I don’t find that this app does as well with off-road travel such as around a pedestrian path or in a park. I entered a bunch of landmarks in but could never get them to be called out as I passed them. Nearby Explorer tries to minimize the amount of chatter and doesn’t have the same approach to places which aren’t destinations that BlindSquare offers. However, it is able to give better information about intersections according to a friend who uses it extensively. If you needed to do a lot of urban travel in unfamiliar cities, Nearby Explorer would definitely shine and prove its worth. You would never be without geographical data even while offline. That’s a potential downfall of apps which don’t store geodata on your device. Nearby Explorer also makes it far easier to quickly change which information is spoken having everything you’d likely want to adjust rapidly right on its home screen. This includes such things as streets, transit information, and much more.

Don’t worry if, like me, you aren’t able to make good use of the virtual exploration capabilities. This app has a tremendous number of navigation tools. Find what helps you the most and master those options.

App Store Expedition:

Other Navigation Options:

For a couple of cheeper navigation apps designed specifically for blind people, consider Ariadne GPS or Lodestone. Ariadne features a map which is totally accessible with VoiceOver and can be explored by touch. This app has been around for a long time and hasn’t been updated recently. Another alternative which is still being updated as of 2018 is Lodestone. It is produced by blind developers and allows local information to be downloaded for offline use. It is much cheeper than the two apps I focus on below and was originally developed for Android smartphones. One particular advantage it offers is the ability to be far more specific about the categories and geographic regions you choose to store on your device. For devices with lower storage space, this is a very attractive capability giving the best of both worlds and making certain you’re never without local information.

Other apps to obtain are not made specifically for blind people but may help with travel. TripIt is an app for managing the details of a trip. You give it information such as flights and hotel bookings and everything is kept track of in that app. It offers numerous perks to frequent travellers and is said to be accessible for blind users.

For more local travel, consider obtaining the Uber app. Many taxi companies also have apps. These can help with ordering rides, make payment easier and more secure, and much more. These apps generally work well with VoiceOver and can be very useful.

Final Thoughts on Navigation:

It’s incredible to think of how much choice we have in terms of our approach to getting around. Even the more expensive options are cheeper than the devices designed for blind people that I’ve heard of by a long shot. If you take the time to get confident with using VoiceOver, you can have very thoughtfully designed accessible apps which do just as much as those more expensive devices. That iPHONE in your pocket can be a life saver if you get turned around out there. BlindSquare has certainly helped me get back home when I’ve gotten disoriented walking outdoors. GPS apps aren’t perfect but they open up a lot of possibilities for blind people. When you’re not using GPS apps for a while, it’s best to close them so they don’t continue needlessly using data and resources in the background while you don’t need them. I’ve had more than one occasion when I discovered I hadn’t done this and therefore had less remaining battery power than I thought. While travelling, please be mindful of how much hearing blockage and/or distraction you’re incurring. Earbuds and over-ear headsets block your natural hearing to a high degree. There are plenty of reports of fully sighted people wearing these and failing to hear oncoming cars and other sometimes lethal hazards. Personally, I use a bone conduction headset while travelling. I have the volume as low as possible while still being able to reliably hear information. I never ever play music while walking. The only thing I want to hear besides my environment and people around me are the navigational announcements from my GPS app of choice. Some people worry that their iOS devices might interrupt the announcements at a critical moment with something unrelated to travel like a notification that somebody tweeted you. I usually set my iPHONE on do not disturb while I’m travelling so nothing else intrudes on the announcements and operation of my GPS app. This mode has become very flexible in iOS12. You can add contacts to your favourites list so you won’t miss a message or call from people who are important to you even when in do not disturb mode. Don’t forget that while in this mode, your iOS device can still receive information. Just check the notification centre when you get where you’re going and find out anything you missed on the way. Normally, in the event that I need to make or answer a call, I’ll move to one side of the path and stand still until the conversation finishes. I take as much responsibility for my own safety and that of others as I possibly can. Safe travels, everyone.

Tuesday, June 12, 2018

Going Dotty: Refreshable Braille on iOS Devices:

I never really appreciated what a gift to organized thought that learning to read Braille was while receiving my own education. Braille was always bulky and heavy. The army surplus backpack I carried through the halls earned me the nickname of Fifty. People thought that it either had fifty things in it or else it weighed fifty pounds. If they were too close behind me while I turned a corner in the hallways of my school, they were liable to get crushed up against a wall. I doubt any of them suspected that the Braille volumes which added most of the weight to that pack were mere fractions of the text books and novels they could easily carry whole under an arm or in a pocket. In early grade school, the class of blind students I was in made use of a copy of the "American Vest Pocket Dictionary". It was comprised of seven thick volumes despite the pages being double-sided. Each volume was thicker than a phone book and the whole dictionary completely filled a long shelf stretching across a wall. For years, I thought the title was someone's idea of a joke. Eventually, on a pure whim, I asked to feel a dictionary carried by one of my sighted classmates. You couldn't quite stick it casually inside a pocket, but it was light and easily carried in hand. For the first time in my memory, I was brought up against the reality of what a profound and massive difference having eye sight could actually make in one's life.

There's also the cost of producing Braille to consider. I walked around grade school with a solid metal contraption somewhat like a typewriter. It was a Purkin's Braille writer which weighed around fifteen pounds and cost at least fifteen hundred dollars. Braille embossers designed for mass production are even more expensive. This has drastically restricted what is made available for blind people to read. The paperback book you can buy for under $10 would cost hundreds to produce in Braille. Until audio and Ebooks recently hit their stride, I've been quite restricted in my reading choices compared to a person who had eye sight.

I'm part of a generation who has learned Braille naturally as part of my school experience while there was really no other credible alternative. However, we have now been liberated from the cost, bulk and weight of Braille by the advent of synthetic speech and more widespread accessibility to mainstream Ebooks and computing. In everyday life, now that I know how to read, I haven't felt the need to constantly use Braille. Quite the opposite in fact. While reading for entertainment and even when referring to books as references, speech output has proved more than sufficient and ever so convenient.

While I can fully appreciate why one might think Braille was no longer needed, I would contend that mastering the art of reading is essential to everyone's education be that print or Braille. If sighted parents faced the prospect of their children not being tought to read and write due to a lack of resources, they'd be horrified and never stand for it. Parents of blind children should feel no qualms about insisting on Braille literacy. The many lessons I learned while gaining literacy have served me well in countless ways. Like riding a bicycle, literacy is one of those things you never forget even if you don't read Braille beyond signs and labels for years. The lessons it teaches about proper use of punctuation, sentence structure and other aspects of writing have stayed with me. As a result, I have been able to use my writing and language skills to help others and express my thoughts clearly and with confidence.

I was never a particularly fast Braille reader and don't feel I've lost what speed I achieved. The same applies to writing. I'm far faster on a QWERTY keyboard than I ever was on a Braille one. In most circumstances, text-to-speech access has proved far supperior in terms of portability and actual access to books. Other than my spelling having deteriorated over the years, I don't feel that the absence of Braille has done me much harm. And yet, I'm profoundly thankful that I was taught Braille reading and writing. The lack of actual literacy would have had a strongly negative impact on my quality of life. Now that you have and idea where I sit in the great Braille debate, lets continue.

Who would have thought that a device with a smooth surface might prove to be an amazing conduit for Braille? Apparently, Apple did. Right from the start when VoiceOver first appeared, there has been support for Braille displays. I never thought much about it when I got my iPHONE4. It was so intuitive and easy to use speech and the capabilities of the platform made learning how to use Braille on it far less attractive than it would be currently. Before embarking on writing this guide, I hadn't given the implications a whole lot of thought. As I've taken the time to dig into what iOS offers in terms of Braille support, the implications become very apparent.

You may wonder why I've chosen to give the Braille experience its own somewhat lengthy section. I have several reasons for doing this. First of all, if people don't intend to use Braille displays, all of the extra commands are out of their way. Those who wish to learn how to use refreshable Braille will find all they need in this section which isn't covered elsewhere. Experiencing iOS through a refreshable Braille display is markedly different than via the touchscreen and speech output. Rather than a whole screen surface which can be explored with a finger plus immediate speech feedback, using a Braille display may change your approach. You can explore the screen in a similar way with one hand operating the iOS device and another on the Braille display. Alternatively, you could operate entirely from the Braille display using all the key commands to navigate. That will feel very different and be more similar to using a traditional screen reader. People who struggle with using a touchscreen may well find this mode of operation to be preferable. In either case, you will find that the Brailledisplay gives you a window in the form of a line of characters whose length depends on that of your display. The position in exact focus, such as the current character in a document being written, is shown by two dots on the bottom of the cell which tick up and down repeatedly.

Another advantage to separating the Braille-specific information in this way is that people can more easily grasp how much support there is and how integral Braille can be if you wish or need it to. Those who might think to presume that Apple has paid mere lip service to Braille support do Apple and themselves a serious injustice. There's a whole lot of ground to cover, so lets begin connecting the dots.

Over time, Apple has gone to considerable lengths to support the use of refreshable Braille with its devices. In fact, it's possible to purchase a Braille display from the Apple Store app which you can obtain for your iOS device. If you already have a Braille display or a Braille notetaker capable of being used as a display and connecting via Bluetooth, you can pair it with your iOS device. VoiceOver has been designed to allow complete access via Braille throughout the operating system. While you can't completely avoid using the touchscreen, you can certainly minimise the need to. Doing this requires the learning of commands making use of key combinations or other buttons which your Braille display may have.

IOS has support which allows far more then simple Braille input and output. Similar to an ordinary Bluetooth keyboard, you can take full control of your device using only your Braille display. There are key commands to do everything you can do with gestures. If you can memorize the commands, you could have excellent and accurate control of your device. This includes things like summoning Siri, controling the volume, and much more all without lifting your fingers from your Braille display.

The catch is that there are a heaping ton of commands to know if you want that kind of complete control. People may find that these commands feel less intuitive and easy to learn than the touchscreen gestures they are designed to replace. Personally, I find a middle of the road approach works best where I still use the onscreen gestures but learn the commands of particular use to me.

 Byebye Braille Book Bulk!

Ebook sellers are starting to get onboard making certain that the apps everyone uses to read them offer support for accessibility. What this means for someone with an iOS device and Braille display is that they are completely liberated in their choice of reading. Braille books used to take hours and hours to translate and be very costly to produce. They also used to weigh quite a bit and take up a lot of space. I mentionned a vest pocket dictionary earlier which serves as a perfect example. Now, that same dictionary would take up a tiny fraction of the data storage available on even the cheepest iPHONE. A Braille display which you could comfortably carry in one hand would let you access that dictionary and thousands of other books on that iPHONE in perfectly readable and translated Braille.

One thing to keep in mind is that the apps you'll use to read these books are designed for people who can see. They're fully accessible but things are done in such a way as to maximise reading pleasure for people who can take in a lot of a page at once. Current Braille displays only present one line of text at a time. There may occasionally be slight problems as the apps and book formats are updated over time. Also, it may be eeasier to use the touchscreen when accessing menus and other functions which reading apps have. For instance, it's far quicker to learn the locations of tabs across the bottom of the screen or use menus which appear when you double-tap on the screen. In exchange for putting up with these small issues, you can read damned near anything you want as soon as it's published and at the same cost as anybody else who buys and Ebook.

Always a Catch;

Pitfalls to Consider with Braille and iOS:

There are some possible trouble spots for those who choose to acquire a Braille display and iOS device rather than opt for a more traditional Braille notetaker or other solution made especially for blind people. There may be times when your Braille display will disconnect since the Bluetooth software is always trying to save battery power. If this happens, simply lock the screen with the power button on your device and then unlock it again. This should result in your display reconnecting. Some displays can be more problematic and require more steps to get them reconnected. This kind of thing also happens with other Bluetooth devices such as keyboards or even Apple's own AirPods. The AirPods are designed to reconnect quite quickly and do so automatically nearly all of the time. There are so many different Braille displays that having the same kind of Bluetooth reliability would be impossible. This is very similar to my having to press a button on my Aftershokz Trekz Titaniums to make them reconnect if I stop hearing things through them.

We saw an instance of another potentially major pitfall when iOS11 was released. People who updated suddenly found that they couldn't enter text quickly on Braille displays. Words would simply be lost and not be recorded in the document or edit field on the iOS device. Because people's fingers were busy typing in Braille, they couldn't immediately realize there was a problem if they weren't also using speech. You can't read Braille while you're in the act of typing it. Muting speech while using a Braille display is a very common practice. The issue was reported by testers but not addressed prior to the release of the update. Braille display users would be a very small percentage of the overall number of users of iOS. Every so often, their issues won't be dealt with in time and may take a while to address.

This happens with other things as well. It's not just a problem for blind people. At one point, an iOS update was released which resulted in iPHONEs being unable to make phone calls. This problem was addressed extremely quickly as you might well imagine. However, it can be especially devastating if you rely completely on having Braille input since issues in that area won't be regarded as being so dire. Apple tries to only release updates when enough improvements have accumulated that receiving the updates will be noticeably helpful to a good portion of users. This practice can leave things hanging for periods of time. It took around two months for things to be fixed so that people could type productively on their displays once again. That kind of delay could be especially inconvenient for students and employees who rely on having Braille access for input and output.

I can also use speech so Braille isn't absolutely essential for me currently. However, if you're utterly reliant on Braille working correctly, keep in mind that there may be periods of time where things don't go smoothly. iOS is a very complex operating system. The more unusual your particular needs are, the higher the chance that problems like the example above will be encountered for hopefully short periods of time. Apple has learned from these mistakes and has introduced public testing of upcoming iOS updates in an attempt to catch major issues.

When looking for pitfalls, I spent some time putting myself in the worst possible case. For people who are deaf and blind, Braile had better work because there's no other way. If you absolutely can't hear speech or see enlarged print, touch is your ownly pipeline of information. In such circumstances, you may well want to look for alternatives or have a backup plan such as sighted assistance if things go wrong.

There are some things which might require sighted assistance to resolve. Certainly, you'd need such help to set up your device and then connect your Braille display initially. I don't believe there's any way to get at the Braille settings in VoiceOver until after the setup process is complete. Another thing which might be troublesome at first is entering your passcode in order to unlock your device. I have an older Focus40 display so more current displays may simply allow you to type in the passcode on them. Mine didn't so I had to enter it using standard typing mode on the touchscreen. I had one hand on the Braille display feeling which number was highlighted as I used my other hand to find and then enter the numbers using the split tap method. This is the best way I've found for making certain you enter the passcode correctly. You can, of course, find the delete key at the bottom right of the virtual number pad on the touchscreen and get rid of mistakes. I found this process somewhat nerve-racking without speech output but I think it would become second nature after a while. There's really no avoiding the need to enter the passcode every so often and after any time you shut down and turn on the device. Once that's been done, you can then use easier methods to unlock your device most of the time like Touch ID or Face ID.

While Braille focus will jump to where new messages are displayed, other things such as choices or controls may need to be more actively searched for by Braille users. Everything will be reachable but without such proactive exploration on your part, you might not realise there are choices or controls present in some apps. The ability to actively explore the screen is a key part of iOS accesssibility which works differently from other screen-readers that may look for and announce more things automatically.

Mainstream Economics and Wider Horizons;

The Advantages of iOS Over A Traditional Braille Solution Tailored For Blind People:

There are several advantages that iOS devices bring to the table for blind people wanting to make use of Braille in daily life. Braille notetakers work extremely reliably and typically have excellent battery longevity. However, they are also very limited in what they can do. They offer a set number of highly perfected functions which work flawlessly, but they don't offer much ability to grow beyond those. On the other hand, IOS gives you very good but sometimes imperfect access to an ever expanding ecosystem of apps, ebook markets, and other things available to sighted people. Even though only a fraction of the total apps available for iOS are accessible to blind people, that still far surpasses anything you'll find elsewhere other than perhaps on Android devices. No Braille notetaker will let you do your shopping and banking with apps designed by the bank and grocery company you and potentially millions of sighted customers select. This access to the same apps used by sighted people could be very helpful socially to blind students and other Braille users. It's an option they've never really had before the iPHONE gained its VoiceOver screenreader. You could read news articles on an app or web site in Braille while talking about them with friends. And then, there are the specialised apps which take advantage of hardware built into your iOS device. Presuming you opt for a small portable Braille display, you could use an app like KNFB Reader to take pictures of the pages of a restaurant menu and then read it in Braille while conversing with your dining companions. There are all kinds of situations like that where you want access to information but also want your ears free.

Braille is especially useful when it comes to the study of Mathematics. It can be tremendously hard to picture how an equation is laid out while using synthetic speech. A Braille display supported with software such as the VoiceOver screen-reader will let blind students and others feel the positions of part of equations. This can make solving them a far easier process. There are other instances where having the ability to feel the position of information is critical such as when examining charts or tables. How practical that is will depend on the length of your Braille display or feeling a table one-handed while the other hand is reading the Braille display.

Braille displays are very costly items and are built to last. I've had the same Focus40 display since I got my first iPHONE in 2010 and it still works great in 2018 with my current iPHONE7. Barring disaster, I expect my display to see me through potentially five to ten more years. The core of my system is my iPHONE and I can have the latest features without upgrading the far more expensive Braille component. Also, I can use my display with more than one device. If something happens to my iPHONE, I could still use the display with my iPAd for instance. If your notetaker breaks down, you need to repair or replace a very expensive device and be without all of its functions while you're taking care of that. If my display breaks, I still have my iOS device which can be used with speech.

If you already have a Braille notetaker, fear not. Most of them can connect via Bluetooth and act as Braille displays for iOS. You have the best of both worlds. Some notetakers are designed to integrate with apps on iOS devices making them an even more powerful combination.

Attention Please;

When Notifications Pop Up:

You're reading along when all of a sudden, the Weather Gods app decides to reveal that it's raining heavily outside. Perhaps, a friend has chosen to send a message asking how you're doing. Unless you have your device on "Do Not Disturb" or have notifications turned off, you will eventually be interrupted from whatever you might be doing by a notification from another app running in the background. In such a case, the same sort of thing happens with the focus of Braille as happens with speech. The interrupting notification automatically gets focus for a short time which you can determine before focus returns to what you were doing. Also, if you pan through the notification while it has focus, you should be able to read it in an unhurried manner. Remember that if they do disappear on you, you can always find them in the notifications centre.

Commonly Used Commands:

This guide won't go through every single command. The set of commands available to you depends on which Braille display you're using. The place you want to reach is a page called Braille Displays Supported by iPHONE, iPAD and iPoD Touch. Below that heading, you'll find a series of links to specific Braile display command lists. Additionally, below this, you'll find a link to a set of universal commands which should work on any display. At the time of this writing, this helpful resource can be found at:

The lists of Braille Commands are comprehensive well-organized lists divided into headings and tables. Apple might change one or more of these commands at any time so it's best to get them directly from Apple's own documentation. However, this subsection should go through the commands you'll need to start trying things out.

Most Braille displays come equipped with some basic control buttons or other things like wheels or rocker switches. These minimise the need to take your hands off the display in order to control the computer it is connected to. Pretty much all displays include a Braille keyboard to facilitate input. Presuming you've gotten your display paired, you should feel Braille pop up as you move your finger over the screen. The controls on your display should do what the instructions which came with your display indicate. For instance, panning buttons will move left or right through text. Navigation rocker switches and advance bars should behave in logical ways. To start finding out what all the buttons do, you can use the keyboard help command. That's the space bar plus the letter K, dots 1 and 3. Think of the space bar like a control key. Once you've entered keyboard help mode, try other space bar and character combinations. You will be told via speech what they are. This also applies to any other controls on your particular Braille display. VoiceOver has full support for at least 70 different Braille displays at the time this guide is being written in 2018. Even if your display only has a Braille keyboard, there will be enough space bar key combinations for you to control your iOS device with reasonable proficiency.

Navigating Important iOS Areas:

Any time you want to reach the home screens, just use the space bar and letter h [dots 1,2, and 5] This should work from anywhere in iOS. While on the home screen, you can start typing in the name of an app you want to get to and matching items will appear in a list which can be quickly scrolled through. You can scroll through this list or through all apps on the current home screen in order via the space bar plus dot 1 or 4 for previous and next item respectively.

Braille Settings in VoiceOver:

Within the settings for VoiceOver inside Accessibility settings, you'll find a subgroup simply called "Braille". The VoiceOver screenreader is doing all the thinking while your ultra-expensive Braille display simply moves dots up and down in perfect obedience. Your overall experience should be similar regardless of which display you use. This certainly holds true for the group of settings we'll discuss now. However, be aware that there is another group of settings called "Braille commands" which we will discuss later that allows for total customisation of what buttons and key combinations you enter on your Braille display will do.

These settings let you set things such as the particular code of Braille to be used with input and output. For instance, you might want six-dot uncontracted Braille for input and contracted Braille for output. You may not want to use UEB Braille if you aren't yet familiar with it. That's perfectly possible. Also, you can choose whether you want word rap on or off. Word rap determines whether lines end with the last possible entire word or whether they can contain the beginning of a word completed on the next line. You can also decide whether or not to have the panning buttons automatically proceed to the next page when you reach a page boundary and pan further.

Connecting a Braille Display:

The first thing to do is pair your Braille display with your iOS device. Make certain the Braille display is ready to be paired via Bluetooth and then flick right through the Braille settings until you come to "choose a Braille display." You should then find a list of any detected Braille displays. Be careful since it may think ordinary keyboards are Braille displays. When you come to the name of your display, double-tap on it to initiate pairing. You may be asked to enter a pin number using the Braille display to help secure the pairing and make certain input coming from the display is recognised as that. There may be other Bluetooth devices connected to your device or operating close by. The number you enter sets up a secure and easily identifyable connection. Once a display has been paired, you shouldn't have to go through this process again in normal circumstances. If your display loses connection, simply locking the screen and unlocking it again should restore the connection.

After you have successfully paired your display, the Braille on it should change as you move your finger around the screen. It will be showing the labels of apps or information on the screen as Braille when you touch it.

Braille Screen input:

You need not have a Braille display for Braille to be a part of your iOS experience. The Braille screen input lets you use a virtual Braille keyboard by positionning your fingers on the touchscreen as if you were writing on a braille writer. You need to enable the option in the VoiceOver rotor settings. The dot position can be calibrated to your natural finger positions on the screen surface. This kind of input can be very useful and people may find it easier than dealing with the ordinary onscreen keyboard. The Braille screen input setting lets you customize whether you want contracted or uncontracted Braille. You can also decide to reverse the positions of the outer dots so dots three and six are closest to the imaginary space bar rather than have the dot numbers increase the farther away from the space bar you get as they traditionally do.

Status Cells:

You can choose to have a cell on your display be used for showing status information rather than a character of normal output. Each dot in that cell indicates something such as that your battery is low, there is more text on the current line, a message awaits your attention, etc. You can choose whether the status cell is on the left or right side of the display. You also may choose whether it shows general information like I described above or text information such as format, font, etc for the current character. This would be useful when writing a document. If you're using a status cell, you can turn the rotor to a status cell setting and flick up or down to find out what each dot on the cell means.

Math and Equations:

There is a setting where you can choose whether Nemeth code is used for equations. Increasingly, Ebooks and other documents which contain mathematical equations are accessible through reading apps and VoiceOver. If you encounter mathematical equations while using a Braille display, one of the settings deals with whether you want the Nemeth code designed to represent mathematics in Braille to be used.

Word Wrap:

Because iOS has full control of formatting what is sent to your Braille display, it can decide when lines end. The word wrap setting lets you choose whether words rap neatly onto lines or whether a line can end with a partial word that's completed on the next line when you pan over. The first choice may help to clarify all words encountered at the end of lines. However, the other choice which allows lines to contain all possible text which can be accommodated by your Braille display may allow for faster reading.

Crossing Page Boundaries:

Another setting lets you choose whether panning over a page boundary automatically moves onto the next or previous page. I have it automatically advance but can appreciate the utility of having panning stop at page boundaries. Keep in mind that some Ebooks don't always give you printed page position. Kindle books provide a location number which you can use to instantly jump to a precise position if you know the number.

Input, Output, and Braille Codes:

There are different styles of Braille much as there are different forms of writing. Depending on when and how people learn Braille, their needs and comfort with the various forms will be different. The input and output settings let you choose whether you want six dot, eight dot, contracted or uncontracted Braille. IOS can support any of those choices quite well. You can also switch between these modes as needed. You might, for instance, wish to read in contracted six-dot Braille but write in uncontracted Braille.

In addition to the type of Braille, there is a separate setting from the input and output settings that lets you choose the overall code of Braille to be used. You can choose between US, UK, and UEB Braille codes. The UEB Braille code is the recently introduced code of Braille which Braille libraries all over the English speaking world are now using to produce books. Thanks to the Marrakesh Treaty, this will allow institutions and library patrons to take advantage of books already produced elsewhere provided the countries have ratified the treaty. By eliminating the need to duplicate work already done elsewhere, institutions belonging to countries which have signed this treaty will free both time and money to broaden their selection of books. The ability to use this new code or, at the user's preference, the older US and UK Braille codes, lowers the bar for people who may not be familiar with the newer UEB code. They will still be able to read and enjoy the latest books in Braille provided they can afford to purchase them.

Hiding the Onscreen Keyboard:

If you're not using the onscreen keyboard, you are able to hide it. This is useful if you're using a Bluetooth keyboard or a Braille display which typically has a built-in physical keyboard. A setting lets you choose whether the onscreen keyboard is shown or hidden. If you don't want the onscreen keyboard shown, that space will be re-purposed and used to display more of whatever is on the screen such as a document or page.

Braille Commands:

Taking Full Control Using a Braille Display

Controlling two separate devices at once can be taxing on the brain and on productivity. In iOS11, Apple eliminated the need for this for people using Braille displays. There is a somewhat hidden group of settings which lets you customize what all of the key combinations and extra buttons on your Braille display will do. We will now explore this group of settings Apple has chosen to. call Braille Commands.

To reach these options, you need to go into the VoiceOver settings and into the "Braille" subgroup of settings. Next, flick right through until you reach the "Choose a Braille Display" area. Flick right until you reach the "more info" button to the right of the name of your display. Double-tap on that. The very first button you come to in the "More info" area will be "Braille commands". Double-tap that and you'll have found your way in.

This seems like a strange place to stick such a powerful bunch of settings. However, it fits with how Apple has chosen to handle other Bluetooth devices. Beside any connected device in Bluetooth settings, there's a similar "More Info" button. Any connected Braille displays are Bluetooth devices so they're keeping to an established pattern. This means that you can have a different series of commands for any additional Braille displays you might need to connect with. For instance, you might have a different Braille display to use at work versus at home. In such a case, the correct set of Braille commands will be ready when you need them with no extra effort on your part.

There are seven categories of commands each with their own button. There are a good many commands and most are rather self-explanatory. Rather than exhaustively going through each one, we'll take a quick tour of each of these areas. I'll give you an idea of what you'll find and why you might want to use what's there.


The Braille area is where you can set commands that relate directly to Braille control. For instance, you can set a specific key command like space bar and dotsfive and six to let you change quickly between output modes. This would be useful if you wanted to quickly go into uncontracted mode to feel how something was spelled and then go back into contracted mode. You may want to set a command to turn word wrap on and off depending on what you're reading. For quick progression through a novel, contracted Braille with word wrap off might be the best way to go. However, when editing document, you may want uncontracted Braille with word wrap on. Settting up commands in this area lets you do that from whereever you may happen to be. Such key combinations saves you having to go all the way into Braille settings any time you want to do this.


Lets say you're sitting on a bus listening to some nifty tunes when someone sits down beside you and says something. You could be rude and ignore him or her. Alternatively, you should pause the music. A third possibly preferable option might be to use the keys on your Braille display to turn down the volume on the music so you can hear both it and the person near you. That's the kind of thing the commands in this category are for. It's where you go to customize commands letting you control your iOS device.

You can set the command which simulates pressing the Home button. Another might take you to the control centre, the Notifications area, or summon Siri. There are commands which simulate rotating your device left and right. Others would let you easily adjust the volume using only your Braille display. All the while, your iPHONE is safely tucked into your pocket.


Sometimes, especially while using BRaille, you don't want to take your hands off the display but need to perform a gestures such as a double-tap. This area has commands which let you simulate doing things like a single or double-tap. They allow you to come up with commands on your display which eliminate the need to touch your iOS device to do simple things like a long press, use 3d touch, etc. In normal circumstances, I find it easier to just touch my device and use the normal touchscreen gestures. However, if people struggle with using the touchscreen or have other hand mobility issues, these commands might make the difference between being able to use an iOS device or not. They offer a kind of precision that only a keyboard and numerous key commands can deliver. Given the relatively short time I've used a touchscreen extensively compared to the decades during which I delt with my computer via such key combos, I'm gobsmacked at how positively old-school this now feels.


This area lets you set commands to perform special things such as selecting text, copying, cutting, deleting, etc. Options to perform these tasks would normally be present on the virtual keyboard. This lets you access these options right on your Braille display using commands which you choose yourself. Many apps include extensive toolbars with these sorts of options. 


This area lets you set up commands that help you move around. There are commands for moving to the next or previous line, paragraph, app, message, and much more. If you have an iPAD, there's even a command letting you switch between apps running on the same screen. If you want to get somewhere without having to use the touchscreen, this is definitely the category to visit and make use of.


The rotor is so important that it has a separate area from the VoiceOver area right beside it. There are just five setable commands here. Next rotor option, previous rotor option, rotor up, Rotor down, and speak current rotor item. This will be especially welcome news for Braille users who have trouble with using the VoiceOver rotor gestures. People can use similar commands on a normal Bluetooth keyboard to control the rotor.


In this last category, you can set commands letting you make use of VoiceOver functions. This includes turning the screen curtain on or off, openning VoiceOver settings, speaking fhints, muting speech, and many more. These commands can be key combinations or use extra buttons on your display if this is more advantageous to you.

Making It Work:

Controlling Apps With Your Braille Display:

Now that we've theoretically covered how to set everything up, we'll discuss what it's like to control your iOS device using a Braille display. To get a proper sense of how things work, we'll examine the Google News app and make use of it completely through the Braille display. This isn't how I normally operate. Ordinarily, I would use speech output or else use the touchscreen to control apps and my Braille display for reading. However, approaching things completely through the Braille display demonstrates the possibilities for taking full control should that be necessary or perferable for you. This is the only time in the guide where this will be demonstrated. All other instructions for using apps will presume that speech output and the touchscreen are used. The techniques demonstrated here should be sufficient to help Braille users figure out how to make use of the majority of other accessible apps. The app I have chosen is available in the app store and has stood the test of time. It is a third-party app developed by people who have used Apple's accessibility tools to include blind people. Google News is very highly regarded by blind users.

You will need to make use of the app store to acquire Google News. Searching the app store and obtaining apps are covered in detail elsewhere in this guide. One easy and quick approach is to invoke Siri and say "find Google News in the app store". You would then purchase the app by using the "get" button and then completing the identification process that occurs whenever things are acquired in the iOS ecosystem. This is explained more fully later in the guide. Lets proceed with the assumption that the Google News app is acquired and present on your iOS device.

Using Google News With A Braille Display:

Google has produced a number of very useful apps for the iOS operating system. It has made certain that these apps offer support for users of the VoiceOver screen reader built into iOS. At the time this guide was being written in 2018, Canadians were still not able to make use of the News app produced by Apple. Alternatives are quite plentiful but they don't come pre-installed with iOS. The Google Nes app should prove useful regardless of which country you happen to live in. It is also free to download and use. The more you use the Google News app, the more it learns about what you're interested in. This will effect the contents of your personal briefing which is the section of the app that you start in upon opening it. To open Google News, find your way to the app and double-tap on it. Alternatively, tell Siri to "launch Google News".

Panning Left And Right:

You start out on the personal briefing screen. Try using the panning buttons on your display. One thing which becomes immediately apparent, especially if you're using speech output as well as Braille, is that more is read back to you via speech than is obvious via Braille. The length of your display dictates how much is displayed at once. After the app has opened, it focusses on a title line which also happens to be a heading. This line indicates that you're in the personal briefing section and gives the user's name. That briefing contains the top five stories that Google believes will be of particular interest to you at the moment and also has the current weather. That's too much information for most Braille displays. To read the rest of what you would hear and continue through the briefing, use the right panning button or key combination. You'll be using the left and right panning keys a great deal to look around. If you pan left as far as possible, you'l end up finding a "search" button which wasn't spoken as the app opened up. It pays to explore. Spend some time on this initial screen getting the hang of panning around. Next, do the same with using the next and previous item button or key combinations. With the standard set of Braille key commands, these are space bar plus dot 1 to go to a previous item or space bar plus dot 4 to go to the next item. This is much faster than panning since items may contain text requiring several presses of the panning button to go past. It also brings you to things such as "read more" links, buttons, and other things which might not be obviously functional given their text.

Getting Quickly To the Bottom or Top of Things:

Lets suppose you want to get quickly to the top or bottom of a screen to then start exploring from that end of it. You'll eventually want to do this to get back to the top of a story you've read to reach the "back" button in order to leave that story. You may want to reach the bottom of a screen to locate tabs so you can quickly get to another tab in the app. To do this, use the commands for reaching the top or bottom of screens. These are space bar plus dots 1,2,3 for "top" or space bar plus dots 4,5,6 for "bottom". Note the logical progression from the earlier previous and next item key combinations. There are helpful patterns such as this in the default set of commands which will aid you in masterring them.

During your explorations, you'll have come across a number of elements such as "read more" links, buttons like the "search" button near the top of the app, and tabs leading to different areas. The Google News app is chock full of things like this making it an excellent app to practice exploring with. To activate an item, use the key combination of the space bar plus dots 3 and 6. This is the "activate" command and will cause any of the buttons, links etc, to be interacted with provided you're directly upon them. Now, you can explore to any depth you wish when it comes to items discovered in the app. There will be plenty of news and articles of interest for you to read and explore.

Search Me!:

Remember that little search button we found while panning around? That packs some serious finding power. We are, after all, talking about an app made by Google which is world famous for its search capabilities. Make your way to that search button and then use the space bar plus dots 3 and 6 combination to activate it. A new dialogue wil appear. You will automatically be placed within an edit field where you can type in whatever words you wish. You can use the space bar plus dots 3 and 6 activation command when you're finished typing to proceed to results matching your terms. You can then use the commands for panning and moving between next and previous items to look through these results. Use the activation key combination to select a result and you will be taken to it. When finished, use the "back" button to return to the list of search results. Pretty simple, isn't it?

Writing and Editing Using Braille:

While you're typing in search terms, feel what's there after you've written a word or two. Below a cell in what you've written, notice a pulsing couple of dots presuming your display has eight-dot cells. These pulsing dots beneath a character are used to indicate precisely where your cursor is so that you can edit effectively. The commands to edit a piece of writing comprise the rotor commands to move around and the delete key for removing what you don't want. The delete key using the default Braille command set is space bar plus the lettre d which is dots 1, 4 and 5. In iOS, remember that the character which is deleted is to the left of the delete key so move one character to the right of whatever lettre you wish to remove.

Rotor Turning by Remote Control:

While editing larger documents or doing many other things, you would be using options on your rotor. A set of commands lets you turn the rotor without ever having touched your iOS device. While editing and writing text, you must use the rotor to move around by various amounts through text. The rotor also gives you options for selecting and operating on blocks of text. The Braile commands to turn the rotor are as follows:

1. Turn rotor left to previous item: space bar plus dots 2 and 3.

2. Turn rotor right to next item: space bar plus dots 5 and 6.

3. Flick up: space bar plus dot 3

4. Flick down: space bar plus dot 6.

By default, there is no command to speak or display the currently selected rotor item but you could set one up for yourself in the "Braille Commands" settings.

Concluding Thoughts About Refreshable Braille:

Exploring the iOS operating system through the lens of its support for Braille has been an interesting journey. I came into this not knowing about most of what I discovered. Adding to my struggles was my lack of speed in both reading and writing Braille. I kept having a sense that things should have been quicker. Indeed, they are much quicker for people who are used to operating Braille displays with other screen readers or using Braille notetakers. People who are more proficient than I at reading and writing Braille will find a lot to like and take advantage of in terms of efficiency. I don't think they would find learning the commands and what various options did to be anywhere near as frustrating as I did.

While doing research for this section of the guide, I felt that it was important to approach People who used Braille with their iOS devices on a regular basis. I wanted to see if my intuitions on when and how they used Braille were correct and what I had doubtless not thought of at all. I found more than one helpful forum thread on the Applevis site. Many people have found using Braille displays very advantageous when they need their ears free for participating in conversations while operating their iOS devices. Also, while in noisy environments where VoiceOver was hard to hear, Braille proved to be a very useful alternative.

Typing in Braille on a display was also quite often sighted as a distinct advantage rather than using the onscreen keyboard. One lady mentionned that she used text adventure games to improve her Braille typing in a less stressful way than doing homework and fretting over every mistake. Provided one is proficient enough, Braille can at times be even quicker than speech when it comes to reading or getting things done. Also, it leaves your ears free for listening to music while reading a good book.

Proof-reading is another excellent use for Braille displays. Now that word processing has become practical on iOS, people are able to read their own writing which can give a different sense of it than having it read to you. Formating information can be conveyed without breaking the flow of reading by means of the status cell.

Braille is especially useful when it comes to the study of Mathematics. It can be tremendously hard to picture how an equation is laid out while using synthetic speech. A Braille display supported with software such as the VoiceOver screen-reader will let blind students and others feel the positions of part of equations. This can make solving them a far easier process. There are other instances where having the ability to feel the position of information is critical such as when examining charts or tables. How practical that is will depend on the length of your Braille display or feeling a table one-handed while the other hand is reading the Braille display.

There are use cases, and then, there are well designed Braille display cases which let you hang the display so it's in front of your chest. You can then make use of it more easily while on the move or even while standing. My wife Sara has a very old Braille notetaker which has such a case letting her direct the choir of her church while referring to notes. Presuming you mastered the necessary commands, you could operate apps on an iPHONE in your pocket without needing to hear it speak while having your display hanging at chest level leaving your hands free for reading or other tasks. GPS apps might be operated using Braille in this manner while on the move. While recording the lecture series which accompanies this guide, I made use of a large 40-cell display on my lap along with some notes on my iPHONE to keeep me on track. Braille displays can be a powerful advantage when it comes to public speaking and presentations.

Overall, Apple has developed a remarkable and powerful platform for Braille users. This is especially true now that the ability to customise what buttons and key combinations on your Braille display will do. With that power comes the danger of making a real mess of your interface. You might make so many changes that you forget what they all are and then discover that you can't remember what the original default options were. The only solution I've found which gets you back to square one would be to make use of the "forget this device" button in the "more Info" area for your Braille display. You could then re-pair it and the settings would all return to the default ones since your customisations for the display would have been forgotten. At that point, you could begin to set custom commands again. The good news is that once you have a command set that you like, it should stay there until you use the "forget this device" button.

People may feel overwelmed with the need to learn all of the various commands and options. I certainly did. Don't forget that it's never an either/or choice. You are always free to use the touchscreen gestures you may be far more comfortable and familiar with. I think the real strength of combining refreshable Braille and iOS is when you use some of the commands but mostly use the touchscreen gestures. You don't have to memorise and use all of the commands unless they work better for you.

One thing which I have tried and can't recommend was using a Bluetooth keyboard as well as a Braille display. I had a number of instances where this seems to have caused some confusion. I found that the only way to fix this was to forget and then re-pair the Braille display with my iPHONE. Thankfully, I hadn't customized a whole bunch of commands. Had I done so, this would have been quite frustrating. The majority of Braille displays have Braile keyboards included and I suggest using those exclusively to avoid this setback.

Braille literacy isn't something which should just be thrown away. However, people who advocate for it need to make certain they come at the problem from realistic angles. One thing which I fervently hope happens is that opportunities to have fun using Braille are as strongly encouraged as possible. I learned how to type and use access technology largely by playing games. IOS presents some unique opportunities in that apps which support VoiceOver are also perfectly accessible to sighted people. Look for games which are text-based and not dependant on visual hand-eye coordination. I'll make some recommendations in the section of this guide which is dedicated to games and their benefits. There was a series of game books which did a lot to encourage young teens to read more than they were inclined to. It was called Fighting Fantasy and combined reading with choices and dice rolling. So far, the apps which are bringing these games into the modern digital age do not include support for VoiceOver. This shuts blind people out of games which they could otherwise play quite easily. I would dearly love to see this situation change and perhaps have an organisation fund the addition of accessibility to these games and other similar apps which are currently inaccessible. The excuse most frequently given for this state of affairs is that the app developers lack the funds to make this economically viable. If people really want to see Braille thrive, steps like making these sorts of games accessible could really help. I don't have many fun memories of Braille other than the occasional enjoyable book. That needs to change. I can easily envision multi-player party adventure, board and card games played on equal terms with sighted and blind players. IOS certainly allows for this but nobody has yet taken up the challenge extensively. This should and could easily change. Over time, more companies will make their apps accessibile to VoiceOver and hence, to Braille users. Rather than having this Braille access be an accidental biproduct, I would very much like to see such efforts funded and requested by blindness organisations on the lookout for opportunities for this kind of thing. There are many circumstances in both work and play were having our ears free to focus on what others are saying can be absolutely crutial. Those sorts of circumstances are where Braille can really have a meaningful impact even to people like myself who are used to speech output.

I have high hopes for greater Braile literacy thankts to the journey I've taken with Apple and Braille. It has drastically widened the scope of possibilities when it comes to the circumstances in which Braille might be used in modern life. We're stil in the early stages of exploration. There are two sources of momentum which must come into play. The piece of this puzzle which Apple does not control are affordable Braille displays. As I write this guide, serious efforts are underway to lower the financial barrier to refreshable Braille. Initiatives like the Orbit Reader and Braille Me hope to dramatically lower the cost to individuals of reliable refreshable Braille. The second piece of the puzzle is somewhat under Apple's influence. It can take measures to facilitate and strongly encourage app developers to support VoiceOver accessibility when creating apps and to keep Braille users in mind. As more apps are made intentionally accessible to users of speech output and Braille, awarenesss will spread more widely and things will get better. Future generations won't be driven away from using Braille for lack of portability and convenience. The case for inclusion has at last reached a kind of critical mass. There is a long way to go, and in the case of Braille, the breaks have been on for quite a while now. However, technology has now taken off those breaks, started the car and started us moving again. There is the potential to take Braille to some very innovative new places.