Wednesday, March 6, 2019

The Camera Which Helped The Blind;

What Is This;

Identifying Objects, People, and More with Apps and The Camera in Your iOS Device:

Handkerchiefs are splendid things to have handy when you need to wipe off messy fingers so you can then use your iPHONE without getting guck all over it. They have made my day more bearable on many occasions when I couldn’t come up with facial tissue when allergies got the better of me. They are, however, the most unbelievably frustrating objects on Earth to try finding once dropped. They make no sound at all when landing on apartment carpet or even hard floor for that matter. I had ben given a dozen handkerchiefs and slowly lost most of them over the past six or seven years since. Eventually, I would step on the thing or else vacuum it by mistake and have to dig it out of the fluff if I even had the good fortune to realize this happened. Of course, there was always the possibility that it had fallen on hard floor and was now a hazard awaiting my unwary footfall to send me slipping or sprawling headlong into an apartment wall. At least, this time around, I was pretty certain that the handkerchief was lost somewhere in the apartment. I couldn’t be certain how many days ago it had fallen from my pocket though so that meant it could have been in any number of places.

There are, of course, time tested methods for dealing with searching for dropped objects. The grid pattern search is the most thorough and laborious. I resigned myself to goodness knows how many minutes or hours of doing such a search of my entire apartment by feel. And then… it dawned on me.

Around the time I got my first iPHONE, my father had gotten an iPAD. We had chatted using the FaceTime app. I remembered that I could switch it so it used the rear-facing camera. Rather than fruitlessly crawling around the floor searching, I called my father on FaceTime. He began telling me where to point the camera so he could see the apartment floor and furniture where it could have fallen. It literally took him no more than a couple of minutes to look over the floor through my phone’s camera and tell me where to reach down and retrieve the handkerchief. He didn’t even have to leave his living room to help with finding stuff or reading printed documents. A few times, he even helped me restore my obstinately silent computer to working status by telling me where to point the camera and what was on the screen.

Finding my lost handkerchief was my first mind-blowing experience of how very useful the tiny camera in the corner of my iPHONE could be to blind people. Over time, it became habit to FaceTime my father whenever I needed his eyes for tasks requiring working eyes. For quite a while, that was where things basically stood regarding useful applications of the camera. It took a while for the technology to get better and for Apple to give enough camera control to app developers before the really awesome stuff started. Slowly, the first specially built apps started to appear which were designed specifically to help blind an visually impaired people through use of the camera. These apps made use of artificial intelligence, big data, and the ability to connect people in need of sighted help with sighted people willing to help when called. In this section, we’ll look at the most popular apps which attempt to bring aspects of sight to blind people through the technology built into every iOS device.

Object Identification:

Sadly, not all blind people have sighted friends or family in their lives who are so willing to lend their eyes when needed. There are also times when, for instance, due to the lateness of the hour, it would be inadvisable to call upon one’s friends to quickly find out what kind of cake you pull from the fridge in order to partake in a midnight snack. Being able to have the benefits of sight for crucial moments here and there without inconveniencing people has always been a strongly held desire among blind people. Tapping into the capabilities built into your iOS device, app developers have now largely made this dream a reality.

Digit-Eyes:

Apps such as Digit-Eyes attempt to identify objects using the bar codes most objects are now adorned with. The user must simply move the camera over the surfaces of an item until the software detects the bar code. The app then taps into a database which is usually online. The bar code is matched to an entry in this database and the blind person is then informed what the object is. Being managed online, the database of bar codes an app like Digit-Eyes can tap into is always kept up to date and takes no storage space on your device.

Although still available in the app store, this app hasn’t been updated in over two years. It has existed for nearly a decade and was the first app of its kind that I experienced. The database of codes which the current version can identify stands at over thirty seven million. The ability to purchase and/or create labels to be attached to objects and speak or type text which is spoken when the labels are encountered might well make this somewhat older system a good choice in some circumstances. Since the text or recordings you create are stored on your device, this system can work even when you’re not connected to the Internet. This approach may also be less intimidating and easier to master for people les skilled with technology or fearful of giving away information to large corporations. It is, however, quite slow by the standards of today’s apps.

Tap Tap See:

https://taptapseeapp.com

Another approach is used by apps such as Tap Tap See. Rather than using a bar code which can be hard to find on some objects, the app assists blind people in taking a good enough picture of the object to allow artificial intelligence or a human assistant to identify the object. The user is then informed, usually within seconds of taking the picture, what the object is. This app detects when an object is clearly in focus and emits a short beep to inform the blind person that an object is firmly in focus. You must go into the “more” tab and then double-tap the “settings” button to access the configuration area of this app where this feature can be enabled. This can help tremendously if people have never had occasion to take a picture before. Tap Tap See can also analyze up to ten seconds of video in order to identify objects in the camera view.

Tap Tap See was king of the proverbial block for quite some time. It was once a paid service where you had to keep re-stocking the amount of pictures you could have described for you. It has long since become free for all users to use as much as they like and still comes in handy every now and then.

For my wife Sara and I, Tap Tap See was well worth paying for photos while that was actually necessary. To just be able to pull out one’s phone, snap a picture of a bottle of beer, box of crackers or cup of noodles and find out what kind it was within seconds, that was absolute unadulterated magic. No longer was it necessary to label so many things that would only be used once. No longer did we have to be so very careful about where we put which soup cans. Provided we had our iPHOnEs, we could make informed choices without having to open something to find out what it was.

KNFB Reader;

A Reading Revolution In Your Pocket: https://knfbreader.com

Being able to read print easily wherever you might happen to encounter it has been another long-standing ardent wish for many blind people. My second truly mind-blowing experience with the camera on my iPHONE happened on September 18, 2014, on the same day as the Scottish referendum. Thanks to the generosity of John Morgan, a philanthropist who I’ve had the honour and good fortune to call friend, I felt able to afford the most expensive app I’ve ever acquired. for quite a while, I had been hearing about a remarkable new app called KNFB Reader. An early demonstration of its capabilities involved someone taking a picture of a screen at the front of a large auditorium and having the print on the screen read out perfectly. This app could also help guide your hand as you positioned your iPHONE above a sheet of paper to get that optimum shot for best recognition results. It was one of those things that felt either too good to be true or too revolutionary and life-changing to be available to ordinary people like me.

I was using an iPHONE5S by this point so I had the capability to run it. People were almost too ecstatic about what it could do. I heard stories of people effortlessly photographing even large poster-sized documents and getting nearly perfect OCR results. People talked of snapping pictures of signs on the street and learning what they said. It all sounded too good to be true. The price of over $100 Canadian certainly gave me pause. I could buy a good portion of a month’s groceries or a lot of Kindle books with that sort of money. I would, in all likelihood have ultimately made the purchase. However, what decided me that day was an email and donation from John Morgan. He had heard of this remarkable app and wondered if I’d be interested in taking it for a spin an telling him what I thought of it. Naturally, at that point, I agreed. I purchased the app from the app store without difficulty and it installed without any issues.

I opened the app and read through the instructions. For an experienced user like me, or even a beginner, it all struck me as very simple and well thought through. I could use a feature called “field of view report” to get a sense of how well a document was in the camera’s focus. Once I had gotten it in good position, I could take a picture and the document in focus would be read out to me. This would apparently happen within seconds of my taking the picture. Testing it out on some smaller flyers I had gotten in the mail, I happily found that the app was as good and easy to use as people said it was. To really put my new app through the ringer, I found a large poster-sized paper that the Ontario government had sent me. I had used my older version of Kurzweil 1000 and the OCR scanner on my desk to partially scan the document. It wouldn’t all fit in the scanner but I had gotten enough read out to know it wasn’t something I needed to wrestle with in order to hear the rest. However, if this new app worked like people had raved about, it wouldn’t be nearly so hard and time consuming to get the full document readable.

I laid out the large sheet on my dining room table and made certain the room light was on. For once, it would actually be helpful to me rather than the occasional sighted visitors I had. Using the field of view feature as well as vibrations which gave me a sense of how tilted my iPHONE was, it took a number of attempts before I found a position which let the camera take in the whole paper. Moving my iPHONE farther above the paper and then getting another report, I was able to hone in on the perfect position for my iPHONE. Then, I carefully double-tapped the “take picture” button. The camera sound played and I waited hoping I hadn’t disturbed the position of the iPHONE by tapping too hard. It took around ten seconds. I was just beginning to wonder if I had crashed the app when a synthetic voice began reading the document. The contents were about as deadly dull and unimportant to my life as humanly possible. Nevertheless, I was still absolutely spellbound. The paper was read out absolutely perfectly. I had never read anything like this without there being a number of OCR recognition errors. You’d encounter them at least once every couple of sentences. There’d be a couple of nonsense characters or letters instead of numbers. It had been this way since my days in secondary school when I had gotten my very first scanner. That still certainly happens even with KNFB Reader but it’s a lot less frequent. However, on that day, the stars must have been right. I stood there at the table utterly amazed waiting for a mistake that never came as perfect sentence followed perfect sentence. I couldn’t help but think back to the weekend I had spent scanning a copy of The Elements of Style so I’d have it in time for my creative writing class in university. A good portion of my ongoing perhaps unfair seething hatred of that book can be traced back to the wrecked weekend of effort it took to get a far from perfect but useable copy scanned into my laptop one page at a time. How much easier and less painful it would have been with an app like KNFB Reader.

The implications for today’s students are absolutely profound. It might actually be useful for them to go to a library, borrow a book and find the information they needed in it. They could read through forms and merely need sighted help to fill them in properly. All of the work of character recognition happens on your own device. This means that your data is kept absolutely private and doesn’t need to travel anywhere for processing. You can also export the text to other apps and share documents when you want to. The vibration feedback plus the field of view report features makes the process of learning to take good pictures of sheets a lot easier to master than with apps which cost less.

Compared to other apps in the app store, KNFB Reader is one of the most expensive purchases you’re likely to make even if you manage to get it while on sale. However, consider this more carefully. I needed government funding to obtain the Kurzweil 1000 software I had been using for over a decade. Each time I wanted to get the latest update, I needed to pay over $100 in order to have it sent to me. The government agency spent over $1000 initially to purchase my user license for the software. However, for the cost of a single update for Kurzweil 1000, I had gotten an app which consistently yielded results as good as or better than Kurzweil 1000. These results were obtained in a fraction of the time that software and scanner took to even scan a sheet let alone interpret what it saw. Rather than a bulky scanner taking up desk space, I could pull my iPHONE out of my pocket and read print anywhere. I’ve never had to pay once for updates to the KNFB Reader app. They just keep coming every so often. While there may be less overall need to read print as more documents become electronically available, I’m still very happy to have this app for when I need a full and accurate scan of something in print. I use this app every day to read my physical mail and notes from the staff of my apartment building. The KNFB Reader app was and still is an absolute game changer.

App Store Expedition:

Prizmo Go; A Cheeper Alternative Reading Solution:

https://creaceed.com/prizmogo

While KNFB Reader was certainly the first really noteworthy print reading app, it wasn’t the only kid on the block for very long. Other apps appeared attempting to offer OCR capabilities more affordably. These days, unlike when I picked up the app, KNFB Reader has some good competition. By far, the most successful of these is one called Prizmo Go. It was designed to cater more to the sighted user. However, Creaceed SPRL, the app developer, took great care to incorporate support for VoiceOver and special guidance to help blind users orient their cameras to get good pictures of text. You can try the app free of charge enough to get a sense of whether the app would suit your needs. Should you need more of its capabilities, you can purchase some outright such as the ability to export and share text. Other capabilities take advantage of cloud-based processing to offer enhanced accuracy above and beyond the already superb performance of the built-in OCR capabilities. Prizmo Go also offers language translation capabilities. If you hesitate to pay an ongoing subscription, you can instead pay for a limited amount of cloud-enhanced accurate scans and/or translations. Otherwise, simply pay a one-time fee to unlock the exporting capabilities of Prizmo Go and you’ll have yourself a very robust and portable OCR solution.

This illustrates the power of large-scale economics. While you won’t find quite the same level of intuitiveness as with KNFB Reader, Prizmo Go offers very comparable OCR results for people who are comfortable and proficient with using the camera of their device. The “scene description” button at the bottom right offers similar guidance to the “field of view” report feature of KNFB Reader. It also gives a sense of the number of lines of text in view. There are VoiceOver hints throughout the app as well as other help offered in the “app settings” button found at the top left. For a lot of people who feel comfortable using the camera, Prizmo Go will be more than sufficient to meet their OCR needs and will cost them a whole lot less money. For plenty of others, the more intuitive feedback and guidance that only an app designed from the ground up for blind users provides will be well worth the extra one-time expense. I’m happy that we as blind people can now regularly make such choices in the iOs ecosystem.

Seeing A I:

Putting It All Together:

Www.seingai.com

To really propel things forward takes a partnership. This partnership is between blind people who own iOS devices that need to know about things in their lives and a gigantic globe-spanning company working on cutting edge artificial intelligence with mind-boggling resources. One Summer, the blind community was absolutely stunned when the Seeing A I app from Microsoft appeared in the app store. As a means of conducting ongoing research into accessibility and artificial intelligence, Microsoft had managed to leverage its artificial intelligence, massive image database and computing power to come up with what is still the current must-have app for blind people. It’s called Seeing A I. This app is the Swiss army knife of handy tools making use of the camera. It has a number of channels which each perform different tasks. The channel you start on is for reading short text as the camera sees it. Touch the channel selector in the bottom right and flick up or down to get to other channels. More are added as new features are deemed ready for public experimentation. Other available channels include ones for reading larger documents, identifying objects or people, identifying currency, detecting the level of ambient light, describing scenes and even reading hand writing. Some of these features are still in what is called “preview” status which basically means they’re available for you to try but not yet judged fully developed. This app is available free from the app store and using it is free of any charges other than possibly cellular data.

It’s impossible to properly convey the tremendous usefulness of the Seeing A I app. Nor can the impact on the social media using blind community be overstated. All through the first months during which the app was available, it seemed that there were torrents of tweets, Facebook posts and podcasts about this small but monumental app. Nothing could knock Seeing A I off of the five most recently recommended app slots on the Applevis home page. And this despite the fact that users in the UK weren’t included initially. The envy from across the pond was thick enough to cut.

When you first run the app, you are presented with tutorials which pop up as you explore. They have videos produced by the app developers at Microsoft which explain how the various features work.. Once you’ve gone through those, the app will open into the default channel for reading short text. Touch the channel selector on the bottom right and flick up or down with one finger to change to different channels. The next two channels above short text reading are for reading whole pages and product identification via bar codes.

Keep in mind that Seeing A I is a research project. Basically, this means that you’re paying for the assistance you receive with the data you’re generating. You are the product. As Microsoft’s artificial intelligence assists you by reading text or describing people or objects, it learns from the images. Human app developers working on Seeing A I and other Microsoft products might make use of the image of a coffee cup you take a picture of and have described to improve the ability of products using artificial intelligence to recognize coffee cups. A blind person might well have taken a picture of a coffee cup from a different unusual angle than most images in an image library. I feel pretty safe in using it for most things but wouldn’t use it to get the security number of my credit card or anything sensitive like that. Always be aware of where images and data are going and mindful of the motivations of people or companies who have access to it.

Speaking of being aware of who gets data, consider the case of my mother in law. Soon after my wife Sara and I began using the Seeing A I app, we visited with her family and showed them the app. With her eager permission, we decided to try the person description channel and take a picture of my mother in law. Upon examining that picture, my iPHONE calmly informed her that she was a good couple of decades older than was actually the case. She was less than thrilled with that description and jokingly warned me not to leave my iPHONE unattended in her presence.

The Eyes have It;

Bringing Willing Sighted Help Where It’s Needed:

Lets face it. There are times when no amount of fancy artificial intelligence will do the trick. You need to hunt for something, complete a process, figure out where to write on a form, etc. There are problems which require an ongoing dialogue with a sighted person to be solved efficiently. With FaceTime, you must have someone in your family or circle of friends willing to help when needed. Not everyone has an iOs device. Nor are the sighted people in our lives always free or knowledgeable enough to help. There are currently a number of apps which seek to remove these limitations. We’ll focus on two very popular apps which attempt to connect those who need sighted help with people who are willing and able to assist.

Be My Eyes:

Www.bemyeyes.com

The Be My Eyes app connects you via a video call similar to FaceTime with the first available volunteer who has made him or herself available to assist. The app shows the volunteer what the camera on the back of your device sees. The real utility in the Be My Eyes app is its ability to connect people needing help with available volunteers who have time to offer assistance at any given moment. This is great for finding lost objects or picking out clothes that look good on you.

Keep in mind that these are volunteers. All they have to do is a simple tutorial about how the app works. They aren’t vetted for security and are unpaid. Don’t use them for anything requiring sensitive information. While it is unlikely that people would volunteer with sinister motives, You never know. Someone could, for instance, se and make use of a credit card number you had the volunteer read to you. There are certainly ways in which this app and the blind people who use it might be abused by unscrupulous people.

To write this section of the guide, I wanted to test out the Be My Eyes app for myself. I found signing up to be very easy. You do need to agree to the terms of use and use the app in abidance with them. Volunteers are never responsible for your safety. Nor can the developers of Be My Eyes be held liable for any misuse of the app. Once I had signed up, I placed a call by finding and double-tapping a button called “Call first available volunteer”. I have a collector’s coin which I got after participating in a documentary called Get Lamp. It’s available on Youtube and is about text adventure games. I asked the volunteer to describe each side of the coin. She did a nice job of it and told me how to move my iPHONE so she could read me the small writing on the coin as well as describe the pictures. She had time for a brief chat and was curious about the documentary. I asked what she had to do in order to volunteer and she described the simple tutorial she had completed.

The experience was very simple and unhurried. However, one is certainly conscious of taking up a volunteer’s time. Personally, I wouldn’t want to use them for anything too lengthy. That call certainly gave me an idea of the ability of the camera on my iPHONE7 to focus on small images and what feels to me like tiny writing on the coin faces.

Aira;

Competent Secure and On-demand Visual Help For Rent: Www.aira.io

Aira, pronounced “eye”ra, puts everything on a business footing. Aira hires agents who help blind people with everyday tasks. These people are vetted for security so you can feel safer about sensitive information. Also, you as a blind client pay a subscription for a certain number of minutes each month. You can use them for whatever you like without feeling guilty for taking up somebody’s time. The software used by Aira agents is more sophisticated and allows them to pull up maps or tap into your social media to recognize faces of people. Aira agents are trusted with your data due to their having been vetted for security and this makes Aira potentially much more useful depending on your individual needs. Aira can be used in conjunction with smart glasses giving a head level hand-free view more similar to eye sight to Aira agents. Imagine walking down a street and a voice in your ear telling you about a restaurant sign coming up ahead. The agent could also tell you that your friend John is approaching you from the left. That’s what happens in one of the promotional videos you’ll find on their web site. Because these agents can access your location and other information, they can help you make travel decisions. During the setup process, Aira makes it very clear that they aren’t responsible for your safety. They won’t help you while you’re travelling unless you are using a mobility aid like a guide dog or cane. They can, however, describe your surroundings and suggest different routes. They could tell you about signs of businesses you’re passing, where a taxi stand or bus stop is, or any number of details which might prove useful. They could help you operate and otherwise inaccessible ticket machine or read information from a restaurant’s display.

While writing this guide, Aira offered a free trial for everyone which gave you thirty minutes that had to b used in seven days or les once your trial started. I took advantage of this to take Aira for a casual spin. Some of my thirty minutes were taken with going through the rules and answering some questions to set up my profile. After that, the agent asked what she could help me with.

In advertising the trial, people were encouraged to knock something off their bucket list. During the years I’ve lived in my apartment, I had often been told that I had a “good view” from my balcony. I decided to have the agent who took my call describe for me the view from the balcony of my apartment. I figured this would give me an idea of how wide an area could be seen from my iPHONE’s camera as well as a sense of how well Aira trained its agents in the art of description. She spent a few minutes describing what she saw and answering some questions I asked. I was impressed by how much could be seen and with the detailed description I was given. She agreed to type up some descriptive text and attach it to a picture she took and shared with my iPHONE. Unfortunately, to access such pictures, you need to be a paid subscriber to the service so I couldn’t access this little souvenir. however, I at least got an idea of what it was like to stand and look out on a snowy afternoon from my balcony.

As things currently stand, using the Aira service is an expense your bank account will feel the sting of even for people with steady working incomes. As of March 2019, a standard subscription without smart glasses would cost around $100 US per month giving you 120 minutes of Aira time which can be used for nearly anything you want to do with the assistance of an Aira agent. This doesn’t cover any costs you’d pay for the cellular data used. Those could be prohibitive if you don’t have a high or unlimited amount of data in your monthly plan.

Putting things on a business footing has a more profound psychological effect than you might think. While you might hesitate to use a volunteer’s time for anything lengthy or complex, the mindset changes when you’re paying for a service. As fate would have it, a friend of mine posted a perfect example of this on Facebook while I was working on this section of the guide. Michelle McQuigge doesn’t consider herself particularly handy with tools or good at assembling things. However, she had bought a heavy-duty laundry cart and decided to use an Aira agent’s help to try assembling the cart without help from her more mechanically inclined sighted friends. The agent was able to find the assembly instructions on the Internet and then talk Michelle through the process of putting her new cart together. That just floored me. I can’t count the times in life where I’ve sat still or stood out of the way while someone sighted has tired him or herself out putting some piece of furniture together for me. An experience common to blind people is that sense of bing more than willing to do the work for oneself if a sighted person would just patiently say what needed to be done and where things were. For Michelle, Aira made that dream come true.

The idea of attending an event and being able to get to your seat or know what’s around at a festival independently is very compelling. You could tour a convention haul or have paintings in a gallery describe to you. Aira really pushes this active and engaged lifestyle in its advertising. The company is doing a lot to try and lower costs and make Aira useful for people who can’t afford to subscribe to one of their plans. Businesses can sign up so that blind people can be helped by Aira agents while in their establishments at no cost other than cellular data. Airports and other places can be designated free for Aira use by anybody. Also, people can use Aira for job hunting tasks at no charge and without losing minutes if they’re subscribed. There’s an entry level plan for $30 US per month for 30 minutes. That might be useful for a few quick tasks around the home but you wouldn’t want to go on an expedition with that plan.

Camera Conclusions:

Over the past eight years, I’ve continued to be delighted by the many ways in which the camera in my iPHONE has proved a vital part of my everyday life. As the costs of artificial intelligence, computing power, internet connectivity and camera technology go down, the possibilities for even those of us on low incomes will continue to grow. Thankfully, ways are being found to tap willing people and the vast resources of companies to improve lives not only of blind people, but of everyone who gains from artificial intelligence made smarter about identifying things for us.

These are still very early days. Things are moving quite quickly. This section of guide may very well be the first to become obsolete due to changes in this area. I’m fascinated to see where both the charitable and business efforts to put the cameras on iOS devices in the service of improving the lives of blind people end up going over the next decade. There’s quite a lot of potential for amazing helpful ideas. What really excites me is how all of this is done through a piece of technology commonly used by all sorts of people be they blind, sighted, disabled in other ways, or fully able-bodied. This brings large scale economics to bear on problems which formerly relied on smaller niche markets and special charity or government economics. That in itself is a massive game changer whose implications are still very much in their infancy.

Sadly, I things there is some cause for worry about an increased divide between blind people who succeed in finding steady enough employment to tap into what paid services like Aira offer and those who aren’t as successful. On lower income levels, you quickly hit a catch22 where you spend so much on the service that you have no money to go places to put it to good use. However, I think the costs will fall over time as more businesses join as sponsors or partners. I’m also hopeful that these services will have a trickle-down effect through changing public attitudes about the possibilities for blind participation in spheres of life.

The fear surrounding what governments and massive corporations will do with the data we generate is still, I believe, a limiting force on what’s possible. Personally, I think a lot of that fear is misplaced. People aren’t really weighing up the benefits and hesitate to reach for the ready help that artificial intelligence can provide. For blind people, I seem more of a danger from the one-on-one connections between them and the sighted people they connect to. It’s far more likely that an unscrupulous person would take advantage of a bank account number accidentally revealed while trying to bring a package of oatmeal on a cluttered countertop into focus or something of that nature. In such a case though, is that very slight risk more or less dangerous than not knowing whether that package of oatmeal contains ingredients you’re allergic to? Might it be worth trusting a volunteer or paid agent to help you use that otherwise inaccessible vending machine to get a snack or pay for a ticket you need in a hurry? In exchange for a bit of mindful sensible trust, that help is now available through your iOS device.

Above my discussion of each app covered in this section, you’ll find a link to the web site about it. There are all kinds of podcasts, videos and other information for each app so you can get a sense of whether it suits your particular needs. A simple Google search can yield more independent reviews and demonstrations of these apps. People tend to talk a lot about things which have positive impacts on their lives. Take advantage of that to get a good sense of what’s possible and how you might use that little round camera in the corner of your iOS device.

Remember to think out of the box. Nothing stops you as a Blind person from using the camera for the same social purposes as a sighted person. From pictures posted to online dating sites to Facebook photos, blind people are getting up to all kinds of social activity. There are even blind Youtube stars who have lots of sighted followers. We’ll cover that in greater detail in the section dealing with social media. There are lots of implications for making friends and sharing experiences with that little camera and other technology in your iOs device. new possibilities are emerging all the time so keep an ear out for them.