Dynamic text galore!
As we all know Apple products, get fairly regular software updates.
I had read and heard a few things about iOS 10, but was intending on waiting to experience it before making comments.
A few instant differences, my text seems larger. I already had larger text enabled via Accessibility settings and with this some apps enable dynamic text so also enlarge text, however, in settings my text seems clearer, larger and bolder, very positive, though not all text size is consistent through all apps.
There was a lot of chat about a new iMessage.
I couldn't imagine how it could be improved. Instantly noticeable are now three symbols, the 'camera' symbol being instantly recognisable, however, the the other two symbols I had to scrutinise! After some time zooming in I discovered the middle is 'digital touch,' like on Apple Watch, though this isn't quite as simple. The bottom half of the screen is black and to change colour of the pen, is simple, but the icons used to illustrate digital touch and video are so tiny. This confused me, after pressing the video icon, camera appears and its now possible to doodle before or during screening a video. This doesn't particularly interest me, though I found after exiting from the camera (well contrasted small cross in the left corner) the black digital touch screen appears in full screen mode, this certainly makes the screen more accessible, giving more room to send your heartbeat or 'sketch,' whatever you'd like to do with digital touch.
The third icon, my initial thought 'A' for AppStore, however on pressing an interesting discovery of 'memes,' images of text and then the option to send across a song from your music library.
To me this seemed like a strange selection under 'A,' I did not find it immediately obvious or the understanding behind it. After more scrutiny I realised it is possible to send Apps, however I didn't find this until I discovered 4 grey dots in the bottom corner that navigate to 4 more options, one being '+' which directs you to the App Store itself.
It took some me some considerable searching and fiddling!
Going back to the first icon in iMessage, the camera, this one though I understood exactly what it was I've not yet seen' figured out how to put camera on full screen before sending a picture. To enlarge on this, once pressing the camera icon, your camera roll appears next to minimised camera feature, moving my phone around I realise the camera was active, and indeed you can take a picture and send right across without being redirected to the camera app.
This may seem easy for some but for somebody like myself who uses the camera as a 'seeing' tool, to zoom in and take pictures to access detail, having the camera feature small without the ability to enable full screen isn't helpful. However, after investigating this further I found that swiping from the left side of the 'mini' camera icon is a slim grey tall box with a grey faded arrow to the left that then indicates to 'camera,' and 'photo library,' this does then take you into the camera app on full screen- this took me a considerable time to discover! Grey on grey dare I say, is incredibly inaccessible!
For as long as I can remember I have used iPhone with large text, this is always helpful and of course is how I can read my messages without straining, though one thing that is noticeable with larger text is the icons don't enlarge. For instance, the 'send,' button has changed in iOS10, it's now an arrow, this is small and I had to search around for it a few times with my finger.
The sizing of the icons in comparison to my large text can make my screen seem out of of proportion and harder to find the smaller icons.
Inside the message window there is 'iMessage,' written in faded grey, not only is this poor contrast but because of that it can be hard to find the text box as the overall backdrop is white, making it difficult to differentiate conversation with text boxes.
Keyboard-wise I noticed the emojis have also enlarged ever so slightly allowing less 'faces,' on screen at a time before swiping across to see the rest, they now have more 3D effect I can start to understand what each emoji stands for.
When in other apps and I received a message, the iMessage/ text message tab that appears is grey. This overlapping other apps doesn't sit well visually. I find it hard to see the tab to either press on or exit it.
Before this update the tab was black, and by pulling it down from the top of the screen the backdrop was black and the text was white- I loved this! Visually it was soothing and did not put any additional stress on my eyes. I'm sad to see this gone with the new update.
Moving onto Apple Music, since having my new and treasured hearing aids,Resound Linx2 I can stream music rarely a day goes by without me listening to music. One of the first things I noticed after updating was the ability to load lyrics, and again large text enabled. This made my day!
Being a deaf person, growing up I always enjoyed music even though I couldn’t access the lyrics, now I love it even more. I would always google the lyrics and sing/ read along with the song in attempt to fully understand and appreciate it.
In recent years I've found the lyric websites to be poor in contrast and a struggle to read even with zoomed features. Having lyrics loaded up on Apple Music means I don't need to search for accessible lyrics, they're right and in dynamic text, perfect.
Another pleasant improvement I noticed, not only is the time and date now displayed bolder than before in locked screen, it's nice to have the ability to swipe down and see any notifications I may have missed or need to see.
The new update has changed the text and colours. Much better! The boxes 'up next,' or 'Siri app suggestions' are off white/ grey with bold larger text (dynamic text!) I can access these without straining. I struggled before and often had to fiddle with zoom features to access, which took a lot of time.
Maps is an app I use a lot both on my iPhone and Apple Watch. Glancing at the Map I instantly saw that text / locations is larger and clearer, 'start,' and 'end,' displayed much clearer than before (also green and red, to clearly indicate without losing it on the screen in poor contrasts) and there are less steps to get where you need previously.
Simplicity makes all the difference and certainly less tiring and hassle to navigate!
What a brilliant idea since I rely heavily on my camera to zoom in on things. Having it set as a shortcut via triple clicking the home screen is really helpful, however, I could already swipe up on a locked screen and be on camera which is quicker!
Camera, one movement rather than three speedy fiddly clicks. Once on magnifier there is a line with a dial along the bottom of the camera that enables you to zoom. The bottom of the screen where the controls are to enable zoom is great, black with yellow dial to scroll across to zoom. These colours are great for me visually to see exactly where to zoom. An alternate for those who struggle to see the scrolling feature like on camera, fingers can be used to pinch the screen to use zoom.
At the bottom right corner is a feature to alter brightness/ contrast or even invert colours, this is a great add on, I can see this as brilliant for reading menus in restaurants or any text material.
Another great way of using mainstream tech as opposed to expensive specialist equipment made for the blind /visually impaired.
I got really excited to see this, I straight away enabled 'colour filters,' to see what I could do to reduce glare and brightness without making the screen seem dark. There are 5 options with various colour combinations which can filter the display to most comfortable.
After experimenting I opted for 'colour tint,' the very last option, this immediately made the screen yellowish, and below I could either increase/ decrease intensity or hue. I loved having the manual power to adjust to what was best to suit my needs. I also have enabled 'reduce white point,' as any white/glare onscreen gives me headache after using for a while.
The only critism I would have here, is on adjusting 'intensity' or 'hue' or even 'reduce white point,' each has a line with a button that can scroll either way, the colouring of this is quite difficult to work with, whites and greys are a difficult contrast for me to determine where the button is to scroll, I’d like to see these colours and contrasts used less frequently as they are difficult.
From an accessibility view there are a few changes for the better, especially the more consistent dynamic text through the many applications, the clarity and better contrasted boxes in 'notification centre,' or some refer to as 'glances.' However some iMessage features such as the icons, the struggle to find camera have made life a little tougher.
I rely on my iPhone for many things it helps me to access the world, for me each software update brings new possibilities, iOS10 is on the whole good but there is a little room for improvement!
Taking a photo to me is not just a picture, its capturing a moment, capturing sight I no longer have.
I take a picture and then take it apart to put it back together in my mind. I zoom in, mentally analyse and take in each little part, put it together like a jigsaw puzzle and make a full picture.
Slowly but surely I use my tiny amount of vision to form a mental picture that I can reproduce in art.
Whilst registered blind (legally blind in US) I do have 5 degrees of vision left in my right eye.
I believe that being born deaf is why I’m a visual person, with such visual strategies, even though I’m now deafblind.
Lipreading is all a part of hearing with your eyes when you are deaf but now deafblind I really struggle with lipreading so I’m now sort of transitioning to other coping strategies
I now try to look at eyes as opposed to lips as I would automatically start trying to lipread which is very tiring to deaf people but absolutely exhausting with damaged eyes like mine.
The small window of vision I have can be pretty useless, however I have found using my iPhone camera to take a photo of something like a menu or label makes that tiny print all of a sudden accessible. Sounds funny right? Using my iPhone to take a picture means I can use my fingers in a pinch motion to zoom in and move the screen around to view what I originally couldn’t see, simple, unfortunately it doesn’t always work.
Depending on the quality of picture or if its a picture of something moving the quality isn't always good enough to make sense of.
I have taken a fair few videos of Unis when out on a ‘free-run’ and I often like to as a hobby edit and slow them down. I also learnt that whilst viewing a video on my iPhone 6 I can zoom in. Quality isn’t always great on this if I have since edited the picture, but it’s enabling me to get a little closer to Unis’s flapping ears in the wind, or her wagging tail. I experience that moment slightly differently.
I take a lot of photos. Yes I take a fair few selfies too! (cringe, I know!).
I store all of these and go over them regularly. I zoom in, I view them on my Mac, I might edit them a few times over and each time I see something different and very strangely I notice small differences even though I’m blind!
Being someone who’s creative and often liked to fiddle with colours and doodles, I guess this is fairly natural.
I also see the benefit of this with my condition.
I attend lots of events, I nearly always take pictures if I can. I always take more pictures than I need to. When I am taking pictures I only have to focus on one thing. Thats my phone screen, and the section with the camera ‘button’ (obviously iPhones being touch screen, it’s not a literal button).
I feel less stressed focusing on the one screen and taking photos. Whilst I experience the whole atmosphere on a holiday, or at an event. I am also simultaneously capturing it. I love taking panoramic pictures too.
Later I will zoom in, review, delete and edit.
Taking a picture and zooming in is almost the way I see a person’s face now.
When I look at someone’s face I now only see one eye and maybe an eyebrow, the rest is a blur on a good day I might have the occasional blind spots too.
In order to view a whole face I move my eyes around to scan to the other eye/ eyebrow, above to the hairline, down the nose to the lips and all around to capture the hair, quite exhausting typing it, but thats the process I go through to see now and yes, it’s exhausting. Once I’ve scanned once or twice that face is memorised in my mind. I will have remembered certain features, not always all, depending on lighting and positioning.
However, if I was to take a picture of a face using my iPhone, I zoom in, split the picture, and view the eyes, the eyebrows, the nose, etc and piece it all together.
During my A Levels, I painted a fair few portraits to try and interpret how I view a face. My main pictorial outcome for people to understand in an abstract sense was a jigsaw puzzle. I see a face, but just the one piece at a time. In my mind I am piecing together the jigsaw puzzle of someone’s face.
Taking a picture allows this. Photo’s allow me to interpret what my eyes alone cannot.
Being on holiday, amongst the smells, various textures, tastes and aural surroundings the pictures help me complete the experience at the end of the day/ holiday.
A selfie helps me complete my make up!
A recent experience was San Fran, more of this in another blog:
San Francisco was amazing. I fell in love with the city and all coping strategies kicked in and along with Mum and Sister I was able to quickly adapt.
The people were friendly and once again I was looked after using my cane wherever I went. The weather was beautiful. We went on a tour bus and ventured around to ‘see’ as much of SF as possible. I felt happiest I had been in a long time. I took full advantage of my camera on my iPhone.
I have always been creative and photography was not always a strong suit of mine, taking photos has always been a strategy for me to zoom up close after taking a picture.
Thanks to my iPhone I actually did see the Golden Gate bridge, all 1.5 miles of it via pictures and zoom.
I take a fair few panorama pictures, at first I wasn’t sure why accept I liked them and what they allowed me to see when I zoom in.
It only occurred to me recently that taking a panoramic picture which involves standing and viewing one view and gradually rotating around creating a long stretch of a view in one. Well, with my restricted field vision, this is very beneficial.
My lack of peripheral vision means I cannot take in one landscape at once, using my fingers to pinch in and out of a landscape (like I do with any picture,) means I am breaking up the image to fit into my small 5 degrees and moving about gradually to fully view an image.
Panoramic artistically create a long view, two perspectives in one. For the detail I zoom in to fully appreciate it, I almost feel like I have peripheral vision again.
Two perspectives on one screen create more of a memory for me of these amazing experiences.
Pictures mean a lot to me.
Pictures capture more than a moment, they become treasured memories that Usher Syndrome cannot take away.
When I wrote my Applewatch blog back in April this year, I had no idea of the interest it would generate, nor the amazing people or companies it would lead me to.
I felt so proud that my blog led to many people with Usher Syndrome, deafblind, blind or deaf considering buying the Applewatch and also so many that have bought it and like me enjoy it's fantastic features.
Thank you to all who have sent me such positive feedback.
I was shocked by the interest from all around the world and flattered by the amount of media interest and the many who contacted me direct, curious about Usher Syndrome and accessibility.
However, for me personally it brought something very special, a company full of fantastic people and a product that together with my Applewatch and iPhone has completely changed my life, Linx2.
GN ReSound came into my life as a result of my Applewatch blog. Until then I had never heard of the company and knew nothing of their amazing Linx2 hearing aids.
For me they came to life on Twitter, I saw their advert advertising the Linx2 to be fully compatible and connective to both iPhone and Applewatch.
I researched further and, I guess as they say the rest is history.
Being fitted with the Linx2 my life has changed so much.
I love that I can adjust my hearing aids myself, to suit the environment, to suit me, I have complete control over what I hear and what I don't. For the first time in my life deafness and environment do not dictate what I can and cannot do, what sound I can or cannot access.
The telephone is something I'd struggled with over the years. Feedback made even trying to communicate on the phone a complete nightmare but I had made use of either text or FaceTime to connect with others, two useful forms of communication open to deaf people but not in a work environment.
Those limitations are now gone thanks to Linx2 and not only can I use a telephone I have bluetooth connectivity which means I'm able to pair hearing aids with iPhone and (lots of other things too) I feel a phone call on my wrist thanks to taptics, press my Applewatch, to connect and I hear clear sound directly into my ears. I can stream music directly into my ears, I can alter bass and treble, I can vary so many things on the ReSound app on Applewatch and I am safe.
I have worried about my iPhone being taken snatched from my hand on a busy street full of people I cannot see, but not any more, my iPhone stays safely tucked away in my bag.
My confidence has grown and I'm able to venture to new places using this incredible technology.
Seeing danger is virtually impossible for me these days but now I can hear it, I know where sounds are coming from and as a result I feel safer which makes me feel so much more able
You completed the picture for me, by allowing me to access the incredible Linx2 hearing aids.
I feel both grateful and very humbled that you have not only taken an interest in me but also such an interest in Usher Syndrome and the work I do raising awareness of the condition.
We are a group of people who often feel overlooked and misunderstood and yet with the right understanding, support and equipment we are very capable, our biggest obstacle is often accessing the necessary equipment!
Since being fitted with my Linx2 hearing aids in May this year I have developed a fantastic relationship with the team in Bicester and was flabbergasted to be invited to be a part of their recent roadshow, it was an ideal platform for me to raise awareness of Usher Syndrome, of my charity The Molly Watt Trust and for me to demonstrate exactly how life changing their products are.
I am no longer isolated by my deafness. I am still deaf but the enhancement I experience every day with Linx2 has been truly overwhelming.
When, like me, you are down to only 5% of useful vision and no cure in sight (excuse the pun) the best available technology to enhance hearing should be a necessity for the deafblind.
So thank you ReSound, your technology is fantastic, I cannot imagine life without Linx2 now and I know things will only get better and better.
Thank you for all you have done and continue to do to support me and my charity, I will be eternally grateful.
I recently read about somebody I know who has Usher Syndrome and had got into a very scary situation.
Having Usher Syndrome this sort of thing is very easily done.
An accident as simple as getting on the wrong bus in the dark, being put off the bus with directions to safety but you didn't understand the directions and finding yourself all alone in the dark bearing in mind you are completely blind in the dark, terrifying.
Being deafblind is often very disorientating at the best of times and lots of us experience dizzy spells or vertigo, often seems part and parcel of the condition and being lost or feeling lost adds to the anxiety lots of us can feel when out and about particularly in unfamiliar areas.
Similar happened to me before I had guide dog Unis and I was petrified, fortunately for me I had my cane, must have looked lost and was helped to a bus stop and onto the correct bus home. To say it knocked my confidence was an understatement, I didn't attempt to go out alone for several months, isolating myself rather than face the possibility of getting lost again.
The day that happened to me I did not have a smartphone or the technology I am lucky enough to enjoy today I just did my best with what I had.
I am pretty confident today and I know that confidence and independence come from incredible technology and of course Guide dog Unis who has saved my life on more than one occasion as back then I struggled to see or hear traffic even with my old hearing aids.
I have blogged about my new hearing aids Linx2 but after four months of this incredible technology along with iPhone and Applewatch I can say my safety and feelings of vulnerability have improved substantially.
Now when I am out and about with Unis I have the ability to change my hearing aid settings to block out certain sounds so that I can not only hear traffic but I can identify the direction of the sound, something I have never ever been able to do so now I see so little I can have trust in my hearing even though I'm deaf and I hear nothing much without hearing aids - get this, "My deaf ears compensate for my dodgy eyes!"
I am now 21 years old and in four months I've learnt so much more about sound than I ever knew.
I hear sounds I've never heard, I've corrected my own speech, things I've said wrongly for years simply because I couldn't hear the sounds properly, I've "overheard" conversations, a really new concept for me, I can speak and hear well in small groups, I hear so much more its hard for me to explain its just quite an "Eyeopener and I'm blind!"
My confidence in my own hearing has improved my vocabulary, yes, even at 21 I'm learning new vocabulary, I'm not mishearing which was often my biggest frustration.
This last week I did something I never thought I'd be capable of doing without help - I took my first ever conference call yes, not Skype, not FaceTime I totally relied on technology to hear and this is how:
The ReSound Linx2 connect to both my iPhone and my apple watch via bluetooth. When the call comes in I can answer via my watch, clear speech goes directly into my ears, no background noise or interference.
I cannot describe my elation at being able to access a three way conversation, to hear clearly two unfamiliar voices and to make plans for an upcoming event.
I'm sure lots of people are thinking it's no big deal but it really is because using a telephone is something most take for granted and yet people with with Usher Syndrome who use hearing aids often cannot and as a result struggle, particularly in the workplace and yet it's possible if only this up to date technology was available to them.
To have these "Smart Aids" (the first to be fully compatible with the applewatch) the watch and an iPhone work out to be very expensive, however when considering what this kit enables a person to do it makes complete sense in my opinion.
I feel very humbled to have access to this technology, it absolutely makes me me.
I am not a tech expert, an expert of Usher Syndrome or anything else for that matter but knowing that this sort of technology exists and what it can do to enhance the lives of those with such challenges it has to be viable.
Speaking about Siri I have noticed it is by far better on my Applewatch than on my iPhone, I'm curious to know if it will improve on iPhone with the new operating system, either way I will continue to use Siri on my watch so I have the security of leaving my phone out of view and safe while I am out and about with Unis.
I'm still a huge fan of taptics but am now finding Siri so useful, when Siri talks to me the sound goes straight into my ears so I hear clearly thanks to these amazing Linx2 hearing aids and if Siri cannot help then there's almost certainly an app that will do so.
It's great to have so much independence via technology that I can access so easily.
I was asked if Siri understands "deaf voices" well, it understands mine is all I can say.
I'm also looking forward to understanding what "native apps" work on the new operating system for Apple watch and just out of curiosity to see if there is any safety element there.
Along with the excitement of so much new assistive technology available comes the frustration of knowing so many people who would benefit won't because they cannot afford it.
There is so much advancement in technology surely funding the right equipment as opposed to the cheapest equipment makes absolute sense.