I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.
I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).
Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.
The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.
Recommend creating a Shortcut to toggle this setting.
Not just for outside, but also at public gatherings like concerts! I went to a concert last month and used a red color filter to record a couple short videos without being a big distraction to the audience behind me.
Dim backlight + Red color filter can make the screen almost invisible to those around you.
I made a Shortcut that drops the white point, turns on the red filter, zeroes the audio and turns down the brightness. On wake-up it reverses these.
It’s attached to an automation for sleep and wake focus it works really well. I added an explicit run requirement for wake so that I could sleep in without getting blasted in the face with white. There’s a notification with a run button which I can wait until I’m truly up to hit.
Thanks for the reminder. Wish I had remembered this feature during the aurora photography I was doing. I set the phone brightness to the lowest setting but the red filter would have helped even more.
The issue I've seen when the app itself offers a red filter is if that app calls an OS native widget like a keyboard does not get filtered. The system level accessibility feature does filter the OS widget. I would almost rather the app's setting to just enable the OS filter, but I can understand why that might not be possible.
I think you can just add it to Control Centre, no need for shortcuts.
I made an app for reading in the dark, minimising the amount of light hitting your eyeballs, but I'm still using the red color filter every night.
Kinda feels like you could’ve done more of a cursory glance to see what functionality was actually being talked about before going for the “Android already does this!!” comment.
Kinda feels like you could've be more like [0] than being an ass, while totally missing what I was surprised about the lack of automagik, not of the feature itself.
I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.
I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.
Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.
The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".
Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.
The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.
Double tap is reachability for me and triple tap is to make the display very dim so that at night at the lowest brightness setting, I can get it even lower. It resets after a while so even if I forget to switch it off my screen won’t stay dim for the next few days while I wonder why it’s so damn dark.
Perhaps I could rephrase that it's hidden within Accessibility settings, not that it's an accessibility setting that is furthermore hidden.
Most people don't go into that menu to look around for things they might want to use cause features that almost everyone could benefit from are alongside settings for people with visual and hearing impairments.
Unironically calling this feature "hidden" is why things are the way they are now. It's not hidden! You can find it if you go through the settings app! But because it isn't in your face all day every now and then people will talk about this "super secret feature" and then a PM somewhere has to make a nag feature to advertise it.
Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.
And don’t get me started on hidden command line settings.
You're getting a lot of agreement from other HN users, but I'm not sure it's fair to criticize Apple for putting these kinds of features under Accessibility.
There's nothing that inherently "locks out" people who don't have a recognized disability from exploring these features. Furthermore, most of Apple's "accessibility" features are related to Vision/Hearing/etc (and categorized as such), so I think it's reasonable to consider them accessibility features.
Clearly based on other comments here, plenty of people discover these features and find them useful.
> Too many simple straightforward options are now strictly inside accessibility
From outside, it feels like these are the only people with the freedom to improve the user experience at all. So they have to hide their work in the Accessibility preferences.
Yes they can, but I don't and I still hate needless animations and turn them off. The point is, why is it in "accessibility" when it should be more visible?
not OP, but macOS has a ton of options available with arcane commands. my favorites are the auto-hide dock speed setting, the third hidden window minimise animation, and the hidden system accent colours usually locked to the colourful iMacs
Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.
Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.
Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.
iOS 17 audio image descriptions for blind people via Image Magnifier should work on all iPhones, but do not work on iPhone SE3 and iPhone 11 Pro. Audio image descriptions do work in iPhone 12 Pro. Lidar in 12 Pro increases accuracy, but should not be mandatory. Hopefully this is a bug that can be fixed, since text descriptions were still functional on the lower-end devices.
Source: purchased devices until finding one that worked, since Apple docs indicated the feature should work on all iPhones that can run iOS 17.
Edit: audio descriptions in Magnifier are non-functional on iPad Air, working on M2 iPad Pro.
That's because the ADA has no enforcement mechanism other than lawsuits, isn't it? Our whole legal disability rights infrastructure is designed to be driven by lawsuits, and sits inert if nobody sues.
That's actually a good thing, especially if there are people specializing in bringing these lawsuits en-masse.
Over here in Europe, where there's no lawsuit culture, some laws (not necessarily accessibility-related, our legislation in that area is far weaker) are violated almost without repercussions. When the government is the only party that can bring charges and the government is complacent / ineffective, nobody actually gets charged and nothing gets done.
There's also the problem of incentives, if you can get a lot of money from a lawsuit, you have a far better incentive to sue and find a competent lawyer. You may even deliberately look for and sue violators as a moneymaking scheme, some companies in the US do this. This puts pressure on businesses to comply with the law. Even if you're piss-poor and never sue anybody, you still benefit.
If this lawsuit culture doesn't exist, the best you can do is write a report to the government and hope they actually act on it. Many people don't know how to write these, and since there's no money in it, getting a lawyer to do it for you is an expense that nobody is going to help you recoup.
The people handling these reports don't help either, they're usually 9-to-5 salaried employees who aren't judged too hard on performance, so they have far less of an incentive to actually pursue cases.
Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.
Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.
Until people complain that Apple is being anti-competitive by not making vision tracking open, or allowing third-party eye-tracking controls, etc. etc.
System one is, but advertisers could always roll their own and see if they can get away with "you can only view this content if you give us permission to use your camera".
At least on iOS, I can't imagine that happening - apps are not allowed to demand you grant permissions unrelated to the actual functionality. From App Review Guidelines (5.1.1) Data Collection and Storage, (ii) Access:
> Apps must respect the user’s permission settings and not attempt to manipulate, trick, or force people to consent to unnecessary data access. For example, apps that include the ability to post photos to a social network must not also require microphone access before allowing the user to upload photos.
Lots of iOS apps today really want to mine your address book, and constantly spam you with dialogs to enable it, but they don't go as far as disabling other features until you grant them access, because they'd get rejected once someone noticed.
Fortunately apps in the EU don’t have to pass though Apple’s anticompetitive review process, so developers are free to ignore that rule if they simply distribute the app via an alternative store.
Unfortunately, poor Americans cannot taste the freedom that Europeans have to be abused by developers.
It's not that hard to come with ways to circumvent system restrictions, after all, advertisers are a fierce adversary and have shown many clever ways of invading users privacy in web browsers and mobile apps. In the case of eye tracking I could see a situation where the system perhaps feeds the "malicious" app in question with a hint of which widget is being currently gazed by the user. You could then just build a giant grid of invisible widgets covering the whole app window and use that to reconstruct the all the eye tracking happening inside your app.
This is not about technical restrictions and finding weird legal loopholes, Apple's guidelines don't work that way. It's the spirit that matters, not the letter.
If you look at the current things the EU has forced them to open up (app distribution and NFC payments), both of them are things that Apple was already actively monetizing.
To compare this to a non-monetized accessibility feature is a bit disingenuous.
The whole notion of ‘fairness’ is that gatekeepers should allow others to compete on even footing. That can be fulfilled by granting EITHER everybody OR nobody access to the market (but not: only yourself).
Fifteen million merits is maybe the least about the future of any Black Mirror episode. It reads best as entirely a comment on the society we already have.
A big clue is that the world in it doesn’t make a ton of internal sense and the episode makes absolutely no effort to smooth that over. The questions it raises and leaves open without even attempting an answer are on purpose. You’re supposed to go “god, bicycling to make little pellets? Just to buy useless trash? God, why? It’s so pointless,” or, “they truly look down on and are mean to the people who get demoted, even though they’re basically guaranteed to end up that way someday, when their health or good fortune run out? Just cruelty for no reason that’s highly likely to hit them some day, too? That’s insane!”
… because actually it’s about now (or, the year it came out) and the point is to get you to connect the dots and realize we’re doing the same crap and just don’t notice. It’s only “sci fi” as some sleight of hand to give you an alien’s eye view of our own actual society as a way to prompt reflection. The core story of the corruption of a “revolutionary” to serve the purposes of the system is some bog-standard media studies stuff pertaining to today, not sci fi. The reveal that they could just fucking stop all this and go outside, it’s not some alien world or a post apocalypse after all is yet another thing that’s supposed to make you go “what’s wrong with them? Oh. Right. It’s us.”[1]
Long winded way to say: we can’t “end up” at Fifteen Million Merits future-world, because it’s just our own world we’re already in.
[1] Some read the final scene as depicting yet another viewscreen, but this is such a wildly weaker reading as far as the how effective the scene is that I can’t believe it’s intended that way.
Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?
I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me
This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.