Also, it's not enough that Gnome is trash, or KDE is slowly trying, or the command line is mainly for developers. When a user installs Linux and needs assistive technology, like Orca, they can't just enable it and go on their way. They have to check a box in settings to "enable" assistive technologies. That's a huge barrier, and shouldn't exist. But it does. Another roadblock. Why do these exist in a supposed welcoming community? Why do these exist if Linux is open to all? Why? If FOSS is communal, why are blind people, due to the huge barrier of entry, shut out of the FOSS OS? These are hard questions we should be working through. Why does the GUI require assistive technology support to be enabled in order for Orca to work with many apps? Why can't it be enabled by default? Does it slow stuff down? If so, why? And should we have to live with a slower OS because we're blind?
@devinprater Setup should START with a screen reader! And there shoudl eb an option to turn it OFF and not ON.
I'm play devils advocate here... have you tried suggesting that to the gnome people? I've been able to see all my life and these problems might seem obvious to someone but definitely not to me.
I would say that Linux has the ability to receive contribs from a far larger audience (and hence a larger audience of visually impaired people), but we need these people to be willing to contribute, not act all pissed on some random social.
@dyamon @devinprater Here, created an issue with the folks @ GNOME: https://gitlab.gnome.org/GNOME/gnome-shell/-/issues/4423
I have to partially reconsider my position here. I still think that opening an issue was the right move but:
1 - one person in the issue comments tried to say that having the screen reader on by default "would be awkward", despite you explaining the reasoning behind it. This is like saying that you don't want ramps next to stairs because you might feel awkward walking at an angle.
2- the issue was marked duplicate of a 2yo issue.
@foreverxml Definitely. Or at least a spoken message That a screen reader can be turned on, and a keyboard command given. But no, on all mainstream distros, a blind user must figure out from somewhere that Super+Alt+S or Super+Alt+O turns on Orca, if Orca is even in the installer. If the installer is even accessible.
I think, you are right. And there is something you can do about it: provide a checklist on barrier-free design of UI. Most devs, who don't need assistive support, don't have a clue what this could be. They just don't know about it, because it's beyond their perception.
Just TELL them, what to do!
One of the things Microsoft is good at is funding UI research to see how people use computers and to meet national procurment laws they put the hard work into accessibility.
I watched https://emacsconf.org/2019/talks/08/ as an idea to try and understand how to use a computer without vision, but it could also help to have examples of use.
Someone once posted a recording a of screen reader going through an emojii heavy post, to make it very clear how annoying it is.
@devinprater @alienghic @wauz as early as 2001 there was a complete linux distro for the blind that booted with voice support and ran early gnome with orca by default. It was sponsored by the American Foundation for the Blind (AFB) when Janina Sajka was their CTO. Several of their major donors, including Microsoft, objected, and demanded they not only stop projects, but fire her and others or would stop donating. So they did.
@tychosoft @wauz @alienghic Oh if you get a chance, check out the Braille Plus. Nthe Braille Plus 18, but the original. It was a beautiful, small PDA for the blind, running, I believe, Alpine Linux with apps built by APH, American Printinghouse for the Blind. That was about the best assistive technology ever made. Now we're stuck with stupid Android toys that can't even show basic text formatting in braille, like older devices could.
not everyone is bowing to that bullying pressure. Thanks for that IRC link, I'll follow up on it. Here's a cool story about a state government switching, and I think that since the city of Munich gets to provide accessibility services to people, and they've all switched away from Microsoft, there might be something larger commanding development (an entire city in this case) towards a better designed user land.
@devinprater @alienghic @wauz every year in the early 2000's Janina would come to speak at the Libre Software Meeting in France and run a blind workshop, as well as show of the distro they sponsored, and other projects like the linux portable daisy reader. That's why I know about it, as I was presenting Bayonne. I did a joint presentation with her one year.
@devinprater @alienghic @wauz out of that, and my trip to Macedonia, came the GNU Alexandria project, which adapted GNU Bayonne to be a daisy book provider of e-government services for the blind. The AFB sponsored a pilot project of that funded by the US SSA, to read ssa documents over the phone to blind citizens. This project too was cancelled because of said AFB donors. @bob
@wauz @tychosoft @alienghic @bob You know, donors and sponsors sure have a lot of power. Like, theNFB conference is sponsored by Microsoft and Google. Google for heaven's sake! Android isn't exactly as bad as Gnome, but it's not nearly as great as iOS. And Microsoft does a bit more than Google, but a lot of people that go to the NFB conference won't even be able to run Windows 11.
@tychosoft @devinprater @alienghic
When I think back, the first computers I used (Apple II, C64), I actually could use "headless", because you had to know all basic commands by heart. Then, with win3, there was a kind of GUI, but actually it wasn't that, what GUI now is. The description desktop fit, bc it was just a surface, where you could put your items - 1/3
just where you wanted them.
I organized my desk in a bunch of "drawers", folders with launchers.
I think, we should go back to the idea of a desk.
Everyone uses desks, but not the same way. A desktop for the blind must be a non-vision-UI.
I think, it's serious business, to think over, how that could work. Mouse involved? Hover-over with speech output?
Maybe we start a hashtag to have a wider @tychosoft @devinprater @alienghic - 2/3
@tychosoft @devinprater @alienghic @wauz it's a shame that that linux/bsd command-line is so terribly designed, with Anglocentrism everywhere, hostile documentation (when it even exists, if it does throw 1d6 to find out whether it's in a manpage, website, --help, README or what), no design consistency whatsoever, the fact that bash is still the default instead of fish, unnecessary abbreviations to cover up for poor history and completion (unless you run fish), and so on.
It makes me sad to think how much of a better computing environment it would be, a command-line that was actually good. There's nothing about CLIs that makes then inherently non-discoverable, arcane, or aggressive, the one we have just has had poor design work (or, one might say, no design work; it's as if the whole thing was designed by programmers). If the interface had good design and the documentation good writing, it would be trivial to support audio-based interaction, and minority language speakers, and, you know, people who are not programmers in general.
I'm a professional sysadmin who used bash for more than 15 years, along with zsh, scsh and tcsh for kicks. Today I use fish; it is superior to all sh- and C shells in all regards, including daily scripting, programming, one-liners and keyboard efficiency.
It is also much easier for non-tech people to use, better documented, more consistent, has good error messages, is a lot more convenient and pretty etc.
this is what good design buys you; it's what linux folk can't wrap their head around, the fact that so-called "users" aren't inferior or $ableist_slur-er than programmers. that's the only reason why bash is still the default when it's worse in all regards.
@devinprater @tychosoft @wauz @alienghic
I don't know about vision accessibility issues but if you run into problems with fish please ping me, I'm willing to write patches and I'm sure the devs would take them.
(sadly I think most of the problems in using the command-line currently are from the (lack of) design in the utilities, a shell can't solve that...)
When we are talking about assistive systems, we should stick to details at first. We need to explore it as wide as we can.
Xoids are modular. And the very first user interface is - a tty terminal. It's not assistive at all, but it doesn't set special barriers. In the origin, this UI was meant to be another piece of hardware (teletype!).
That is also the first @ramona @devinprater @tychosoft - 1/3
stage, where we could put up assistive systems.
Then we have a kind of transmissive layer, e.g. Xserver. Next stage for assistive systems?
Then, we have window managers.
Software, that has own UI (vim, emacs, just as example). Some of those deny the modular principle, as emacs (no judgement, just a fact)
We should think over, which assistive technology should be @seachaint @ramona @devinprater @tychosoft - 2/3
@ramona @wauz @devinprater @alienghic the irony is the GNU GPL was not written simply as a means to enable a separated class of developers to share and exchange code among themselves, but rather to radically empower everyone, and especially what we do often seem to call end users. So I appreciate the way you do phrase this.
yeah it's a thing I often think about, I feel like CLI interfaces have a lot more potential than generally assumed (think of all the Google stuff like "13:00 in Chennai in New York" or "etymology protest" or "(4*128) usd in eur", that's a command-line). and I think the GUI selling point, that GUIs are easier to use because they're more discoverable, is an accident of history rather than anything inherent to CLIs. a discoverable CLI is conceivable to me, it would just be a lot of work to adapt all the tools to adhere to a consistent design, a common and better docs etc. but there's a freedesktop standard, isn't there? people see the benefits of coding for it.
at the end of the day a CLI is just language, it's communicating with words, and this is something people can do generally, not just tech people. plus there's a lot of synergy with voice assistants as voice recognition gets better, and it's good for low-cost devices for people without access to expensive gear, and so on…
the Linux/BSD gui is the best we have, but only cos the bar is so low.
@tychosoft @devinprater @alienghic @wauz (Emacs is a bit of an inspiration here too, I wouldn't call it exactly "not arcane", but you can see M-x as a CLI, and if you use it like I do—I tend to type commands in Emacs a lot more often than resorting to keyboard shortcuts—then this CLI does discoverability and consistent design a lot better than the Linux CLI.)
@devinprater how usable is emacs speaks? When someone was asking about blind accessible free software that was my best guess where to go.
Now if we could only get Emacs to be more repetitive strain injury accessible we might be in a better place.
@alienghic Setting it up is not like any other package. You have to download the release archive, or clone from GitHub.com/tvraman/emacspeak/, then on Linux, if you want the best experience, you'll have to buy a proprietary speech synthesizer, called Voxin, from https://oralux.org, then make Emacspeak, make the speech engine you want to use, so make outloud for Voxin (it used to be called IBM Viavoice), and then set, in your .profile, something like DTK_PROGRAM=outloud. Then it'd probably work. But yeah after that you've got one of the best blind-friendly systems out there. It's not easy to learn, it's basically living in Emacs with a few API-scraping utilities for gathering stuff like news or weather, but my goodness it's better than any screen reader on any system out there can match with hooking into a general operating system.
@devinprater I was going to ask if there's a good free software tool to start with, but I'm now I'm also curious where the cost for voxin goes. They say they're not profit, and the high quality voice samples are really nice.
Also out of curiosity is voice control at all useful for blind users?
I wonder if the voxin voices could be plugged into one of the free software voice assistants like mycroft or almond.
@alienghic It comes from a company which part of it was bought by Microsoft, but the speech part was spun into its own company. So I'm not even sure what it's called now. But yeah it comes from licensing that. And IBM TTS is abandonware that this company just won't let go of.
@devinprater Interesting... Thank you.
Not wanting to "favorite" a company keeping abandoned assistance technology under wraps.
@alienghic Yeah, it sucks. I mean, there is ESpeak, but it sounds pretty bad, and its speech server in Emacspeak... could use some work. It was pretty laggy when I last tried it.
@ramona @tychosoft @devinprater @alienghic @wauz sorry to fly by and drop a link in here, but this is the closest i’ve seen on a mac to an attempt to center a CLI and make it a real Thing for a non programming audience https://www.alfredapp.com and i was like. obsessed when it came out 😅
might be interesting from a design perspective
@alienghic @ramona @tychosoft @wauz A program I wrote uses the CLI. Students, who have never touched a CLI in their lives, were able to use it because it tells them what they can do in simple language. Press 1 for this, 2 for this, 3 for this, or just press enter for this option that everyone's probably gonna use anyway, or q for this, and then press Enter. Something like that but better since I wasn't just waking up when I wrote it into the program lol.
A fun, happy little Mastodon/Hometown instance. Join us by the fire and have awesome discussions about things, stuff and everything in between! Admins: @Talon and @Mayana.