#ScreenReader
added 4 new natural voices of #Windows #Narrator, however, none of them is working for me. when I down arrow to choose any of the 4 new voices, narrator becomes completely silent. #Blind #Blindness #Accessibility #ScreenReader
ok, got the #Windows 11 september 26th update. i'm playing with the #copilot now and there are some #ScreenReader issues. one example is, some of the buttons says unavailable in fact they are clickable. i'm using #JAWS by the way.
Quite a while ago (one year+ maybe) I saw a post with a sound file demonstrating how hard it is to follow if someone used #emojis excessively and you are using a #ScreenReader. I searched left and right and could not find it anymore.
I'm grateful for any hints!
Thanks!
I broke my poll, so trying this again!
Another #Poll for anyone with #Disability and #aAcessibility perspectives. This one is specifically about #ScreenReader language.
If some instructions read, “Click the Submit button to complete the application,” which of these are most correct? (This assumes the screen reader can read something different than what’s on screen.) And feel free to share your reasons in a reply.
Boost for more responses if you feel so inclined.
I've been happily using @IceCubesApp for a few days, and I have a few suggestions.
1. The ability for emoji characters in usernames to be stripped out for users of the #VoiceOver #ScreenReader.
2. A fix for a bug that sometimes causes VoiceOver not to announce something I type.
3. The option to have content warnings auto expanded.
4. Support for other translation services, such as #GoogleTranslate and the translator built into #iOS.
However, I've come to the realisation that I can't make this a thread. I'll need a "content warning" on the follow-up comment which would be the continuation. However, you can't add a summary (which is a content warning on #"Mastodon") to a comment on #Hubzilla. But without it, some mobile users will have a wall of text of tens of at least 25,000 characters in their federated timelines with nothing rolling it up or covering it up.
So I either have to drop everything into one post which won't be done until Sunday evening. I expect it to grow beyond 60,000 characters with all #ImageDescriptions in it. Downside: I don't know the character limit for posts on this hub or on the one where I have my clone. If it's 60,000 characters or lower, I'm doomed.
Or I have to make it two separate posts with the second one referencing the first one. The second one will therefore include a link to the first one. Downside: The link will most likely open in a Web browser. Also, I know for a fact that Hubzilla articles don't work in a #ScreenReader, so I guess neither do Hubzilla posts which renders the image descriptions largely moot.
Or I have to make it two separate, unrelated posts with no reference to the first one. Downside: The image descriptions in the second post would grow much, much longer because I'd have to repeat a whole lot of stuff from the first post.
I've only got one spaceship left to describe, and this can be fairly short because that vessel is hard to "paint with words" due to its unusual design. And then there will be a few words about the surrounding scenery which I usually add last.
But I still haven't written the descriptions of the four nebulae, three of which I need for this image description. I can describe spaceships halfway decently, but I don't think I can do the same at the same level of detail with nebulae. And I absolutely need descriptions for them in order not to let non-sighted users down and come over as some ableist swine.
Maybe I should #AskFedi to do that or even resort to #Alt4Me. (The nebulae in question would be IC 434 including the Horsehead Nebula as well as NGC 2023 and the other surroundings, the Crab Nebula, NGC 604 and, irrelevant for this first picture, the Lagoon Nebula.)
Also, I might actually split the post I've planned in two. I currently estimate the whole thing including a triple image description to go well beyond 60,000 characters. I'm not even sure if that's still within the character limit of the hubs where I have my channel. It would definitely exceed 50,000 characters.
The downside would be that the double image description in the second post would have to use that in the first post as a reference. I'm not sure if and how this will work for #Mastodon users, especially those with a #ScreenReader. The alternative would be to put the whole 4,200-character non-image-specific lead-in into both #ImageDescriptions.
question for folks who use a screen reader, is the recent change (at least on mastodon.art) to let hashtags appear separate from the post itself actually making screenreaders skip those tags? gonna add a few tags after this to test, thank you!!!
So for easier discoverability: How does a #ScreenReader handle words or sentences in all-caps?
Mein #Fairphone kam gestern an. Sehr cooles Konzept. Leider war es nicht möglich, es #blind ohne Hilfe einzurichten und die Lautsprecher klingen, wie ein non-stop Telefonat aus den 90gern. Die Mikrofone klangen erstmal ganz gut. Aber wenn jede Interaktion mit dem Gerät mit #Screenreader Ohrenschmerzen verursacht, kann ich es leider nur zurückschicken.
Sagen wir mal, du schreibst eine #Bildbeschreibung. Nicht einfach nur kurz und prägnant, was auf einem Bild drauf ist, was für den Post wichtig ist. Sondern du beschreibst das Bild detailliert. Du erklärst auch Dinge. In deiner Bildbeschreibung kommen also Sachen vor, die weder aus dem Post noch - für Normalsterbliche - aus dem Bild ersichtlich sind.
Dann gehört die nicht in den Alt-Text.
Der Grund dafür ist, daß es auch sehende Leute gibt, die an den Alt-Text nicht herankommen, weil sie z. B. aufgrund einer körperlichen Behinderung keinen Mauszeiger überm Bild hovern können. Für solche Leute sind alle Informationen, die nur im Alt-Text stehen, unerreichbar.
Für Sehbehinderte mit #ScreenReader ist sehr langer Alt-Text auch unpraktisch, weil sie den nicht navigieren können wie normalen Post-Text. Sie können nicht an eine bestimmte Stelle im Alt-Text zurückspringen und ihn sich von da an noch einmal vorlesen lassen. Sie können nur ganz an den Anfang vom Alt-Text zurückgehen. Je näher man der 1500-Zeichen-Marke kommt, desto länger labert einen dann der Screenreader unnötigerweise voll. Auch deshalb sollten längere #Bildbeschreibungen nicht in den Alt-Text.
Auf #Mastodon ist das natürlich ungünstig. Da hast du im Tröt selbst nur 500 Zeichen für den eigentlichen Tröt, für Hashtags, für etwaige Content Warnings und dann auch noch für die Bildbeschreibung. Im Alt-Text hättest du 1500 Zeichen nur für die Bildbeschreibung. Und ein Thread ist nicht sonderlich barrierefrei.
Praktisch überall sonst im #Fediverse ist das einfacher, weil man da für Posts mindestens mehrere tausend Zeichen hat. Ob der Alt-Text dann auf 1500 Zeichen begrenzt ist oder nicht, spielt dann keine Rolle mehr.
@accidentlyAnton welcome!
One thing I’m glad I learned early about #mastodon is hashtags are welcome in the text. The feed is not algorithmic, so we ned to discover similar interest through hashtags.
Another thing is to stay mindful about the #accessibility of your posts. Add alt text to images whenever possible, use CamelCase in hashtags so folks with #ScreenReader would be able to hear individual words.
@vowe Wie eigentlich meistens wurde auch bei Bluesky #Accessibility nicht von Beginn an mitgedacht und muss von Usern nachträglich angefragt werden. Falls es irgendwann nen vernünftig mit #Screenreader bedienbaren Client gibt, bin ich sicher auch mehr dort.
While the #MK1 hype train is moving, I thought I'd share this really cool site. Want to get stories, bios, move lists etc for virtually every #MortalKombat game ever? Why not give this a try! https://www.angelfire.com/va3/mk/
A #accessibility notice: if using a #ScreenReader, press Alt + Down Arrow on the combo boxes and hit either Tab or Enter to choose an item, otherwise the page will auto refresh when you try to select an item if you do it the normal way.
Eine Frage an Menschen, die z.B. #screenreader benutzen, oder die generell auf die #Alternativtext und #Bildbeschreibung angewiesen sind:
Ist es nützlich für euch Links in die Bildbeschreibung einzufügen?
Ich versuche das vor allem immer dann zu tun, wenn ich hier Memes poste. Dann packe ich gern einen Link zu knowyourmeme mit in die Bildbeschreibung, weil die meist eine sehr gute Bildbeschreibung, sowie zusätzlichen Kontext zum Bild liefern.
Aber können Tools wie Screenreader diese Links dann auch sinnvoll verarbeiten, so dass man da dann hin navigieren kann?
#technica11y #Barrierefreiheit #Inklusion
Question that just crossed my mind for people using #screenreaders and generally the #accessibility and #WebDev crowd: Is it useful for folx consuming the web with accessibility tools, such as #screenreader, to include links in alt text for images?
Especially when posting memes here on mastodon, I like to include a link to knowyourmeme, cause they usually have a) a very good image description and b) some context to go with it, so it doesn't just feel like I'm describing a (visual) inside joke.
But can tools like screenreaders even handle those links in a useful way? 🤔
#Barrierefreiheit (deutsch im nächsten reply)
WordPress Accessibility Meetup starts in 30 minutes! Join us to watch Alex Stine do a live accessibility audit of a WordPress plugin. Live #captioned.
https://us02web.zoom.us/webinar/register/3416950799213/WN_m6H2TWVGTHGTXxS9GOZo3A
@thomasfuchs 6. Not having ads completely hijack your #ScreenReader.
Apex is an #opensource #PHP based framework designed for efficient and robust web #development. Following a "simple is good" philosophy, Apex utilizes standardized methodologies via #PSRs to provide an easy learning curve and flexibility allowing you to develop in any style or design you prefer. And the version 2.0 integrates #ChatGPT and is apparently #accessible with a #ScreenReader. More info here on #reddit https://reddit.com/r/Blind/comments/16jgawx/apex_v20_released_with_chat_gpt_integration/
A question for those that use a #screenReader
How much detail should we put in the alt text for images? Is it better with a general description or should one try to paint the picture in words?
#genuinQuestion #altTextUsage #wantToDoRight
@krautspace@chaos.social
Hi, welche Betriebssysteme verwendet Ihr?
#ubuntu z.B. bringt den #ScreenReader #Orca standardmäßig mit
https://chaos.social/@krautspace/111062258367863805
#a11y #Barrierefreiheit
Wie möchten bei unserem Kinderbasteln mit den Kindern über Barrieren bei Geräten reden. Dazu suchen wir unter anderem nach einem #Screenreader. Hat jemand von euch Zugriff darauf und könnte es uns zur Verfügung stellen?
#Barrierefreiheit
Three things are certain: designing a new #screenReader for every new #accessible #videogame is not a sustainable nor effective strategy. But by the time it becomes broadly accepted that other parties should've been doing the heavy lifting all along, companies will be too invested in their 27 separate, inferior efforts to push for better. Meanwhile, there will be very little crossover between the people making the decision not to hold the Microsofts, Steams and Sonys of this world to account, and the group of users actually relying on the accessibility affordances. Happy playing.
#PSA | When you post images or video, please provide at least a basic description in #AltText. Some in the Fediverse rely on them for a complete experience, using a #ScreenReader.
Thanks.
Good evening mastofolks. Today, I performed my first #accessibility test on a mobile app that's currently available as a stable release.
I tested my first website in 2018, but up until now, I have only tested beta-stage websites and apps. So today was a major breakthrough.
Currently, I test sites and apps on Windows and Android only. I mainly focus on #UX (user experience) testing, to ensure that websites and apps are ACE!
A: Accessible for #ScreenReader users.
C: Clear and transparent.
E: Easy to navigate.
So, if anybody wants their website, or app, testing, please feel free to send me a direct message.
Seems to be fully #accessible for #Screenreader users. It even has support for #Talkback actions. https://mastodon.social/@pachli/111030607167438186
Hallo Menschen die ihr #screenreader und ähnliches verwendet.
eine #sehbehinderung habt oder euch Texte aus anderen Gründen vorlesen lasst.
ich möchte einen Fernlehrgang beginnen und würde mir die PDFs gerne auch vorlesen lassen. Welche kostenfreie App für iPhone könnt ihr empfehlen.
Is it okay to fill the chat with dozens of messages during a Zoom meeting or other videoconference?
https://beehaw.org/post/7716703
#A11y #Accessibility #Disability #Blind #ScreenReader #Zoom #Videoconference #Jaws #NVDA #VoiceOver #Meetings
Was mache ich nun mit diesem #BlueSky Account? Alles von hier crossposten ergibt keinen Sinn. Und solange das Lesen der TL mit #Screenreader so zeitaufwändig ist, öffne ich die app nur selten. Aber ich hab wieder 2 Invites. Wer einen möchte, bitte melden.
If you absolutely want to write a detailed #ImageDescription, or you have to (yes, there can be reasons for that), and you're on something that isn't #Mastodon, don't put it into the #AltText.
Sure, the alt-text culture largely came from Mastodon. And most #Fediverse users have probably been taught what alt-text is while on Mastodon, and if not, then by Mastodon users. And yes, Mastodon gives you much more space in alt-text than in the post text.
But all the other Fediverse projects don't.
Mastodon gives you 500 characters save for modified instances. But:
#Misskey gives you 3,000 characters.
#Firefish gives you 3,000 characters at default settings.
#Pleroma gives you 5,000 characters at default settings.
#Akkoma gives you 5,000 characters at default settings.
#Friendica, #Hubzilla and #Streams all give you a practically unlimited amount of characters.
So "there's no room in the post" isn't true in most cases. Mastodon's limitations only apply to Mastodon. Everywhere else, you're free from them.
Besides, the longer an alt-text is, the more inconvenient it becomes for #ScreenReader users. A screen reader can only read alt-text in one chunk. It can't rewind to one specific point in the alt-text and re-read from there. If screen reader users want something re-read, they'll have to go all the way back to the beginning of the alt-text.
Lastly, you should never put any information that's only available in the image description into the alt-text. There are actually people who cannot access alt-text, for example due to a physical disability that prevents them from hovering a mouse cursor over an image (some of us sighted people aren't on mobile). Any information that's only in the alt-text is inaccessible and therefore lost to them. They can only read what's in the post text itself.
Yes, a full-blown image description in the post text doesn't look good. Yes, posts look neater with the image description tucked away in the alt-text. Yes, some Mastodon users protest against posts with over 500 characters in their federated timelines. But there are also Mastodon users who protest against having to write alt-text because they can't be bothered.
And there are disabled Mastodon users who'd protest against detailed image descriptions in alt-text that they'd like to read, but can't, if they knew that such image descriptions exist underneath particular images.
I've actually conducted a test and demonstration with a very long image description:
in alt-text where it was expectedly cut off at the 1,500-character mark by Mastodon and other projects
in a separate article which not only is inconvenient again, but only available to #Hubzilla users and people with websites of their own
in the post text which blew the initially very short post way out of proportion, but which was the most accessible method by far
As far as I can see, putting the image description into the post text is the preferred way.
Hi, MastoFolk! I'm still here! I got going with a couple of fantastic bi communities on Discord. The app is usable with a screen-reader, but could use improvements. IMO, worth it to be a part of these communities I love. #bisexual #screenReader #Discord #Accessibility
I thought it couldn't get any worse, but the Instagram website, specifically the direct messaging chats list, has gone even further down the accessibility toilet. As of now, my screen reader cannot see chats that I know exist. Particularly frustrating as fellow students and student groups use Instagram direct messaging religiously.
#Instagram #Accessibility #A11Y #ScreenReader #Blind #LowVision #VisuallyImpaired #BlindMastodon #BlindMasto #BlindFedi
Bit of an odd #a11y question: is there any #screenreader that outputs everything it reads/says to a text file somewhere? Want to try some bits in #CSS and other browser features and would be good to have backup copies of what the screen readers said before and after the changes to my page.
Is #YouTube ever going to fix the bug where #ScreenReader users, can't read the names of shorts when viewing a specific channel, at least on #iOS?
I have a question to those who use screen readers.
How do the posts with CW (content warning) filter work for you?
I wanted to post what we saw in the forest with CW, but I wasn’t sure if you guys can skip the post, if you wanted to. (Sometimes, mother nature can be cruel.)
Thank you.
Beginning to get messages in my various #cricket chat places like "I've got spare tickets for the #ODI at #TheOval on the 13th". By an 𝘢𝘮𝘢𝘻𝘪𝘯𝘨 coincidence, look what the weather forecast says. Also, if any of you use a #ScreenReader could you tell me whether that italicisation of the word "amazing" worked well? #blind
New app added to the Accessible Apps directory by Wang Ice from Prudence Interactive: Prudence Screen Reader. New release | Break information barriers, leading the visually impaired intelligent life, overseas version of the “Prudence Screen Reader” for android mobile officially on the Google Play! https://accessibleandroid.com/app/prudence-screen-reader/ #Android #App #ScreenReader
We are very pleased to announce that version 2023.2 of NVDA, the free screen reader for Microsoft Windows, is now available for download. We encourage all users to upgrade to this version.
This release introduces the Add-on Store, new braille features, commands, & display support, new input gestures for OCR & flattened object navigation & much more!
Please visit https://www.nvaccess.org/post/nvda-2023-2/ for the full info, & to download the new version! #NVDA #ScreenReader #Accessibility #A11y

@jntrnr this is a great blog post about #Nushell. and coming up even better #ScreenReader support. it's slices and dices #DATA and probably even juliennes fries.
and now, how much would you pay?
#A11Y #Accessibility
#Business #Pitfalls
Don’t fake bold and italic text with Unicode · It’s detrimental to accessibility, findability, and predictability https://ilo.im/14wzwi
_____
#SocialMedia #SEO #Findability #Accessibility #ScreenReader #Unicode
I have a #blind musician nephew willing to learn more about #Reaper used with #nvda and #osara. However, he lives in Tunisia and mainly speaks Arabic. Is there blind reaper users out there who speaks arabic who could talk with him and help? Or any forum or group where arabic language is used on this topic? Thanks! #a11y #screenreader #accessibility
testing my about-me webpage with a #ScreenReader worked decently until the paragraph in Welsh 😅
(and also the phrases "XCOM 2", which it's pronouncing as individual letters, and "DevOps", which it's pronouncing as one word. I could put a zero-width space there to make it "X Com", but I'm not sure that's best practice?)
First of all, "Mastodon" does not stand for the whole #Fediverse. The length of this post should be a give-away that I might not even be on Mastodon myself. Because I'm not.
Next, allow me to elaborate.
Look at all those many articles on the Web that explain what alt-text is and what it should look like. Unless they were explicitly written for Mastodon, they don't take Mastodon or the Fediverse in general into consideration.
If they speak of "social media", what they mean are #𝕏 with a limit of 500 characters in alt-text and #Facebook with a limit of only 100 characters. Also, they take into account that older #ScreenReaders are limited to 200 characters, and #blind or #VisuallyImpaired #ScreenReader users are normally the only target audience for alt-text.
Thus, these articles say that alt-text should be short and concise and limited to what matters in the image within the context of the webpage or article or post it is part of.
Then came Mastodon. With it came a new culture of #inclusion and more or less voluntarily-granted #accessibility, also because it became a safe haven for #disabled people who had fled from the rampant social Darwinism on #Twitter.
Not only that, but Mastodon raised its alt-text character limit from 450 to 1,500 characters. For each image, not for all of them together, as far as I know.
The next thing that happened was that people suggested image descriptions in alt-text to be more detailed. Instead of being limited to what's important, they should fully describe images. For one, a detailed image description could help even sighted people understand an image better, for example, if it contains something technical.
Besides, since most Fediverse users mostly or exclusively use the Fediverse on mobile phones, another use-case for image descriptions emerged, namely as a replacement for the image proper when the network is so weak that the image doesn't load.
And then people started writing detailed image descriptions. Yes, in alt-text. It felt only natural to do so because most people who started describing images in alt-text had never written an #ImageDescription before. And alt-text is what describes an image, right?
On Mastodon, it was and still is actually fully justified: You have 1,500 characters for alt-text per image. For the post itself, you only have 500 characters. And these 500 characters have to include the post text, extra hashtags and even the content warnings. You may end up with fewer than 100 characters for an image description. Alt-text grants you you 1,400 characters more.
However, just because it's what everyone does on Mastodon because they don't have a choice, doesn't mean it's the right thing to do. Especially not if you do have a choice.
(To be continued...)
People of the world that needs #alttext because you use a #screenreader, can you tell me some good practices or do's and dont's when adding those? 🙏#accessibility (please boost if you can, really curious about this so I can do it better!)
Hi everyone,
In-Process for the 11th August is out, featuring all the news on NVDA 2023.2 Beta 2, tips on reporting errors in NVDA, some thoughts on reporting accessibility issues on other programs, and NV Access get even more social! All in: https://www.nvaccess.org/post/in-process-11th-august-2023/ #NVDA #ScreenReader #A11y #bug
I don't write my #ImageDescriptions only for #blind or #VisuallyImpaired users. I write them for everyone for whom they may be of use. I write them for people who wouldn't be able to identify and/or understand what's in the pictures I post. Seeing as the content in my pictures tends to be very unusual, I almost always have to go into detail a lot.
Now, I'm not on #Mastodon. I'm not bound to 500 characters for the post plus the content warning, and I'm only bound to 1,500 characters for the #AltText because that's the length at which Mastodon, #MissKey, #Firefish and probably other #Fediverse projects cut off longer alt-text. I practically don't have any character limit whatsoever.
This means that I've actually got much, much, much more space for an #ImageDescription in the post itself where it should actually belong than in the alt-text. And I really need that space to describe the images I usually post. So the image description goes into the post, and the alt-text briefly mentions the image plus that its description is in the post.
It does have a few downsides, though: Sighted Mastodon users might have to scroll past an enormous wall of text before they reach the image that's described. This multiplies with each image, should I put more than one into a post.
Besides, I have to add a #ContentWarning for a long post, also mentioning how long the post is. To my best knowledge, a long post warning should be issued for anything over 280 characters or five lines on Mastodon's default Web interface, whichever is reached first, and my posts with image descriptions tend to exceed that by magnitudes.
It's interesting to see that about one out of six blind or visually-impaired users prefers image descriptions that are "as detailed as possible" although "possible" can be stretched to oblivion. And frankly, not only have I yet to encounter someone who complains about my image descriptions being too long, but at least some people have declared they like the way, as they say, I "paint with words".
Still, this poll makes me wonder what the other five out of six think about extensive and detailed image descriptions, especially when they're in the post text itself and not in the alt-text where they can be hard for #ScreenReader users to navigate.