Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
0
0
868
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
481
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
830
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
714
Dec ’25
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
1
0
1.2k
Jan ’26
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
1
0
1.6k
Feb ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
0
0
319
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
1
0
434
3w
Left-flick and right-flick gestures with VoiceOver and UIAccessibilityReadingContent
Hi, I have an app that displays lines of text, that I want to make accessible with VoiceOver. It's based on a UITextView. I have implemented the UIAccessibilityReadingContent protocol, following the instructions in https://developer.apple.com/videos/play/wwdc2019/248 and now users can see the screen line by line, by moving their fingers on the screen. That works fine. However, users would also like to be able to use left-flick and right-flick to move to the previous or next line on the screen, and I haven't been able to make this work. I can see that left-flick triggers accessibilityPreviousTextNavigationElement and right-flick triggers accessibilityNextTextNavigationElement, but I don't understand what these variables should be.
1
0
1.2k
1w
actionGroupCell Y coordinates are corrupted. The overlay for 'Save to Files' doesn't expose no valid id's for testability.
I am using Maestro framework for testing iOS. For our tests we need to save pdf's to files. When opening the 'QLOverlayDefaultActionButtonAccessibilityIdentifier' the buttons on the overlay are way off, the coordinates got corrupted. When opening the files app, the next screen has no coordinates for testing. I checked with xcode inspector and the same issue persists.
1
0
113
16h
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
Replies
0
Boosts
0
Views
868
Activity
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
Replies
1
Boosts
0
Views
481
Activity
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
Replies
1
Boosts
0
Views
830
Activity
Dec ’25
Accessibility voice command recording does not start on Apple Vision Pro
Is the accessibility feature, voice command recording available on the Apple Vision Pro? It does not start on my device. The Apple Vision Pro is on 26.1. Regular single voice commands work on the Apple Vision Pro. Recording commands worked on other devices. (iPad and iPhone)
Replies
2
Boosts
0
Views
823
Activity
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
Replies
1
Boosts
0
Views
714
Activity
Dec ’25
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
Replies
1
Boosts
0
Views
1.2k
Activity
Jan ’26
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
Replies
1
Boosts
0
Views
1.6k
Activity
Feb ’26
square mouse and lack of transparency
after the 26.3 beta update, my mouse has been having major problems with transparency, have to keep going to reset colors in display, but it doesn't hold, anyone else?
Replies
1
Boosts
0
Views
1.1k
Activity
Feb ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
Replies
0
Boosts
0
Views
319
Activity
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
Replies
1
Boosts
0
Views
434
Activity
3w
Left-flick and right-flick gestures with VoiceOver and UIAccessibilityReadingContent
Hi, I have an app that displays lines of text, that I want to make accessible with VoiceOver. It's based on a UITextView. I have implemented the UIAccessibilityReadingContent protocol, following the instructions in https://developer.apple.com/videos/play/wwdc2019/248 and now users can see the screen line by line, by moving their fingers on the screen. That works fine. However, users would also like to be able to use left-flick and right-flick to move to the previous or next line on the screen, and I haven't been able to make this work. I can see that left-flick triggers accessibilityPreviousTextNavigationElement and right-flick triggers accessibilityNextTextNavigationElement, but I don't understand what these variables should be.
Replies
1
Boosts
0
Views
1.2k
Activity
1w
actionGroupCell Y coordinates are corrupted. The overlay for 'Save to Files' doesn't expose no valid id's for testability.
I am using Maestro framework for testing iOS. For our tests we need to save pdf's to files. When opening the 'QLOverlayDefaultActionButtonAccessibilityIdentifier' the buttons on the overlay are way off, the coordinates got corrupted. When opening the files app, the next screen has no coordinates for testing. I checked with xcode inspector and the same issue persists.
Replies
1
Boosts
0
Views
113
Activity
16h