The future of AI in disability care isn’t about making businesses more efficient—it’s about empowering people. For individuals with complex communication needs, cognitive disabilities, and fluctuating conditions, a custom, person-centered LLM could be the difference between life and death. Yet, so much of AI in disability services is built around business needs—not actual human needs.
AI in disability care shouldn’t be about optimizing bureaucratic processes or making it easier for agencies to cut costs. It should be about proactive, person-centered support—LLMs fine-tuned to the specific needs of the individual, trained on their communication styles, routines, and unique support requirements.
Valmar Case: One Example of How a Custom LLM Could Have Helped
Take the Valmar case, where a man with disabilities choked to death because his dietary needs weren’t properly communicated to support staff. The NDIS Commission found that crucial information about his texture-modified diet was missed.
Now, imagine if a custom LLM trained specifically on his support needs had been in place. A simple photograph of food combined with an AI-powered dietary alert system could have flagged in real time that the meal was unsafe—triggering an immediate response before it was too late.
This is the future we should be building.
Why Person-Centered LLMs Matter
A custom LLM for a person with disabilities could:
✅ Identify risks before they happen – analyzing behavioral data, dietary restrictions, and health patterns to proactively prevent crises.
✅ Communicate in a way that makes sense – fine-tuned to an individual’s communication style, whether that’s emojis, AAC, or simplified language.
✅ Adapt and learn – tracking real-world changes in someone’s support needs, rather than relying on outdated reports and rigid bureaucratic reviews.
✅ Reduce the burden on caregivers – offering real-time insights and reminders instead of forcing families to constantly re-explain everything to new support workers.
Building a Future That Puts People First
AI shouldn’t be another tool for corporate efficiency or government cost-cutting. It should be about real-world safety and autonomy for people with disabilities. Instead of agencies training AI to detect fraud or reduce funding, we should be training AI to understand individual needs and prevent avoidable harm.
The technology exists. The only question is whether we build it to serve people—or to serve the system.
#AIForPeople
#PersonCenteredAI
#CustomLLMs
#DisabilityRights
#NDIS