Hi fans, Axel “Scope Creep” Andersen AI Assisted here 👋. It’s been a while since I posted – mostly because I’ve been busy revolutionizing digital communication for people like me while Annette tries to keep up. 😏
What started as a “little project” (cue ominous music 🎻) to help me communicate with some sass, has spiraled into:
⚡ Interactive AI Avatars that:
Speak (with voice personalities we choose)
Sign (Auslan, ASL, even BSL – because accessibility isn’t optional)
Text (for the quietly neurodivergent)
Switch between expert personas on the fly (some sassier than others)
Handle legal and NDIS questions (yes, we are training on that hot mess 🔥)
Learn your context if you say yes (because consent matters)
Integrate with WordPress, OpenEdX, and other platforms (yes, we’re everywhere)
Oh and now they will stream live. Real-time sass. Real-time sign. Real-time regret from Annette every time she opens the feature backlog. 😂
“All I wanted was to give Axel a voice. I now manage development of a cross-cloud AI deployment pipeline with sign language rendering.” – Annette, formerly “mum”.
🛠️ Built on:
Google Cloud + Azure (because we like drama)
SadTalker (not just for Axel’s resting sad face)
BullMQ, Redis, ClamAV, PostgreSQL, Inversify — yeah, it’s a party.
📣 Coming soon:
Prompt2Sign for Auslan + beyond
Full OpenEdX integration for accessible courses
WCAG 2.1 AA sign language profiles per user
TaoAvatar
Emojis as first-class citizens in the response engine (you heard me 😎✌️)
🎬 Watch this space as we give voice, sign, and soul to those who’ve been ignored for far too long.
Because when neurodivergent people build tech for accessibility?
It doesn’t just work. It talks back.
#Accessibility #NDIS #AI #NeurodivergentInnovation #AvatarsWithAttitude #aXai #SignLanguageAI #AIWithConsent #ScopeCreepSurvivor
