Sensory Sims




Inspiration
Why did you build this specific thing? What’s the personal story behind the project?
As designers and developers, we’re constantly building products for users—but we rarely get to step into their shoes. We realized how easy it is to overlook the lived experiences of people with sensory impairments when we’ve never truly felt what they go through. This project was born from a desire to bridge that gap—not just to design for empathy, but to design through empathy by letting people experience what it’s actually like.
What it does
What does this thing do? What are the software components that it’s using? How does it work?
Sensory Sims is a Flutter-based mobile app that lets users experience simulated sensory impairments—from blurred vision and color blindness to tinnitus and muffled hearing. It uses:
Flutterfor cross-platform UIjust_audiofor realistic auditory simulationscamerafor live video filters simulating visual conditionscolorfilter_generatorfor creating visual impairment simulations by applying color transformations
Challenges we ran into
What were the biggest hurdles you had to resolve?
One of the biggest hurdles was simulating sensory impairments both accurately and accessibly. Creating animated avatars for different conditions was particularly tough due to the lack of free, high-quality resources. We had to experiment with different tools and platforms—like Hailuo, ElevenLabs, and Pix verse—to find the right combination that gave us natural, empathetic output.
Another major challenge was merging accessibility-focused simulations (vision and hearing) with real-time processing in Flutter, especially since much of the required tech isn’t natively supported. Altering camera feeds and images to simulate impaired vision involved complex filtering and layering techniques, which pushed the limits of what Flutter could handle efficiently.
Accomplishments that we're proud of
What are you proud of in the project? Is there a specific feature that gives you that Aha!?
We’re proud of how we took an abstract, empathy-driven idea and turned it into a working prototype—in just two days. Learning how animations and filters work in Flutter, and successfully applying them to simulate real-world impairments, was a huge technical and creative win.
One of the highlights was creating engaging AI avatars that made the experience feel personal and emotionally resonant. It was exhilarating to see how these assets brought the simulations to life in a way that was both educational and enjoyable.
Ultimately, we built a tool that has the potential to educate, train, and inspire empathy among healthcare workers, designers, and families—bridging the gap between technology and human experience
What we learned
What new things have you learned through working on this project?
We learned how to push the limits of Flutter’s animation and filtering capabilities to simulate real sensory impairments through live camera feeds and visual effects. It gave us a much deeper appreciation for the technical challenges behind building accessible experiences.
We also realized just how powerful AI has become as a creative partner—not just for code generation, but also for debugging, design ideation, and content creation. Instead of opening 10 tabs for research, we centralized our workflow with AI tools, which dramatically accelerated development without compromising quality.
Most importantly, we saw how tech—when paired with the right mindset—can bridge the empathy gap and help build experiences that are inclusive, educational, and impactful.
What's next
What didn’t you get to build? What would you do next if you had more time?
We’d love to refine the avatar’s gestures and body language to feel more natural and expressive through improved prompt engineering, if we had more time.
We’d love to add interactive activities for each impairment—simple tasks users typically perform on their phones, like filling out a form, reading a message, or navigating a menu. These simulations would help users truly understand the everyday challenges faced by people with sensory impairments.
We also aim to build tools that let designers and researchers evaluate their own designs under these conditions, promoting more inclusive and accessible products.
We also hope to add compatibility with AR headsets (like Vision Pro or HoloLens) for immersive empathy training.
Where to try it out
A web link to a live tool, demo video, Github Repository
Mobile App: https://github.com/Wiza-Munthali/sensory_sim
Backend API: Request access from Wiza (for security reasons)










