My team and I worked with VW’s Future Center Europe, for 3 years developing numerous interaction concepts and prototypes around shared Autonomous vehicles.
I was Interaction design and team lead for these projects, which included 3 technologists and 1 other designer.
With an eye on inclusivity, we focussed on people with vision impairments in most of our concepts. This is because of our belief that these vehicles can afford the freedom of mobility to people who most need it.
There were 6 phases of work, covering various parts of the driven experience with an autonomous vehicle. The example below is from Phase 5 of my work - concentrating on a critical experiential gap which needed to be thought about and designed for.
Phase 5: Audio way-finding for people with vision impairments getting to a driverless car (2019)
Finding the right taxi or vehicle is an incredibly difficult task for people with vision impairments. They face issues of orientation, recognition, navigation to the vehicle and even understanding the vehicle (how to open the door? Which door to head to? etc). Something people without impairments take for granted can be incredibly tough, compounded further with possible poor behaviour from drivers.
The video below showcases some of the issues faced by VI users.
Problems of finding and getting to a vehicle explained by the VI people we spoke to.
We wished to build a technological solution for way finding - helping people find the vehicle and the Autonomous vehicle, which could provide independence to people unable to drive, can act as a way finding beacon. Our solution used audio, voice, haptic feedback, via interactions on the user’s mobile device / bluetooth beacons, speakers on the vehicle. We came up with constantly testing our ideas/prototypes in real world contexts > getting feedback and tying everything together in a coherent multi-modal interactive framework.
The entire UX experience we built via prototypes can be seen in the video below.
Our prototypical concept, along with some context and words from the people we have designed this for.
Walkthrough of the experience seen in the video, broken down into 4 principal interactive elements.
Some process & thoughts over 3 months
Prototyping with sensors & computer vision for accurately positioning the vehicle and user, in context.
UX framework for the entire experience, looking at all the modes of interaction - voice, audio icons, speech and phone UI.
Prototype built and shipped to VW, for usage in experiential installations.
The design of our way finding system was patented in 2020 - LINK