Most of my work is confidential, but some current work can be seen within the Google Pixel Fold, especially around Postures and Dual screen translate.
I was the Android design lead for its dual screen translate features released during IO 2023. We took a concept from early stages to a feature readiness prototype in 4 months, interfacing across Android, Translate and hardware teams.
This feature is launched via Google’s interpreter AI assistant & holding up the Fold in certain postures. Our aim was to use the Pixel Fold’s unique hardware capabilities to aid interpersonal communication across language divides, with minimum user effort.
Definition of these postures and interactivity were defined by consulting with ML and Google Translate’s feature experts.
Constant testing via prototyping & understanding key questions around the postural use, privacy & trust.
Definition of multi-modal micro interactions to engender trust and understanding of this feature.
Launched during Google IO 2023
‘Postures’ is a set of experiments using the onboard sensors (eg. accelerometers) in the Pixel fold to initiate novel interactions - based on positioning and angle of hinge flex. The interactions are based on the understanding of context > Invoking personalised UI via sensor data > Transition meaningfully between interfaces.
An example seen below is of a Fold used as a bedtime companion…
A postural language could be constructed, explored through prototypes around mulitple contexts of home, office, cooking etc - helping interact with the device through physicality.