The UX of a Conversation

TEAM: Daniela Navaes | Patrick Bull | Ivy Wong | Reina Yuan

For the second and third weeks, the brief was: the user experience of a conversation. What constitutes a conversation? What are the entities and elements involved? Is it possible to have a conversation between a human and a non-human entity?

The goal was to design a conversation between a human and an artificially intelligent system.

Project Direction

We decided to explore the design of a non-verbal conversation. The UX/UI community is very excited about voice based assistants, since being able to speak to a machine via voice only is a very attractive idea. But what about those who cannot hear and/or speak?

The deaf community is somewhat invisible in society. Since they are not as easy to spot as blind people, for example, they’re also easy to be forgotten.

For that reason, we asked ourselves what if machines can learn to read and interpret body language and sign language to anticipate someone’s needs? We read several articles about it and found out that many technologies that enable this are constantly being developed and tested.

A Highly Proactive Smart Home  

We decided to explore the concept of how a smart home could interact with a person who is deaf/mute. The house would be proactive, suggesting and doing changes in the environment according to what it interpreted from their human.

Research Methods

We used two distinct research methods: AEIOU and Speed Dating.

The AEIOU framework consists in listing the following elements for the research subject: Activities; Environment; Interactions; Objects and Users. 

As for the Speed Dating method, we drew storyboards for a series of interactions between the user and our system, and showed to our colleagues in order to get feedback about the interaction – criticism, suggestions for improvement, feedback about language, realism, level of relatability etc.

AEIOU for blog


To get real user perspective on the user experience of deafness, we contacted a few organisations, of which only DeafPlus got back to us and arranged a meeting for the following week. Although it took place during the second week of the project (the design week), we took valuable insights from it which helped with the final outcome.

Research Findings
After presenting our findings, we got feedback that even though we did a lot of research about interaction between deaf people and AI, our results are vague and largely based on assumptions rather than real data.

Final Design

We then deliberated and decided to change our direction into designing a (still) wordless conversation between someone who has trouble sleeping and their smart bedroom.

For that we ran short interviews with college students about their sleeping habits and came up with the final concept: an app that would be connected with other apps in order to act like a sleep assistant. There are two conversation: would be between the user and their phone, though a chatbot that would define their preferences, and later through gestures only. 


The biggest learning outcome from this project is that it’s very easy to make assumptions about how someone with a disability might experience the world, especially when we are fully abled body. Although the intention is good, the road to real empathetic design consists in much more than reading articles online and doing interviews. Also, we learned that the deaf community sees itself more as a culture than as group of disabled people. In regards to the final design outcome, was satisfied with it, especially considering it was all done in two days (new research and design).

Deixe uma Resposta

Preencha os seus detalhes abaixo ou clique num ícone para iniciar sessão:

Logótipo da

Está a comentar usando a sua conta Terminar Sessão /  Alterar )

Imagem do Twitter

Está a comentar usando a sua conta Twitter Terminar Sessão /  Alterar )

Facebook photo

Está a comentar usando a sua conta Facebook Terminar Sessão /  Alterar )

Connecting to %s