Is Artificial Intelligence really smart?

At the present era, Artificial Intelligence (AI) is present in several moments of our day. Take the self check out counters at the supermarket, for example. They’re fast, convenient, practical. But they’re also a perfect example on how it’s likely to be a long time until we can fully rely on them for shopping. Although the experience may run smoothly most of the time, the machines often make mistakes, and there is always a human in patrol to correct them. I dare to make the assumption that the people surveilling the automated check out machines are there not only due to the company’s legal requirement of having a certain number of employees. Maybe supermarket employees are always observing the machines because they expect them to make mistakes.  

The above scenario is a brief picture our current relationship with AI: we know it exists and roughly what it can do, but we don’t expect it to solve all our problems. In fact, we expect it to create problems at some point, and that’s why we are aware we need to say vigilant.  

We don’t feel safe enough yet to give machines full control. One might argue that this is a healthy relationship, although one sided. One party knows that the other has severe limitations, but still chooses to engage in it, not expecting it to be the bearer of their happiness. The other can only read zeroes and ones. And that is probably ok. Relationships are never 100% equal. 

Of course there are way more intelligent AI systems. Voice based assistants like Alexa, for example, can recognise human speech, overlooking accents (sometimes) and responding accordingly to many commands. That requires an enormous amount of intelligence to quickly go through a massive set of database, interpreting it and giving back an appropriate response in seconds (or an inappropriate one, like a laugh or a cheeky joke).

But what Alexa is oblivious to understand is human complexity, nuance of speech and implied meaning. It can give me the weather report if I say “Alexa, tell me the weather”. But I wonder if it would understand if I said “Alexa, should I wear a hat outside today?”. Is Alexa capable of understanding my question as a desire to know if it’s going to be sunny and hot enough to make me want to wear a hat? 

siri 1

I’ve done a little experiment with Siri. Yes, it interpreted my question as a desire to know the weather, but it didn’t really answer my question. It is sunny, but not hot enough to wear a hat, as one will not get sunburn at 10 degrees. One might say that it’s cold enough to wear a hoodie or a beanie, but that’s not really what I asked.  

Here’s what I got after asking Siri if I should wear a beanie today… 

siri 3

This is another great example of how important it is that we recognise the limitations of AI, and keep our expectations on check. First of all, it didn’t understand beanie, but bikini. Second, I’m shocked that on my second try Siri pointed me to articles about body insecurity, which is something I wasn’t even thinking about. Not only it made a mistake in recognising my words, it suggested me content that could be damaging to one’s self esteem and body image. I’m aware, however, that it was not a conscious choice of Siri. It simply searched the question on Google and showed me the most popular results. 

Also, do we really want to establish that kind of codependent relationship with our voice-based assistants? Is it healthy to expect them becoming more and more intelligent in order to fulfil our every need? Isn’t it better to ask those kinds of questions to a friend? Are we really that lonely?  

Deixe uma Resposta

Preencha os seus detalhes abaixo ou clique num ícone para iniciar sessão:

Logótipo da WordPress.com

Está a comentar usando a sua conta WordPress.com Terminar Sessão /  Alterar )

Google photo

Está a comentar usando a sua conta Google Terminar Sessão /  Alterar )

Imagem do Twitter

Está a comentar usando a sua conta Twitter Terminar Sessão /  Alterar )

Facebook photo

Está a comentar usando a sua conta Facebook Terminar Sessão /  Alterar )

Connecting to %s