![]() ![]() “HAL is the pop-culture example that we most often go to when we think about artificial intelligence and our relationship to artificial intelligence,” says Stephen D. All this, despite having no physical presence onscreen beyond a red light. He sent a chill down the spines of audiences, making them ponder humanity’s relationship to technology at a time when computers were still mysterious. “So it’s an inadvertent Canadian element in artificial intelligence.”Īs the film’s malevolent spacecraft antagonist, HAL also had an ominous quality that conveyed a sense of danger and evil despite sounding good-natured and chipper. ![]() Manage Print Subscription / Tax Receipt.If it/she can pass as “human”, this might further risk manipulating consumers and obscuring the implications of surveillance, soft power and global monopoly.īy positioning smart assistants as innocuous through their voice characteristics – far from the menacing males and monstrous mothers of the cinema screen – consumers can be lulled into a false sense of security. Google’s experimental technology Duplex, for instance, allows users to ask the assistant to make phone calls on their behalf to perform tasks such as booking a hair appointment.ĪI can book a restaurant or a hair appointment, but don't expect a full conversation Smart assistants may soon play an even more intrusive role in our everyday affairs. This gives the tech companies significant “ soft power” in their potential to influence consumers’ feelings, thoughts and behaviour. With voices that are apparently natural, transparent and depoliticised, the assistants give only one brief answer to each question and draw these responses from a small range of sources. Smart assistants resemble the humanoid robots in latter-day pop culture – sometimes nearly indistinguishable from humans themselves. When asked, "Alexa, are you dangerous?”, she replies calmly, “No, I am not dangerous.” Gentle humour, too, plays a significant role in humanising the artificial intelligence behind these devices. ![]() Smart assistants are programmed to be culturally competent in their relevant market: the Australian version of Google Assistant knows about pavlova and galahs, and uses Australian slang expressions. It also echoes traditional media practices using the masculine voice of authority. Its designers say this accent makes their robot more human-like. Perhaps it is for this reason the newest smart-voice is the BBC’s Beeb, with a male northern English accent. There's a reason Siri, Alexa and AI are imagined as female – sexism Even UNESCO has warned smart assistants risk entrenching gender bias. Smart assistants have been described as “ wife replacements” and “ domestic servants. ![]() They were the antithesis of the menacing male or monstrous mother cinematic robot archetypes.īut while these friendly voices could steer consumers away from thinking of smart assistants as dangerous surveillant machines, the use of female-by-default voices has been criticised. Big tech companies strategically selected these female voices to create positive associations. Smart assistant developers adopted this concept of developing persona through voice after recognising the value in getting consumers to identify with their productsĪpple’s Siri (2010), Microsoft’s Cortana (2014), Amazon’s Echo (2015) and Google Assistant (2016) were all introduced with female voice actors. In these films the voice is a crucial vehicle with which robots express a persona. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |