Our prejudices must be very deeply ingrained if we even stereotype robots. From an interesting new paper:
Previous research on gender effects in robots has largely ignored the role of facial cues. We fill this gap in the literature by experimentally investigating the effects of facial gender cues on stereotypical trait and application ascriptions to robots. As predicted, the short-haired male robot was perceived as more agentic than was the long-haired female robot, whereas the female robot was perceived as more communal than was the male counterpart. Analogously, stereotypically male tasks were perceived more suitable for the male robot, relative to the female robot, and vice versa. Taken together, our findings demonstrate that gender stereotypes, which typically bias social perceptions of humans, are even applied to robots. (source, source)
If we can’t manage to treat inanimate robots without sexism and prejudice, then what hope is there for our fellow human beings of the other gender?
Interestingly, the complaint seems to go both ways. Robots, in the general sense of the word, have been known to exhibit sexism. Siri and Google for example are said to favor “male terms” and solutions when autocorrecting of suggesting phrases. Some examples:
Obviously, prejudice in robots and in software, to the extent that it exists, only reflects the prejudice of their makers.
More posts in this series are here.