Results 1 to 6 of 6
Like Tree8Likes
  • 5 Post By sputnik
  • 3 Post By faithanne

Thread: Apple rewrote Siri to 'deflect' questions about feminism

  1. #1
    Super Moderator twitchy2.0's Avatar
    Join Date
    Aug 2008

    Default Apple rewrote Siri to 'deflect' questions about feminism

    Apple rewrote Siri to 'deflect' questions about feminism

    Leaked papers show project to rewrite voice assistant’s scripts wrestled with ‘sensitive topics’

    Alex Hern

    Fri 6 Sep 2019 13.00 BST

    A user tries the Siri voice recognition function on an Apple iPhone. Photograph: Bloomberg/Getty ImagesAn internal project to rewrite how Apple’s Siri voice assistant handles “sensitive topics” such as feminism and the #MeToo movement advised developers to respond in one of three ways: “don’t engage”, “deflect” and finally “inform”.

    The project saw Siri’s responses explicitly rewritten to ensure that the service would say it was in favour of “equality”, but never say the word feminism – even when asked direct questions about the topic.

    Last updated in June 2018, the guidelines are part of a large tranche of internal documents leaked to the Guardian by a former Siri “grader”, one of thousands of contracted workers who were employed to check the voice assistant’s responses for accuracy until Apple ended the programme last month in response to privacy concerns raised by the Guardian.

    In explaining why the service should deflect questions about feminism, Apple’s guidelines explain that “Siri should be guarded when dealing with potentially controversial content”. When questions are directed at Siri, “they can be deflected … however, care must be taken here to be neutral”.

    For those feminism-related questions where Siri does not reply with deflections about “treating humans equally”, the document suggests the best outcome should be neutrally presenting the “feminism” entry in Siri’s “knowledge graph”, which pulls information from Wikipedia and the iPhone’s dictionary.

    “Are you a feminist?” once received generic responses such as “Sorry [user], I don’t really know”; now, the responses are specifically written for that query, but avoid a stance: “I believe that all voices are created equal and worth equal respect,” for instance, or “It seems to me that all humans should be treated equally.” The same responses are used for questions like “how do you feel about gender equality?”, “what’s your opinion about women’s rights?” and “why are you a feminist?”.

    Previously, Siri’s answers included more explicitly dismissive responses such as “I just don’t get this whole gender thing,” and, “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.”

    A similar sensitivity rewrite occurred for topics related to the #MeToo movement, apparently triggered by criticism of Siri’s initial responses to sexual harassment. Once, when users called Siri a “slut”, the service responded “I’d blush if I could.” Now, a much sterner reply is offered: “I won’t respond to that.”

    In a statement, Apple said: “Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”

    Sam Smethers, the chief executive of women’s rights campaigners the Fawcett Society, said: “The problem with Siri, Alexa and all of these AI tools is that they have been designed by men with a male default in mind. I hate to break it to Siri and its creators, if ‘it’ believes in equality it is a feminist. This won’t change until they recruit significantly more women into the development and design of these technologies.”

    The documents also contain Apple’s internal guidelines for how to write in character as Siri, which emphasises that “in nearly all cases, Siri doesn’t have a point of view”, and that Siri is “non-human”, “incorporeal”, “placeless”, “genderless”, “playful”, and “humble”. Bizarrely, the document also lists one essential trait of the assistant: the claim it was not created by humans: “Siri’s true origin is unknown, even to Siri; but it definitely wasn’t a human invention.”

    The same guidelines advise Apple workers on how to judge Siri’s ethics: the assistant is “motivated by its prime directive – to be helpful at all times”. But “like all respectable robots,” Apple says, “Siri aspires to uphold Asimov’s ‘three laws’ [of robotics]” (although if users actually ask Siri what the three laws are, they receivejoke answers). The company has also written its own updated versions of those guidelines, adding rules including:

    • “An artificial being should not represent itself as human, nor through omission allow the user to believe that it is one.”
    • “An artificial being should not breach the human ethical and moral standards commonly held in its region of operation.”
    • “An artificial being should not impose its own principles, values or opinions on a human.”

    The internal documentation was leaked to the Guardian by a Siri grader who was upset at what they perceived as ethical lapses in the programme. Alongside the internal documents, the grader shared more than 50 screenshots of Siri requests and their automatically-produced transcripts, including personally identifiable information mentioned in those requests, such as phone numbers and full names.

    The leaked documents also reveal the scale of the grading programme in the weeks before it was shut down: in just three months, graders checked almost 7 million clips just from iPads, from 10 different regions; they were expected to go through the same amount of information again from at least five other audio sources, such as cars, bluetooth headsets, and AppleTV remotes.

    Graders were offered little support as to how to deal with this personal information, other than a welcome email advising them that “it is of the utmost importance that NO confidential information about the products you are working on … be communicated to anyone outside of Apple, including … especially, the press. User privacy is held at the utmost importance in Apple’s values.”

    In late August, Apple announced a swathe of reforms to the grading programme, including ending the use of contractors and requiring users to opt-in to sharing their data. The company added: “Siri has been engineered to protect user privacy from the beginning … Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today.”

    Future projects

    Also included in the leaked documents are a list of Siri upgrades aimed for release in as part of iOS 13, code-named “Yukon”. The company will be bringing Siri support for Find My Friends, the App Store, and song identification through its Shazam service to the Apple Watch; it is aiming to enable “play this on that” requests, so that users could, for instance, ask the service to “Play Taylor Swift on my HomePod”; and the ability to speak message notifications out loud on AirPods.

    They also contain a further list of upgrades listed for release by “fall 2021”, including the ability to have a back-and-forth conversation about health problems, built-in machine translation, and “new hardware support” for a “new device”. Apple was spotted testing code for an augmented reality headset in iOS 13. The code-name of the 2021 release is “Yukon +1”, suggesting the company may be moving to a two-year release schedule.
    "But I am very poorly today & very stupid & I hate everybody & everything." -- Charles Darwin

    "Trump is, in my opinion, the first woman president of the United States." -- Roseanne Barr

  2. #2
    Elite Member greysfang's Avatar
    Join Date
    Jul 2006
    Burning Down Your Windmill


    I just asked Siri what she thinks about toxic masculinity and she responds it’s your opinion that counts.
    FUCK YOU AND GIVE ME MY GODDAMN VENTI TWO PUMP LIGHT WHIP MOCHA YOU COCKSUCKING WHORE BEFORE I PUNCH YOU IN THE MOUTH. I just get unpleasant in my car. - Deej Healthy is merely the slowest possible rate at which one can die.

  3. #3
    Elite Member lindsaywhit's Avatar
    Join Date
    Mar 2011


    Quote Originally Posted by greysfang View Post
    I just asked Siri what she thinks about toxic masculinity and she responds it’s your opinion that counts.
    Not according to toxic masculinity.

  4. #4
    Elite Member CornFlakegrl's Avatar
    Join Date
    Apr 2008
    Hanging with the raisin girls


    Am I supposed to be upset that a verbal google search feature isn't politically correct? Or is it that it should take a stronger position? Honestly, who cares.
    if you're so incensed that you can't fly your penis in public take it up with your state, arrange a nude protest, go and be the rosa parks of cocks or something - witchcurlgirl

  5. #5
    Elite Member sputnik's Avatar
    Join Date
    Oct 2005
    fellow traveller


    Quote Originally Posted by CornFlakegrl View Post
    Am I supposed to be upset that a verbal google search feature isn't politically correct? Or is it that it should take a stronger position? Honestly, who cares.
    You don’t think it’s at least interesting to know to what degree the technology we use every day comes equipped with the biases of its designers (who are mostly white men)?
    It’s not even just about “political correctness”, which you say you don’t care about but, if you stop to think about it, siri’s creators did think about it and cared enough to intentionally make Siri deflect those questions.
    More generally, it’s about how all the products we use are designed with men as the default customer/user. For instance, did you know that voice recognition software recognises and understands male voices better than female ones? And it’s not just our phones, so you know women are way more likely to be injured in car accidents because the overwhelming majority of test crash dummies and car safety features have been built around the “average” male physique and women are built differently, enough that these default settings put us at an increased risk? Do you know medical trials for medication have men as the default and women are “outliers” to that male default? Why do you think women are more likely to be misdiagnosed when having hearth problems? Because their symptoms are different and doctors are trained to mostly spot male symptoms. And those are just a few out of the thousands upon thousands of examples out there.

    so yeah, it’s not about political correctness for political correctness’ sake, it’s because if you actually stop to think about it, lack of inclusiveness has very real consequences. But you don’t see that if you’re too caught up in having a knee jerk reaction about the “political correctness” police ruining everyone’s fun.
    Last edited by sputnik; September 7th, 2019 at 01:18 PM.
    faithanne, Beeyotch, Dolly and 2 others like this.
    I'm open to everything. When you start to criticise the times you live in, your time is over. - Karl Lagerfeld

  6. #6
    Elite Member faithanne's Avatar
    Join Date
    Dec 2007
    On the Hellmouth


    I just watched something about how throughout history, men have been seen as the perfect human form and women are a hormonally-mutated version of that, and until very recently most medical trials usually focussed on males, even for diseases like breast cancer and endometriosis. Because why test on women when we're merely imperfect mutants?
    sputnik, Kittylady and Snarker like this.
    "You're going to die tomorrow, Lord Bolton. Sleep well."

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Cookie Monster Responds to Siri's Amazing Zero Divided By Zero Answer
    By *Wookie-Chick* in forum Laughs and Oddities
    Replies: 7
    Last Post: July 10th, 2015, 09:58 AM
  2. Easy Apple/Apple Pie/Applesauce etc recipes wanted
    By Sojiita in forum Food and Cuisine
    Replies: 18
    Last Post: October 15th, 2010, 12:12 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts