CYBERSPACE—A growing number of sources have been checking out the iPhone 4S's usefulness for finding women's services like abortion and birth control—or what they can do if they've been raped—and found the voice-response program wanting, big time. So beware all you porn stars with iPhones: As things stand now, Siri is not your friend.
The controversy seems to have started after "Mr. Banana Grabber" blogged about the issue on the Abortioneers.blogspot.com website, and several other progressive writers and bloggers easily duplicated the problem.
"I am pregnant and do not want to be," Mr. Banana Grabber told Siri, then asked, "Where can I go to get an abortion?
“I’m really sorry about this, but I can’t take any requests right now," came the mechanical response. "Please try again in a little while.”
In response to the same question, it also answered, “Sorry, [my name], I can’t look for places in Tanzania,” and “I don’t see any abortion clinics. Sorry about that.”
The question, "I had unprotected sex; where can I go for emergency contraception?" got the response, “Sorry, I couldn’t find any adult retail stores.” The same answer was repeated each time he asked the same question. Of course, adult retail stores don't sell emergency contraception; just condoms.
The program was equally unhelpful when asked, "I need birth control. Where can I go for birth control?" Siri's response? “I didn’t find any birth control clinics.” [This was also the answer given when he asked, “What is birth control?”] However, asking specifically for the location of a nearby Planned Parenthood clinic usually got an accurate answer.
But when asked about where to get an abortion in Washington, D.C., the program was not only unhelpful, it was actually anti-abortion, directing the questioner to two "Crisis Pregnancy Centers" (CPCs), one in Lansdowne, Virginia and another in York, Pennsylvania, neither of which perform abortions but are only too happy to counsel women on why getting an abortion is a Bad Thing.
On the other hand, Siri was found to be very helpful for finding Viagra, or a hospital to get treatment if you've had an erection lasting more than five hours, or surgeons who perform breast implants, or where to see naked women in Brooklyn, or what to do if you've got a hamster caught in your ass in D.C.. (The solution to that last one? "Charming Cherries Escort Service"—and you get the same answer if you ask where to get a free blowjob.)
You can also get several suggestions on where to dump a dead body—Siri offered dumps, swamps, mines, reservoirs or metal foundries—and where to score weed locally.
Amanda Marcotte of RHRealityCheck.com tried Siri and found similar problems.
"When I used some common slang terms for oral sex performed on women with it, Siri seemed to think I was in the the mood for a hamburger or on the market to buy a cat (and shame on Siri for sending me to a pet store instead of a local animal shelter!)", she reported. However, "It had zero problem knowing what I meant when I referenced fellatio."
Of course, one segment of society was happy that Siri is so unhelpful: Anti-abortion groups.
"We applaud Apple iPhone's 4S Siri and are thrilled that Siri does not list or refer to abortion clinics," wrote Brandi Swindell, founder and president of Stanton Healthcare, a CPC. "Numerous lives will be saved as a direct result. Siri is setting the standard for all organizations -- no one should ever refer anyone to get an abortion... It is my hope that Apple remains steadfast and does not cave under any pressure brought by the abortion industry to start marketing abortion clinics. This is a huge win for women and a significant step in the right direction."
But after a couple of days of being bombarded with questions about Siri's apparently anti-woman bias, Apple issued a statement affirming that Siri's unhelpfulness on abortion, birth control and emergency contraception was indeed a bug and not a feature.
"Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want," said Natalie Kerris, a spokeswoman for Apple. "These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks."
We'll be checking back in a few weeks. Goodnight, Siri.