Apple Fixes “Asian” Adult Content Filter, but We Need More


Illustration for article titled Apple Fixes 'Asian' Adult Content Filter, but We Need More

Photo: Victoria Song/Gizmodo

In an ideal world, I wouldnt have had to write a story about how Apples adult content filter on iOS blocks users from searching the word Asian in any browser. In that world, I also wouldnt have to write the follow-up the issue is now fixed, but that the timingshortly after a mass shooting in Atlanta that left eight people dead, six of whom were Asian womenis deeply problematic.

According to Mashable, the latest iOS 14.5 beta removes the Asian filter. Gizmodo has independently confirmed that this is the case. This is objectively a good thing. Whats not good is that this issue, according to Mashable, was reported to Apple on Dec. 7, 2019 by iOS developer Steven Shen. After more than a year of inaction, Shen took to Twitter in early February to raise public awareness of the issue, which Gizmodo and other news media covered. Gizmodos requests to Apple at the time for a statement went unanswered. Shen also reportedly told Mashable while Apple never officially responded to his initial warning, an Apple employee did verify the problem on Twitter and filed the issue internally.

What this means is this filter was in place before the first public reports of covid-19 in Wuhan, China. It means it was brought to Apples attention well before the phrases kung flu or China virus were ever uttered. It was in place as the AAPI community tried to raise awareness over a spike of hate crimes fueled by the pandemic. It was functional well after the issue finally started picking up steam in the media after the killing of Vicha Ratanapakdee, an 84-year-old Thai immigrant who was murdered in San Franciscoa city about 50 minutes from Cupertino. It is still technically functional today, a day when yet another elderly Asian-American woman was attacked in New York City as bystanders did nothing. You need to have downloaded the iOS 14.5 beta, after all, to get the fix.

Since publicly rolling out in September, there have been several updates to iOS 14. Since the issue gained public attention, there have been two updates. Its still unclear how exactly Apples adult filters were determined, and whether there was human oversight or if this an example of AIs inherent weaknesses in tagging and filtering content. Apple has yet to comment or address why Asian was the sole racial search term that was filtered for adult content, or whether fixing this issue was a priority once it became aware. Perhaps this is an extremely difficult thing to fix, requiring several of Apples most brilliant minds, and the timing was unfortunate given the uptick in anti-Asian hate crimes. I dont know. Ive reached out to Apple for more context and a statement, but have yet to hear back. That said, my gut feeling is that this isnt a difficult task. It just wasnt deemed a particularly important or urgent one.

I wish I could say that, as an Asian-American tech journalist who frequently reviews Apple products, the thought makes me angry. The reality is I am just so sad, and so tired.

It especially hurts in the wake of Atlanta because, as Gizmodo originally reported, this filter wasnt even effective. If the intent was to block inappropriate content, it could be easily outsmarted with a few workarounds. You could search Japanese schoolgirls and see several images that parents would object to. But searching Asian-American history or Stop Asian Hate would turn up a message that you were trying to access inappropriate content. Ostensibly, this filter was meant to protect children from seeing pornography. What I cant stop thinking about is how a child could innocently search for facts about an Asian elephant and then see a message that plants the idea that somehow anything Asian is adult content.

This is most certainly not what Apple intended, but it doesnt erase the sting. It doesnt change the fact that the way Apple handled this mirrors how Asian-American pleas to Stop Asian Hate went unheard for over a year. It underscores just how normal it is to hypersexualize Asian women, which to be clear, is both racist and misogynist. It only magnifies the normalized racism and misogyny of the Atlanta shooter, who targeted Asian massage parlors to remove the temptation of his alleged sex addiction.

Whats done is done. Apple cant go back in time, wave a magic wand, and pretend this never happened. Apple shouldve fixed this issue when it was first raised, but saying this feels empty and hollow. Lots of people and companies in positions of power and influence should have and could have done something sooner. They didnt, and bemoaning that does nothing but rub salt in the wound.

The one thing that Apple absolutely should not do is stay silent in the hopes that quietly fixing this issue will limit how many people know this even happened. The AAPI community has been gaslit enough. Instead, it could own up to its mistake and outline how it intends to ensure that this sort of thing never happens again. Perhaps Im wrong, but this doesnt seem like a huge ask. But then again, neither did fixing the filter.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *