Skip to content

Apple Censors URLs Containing “Asian” with Adult Filters

What AI and search trends can tell us about deep-rooted misogyny

A recent tweet by iOS developer Steven Shen has exhumed problems with Apple’s “limit adult websites” feature, which blocks URLs with adult content from being accessed while the feature is on, supposedly to block inappropriate content. One unintended consequence, however, is that URLs containing keywords which Apple has identified to be associated with sexual content, such as “Asian” and “teen,” are included on their filter list. Other racial terms contrast this – for example, “Black,” “white,” and “Arab,” can be searched with no issues. The decision to censor certain keywords is questionable too; what if a teenager was looking for “teen mental health support,” and this search was blocked, or any other myriad of searches that can be associated with such a broad term? This is likely an instance of artificial intelligence (AI) going wrong, and it has been a problem since the advent of this feature in 2019. Apple’s silence speaks volumes about their complacency regarding this issue.

There have been several occasions where AI has reinforced cultural prejudices, such as the classic “man is to computer programmer as woman is to homemaker.” The foundation of AI relies on machine learning and neural networks, and draws connections between the data it is fed. Problems arise when biases are present in the data fed to the neural networks, whether intentional or unintentional, as properly functional AI will retain the biases within the data from which it learned. AI is a very powerful tool, but it still has limitations if not used correctly. If it is given garbage, like incorrect information or biased data, then it will output garbage drawn from that data. A viral example from 2016 is Tay, the Microsoft chatbot that was meant to mimic the Tweeting habits of a teenage girl. Human engagement was used as the main source of the bot’s machine learning, and it took less than 24 hours for it to start spewing slurs and racist conspiracy theories. These issues highlight the importance of identifying bias in and de-biasing AI, as well as the issues that surround using unfiltered public interactions to aggregate data, which Apple clearly did not take into consideration. 

However, Apple’s mistake exhibits not only the limitations of AI, but another important issue pertaining to anti-Asian racism – the fetishization and objectification of Asian people. The word “Asian” being censored was no coincidence; it is reflective of browsing habits and the composition of online content. It is no secret that Asian women are treated as a post-colonial conquest by white men, prized for their neotenous features and perceived stereotypes of “submissiveness.” These stereotypes also translate to the LGBT+ Asian community, where femininity is a universal stereotype that Asians of all genders face: gay Asian men are portrayed as submissive and asexual, and Asian lesbians face similar stereotypes. The fetishization of Asians is so prominent that the word “Asian” became an automatically banned keyword, despite it being a qualifier for over half the world’s population. It is not just Apple’s services that reveal this issue; Googling “Asian” appended with any other popular social media brings up pages that gained popularity from exploiting Asian women.

Fetishization is problematic for the obvious reason that it leads to objectification, misogyny, and prejudice against other marginalized genders, and there are tangible manifestations of these problems. The saturation of Asian sexual content muffles real spaces and accounts for serious, diasporic discussion which can promote activism within communities as well as celebrate Asian culture. Additionally, Asian women experience violence from offenders outside their race at such overwhelming rates, that they are the only group of women who are more likely to face violence from offenders outside their race, a symptom of their rampant objectification and negative stereotypes. 

Women hold up half the sky, and while Apple made a mistake that they continue to ignore, they also presented a learning opportunity for how search trends and AI can reveal cultural and systemic prejudices that are rooted deep in our society and often not discussed. Some organizations that do report on and discuss current events pertaining to Asian diaspora are Asian Dawn and NextShark. The fetishization and dehumanization of Asian people is a serious problem, and how it is so easily uncovered with a Google search further emphasizes the gravity of this issue.