Meta's independent content watchdog said Thursday there were "serious questions" about how the social media giant deals with anti-immigrant content, particularly in Europe.
The Oversight Board, established by Meta in 2020 and sometimes called its "supreme court," launched a probe after seeing a "significant number" of appeals over anti-immigrant content.
The board has chosen two symbolic cases -- one from Germany and the other from Poland -- to assess whether Meta, which owns Facebook and Instagram, is following human rights law and its own policies on hate speech.
Helle Thorning-Schmidt, co-chair of the board and a former Danish prime minister, said it was "critical" to get the balance right between free speech and protection of vulnerable groups.
"The high number of appeals we get on immigration-related content from across the EU tells us there are serious questions to ask about how the company handles issues related to this, including the use of coded speech," she said in a statement.
The first piece of content to be assessed by the board was posted in May on a Facebook page claiming to be the official account of Poland's far-right Confederation party.
An image depicts Polish Prime Minister Donald Tusk looking through a peephole with a black man approaching him from behind, accompanied by text suggesting his government would allow immigration to surge.
Meta rejected an appeal from a user to take down the post despite the text including a word considered by some as a racial slur.
In the other case, an apparently AI-generated image was posted on a German Facebook page showing a blond-haired blue-eyed woman, a German flag and a stop sign.
The accompanying text likens immigrants to "gang rape specialists."
A user complained but Meta decided to not to remove the post.
"The board selected these cases to address the significant number of appeals, especially from Europe, against content that shares views on immigration in ways that may be harmful towards immigrants," the watchdog said in a statement.
The board said it wanted to hear from the public and would spend "the next few weeks" discussing the issue before publishing its decision.
Decisions by the board, funded by a trust set up by Meta, are not binding, though the company has promised to follow its rulings.