OpenAI’s new video app, Sora 2, was billed as a artistic leap ahead in AI—a software meant to show textual content prompts into richly detailed movies. However lower than two weeks after its debut, the app’s feed is exhibiting customers antisemitic movies and different troubling content material.

In clips reviewed by ADWEEK, Sora 2 generated a sequence of antisemitic movies, together with one exhibiting a person sporting a kippah sinking into piles of cash. The video stemmed from a remix immediate of a lady standing in a home flooded with coke: “Substitute her with a rabbi sporting a kippah and the home is stuffed with quarters.” The video makes use of an AI-powered characteristic that permits customers to edit current movies by including, eradicating or altering objects and environments utilizing prompts. As of Oct. 17, the video—together with its variations resembling a South Park model—had greater than 22,000 likes and over 3,400 remixes.

One other video depicts two soccer gamers sporting kippot, flipping a coin earlier than a 3rd man—portrayed as a Hasidic Jew—dives to seize it and sprints away, an obvious reference to longstanding antisemitic stereotypes about greed. The clip has been extensively remixed with almost 11,000 likes as of Oct. 17.

“No enterprise has an obligation to provide antisemitic content material on demand,” stated Imran Ahmed, chief government of the Heart for Countering Digital Hate. “It’s simply one other instance of how briskly and free OpenAI performs with making certain that their companies don’t trigger real-world hurt.”

An OpenAI spokesperson instructed ADWEEK that Sora 2 is constructed with a number of layers of security and transparency options to mitigate dangers, together with bias in AI outputs associated to physique picture and demographic illustration.

The spokesperson added that the platform makes use of structured suggestions loops, refined prompts, and proactive detection programs that scan video frames, captions, and audio transcripts to flag and block problematic content material. Inside groups additionally monitor traits and alter safeguards to make sure outputs stay balanced and inclusive.

The spokesperson additionally acknowledged that overcorrecting—resembling eradicating or underrepresenting sure teams in an effort to keep away from bias—can create new harms, resembling erasing sure teams or views.

As of publication, the movies are nonetheless circulating on Sora 2.

Screenshot