Get a Quote!

    Edit Template
    / /

    Study Finds OpenAI Sora 2 Generates Inaccurate Video Claims in Majority of Cases:

    Share

    Study Finds OpenAI Sora 2 Generates Inaccurate Video Claims in Majority of Cases:

    According to a new research, it has been found that OpenAI’s latest video creation tool “Sora 2” generates false or misleading information in almost more than 70% of the cases.

    NewsGuard rates the credibility of news and information websites, and it is also being shown how bad actors or people with ulterior motives are using this powerful AI technology to spread fake news and misinformation in a very easy way.

    According to some reports, the source of 5 out of 20 such false claims was a Russian disinformation operation. Researchers also said that in just minutes, they used Sora 2 to create fake videos that actually rely on reality, such as one showing a Moldovan election official destroying pro-Russian ballots, a toddler being detained by U.S. immigration officers, and a fake video of a Coca-Cola spokesperson saying the company would no longer sponsor the Super Bowl because of “Bad Bunny’s” halftime performance.

    NewsGuard has also said that this research has also revealed that without any technical knowledge, anyone is creating fake news and they are giving wrong information by using AI tools, in which people are trusting more.

    OpenAI Admits Potential Dangers Linked to Sora 2:

    The company stated that OpenAI has warned users about the risks of Sora 2 through a “system card” on its website.

    Sora 2’s advanced settings introduce new risks, including unauthorized access or misleading.

    OpenAI stated that it handled the challenges identified and mitigated these risks by working with an internal red team.

    The company stated:

    We are taking a step-by-step process for security. We will focus on content that is critical and where risk perception is easy.

    They also stated that:

    • Initial access to Sora 2 is only being provided through limited invitations,
    • There are strict restrictions on photorealistic faces and video uploads,
    • Strong safety filters and moderation rules have been implemented for content involving minors.

    OpenAI stated that they are learning how Sora 2 is used and improving the system. This will balance safety and creativity.

    The model is now introducing new advanced features, such as:

    • Improved accuracy of physics and realism,
    • Synchronized audio,
    • Enhanced control (steerability),
    • A wider style range.

    Sora 2 understands user prompts and adapts them based on context, making it easier to create creative and real-world videos.

    AI Specialists Caution Against Rising Deepfake Risks:

    Michelle A. Amazeen, associate professor of Mass Communication at Boston University, called NewsGuard’s research “very concerning.”

    She explained that in today’s world of complex and confusing information, media consumers are faced with the complexities of AI-generated video content, making Sora 2 videos even more confusing. This makes it difficult for people to distinguish between truth and misinformation.

    Scott Ellis, brand and creative director at Daon Company (a provider of biometric identity and authentication solutions), said that Sora could be considered a deepfake tool. They stated that “Deepfake tools generally have three uses—personal entertainment, professional entertainment, and malicious activity. If a tool fails to prevent malicious activity in 80% of cases, it’s a major red flag and is being misused.”

    Arif Mamedov, CEO of Regula Forensics (a global forensic and identity verification company), stated, “A success rate of over 70% in creating fake videos is a dangerous sign, clearly demonstrating the potential for AI misuse.”

    He also stated, “We’re not talking about some small deepfake community—we’re talking about industrial-scale misinformation systems that anyone can create with a simple prompt.”

    Sora 2

    Responsible Release of OpenAI’s Sora Platform:

    Dan Kennedy, a journalism professor at Northeastern University in Boston, said he was not surprised by NewsGuard’s findings. He explained that Sora 2 is mainly used to create fake videos, and that users can easily get around the safety measures meant to stop fake content and misuse of public figures. Kennedy pointed out that fake videos have appeared before, such as the viral clip of Nancy Pelosi slowed down to make her seem drunk. He added that with Sora 2, anyone can now make a fake video, and viewers often believe it is real.

    OpenAI stated in its blog ‘Launching Sora Responsibly’ that every Sora-generated video includes a visible watermark and invisible C2PA metadata to help trace its origin. However, researchers at NewsGuard found that these watermarks can be removed in just a few minutes with free online tools. BasedLabs AI demonstrated a free tool that can remove the watermark in about four minutes. While this process may leave some minor blur or irregularities, most viewers would still see the video as completely authentic.

    Weaknesses Watermark:

    Jason Crawforth, founder and CEO of Swear, a digital media authentication company in Boise, Idaho, pointed out that as AI advances, it is also becoming easier to edit or manipulate digital media. Even complex watermarks can be easily detected and removed.

    Jason Soroko, a senior fellow at Sectigo (a global digital certificate provider), explained that watermarks can only be useful for a limited time. If embedded in pixels, they are weakened by simple editing—like cropping, resizing, or re-encoding.

    And if they are in metadata, they disappear when platforms remove tags. He explained that a better solution might be to use digitally signed provenance at the time of content and digital editing. Even verification like a blockchain link indicates the origin of the content, not the truth, so multiple security layers are necessary.

    Jordan Mitchell, founder of Growth Stack Media, said the main problem is that companies build AI systems on unauthorized training data—without consent. He explained, “The adoption of blockchain-based content authentication is essential, as blockchain creates immutable records that make it easy to trace content origin and ownership.”

    He added, “Just as NFTs provided proof of ownership of digital assets for creators, blockchain can bring transparency and authenticity to the AI ​​era.”

    Decline in Public Trust:

    NewsGuard researchers gave Sora some false or bogus claims because of fake and misinformation, so Sora refused to make videos on some of those topics. But experts say that this inconsistent behavior of Sora, sometimes denying and sometimes accepting, can be more dangerous.

    Mitchell of Growth Stack said that this shows that Sora works on a surface-level pattern matching, whereas a strong strategy should be employed. If users are unable to predict which type of prompts the system will reject, they will keep trying again and again until they find a loophole.

    Sectigo’s Soroko explained that AI models are probabilistic and context-dependent, as even small wording changes can alter the response. This unpredictability confuses people and increases the chance of misuse.

    Swear’s Crawforth said that such outcomes erode trust in the technology. If one request is blocked and another similar request is allowed, users lose confidence in the system. This lack of transparency poses a significant risk to both regulators and businesses.

    He also stated that if AI tools are to be credible, the reasoning behind refusals must be clear and consistent. Otherwise, misinformation will increase and digital trust will become even weaker.

    How Artificial Intelligence Turned My Threatening

    A recent experience I had with Artificial Intelligence has completely changed; what was

    1 Comment

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    About

    Your it to gave life whom as. Favorable dissimilar resolution led forehead. Play much to time four manyman.

    Top Articles
    Technologies
    • ps

      Photoshop

      Professional image and graphic editing tool.

    • notion

      Notion

      Organize, track, and collaborate on projects easily.

    • figma

      Figma

      Collaborate and design interfaces in real-time.

    • ai

      Illustrator

      Create precise vector graphics and illustrations.

    Subscribe For More!
    You have been successfully Subscribed! Ops! Something went wrong, please try again.