Why OpenAI Really Shut Down Sora: The Facial Data Collection Story Behind AI Video's Most Prominent Closure
Six months after its public launch, OpenAI has discontinued Sora. The official explanation pointed to cost and product focus — but a deeper look at the platform's data practices reveals it had enabled users to submit facial images for training-adjacent purposes, raising questions that OpenAI was not prepared to answer publicly.

D.O.T.S AI Newsroom
AI News Desk
OpenAI's decision to shut down Sora, the AI video generation platform it launched to significant fanfare in late 2025, has been attributed publicly to resource prioritisation. The company is consolidating its consumer product surface around ChatGPT. Sora, which required significant infrastructure and generated limited revenue relative to its operational cost, was a candidate for discontinuation. That much is true.
But TechCrunch's reporting this week surfaces a more specific concern that appears to have been a factor in the timing and finality of the shutdown: the platform had enabled users to submit facial data, and the precise scope of how that data was used — or intended to be used — was never clearly disclosed to users or regulators.
What Sora's Facial Feature Did
Sora had introduced a feature that allowed users to upload images of real people, including their own faces, to influence generated video output — enabling a form of personalised avatar creation. The capability was presented as a creative tool. But in practice, submitted facial images represent some of the most sensitive biometric data a consumer platform can collect, and the regulatory exposure for such collection is significant, particularly in the European Union under GDPR and in US states like Illinois under BIPA.
OpenAI had not provided public documentation adequate to the sensitivity of the data being collected. When TechCrunch began asking detailed questions about the facial data pipeline, the answers it received were incomplete.
The Timing Problem
The shutdown announcement came within weeks of growing regulatory scrutiny of AI video platforms in the EU and of parallel reporting on facial recognition and biometric data collection practices across consumer AI tools. Whether the regulatory pressure was the proximate cause or one accelerant among several, the result is the same: Sora, which was among the most technically impressive generative AI products of 2025, has been discontinued less than a year after public launch.
What It Means for AI Video
The Sora closure is a data point in a broader pattern. Runway, Pika, and Kling have all grown their user bases while navigating the same questions about training data provenance, likeness rights, and biometric collection. None of those companies have faced the same public scrutiny that OpenAI does, partly because they are smaller and partly because their products never occupied the same cultural attention as Sora did at launch.
The industry lesson from Sora's shutdown may be less about the viability of AI video as a product category — which remains strong — and more about the governance requirements for any consumer AI platform that collects biometric data at scale. Those requirements are evolving rapidly, and companies that are not ahead of them are increasingly exposed.