In 2023, the Entertainment Software Rating Board, along with digital identity company Yoti and “youth marketing solutions” provider Superawesome, filed a proposal with the FTC for a new “verifiable parental consent mechanism” called Privacy-Protective Facial Age Estimation. The FTC has now issued its response to that proposal, and the answer is “no”—for now.
The ESRB’s proposed technology stirred feathers almost immediately, and understandably so: The idea of having to essentially submit a selfie to prove to a machine that you’re old enough to play GTA 6 is inherently intrusive, and that’s before you even get into questions of technological bias and whether or not the thing would work well enough to justify the headaches that would inevitably erupt for at least some users.
The ESRB moved quickly to reassure the public that the system is not meant to identify individuals but simply to estimate age, and that it would not store any data after the analysis was complete. It was also not intended to ensure compliance with the ESRB’s age ratings, but rather with COPPA—the Children’s Online Privacy Protection Act—a US privacy law that requires “verifiable parental consent” before companies are allowed to collect or share data from children under the age of 13.
Unlike the ESRB rating system, which is voluntary, COPPA is legally binding, and breaking that law can be awfully expensive. In 2022, for instance, Epic Games agreed to pay a $275 million penalty for COPPA violations, while in 2023 Microsoft ate a $20 million fine for violations of its own on Xbox Live. So you can understand why companies might be eager to find a low-effort system that enables them to at least say, “Hey, we tried.”
But for now, the ESRB’s proposed solution isn’t going to be it. In a ruling issued on March 29, the FTC said that after receiving more than 350 comments on the proposal, it voted unanimously to deny the application. The denial was issued without prejudice, meaning the ESRB and its associates can resubmit the application in the future.
That reflects the reason for the denial, which came not because the FTC has concerns about millions of people submitting selfies to an AI-powered machine dedicated to government oversight, but because it’s not clear how (or, I suppose, if) it will work.
In a letter sent to the ESRB group, the FTC noted that Yoti had submitted a “facial age estimation model” to the National Institute of Standards and Technology in September 2023, but the NIST’s evaluation has not yet been delivered. On March 22 the ESRB asked for a 90-day delay of any ruling on the matter to allow for the NIST’s report to be delivered, but because there’s no indication the report will show up within that time frame, the FTC has simply decided to toss the whole thing.
(Image credit: ESRB (via FTC))
Frankly I think the world would be a better place if this whole idea went away completely, but that seems unlikely. In its denial, the FTC said it “is taking no position on the merits of the application,” and effectively encouraged a re-submission once the NIST report is finished, “when the Commission anticipates that additional information will be available to assist the Commission and the public in better understanding age verification technologies and the application.”
I’ve reached out to the ESRB for comment on the ruling, and to ask whether it intends to re-submit the Privacy-Protective Facial Age Estimation proposal once the NIST report is complete, and will update if I receive a reply.