In April of 2018, a group of more than 20 privacy groups filed a complaint with the FTC, claiming that YouTube had repeatedly and knowingly violated the Children's Online Privacy Protection Act (Coppa). The allegations involved knowing that users under the age of 13 have regularly used YouTube to access video content, and YouTube had collected viewing history in order to make recommendations, as well as serve targeted advertisements, all without parental consent.
Since the complaint was initially filed, the FTC has begun a more universal investigation into online companies and child privacy. This included a fine against TikTok for requiring users to enter information that legally they could not collect from children. This week, the original complaint was addressed, with YouTube being fined $170 million for the violations. The fine comes as a partnership between the FTC and the state of New York. $34 million will go to the state, while the rest will go to the FTC.
In addition to the fine, YouTube has agreed to make changes to their operating procedures. Videos being uploaded to the service will need to be marked as safe for children. This will be an opt-in self-identification by the content creators, meaning that by default content will not be marked as child safe. The company will also begin getting parental consent before collecting data, which they were always legally required to do, and will not use any data collected previously, with or without consent.
This move is another indication that the FTC is worried about child safety online, as well as showing that they don't hold Silicon Valley in any special regard. If you violate laws or regulations, you will be held accountable, no matter how big you may be. FTC Chairman Joe Simons and Commissioner Christine Wilson said in a statement,
This settlement achieves a significant victory for the millions of parents whose children watch child-directed content on YouTube. It also sends a strong message to children's content providers and to platforms.
It's an important time for a penalty like this, as more companies have begun targeting online content at children. Services like Snapchat have adult users, but they're definitely popular with younger users. No longer will the claim that you have to be 13 to sign up be a valid argument in child protection cases.