The Great YouTube \ COPPA Panic

EDIT: 11/25/2019 – This entire situation is, apparently, far worse than I originally thought –See this follow up: https://www.xadara.com/2019/11/i-think-i-misjudged-the-youtube-coppa-situation/

A couple of weeks ago, right on the heels of the paranoia surrounding fear that YouTube was going to terminate accounts that aren’t profitable (something many people still think is the case) came another thing for all those who perpetually complain about YouTube yet never leave the site to panic over – COPPA.

This isn’t actually anything new, in effect – COPPA, the Children’s Online Privacy Protection Act, is a United States Federal Law that, in simple form, is designed to restrict information collection from minors on the internet.

It became effective in 2000, nearly 20 years ago, so as I said, it’s nothing new. When I started Xadara in 2007 I discovered this law as a side effect of the forum software I was using mentioned it, and had sections set up for dealing with registration of members at or under the age of 13. Not wanting to deal with this, I just disabled registration for such and called it a day.

Of course, a company like Google / YouTube can’t just do what I did. Not in any practical way, and apparently Google hasn’t been playing as well by the rules as the FTC would like.

The end result of this is that YouTube is now treating content that is “made for kids” different from normal uploads, primarily for the purposes of advertising and statistic tracking.

Understand that Google logs every bit of data they can on everyone, in every way they can. I don’t say this here to be menacing, just factual — they are a company that makes their money off of user data, passively at least via advertising. Knowing who watches what content gives that content value — an advertiser who wants to sell a product to a certain type of person will know that having an ad play before a certain video will get that product or service into the eyes of those most likely to take interest in that product or service.

This of course means collecting information on people viewing said content, regardless of what it is, if Google is involved, they collect data.

With YouTube there’s an interesting problem in that there is actually quite a bit of content specifically made with kids in mind. It’s honestly an absolutely titanic market, and has been incredibly profitable for people making such content as ads play on it — especially given that a kid handed a cell phone isn’t exactly going to have adblock enabled.

So, this means, in effect, that to serve ads YouTube (and thus Google) would be collecting information on kids viewing these videos, if passively. The thing is, this is still a violation of COPPA and, as mentioned above, there is no practical way for YouTube to know who’s actually a minor and who’s not, so to comply with the law, they have taken another route — passing the onus to video creators. Yep, it’s your responsibility to comply, as the ownership of the channel is treated the same as ownership of a website. Interesting, as this could set a precedent that one could use against YouTube proper for a right to use the service and right to remove channels case, but I’m no legal expert…

Not going to lie, this is yet another thing I feel the YouTube community somewhat brought on themselves.

Anyway, I’ll have to look a bit more deeply into just what the litigation against YouTube that spawned this change was, as it would explain, obviously, just what the hell went on and just why this wound up being the best option.

So the basics of it is this: You, the uploader, now have to select what audience your video is for. Videos intended for kids will not have targeted ads enabled — this doesn’t mean they won’t have ads at all, or people won’t make money, they will just not have ads based on information related to the viewers which, logically, would be children and thus information gathering on them would be in violation of COPPA. These “generic” ads won’t pay out as much, and many creators are fearing yet another “adpocalypse” as a side effect of this, but again, I don’t see it as affecting much more than the already massive earnings of those toy unboxing and nursery rhyme channels.

Comments, etc, will also be disabled for such, again, complying with limiting data collection from probable minors. A possible side effect is that, since this will limit actual interaction on the videos, that this will impact the visibility of those videos, but given how these videos get marathoned by kids, I don’t think it’s going to be much of an issue in the grand scheme.

[EDIT 11/23/2019] I failed to address originally that “for kids” videos also cannot be added to playlists, notifications won’t show when such are uploaded, and that generally any real interaction which may leave traceable data regarding the video will not be available. This does affect visibility of content, clearly, and will be quite a hit to people’s ability to get content seen, so more damage will be done in this regard alone than I originally thought.

Of course, there are penalties for not selecting this option if your video is indeed targeted for children — up to absurd fines from the FTC and as one can imagine, everyone on YouTube now thinks that if they upload anything that a kid might watch they are suddenly going to get sued for $40,000.

This is, logically, far from the case and would only be something to happen to major offenders, people who would do everything they could to circumvent what YouTube has already put in place and still throw kid-oriented content on there without it being selected as such, which just seems absurd to think of as a situation in an of itself.

YouTube as well will be putting in place systems to detect content for kids. This is a bit concerning to people, and I can see why given YouTube’s history of terrible computer detection of things, because many fear that any mention of a cartoon, for example, or showing a character, or discussing something animated, regardless of its target audience (South Park, for example) may set off a false-positive.

I can’t say for sure how this will play out, but I can say for certain that unless you, as a creator, are indeed making content for children you have nothing to worry about. It’s pretty obvious when content is made for a child and when it’s not, and if there’s doubt to be had, I’d think you’re fine.

Honestly, I don’t think this will hurt many creators at all — if you aren’t making content targeting people under 13, then you’re fine. If your content is possibly of such a nature, you may want to reconsider your approach or just go all in and accept this as a reality that should have always been the case. Again, this has been a legal fact since before YouTube even existed and isn’t something up for debate, nor is it “going to destroy the site.” It’s just the next step in what some people still consider renegade media being completely mainstream and being treated like such.

Protip: It’s been illegal to collect data from minors since 2000. This isn’t anything new.

For the rest of us, it’s business as usual. If you’re afraid this may impact your earnings, might I suggest as so many did to us small-time content creators who lost monetization, that you get a real job. 😉

I’ll be covering much more of this as time passes. I know, I’m already late getting to this aspect of it but it’s been a busy week for me. More to come, as always.

For those curious, relevant links shared in the email sent out regarding this program can be found here:

https://support.google.com/youtube/answer/9527654

https://support.google.com/youtube/answer/9528076

Oh, and for a disclaimer, nothing here is legal advice. Just an analyses from someone who actually tries to understand what’s going on and the logic of things.

Lastly, as a bonus read, consider this: https://prokopetz.tumblr.com/post/189179831042/i-dont-imagine-this-is-going-to-make-a-huge

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.