Kids online are at risk from YouTube's violations of federal child privacy laws

Discuss whatever you want here ... movies, books, recipes, politics, beer, wine, TV ... everything except classical music.

Moderators: Lance, Corlyss_D

Post Reply
Posts: 5954
Joined: Sun May 29, 2005 7:06 am
Location: Cleveland, Ohio

Kids online are at risk from YouTube's violations of federal child privacy laws

Post by jserraglio » Wed Jun 19, 2019 2:52 pm


YouTube is under investigation over children's privacy

The U.S. government is in the late stages of an investigation into YouTube for its handling of children’s videos, according to four people familiar with the matter, a probe that threatens the company with a potential fine and already has prompted the tech giant to reevaluate some of its business practices.
The Federal Trade Commission launched the investigation after numerous complaints from consumer groups and privacy advocates, according to the four people, who requested anonymity because such probes are supposed to be confidential. The complaints contended that YouTube, which is owned by Google, failed to protect kids who used the streaming-video service and improperly collected their data.
As the investigation has progressed, YouTube executives in recent months have accelerated internal discussions about broad changes to how the platform handles children’s videos, according to a person familiar with the company’s plans. That includes potential changes to its algorithm for recommending and queuing up videos for users, including kids, part of an ongoing effort at YouTube over the past year and a half to overhaul its software and policies to prevent abuse.
A spokeswoman for YouTube, Andrea Faville, declined to comment on the FTC probe. In a statement, she emphasized that not all discussions about product changes come to fruition. “We consider lots of ideas for improving YouTube and some remain just that--ideas," she said. "Others, we develop and launch, like our restrictions to minors live-streaming or updated hate speech policy.”
The FTC declined to comment, citing its policy against confirming or denying nonpublic investigations.
The Wall Street Journal first reported Wednesday that YouTube was considering moving all children’s content off the service into a separate app, YouTube Kids, to better protect younger viewers from problematic material — a change that would be difficult to implement because of the sheer volume of content on YouTube, and potentially could be costly to the company in lost advertising revenue. A person close to the company said that option was highly unlikely, but that other changes were on the table.
YouTube Kids gets a tiny fraction of the YouTube’s audience, which tops 1.9 billion users logging in each month. Kids tend to switch from YouTube Kids to the main platform around the age of seven, Bloomberg reported this week.
The internal conversations come after years of complaints by consumer advocates and independent researchers that YouTube had become a leading conduit for political disinformation, hate speech, conspiracy theories and content threatening the well-being of children. The prevalence of preteens and younger children on YouTube has been an open secret within the technology industry and repeatedly documented by polls even as the company insisted that the platform complied with a 1998 federal privacy law that prohibits the tracking and targeting of those under 13.
The FTC has been investigating YouTube about its treatment of kids based on multiple complaints it received dating back to 2015, arguing that both YouTube and YouTube Kids violate federal laws, according to the people familiar with the investigation. The exact nature and status of the inquiry is not known, but one of the sources said that it is in advanced stages — suggesting a settlement, and a fine depending on what the FTC determines, could be forthcoming.
“Google has been violating federal child privacy laws for years,” said Jeffrey Chester, executive director of the Center for Digital Democracy, one of the groups that has repeatedly complained about YouTube.
Major advertisers also have pushed YouTube and others to clean up its content amid waves of controversies over the past two years.
A report last month by PWC, a consulting group, said that Google had an internal initiative called Project Unicorn that sought to make company products comply with the federal child privacy law, called the Children’s Online Privacy Protection ACT and known by its acronym COPPA.
The company that commissioned the PWC report, SuperAwesome, helps technology companies provide services without violating COPPA or European child-privacy legal restrictions against the tracking of children.
“YouTube has a huge problem,” said Dylan Collins, chief executive of SuperAwesome. “They clearly have huge amounts of children using the platform, but they can’t acknowledge their presence.”
He said the steps being considered by YouTube would help, but “They’re sort of stuck in a corner here, and it’s hard to engineer their way out of the problem.”
Earlier this month, YouTube made its biggest change yet to its hate speech policies — banning direct assertions of superiority against protected groups, such as women, veterans, and minorities, and banning users from denying that well-documented violent events took place. Previously, the company prohibited users from making direct calls for violence against protected groups, but stopped short of banning other forms of hateful speech, including slurs. The changes were accompanied by a purge of thousands of channels, featuring Holocaust denial and content by white supremacists.
The company also recently disabled comments on videos featuring minors and banned minors from live-streaming video without an adult present in the video. Executives have also moved to limit its own algorithms from recommending content in which a minor is featured in a sexualized or violent situation, even if that content does not technically violate the company’s policies.
Tony Romm is a technology policy reporter at The Washington Post. He has spent nearly ten years covering the ways that tech companies like Apple, Facebook and Google navigate the corridors of government -- and the regulations that sometimes result.
Elizabeth Dwoskin joined The Washington Post as Silicon Valley correspondent in 2016, becoming the paper's eyes and ears in the region and in the wider world of tech. Before that, she was the Wall Street Journal's first full-time beat reporter covering big data, artificial intelligence, and the impact of algorithms on people's lives.
Craig Timberg is a national technology reporter for The Washington Post. Since joining The Post in 1998, he has been a reporter, editor and foreign correspondent, and he contributed to The Post’s Pulitzer Prize-winning coverage of the National Security Agency.
Democracy Dies in Darkness
© 1996-2019 The Washington Post
Last edited by jserraglio on Wed Jun 19, 2019 3:06 pm, edited 2 times in total.

Posts: 5954
Joined: Sun May 29, 2005 7:06 am
Location: Cleveland, Ohio

Re: Kids on YouTube are at risk from Google violating federal child privacy laws

Post by jserraglio » Wed Jun 19, 2019 2:57 pm


How 20 years of child online protection law went wrong

Two decades after Congress tried to wall off the worst of the Internet in hopes of protecting the privacy and innocence of children, the ramparts lie in ruins.
Sex, drugs, violence, hate speech, conspiracy theories and blunt talk about suicide rarely are more than a few clicks away. Even when children are viewing benign content, they face aggressive forms of data collection that allow tech companie s to gather the names, locations and interests of young users.
The federal government’s efforts to thwart these rising threats have been weakened by court rulings, uneven enforcement and the relentless pace of technological change. Surveys show that four out of five American preteens use some form of social media, with YouTube being the most popular but Instagram, Facebook and Snapchat also widely used — even though all four services officially prohibit users younger than 13.
Other popular online offerings — such as the game Fortnite, which has proven so engrossing to preteen boys that parents worry about addiction — maintain they are “not directed” at children. But the services also don’t ask users how old they are. This tactic, lawyers say, helps the companies sidestep the Children’s Online Privacy Protection Act, a 1998 law known by the acronym COPPA that restricts the tracking and targeting of those under 13 but requires “actual knowledge” of a user’s age as a key trigger to enforcement.
Consumer and privacy advocates have alleged rampant COPPA violations by leading technology companies, including in a highly detailed, 59-page complaint against YouTube last year. Even when federal authorities take action, the penalties typically come many years after the violations began and make little dent in corporate profit margins, the advocates say.
“We’ve got a crisis of enforcement,” said Josh Golin, executive director of the Campaign for a Commercial Free Childhood, an advocacy group based in Boston. “Right now we are incentivizing companies to not know that children are on their sites. . . . They’ve literally been rewarded for pretending that there are no children on their sites.”
As researchers and consumer advocates spotlight the weaknesses of federal protections for children, some members of Congress are pushing to toughen the federal privacy law and to impose legal restrictions on what can be shown to children online. But such efforts are struggling to advance in a Congress consumed by partisan battles.
Safeguards by industry, which for years argued that “self-regulation” was an effective alternative to government’s heavy hand, also have proved weak. Even content that nearly everyone agrees should be off-limits to children, such as pornography or sites celebrating drug, tobacco and alcohol consumption, can be seen by underage users who enter fake birth dates or tap online buttons that allow them to claim to be adults. Rarely do sites or apps employ systems to routinely verify ages.
This leaves parents with few choices for helping kids navigate an online world in which the borders between sites for adults and those for children have all but disappeared. Short of round-the-clock vigilance — in play rooms, on school buses, wherever children gather with their ever-present mobile devices — there are few effective ways to shield them from corporate data collection or from encountering content traditionally kept from young eyes.
“There has been a complete and utter failure to protect children in this entire society by the Washington infrastructure,” said James Steyer, chief executive of Common Sense, a San Francisco-based advocacy group pushing for several new laws to make the Internet safer for children. “It’s a disgrace. And the losers have been children, parents and their families.”
Shocking discoveries
One children’s website,, featured educational games, cartoon characters in togas and a decidedly adult advertisement along the bottom, a Princeton researcher recently found. In the ad a dark-haired woman in a low-cut dress smiled warmly just above the words “Ashley Madison,” with a link to the online dating service whose slogan is “Life is short. Have an affair.”
Minutes later, on a different children’s math site, another Ashley Madison ad appeared, only this time the woman’s hair was curlier and the dress more revealing.
Delivering the ads on both occasions was Google, the world’s largest digital advertising company, which acknowledged in a statement that the ads violated company policy.
“I was shocked,” said Gunes Acar, the researcher for Princeton’s Center for Information Technology Policy who discovered the ads. “I definitely was not expecting to find that on a child-directed site.”
The shocks would keep coming for Acar during several weeks last winter reviewing the ads on children’s websites.
Acar was studying the effectiveness of COPPA, which was once hailed as a landmark in protecting kids online. But recent work by Acar and other researchers has demonstrated significant limits in the reach and enforcement of COPPA, suggesting that the law has been overrun by the very industry it was supposed to regulate.
In addition to the Ashley Madison ads, Acar’s survey of websites labeled as “child-directed” found a Google ad for a dating service featuring Qatari women and another touting pictures of “Hot Survivor Contestants.” Some ads served by Google offered downloads that included malicious software. Another Google ad caused his computer to emit a high-pitched alarm as a robotic voice announced an “Important security message” and urged he call a number for tech support — all signs of a likely online scam.
All of these ads complied with COPPA, meaning they didn’t track or target children. But the law also had another apparent effect, one not intended by its creators: By barring personalized advertising, COPPA can prompt advertising companies to deliver a hodgepodge of untargeted ads on kids sites, resulting in a mix that can be curiously adult in nature.
Acar’s survey involved repeatedly visiting children’s websites while he was not signed into any Google service, so that he could see what advertising appeared. He also collected Google’s explanations of why it displayed certain ads, to make sure that the factors weren’t particular to his browsing history or anything else that might indicate an adult user. Google’s explanations of the Ashley Madison ads, for example, indicated that they were not personalized to Acar but were displayed for general reasons, such as his location, in Princeton, N.J., and the time of day.
Google said that it has policies against ads delivering malicious software and that it does not allow adult advertising on kids sites.
“Our policies prohibit serving personalized ads to children. We also restrict the kind of ad content that is served on primarily child-directed sites and in this case, our systems didn’t work as intended and we have removed those ads,” Google spokeswoman Shannon Newberry said.
John Barth, director of digital marketing for IXL Learning, which owns the children’s math sites that showed the Ashley Madison ads, wrote in an email that the sites are flagged to Google as “child-directed” and run ads that comply with COPPA. When provided with images of the ads that Acar found, Barth said, “This is concerning . . . I plan to investigate this issue.” The sites are no longer online.
Ashley Madison, meanwhile, expressed frustration that some of its ads had reached audiences unlikely to be in the market for its core service of helping people find sexual relationships outside their marriages.
“Unless there’s been a sudden surge of affairs at recess, this is not in our interest,” said Paul Keable, chief strategy officer for Ashley Madison’s parent company, ruby Inc.
Prohibitions have little effect
COPPA passed with bipartisan support in 1998 before much of today’s digital ecosystem existed. Google had just been founded, but many staples of contemporary online life — the iPhone, YouTube, Facebook — had not yet been invented.
The law sought to protect children from websites that targeted children, gathered their personal data and delivered advertising based on this information. This became illegal under COPPA, unless parents specifically permitted it. Few did.
But the law’s architects, who found themselves negotiating against powerful industry interests while seeking support in Congress, agreed to a key loophole: So long as online sites didn’t explicitly target children and didn’t have “actual knowledge” that a particular user was under 13, COPPA’s restrictions didn’t apply.
“We saw COPPA as a first step,” said Kathryn Montgomery, a retired American University professor who lobbied for the law in 1998 and now is research director for the Center for Digital Democracy. “But it’s very alarming to see that this industry, particularly those who are targeting children and making a lot of money on children, is not taking into consideration the welfare of children.”
A similarly named law, the Child Online Protection Act, also passed in 1998, backed mainly by political conservatives as part of a broader campaign against pornography. The law sought to keep children from seeing content deemed “harmful to minors” but immediately came under successful legal attack from civil liberties group, which argued that it infringed on constitutionally protected speech. The law never took effect.
That left COPPA, with its focus on privacy alone, as the primary federal statute seeking to protect children online.
The early years of COPPA enforcement actions by the Federal Trade Commission focused on companies that knew children were using their services, often because the sites asked users for their ages. Ohio Art Company, the maker of Etch A Sketch, for example, reached a $35,000 settlement with the FTC in 2002 over allegations that its website was collecting personal data on children, including their birth dates. Sony BMG Music settled a $1 million case with the FTC in 2008 for allegedly collecting data, including dates of birth, on young visitors to the fan sites of pop artists.
Lawyers for technology companies began warning their clients to avoid requesting such information and, instead, to write terms of service prohibiting users under the age of 13, legal experts say. Some of the online services that surveys show are most popular among younger children, including YouTube, Instagram and Snapchat, officially bar them from the platform.
Children’s advocates say such prohibitions, often explained in fine print rarely read by young users or their parents, do little to keep kids off these services, an assertion backed by survey data.
Common Sense, the advocacy group led by Steyer, found in 2017 that 83 percent of children aged 10 to 12 use at least one social media site. Their favorites were YouTube, Facebook, Instagram and Snapchat — all of which have terms of service barring users that young. A poll last year by the Pew Research Center echoed the finding, showing that even among kids 11 or younger, 81 percent had watched YouTube at least once and 34 percent did so regularly, according to their parents.
Many of the most popular channels on YouTube, as ranked by research firm Social Blade, appear to be directed toward kids, by featuring nursery rhymes, simple cartoons or videos of children playing games or unwrapping packages with toys inside. Such issues, which have been evident for several years on YouTube, are at the heart of last year’s COPPA complaint to the FTC.
‘A crisis of enforcement’
Epic Games, maker of the massively popular online game Fortnite, has taken a legal position similar to that of the social media sites. It does not specifically prohibit users under 13 but says that it does “not direct” its games toward children or “intentionally collect personal information from children,” according to the company’s privacy policy. Many parents, however, say that children in elementary and middle schools spend hours a day playing the game, which collects a range of personal data from its users.
“COPPA doesn’t work,” said Marc Groman, a former White House and Federal Trade Commission lawyer who explored the law’s many problems in a podcast he co-hosts, called “Their Own Devices,” on children and technology. “The fact is it doesn’t protect children under 13 in many circumstances.”
Epic and Snapchat declined to comment. Newberry, the Google spokeswoman, said of YouTube , which is subsidiary of Google . “Parents want their children to be safe online and we’re committed to providing tools and safeguards to help them.”
Instagram spokeswoman Stephanie Otway said: “People under the age of 13 are not allowed to use Instagram. When we find an underage account, we will restrict access to that account and ask the account holder to prove their age. If they are unable to do so, we will delete the account from Instagram."
Instagram declined to say how many accounts it’s deleted for this reason.
Even when COPPA does apply, enforcement of the law often is slow and inconsistent, consumer advocates say.
The FTC in February fined the maker of the popular lip-syncing app TikTok $5.7 million, a record for COPPA violations, settling allegations that the app had collected personal data on children illegally since 2014 under a previous name, But, critics of COPPA enforcement said the agency moved much too slowly to discourage other companies from doing the same. TikTok’s Chinese parent company, ByteDance, which acquired in 2017, is now one of richest start-ups in the world.
The FTC, which enforces COPPA, defended its record and said the law has been successful in protecting children from collection of their personal data and targeted advertising. The agency has settled allegations of COPPA violations in 30 cases and used its rulemaking power to update enforcement of the law in 2013 by expanding the definition of personal data to include videos, photos or voice clips.
“We think that the statute is effective, and we think that our enforcement efforts are effective,” said Andrew Smith, head of the FTC’s bureau of consumer protection.
A push to strengthen
Sen. Edward J. Markey (D-Mass.), one of the original authors of COPPA, has proposed a bill this year to strengthen it. His COPPA update, co-sponsored by Sen. Josh Hawley (R-Mo.), would set a broader standard requiring compliance if there is substantial evidence that children are using a website or app. The bill also would bring the United States closer to the children’s privacy standards in the European Union by raising the age of those covered by COPPA to include anyone under 16. European law prohibits collecting personal data from children under 16 in most cases.
“We believe that parents in the United States want the same protection for their children as Europeans want for their children,” Markey said.
He also is developing a bill that would implement new standards for children’s online content, echoing previous generations’ rules for kids’ shows on broadcast television. Markey said the portability of mobile devices makes it harder than ever for parents to monitor what their children are watching — or receiving through advertising.
While Markey predicted bipartisan support for his bills, they already are generating resistance from some in the technology industry, whose lobbying corps is among the largest and best funded in Washington.
Markey’s legislative push, they warn, will inhibit innovation and push developers toward offering fewer free services online, limiting access to poorer families.
“I think it will push us further in this direction of limiting the availability of services and apps [for] kids,” said Daniel Castro, the vice president at the Information Technology & Innovation Foundation, whose board includes Apple and Microsoft executives. “It’s going to raise the cost of compliance. It’s going to make it so you have more paid apps.”
‘The violations are rampant’
When researchers from the University of California at Berkeley tested 5,855 apps marketed to children and their parents on the “Designed for Families” portion of Google’s Play store, they found 57 percent showed signs that they may be violating COPPA, including its limits on collecting the personal data of young users. The researchers published their findings in a peer-reviewed journal last year and furnished the list of apps to Google, hoping it would address the claims.
A year later, many of the identified apps still are available on the Play store in the “Designed for Families” section, the researchers say. New Mexico Attorney General Hector Balderas has sued Google in federal court for alleged COPPA violations, including for failing to address problems found by the researchers.
Google also was among the companies whose online trackers were likely collecting children’s data. Independent developers installed trackers from Google and other advertising companies into their apps, allowing them to collect such data as user locations and interests, based on what apps they used or websites they visited.
This helped app makers understand their audiences better while also providing the data necessary to attract more lucrative targeted ads, but on kids apps, this kind of tracking likely violated COPPA, said Serge Egelman, one of the researchers and director of the Berkeley Laboratory for Usable and Experimental Security.
“The platforms have an incentive to not investigate,” Egelman said. “The violations are rampant.”
Google has argued in its response to the New Mexico attorney general’s lawsuit that “the app developer bears sole responsibility for ensuring COPPA compliance.” But the FTC also has broad authority to investigate and institute enforcement actions when companies engage in “deceptive practices.”
Georgetown University law professor Angela J. Campbell, who represents a coalition of consumer advocates and privacy groups that filed an FTC complaint in December against Google for alleged COPPA violations, argues that the company should not characterize apps as family friendly when company executives have been alerted to apparent privacy intrusions.
“They are deceiving the public in presenting these apps as being appropriate for children,” Campbell said.
Newberry, the Google spokeswoman, said, “Parents want their children to be safe online and we work hard to protect them.” She added that the “Designed for Families” section of the Play store requires developers to have privacy policies and to adhere to privacy laws, including COPPA. The Play store added new rules last month requiring app developers to declare the ages of their intended audiences and to ensure that apps geared towards adults aren’t designed in ways that “unintentionally” appeal to children.
Newberry did not comment on Egelman’s research, on the complaint to the FTC alleging deceptive practices and why apps Egelman had identified as likely violating COPPA remained in the Play store’s “Designed for Families” section.
Facebook also collected data from hundreds of apps marketed to children and their parents, the researchers found. When they alerted the company that app makers appeared to be using Facebook tracking technology to collect information on children, Facebook contacted the app makers about the apparent COPPA violations.
But in dozens of cases, according to Egelman, apps that appear to be aimed at a children’s audience still are sending personal data to Facebook. That includes games that have cartoonish graphics and, in some cases, the word “kids” in titles or descriptions.
Facebook said it has limited power to evaluate potential COPPA violations when other companies use its tracking technology.
"We require all developers engaging with our platform to be COPPA compliant. We followed up on the researcher’s report and suspended all apps that did not guarantee their compliance. In order to protect a competitive developer ecosystem, the law does not give large companies the authority to determine COPPA compliance,” said Antigone Davis, Facebook’s head of global safety.
The state of COPPA enforcement is such that even the federal government has struggled to consistently comply. Researchers from the Danish privacy compliance company Cookiebot discovered recently that a site for the U.S. Government Publishing Office aimed at children as young as 4 appears to have violated the law’s prohibition on tracking children.
The site — called Ben’s Guide and featuring a cartoon image of Founding Father Benjamin Franklin — has games and lessons about U.S. history and government explicitly aimed at schoolchildren. It also had a Google data tracker on it that collected information about what devices children were using and what other websites they visited, Cookiebot found. The Government Publishing Office placed the Google tracker on the site but did not collect the resulting personal data; the data went to Google.
“We do our best to ensure compliance with COPPA and do periodic reviews of the software,” said Gary Somerset, a spokesman for the Government Publishing Office. “If there are any system modifications needed to improve upon security or privacy, our technical personnel will correct.”
Google declined to comment.
Tony Romm, Alice Crites, Julie Tate and Emily Guskin contributed to this report.
Craig Timberg is a national technology reporter for The Washington Post. Since joining The Post in 1998, he has been a reporter, editor and foreign correspondent, and he contributed to The Post’s Pulitzer Prize-winning coverage of the National Security Agency.
Democracy Dies in Darkness
© 1996-2019 The Washington Post

Post Reply

Who is online

Users browsing this forum: No registered users and 74 guests