#parent | #kids | #parent | #kids | TikTok bans children from the popular lip-syncing app after $5.7 million fine


An lip-syncing app is paying big for misusing data from its younger users.

Musical.ly, a video-based social-media platform, agreed on Wednesday to pay $5.7 million in the Federal Trade Commission’s largest civil penalty ever under the Children’s Online Privacy Protection Rule (COPPA) for collecting personal information of young users without parental consent. The app, which was renamed TikTok in August 2018, has been downloaded by more than 200 million people, including 65 million in the U.S.

The FTC charged the app with failing to provide notice on its site about information collected from children, failure to get consent from parents for collecting data, and failure to delete information of children at the request of parents. Although this is the largest fine for a violation of the law yet, Musical.ly is just one of many apps collecting a large amount of data on children, according to Jim Steyer, chief executive officer of tech policy advocacy group Common Sense Media.

“It is no secret that tech companies are illegally and knowingly collecting personal information from children,” he said. “Musical.ly wasn’t the first company and they won’t be the last, which is why we need the FTC to continue to regularly enforce the Children’s Online Privacy Protection Act and hold companies accountable in a big way.”

On the app, user accounts were previously public by default, allowing anybody to see where the users lived, their age or grade in school, profile picture, and user names. The company raised the age requirement to 13 in 2017, but did not remove younger users from the account after this change was made, the report said. The FTC also said there is evidence of adults using the app to contact children.

Since 2014, Musical.ly received thousands of complaints from parents of kids under the age of 13 who were registered users asking it to delete their children’s accounts, the FTC said. The company deleted the accounts of the children, but did not delete the data collected on them from its servers.

In response to the ruling, TikTok will release a separate app that introduces additional safety and privacy protections for young children. The new app will not permit the sharing of personal information.

“It’s our priority to create a safe and welcoming experience for all of our users, and as we developed the global TikTok platform, we’ve been committed to creating measures to further protect our user community — including tools for parents to protect their teens and for users to enable additional privacy settings,” the company said in a statement.

Initially, TikTok tried to argue that its app was not targeting children and, therefore, not subject to COPPA. The FTC investigation found a“significant percentage” of users were under 13 and that the app had many factors appealing to young users, including popular child celebrities and visual content. These factors led the FTC to rule that the app was, in fact, subject to COPPA.

It said that in the future it will similarly examine a site’s “look and feel” to determine whether it is meant to appeal to children.

“The primary message for other sites and services is to think twice before concluding, ‘We’re not covered by COPPA,’” the FTC said. “According to COPPA, whether a company intends — or doesn’t intend to have a site directed to kids isn’t what controls the analysis.”



Source link