Back in 2015, Google released YouTube for Kids app to let children browse through channels and playlists that contain different shows, music as well as learning activities. Although the app went through several changes since its release, earlier last week, the app went through massive changes as it was revamped with new features as well as improved security. At the time of the release, Google warned that despite several measures, some mature contents might slip through the application which might be inappropriate for children. Following a week since the latest updates, the application has been slammed for showing inappropriate contents by several high profile publishers.
While not so widespread, reports from New York Times has pointed out that the app is showing videos which are knockoff of well-known characters. The report pointed out that the probable videos went through the filtering process as the application relies on its algorithms to choose which videos are appropriate for children. At the time of the update, Google advised parents to go through videos on a regular bass and report anything if it seems inappropriate.
As the company suspected, only a handful of videos made through the filtering process and over last 30 days, less than .005 percent videos were removed from the platform. Talking about the latest issue, a YouTube spokesperson said, “We use a combination of machine learning, algorithms and community flagging to determine content in the app as well as which content runs ads. We agree this content is unacceptable and are committed to making the app better every day.”
Despite the claims, parents and children’s groups are angry with the situation and we will have to wait and see to find out if the problem still persists in the coming weeks.
Image: Flickr/Esther Vargas