FTC Finds Explicit Content In The Virtual Worlds: Are Children Protected?

Parents can no longer trust child-oriented virtual worlds as being a safe haven for their children online.  Federal officials have recently declared that parents have to play a more active role in their kids’ virtual lives.  They said that a majority of these worlds failed to adequately protect children from explicit sexual and violent content.

In fact, this study has been congressionally mandated with the Congress tasking the FTC with looking into this matter.

The resulting report released by the Federal Trade Commission (FTC) analyzing the policies and content of 27 virtual worlds revealed a lot of interesting information. The commission found 70% of the virtual worlds allowing access to objectionable material.

The researchers investigating on behalf of the agency registered in these virtual worlds as children, teens and adults.  They then recorded the explicitness of the content they found in those places by identifying it as low, moderate or heavy.

The FTC found at least one instance of either violence or sexually explicit content in 19 of the 27 virtual worlds they investigated.  Five of the virtual worlds were graded as displaying heavy amount of explicit content, four offered moderate amount, and ten showed lesser amount of explicit content.

The Commission also revealed that while some of the virtual worlds used language filters to ensure that objectionable language does not appear in text communications, this did not prove to be very successful.

Apart from presenting these reports, the FTC also offered several suggestions for virtual world operators towards reducing the risk of exposure to explicit content for youth.  The suggestions included age-screening mechanisms to be fitted in place to ensure minors are only allowed to interact with their peers and view only content that is appropriate for their age.  They also suggested re-examining the strength of language filters.

The agency also noted that there are certain online worlds that have taken the precaution of having advanced age-screen mechanisms in place, such as not allowing a user to register again from the same computer giving an older age, after they were initially rejected for being too young. They also had adult-only sections restricting the entry for young users to those areas.

However, the FTC feels that a large section of the virtual worlds have insufficient rules and systems in place.  They found that in such places, their researchers were able to register again using the same computer, immediately after being rejected for not fulfilling the minimum age requirement.

Another suggestion of the FTC was on employing moderators that are specially trained in dealing with violations of conduct and who have the ability to guide other viewers to report content as well as comment on users who violate the terms of behavior in the virtual world.  They also said that segmenting members by age in order to reduce the risk of children interacting with older people may prove to be helpful.

FTC Chairman Jon Leibowitz said, “It is far too easy for children and young teens to access explicit content in some of these virtual worlds.  The time is ripe for these companies to grow up and implement better practices to protect kids.”

After their research was conducted and data retrieved, the Commission sent letters of inquiry to six virtual-world operators asking for information on the age-screening techniques they employed.

Credit: diybookgirl (via Flickr)
Credit: diybookgirl (via Flickr)

Although, the responsibility of ensuring content is strictly available based only on the age lies with the virtual world operators, the FTC also stressed on the importance of parents and children alike becoming better educated about the risks and benefits of participation of young children and teens in the online worlds with explicit content.

The general opinion of internet users seems to steer towards having better moderation of chat channels that can filter offensive material. In some virtual worlds, in spite of there being a system in place to report users, all requests for moderation fall on deaf ears. The general consensus is that perhaps programs that allow user-driven moderation could help alleviate this problem.

While the FTC works on cleaning up the virtual worlds and making them safe for our children, it becomes the parents’ responsibility to monitor their children’s online activities.

Join the discussion