Post in our forum for parents, teens - You! - at ConnectSafely.org.
Thursday, December 10, 2009
FTC's milestone report on virtual worlds
This is pioneering stuff on the part of the US government. The Federal Trade Commission today sent to Congress its close study of 27 online virtual worlds – 14 for children under 13 and 13 aimed at teens and adults – looking at the level of sexually explicit and violent content and what the VWs were doing to protect children from it. I think it's important for parents to keep in mind when reading the study or just the highlights here that "content" in virtual worlds means user-generated content (which is why, in "Online Safety 3.0," we put so much stress on viewing children as stakeholders in their own well-being online and teaching them to be good citizens in their online and offline communities). Here are some key findings:
The FTC found at least one instance of either sexually or violently explicit content in 19 of the 27 worlds – heavy (sex or violence) in five of them, moderate in four, and "only a low amount in the remaining 10 worlds in which explicit content was found."
Of the 14 VWs for kids under 13, 7 contained no explicit content, 1 had a moderate amount, and 6 had a low amount.
Nearly all the explicit content found in the kids' VWs "appeared in the form of text posted in chat rooms, on message boards, or in discussion forums."
The Commission found more explicit content in VWs aimed at teens or adults, finding it in 12 of the 13 in this category, with a heavy amount in 5 of them, moderate in 3, and a low amount in 4 of the 13.
Not just text: Half the explicit content found in the teen- and adult-oriented virtual worlds was text-based, while the other half appeared as graphics, occasionally with accompanying audio.
The report goes into measures these 27 VWs surveyed take to keep minors away from explicit content, including "age screens" designed to keep minors from registering below a site's minimum age (what the FTC calls "only a threshold measure"); "adults only" sections requiring subscriptions or age verifications (see "'Red-light district' makes virtual world safer"); abuse reporting and other flagging of inappropriate content; human moderation; and some filtering technology. "The report recommends that parents and children become better educated about online virtual worlds" and that virtual-world "operators should ensure that they have mechanisms in place to limit youth exposure to explicit content in their online virtual worlds." In the two pages of Appendix A (of the full, 23-page report + appendices), you'll find a chart of all the virtual worlds the FTC reviewed. [See also my VW news roundup last week and "200 virtual worlds for kids."]
This is a great start. As purely user-driven media, virtual worlds are a frontier for research on online behavior. The FTC was charged by Congress "merely" with determining the level of harmful content, not behavior – I really think because adults continue to think in a binary, either-or way about extremely fluid environments that are mashups of content and behavior. Where is it really just one or the other, what is "content" in social media, and how do we define "harmful"? We also need to define "virtual worlds." Some of these properties are largely avatar chat, some are games (with quests), some are worlds with games but not quests in them. Still, we've got some great talking points and very useful data to build on.
The report goes into measures these 27 VWs surveyed take to keep minors away from explicit content, including "age screens" designed to keep minors from registering below a site's minimum age (what the FTC calls "only a threshold measure"); "adults only" sections requiring subscriptions or age verifications (see "'Red-light district' makes virtual world safer"); abuse reporting and other flagging of inappropriate content; human moderation; and some filtering technology. "The report recommends that parents and children become better educated about online virtual worlds" and that virtual-world "operators should ensure that they have mechanisms in place to limit youth exposure to explicit content in their online virtual worlds." In the two pages of Appendix A (of the full, 23-page report + appendices), you'll find a chart of all the virtual worlds the FTC reviewed. [See also my VW news roundup last week and "200 virtual worlds for kids."]
This is a great start. As purely user-driven media, virtual worlds are a frontier for research on online behavior. The FTC was charged by Congress "merely" with determining the level of harmful content, not behavior – I really think because adults continue to think in a binary, either-or way about extremely fluid environments that are mashups of content and behavior. Where is it really just one or the other, what is "content" in social media, and how do we define "harmful"? We also need to define "virtual worlds." Some of these properties are largely avatar chat, some are games (with quests), some are worlds with games but not quests in them. Still, we've got some great talking points and very useful data to build on.
Labels: age verification, FTC, online safety, Second Life, social media, social Web, virtual worlds
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home