Inappropriate content ‘flagged’ by users, news items ranked by users, online sellers rated by users, online lexica articles written by users and silent agreements among users on socially acceptable behaviour in online communities…

In the age of Web2.0, self-regulation among the users as well as user governance has become an integrated element of our participation in the online world. And growing up online requires a set of complex personal competences. Participation in online communities is an integrated element of many children and young people’s everyday lives today.

Here it is of key importance to command the ability to interpret and perform different social and cultural norms of a given community as well as using and understanding the self-regulatory tools available on the different community sites.

For organisations working with child protection in the age of the social internet where user empowerment is fundamental, many questions appear. Among those worth singling out are: How can we support an ethical culture among young internet users and with which tools can we provide them to sustain already existing cultures of self-regulation?

The Whole World is Watching
Many forms of self-regulation among users are characteristic of the online world today – some a more integrated element of our online identities, some still being negotiated explicitly in our respective online communities. Often self-regulation among users is defined by a set of shared values and ideas among the users of a particular internet community – not formalised, but based on silent agreements.

One implicit form of self-regulation is our increasing awareness of the public nature of our online identities. In Denmark, the internet was introduced into the average Danes’ home in the mid 90s and thus gradually the public sphere has become an integrated element of our private spheres. At first, we were perhaps not so cautious of our online activities arguably resulting in the “private” tone of the “personal diary”, the weblog, for example. However, many of us have gradually become accustomed to cultivating and designing the online persona that shows up on a Google Search for example. This is one form of user self- regulation. As Professor Joshua Meyrowitz recently put it at the seminar ‘Media and Mobility’ in Copenhagen: “the idea that the whole world is watching leads to a sense of caution”.

Negotiating Social Norms in Online Communities
Other forms of self-regulation are defined by cultural norms within the respective online community in which we participate, but are more explicitly being negotiated by the users of the communities. This is for example often seen in the user comments to Youtube videos where a user with a ‘misplaced’ comment is put in his place by other users. The implicit debate is here: ‘What is the most socially appropriate way of interacting on this community site?’

Attempts to create more formalised “social codes of conduct” for online interaction have been made. But most often these are met with great resistance from internet users that intensely shield the free and independent nature of the internet. We saw it when Jimmy Wales, the man behind Wikipedia, and Tim O’Reilly, father of the term “Web2.0”, earlier this year, argued for establishing a formal “Bloggers Code of Conduct” causing a fierce debate in the blogging environment – a debate that is still at this very moment taking place online.

Supportive Tools for Self-regulation
Another side of self-regulation among internet users features the tools provided by the service providers to help users govern their online communities. For example, practically all social networking sites have a set of guidelines for the users for posting content and interacting with other users on the community. Other examples of sustaining a self-regulating culture among users are the “flag” function on Youtube that users can use to report “inappropriate” content, the rating system on Ebay with users’ positive or negative feedback on sellers, and on the user based online lexica Wikipedia.org, where users are provided with systems to catch the vandalistic editing of content as well as systems to review and improve articles.

Online User Governance and Self-regulation among Young People
Most online communities used by children and young people today are usually characterised by a combination of centralised control by the service provider as well as decentralised cultures of self-regulation among the users themselves sustained by tools provided by the provider.

For example, the social networking sites that children and young people use to sustain networks with peers via the creation of the so-called “profiles” with “friend lists” “guest books”, “image galleries” and many other functions. In Denmark, we have examples of such portals, where administrators manually control all images before allowing them to be posted on the portal, technical word-filters are used to control the chat and the service provider reserves the right to close down accounts where the owner infringes the regulations set out by the provider. In addition, the users are provided with tools to self-regulate; that is to govern their communities, as for example the possibility of blocking other users from their profiles, set the profile as ‘private’ or to report bad behaviour from other users to the administrators of the site. But perhaps most significantly, research shows that the young users have created a set of cultural norms on the communities by which they tend to live their online lives.

A Culture of Truth
The concept of the “faker” – a term used by the young users of social networking sites in Denmark – is a good example of the silent social norms that many young people live by when participating in online communities. As pointed out by Malene Charlotte Larsen, a Danish researcher of young people’s use of one of the most popular Danish social networking sites for young people, Arto.dk, if a user creates a “fake profile” with an obviously fake picture, the other users will in a very short time have pointed it out as a fake profile via comments on the profile, and in many cases users will even have researched the profile finding the original source of the profile image or even in some cases the actual user behind the profile. The culture of keeping a truthful culture on the community is strong among the young users. It is simply not acceptable to be a “faker” online.

The administrators of the site have now responded to this user culture with a so called “IRL-guarantee” (In Real Life Guarantee) with which users can guarantee, so to speak, that they are who they say they are on their profiles either by peer to peer recommendation, a social security number check or by showing a picture of themselves holding a sign with their user name. The system is voluntary. However, a quick spot test among the user profiles on the site show that many of the users use the IRL guarantee. Perhaps in order to indicate to their fellow users that they are indeed not ‘fakers’? User governance is supported by the service providers that are inspired by users’ behaviour and user governance is shaped by the tools made available by the service providers. These are the conditions in today’s online environments and importantly these are some of the conditions under which children and young people live their everyday lives.

Advertisements