Next month the OECD is holding its annual ministerial meeting in Seoul on The Future of the Internet Economy. I've been invited to speak on "Trust in Online Communities" at the meeting. My extended abstract and slides are below — any comments are welcome, and might even make it into my presentation!
Trust in online communities
Social networking sites such as Facebook and MySpace have attracted huge numbers of Internet users over the last five years. Dutton and Helsper (2007) recently found that 42% of all UK students had created a social networking profile, while DataMonitor found in August 2007 that North Americans spent almost 450m hours in total visiting these sites each month.
Other types of online communities are equally popular. Massively Multiplayer Online Role Playing Games (MMORPGs) like World of Warcraft and virtual worlds such as Second Life collectively boast tens of millions of active users. Many more specialised communities also exist online, linking for example physicians or those coping with diseases such as breast cancer or HIV (Bray, Croxson, Dutton & Konsynski, 2008).
What public policy interests do online communities implicate? Most generally, they provide a mechanism by which citizens can build social capital by strengthening their networks of emotional and practical support. They also present a means to reduce geographical barriers to participation in society, providing rural users with new means to interact and transact with those in more urban areas – particularly useful in countries such as the UK where urban/rural divides in broadband connectivity are now being overcome (OFCOM, 2008).
Recent social psychology research suggests a number of ways in which community designers can increase the production of these social goods by improving user trust in sites and in other community members. Green (2007) suggested the provision of non-verbal clues, links to mutual acquaintances and mechanisms to allow users to verify information. Reigelsberger, Sasse and McCarthy (2007) advised that sites should aim to increase users’ temporal, social and institutional embeddedness. These features can already been seen in social networking sites, such as Facebook’s lists of shared friends, institutional networks and links to external websites.
Reputation measures are also popular within online communities. These show how far community members have interacted with each other and with users’ own friends, sometimes with explicit ratings from peers. They also can identify users with similar activities and interests, a key predictor of trust (Jensen, Davies & Farnham, 2002).
Lacohee, Phippen and Furnell (2006) found that education, openness and tools for experimentation increase trust. Users need to feel in control, and be provided with restitution when problems occur.
Because it is difficult to reverse harms caused by the revelation of private information, community members need to be supported in setting appropriate limits on access to their personal data. Their trust can be seriously damaged by unintended releases of this information (Adams & Sasse, 2001). Facebook has seen a number of these incidents due to over-permissive sharing defaults (allowing for example Oxford University to fine students who had shared more widely than they had intended photographs of friends taking part in banned exam celebrations). Facebook's insecure application platform also allows unauthorised access to user data (Brown, Edwards & Marsden, 2007). The crude binary nature of “friend” relationships in social networks does not match the subtle manner in which individuals share different aspects of their lives with partners, friends, family members, managers, staff, acquaintances and strangers (Boyd, 2004).
Governments have two general policy levers at their disposal to encourage the development of trustworthy online communities. They can use competition law to drive up the quality of community sites – for example, by mandating interoperability for dominant players to reduce the winner-takes-all nature of markets based on networks of individuals. Privacy law can be used to enforce users’ control over their personal information, particularly in jurisdictions that have implemented the OECD's 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.
Given the popularity of online communities with teenagers and younger children, governments have also come under pressure to intervene to reduce potential harms to young users. A recent review for the UK Department for Children, Schools and Families suggested that community sites targeted at young people should be encouraged to subscribe to a code of conduct that includes commitments to provide safety advice to users and to promptly takedown “harmful and inappropriate” content when notified of its presence (Byron, 2008). Byron also suggested that children be encouraged to talk with their friends, siblings and parents about e-safety; that parents install filtering software on home PCs; and that Internet Service Providers be required to block access to illegal content.
Adams A & Sasse MA (2001) Privacy in Multimedia Communications: Protecting Users, Not Just Data. In Blandford A, Vanderdonkt J & Gray P (Eds.) People and Computers XV - Interaction without frontiers. Joint Proceedings of HCI2001 and ICM2001, Lille, Sept. 2001. pp. 49-64
Boyd DM (2004) Friendster and publicly articulated social networking. Computer-Human Interaction ‘04, pp. 1279-1282
Bray, DA, Croxson K, Dutton WH & Konsynski B (2008) Sermo: A Community-Based, Knowledge Ecosystem. Distributed Problem-Solving Networks Conference, Oxford. Available from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1016483 (last accessed 30 May 2008)
Brown I, Edwards L & Marsden C (2007) Stalking 2.0: privacy protection in a leading Social Networking Site. GikII 2 — law, technology and popular culture, London. Available from http://www.law.ed.ac.uk/ahrc/gikii/docs2/edwards.pdf (last accessed 30 May 2008)
Byron T (2008) Safer Children in a Digital World (Department for Children, Schools and Families) p. 64
Dutton WH & Helsper E (2007) Oxford Internet Survey 2007 Report: The Internet in Britain (Oxford Internet Institute) p. 52
Green MC (2007) Trust and social interaction on the Internet. In Joinson et al. (eds) The Oxford Handbook of Internet Psychology (Oxford University Press) pp.43-52
Jensen C, Davies J & Farnham S (2002) Finding Others Online: Reputation Systems for Social Online Spaces. Computer-Human Interaction ‘02, 4(1) p.449
Lacohee H, Phippen AD & Furnell SM (2006) Risk and Restitution: Assessing how users establish online trust. Computers & Security 25(7) pp.486-493
OFCOM (2008) The Nations & Regions Communications Market. Available from http://www.ofcom.org.uk/research/cm/cmrnr08/ (last accessed 30 May 2008)
Riegelsberger J, Sasse MA & McCarthy JD (2007) Trust in mediated interactions. In Joinson et al. (eds) The Oxford Handbook of Internet Psychology (Oxford University Press) pp.53-70