In my last post I took a look at what permissions granularity was and how it might impact on user behavior. The short version of the conclusion that post made is: If permissions granularity is not transparent - easy to understand and easy to use - most people will fall back on whatever the site defaults are. Of course incentive to use restrictions in the first place is dependent on an understanding that 1. the stuff you are putting out is searchable and accessible to the general public 2. there are people in that category you don't necessarily want to see your stuff.
I remember an audience member in a conference I attended last year who was outraged that a potential employer might Google her and then base a judgment about her on her personal activity. And I've seen school kids squirm in horror as their Bebo and YouTube pages were looked at by teachers and parents. It's increasingly common for recruiters, universities, and other authoritative gatekeepers to use public social network information to fill in candidates 'other interests': goodbye fervent interest in hang gliding and byzantine pottery; hello getting drunk and pinching road signs.
It also seems fair to say that a large number of people depend on fairly flimsy strategies to avoid managing their data (or having to work out any permissions granularity). These include counting on the fact that your name is a fairly common one, simply playing the odds in the face of the sheer amount of information everyone else is putting out, and imagining your social networking service is one that no one you feel uncomfortable with would possibly use.
Way back in 2002 Katz and Rice describe the internet as a panopticon. Those of you who've flirted with Foucault or are interested in architecture will remember that the key characteristic of the Bentham's prison design is that people keep themselves in line, because the possibility of being observed is always present. The panopticon encouraged self-policing since inmates were aware they could be seen (and subsequently punished) at all times. While web 2.0 Community sites have no realistic alternative to encouraging self-regulation thorough a participatory panoptism, the internet has not turned out to be a hotbed of self denial and careful self regulation. One of the conclusions made by Pew's Digital Footprints report in December 2007 was that “Most internet users are not concerned about the amount of information available about them online, and most do not take steps to limit that information”.
Partly this can be attributed to the charmed circle people believe themselves to be positioned in - the imaginary frameworks of space and place that allows for the fun interchange of information, the subjective psychogeographic environment alluded to in my title.
There's a gap in perception between what many users believe to be
the context and audience that they are writing for – a closed group of
friends – and the numbers of people actually able to view their
information. Many users are unaware that the information they have
posted may be publicly available, and able to be searched for and read
by a much wider audience than their group of friends. Acquisti and
Gross (2006) characterise social networking services as "imagined communities"
in recognition of the gap between users’ perceptions of a private,
closed network and the reality of who can access their information
Additionally there is the issue of time. Embarrassing or inappropriate stuff may still be around in a few years' time. We don’t know the full consequences yet of a generation which has grown up online, or the future implications of new types of search - for example social search, which aggregates information from across a range of social networking sites by your name or email address, or of the development of facial recognition search software.
I've been working quite a bit around e-safety and digital literacy, so my thinking in this area is largely around presence issues - not just how we keep ourselves safe online but also how our online activity represents us to the rest of the online world. It's becoming increasingly easy to track peoples unprotected conversations, and the rise of social search pretty much demolishes any illusionary protection that acting within a silo might offer. The current tidalwave of lifestream apps further puts paid to this notion of the public internet being a series of discreet islands.
I agree whole heartedly with the argument that any good
service should ensure members can get all of their data out both easily
and meaningfully (i.e. in some useful format that can be recognised and
repurposed by other tools and services). However – we also need to recognise that a lot of people who use the
web don’t care about data portability. If fact, some of them even use
services precisely because they seem closed and hard to get information
out of, and when they do stumble across their data outside of its
origional context, it sometimes comes as a shock to them.
And recontextualisation isn't just about taking information from one place and replanting it in another - it can be about someone from outside of the charmed circle you imagain yourself addressing reading your stuff. This doesn’t mean that we shouldn’t be pressing hard to open up services – it means we need to be mindful of the importance of context, and the value of closedness/closeness, to people using services.
Fascinating post Josie. I love the internet as panopticon idea. I also think you're right that it's all about context. At the moment the tools we have to define who can see what are pretty basic - these will inevitably become much more sophisticated as time goes on, as will people's natural navigation around these issues.
Social norms will emerge as time goes on - I guess we're the guinea pigs at the moment! It's fascinating to see it all panning out in front of our eyes, especially right now with Twitter.
Education also has a huge role to play too as you say. My question is who's going to do it?
Posted by: Antonio Gould | 16 April 2008 at 13:11
Love the way you think, Josie.. I'm wondering, as the amount of information (refined and "spontaneous") that emerges about individuals increase, does the threshold at which a particular piece of information about someone is considered "salacious" change? In other words, are we getting more tolerant of deviance (pinching road signs) amongst our associates, as we have evidence of such deviance by more and more of them?
Posted by: Craig A. Cunningham | 24 April 2008 at 18:39
Great question Craig. I think that things will carry on pretty much as they are: If you are my friend, and in most other ways sane, I'd probably be prepared to overlook some salacious behaviour - particularly if it didn't impact on me/my ethics directly. If I was interviewing you for a job in the police force, I'd take it very differently. There is some evidence to suggest a degree of context being taken into consideration - i.e. marginally/socially acceptable dubious behaviour being regarded as more excusable in young people than in older people. If you're asking does the online discussion of this kind of activity water down the general ethical/moral standards of the online population I'd have to say - its far more complex than that, and further clouded by the cyclical/generational change which is recurrently mystifying or horrifying to an older generation (think of the outrage surrounding the behaviour of flappers in the 1920s for example (one of many).
Posted by: Josie Fraser | 24 April 2008 at 22:21
To answer your question Antonio - I can only really talk within the UK policy context, but digital literacy needs to be policy led and financially supported. It needs to be on the national agenda in a way that it isn't at the moment. Right now, digital literacy is addressed in a disparate range of generally piecemeal and poorly funded/supported initiatives. It needs to be ON a national agenda - education & skills, industry and culture and the Home Office are all miniseries which could do with thinking about what we might mean by digital literacy and how we are going to support it nationally. Sure, its an important issue for institutions, teachers and parents but they need support in contributing to the national discussion. Unfortunately we don't seem to have focused leadership in this area.
Posted by: Josie Fraser | 24 April 2008 at 22:31
Interesting post, Josie. I'm speaking as someone who definitely doesn't fit into the category of hoping I can hide with a common name!
In some ways, though, I see this ability to get information a way of going back a few generations; when typically people didn't move far from an area; when everyone knew their neighbours' business. The difference is that now that "village" is so much bigger; and whereas in a village you were likely to know who was a part of it (and, who was the biggest gossip; hence who to hide from when possible); now it's just not possible to know who is looking at you.
We've gone from the openness, to the privacy of the net curtains, to openness behind the net curtains.
And we have to learn how to manage that.
Posted by: Emma | 27 April 2008 at 18:11