Foocamp 2010: lovely, expectant, reflective

This has been sitting in my edit queue for too long… here goes.

So, I was invited to Foocamp this year. Back in February, I attended KiwiFoo at the invitation of Nat Torkington. I realize now (sorry, Nat!) that I never blogged about going there. I have some rough notes around that I may try to edit down.

I’ll avoid any direct comparisons for a moment, and just talk about what it was like to dive into a slice of Silicon Valley, dislocated into the countryside for a weekend.

When I arrived, I felt immediately at home, welcomed and appreciated. Sara Winge was so helpful and easy-going about the Tesla Coil Josh brought. (Thanks for the ride, Josh!) I spent a lot of evening time educating people about avoiding touching the sparks (it may be a pretty toy, but it is in fact a dangerous one!) and also playing around with a metal glove that was rigged up so you could get a bit closer in to the coil.

I spent a couple delightful hours considering claude glass with Roberta Cairney. Over and over, I absorbed the positive energy tumbling out of Sumana. Once again, I spoke with Kiwis whose humor and affectionate swearing reminded me of home.

I held one session – about forgetting and the ethics of shifting our culture to assume everything may be on your permanent record, and the ways in which people try to opt out or game the system. I was surprised and overwhelmed by the people who attended. Rita King, Scott Berkun, Biella Coleman and danah boyd were all part of the discussion and I was able to talk about ideas I’ve had tumbling around in my head for years. I believe sysadmins/devops must have conversations about the ethics of the default choices made by developers around configuration and long-term management of log data. People asked provocative questions, and we had a real debate about the ethics. It was a wonderful experience and I came up with at least one good idea that I hope Jesse Robbins and I are going to act on together.

I also ran into several people who are just starting to work on user group and community issues in their geographic areas. Seattle came up over and over – and I’m looking forward to helping Ben Huh and the open gov folks who want to do grassroots organizing in their tech communities. I also met some amazing women there, and hope that we’re able to continue our discussions about business and tech in the future.

I camped, and was in a tent under the stars, and early morning fog. I enjoyed running into a fellow PostgreSQL community member, Paul Ramsey, among the early risers.

Apart from the immediate things to collaborate on, and an incredibly long list of new ideas and connection points, I came away inspired and mentally refreshed. I relished the relative lack of device obsession. The people that I wanted to have conversations with tended to have put their devices away for a few hours, and were focusing on the people in front of them.

The Ignite talks were my favorite talks. Jake Applebaum‘s meditation on wikileaks was particularly inspiring, reminding me to seek out opportunities to change the world for the better.

Forgetting: Logging as an ethical choice

I have kind of a weird idea for a database person.

Forgetting should be built into our applications by default. I just spent the weekend at FooCamp, and I held a session to discuss this idea, and some of the possible consequences if it were implemented.

To explain why I think this, I’m going to take an extreme stance for a moment and argue a position that I’d like to see rebutted. So, please have at it! :)

For too long we have allowed decisions made by developers – default application settings – to determine what ultimately become surveillance levels.

There are notable counter examples: 4chan intentionally expires postings every few days. Riseup keeps no logs. The EFF documents what we do and do not legally need to keep. These, however, are the efforts of a tiny minority when considered against the rest of the web.

Over time, our conception of what is reasonable has changed around logging and accounting for vast periods of our activities. Never before would a silly recording taken by a 15-year old be stored indefinitely, and then be documented as a watershed event because of how many times it was viewed in a vast global network, rather than for the content of the cultural artifact itself. The log of views themselves were the cultural artifact, and it is celebrated.

Fading away isn’t evil. But we act like it is when we pipe what once was ephemeral into archive.org indefinite storage.

Why have we decided to participate in this social experiment? It really wasn’t a collective decision. Some software developers and investors decided that archival on a massive scale was important or profitable. We started calling these things “part of history” and just storing them without thinking about it. Saving became default.

I’m not saying that archiving the internet, search robots or “opting in” are bad things. But those who least understand archiving’s effect on personal privacy may be the ones most likely to suffer in the future.

The ripple effects of the decision to move from “default expire” to “default save” are vast. Consider for a moment if we were to call the ability to intentionally forget on the internet a human right.

Instead, what we’ve done is to say to millions of people – you do not have the right to forget. Companies will take your locations and status updates, and never delete them. And privacy is rapidly becoming a privilege of those who can afford to buy it.

For the sake of argument, consider the difference between narrative historical documentation and collections of “facts.” The narrative is an aggregation, full of embellishments and forgetting and kernels of truth. Facts are collected, supposedly objectively. Both approaches to capturing historical thought suffer from the fallacy that historical “fact” is fixed and doesn’t evolve based on the viewer and reteller over time. How much worse is this effect when our collections of facts are now ballooning to include every blog post, photo, tweet and web access log you’ve ever made?

The point is not that individuals wish to change history or even obscure events which may reflect poorly on them. (Even though we all do!)

We need to give people a real choice – not a set of ACLs and rules. Choice about what is archived about them, control over that process and a clear delineation between personal artifact and public property.

Kathy Sierra deleted her twitter stream and was accused of removing a piece of history, and possibly the worse internet offense – taking away conversations. Taken at face value, isn’t that the point of conversation? That it is ephemeral?

Conversations leave echos in changed thoughts and light or deep impressions in the minds of the participants. Just because Twitter has by default chosen to retain these conversations indefinitely doesn’t change the nature of conversation itself. No one would argue that just because we share our thoughts that we are obligated to share every thought.

In the same way, we are not obligated to maintain a record of our sharing. And if we do maintain and share a record of our own end of a conversation, we still have the right to ultimately destroy it.

Once shared, of course, an artifact of a conversation can’t be taken away from those that have copies. But authors and owners of the original work must always retain the right to destroy.

So, that brings me to what is ethical in our applications. When we say: “we’re keeping your data forever” and “delete means your account will still be here when you come back”, application developers and companies are making an ethical choice. They are saying, “your shared thoughts aren’t your own – to remember or forget. We are going to remember all of this for you, and you no longer have the right to remove them.”

Connectedness is not the same as openness. Storing vast logs of data related to individuals which connect thousands of facts over the course of their lives should be presented as the ethical choice it is, rather than a technical choice about “defaults”. Picking what we decide to log and store is an ethical and political decision. And it should also be possible for it to be a personal decision.