Image by Compound Eye via Flickr
Facebook sure is getting beaten up recently. There's even a crowd-funded initiative to replace it with something open, called Diaspora -- everyone on Facebook is talking about it.
Yet it wasn't even two full years ago that Facebook was the darling of the ditherati. For a while it seemed as if nearly everything Facebook did was hailed as the future of messaging, perhaps the future of the Internet — or maybe the Internet didn't matter anymore, except for Facebook. Even obvious scams got VC funding, so long as they were on Facebook. But with just a few missteps -- which they appear to believe were nothing more than misunderstandings -- everything's changed.
The first tipping point, it seems, was Facebook founder Mark Zuckerberg's statement that he doesn't believe in privacy, with the obvious connotation that therefore he doesn't have to concern himself with it. But as researcher danah boyd responded, privacy isn't just about whether or not you share stuff -- it's whether or not you have any control over what you share, and who you've shared it with. And besides, it doesn't matter whether Zuckerberg believes in privacy. Facebook's users still do.
The problem Facebook has created for themselves is not that nobody wants to share information about themselves; it's quite obvious that there's a lot of sharing going on. It's that with each new feature, Facebook changes the social dynamic. Before, Facebook users felt and believed that they were sharing with their friends, and with particular networks they'd chosen; it was a closed environment, with borders that were clearly defined and understood. And if a few advertisers got to peek in, well, that was the price of admission. But now, after many changes, much of that same information is entirely public -- unless each user individually goes through a set of
It's akin to sending email to a private mailing list, only to have it forwarded to a reporter and published. Sure, it's always possible that that can happen, but we can either trust each other to abide by the social contract to not do such things, or we can't trust each other at all.
Facebook's staff may well have thought to themselves that if they asked users to opt in to having their data shared with a much broader audience, the users would decline -- and they were probably right. But by ignoring that insight and making it opt-out instead, they showed a severe lack of respect for their users. Without that respect, there can be no trust. Without that respect, your users will turn on you -- because they were never really "your" users to begin with.
Once trust is lost, what do you do? Can it ever be regained? Or will trustworthy behavior have to be forced upon them by regulators?
We at CAUCE have pondered this same question over the years in terms of companies who used to send spam, and have since learned not to. Some people will never forgive them, no matter what they do. Others won't see what the big deal is, because the spam never affected them personally. But most will fall somewhere in the middle, never quite trusting the company not to spam them again.
That middle area is the most Facebook can hope for at this point, and the way to gain it is to start viewing everything in terms of "what do users think is going on," rather than "what do we want users to think is going on?" More than anything else, they have to ask themselves: "are we being respectful towards our users? Are we allowing them the choice and control they believe they already have?"
It sounds like they're already thinking in this direction, or at least they want us to think they are -- but that doesn't mean users' perceptions will change overnight. Facebook won't be forgiven that easily, especially if their PR tactic is essentially "oh, you just didn't understand what a wonderful thing we're doing." They'll have to patiently explain their thinking in an honest way, keeping corporate doublespeak to a minimum -- and stay consistently respectful for a very long time.
Even then, success may not be measured by a decrease in angry blog postings. It also won't be measured by a decrease in the number of people deleting their accounts. Trust is much more nebulous than that. If anything, it'll be measured by whether anyone's willing to try new features when given the opportunity to opt in -- and that, too, could take a long time.
If they stick with it, and they're open and transparent about the change, then they could continue to be the largest and most successful proprietary social network that has ever existed...at least until some other tipping point occurs.
Recent Comments