Facebook makes it official: You have no say

anil-dash-wired-tos-column.jpg

Late on Wednesday, just as Americans were taking off for the Thanksgiving holiday, Facebook announced its intention to change the feedback process for the policies which govern use of its service.

For the last few years, as I’d mentioned in Wired a few months ago, Facebook held sham elections where people could ostensibly vote on its policy changes. Despite lots of responses (the most recent Site Governance vote got far more people participating than signed the secession petitions on the White House website), Facebook never promoted these policy change discussions to users, and the public has never made a substantive impact on site governance.

Now, Facebook follows the steps that most tyrants do, quietly moving from sham elections to an official policy that users will have no vote in site governance.

Intentions

I’d like to give Facebook the benefit of the doubt on their change to site governance, as the company pinkie-swears that they’ll listen to users now. Facebook even offers up their well-intentioned Chief Privacy Officer, Erin Egan, to lead conversations designed to engage with the public about site policies:

As a result of this review, we are proposing to restructure our site governance process. We deeply value the feedback we receive from you during our comment period. In the past, your substantive feedback has led to changes to the proposals we made. However, we found that the voting mechanism, which is triggered by a specific number of comments, actually resulted in a system that incentivized the quantity of comments over their quality. Therefore, we’re proposing to end the voting component of the process in favor of a system that leads to more meaningful feedback and engagement.

But if Facebook believed in this move, and thought it would be embraced as positive by users, it wouldn’t have been announced late on the day before Thanksgiving, with a deadline for responses just a few days later. No matter how earnestly Egan wants to hear from the public, this effort is structured in a way where public feedback on site governance will almost inevitably be futile.

Copy and Paste Panic

Though this policy change from Facebook attracted very little attention, as designed by its release just before a holiday weekend, a separate panic about Facebook’s terms of service raised its head in the last few days. A copy-and-paste meme inspired thousands of Facebook users to post a message to their profiles asserting their copyright over their content, with explicit calls for Facebook not to exploit their posted data, tied to a bigger perceived threat due to Facebook’s recent listing as a publicly-traded company.

Facebook offered a terse refutation of the need for the meme, explaining correctly that users retain copyright on their works by default and thus have no need to share this declaration. (Especially true as posting this message to a Facebook wall would be ineffective for this purpose regardless.)

But Facebook’s one-paragraph response to hundreds of thousands, perhaps millions of users expressing concern about their personal data and content shows exactly why their site governance process is unacceptable.

A brief “fact check” about the site’s copyright policy assumes that simply correcting the factual error in the premise of postings solves the issue that’s being raised. One can almost hear the condescension behind the Facebook response (“they spelled ‘Berne Convention’ wrong!”), but there’s a glaring absence of any effort at addressing the emotional motivation behind so many users crying out. Facebook sees a large-scale user protest as a problem to be solved, rather than as an opportunity to serve their community. And as a result, they dismissively offer a short legal or technical dismissal of users’ concerns over their content, rather than empowering them with simple, clear controls over the way their information is used.

What Facebook Could Do

Facebook and its apologists will say, “but we already have good privacy controls!” and will point to their settings page, which, to their credit, has been admirably simplified.

Now imagine if, instead of posting a “fact check”, Facebook had responded to the rapidly-spread cries for intellectual property control on the site by leading and guiding their community in a way that was better for users while also being better for the web.

The same brief explanation that users retain copyright on their content could be followed by two simple controls, the first reiterating the existing site’s existing controls for privacy:

fb-privacy-controls.png

And then a second one (this is just a quick, silly mockup I made) could default to the existing rights and protections, but offer a simple interface for Creative Commons or similar license for sharing content.

fb-rights-controls.png

“But wait!” you cry. “Isn’t this much more complicated? Isn’t it a bad UX to force a choice on a user?” To which I reply: Not when a desire for control is what they’re expressing.

Because the emotional underpinning to the hue and cry over copyright and permissions on Facebook isn’t some newly-discovered weird mass fixation on intellectual property rights. It’s a simple expression of a lack of trust for Facebook, one where their status as a publicly-traded company makes users feel that Facebook is less accountable to their preferences on privacy and permissions due to its accountability to shareholders to maximize value.

Think about the feelings behind an ordinary Facebook user updating their status to say, in part, “By the present communiqué, I notify Facebook that it is strictly forbidden to disclose, copy, distribute, disseminate, or take any other action against me on the basis of this profile and/or its contents”. They’re expressing the fear that Facebook is going to disclose their personal thoughts, and exploit them for commercial gain.

You don’t solve that level of concern by offering occasional web chats with a Chief Privacy Officer.

Being Of Service

Facebook needs to change its culture to one where it’s determined to be of service to users who are worried, even if those users have some misunderstandings about technical details or esoteric legal concepts. Because the fundamental issues of trust and vulnerability are legitimate, and users deserve to have them addressed.

There’s also a huge long-term liability for Facebook if these issues of trust aren’t addressed. Companies face the wrath of regulators and the vagaries of policy changes not just because of lobbyists and wranglings in the corridors of government, but often because ordinary people have a gut sense that these huge corporations aren’t working in their interests. That sentiment can express itself in a million different ways, all of which serve to slow down the innovation of a company, limit its opportunities to reach new audiences, and eventually come to cripple its relevance in the market. Microsoft should offer a sobering example to Facebook of a company that, in addition to breaking the law (which Facebook seems on course to do in a few years with its constantly-shifting policies) had separately earned such mistrust and animosity from the industry and from users that decisive legal action against the company was all but inescapable.

When Facebook went public, Mark Zuckerberg wrote a great letter to investors, which began with a simple statement:

Facebook was not originally created to be a company. It was built to accomplish a social mission — to make the world more open and connected.

We hope to strengthen how people relate to each other.

When Mark says that, I believe him. I do sincerely think that’s what he intends to do, and what he hopes for his employees to do. But from the micro-level decisions where a panic over content rights is handled in a perfunctory, dismissive way, to the macro-level where the fundamental negotiation with users over their empowerment as the lifeblood of the service, Facebook has made obvious that they’re not culturally ready to meet the mission Mark has laid before them.

It’s not too late to change, but Facebook has the obligation to truly embrace empathy for its users. Right now, it’s sneaking out the warnings in the dark of night that things are going to get worse before they get better.