So close, yet so far

I saw the above in a parking lot near Uluru.
And here's Uluru.
And here's a different Uluru.
These are the dog days of summer (northern hemisphere). This post has nothing to do with philanthropy, other than I took the top two photos while I was in Australia, working on digital governance in philanthropy. I'll have more to write on what I learned when I stop procrastinating by looking at photos.

Data, ethics, psychology, and the census


I've been in Australia for several weeks, leading workshops on digital civil society and data governance in nonprofits, meeting with philanthropists, corporate leaders, and various government bodies, and searching for potential scholarly collaborators. I'll be writing a reflection piece on this work soon. 

But I've mostly been thinking about the census. In December 2015 the Australian department that manages the census announced it would be collecting and storing real names with the census data. Penalties would be levied against anyone who didn't file a form or who used a false name. The Christmas Eve announcement went largely unnoticed at the time, but the (mandatory) census date of August 9 brought this issue back to everyone's attention during my time in the country.

At dinner one night in Sydney I sat next to a woman who was telling me how she, always law-abiding and even professionally dependent on the census findings, found herself contemplating obfuscation as she reviewed the form. The fact that her name would be attached and stored shed a new light on the questions being asked about religion, income, and family structure. 

Filling out the census is mandatory, everyone is set to file on a single day, and the push is to get most Australians to file online. There are immediate penalties that accrue daily for not filing or for omitting your name (making lying on the form or boycotting it all together another example of privacy becoming a luxury item). The more I imagined myself facing down such choices the more the psychological tradeoffs bounced about in my head. 

So I wonder, will the new census approach reveal that Australia is now home to a million “mickey mouses,” a million followers of the R2D2 faith, or several million people who simply make up all kinds of information about themselves?

I wonder – in addition to considering the utility, the ethics, and the security issues of attaching names to census data - did the folks at the Australian Bureau of Statistics consider the psychological calculations inevitably being made by those filling out the forms? By compelling the citizens to file and name themselves has the government created a situation in which a last grasp for privacy (and dignity) outweighs a civic obligation for accuracy? 

How do our psychological needs, our civic responsibilities, our political attitudes toward government (or nonprofits or corporations) interact with the digital demands for information we face constantly? Will digital data demands make liars of us all? How does your nonprofit or foundation take these variables into account when you ask people for their data?


First Facebook ate the news, are nonprofits next?


(Photo: http://www.brookings.edu/blogs/fixgov/posts/2015/05/12-improving-media-capacity-series-intro-kamarck)

The news business has been through quite a bit over the last two decades. A year or so ago, Facebook made itself into a key distribution channel for magazine, newspaper and broadcast outlets - creating tools like Instant Articles, inking deals to with news organizations to use Facebook Live, jiggering  its news feed and striking deals with major outlets. Nowadays, news organizations pretty much depend on Facebook to get their stories in front of readers.

So it's a big deal when Facebook then turns around and says, never mind, we're changing our algorithms (again) so Facebook users see less of news content and more of their friends' stuff.

By now, we should all realize - it's their platform, their algorithm, their rules. News organizations know this, but Facebook's reach is so great they clearly decided they needed to play in the company's sandbox, regardless of the rules.

Nonprofits shouldn't make the same mistake.

The same week that Facebook announced it was pulling the rug out from under news companies' content it announced it was luring nonprofits to the platform to do their fundraising. Again.

Just as it promised the news companies, Facebook's pitch to nonprofits is about scale. Facebook may have 1+ billion users, but that doesn't mean they're all going to suddenly care about your organization.

Shifting your fundraising over to the platform may get you a few dollars in the short term. It may make it easier for an especially eager volunteer to run a fundraising event for you.

But be wary. If someone came to you and offered: "Why not hold all your events in our house, we'll manage all your invitations, process all the gifts, follow up with everyone who attends" you'd ask yourself, "What's in it for them?" Ask yourself the same question of Facebook (or any tech platform that you don't control). The answer is easy - they get all of the data on who, what, when, and how much. They own your fundraising data. And if they decide to change the rules on how their tools work (or close the doors of the metaphorical house) they can. If history is any guide, they will.

Nonprofits give up a bit of their independence, a bit of their donors' and constituents' privacy, and a lot of control with these arrangements. It may raise a little money in the short term. But in the long term it sells out the sector. Just ask your local newspaper publisher. If you still have one.

First Facebook ate the news, are nonprofits next?


(Photo: http://www.brookings.edu/blogs/fixgov/posts/2015/05/12-improving-media-capacity-series-intro-kamarck)

The news business has been through quite a bit over the last two decades. A year or so ago, Facebook made itself into a key distribution channel for magazine, newspaper and broadcast outlets - creating tools like Instant Articles, inking deals to with news organizations to use Facebook Live, jiggering  its news feed and striking deals with major outlets. Nowadays, news organizations pretty much depend on Facebook to get their stories in front of readers.


So it's a big deal when Facebook then turns around and says, never mind, we're changing our algorithms (again) so Facebook users see less of news content and more of their friends' stuff.

By now, we should all realize - it's their platform, their algorithm, their rules. News organizations know this, but Facebook's reach is so great they clearly decided they needed to play in the company's sandbox, regardless of the rules.

Nonprofits shouldn't make the same mistake.

The same week that Facebook announced it was pulling the rug out from under news companies' content it announced it was luring nonprofits to the platform to do their fundraising. Again.

Just as it promised the news companies, Facebook's pitch to nonprofits is about scale. Facebook may have 1+ billion users, but that doesn't mean they're all going to suddenly care about your organization.

Shifting your fundraising over to the platform may get you a few dollars in the short term. It may make it easier for an especially eager volunteer to run a fundraising event for you.

But be wary. If someone came to you and offered: "Why not hold all your events in our house, we'll manage all your invitations, process all the gifts, follow up with everyone who attends" you'd ask yourself, "What's in it for them?" Ask yourself the same question of Facebook (or any tech platform that you don't control). The answer is easy - they get all of the data on who, what, when, and how much. They own your fundraising data. And if they decide to change the rules on how their tools work (or close the doors of the metaphorical house) they can. If history is any guide, they will.

Nonprofits give up a bit of their independence, a bit of their donors' and constituents' privacy, and a lot of control with these arrangements. It may raise a little money in the short term. But in the long term it sells out the sector. Just ask your local newspaper publisher. If you still have one.

A reformation of digital and democracy literacy led by civil society

Once upon a time, the codes that guided society were the province of a few. The word of God was read and interpreted by priests and men of the church who told the people what the book said, what the codes for a good life were. And the people did as they were told.


Then more people learned to read, printing technology changed, and still more people learned to read. Violations of power from those who had controlled the code were exposed, and religious reformation was called for. New technologies and more public interrogation of the "code" began and many took over what had the been the tightly-held purview of a few.
(The above is a deliberately oversimplified analysis of 15-18th century western European historical canon, minus all the power struggles, racism, sexism, extra-Christian turmoil, and colonialism. I'm trying to make a point.)

Once upon a time, software code drove devices used mostly by those who could read and write software code. Then these devices and the networks they powered were opened to most (not quite all). And all became dependent on software powered gadgetry and digital networks. But the code remained the province of a few, even as some led movements for opens source, open data, open algorithms, open governance. But, still the many had no understanding of the nature of the code, its limitations or bounds.

As this code and its disciples brought their tools, which were designed around the efficiency of market forces, into other realms of life, such as the household, political systems, and civil society there was tension. The value of efficiency, coded into the software, didn't always fit smoothly with the values of the household (privacy) or that of the governing systems (participation and representation) or civil society (justice, equity, beauty).

And so there was a clash of values, a clash of codes. And the priests of software code and the priests of governance found themselves at odds. And the people - to whom open data was given - were not equipped to use it.

And some of the nonprofits and foundations and associations that constitute civil society interrogated some of the software code. Over and over again they pointed out ways in which the code was misaligned with the task to which it was being applied. Examples of racial discrimination. Of algorithmic bias. Of new divides and new versions of exclusionary practice. Civil society served one of its most important functions - checking and re-checking the power of governments and markets (and their digital tools).

Some of the answer lay in applying existing democratic process technologies - such as due process - to applications of algorithmic decision making. And this was good.

Still, a more fundamental structural divide remained. Call it a linguistic divide. Between those  "fluent" in digital and those "fluent" in democracy and civil society. A reformation in access and capacity and understanding was needed.

And this is where we are today. 

Codes - software and legal - are sets of values. Written and enforced. When using digital technologies within democratic systems or for democratic purposes the values embodied in the codes need to align. They need to be able to interrogate each other and for the people to understand what is meant, what is captured in the code, what is being promoted or enforced by the collective set of rules and tools.

The common term for helping individuals, non techies, understand digital data and systems, codes and algorithms, is data literacy. This is not my favorite term, but let's use it for now, recognizing that it its not a one way street. People need to better understand how digital systems and codes work and software coders need to the priorities and principles of of democratic practice - both "literacies" are needed.

All involved, not just the priests with the books (not just the software coders) but the people, our agents, our elected officials, our judges) need to be able to understand the code.

Algorithmic accountability, open data, machine learning must be designed by and with those who understand the principles of democratic governance. It is the job of those who understand these systems to teach those who understand algorithms and vice versa

In other words, digital technologists and tacticians must teach and learn from democratic theorists and tacticians.  All who are citizens, all in civil society, all in public agencies - these are the democracy tacticians I'm talking about.

We need both codes - the codes of democracy and software code - to be written, used, held accountable, and procedurally applied and interrogated together.

One does not have the solutions for the other, they must build solutions together. If for no other reason than we (in democratic societies at least) are all dependent on both democracy and software. There is no "they," we are we. 

Thoughts on democracy from the U.S. capital

I've been thinking...

Suing news outlets with whom you don't agree is not philanthropy. Wealthy individuals litigating an agenda by themselves (and secretly) is different from "impact litigation" led by public interest groups, (even when financed by a few individuals). (see below * on associational power)

The arc of platform consolidation built on the back of personal data that has contributed to the collapse of independent journalism is a story line we may see repeated in the nonprofit sector writ large.

Community-governed, small, independent associations - which de Tocqueville noted as core to American democracy - are threatened by homogenizing pushes for scale, efficiency, short-term metrics, and earned revenue.

These associations are key to what scholars call social capital, political wonks call civic engagement, and neighbors recognize as community. We overlook these roles of nonprofits and associations at our peril.

They are bulwarks against both economic and political monoculturalism. Otherwise known as inequality and tyranny.

Associations fill this role in at least two ways. First, they provide support for a diversity of views.
* Second, their governance structure is intended to involve multiple people as a form of public accountability and mechanism by which power can be scrutinized. Toward this end, transparency and public reporting requirements for associations (and sits in tension with anonymity). We're fooling ourselves if we think concentrated wealth or power is any less threatening in a nonprofit or philanthropic guise.

Pluralism requires a diversity of options, in associational life and digital space, with distributed governance.

There is no independent sector in digital space.

Creative Commons, Wikipedia, Mozilla, Electronic Frontier Foundation and the Internet Archive are our first models of civil society organizations purpose built for the digital age. We all manage digital resources now. We need new institutional forms.

We need local, community-led associations - distributed, fragmented, pluralistic, and contentious - equipped to help us dedicate our private resources - time, money, and data - to public benefit.