Recap: Fairness, Accountability and Transparency Conference 2018

The first step is recognizing tech organizations have a problem. The second is realising that tech solutions won’t solve them.

Ryan M Harrison
3 min readApr 12, 2018
geralt, Creative Commons

On 23/24 February 2018, ~200 folks representing academia (3/4), industry (1/4) and a handful of government folks convened for two rainy days at NYU Law to discuss how black box algorithms and powerful unaccountable tech companies have broken the Internet. The program is available online, as well as recordings.

Stand-out talks

We have a problem

While the first step is recognizing you have a problem — and indeed mass-affluent wage-slaves representing the tech giants such as Google and Microsoft were in attendance — the subsequent steps will be significantly more difficult.

Technological solutions alone will not solve the problems we face with the lack-of fairness, accountability and transparency; rather, technology organisations must change.

The conference highlighted that we are at a crossroads. Who will benefit and who will suffer from harnessing technology?

Solutions abound

The solutions themselves, are straight forward, and several were called-out at the conference. These are questions of ethics, governance, ownership and distribution.

Ethics

  • Independent ethical review boards
  • Differential impact assessments, to ensure technology does not cause differential harm to marginalised groups

Governance

  • A legal framework for privacy, the General Data Protection Regulation (GDPR) is a reasonable start.
  • Board representation of all stakeholders, not just capital: workers and users/community
  • Freedom of information act style request applied to companies

Ownership

  • Of the data, by each individual
  • Of the hosting/platform/entity, by users collectively
  • Of the underlying software, by the commons (FOSS)

Distribution

  • Of profits to all stakeholders

Implementation requires the will to change

Purported solutions that do not address the root causes of the lack of fairness, accountability and transparency are at most half-measures against a tide that will determine our collective fate: a humane post-scarcity world in which every individually has the substantive ability to self-determine, with access to the collective material abundance vs. artificial scarcity, opulence for the affluent and no solace for the rest.

It’s one thing for an academic, even an academic in collaboration with an industry researcher, to reveal how algorithmic bias causes differential harm to minority groups. It’s another entirely to shake the foundation of capital accumulation that the whole rotten system is build on when tens of trillions of dollars of financial capital are at stake.

So long as an ethos of the domination of capital reigns.

So long as technology is governed by the interests of capital accumulation, in which the interests of workers, community and environment are ignored as costs to be externalized and commons to enclose.

So long as ownership is restricted to a narrow capitalist class that own the increasingly technological means of production.

So long as the great material abundance of our society flow to the few instead of the many.

We will have technology used to further the ends of capital, alongside the exploitation and oppression that inure to the many the benefit the few.

The harnessing of technology towards the aims of capital is not a natural law. Just as the legal, economic and social systems that harness technology for the benefit of the few have been made by men, so these systems may be unmade.

Resources

Algorithmic Accountability: A Primer by Data & Society Institute

Footnotes

PS. Word on the grape vine: Microsoft offered to sponsor the event in exchange for the key note. Thankfully, the conference organizers called bullshit. Shame on you Microsoft.

--

--

Ryan M Harrison

Software for health IT and life-sciences. Basic Income (UBI).