Recap: Fairness, Accountability and Transparency Conference 2018
The first step is recognizing tech organizations have a problem. The second is realising that tech solutions won’t solve them.
On 23/24 February 2018, ~200 folks representing academia (3/4), industry (1/4) and a handful of government folks convened for two rainy days at NYU Law to discuss how black box algorithms and powerful unaccountable tech companies have broken the Internet. The program is available online, as well as recordings.
- Discrimination in Online Personalization: A Multidisciplinary Inquiry [Video]: It is rare to have a lawyer and computer scientist take the stage together, but when they do humour and insight flow. I for one didn’t grasp the implications of Section 230 of the Communications Decency Act not applying to content providers, or the mismatch in liability between Section 704 (Employment) and Section 8 (Housing).
- Fairness in Machine Learning: Lessons from Political Philosophy [Video]: The moral philosophy of discrimination is nuanced. How do power, here the owners of technology, ignore: moral responsibility, egalitarianism, costly rescue and the interplay of choice & luck. How might fairer choices be made if persons ascribed power looked a little longer in the mirror before judging another.
We have a problem
While the first step is recognizing you have a problem — and indeed mass-affluent wage-slaves representing the tech giants such as Google and Microsoft were in attendance — the subsequent steps will be significantly more difficult.
Technological solutions alone will not solve the problems we face with the lack-of fairness, accountability and transparency; rather, technology organisations must change.
The conference highlighted that we are at a crossroads. Who will benefit and who will suffer from harnessing technology?
The solutions themselves, are straight forward, and several were called-out at the conference. These are questions of ethics, governance, ownership and distribution.
- Independent ethical review boards
- Differential impact assessments, to ensure technology does not cause differential harm to marginalised groups
- A legal framework for privacy, the General Data Protection Regulation (GDPR) is a reasonable start.
- Board representation of all stakeholders, not just capital: workers and users/community
- Freedom of information act style request applied to companies
- Of the data, by each individual
- Of the hosting/platform/entity, by users collectively
- Of the underlying software, by the commons (FOSS)
- Of profits to all stakeholders
Implementation requires the will to change
Purported solutions that do not address the root causes of the lack of fairness, accountability and transparency are at most half-measures against a tide that will determine our collective fate: a humane post-scarcity world in which every individually has the substantive ability to self-determine, with access to the collective material abundance vs. artificial scarcity, opulence for the affluent and no solace for the rest.
It’s one thing for an academic, even an academic in collaboration with an industry researcher, to reveal how algorithmic bias causes differential harm to minority groups. It’s another entirely to shake the foundation of capital accumulation that the whole rotten system is build on when tens of trillions of dollars of financial capital are at stake.
So long as an ethos of the domination of capital reigns.
So long as technology is governed by the interests of capital accumulation, in which the interests of workers, community and environment are ignored as costs to be externalized and commons to enclose.
So long as ownership is restricted to a narrow capitalist class that own the increasingly technological means of production.
So long as the great material abundance of our society flow to the few instead of the many.
We will have technology used to further the ends of capital, alongside the exploitation and oppression that inure to the many the benefit the few.
The harnessing of technology towards the aims of capital is not a natural law. Just as the legal, economic and social systems that harness technology for the benefit of the few have been made by men, so these systems may be unmade.
PS. Word on the grape vine: Microsoft offered to sponsor the event in exchange for the key note. Thankfully, the conference organizers called bullshit. Shame on you Microsoft.