In the darkening Finnish autumn of 2018 I decided to take part in a seminar on Digital Ethics at Aalto University. It proved to be a good decision - as you probably gather from these web pages, human relations with other computational processes is something I am always curious about - the course provided an interesting perspective based on sociology, design, law and ethics. It was also a lot of fun to engage in discussions with the other participants.
One term which kept coming back in the articles I read for the course was "governance". Very broad, clearly, and especially popular in circles discussing super-intelligent AI, but equally applicable to the wider spectrum of digital technology discussed during the seminar. Other terms that kept returning was "accountability", "fairness", "transparency", and so on. Terms related to technology performance, but performance not of a task, but in society.
I found that quite interesting, because as a species we humans are very dependent on our tools and technology. On a large scale, governance is about trying to steer that entwined and inseparable dynamic system of matter and though.
On the largest scale it is about playing an infinite game.
But, on the scale of that seminar, of digital ethics, it is about those terms - how to build technology for today, and for next year, which is fair, transparent, accountable, and so on. This is equal portions business and engineering. As much judicial matters as choices of design.
So, for my course project I decided to look at governance of such technology intended to supplement or replace human functions (a traditional reason behind new technology). It resulted in a mini-review, where I tried to use proposed approaches to governance to identify concerns of performance, and finally to base a discussion of how performance along those axis may be a basis for how a technology adapts or disrupts society.
I presented the poster at the FCAI-days 2018. It has soon been a year since I presented the work which somehow reminded me to put it up here.