Ethical questions should not be an afterthought. The hard problems are never the
technical ones. The stuff we build has an impact. A real, incredibly huge impact.
As "software is eating the world" we have to consider what sort of world it is that we
want. Is it a world increasingly controlled by corporations? Where power lies in the
hands of a select few individuals, mostly male and predominantly white?
Or can we strive for something better? Where great software is great because it betters
the lives of everyone, not because it managed to obtain venture capital funding? Where
the "genius" tech startups are "genius" because they strive to help us all improve, not
because they found creative ways to avoid paying their workers.
Can we strive for a world where the disconnect between "users" of technology and its
creators has been bridged; where users are creators who look at a piece of technology and
can understand not only how it works but also how to improve it? A world where everyone's
needs and desires can be met in an equitable and fair way? A world where we can start to
redress the harms we have done unto each other and unravel the complex webs of
discrimination which have been constructed around us. Where everyone can receive access to
an education which seeks not to control them, but empower and enable them to make their
world better (also having universal education in the first place would be good).
We all have some part to play in achieving this. As those who write the script for the
machines which are coming to increasingly dominate our society, we bear a great
responsibility. Will the machines of tommorrow be like the machines of today ‐
furtherers of the oppressive inequality endemic that the society we live in has built up.
Or will they be something else? Something truly magnificent, something breathtaking? (also if
we actually used anthopometric data that included that of women when designing screen sizes
that would be good). Much of the power to decide this lies in our hands, and although this
is very clique – it cannot be said enough: with great power comes great responsibility. This
is not to say that we should have a brief think and get back to our way of working - we
should have an extended think; this extended think should be integrated into how we design
and build systems.
We can no longer pretend that we are not morally culpable. When the stuff we build
causes severe harm to people, provides a platform to those who wish to direct violent
hate towards others and starts to split our society down its fault lines it is not the
time to bluster and pretend that the problem is not ours. We cannot pretend that it is
"society" which is bad and that it is us who are merely giving people what they "want."
When the technology we build is helping militaries to more effectively kill people,
the rich to become richer at the expense of the poor, and further an agenda of racism
and sexism, the correct response is to look within. What is it about our code that
causes these problems? How can we improve our systems to make them more humane,
equitable and fair? The incorrect response is to attempt to appease shareholders and
those in power that our systems will not hurt them.
The people suffering the most are invariably those with the least power. We should not
be kicking people when they are down. Down often because our ancestors colonised their
countries and forced them into generations of slavery. Down because we plundered their
land and used the wealth to build ourselves up. We did not choose to be born into our
position of power. Likewise, others have not chosen to be born into the position in
which they stand today.
Taking responsibility is often hard, but it is the right thing to do. Unravelling our
position of power is a tricky and often uncomfortable experience. But it is necessary.
Claims are often made that tech is something new, something revolutionary we have never
seen before. In many ways the tech companies of today are just a reformulation of what
has stood before. A reformulation of systems and processes which have caused misery and
pain to huge numbers of people.
So called "disruptive" technologies are not necessarily good. It follows an adage of "out
with the old, in with the new." Ripping out tried and tested systems to replace them with
the latest and shiniest is rarely a good idea. Tech companies whose success is predicated
upon aquiring huge market share will have a massive impact on the lives of millions. And yet
we are happy to allow the playthings of developers who haven't considered the effects of
their products to unleash untried and untested software developed in toxic echo-chambers
which eschew diversity and inclusion on tight timetables which fail to leave enough time for
proper review and assesment.
Some industries have a term for this: criminal negligence. In most jurisdictions, unleashing
an unapproved drug onto patients would result in serious jail time for the perpetrators. But
for some reason, it's acceptable for tech entrepreneurs to unveil software they know has
bugs and isn't ready for production use onto vulnerable people. Furthermore, it's acceptable
for them to deploy algorithms they don't understand and can't explain and have no way of
being sure that they won't perform with erratically different results on two slightly
different images which look (to a human) to be identical.
The systems we have in place to prevent this sort of behaviour are failing us. Politicians
are overtaken by a desire to have the latest and greatest; to be "cutting-edge." As the
people who program the machines, we have to step up to the bar. Write to your lawmaker, your
judiciary, tell your tech companies that what they're doing isn't acceptable. Hold them
accountable for the decisions that they make. Let's make sure they know we think they're
culpable for their actions, even if they don't.
My name is Teymour Aldridge, and you've just read some of my ramblings :)
Feel free to check out code I've written
. If you're interested in more of my ramblings, you can take a look at my blog.
If you'd like to contact me, you can drop a line to