Ethical questions should not be an afterthought. The hard problems are never the
technical ones. The stuff we build has an impact. A real, incredibly huge impact.
As "software is eating the world" we have to consider what sort of world it is that we
want. Is it a world increasingly controlled by corporations? Where power lies in the
hands of a select few individuals, mostly male and predominantly white?
Or can we strive for something better? Where great software is great because it betters
the lives of everyone, not because it managed to obtain venture capital funding? Where
the "genius" tech startups are "genius" because they strive to help us all improve, not
because they avoid paying their workers.
Can we strive for a world where the disconnect between "users" of technology and its
creators has been bridged. A world where everyone is equally important. A world where we
can redress the harms we have done unto each other. Where everyone can receive an
education, without having to fear for their life.
We all have some part to play in achieving this. As those who write the script for the
machines which are coming to increasingly dominate our society, we bear a great
responsibility. Will the machines of tommorrow be like the machines of today ‐
furtherers of the oppressive inequality endemic to our model of capitalism. Or will they
be something else? Something truly magnificent, something breathtaking? A lot of the
power to decide this rests in our hands.
We can no longer pretend that we are not morally culpable. When the stuff we build
causes severe harm to people, provides a platform to those who wish to direct violent
hate towards others and starts to split our society down its fault lines it is not the
time to bluster and pretend that the problem is not ours. We cannot pretend that it is
"society" which is bad and that it is us who are merely giving people what they "want."
When the technology we build is helping militaries to more effectively kill people,
the rich to become richer at the expense of the poor, and further an agenda of racism
and sexism, the correct response is to be introspective. What is it about our code that
causes these problems? How can we improve our systems to make them more humane,
equitable and fair? The incorrect response is to attempt to appease shareholders and
those in power that our systems will not hurt them.
The people suffering the most are invariably those with the least power. We should not
be kicking people when they are down. Down often because our ancestors colonised their
countries and forced them into generations of slavery. Down because we plundered their
land and used the wealth to build ourselves up. We did not choose to be born into our
position of power. Likewise, others have not chosen to be born into the position in
which they stand today.
Taking responsibility is often hard, but it is the right thing to do. Unravelling our
position of power is a tricky and often uncomfortable experience. But it is necessary.
Claims are often made that tech is something new, something revolutionary we have never
seen before. In many ways the tech companies of today are just a reformulation of what
has stood before. A reformulation of systems and processes which have caused misery and
pain to huge numbers of people.
So called "disruptive" technologies are not necessarily good. It follows an adage of "out
with the old, in with the new." Ripping out tried and tested systems to replace them with
the latest and shiniest is rarely a good idea. Tech companies whose success is predicated
upon aquiring huge market share will have a massive impact on the lives of millions. And yet
we are happy to allow the playthings of developers who haven't considered the effects of
their products to unleash untried and untested software developed in toxic echo-chambers
which eschew diversity and inclusion on tight timetables which fail to leave enough time for
proper review and assesment.
Some industries have a term for this: criminal negligence. In most jurisdictions, unleashing
an unapproved drug onto patients would result in serious jail time for the perpetrators. But
for some reason, it's acceptable for tech entrepreneurs to unveil software they know has
bugs and isn't ready for production use onto vulnerable people. Furthermore, it's acceptable
for them to deploy algorithms they don't understand and can't explain or be sure that they
won't perform with erratically different results on two slightly different images which look
(to a human) to be identical.
The systems we have in place to prevent this sort of behaviour are failing us. Politicians
are overtaken by a desire to have the latest and greatest; to be "cutting-edge." As the
people who program the machines, we have to step up to the bar. Write to your lawmaker, tell
your tech companies that what they're doing isn't acceptable. Hold them accountable for the
decisions that they make. Let's make sure they know we think they're culpable for their
actions, even if they don't.
My name is Teymour Aldridge, and you've just read some of my ramblings :)
Feel free to check out code I've written
. If you're interested in more of my ramblings, you can take a look at my blog.
If you'd like to contact me, you can drop a line to