Changing the rhythm of Algorithmic Bias

Image credit: https://i.ytimg.com/vi/0xKklLplngs/maxresdefault.jpg

You’re a pretty sweet coding dude (or dudette), who just finished his 1000th pretty sweet algorithm for your pretty sweet program.

That sounds pretty sweet, but have you put in some critical think time in regard to bias? It turns out your pretty sweet algorithm might actually be a total drag.

In all seriousness, before this week, I had little to no knowledge of such profound bias in algorithms. The weight of the subject hadn’t really crossed my mind, other than uproar over Facebook’s politically driven algorithmic choices. Cathy O’Neil (Blind Faith, 2017) brings up a revealing and alarming point, that “…algorithms automate the status quo”. The thing is, I don’t think it’s possible to totally eliminate bias- with our every-changing society, there will always be a seed that can proliferate.

There needs to be some sort of regulations to keep bias in check, but that’s a difficult question to answer. There comes an abrupt point where regulation becomes unmanageable and may completely stunt progress- what and who is going to enforce this, and is a balance between equality and progress achievable?

Facebook is a very recent example of bias over political, racial, and social divides. Facebook has said that they have no plans on changing their algorithms to combat bias, and this makes sense- segregation is historically more profitable than inclusion.

Individually, however, we each can make a conscious decision to think about our own code and the bias we may seed into it. Just stopping to think is the first step to break up the rhythm of bias and work toward making it into something better.

Leave a Reply

Your email address will not be published. Required fields are marked *