In The Algorithm Will See You Now, artist Chris Combs uses interactive electronics to ask pressing questions about algorithmic injustice and surveillance capitalism.
Machine learning or “artificial intelligence (AI)” tools are popping up like mushrooms after a rain. They are trained—like a puppy—from massive datasets of human decisions, such as, this photo depicts a cat, or, we granted parole to a person with this name and this criminal record.
In this way, countless amounts of individual decisions made by people are then laundered into an “algorithm” that is empowered to make decisions itself. Despite being automated and performed by a machine, there is nothing that inherently makes these decisions objective. An AI’s training is rarely known to the public. What are the implications of this, and what can we do about it?
– Chris Combs
Instagram Mastodon Bluesky Facebook