Friday, September 9, 2016

Let’s Pause Before Bowing at the Altar of Algorithms

Issues

Let’s Pause Before Bowing at the Altar of Algorithms

For several weeks this summer, in certain cars of the New York City subway, Spotify, the music-streaming service, bought out all the ad space to promote one of its newer features: Discover Weekly. The service sends users a personalized playlist of music every Monday.
New-music discovery is one of the fiercest battlegrounds on which competing music-streaming services fight, and as they do, they have a tactile decision to make: should humans or computers take the lead?
In other words, will that delightful new song that pops up in your playlist do so primarily because a music expert, after considering beat and vocals and instrumentation, decided you’d like it, or because an algorithm cycling through vast bits of data, personal and global, arrived at the conclusion that you would.
If the song’s good, it probably doesn’t matter by which method it was selected. But what was interesting about Spotify’s ad campaign is how much it emphasized its use of algorithms, presenting them even as less impersonal than, perhaps, some nameless song analysts in a snazzy tech-startup office somewhere.
“The Discover Weekly playlist on Spotify really is like unwrapping a birthday present every Monday. Algorithms, you get me like no other,” read one Tweet that later appeared in a subway ad.
“It’s scary how well @Spotify Discover Weekly playlists know me. Like former-lover-who-lived-through-a-near-death experience-with-me well,” crooned another.
Spotify Discover Weekly Ad - Algorithms
Apparently in the age of Big Data, our relationships with algorithms can be more intimate than our relationships with people.
If this is creepy — and it is creepy — that’s because intimacy was the exact thing computers were supposed to remove from our daily interactions. After all, another promise of the Discover Weekly service is that it works equally for everyone, unlike its predecessor, the record-store clerk, who blew past as you flipped through Nickleback to suggest the next hippest artist to the cute girl browsing the selection of Nirvana.
Music is a special case. It’s a place where we want bias. We want the record-store clerk to linger by us a bit longer because he likes us and our taste more than he does that of the other shoppers. Perhaps this is why we so eagerly hope that the algorithms that select our music favor us, even when we know algorithms aren’t supposed to favor anyone.
In her new book Weapons of Math Destruction, Cathy O’Neil argues we should take the conclusions of many other algorithms just as personally. Bias, she laments, has tainted a whole host of other algorithms, ones we rely on to make decisions for us that we expect — that we assume — will be fair to everyone evaluated. In a wide-range of realms including criminal justice, job evaluation and hiring, and political and product messaging, we count on algorithms to make judgements that for decades were made by people whose whims influenced outcomes. The promise is that the algorithms do this fairly.
A judge may be acculturated to believe that black males are more likely to reoffend, but when an algorithm suggests the sentence he gives, gone are the charges of race being a factor. A teacher may be the principal’s poker buddy, but when his students’ test scores are crunched by a computer and out comes a dismal rating of his ability, it’s time for him to go."The problem is that so many algorithms that dictate our lives are often based on junk assumptions."TWEET THIS QUOTE
In the age of algorithms all a decider has got to do is feed in the raw data and wait for a number to pop out. Gone is the trying deliberation of casting judgement, of weighing each of those variables, calling on a limited set of prior experience, and trying to push emotion aside. An algorithm knows everything, feels nothing, and never wakes up on the wrong side of the bed. The decider doesn’t even have to know how the thing works. That’s the era when magic is spelled STEM.
So why then do so many of the algorithms’ outcomes that O’Neil documents seem to recreate the racial biases and arbitrary decision-making of that earlier, personal era? Why are our sentencing algorithms treating black males more harshly and firing prized teachers?
Her answer isn’t mind-blowing: algorithms are powered by math, not magic. The equations behind them must be constructed by people and the inputs must be measured and transformed into numbers, again in a process dictated by people. The biases of our culture are recreated in the system. Neither surprising is the construction of her book to readers of the popular-science genre: it’s a series of case studies that all reduce to the bite-size conclusion of whether or not the particular case represents a Weapon of Math Destruction, or WMD (a groaner, indeed, but at least we all know what she means).
What’s fierce about O’Neil’s book is her authority and how specifically she can diagnose the problem and proclaim a bold solution. Far from a yesteryear nostalgist, O’Neil, who earned her professional credentials working in the very tech-mathematics combine she criticizes, doesn’t see all algorithms as pernicious. The problem, she finds, is that so many algorithms that dictate our lives are often based on junk assumptions, operate in the dark with no lay understanding (and often little professional understanding) of their mechanics, and unlike the racist judges and nepotistic functionaries of the past, don’t have anyone to complain to. Moreover, algorithms often lack self-correcting feedback mechanisms, instead their errors are treated as the cost of doing business and because we’ve fallen for the seeming magic of the algorithms they are rarely interrogated as they should be.
More worrisome, O’Neil shows that it’s becoming increasingly more difficult to determine when an algorithm may be giving objectionable results — that I, adjudged a good credit risk, may only be seeing internet ads for low-interest auto loans, while you, determined to be a poor one, are bombarded only with ones of usurious payday loans, without either of us knowing what’s on the other’s screen. Call this the pandering politician problem: if a candidate tells each voter exactly what they want to hear — a product political firms wielding algorithms are ever working to hone — and is able to do it in private, who’s to know when they’ve broken their campaign promises? Who’s to stop them offering each of us personally tailored lies?
The answer seems to be no one, and it’s here, when offering a solution, that O’Neil turns most to the past. Looking back at how the nation confronted the ills of previous technological change, O’Neil returns with a simple answer: regulation. In neoliberal Silicon Valley such a proposal may be heretical, but you can’t STEM your way out of everything.
Until then, perhaps we need a new subway ad: “Algorithms, you get me like no other, even though I don’t get you.”
Live a Life Well-Read Get the best of news, culture, and books delivered weekly.

No comments:

Post a Comment