Something is wrong on the internet by James Bridle
He tells an interesting, and weird, and interesting-because-it's-weird true story of how we got to mass-produced semi-nonsense animated youtube videos that span a range from "maybe educational" to "nightmare fuel." Youtube, he argues, has become not only a victim of platform abuse, but a participant in widespread child abuse*. But most importantly, near the end:
Bridle: "The asides I’ve kept in parentheses throughout, if expanded upon, would allow one with minimal effort to rewrite everything I’ve said, with very little effort, to be not about child abuse, but about white nationalism, about violent religious ideologies, about fake news, about climate denialism, about 9/11 conspiracies."
Youtube is complicit in ISIS recruitment, in the spread of Breitbart, in the Birthers and the 9/11 Truthers.
I'm not sure what the answer is. But the pattern is troubling:
1. some Bad Guy discovers a way to beat The System.
2. Bad Guy does Bad Thing X; some harm gets done.
3. company that runs The System patches it and says "with our new algorithms, nobody will ever do this Bad Thing X again."
This hasn't been catastrophic in the past, because the harm in step 2 is something like "spam gets sent out" or "some people with old janky computers get virused." But it's been getting worse: "Nazis congregated on our platform," "ISIS recruited new people," and even "we elected the wrong guy."
So this is going to end apocalyptically, right? (rather, it's already begun - or is Trump not a bad enough moment for you?)
Tim O'Reilly in an email newsletter I subscribe to:
"Facebook’s fake news problem shows that even present-day algorithmic systems can optimize for the wrong thing; by telling its systems to show people more of what they liked and shared. Facebook thought that it would encourage deeper social connections and build a great advertising business. It didn’t intend to amplify hyper-partisanship and the development of fake news. And when economists told companies that the only social obligation of business is to make money for shareholders, they thought they would make businesses more efficient. They didn’t mean to increase income inequality, hollow out the US economy, and create an opioid epidemic. But these were some of the unintended consequences when we told our companies to optimize relentlessly for corporate profit and treat humans as a cost to be eliminated. This is why I say we’ve already had "our Skynet moment.""
Scary as this is, it vaguely points towards a way out. There's no reason we have to be in this upward spiral of speed and complexity, besides ad clicks and shareholder value. I don't know the exact measures by which we can turn Facebook and Twitter and Youtube into pro-social tools instead of hotbeds of anger and sensationalism, but I'm pretty confident that as long as the ads are paying the bills, the social networks aren't going to figure it out themselves.
(*aside, back on the Youtube-freaking-out-kids topic: Geoff Manaugh talks about how the problem is not that these videos will freak kids out - it's just that it'll make them think really weirdly. Like, of course these things are connected (say, Aladdin characters and the kid from Despicable Me) - why not? Everything is connected to everything else. I ... I dunno, I kinda think this would just be great? This kind of free-association is sort of my jam; it's what I love about Labyrinth or the Phantom Tollbooth or Twin Peaks or Mitch Hedberg or Janelle Shane)
1. some Bad Guy discovers a way to beat The System.
2. Bad Guy does Bad Thing X; some harm gets done.
3. company that runs The System patches it and says "with our new algorithms, nobody will ever do this Bad Thing X again."
This hasn't been catastrophic in the past, because the harm in step 2 is something like "spam gets sent out" or "some people with old janky computers get virused." But it's been getting worse: "Nazis congregated on our platform," "ISIS recruited new people," and even "we elected the wrong guy."
So this is going to end apocalyptically, right? (rather, it's already begun - or is Trump not a bad enough moment for you?)
Tim O'Reilly in an email newsletter I subscribe to:
"Facebook’s fake news problem shows that even present-day algorithmic systems can optimize for the wrong thing; by telling its systems to show people more of what they liked and shared. Facebook thought that it would encourage deeper social connections and build a great advertising business. It didn’t intend to amplify hyper-partisanship and the development of fake news. And when economists told companies that the only social obligation of business is to make money for shareholders, they thought they would make businesses more efficient. They didn’t mean to increase income inequality, hollow out the US economy, and create an opioid epidemic. But these were some of the unintended consequences when we told our companies to optimize relentlessly for corporate profit and treat humans as a cost to be eliminated. This is why I say we’ve already had "our Skynet moment.""
Scary as this is, it vaguely points towards a way out. There's no reason we have to be in this upward spiral of speed and complexity, besides ad clicks and shareholder value. I don't know the exact measures by which we can turn Facebook and Twitter and Youtube into pro-social tools instead of hotbeds of anger and sensationalism, but I'm pretty confident that as long as the ads are paying the bills, the social networks aren't going to figure it out themselves.
(*aside, back on the Youtube-freaking-out-kids topic: Geoff Manaugh talks about how the problem is not that these videos will freak kids out - it's just that it'll make them think really weirdly. Like, of course these things are connected (say, Aladdin characters and the kid from Despicable Me) - why not? Everything is connected to everything else. I ... I dunno, I kinda think this would just be great? This kind of free-association is sort of my jam; it's what I love about Labyrinth or the Phantom Tollbooth or Twin Peaks or Mitch Hedberg or Janelle Shane)
No comments:
Post a Comment