Algorithmic
On Monday morning, I went to grab a cup of coffee at a little cafe by my work. I overheard two young guys having a business meeting at a table by the window. One, in motorcycle jacket and stylishly-slashed jeans, was pitching to the other, talking about creating a decision engine for our leisure-time activities. The rest of their conversation drifted in and out of earshot, but I couldn't get that phrase -- decision engine -- out of my head. I paid for my coffee, dumped in some cream, and made for the door. As I walked back to the office, cutting down the alley, I thought over that choice of words; a techie, slang-y take on an abstract concept that also sounds fundamentally dumb. Break it down for minute: decisions are rational thoughts that generate human actions and behavior, while engines are big, brainless brutes that generate mechanical action in support of human behavior. So, do we really want an engine spewing out decisions like so much horsepower?
Since the mid-nineties, Americans have greedily slurped up every hop, skip, and leap forward in technology, from flat-screen TVs to broadband internet to the now-ubiquitous mobile web. I'm speaking to you now on a platform that has revolutionized people's ability to self-publish while actually reaching an audience. Lots of things have fallen by the wayside, from pets.com to Napster -- one might call them the inevitable casualties of forward progress. One fundamental success of Web 2.0 and beyond has been the rise of all of these engines, generating associations by plugging your eminently trackable web behavior into algorithms.
Amazon comes up with recommendations based on books you've bought, books you've browsed, and books other people have bought. Netflix analyzes what you've watched, what you've rated, and what's in your queue to come up with absurdly specific categories of movies you might be interested in. Pandora builds a library of music upon your diligence in clicking on a thumbs-up or thumbs-down icon. Google has begun to filter and customize your search results based on your previous searches and cookies embedded in your browser, especially if you use Chrome. Google Maps uses algorithms to determine the best route, based on a matrix of speed limits, lane widths, and number of traffic lights.
Lots of these engines have undoubtedly improved our lives. Netlfix, Hulu, Pandora, and Amazon have introduced me to dozens of shows, movies, authors, and artists that I was unfamiliar with. Google Maps has been a lifesaver in unfamiliar cities, as well as a powerful tool for architects, democratizing once-expensive satellite site imagery. Hell, even the New York Times has gotten into the game, snaring me with their most-emailed and recommended for you columns on the right hand side of the page.
However, there are a myriad of dangers here. The first, and most immediate, is the routine and thoughtless abdication of responsibility. Can you remember the last time you went past the first page of Google results when you searched something? Google will say this is a function of their supreme efficiency at collecting all the information on the web, and they'll partly be right. My sense is that Google has, rather effortlessly, short-circuited our natural curiosity. Once you find what you're looking for in the first or second result a dozen times or so, you never move past that again. Try it sometime -- you'll be amazed what turns up four or five pages deep.
Second, we find ourselves out-sourcing our brain to the algorithm, and, by extension, the crowd. While the crowd and the algorithm may be very wise, they aren't always right, and we chip away at our ability to think critically the longer we use these decision engines. I read a fascinating interview with one of my favorite artists, DJ Shadow, on Wired, recently, where he talks at length about his ambivalence about the internet:
I just think that the internet has been sold to us as our savior. As a means to create a new economy, as our spiritual salvation, whatever . . . But what I think people have lost sight of — and I don’t think the internet has done a good job of self-evaluation in this respect — is the massive shift between the brave new internet world of the late ’90s and now . . . A decade later, everything is corporate-owned, advertising is incessant, and the diverse opinions of internet commentary are often shouted down. Now there’s much more online groupthink . . . The internet was supposed to democratize communication, but the opposite seems to have happened.
Mr. Shadow brings me around to my third point: all of these engines are little robots of commerce. This is not necessarily a bad thing. I'm sure DJ Shadow, for one, doesn't mind when the Amazon recommendation engine puts his music in the hands of someone who had never heard of him. However, that little voice on Amazon is pretending to be your college buddy, turning you on to a sweet new band and passing you another beer, when, in fact, it is just a sales driver. Amazon doesn't give a shit if you like the music or not, they're just happy you parted with a couple more dollars in their house!
Designers, to paraphrase our former president, are deciders. As soon as the designer abdicates deciding, they have ceased to be a critic of the world, a critic of the flaws of our designed environment, and useless as a problem solver.
It takes a lot of discipline and intellectual energy to bend the web to your will, instead of letting it bend you. I'm just trying to maintain composure, folks.
Since the mid-nineties, Americans have greedily slurped up every hop, skip, and leap forward in technology, from flat-screen TVs to broadband internet to the now-ubiquitous mobile web. I'm speaking to you now on a platform that has revolutionized people's ability to self-publish while actually reaching an audience. Lots of things have fallen by the wayside, from pets.com to Napster -- one might call them the inevitable casualties of forward progress. One fundamental success of Web 2.0 and beyond has been the rise of all of these engines, generating associations by plugging your eminently trackable web behavior into algorithms.
Amazon comes up with recommendations based on books you've bought, books you've browsed, and books other people have bought. Netflix analyzes what you've watched, what you've rated, and what's in your queue to come up with absurdly specific categories of movies you might be interested in. Pandora builds a library of music upon your diligence in clicking on a thumbs-up or thumbs-down icon. Google has begun to filter and customize your search results based on your previous searches and cookies embedded in your browser, especially if you use Chrome. Google Maps uses algorithms to determine the best route, based on a matrix of speed limits, lane widths, and number of traffic lights.
Lots of these engines have undoubtedly improved our lives. Netlfix, Hulu, Pandora, and Amazon have introduced me to dozens of shows, movies, authors, and artists that I was unfamiliar with. Google Maps has been a lifesaver in unfamiliar cities, as well as a powerful tool for architects, democratizing once-expensive satellite site imagery. Hell, even the New York Times has gotten into the game, snaring me with their most-emailed and recommended for you columns on the right hand side of the page.
However, there are a myriad of dangers here. The first, and most immediate, is the routine and thoughtless abdication of responsibility. Can you remember the last time you went past the first page of Google results when you searched something? Google will say this is a function of their supreme efficiency at collecting all the information on the web, and they'll partly be right. My sense is that Google has, rather effortlessly, short-circuited our natural curiosity. Once you find what you're looking for in the first or second result a dozen times or so, you never move past that again. Try it sometime -- you'll be amazed what turns up four or five pages deep.
Second, we find ourselves out-sourcing our brain to the algorithm, and, by extension, the crowd. While the crowd and the algorithm may be very wise, they aren't always right, and we chip away at our ability to think critically the longer we use these decision engines. I read a fascinating interview with one of my favorite artists, DJ Shadow, on Wired, recently, where he talks at length about his ambivalence about the internet:
I just think that the internet has been sold to us as our savior. As a means to create a new economy, as our spiritual salvation, whatever . . . But what I think people have lost sight of — and I don’t think the internet has done a good job of self-evaluation in this respect — is the massive shift between the brave new internet world of the late ’90s and now . . . A decade later, everything is corporate-owned, advertising is incessant, and the diverse opinions of internet commentary are often shouted down. Now there’s much more online groupthink . . . The internet was supposed to democratize communication, but the opposite seems to have happened.
Mr. Shadow brings me around to my third point: all of these engines are little robots of commerce. This is not necessarily a bad thing. I'm sure DJ Shadow, for one, doesn't mind when the Amazon recommendation engine puts his music in the hands of someone who had never heard of him. However, that little voice on Amazon is pretending to be your college buddy, turning you on to a sweet new band and passing you another beer, when, in fact, it is just a sales driver. Amazon doesn't give a shit if you like the music or not, they're just happy you parted with a couple more dollars in their house!
Designers, to paraphrase our former president, are deciders. As soon as the designer abdicates deciding, they have ceased to be a critic of the world, a critic of the flaws of our designed environment, and useless as a problem solver.
It takes a lot of discipline and intellectual energy to bend the web to your will, instead of letting it bend you. I'm just trying to maintain composure, folks.