МЛН

5.4 Pitchfork, 1995–present


5.4

Pitchfork, 1995–present

In anticipation of Issue 13, “Machine Politics,” we’ve decided to fork over the full text of Richard Beck’s review of Pitchfork from Issue 12. Today is also the last day to subscribe and receive the issue straight from the printer, so whether you’re a new reader or a reluctant renewer, today’s your day. In the meantime, enjoy. —Eds.
One day in early 2010, the internet message board I Love Music began discussing the Pazz and Jop poll, which the Village Voice had recently published on its website. TheVoice has conducted Pazz and Jop annually since 1971. Hundreds of music critics submit lists ranking their favorite albums and singles, and the Voice compiles two master lists identifying the year’s best music. It is the main event in American popular music criticism. On I Love Music, the Pazz and Jop thread chugged slowly along for a few hours. Then Scott Plagenhoef, editor-in- chief of the music website Pitchfork, began posting under the name “scottpl,” and things picked up speed. “11 of the top 13 LPs and five of the top six singles are shared between this and the Pitchfork list,” Plagenhoef wrote. “For what it’s worth.”
The suggestion that the nation’s music critics had copied their end-of-the-year charts from a website just over a decade old was a clear provocation, and twenty-four hours and hundreds of posts later the conversation was no longer about Pazz and Jop. “Man, Pitchfork circa 2000 and 2001 vs now is night and day,” Plagenhoef wrote. “The size of the site now utterly dwarfs the site then, and certainly the way it’s run and decisions are made are different.” Plagenhoef did his best to maintain a modest pose (“I don’t beg for credit or claim to be responsible for things”), but in the end it was hard to resist a triumphal note. “We’ve succeeded at a time when nobody else has,” he wrote. “We reach more people right now than Spin or Vibe ever did, even if you use the bs print mag idea that ‘every copy is read by 2.5 people’ . . . hell, I should stop caring, get back to work, and let people keep underestimating us.” Then he posted two more times. Then he wrote, “Alright, I will get out of this thread.” Then he posted eighteen more times.
He may have been bragging, but Plagenhoef was right. In the last decade, no organ of music criticism has wielded as much influence as Pitchfork. It is the only publication, online or print, that can have a decisive effect on a musician or band’s career. This has something to do with the site’s diligently cultivated readership: no genre’s fans are more vulnerable to music criticism than the educated, culturally anxious young people who pay close attention to indie rock. Other magazines and websites compete for these readers’ attention, of course, but they come and go, one dissolving into the next, while Pitchfork keeps on gathering strength. Everyone acknowledges this. And yet everyone also acknowledges something else: whatever attracts people to Pitchfork, it isn’t the writing. Even writers who admire the site’s reviews almost always feel obliged to describe the prose as “uneven,” and that’s charitable. Pitchfork has a very specific scoring system that grades albums on a scale from 0.0 to 10.0, and that accounts for some of the site’s appeal, but it can’t just be the scores. I could start a website with scores right now, and nobody would care. So what is it? How has Pitchfork succeeded where so many other websites and magazines have not? And why is that success depressing?

Ryan Schreiber launched Pitchfork in November 1995 from his parents’ house in a suburb of Minneapolis. Because the domain name www.pitchfork.com belonged to a company selling livestock out of Butte Falls, Oregon, Schreiber had to settle for www.pitchforkmedia.com. The name, he told BusinessWeek in 2008, was meant to suggest “an angry mob mentality” toward the music industry. He was 19, a recent high school graduate working part-time jobs and going to indie rock concerts in downtown Minneapolis. Hindsight makes it easy to see Schreiber as a 1990s digital visionary, but the truth is more prosaic: he wanted to write about music, and money was scarce. “I thought it would be really cool to meet the people in bands I liked and talk to them,” he told an interviewer. “I loved reading print fanzines . . . but at the same time, their distribution was limited and there was a lot of overhead to cover . . . so it was a case of publish on the internet or don’t publish at all.” In its earliest days, Pitchfork was written almost entirely by Schreiber—reviews, a little news, and interviews with groups passing through Minneapolis. His parents, Schreiber says, got angry about the phone bills he racked up contacting record labels in New York, but as the free CDs began to roll in, Schreiber realized that he had found his calling: “Once I heard about the promos, I was like, ‘Oh my God, unbelievable! Unbelievable!’”
Schreiber’s youthful enthusiasm is everywhere in Pitchfork’s early reviews. (Although most of them have been taken down from the site, some can still be found on the Internet Archive’s Wayback Machine.) Today, as a matter of practice if not explicit principle, Pitchfork grants the vast majority of its perfect scores—10.0—to reissues, albums that have weathered the storms of popular taste for a decade or two. Before 2000 the site allowed its writers far more license, awarding 10.0s to Walt Mink’s El Producto, 12 Rods’ Gay?, and Amon Tobin’s Bricolage, albums that almost nobody remembers today. The writing that accompanied these scores was sweet and hyperbolic. Opening his 10.0 review of Bonnie “Prince” Billy’s I See A Darkness(1999), Samir Khan, one of the site’s early reviewers, wrote, “Music is a wounded, corrupted, vile, half-breed mutt that begs for attention as it scratches at your door.” I don’t know what that sentence means, but the sentiment is clear enough. Untroubled by knowledge, wide-eyed, drunk on enthusiasm: this is what hearing pop music as a teenager feels like.
Even Pitchfork’s obsession with numbers, which eventually came to seem pedantic and sometimes cruel, was initially more like an adolescent game. Since Pitchfork’s writers had just graduated from high school, a place where the difference between an 89 and a 91 could not have been more consequential, they were well positioned to distinguish between an 8.3 album and an 8.7. The pseudoscientific precision—101 possible ratings instead of the usual four or five stars—was a large part of the appeal.
In order for Pitchfork to justify the proliferation of ratings, the reviews were going to need to proliferate as well. Although Schreiber initially chose a web format because it was cheaper than photocopying, one of the internet’s advantages over print media soon began to reveal itself: its archival potential. Old reviews didn’t go anywhere (at least not until Pitchfork began to airbrush its history); they accumulated. Schreiber counted the number of albums Pitchfork had covered, and by early 1999 he could announce, at the bottom of the site’s front page, “1,512 REVIEWS SINCETH.FEB.01.96.”
In addition to removing the overhead costs associated with print publishing, the internet, in a happy coincidence, had also solved the problems of information that plagued zine culture. The bands you liked—or, rather, the bands you might potentially like, if you ever managed to hear about them—were not on the radio. They would come to your local indie rock venue eventually (in Minneapolis the main one was First Avenue), but usually you had to be 18 to get in, and anyway what were you going to do, hang out at First Avenue every night of the week? It wasn’t going to be cheap. Zines helped to keep people informed by publishing interviews, concert write-ups, and an enormous number of record reviews; but they were also hard to find, hard to keep track of, and, even if you hoarded the issues, hard to search through. As Schreiber began publishing writers other than himself, and as the archive grew, Pitchfork began to look like something other than a collection of off-the-cuff record reviews by teenagers. It began to look like a project with a mission, however accidental: an encyclopedia of contemporary tastes in rock and roll, accessible to everyone, everywhere.1
Early Pitchfork’s narrow focus on indie rock wasn’t a conscious decision—indie rock just happened to be the kind of music that most of Ryan Schreiber’s friends liked. Even as the site began poking around in other genres, it was not hard to figure out where the writers had come from. Reading through the archive, watching Pitchfork begin to discover thoughtful, politically liberal rap groups like A Tribe Called Quest and Jurassic 5, I felt a shock of white suburban recognition. In 1998, Lang Whitaker gave a 7.1 to the Black Eyed Peas, speculating that with “a line-up that looks straight out of a Benetton ad,” maybe the group could “assume their mantle as hip-hop’s street saviors.” One year later, in a review of The Roots’ Things Fall Apart, Samir Khan congratulated the group on featuring “an intelligent rapper.” Other genres were treated with the same endearing bewilderment. Schreiber, in particular, fell head over heels in love with 1960s jazz. On Thelonious Monk: “The man could play a piano like it was a goddamn video game.” And on John Coltrane, recorded live at the Village Vanguard: “‘Trane takes it to heaven and back with some style, man. Some richness, daddy. It’s a sad thing his life was cut short by them jaws o’ death.” It’s easy to complain about this kind of thing, but it isn’t racist. It isn’t even insensitive, really. It’s just oblivious. What could be less anxious, less self-conscious, than a white 21-year-old writing about John Coltrane in the voice of an old black man? It must have been nice to write on the internet and feel that the only people paying attention were your friends. This initial three-year period was Pitchfork’s Edenic phase, and it came to an end in June 1999, when Shawn Fanning, a student at Northeastern University, launched Napster.
People think that Napster liberated recorded music by making it available at no monetary cost, but that is not exactly right. Napster allowed you to keep your money, but it also told you there was something shameful in keeping your musical tastes to yourself. In its default setting, the service made each user’s collection available for everyone’s perusal. Fanning spent his teenage years immersed in the open-source culture of internet message boards, where circulating your own hacks and helping to improve the work of others was just good citizenship, and this collaborative ethos migrated smoothly into the realm of digital music. You could change Napster’s default setting and keep your own files hidden, but then everyone would resent you—it was called file sharing for a reason. Although Napster was shut down by the Ninth Circuit Court of Appeals in July 2001, its ethic lives on in the language of music downloading. On sites using the BitTorrent file-sharing protocol, someone who makes a torrent available to others is called a “seeder.” Someone who only downloads material made available by others, on the other hand, is a “leech.” 
Back when people still had to pay for music, money served to limit and define consumption. You could only afford so many records, so you bought what you could, listened to the radio or watched MTV, and ignored everything else. Those select few who did manage to hear everything—record store clerks, DJs, nerds with personal warehouses—could use this rare knowledge to terrorize their social or sexual betters, as in the pre-internet-era film High Fidelity. Napster made all of that obsolete. Today, almost every person I know has more music on his computer than he could ever know what to do with. You don’t need to care about music to end up like this—the accumulation occurs naturally and unconsciously. My iTunes library, for example, contains forty-seven days of music. According to the column that counts the number of times I’ve played each song, roughly a sixth of that music has never been listened to at all. In the 21st century, we are all record store clerks. 
Music began to register the overabundance of supply almost immediately by inventing a new subgenre of dance song that made it possible to listen to your whole music library at once: the mashup. In 2001, a DJ called Freelance Hellraiser laid the vocals from Christina Aguilera’s “Genie in a Bottle” over the instrumental from “Hard to Explain” by The Strokes. What distinguished the mashup was precisely how obvious it was, how celebratory and stupid. You recognized both songs (usually a warm, classic-rock instrumental and a raunchy rap lyric) and appreciated the faux-unlikeliness of their juxtaposition. In the years following Napster’s heyday, dozens of other mashup albums and mixtapes arrived: Uneasy Listening Vol. 1 by DJ Z-Trip and DJ P, Never Scared by Hollertronix, and most famously, Danger Mouse’s The Grey Album, which coupled songs from Jay-Z’s The Black Album with The Beatles’ White Album. The biggest misconception about the mashup was that it was meant for dancing. But the mashup’s ideal listener is seated at his desk, bathed in the computer screen’s aquarium glow, Googling any songs he didn’t pick up the first time around.
Most mashups are quite crudely put together. What made the mashup DJ impressive was the breadth of his iTunes library, his mastery of it, and his taste. This made him an ironically heroic figure, because while using Napster for the first time was exciting, it also forced you to confront your own inadequacy. Everyone on the internet had better music than you did, or at least more of it. You could spend whole evenings downloading to close the gap, but what were you actually supposed to do with all these new songs? Listen to them? That could take years, and all the while you’d be downloading more music. 

In the middle of 2000, having raised a few thousand dollars selling records on eBay, Schreiber moved Pitchfork to an office in the Wicker Park neighborhood of Chicago. His stable of writers—people willing to produce long reviews for very little money and a CD—was growing, and on the site’s Staff page, these writers posted irreverent profiles of themselves. Matt LeMay, who still works for the site as a senior contributor, put up a picture of a cactus instead of his face, and wrote, “Matt LeMay wasn’t like the other cacti.” On the Advertising page, the site bragged about a readership that had “tripled since January, 2000! Pitchfork is now receiving over 130,000 visits every month, more than any of our print competitors.” This looks like a dubious claim—Rolling Stone‘s circulation was well over one million—but Pitchfork wasn’t thinking that big yet. They meant other indie rock zines, none of which survive today. “[Our writers] genuinely care about music,” the pitch read, “unlike some of the big time playaz that’re just in the business for the bling bling.” Charging $300 per month for a banner ad, Pitchfork attracted indie bands with new releases, upstart record labels, and online music stores like InSound, eMusic, and Spun.com. One good reason to advertise with Pitchfork was that Schreiber was now publishing four reviews a day, five days a week, and thus gradually turning his small audience into a loyal one. Best of all, though, was the fact that Pitchfork had a five-year head start on rating all the music that Napster had recently made available for free: “Our massive record review archive is one of the most comprehensive indie-based review archives on the web with more than 3,000 reviews spanning the past five years.” 
In the same month that Schreiber established Pitchfork in Chicago, Radiohead released their fourth album, Kid A. In the past, record reviews were for helping people decide whether to buy something. They were shunted to the back of the magazine, they were frequently brief, and they were secondary to the illustrated profiles and interviews that moved issues off the newsstand. For years, America’s most intelligent rock critic, Robert Christgau, wrote a weekly review column for theVoice with the not-quite-tongue-in-cheek title “Christgau’s Consumer Guide.” If a record clocked in at less than half an hour, he mentioned it, the idea being that you needed value for your money. But when Kid A was released that October, it had been online for three weeks, for free, and readers clicking on Brent DiCrescenzo’s Pitchfork review were not looking for a consumer guide. So what were they looking for? Because Kid A was the best album Pitchfork had ever reviewed, and because Radiohead was one of the first bands to build up an internet fan apparatus, DiCrescenzo’s answer to that question would have important consequences for Pitchfork’s future.
The strategy DiCrescenzo decided on had two parts. First, he pushed Pitchfork’s fondness for rhetorical extravagance as far as possible. “The experience and emotions tied to listening to Kid A are like witnessing the stillborn birth of a child while simultaneously having the opportunity to see her play in the afterlife on Imax,” he wrote. “Comparing this to other albums is like comparing an aquarium to blue construction paper.” This, in itself, was no major innovation: rhetorical excess had long been one of the hallmarks of rock writing, a natural if usually doomed attempt to translate the music’s depraved energies into prose. The next move—giving the record a 10.0—didn’t seem like an innovation either. The site had awarded perfect scores before, including a 10.0 for Radiohead’s previous album, OK Computer. But these prior scores were in the spirit of nerdy fun: if a record was awesome, you gave it a 10.0, and you didn’t think too hard about it. In Kid A, by contrast, it was clear that Pitchfork saw an important event in pop music history. The 10.0 meant they were trying to be equal to it. Pitchfork, having built up credibility by making minute distinctions between one album (7.4) and another (7.3), had begun to understand that one of the things you can do with cultural capital is spend it on something extravagant (10.0). And since a well-placed 10.0 would elicit sustained bursts of praise and criticism from different corners of the internet, the expenditure was an investment in the site’s own future.
Faced with an album this new and this great, DiCrescenzo paid it the highest compliment he could think of: he made a list of Radiohead’s influences. In just over 1,200 words, he managed to mention the John Coltrane album Olé, C. S. Lewis, the Warp Records label, Terry Gilliam’s animations for Monty Python, David Bowie and Brian Eno, Aphex Twin, Björk, and, finally, the White Album. “It’s clear that Radiohead must be the greatest band alive,” he wrote, “if not the best since you know who.” (He means the Beatles.) “It was a watershed moment for us,” Schreiber later said of the Kid A review. “We got linked from all the Radiohead fan sites, which were really big. We got this huge flood of traffic, like five thousand people in a day checking out that one review. We had never seen anything like that. Web boards were talking about our review.” Of course, the review told you little about Radiohead’s music that you couldn’t have heard on your own, but it told you everything about what kind of cultural company Radiohead was meant to keep. This technique became Pitchfork’s signature style.

Although the term “indie rock” didn’t gain widespread use until the early ’90s, the music began to emerge from the wreckage of punk rock in the mid-‘80s, and was variously referred to as “college rock,” “post-punk,” and “DIY” (for Do It Yourself). “Indie” referred specifically to the independent record labels that cropped up around the country in that decade: Dischord in Washington DC; SubPop in Seattle; Matador in New York; Merge in Chapel Hill, North Carolina; and Touch and Go in Chicago. The music these labels released was diverse, to be sure, but at the core was a dominant strain of direct, hard-edged, bitterly sarcastic music by bands like Minor Threat, Black Flag, Dinosaur Jr., Fugazi, The Butthole Surfers, and The Pixies. What held these bands together—however loosely—was a belief in the cynicism and greed of the major labels, which would not have signed them anyway. It is worth noticing that “indie rock” is one of the few musical genre names that doesn’t refer to a musical aesthetic: the genre was founded on an ethic of production.
This ethic got complicated in 1991, when Nirvana, a three-man band out of Aberdeen, Washington, became one of the most popular musical acts in the world. Nirvana started out on SubPop, but in 1990 the group signed with the major labelDGC Records. By 1992, Nirvana’s second album, Nevermind, had landed at number one on the Billboard chart, beating out new albums by Garth Brooks and Michael Jackson. In 1994, Kurt Cobain killed himself with a shotgun.
It is not an exaggeration to say this changed everything for indie music. While opportunistic rock bands spent the 1990s doing their best to follow in Nirvana’s musical and commercial footsteps, more serious indie bands took a different course. After Nirvana, they made music with the knowledge that although mainstream success and adoration remained on balance unlikely, they were not impossible. This knowledge took on an ominous quality in light of Cobain’s suicide, and some indie musicians went looking for ways to inoculate themselves against the potential consequences of fame. For years, a certain strain of indie rock had taken a nearly perverse satisfaction in making inscrutable jigsaw puzzles out of the old verse-chorus song form. Now, with “alternative music” at the top of the charts, obscure lyrics and scrambled song structures came to the fore. It wasn’t long before Stephen Malkmus, who fronted the now-famous Pavement, was telling Spin that he wrote lyrics from “the perspective of a guy who, inebriated at a party, is saying a great many things he doesn’t mean.” Groups like Sebadoh and The Mountain Goats began recording on cheap, low-fidelity equipment (think: a boombox), as though to make sure that even if some slick record executive took a shine to their music, it wouldn’t sound good enough for radio. It is not hard to see these as defensive gestures. Even as the mainstream was beginning to pretend to embrace something other than itself, bands like Pavement could wear their inscrutability like armor. They would remain, at least temporarily, unconsumed and uncorrupted by the industry that had begun to accept them, and that they, warily, had begun to accept in turn.
Then that industry began to collapse. In 2000, the pop group ‘N Sync sold 2.42 million copies of No Strings Attached in its first week of release. Six years later, it took Lil Wayne six months to sell the same number of copies of Tha Carter III. Each was the best-selling album of its respective year. The period in between amounted to a Napster-induced, industry-wide panic. One of the music industry’s first efforts to address this crisis emerged in the music itself. In the first years of the decade, a number of rock bands trading in blatant formal nostalgia became both commercial and critical favorites: The White Stripes, The Strokes, The Hives, and The Vines all played direct, sexy versions of by-the-book rock and roll, and were relatively untroubled by the ethical concerns that had dogged successful indie bands. Some of this music was pretty good, but Rolling Stone‘s impulse to refer to The Strokes as “saviors of rock” had more to do with denying the existence of a new world. Pitchfork, appropriately, was more skeptical of this new music, but eventually they decided to like it, and it’s hard to blame them—it was so likable! “It’s Christ and Prometheus,” Ryan Schreiber and Dan Kilian wrote in a review of The White Stripes’White Blood Cells, “eternally dying and rising again. . . . They don’t innovate rock; they embody it.” The frightening implications of this—whether the body of rock was really a corpse, for one—were not addressed.
Most of these bands began to fade within a year or two, and as indie bands watched the industry’s collapse, the envy and contempt they had traditionally felt toward major labels stopped making sense. What was there to envy anymore? Wasn’t it obvious that indie bands, with their devoted networks of fans, critics, and performance venues, had it better? Not only were the major labels soul-sucking money machines, they couldn’t even make you rich! This made early indie’s militancy and paranoia look silly, and the hard lines began to soften. In 2001, The Shins, a band from Albuquerque, New Mexico, released a gentle, nostalgic song called “New Slang,” and within a year it could be heard not only on college radio stations but also in a McDonald’s commercial, and eventually in the movie Garden State. The Shins were on SubPop, and when the Seattle Times asked one of the label’s creative directors about the band’s reaction to the McDonald’s offer, he recalled, “They were like, well we don’t really think this is compromising, someone wants to pay us to do what we do.” This turned the usual critique of selling out precisely on its head. “It’s a way for a band that doesn’t get signed for huge advances to be able to quit their day jobs for a while,” the creative director continued, “and concentrate on making music.” By 2003, Fox was making indie rock a staple on the soundtrack toThe O.C., its biggest hit at the time. In an early episode Seth, one of the show’s main characters, even went so far as to name-check the Seattle band Death Cab for Cutie.

Pitchfork, too, began to shift from “angry mob” to kingmaker. In 2003, the site introduced a “Best New Music” designation to its review scheme, and the following year Ryan Schreiber took his first trip to New York City. “Ryan spent a whole day in Times Square,” one contributor recalled. “He was so happy.” Schreiber’s trip had musical justifications as well. Amid a wave of nostalgia for the 1980s, New York’s last decade of musical excitement, the city’s indie musicians had taken an interest in post-punk and dance music, and Schreiber was paying close attention. A group called LCD Soundsystem had recently released the song “Losing My Edge,” in which an aging James Murphy lamented how much cooler those younger than him were becoming: “I’m losing my edge to the internet seekers who can tell me every member of every good group from 1962 to 1978,” Murphy sang. “I’m losing my edge to the art-school Brooklynites in little jackets and borrowed nostalgia for the unremembered Eighties.” (“I hear you have a compilation of every good song ever made by anybody.”) For Schreiber, ascendant king of the “internet seekers,” this was like a dance-music gateway drug. In fall 2003, another dance-punk band called The Rapture released Echoes, and Schreiber was addicted for good. “Finally,” Schreiber wrote, in a 9.0 Best New Music review, “we are shaking off the coma of the stillborn slacker ’90s and now there is movement. Arms uncross, faces snap to attention, and clarity hits like religion.” Schreiber praised the new indie rock for “cultivating a new loathing and defiance for tired hipster poses.” It was time to dance: 
You people at shows who don’t dance, who don’t know a good time, who can’t have fun, who sneer and scoff at the supposed inferior—it’s you this music strikes a blow against. We hope you die bored.
Pitchfork, of course, did not become a site that promoted only dance music, but it had begun to move away from the detached skepticism that characterized so much of ’90s indie rock. More and more, it found itself promoting music that celebrated a certain kind of emotionalism. Earlier that year, Schreiber had written about diving into boxes of promotional records and coming to the surface with a Toronto band called Broken Social Scene between his teeth. Broken Social Scene was in perfect keeping with indie’s developing emphasis on nostalgia, collaboration, and empathy. One popular track on You Forgot It In People, for example, was called “Anthems for a Seventeen Year Old Girl.” Played in a warm major key, the song featured a small harmonic circle that endlessly repeated as the instruments swelled, as though the group were playing not a pop song but a round or lullaby. The band’s members, ranging from six to nearly twenty in number, referred to themselves as a “collective,” and they highlighted their own cooperative largeness by playing, in addition to the standard rock instruments, woodwinds, horns, and strings. But in his 1,100-word rave, Schreiber spent more time listing influences, DiCrescenzo-style: he compared the group to five other bands from Toronto, as well as Dinosaur Jr., Jeff Buckley, Spoon, Electric Light Orchestra, electronic musician Ekkehard Ehlers, The Notwist, and (here they come again) Sgt. Pepper’s Lonely Hearts Club Band. “You just have to hear it for yourself,” Schreiber concluded. “Oh my god, you do. You just really, really do.” Schreiber gave the album a 9.2, and suddenly Broken Social Scene, whose record had essentially disappeared from stores between its 2002 release and Pitchfork’s review, began selling out concerts. The idea that Pitchfork could wield the kind of influence that causes money to circulate—in other words, the kind of influence that still matters, even in our file-sharing times—started to take hold.
One year later, the debut album by a Montreal band called Arcade Fire received a 9.7. David Moore began his review with a historical excursus:
How did we get here? Ours is a generation overwhelmed by frustration, unrest, dread, and tragedy. Fear is wholly pervasive in American society, but we manage nonetheless to build our defenses in subtle ways—we scoff at arbitrary, color-coded “threat” levels; we receive our information from comedians and politicians. Upon the turn of the 21st century, we have come to know our isolation well. . . . In our buying and selling of personal pain, or the cynical approximation of it, we feel nothing.
Like Broken Social Scene, Arcade Fire was an enormous band—at some concerts they would put as many as fifteen musicians on stage. Members played antiquated instruments like the accordion and the hurdy-gurdy, and singer Régine Chassagne would perform wearing lace gloves. But if the instrumentation and stage costumes suggested a Victorian, semi-handmade aesthetic, the music itself embraced unembarrassed emotionalism. Lead singer Win Butler yelled his lyrics as often as he sang them—these were often laments for a damaged childhood—and he relished the moments when his voice would crack from the strain. The band also infused its music with a sense of spiritual grandeur by using a pedal point, a sustained bass tone that’s usually provided by an organ in church music. (The band recorded its second album, Neon Bible, in an actual church.) Expressiveness and sentimentality are what Moore liked best about Arcade Fire. Near the end of his review, he creditedFuneral with “completely and successfully restoring the tainted phrase ‘emotional’ to its true origin,” as strong an endorsement of indie rock’s sincerity as Pitchfork would ever make. By the end of the following year, Arcade Fire had appeared on the Late Show With David Letterman and performed in Central Park.

Arcade Fire ended debates about whether Pitchfork mattered. The site’s job now was to learn how to handle its influence. As late as 2004, a journalist writing for the East Bay Express could accurately remark that many of Pitchfork’s reviewers “have no prior rock crit experience or interest in making a career out of this.” This wasn’t sloppy hiring on Pitchfork’s part; it had a lot to do with what the site thought (and continues to think) about the value of an individual writer. Although Pitchfork advertised its reviews as personal and idiosyncratic, it refused to let a writer get so good or so famous that he could have any kind of a following. Writers who didn’t like it were eventually shown the door, or found it themselves. Brent DiCrescenzo, easily the site’s best-known critic in the early days, quit Pitchfork in a 2004 Beastie Boys review. “The little number at the top of this piece”—it was a 7.9—“reflects little of my personal relation to the record,” he wrote. “It’s an arbitrary guide.” He went on: “This process has become unexciting and routine, which is why I bid the world of music writing farewell.” In the final paragraph, he came very close to giving away the great secret of Pitchfork’s signature brand of spasmodic reviewing. “I could continue to crank out divisive pieces of writing here until I go gray,” he wrote, and of course he could. Anyone could. Today, DiCrescenzo edits the music section at Time Out Chicago.
He left just in time. Pitchfork was taking major steps toward professionalization, and DiCrescenzo’s writing would have had more and more trouble fitting in. In 2004, Schreiber hired Chris Kaskie away from the Onion to handle business operations, and Scott Plagenhoef, who had worked as a music and sports writer in Chicago, signed on to oversee editorial content. For years the site had gotten along just fine publishing hopeless, funny teenagers; now it began to publish writers who had worked paid jobs at other publications (Tom Breihan, Village Voice), or whose tone made them well-suited for future professionalization (Nitsuh Abebe, New York). Over the next year or two, the site would introduce a number of columns, providing longer articles on particular genres as well as more abstract meditations on topics like talking about music with your friends or using Last.fm for the first time. With these changes, plus an attractive professional redesign in 2005—the first time anyone but Schreiber had designed the site—Pitchfork began to read less like an elaborate blog and more like an online glossy magazine.
This makeover had a number of consequences for the site’s reviewing practices. It meant no more reviewing albums by inventing a dialogue between a venue manager and a band, as Nick Sylvester did in a 2003 review of Jet’s Get Born: “Fuck you, we’re Jet!” says Jet. “Wherever we play people sleep with us.” (The album got a 3.7.) Also out of the question was reviewing a Tool album via a made-up essay by a teenage Tool fan: “I feel like this record was made just for me by super-smart aliens or something.” (A humiliating 1.9 for Lateralus.) But as the writing took on a more formal tone, and the site began to submit itself to a more rigorous editorial process, Schreiber maintained that Pitchfork’s animating principles had not changed. “You have to be completely honest in a review,” he told a reporter. “If it gets sacrificed or tempered at all for the sake of not offending somebody . . . that’s so the opposite of what criticism is supposed to be.”
Pitchfork’s grandest experiment in unfiltered honesty took place in 2004. In the 1990s, one of Pitchfork’s favorite bands had been the Washington DC rock group The Dismemberment Plan. But when the lead singer, Travis Morrison, released his first solo album, Travistan, Chris Dahlen gave it a 0.0. “I’ve never heard a record more angry, frustrated, and even defensive about its own weaknesses,” he wrote, “or more determined to slug those flaws right down your throat.” The record sold terribly as a result, and Morrison’s concerts, when they weren’t empty, became awkward. “I don’t think it occurred to [Pitchfork] that the review could have a catastrophic effect,” Morrison told a reporter. “Up until the day of the review, I’d play a solo show, and people would be like ‘That’s our boy.’ . . . Literally, the view changed overnight.” As Morrison’s career went into a tailspin, the opening line of Dahlen’s review, “Travis Morrison got his ass kicked,” began to sound self-congratulatory, as though Dahlen enjoyed being the one who did the kicking. Travis Morrison subsequently retired from music, returning only for a scattered batch of Dismemberment Plan reunion concerts in 2011. Today, the front page of Morrison’s website is dominated by the word “RETIRED!” in an enormous font, with a smaller note below: “Befriend me on Facebook. I’m so nice!”
Like all rock nerds, Pitchfork’s writers had always come off as snobs, but theTravistan review in particular struck bloggers and critics as an abuse of the site’s ever expanding power. Further evidence of snobbery was seen in Pitchfork’s refusal to allow comments—which really was unusual, given that SpinRolling Stone, and other magazines were rushing to let readers append their own record reviews to the professional ones. Perhaps Schreiber sensed that because Pitchfork’s reviewers were themselves amateurs—in another context, commenters—a commenting feature would have threatened the fragile suspension of disbelief that powered the Pitchfork machine. 
Pitchfork was commonly accused at this time of “hipster cynicism,” but the charge was hard to square with the music Pitchfork liked. One of the site’s favorite musicians around 2005 was Sufjan Stevens, a sensitive young composer and songwriter from Detroit. After announcing his plan to make an album about each of the fifty states, Stevens went on to release Seven Swans, an album of mellow, explicitly Christian folk-pop. When he finally got back to the Fifty States Project and released Illinois, Pitchfork made it their Album of the Year. Amanda Petrusich’s review declared that Stevens’s beautifully orchestrated music (he really did use a small orchestra) made it hard to know “whether it’s best to grab your party shoes or a box of tissues,” and she approvingly quoted this lyric, from the song “Chicago”: “If I was crying / In the van with my friend / It was for freedom / From myself and from the land.” “At its best,” Petrusich wrote, “the album makes America feel very small and very real.” This new interest in pastoral nationalism seemed like a strange fit for indie rock; or at least it made plain that indie rock was in the hands of a new and different generation of fans. At the height of the Iraq war, college graduates poured into cities and took internships at magazines, nonprofits, and internet startup firms. They found themselves drawn, for some reason, to adorable music that openly celebrated our national heritage. They dressed like stylish lumberjacks and watched Sufjan perform dressed as a Boy Scout, and they remembered a disappeared world of the small and the tangible.
Around this time, Pitchfork also began championing a woman who wasn’t an indie rocker at all, and who would go on to become one of the decade’s most important musicians. As Sufjan Stevens was doing his best to imagine a time when the internet didn’t exist, British rapper M.I.A. seemed to pull her entire aesthetic off her wifi connection. Originally a designer and visual artist, M.I.A. dressed like a Myspace page, overlaying brightly patterned neon spandex with piles of fake bling. She designed her website to look like websites from the ’90s, and on an album cover she obscured her face with the bars that show how much of a video has played on YouTube. In her music, she translated the musical ideas behind mashups into a vague but appealing third-world cultural militancy. Her first mixtape was calledPiracy Funds Terrorism Vol. 1, as in online music piracy. Mashup DJs like Girl Talk had begun redeeming mainstream pop songs by playing them all at the same time, a perfect party soundtrack for the listener who, though he didn’t actually like Fall Out Boy or Gwen Stefani, needed at least to know about them. M.I.A.‘s politics worked in the same way, making the particular brutalities of oppression in Liberia or Sri Lanka danceable by lumping them into a vague condition of sexy global distress. “She’s not exploring subcultures so much as visiting them,” Scott Plagenhoef wrote in a review of her first album, Arular, “grabbing souvenirs and laying them out on acetate.” Plagenhoef didn’t see anything wrong with this, although in true Pitchfork style, he made sure to let you know that some people might object: “An in-depth examination of demonizing The Other, the relationship between the West and developing nations, or the need to empathize with one’s enemies would likely make for a pretty crappy pop song.” Around the same time, a contributor reviewing M.I.A.‘s live concert defended her politics in a similar vein. “Maybe that’s how brilliantly innocuousArular actually is,” he wrote. “It subtly imprints manifestoes in the brain, inspiring the masses to pull up the poor, without ever really teaching how or why.” Reading these strained, convoluted efforts to justify the cultural exploitation of global violence, I began to wonder why Pitchfork’s writers had such trouble saying the things they knew to be true.2 Maybe it was because they felt the truth would make for crappy pop songs, and that therefore the best thing would be to ignore it.
In 2007, Ryan Schreiber moved to Brooklyn, which had become one of indie music’s vital hubs. Just as the internet had weakened the major labels and the music magazines that Pitchfork would eventually see as its competition, so did it decimate the weekly newspapers that had supported indie rock throughout the 1990s. As the alt-weeklies went into decline, regional music scenes began to weaken, and indie bands began gravitating toward New York, the city with the greatest hype-generating media apparatus in the world. Priced out of lower Manhattan by the ’90s real estate boom, these bands lived in the Brooklyn neighborhoods of Williamsburg and Greenpoint, where promoters like Todd Patrick—better known as Todd P—had begun to organize DIY concerts in 2001 for acts like Yeasayer, the Dirty Projectors, and Dan Deacon. Although the main body of Pitchfork’s editorial operations stayed behind in Chicago, Schreiber almost immediately became a fixture at concerts. In that same year, the site also finally managed to buy its rightful URL from the livestock company. Pitchfork has made a home at www.pitchfork.com ever since.
One Brooklyn band that Pitchfork began investing serious energy in praising was Animal Collective. The four musicians comprising Animal Collective had grown up and met around Baltimore, but they didn’t start making a career out of music until after college, in New York. Between 2003 and 2009, Animal Collective released five full-length albums, each receiving a higher score on Pitchfork than the last (with a cumulative average of 9.1). If Broken Social Scene and Arcade Fire reminisced about childhood and adolescence in their lyrics, Animal Collective went further. They sounded like actual children. Pitchfork’s writers immediately latched onto the band’s blend of sonic eccentricity and emotional innocence. An early review referred to the group’s work as “fairy-tale music.” Another imagined the band’s members “dancing like children around the crackling fire among the pines.” Another: “It’s a child’s lack of self-conscience and ‘common sense’ that makes them holy. . . . Wisdom is wasted on the old.” Another: “There’s a romantic sense of longing, an air of celebration, but also tinges of doubt, loss, and acceptance.” In 2009, Pitchfork consummated its love affair with Animal Collective by giving Merriweather Post Pavilion the highest score the site had awarded to new music since Arcade Fire’s Funeral. “It’s of the moment and feels new,” Mark Richardson wrote, “but it’s also striking in its immediacy and comes across as friendly and welcoming.” In that album’s first single, “My Girls,” Noah Lennox sang, “I don’t mean to seem like I care about material things / Like our social stats.” You can call this inventive and complicated music many derogatory things, if you’re in the mood. The band’s signature is the combination of simple musical forms—the kinds you learn singing nursery rhymes as a child—and harsh, lacerating bouts of electronic noise. It is the kind of music, in other words, that is “not for everyone.” But you cannot call Animal Collective’s music insincere. One reason Pitchfork liked that line about social stats so much was that they saw their own project’s ambivalence reflected in it.

Did these bands suck? Was there something that Pitchfork had missed? Although Broken Social Scene, Arcade Fire, Sufjan Stevens, M.I.A., and Animal Collective all produced sophisticated, intelligent music, it’s also true that they focused their sophistication and intelligence on those areas where the stakes were lowest. Instead of striking out in pursuit of new musical forms, they tweaked or remixed the sounds of earlier music, secure in the knowledge that pedantic blog writers would magnify these changes and make them seem daring. Instead of producing music that challenged and responded to that of other bands, they complimented one another in interviews, each group “doing its own thing” and appreciating the efforts of others. So long as they practiced effective management of the hype cycle, they were given a free pass by their listeners to lionize childhood, imitate their predecessors, and respond to the Iraq war with dancing. The general mood was a mostly benign form of cultural decadence. It would be nice to say that Pitchfork missed something important, that some undiscovered radical alternative was out there waiting to be found. But Pitchfork’s writers are nothing if not diligent. They had it pretty much covered. 
By 2010, by almost any metric, Pitchfork was a tremendous success. In addition to its enormous daily output—five record reviews, a dozen or so news items, and frequent interviews with musicians and other celebrities—the site was publishing fourteen columns, from a regular report on the techno scene to “Why We Fight,” in which Nitsuh Abebe discussed the discussions that surround popular music. Pitchfork hosts music and video clips, and on Pitchfork.tv you can watch recordings of live concerts from any of six camera angles. The company remains wholly owned by Ryan Schreiber (official title: Founder/CEO), and still occupies its Chicago office in Wicker Park. Advertising is Pitchfork’s sole source of revenue, and if the flashy animated ads for Toyota, Apple, and American Apparel are any indication, the revenue is flowing. “We’ve succeeded at a time when nobody else has,” Scott Plagenhoef wrote in 2010. As Nick Denton competed with Conde Nast by running Gawker like a high-end magazine (albeit one with almost no publishing costs), Ryan Schreiber defeated print magazines by moving the whole operation online. By keeping user-generated comments off the site, Pitchfork has behaved more like a magazine than the magazines have. The only major modification Schreiber made to the print template—putting reviews, not interviews or features, at the center—was an ingenious adaptation to the dynamics of internet buzz: interviews may sell the rock and roll lifestyle, but reviews are what blogs will link to and argue about. Finally, of course, there is the archive. By constantly updating and adjusting its archive—whether by deleting early reviews or by writing up a reissued album for a second time—Pitchfork has become the only music publication to attempt an account of what it felt like to be a music fan in the last fifteen years. You cannot write the history of contemporary rock without acknowledging Pitchfork’s contribution.
And yet not all contributions are positive. The strangest thing about Pitchfork is that, for all its success, it hasn’t produced a single significant critic in fifteen years. Brent DiCrescenzo was a bit of a star for a while, but even with him the entertainment value almost always exceeded the insight. There are other magazines that subordinate the writer’s individual voice to an institutional voice—the New Yorker, for starters—but it’s strange for a rock magazine to do so, and even the New Yorker occasionally lets writers sound like themselves. Pitchfork couldn’t develop intelligence on the individual level because the site’s success depended largely on its function as a kind of opinion barometer: a steady, reliable, unsurprising accretion of taste judgments. Fully developed critics have a tendency to surprise themselves, and also to argue with one another, and not just over matters of taste—they fight about the real stuff. This would have undermined Pitchfork’s project.
That project—an ever evolving, uncontroversial portrait of contemporary tastes in popular music—addressed one problem surrounding music in the file-sharing era to the exclusion of all others. Faced with readers who wanted to know how to be fans in the internet age, Pitchfork’s writers became the greatest, most pedantic fans of all, reconfiguring criticism as an exercise in perfect cultural consumption. Pitchfork’s endless “Best Of” lists should not be read as acts of criticism, but as fantasy versions of the Billboard sales charts. Over the years, these lists have (ominously) expanded, from fifty songs to 100 or 200, and in 2008 the site published a book called The Pitchfork 500: Our Guide to the Greatest Songs from Punk to the Present. Similarly, Pitchfork’s obsession with identifying bands’ influences seems historical, but isn’t. When a pop critic talks about influences, he’s almost never talking about the historical development of musical forms. Instead, he’s talking about his record collection, his CD-filled binders, his external hard drive—he is congratulating himself, like James Murphy in “Losing My Edge,” on being a good fan. While Pitchfork may be invaluable as an archive, it is worse than useless as a forum for insight and argument. 

What did we do to deserve Pitchfork? 
The answer lies in indie rock itself. 
In the last thirty years, no artistic form has made cultural capital so central to its identity, and no musical genre has better understood how cultural capital works. Disdaining the reserves of actual capital that were available to them through the major labels, indie musicians sought a competitive advantage in acquiring cultural capital instead. As indie’s successes began following one another in increasingly rapid succession, musicians working in other genres began to take notice. Hip-hop is an illustrative foil. As indie bands in the ’90s did everything they could to avoid the appearance of selling out, rappers tried to get as rich as possible. The really successful ones stopped rapping—or at least outsourced the work of writing lyrics—and opened clothing lines and record labels. But for all their corporate success, rappers knew where the real cultural capital lay. When Jay-Z decided, as an obscenely wealthy entertainment mogul, that he wanted finally to leave his drug-dealer persona behind, he got himself seen at a Grizzly Bear concert in Williamsburg. “What the indie rock movement is doing right now is very inspiring,” he said to a reporter. One year later, his memoirs were published by Spiegel & Grau. 
Pitchfork has fully absorbed and adopted indie rock’s ideas about the uses of cultural capital, and the results have been disastrous. Indie rock is based on the premise that it’s possible to disdain commercial popularity while continuing to make rock and roll, the last half century’s most popular kind of commercial music. Sustaining this premise has almost always involved suppressing or avoiding certain kinds of knowledge. For indie bands, this meant talking circles around the fact that eventual success was not actually improbable or surprising. For indie rock’s critics, it meant refusing to acknowledge that writing criticism is an exercise in power. In 2006, two years after Arcade Fire should have made this kind of claim implausible, Schreiber tried to downplay Pitchfork’s importance in a newspaper interview: “So I think we maybe have this sort of snobbish reputation. But we’re just really honest, opinionated music fans.” Four years later, he said the same thing to the New York Times: “I don’t think that we see ourselves as anointers.” Nine months after that, Arcade Fire won the Grammy for Album of the Year—the first indie band to be so honored.
Indie’s self-deception has had consequences for fans as well. One kind of fan, at least originally, was the lower-middle-class white person, frequently a college dropout, who got by on bartending or other menial work and tried to save enough money to move out of his parents’ house. This kind of person got involved in indie rock to acquire cultural capital that he’d otherwise lack. A pretty good example of this kind of indie rock fan is Ryan Schreiber. In the last decade, however, indie rock has classed up, steadily abandoning these lower-class fans (along with the midsized cities they live in) for the young, college-educated white people who now populate America’s major cities and media centers. For these people, indie rock has offered a way to ignore the fact that part of what makes your dead-end internship or bartending job tolerable is the fact that you can leave and go to law school whenever you like. A pretty good example of this kind of indie rock fan is me. In the two years since I graduated from college, I’ve had a pretty good time being “broke” in New York and drinking “cheap” beer with my friends. But sometimes I remind myself that the beer I’m drinking is not actually cheap, and that furthermore I am not actually broke: if I married someone who made the same salary I make, our household income would be slightly above the national median, which is also true of almost every person I spend my free time with. The truth is that I inherited expensive tastes and moved to an expensive city, and sometimes I get cranky about not being able to buy what I want. But when I don’t feel like reminding myself of these things, I can listen to indie music. In Sufjan Stevens, indie adopted precious, pastoral nationalism at the Bush Administration’s exact midpoint. In M.I.A., indie rock celebrated a musician whose greatest accomplishment has been to turn the world’s various catastrophes into remixed pop songs. This is a kind of music, in other words, that’s very good at avoiding uncomfortable conversations. Pitchfork has imitated, inspired, and encouraged indie rock in this respect. It has incorporated a perfect awareness of cultural capital into its basic architecture. A Pitchfork review may ignore history, aesthetics, or the basic technical aspects of tonal music, but it will almost never fail to include a detailed taxonomy of the current hype cycle and media environment. This is a small, petty way of thinking about a large art, and as indie bands have both absorbed and refined the culture’s obsession with who is over- and underhyped, their musical ambitions have been winnowed down to almost nothing at all.
It’s usually a waste of time to close-read rock lyrics; a lot of great rock musicians just aren’t that good with words. What you can do with a rock lyric, though, is note the kinds of phrasing that come to mind when a musician is trying to fill a particular rhythmic space with words. You can see what kind of language comes naturally, and some of the habits and beliefs that the language reveals. This makes it worth pausing, just for a moment, over Animal Collective’s most famous lyric: “I don’t mean to seem like I care about material things.” The ethical lyric to sing would be, “I don’t want to be someone who cares about material things,” but in indie rock today the worst thing would be just to seem like a materialistic person. You can learn a lot about indie rock, its fans, and Pitchfork from the words “mean to seem like.”
I sometimes have the utopian thought that in a better world, pop music criticism simply wouldn’t exist. What justification could there be for separating the criticism of popular music from the criticism of all other kinds? Nobody thinks it’s weird that theNew York Review of Books doesn’t include an insert called the New York Review of Popular Books. One of pop music criticism’s most important functions today is to perpetuate pop music’s favorite myth about itself—that it has no history, that it was born from nothing but drugs and “revolution” sometime in the middle of the 20th century. But the story of The Beatles doesn’t begin with John, Paul, George, and Ringo deplaning at JFK. It begins with Jean-Philippe Rameau’s 1722 Treatise on Harmony, which began to theorize the tonal system that still furnishes the building blocks for almost all pop music. Or, if you like, it goes back to the 16th century, when composers began to explore the idea that a song’s music could be more than just a setting for the lyrical text—that it could actually help to express the words as well. Our very recent predecessors have done many important and wonderful things with their lives, but they did not invent the musical universe all by themselves. The abolition of pop criticism as a separate genre would help pop writers to see the wider world they inhabit.
Most of all, though, we need new musical forms. We need a form that doesn’t think of itself as a collection of influences. We need musicians who know that music can take inspiration not only from other music but from the whole experience of life. Pitchfork and indie rock are currently run by people who behave as though the endless effort to perfect the habits of cultural consumption is the whole experience of life. We should leave these things behind, and instead pursue and invent a musical culture more worth our time.
1 Pitchfork was not the only music archive emerging on the internet at this time. In 1992, an ex–folk musician named Michael Erlewine began publishing a print volume called the All Music Guide, and in 1995 he took the Guide online. Where Pitchfork’s coverage was limited by the relatively narrow tastes of its tiny staff, Allmusic.com aspired to true comprehensiveness, with separate sections dedicated to Pop/Rock, Country, Latin, Reggae, Classical, R&B, Jazz, and other genres. The site rated albums on a five-star scale, and accompanied reviews with bulleted lists identifying a record’s “Moods” (“Austere,” “Suffocating,” “Intense,” “Nocturnal”) and “Themes” (“Feeling Blue,” “Late Night,” “The Creative Side”). With a three-year head start and funding that Schreiber could only dream about, Allmusic initially seemed on its way to internet dominance, but the site’s rather bland comprehensiveness would turn out, somewhat counterintuitively, to be limiting. Erlewine’s error can be found in the assumption that what people wanted was an encyclopedic survey of music itself. Pitchfork knew that what people actually wanted was an encyclopedia of musical taste.
2 Lynn Hirschberg, writing for the New York Times Magazine in May 2010, finally made some of these points in a profile called “M.I.A’s Agitprop Pop,” but the best critique of M.I.A. wasn’t made by a critic. It appeared in the lyrics to a song by Vampire Weekend, in which frontman Ezra Koenig sings about a young woman attending what it seems obvious to me is an M.I.A. concert: “A vegetarian since the invasion / She’d never seen the word ‘Bombs’ / She’d never seen the word ‘Bombs’ / blown up to ninety-six-point Futura / She’d never seen an A.K. / In a yellowy DayGlo display / A T-shirt so lovely it turned all the history books gray.”
This article has been corrected since its publication to reflect that Merge Records was based in Chapel Hill, not Durham in the 1990s and that Ralph’s Corner was in Fargo-Moorhead, not Minneapolis.

MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget