If you google “the death of the album” you’ll find it’s something both well and ill informed pundits have been banging on about for a few years now. Without getting into merits or otherwise of these articles, I think we can agree that all years are not equal when it comes to great albums. Some years are just better than others. But what of 2012? We could argue that for hours, but it seems fitting that in the year that gave us Nate Silver and gave Moneyball an Oscar nomination we should turn to the stats.
Step forward Pitchfork, the site that seems to be most committed to the album format witho ver 1200 album reviews every year and each one 500-800 words long. They may be too in thrall to the long form record to openly comment on its demise, but surely with their super-anal scoring technique, where each album is given a mark out of 10 specific to one decimal place, we can tell how 2012 fared. Excuse we while I crunch some numbers…
Well, that was interesting. Although they haven’t given a new album a top score of 10 in the last 2 years*, on average, albums are scoring the same across the board. If you were to look at these reviews you would have very little clue that the quality of albums ever changed at all. Here are some stats based on reviews in 2012 compared to the previous two years.
- Total: Pitchfork reviewed 1243 albums in 2012 – slightly more than were reviewed in 2010 and 2011.†
- Brilliant: Less than 1% of albums in each year were awarded a score of 9.0 or above.
- Average: In 2012 the mean score awarded for an album was 7.1 – the same as in 2011 (7.0 in 2010).
- Well Below Average: Almost exactly 25% of the albums in each year were awarded a score of 6.4 or less.‡
Pitchfork’s Top 50 albums critics’ list closely tallies with their highest rated albums of the year, but there are always a few albums that fall out of favour or are bumped higher up the list:
- Timing: For 2012 The best time of year to release an album was October, followed by April: 17 of the top 50 end of year albums were reviewed in those two months.
For 2010 it was May and September (15 out of 50), for 2011 June and January (14 out of 50).
- Grower: Grimes’ Visions was the album judged to have improved the most this year – leap frogging 13 albums that scored higher on initial review.
- Unlucky: Whereas Lambchop’s Mr M was judged to have aged poorly, finishing behind 13 albums that scored lower.
- Permanence: Cloud Nothings’ Attack on Memory had the most staying power – the only album reviewed in January to make it inside the top 20 at year end.
- Waning Charms: Twin Shadow’s Confess and Converge’s All We Love We Leave Behind were the biggest losers – scoring higher than 33 albums that did make the top 50 but failing to place.
Personally I love the Lambchop album and find the praise heaped on Visions baffling. You can draw your own conclusions from the rest of the stats, I’ll simply add that I’m going to go out on a limb and predict almost exactly the same results come the end of 2013. An average score of 7.1 and 25% of albums getting a score of <6.5. Plus, I reckon they’ll hand out another 10 this year, either to The Knife, a hip hop artist, or a debut album by a bedroom producer.
*Kanye West’s My Beautiful Dark Twisted Fantasy.
†Pitchfork review five albums a day each weekday excluding American public holidays and none during the industry down time of the last 2-3 weeks of December when practically no new albums are released. Occasionally, albums in a box set will get individual scores, hence the slight disparity in each year’s review numbers.
‡ I call a score of <6.5 ‘The Everrett True mark of failure’ after the music critic who ranted about “a world full of music critics lazily and cravenly praising everything in their path … for if they don’t, their editors won’t run the review or feature or article. Look around you. It’s already happened. How many reviews graded below 6.5 stars do you think Pitchfork runs?” His opposition to what Pitchfork does so obsessive that he just assumes they rarely give less than glowing scores.