Beyond our own borders
As I threatened slash promised a few days ago that I would do, I want to elaborate fairly briefly (for me, at least, about this particular subject, anyway) on some of the alternatives to the inglorious hash that is the SN chart. Let me say upfront that I’m not describing these other schemes in some idealistic frenzy of hopefulness that sg will end up emulating any one of these methods. Frankly, experience being a guide, there’s no reason to believe anything structural or foundational is going to change to the SN charts anytime soon, Ken Kirksey’s bloviating notwithstanding. Instead, I offer these by way of suggesting that there is a world beyond the insularity and provincial myopia of sg, and that this other world grew up from the seeds of innovation, creativity, conflict, and discordance that is common to … well, just about every industry, system, or economy.Also, I should say how deeply indebted I am to a variety of informal tutors, experienced radio hands, and music executives for giving me a crash-course in radio charts over the last few months and weeks. These generous experts have given of their knowledge and time pretty amazingly, and my only regret in having the conversations I have had with them is that the hostile and retributive climate in which the SN does bidness prevents me from thanking them openly and by name (after I published the Hierline No. 1 spot in advance of the “official” release to the public, Ken Kirksey had the temerity to ask me to tell him who was giving me information about the SN chart, a request which I ignored because I assume he didn’t want to just call these folks up and invite them to high tea). Needless to say, any errors or inaccuracies are totally my own and shouldn’t be attributed to my teachers (though of course that’s not going to be a problem, since you don’t know who they are). So here we go.
For the sake of ease and because of proximity, I’m going to focus on four chart formats that are used in Christian music. Notice I said “chart formats” rather than charts, because our first lesson of the day is that there are different methodologies behind different charts depending on their format and different reasons for their existences. The cynic may dismiss variety as so much diversified greed (”all charts are designed as sales tools for record companies”), and there may be some truth to that opinion, since most charts - even glacially slow ones like the SN - purport to reflect “current” airplay for “current” singles and so give labels and promoters a hook to sell more product and buy advertisements in publications like the SN. Still, there are substantive differences among various charting methods, the understanding of which can help us make sense of where sg is or is not headed with the SN charting the course.
1. Billboard. By far, the supreme commander of radio charts, Billboard is everywhere. So much so that I’m going to let the good folks over at wikipedia do my ’splainin’ for me. There’s also some other Billboard related info here, especially about the way in which Billboard uses a mix of sales and spins to compile its charts (though the precise nature of the formula is not clear to me). Also, note that this year, Billboard began incorporating new media content providers like iTunes sales into some of its charts.
2. The PDAdvisor. PDA describes itself as “an ePublication focused on the needs of Christian radio.” What that means is that “subscribers receive The PDAdvisor via a weekly HTML email on Tuesdays, featuring complete AC, CHR, and Inspirational airplay charts, detailed information on new and developing songs, and trend indicators for each chart.” But more to the point, the PDA takes spin counts for each song in the chart and weights it according to a station’s market size so that songs have points attributed to them instead of spins. Thus the PDA trades in the currency of points per market. This category-based weighting system generates a weekly index of the hottest songs across the country and/or in regions. So the target audience for PDA tends to be labels and artists themselves (and, I suppose, retailers trying to figure out what titles to carry to guarantee good sales).
3. Radio & Records. Nevermind how R&R describes itself; it’s a bunch of corporatese and PR pablum that could just as easily have been written for Hostess Cupcakes, give or take a few words. The basic R&R methodology is primarily by spin count alone. Tally up all the times a song was played this week, locate the songs that were spun the most and there’s your chart. By emphasizing spin-count, this kind of chart can help democratize song rankings. A spin is a spin is a spin, no matter how big or small the station. What this method doesn’t handle so well is the problem of market penetration. A spin in Nashville is vastly different than a spin in, say, Rootwad, Alabama (that is, while the Rootwad station may reach a higher percentage of the potential listeners in its market, the Nashville station is reaching far and away exponentially greater numbers of people; nevertheless R&R treats a spin in Rootwad the same as a spin in Nashville). R&R does try to meliorate this problem by placing minimum Arbitron ratings on its charting stations, which attempts to ensure that charting stations meet a baseline standard of success in their market, but the underlying problem persists nevertheless.
4. Christian Radio Weekly. CRW’s methodology bases its chart on audience impressions. As I understand it, this means that CRW takes a reporting station’s Average Quarter-Hour listeners (or AHQ, the broadcast metric that measures the average number of people listening to a specific station for at least 5 minutes during any 15-minute period of any given daypart) times weekly spins to determine total audience impressions for a song. Add up all the station impressions for a song and rank them accordingly, and there’s your CRW chart. This kind of impressionistic charting, if you will, is useful if the goal is to identify how many people are being exposed to a particular song. The major drawback is that large market stations and broadcast networks command the chart. One major-market station in a metropolis or a single national network picks up a song and boom … the song is halfway up the chart the first week. Doesn’t mean the song is any good. Doesn’t even mean listeners like it or that it has tested well. It just means that one music director somewhere big decided to play the song and it got a lot of exposure that way.
Lastly, there’s the important business of where and how these various charts (and most others like them) get their data. Charts like these don’t typically monitor airplay themselves. Instead, they farm out the monitoring to third parties. In most cases, charts usually get the raw spin-count for songs from a service called Broadcast Data Systems (or one like it; the market for this kinda thing is, I gather, growing). BDS uses a form of audio fingerprinting to track what songs are played where, by whom, and how frequently. This tracking data is then compiled in reports that go out to subscribers, usually on a weekly basis. BDS has a central monitoring facility in each market (markets being determined in most genres, I assume, by Arbitron’s delineation of broadcast sectors). This means there’s no equipment per se that stations have to have in order for their airplay to be monitored. What is required is that BDS or some other monitoring system be willing to track the format of music you play and want to chart.
I’ve offered this not-exhaustive survey in an attempt to show how important it is to clearly understand the purpose of a chart. The SN chart purports to be a gauge of new sg music, yet the way it’s operated could make one wonder. For instance: Artists like Ricky Skaggs and the Oak Ridge boys are not included in the SN chart, even though they often record and have a great deal of success with sg music. Nor are CCM artists that cover sg tunes tracked by the SN. (And, though I’m going purely from my own eye-witness account here, I have never seen a song that’s dropped out the Top 40 come back on to the SN charts, even though popular songs can have revivals and pretty steady popularity in the bottom third or quarter of the chart after initial fluctuations. If the SN does have a rule about banning encore appearances in the Top 40 - and let me hasten to add that I do not know of any official rule - this would only complicate things more. Can anyone confirm this one way or another?) All of these things (and perhaps others?) combine with the bribability of charting stations to create a chart that measures virtually nothing meaningful … there are simply too many thumbs on the scale.
These problems, I suspect, are symptomatic of deeper contradictions within the SN itself about the chart. While Ken Kirksey repeatedly and publicly insists that stations are “supposed to report the twenty songs receiving the most airplay on your station” and that charting stations “should consider no other factor when you create the report you send” to the SN, Maurice Templeton is on record with at least one radio-station manager as saying that “If the DJ’s are doing as instructed, the charts are not made up from programmers or DJ preferences but from listeners request.” I suppose one could say that Templeton was arguing from expediency (since in this case he was trying to rationalize why the SN refuses to recognize internet radio). Or one could, I guess, construe his remarks to mean that chart affidavits are a direct reflection of listener requests, but there’s very little evidence to suggest that such a correlation actually exists (why this is so is another matter altogether). Certainly, Ken Kirksey doesn’t seem to be saying this at all. In any event, Kirksey’s and Templeton’s statements at least raise the possibility that the SN chart is hobbled as much by its own internal contradictions and the cross-purposes of its owners as anything else. The guy who owns the thing thinks we’re getting a gauge of listener preference; the guy running the thing on a day-to-day basis pretends we’re getting spin-counts, except when he doesn’t bother to pretend that at all and just bemoans how people take advantage of the system. And in either case, the arguments don’t hold up under scrutiny: as long as the chart is built on unverifiable affidavits, the SN has to admit - as Kirksey repeatedly has, in order (one assumes) to prove his get-tough bonafides to Temptlon - that no matter what it was intended to measure when it was set up, the chart finally can be said with confidence to reflect only the popularity of a given song among a handful of radio station managers and music directors, some of whom are obviously under the sway of their own self-regard or the wiles of unscrupulous promoters.
If we were to invite our cynic back out on stage for a minute, we might be reminded that perhaps profit is the unifying goal in the SN’s case and that the SN chart does a fine job of achieving that goal. I would like to believe that’s not the only or primary factor here, but I confess that I find myself pretty unavoidably reversing my earlier, overly-glib claim that “it wouldn’t be that difficult” to change the chart. While in theory it ought to be easy, the intransigent reality of sg is much more impervious to change than I was initially willing to admit. It’s partly a question of money, yes … and it’s also a question of the the current business culture within sg. Even if the establishment was interested in truly reforming or fixing the SN charts by adopting common music-industry standards for charting, it’s not at all clear that sg is taken seriously enough by outfits like BDS to attract or support the infrastructure necessary for the improvements. That is, sg has to be willing to give up the fiercely guarded prerogatives of feudal lords reigning over a dwindling empire. But really, it comes to the fact that the SN chart and its flawed methodology persist because lotsa people like it that way.Email this Post