Sunday, April 29, 2007

A Way to Extend Your Music Test

Do you have a listener email database?

First, if you don't, I'd suggest that you consider building one. At the very least, make "joining" an option for listeners who visit your Web site. I have found a number of stations that require an email address in exchange for being able to access streaming audio. I'm not certain that isn't overdoing it, but in my case, I haven't felt like that was too much to ask.

A station that I "signed up" with sends me email on a regular, but not irritating, basis. Two of their recent emails are worth noting.

Shortly after completing an AMT/library music test, this station sent an email out to the database.

We were informed that the station had just completed a huge music survey. We were asked to listen to the station critically over the next two weeks. We were given an email address for our comments, good or bad.

Actually, it was very well written. But to respect the station's privacy and creativity, I won't share the exact wording (I've hidden it in a safe place).

A few days later, I received a second email, again asking for feedback. This time there was a specific date by which comments should be sent. I liked the sense of urgency that created.

I also think it made it sound much more serious and important because the station established a deadline.

My first impression is that this is a very nice extension of the music testing process. And why not alert some of the people who are most likely to notice your changes? Make sure those changes pass the giggle test.

Do you agree?

I would suggest just one more thing, given that a deadline was sent for comments. I'd enter everybody in a random drawing (making it clear that there is no penalty for negative comments). The prize doesn't need to be expensive. But it should have the station's name. And of course everybody should receive a "thank you."

Wednesday, April 18, 2007

What Do You Do With A Song That Everybody Likes?

I have a question. I'm not asking it here for the first time. I'm in the middle of a discussion with one of the best programmers in the world about this. Perhaps you, dear reader of this blog, could help us out.

Here is the issue: A song came back from a music test with only 5% of the respondents giving it a "love it" rating. That put it in a tie for 532nd place. On the other hand, the song was ranked #6 for number of "like it" responses. And there were exactly zero "don't like it" responses.

So, perhaps we could say that nobody disliked it, nearly everybody liked it, and only a few people were passionate about it. Or can we put it that way?

One other bit of information: The song's "type" was a perfect match with the center sound of the station. It's music fit rank was #5,  meaning that it was classified by the music test respondents as being the same kind of music as the music at the very center of the format.

I don't want to bias you just yet by giving our opinions on this so far. But it is an important question, because it makes us consider:

1. How important is "love" or "favorite" versus "like it"?

2. How much are we to be guided by playing favorite songs, versus playing songs that nobody dislikes? (Yes, I agree that is a variation on question 1).

3. If a song is the "right kind" of music, do you give it extra credit? And if so, just how do you do it?

4. With such a song, what do you think you'd do with it?

5. What factors matter most? What might contribute most to deciding it is a power, a secondary an occasional spice/filler song, or it isn't played at all?

Email me: scasey@upyourratings.com with your opinion. Please help me write the follow-up to this post.

Thanks!

Monday, April 09, 2007

For Radio, Passion Exists Within a Context

Passion is a word used often in radio programming, and often used incorrectly.

Okay, perhaps not so much incorrectly as incompletely. Passion exists within a context.

If somebody loves a song they hear on your station, but they generally dislike the other music on your station, there is some "passion", but it probably won't translate to your station, or spending more time with your station.

If somebody hears 6 songs in a row that they like, and they find they generally like the other music on your station, there may be no one song that generated tremendous "passion". But they are likely to feel passion for your station, and it will translate to spending more time listening to you in the future.

As obvious as this is, many radio programmers insist in looking - in isolation - at the "favorite" rating for a song, and making a judgment as to how much airplay it should receive. But very few songs generate "favorite" ratings of more than 33%. So for most songs, over 67% or two thirds of the listeners are not generating this particular type of "passion". You can't build a successful station by playing enough songs with enough favorites to generate continuing passion. You must generate passion for the station itself. It is one of those simple (but perhaps difficult to accomplish) truths.

Whether we want to admit ii or not, the game we play is "don't leave". We are usually a background for a listener's life. Other things are going on. People are not breathlessly waiting, with passion, to hear the song that makes them dance with joy. No, they are listening for "their" kind of music, and mostly they expect it to be familiar and comfortable. They want it on as a soundtrack behind what they're doing. Violate that; jump out and grab their attention with too many things that stand out as odd, or which don't meet their expectations, and you'll see passion. You'll see a passionate turn of the knob to "off". Or to another station.

My thought today is that we should remember that listeners don’t hear our stations as a sequence of songs. They hear an environment. Within that environment, the songs must work together to reinforce why the listener chose to listen to us. There is a center to what position our programming occupies, to what we represent, and a circle of music around that center.

My work is focused on this reality. A modern music test should allow your music to work better to reinforce your programming vision. The chief tool I use to achieve that is the music fit analysis built into MusicVISTA. Regardless of who gathered your music opinion data, I would be happy to show you how your listeners' organize the songs to determine what best qualifies as their kind of music.