Stop Using Views To Measure YouTube Success

Measurement is a hot theme right now. Lots of smart people are writing about it regularly and Joseph Thornley (also a smart guy) is even organizing a social media measurement roundtable.

Here’s a measurement issue that’s bugged me for a while.

I keep seeing and hearing people citing video views as a critical success measure.

For example, Dan Ackerman-Greenberg pushed views as his key success measure with his (*shudder*) "viral" YouTube strategies.

On the other (i.e. right) side of the ethical fence, Shel Holtz and Neville Hobson talked about the number of views of Microsoft’s YouTube videos on a recent episode of FIR.

Isn’t there a better way to measure the success of videos?

It really seems like people take the route of least resistance by using an easily available bit of data to measure success without considering whether it actually shows success or not.

  • Does the video change peoples’ knowledge, perceptions or behaviour?
  • Do viewers get the message?
  • Do they go to your website after viewing the video?
  • Do they buy your product/service after seeing your promo?
  • Do they take whatever other action you want them to take?

Views don’t answer any of these questions. Sure, they’re nice to know and a large number of viewers may well be better than a small number, but not necessarily.

  • Are all those people your intended audience?
  • Are they influential in their field?

If not, then all those views may mean nothing.

Does several hundred thousand views of Microsoft’s new videos mean they’ve succeeded? Maybe. On the other hand, the negative comments would seem to indicate otherwise. Only more useful measures will tell.

Sure, it would be more work to find out more useful stats but really, what’s the point if you don’t?

I’m no measurement expert so I put the question out to you: What’s a better way to do this?

6 Responses toStop Using Views To Measure YouTube Success

  • Dead simple – what’s the call to action? If you make a video you expect to do well, put in a unique URL and track to see how many people take an action from it.

    Ultimately, what’s the goal? A sold t-shirt? A vote? A funded loan?

    Views and traffic are just the top of the sales funnel.

  • I’ve always liked Kirkpatrick’s rubric: Do they like it? Do they remember it? Does it impact the way they go about their lives? Does it impact the bottom line?

    Level 1 is easy – the star rating. Level 2 might be some measure of “viral-ness”, and views might be part of that.

    Levels 3 might be a little difficult to quantify, but perhaps spawning blog posts or remixes might be a measure. See Souljah Boy, Dramatic Hamster, and Star Wars Kid for examples. For Level 4 you’re back to regression analyses on market surveys that include the YT video on the list of “where did you hear about this?”

  • Making it worse is the Autoplay feature, which distorts views – so many people click on and off again quickly.

    And the nature of the audience – who seem predominantly like small-brained children who reject anything that doesn’t fit a narrow band of entertainment criteria.

    An intelligent combination of all the measures – watched to the end, visited site, favourited/liked, linked to, subscribed, embedded, honours. Last of all ratings and comments. Star ratings can be slewed by dumb kids and gamed easily too by companies or competitors. Comments on Youtube are ridiculous – the haters there are absurdly rampant. Not helped by anonymity.

    These are all things that a less lazy videomaker can work on trying to improve results for.

    As a combination, these things are obviously not as easy to assess or reward as simple views (though their ease is worthless as they’re meaningless, mostly) – but something that could be quantified by people who understand the scene. If they have the motivation. And it doesn’t seem many people do at the moment.

    So we’re probably stuck with views, at least for 08.

    Meanwhile the battle between pre-roll and overlay ads will rumble on ignored. Most viewers will put up with pre-rolls, I think – but Google are determined to build ads right into the frame throughout. Which will provide a whole bunch of other interactions to measure. We’ll see how quickly that becomes as irritating and useless as website banner ads.

  • I think views can be a success measure if one of your goals is simply awareness. For example, as part of my day job I’ve been looking at the number of views of Dove’s “Onslaught” video as one of the ways to gage overall awareness of the Campaign For Real Beauty. Now, did people who viewed the videos change their attitudes on concepts of beauty, which is the intention of the campaign? Well that’s harder to measure, but some of the other things that are being tracked are comments on video sharing sites and blogs, as well as use and downloads of the various tools & docs on the Dove site that are meant to promote girls self-esteem. All these measures, taken together, will be used to gage the overall effectiveness of the campaign, but I don’t think views can be discounted as an important measure.

    Or, put another way, if no one is watching your video they don’t have the opportunity to form any kind of opinion!

  • I think we should look at views the same way we look at Ad Value Equivalencies. Both measure how many people may or may not have viewed your video/article/whatever, but really do little to measure any opinions from the viewer. Eyeballs do not equal impact!

    I don’t put much stock in the star rating either, since it really reflects the quality of video more than the message behind it, but at the very least it can measure how many people cared enough about the video to rate it. It will only measure viewers that log in to Youtube though, which isn’t nearly the whole viewership.

    I agree with Christopher Penn on this one, where are the hard results? Views are great if the video is the end product, but if the video is the promotion, views really don’t mean much.