Stats Corner- Juries who think alike

Montenegro vote eurovision 2016

With the jurors for this year’s Eurovision announced during the week, I’ve been thinking a bit about what they get up to. For the last couple of years, we’ve had access after the contest to the full jury vote and can see exactly who voted for who out of each five-member panel. I don’t necessarily think this leads to greater fairness in the vote, but it gives me some numbers to play with.

Many fans have been struck by how uniform some countries’ votes have been. There are certain sets of jurors who seem to have remarkably similar tastes. That may just be coincidence or down to broadcasters selecting jurors with similar backgrounds, or it may be something more pernicious. I thought I’d look at how the points are given out in a more scientific way than just looking at certain countries and saying that looks funny.

Each juror ranks all the songs in the final (other than their own country’s) from first to last. For both the 2014 and 2015 contests I’ve looked at each set of five placings a jury have given to each country in the final and taken the standard deviation of those scores. This gives a measure of how far apart these marks are spread; the higher the standard deviation the more varied the opinion of that song.

I then average these standard deviations for each national jury to give an overall standard deviation for each jury that year. I then average again to give a standard deviation over the two-year period my sample covers. This is a very small sample, so the results I’ve got are vulnerable to overstating the effect of one unusual set of jurors. It’s entirely possible my figures represent the result of chance variation. However, this is all the data we’ve got and the results are quite striking.

Eurovision jury variance 4

Everyone sticks pretty solidly to an average deviation of about 3 or 4 places. At the bottom, though, Azerbaijan and Montenegro are by some distance the juries with least variety of opinion.

It should probably be pointed out at this juncture that Montenegro’s result comes from an even more limited sample than most. They’re one of the countries whose jury vote was thrown out in 2015. One wonders how much more uniform it got.

So what happened to the other countries who have had a jury vote disqualified? This is probably the most interesting part to me. FYR Macedonia (jury votes disallowed in 2015) and Georgia (2014) are at the other extreme on my graph. The years for which I have non-disqualified data for these countries sees the two most diverse sets of jury opinions of the lot. When this transparency thing was set up, I pointed out that countries could try to arrange their votes so that they show enough variety to avoid suspicion. Is it possible that countries are trying this approach, but overdoing it? Perhaps a standard deviation of over 5 places is so abnormally spread that it should arouse suspicion.

I’ll update this graph after this year’s contest. It will be interesting to see if these outlying nations shuffle back into the pack. Such a move may be down to the evening out of an innocent coincidence. If a country is cheating, it could instead be that the fixers use the EBU’s transparency policy to figure out what “normal” looks like. If I was in charge of rigging a jury, it’s what I would be doing.

Advertisements
This entry was posted in Stats Corner and tagged , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s