Monday, March 2, 2020

Does the relationship between critics' scores differ by wine type?

A few weeks ago I asked: Are wine scores from different reviewers correlated with each other? I looked at wine-quality scores for two Swedish review sites, Vinbanken and BKWine, and concluded that the answer is “about a half of the time”.

An obvious follow-up question is whether this answer applies to different wine types, or whether there are notable differences between types. This new blog post answers that question — there are definitely detectable differences, in this case.

As for my previous post, the wines come from the Swedish wine chain Systembolaget. These wines are tasted by various media commentators shortly before their release. In my sample, there were 1,034 wines scored by both review sources during 2019. The two critics are not actually in the same room doing the tasting. Both score sources use a 20-point scale.


The first source is from Jack Jakobsson at BKWine Magazine. I deleted the data for beer, cider, saki, fortified wines, and spirits, leaving the reds, whites and rosés. The points are provided in 0.5-point increments. The second source is from Johan Edström at Vinbanken. The scores are reported separately for reds and whites, with rosés included among the whites. The points are usually provided in 0.5-point increments, although occasionally finer divisions appear.

I subdivided the 1,034 wines into four groups: reds (612 wines), whites (319), rosés (37) and sparkling wines (66). For each group, I calculated the average difference in points between the paired scores, as well as the correlation. The difference in average quality scores is shown in the histogram. As you can see, both reviewers award the sparkling wines higher points than the other wine types, followed by the reds, then whites, and finally the rosé wines.

Vinbanken and BKWine quality scores for different wine types.

Also, as I noted last time, the BKWine scores are an average 0.57 points less than the Vinbanken scores. However, this average difference is smaller for the sparkling wines (0.41 points) and red wines (0.47 points) than for the white wines (0.75 points) and rosé wines (1.51 points). Clearly, whites and rosés are downgraded at BKWine, compared to Vinbanken.

The consistent difference among the wine types cannot be disregarded, especially that for the rosé wines, which is significantly less for both reviewers. We cannot expect reviewers to use exactly the same wine-quality scales, but we might at least hope that they are consistent across the various wines that they evaluate.

The possible cause of variation among wine types may not be too hard to detect, as I discussed in the introduction to a previous post (What's all this fuss about red versus white wine quality scores?). Bob Henry commented on my previous post about many non-red wines:
Citing Robert Parker and his 1989 interview with Wine Times magazine (later rebranded Wine Enthusiast), such wines do not improve with age in the bottle. Hence they garner no “bonus” points that place the wine somewhere between 91 points and 100 points. Consequently they bump up against a “glass ceiling” of 90 points.
In this case 90 out of 100 points would be roughly equivalent to 16 or 17 out of 20 points; and this seems to be the fate of the rosé wines, especially. As you can see in the next graph (where each dot represents one or more wines), neither reviewer gave the rosé wines anything like 16 points, even at their best. Furthermore, the Vinbanken scores almost always exceeded the BKWine score for the same wine (the dashed line represents equality).

Vinbanken and BKWine quality scores for rose wine

This does not mean that rosé wines cannot be aged, of course. One classic example is the Fondo Antico “Memorie Rosato”, from Sicily, which is fermented and matured in oak barrels — this wine is not even released until it is 5 years old. Not unexpectedly, it is unlike most other rosés (even the drink-now ones from the same winery); and I can recommend it, if you ever get the chance to try it.

The correlations among the quality scores also differ between the two critics for the different wine types (the numbers represent the amount of the scores that is shared):
Total
Sparkling
Red
White
Rosé
54.5%
75.4%
49.3%
55.5%
13.5%

Clearly, our reviewers agreed much more on the quality of the sparkling wines than they did for the reds and whites; and they had very little agreement at all about the rosé wines (as is obvious in the scatterplot above). So, the rosé wines get short shrift in all ways.

As a final point, these are the three rosé wines that the two reviewers did agree are the best of the selection (represented by the points at the top-right of the scatterplot):
Domaine de la Mordorée “La Reine des Bois” 2018
Marimar Estate “Rosaleda Rosé of Pinot Noir” 2018
Dominio del Águila “Pícaro” 2016
Tavel
Sonoma
Ribera del Duero
$US 25
$US 25
$US 30
These wines still got very poor scores, given that their prices match those of many of the red and white wines.

1 comment:

  1. I invite David's readers to visit this circa 2015 spirited debate on roses and ratings/scores on retired Wine Enthusiast magazine California wines editor Steve Heimoff's wine blog:

    "An appreciation of rosé and a call for changing the rules of wine criticism"
    Steve Heimoff wine blog - posted April 28, 2015

    URL: http://www.steveheimoff.com/index.php/2015/04/28/an-appreciation-of-rose-and-a-call-for-changing-the-rules-of-wine-criticism/

    ReplyDelete