Monday, June 18, 2018

Not all retailers cherry-pick the best critic scores for their wines

I noted in an earlier post that wine retailers often cherry-pick the critics' quality scores that they quote when advertising their wines (Laube versus Suckling — their scores differ, but what does that mean for us?). They can do this because the quality scores from different critics often do not agree, and there is a pecuniary advantage to the retailer for any given wine to appear to be as good as it can be.


However, not all wine advertising is like this. For example, Wine Rack Media, a company offering services to online wine retailers, makes available the scores (for any given wine) from a range of critics, and these scores are sometimes quoted in full by retailers. For example, both Fisher's Discount Liquor Barn and Incredible Wine Store have web pages "powered by Wine Rack Media", and they both use that company's wine database.

Let's take a specific example of a well-known wine, to see what information we get: Caymus 'Special Selection' Cabernet Sauvignon. Checking online, we are given wine descriptions (including quality scores) for most of the vintages from 1978 to 2005, many of them from multiple sources: Wine Spectator, Wine Enthusiast, Wine and Spirits, Stephen Tanzer, Connoisseurs' Guide to California Wine, and Vintage Tastings.

What is more interesting, though, is that we are often given multiple scores from each of these sources, rather than a single score. Most reputable sources of wine-quality scores are likely to have tasted the wines on different occasions, often by different tasters, so that we do have multiple scores available. Most wine retailers report only the highest of these scores, for each wine (ie. a cherry-picked score); but not in this example.

The first graph shows all of the quality scores we are given from the Wine Spectator magazine, with the vintages listed horizontally and the scores vertically. Where there are multiple scores for a particular vintage, I have connected them with a line.

Wine Spectator quality scores for Caymus 'Special Selection' Cabernet Sauvignon

As you can see, the repeated scores tend to differ from each other by at least a couple of points, although one pair differs by 16 points (82 versus 98) and some others differ by 10 points and 9 points. Clearly, cherry-picking a score would be very effective here. Indeed, all of the scores below 90 points have a paired score that is much higher — we could, if we were so inclined, easily claim that the Caymus Cabernet is consistently better than a 90-point wine!

For comparison, the second graph shows some of the other quality scores, as well.

Several critics' quality scores for Caymus 'Special Selection' Cabernet Sauvignon

Note that the Tanzer scores are almost all lower than the other scores for the same vintage, while the Connoisseurs' Guide scores are usually lower. Indeed, the only vintages for which there is good agreement are 2000 and 2004 (where all of the scores are within 1 point).

With this sort of presentation of the quality scores we can thus see at a glance where we stand with regard to the variability among critics' scores. If only all wine retailing was this honest about the multitude of data available.

Monday, June 11, 2018

Actual wine consumption versus the recommended maximum

Many, if not most, countries have an "official" recommended maximum level of wine consumption per week for adults. However, the recommended value differs rather a lot between the countries. Moreover, the observed weekly consumption of wine also differs between countries. I have therefore wondered how these two values compare — the actual consumption versus the recommended maximum one.


For the recommended wine-intake values for each country, I have used the data from the International Alliance for Responsible Drinking (IARD), which were last updated in January 2018. The values given are in grams of alcohol per week (for each adult), with separate values for males and females, if they differ. We know that one 750 ml bottle of 12.5% wine equals 74 g alcohol, so we can use this to convert the values to bottles of wine per week.

For the actual wine-intake values for each country, I have used the Annual Per Capita Wine Consumption 2014-16 list from the AAWE facebook page. The values given are liters of wine per person per year. To convert these to "per adult", I have used the World Bank data for the percent of each population aged 15-64 years (this is apparently a standard age group for "adults"). I then converted these new values to bottles of wine per week.

This means that I now have two figures for bottles of wine per week per adult, one estimating actual consumption and one describing the recommended maximum, for a range of countries. Clearly, the value for actual consumption does not take into account what proportion of the population actually drinks wine.

The only complication is that the observed consumption values combine both males and females, whereas the recommendations for the sexes sometimes differ — where they do differ, the value for females is typically one-half or two-thirds of the value for males. Interestingly, there has been a recent trend for countries to lower the recommended limit for wine consumption by men to the same value used for women. Apparently, we are moving away from the James Bond / Humphrey Bogart hard-drinking lifestyle, at least as a medical recommendation for men (see my blog post James Bond, alcoholic).

Anyway, the data in the table show the actual consumption (bottles of wine per week per adult) as a percentage of the recommended limit for men. Most countries have a recommended limit for men of 1.9-2.3 bottles per week.

Croatia
France
Netherlands
Switzerland
Portugal
UK
Belgium-Lux.
Italy
Denmark
Austria
Australia
Sweden
New Zealand
Uruguay
Germany
Georgia
Chile
Greece
Bulgaria
Ireland
Argentina
Romania
Finland
Canada
USA
Spain
90%
90%
88%
83%
75%
74%
63%
60%
53%
49%
47%
45%
45%
44%
39%
39%
36%
36%
31%
31%
30%
27%
26%
20%
16%
15%

Note that most of the countries have wine intake that is less than 50% of the recommended maximum wine consumption. In theory, this should make the medical people happy. However, there are two ways to be at the top of this list: (i) have a high consumption, and (ii) have a low recommended maximum consumption.

The countries with lower limits than 1.9 bottles per week include Bulgaria and the United Kingdom (1.5 bottles), Chile (1.3), and the Netherlands (1.0). This explains why the UK and the Netherlands are near the top of the list, even though their consumption is not particularly high.

The other top countries are there because of their high wine consumption. Indeed, Croatia, Portugal, France and Switzerland each consume 25% more wine per adult than does their nearest rival (Italy).

The countries with the highest recommended limits include Argentina, Canada and the USA (2.7 bottles per week), Greece (2.8), Romania (3.0), and Spain (3.8). This explains why these countries occupy the bottom places on the list — they set high limits, and so their people's consumption gets nowhere near that limit.

Note that Chile and Bulgaria have low recommended limits but even lower wine consumption.

Finally, it is worth noting that those countries with wine-consumption values exceeding 50% are likely to have average consumptions that exceed the recommended value for females, since these are often half that of the values for males. This is true for Croatia, Switzerland and Portugal. Also, since the value for actual consumption does not take into account what proportion of the population actually drinks wine, there will be many people of both sexes who exceed the recommended limit, possibly by a great margin.

Monday, June 4, 2018

What is a "most admired" wine brand?

The short answer is: It depends on what year it is!

Each year, the April edition of Drinks International magazine contains a supplement with a survey called The World’s Most Admired Wine Brands. A group of people are asked to vote for the wine brands they "most admire" based on the criteria that each brand should:
  • be of consistent and / or improving quality
  • reflect its region or country
  • be well marketed and packaged
  • respond to the needs and tastes of the target audience
  • have broad appeal among wine consumers.
So, while "admiration" is admitted to be an intangible thing, there are clearly a few pointers here.


The people polled are drawn from "a broad spectrum of the global wine community", which apparently includes: masters of wine, sommeliers, commercial wine buyers, wine importers and retailers, wine journalists, wine consultants and analysts, wine educators, and other wine professionals. There were only 60 people involved back in 2012, but there are now more than 200.

The people could originally vote for up to six wine brands, but apparently they are now asked for only three choices. Furthermore, they are provided with a list of previous winners, including "a list of more than 80 well-known brands and producers, but as usual we also encourage the option of free choices". David Williams has presented his own take on what it means to take part in this poll.

The polls

I have compiled the poll results for the years 2011-2018 inclusive. Each of the published lists contains only the results for the top 50 ranked wine brands in that year — all we know about the other brands is that were ranked lower than 50th place in that year.

Across the 8 years, 98 brands have appeared at least once in the lists. However, only 15 of these brands appeared in all 8 lists, with a further 10 brands appearing in 7 of the 8 lists. There were 20 brands that appeared only once each. There is thus a great deal of variability in "admiration" from year to year.

The first graph shows the yearly data for 9 of the wine brands that appeared in all 8 lists, with the vertical axis showing their ranking from 1st on down to 50th. As you can see, even for these brands their results varied dramatically from year to year. Indeed, only the top three brands have shown any consistency at all — these are Torres (from Spain), Concha y Toro (from Chile), and Penfolds (from Australia). Indeed, Torres was ranked either 1st or 2nd every year, which must make it the most admired brand of all.

The "most admired" wine brands from 2011-2018

To this list of consistent brands we can also add E. Guigal (from France) and Ridge (from the USA), both of which missed the 2011 list but had relatively consistent ranks thereafter (mostly in the top 10).

The second graph focuses on the wine brands from Bordeaux, including those that did not make it onto all 8 lists — this is far and away the best-represented wine region in the lists, with 10% of the brands.

The "most admired" Bordeaux wine brands from 2011-2018

These are all top wine chateaus, of course, with the most expensive wines. The most successful of them seems to be Château Margaux, but even it varies in rank from 7 to 29 across the years. Château Latour and Château d'Yquem are the only ones to get into the top 5 in at least one year, but these two chateaus then missed the list entirely in other years. Clearly, Bordeaux does not engender unmitigated admiration in the wine world.

As far as countries are concerned, France hosts 20% of the admired wine brands:
Bordeaux
Rhône
Burgundy
Languedoc
Beaujolais
Provence
General
10
3
2
2
1
1
1
You will note the very poor showing from Burgundy — there are not many large wine brands (only Louis Latour makes most of the lists), but instead a host of smaller brands marketing very expensive wine (Domaine de la Romanée-Conti makes it onto two lists).

The remaining countries include:
Australia
Spain
USA
New Zealand
Chile
Italy
South Africa
Portugal
   [Porto
Germany
Argentina
Canada
China
Hungary
Lebanon
13
 13
11
8
7
6
6
5
4]
3
2
1
1
1
1
Even though the Porto region hosts 4 of Portugal's admired wine brands (Dow's, Graham's, Sandeman, Taylor's), these have only ever made it onto two of the lists (2016 and 2017).

The brands

I have provided a summary of the data relating to the individual brands in the following network. It displays all of those brands that appeared in at least 50% of the lists (ie. 4 out of 8). This is a form of multivariate data summary, as described in my post Summarizing multi-dimensional wine data as graphs, Part 2: networks. [Technical details: this is a neighbor-net based on the gower distance.]

Each brand is represented by a dot in the network. Brands that are closely connected in the network are similar to each other based on their ranks across the 8 polls, and those that are further apart are progressively more different from each other. So, for example, the three top-ranked brands (Torres, Concha y Toro, Penfolds) are together at the top of the diagram, followed by the next pair (E. Guigal and Ridge). From there, the network progresses down to the less-admired wine brands at the bottom.


Note that Guigal and Ridge are at the head of a bunch of 13 wine brands, all of which are relatively highly admired. Then there is a group of 7 intermediate brands (the network area from Oyster Bay down to Pétrus), plus Marqués de Riscal on its own — the latter is isolated in the network only because it inexplicably missed the 2017 list.

You will also note the proximity in the network of Yellowtail to Château Mouton-Rothschild and Cheval Blanc! This should make you wonder about the criteria for "brand admiration".

The basic issue with these lists

There are potentially at least three things wrong with the "best of" type of list: (i) there is rarely any clear idea of what "best" is supposed to mean; (ii) the list is of arbitrary length (eg. Top-10 or Top-50 only); and (iii) the ranking does not reflect the differences in the original scores. In this instance, we have some idea of what "most admired" is supposed to mean; but the other two issues definitely apply here.

More importantly, there is an issue with interpretation that is rarely mentioned. To say, as many of the media have, that "In the 2018 poll the industry voted Torres the most-admired wine brand" is wrong, because there is no evidence that the people polled did any such thing. Indeed, we do not know how many people actually did put Torres (or any other brand) on their own personal list of three brands. All we know is that Torres is listed as the no. 1 brand because more people put it on their list than did so for any other brand — it may have been a lot of people or it may not.


To provide a concrete example of what I mean, I will refer to my previous discussion of a similar situation for the "Greatest Films Poll" produced every decade by Sight & Sound magazine, which lists the top films as voted by selected film critics. The critics are asked to each list 10 films; and in the most recent poll Alfred Hitchcock's film Vertigo topped the overall poll. However, the vast majority of the critics (77%) said that this film doesn't even belong in the top 10 (ie. it was not on their personal list), let alone first. However, it is listed as the no. 1 film, because more critics (23%) put it on their list than did so for any other film. Similarly, 91% of the critics said The Searchers should not be in the top 10, and yet it is ranked no. 7. So, the rank order of the films is simply that — a rank order; it does not tell you how many critics think highly of each film.

This is the same point that I am making for the wine brands, although in this case I do not have the detailed information to say exactly how many people listed each brand.

In a similar vein, anyone who knows anything about banking will known that "the most trusted bank" is nothing more than the "the least mistrusted bank". The distinction is not trivial for the consumer.

This knowledge does not stop the misuse of these sorts of lists, of course. For example, Voxy recently noted: "Villa Maria [was] named the world’s most admired New Zealand wine brand by Drinks International". This is, strictly speaking, true, but Villa Maria did go down from 4th rank overall last year to 8th this year!

In a future post I might looks at some of the other industry awards, such as The World Ranking of Wines and Spirits and The Drinks Business Awards.