It was back in 2012 that Andrew Jefford presented a keynote address to a bloggers conference entitled "The Death of the Wine Writer". There has been a lot of discussion of this topic since then, but very little actual data demonstrating any such thing, at least as far as wine blogs are concerned.
I once presented a post about The rise and fall of wine blogs, and other things, using data on the frequency of Google searches. However, this was not targeted specifically at individual blogs, because there were not enough data for most of them. This time, I intend being more specific. I have been told that "it’s a wine blog wasteland on the internet, with countless blogs that haven’t been touched for years" (Wine Turtle), but to what extent is this really true?
Back in 2013, it was reported that "there are about 1,450 wine blogs today, of which about 1,000 are nonprofessional endeavors", and that "only 18% of bloggers have been blogging for more than six years." This is a lot of blogs to check, so I need to subsample them.
There are only three places I could write about with any confidence. Of these, the USA has far too many wine blogs to study (probably up to 1,000, with c. 350 people attending the annual Wine Bloggers Conference), while Sweden has far too few (less than a dozen active). That leaves Australia (with many more than a dozen extant), whose wine blogs I will discuss here. There seems to be no reason to assume that the situation for wine blogging in Australia is different to anywhere else, other than in the actual number of blogs.
There have been a few commentaries discussing the fate of Australian wine blogs, as there have been elsewhere. For example, in January 2015 Anthony Madigan posted in The Week That Was (a newsletter from Australia's Wine Business Magazine) the question: "Whatever happened to all the wine bloggers out there?" Andrew Graham, from the Australian Wine Review blog, responded that it wasn't that bad (Where have all the Australian wine bloggers gone?); and he updated his comments in his 10th anniversary review earlier this year.
Some data
To check the current situation, I have tried to compile a list of those wine blogs based in Australia that have been active at some time during the past 10 years, excluding blogs consisting mostly of industry news and announcements, or wine sales. My list is likely to be comprehensive but not exhaustive — I have checked every blog whose existence I heard of during my search, but I cannot exclude the possibility of missed blogs. In each case, I recorded the number of posts for each month of the blog's active period, up to and including May this year. [See the footnote.]
My final list has 63 blogs on it, with the total number of posts varying from 18 (from a 1-person blog) to more than 39,000 (from a 3-person blog). Of these blogs, 23 posted in May 2018, which is more than one-third of them.
The longevity of the blogs is shown in the first graph, with the horizontal axis counting the number of months from the first post to the last (inclusive), and the vertical axis counting the number of blogs. Three of the blogs are listed as ">149": The Real Review (167 months), The Wine Front (201 months), and Chris Shanahan (314 months). I have also plotted separately that subset of the blogs that have not yet posted in 2018 ("Finished blogs").
It is interesting to note the dip in the number of blogs in the "50-59 months" category, indicating that the blogs have tended to last either <4 years or >5 years (technically: a bimodal distribution). Of the no-longer-posting blogs, most of the blogs made it for 3-4 years' worth of posts, although some then died out at that time (30%). If the blogs did make it past that time, then they tended to die out after 5-6 years (30%). However, 40% of the blogs have lasted longer than this, which is double the estimate (18%) reported at the top of this post.
This seems to be the answer to the question posed in the post's title.
The second graph shows which years the blogs started (remember: these are only the wine-related blogs where something was posted during 2008 or later). Half of the blogs started during 2009-2011, with only 3 starting after 2013. Obviously, there was a "boom time" for wine blogging in Australia, which is now long gone.
The same graph shows the year in which the last blog post occurred, with 25 of the 63 blogs (40%) having posted in 2018. This is a lot more extant Australian wine blogs than I think people realize. They are listed at the bottom of this post.
However, it is rather difficult to know whether a wine blog is merely moribund or is actually dead. Only one of the bloggers explicitly noted that his blog was ending (and, sadly, he died shortly afterwards). Most of the other bloggers just stopped posting regularly, although in several cases the blog was actually taken offline (and I accessed it through the Internet Archive's Wayback Machine).
It has been noted that often "bloggers allow weeks, months, even years to go by without posting a thought". As an example, the third graph shows two of the Australian blogs where posts kept appearing sporadically for several years after the regular blog posts stopped, indicating that they may not yet be dead, even now. So, baring the known decease of the blogger, or the blog itself being taken offline, we can't definitively say that any of the blogs are actually "finished".
The final graph shows the activity of the blogs, with the horizontal axis showing the average number of posts per month during the their lifetime (excluding the most prolific blog, with an average of 195 posts per month!). The most common activity numbers are once per fortnight, once per week, twice per week and three times per week, with half of the bloggers sticking to 4 posts or less per month. This rate of posting indicates that the posts are mainly about wine-related topics, rather than being short reports of wines tasted.
However, the seven most prolific blogs in the graph (having averaged >20 posts per month) do consist mostly of wine reviews. In order, they are: Wine Will Eat Itself, Qwine, Tyson Stelzer, Australian Wine Journal, Australian Wine Review, Wino Sapien, and Grape Observer; and to this list we can add The Wine Front, with its massive average of 6.5 posts per day (from three people). I might write about these two distinct types of wine blogs in a future post.
Conclusion
It seems that 25 of the 63 wine-related blogs that have been active in Australia during the past decade have posted sometime during 2018, which is not an insignificant number (see the list at the bottom of the post). Nevertheless, Anthony Madigan's January 2015 comment was spot on. During the previous 5 years, 19 blogs had stopped posting (32% of those in existence at the time), and no new ones had been started during the previous year. He was quite right to ask his question at that particular moment. [Mind you, Madigan's own blog (Country Wine) has the smallest average number of posts per month (0.4), and he has not posted since May last year.]
So, irrespective of any perceived image problem, it seems that announcements of the death of wine blogging are greatly exaggerated, at least in Australia.
Compiling the data for this post was relatively straightforward for the blogs hosted by Blogger, because the default setup is to have a post archive actually listing the number of posts for each month. However, Wordpress blogs have no such default. In fact, only one Wordpress blog provided the information in any easy-to-access manner. In every other case, I had to count the posts manually. So, you Wordpress bloggers — I hate you all!
Only three blogs defeated my attempts to count their posts manually, all produced by wine critics: The Wine Front (39,000 posts since 16 September 2001), The Real Review (2,700 posts since 1 July 2004), and Tyson Stelzer (2,600 posts since 1 April 2010).
Wine-related blogs from Australia that have posted at least once so far during 2018:
Australian Wine Review
Australian Wine Reviews - and Beyond
Best Wines Under $20
Chris Shanahan
Drinkster
Grape Observer
Happy Wine Woman
The Inquisitive Palate
The Intrepid Wino
More Red Sir!
People of Wine
Que Syrah
Qwine
The Real Review
The Tasting Glass
Travelling Corkscrew
Tyson Stelzer
Vino Notebook
The Vinsomniac
The Wine Front
The Wine Wankers
Winemusing (formerly The Wine Muse)
Wino Sapien
Winsor Dobbin Wine of the Week
Winsor's Choice
Note that this list excludes blogs that consist mostly of industry news and announcements, or wine sales, as well as personal web pages without a distinct blog component.
Monday, June 25, 2018
Monday, June 18, 2018
Not all retailers cherry-pick the best critic scores for their wines
I noted in an earlier post that wine retailers often cherry-pick the critics' quality scores that they quote when advertising their wines (Laube versus Suckling — their scores differ, but what does that mean for us?). They can do this because the quality scores from different critics often do not agree, and there is a pecuniary advantage to the retailer for any given wine to appear to be as good as it can be.
However, not all wine advertising is like this. For example, Wine Rack Media, a company offering services to online wine retailers, makes available the scores (for any given wine) from a range of critics, and these scores are sometimes quoted in full by retailers. For example, both Fisher's Discount Liquor Barn and Incredible Wine Store have web pages "powered by Wine Rack Media", and they both use that company's wine database.
Let's take a specific example of a well-known wine, to see what information we get: Caymus 'Special Selection' Cabernet Sauvignon. Checking online, we are given wine descriptions (including quality scores) for most of the vintages from 1978 to 2005, many of them from multiple sources: Wine Spectator, Wine Enthusiast, Wine and Spirits, Stephen Tanzer, Connoisseurs' Guide to California Wine, and Vintage Tastings.
What is more interesting, though, is that we are often given multiple scores from each of these sources, rather than a single score. Most reputable sources of wine-quality scores are likely to have tasted the wines on different occasions, often by different tasters, so that we do have multiple scores available. Most wine retailers report only the highest of these scores, for each wine (ie. a cherry-picked score); but not in this example.
The first graph shows all of the quality scores we are given from the Wine Spectator magazine, with the vintages listed horizontally and the scores vertically. Where there are multiple scores for a particular vintage, I have connected them with a line.
As you can see, the repeated scores tend to differ from each other by at least a couple of points, although one pair differs by 16 points (82 versus 98) and some others differ by 10 points and 9 points. Clearly, cherry-picking a score would be very effective here. Indeed, all of the scores below 90 points have a paired score that is much higher — we could, if we were so inclined, easily claim that the Caymus Cabernet is consistently better than a 90-point wine!
For comparison, the second graph shows some of the other quality scores, as well.
Note that the Tanzer scores are almost all lower than the other scores for the same vintage, while the Connoisseurs' Guide scores are usually lower. Indeed, the only vintages for which there is good agreement are 2000 and 2004 (where all of the scores are within 1 point).
With this sort of presentation of the quality scores we can thus see at a glance where we stand with regard to the variability among critics' scores. If only all wine retailing was this honest about the multitude of data available.
However, not all wine advertising is like this. For example, Wine Rack Media, a company offering services to online wine retailers, makes available the scores (for any given wine) from a range of critics, and these scores are sometimes quoted in full by retailers. For example, both Fisher's Discount Liquor Barn and Incredible Wine Store have web pages "powered by Wine Rack Media", and they both use that company's wine database.
Let's take a specific example of a well-known wine, to see what information we get: Caymus 'Special Selection' Cabernet Sauvignon. Checking online, we are given wine descriptions (including quality scores) for most of the vintages from 1978 to 2005, many of them from multiple sources: Wine Spectator, Wine Enthusiast, Wine and Spirits, Stephen Tanzer, Connoisseurs' Guide to California Wine, and Vintage Tastings.
What is more interesting, though, is that we are often given multiple scores from each of these sources, rather than a single score. Most reputable sources of wine-quality scores are likely to have tasted the wines on different occasions, often by different tasters, so that we do have multiple scores available. Most wine retailers report only the highest of these scores, for each wine (ie. a cherry-picked score); but not in this example.
The first graph shows all of the quality scores we are given from the Wine Spectator magazine, with the vintages listed horizontally and the scores vertically. Where there are multiple scores for a particular vintage, I have connected them with a line.
As you can see, the repeated scores tend to differ from each other by at least a couple of points, although one pair differs by 16 points (82 versus 98) and some others differ by 10 points and 9 points. Clearly, cherry-picking a score would be very effective here. Indeed, all of the scores below 90 points have a paired score that is much higher — we could, if we were so inclined, easily claim that the Caymus Cabernet is consistently better than a 90-point wine!
For comparison, the second graph shows some of the other quality scores, as well.
Note that the Tanzer scores are almost all lower than the other scores for the same vintage, while the Connoisseurs' Guide scores are usually lower. Indeed, the only vintages for which there is good agreement are 2000 and 2004 (where all of the scores are within 1 point).
With this sort of presentation of the quality scores we can thus see at a glance where we stand with regard to the variability among critics' scores. If only all wine retailing was this honest about the multitude of data available.
Monday, June 11, 2018
Actual wine consumption versus the recommended maximum
Many, if not most, countries have an "official" recommended maximum level of wine consumption per week for adults. However, the recommended value differs rather a lot between the countries. Moreover, the observed weekly consumption of wine also differs between countries. I have therefore wondered how these two values compare — the actual consumption versus the recommended maximum one.
For the recommended wine-intake values for each country, I have used the data from the International Alliance for Responsible Drinking (IARD), which were last updated in January 2018. The values given are in grams of alcohol per week (for each adult), with separate values for males and females, if they differ. We know that one 750 ml bottle of 12.5% wine equals 74 g alcohol, so we can use this to convert the values to bottles of wine per week.
For the actual wine-intake values for each country, I have used the Annual Per Capita Wine Consumption 2014-16 list from the AAWE facebook page. The values given are liters of wine per person per year. To convert these to "per adult", I have used the World Bank data for the percent of each population aged 15-64 years (this is apparently a standard age group for "adults"). I then converted these new values to bottles of wine per week.
This means that I now have two figures for bottles of wine per week per adult, one estimating actual consumption and one describing the recommended maximum, for a range of countries. Clearly, the value for actual consumption does not take into account what proportion of the population actually drinks wine.
The only complication is that the observed consumption values combine both males and females, whereas the recommendations for the sexes sometimes differ — where they do differ, the value for females is typically one-half or two-thirds of the value for males. Interestingly, there has been a recent trend for countries to lower the recommended limit for wine consumption by men to the same value used for women. Apparently, we are moving away from the James Bond / Humphrey Bogart hard-drinking lifestyle, at least as a medical recommendation for men (see my blog post James Bond, alcoholic).
Anyway, the data in the table show the actual consumption (bottles of wine per week per adult) as a percentage of the recommended limit for men. Most countries have a recommended limit for men of 1.9-2.3 bottles per week.
Note that most of the countries have wine intake that is less than 50% of the recommended maximum wine consumption. In theory, this should make the medical people happy. However, there are two ways to be at the top of this list: (i) have a high consumption, and (ii) have a low recommended maximum consumption.
The countries with lower limits than 1.9 bottles per week include Bulgaria and the United Kingdom (1.5 bottles), Chile (1.3), and the Netherlands (1.0). This explains why the UK and the Netherlands are near the top of the list, even though their consumption is not particularly high.
The other top countries are there because of their high wine consumption. Indeed, Croatia, Portugal, France and Switzerland each consume 25% more wine per adult than does their nearest rival (Italy).
The countries with the highest recommended limits include Argentina, Canada and the USA (2.7 bottles per week), Greece (2.8), Romania (3.0), and Spain (3.8). This explains why these countries occupy the bottom places on the list — they set high limits, and so their people's consumption gets nowhere near that limit.
Note that Chile and Bulgaria have low recommended limits but even lower wine consumption.
Finally, it is worth noting that those countries with wine-consumption values exceeding 50% are likely to have average consumptions that exceed the recommended value for females, since these are often half that of the values for males. This is true for Croatia, Switzerland and Portugal. Also, since the value for actual consumption does not take into account what proportion of the population actually drinks wine, there will be many people of both sexes who exceed the recommended limit, possibly by a great margin.
For the recommended wine-intake values for each country, I have used the data from the International Alliance for Responsible Drinking (IARD), which were last updated in January 2018. The values given are in grams of alcohol per week (for each adult), with separate values for males and females, if they differ. We know that one 750 ml bottle of 12.5% wine equals 74 g alcohol, so we can use this to convert the values to bottles of wine per week.
For the actual wine-intake values for each country, I have used the Annual Per Capita Wine Consumption 2014-16 list from the AAWE facebook page. The values given are liters of wine per person per year. To convert these to "per adult", I have used the World Bank data for the percent of each population aged 15-64 years (this is apparently a standard age group for "adults"). I then converted these new values to bottles of wine per week.
This means that I now have two figures for bottles of wine per week per adult, one estimating actual consumption and one describing the recommended maximum, for a range of countries. Clearly, the value for actual consumption does not take into account what proportion of the population actually drinks wine.
The only complication is that the observed consumption values combine both males and females, whereas the recommendations for the sexes sometimes differ — where they do differ, the value for females is typically one-half or two-thirds of the value for males. Interestingly, there has been a recent trend for countries to lower the recommended limit for wine consumption by men to the same value used for women. Apparently, we are moving away from the James Bond / Humphrey Bogart hard-drinking lifestyle, at least as a medical recommendation for men (see my blog post James Bond, alcoholic).
Anyway, the data in the table show the actual consumption (bottles of wine per week per adult) as a percentage of the recommended limit for men. Most countries have a recommended limit for men of 1.9-2.3 bottles per week.
Croatia France Netherlands Switzerland Portugal UK Belgium-Lux. Italy Denmark Austria Australia Sweden New Zealand Uruguay Germany Georgia Chile Greece Bulgaria Ireland Argentina Romania Finland Canada USA Spain |
90% 90% 88% 83% 75% 74% 63% 60% 53% 49% 47% 45% 45% 44% 39% 39% 36% 36% 31% 31% 30% 27% 26% 20% 16% 15% |
Note that most of the countries have wine intake that is less than 50% of the recommended maximum wine consumption. In theory, this should make the medical people happy. However, there are two ways to be at the top of this list: (i) have a high consumption, and (ii) have a low recommended maximum consumption.
The countries with lower limits than 1.9 bottles per week include Bulgaria and the United Kingdom (1.5 bottles), Chile (1.3), and the Netherlands (1.0). This explains why the UK and the Netherlands are near the top of the list, even though their consumption is not particularly high.
The other top countries are there because of their high wine consumption. Indeed, Croatia, Portugal, France and Switzerland each consume 25% more wine per adult than does their nearest rival (Italy).
The countries with the highest recommended limits include Argentina, Canada and the USA (2.7 bottles per week), Greece (2.8), Romania (3.0), and Spain (3.8). This explains why these countries occupy the bottom places on the list — they set high limits, and so their people's consumption gets nowhere near that limit.
Note that Chile and Bulgaria have low recommended limits but even lower wine consumption.
Finally, it is worth noting that those countries with wine-consumption values exceeding 50% are likely to have average consumptions that exceed the recommended value for females, since these are often half that of the values for males. This is true for Croatia, Switzerland and Portugal. Also, since the value for actual consumption does not take into account what proportion of the population actually drinks wine, there will be many people of both sexes who exceed the recommended limit, possibly by a great margin.
Monday, June 4, 2018
What is a "most admired" wine brand?
The short answer is: It depends on what year it is!
Each year, the April edition of Drinks International magazine contains a supplement with a survey called The World’s Most Admired Wine Brands. A group of people are asked to vote for the wine brands they "most admire" based on the criteria that each brand should:
The people polled are drawn from "a broad spectrum of the global wine community", which apparently includes: masters of wine, sommeliers, commercial wine buyers, wine importers and retailers, wine journalists, wine consultants and analysts, wine educators, and other wine professionals. There were only 60 people involved back in 2012, but there are now more than 200.
The people could originally vote for up to six wine brands, but apparently they are now asked for only three choices. Furthermore, they are provided with a list of previous winners, including "a list of more than 80 well-known brands and producers, but as usual we also encourage the option of free choices". David Williams has presented his own take on what it means to take part in this poll.
The polls
I have compiled the poll results for the years 2011-2018 inclusive. Each of the published lists contains only the results for the top 50 ranked wine brands in that year — all we know about the other brands is that were ranked lower than 50th place in that year.
Across the 8 years, 98 brands have appeared at least once in the lists. However, only 15 of these brands appeared in all 8 lists, with a further 10 brands appearing in 7 of the 8 lists. There were 20 brands that appeared only once each. There is thus a great deal of variability in "admiration" from year to year.
The first graph shows the yearly data for 9 of the wine brands that appeared in all 8 lists, with the vertical axis showing their ranking from 1st on down to 50th. As you can see, even for these brands their results varied dramatically from year to year. Indeed, only the top three brands have shown any consistency at all — these are Torres (from Spain), Concha y Toro (from Chile), and Penfolds (from Australia). Indeed, Torres was ranked either 1st or 2nd every year, which must make it the most admired brand of all.
To this list of consistent brands we can also add E. Guigal (from France) and Ridge (from the USA), both of which missed the 2011 list but had relatively consistent ranks thereafter (mostly in the top 10).
The second graph focuses on the wine brands from Bordeaux, including those that did not make it onto all 8 lists — this is far and away the best-represented wine region in the lists, with 10% of the brands.
These are all top wine chateaus, of course, with the most expensive wines. The most successful of them seems to be Château Margaux, but even it varies in rank from 7 to 29 across the years. Château Latour and Château d'Yquem are the only ones to get into the top 5 in at least one year, but these two chateaus then missed the list entirely in other years. Clearly, Bordeaux does not engender unmitigated admiration in the wine world.
As far as countries are concerned, France hosts 20% of the admired wine brands:
You will note the very poor showing from Burgundy — there are not many large wine brands (only Louis Latour makes most of the lists), but instead a host of smaller brands marketing very expensive wine (Domaine de la Romanée-Conti makes it onto two lists).
The remaining countries include:
Even though the Porto region hosts 4 of Portugal's admired wine brands (Dow's, Graham's, Sandeman, Taylor's), these have only ever made it onto two of the lists (2016 and 2017).
The brands
I have provided a summary of the data relating to the individual brands in the following network. It displays all of those brands that appeared in at least 50% of the lists (ie. 4 out of 8). This is a form of multivariate data summary, as described in my post Summarizing multi-dimensional wine data as graphs, Part 2: networks. [Technical details: this is a neighbor-net based on the gower distance.]
Each brand is represented by a dot in the network. Brands that are closely connected in the network are similar to each other based on their ranks across the 8 polls, and those that are further apart are progressively more different from each other. So, for example, the three top-ranked brands (Torres, Concha y Toro, Penfolds) are together at the top of the diagram, followed by the next pair (E. Guigal and Ridge). From there, the network progresses down to the less-admired wine brands at the bottom.
Note that Guigal and Ridge are at the head of a bunch of 13 wine brands, all of which are relatively highly admired. Then there is a group of 7 intermediate brands (the network area from Oyster Bay down to Pétrus), plus Marqués de Riscal on its own — the latter is isolated in the network only because it inexplicably missed the 2017 list.
You will also note the proximity in the network of Yellowtail to Château Mouton-Rothschild and Cheval Blanc! This should make you wonder about the criteria for "brand admiration".
The basic issue with these lists
There are potentially at least three things wrong with the "best of" type of list: (i) there is rarely any clear idea of what "best" is supposed to mean; (ii) the list is of arbitrary length (eg. Top-10 or Top-50 only); and (iii) the ranking does not reflect the differences in the original scores. In this instance, we have some idea of what "most admired" is supposed to mean; but the other two issues definitely apply here.
More importantly, there is an issue with interpretation that is rarely mentioned. To say, as many of the media have, that "In the 2018 poll the industry voted Torres the most-admired wine brand" is wrong, because there is no evidence that the people polled did any such thing. Indeed, we do not know how many people actually did put Torres (or any other brand) on their own personal list of three brands. All we know is that Torres is listed as the no. 1 brand because more people put it on their list than did so for any other brand — it may have been a lot of people or it may not.
To provide a concrete example of what I mean, I will refer to my previous discussion of a similar situation for the "Greatest Films Poll" produced every decade by Sight & Sound magazine, which lists the top films as voted by selected film critics. The critics are asked to each list 10 films; and in the most recent poll Alfred Hitchcock's film Vertigo topped the overall poll. However, the vast majority of the critics (77%) said that this film doesn't even belong in the top 10 (ie. it was not on their personal list), let alone first. However, it is listed as the no. 1 film, because more critics (23%) put it on their list than did so for any other film. Similarly, 91% of the critics said The Searchers should not be in the top 10, and yet it is ranked no. 7. So, the rank order of the films is simply that — a rank order; it does not tell you how many critics think highly of each film.
This is the same point that I am making for the wine brands, although in this case I do not have the detailed information to say exactly how many people listed each brand.
In a similar vein, anyone who knows anything about banking will known that "the most trusted bank" is nothing more than the "the least mistrusted bank". The distinction is not trivial for the consumer.
This knowledge does not stop the misuse of these sorts of lists, of course. For example, Voxy recently noted: "Villa Maria [was] named the world’s most admired New Zealand wine brand by Drinks International". This is, strictly speaking, true, but Villa Maria did go down from 4th rank overall last year to 8th this year!
In a future post I might looks at some of the other industry awards, such as The World Ranking of Wines and Spirits and The Drinks Business Awards.
Each year, the April edition of Drinks International magazine contains a supplement with a survey called The World’s Most Admired Wine Brands. A group of people are asked to vote for the wine brands they "most admire" based on the criteria that each brand should:
- be of consistent and / or improving quality
- reflect its region or country
- be well marketed and packaged
- respond to the needs and tastes of the target audience
- have broad appeal among wine consumers.
The people polled are drawn from "a broad spectrum of the global wine community", which apparently includes: masters of wine, sommeliers, commercial wine buyers, wine importers and retailers, wine journalists, wine consultants and analysts, wine educators, and other wine professionals. There were only 60 people involved back in 2012, but there are now more than 200.
The people could originally vote for up to six wine brands, but apparently they are now asked for only three choices. Furthermore, they are provided with a list of previous winners, including "a list of more than 80 well-known brands and producers, but as usual we also encourage the option of free choices". David Williams has presented his own take on what it means to take part in this poll.
The polls
I have compiled the poll results for the years 2011-2018 inclusive. Each of the published lists contains only the results for the top 50 ranked wine brands in that year — all we know about the other brands is that were ranked lower than 50th place in that year.
Across the 8 years, 98 brands have appeared at least once in the lists. However, only 15 of these brands appeared in all 8 lists, with a further 10 brands appearing in 7 of the 8 lists. There were 20 brands that appeared only once each. There is thus a great deal of variability in "admiration" from year to year.
The first graph shows the yearly data for 9 of the wine brands that appeared in all 8 lists, with the vertical axis showing their ranking from 1st on down to 50th. As you can see, even for these brands their results varied dramatically from year to year. Indeed, only the top three brands have shown any consistency at all — these are Torres (from Spain), Concha y Toro (from Chile), and Penfolds (from Australia). Indeed, Torres was ranked either 1st or 2nd every year, which must make it the most admired brand of all.
To this list of consistent brands we can also add E. Guigal (from France) and Ridge (from the USA), both of which missed the 2011 list but had relatively consistent ranks thereafter (mostly in the top 10).
The second graph focuses on the wine brands from Bordeaux, including those that did not make it onto all 8 lists — this is far and away the best-represented wine region in the lists, with 10% of the brands.
These are all top wine chateaus, of course, with the most expensive wines. The most successful of them seems to be Château Margaux, but even it varies in rank from 7 to 29 across the years. Château Latour and Château d'Yquem are the only ones to get into the top 5 in at least one year, but these two chateaus then missed the list entirely in other years. Clearly, Bordeaux does not engender unmitigated admiration in the wine world.
As far as countries are concerned, France hosts 20% of the admired wine brands:
Bordeaux Rhône Burgundy Languedoc Beaujolais Provence General |
10 3 2 2 1 1 1 |
The remaining countries include:
Australia Spain USA New Zealand Chile Italy South Africa Portugal [Porto Germany Argentina Canada China Hungary Lebanon |
13 13 11 8 7 6 6 5 4] 3 2 1 1 1 1 |
The brands
I have provided a summary of the data relating to the individual brands in the following network. It displays all of those brands that appeared in at least 50% of the lists (ie. 4 out of 8). This is a form of multivariate data summary, as described in my post Summarizing multi-dimensional wine data as graphs, Part 2: networks. [Technical details: this is a neighbor-net based on the gower distance.]
Each brand is represented by a dot in the network. Brands that are closely connected in the network are similar to each other based on their ranks across the 8 polls, and those that are further apart are progressively more different from each other. So, for example, the three top-ranked brands (Torres, Concha y Toro, Penfolds) are together at the top of the diagram, followed by the next pair (E. Guigal and Ridge). From there, the network progresses down to the less-admired wine brands at the bottom.
Note that Guigal and Ridge are at the head of a bunch of 13 wine brands, all of which are relatively highly admired. Then there is a group of 7 intermediate brands (the network area from Oyster Bay down to Pétrus), plus Marqués de Riscal on its own — the latter is isolated in the network only because it inexplicably missed the 2017 list.
You will also note the proximity in the network of Yellowtail to Château Mouton-Rothschild and Cheval Blanc! This should make you wonder about the criteria for "brand admiration".
The basic issue with these lists
There are potentially at least three things wrong with the "best of" type of list: (i) there is rarely any clear idea of what "best" is supposed to mean; (ii) the list is of arbitrary length (eg. Top-10 or Top-50 only); and (iii) the ranking does not reflect the differences in the original scores. In this instance, we have some idea of what "most admired" is supposed to mean; but the other two issues definitely apply here.
More importantly, there is an issue with interpretation that is rarely mentioned. To say, as many of the media have, that "In the 2018 poll the industry voted Torres the most-admired wine brand" is wrong, because there is no evidence that the people polled did any such thing. Indeed, we do not know how many people actually did put Torres (or any other brand) on their own personal list of three brands. All we know is that Torres is listed as the no. 1 brand because more people put it on their list than did so for any other brand — it may have been a lot of people or it may not.
To provide a concrete example of what I mean, I will refer to my previous discussion of a similar situation for the "Greatest Films Poll" produced every decade by Sight & Sound magazine, which lists the top films as voted by selected film critics. The critics are asked to each list 10 films; and in the most recent poll Alfred Hitchcock's film Vertigo topped the overall poll. However, the vast majority of the critics (77%) said that this film doesn't even belong in the top 10 (ie. it was not on their personal list), let alone first. However, it is listed as the no. 1 film, because more critics (23%) put it on their list than did so for any other film. Similarly, 91% of the critics said The Searchers should not be in the top 10, and yet it is ranked no. 7. So, the rank order of the films is simply that — a rank order; it does not tell you how many critics think highly of each film.
This is the same point that I am making for the wine brands, although in this case I do not have the detailed information to say exactly how many people listed each brand.
In a similar vein, anyone who knows anything about banking will known that "the most trusted bank" is nothing more than the "the least mistrusted bank". The distinction is not trivial for the consumer.
This knowledge does not stop the misuse of these sorts of lists, of course. For example, Voxy recently noted: "Villa Maria [was] named the world’s most admired New Zealand wine brand by Drinks International". This is, strictly speaking, true, but Villa Maria did go down from 4th rank overall last year to 8th this year!
In a future post I might looks at some of the other industry awards, such as The World Ranking of Wines and Spirits and The Drinks Business Awards.
Subscribe to:
Posts (Atom)