Monday, July 24, 2017

Wine tastings: the winning wine is often the least-worst wine

At organized wine tastings, the participants often finish by putting the wines in some sort of consensus quality order, from the wine most-preferred by the tasting group to the least-preferred. This is especially true of wine competitions, of course, but trade and home tastings are often organized this way, as well.

The question is: how do we go about deciding upon a winning wine? Perhaps the simplest way is for each person to rank the wines, and then to find a consensus ranking for the group. This is not necessarily as straightforward as it might seem.


To illustrate this idea, I will look at some data involving two separate blind tastings, in late 1995, of California cabernets (including blends) from the 1992 vintage. The first tasting had 18 wines and 17 tasters, and the second had 16 wines and 16 tasters. In both cases the tasters were asked, at the end of the tasting, to put the wines in their order of preference (ie. a rank order, ties allowed).

The first tasting produced results with a clear "winner", no matter how this is defined. The first graph shows how many of the 17 tasters ranked each wine in first place (vertically) compared to how often that wine was ranked in the top three places (horizontally). Each point represents one of the 18 wines.

Results of first tasting

Clearly, 15 of the 18 wines appeared in the top 3 ranks at least once, so that only 3 of the wines did not particularly impress anybody. Moreover, 6 of the wines got ranked in first place by at least one of the tasters — that is, one-third of the wines stood out to at least someone. However, by consensus, one of the wines (from Screaming Eagle, as it turns out) stood out head and shoulders above the others, and can be declared the "winner".

However, this situation might be quite rare. Indeed, the second tasting seems to be more typical. The next graph shows how many of the 16 tasters ranked each wine in first place (vertically) compared to how often that wine was ranked in the top five places (horizontally). Each point represents one of the 16 wines.

Results of second tasting

In this case, the tasters' preferences are more evenly spread among the wines. For example, every wine was ranked in the top 3 at least once, and in the top 4 at least twice, so that each of the wines was deemed worthy of recognition by at least one person. Furthermore, 10 of the 16 wines got ranked in first place by at least one of the tasters — that is, nearly two-thirds of the wines stood out to at least someone.

One of these wines, the Silver Oak (Napa Valley) cabernet, looks like it could be the winner, since it was ranked first 3 times and in the top five 7 times. However, the Flora Springs (Rutherford Reserve) wine appeared in the top five 10 times, even though it was ranked first only 2 times; so it is also a contender. Indeed, if we take all of the 16 ranks into account (not just the top few) then the latter wine is actually the "winner", and is shown in pink in the graph. Its worst ranking was tenth, so that no-one disliked it, whereas the Silver Oak wine was ranked last by 2 of the tasters.

We can conclude from this that being ranked first by a lot of people will not necessarily make a wine the top-ranked wine of the evening. "Winning" the tasting seems to be more about being the least-worst wine! That is, winning is as much about not being last for any taster as it is about being first.

This situation is not necessarily unusual. For example, on my other blog I have discussed the 10-yearly movie polls conducted by Sight & Sound magazine. In the 2012 poll Alfred Hitchock's film Vertigo was ranked top, displacing Citizen Kane for the first time in the 50-year history of the polls; and yet, 77% of critics polled did not even list this film in their personal top 10. Nevertheless, more critics (23%) did put Vertigo on their top-10 list than did so for any other film, and so this gets Vertigo the top spot overall. From these data, we cannot conclude that Vertigo is "the best movie of all time", but merely that it is chosen more often than the other films (albeit by less than one-quarter of the people). Preferences at wine tastings seem to follow this same principle.

Finally, we can compare the seven wines that were common to the two tastings discussed above. Did these wines appear in the same rank order at both tastings?

In this case, we can calculate the consensus rank for each tasting by summing the ranks from each participant, giving 3 points for first rank, 2 points for second, and 1 point for third. The result of this calculation is shown in the third graph, where each point represents one of the seven wines, and the axes indicate the ranking for the two tastings.

Comparison of the two tastings

The two groups of tasters agree on the bottom three wines in their rankings. However, they do not agree on the "winning" wine among these seven. More notably, they disagree quite strongly about the Silver Oak cabernet. In the second tasting this wine received 3 firsts and 2 thirds (from the 16 tasters), while in the first tasting it received 1 third ranking only (out of 17 people). The consensus ranking of this wine thus differs quite markedly between the tastings. This may reflect differences in the type of participants at the tastings, there being a broader range of wine expertise in the second tasting.

4 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. [Preface: I have deleted and reposted my twin comments to correct two typos. ~~ Bob]

    Speaking as the organizer of the first alluded to winetasting, it was conducted "single blind": one participant brown bagged the bottles to make them appear as anonymous as possible; a second participant "randomly" numbered the wines 1 through 19 -- which became their pour order. Wines were grouped in flights of five. "Top 3 Preference Votes" were taken after each flight. All wines remained on the table in glasses in front of the participants throughout the event, to facilitate retastings to compare and contrast wines. At the conclusion of the tasting, an "Overall Top 3 Preference Vote" was taken. That last vote is cited by David and represented in the first exhibit.

    I can inform you that the debut vintage 1992 Screaming Eagle totally captivated the assembled tasters. It evinced a beguiling aroma of violets that perfumed the room. The fruit was ripe (but not "stewed" or "raisined"), the mouthfeel plush, with a long finish, and no hint of high alcohol (which in recent times has become a hallmark of contemporary California Cabernets and Cab-blends).

    One of the greatest young red wines I have tasted from the New or Old World.

    A wine that Jancis Robinson, MW described in her website piece titled "Is California dreaming? -- an extraordinary assessment of the cult wines that cost more than Bordeaux's first growths" thusly:

    "1992 Screaming Eagle: 19.5 points [out of 20] and still improving

    "Very very deep colour, Lively, edgy nose. Round palate but with revitalising edge of acidity. Very slightly drying at the end but markedly long (the longest-lasting wine of the tasting) and elegant. Sweet and appetising (the bullseye) right through to the end."

    (A wine that even impressed Hugh Johnson and Michael Broadbent at the tasting they attended with Robinson.)

    Surpassed in my drinking experience only by a bottle of 1990 Château Rayas Réserve Châteauneuf-du-Pape in its hedonism.

    (Whicn Decanter magazine declared "wine legend":
    http://www.decanter.com/learn/wine-legend-chateau-rayas-reserve-1990-334903/)

    Kudos to winemaker Heidi Barrett and Screaming Eagle winery owner Jean Philips for crafting a modern day masterpiece.

    ReplyDelete
  4. "Naming names" . . .

    Let me enhance the above comment by listing the wines that comprised my 1992 vintage California Cabernets and Cab-blends tasting.

    FLIGHT 1 OF 4 :

    # 1: Z. D. "Reserve"
    # 2: Stag's Leap "Cask 23"
    # 3: Silver Oak "Napa"
    # 4: Screaming Eagle
    # 5: Joseph Phelps "Insignia"

    FLIGHT 2 OF 4 :

    # 6: Montelena
    # 7: Mt. Eden "Old Vines"
    # 8: Laurel Glen
    # 9: La Jota "11th Anniversary"
    # 10: Kendall-Jackson "Reserve"

    FLIGHT 3 OF 4 :

    # 11: Judd's Hill
    # 12: Hess Collection
    # 13: ( WINE NOT SUBMITTED )
    # 14: Flora Springs "Reserve"
    # 15: Fisher "Wedding Vyd."

    FLIGHT 4 OF 4 :

    # 16: Dalla Valle "Napa"
    # 17: Caymus "Special Selection"
    # 18: Beringer "Reserve"
    # 19: Araujo "Eisele Vyd."

    [No, I am not triskaidekaphobic. Wine submission # 13 never showed up at the venue, so I made a 12th hour substitution.]

    ReplyDelete