A major trend in wine sales in the last couple of decades has been the increasing reliance on point scores to sell wines. Retail shops make sure to have shelf talkers hanging by wines that have scored well. Ads proclaim “90 point wine by Robert Parker!” or “91 point wine by Wine Spectator!” When a customer goes into a store to choose from a vast selection of wines, he or she often feels that lacking any other means to judge a wine, a point score is the best way to decide.One hundred point scoring systems were introduced into the wine world by Robert Parker in the late ‘70s, when, if point scoring was done at all, it was done on a 20-point scale. The Department of Viticulture and Enology at the University of California at Davis had a 20-point system that subtracted or awarded points for having or not having specific flaws. That system may or may not have provided a guide for consumers as to how “good” a wine was. Harry Waugh used a 20-point system that was simply Harry saying to himself, “this tastes like a 17-point wine.” Many years ago, I developed a 100-point system that awarded points for particular attributes such as “Varietal Character,” “Character, Interestingness,” and “Elegance, Breed, Finesse.” There was plenty of room at the top of each category for wines that in my imagination exceeded anything that I had yet tasted. The highest score any wine achieved on that system, a 1961 LaTache, by the Domaine de la Romanée Conti, was 83. I gave up that system sometime in the ‘80s. As near as one can tell, the various 100-point scoring systems in use today follow the Harry Waugh method of the taster saying, “I know what a 90-point wine tastes like, and this wine tastes like it’s 90 points.” Why 100 points? Robert Parker explained in his early issues (from issue VI, in 1979, the earliest issue I can find in my files): "The WINE ADVOCATE’S rating system employs a 50-100 point quality scale. It is my belief that various twenty (20) point rating systems do not provide enough flexibility and often result in inflated wine ratings. The WINE ADVOCATE will take a hard, almost over critical look at wine, since it would prefer to underestimate the wine’s quality than to overestimate it. The wine industry seems to be a leader in over inflated propaganda about the virtues of particular wines and vintages. This is evidenced by the ratings of the 1973 Red Bordeaux, itemized in Volume I, which as a vintage the WINE ADVOCATE would rate 68. This is considerably lower than the ratings given by other wine authorities who, more often than not, represent the viewpoint of the wine wholesaler, distributor, and retailer, rather than the wine consumer."The first sentences of this explanation are actually remarkably little changed from the current explanation on Robert Parker’s web site, but there has been some grade inflation in the intervening 32 years. When Parker began, he explained that a wine rated 80-89 was “a very good wine displaying considerable finesse and character with no noticeable flaws,” while a wine rated 75-79 was “an above average wine that offers delightful drinking with some complexity and charm.” Today Parker explains that a wine he rates 80-89 is “A barely above average to very good wine displaying various degrees of finesse and flavor as well as character with no noticeable flaws.” A wine rated 70-79 is “An average wine with little distinction except that it is a soundly made. In essence, a straightforward, innocuous wine.” That Volume VI from which the earlier quotes were drawn rated 1975 Bordeaux, giving, for instance, Haut-Bages Liberal a score of 86 but also a “highly recommended.” Pichon Lalande was rated 84 and “recommended,” as was First Growth Mouton Rothschild. Imagine a First Growth today being “recommended” with a score of 84. (We tasted the 1975 Mouton when it first came out, by the way, thought it was excellent, and bought some.)The other big gun of wine scoring is Wine Spectator. Its 100-point system explains that a wine rated 85-89 is “Very good: a wine with special qualities.” As a practical matter, of course, a wine rated 85-89, unless it is unusually inexpensive, has been given the kiss of death. Why buy a wine costing $25 that is rated 87, when the wine next to it has been rated 90 for the same price?Wine Spectator explains that when it rates wines, it conducts tastings “in private, under controlled conditions. Wines are always tasted blind, in flights organized by varietal, appellations or region. Bottles are bagged and coded. Tasters are told only the general type of wine (varietal or region) and vintage. Price is not taken into account.” Some wine reviewers, such as the New York Times, have a panel of tasters and the ratings are something of an average of the group. Wine Spectator’s ratings generally reflect the views of an individual. In the case of California wines, for example, that individual has long been James Laube, who has been with Wine Spectator since 1981. So when you read that the “Wine Spectator” has rated a California Wine a 92, that is really James Laube rating the wine.So are ratings a good guide to how much you will like a particular wine? We decided to put the ratings to the test. How close would a group come to rating wines the same as Wine Spectator? For this experiment I used six California Pinot Noirs from the 2007 vintage (which is generally considered a particularly good vintage for that wine). Five of the six wines were “single vineyard” wines—that is, for example, instead of the Saintsbury “Carneros” Pinot Noir, I used the Saintsbury “Stanly Ranch” Carnernos Pinot Noir. It is very common in California (and many other places) for a winery to release an “entry level” wine, often a blend from several sources, and one or more “high end” wines sourced from single vineyards. The high end wines are often the ones submitted to Wine Spectator and the others, so one must be careful when reading that a “Saintsbury” Pinot Noir scored a 92 from Wine Spectator—it was actually the Saintsbury “Stanly Ranch” Pinot Noir that scored the 92.I carefully selected six 2007 California Pinot Noirs, each with a different Wine Spectator score ranging from 87 to 92. The great majority of wines that are rated (at least in the print edition) by Wine Spectator are between 87 and 92—higher or lower scores are rarer. Thus, for most wines for which we know the scores, a 100-point scale in reality is a six-point scale.In any event, the wines were:1. 2007 Sanford, Sta. Rita Hills (Santa Barbara County) Wine Spectator Score 872. 2007 Foxen, Julia’s Vineyard, Santa Maria Valley (Santa Barbara County) Wine Spectator Score 883. 2007 Foxen, Sanford & Benedict Vineyard, Sta. Rita Hills (Santa Barbara County) Wine Spectator Score 894. 2007 A.P. Vin, Keefer Ranch Vineyard, Russian River Valley (Sonoma County) Wine Spectator Score 905. 2007 A.P Vin, Clos Pepe Vineyard, Sta. Rita Hills (Santa Barbara County) Wine Spectator Score 916. 2007 Saintsbury, Stanly Ranch, Carneros (Napa/Sonoma) Wine Spectator Score 92Of these wines, I was most familiar with the two Foxens. They came from my cellar, and I was not very concerned that they would not show well despite having “only” 88 and 89 scores. The rest I obtained just for this tasting.The wines were presented to my regular tasting group, most of whom have been tasting together for more than two decades, mostly blind tasting, so they know the drill and are experienced tasters. I poured each wine into a carafe with a label on the bottom indicating wine 1, 2, 3, etc., so that in the end we would know which wine was which. On the side of the carafe was another label, and another taster randomly labeled the wines A, B, C, etc., so that during the tasting neither I nor anyone else would know which wine was which. Following the Wine Spectator rules, the tasters were only told that the wines were 2007 California Pinot Noirs. Deviating somewhat from the Wine Spectator rules, the tasters were instructed that their sole job was to assign each wine a score of 87, 88, 89, 90, 91, or 92—no other scores, and no tie scores.When most people in a wine store see scores between 87 and 92, they would probably accept that the difference between a 91 and 92-rated wine was trivial, but they would think that the difference between an 87 and 92-rated wine was significant. And they would probably believe that the relative point scores would tell them which wine they would like better.What we found, however, was that none of the ten tasters agreed with Wine Spectator, and none of the tasters agreed with each other. Only one taster agreed with Wine Spectator that the Saintsbury Stanly Ranch was the highest rated, and none of the tasters agreed with Wine Spectator that the Sanford was the lowest rated. Each of the wines had at least one taster rating it the highest, and four of the six wines had at least one taster rating it the lowest. For average scores, interestingly, the A.P. Vin Keefer Ranch had the highest score, while the A.P. Vin Clos Pepe had the lowest score. The average scores, however, mask the tremendous variation in scores among tasters. For those who care about such things, the Foxen Julia’s Vineyard had the highest standard deviation (2.04), while the Sanford and A.P. Vin Keefer Ranch had the lowest (1.26 and 1.27 respectively). So the Julia’s Vineyard had the widest range of preferences. I and three others gave it the highest score, while two gave it the lowest score.Several caveats may be in order here. First, most of the tasters remarked that all of the wines were pretty darn good. Had they not been instructed to assign six separate scores of 87 to 92, most would probably have rated all of them in the 91 to 93 point range. Given that the wines averaged around $50 in price, this may not be surprising.Second, we tasted the 2007 wines more than a year after James Laube of Wine Spectator would have tasted them. It would not be surprising if the wines had changed relative to one another in that year. But since these were high-end wines not meant to be consumed right away (and really not yet for that matter), it is a useful cautionary tale to consider that wines rated right at release may change their relative deliciousness over time.Third, storage and transportation conditions may affect wines even in the first year. Although all the wines were stored together in my basement for two weeks before the tasting, the two Foxen wines were shipped more or less directly to me from the winery. The A.P. Vins and the Sanford came from reputable Washington D.C. stores (MacArthur’s and Calvert-Woodley, respectively). The Saintsbury came from the online wine source Wines Til Sold Out, so there is no telling what its story was.Fourth, different conditions—time of day, number of wines tasted at a time, temperature, for example, affect one’s perception of how a wine tastes. We could not duplicate the exact conditions under which James Laube initially rated these wines.But against these caveats is the indisputable fact that ten tasters tasting these six wines under the same conditions had wildly different perceptions of how much they liked the wines relative to one another. And you are not likely to know in advance whose preferences you would agree with if you were to taste these wines. Would you agree with James Laube that the Saintsbury Stanly Ranch was the best of these, and the Sanford the least good, or would you agree with me and taster “K” that the Foxen Julia’s Vineyard was the best and the Saintsbury Stanly Ranch the least good? Perhaps you’d agree with tasters “B” and “L” that the A.P. Vin Keefer’s Ranch was the best. Perhaps you’d have different preferences altogether.So, if point scores from Robert Parker, Wine Spectator, or any other source are not necessarily a reliable guide to how much you will like the wine, how are you to choose which wines to buy? There are several options. First, you can find a reputable wine shop where you come to know the wine person, and, more importantly, she comes to know you and your tastes. Buy wines that she recommends and report back to her about how well you liked her recommendations. Over time, she will be able to refine her recommendations to suit your particular tastes.Second, go to wine tastings, either in a store, a winery, at special event wine tastings, or in a tasting group of your friends. That way, you get to “try before you buy.” Pay no attention when the pourer says, “Robert Parker gave this wine 92 points” (and if this is the general sales pitch of your retail wine person, find another). It’s how you like it that counts. You may come to find that you like most offerings of a particular winery, and can come to trust that you’ll like future releases. For example, I have been to the tasting room at Foxen a few times, and am generally satisfied that I’ll like whatever they’ll sell me in the future. (I’ll note here that although I liked the Foxen Julia’s Vineyard best in this tasting, I had no idea while I was tasting that it was the Foxen.)Finally, come to accept the notion that, these days, there is really not that much terrible wine out there. It is not a tragedy of epic proportions if the wine you’re drinking turns out to be “87-ish” to your tastes rather than “92-ish.” Most wines (OK, most wines above jug-wine status) have at least some elements of interest if you pay attention. Don’t obsess with drinking only “the best” wines. Enjoy each wine for what it is, and constantly try new wines. You may very well change your notion of what you like.