Often on fine wines you see ratings from Wine Advocate and/or Wine Spectator, and usually they're within a point or two of each other. But I've seen two wines recently where there was a 10 point spread--how is this possible?
Answer From Expert Roger Bohmrich MW
This is a great question and brings up one of the continuing hot topics in the world of wine - particularly, but not only, in the US where such ratings seem to carry particular weight. Your question - "how is it possible" that different reviewers would score the same wine very differently - seems to suggest that there is a uniform and objective measure which all tasters apply. Many question whether it is possible to quantify wine quality, except by using some type of arbitrary scoring system, and even then human beings naturally apply any such system in an individual manner. That is why wine competitions usually rely on panels of judges and a methodology to "even out" the subjective extremes, such as discarding the high and low scores. Coming back to your question, it is true that trained professional tasters (of which there are, in truth, many fewer than you might think) do tend to share standards of assessment and a vocabulary; nevertheless, they may come to different conclusions, especially when they are expressing their personal preferences in a numerical score.