In days long gone, a handful of powerful publications were the gatekeepers of wine reviews. They passed out scores, and wineries either benefitted from them or not. In today’s world, these magazines are no longer the gatekeepers for reviews. Wineries and retailers are.
Things started to change around 15-20 years ago with the introduction of blogging. Wineries could now get attention from somewhere other than the big magazines. Independent reviewers, who were reviewing wines with more intention and devotion, also emerged.
In two high profile examples, Antonio Galloni established the Piedmont Report in 2004. Jeb Dunnuck established The Rhône Report in 2008. Both were eventually brought under the umbrella of The Wine Advocate (TWA), Galloni in 2006 and Dunnuck in 2013. (I established Washington Wine Report in 2004 and rebranded to Northwest Wine Report in 2022.)
Then something interesting happened. Galloni left TWA in 2013 and started a new review site, Vinous. Similarly, Dunnuck left TWA and founded JebDunnuck.com in 2017.
No publication has been immune from the trend of reviewers going independent. James Suckling founded JamesSuckling.com after leaving Wine Spectator. (I left Wine Enthusiast and started focusing fully on my own site in 2022.)
With technology providing the equivalent of a modern day printing press, more independent review sites have launched from people not formerly associated with the major magazines. Additionally, wine competitions have started handing out generous point scores instead of just medals-for-all.
This has led to a tectonic change in the wine reviewing landscape. Wineries and retailers no longer need scores from legacy publications to help sell wine. Instead, they have a buffet of reviews to choose from.
One could refer to this as the democratization of wine reviewing. It was.
However, previously, reviewers were vetted (usually). Regional experience was considered paramount. Reviewers also typically had extended tenures, allowing wineries and consumers to better understand their palates.
Reviewers provided not just a service to consumers but also a check to wineries when they missed the mark. The importance of reviews came from both the publication with which it was associated as well as a reviewer’s own experience and credibility.
None of these things are the case any longer.
Today, the score is all-important, not the publication or the reviewer. There is no standard whatsoever for reviewer experience at some independent publications and even some legacy ones. Reviewers often have little experience and no regional expertise. Reviews are, generally speaking, nothing but effusively positive.
There have, no doubt, been some benefits from these changes. Wineries are no longer beholden to a small handful of publications and their predominantly white male reviewers to get attention. People interested in reviewing wine can self-start, as I did. Consumers have more varied voices.
Unfortunately, the problems are manifold. Reviewers are strongly incentivized to give high scores to rise above the fray. Wineries are similarly incentivized to promote the highest score a wine receives, regardless of its source.
In doing so, wineries and retailers bestow credibility on the reviewer and outlet by lending some of their own. They elevate those voices. That is a fundamental change from the days when the credibility of the review was coming from the publication itself.
This might seem like a victimless problem. However, giving wineries and retailers control over the legitimacy of reviewers and reviews is deeply conflicted. Both are in the business of selling something.
The reality is everyone loses. Consumers buy wines based off scores that have little credibility. Many realize it and start ignoring reviews. Wineries and retailers hurt their credibility by promoting scores that aren’t based in reality and that they don’t even necessarily believe.
Wine criticism takes the worst hit. These changes have turbo-charged score inflation. As reviewing outlets multiply, one sure-fire way to get wineries to promote scores (and hence your brand) is to score at the upper reaches of the scale.
Top scores, once rare beasts reserved for the best wines from the best vintages (as I believe they still should be), are now so commonplace they have little meaning. When reading articles of all of the 100-point scores publications gave last year, I didn’t know whether to laugh or cry.
Ultimately, these changes will usher in the end of the 100-point system. That day is fast approaching. Scores have already reached unsustainably high levels where they are less and less impactful.
Some will surely rejoice when that happens. However, consumers still need ways to find wines they are more likely to enjoy from the thousands of available options. Wineries similarly need ways to elevate their wines in an increasingly competitive environment.
Wine reviewing, for better or worse, has long served as one of the mechanisms to do these things. Once that arrow gets removed from the quiver, all the points in the world won’t affect people’s buying habits.
Perhaps something better will rise from the ashes. However, as the wine industry sails into a near perfect storm of headwinds, losing one of the primary tools used for decades to sell wine is unlikely to be anything to celebrate.
Northwest Wine Report is wholly subscriber funded. Please subscribe to support continued independent content and reviews on this site. It is the only way that the site will be able to continue.
To receive articles via email, click here.
Sean,
Thank you for another outstanding article and one that needed to be written. You are spot on with your analysis.
One of the many things that differentiates you from the pack of uninformed “inflators” is your in-depth discussion of everything that makes the wine. You cast a wide net and provide unique insights into things that matter to those of us who love Washington wine.
I have, for many years, respected your balanced and well informed approach to your reviewing and scoring. Hopefully you continue to talk about how to better educate the wine consumer and get them to see the big picture rather than a number that is increasingly meaningless.
As you say, the environment is shifting daily and consumers want intelligent information about wine!
I’m glad you wrote this we’ve talked about it a few times. I agree with much of this, but would make a few points.
-Wine magazines also sell wine; if they didn’t, they wouldn’t receive wines to review, they wouldn’t book ads selling wine, producers wouldn’t pay to be a part of their events, etc. So I think the relationship there is much more complicated than you’ve made it seem above. If a wine magazine, for example, only published its 88 and below scores, it would go out of business, and everyone has a Top 100 or 50, or whatever.
-I have seen several wineries post only the top scores they receive from a reviewer, me, for instance. I may have scored several of their wines 89, 90, or 91 and if one or two received a 93 or above, those are the ones they post on their site or social media. I think this is their prerogative what they do, of course, but I don’t blame them. I think of those “My child is an Honor Student at XYZ School” I never saw those for kids who got Cs.
The last one and we still have a long way to go on progress here, but those are old, out-of-shape white guys, for the most part, staffing those mags. If we want to see more women and people of color make their way into these mainstream magazines, they are likely getting their start in blogs (as we both did) or on Instagram, and I think that part of it is 100% positive.
I do agree that regional familiarity and reviewing experience is important as well as some training and palate calibration
Clive,
I agree that wine magazines selling wine and making money by having wineries attend events presents complications and conflicts of interest. That said, I can say at least from my time at Wine Enthusiast, I had literally zero influence from the sales and event side of the operation in terms of how I scored my wines. Did they ever talk to me about scores? Not one single time in nine and a half years. So there are, in cases at least, some dividing lines between those two sides of the business.
It is absolutely a winery’s prerogative to use whatever scores they like. My main point here is that who has control over the influence of scores has fundamentally changed, from the publications to wineries. By promoting certain reviews and ignoring others, *wineries* raise the voices and give credibility. That is a fundamental change.
While it is a winery’s prerogative to use whatever reviews they like, there is a downstream effect of using reviews that have no or little credibility. That effect is ultimately score inflation and reviews becoming increasingly less effective in selling wine. So ultimately, it’s penny wise and pound foolish.
In an ideal world, wineries would consider a) the publication b) the person’s experience with wine and c) the person’s expertise or lack thereof with a certain region. However, we don’t live in an ideal world, and that’s a lot to ask. Many wineries also lack objectivity about the quality of their wines, so when someone – anyone – has positive things to say, they run with it regardless of who that someone is or what their experience is.
With reviews that have little validity now being liberally promoted, it ultimately pushes the onus onto consumers to decide which reviews are meaningful and which reviews aren’t. I would guess that less than 1% of wine drinkers are likely to do that. Ultimately, it has built a house of cards which will fall.
Finally, as an old, out-of-shape white guy, I wouldn’t discriminate based on age or physical fitness. That said, I would love to see more diversity in reviewers and the wine industry more generally. Thanks for the thoughts!
Great article. What do you think should replace the 100 point system if this era is coming to an end?
Todd, it’s a great question. Long-time readers will remember that, for many years, I used a five-point scale that ultimately became a five star scale, with half star increments. Especially with the hegemony of that system due to its use in Amazon, rideshares, etc., that is the most likely candidate. I know a variety of apps already use that system for wine. That said, I expect that will be used more crowd-sourced reviews than by individual reviewers.
For at least 15 years, I’ve tried to think about other systems and possibilities. Many have tried other things. None of those things have been successful and they won’t be unless wineries, distributors, and retailers adopt them and promote them. Whatever it might be, it will take a long time to wean the industry off of the 100-point system, even long after its impacts have ceased.
Sean,
Regarding the prevalence of the 100-point rating, the problem is that the curve has not been adjusted.
I’m one of those people who knows that the 100-point rating scale was a great innovation, that it works, and that consumers benefit from this system. So it can still work just fine.
I think we have to consider though that wines have gotten very, very good over the past two decades. Technology and knowledge have increasead. Proper matching of grape to terroir his increased. More really great wines are being produced. BUT, the ratings curve simply hasn’t been adjusted so that today we have tremendous numbers of 98, 99 and 100 point scores.
What’s necessary is for reviewers to be brave enough to adjust their expectations for wines. There needs to be a new expectation among critics: If the wine is very very good, then it really only deserves an 85. To get a 90+ score it must be magnificent. It’s worth noting that under this scenario, the 85-point wine of today would have been a 90 or 92 point score in 2000.
Equally important, it’s time for reviewers to start focusing attention on wines and wineries that simply fail to make very good wine. They need to be called out.
Tom,
I do believe that the increase in wine quality due to a wide variety of factors has increased scores. That said, the biggest issue I see is that many reviewers (most?) lack transparency or standardization in the ways that they review wines. There is also no standard for experience.
Some wines are reviewed while being feted by the winemaker (those wines always score higher and often score higher than they should), some wines are reviewed in cattle call tastings with 120 other wines (those wines cover a range of scores but are almost always lower than those tasted with the winemaker), and some are tasted at home, either with sets from the same winery or perhaps other wines.
Those are all fundamentally different things and lead to different scores. That is to say, I could take the exact same wine and put it in those three different settings and it would receive radically different scores from the exact same reviewer. How do I know this? Because I’ve done the experiment! There’s a reason that I am rigorous in the procedures I use to taste wines, where every wine is tasted the exact same way.
Unfortunately, most reviewers are opaque about the setting that they review a wine. Even when I was tasting for Washington Wine Report, the original name of this site, you could look in my review database and see: a) did I purchase the wine or did I receive it as a sample b) did I taste the wine at a winery or at home. Why? Because I thought those things might matter. I still believe that, but you’d never know the answer looking at most sites.
On your last point, I am firmly of the belief that wine reviews serve should serve the function of letting wineries (and consumers) know when a winery has hit the mark or missed it. The latter is particularly important in an industry where everyone thinks there wine is great.
Unfortunately, most reviewers today shy away from giving scores that might be deemed critical, as they fear that the submissions would stop coming. As a result, wineries miss out on an important piece of information, and consumers do too. All we hear is the praise, which is often greater than it should be.
Thanks for the thoughts!