Comics News

Same Old Playbook: Is Rotten Tomatoes Trying to Shield Lightyear?

[ad_1]

 

Rotten Tomatoes, which was once lauded for being a pro-consumer movie review website that allowed fans and critics to input their scores for movies and TV episodes, changed its tune when a succession of highly praised films received dismal crowd reviews. Those movies were Ghostbusters (2016), Captain Marvel, and The Last Jedi. To combat what they called “review bombing” or “trolling,” Rotten Tomatoes has worked to “improve” their audience score system several times. However, no matter how much money a company invests on a film, some people genuinely dislike a movie.

 

ThatParkPlace says that it must be very tempting for Rotten Tomatoes, who depends on revenue from movie companies, to mislabel bad but honest reviews as “trolling” and “review bombing”:

Surely everyone must love what the corporations and the mass media love, right?

Want some evidence of these things occurring?

Rotten Tomatoes Hides the Eternals During Poor Reception

Doctor Strange in the Multiverse of Madness Audience Scores Vary Wildly on Rotten Tomatoes and Meta Critic

 

We’ve also covered this with Rotten Tomatoes tweaking a movie’s Audience score with The Rise of Skywalker. ThatParkPlace continues:

 

The problem here is that Rotten Tomatoes does not exist in a vacuum. It faces competition. We have other sites that provide audience scores in almost the identical way, sometimes even in a superior way. Normally, the audience scores for the big three aggregators line up remarkably well. Rotten Tomatoes, Meta Critic, and IMDB all usually have numbers that match closely. But sometimes, especially when a big movie with poor reception launches, one of these sites is not like the other. Let’s take a look, for example, at the audience scores for Lightyear… a movie which has grossed less-than-half what it was expected to make just two weeks ago:

 

That’s IMDB with an audience score of 5.3 — in other words, very mediocre.

 

And here is Metacritic’s audience score, which is within 1.0 of the other’s record.

 

Okay, so from those two sites we can see that sample size doesn’t seem to budge the needle. They have different systems for measuring audience scores, one using 114 user reviews from select persons, the other using many thousands of random reviews. It all comes out to the same mediocre level. And with Metacritic we even get a breakdown of how polarizing the movie is — 58 negative reviews, 48 positive, and only 8 moderate.

So how does Rotten Tomatoes see it?

It’s funny how a film that is coming in as Pixar’s worst miss from original projections is just hunky-dory with audiences, eh? The public loves it more than critics according to only one review aggregator out there. It’s just another example of the same old thing we’ve been tracking with Rotten Tomatoes for years now.

 

I’m convinced that Rotten Tomatoes is keeping their thumb on the scales for the bigger studios like Disney. The evidence is circumstantial yes, but a pattern is emerging. What do you think?

 


[ad_2]

You may also like

Leave a reply

Your email address will not be published. Required fields are marked *

More in:Comics News