4 Messages
•
160 Points
adjusting movies rating to give users a real feedback
I'm pretty sure this comes up quite often. But I did go through ALL the rating's FAQ, and after all, it sums up to this: Some movies rating (the recent, high budget ones) are systematically faked, and it shows.
Let's take a practical example: the last avengers movie. I decided to Watch it because ratings on the site were 9/10. I watched it, and couldn't believe this thing had such high value in the eyes of so many.
So I went to do some checking and maths on the ratings and comments. I don't say I went through all of them, there were too many, but i did read a lot, and even combed them through the filters of pertinence, time posted, and stars awarded, to make sure I didn't miss anything of importance.
Without bothering you, dear reader, with boring numbers, I'm just going to say this: when i look at the 9/10 and 10/10 given before may, 1st, they are all explained with "generic superlatives". None of those actually speaks about the movie, they just say how beautiful and life-changing an experience it was, to Watch. And most of those actually commented on one to three movies at all, and were accounts created in the last five years. After that date, these kind of rating start to answer to previous lower-rated opinions, but none of them actually gives an opinion on the stuff that happens in the movie.
when looking at 5 to 7 rated comments, they all say how the movie was flawed, generally evoking specific issues, and making the part of what was good, and what was bad, and pretty much all agree that it was deceptive - they were expecting better. Many of these have actually a few reviews written for other movies, and are of various "server age".
When looking at 1 to 4 rated comments, they generally say either that they hated it, or that it would deserve a better rating, but are trying to bring down the general that is too high and obviously inflated. I didn't look a lot into these profiles, but I'm among the formers, in these, having rated this movie a 3* because it was really disappointing and badly executed, in my opinion.
now, add to this a semantic analysis of the sentences, and the fact that some reviews get 2000+ "useful" votes in less than 2 weeks (as comparison, mine had 5 out of 7 :D ), and it becomes obvious that the movie's average rating was boosted with semi-automatic tools.
Here is how i see it: Year X, people or bots creates accounts that will be used at some point a few years later. Accounts are almost dormant, and have little to no activity. Then when a movie comes out, during year X+(3-5), somebody pay for those people to return to said accounts (eventually through a bot) and fill some random comments that could actually apply to pretty much any movie. At the same time, the accounts vote each other in term of usefulness, all to get some credit to the evaluation.
For the duration of the contract, whenever the general rating drops below a given value, some new comments are randomly generated to reestablish the "proper" value.
I took the example of a recent movie, but actually this can be easily seen when making a statistical check on older blockbusters: systematically, there is a huge wave of "good" ratings at the moment the movies go to theatre, and then, a few months later, a systematic drop of several points, given by the successive comments that real spectators wrote.
The problem in all this: ratings are no longer trustables. For me, I can't say if i would have watched this movie if ratings had been 6 instead of 10. But it's possible that I would have chosen a different one, and waited for this to get on VOD, for example.
It would be easy to create a "trusted group" system, where users can actually be selected for their objectivity, and make a "parallel" rating system. The first, open to all, as it is today; the second, only to those people that actually identified themselves, filled a small questionnaire, and are rated themselves on their comments, so as to make sure they aren't fakes. Or something else like that. In all cases, something should be done to make sure the rating service becomes useful once again.
sorry for the long text.
ACT_1
8.5K Messages
•
176.2K Points
6 years ago
tellcall
Joined community on May 11, 2019
Let's take a practical example: the last avengers movie
- - -
See:
https://getsatisfaction.com/imdb/topics/top-250-movies-waiting-period-for-new-movies
TOP 250 movies waiting period for new movies
by Aaron
Posted 2 weeks ago
- - -
https://getsatisfaction.com/imdb/topics/top-250-movies-waiting-period-for-new-movies?topic-reply-lis...
https://www.imdb.com/chart/top
Tuesday, April 30, 2019
Top 25 of the Top 250 as rated by IMDb Users
Rank & Title - IMDb Rating
7 . Avengers: Endgame (2019) 8.9
Tuesday, May 7, 2019
12. Avengers: Endgame (2019) 8.8
- - -
Friday, May 10 2018
13. Avengers: Endgame (2019) 8.7
https://www.imdb.com/title/tt4154796/reviews - 6,504 Reviews
https://www.imdb.com/title/tt4154796/ratings
360,634 IMDb users have given a weighted average vote of 8.9 / 10
.
3
jeorj_euler
10.7K Messages
•
225.4K Points
6 years ago
0
tellcall
4 Messages
•
160 Points
6 years ago
dear Ed and Jeorj,
"If he had really read all the replies on this topic, he would have not brought this up at all."
Ed, i like your way of putting doubt forward. In fact, it's of the same kind that brings my point to be valid:
if the
"formula which they will not reveal, that takes the very points you make into account when determining the "Weighted Average" that is applied to all titles.
It is to combat all the points you make."
was really effective,
then
we should see a different result.
The result being what we can see is proof that said formula just doesn't work as intended.
Quod Erat Demonstrandum.
See? power of maths! ;)
Also, I didn't say I read all replies on this topic; just that I read all of the FAQ on IMDB website before writing this. And I wrote this because said FAQ insists, as you do, saying that said formula gives a proper result.
But it doesn't.
I took this movie as an example, but there are hundreds like this, it's becoming (or rather, it has become) a common situation, and people got used to it, but it's just wrong.
What I'm actiually pointing out is that some smart people out there found a way to circumvent the security given by this "hidden anti-tampering formula".
And that a deep analysis of ratings & reviews shows it plainly.
As of the "idea" thing, no, it wasn't one. I was just trying to show that there are a lot of better solution than simply denying an issue hiding behind a "we got a hidden thing that does just that, you don't see it, but trust us, it works".
Well, no, given the proofs on hand, trust is out of duty. And if people at IMDB want it back, they ought to do something with a better result at the end.
That was the message core.
I was hoping to reach some responsible people at IMDB, but couldn't find a mailing address, so just posted it here since they say they do read this board. Then again, they also say the thing works correctly, so, who knows.
Thanks for reading, btw.
0
bderoes
Champion
•
5K Messages
•
118.3K Points
6 years ago
I'm not sure how your suggestion of a "trusted group system" differs from the extant Top 1000 voters.
We could probably use a bigger group than 1000; by now IMDb has so many users, perhaps the top 100,000 raters should be a group.
The Top 1000 score is visible on the ratings breakdown page, accessed by clicking the total number of raters on the title page. Starting here:
You get this:
https://www.imdb.com/title/tt4154796/ratings?ref_=tt_ov_rt
And at the bottom left of that page is the Top 1000 voters.
Click on the total voters, and you get the breakdown of their votes:
https://www.imdb.com/title/tt4154796/ratings?demo=top_1000_voters
3
jeorj_euler
10.7K Messages
•
225.4K Points
6 years ago
0
0