tellcall's profile

4 Messages

 • 

160 Points

Saturday, May 11th, 2019 1:31 AM

2

adjusting movies rating to give users a real feedback


I'm pretty sure this comes up quite often. But I did go through ALL the rating's FAQ, and after all, it sums up to this: Some movies rating (the recent, high budget ones) are systematically faked, and it shows.


Let's take a practical example: the last avengers movie. I decided to Watch it because ratings on the site were 9/10. I watched it, and couldn't believe this thing had such high value in the eyes of so many.

So I went to do some checking and maths on the ratings and comments. I don't say I went through all of them, there were too many, but i did read a lot, and even combed them through the filters of pertinence, time posted, and stars awarded, to make sure I didn't miss anything of importance.

Without bothering you, dear reader, with boring numbers, I'm just going to say this: when i look at the 9/10 and 10/10 given before may, 1st, they are all explained with "generic superlatives". None of those actually speaks about the movie, they just say how beautiful and life-changing an experience it was, to Watch. And most of those actually commented on one to three movies at all, and were accounts created in the last five years. After that date, these kind of rating start to answer to previous lower-rated opinions, but none of them actually gives an opinion on the stuff that happens in the movie.


when looking at 5 to 7 rated comments, they all say how the movie was flawed, generally evoking specific issues, and making the part of what was good, and what was bad, and pretty much all agree that it was deceptive - they were expecting better. Many of these have actually a few reviews written for other movies, and are of various "server age".

When looking at 1 to 4 rated comments, they generally say either that they hated it, or that it would deserve a better rating, but are trying to bring down the general that is too high and obviously inflated. I didn't look a lot into these profiles, but I'm among the formers, in these, having rated this movie a 3* because it was really disappointing and badly executed, in my opinion.

now, add to this a semantic analysis of the sentences, and the fact that some reviews get 2000+ "useful" votes in less than 2 weeks (as comparison, mine had 5 out of 7 :D ), and it becomes obvious that the movie's average rating was boosted with semi-automatic tools.


Here is how i see it: Year X, people or bots creates accounts that will be used at some point a few years later. Accounts are almost dormant, and have little to no activity. Then when a movie comes out, during year X+(3-5), somebody pay for those people to return to said accounts (eventually through a bot) and fill some random comments that could actually apply to pretty much any movie. At the same time, the accounts vote each other in term of usefulness, all to get some credit to the evaluation.

For the duration of the contract, whenever the general rating drops below a given value, some new comments are randomly generated to reestablish the "proper" value.

I took the example of a recent movie, but actually this can be easily seen when making a statistical check on older blockbusters: systematically, there is a huge wave of "good" ratings at the moment the movies go to theatre, and then, a few months later, a systematic drop of several points, given by the successive comments that real spectators wrote.

The problem in all this: ratings are no longer trustables. For me, I can't say if i would have watched this movie if ratings had been 6 instead of 10. But it's possible that I would have chosen a different one, and waited for this to get on VOD, for example.

It would be easy to create a "trusted group" system, where users can actually be selected for their objectivity, and make a "parallel" rating system. The first, open to all, as it is today; the second, only to those people that actually identified themselves, filled a small questionnaire, and are rated themselves on their comments, so as to make sure they aren't fakes. Or something else like that. In all cases, something should be done to make sure the rating service becomes useful once again.

sorry for the long text.



8.5K Messages

 • 

176.2K Points

6 years ago

tellcall
Joined community on May 11, 2019
Let's take a practical example: the last avengers movie
- - -

See:

https://getsatisfaction.com/imdb/topics/top-250-movies-waiting-period-for-new-movies
TOP 250 movies waiting period for new movies
by Aaron
Posted 2 weeks ago
- - -

https://getsatisfaction.com/imdb/topics/top-250-movies-waiting-period-for-new-movies?topic-reply-lis...

https://www.imdb.com/chart/top
Tuesday, April 30, 2019
Top 25 of the Top 250 as rated by IMDb Users

Rank & Title - IMDb Rating
7 . Avengers: Endgame (2019)  8.9

Tuesday, May 7, 2019
12. Avengers: Endgame (2019)  8.8
- - -

Friday, May 10 2018
13. Avengers: Endgame (2019)  8.7

https://www.imdb.com/title/tt4154796/reviews - 6,504 Reviews

https://www.imdb.com/title/tt4154796/ratings
360,634 IMDb users have given a weighted average vote of 8.9 / 10

.

10.7K Messages

 • 

225.4K Points

Relevance?

8.5K Messages

 • 

176.2K Points

Jeorj Euler : Relevance?

? ?

https://getsatisfaction.com/imdb/topics/top-250-movies-waiting-period-for-new-movies
TOP 250 movies waiting period for new movies
I was hoping to not see Avengers: Endgame on the top 250 the day after it was released but sure enough it's number six on the list right now.

Anyone that pays attention to the top 250 list will know that this movie will go down and eventually be no where near the top ten on the list

What I would purpose to stop new movies from continuing to disrupt the top 250...

by Aaron
Posted 2 weeks ago
- - -

3 Months is not enough.
I'd like to propose a variation on your idea...
The Top 250 list gets new additions once a title reaches 1 year old.
by Ed Jones (XLIX)

 .

10.7K Messages

 • 

225.4K Points

Okay. Maybe.

10.7K Messages

 • 

225.4K Points

6 years ago

This is more of an "idea" style topic than a "problem" style topic.

4 Messages

 • 

160 Points

6 years ago


dear Ed and Jeorj,


"If he  had really read all the replies on this topic, he would have not brought this up at all."


Ed, i like your way of putting doubt forward. In fact, it's of the same kind that brings my point to be valid:


if the 


"formula which they will not reveal, that takes the very points you make into account when determining the "Weighted Average" that is applied to all titles.
It is to combat all the points you make."


was really effective,

then

we should see a different result.


The result being what we can see is proof that said formula just doesn't work as intended.

Quod Erat Demonstrandum.

See? power of maths! ;)

Also, I didn't say I read all replies on this topic; just that I read all of the FAQ on IMDB website before writing this. And I wrote this because said FAQ insists, as you do, saying that said formula gives a proper result.

But it doesn't.

I took this movie as an example, but there are hundreds like this, it's becoming (or rather, it has become) a common situation, and people got used to it, but it's just wrong.

What I'm actiually pointing out is that some smart people out there found a way to circumvent the security given by this "hidden anti-tampering formula".

And that a deep analysis of ratings & reviews shows it plainly.

As of the "idea" thing, no, it wasn't one. I was just trying to show that there are a lot of better solution than simply denying an issue hiding behind a "we got a hidden thing that does just that, you don't see it, but trust us, it works".

Well, no, given the proofs on hand, trust is out of duty. And if people at IMDB want it back, they ought to do something with a better result at the end.

That was the message core.

I was hoping to reach some responsible people at IMDB, but couldn't find a mailing address, so just posted it here since they say they do read this board. Then again, they also say the thing works correctly, so, who knows.

Thanks for reading, btw.



Champion

 • 

5K Messages

 • 

118.3K Points

6 years ago

tellcall - 
I'm not sure how your suggestion of a "trusted group system" differs from the extant Top 1000 voters.
We could probably use a bigger group than 1000; by now IMDb has so many users, perhaps the top 100,000 raters should be a group.  

The Top 1000 score is visible on the ratings breakdown page, accessed by clicking the total number of raters on the title page. Starting here:


You get this:
https://www.imdb.com/title/tt4154796/ratings?ref_=tt_ov_rt


And at the bottom left of that page is the Top 1000 voters.


Click on the total voters, and you get the breakdown of their votes:
https://www.imdb.com/title/tt4154796/ratings?demo=top_1000_voters



8.5K Messages

 • 

176.2K Points

bderoes, Champion
We could probably use a bigger group than 1000;
by now IMDb has so many users,
perhaps the top 100,000 raters should be a group. 
- - -

102,830,000 Users now
some are gone, some do nothing here
about 40,000 New Users each day

jominju
IMDb member since May 11 2019
https://www.imdb.com/user/ur102830000

- - -

IMDb could add How many titles are Rated ? ? (and by how many Users)

https://www.imdb.com/pressroom/stats/
Titles: 5,980,614

User review: 4,211,785

.

8.5K Messages

 • 

176.2K Points



darkshyne
IMDb member since May 2005
https://www.imdb.com/user/ur5447903/
https://www.imdb.com/user/ur5447903/reviews -  1 Review
https://www.imdb.com/user/ur5447903/ratings - 17,625 titles

Horst_In_Translation
IMDb member since September 2004
https://www.imdb.com/user/ur3914439/
https://www.imdb.com/user/ur3914439/reviews - 9,325 Reviews
https://www.imdb.com/user/ur3914439/ratings - 22,099 titles 

MartinHafer
IMDb member since June 2003
https://www.imdb.com/user/ur2467618/
https://www.imdb.com/user/ur2467618/reviews - 22,939 Reviews
https://www.imdb.com/user/ur2467618/ratings - 22,655 titles

rlw-43180
IMDb member since December 2017
https://www.imdb.com/user/ur82873564/
https://www.imdb.com/user/ur82873564/reviews - 0 Reviews
https://www.imdb.com/user/ur82873564/ratings - 23,584 titles

Keester
IMDb member since April 2009
https://www.imdb.com/user/ur21010397/
https://www.imdb.com/user/ur21010397/reviews - 3 Reviews
https://www.imdb.com/user/ur21010397/ratings - 24,235 titles

cutegirlbutts
IMDb member since November 2012
https://www.imdb.com/user/ur37715747/
https://www.imdb.com/user/ur37715747/reviews - 8 Reviews
https://www.imdb.com/user/ur37715747/ratings - 27,850 titles

Bronco46
IMDb member since May 1999
https://www.imdb.com/user/ur0105327/
https://www.imdb.com/user/ur0105327/reviews - 212 Reviews
https://www.imdb.com/user/ur0105327/ratings - 28,085 titles

sarge-19
IMDb member since October 2000
https://www.imdb.com/user/ur0204988/
https://www.imdb.com/user/ur0204988/reviews - 0 Reviews
https://www.imdb.com/user/ur0204988/ratings - 28,205 titles

a-l-e-s-s-a-n-d-r-o93
IMDb member since July 2012
https://www.imdb.com/user/ur35354941/
https://www.imdb.com/user/ur35354941/reviews - 1 Review
https://www.imdb.com/user/ur35354941/ratings - 28,351 titles

boblipton
IMDb member since Feb 2002
https://www.imdb.com/user/ur1617546/
https://www.imdb.com/user/ur1617546/reviews - 5,970 Reviews
https://www.imdb.com/user/ur1617546/ratings - 32,998 titles


Col Needham, Founder and CEO of IMDb
IMDb member since Oct 1990
https://www.imdb.com/user/ur1000000/
https://www.imdb.com/user/ur1000000/reviews - 2 Reviews
https://www.imdb.com/user/ur1000000/ratings - 11,726 titles
.

4 Messages

 • 

160 Points


Hey Bderoes,

I'm not sure how your suggestion of a "trusted group system" differs from the extant Top 1000 voters.

honestly I did not give much structure to that thought, it was only for illustrative puprose.

But ok, let's give it a try.

Say that they find a group of people, statistically relevant, (so various origins, lifestyles, etc, rougly balanced to corrrespond to what we have IRL), who volounteer to be verified as such (that would include accepting to disclose to the website some personal information, just enough to know that they are who they pretend to be). All having in common that they watch movies every now and then.

Then, these people could be a "core" group whose reviews (when they do go to see a movie) are set apart, and used as comparison with the "mass" reviews posted out today. Whenever the two diverge, It would be safe to assume that the "core" reviews ought to be more accurate. 
Then it would be just easy to have, on the movie page, beside the score we see today, the "core group" score, and let everyone make their own opinion based on that.

Said group ougth to be big and dynamic. Big, because of course we can't expect everyone to see every movie. Dynamic because people that are very active today may not be tomorrow, and also it is important to have some "weighting" based on personal tastes (which would be part of said personal information). It could also be reserved only for big budget movies, as those are the ones with the most distorted ratings.

This group could also be formed passively, just by analysing existing reviews: in a nutshell, a person that wrote several appropriate reviews and whose ratings are in sync with the average movie ratings most of the time could be added to the core group on the kind of movies they usually rate. But It would be less efficient.

In short, we can't say who is right or who is wrong, but when a difference appears, we could have a "guaranteed real people with no personal interest in this rated it this much" option.

10.7K Messages

 • 

225.4K Points

6 years ago

Hi, tellcall. You've not actually proved let alone quantified the extent of the supposed ballot stuffing.