3 Messages

 • 

90 Points

Wednesday, March 18th, 2026

No Status

Possible coordinated and policy-violating reviews on Desert Warrior

Hi,

I would like to report a concerning pattern of user reviews on Desert Warrior:

https://m.imdb.com/title/tt13570066

The film has not yet been released, however a significant number of 1-star reviews have already been submitted. Many of these reviews appear to be based on assumptions rather than actual viewing.

In addition, several patterns raise concerns about policy violations:

  • Multiple reviews appear to be duplicated or highly similar in structure and wording

  • A number of accounts seem newly created and used immediately for posting reviews

  • Several reviews include statements suggesting the users have not seen the film

More importantly, a number of reviews contain language that appears to target casting choices based on ethnicity or race, for example:

  • questioning why an “African actor” is cast

  • statements rejecting actors based on skin color or origin

  • repeated claims that certain ethnicities should not represent specific roles

This type of content appears to go beyond normal critique and may fall under IMDb’s guidelines regarding inappropriate or discriminatory content.

Given the combination of:

  • pre-release reviewing

  • repeated / coordinated patterns

  • and potentially guideline-violating language

Could this title please be reviewed for:

  • inauthentic or coordinated activity

  • possible review removal where guidelines are violated

  • and rating stabilization prior to release

Thank you for your time and review.      

Oldest First
Selected Oldest First

Employee

 • 

7.5K Messages

 • 

78.5K Points

1 month ago

Hi ahmetyavuz-

Thank you for bringing this to our attention.

We are aware that there are people who may vote for the sole purpose of trying to inflate/deflate the rating for a movie. We have several safeguards in place to automatically detect and defeat this type of ballot stuffing: even though we count and display all unaltered votes in the rating breakdown, we apply several countermeasures against all attempts to skew the rating for the weighted rating you see displayed on the site. We will review the title in question to ensure that the weighted average is displaying as intended. 

For more information on our abuse detection methods, or details about IMDb ratings and weighted averages, please refer to our Ratings FAQ.

Cheers!

3 Messages

 • 

90 Points

Hi Maya,

Thank you for the response, really appreciate you taking a look.

I just wanted to add a bit more specific context that may help with the review:

  • A number of reviews appear to be duplicated almost word-for-word (including repeated paragraphs and identical phrasing across different accounts)

  • Several accounts posted reviews immediately after creation within the same short time window

  • Some reviews explicitly indicate the users have not seen the film, yet are rating it 1/10

  • There are also instances of the same review text being posted multiple times by the same user

Additionally, some reviews include language targeting casting choices based on ethnicity (e.g. rejecting actors based on origin or skin tone), which may fall outside standard review guidelines.

Given these more specific patterns, it may be worth a closer manual review of:

  • duplicate reviews

  • newly created account activity clusters

  • and reviews that do not reflect actual viewing

Happy to share specific examples if helpful.

Thanks again

(edited)

3 Messages

 • 

90 Points

Hi Maya,

Thank you for the action taken on this title in March. The review cleanup was appreciated, and I understand the effort that went into it.

However, I'm following up because the resolution appears incomplete, and the title's rating integrity has not been restored.

What happened:

After my report, the title was locked for approximately 81 days (mid-March through April 24). During that time, 179 user reviews were removed. When the title was reopened on release day, the written reviews were gone, but all the star ratings from the same period were restored unchanged, including those from accounts whose reviews were deleted for policy violations.

The title reopened on April 24 with a 1.3/10 rating based on approximately 4,700 pre-release votes. It now sits at 1.9/10 with over 5,200 votes. The underlying data tells a very clear story.

The rating histogram is not consistent with organic behavior:

- 73.1% of all votes are 1/10- 13.8% are 2/10- Combined 3 through 9 stars account for only 4.1% of votes- Then 10/10 jumps to 4.9%

No film, regardless of quality, produces a distribution where 87% of votes are 1 or 2 stars and only 4% fall between 3 and 9. This is a textbook manipulation pattern. The near-total absence of middle ratings is the clearest indicator.

The geographic concentration is extreme:

74% of all votes originate from a single country, averaging exactly 1.2/10, while the film had limited or no theatrical release in that market during the voting period. This is the same pre-release voting pattern I flagged originally.

External ratings confirm the anomaly:

I'm not claiming the film is great. Every other platform reflects a mixed-to-below-average film:

- Rotten Tomatoes: 33% critics / 63% audience- Metacritic: 41/100- Letterboxd: 2.6/5

IMDb's own Metacritic page (https://www.imdb.com/title/tt13570066/criticreviews/) shows critic scores ranging from 20 to 70. Not a single professional critic rated it anywhere near 1/10. There are also 12 external reviews listed on IMDb's own external reviews page (https://www.imdb.com/title/tt13570066/externalreviews/). The consensus across all platforms and professional reviewers places this film in the 3-5/10 range.

A 1.9/10 on IMDb against a 63% audience score on Rotten Tomatoes and 2.6/5 on Letterboxd is a gap too large to explain by platform differences. It can only be explained by the pre-release vote manipulation that was already acknowledged.

Post-release data from actual viewers supports this:

Since the title was reopened, new votes from markets where the film is actually playing in theaters are averaging around 6/10. These are people who bought tickets and watched the movie. Their votes are being mathematically overwhelmed by the 4,700 pre-release manipulation votes that were restored.

The inconsistency:

If the written reviews from these accounts were deemed manipulative enough to warrant deletion, the star ratings submitted alongside those same reviews should not have been restored. The reviews and ratings came from the same accounts, during the same coordinated campaign, as part of the same behavior. Removing one but keeping the other doesn't address the actual problem. The rating is the part that affects the title page, search rankings, and audience perception.

What I'm asking:

Could the team please review the star rating data with the same scrutiny that was applied to the written reviews? Specifically:

- Ratings from accounts whose reviews were previously removed- The statistical anomaly in the histogram distribution- The geographic concentration from a single market during the pre-release window

The current weighted average does not appear to be correcting for this level of coordinated activity. The film has real problems, and a fair rating would likely land in the 4-5/10 range. But 1.9/10 is not a reflection of audience opinion. It's a reflection of the campaign that was already identified and partially addressed.

Thank you for your time.