Recommendation rating

Chao
Joined: Sun Apr 14, 2002 6:39 am
Org Profile

Recommendation rating

Post by Chao » Sat Jan 22, 2005 1:07 pm

A possible solution to people asking "what's the best x amvs" would be to perhaps introduce a recommended rating as well as an average rating, ala Rotten Tomatoes. Whereby a an opinion rating of 7 or higher is a recommendation and a 4 or 5 on the stars is a recommendation. This way you can see not only which amv are generally considered the best, but those considered good by a lot of people. It would help differentiate between the medium scored AMVs that some like and some love and the ones that everyone thinks is reasonable.

User avatar
Kalium
Sir Bugsalot
Joined: Fri Oct 03, 2003 11:17 pm
Location: Plymouth, Michigan
Org Profile

Post by Kalium » Sat Jan 22, 2005 3:08 pm

Well, there's the Top 10%, the Top Star Scale, and the Top Favorite Videos.

What doesn't that cover? Bear in mind that the Top 10% is filterable in multiple ways.

Chao
Joined: Sun Apr 14, 2002 6:39 am
Org Profile

Post by Chao » Sat Jan 22, 2005 4:19 pm

The Top Favourite Videos is probably the closest to what I mean but still compiles what most people think stands out, it's probably best explained with an example:

Video 1 and 2 have both been seen by 100 people.

Video 1 has quite a split fan base, a lot of highs and a lot of lows, averages out with a high medium score (say around 7) but makes 20 people's top favourites.

Video 2 has a lot of people giving it high medium scores, makes only 2 people's top favourites but almost no one gives it a low rating.

Neither would stand out on the top 10% or star scale, the first is loved by 20% of the people that have seen it, but gets swamped by the popular ones on the top favourites, but would come a lot higher than video 2. Video 2 comes out low by all the forms of top lists, yet almost every one thinks it's pretty good. A rotten tomatoes inspired system would give this around 90%, because 90% or so of people that have watched it would say it is pretty good. It measures everyone's opinions of the video, rather than just their top, and thus is the best way to answer questions like "what Naruto vids would most people say is worth watching", you just filter the RT search for Naruto and find the vids liked by most people. Not especially amazing but universal.

User avatar
Kalium
Sir Bugsalot
Joined: Fri Oct 03, 2003 11:17 pm
Location: Plymouth, Michigan
Org Profile

Post by Kalium » Sat Jan 22, 2005 4:36 pm

I can see the huge weaknesses already.

There are a lot of videos with a few opinion that rate them very highly. Result? A nasty signal-to-noise ratio.
Chao wrote:It measures everyone's opinions of the video, rather than just their top
Uh.... Last I checked, the top 10% works by averaging, so I'm not entirely what the heck you're talking about here.

Besides, last I knew, there was a 'sort by score' option in the search. For donaters, I think.

Anyway, opinions tend on the high side (I could go into my thoughts on why, but I won't), so the good ones tend to balance out the bad ones.

In short, aside from increasing load on the database server (which, if you haven't noticed, has been running a little slow recently), I don't see huge benefits to this.

Chao
Joined: Sun Apr 14, 2002 6:39 am
Org Profile

Post by Chao » Sat Jan 22, 2005 11:45 pm

Uh.... Last I checked, the top 10% works by averaging, so I'm not entirely what the heck you're talking about here.
Then you really haven't understood the system I'm talking about. This is a completely different system from averaging, if you checked the example I used it even explained this. Two completely different videos could get a similar result from averaging, one doing so by having quite split opinions, one by having most of the opinions being about that medium level. This sort of score would be based on how many people consider it good, not great or amazing as the top favourites shows, not considered great by a lot of people as the top 10%, but one that everyone who's viewed it has considered worth seeing.

An average takes into account how high everyone considers a video, this just takes a good versus bad opinion of a video for everyone.

Opinions being on the high side isn't that important, just shift the cut off for a good versus bad opnion. On RT the cut off for games is higher than films because most game reviews give higher scores anyway, it's an arbitary cut off.

It wouldn't be a massive increase on the server load either (the major load on the server recently is bandwidth based, check the query times on the super search and the CPU load is obviously minimal). It woudl just be another stat like the overall score, updated only when a new opinion is added, not something dynamic every time a query is run.

And check Rotten Tomatoes to see the difference in the score type compared to the average (which they also mention per film). It's a completely different sorting type that gives a very different result, yet, IMHO, one that is very useful.

User avatar
Zarxrax
Joined: Sun Apr 01, 2001 6:37 pm
Contact:
Org Profile

Post by Zarxrax » Sun Jan 23, 2005 1:10 pm

The org doesnt calculate scores by averaging anymore. I think it uses some sort of bayesian averaging or something. Much more 'accurate' than a simple average.

Chao
Joined: Sun Apr 14, 2002 6:39 am
Org Profile

Post by Chao » Sun Jan 23, 2005 1:46 pm

Bayesian averaging is still averaging (as opposed to mean, as a student I've got that ingrained into me). But while it's acurate, this type of sort is completely different. As I said before, check the RT site (which a lot of well known film critics have joined so they obviously consider it a good system as opposed to just an average) and see for yourself.

The difference is that this takes into account how many think it's good and how many think it's bad so the spread of results is as important as the average of the results.

Once you get into about 8.5 to about 5.5 ratings there is a HUGE number of videos, there is no good way to get any useful information about opinions of the videos to decide which of the many might be worth getting. For that range of videos knowing which videos scored all reasonable versus those that are love/hate videos would be a very good way to help go through them. You'd see a video at 7.0 with a very high RT rating and know that the chances are, while it won't be great, you're almost certain to enjoy it, if it had an RT rating nearer 50% you can decide if you want to gamble your download time on one you might really like but equally might dislike.

User avatar
AbsoluteDestiny
Joined: Wed Aug 15, 2001 1:56 pm
Location: Oxford, UK
Contact:
Org Profile

Post by AbsoluteDestiny » Sun Jan 23, 2005 5:02 pm

Pfft, it's just numbers and opinions. No matter how many mathematical systems you put in place to judge the goodness of a video you are still going to get people saying "uhh no I think that sucks actually".

The best thing we have on the org at the moment for people who want to find good videos is the suggestion query that gives you videos liked by people who rated videos you saw in the same way you did thus matching videos by taste rather than the "this video is rated highly by many therefore you will obviously like it" system.

User avatar
AbsoluteDestiny
Joined: Wed Aug 15, 2001 1:56 pm
Location: Oxford, UK
Contact:
Org Profile

Post by AbsoluteDestiny » Sun Jan 23, 2005 5:05 pm

Oh and also, rotten tomatoes and the org are vastly different. Movie critics review the films they have to review, people on the org leave opinions on videos they feel like leaving opinions on and the majority of opinions are people saying how much they liked a certain video. You just don't get the same kind of balance on the org and no amount of number crunching can remove that bias.

DarkAlex
Joined: Tue Mar 09, 2004 7:27 pm
Contact:
Org Profile

Understood

Post by DarkAlex » Sun Jan 23, 2005 5:05 pm

Ok, I'm pretty sure I understand what you're suggesting. Basically it would be a system that just used standard deviation to rate the risk level of a given AMV. So here's an example:

First we have AMV 1, 10 people have starred it. 5 gave it 1 star; 5 gave it 5 stars. The standard deviation for this AMV would be pretty high because the scores are so different. This is the Risky AMV from your previous example.

Next is AMV 2. 10 people have starred it and they all gave it 4 stars. The standard deviation for this AMV is very low because all the scores are the same. This would be a Solid AMV.

If this suggestion is to be implemented then it would almost certainly have to use Star Averages, because otherwise there would, as has been mentioned before, be too high a signal to noise ratio. It is my impression that more people give star ratings because they're easier. Thus you collect many more opinions over a less accurate scale (10 on opinion vs 5 stars) and get a more reliable prediction.
This concept is not theoretically weak, but with all of the other user rating systems that this site already has I don't know how much this idea would add.

Locked

Return to “Site Help & Feedback”