[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]

/lit/ - Literature

Search:


View post   

>> No.19614198 [View]
File: 233 KB, 883x1299, sffgAnalysis.png [View same] [iqdb] [saucenao] [google]
19614198

>>19614091
>>19614074
Just gave it a try, using two separate methods, and the results were... interesting, to say the least.
I used two separete methods for averaging, one pushing the average towards 3 (the theoretical average of a 1-5 scoring system) based on how many (or rather how few) reviews it had, and another pushing the average towards the goodreads score (actually towards the average of the GR score and 3), more as a thought experiment than anything else.
The first method purely shows results based on /sffg/ scores and prioritizes books with lot's of reviews, while the second gives books with less reviews but a high GR score a higher spot.

Pic related are the top relevant results for both methods, Standard Weighted on top and Goodreads weighted on the bottom.

In canse anyone is wondering, the system for weighted average I used is based on https://steamdb.info/blog/steamdb-rating/ , as I really like the weighting system used in this site.
The exact formulae used were
W Average =B2-(B2-3)*2^(-LOG(C2+1,3))
GR W.Average =B2-(B2-AVERAGE(3,I2))*2^(-LOG(C2+1,3))

Would be interesting to do this with an expanded dataset

Navigation
View posts[+24][+48][+96]