Could you please resolve the review scoring system used in the JED? At the moment it is, to put not too fine a point on it, unfair, meaning that my core product scores "worse" than many of my less developed competitors despite the level of work we put into it being significantly higher.
Let's take a look at how things are now.
http://extensions.joomla.org/extensions ... ns/booking
Jomres is listed on the JED under Vertical Markets - Booking : Currently it is placed 8th in the scores but this seems odd, given the amount of work we put into it, and the reviews. Let's look at some of the competition :
Roombooker. 2nd.
Added 05/04/2010
Last updated 5/01/2011.
10 reviews.
http://extensions.joomla.org/extensions ... king/12082
Forum http://www.joomplace.com/forum/joomla-c ... ooker.html
Jongman. 5th.
Added 01/10/2010
Last updated 24/12/2010.
10 reviews.
http://extensions.joomla.org/extensions ... king/14221
Forum http://www.joomlant.org/forum/6-jongman ... ystem.html
Table boss. 7th.
Added 22/09/2007
Last updated 08/01/2011 7 reviews.
http://extensions.joomla.org/extensions ... oking/3110
Forum http://tableboss.com/component/option,c ... 1/lang,en/
Jomres 8th.
Added 05/03/2006
Last updated 30/01/2012
88 reviews.
http://extensions.joomla.org/extensions ... ooking/335
Forum http://www.jomres.net/forum
Jomres aside, none of them have been updated in the last year.
I've linked to these product's forum pages. From them you can see that there's almost no activity in them, at least two of them look dead. In comparison, our forum alone has had a dozen posts this morning, and that's not including the emails we've responded to via the ticket system.
Since these extensions were last updated, Jomres has seen 43 new versions. It's probably one of the most heavily developed of the Joomla components. It has two people supporting and developing for it, full time. Indeed, if you look at the reviews you'll see that we pride ourselves on our speedy and accurate support.
Yet, somehow, Jomres is still showing 8th in the list. Why is that? I believe that it's the number of reviews we've got that are actually holding us back from getting a better score. We have more users, whereas these stagnant projects aren't gaining new users therefore their scores aren't changing. It's their inactivity that's hurting Jomres', and more importantly potential user's impressions of Jomres. In a community that agrees that software that's under development is going to be of higher quality than code that's simply been ignored for the last few years, surely this is an anathema? It's actually to my competitor's benefit that they don't update their products and don't encourage users to review them on the JED. If that's not a bit mad, I don't know what is.
Naturally, this is very frustrating to me, and I think it's jolly unfair
How can we improve the scoring system on the JED?
Well, I've done some back of a fag packet calculations. My numbers might be slightly off, but if they are it's not by much.
First, let's look at the existing data.
Review score values (sum and count of reviews)
Code: Select all
Roombooker Score 48, reviews 10.
Jongman Score 44, reviews 10.
Table boss Score 35 reviews 7.
Jomres Score 366 reviews 81*
Months since last update
Code: Select all
Roombooker 0
Jongman 13
Table boss 12
Jomres 0
JED ratings (How JED currently scores extensions)
Code: Select all
Room booker 4.8
Jongman 4.57
Table boss 4.36
Jomres 4.23
Simple averages score
Code: Select all
Table boss 5
Room booker 4.8
Jomres 4.51
Jongman 4.4
Weighted averages (n1 * N1) + (n2 * N2) / (n1 + n2) are a minor improvement, but they don't take an extension's inactivity into account, let's see the numbers :
Simple weighted averages
Code: Select all
Table boss (7x5)/5 = 35/7 = 5
Icebooking* (22x5)+(1x3)/(22+1) = 113/23 = 4.91
Jomres (63x5)+(9x4)+(1x1) / (63+9+1) = 352/73 = 4.82
Room booker (6x5)+(2x4)/(6+2) = 38/8 = 4.75
Jongman (4x5)+(6x4)/(4+6) = 44/10 = 4.4
Now, Table boss, which has the fewest reviews but is second oldest after Jomres and is obviously ignored by it's developer, demonstrates that this formula isn't working as it's ranked the highest now. Is there anything we can do to improve this?
Yes, we can throw the number of months since the last update into the mix.
Weighted averages including last update
Code: Select all
Jomres ( (63x5)+(9x4)+(1x1) -0) / (63+9+1) = 352/73 = 4.82
Icebooking* ( (22x5)+(1x3) -6) /(22+1) = 107/23 = 4.65
Table boss ( (7x5) -12 ) /5 = 23/7 = 3.28
Room booker ( (6x5)+(2x4) - 0 ) /(6+2) = 26/8 = 4.75
Jongman ( (4x5)+(6x4)/(4+6) - 13 ) = 31/10 = 3.1
Sure, there might be some people who will think that massaging their changelog will suffice to keep their extension rated better, but isn't that really the point, to get people updating their products?
Remember, I've based these calculations off the information that's currently available in the JED's db. There's no point in throwing other weighting factors into the mix, such as the frequency of updates, if we haven't captured that information. The JED might have done, but there's no evidence to say that it has therefore we'll assume it's not available therefore we can't use it.
NB. Before you mention it, yes I know that all of those mentioned extensions (except Icebooking) are J1.5 only and will be unpublished on the 1st of April, that's not my point. My point is that the scoring system used actually penalises me for having more reviews.