Here are the categories that I think should be ranked:
1) Packaging - An apk file, plus readme file, plus stock list, plus small jpg < 50k.
2) Instructions - Route and version, author's name and email, objective of activity, helpful hints and not all in upper case or center justified.
3) Stock List - Only non-default stock listed with file ID's or server links and instructions regarding any coupling or brake adjustments necessary. Fewer downloads = higher rating and no points for raw Activity Analysis listings.
4) Installation - Activity installs without any hitches and passes the Route-Riter/Route Control analysis tests.
5) Running - The activity runs flawlessly with good pop up instructions where necessary and acurate work orders or achieveable station times.
So for each category above we would assign up to 5 grades or stars by entering the activity file ID and then picking the ranking in each category. Based on the total scores, the server would assign icons against activities as follows:
5 green signals - Outstanding Activity (23 - 25 points)
4 greens - 1 red - Excellent Activity (20 - 22 points)
3 greens - 2 reds - Good Activity (16 - 19 points)
2 greens - 3 reds - Fair Activity (10 -15 points)
1 green - 4 reds - Poor Activity (5 - 9 points)
5 red signals - Not worth the bother (0 - 4 points)
Users would have to be matched according to their user ID so that they could only vote once and not be able to flood the ranking, something like the way eBay manages their feedback.
This would remove the frustration that new MSTS users experience when they have to jig stock and activities to get them to run and might encourge our activity authors to higher achievement. We could even have a gala awards night for the best!
Any opinions?
Bob