I think we all have our own way of evaluating how stable a module is. Sometimes we just know/trust the developer and have had good experiences with other code that they work on. Sometimes it's critical mass (if 100k sites are using it, it's got to be stable, right).

Following from #2185511-9: Highlight User Contributions & What They Can Do Next in Issue Queue I think we should to start exploring ways of defining what that means and ensuring that we have an automated means of evaluating it. With that we can start rewarding modules that are following that best practice.

Modules that are well maintained should be featured on the d.o site (possibly on the home page, but there's an option to do this in lots of other places too. The DA knows what the high profile pages are and having your module featured in this list should be a big deal. It should drive users, but also clients & contributors.

I do think it would also be nice to have some sort of visual acknowledgement on the project page that this module meets a set of agreed upon rules about what it means to be a well supported module.

Partly this is a matter of determining if the project has critical mass.

So I'm going to throw out a few ideas. I hope some of these will be shot down and that folks will come up with others.

  1. No bugs in the issue queue open & active for more than a year. Bonus points may be given if feature requests are submitted & fixed.
  2. RTBC'd items wait no longer than 1 month before getting into the git repository.
  3. If there is activity on dev, see that a stable release is made every year. (possibly a commits in git / issues fixed since the last stable release)
  4. If there are open bugs, see that there is activity in git every 6 months
  5. Clean response from a Coder Review
  6. Only rank modules with over 500 sites using it.
  7. A solid documentation page (or more)
  8. Not having a alpha or beta release sit in the recommended release for more than 2 months
  9. Making sure that there is a proper recommended release
  10. Not having a security update (although perhaps this would pose a security concern with people hiding security issues so that they don't loose status)
  11. Bonus points for supporting both official Core releases
  12. Providing a dev version of the D8 release (when in alpha).
  13. Number & activity of maintainers - let's not forget about the bus factor
  14. Bonus points for Under active development. Negative points for Seeking new maintainer or No further development. Loose all points if Abandoned or Obsolete.
  15. Bonus points for having an image on the project page
  16. Including SimpleTests or Unit Tests in the code where applicable
  17. Provide appropriately long and meaningful release notes with each new stable release identifying what changes have occurred. Issue queues should be referenced and there should be at least 1 line of description for 100 lines of code.
  18. Enable automated testing in the Project.
  19. Is there a README.txt & INSTALL.txt file? No other alternates will be credited, consistency matters.
  20. Is there a CHANGELOG.txt and does it change with ever stable release?
  21. Is there some evidence of use of translation? Any t() strings?
  22. Modules should get points for being around for a while. Seniority matters too.
  23. Is there a teaser in the project description so that it is easier to understand the project's purpose?
  24. Is the project description at least 500 characters?
  25. Are there headings for: Overview, Features, Requirements, Screenshots, Known problems, Links to Documentation, Pledges, Credits, Recommended modules & Similar projects
  26. Is the ratio of downloads to installed sites more than 5:1?
  27. Are there translations available?
  28. Are users who contribute patches recognized in the git commits?

I don't know how many of these are actually testable.

If we could test all of these things, we'd probably have a pretty small list of modules included.

But if the bar was set right and was transparent it could be a great motivator. Drupal shops might be willing to put in the extra effort to see that their logos were prominently displayed on the project page & ensure that they give their staff extra time to ensure that the module that they maintain is properly maintained.

Thanks @tim.plunkett for these links to existing best practice documents:

The ultimate indication of mastery is what we do. Our modules/themes need to highlight that, but there should be enough incentives so that people and organizations have the resources to do it well https://groups.drupal.org/node/410503

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

ar-jan’s picture

I think something along these lines is a good idea, both to motivate the maintainers to keep/achieve a healthy module status, and to guide new users towards recommendable modules.

I believe there is a proposal somewhere about including project activity graphs to indicate project health, anyone have the link?

Some thoughts.
6. As a site-builder evaluating a module, number of installs is something I definately look at. But I'm wondering how good those numbers are, especially in the lower ranges. E.g. BOA disables the update status module for performance reasons. I also remember reading about a module maintainer spoofing installation stats (though I suppose this is monitored). Still this looks like a good, and one of the easiest, metrics. There could also be mention for e.g. over 100k installs (or whatever number is a good indicator of top-usage).

7. Documentation: this would certainly require human judgment, since it's quality that matters? Also some modules are so simple they hardly need documentation. (I also like projects that link to a README.txt, because those docs are always up-to-date with the module version - at least in principle).

11. Or whether it supports the latest core version. E.g. Commerce is D7-only "because entity", but that is actually a positive sign.

mgifford’s picture

From #1182998: Provide a community metrics "dashboard" just bringing over this example from Mozilla:

http://eaves.ca/2011/04/07/developing-community-management-metrics-and-t...

From this (and Drupalized), some of the features or other dashboards to include:

  • a code-review dashboard that would show how long contributors have been waiting for code-review, and how long before their patches get pushed. This could be a powerful way to identify how to streamline processes and make the experience of participating in open source communities better for users.
  • a semantic analysis of issue queues. This could allow us to flag threads that have become unwieldy or where people are behaving inappropriately so that module owners can better moderate or problem solve them
  • a dashboard that, based on your past bugs and some basic attributes (e.g. skillsets) informs newbies and experienced contributors which outstanding bugs could most use their expertise
  • Ultimately I’d like to see at least 3 core dashboards – one for contributors, one for module owners and one for overall projects – emerge and, as well as user generated dashboards developed using project metrics data.
  • Access to all the data in the issue queue so the contributors, module owners, researchers and others can build their own dashboards – they know what they need better than we do
mgifford’s picture

Also, would be nice to be able to assess the module description. Great ideas in

http://growingventuresolutions.com/blog/module-owners-how-make-your-modu...
https://drupal.org/node/997024

mgifford’s picture

Here's a patch that adds some of the functionality described.

Scoring based on:
when the module was first released
if updated in the last year
if there is a long description
if there are headings
if there are docs
if there are screenshots
if the project status shows signs of warning
If there is a D7 or D8 branch as the default
If there are 3 or more maintainers
if the ratio of downloads to installs is more than 5:1
for having a a lot of installs
if there are translations

No score is available if the module is obviously abandoned or has less than 500 sites using it.

Still more to do, but I've not made enough progress re-using the existing data or pulling it out of the database.

EDIT: Oh ya, and it's installed http://search_api-drupal.redesign.devdrupal.org/project/ctools

mgifford’s picture

Status: Active » Needs review
klonos’s picture

This is a great start!

I'm not sure though whether the "Automated Scorecard" communicates what that number represents. Perhaps it could use a better name plus an explanatory popup linked to from a "?" like launchpad's bug heat indicator. There are other minor things like should it be more prominent or make it clear what the highest possible score could be ("Ah, this project has an Automated Scorecard of 87. Out of what?"), but all these are details that can be sorted out as we perfect this. The algorithm behind the calculated number should be public and long-standing/unanswered open issues (especially bugs) should be also taken under account.

PS: How about if we call this something like the projects' "Health indicator" and it was a percentage?

klonos’s picture

...also, what I would like to see the score assigned by the "long" description be instead calculated by some sort of more intuitive means because a long description does not necessarily mean a good-quality one too. People being allowed to vote (5 stars - ebay feedback style) for "How useful was this project's description" would be great.

klonos’s picture

Title: Highlight Modules that follow Best Practices » Highlight projects that follow Best Practices.

...and on a related (UX-wise) matter, #1434450: Add the ability to follow projects like we follow issues should be a priority too because as project pages are updated people would need a way to be notified (so they can review and perhaps change their rating).

PS: this is not only about modules. Themes could use it too for example.

klonos’s picture

...and on the maintainers' side, they should have some sort of "What can I do to improve my project's rating?" checklist available that also breaks down their current score so to let them know which areas need their attention. This will also motivate them by providing a "reach-that-goal" game.

klonos’s picture

I just filed #2191947: Display project page screenshots in a lightbox(y) way. btw - thought you might find it interesting.

mgifford’s picture

Thanks @klonos - Yes, this is definitely just a start.

Health indicator. I like that better. My initial thoughts were that "automated" might make it clear that there are going to be errors. I like "scorecards" but I also like indicators.

I think it might be interesting to actually make this visible first to just contributors. Then make it visible to all users. Finally make it visible to everyone.

This is going to upset people. Nobody likes getting an unpleasant report card or for that matter seeing someone else get a higher grade than they deserve. Having a disclaimer and more details would be important.

Totally there should be a link for more information with all of the details about what that number actually represents. The more detail that is provided the better. It has to be clear that it's hard to compare and that this isn't going to be perfect.

I do like drupalmodules.com and the ranking options to give folks the opportunity to vote. It's been too long since I saw John Forsythe, who I do sometimes bump into here in Ottawa. Lots to be learned from that site (which he set up ages ago because he didn't think changes to drupal.org would happen fast enough for what he wanted.

Modules, themes, distributions. There would probably need to be slightly different assumptions in the calculations. I haven't done anything with that.

Totally like the idea about following projects. Especially if that could be done through drush some how. I have these 40 modules installed - tell me what's going on in those modules.... Not sure how that would work.

Like the idea of being explicit about what the calculations are and helping to suggest ways that projects could work to improve their settings. People will game the system. To some extend that's going to be expected. However, if folks are exploiting it there should be ways to address that.

Lots of these stats probably could be delivered more effectively through some sort of lightbox approach.

ps. I generally edit documents when I have small revisions. Often with a "EDIT" prefix. There are lots of ways to approach this. Ultimately it would be nice if it was just more intuitively added on.

zengenuity’s picture

I like this idea, but I think you have to be careful about the formula you use. I don't disagree with the data points you listed above, but they need to be combined in smart ways. In particular, certain modules might inherently have less use or activity than others. (Specific payment processors for Commerce, for example.) They may never get to 500 installs and there may not be much activity or maintenance required on them, but they could still be totally stable, secure and well-written. I see that you said you might provide no scores for sites with less than 500 installs, but users might still interpret that as a negative depending on how it's shown.

Regardless of the scoring, I'm totally on-board for your encouragement of maintainers to release and maintain stable versions. I teach training classes to novice users and it's really confusing for them that they sometimes need to install the -dev version of a module to have something that works. (e.g. Commerce Donate) There's really no way for them to figure that out without me telling them just to download -dev or them banging their heads against the issue queue. I don't think users care as much about beta or unstable labels, but they expect that the "green one" works, and it's frustrating when it doesn't. I don't know how you automatically score that, though.

mgifford’s picture

@zengenuity - Absolutely. The numbers have to be meaningful and the formula I used was just a first hack at what data I could find. There has to be faith in the scoring, but also understanding about it's limitations. There are a lot of modules in contrib that are being managed in many different ways.

You can score the number of open bugs tied to the latest stable release. If there have been X number of open issues that haven't been addressed in say 3 months... Or maybe if more folks are downloading the -dev version than the stable release?

We'll have to look at that and see what we can come up with. It also needs to be done in conjunction with the existing module maintainers.

mgifford’s picture

I love this example of how to send out a new Beta release in a popular module from the good folks in https://drupal.org/project/memcache

It's definitely the best example of how to put out a Beta of the projects I've been to in a long while.

Dave Reid’s picture

If we're talking best practices, having a non official (alpha/beta/etc) release in-between minor versions of a module has always been kind of a no-no in that it means you've probably changed too much and should have made a new branch (which is why those kinds of releases won't show up on the project page downloads).

mgifford’s picture

Thanks @Dave Reid! So if they'd made a release, but made sure it wasn't a Recommended release that would be better?

I can't say what they have changed but really like that leading up to the 1.1 release they have dates & the history:
7.x-1.1-beta3 ready for testing (Feb 17, 2014)
7.x-1.1-beta2 (Feb 14, 2014)
7.x-1.1-beta1 (Jan 12, 2014)

Most folks don't want to download a git branch to test a new module. You'll get wider adoption if there's a release folks can download.

tim.plunkett’s picture

Looking at the ideas in the OP:
1) What if it is a completely ridiculous feature request with no volunteers to work on it?
3) and 4) are only relevant to a module that actually needs to change. What if it is just done and working?
5) Coder review has many many known false positives. This isn't reasonable until that module is "perfect".
6) How is this a "best practice"? Completely beyond the control of a maintainer.
8) Why are you recommending an alpha or beta anyway?
9) What if you're still in beta and don't have one yet? Why would you be punished for that?
10) This is really not a problem. Having security updates is not a problem, its having unfixed security bugs.
13) This is vague, not sure what you're implying
15) How is this relevant to a module like https://drupal.org/project/entitycache? Or even Views?
16) Only D8 has support for unit tests.

The rest of these are codified in https://drupal.org/node/7765 and its subpages.
If a module doesn't have a stable recommended release, or documentation, or automated testing, that is plainly evident from the project page itself. Nothing else needs to happen past that.

And I agree with @Dave Reid in #15, what memcache is doing is not a best practice.

mgifford’s picture

Issue summary: View changes

@tim.plunkett - thanks!

OP?

1) We could rank bugs & features differently. Features could be completely discounted.

3) & 4) Ultimately it comes down to the community and the judgements of the community. If there are a few patches sitting at RTBC though then maybe it's time for a maintenance release. If there's activity on a dev branch but it's been over a year since the stable release.... I really don't think there's an automated way to check if a maintainer is telling folks to simply use the -dev version (However this does happen). It could simply be a matter of what % of downloads are the dev vers vs the stable release. If it's weighted in favour of the dev's then it's probably time for a new stable release as there just aren't that many active developers and using dev code on a production site shouldn't be a common practice.

5) Again, lots of little factors weighted differently. I'm all for making Coder better. Maybe in D8 we'll have a version that we have real confidence in. Until then It's good to know that a maintainer will have used Coder and cleaned up all of the stupid spaces and formatting nonsense. Hopefully they would have also caught the more serious issues. Still, fewer messages is pretty much always a good thing. It's just an indicator though.

6) In this case it's just a matter of ensuring that there is enough data to judge. Frankly it's also a matter of critical mass. If I know 10k sites are using a given module I have much more confidence in it than if there are only 10 sites. It's not a perfect test, but I am pretty sure that there is merit in it. Critical mass is really important in any community project. Without enough users & developers even a perfect module won't last for long. It's also pretty important to figure out some way to put things into categories. Any module with 100k sites using it is likely going to have more folks posting things in the issue queue than a site with only 100.

8. I've just seen way, way too many projects lately that have put out an alpha, beta or RC and then just let it sit there for over a year. One of the things I liked about memcache was that it was pretty clear that they were using those pre-releases as a temporary release. In D7 it's just become common place to use any release as the recommended release.

9. If there's a new beta release ever two months, great. These are just ideas after all. A simple quick brainstorm. I would want to be able to trust in the longevity of a full stable 1.0 release over a beta7. This might just be perceptions, but beta's are much more likely to change in order to adjust for new features. I do think that a full stable release should be given more points than a beta. Obviously, this is just an indication amongst many factors we'd want to consider.

10. Ya.. That's fair. We want to encourage folks to fix security bugs. So should we reward projects for issuing security releases? Probably best to just drop it.

13. If you're the only developer on a project and you get hired by a non-Drupal shop, get pregnant, win the lottery or git hit by a bus, the project will flail and probably have a long period where it shouldn't be trusted as nobody knows the details about how it works. A project with 2-3 active maintainers is going to be more sustainable than a project with just 1.

14. The Views UI would be easy. There are definitely modules though where a photo wouldn't be easy. However, there are folks who upload photos that are frankly just fun https://drupal.org/project/magic - but ya.. Let's assume that if this gets in there it really isn't worth that much.

16. SimpleTests. But heck, if this only applies to D8, that's just fine. We want to encourage folks to have a D8 release with Unit Tests anyways don't we?

This was just a brainstorm. Lots can be added/removed from this list.

mgifford’s picture

Issue summary: View changes
rooby’s picture

The point about the intermediate betas is that if the scope of the work requires a beta phase then you should be going to a new major version, so whether or not it is the recommended release is not really relevant.

1) I agree that feature requests should be handled differently.
3 & 4) I also agree with Tim's point. There are modules that never need to be updated so modules shouldn't be penalised for this unless you are taking into account number of open bugs and things also. I think the making new releases point is only really an issue if the dev version is not the same as the latest stable release. When you have a bunch of bug fixes in dev that are never made into a stable release is when you get problems.
5) I don't really like this idea because of the false positives and nit pickyness. A module could be fantastic in every other way except for a few really minor coder review issues. If people open issues for coding standards fixes and they stay open for a long time then it should get picked up by other points on this list.
6) I agree this shouldn't be used to penalise. There are modules that are used by a lot of people that are in my opinion not very well written and don't follow these types of best practices and there are excellent modules that are pretty much flawless with great maintainers that only have a small amount of users because they provide less commonly used features or similar. I don't think this really relates at all to best practices.
10) I agree this should be dropped.
11) This could have grey areas sometimes, if for example a module is providing some feature that leverages underlying functionality only available in the newer of the two current releases.
12) This really depends on the status of the current drupal core dev. I would go as far as to say it is irresponsible for people to be wasting time on making a drupal 8 dev version too early in the development of core (depending on what their module provides).
13) Hard to accuse someone of not adhering to best practices because no one else is willing to help out with maintenance. It might be a different story if people are asking to help maintain and are being rejected. Small projects or projects with active maintainers don't need a bunch of people to help, in which case only bus factor is relevant, but it's not relevant to best practices.
14) Maybe expand on this. Which maintainer & development statuses are good ar bad as far as best practices go and why? I'm not sure it's so relevant in most cases.
15) Some modules don't really need an image so they shouldn't be penalised for it. Maybe gaining points for having one though?
17) I think this one is important, although hard to quantify further than just having notes or not becuase you might only need a short note.

mgifford’s picture

Issue summary: View changes

CHANGLOG
1) I changed it to focus on bugs. Made feature fixes a possible bonus.

3) & 4) I put in a conditional statements. If we look for open bugs we can see if there should be activity in git. If we see that there is development in dev on the same branch, then after a while we can see that there is a need for a new release.

5) Maybe a Coder Lite could be developed, but at this point I've struck it.

6) I've struck it for now. Occurred to me that it's really a matter of comparing modules of similar user bases.
10) Struck.

11) Just made it a bonus. And yes, this whole thing is grey and there are going to be lots of places where it doesn't apply 100% of the time. Life is rough but frankly nobody has enough time to know or compare all 20k modules on Drupal.org so we have to make some compromises.

12) I clarified this so that it is when it is in Alpha. Should it be Beta? I don't really know. Maybe Alpha4. It could be based on some arbitrary time where Core developers just think that everything is stable enough for Core Contribs to start mass upgrading their modules. This doesn't have to be fixed in stone and it isn't going to be perfect. Mind you it's also a dev release or a git branch so really it isn't like it needs to be fully functioning.

13) This shouldn't be seen as something that is about a personal failing of the maintainer. This needs to be worded in a way that ads some fun & provides some elements of competition to encourage better practices. It's a piece of gamification. There are going to be winners & losers just as there are now. However the criterion won't just be how many sites have it installed, projects can be rewarded and acknowledged for good work in a range of things. Dealing with co-maintainers is essentially about human resources. Our projects probably need some project management. It's not the fault of a kick ass developer that they don't have a project manager to help. However, we can start to open up new roles and have more roles so that people can find new ways to contribute. This is about evaluating the project and not the great developer who started the project. It's as much about inspiring the 10k sites that depend on module X to step up to the plate and find a way to contribute something. Small project that have fewer developers are also susceptible to the "bus factor". When evaluating projects I want to know that it's not all riding on one wiz kid who might disappear after writing the most amazing code on d.o.

14) I added some more specifics from https://drupal.org/node/1066982 - Ideally everything is Actively Maintained & Under active development. I think that this #1134450: Automatically degrade maintenance and development status of projects over time would also help as as much as it could benefit the community to add elements from games, we don't want to have folks trying to game the system.

15) Yup. Bonus points! Maybe they are actually scored differently so that there isn't just one number but 2 or 3.

17) I'm just making up numbers, but I think that it would be fair to ask for a 1 line description for every 100 lines of code. The more changes there are, the more should be in the release notes. Having a long patch with a one line "misc bug fixes" really isn't helpful for anyone.

mgifford’s picture

Issue summary: View changes
mgifford’s picture

Issue summary: View changes
Issue tags: +gamification
DamienMcKenna’s picture

Title: Highlight projects that follow Best Practices. » Highlight projects that follow Best Practices
Issue tags: -gamification

This isn't a competition.

Step 1: Improve the existing documentation on what best practices are.

Step 2: Promote it via webinars, blog posts, etc.

Step 3: Help maintainers achieve this collaboratively, mostly by submitting patches and volunteering to co-maintain.

mgifford’s picture

I don't claim to be an expert on gamification or any other form of social engineering. Certainly it's clear that there are people who are annoyed by that term and are reacting to it. You can find slightly more people who seem to think that "Gamification is Bad" than "Gamification is Good" but what is clear is that it isn't going away. It's also seems to be working well for some, such as http://stackoverflow.com

I've got no problem with Steps 1, 2 & 3, but I think that's been seen as the be-all for approaching things in the Drupal community and frankly I don't think it's enough.

Over & over again I hear that module maintainers just don't have enough time. Steps 1, 2 & 3 are necessary for onboarding new people, but frankly it isn't enough. We need more incentives to bring more people into the community. As you know I've suggested a lot of other ideas. Some will work, others won't.

You're absolutely right that this isn't a competition, but something has to change so that the Drupal Community can reverse the trend in contributions - https://drupal.org/metrics

Doing more of the same just doesn't seem like a good solution.

Danny Englander’s picture

There was a somewhat related conversation to this a year ago, linking for reference: Unplugging the On-Ramp

rooby’s picture

My 2 cents on the newer ones:

21. Not using t() when you should is definitely bad practice, however not having anyy t90 calls isn't necessarily bad as you might not have any translatable strings.
22. The length the module has been around doesn't really have anything to do with best practices.
25. Not all of these are relevant to all but generally speaking it is nice to have the ones that are.
26. I don't think this is as useful as it seems. There are lots of reasons people might download other than downloading to try it out but then not choosing it because it is badly written or doesn't work etc. For exsample you might be choosing between similar modules but one fits your needs better than the other one. I also very frequently download tarballs just to open, not to save, just so I can read some code because I forgot a function name or for some other minor random reason, never intending to install it after that download.
27. Unfortunately a lot of module maintainers don't know other languages and can't really have much influence over translations.
28. I think this one is really good. It makes new contributors happy to see their work being recognised. At least I think so anyway.

DamienMcKenna’s picture

IIRC there's also a separate (aging) discussion on adding reviews to modules & themes.

mgifford’s picture

Issue summary: View changes

@rooby - striking some things out.
#25 - We've got to have some room for bonus' Maybe we just give points if someone has half of them.
#26 - I'm not suggesting that this is a perfect ratio. Just that there's a good chance if people are successful in using a module that they end up installing that's a good thing. If there are a bunch to choose from probably people will have tried a few of them, generally the one that has the best ratio will be the easiest to install and get people what they were looking for. There are lots of reasons that folks do all kinds of things and we can't even begin to guess why. However, if we've got 10,000 downloads vs 200 installs we can begin to wonder about the health of the module. A great description only gets you so far. I'm fine setting the ratio quite high though.
#27 - This is not about evaluating the maintainer. That has nothing to do with it. It has to deal with the module & it's maturity. If there are translations it's just an indication of the adoption of the project around the world. That's a good thing.
#28 - Folks have complained that they aren't listed.... I think most probably don't know either way. Now with #2203341: Automating "thank you" messages on d.o for participation we'd have a way of ensuring that there was at least an alert.

@DamienMcKenna - There must have been. DrupalModules.com must have inspired that discussion long ago somewhere on on d.o.

mgifford’s picture

Referenced in #2183843-3: Roadmap for a New Stable Release

EDIT: Also mentioned in Google+ Thread

Phil Ward - 24 Feb 2014

I think this is a great idea. Maybe even a user generated review of the module? (Could spell disaster too)

Prime example of this scenario is the fivestar module. I want to add a review system to my ecommerce sites and fivestar is the dominant module in this category, however it hasn't seen love in ages. Makes me wonder if there will be a port to D8 and also makes me wonder how many patches I'll have to go dig up just to use it in its current state... Needless to say, I'm still holding out.

Steve Turnbull - 24 Feb 2014+2

Field Complete. It's superb, and is an excellent example of coding standards and documentation.

Written by ... someone ... can't remember who ... oh ... wait ... (me).

Peter Sohal - Yesterday 22:36+1

Excellent idea. Hope to see some good debate on this and a nice resolution. 

mgifford’s picture

A form of this was discussed earlier in the Prairie Initiative - Projects quality initiative.

mgifford’s picture

From old project page redesign initiative:
https://undpaul.notableapp.com/posts/64bb90be1ebd4ed73304352a5b7577614fc...

"Screenshot is mandatory. Is filed as a bug if non-existent in old projects. Gallery possible."

There was no clarification from Thomas Moseler as to why.

rooby’s picture

Re #32:

Maybe if you don't have anything to take a screenshot of you can take one of your code :)

mgifford’s picture

If there's a UI, great. I do think that many modules would really benefit with a small one. Drupal.org is just so bloody text heavy.

Actually, what would be really cool would be photos of the team that work on the module. Get a bunch of folks who contribute together at a DrupalCon and show the people who make the code. That would be neat.

If not that then animated gifs... :)

EDIT: Adding related GDO discussions https://groups.drupal.org/node/142779 & https://groups.drupal.org/node/144604

mgifford’s picture

It occurs to me that we probably should start by simply having this type of a report only visible to the module maintainers. It would still be feedback to the maintainers, which is important. But it is less likely to undermine intrinsic motivation.

More related discussions from GDO:
https://groups.drupal.org/node/314018
https://groups.drupal.org/node/312828

joachim’s picture

> If there is activity on dev, see that a stable release is made every year.

I think this needs to be more precise, and tied to the number of commits in git / issues fixed since the last stable release.

It's a real PITA as a user to download the latest stable, find bugs, search for them, and see that the fixes were committed months ago. You're left with either having to apply the patches, or run the -dev version.

We can obviously bikeshed what's the right amount of time and commits, but my hunch for these would be 3 months or 20 commits -- if you get to either of these since the last stable release, time for a new stable release.

mgifford’s picture

@joachim - Thanks!

Ultimately if folks are finding it easier to run with a dev version than the latest stable release, Drupal's security infrastructure just gets missed. Drupal doesn't do security updates for dev modules and probably shouldn't.

If we know that dev modules are being used in production sites it is a security problem for the community. If there isn't a stable release that you can leverage or especially if the developer is telling you to use the dev release, it is a bad security practice for the community.

I don't think we need to commit now to how we determine if a stable release is needed or not. I'm happy to experiment with a commit/bug fix ration. Could even be the length of time a bug in the latest release is marked RTBC. Certainly once a major issue or a few minor issues are marked RTBC, it's worth considering if it is time for a new release. Right now it often takes folks starting an issue to ask for a new release to see it happen.

Anyways, we can play with the calculations and see if it drives the desired result in the community. If not, we will need to adjust it accordingly.

mgifford’s picture

Issue summary: View changes
drumm’s picture

Status: Needs review » Needs work

I think simply adding a number will cause confusion. I imagine plenty of questions like "What does this number mean?" "Why is my module a 3?" "What can I do to change my number?"

I'd like to see more design work go into clarify parts of the page we know are important. For example, "Maintenance status: Unsupported" gets a warning icon, but no other explanation. You should be able to quickly scan the page and make the assessment, not have an opaque number.

mgifford’s picture

I do think it is important to have module maintainers at the very least be thinking, what can I do to make my module better. Wouldn't that be a good thing? Particularly if we provided some mechanisms to make that easier?

Whether that's in the form of a number or a thermometer or whatever, having a goal is a good thing. Making sure that module maintainers know how well they do compared to other modules is also a good thing. Even if it's just in relative terms.

I do think it would be good to start with having some scoring mechanism visible only to the maintainers of the module.

You're right though that the focus has to be away from the number or points. Any system can be manipulated and any score will have it's problems.

If we had an automated means to determine when a module is in trouble like #1134450: Automatically degrade maintenance and development status of projects over time then this would be more helpful. However, at this stage there seem to be a lot of modules which say they are under Active Development which haven't been touched in a year.

If a module's maintainer leaves the community how do we know?

Maybe if the UI simply watched the log files and adjusted the status accordingly. However, if it's all up the the maintainer's digression to set these then it doesn't mean a lot nor does it act as any incentive to change behavior.

Leeteq’s picture

Cross-linking / FYI:

#2192181: Allow users to review and vote on various aspects of projects.

"Posted by klonos on February 9, 2014 at 6:25pm

In #2186377: Highlight projects that follow Best Practices we aim to introduce an automatically calculated indicator based on various measurable factors from data gathered. This issue here is to request to allow users to vote on various project aspects that we cannot literally measure per se like:

- accuracy of the project's description
- completeness of documentation (both in the module's README.txt/INSTALL.txt as well as its documentation pages in d.o)
- fast replies in the queue (not necessarily by the maintainers)
- an overall "feeling" based on how useful the module is"

mgifford’s picture

Pierre.Vriens’s picture

This is an amazing post, which I've been searching for for many months ... Haven't digested it fully (sorry), but plan to do so soon.

I just updated the Comparison of charting modules, by adding a draft version (work in progress!) of such scorecards for all Native charting modules ... My way of contributing back for this amazing post.

It'd be interesting to know the outcome after this scorecard gets completed (here is how to do so ...). To then help decide about picking 'a' charting related module in Drupal-land. From my own experience (since 6 to 12 months now), I already "know" what I think the answer is. But by using these 28-5=23 criteria, it should become obvious for anybody who wonders "I need some charting related module, which one should I go for?".

Be aware: for about 6 months I'm the (new) co-maintainer (and only one ...) of the 2 native charting modules with the highest reported installs. I'm trying to some day achieve my goal as described in #2368793: Chart 2.0. It's challenging, and hard, but we're ('I' am?) getting there, step by step, bit by bit. Part of what I'll be working on is to have those modules meet as many as possible (all?) criteria of this scorecards ... For that I'm going to create an extra community docu page for each of these modules, as a kind of checklist, which I might call something like 'Compliance with Best Practises' (any better suggestions?).

sonicthoughts’s picture

Thanks to @pierre.vriens for pointing out this post. I've seen / commented on related activity on stackexchange:
http://drupal.stackexchange.com/questions/87971/get-more-drupal-org-proj...
http://drupal.stackexchange.com/questions/160386/best-way-to-get-module-...

Capturing a scorecard for module health would be fantastic. I would also like to see it summarized as an "index" but IMO the index would need to be weighted by the consumer. Some may value few bugs while others more commits (or D8 support.) I think the first challenge is to make the underlying data available and then how to summarize.

lolandese’s picture

Just posting a link to another list of criteria to evaluate project quality in the D.O. docs. Mostly overlaps but seems to add:

  • Is it covered by the security advisory policy? This has also its own shield logo.
  • Is there a demo?

I like the points in the criteria mentioned at the top of this issue. Well thought out. Glad I found it. I added a link to it in the mentioned docs.