Run at least three sessions to test the usability of a typical drupal blog or small community site for anonymous users.
To do this you will need 3 or more volunteers who will evaluate drupal 6. It is important that everyone understands that drupal is being tested, not the evaluator (your volunteers). The evaluator can not pass or fail.
Ask each of the volunteers to browse a fairly-default drupal 6 site anonymously, comment on it, search for an answer to a question hidden in an old content item, find a specific article, find out some information about a user, register, confirm and comment again. Provide them with a scenario and website that will inspire them to do it realistically. e.g. "Your soccer club has a website and you need to...".
You should write and provide more detailed scenario and more detailed tasks for them to complete. See the UMN formal usability testing plan for ideas on how to do this. Your scenario and tasks should give the evaluator a clear goal and help inspire creativity in writing, for example (which incidentally is not being tested but is required to complete the evaluator's tasks).
To familiarize yourself with the tasks and usability tests, it is useful to do you're own test and report, before running the tests with evaluators. This will help you gain confidence with finding issues and taking notes on them. It's also important that you thoroughly test permissions before running a test so that the website you provide allows the evaluator to do what the tasks.
For bonus points (not required for task completion), create an install profile for your usability-testing site. This may be re-used or further developed for the other usability testing.
The evaluator must be able to do all the tasks through drupal's UI and not need to write any code or change files.
While observing new users, take note of:
- what the evaluator wants to do first
- where the evaluator gets lost or confused
- what the user expected
- where the evaluator spends their time in the first 30 minutes of the session
- where the evaluator spends their time in the first few seconds of each new UI / page
- when and where they search for help
- where they search for help
Perhaps the most valuable information from a usability test is knowing what the user expected. This makes it easier to discover usability bugs and suggest solutions. You should spend some time immediately after each test (while it's still fresh in the evaluator's mind) debriefing the evaluator to find out their answers to the above questions. You might find that you misinterpreted their behavior. Some evaluators find this difficult and begin to feel like they are being tested. If this is the case, don't pressure them to give you better feedback but help them to relax, remind them no answer is right or wrong and ask simpler questions about how they felt emotionally about the tasks they found difficult. If the evaluator can't give you good feedback then don't persist. You still have notes from watching their behavior, right?
Write a report that summarizes your findings. We're looking for a level of
detail and format similar to Factory Joe's Usability report on drupal 6 beta 1. See also the reports from GHOP tasks #8 and #7.
There are two completed GHOP tasks that are usability tests like this one; #7 (d.o), and #8 (d.o). Those tasks focussed on drupal installation. This task focuses on site configuration and customization.
Before planning your usability tests read about how to do usability testing:
- http://www.uxmatters.com/MT/archives/000183.php
- http://openusability.org/
- http://keycontent.org/tiki-index.php?page=Usability+Tools
- http://factoryjoe.pbwiki.com/FeedbackForDrupal6
Deliverables: This task is complete when the report has been submitted to by the student, and reviewed and approved by the mentor or other appropriate drupal community member. The report should be made available in a widely available format like plain text, html or PDF.
You can include screenshots for bonus points. These could be annotated using flickr's annotate tool. (Tag them with drupalui if using flickr.)
Bevan is the owner / mentor of this task.
Comments
Comment #1
aclight commentedIt sounds like quite a bit of work to get a site set up like this for which the student can use for testing. The devel generate module won't work for this since lorem isn't suitable. Did you have some idea in mind other than just creating a lot of fake content? If the student creates just a few pages for testing I don't know how valid the test results will be.
Comment #2
Bevan commentedAh yes. That was something I started thinking about, then I forgot to finish thinking about...
I had considered they could use my D6 blog but the content on it is rather drupal-centric technical and probably uninteresting for most folk. Although right now I don't have a better idea. Unless someone has an ideal site they don't mind us duplicating for this purpose? The tricky thing about that is that the site has to be pretty default, not too much contrib, drupal 6 and garland theme. I don't know of any sites like that other than my blog...
Ideas?
Comment #3
aclight commentedWebchick's blog at webchick.net also runs D6 and is relatively simple, but again it's mostly about Drupal. I can't think of any other suggestions off the top of my head.
Comment #4
Bevan commentedHer content is certainly more appropriate for this task. Although it's still drupal-centric and not 'generic'.
I'm gonna send her an email
Comment #5
webchickI hemmed and hawed over this, but finally after enough mulling it over w/ folks in #drupal, I'm ok with providing an anonymized dump of webchick.net for this task, should it be claimed. The information there is public and could be manually populated, but this would just be a way to shorten up that process.
Comment #6
Bevan commentedThanks Angie! We don't need to make the database public. You could email it directly to the claimant. It's not part of the student's deliverables.
Comment #7
Bevan commentedAnd on that note, here's a revised version:
Run usability tests on site browsing, commenting and registration
Run at least three sessions to test the usability of a typical drupal blog or small community site for anonymous users.
To do this you will need 3 or more volunteers who will evaluate drupal 6. It is important that everyone understands that drupal is being tested, not the evaluator (your volunteers). The evaluator can not pass or fail.
Ask each of the volunteers to browse a fairly-default drupal 6 site anonymously, comment on it, search for an answer to a question hidden in an old content item, find a specific article, find out some information about a user, register, confirm and comment again. Provide them with a scenario and website that will inspire them to do it realistically. e.g. "Your soccer club has a website and you need to...".
You should write and provide more detailed scenario and more detailed tasks for them to complete. See the UMN formal usability testing plan for ideas on how to do this. Your scenario and tasks should give the evaluator a clear goal and help inspire creativity in writing, for example (which incidentally is not being tested but is required to complete the evaluator's tasks).
You will be provided an anonimized copy of the database of a website with content, settings and permissions suitable as a starting point to make the website to do the testing on. You'll need to thoroughly test permissions before running tests so that the website you provide allows the evaluator to do what the tasks.
To familiarize yourself with the tasks and usability tests and check the website and tasks, it is useful to do your own test and report, before running the tests with evaluators. This will help you gain confidence with finding issues and taking notes on them
The evaluator must be able to do all the tasks through drupal's UI and not need to write any code or change files.
While observing new users, take note of:
Perhaps the most valuable information from a usability test is knowing what the user expected. This makes it easier to discover usability bugs and suggest solutions. You should spend some time immediately after each test (while it's still fresh in the evaluator's mind) debriefing the evaluator to find out their answers to the above questions. You might find that you misinterpreted their behavior. Some evaluators find this difficult and begin to feel like they are being tested. If this is the case, don't pressure them to give you better feedback but help them to relax, remind them no answer is right or wrong and ask simpler questions about how they felt emotionally about the tasks they found difficult. If the evaluator can't give you good feedback then don't persist. You still have notes from watching their behavior, right?
Write a report that summarizes your findings. We're looking for a level of
detail and format similar to Factory Joe's Usability report on drupal 6 beta 1. See also the reports from GHOP tasks #8 and #7.
There are two completed GHOP tasks that are usability tests like this one; #7 (d.o), and #8 (d.o). Those tasks focussed on drupal installation. This task focuses on site configuration and customization.
Before planning your usability tests read about how to do usability testing:
Deliverables: This task is complete when the report has been submitted to by the student, and reviewed and approved by the mentor or other appropriate drupal community member. The report should be made available in a widely available format like plain text, html or PDF.
You can include screenshots for bonus points. These could be annotated using flickr's annotate tool. (Tag them with drupalui if using flickr.)
Bevan is the owner / mentor of this task.
Comment #8
aclight commentedTypo:
"so that the website you provide allows the evaluator to do what the tasks [?]."
You still have the paragraph about the soccer club in, but then you talk about providing webchick's site. I assume you can now take out part of the third paragraph.
You might also want to specify that the student knows how to import a database from a SQL dump, and that the student uses MySQL (I doubt this second part will be a problem, however).
I'm keeping this as CNW, but when you fix these things go ahead and create an official task with this--I don't think it needs to come back here for another review first.
Comment #9
Bevan commentedGoogle code: http://code.google.com/p/google-highly-open-participation-drupal/issues/...
Work location: http://drupal.org/node/211814
Comment #10
aclight commentedComment #11
Bevan commentedHe he! Did you know that an anonymous drupal-bot comes along after two weeks and closes these automatically? That's why I leave them on 'fixed'. :)
Comment #12
aclight commentedYep, I know about the bot. But in general, it just keeps the list of issues pages less cluttered when we close tasks that are no longer needed. For actual patches it's sometimes nice to see the progress made by looking at fixed issues, but since each closed task idea essentially generates another issue for tracking the progress of the student, we just get too many issues.