Monthly Archives: July 2009

Usability testing gets easier and easier

Your product is useless if no one can use it.  But how do you know how usable it is?  You can try it yourself, but you are not the user. (“You are not the user” is practically a mantra here at HCII.)  One set of methods, usability inspection, provides a framework to systematically evaluate a system yourself or with the help of others versed in the methods.  But really, you want to know how actual users experience the system.

Usability testing is a systematic way to see how real users get along with your design.  There is much literature around usability testing, which you may want to read.  Or you can just dive in using some of the easy tools that are available these days.

If you have access to users and can put them in front of your computer, there is software that can record both the screen and a webcam at the same time so that you can go back later to watch and listen to the user while they’re using your software. The best is Silverback, which is pretty slick but only for Mac.  (For Windows, keep looking.)   Because it’s recorded, you can skip over boring stuff and replay interesting stuff over and over.  You don’t even need to be there to record it, if other people are helping you.

Then there’s remote testing. If you’re testing a web application, your users don’t even have to leave their computer.  Today I came across these remote testing services and they’re pretty great:

userfly is impressive and also the cheapest.  You add a Javascript string to your page and the everything the user does on the page is recorded.  Then they play it back for you, showing the mouse move around, clicks and keypresses.  It has some difficulties updating with results of AJAX calls, but they’re working on that.  They offer 10 recordings per month free, so it’s worth trying.

UserTesting.com is like a mix of Silverback and userfly.  For $29, they provide the user for you and record a video of their screen, with an audio track of them thinking aloud using your website.  Apparently the user also provides a written summary of the problems they found.  I haven’t tried it since it costs money, but I wonder who the users are.  I expect they’re experienced evaluators, which helps in some ways, but can also be detrimental.  If you’re targeting a special population, then you probably want to find the users yourself.

Chalkmark has a very specific purpose, seeing where people click on an image when given an instruction. E.g. you upload an image of your web site and tell them “click on the link to update your settings”.  Wherever they click gets recorded, along with everyone else’s clicks on the task, to create a heat map on the image.  I guess it’s useful if you’re carefully testing out the layout of a site across a large population, but usually a small sample suffices and you would get better data with the other tools.

Remember, you are not the user and your assessments of the usability of what you made are likely way off.  Test with real users.  These tools make it easy.

Conference deadline extension statistics

Anyone who publishes in conferences knows the pain of making sacrifices to meet a deadline, only to find at the last moment that it has been extended.  Year after year of these extensions and people come to expect them, making them all the more inevitable.  This reminds me of people who set their clocks fast to be late less, only to adjust themselves again to be late and have to move their clocks ever earlier.  At what point do conference deadlines lose their credibility?

I’d be interested to know how many people believe a conference’s deadline, as an effect of how often that conference extends its deadlines.  I imagine that younger researcher are more credulous, but that it’s simply because they lack the knowledge of all the extensions.

Are there data available on deadline extensions? If not, why not collect it? Each time a deadline is extended, there is surely at least one person who is frustrated.  What if there were a place they could go and vent their frustration by adding the occurrence to some listing/database?  Well here is the place:

To add your data, or vent depending on how you see it, just fill out this form:

I hope this helps. I just added AERA, which had been due today.

Plone, Drupal, Moodle and ATutor

I’m overdue for an update here on what I’ve been working on.  My current project is a web application for collaboratively sharing, critiquing and improving question items for homeworks and quizzes. Basically a wiki question bank, but more socially-oriented.

In my last post, I was working on Plone and the ECQuiz module of eduComponents.  After a few months I abandoned Plone and never looked back. Plone’s going through a big transition right now and it’s hard to be a newcomer to its scene.  I wish Plone and eduComponents developers well.

I switched my platform to Drupal and its Quiz module. Drupal has an amazing community.  It’s hard to measure a community, but a handy data point is that the Drupal group on Facebook has 3500 members, compared to Plone’s 500.  This Quiz module has an active forum, an IRC channel, and a longish history.

While the Quiz community is strong, its design is lacking for my purposes.  It began 3 years ago as a simple module and has been pulled and contorted over the years to suit different needs.  This is arguably the best way for an open source module to evolve.  Thanks to big contributions from Matt Butcher, the module got some big improvements in Quiz 3.0.  For example, there is now an object-oriented type system for question types so new ones can be added more easily.  (Unfortunately, Drupal data schemas don’t have inheritance like PHP objects do so data properties of the base class have to be included in each subclass.  Unless someone wants to hack around that.)  And now Sivaji is making yet more improvements for Quiz 4.x, as part of his winning GSoC proposal.  Quiz 4.0 will be a polished set of improvements at the end of this summer.

I’ve been exploring the potential for Quiz to take a leap forward by drawing in code from PHP-based learning management systems.  I began with Moodle and hammered its import/export code into Quiz to allow it to handle many more formats.  I was happy having materialized the possibility for re-use in open source, but overall pretty turned off by Moodle’s spaghetti codebase.  Maybe if you’re a longtime Moodle developer it all seems clean and clear, but that wasn’t my experience.  So I’ve kept looking.

ATutor looks promising.  It’s a leaner codebase and looks so far to be a clean design.  It also has much better support for standards, which is important for my question bank in order to interoperate with other systems.  After skimming the source code I realized I would need the db schemas to wrap my head around it so I installed the whole thing on my laptop.  Wow, that was easy.  I just moved the folder within my MAMP htdocs, navigated to it in my browser, and the rest was clicking through web forms.  (There was one step where I had to make a directory manually, but the directions were explicit enough for anybody.)  Thanks ATutor developers, and Happy Canada Day.

I may post again with an assessment.  Please reply in the comments if you’d like to hear this (and why would also help).