Carnegie Mellon University has a new Masters program in edtech with an emphasis on learning science. Seems like a great way to train for a growing field:
The Professional Masters in Educational Technology and Applied Learning Science (METALS) is a one-year interdisciplinary masters, jointly taught by the Human Computer Interaction Institute and the Psychology Department. The program trains students to design, develop and evaluate evidenced-based programs for learning in settings that range from schools to homes, workplaces to museums, and online to offline learning environments. Graduates will be prepared to take key positions in corporations, universities and schools as designers, developers, and evaluators of educational technologies as well as learning engineers, curriculum developers, learning technology policy-makers, and even chief learning officers.
Students with backgrounds in psychology, education, computer science, design, information technology, or business are encouraged to apply.
Academics and professionals go to a lot of conferences. I recently returned from iSLC 2010, which like all conferences had badges and like all conferences the badges could be frustrating. Unlike most conferences, this one is run by graduate students such as myself and I figure that by posting these notes I can make a difference for iSLC 2011 and beyond. (For what it’s worth, other than the badges, I thought this was the best run iSLC conference yet. Practice makes perfect.)
This isn’t the first post on designing conference badges but I’ll try to make it one of the most succinct. Here is the main principle to bear in mind when designing a badge: The purpose of the badge is to help people connect. To that end, here are some graphic design tips (with help from my friend and colleague Ruth Wylie):
Participant’s names should be largest on the badge and then affiliation.
First names above and bigger than last names.
Duplicate front on back for when it flips. (Can print one-sided and fold.)
That’s it. Notice there’s nothing about the conference itself, since that’s the least important part of the badge. Everyone knows where they are. In general, anything that sets people apart should be larger than anything that is the same.
The MacArthur Foundation is sponsoring a competition for innovative Digital Media and Learning applications. I had learned of it through my colleague Derek Lomas who won last year for his Playpower project.
This year the applications are posted online with open commenting. The word limit on applications is 300 words (not the abstract, the whole application) which makes it easy for anyone to read through it and give feedback. It also makes it easier to write one and they have over 1,000 submissions last I checked. Judges will select which entries advance to the second phase for which a demo video is required.
With research partners I proposed Qrumbs, a system for social collaborative learning around any web resource. I’m glad to be working with Connexions, Curriki, and the PSLCDataShop, leaders in open educational resources and educational data mining. Below the fold is the full text of the proposal, springing from my work in question authoring. With so few words to work with, I used a narrated scenario to communicate concisely how the system works. I’m including the full text below and encourage you to leave comments both here, or even better on the application’s page.
The TED Talks lecture series is a wonderful intellectual and cultural resource. I’ve been a fan for years and on this long weekend relished the opportunity to catch up on some I’ve had in my queue. One of my favorites is by Ken Robinson, because it highlights both a goal and a challenge of my research.
First, the challenge:
I have an interest in education — actually, what I find is everybody has an interest in education. Don’t you? I find this very interesting. If you’re at a dinner party, and you say you work in education — actually, you’re not often at dinner parties, frankly, if you work in education. (Laughter) You’re not asked. And you’re never asked back, curiously. That’s strange to me. But if you are, and you say to somebody, you know, they say, “What do you do?”and you say you work in education, you can see the blood run from their face. They’re like, “Oh my God,” you know, “Why me? My one night out all week.” (Laughter)
Education, to many, is a very dry topic. That’s because, for many, the practice of education is very dry. If you ask a space alien what education is for, he posits, they’ll say it’s training to be a university professor. That’s the pinnacle of the process, right? But it takes much more than university professors to make the world go round, of course. How can our system of education support the diversity of needs and foster the diversity of talents of a rich and dynamic society? That’s the goal.
I ask “how” not “if” because I don’t see it as a choice. Globalization, mechanization, intelligent machines… these demand that we develop human resources, fostering the gifts of each person however they may manifest. Motivated by the example of Gillian Lynne (starting 2:30 in the talk), Robinson argues that we increase the amount of education in the arts, even specifically teaching dance to everyone. Such sentiments underestimate the constraints on time, money and attention for and of students. I would imagine Robinson is not a fan of intelligent tutoring systems, how they involve sitting at a desk and developing, most often, a laser focused set of skills.
But I think research in intelligent tutoring systems and other computer supported learning can help address the diversity he speaks of and even help students learn to dance, if that’s something a community values. Firstly, they can help increase the efficiency with which “maths” (he’s British) and other “left brain” skills are learned, leaving more time for other enrichment. Secondly, and this is a goal of my research, computers can help personalize instruction to each student’s needs and interests. The computer can devote its unblinking attention to the individual, drawing on a wealth of data about their prior interactions with the system, and that of other learners like them, to deliver an effective and engaging learning experience. Technology isn’t the answer, but it is part of the best solution.
Thanks for reading. I encourage you to watch the whole talk as it has many more gems than I’ve noted here, and several laugh out loud moments.
Today I rolled out a big update to QCommons, implementing support for many more content areas than Chemistry.
The key new feature is groups. Each group has its own set of content, its own forums, and its own classification terms. (Because “rational” doesn’t mean the same in Economics as it does in Algebra.) Each group can also have its own permissions system and administrators to support more private uses such as school district curriculum committees. If you would like to host a new group, please contact firstname.lastname@example.org.
To keep a handle on the growth of the site, I’ve switched registrations to requiring administrator approval. Please register and sign up for the mailing list. You’ll go into a queue that I’ll process regularly to allow more users.
In a nutshell, QCommons is a platform for assessment resources that promotes sharing and collaborative improvement. It is open source and designed to support many different topic domains. To track developments with QCommons please fill out the subscription form currently on the front page of the site.
QCommons: Chemistry is the first site built on the platform. In developing sharing sites, there is a bootstrapping issue: people are unlikely to contribute to a site that doesn’t already offer something. This makes starting out tough, but once it gets rolling the network effects help it grow faster and faster. (Wikipedia being a prime example.) Fortunately, the ChemEd DL and the Journal of Chemical Education have donated hundreds and hundreds of quality assessment items to get the ball rolling. Anyone can come and browse through the items for ones useful in their formative or summative assessments. All the items are from the JCE QBank and are used in General Chemistry courses at major universities. These were previously available only to vetted teachers, but now the questions (without the answers) are available to everyone. (At this point, to see the answers you will still need to prove you are a teacher. I welcome feedback on how to improve this.)
This is just the beginning. There are many topic areas, new features and usability improvements to come. In the spirit of Google’s long “beta” cycle, I’m releasing the site as it is to let people start using it and give feedback. It is ready to use and relatively bug free. As I add new features I intend to keep it that way. Just bear in mind that it’s a work in progress and any suggestions you have could strongly influence how it develops.
So again, all comments, criticisms and suggestions are welcome. You can post them here on this blog or in the forums on the site.
Educators enjoy certain greater freedoms under copyright law’s Fair Use Doctrine, and such the culture of exchange in education is pretty lax. In a workshop recently in which I discussed Qcommons (more on that later this week), the teachers said that if they were reticent to upload their materials because they’d lost track of what was original and what they’d copied from copyrighted sources. They knew they had the right as teachers to use them in their courses, but did not know whether they could publish them online, and thus were effectively silenced by the intimidating complexity of copyright law.
Creative Commons has set out to make copyright and licensing terms easier to understand and use by everyone. CCLearn is trying to do the same for education, which I argue is an even greater task. Today they put up a survey to get a sense of how people understand the copyright rules and how they work with them. Please take 5-10 minutes to fill it out and help advance access to and sharing of educational resources. It closes August 31 so go ahead and do it now. :)
“Because most content remains “all-rights-reserved” under the traditional rules of copyright, it is often the case that the creators and producers of OER must confront questions as to when and if it is permissible to use content created by others when it is not offered under an open license. For example, an OER creator may want to incorporate a clip from a film into a lesson about film techniques, or an animated video illustrating a biological process into a lesson about that process. However, if the film clip or animation is protected by “all-rights-reserved” copyright, then the OER creator may be unsure how to proceed, or may wish to rely on some exception to copyright law that may apply under such circumstances.
It is our goal to develop a deeper awareness of the degree to which OER practitioners and users grapple with copyright law issues, and whether those issues pose barriers to the creation, dissemination, and reuse of OER. We hope that this initial survey will form the basis of a larger international study led by ccLearn.”
Your product is useless if no one can use it. But how do you know how usable it is? You can try it yourself, but you are not the user. (“You are not the user” is practically a mantra here at HCII.) One set of methods, usability inspection, provides a framework to systematically evaluate a system yourself or with the help of others versed in the methods. But really, you want to know how actual users experience the system.
Usability testing is a systematic way to see how real users get along with your design. There is much literature around usability testing, which you may want to read. Or you can just dive in using some of the easy tools that are available these days.
If you have access to users and can put them in front of your computer, there is software that can record both the screen and a webcam at the same time so that you can go back later to watch and listen to the user while they’re using your software. The best is Silverback, which is pretty slick but only for Mac. (For Windows, keep looking.) Because it’s recorded, you can skip over boring stuff and replay interesting stuff over and over. You don’t even need to be there to record it, if other people are helping you.
Then there’s remote testing. If you’re testing a web application, your users don’t even have to leave their computer. Today I came across these remote testing services and they’re pretty great:
UserTesting.com is like a mix of Silverback and userfly. For $29, they provide the user for you and record a video of their screen, with an audio track of them thinking aloud using your website. Apparently the user also provides a written summary of the problems they found. I haven’t tried it since it costs money, but I wonder who the users are. I expect they’re experienced evaluators, which helps in some ways, but can also be detrimental. If you’re targeting a special population, then you probably want to find the users yourself.
Chalkmark has a very specific purpose, seeing where people click on an image when given an instruction. E.g. you upload an image of your web site and tell them “click on the link to update your settings”. Wherever they click gets recorded, along with everyone else’s clicks on the task, to create a heat map on the image. I guess it’s useful if you’re carefully testing out the layout of a site across a large population, but usually a small sample suffices and you would get better data with the other tools.
Remember, you are not the user and your assessments of the usability of what you made are likely way off. Test with real users. These tools make it easy.
Anyone who publishes in conferences knows the pain of making sacrifices to meet a deadline, only to find at the last moment that it has been extended. Year after year of these extensions and people come to expect them, making them all the more inevitable. This reminds me of people who set their clocks fast to be late less, only to adjust themselves again to be late and have to move their clocks ever earlier. At what point do conference deadlines lose their credibility?
I’d be interested to know how many people believe a conference’s deadline, as an effect of how often that conference extends its deadlines. I imagine that younger researcher are more credulous, but that it’s simply because they lack the knowledge of all the extensions.
Are there data available on deadline extensions? If not, why not collect it? Each time a deadline is extended, there is surely at least one person who is frustrated. What if there were a place they could go and vent their frustration by adding the occurrence to some listing/database? Well here is the place:
To add your data, or vent depending on how you see it, just fill out this form: