Soliloquy on Learning Analytics

9 Aug

Learning Analytics is a hot topic these days, but it also seems at the moment to some to be a kind of Google Wave: a neat concept, but with few concrete examples. I think it has a bright future, so I thought I would write about it.

Learning Analytics for me is simply analytics applied to learning: analysis of any data generated in the course of learning that can shed light on how people learn. Imagine those great data visualizations you see in the New York Times–only representing, not red states and blue states, but heretofore invisible patterns in the behaviors of learners.

It’s distinct for me from Business Analytics–the real-time analysis of data generated by organizations, designed to inform decision-making, also known as the famous “dashboard” to which all good leaders aspire. And it’s distinct from what my friend Ganesan Ravishanker and others call Academic Analytics (see his ECAR Research Bulletin on the topic): the analysis of data from the business operations of academic activities like enrollment, majors, tuition payments, faculty counts, and so on. These are wonderful and important activities; the difference in Learning Analytics is that it aims to focus on learning. Or, to make that slightly less esoteric, it aims to analyze the records of behavior occurring, and the artifacts produced, during learning.

What am I talking about?  Imagine all the things that happen when you are taking a class. Reading the syllabus. Listening and talking in the class. Reading the homework assignments. Taking notes, in class, and during reading those homework assignments. Writing paper drafts. Using particular sorts of software. Engaging in online discussions. Texting classmates. Contributing to a course backchannel in Twitter. Giving feedback and engaging in peer review. Taking surveys. Filling out evaluation forms. Posting at Rate my Professor.com. And so on. Increasingly this stuff is electronic or happens in an electronic medium, so it’s in theory collectable. If we can collect it, we can analyze it. Suddenly things that were invisible are no longer invisible. As Johann Larusson and Brandon White say in Detecting the “Point of Originality” in Student Writing:

The continuing migration of more and more teaching
materials to digital venues, and thus a machine-readable
form, has the supplementary benefit of making a student’s
day-to-day learning activities more transparent in each
stage of the teaching process.

Okay, if we collected all this stuff, what sorts of things would we see? Basically we’d see better what people thought during the evolution of the course. About the content in the course, about the ideas in the course, about conversation in the course, about assignments. We’d see their work evolve–through drafts, through feedback. We’d see their feedback evolve. We’d see how people responded to evolved feedback. We’d see the conversation evolve. In other words, we would see the way people developed over the course of the semester. And we’d be able to understand a bit better what learning is and what parts of our learning environments were most beneficial. The main way we see at all into this area now is basically through the intuition of the teachers and learners engaged in the course–their intuition informed by their own sometimes semi-conscious collection and analysis of data–which is of course a wonderfully valuable and reliable tool. Learning Analytics would just supplement it.

How about an example? Here’s one: In their “Point of Originality” work (see link above), Johann Larusson and Brandon White created a tool that uses established linguistic theories to measure the originality of student writing, and uses as its data source student blogs. In a course where students regularly blog, their tool can give a sense of when students are engaged and thoughtful about the topics discussed, in the aggregate and in the individual. A professor can use this new tool to complement his or her intuition or other assessments–and s/he can start to make connections that can help improve the learning environment. Are there patterns to the engagement? Does a particular topic excite more originality? Did the teaching method have an effect? Does originality correlate with performance on exams? Do I sense an engagement in class that corresponds to the activity in the blogs? And so on.

The above is what you might call the human perspective of Learning Analytics. There’s also a structural perspective of Learning Analytics, and it has potential too. I’m indebted to Greg Crane for this key insight (see my post from last Spring). We have collections of data from systems–be they computer systems or systems of activity–that can shed light–from a large-scale perspective–on how people define and shape learning containers (if you will) and pathways or sequences. What systems do I mean? Collections of syllabi, course descriptions, definitions of course learning objectives, reserves reading lists, records of textbooks used by courses, course catalogs, departmental or program descriptions, course prerequisites, transcripts, credit transfer agreements, and on and on: every school could generate a vast amount. It’s relatively easy to imagine these kinds of materials collected and organized, even on a national scale. Analyzing these data would start to give us a kind of linguistic understanding of the grammar of learning sequences. We could start to see how people shape learning, what subjects are considered learn-worthy (if you will), what sequences of subjects are considered appropriate, even what information resources are associated with what subjects. You could see how these structures changed over time or were different in different cultures. You might even start to conceive of a dynamic learning advising tool that could give a self-directed learner lots of options based on analysis of these large scale patterns of learning constructions.  “Welcome, David. You’re interested in learning Economics? Based on our analysis of all world Economics courses, the first step generally consists of activities like these . . . and uses information sources like these . . . and assessments like these . . . Given your personality and past successes, we recommend activities with a strong experiential flavor . . . we’re now generating a syllabus for you. Also, based on your projected learning path, three professors have made a bid to serve as your personal guide at reasonable rates. We think you’ll like Dr. Jones. Etc.”

I’m perhaps getting a little carried away here with this Greg-Crane-inspired advising machine, which reminds me of what I think we should do and not do with knowledge derived from Learning Analytics. I don’t think the point of analyzing learning is to create a machine to replace teachers and automate assessment, or that the data collected would be used in secret to make some vast nefarious decisions about people and programs (these are two of the initial worries I think people feel when they start to think about Learning Analytics). I think the point of Learning Analytics is instead simply to give back to learners and teachers more information about how they are doing and about what is working. The ultimate arbiters for me are still the learner and the teacher–who, with Learning Analytics, will have more tools at their disposal as they try to understand how to help themselves or others learn. Learning Analytics should contribute to the kind of ongoing, imbedded, formative, transparent assessment hoped for by people everywhere.

And so ends my soliloquy. If you’re interested in learning more, look for an upcoming webinar co-hosted by the New Media Consortium and the NorthEast Regional Learning Analytics group (NERLA) to appear this Fall, take a look at my friend Malcolm Brown’s excellent ELI Brief on the topic, or review the proceedings of the Banff conference from last Spring, which had some wonderful presentations. And if you have an idea or a project in the works, consider presenting on it at the first ever NERLA Learning Analytics Symposium, January 2012 in Norwood, Massachusetts: the call for papers is open through September 26, 2011.

2 Responses to “Soliloquy on Learning Analytics”

Trackbacks/Pingbacks

  1. NMC / NERLA Learning Analytics Workshop / Webinar | Theatrical Smoke - October 7, 2011

    […] tackles a thorny problem in Learning Analytics – the problem where everyone is excited by the idea of Learning Analytics but wants to see […]

  2. Discussion of Learning Analytics Project Development Workshop » THATCamp New England 2011 - October 18, 2011

    […] soul of the shadow of the hint of the embryo of an idea, to explore that idea. And learn more about Learning Analytics. And network with some funky data-collecting learning-likers. And then subsequently show off your […]

Leave a comment