Audio from the session
To downlod the MP3 click here
This session was a workshop rather than a formal presentation and panel debate, and was led by a project team who have worked on projects to elicit students’ ideas and experiences of the technologies they use in their studies. The group activities included trying out some of the elicitation techniques and feeding back on how effective the methodologies are.
Sarah Knight, programme manager for this strand, welcomed all participants to the session, which is one that belongs to a series of sessions intended to gain a better understanding of student experience of using technology. The e-learning programme gets the first session – Listening to Learners. It’s intended to be a dynamic session with lots of interaction, and she introduced and welcomed the session presenters/facilitators.
Rhona Sharpe: The session is structured in two parts. First, we’ll be talking about methodology and sharing the ways in which learners’ experiences have been collected, and we’ll be giving participants the chance to try the methods out. Second, we’ll be talking about the findings, what the research projects have been finding, and what participants might like to find out from their own learners.
First, I’ll give you a quick introduction to the projects and what they’ve been doing. First – there’s an investigative aim – how learners experience technology and participate in learning. We’re asking the questions – What’s the impact on our particular product/service? How do we develop methodologies to better understand the learner experience?
Phase 1 of the product resulted in the Lex methodology report and showed that we need to look at learners’ lives holistically to understand their motivations. We were working with an open question – what is the learner experience? The research questions for Phase 2 were much more specific – how do specific groups experience learning? How do students personalise their technology tools? We have various projects running across the country, and there are representatives of these projects attending today’s session. These projects are not yet finished, so we have no results yet – when we discuss findings, they’ll be tentative preliminary findings.
Out of Phase 1, we made methodological recommendations. Methods should be longitudinal – they can’t be snapshots. We need to gauge and record actual use of technologies, not perceived use or retrospective memories of use. We should purposively sample specific groups of learners – maybe groups who are failing, ones who are working part-time, community learners. It’s difficult to generalise across learners, so we should sample carefully. We should use guided recall or at-the-time techniques, and triangulate data to improve validity of results. With rich audio/video data, it needs to be collected and stored to strict ethical guidelines. Some of the projects have done large-scale services, and there is an emphasis on interviews and interviews plus an artefact – students bring something to talk through their use. Focus groups, video diaries and audio logs have been used – students have phoned in to record their technology use, for example, and students have been given cameras to talk into. ‘Penpals’ at Thema has established a relationship between researcher and the researched that operates long-term, allowing them to elicit rich data.
We’ll look now at some examples of video diaries in a montage called How students use technology to support learning, which is a series of edited clips from the Stroll project. “I very rarely leave my computer unless it’s to attend a lecture…or go out to eat,” says one student. Another commented on the usefulness of broadcast/podcast lectures, when lecture notes simply aren’t enough and students have to miss sessions. Another talked about the value of free online courses. One student talked about the university’s virtual campus on Second Life, and praised the university for trying to “keep up with modern technology”. She also talked about companies doing interviews via Second Life.
All sorts of interesting things that may be difficult to elicit from students the rest of the time. Throwaway comments can be unpicked. There is also a website available to access with further information about the data collection methods, and ELESIG is a place for those involved in the evaluation of learners’ experiences of e-learning.
Rob Howe, University of Northampton: I am involved in the E4L project, which is scheduled over two years, completing in Feb 2009. At the outset, we thought, “What do we need to do?” We were asked to identify effective e-learners – what do we do, and can we actually pick out the effective e-learners? We went for “proficient e-communicators” as a variant – a logical step and something we could work within our timescale. A methodology needed to be devise, and we used the VLE for that purpose, setting up a number of tasks, and the students who reached level 3 in those tasks were taken on to the prject itself. It wasn’t a case of getting students into a room and asking them questions – we have a collection of different ways we interacted with them. Tutors were involved from the outset, they were important; we went into lectures and gave out info cards with KitKats – it was something they remembered, and gave them the opportunity to interact with us. we asked them to complete some of the initial stages on the VLE. In terms of the interaction itself, we needed to keep them engaged with the interviewing process. We did try video diaries but didn’t find that very successful. Once we’d got the learners in, we videoed the whole session, and we went for a number of stages with those learners. One key theme was transitions – looking between adult community learning/further education/higher education. We broadened that out to years and courses. We tried to get their educational flowchart out, so talked to them about their educational experience and where they wanted to go. We wanted to engage where possible with a natural dialogue. We then talked to them about the questions we were going to ask, so there were no surprises, and we tried to phrase them in such a way we didn’t mention what we were looking for.
Once the camera started rolling, we gave them a card sort – a useful process in terms of the technologies they were engaging with, ranking them in order of importance and talking about the reasons why they ranked things as they did. We allowed them a few blank cards, so they could write down technologies we hadn’t thought of, and allowed them to talk about that. That had one of the unintended outcomes – we used blank cards in subsequent card sorts, and the students would ask questions about some of the technologies they didn’t know about – they’d go away and find out about it and then tell us about how useful it was afterwards. One of the best ways to appreciate it is to try it out.
Handover to the floor: Participants were then divided into five groups, and given an envelope containing a number of cards – discuss and rank the technologies in there before coming back to feedback.
Rob Howe called the session back together.
RH: We’re not interested in how you ranked them, we want to find out how you found the card sort and discussion as a method of getting people to talk. How good was it as a method of talking about your own experiences?
Group 1: Laptop computer was ranked as the top item. We enjoyed talking about what we used and the process of shuffling cards around was entertaining and interesting. We also had a sense of commonality – we do things in the same way, though there is some personal and institutional variation.
Group 2: Our top item is the same – laptop computer, though also email and mobile phones. As a method of getting discussion going and the extent to which age affects these things, whether hardware is a technological thing you have to have – younger people might not talk about the hardware, but the things they did with them.
Group 3: We didn’t talk about laptops and things, we assumed we would have those. We didn’t have any cards so we wrote our own and then tried to sort them into related things – email and podcasting got the most votes. What was difficult about it was that we all have very different roles in the ways we interact with students. We didn’t really have time to explore that, but it was clear from what was coming out that we interacted with students and interpreted the brief in different ways.
Group 4: We are addicted to work email, unfortunately. It was effective at drawing out the differences; people who didn’t agree with the ranking might not get a voice.
Group 5: Computers were at the top. There was a cluster between direct experience of the curriculum as opposed to broader experience and social networking. We had a slight problem with definition, but it was an informative exercise and got everyone talking.
Group 6: Like everyone, we had laptop at the top. We enjoyed throwing ideas out. We didn’t really try to achieve a consensus, just discussed individual differences.
Ellen Lessner has been collating the groups’ feedback on a flowchart.
RS: Learners are led by the prompts that we give them. One of the groups assumed that laptops were already included. We need to say some things explicitly because the other person makes assumptions about what has already been said.
(Janet Finlay: Give the fact that it would be difficult to use some of the technologies without the hardware, it’s at a different level. It’s not an interesting thing to find out, that people need laptops. We focused on the software and the platforms, because you need the hardware.)
(Q1: Maybe asking people if they used laptops or pens more often might be interested.)
RH: Some people ranked mobile phones as their top technology. We need to talk to learners and get that mobility out. We need to break down the assumptions we have. We did a session in Bradford called Back to Basics, and we considered everything from the students’ perspective. Make yourself a student, and you come up with different assumptions and experiences. Some groups asked me whether we should be going the exercise as a student, a practitioner or a institutional manager.
(JF: Comparing how much you use a laptop with how much you use Blogger is comparing different things. Did you separate them out?)
RH: we used the card sort as a way to get learners talking. It’s not necessarily the top one or two, it’s the cluster at the top. You need to be dynamic in the interview setting. You can’t predict what’s coming out in those high agendas. The trends across the seven projects is what Rhona and the teams are trying to pick out.
(Q2: Have you detected any age-related difference, and is there any gender-related difference, and is there difference between sectors?)
RS: Age differences are not necessarily what you would assume. Younger students use more technology but not necessarily in ways which help them with their study. Older students need to be more strategic in their technology use, and that’s similar with adult and community learning.
RH: We found some of our older learners quite self-depreciating. They didn’t appreciate how good they were at using the technology. One lady talked about how technology has transformed her learning – there are no boundaries for her now. She’s overtaken her children now with regards to technology usage. But she said at the start, “I’m not a big technology user.” There may be an age difference, but older people may be self-depreciting.
(Q3: We found people from a workplace background were competent in their use of technology and they applied that to their study. They knew what they had to do, and they had tools to approach it. They were focused and driven.)
(Q4: We’re working with work-based learners. They’ll use the technologies you use in providing a learning experience for them, but in terms of social networking, they don’t.)
RS: It’s more about efficiency.
(Q4: Yes, it’s email and Google and that’s about it, really.)
RS: One more point – outlier or consensus. When you’re trying to get learners’ experience, which do you want? Sometimes we do want to look at the outliers, and we’re really interested in that – keeping up with the learners and institutions and having a flexible way of organising IT services. We want the students who know about things we’re not aware of.
RS: We have 15 to 20 minutes left, and in this last bit, we want you to imagine the impact of whatever it is you do on learner experience. In order to get you thinking about that and broadly about students’ lives, I’m going to run through some of the areas that we’ve been looking at in our programme of work. Hopefully that should spark off some ideas. We’re interested in the issues you’re looking at, and that will help us look for themes in our data.
Different projects have been looking at different research questions, so the University of Edinburgh’s study, for example, has looked at access to technologies. When we look at usage of social networking, we want to know how they’re using it – as a forum for discussion, transferring existing groups into Facebook, or setting up specific groups led by enthusiastic students. The use changes over time – when they first arrive at university, there’s a splurge, and it trails off after that. Thema produced a report for us about students’ use of social networking. Some students used it to stay in touch with people at home, or in organising meetings at university, but if you dig deeper there are different views – it’s a waste of time, I joined because everyone else did, it’s too addictive. we find out a whole range of different views oncec you start to dig deeper.
Rob mentioned something about effective e-learners, and in Phase 1 we found that effective learners tend to be skilled networkers, using different technologies when needed. The Phase 2 projects have had difficult tasks unpicking what we mean by “effective e-learners”. E4L’s “proficient communicators” might be one definition; another project LexDis looks at those who are “agile technology users”, focusing on students with disabilities.
We’re finding that if we could characterise effective learners, they are using what they know and have to hand in ways to support their study. They use their knowledge and networks to enhance their environment. Some people find multi-tasking distracting. Some people set up alternative forums, so not the ones provided by the course, and some use less well-known technologies and software to help them.
If we go back to the themes I pointed out – access; preference; personalisation; beliefs and expectations; effective e-learners; social software; change and transition; specific learners and contexts; institutional level practices; course level practices – let’s talk now in groups about how what you do impacts on them.
RS: We’ve run over time, but will stay on for another 15 minutes to conclude the session. We’d like it if you could pass forward the papers you’ve been working on – we can look at the areas of learner experience you’ve pinpointed. There are two important points I want to finish on – change and transition. Things change. Learners change their skills as they move through their learning experience. There are lessons here for the jobs that we have. Learners are malleable, and we have a role to play in understanding more and finding ways to help them become more effective. We have to see this as a developmental process, and we have the opportunity to impact on that process.
These projects were funded to provide a window into experience, but also to develop research skills and to support the development of skills and strategies to find out what’s appropriate for students’ learning in the digital age.