Session 1: Community engagement and a typology of barriers

Audio from the session
To downlod the MP3 click here

Alex Voss (e-uptake project) National Centre for E-Social Science.

In this session the e-Uptake project from the JISC User engagement programme looks at the present barriers in uptake of e-Infrastructure in academia. This project is one of three working closely together to get to grips with the issues.
The e-uptake project takes an in-depth and multi layered approach to handle such a complex field of work.
There are many ways to tackle the technical issues involved, which are described in this session.

But, how far can you go in describing the issues, without getting stuck in the human aspects of cultural change?

The session was split into four parts:

  • Literature and fieldwork review
  • A typology of barriers
  • Looking at e-infrastructures and understanding what they are
  • Scaling up to a wider context

Key points raised were that:

More collaboration is needed – between designers and users and between projects and programmes
Sustainability issues need to be addressed
Ways are needed of mapping community engagement – eg; what different researchers are doing – to facilitate collaboration

The full Powerpoint is available at the end.

Introduction to community engagement in the e-infrastructure project

These 3 projects collaborate to form a community engagement project.

  • Eius – Looks at how people use e-infrastructure
  • e-uptake – Looks at barriers and enablers to uptake. The project’s remit is to widen uptake of e-research across all disciplines.
  • Engage – Specific problems that people are facing

Literature review and fieldwork

Issues identified in the literature

Some key points were (More details are available on the powerpoint presentation):

  • What constitutes e-infrastructure? How do people see it?
  • Data
  • Who is the e-science community?
  • Funding: Attracting funding for multi-disciplinary research
  • Ethical and policy issues associated with e-research
  • Legal issues
  • Managing local autonomy
  • Measuring success of research and recording it

e-uptake project

The e-uptake project research looked to find out more about the barriers to the uptake of e-research. 50 interviews were conducted to build a body of evidence and a typology of findings. The aim is to make these findings available online eventually. There was also an analysis of training requirements. They found:

  • A clear need for education in using e-research
  • More training is needed
  • Interventions need to be tailored to the community

Q and A

Shirley Williams-Reading

Are you looking to get communities or individuals engaged?

Interviewing individuals as representatives of scientific communities.

Mark Baker-University of Reading

What community will be addressed in future for a representative sample? Don’t we need to ask completely non-biased communities?

Snowballing-people will be asked to recommend people who haven’t been involved in e-infrastructure.

The question remained – is this unbiased enough? These people still have a connection to e-research. Is there no unbiased community who can be used?

Julian Beckton: University of Lincoln

Did the pie chart categories emerge from the data or were they pre-imposed?

They came from the data.

What were people asked about in terms of e-infrastructure? What definition was used?

Advanced tools. However, “advanced” means different things to different people. The conversational nature of interviews means interviewer can guide away from generic examples such as Excel and the web.

The Next steps

What is the e-uptake project doing with the data? The three community engagement projects are already interacting. The next steps are:

  • Interviews will be transcribed and parts of speech will be coded in terms of barriers to uptake.
  • Aiming for a minable and sustainable research resource.
  • Considering using SQUAD for the rest of the project.
  • Looking at tools which are advanced – to practice what they preach!
  • The process is being reviewed – comments are welcome.

Q and A

You said there was a Conversational interviewer’. What were interviewers’ instructions in terms of structure?

Each interview was semi-structured. We asked interviewees about whether they use certain JISC services and how. Asked them to talk about barriers at certain stages and how the barriers were overcome.

Are the questions available?


Did you reflect your findings back to interviewees?

We are not at the stage to do that systematically yet. We hope to follow-up with some interviewees who have specific barriers. The feedback of the body of findings hasn’t happened yet – the project isn’t finished. The repository will be somewhere for people to receive feedback.

Is this a missed opportunity? Isn’t going back to the people already contacted engaging them more, as the project aims to do?

Neil Chue-hong from Engage responded. This feedback is happening in the Engage project which is trying to engage directly with individuals. Sharing work – what would be a bias in the e-uptake project can be used by the Engage project.

Andy Jordan: Confidentiality. Will the transcripts be anonymised?

Yes-University and project names will be removed to protect identity. This is a limitation but EIUS can take individual cases up. Part of working together.

Yvonne Howard: Marking up interviews using squad. How long did an hours’ worth of transcript take to code?

About 1.5 hours. However, it will take longer in the analysis process.

Martin Edney, Durham: Will any weighting be applied? Eg;a barrier was identified strongly or to a lesser degree?

Not yet. People have mentioned this in interviews. Agreed that this was important and will be looked into.

Fostering infrastructures

  • How to embed e-research practices.
  • E-infrastructures are fostered, not built.
  • Changing social infrastructures require interventions not traditionally associated with engineering and design.
  • E-infrastructure requires social and technical support to be sustained.

Scaling Up to a wider context Fostering infrastructures rather than taking the approach of: “Build it and they will come”.

Users can use technology in all sorts of ways that designers never intended and researchers need to pick up on this.

Mutual learning is needed. Designers and users need to learn from each other and engage meaningfully on the development of technology.

How can collaboration between designers and users be achieved?

  • Training
  • Boundary spanning
  • Facilitation
  • Shared practice

Paths to adoption

The challenge is to find paths to adoption that provide a path to adoption between general (eg Web 2.0) and specific tools (eg: bespoke functionality. To do this we need a triage service – to put users in touch with the relevant person at the relevant time. The engagement has to be continuous to maintain interest. Community We need a baseline idea of what the “community” is. Collecting information and creating about e-science projects would be helpful.