Monday 20 February 2012

Narrative as an element of communication in cultural heritage organisations (by Paula Goodale)

After attending a few of the Information School’s monthly discussion group sessions, and generally being impressed by the talks given by fellow PhD researchers, I foolishly offered to talk about my own less well-developed PhD ideas. I hoped there might only be a small audience, it being the second Thursday in January (just after the Christmas break).  I was wrong, and there were staff too, including the head of my research group. I hoped that I would present my ideas with clarity and conviction and that questions would be benign – not a chance! I hoped I might get the killer insight or advice that has eluded me - instead I came away with more questions than answers, and confirmation (of what I already knew) that there is much work to do, ruthless focusing needed and that it will definitely NOT be easy. With these caveats in mind, I will now describe my fledgling PhD research project and try to summarise some of the points raised.

My core idea is related to the fact that narrative has long been a key element of communication in cultural heritage organisations, as a means of providing access to and interaction with collections, a device for sense-making and informal learning, and a way of approaching social policy goals such as community and inclusion. However, in digital projects (whether they are social media, web sites or libraries) this element of narrative is most often missing or under-developed, thereby missing a key opportunity to engage users. My goal is to look at why this element of narrative is missing online, and to explore ways of addressing this, firstly through the use of the PATHS system (that constitutes my day job), and other online media as appropriate. My research objectives (probably too broad) and methods are to:
  • Find out more about general users’ information behavior, using survey and observational methods.
  • Establish the current state of play on the use of narrative, using desk research and interviews with museum personnel.
  • Explore which are the most promising opportunities for incorporating narrative into user engagement, using open-ended and creative tasks in different systems (including PATHS and one or more social media sites), both individually and collaboratively.
  • Evaluate the outcomes of the task-based activities giving consideration to qualitative measures such as satisfaction with the results, perceived outcomes, quality of the user experience, etc.
  • Analyze the content of the narratives produced via task-based activities to understand more about the nature of narratives produced by general end users compared with those produced by experts.
What is very clear from the comments that followed is that ‘narrative’ is a confusing and multi-faceted construct, and that a tight definition of my interpretation is needed. I’d been challenged on this previously by one of my supervisors and another staff member, so this was no surprise. Aligned with this is whether narrative is data or method – my current thinking is that it might well be both, in that I am analysing the content of narratives, I’m interested in it is a phenomenon and I’m also investigating ‘narrative inquiry’ as a research method.

Further discussions with fellow researchers last week at the second LIS DREaM (Developing Research Excellence & Methods) workshop in London (which I’ve blogged about here) provided greater clarity and perhaps a degree of focus. One angle that I am interested in following is the development of narrative use for engagement in cultural heritage over time, particularly the changes in ownership (and content) of narrative from experts to end-users, and onto more collaborative approaches to interpretation. I recognise that this is still a mammoth task, but hope to make it more PhD-sized following plenty of grilling by my supervisors, extensive deep thought (perhaps supported via rich pictures or some other mind-mapping technique), and occasional more serendipitous tangential conversational encounters.

Friday 10 February 2012

Surveying online survey tools (by Angharad Roberts)

Following on from Andrew's blogpost about online surveys, yesterday's session provided an excellent opportunity to discuss people's experiences of different online survey tools. I presented a document (which can be viewed below or here) describing a range of online tools, evaluated according to four criteria which seemed important to me: compliance, compatibility, clarity and cost.

Compliance relates in part to the important issue of data protection, raised at the end of last month's meeting. EU Data Protection laws say data shouldn't be transferred to countries outside the European Economic Area, unless the country it is transferred to has equivalent laws. US laws are not regarded as providing the same level of protection (9 countries which do are listed here) but it does have some of the biggest data storing servers, including Google and SurveyMonkey. There is something called the US-EU safe harbor framework, which enables US-based companies to show that their data protection procedures meet EU standards. Google complies with this, as do many big online survey companies. Another compliance issue is to do with the accessibility of the survey - for example, does it work with screen reader technology which may be used by people with visual impairments?
Compatibility relates to the options a survey tool provides for how data can be exported - can it be downloaded directly into SPSS, or would it be available as an Excel spreadsheet?
Clarity - for me, this mostly relates to question types and particularly so-called skip logic questions. I have a number of different potential target audiences and although I want to ask most of them the same questions, there are some questions I only want to ask one particular group. Skip logic allows for different pathways through the same survey, depending on answers to particular questions.
Cost - there lots of free versions of survey tools but these often have very limited functionality. I've set these out in the limitations column of my document. For example, SurveyMonkey just allows 10 questions and 100 responses in each free survey. Export options may be limited in some free tools as well. So I may need a paid for survey tool, which means it's helpful to know what the range of potential costs could be, including potential discounts for academic / research use (SurveyGizmo offers a free student account, but badges these surveys with a SurveyGizmo student research logo - which may not be the image I want to project). It also raises the question: are there survey tools in use within the department which it might be useful to know about?

This was followed by a very valuable group discussion about some of these issues. In response to a question from Alex Schauer, I clarified that most of these survey tools allowed for surveys in "all languages" or 40+ languages. I'm not sure whether these terms are used interchangably, with no survey tool appearing to list more than 59 supported languages - these figures seem to be related to the Unicode standards for supported language character sets. Bristol Online Surveys only supports 10 languages in addition to English (at extra cost); the free version of QuestionPro has no multi-language support (a point omitted from the version of the document I presented yesterday, but included in the copy linked to from this post). Barbara described problematic experiences with attempting to export data from survey tools, and suggested experimenting with the export process before choosing a tool to run the actual survey. Liz Chapman described the different approach to data protection displayed in one recent US survey ("we can't promise anything about your data...") and some of the limitations of SurveyMonkey, which may be addressed in one of the more expensive subscription versions. Mark Hall talked about his experiences of using Lime Survey, and the exciting prospect of an in-house survey tool, developed in the iSchool, which would potentially give iSchool researchers complete control of their survey data. Paula also described some of the more powerful features offered by QuestionPro.

You can view, download and print the summary document here or view this on Scribd:

Online Survey Tools - summary sheet

Thursday 9 February 2012

Surveying Universities: A Modest Proposal

Apologies to anyone looking for biting Swiftian satire, but this genuinely is a modest proposal - though on the subject of surveying universities rather than on the merits of babies as a food source.

The background
I am drawing near to the end of a project to look at the information behaviour of students at various stages of education; beginning with Key Stage 3 (11 to 14 year olds), and going all the way up to postgraduates.

A key part of the project involved surveying universities in the Midlands and the North of England.  Between us, my colleague (Mary Crowder) and I approached 12 universities with a view to asking them to circulate our survey amongst students and staff.  Responses to our request varied.  All too often however, we got one of two answers.
1) We were told that there was nobody in particular responsible for posting surveys and that we could try Computing Services, Students' Union, Marketing, Student Administration, or various local equivalents; or
2) They already had numerous questionnaires generated by their own staff and students and were concerned that people would get survey fatigue. 

In the end, we got responses from students and staff at five universities.  As an inducement, we offered to enter respondents into a prize draw, with the opportunity to win a £50 Amazon voucher.  At each university, two vouchers were offered to students, and one to lecturers.  The project therefore paid £750 in prizes.  Since the response rate at one university was very low, some students and staff had an extremely high chance of winning.

The modest proposal
I presume that we are not alone in wishing to learn about the views of university staff and students across the UK.  We are also not the only project to offer the inducement of a prize draw.  I suggest therefore, that UK research councils with an interest in educational research should consider setting up a central site on which RCUK-funded researchers can post surveys.  Completion of a survey would qualify a student to enter a draw with prizes provided from Research Council funds.  To enter the site, it would be necessary to log on with an .ac.uk email address.

So far (according to the blogger statistics) this blog has been viewed 3000 times by readers in 16 countries.  If anyone has knowledge of such a scheme within their country, or can suggest ways to elicit opinions of students and staff across Higher Education, I would very much appreciate receiving their comments on this subject.