JISC Developing Digital Literacies – Evaluation REALITY-CHECK Task

The questions below are intended as a reality-check exercise, for us to use as a project team in clarifying and ratifying our evaluation design, and for us to review so we can provide effective support and valuable sets of evidence across the programme. Reflection on where we’re at with evaluation should also help us all prepare for a productive webinar on Friday 13th April and your responses can be updated and used in your interim report.

Q1. Briefly describe your approach to evaluating your DL project. * What’s needed here is not a lot of detail or tables, just an outline of your overall methdology/design and/or the stage you feel you are at?

The early aims of the DIAL project was to run small and defined mini projects within a wider DIAL project/programme.  Therefore our approach was to evaluate each pilot project group individually at the beginning of their projects. This early localised evaluation helped to define the mini projects aims and objectives and improve the overall focus of the DIAL project.  Five pilot DIAL projects were identified at the beginning of the project. Follow up evaluation meetings are planned for all the mini projects in May/June 2012.

Q2. In what ways have you used the DL synthesis framework or evaluation templates offered by the programme, and/or created your own tools/approaches? * For example, you may have developed your evaluation design or plan in each of the six focus areas (strategy/policy, infrastructure, support, practice, expertise, stakeholders), some areas may be more relevant to your project and its evaluation, or you may have extended or adapted these tools or those from other programmes, institutional audits, skills/competence/graduate frameworks, etc..

The DIAL project employed Duna Sabri a freelance evaluator who consulted with project members to develop a framework of questions in order to structure the evaluation process. The following factors were taken in consideration:

1. Each of the community contacts is likely to be doing their own ongoing formative evaluation, drawing on JISC instruments, doing both systematic data collection and picking up evaluative comments informally.

2. Duna compiled an overview of data: First by setting up meetings with the 5 project group leaders and second by drawing together the data they collected sometime in June or July for formative purposes.  The purpose of this overview will be to see the differences as well as consistencies between the different projects.  They will be evaluated according to their own aims.  Among the questions Duna asked were:

– what are you hoping to achieve through participation in DIAL?
– how did you come to decide to take part?
– what approach are you planning to take in evaluating how your project goes?
– have you seen the JISC tools?  If so, how useful/appropriate were they?  Do you need help adapting them?
– what sort of evaluation data would you like to have by the end of the year?

Q3. What are your ‘big picture’ questions in relation to your intended changes/enhancements? * You should refer to your project aims, objectives & impact descriptions in your most recent project plan. Clarify what and who are you aiming to change or enhance? You may need to distinguish questions for your different stakeholders (who is most interested/concerned about these areas?).

  • How is the project engaging with students?
  • How is the project engaging with industry?
  • How to evaluate impact on staff development?
  • How do we measure/evaluate big picture institutional and cultural change?

Q4. How far along are you in identifying what ‘success’ looks like in relation to those intended outcomes? * Please say where you are with creating/sharing a clear set of key indicators and impact measures at interim and final stages. For instance, what kinds of early evidence of progress towards outcomes do you expect and have plans to gather; what processes do you have in place to capture anecdotal/unintended outcomes (blogs, user diaries, observation, focus groups, note taking at stakeholder events), and how are you engaging with stakeholders to review & respond and come to conclusions about emerging findings, benefits and impact measures that are interesting and convincing to them?

Institutional baseline

Following the JISC baseline, the DIAL project board has agreed to support the development of an official institutional baseline report for its digital provision across all colleges and departments.

Surveys and focus groups

We’re currently running an online DIAL survey. The survey was circulated after the JISC baseline submission given our late project start. The survey was promoted to staff only and has received over 120 responses. The responses are very detailed and have produced valuable date – http://tinyurl.com/dial-survey

Six students in Six colleges. The DIAL project is piloting different feedback methods:

Student field worker pilot: We’ve used this method to gain ‘sound bite’ feedback from interviews with UAL staff and student in all six colleges.

Six Students were employed one from each college, we used our in-house employment agency artstemps. Each student was asked to spend at least 4 hours interviewing at least 6 individuals’ staff or students in their college.

They were asked to note what time and place you carried out the interview (café, canteen, studio, media suite, social spaces) The students were each given the following questions below as guidance, they were not required to use or to ask all the questions, the interviewee could choose the question/s they would like to answer, students could create new question specific to your college if necessary.

  • Where and how do students/staff want to access information?
  • What equipment and skills do staff/students need to deliver the blended, social and mobile learning?
  • What online digital resources would you like to use?
  • Would you like to engage with tutors online and see online resources?
  • Could students create online learning resources, if so how?
  • What support do students and staff need to help create resources?
  • Should UAL evaluate digital skill levels of staff and students?
  • Which digital skill sets are important to you?
  • Do you feel connected: to other UAL colleges and individuals?
  • Do you feel connected: to industry and professionals in your specialism?
  • How can UAL improve online participation and networks?

Focus Groups & workshops





Q5. How well do you feel your evaluation activities integrate with other project activities and gather data about uptake and engagement levels? * E.g. Do you have workpackage tasks that detail specific evaluation activities, including the above as well as other impact measures such as usage/participation/satisfaction data. Does your timeline show clearly the linkages/opportunities to integrate evaluation with baselining, development work, implementation/pilots, stakeholder consultation/dissemination etc. ?  

DIAL project or programme?

The DIAL project was always going to be bigger than a project and by acknowledging this and looking at DIAL as a potential UAL programme we can better build a case for developing a UAL wide digital strategy and sustainability plans to develop and maintain progressive digital practice at UAL. So DIAL will run as a programme and do its best to acknowledge as wide a spectrum of issues as possible although it cannot address everything. The DIAL project groups are trialing different methods of developing and documenting their projects. We are developing a ‘Work package brief’ template with some of the pilot project groups in order to improve communication and alignment with the wider programme aims and objectives.

Summary of DIAL group projects activities to date:

  1. Open education at UAL (have three more discussion groups to come at WCA, CCA, and LCC, the overall aim to is to produce a handbook (need identified by the stakeholders)– ‘Getting started/how to do Open Education at UAL’ including: Teaching resources, Individual and institutional recommendations for the future and a open education sustainability plan for UAL.  ) All up-to-date info is here – http://dial.myblog.arts.ac.uk/category/open-education/ We have set up a content/resource cluster group: http://process.arts.ac.uk/category/project-groups/open-and-flexible-learning
  2. The Presentation skills group All up-to-date info is here – https://docs.google.com/document/d/1qVtyAbhABw2b2QoP7eDutiqOEoI0hcBCBpJo097IWjw/edit
    We have set up a content/resource cluster group (Laura will be posting soon) – http://process.arts.ac.uk/category/project-groups/presentation-skills
  3. The Information literacies group – just starting their workshops with staff, students are supporting staff in the project development (they will be setting up a blog for their resources soon). All up-to-date info is here – https://docs.google.com/document/d/1rtSSx2o4CUkIxdNkWxdMTn9j5QDK8cx058sGqlmzUgE/edit
  4. The Online Reflective Practice groupis beginning to take shape and Lindsay is due to post some updates before the end of the month.We have set up a content/resource cluster group – http://process.arts.ac.uk/category/project-groups/cltad-teaching-development-projects
  5. We also have some new groups forming:
    Developing your online professional identities  – https://docs.google.com/document/d/1SbfigfrUYanJjEyZ8_zCpggW3w56ba3wu-XkAaV7wW0/edit  – We have set up a content/resource cluster group – http://process.arts.ac.uk/category/project-groups/enterprise-and-employability-curriculumTwo successful 2012 Teaching & Professional Fellowship Awards, both projects are aligned closely with DIAL’s objectives and we look forward to working together – http://dial.myblog.arts.ac.uk/2012/04/12/teaching-professional-fellowship-awards-2012/UAL Community of Practice (CoP) awards – The Learning Studio – http://process.arts.ac.uk/category/project-groups/learning-studio and Drupal UAL – http://process.arts.ac.uk/category/project-groups/drupal-ual

Some initial (very rough) ideas on the common features Duna our project evaluator has observed in the five pilot projects.

1. The aspirations for each the projects have their roots in group leaders’ reflections on their own roles and emerge from long-standing challenges that they want to tackle, and perhaps up to now have not had the time or resources to attend to them.  They are issues that can’t be dealt with in a piece-meal day-to-day way but benefit from an injection of attention and resource that DIAL brings.

2. There’s a substantial affective dimension to almost all the projects that is related to the particular kind of digital literacy that they seek to develop.  The relationship between the affective and the technological varies from project to project.  For example, Lindsay’s online reflective practice project is dealing with teachers’ fear of learning in public, the open and flexible education project is dealing with the discomfort of making curricular resources public, and Laura’s digital presentation skills project is dealing with anxieties relating to presenting oneself.

3. All of the projects are aiming to make some use of the relationships between different members of their target communities.  Some plan to get more literate/confident members to work with less literate/confident members, others aim to bring together opposing views, staff working on similar courses from different colleges, or students from different cohorts.  The configurations vary depending on how the projects define their particular hurdles.

Q6. What further support do you feel you need or would benefit from in relation to designing, planning out or undertaking your project’s evaluation? * For instance, preference for wiki resources/toolkit, webinars, cluster sessions, individual skype conferences, identifying external evaluators.

More group debates and workshops – nicely presented easy viewing case studies would be good, videos.

Q7. Are there any other points you’d like to raise about evaluating your project? Anything that hasn’t come up in these questions.


Q8. Finally, reading through your answers above, what are the critical elements you feel you may need to pay closer attention to? (I suggest you put some ideas down now, and revisit this as an action point after the webinar.) * For instance, you might wish to adjust the scope or focus/priorities of your evaluation questions, review evaluation milestones or timeframes for identifying/gathering different kinds of quantitative and qualitative evidence, consider alternative methodologies & data gathering or analysis needs, align evaluation with other stakeholder engagement activities as opportunities to ask about types of evidence that would be relevant and convincing to them, discuss early findings, review project effectiveness & communication, consult on emerging outcomes and final impact measures).

I would like to find out/get feedback/plan for the additional big picture overall project/programme evaluation. E.g. are we doing enough and how best to prepare for the next phase of the project and prepare an adapted evaluation plan in advance.



This entry was posted in Evaluation and tagged , . Bookmark the permalink.

1 Response to JISC Developing Digital Literacies – Evaluation REALITY-CHECK Task

  1. My Notes:
    • Invite rector senior managers to open comment add posts to the DIAL blog
    • Indicate, list what committees are we involved in
    • What evidence do we have
    • Conversations with senior management – add to evaluation plan
    • Evaluation levels two levels
    • Capture change – not just about tools, they will change and people will use anyway
    • Agile evaluation
    • Roles changing and transitioning through digital impact
    • Indicators of awareness

    Strand sites, links / evaluation designs to note:

    InStePP Evaluation Plan on JISC wiki- https://jiscsupport-developingdigitalliteracies.pbworks.com/w/page/52729010/InStePP%20Evaluation%20Plan

    Project evaluation: evidence, change and reflections on a webinar – http://diglitpga.jiscinvolve.org/wp/2012/04/13/evaluation-webinar/

Leave a Reply

Your email address will not be published. Required fields are marked *