|Version 13 (modified by mastermanl, 6 years ago) (diff)|
Phoebe Phase 2 Evaluation Plan
Phoebe Phase 2 Evaluation Plan
- Document history
- Research questions for the Phase 2 evaluation
- Specific questions to ask (stemming from the research questions)
- Evaluators for Phase 2
- Evaluation methods
- Phase 2 evaluation schedule
- In progress: 03/10/07
Research questions for the Phase 2 evaluation
Phase 1 focused on evaluating Phoebe as a proof-of-concept tool; this time we are testing it in more authentic environments, in order to address the following questions (adapted from Phase 1):
- Is Phoebe a tool that practitioners in post-compulsory education find usable, helpful and relevant to their needs, whether they are
a) beginning or experienced teachers looking to use technology in their teaching for the first time, or
b) are familiar with e-learning but are looking for new ideas re technology and pedagogy?
Specifically, does it encourage practitioners to think about their practice in a structured way?
- Can one tool address all sectors of post-16 education, or are separate tools required for, say ACL, WBL, FE versus HE?
- Is Phoebe suitable as a tool for teacher education and/or a tool for supporting everyday practice?
- Is there a perceived need for Phoebe to be customised to meet the needs of individual organisations (i.e. to function as a community-owned artefact)?
Specific questions to ask (stemming from the research questions)
Key aspects of usability
- Ease of learning to use
- What "getting started" guidance is needed?
- Ease of use once learned
- Does Phoebe work?
- What does Phoebe do well?
- What does Phoebe do badly?
- Productivity/usefulness/effectiveness (task accomplishment) (also covered in practice-related issues below)
- Affective response (enjoy using it/choose to use it/recommend to others)
- Sustainability? (Not just new features, but also projections re effort likely to be involved in maintaining currency of content and examples.)
Integration into individual pedagogic practice
- Mapping to their established practice (processes, functionality)
- Usefulness at different levels: course/scheme of work planning vs topic planning vs session planning
- Relevance to their established practice (content) (focus on generic vs examples vs institution-specific examples)
- Extent to which they feel they have learned something new/been inspired to experiment/innovate in their own practice
- Relationship between their use of Phoebe and their continued use of existing tools: Do practitioners combine the use of Phoebe's planning functionality with their familiar tools?
- Would practitioners want to use Phoebe in their normal work?
- If yes, what role would Phoebe play?
- If no, why not and what, if anything, would induce them to use it again?
- Do they consider Phoebe to be useful to others? At what stage of career/at what level of experience wrt e-learning?
Integration into pedagogic practice of the institution (community dimension)
- Staff trainers: does it map to the models of pedagogy which they are trying to communicate to their trainees?
- If no, how much effort is required to make it map? Could simple customisation options (e.g. ability to edit "default" guidance, add own case studies and examples, components of plan, terminology?) work, or would they need a totally customised tool?
- Can the tool be used by practitioners working alone or does it work best in an institutional setting, as a community artefact?
- To what extent does Phoebe facilitate or, conversely, impede established practice relating to session planning in the user's institution?
Evaluators for Phase 2
- E-learning experts:
- Teaching staff who are already familiar with the use of technology in their teaching/learning
- Staff developers with responsibility for promoting e-learning and tutors in ITT who are actively promoting
- IT support officers and learning technologists who are responsible for disseminating the use of technology in their institution/department and/or are involved in specific 'projects' (e.g. creating an image bank; redesigning a F2F course to run in a blended or fully online environment)
- E-learning novices:
- Teachers undergoing CPD aimed at introducing them to e-learning
- Teaching staff who are new to technology but not participating in a CPD e-learning programme
- Students undergoing ITT (NB they may or may not be experienced in using digital technologies)
Possible sources of evaluators:
NB This is just a list of 'possibles': none of these groups has yet been approached.
- Members of JISC Pedagogy Experts group
- Selected participants from the Learning Design Tools project (mailing list is still open)
- PIs and/or their colleagues (esp. where originally approached to take part in 'embedding' workshops)
- Serendipitous contacts, including those who find out about Phoebe through third-party connections or Web searches (see also "Friends of Phoebe" and list of people who contacted us for user accounts in Phase 1)
- Workshop participants in the University of Greenwich staff conference September 2007 who attended a demonstration of Phoebe and expressed interest in another workshop
We envisage recruiting only a small minority of evaluators from the D4L programme itself as we feel that the evaluation will be more fruitful (and objective?) if evaluators are from outside the D4L "community."
We will use methods that focus on capturing a rich set of qualitative data from a small group of (experienced) people.
The proposed methods are shown in the following table:
|Instrument||Subjects||Purpose||Qual/Quant? data?||Notes & issues|
|Heuristic evaluation||Project team||Optimise usability of interface so as to eradicate unwanted factors in evaluation of pedagogical aspects of tools.||Qual + Quant||Base on user-centred and learner-centred principles; modify, extend and/or replace the Nielsen and Sharples & Beale guidelines. (See other lit on UI design in e-learning, incl. Hedberg et al. paper.)|
|Observation, optionally with recorded think-alouds or short semi-structured interviews||Evaluators||Affective responses to experience; Usability problems||Qual||Feed into case studies|
|Semi-structured group discussions||Evaluators||Affective responses to experience (reflective)||Qual||Feed into case studies|
|Session plans created during workshops?||Evaluators||Capture examples of output - components used in different plans, structures (ordering of components)||Qual||Feed into case studies|
|Questionnaires||Evaluators||Affective responses to Phoebe; other attitudinal data||Qual + Quant||Draw for inspiration from questionnaires written for LAMS evaluation, LD Tools project; feed into case studies?|
|Webserver request logs||Evaluators||Investigate users' behaviour re the planning and reference aspects of Phoebe||Quant||From Dave B. 28/08/07: "with the guidance and functionality running as independent websites, correlating users in their logs will be necessary if we wish to examine paterns connecting the two parts of Phoebe."|
Phase 2 evaluation schedule
The Phase 2 project plan identifies five key 'events' in the evaluation schedule: workshops involving trainee teachers, in-service training (CPD), a workshop for experienced e-learners, and workshops to evaluate the customisation features of Phoebe and its support for collaborative planning. On reflection, some of these can be combined (e.g. experienced e-learners and customisation) or may prove infeasible to run as "live" workshops (e.g. collaborative planning). On consultation with Glenaffric, we now also propose to capture feedback from practitioners who ask us for a Phoebe account (30+ in Phase 1) through an online survey. However, we have yet to ascertain the likely value of such data in relation to the "live" face-to-face events.
|Timing||Event||Factor to Evaluate||Questions to Address||Method(s)||Measure of Success|
|Oct-Nov 2007||Internal usability tests||UI and functionality of prototype tool||Consistency, usability, error-free functioning||Walkthrough by project team members||Number of issues identified and resolved|
|16 Nov 2007; Jan 2008||In-service training: staff development workshops at Oxford Brookes University||Usability; integration into personal pedagogic practice; integration into institutional practice||Observation, discussion, session plans, questionnaires (? to capture current practice?), Web request logs||Levels of satisfaction and acceptability expressed by participants|