|Version 15 (modified by mastermanl, 7 years ago) (diff)|
Evaluation of Phoebe
- Evaluation of Phoebe
Phoebe Evaluation Plan 25/08/06
- Aims of project (a recap)
- Concept underlying Phoebe and its rationale (a recap)
- Resarch questions
- Specific questions to ask (stemming from RQs)
- Evaluation techniques
- Evaluation schedule
Phoebe Evaluation Plan 25/08/06
(Subject to ongoing revision)
Aims of project (a recap)
- Develop a prototype online planning tool to guide practitioners working in post-16 and higher education in designing effective and pedagogically sound learning activities.
- User-test the planning tool for functionality and usability.
- Investigate the feasibility of further development and the integration of the planning tool into pedagogic practice by:
a) Linking the planning tool to specific guidance, models of practice, case studies, learning designs and other appropriate support material;
b) Embedding use of the planning tool into specific contexts for piloting and evaluation, e.g. continuing professional development, initial teacher training
Concept underlying Phoebe and its rationale (a recap)
Basic concept: An online planning tool to guide practitioners working in post-16 and higher education in designing effective and pedagogically sound learning activities.
(Do we need to elaborate on what we mean by 'pedagogically sound'?)
Specific instantiation of that concept: A tool that propagates the principles of effective practice to as wide an audience as possible, by allowing them to develop new pedagogical approaches while still using the planning tools with which they are familiar.
Rationale: We believe that successful innovations in IT reflect, and build on, the ways in which users actually work, rather than requiring them to adapt their practices. Therefore a planning tool such as Phoebe should take as its starting-point the tools and processes in current use. By meeting practitioners on their 'home ground', we can then introduce them to new, more effective, tools and processes and thereby lead them to espouse the emergent technologies where these are appropriate to their situations. While acknowledging the power and potential of the new generation of learning design tools, we note that a) they constitute only one of a repertoire of tools at teachers’ disposal and b) they still have only a limited user base, yet a large number of practitioners need assistance re getting started with e-learning.
- Is Phoebe, as an embodiment of the concept outlined above, a tool that practitioners in post-compulsory education find usable, helpful and relevant to their needs, whether they are a) beginning or experienced teachers and b) looking to use technology in their teaching for the first time or are familiar with e-learning but are looking for new ideas re technology and pedagogy?
- Can one tool address all sectors of post-16 education, or are separate tools required for, say ACL, WBL, FE versus HE? (reasons: different nature of planning in the sectors; differences in terminology and linguistic style)
- Is Phoebe suitable as a tool for teacher education and/or a tool for supporting everyday practice?
- What additional features and functionality are required to turn Phoebe from a PoC prototype into a tool for general use?
- What are the potential issues of sustainability and how might these be resolved?
Critical point: the brief is to design a proof-of-concept tool, which means that the vision underlying the design is on trial as much as the design itself. Of course, all designs embody their underlying vision, but in a PoC tool that vision is necessarily much larger than its embodiment, and so we have to project this to our evaluators.
Specific questions to ask (stemming from RQs)
Note: AT: Xxxxx refers to a question derived from the Activity Theory analysis of the LD Tools data (section 4.3.2 of the report).
Key aspects of usability
- Ease of learning to use
- Ease of use once learned
- Productivity/usefulness/effectiveness (task accomplishment) (covered in practice-related issues below)
- Affective response (enjoy using it/choose to use it/recommend to others)
- Sustainability? (Not just new features, but also projections re effort likely to be involved in maintaining currency of content and examples.)
Integration into individual pedagogic practice
- Mapping to their established practice (processes, functionality)
- Relevance to their established practice (content) (focus on generic vs examples vs institution-specific examples)
- Extent to which they feel they have learned something new/been inspired to experiment/innovate in their own practice
- Relationship between their use of Phoebe and their continued use of existing tools: Do practitioners use Phoebe's notebook to start creating their plans in Phoebe, or do they use it purely for reference? (AT: Subject, Tools)
- Use again in own practice? If no, why not? What, if anything, would?
- Useful to others? At what stage of career/at what level of experience wrt e-learning
Integration into pedagogic practice of the institution (community dimension)
- Staff trainers: does it map to the models of pedagogy which they are trying to communicate to their trainees? (AT: Rules) If no, how much effort is required to make it map? Could simple customisation options (e.g. ability to edit "default" guidance, add own case studies and examples, components of plan, terminology?) work, or would they need a totally customised tool?
- Can the tool be used by practitioners working alone or does it work best in an institutional setting, as a community artefact? (AT: Subject, Community)
- To what extent does Phoebe facilitate or, conversely, impede established practice relating to session planning in the user's institution? (AT: Rules)
Proof of concept
- Do evaluators endorse the underlying concept?
- Does the tool operationalise the concept?
- What would be useful/desirable/essential for its (further) acceptance? (Give list of possible features + invite user to suggest others)
We envisage that three groups of users will be represented:
- TS: The 'broad church' of teaching staff
- TS-F: Teaching staff who are already familiar with the use of technology in their teaching/learning
- TS-N: Teaching staff who are new to technology
- TD: Teaching staff responsible for the dissemination of technology in pedagogic practice:
- TD-T: Those with teacher-training or staff-development roles
- TD-S: IT support officers and learning technologists who are responsible for disseminating the use of technology in their institution/department and/or are involved in specific 'projects' (e.g. creating an image bank; redesigning a F2F course to run in a blended or fully online environment)
- PDT: Teachers and students undergoing CPD and ITT respectively (NB It is not clear at present how this group might be involved, since we have dropped the proposal [at meeting with JISC and Glenaffric 11/08/06) to evaluate embedding the planner in training contexts through workshops with trainees. However, if they are comparatively IT-capable, they might be invited to the main evaluation workshops.)
The PoC evaluation requires a specific type of evaluator: one who can see past the present limitations/imperfections of the tool to the tool as it might be: i.e. the tool is not the object of the evaluation; rather, it serves to mediate the evaluator's mental projections. In this respect, novice users of e-learning technology may not be the most appropriate evaluators: as Rogers and Scaife suggest in their original work with informants, it can be more productive to work with people who are a few stages above the target audience as they can recall, and reflect critically on, their own experiences as novices. [Note: Veronica @ Glenaffric has a questionnaire that might be useful for this purpose.]
Possible sources for recruitment of expert evaluators
NB This is just a list of 'possibles': none of these groups has yet been approached.
- Members of JISC Pedagogy Experts group
- Selected participants in LD Tools project
- Colleagues of PIs (esp. where originally approached to take part in "embedding" workshops
We will use techniques that focus on capturing a rich set of qualitative data from a small group of (experienced) people. The proposed techniques are shown in the following table
|Instrument||Subjects||Purpose||Qual/Quant? data?||Notes & issues|
|Questionnaires||All||Affective responses to Phoebe; other attitudinal data||Qual + Quant|
|Observation, optionally with recorded think-alouds||PIs||Affective responses to experience; Usability problems||Qual|
|Semi-structured group discussions||Teachers, Learners||Affective responses to experience (reflective)||Qual|
|Session plans created during workshops?||TS; TD||????||Qual|
|Usage data||TS; TD||Determine patterns of uasge during workshop||Quant|
|Heuristic evaluation||Project team; PIs||Optimise usability of interface so as to eradicate unwanted factors in evaluation of pedagogical aspects of tools.||Qual + Quant||Base on user-centred and learner-centred principles; modify, extend and/or replace the Nielsen and Sharples & Beale guidelines. (See other lit on UI design in e-learning, incl. Hedberg et al. paper.)|
The project plan identifies four key 'events' in the evaluation schedule, the first of which was the programme of interviews with practitioner-informants; the others are the walkthroughs by PIs, practitioner workshops and embedding sessions. However, in retrospect, the material collected from the PIs falls more into the 'requirements gathering' activity, and so the interviews are now excluded from the evaluation.
|Timing||Factor to Evaluate||Questions to Address||Method(s)||Measure of Success|
|Aug/Sept? 2006||UI and functionality of prototype tool||Consistency, usability, error-free functioning||Walkthrough by project team members||Number of issues identified and resolved|
|Oct/Nov? 2006||Usability and usefulness of prototype tool||Consistency, usability, mapping to real-world task||Walkthroughs by practitioner-informants; give feedback via questionnaires or interviews (TBD)||Quality of feedback and suggestions for improvement; minimal number of bugs and UI/functionality issues|
|26 Oct 2006||Acceptability of proposed tool to wider user community||Proposed overall vision and functionality of tool||Set of activities at JISC ELP Experts' Group meeting||Extent to which the tool is judged to be of use to the community|
|Nov 2006||Usability of revised prototype||Usability, error-free functioning||Walkthrough by project team members||Extent to which issues raised in evaluation with PIs have been resolved satisfactorily|
|Jan 2007||Embedding: Acceptability of tool as support for design for learning (also usability issues)||Mapping to real-world task||2 workshops with max. 10 (expert) practitioners at each. Feedback via questionaires and interviews. Some observation may also be carried out.||Quality of feedback re usefulness/acceptability|
|27? Jan 2007||Acceptability of tool to wider user community||Overall vision and functionality of tool||Demonstrate tool at Design for Learning programme meeting or Pedagogy experts group||Extent to which the tool is judged to be of use to the community|
|Jan 2007||Embedding (and sustainability)||Suitability for embedding in staff development and/or initial teacher-training context; suitability across sectors and domains||TBD: Individual meetings with representatives from HEA, Becta, ACLearn||Quality of feedback re suitability|