Phoebe Phase 1 Evaluation Report
Liz Masterman and Marion Manton
Phoebe Phase 1 Evaluation Report
- Data collection
- Q1. First impressions
- Q2. Intended users and purpose
- Q3. Advantages and beneficiaries
- Q4. Support and training issues
- Q5. Technical issues
- Q6. Organisational issues
- Q7. Improvements and revisions
- Q8. Other ideas
- Findings and Recommendations
This report outlines the approach taken in the evaluation of the Phoebe prototype pedagogic planning tool carried out between May 2006 and February 2007 and funded under the JISC Design for Learning (D4L) programme. It summarises the findings and makes recommendations for the revision of the tool envisaged during the period May 2007 to February 2008.
The aim of the Phoebe project is to develop an online planning tool that guides practitioners working in post-compulsory education in designing effective and pedagogically sound learning activities. Its guiding principle is to propagate effective practice to as wide an audience as possible, by allowing users to develop new pedagogical approaches while still using the planning tools with which they are familiar.
The approach taken in this evaluation differed substantially from the initial plan, for two reasons: a) slippage in the project schedule and b) the prospect of a 12-month extension to the project. Together, these led to the postponement of our core evaluation “event” (viz. the two practitioner workshops) and to the insertion of a design review at the behest of JISC to provide input its decision-making regarding the extension. Following approval of the extension, the periods May 2006-February 2007 and March 2007 to February 2008 have been designated as Phase 1 and Phase 2 respectively.
Phase 1 focused on producing a proof-of-concept prototype planner tool, which means that the vision underlying the design was on trial as much as the design itself. In this respect, the project team wanted to know:
- Do evaluators endorse the underlying concept?
1.1 Would the tool be viable in their organisation for planning at the lesson level?
1.2 In what contexts could they envisage its use?
1.3 Could the tool function as a “community artefact,” owned and customised by individual institutions or departments?
- Does the tool operationalise the concept?
2.1 How easy is it to learn to use the tool?
2.2 How easy is the tool to use once the basics have been grasped?
- What would be useful/desirable/essential for its wider acceptance?
3.1 What shortcomings need to be overcome?
3.2 What additional features or functions would be helpful?
A proof-of-concept evaluation requires a specific type of evaluator: i.e. one who can see past the present limitations/imperfections of the tool to the tool as it might be. Thus, the tool is not the sole object of the evaluation; rather, it also serves to mediate the evaluator’s mental projections. In this respect, it was desirable to involve evaluators who combined considerable experience in the use of e-learning with a clear appreciation of the problems that novice teachers and “technology-reticent” teachers might encounter in the face of technology.
Evaluators were drawn from the following sources:
- The original practitioner-informants (in keeping with their role to provide input in all phases of the project)
- D4L programme participants (not envisaged in the original plan, but a consequence of the JISC-organised design review)
- Experts in (e-)learning not involved in the D4L programme (invited by JISC)
- Experienced teachers, both with and without e-learning experience
The evaluation comprised the following activities:
1. Design review (23 Jan 2007)
The design review took the form of a walk-through of the prototype planner tool and a question-and-answer session at the D4L programme meeting at the Aston Business School. It was organised by the JISC D4L programme management team, together with the evaluation project team from Glenaffric Ltd. One member of this team attended the review. A parallel review of the pedagogic planner tool being developed by the London Knowledge Lab was conducted at the same time. The review was attended by volunteer members of projects in the programme, one member of the evaluation project team and a number of individuals specifically invited by the JISC programme management. These individuals were not part of the D4L programme. Participants were instructed to complete a set of open-ended questions prepared by Glenaffric Ltd and return them to the programme manager before the close of the meeting.
2. Meetings with practitioner-informants (Feb-Apr 2007)
We re-interviewed practitioner-informants in February and March 2007 to obtain their feedback on the prototype tool which we had developed partly on the basis of their original input. Although conducted in an open manner in order to elicit interviewees’ unprompted reactions and suggestions, the interviews also included the same set of questions that had been posed in the design review, modified slightly to suit the different context.
The interviews were tape-recorded and subsequently written up using the same technique as in the original meetings. Responses to the structured set of questions were then collated with the responses to the corresponding questions in the design review.
3. Staff development workshop, University of Greenwich (22 Feb 2007)
This workshop, organised by PI07, was intended for trainee teachers in the FE sector to explore Phoebe as a tool for their own planning. However, the invitation was subsequently extended to academic teaching staff within the university. The two-hour workshop consisted of an introductory presentation and demonstration of Phoebe, followed by hands-on exploration and a closing discussion of participants’ experiences.
We also designed an online evaluation of the tool to be carried out in place of the original workshops; however, the striking consistency of feedback regarding the usability of Phoebe obtained from the review meeting and our interviews with PIs made it clear that this exercise would add little to the data already collected.
1. Design review
Responses were collected from 9 reviewers. They were de-identified and collated into a summary document by Glenaffric, so we have no means of distinguishing between D4L programme members and invited external reviewers.
2. PI Interviews
We met with 5 PIs: PI01, PI02, PI04, PI05 and PI06. The others were omitted through lack of time and because a number of common points had emerged from the earlier interviews, together with the review meeting, for us to feel that little more would be gained from additional meetings, especially as lengthy travel would be involved.
A further PI, PI07, had been involved in the design review and thus was not interviewed separately.
3. Staff development workshop
No student teachers and only 3 experienced teachers attended this workshop (in addition to PI07, the organiser). The paucity of data collected limited the extent to which we could collate their feedback with data from the design reviewers and PIs.
As already noted, Glenaffric collated the feedback from the design reviewers in a summary document. We extended this document by inserting the responses of PIs to each of the questions posed ( see attached document), and then analysed the combined data.
The next few sections report briefly the responses to each question. The final question also includes additional data from the interviews, together with criticisms and suggestions made by participants in the staff development workshop.
Q1. First impressions
Question asked: What are your first impressions of the planner?
13 respondents: all positive; for example:
"Great tool, help and guide for new tutors and tutors that are studying."
"Looks good. The types of content - not come across anything that does that type of thing before, things that people do need to think about but don’t always do it."
"I found the tool to be easily accessible at the top level, providing a really helpful structure to guide the design process. This guidance is combined with very high level of 'personalisability' with its free form note making tool."
However, 5 also expressed reservations about its complexity and possible usability issues; e.g.
"Potentially very powerful and flexible but also perhaps confusing, especially for the novice user. Could function at many levels,. Highly theoretical, perhaps too much so for 'rank and file' teachers. Needs some judicious editing and would benefit from some simple guidelines. Overall a potentially useful tool (or set of tools)."
"The potential is huge. It’s just tempered by getting started."
Q2. Intended users and purpose
Question asked: Who do you think could/should make use of the planner, and for what purposes?
|Practising lecturers, incl. those wishing to explore e-learning||6|
|Staff developers, trainers||5|
|Student teachers; new lecturers||5|
|People interested in Learning Design and/or theory, incl. developers of Learning Design systems||3|
Examples of comments:
"...anyone who is motivated to consider alternatives or needs inspiration."
"Anyone who needs to understand the theoretical underpinning of learning design."
"[Staff developers & student/new teachers] – to help formulate pedagogically sound designs – to see types of tools/activities available
[Lecturers & learning technologists] - to help design new modules/courses etc or revise existing ones"
However, one person expressed concern about the cross-sector applicability of the tool as it stands:
"I have concerns over the proposed breadth of audience I think the level of choice presupposes an understanding of the options which would apply better to HE than an ACL or FE audience."
Q3. Advantages and beneficiaries
Question asked: What in your view are the advantages of using the planner, and who will benefit?
Perceived advantages include:
- Simplicity; step-by-step approach
- Structure, incl. making available the "whole picture" of what needs to be considered in planning a lesson
- Flexibility: can accommodate planning at multiple levels from curriculum -> learning objects
- Support for reflection
- Range of materials; clearly written
- Provenance ("safe")
- Online environment
Perceived beneficiaries were much as question 2, plus "communities of practice":
"I like the proposal of Phoebe supporting communities of practice rather than being left open ended. Its use could then be introduced & supported by an informed person who is a trusted or significant others for non-informed members."
Q4. Support and training issues
Question asked: What, if anything, could help the intended users make effective use of the planner (e.g. training, guidance, time...)?
|Training; integration into staff development programme; pedagogical support||6|
|"Getting started" guide with animations e.g. in CamStudio?, Wink||1|
Other suggestions related more to the functionality of Phoebe, and thus are included in the results of question 7.
Q5. Technical issues
Question asked: Do you foresee any technical limitations in using the planner?
Issues related to inherent restrictions of the technology:
- Web-based technology may cause difficulties in ACL/FE environments with limited internet access.
- Failure to work in IE6:
"We’re still on IE6. It’s an issue for a lot of organisations. Not likely to update to IE7 soon (only went onto XP last year)."
- Text-based wiki militates against visual design and limits incorporation of visual forms of representation.
- Inability to link to documents on local computer (?and network?).
Issues stemming from conceptual design and its implementation in the chosen environment:
- Tension between the wealth of information provided and the efficiency of navigation through it; tool could become unwieldy
- Planning needs to be a seamless process: should either do all planning in Phoebe or somewhere else (i.e. just using Phoebe for reference)
Q6. Organisational issues
Question asked: What (if any) organisational issues do you foresee?
The principal perceived "needs" are:
- Integration of the tool into practice through staff development
- Getting design for learning practised in the first place
- Responsibility for the tool: e-learning teams, staff development teams, VLE support staff...?
- Ability for institutions to include their own learning design templates, examples, learning approaches etc. in Phoebe
Q7. Improvements and revisions
Question asked: Please give your views on the future development plans outlined by the projects. What (if any) improvements or revisions would you like to see?
The following is a summary of the suggestions for improvements and revisions to the Phoebe prototype made by the reviewers and evaluators:
a) Content (structure + text)
|Details||No. of suggestions|
|Need for graphics as well as text||3|
|Less theory, more practical examples||1|
|Links should include identifying info about the site being linked to, so if link is broken users can use Google to find it||1|
|Include a mechanism on each page for reporting broken links||1|
|Sort out terminology||1|
|Use a structured vocabulary||1|
|Export content to other wikis||1|
|Details||No. of suggestions|
| Overview of how components are related (gestalt view), esp. graphical [NB map was envisaged in Phase 1]|
(“The atomisation/fragmentation of keeping notes seems a bit wearisome - a different ‘page’ for name, time, location, etc.”; “At the moment the design is not the central piece. The central piece is the structure of the contents. And then the designs have to be brought together from the notes by the user.” See also the ReMath planner tool.)
|Improve navigation (general request: no specific method suggested)||3|
|Visual (graphical) design interface||1|
| Explicit (visual) links between components:|
Activities<-> learning outcomes
Location <-> tools
|Insert institution-specific templates and exemplars||1|
|Upload learning designs in any format||1|
|Upload docs into the learning design||1|
|“Preview” designs (as per LAMS?)||1|
|Map output to IMS LD||2|
|Output to Word docs (NB these are in customised tabular format)||1|
|Mailto: link so that users can email support staff, room booking etc. within Phoebe (this is possible – just insert a blank “mailto” code)||1|
|Improve functionality for creating a new design||2|
| Links into/out of course and student management systems, e.g.:|
Log into Phoebe and see list of designs that one needs to create (with basic info e.g. title, cohort, date, location)
|Need for multiple representations of learning design: what you need to run the session, what the students need to see, what management needs to see for audit purposes||1|
|Distinguish between “planning” pages and “info only” pages||1|
|Support for sharing and re-using/repurposing designs||2|
c) Help and support system
|Details||No. of suggestions|
|Dynamic getting started guide (Wink)||1|
|“Wizard” for creation of basic plans||2|
|Help system (esp. for tagging)||4|
|Details||No. of suggestions|
|Browser support: IE6||4|
Q8. Other ideas
Question asked (design reviewers only): Do you have any other ideas or views about piloting or promoting the pedagogic planning tool?
The principal suggestions were:
- Trial the tool in teacher-training or staff-development sessions (two reviewers expressed interest in doing so themselves).
- Explore synergies with other JISC development projects.
- Use the tool to "[make] explicit the learning designs inherent in learning objects."
Findings and Recommendations
In summary, the principal findings of the evaluation are:
- The Phoebe tool can potentially play a valuable role in initial teacher training and staff development programmes, and as such is a resource to be include in such programmes rather than a self-teaching aid for “lone” practitioners who wish to explore D4L.
- The guidance and examples already incorporated into Phoebe appear to meet practitioners’ needs; however, in its present form it functions better as a resource with a note-taking facility than as a usable and useful tool for creating lesson plans. Resources in Phase 2 should therefore be concentrated on this aspect of its functionality, although considerable effort is still needed to develop the content.
- There is considerable interest in the potential of Phoebe as a customisable community-owned tool.