DEHubResearchProject/Charles Sturt University/Evaluation

From WikiResearcher
Jump to: navigation, search

DEHub web banner 4.jpg

Managing institutional change through distributive leadership approaches: Engaging academics and teaching support staff in blended and flexible learning.
 | DEHub Home  | Charles_Sturt_University Home | About  |  Planning  |  Documentation  |  Findings  |  Evaluation

Evaluator activities

Two Evaluation Approaches: The evaluator adopted two distinct approaches to the evaluation; 1. an empirical approach to the operational domain of the research, ie. reporting, communication, timelines, budget, documentation etc. This will be factual and summative and predominately based on artefacts. This information will be reported in the mid-year and final reports. 2. a constructivist approach to the research process (Luedekke, 1999). This will consider individual and collective meaning making of identified critical incidents in the implementation of the research, and the way these critical incidents shape the research methodology, findings and recommendations. It will be interpretive, explorative and formative and predominately based on reflective conversations. This information will be used internally, as part of the research process.



Evaluator Reports

Mid-Project Report, January 2012

Final Report, July 2012

Project activities

Evaluation plan

The following questions guide the evaluation in each domain, based on the systematic evaluation process as recommended by the ALTC


A. Operational Evaluation (factual and summative)

All members of the research team are invited to contribute to this part of the evaluation. It is expected that the Chief Investigator and the Project Manager will provide most of the "hard" data.

1. Project Clarification: What is the nature of the project? • What is the focus of project? • What is the scope of project? • What are the intended outcomes? • What are the operational processes developed to achieve the outcomes? Documentation requested sofar * Role clarification? (Flow chart of roles and responsibilities in the project - diagram) * Communication flows • Budget What is the conceptual and theoretical framework underpinning the project? • What is the context of the project? • What key values drive the project? Ethical approval/s • Timelines for deliverables * Timeliness of project reports • Quality of communication amongst the research team • Adjustment and achievement of the project goals • Appropriateness of literature review and force field analysis • Appropriateness and accessibility of final report • Dissemination strategies *How do the various virtual strategies (public and private) support the research and the partnership? 3. Who are the stakeholders for the project and the audiences for the evaluation information? • Stakeholders - Who has an interest or stake in the project and/or its outcomes, and in the evaluation of the project? • Audiences - Who will be interested in the results of the study and what types of information do they expect from the evaluation?

B. Process evaluation (interpretive, explorative and formative)

All members of the research team are invited to participate in this part of the evaluation.

A critical incident approach will be adopted (Angelides,2001). It will focus on incidents where decisions made have the potential of impacting on the whole project. Each of the incidents identified to date reflects a different activity level of the research, ie. research implementation, governance and desired states.

i.) Adjustments to methodology and conceptual clarification: how did they transform the project, how did the research team respond, how did the team orient itself to the adjustments? (research implementation)

ii.) The decision to combine the two reference groups (CSU and Massey led projects): what were the pragmatic and conceptual reasons? What are the outcomes? Which problems were solved, which were created? (governance)

iii.) The intention to grow the CSU/Massey partnership as a project outcome(desired state). How did the DEHub principles (Keppel,M. 2010) inform the development of the research and the partnership?


A second set of critical incidents will be identified and interpreted for the final evaluation report.

Reference documentation

Angelides, P.(2001) The development of an efficient technique for collecting and analyzing qualitative data: the analysis of critical incidents. Qualitative Studies in Education, 14(3), pp 429-442

Lueddeke, George R. (1999) Toward a constructivist framework for guiding change and innovation in higher education." Journal of Higher Education 70.3

This project uses Delicious for reference documentation