Evaluation and Online Learning and Teaching at UniSQ
The Evaluation theme is focused on educational technology (outcomes and impacts) and the evaluation process – including data collection and analytics.
There are a number of stages to consider within evaluation: evaluation as part of tool selection and evaluating the success of the broader learning environment. Below you will find resources for each.
Evaluating EdTech Tools
When deciding whether to trial a new piece of EdTech, there are many considerations, including your own needs, your student cohort, course design, and accessibility.
If you have identified a tool you might like to trial, the most commonly discussed model for the uptake of new EdTech is Fred D. Davis’s seminal Technology Acceptance Model (TAM).
Davis identifies two factors that you could consider:
- What is the perceived usefulness of the tool?
- What is its ease of use?
Davis defines perceived usefulness as “the degree to which a person believes that using a particular system would enhance his or her job performance” (320) and perceived ease of use is “the degree to which a person believes that using a particular system would be free of effort” (320).
Measurement scales for both categories are included as appendices within the article with these scales stepping you through the facets of each category, allowing you to evaluate whether or not to trial the tool.
Access the article here.
Lauren Anstey and Gavan Watson’s Rubric for Evaluating E-Learning Tools in Higher Education is more detailed than TAM and very apt for our context.
It takes into consideration attributes such as functionality and accessibility, as well as elements of TAM and Community of Inquiry. It is a great place to start when evaluating EdTech tools prior to implementation.
You can access the rubric and read more about the its design by clicking here.
If you haven’t already selected an EdTech tool, organisations such as EDUCAUSE provide you with the inspiration you might need.
Click here to access the EDUCAUSE website.
Evaluating Learning Environments
A number of models exist within learning and teaching praxis for the evaluation of your learning environment. These can be used to assess the environment itself, or student use of such.
As well as the models below, peer-feedback is an excellent way to support reflection and evaluation of your learning environment.
Our Learning and Teaching Quality Partners are here to support you in this process. Contact them by clicking here.
A seminal online learning framework, the Community of Inquiry (CoI) has been operationalised, with an instrument developed in 2008 by J.B. Arbaugh, Martha Cleveland-Innes, Sebastian R. Diaz, D. Randy Garrison, Philip Ice, Jennifer C. Richardson, and Karen P. Swan.
- Access the original article by clicking here.
- A later article, which clarifies the scale can be found here.
Arbaugh et al. created a 34 item Likert scale which is completed by students to allow assessment of the construction of the environment and its use of social presence, cognitive presence, and teaching presence.
Much research has been conducted using this framework. It additionally provides an excellent foundation for self-assessment of your learning environment.
ASCILITE has developed a set of standards to evaluate online learning packages with a specific focus on supporting Australian and New Zealand academics and learning designers.
The TELAS (Technology Enhanced Learning Accreditation Standards) framework can be used before, during or after the development of your learning package utilising the self-assessment document and associated resources.
For a small fee you can become an accredited certifier through ASCILITE TELAS workshops and assessments, which includes the awarding of a digital badge. Your learning packages can be assessed and certified by a certified assessor with badging from Diamond through to Bronze.
More information can be found on the TELAS website or by contacting TechDems Learning and Teaching Quality Partner Dr Lisa Jacka, as she is currently involved in the delivery of this service.
Access the framework, and all supporting material, by clicking here.
Another useful model is the Diversity, Equity and Inclusion Course Design Rubric created by Kim DeBacco and Nick Mattos.
This model brings to the fore important considerations relating to accessibility within your learning environment.
You can access the rubric, support materials, and the pre-existing rubrics from which it was drawn by clicking here.
If you are interested in the use of data analytics to evaluate the use of your learning environment, as opposed to its design, the following links connect you to UniSQ data and analytics sites.
Each of these require a UniSQ log in: