When I first obtained CMALT qualification I submitted around these three videos to evidence my appraisal of the benefits and constraints in certain technologies in achieving a technical or teaching aim. The videos are as follows:
Firstly, how to make a ‘Narrated PowerPoint’. How to achieve a technical finish. This video outlined the options available at the time:
This thinking evolved and the field for creating content is now more developed / cluttered than ever at UWE. I summarised the different approaches in 2019 here –
Secondly, I submitted a video I had made outlining how we would address a teaching need using the technologies available at the University.
I do still try and use my more in depth knowledge of systems to plot cool work arounds or workflows that combine the different elements of our technology ecology. Things such as this summary of a more up to date digitization of OSCEs event face to face event –
Thirdly, systems in use for multi-modal feedback on assessments. By this we explored the practice associated with giving audio and video feedback.
At my level, I down source a lot of the product research to those I manage. It is an excellent personal development exercise for people new to being a learning technologist. I manage three positions that are designed as entry level jobs into this part of the industry. The have reviewed the following:
- 2017 Audience Response systems (Krystina Selley and Naomi Beckett)
- 2018 Audio only hosting (James Rawlings)
- 2018 Video enriched branching learning scenarios (James Rawlings not linked)
- 2018 Web cameras (Husna Ahmed)
The evidence above is not meant to be super high quality. They are summations of the work we did year in and year out. We had a policy of open note taking. So that when we learn we shared the output so others could learn. Being open comes with less emphasis on high end production and more on getting ideas out there quickly.
Delegating actually left very little for me to do in the way of research on my own terms, merely I acted to set the parameters for the assessment of constraints and benefits. The more senior I got, the less decisions I got to take about this other than suggesting what needs to be in the mix, the larger decisions where I would need to be involved sporadically stalled at my institution for long periods. So big decisions about the institutional tool kit etc. Part of being a manager is guiding people to be the best they can be but part of it is also giving away all the fun projects.
What did change was what needs to be considered when assessing the benefits of systems. There are wide scale legislative changes that have came in across the 2018 – 2019 period which pushed more socially responsible aspects of systems used in digital learning higher up the chain in terms of evaluation. This will be explored later in this portfolio but it is heartening that we now have informed conversations about accessibility and data protection that we were mumbling our ways through less than a decade ago. Knowledge around copyright legislation is still woefully behind in this regard.
For a long time I worked in a large HEI where the digital learning support services were very dislocated. With a post code lottery on support services per Faculty. So the big piece of work for myself and other senior stakeholders in digital learning was to make an environment that is more sustainable for technology commissioning with a larger centrally support tool set that allow for innovations with these new higher priority risks managed better. By this I mean guidance for those going off piste to analyse not just the constraints and benefits or certain technologically enhanced approaches to their teaching but also how they comply / exceed the expectations of the institution form a legal / socially progressive stand point.
It is a certainly less fun than the previous policy light and system support light environment but it is a necessary step. In 2019 we commissioned and decided about a new polling tool for the university. This helped us manage the risks of the previous wild west collection of ‘whatever we had seen at a conference’ product selection for in class polling. We hadlimited choice and functionality BUT we at least know our practitioners will not be sending who know what data here, there and everywhere.
What will I do differently in the future? As discussed in 2.b I have been reborn into seeking out the student/service user voice recently. Until this summer I have always treated people who say ‘we need the student voice’ as people who really wanted to say something but didn’t have anything to say. A lot of the decisions we make are functional and don’t need too much discussion they just need agreement and work. However, too often we have never focussed on analysis after commissioning or brought students into conversations over tool selection. Part of my new outlook is more open to this and I will endeavour to make space for the student voice in these decisions.