The National Forum for the Enhancement of Teaching and Learning in Higher Education had a symposium on Learning Analytics (LA) last week (http://www.teachingandlearning.ie). It was informative, useful and a sincere Thank You to the organisers who ensured the day was successful. The symposium presentations topics included first principles, ethics, student experience, application of LA, original research, existing platforms and IT integration. This blog post highlights and expands on some of the main issues discussed last Thursday.
The 2016 Horizon Report describes learning analytics as “an educational application of web analytics aimed at learner profiling, a process of gathering and analyzing details of individual student interactions in online learning activities.” Learning Analytics therefore involves measurement, collection, analysis and reporting of data pertaining to learners. It involves student activity and accomplishment, with the view to optimizing performance, developing an understanding and interpreting the learning taking place. Learners in this instance are the students both on and off campus, but the learners by this definition can also be faculty and administration staff – anyone who affects learning and teaching at Higher Education.
In principle, the data available from learning analytics is anything which the instruments can measure. The data doesn’t have to be simple quantifiable elements – but rather as much as we prescribe. The need for effective resources and communication between systems, talking in intelligent ways is necessary. So too is the early intervention and the uniform definition of terms and metrics. This is vital. Measuring achievement, for example, on a single absolute scale is not as simple as it sounds. Data access and system integration are currently not so seamless. There is a need to have clear values and conditions that cross the higher education sector and an importance in understanding expectations surrounding concerns of QA, IT infrastructure, administration and academic data requests for example.
A lot of the Learning Analytics work relates to student retention, or ‘student engagement’ as colleagues from UL so nicely referred. In regard to student retention there have been many efforts concerning performance prediction – in exploring how to use admission criteria as predictors of student performance and progression. This metric of success is often determined by student progression and the institutes’ associated funding.
Higher education is a critical element of our society, with many audiences, but the most important audience are our students. We must share what we learn with our students, helping them to better understand and navigate the educational system, which after all has their success as its sole purpose. The student experience and providing them with the same access to the analytics that we as educators have, was emphasised by Mary Loftus in her PhD work at NUI Galway (@).
The important opportunity which digitally mediated education provides is incontestable. The analysis of this data can give us an evidence basis for decision-making and serve as the foundation for systemic change. That said, there is need for recognition that each learner in each environment is unique and that effective education must always strive to be sensitive and responsive to difference. Prof. Sarah Moore (@@SarahMooreTL) in her concluding remarks stated that we have to be clear that Learning Analytics is not a panacea for all the challenges in Higher Education. The sector should be data informed, not data driven.
In developing the scope of learning analytics for higher education we merge the fields of education, computing and mathematics to learn and optimise the student experience. There are genuinely strong opportunities and a wonderful nexus between teaching and research!