AI for Teachers, An Open Textbook: Edition 1

Learning Analytics and Educational Data Mining

By Azim Roussanaly, Anne Boyer, Jiajun Pan, LORIA/Université de Lorraine

What is learning analytics?

More and more organizations are using data analysis to solve problems and improve decisions related to their activities. And the world of education is not an exception because, with the generalization of virtual learning environment (VLE) and learning management systems (LMS), massive learning data are now available, generated by the learners’ interaction with these tools.

We then speak of Learning Analytics (LA) : LA is a disciplinary field defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” [Long-2011]

Four types of analytics are generally distinguished according to the question to solve:
- Descriptive Analytics: What happened in the past?
- Diagnostic Analytics: Why something happened in the past?
- Predictive Analytics: What is most likely to happen in the future?
- Prescriptive Analytics What actions take to affect those outcomes?

What is it?

The educational tools based on LA are very diverse, from dashbords for data vizualition to recommender systems. Research in this area is currently very active.We will limit ourselves to summarizing the frequent issues encountered in the literature. Each of these issues leads to families of tools targeting mainly learners or teachers who represent most of the end users of LA based applications.

Predict and enhance students learning outcome

One of the emblematic applications of LA is the prediction of failures.
Learning indicators are automatically computed from the digital traces and can be accessed directly by learners so that they can adapt their own learning strategies.
One of the first experiments was conducted at Purdue University (USA) with a mobile application designed as a traffic light-based dashboard (Arnold-2012).
Each student can monitor his own progress indicators.
A screenshot of the dashboard is shown in fig#1

Indicators can also be addressed to teachers as in an early warning system (EWS).
This is the choice made by the French national center of distant learning (CNED) in an ongoing study (BenSoussia-2022).
The objective of an EWS is to alert the tutors who are responsible for monitoring the students as soon as possible so that they can implement at the earliest the appropriate remedial actions.

Analyze student learning process

LA techniques can help to model the learning behavior of a learner or a group of learners (i.e. a class). The model can be used to display learning processes in LA applications, providing additional information that will enable teachers to detect deficiencies which will help improve training materials and methods. Moreover, the analysis of learning process is a way for observing the learner engagement. For example, for in the e-FRAN METAL project, the indicators were brought together in a dashboard co-designed with a team of secondary school teachers as shown in fig#2 (Brun-2019).

Personalise learning paths

The personalisation of learning paths can occur in recommendation or adaptive learning systems. Recommender systems aim to suggest, to each learner, best resources or appropriate behavior that may help to effectively achieve the educational objectives.

Some systems focus on putting the teacher in the loop by first presenting proposed recommendations for their validation. Adaptive learning systems allow the learner to develop skills and knowledge in a more personalized and self-paced way by constantly adjusting the learning path towards the learner experience.

Does it work?

In the publications, feedback focuses mainly on students (and in higher education). Observations generally trend to show improved performance of learners (for example, +10% of grade A and B at Purdue University). For teachers, impact of LA is more complex to assess. Studies based on the Technology acceptance model (TAM) suggest that teachers have a positive perception of the use of LA tools. It is interesting to note in one of these studies the final Strengths, Weaknesses, Opportunities and Threats analysis (SWOT) that we reproduce here (Mavroudi-21)(see fig#3)

Some of the points of attention, included in the Threats and Weakness parts, form the basis of reflection of the Society for Learning Analytics Research (SoLAR) community to recommend an ethics by design approach for LA applications (Drashler-16). The recommendations are summarized in a checklist of 8 keywords: Determine, Explain, Legitimate, Involve, Consent, Anonymize, Technical, External (DELICATE).

References

P.Long and G. Siemens: 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011
K. Arnold, M. Pistilli: Course signals at Purdue: Using learning analytics to increase student success, LAK2012, ACM International Conference Proceeding Series. (2012).
A. Ben Soussia, A. Roussanaly, A. Boyer: Toward An Early Risk Alert In A Distance Learning Context. ICALT (2022)
A. Brun, G. Bonnin, S. Castagnos, A. Roussanaly, A. Boyer: Learning Analytics Made in France: The METALproject. IJILT (2019)
A. Mavroudi, Teachers’ Views Regarding Learning Analytics Usage Based on the Technology Acceptance Model, TechTrends. 65 (2021)






 

This page has paths:

This page references: