Learning Analytics

A Critical Overview

At a glance

Learning Analytics (LA) is a data-driven approach to help understand and "optimize learning and environments in which it occurs”1. A learning analytics project involves:

  • computational methods to measure, collect, analyze, and report data about learners and their contexts1.
  • applying and informing educational theory, practice and/or policy.

Who to contact

Learning Innovation and Faculty Engagement – Gemma Henderson

Usage scenarios

The following scenarios are generated from implementations at various levels across the University of Miami.

Learning Platforms

While learning data is captured through the multiple learning platforms faculty and students and engage with, it is often difficult to interface with this data together and make meaningful actions. Some platforms provide limited dashboards to view engagement with course resources:

  • For course-level analytics, the Performance Dashboard, in Blackboard Learn shows all activity in your course, including discussion analytics. The ‘Retention Center’  requires an instructor to set up rules to define student engagement and participation in a course (deadlines, grades, course activity, course access). Once set up, the dashboard helps identify students who have not met engagement rules and a way to contact individual or multiple students.
  • For file-level analytics, collaborative platform Google Drive recently introduced the Activity Dashboard where you can view who viewed the file and when they viewed it. To view who has edited a file, accessing the Version History can help identify collaborators during peer-editing exercises and see the progress of a student during writing assignments.

Advising

SSC Campus is the current advising platform at the University of Miami, developed by the Education Advisory Board (EAB). SSC Campus tracks student progress using historical data, research, and predictive analytics, to facilitate the identification of students who may require further support to complete their degree. It is a university-wide platform used to enhance our coordinated approach to student success through the following resources:  

  • Advisors - access to current student data (GPA, credit accumulation, course milestones, enrolled courses, student success markers) to efficiently consult with students. Advisors are able to communicate with students, schedule appointments, document consultations, and make referrals to other resources across campus.
  • Faculty – view class rolls, track attendance, communicate directly with students, post and manage assignment information, issue alerts and respond to University-wide progress report campaigns
  • Administrators -  proactively target students with personalized communication.
  • Students -  conveniently schedule meetings with an academic advisor or a peer tutor.

SSC Campus is available to all schools and colleges at UM and is currently focused on supporting undergraduate students.  Each school/college has delegated a SSC Campus Specialist who is dedicated to improving the collaborative success effort. The data captured will be used to inform academic policy and improve strategies to support students.

Graduate Courses

At the University of Miami, learning analytics are introduced within the following graduate programs and courses at UM.

Research Methods

Nam Ju Kim, Assistant Professor within Teaching and Learning department at the School of Education and Human Development previously engaged in learning analytics research at Utah State University. In one project, Professor Kim:

  • evaluated the effectiveness of computer-based scaffolding in the context of problem-based learning (PBL).
  • applied a Bayesian meta-analysis on collected data, a statistical methodology that ‘combines the results and synthesize information from multiple individual studies’ 2.
  • identified how particular scaffolding strategies are more suitable for problem-based learning.

Through his experience, Dr. Kim expressed that learning analytics can help us evaluate the pathway of students through variables such as GPA, course evaluations, class format or online interactions in a virtual learning environment. Dr. Kim is enthusiastic about the prospect of pursuing learning analytics within his research and teaching practice at the University of Miami.

References

    1. Siemens, G., & Gašević, D. (2012). Special Issue on Learning and Knowledge Analytics. Educational Technology & Society, 15(3), 1–163.
    2. Kim, N., Belland, B., & Walker, A. (2017). Effectiveness of Computer-Based Scaffolding in the Context of Problem-Based Learning for Stem Education: Bayesian Meta-analysisEducational Psychology Review30(2), 397-429. doi: 10.1007/s10648-017-9419-
    3. Gašević, D., Dawson, S., & Siemens, G. (2014). Let’s not forget: Learning analytics are about learningTechtrends59(1), 64-71. doi: 10.1007/s11528-014-0822-x
    4. Stephen Hutt, Margo Gardener, Donald Kamentz, Angela L. Duckworth, and Sidney K. D'Mello. 2018. Prospectively predicting 4-year college graduation from student applications. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18). ACM, New York, NY, USA, 280-289. DOI: https://doi.org/10.1145/3170358.3170395
    5. Vitomir Kovanović, Srećko Joksimović, Negin Mirriahi, Ellen Blaine, Dragan Gašević, George Siemens, and Shane Dawson. 2018. Understand students' self-reflections through learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18). ACM, New York, NY, USA, 389-398. DOI: https://doi.org/10.1145/3170358.3170374
    6. SHEILA. (2018). Retrieved from http://sheilaproject.eu/
    7. Marcelo Worsley. 2018. (Dis)engagement matters: identifying efficacious learning practices with multimodal learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18). ACM, New York, NY, USA, 365-369. DOI: https://doi.org/10.1145/3170358.3170420
    8. Yi-Shan Tsai and Dragan Gasevic. 2017. Learning analytics in higher education --- challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK '17). ACM, New York, NY, USA, 233-242. DOI: https://doi.org/10.1145/3027385.3027400 p.2
    9. Wong, B. (2017). Learning analytics in higher education: an analysis of case studies. Asian Association Of Open Universities Journal12(1), 21-40. doi: 10.1108/aaouj-01-2017-0009
    10. Yi-Shan Tsai and Dragan Gasevic. 2017. Learning analytics in higher education --- challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK '17). ACM, New York, NY, USA, 233-242. DOI: https://doi.org/10.1145/3027385.3027400 p.3

Research Team

Gemma Henderson: Senior Instructional Designer, Academic Technologies

Cameron Riopelle: Librarian Assistant Professor, Data Services

What is it?

Grounded in educational theory research and practice, Learning Analytics involves studying data about key stakeholders in education (students, educators, researchers and administrators) to better understand and enrich the learning process. LA projects can range from predicting the problems they may encounter along the way (financial aid, lack of prerequisites, low grades) to how they interact with learning systems (Blackboard Learn, Canvas, Moodle, SSC Campus).

How does it work?

There are multiple pathways to begin exploring learning analytics. Some questions to consider:

Where will you look for available data? Identify and collect multiple data sets (open, internal, cross-institutional), clean and aggregate the data, if possible, and make data open (while protecting privacy) for further analysis. For example, learning management systems collect data on grades, discussion content, videos, or resources accessed; by using this data or merging it with other data such as administrative and enrollment data, can help provide a more comprehensive picture of students at the university.

Who will collaborate with you? As experts in data science explore the potential of learning analytics, educators and educational researchers are vital collaborators in grounding the project in educational theory and analysis, and help make meaningful actions with learning data.

What will the data answer? Address the purpose or research question behind a project, the ethical impact of pursuing a LA project and the principles that will govern the collection of data. By constructing a data analysis process to measure students in their own particular contexts as well as in comparison with differing scales, such as courses, majors/minors, and departments may provide new insights.

How will you share, communicate and visualize results? Reports and summaries generated through learning analytics methodologies should tell a story about the learning process and motivate instrumental change. These stories should follow strict guidelines about the ethics of collecting, storing, and using data in which vulnerabilities, and sensitive information might be identifiable.

Who’s doing it?

Learning analytics projects are diverse in both scale and subject matter - exciting and daunting for individuals new to the field. In recent years, focus has transitioned from student retention to establishing ethical frameworks and policies. Furthermore, as researchers experiment with learning analytics, projects are often temporary in nature and they may have lasting effects on the field3 - such as ‘Course Signals,’ the discontinued early warning system to improve student retention at Purdue. To raise awareness of the practice of learning analytics, JISC (2016) shares eleven case studies of institutions deploying learning analytics, while Ebner et al. (2017) provides a detailed analysis of LA literature. In summary, some initial themes, descriptions, recent developments have been shared below.

  • Predicting Student Performance involves analyzing various datasets to predict student performance and subsequence behaviors over time. Collaborators from University of Colorado Boulder, Columbia University, University of Pennsylvania and Character Lab, leveraged a large and “unique national dataset of 41,359 college applications to prospectively predict 4-year bachelor's graduation in a generalizable manner”4
  • Textual Analysis may include analyzing the contents of student writing (blog posts, forums, journals) submitted through virtual learning environments or systems. Researchers from University of South Australia, University of Texas at Arlington, University of Edinburgh / Monash University and Standford University, recently shared their study about the development of an analytics system for assessment of reflective writing5.
  • Ethical Frameworks and Policies: In 2014, Slade & Prinsloo shared ethical and privacy principles for learning analytics, in 2015, JISC released a Code of Practice for Learning Analytics, informed by institutions such as the Open University and University of Edinburgh. Between 2016 and 2018, the University of Edinburgh partnered with European universities in a cross-institutional research project SHEILA to “build a policy development framework that promotes formative assessment and personalized learning, by taking advantage of direct engagement of stakeholders in the development process.”6
  • Multimodal Analysis can involve analyzing multiple types of learner generated data, for instance non-verbal cues of learners (gestures, speech, electrodermal activity (EDA) through video capture. At Northwestern University, Worsley (2018) employed in-class video capture of students who completed engineering design tasks in pairs, analyzing gesture, speech and electro-dermal activation data7

Why is it significant?

Monitoring Student Progress: Educational institutions have been embracing learning analytics as a powerful methodology for understanding educational outcomes, with most interest still residing in “monitoring or measuring student progress” 8 than in other components such as “predicting learning success or prescribing intervention strategies”8.

Actionable Data: At its best, learning analytics uses a variety of sophisticated approaches to data analysis, including statistics, machine-learning, and qualitative methods, to provide key information that can be used by administrators, researchers, and possibly even the students themselves, to better understand the educational process on the whole, from the basics of student enrollment to complex topics of success and remedial actions9.

Alternative Evaluation Methods: With appropriately scaled and staffed learning analytics units that operate campus-wide and in keeping with the missions of higher education, it is possible to imagine a future in which teachers are aware of the effectiveness of their instruction, struggling students are provided with immediate and timely administrative intervention, and measures of success are approached holistically, taking into account both the “objective” such as GPA and job placement, but also more affective qualities such as student satisfaction and courses meeting learning goals.

What are the downsides?

Scalability: The landscape of learning analytics is still uneven, with different institutions approaching the topic with often vastly different methodologies and goals. A common problem is the question of scale--if only certain colleges or departments within a given university adopt learning analytics platforms, but others do not, the endeavor can be weakened on the whole, and generalizations (both statistical and otherwise) are as a consequence limited.

Surveillance: Part of the difficulty in campus-wide adoption of learning analytics platforms might well be that more humanistic departments could view with hostility the idea that students are to be surveilled through for-profit means and subsequently assessed. These concerns are both relevant and eye-opening--in fact, with any study of “success”, one of the key questions is: when it comes to measuring outcomes, who gets to define success? The instructor, student, or administrators?

Strategic Vision: In their 2017 review of learning analytics at universities, Yi-Shan Tsai and Dragan Gasevic discuss the challenges that learning analytics methodologies face in today’s educational climates. Challenges include shortages of leadership, unequal engagement with different stakeholders, shortages of pedagogy-based approaches to removing learning barriers, insufficient training opportunities, a limited number of empirical studies to validate impact, and a limited availability of policies to address issues of privacy and ethics10. The shortage of pedagogy-based approaches is especially important in that learning analytics platforms often try to improve themselves using technical and IT-based solutions, rather than taking into account the goals of instructors10. For the challenge of privacy and ethics, as well as the challenge of unequal engagement, policies need to be in place that present a strategic vision in which all users of learning analytics platforms are seen as not just “users” but also stakeholders.

Where is it going?

Open communities: Due to the various skills needed, researchers and educators from diverse fields are continuing to work together to explore how analytics can inform teaching and learning. The collaborative nature of learning analytics projects has seen the development of multiple interdisciplinary networks, within and outside institutions, including Society for Learning Analytics Research (SoLAR), Learning Analytics and Knowledge Conference, LINK Research Lab, and Learning Analytics Research Network (NYU-LEARN).

Personalized learning: To provide further guidance to large classes, the OnTask Project aims to “improve the academic experience of students through the delivery of timely, personalised and actionable student feedback throughout their participation in a course.” Designed by a team of lead researchers in the field of learning analytics, OnTask aims to situate all stakeholders (learners, faculty, administrators) with more access to data. 

Libraries: Libraries are key partners in employing learning analytic initiatives - the learning data captured by the library (resources, spaces, services) relates to multiple stakeholders at the institution. In 2018, the Association of Research Libraries (ARL) delivered SPEC Survey on Learning Analytics to ARL members, to explore data management practices within existing learning analytics initiatives and ethics commitments.

Enterprise solutions: As institutions research sustainable solutions to manage, analyze and act upon the production of learning data, corporate entities are innovating quickly in this area. For example, in association with Cambridge English (part of the University of Cambridge), Write & Improve provides a free tool to learners of English, analysing submissions with automated feedback.

What are the implications for teaching and learning?

Curriculum: As the discipline expands, higher education institutions are developing programs and courses in Learning Analytics. The University of Miami introduces learning analytics within its M.S.Ed in Applied Learning Sciences and Ph.D. in Teaching and Learning - Science, Technology, Engineering and Mathematics (STEM) Education, while Teachers College at Columbia University delivers a graduate program in Learning Analytics, and University of Edinburgh’s MSc in Digital Education program, offers a fully online course, co-taught by leading researchers in the field.

Ethics: The ethical implications of using learning data are broad and important to keep in mind. There is often a barrier to access for users of learning analytics platforms, often due to the differing ways the environments are used by researchers, instructors, and students. Students are often not given access to the data that is being collected and analyzed to describe them, and perhaps more importantly, to measure their outcomes. Another key ethical problem is focus on student retention instead of thinking about different modes of the learning process--what exactly are we measuring? For example, an analytical process that seeks to measure the effectiveness of discussion groups might greatly differ from an analytical process that seeks to measure student retention. As we use third party learning tools such as Google Drive, we are often unable to access data produced, where they have terms of agreement in which any data is housed on their systems can be used for purposes such as product development and marketing.