Learning Analytics

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 26562 Experts worldwide ranked by ideXlab platform

Dirk Ifenthaler - One of the best experts on this subject based on the ideXlab platform.

  • Utilising Learning Analytics to support study success in higher education: a systematic review
    Educational Technology Research and Development, 2020
    Co-Authors: Dirk Ifenthaler
    Abstract:

    Study success includes the successful completion of a first degree in higher education to the largest extent, and the successful completion of individual Learning tasks to the smallest extent. Factors affecting study success range from individual dispositions (e.g., motivation, prior academic performance) to characteristics of the educational environment (e.g., attendance, active Learning, social embeddedness). Recent developments in Learning Analytics, which are a socio-technical data mining and analytic practice in educational contexts, show promise in enhancing study success in higher education, through the collection and analysis of data from learners, Learning processes, and Learning environments in order to provide meaningful feedback and scaffolds when needed. This research reports a systematic review focusing on empirical evidence, demonstrating how Learning Analytics have been successful in facilitating study success in continuation and completion of students’ university courses. Using standardised steps of conducting a systematic review, an initial set of 6220 articles was identified. The final sample includes 46 key publications. The findings obtained in this systematic review suggest that there are a considerable number of Learning Analytics approaches which utilise effective techniques in supporting study success and students at risk of dropping out. However, rigorous, large-scale evidence of the effectiveness of Learning Analytics in supporting study success is still lacking. The tested variables, algorithms, and methods collected in this systematic review can be used as a guide in helping researchers and educators to further improve the design and implementation of Learning Analytics systems.

  • Change Management for Learning Analytics
    Advances in Analytics for Learning and Teaching, 2020
    Co-Authors: Dirk Ifenthaler
    Abstract:

    Learning Analytics draw on an eclectic set of methodologies and data to provide summative, real-time, and predictive insights for improving Learning, teaching, organisational efficiency, and decision-making. The implementation of Learning Analytics at higher education institutions may have broad implications for the organisation and its stakeholders (e.g. students, academic staff, administrators) including changes in Learning culture and educational decision-making. Hence, change management seems to be an essential prerequisite when implementing Learning Analytics, while change management includes approaches to prepare and support organisations and its stakeholders in making sustainable and beneficial organisational change. This chapter presents two case studies which exemplify the process of staff and technological change management processes required for successful implementation of Learning Analytics. Implications of the case study include insights into functioning implementation strategies highlighting the importance of open communication structures, transparency of decision-making, and the importance of systems thinking approaches.

  • features students really expect from Learning Analytics
    Computers in Human Behavior, 2018
    Co-Authors: Clara Schumacher, Dirk Ifenthaler
    Abstract:

    More and more Learning in higher education settings is being facilitated through online Learning environments. Students' ability to self-regulate their Learning is considered a key factor for success in higher education. Learning Analytics offer a promising approach to better support and understand students' Learning processes. The purpose of this study is to investigate students' expectations towards features of Learning Analytics systems and their willingness to use these features for Learning. A total of 20 university students participated in an initial qualitative exploratory study. They were interviewed about their expectations of Learning Analytics features. The findings of the qualitative study were complemented by a quantitative study with 216 participating students. The findings show that students expect Learning Analytics features to support their planning and organization of Learning processes, provide self-assessments, deliver adaptive recommendations, and produce personalized analyses of their Learning activities. Investigating students' expectations toward features of Learning Analytics systems.Illustrating 15 Learning Analytics features which support Learning processes.Relating Learning Analytics features to phases of self-regulation.

  • Are Higher Education Institutions Prepared for Learning Analytics?
    TechTrends, 2017
    Co-Authors: Dirk Ifenthaler
    Abstract:

    Higher education institutions and involved stakeholders can derive multiple benefits from Learning Analytics by using different data Analytics strategies to produce summative, real-time, and predictive insights and recommendations. However, are institutions and academic as well as administrative staff prepared for Learning Analytics? A Learning Analytics benefits matrix was used for this study to investigate the current capabilities of Learning Analytics at higher education institutions, explore the importance of data sources for a valid Learning Analytics framework, and gain an understanding of how important insights from Learning Analytics are perceived. The findings reveal that there is a lack of staff and technology available for Learning Analytics projects. We conclude that it will be necessary to conduct more empirical research on the validity of Learning Analytics frameworks and on expected benefits for Learning and instruction to confirm the high hopes this promising emerging technology raises.

  • Student perceptions of privacy principles for Learning Analytics
    Educational Technology Research and Development, 2016
    Co-Authors: Dirk Ifenthaler, Clara Schumacher
    Abstract:

    The purpose of this study was to examine student perceptions of privacy principles related to Learning Analytics. Privacy issues for Learning Analytics include how personal data are collected and stored as well as how they are analyzed and presented to different stakeholders. A total of 330 university students participated in an exploratory study confronting them with Learning Analytics systems and associated issues of control over data and sharing of information. Findings indicate that students expect Learning Analytics systems to include elaborate adaptive and personalized dashboards. Further, students are rather conservative in sharing data for Learning Analytics systems. On the basis of the relationship between the acceptance and use of Learning Analytics systems and privacy principles, we conclude that all stakeholders need to be equally involved when Learning Analytics systems are implemented at higher education institutions. Further empirical research is needed to elucidate the conditions under which students are willing to share relevant data for Learning Analytics systems.

Yiwei Cao - One of the best experts on this subject based on the ideXlab platform.

  • Go-Lab Releases of the Learning Analytics, Scaffolding Services, and Add-on Services -Final
    2015
    Co-Authors: Sven Manske, Yiwei Cao
    Abstract:

    This deliverable describes the final releases of Learning Analytics, scaffolding, and add-on services of Go-Lab, following the final specification reported in D4.6 (M33). It also considers users’ feedback and reviewers’ evaluation to the initial release in D4.4 (M24). This deliverable consists of two major parts: (i) the Learning Analytics and scaf- folding services and (ii) the add-on services which are specified in two compo- nents, i.e. the booking system and the tutoring platform. The Learning Analytics and scaffolding services consist of a rich backend and several options to develop new Learning Analytics apps according to the spec- ified infrastructure and architecture. To support the two main stakeholders of Learning Analytics in Go-Lab, namely teachers and learners, a teacher dash- board and several apps, e.g., to support learners’ reflection, will be presented in this deliverable. Participatory design activities, particularly framed by the Go- Lab Summer School 2015, have been used to evaluate initial versions of the Learning Analytics apps to gather useful feedback aligned to the stakeholders. The booking system (http://www.golabz.eu/) offers Go-Lab remote labs an appropriate booking service using a calendar managed by lab owners. It con- sists of the front-end user interfaces for both lab owners and teachers, as well as the backend services to validate the booking information through the Go-Lab Smart Gateway. The tutoring platform (http://tutoring.golabz.eu/) helps build up a virtual community where teachers could share their expertise in in- quiry Learning with online labs and help each other grow their teaching skills. Both of them provide the add-on services for the Go-Lab Portal. The releases of LA, scaffolding services, and add-on services facilitate a tech- nical framework with additional indispensable services for the Go-Lab Portal (cf. D5.6). They are the collaboration research results between WP4 and WP5, taking into account the feedback from WP3.

  • Go-Lab Specifications of the Learning Analytics, Scaffolding Services, and Add-on Services -Final
    2015
    Co-Authors: Sven Manske, Yiwei Cao
    Abstract:

    This deliverable describes the final specification of Learning Analytics, scaffolding, and add-on services of Go-Lab. It follows the initial specification reported in D4.2 (M18) and reflects the development and evaluation of the release in D4.4 (M24). It serves as a guideline for final release of the Learning Analytics, scaffolding, and add-on services in D4.8 (M36). This deliverable consists of two major parts: (i) the Learning Analytics and scaffolding services and (ii) the add-on services including a booking system and a tutoring platform (which was called the Bartering Platform in D4.2). The Learning Analytics and scaffolding services consist of a rich back and several options to develop new Learning Analytics apps according to the specified infrastructure and architecture. To support the two main stakeholders of Learning Analytics in Go-Lab, namely teachers and learners, a teacher dashboard and several apps, e.g. to support learners’ reflection, will be presented in this deliverable. Participatory design activities, particularly framed by the Go-Lab Summer School 2015, have been used to evaluate initial versions of the Learning Analytics apps to gather useful feedback aligned to the stakeholders. The booking system offers Go-Lab remote labs an appropriate booking service using a calendar managed by lab owners. The tutoring platform helps build up a virtual community where teachers could share their expertise in inquiry Learning with online labs and help each other grow their teaching skills.

  • Releases of the Learning Analytics, Scaffolding Services, and Add-on Services – initial
    2014
    Co-Authors: Tobias Hecking, Yiwei Cao
    Abstract:

    This deliverable describes and demonstrates the initial release of Learning Analytics, scaffolding, and add-on services of Go-Lab. However, the main product of this deliverable are the software prototypes of the Learning Analytics, scaffolding, and add-on services, many of which are accessible online. Currently, we have realised the first prototypes specified in the initial specification of D4.2. It serves as a documentation of the development progress of the first prototypes. It also reports early evaluation results of the first prototypes. This deliverable consists of two major parts: (i) the Learning Analytics and scaffolding services and (ii) the add-on services including a booking system and a tutoring platform (which was called the Bartering Platform in D4.2). The initial release of the Learning Analytics and scaffolding services comprise of the prototypical implementation of the action logging, back-end services and guidance mechanisms that have been specified in D4.2. The add-on services prototype of the booking system includes a set of mock-up components, while the first prototype of the tutoring platform is accessible at http://tutoring.golabz.eu. The prototypes show that they are able to support the federation of online labs via the booking mechanism and to motivate the user community to interact on the portal actively via skills bartering. Based on the experiences and prototypes described in this deliverable and our future work, the specifications of the Learning Analytics, scaffolding, and add-on services will be updated and finalised in D4.6 (M33). Furthermore, the evaluation of the first prototypes will contribute to the development improvement of the final release in D4.8 (M36).

  • Specifications of the Learning Analytics, Scaffolding Services, and Add-on Services – initial
    2014
    Co-Authors: Tobias Hecking, Yiwei Cao
    Abstract:

    This deliverable describes the initial version of the specifications of Learning Analytics, scaffolding, and add-on services of Go-Lab. All these services provide additional functionality to teachers, students and lab owners using the Go-Lab portal (see D5.2). This deliverable consists of two major parts: (i) the Learning Analytics and scaffolding services and (ii) the add-on services. Learning Analytics aims to collect and analyse user activities to make Learning and Learning environments more effective and efficient. The Go-Lab Learning Analytics services provide means to track user activities and analyse this tracked data. This provides the foundation for guidance mechanisms for students through scaffolding, as well as intelligent decision support for teachers and lab owners. More specifically, the Learning Analytics services provide support for recommendations, intelligent feedback for students, and analytical reports that help to design better inquiry based Learning scenarios and spaces. This deliverable describes the architecture of the Learning Analytics services in detail. Furthermore, it explains how this service integrates with the Go-Lab portal and a mechanism that enables privacy of the tracked data controlled by the teacher. We regard the Learning Analytics service as an enabler of scaffolding applications and thus the Learning Analytics services and its feedback loop together provide the scaffolding services. The add-on services consist of the bartering platform and the lab booking system to support Go-Lab Portal in different aspects. The bartering platform offers teachers peer assistance through a tutor social platform for expertise sharing related to online labs and inquiry Learning spaces. Teachers are motivated to help other teachers and share their skills and knowledge about online labs on the bartering platform. Furthermore, the bartering platform also attempts to make the Go-Lab Portal sustainable and usable by many schools through active interactions among teacher communities and a credit system, ranging from social rating to payment mechanisms. Since remote labs can only be used by a limited number of users at the same time, the Go-Lab Portal needs services to arrange which users can use a lab at a given time. Therefore, a Go-Lab calendar-based booking system is offered to manage remote lab booking tasks. In the booking system three Go-Lab booking schemes are specified for use by remote labs in order to get Go-Lab as well as external remote labs ready for Go-Lab users’ use. In this document usage scenarios, requirements and initial component specifi- cations are described and contextualised with existing research. The specifications in this deliverable will be updated and finalised in D4.6 (M33). Furthermore, a first version of the implemented prototypes will be delivered in D4.4 (M24) and the final implementation will be described in D4.8 (M36). As a preview for D4.4, Appendix B briefly describes the implementation efforts, achieved up till M18.

Ulrik Schroeder - One of the best experts on this subject based on the ideXlab platform.

  • supporting action research with Learning Analytics
    Learning Analytics and Knowledge, 2013
    Co-Authors: Anna Lea Dyckhoff, Mohamed Amine Chatti, Vlatko Lukarov, Arham Muslim, Ulrik Schroeder
    Abstract:

    Learning Analytics tools should be useful, i.e., they should be usable and provide the functionality for reaching the goals attributed to Learning Analytics. This paper seeks to unite Learning Analytics and action research. Based on this, we investigate how the multitude of questions that arise during technology-enhanced teaching and Learning systematically can be mapped to sets of indicators. We examine, which questions are not yet supported and propose concepts of indicators that have a high potential of positively influencing teachers' didactical considerations. Our investigation shows that many questions of teachers cannot be answered with currently available research tools. Furthermore, few Learning Analytics studies report about measuring impact. We describe which effects Learning Analytics should have on teaching and discuss how this could be evaluated.

  • design and implementation of a Learning Analytics toolkit for teachers
    Educational Technology & Society, 2012
    Co-Authors: Anna Lea Dyckhoff, Mohamed Amine Chatti, Dennis Zielke, Mareike Bultmann, Ulrik Schroeder
    Abstract:

    Introduction Learning Management Systems (LMS) or Virtual Learning Environments (VLE) are widely used and have become part of the common toolkits of educators (Schroeder, 2009). One of the main goals of the integration of traditional teaching methods with technology enhancements is the improvement of teaching and Learning quality in large university courses with many students. But does utilizing a VLE automatically improve teaching and Learning? In our experience, many teachers just upload existing files, like lecture slides, handouts and exercises, when starting to use a VLE. Thereby availability of Learning resources is improved. For improving teaching and Learning it could be helpful to create more motivating, challenging, and engaging Learning materials and e.g., collaborative scenarios to improve Learning among large groups of students. Teachers could e.g., use audio and video recordings of their lectures or provide interactive, demonstrative multimedia examples and quizzes. If they put effort in the design of such online Learning activities, they need tools that help them observe the consequences of their actions and evaluate their teaching interventions. They need to have appropriate access to data to assess changing behaviors and performances of their students to estimate the level of improvement that has been achieved in the Learning environment. With the establishment of TEL, a new research field, called Learning Analytics, is emerging (Elias, 2011). This research field borrows and synthesizes techniques from different related fields, such as Educational Data Mining (EDM), Academic Analytics, Social Network Analysis or Business Intelligence (BI), to harness them for converting educational data into useful information and thereon to motivate actions, like self-reflecting ones previous teaching or Learning activities, to foster improved teaching and Learning. The main goal of BI is to turn enterprise data into useful information for management decision support. However, Learning Analytics, Academic Analytics, as well as EDM more specifically focus on tools and methods for exploring data coming from educational contexts. While Academic Analytics take a university-wide perspective, including also e.g., organizational and financial issues (Campbell & Oblinger, 2007), Learning Analytics as well as EDM focus specifically on data about teaching and Learning. Siemens (2010) defines Learning Analytics as "the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on Learning." It can support teachers and students to take action based on the evaluation of educational data. However, the technology to deliver this potential is still very young and research on understanding the pedagogical usefulness of Learning Analytics is still in its infancy (Johnson et al., 2011b; Johnson et al., 2012). It is a current goal at RWTH Aachen University to enhance its VLE--the Learning and teaching portal L2P (Gebhardt et al., 2007)--with user-friendly tools for Learning Analytics, in order to equip their teachers and tutors with means to evaluate the effectiveness of TEL within their instructional design and courses offered. These teachers still face difficulties, deterring them from integrating cyclical reflective research activities, comparable to Action Research, into everyday practice. Action Research is characterized by a continuing effort to closely interlink, relate and confront action and reflection, to reflect upon one's conscious and unconscious doings in order to develop one's actions, and to act reflectively in order to develop one's knowledge." (Altrichter et al., 2005, p. 6). A pre-eminent barrier is the additional workload, originating from tasks of collecting, integrating, and analyzing raw data from log files of their VLE (Altenbernd-Giani et al., 2009). To tackle these issues, we have developed the "exploratory Learning Analytics Toolkit" (eLAT). …

  • A reference model for Learning Analytics
    International Journal of Technology Enhanced Learning, 2012
    Co-Authors: Mohamed Amine Chatti, Anna Lea Dyckhoff, Ulrik Schroeder, Hendrik Thüs
    Abstract:

    Recently, there is an increasing interest in Learning Analytics in Technology-Enhanced Learning (TEL). Generally, Learning Analytics deals with the development of methods that harness educational datasets to support the Learning process. Learning Analytics (LA) is a multi-disciplinary field involving machine Learning, artificial intelligence, information retrieval, statistics and visualisation. LA is also a field in which several related areas of research in TEL converge. These include academic Analytics, action Analytics and educational data mining. In this paper, we investigate the connections between LA and these related fields. We describe a reference model for LA based on four dimensions, namely data and environments (what?), stakeholders (who?), objectives (why?) and methods (how?). We then review recent publications on LA and its related fields and map them to the four dimensions of the reference model. Furthermore, we identify various challenges and research opportunities in the area of LA in relation to each dimension.

  • design and implementation of a Learning Analytics toolkit for teachers
    Educational Technology & Society, 2012
    Co-Authors: Anna Lea Dyckhoff, Mohamed Amine Chatti, Dennis Zielke, Mareike Bultmann, Ulrik Schroeder
    Abstract:

    Introduction Learning Management Systems (LMS) or Virtual Learning Environments (VLE) are widely used and have become part of the common toolkits of educators (Schroeder, 2009). One of the main goals of the integration of traditional teaching methods with technology enhancements is the improvement of teaching and Learning quality in large university courses with many students. But does utilizing a VLE automatically improve teaching and Learning? In our experience, many teachers just upload existing files, like lecture slides, handouts and exercises, when starting to use a VLE. Thereby availability of Learning resources is improved. For improving teaching and Learning it could be helpful to create more motivating, challenging, and engaging Learning materials and e.g., collaborative scenarios to improve Learning among large groups of students. Teachers could e.g., use audio and video recordings of their lectures or provide interactive, demonstrative multimedia examples and quizzes. If they put effort in the design of such online Learning activities, they need tools that help them observe the consequences of their actions and evaluate their teaching interventions. They need to have appropriate access to data to assess changing behaviors and performances of their students to estimate the level of improvement that has been achieved in the Learning environment. With the establishment of TEL, a new research field, called Learning Analytics, is emerging (Elias, 2011). This research field borrows and synthesizes techniques from different related fields, such as Educational Data Mining (EDM), Academic Analytics, Social Network Analysis or Business Intelligence (BI), to harness them for converting educational data into useful information and thereon to motivate actions, like self-reflecting ones previous teaching or Learning activities, to foster improved teaching and Learning. The main goal of BI is to turn enterprise data into useful information for management decision support. However, Learning Analytics, Academic Analytics, as well as EDM more specifically focus on tools and methods for exploring data coming from educational contexts. While Academic Analytics take a university-wide perspective, including also e.g., organizational and financial issues (Campbell & Oblinger, 2007), Learning Analytics as well as EDM focus specifically on data about teaching and Learning. Siemens (2010) defines Learning Analytics as "the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on Learning." It can support teachers and students to take action based on the evaluation of educational data. However, the technology to deliver this potential is still very young and research on understanding the pedagogical usefulness of Learning Analytics is still in its infancy (Johnson et al., 2011b; Johnson et al., 2012). It is a current goal at RWTH Aachen University to enhance its VLE--the Learning and teaching portal L2P (Gebhardt et al., 2007)--with user-friendly tools for Learning Analytics, in order to equip their teachers and tutors with means to evaluate the effectiveness of TEL within their instructional design and courses offered. These teachers still face difficulties, deterring them from integrating cyclical reflective research activities, comparable to Action Research, into everyday practice. Action Research is characterized by a continuing effort to closely interlink, relate and confront action and reflection, to reflect upon one's conscious and unconscious doings in order to develop one's actions, and to act reflectively in order to develop one's knowledge." (Altrichter et al., 2005, p. 6). A pre-eminent barrier is the additional workload, originating from tasks of collecting, integrating, and analyzing raw data from log files of their VLE (Altenbernd-Giani et al., 2009). To tackle these issues, we have developed the "exploratory Learning Analytics Toolkit" (eLAT). …

Simon Buckingham Shum - One of the best experts on this subject based on the ideXlab platform.

  • educational data mining meets Learning Analytics
    Learning Analytics and Knowledge, 2012
    Co-Authors: Ryan S Baker, Erik Duval, John C Stamper, David Wiley, Simon Buckingham Shum
    Abstract:

    W This panel is proposed as a means of promoting mutual Learning and continued dialogue between the Educational Data Mining and Learning Analytics communities. EDM has been developing as a community for longer than the LAK conference, so what if anything makes the LAK community different, and where is the common ground?

  • Learning dispositions and transferable competencies pedagogy modelling and Learning Analytics
    Learning Analytics and Knowledge, 2012
    Co-Authors: Simon Buckingham Shum, Ruth Deakin Crick
    Abstract:

    Theoretical and empirical evidence in the Learning sciences substantiates the view that deep engagement in Learning is a function of a complex combination of learners' identities, dispositions, values, attitudes and skills. When these are fragile, learners struggle to achieve their potential in conventional assessments, and critically, are not prepared for the novelty and complexity of the challenges they will meet in the workplace, and the many other spheres of life which require personal qualities such as resilience, critical thinking and collaboration skills. To date, the Learning Analytics research and development communities have not addressed how these complex concepts can be modelled and analysed, and how more traditional social science data analysis can support and be enhanced by Learning Analytics. We report progress in the design and implementation of Learning Analytics based on a research validated multidimensional construct termed "Learning power". We describe, for the first time, a Learning Analytics infrastructure for gathering data at scale, managing stakeholder permissions, the range of Analytics that it supports from real time summaries to exploratory research, and a particular visual analytic which has been shown to have demonstrable impact on learners. We conclude by summarising the ongoing research and development programme and identifying the challenges of integrating traditional social science research, with Learning Analytics and modelling.

  • social Learning Analytics five approaches
    Learning Analytics and Knowledge, 2012
    Co-Authors: Rebecca Ferguson, Simon Buckingham Shum
    Abstract:

    This paper proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of Learning Analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social Learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social Learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these Analytics on SocialLearn, an online Learning space in use at the UK's Open University, and the challenges that this is raising. This work takes an iterative approach to Analytics, encouraging learners to respond to and help to shape not only the Analytics but also their associated recommendations.

  • Social Learning Analytics
    Educational Technology and Society, 2012
    Co-Authors: Simon Buckingham Shum, Richard Ferguson, Simon Buckingham Shum, Rebecca Ferguson
    Abstract:

    We propose that the design and implementation of effective Social Learning Analytics (SLA) present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the Learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social Learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social Learning. The second challenge is to identify different types of SLA and their associated technologies and uses. We discuss five categories of analytic in relation to online social Learning; these Analytics are either inherently social or can be socialised. This sets the scene for a third challenge, that of implementing Analytics that have pedagogical and ethical integrity in a context where power and control over data are now of primary importance. We consider some of the concerns that Learning Analytics provoke, and suggest that Social Learning Analytics may provide ways forward. We conclude by revisiting the drivers andtrends, and consider future scenarios that we may see unfold as SLA tools and services mature.

  • LAK - Educational data mining meets Learning Analytics
    Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK '12, 2012
    Co-Authors: Ryan S Baker, Erik Duval, John C Stamper, David Wiley, Simon Buckingham Shum
    Abstract:

    W This panel is proposed as a means of promoting mutual Learning and continued dialogue between the Educational Data Mining and Learning Analytics communities. EDM has been developing as a community for longer than the LAK conference, so what if anything makes the LAK community different, and where is the common ground?

Ryan S Baker - One of the best experts on this subject based on the ideXlab platform.

  • challenges for the future of educational data mining the baker Learning Analytics prizes
    Educational Data Mining, 2019
    Co-Authors: Ryan S Baker
    Abstract:

    Learning Analytics and educational data mining have come a long way in a short time. In this article, a lightlyedited transcript of a keynote talk at the Learning Analytics and Knowledge Conference in 2019, I present a vision for some directions I believe the field should go: towards greater interpretability, generalizability, transferability, applicability, and with clearer evidence for effectiveness. I pose these potential directions as a set of six contests, with concrete criteria for what would represent successful progress in each of these areas: the Baker Learning Analytics Prizes (BLAP). Solving these challenges will bring the field closer to achieving its full potential of using data to benefit learners and transform education for the better.

  • educational data mining and Learning Analytics
    The cambridge handbook of the learning sciences 2014 ISBN 978-1-107-62657-7 págs. 253-274, 2014
    Co-Authors: Ryan S Baker, Paul Salvador Inventado
    Abstract:

    In recent years, two communities have grown around a joint interest on how big data can be exploited to benefit education and the science of Learning: Educational Data Mining and Learning Analytics. This article discusses the relationship between these two communities, and the key methods and approaches of educational data mining. The article discusses how these methods emerged in the early days of research in this area, which methods have seen particular interest in the EDM and Learning Analytics communities, and how this has changed as the field matures and has moved to making significant contributions to both educational research and practice.

  • educational data mining meets Learning Analytics
    Learning Analytics and Knowledge, 2012
    Co-Authors: Ryan S Baker, Erik Duval, John C Stamper, David Wiley, Simon Buckingham Shum
    Abstract:

    W This panel is proposed as a means of promoting mutual Learning and continued dialogue between the Educational Data Mining and Learning Analytics communities. EDM has been developing as a community for longer than the LAK conference, so what if anything makes the LAK community different, and where is the common ground?

  • LAK - Educational data mining meets Learning Analytics
    Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK '12, 2012
    Co-Authors: Ryan S Baker, Erik Duval, John C Stamper, David Wiley, Simon Buckingham Shum
    Abstract:

    W This panel is proposed as a means of promoting mutual Learning and continued dialogue between the Educational Data Mining and Learning Analytics communities. EDM has been developing as a community for longer than the LAK conference, so what if anything makes the LAK community different, and where is the common ground?