Browsing by Author "Ramaswami G"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
- ItemEffectiveness of a Learning Analytics Dashboard for Increasing Student Engagement Levels(Society for Learning Analytics Research (SoLAR), 2023-12-22) Ramaswami G; Susnjak T; Mathrani ALearning Analytics Dashboards (LADs) are gaining popularity as a platform for providing students with insights into their learning behaviour patterns in online environments. Existing LAD studies are mainly centred on displaying students’ online behaviours with simplistic descriptive insights. Only a few studies have integrated predictive components, while none possess the ability to explain how the predictive models work and how they have arrived at specific conclusions for a given student. A further gap exists within existing LADs with respect to prescriptive analytics that generate data-driven feedback to students on how to adjust their learning behaviour. The LAD in this study attempts to address this gap and integrates a full spectrum of current analytics technologies for sense-making while anchoring them within theoretical educational frameworks. This study’s LAD (SensEnablr) was evaluated for its effectiveness in impacting learning in a student cohort at a tertiary institution. Our findings demonstrate that student engagement with learning technologies and course resources increased significantly immediately following interactions with the dashboard. Meanwhile, results showed that the dashboard boosted the respondents’ learning motivation levels and that the novel analytics insights drawn from predictive and prescriptive analytics were beneficial to their learning. This study, therefore, has implications for future research when investigating student outcomes and optimizing student learning using LAD technologies.
- ItemLearning analytics dashboard: a tool for providing actionable insights to learners(BioMed Central Ltd, 2022-02-14) Susnjak T; Ramaswami G; Mathrani AThis study investigates current approaches to learning analytics (LA) dashboarding while highlighting challenges faced by education providers in their operationalization. We analyze recent dashboards for their ability to provide actionable insights which promote informed responses by learners in making adjustments to their learning habits. Our study finds that most LA dashboards merely employ surface-level descriptive analytics, while only few go beyond and use predictive analytics. In response to the identified gaps in recently published dashboards, we propose a state-of-the-art dashboard that not only leverages descriptive analytics components, but also integrates machine learning in a way that enables both predictive and prescriptive analytics. We demonstrate how emerging analytics tools can be used in order to enable learners to adequately interpret the predictive model behavior, and more specifically to understand how a predictive model arrives at a given prediction. We highlight how these capabilities build trust and satisfy emerging regulatory requirements surrounding predictive analytics. Additionally, we show how data-driven prescriptive analytics can be deployed within dashboards in order to provide concrete advice to the learners, and thereby increase the likelihood of triggering behavioral changes. Our proposed dashboard is the first of its kind in terms of breadth of analytics that it integrates, and is currently deployed for trials at a higher education institution.
- ItemOn Developing Generic Models for Predicting Student Outcomes in Educational Data Mining(MDPI (Basel, Switzerland), 2022-01-07) Ramaswami G; Susnjak T; Mathrani A; Cowling, M; Jha, MPoor academic performance of students is a concern in the educational sector, especially if it leads to students being unable to meet minimum course requirements. However, with timely prediction of students’ performance, educators can detect at-risk students, thereby enabling early interventions for supporting these students in overcoming their learning difficulties. However, the majority of studies have taken the approach of developing individual models that target a single course while developing prediction models. These models are tailored to specific attributes of each course amongst a very diverse set of possibilities. While this approach can yield accurate models in some instances, this strategy is associated with limitations. In many cases, overfitting can take place when course data is small or when new courses are devised. Additionally, maintaining a large suite of models per course is a significant overhead. This issue can be tackled by developing a generic and course-agnostic predictive model that captures more abstract patterns and is able to operate across all courses, irrespective of their differences. This study demonstrates how a generic predictive model can be developed that identifies at-risk students across a wide variety of courses. Experiments were conducted using a range of algorithms, with the generic model producing an effective accuracy. The findings showed that the CatBoost algorithm performed the best on our dataset across the F-measure, ROC (receiver operating characteristic) curve and AUC scores; therefore, it is an excellent candidate algorithm for providing solutions on this domain given its capabilities to seamlessly handle categorical and missing data, which is frequently a feature in educational datasets.
- ItemPerspectives on the challenges of generalizability, transparency and ethics in predictive learning analytics(Elsevier Ltd, 2021-11-20) Mathrani A; Susnjak T; Ramaswami G; Barczak AEducational institutions need to formulate a well-established data-driven plan to get long-term value from their learning analytics (LA) strategy. By tracking learners’ digital traces and measuring learners’ performance, institutions can discern consequential learning trends via use of predictive models to enhance their instructional services. However, questions remain on how the proposed LA system is suitable, meaningful, and justifiable. In this concept paper, we examine generalizability and transparency of the internals of predictive models, alongside the ethical challenges in using learners’ data for building predictive capabilities. Model generalizability or transferability is hindered by inadequate feature representation, small and imbalanced datasets, concept drift, and contextually un-related domains. Additional challenges relate to trustworthiness and social acceptance of these models since algorithmic-driven models are difficult to interpret by themselves. Further, ethical dilemmas are faced in engaging with learners’ data while developing and deploying LA systems at an institutional level. We propose methodologies for apprehending these challenges by establishing efforts for managing transferability and transparency, and further assessing the ethical standing on justifiable use of the LA strategy. This study showcases underlying relationships that exist between constructs pertaining to learners’ data and the predictive model. We suggest the use of appropriate evaluation techniques and setting up research ethics protocols, since without proper controls in place, the model outcome would not be portable, transferable, trustworthy, or admissible as a responsible outcome. This concept paper has theoretical and practical implications for future inquiry in the burgeoning field of learning analytics.
- ItemSupporting Students’ Academic Performance Using Explainable Machine Learning with Automated Prescriptive Analytics(MDPI (Basel, Switzerland), 2022-12) Ramaswami G; Susnjak T; Mathrani ALearning Analytics (LA) refers to the use of students’ interaction data within educational environments for enhancing teaching and learning environments. To date, the major focus in LA has been on descriptive and predictive analytics. Nevertheless, prescriptive analytics is now seen as a future area of development. Prescriptive analytics is the next step towards increasing LA maturity, leading to proactive decision-making for improving students’ performance. This aims to provide data-driven suggestions to students who are at risk of non-completions or other sub-optimal outcomes. These suggestions are based on what-if modeling, which leverages machine learning to model what the minimal changes to the students’ behavioral and performance patterns would be required to realize a more desirable outcome. The results of the what-if modeling lead to precise suggestions that can be converted into evidence-based advice to students. All existing studies in the educational domain have, until now, predicted students’ performance and have not undertaken further steps that either explain the predictive decisions or explore the generation of prescriptive modeling. Our proposed method extends much of the work performed in this field to date. Firstly, we demonstrate the use of model explainability using anchors to provide reasons and reasoning behind predictive models to enable the transparency of predictive models. Secondly, we show how prescriptive analytics based on what-if counterfactuals can be used to automate student feedback through prescriptive analytics.
- ItemUse of Predictive Analytics within Learning Analytics Dashboards: A Review of Case Studies(Springer Nature BV, 2023-09-01) Ramaswami G; Susnjak T; Mathrani A; Umer RLearning analytics dashboards (LADs) provide educators and students with a comprehensive snapshot of the learning domain. Visualizations showcasing student learning behavioral patterns can help students gain greater self-awareness of their learning progression, and at the same time assist educators in identifying those students who may be facing learning difficulties. While LADs have gained popularity, existing LADs are still far behind when it comes to employing predictive analytics into their designs. Our systematic literature review has revealed limitations in the utilization of predictive analytics tools among existing LADs. We find that studies leveraging predictive analytics only go as far as identifying the at-risk students and do not employ model interpretation or explainability capabilities. This limits the ability of LADs to offer data-driven prescriptive advice to students that can offer them guidance on appropriate learning adjustments. Further, published studies have mostly described LADs that are still at prototype stages; hence, robust evaluations of how LADs affect student outcomes have not yet been conducted. The evaluations until now are limited to LAD functionalities and usability rather than their effectiveness as a pedagogical treatment. We conclude by making recommendations for the design of advanced dashboards that more fully take advantage of machine learning technologies, while using suitable visualizations to project only relevant information. Finally, we stress the importance of developing dashboards that are ultimately evaluated for their effectiveness.