Sunday, January 21, 2018

(Draft) Blueprint for 1 week data analytics course - for Individual Faculty and Institutional Stakeholders

Purpose

To analyse, and be able to draw / determine real-time; and predictive data about student and trainee participation, engagement, milestone-performance-competency gains.

Outline

Day 1 - meaningful data, milestone and performance metrics, logs and portfolio data and artefacts
Day 2 - what we have
Day 3 - what we need to clean up
Day 4 - dashboards, feedback loops
Day 5 - implementation-review-refine PDSA and rapid prototyping loops

Outcomes

To analyse, and be able to draw / determine real-time; and predictive data about student and trainee participation, engagement, milestone-performance-competency gains.


Faculty

Clinician-Educator
Medical Educator
Instructional Designer
Data Scientist
Educational Technologist
Institutional MedEd Administrator/Leader


Pre-session Survey and Needs Analysis

Who are stakeholders?
What are meaningful metrics?

Familiarity with data analytics concepts and tools - R, Python, Excel

https://datainformedelearning.blogspot.sg/2018/01/bite-or-byte-size-tips-for-using.html

https://edition.cnn.com/2018/03/20/opinions/facebook-privacy-blockchain-opinion-parker/index.html


http://scale.nus.edu.sg/programmes/edp/public-courses/data-sciences.html

     http://scale.nus.edu.sg/documents/EDP/Data-Analytics-Begins-With-Me.pdf

     http://scale.nus.edu.sg/documents/EDP/Data-Science-AI.pdf

     http://scale.nus.edu.sg/documents/EDP/Data-Analytics-for-Senior-IT-Managers.pdf


https://www.edx.org/course/subject/data-analysis-statistics

https://www.coursera.org/browse/data-science/data-analysis?languages=en


https://www.sp.edu.sg/wps/portal/vp-spws/pace.short.course.details?WCM_GLOBAL_CONTEXT=/lib-pace/internet/courses/analytics+for+educators+i

https://www.edx.org/course/data-analytics-learning-utarlingtonx-link5-10x

https://www.tableau.com/solutions/education-analytics

https://www.tableau.com/solutions/education-higher-ed-analytics

https://www.analyticsvidhya.com/learning-paths-data-science-business-analytics-business-intelligence-big-data/tableau-learning-path/

https://www.udemy.com/tableau-data-analytics-must-see-introduction-to-analytics/


https://analytics.jiscinvolve.org/wp/2014/10/22/learning-analytics-using-business-intelligence-systems/

"In terms of evaluation of learners, assessment should be in-process, not at the conclusion of a course in the form of an exam or a test. Let’s say we develop semantically-defined learning materials and ways to automatically compare learner-produced artifacts (in discussions, texts, papers) to the knowledge structure of a field. Our knowledge profile could then reflect how we compare to the knowledge architecture of a domain — i.e. “you are 64% on your way to being a psychologist” or “you are 38% on your way to being a statistician.”
above quote from George Siemens in interview below
https://www.oreilly.com/ideas/education-data-analytics-learning

https://www.crcpress.com/Data-Analytics-Applications-in-Education/Vanthienen-Witte/p/book/9781498769273

https://www.wiley.com/en-us/Data+Mining+and+Learning+Analytics%3A+Applications+in+Educational+Research-p-9781118998236

Martin, Florence and Ndoye, Abdou, Using Learning Analytics to Assess Student Learning in Online Courses, Journal of University Teaching & Learning Practice, 13(3), 2016. Available at:http://ro.uow.edu.au/jutlp/vol13/iss3/7

https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v2_0.pdf


Google image search "machine learning vs deep learning"

https://www.zendesk.com/blog/machine-learning-and-deep-learning/

https://www.digitaltrends.com/cool-tech/deep-learning-vs-machine-learning-explained/


http://www.straitstimes.com/singapore/engineering-the-future-of-medicine

https://www.comscore.com/Insights/Blog/Mobile-Matures-as-the-Cross-Platform-Era-Emerges

Saturday, January 20, 2018

Bite or Byte-size Tips for Using Data(analytics) to Inform (e)Teaching and (e)Learning from workshop facilitators

Bite or Byte-size Ideas from the workshop facilitators

7 byte-size ideas from Poh-Sun

and
5 (more) short takeaways from Poh-Sun

1 - Using data (analyticsto inform eTeaching and eLearning 
Arguably the aim (outcome) of any learning session is to be able to THINK about (have a conversation), DO, and have a FEEL (opinion) about something, at the end of the educational and training process. Or gain Knowledge, Skills, and (form or change) Attitudes (KSA).
For this topic, when starting out, the material and terminology will feel unfamiliar at the beginning. With further reading, review of more material, repeatedly coming back to the topic, discussing and working through the ideas with other participants, colleagues and collaborators; starting and continuing to USE these ideas, you will progressively increase insight, understanding and skill in this area. Keep going. One step at a time. Iteratively. The more one thinks about, and uses ideas and strategies, the more one becomes familiar with, and confident in a topic, or area of professional practice.

above first presented at workshop @ 15th APMEC 2015, workshop blog link below


"We can manage better (to iteratively improve, share and build on) what is visible, what we can see – directly, through data, and data dashboards-visual data maps and illustrations. Data and observations, big data and small or rich data, quantitative and qualitative research (mixed methods research) gives us insights as educators into our teaching practice, its effectiveness and impact. Just as we blend the best features of traditional and online/eLearning/Technology enhanced learning in our teaching practice, we can “blend” and take advantage of “big data” or online data analytics, which when added to traditional classroom observations and measures-indicators of learning effectiveness and impact, can give us a more complete, comprehensive, and rounded picture of individual, and group learning."

above from

The Society for Learning Analytics Research (SoLAR) defines learning analytics as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs" (Long, Siemens, Conole, & Gašević, 2011).

'...Digital "footprints" (or trace data) about user interactions with technology have been recorded since the very introduction of the Internet and web-based software systems ... ...Over time, the value of such digital traces has been recognized as a promising source of data about student learning ...'
above quote from
Gašević, D., Dawson, S., & Siemens, G. (2015). Let's not forget: Learning analytics are about learning. TechTrends, 59 (1), 64-71. https://doi.org/10.1007/s11528-014-0822-x

"As learning analytics data provides a snapshot of how engaged students are and how they are performing, this could be considered a useful indication of where excellent teaching is taking place"
above quote from
From Bricks to Clicks - The Potential of Data and Analytics in Higher Education, report by the Higher Education Comission, on 26 January 2016.

"Key to adopting Learning Analytics is the growing use of Technology Enhanced Learning (TEL)
within Higher Education. Alongside the often cited benefits of facilitating access to course materials for students, and supporting personalised learning experiences, the use of TEL also allows the learner journey to be captured and understood. Drawing upon experiences gained through e-commerce and the related customer journey to enhance the retail experience, educational establishments can gain a much richer understanding of the patterns of engagement. This leads not only to an understanding of how students are progressing, but can also determine points of intervention when students may stop engaging and be considered a risk in relation to retention."
above quote from Preface to

"Learning is a long-lasting change in a person's knowledge attributable to experience."
and
"Effective instruction with technology seeks to help the learner engage in appropriate cognitive processing during learning without overloading her or his cognitive system."
above quotes from
page 195, Chapter 8, Learning with Technology by Richard Mayer
in The Nature of Learning: Using Research to Inspire Practice, OECD 2010

"In terms of evaluation of learners, assessment should be in-process, not at the conclusion of a course in the form of an exam or a test. Let’s say we develop semantically-defined learning materials and ways to automatically compare learner-produced artifacts (in discussions, texts, papers) to the knowledge structure of a field. Our knowledge profile could then reflect how we compare to the knowledge architecture of a domain — i.e. “you are 64% on your way to being a psychologist” or “you are 38% on your way to being a statistician.”
above quote from George Siemens in interview below
https://www.oreilly.com/ideas/education-data-analytics-learning


2 - Start with the your goal or objective as a teacher or educator. What is your learning objective(s)? How will you know this is (being) achieved? What dataevidence, and observations inform you (that this has been accomplished)?


"We can manage better (to iteratively improve, share and build on) what is visible, what we can see – directly, through data, and data dashboards-visual data maps and illustrations. Data and observations, big data and small or rich data, quantitative and qualitative research (mixed methods research) gives us insights as educators into our teaching practice, its effectiveness and impact. Just as we blend the best features of traditional and online/eLearning/Technology enhanced learning in our teaching practice, we can “blend” and take advantage of “big data” or online data analytics, which when added to traditional classroom observations and measures-indicators of learning effectiveness and impact, can give us a more complete, comprehensive, and rounded picture of individual, and group learning."

above from

"Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. The adoption of learning analytics too must be informed not only by what can be measured but also by what cannot. There will be limits in what learning analytics can do. In this vein, Siemens and Long have appropriately acknowledged that learning "is messy" and have warned that with learning analytics, "we must guard against drawing conclusions about learning processes based on questionable assumptions that misapply simple models to a complex challenge."5 The message here is important: not every aspect of learning can be captured by the powerful tool that analytics promises to be. Sometimes learning is ineffable! Therefore, multiple methods for assessing learning should be employed, including assessments that function as learning opportunities to support students' deep integration of knowledge, their personal development, and (hopefully!) their transformation over time."
and
"Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. As this principle states, assessment alone changes very little; likewise, learning analytics cannot act alone in radically disrupting and transforming education. Assessment (when done well) is about the authentic and deep understanding and improvement of teaching and learning. Analytics is about using the power of information technology to see patterns of success (or failure) in learning. Combining the two might actually produce the seeds of transformation—a powerful inquiry into what supports authentic, deep, transformative learning for students."
above quotes from
https://er.educause.edu/articles/2012/7/learning-analytics-the-new-black


3 - Increasingly education, learning and training will be technology mediated, technology facilitated, and technology enhanced. Take advantage of your delivery and engagement platform and process to gather this data and evidence. In real time. To inform, and further customise your educational offering. Similar to what you are familiar with in a traditional classroom and training setting.

"Learning analytics (LAcan be combined with data from Assessment (both formative and summative), faculty and peer/student observations and feedback, to inform both eTeaching (for faculty), and (e)Learning (for students). LA can provide insight and visualisation of the learning journey undertaken by students and trainees, from attendance, through engagement, (active) interaction with the learning content and training process, as well as collaborative learning with fellow students and trainees, toward step-wise and progressive achievement of learning milestones, learning objectives and outcomes."
- Poh Sun
(each and every highlighted word is hyperlinked to an online illustration/resource)
above first presented at symposium @ 15th APMEC 2015, symposium blog link below

What do "good" students and trainees do? 
Are the "best" students and trainees following different learning paths?
Can we predict, and assist students and trainees who are struggling with the course content or training process?
What are predictors, and indicators of poor, good, and outstanding outcomes?
How can we use (anonymised) cohort data, engagement and outcome metrics to assist both students and trainees (for feedback), and teachers to improve learning and training outcomes?


4 - Customisation and personalisation (personal learning) may simply require offering a "buffet" of (digital) content, and training scenarios, to suit individual preference and choice, for a specific purpose, at a specific time. Or use a "well stocked" digital repository for instructor facilitated, and specifically curated teaching and learning.

Novices, and early stage learners and trainees require guidance, and a curated limited offering of educational and training material.

More advanced learners and trainees often require a wider range of educational and training materials and options, customised to specific practice needs. Often in practice settings. To look up. Refresh memory. Check guidelines. Do some background reading. Go (a little) deeper with a limited, or more extensive literature review. To then engage with peers to have an informed discussion on diagnostic and therapeutic options.


5 - Technology enhanced Learning (TeL) makes what we teach with, and assess on visible, accessible, and assessable. We can "show what we teach with, and assess on". For students, trainees, peers, colleagues, administrators. In fact all potential stakeholders. To make this (data) available as evidence of scholarly activity, and educational scholarship (the Scholarship of Teaching and Learning, or SOTL).

Showing, and demonstrating (evidence of) value (add), impact, costs (effectiveness) of educational and training efforts and initiatives. This will be required for educational scholarship, course and program evaluation, benchmarking, external accreditation and review, quality improvement, quality assurance, as well as accountability to stakeholder exercises and initiatives.

           

3 byte-size ideas from Sergio

1 - Big data is as good as the analysis we perform on it.
2 - Big data and small data are like partners in a dance. They both need to be considered to generate insights
3 - Not all projects require machine learning.


4 byte-size ideas from Wee Khee

1.    Begin with the End in Mind - Set goals before considering analytics and measurement
2.    Design with learners at heart – Use human-centred design principles in learning journey design.
3.    Test, Measure and Experiment – Use analytics to guide experimentation 
4.    Take actions with insights – analytics are only useful if you can make decisions and changes






References

Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA.
Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-45. doi:10.3109/0142159X.2010.501190.

Goh, P.S. Learning Analytics in Medical Education. MedEdPublish. 2017 Apr; 6(2), Paper No:5. Epub 2017 Apr 4. https://doi.org/10.15694/mep.2017.000067

Goh, P.S., Sandars, J. Insights from the culinary arts for medical educators. MedEdPublish. 2017 Jan; 6(1), Paper No:10. Epub 2017 Jan 18.

Goh, P.S. A proposal for a grading and ranking method as the first step toward developing a scoring system to measure the value and impact of viewership of online material in medical education - going beyond “clicks” and views toward learning. MedEdPublish. 2016 Oct; 5(3), Paper No:62. Epub 2016 Dec 9.

Goh, P.S. The value and impact of eLearning or Technology enhanced learning from one perspective of a Digital Scholar. MedEdPublish. 2016 Oct; 5(3), Paper No:31. Epub 2016 Oct 18.

Goh, P.S. A series of reflections on eLearning, traditional and blended learning. MedEdPublish. 2016 Oct; 5(3), Paper No:19. Epub 2016 Oct 14.

Goh, P.S. Using a blog as an integrated eLearning tool and platform. Med Teach. 2016 Jun;38(6):628-9. Epub 2015 Nov 11.

Goh, P.S. eLearning or Technology enhanced learning in medical education - Hope, not Hype. Med Teach. 2016 Sep; 38(9): 957-958, Epub 2016 Mar 16

Gašević, D., Dawson, S., & Siemens, G. (2015). Let's not forget: Learning analytics are about learning. TechTrends, 59 (1), 64-71. https://doi.org/10.1007/s11528-014-0822-x

Gašević, D., Dawson, S. and Pardo, A. (2016), “How do we start? State and directions of learning analytics adoption”, International Council for Open and Distance Education, available at: https://icde.memberclicks.net/assets/RESOURCES/dragan_la_report%20cc%20licence.pdf (accessed 25 January 2018).

Lockyer, L & Heathcote, Elizabeth & Dawson, Shane. (2013). Informing Pedagogical Action Aligning Learning Analytics With Learning Design. American Behavioral Scientist. 57. 1439-1459. 10.1177/0002764213479367.

Long, P. D., Siemens, G., Conole, G., & Gašević, D. (Eds.). (2011). Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK'11). New York, NY, USA: ACM.

Mackintosh-Franklin C. An evaluation into the impact of undergraduate nursing 
students classroom attendance and engagement with online tasks on overall
academic achievement. Nurse Educ Today. 2017 Nov 20;61:89-93. doi:
10.1016/j.nedt.2017.11.017. [Epub ahead of print] Review. PubMed PMID: 29195212.

Martin, Florence and Ndoye, Abdou, Using Learning Analytics to Assess Student Learning in Online Courses, Journal of University Teaching & Learning Practice, 13(3), 2016.

Mohamed Amine Chatti; Anna Lea Dyckhoff; Ulrik Schroeder; Hendrik Thüs. A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL), Vol. 4, No. 5/6, 2012. https://www.inderscienceonline.com/doi/abs/10.1504/IJTEL.2012.051815

Papamitsiou, Z. and Economides, A.A. (2014), “Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence”, Educational Technology & Society, Vol. 17 No. 4, pp. 49-64. [Google Scholar]

Wong, BTM, (2017) "Learning analytics in higher education: an analysis of case studies", Asian Association of Open Universities Journal, Vol. 12 Issue: 1, pp.21-40, https://doi.org/10.1108/AAOUJ-01-2017-0009 
http://www.emeraldinsight.com/doi/full/10.1108/AAOUJ-01-2017-0009
















"In three studies, we found that students who took notes on laptops performed worse on conceptual questions than students who took notes longhand. We show that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning"
above quote from abstract of article below