Saturday, April 21, 2018

Learning Analytics Short Course




Message from the workshop organisers:

Dear participants,

Thank you for participating in this Learning Analytics (LA) Short Course.

To take full advantage of this opportunity, we recommend the following process.

1. The session objectives, pre-reading, and pre-workshop reflection exercise is outlined in the course blog, which has been emailed out to you one week before.

https://datainformedelearning.blogspot.com/2018/04/learning-analytics-short-course.html

Please allocate one hour to the pre-session reflection exercise, using the resources of the course blog as a content buffet, to start your thinking process.

Please make sure you make a record of your pre-session reflections, which we request that each participant email back to the organisers by 8am Monday, June 11, 2018, one day before the session.

2. During the interactive group activity on Tuesday, 11 June, we will work on your reflection exercise individually, and as a group, before sharing this amongst the participants.

3. The organisers envisage that this process will form the foundation for the next stage in the planning and implementation of LA use at NUHS.

__________________

Session Objectives

1. The definition of learning analytics

2. Types of learning analytics and their purpose in education

3. Tools in supporting learning analytics

4. How data-driven decision making differ from traditional decision making and the potential future implications of this transition.

5. How to draw/determine real time and predictive data about students and trainees data


PRE-SESSION ACTIVITY

1. Read the following articles


2. Prepare the following, BEFORE the workshop.

Reflect on your roles in medical education and clinical training. Make notes on the following, which we will add to during the "live" session.

What data do you currently have about students and trainees?
What data do you feel we should collect?

What analysis are we currently doing with this data?
What could we be doing?

How can data, and analysis (of this data) improve our education and training efforts?
How could we do this?


Session takeaways:

7 byte-size ideas from Poh-Sun

and
5 (more) short takeaways from Poh-Sun

1 - Arguably the aim (outcome) of any learning session is to be able to THINK about (have a conversation), DO, and have a FEEL (opinion) about something, at the end of the educational and training process. Or gain Knowledge, Skills, and (form or change) Attitudes (KSA).
For this topic, when starting out, the material and terminology will feel unfamiliar at the beginning. With further reading, review of more material, repeatedly coming back to the topic, discussing and working through the ideas with other participants, colleagues and collaborators; starting and continuing to USE these ideas, you will progressively increase insight, understanding and skill in this area. Keep going. One step at a time. Iteratively. The more one thinks about, and uses ideas and strategies, the more one becomes familiar with, and confident in a topic, or area of professional practice.
2 - Start with the your goal or objective as a teacher or educator. What is your learning objective(s)? How will you know this is (being) achieved? What dataevidence, and observations inform you (that this has been accomplished)?
3 - Increasingly education, learning and training will be technology mediated, technology facilitated, and technology enhanced. Take advantage of your delivery and engagement platform and process to gather this data and evidence. In real time. To inform, and further customise your educational offering. Similar to what you are familiar with in a traditional classroom and training setting.
4 - Customisation and personalisation (personal learning) may simply require offering a "buffet" of (digital) content, and training scenarios, to suit individual preference and choice, for a specific purpose, at a specific time. Or use a "well stocked" digital repository for instructor facilitated, and specifically curated teaching and learning.
5 - Technology enhanced Learning (TeL) makes what we teach with, and assess on visible, accessible, and assessable. We can "show what we teach with, and assess on". For students, trainees, peers, colleagues, administrators. In fact all potential stakeholders. To make this (data) available as evidence of scholarly activity, and educational scholarship (the Scholarship of Teaching and Learning, or SOTL).








Chan, T., Sebok-Syer, S., Thoma, B., Wise, A., Sherbino, J. and Pusic, M. Learning Analytics in Medical Education Assessment: The Past, The Present and The Future.  Education and Training, April 2018. https://onlinelibrary.wiley.com/doi/abs/10.1002/aet2.10087

Matt M. Cirigliano, Charlie Guthrie, Martin V. Pusic, Anna T. Cianciolo, Jennifer E. Lim-Dunham, Anderson Spickard III & Valerie Terry (2017) “Yes, and …” Exploring the Future of Learning Analytics in Medical Education, Teaching and Learning in Medicine, 29:4, 368-372, DOI: 10.1080/10401334.2017.1384731

Goh P, Learning Analytics in Medical Education , MedEdPublish, 2017, 6, [2], 5, doi:https://doi.org/10.15694/mep.2017.000067


Norman, G. (2012). Medical education: past, present and future. Perspectives on Medical Education, 1(1), 6–14. http://doi.org/10.1007/s40037-012-0002-7


Billy Tak Ming Wong, (2017) "Learning analytics in higher education: an analysis of case studies", Asian Association of Open Universities Journal, Vol. 12 Issue: 1, pp.21-40, https://doi.org/10.1108/AAOUJ-01-2017-0009 

______________

above from

above from
Goh, P.S. Learning Analytics in Medical Education. MedEdPublish. 2017 Apr; 6(2), Paper No:5. Epub 2017 Apr 4. https://doi.org/10.15694/mep.2017.000067




above from





above from





above from































above abstract of paper accepted for presentation at upcoming 50th JSME meeting, 3 August 2018





Norman, G. (2012). Medical education: past, present and future. Perspectives on Medical Education, 1(1), 6–14. http://doi.org/10.1007/s40037-012-0002-7

"Understanding basic theory using a few illustrative examples. Mastering a topic by exposure to and experience with many examples
Typical examples or real-life scenarios can be used to illustrate theory, and help students understand fundamental principles. Mastering a topic usually requires exposure to and experience with many examples, both typical and atypical, common to uncommon including subtle manifestations of a phenomenon. The traditional method of doing this is via a long apprenticeship, or many years of practice with feedback and experience. A digital collection of educational scenarios and cases can support and potentially shorten this educational and training process. Particularly if a systematic attempt is made to collect and curate a comprehensive collection of all possible educational scenarios and case-based examples, across the whole spectrum of professional practice. Online access to key elements, parts of and whole sections of these learning cases; used by students with guidance by instructors under a deliberate practice and mastery training framework, can potentially accelerate the educational process, and deepen learning."
above from
Goh, P.S. A series of reflections on eLearning, traditional and blended learning. MedEdPublish. 2016 Oct; 5(3), Paper No:19. Epub 2016 Oct 14.  http://dx.doi.org/10.15694/mep.2016.000105

Goh P, Learning Analytics in Medical Education , MedEdPublish, 2017, 6, [2], 5, doi:https://doi.org/10.15694/mep.2017.000067

Matt M. Cirigliano, Charlie Guthrie, Martin V. Pusic, Anna T. Cianciolo, Jennifer E. Lim-Dunham, Anderson Spickard III & Valerie Terry (2017) “Yes, and …” Exploring the Future of Learning Analytics in Medical Education, Teaching and Learning in Medicine, 29:4, 368-372, DOI: 10.1080/10401334.2017.1384731

Chan, T., Sebok-Syer, S., Thoma, B., Wise, A., Sherbino, J. and Pusic, M. Learning Analytics in Medical Education Assessment: The Past, The Present and The Future.  Education and Training, April 2018. https://onlinelibrary.wiley.com/doi/abs/10.1002/aet2.10087


see also below






see also


and




















https://www.cnbc.com/2018/05/09/zuckerberg-invests-in-blockchain-to-keep-facebook-relevant.html

https://www.wired.com/story/whats-the-deal-with-facebook-and-the-blockchain/

http://fortune.com/2018/05/09/facebook-blockchain-team/

https://www.theverge.com/2018/5/8/17332894/facebook-blockchain-group-employee-reshuffle-restructure-david-marcus-kevin-weil

https://www.cnet.com/news/facebook-is-reportedly-starting-a-blockchain-team/

https://e27.co/blockchain-future-data-privacy-20180525/

Blockchain and individuals’ control over personal data in European data protection law (by Roberta Filippone, Thesis for Master of Law, 2017)

"The holy grail for privacy engineers is to create a semi-permeable membrane between generators and users of data that allows general inferences to be drawn from our digital footprint without personal identification leaking out."
quoted from article below

Sunday, January 21, 2018

(Draft) Blueprint for 1 week data analytics course - for Individual Faculty and Institutional Stakeholders

Purpose

To analyse, and be able to draw / determine real-time; and predictive data about student and trainee participation, engagement, milestone-performance-competency gains.

Outline

Day 1 - meaningful data, milestone and performance metrics, logs and portfolio data and artefacts
Day 2 - what we have
Day 3 - what we need to clean up
Day 4 - dashboards, feedback loops
Day 5 - implementation-review-refine PDSA and rapid prototyping loops

Outcomes

To analyse, and be able to draw / determine real-time; and predictive data about student and trainee participation, engagement, milestone-performance-competency gains.


Faculty

Clinician-Educator
Medical Educator
Instructional Designer
Data Scientist
Educational Technologist
Institutional MedEd Administrator/Leader


Pre-session Survey and Needs Analysis

Who are stakeholders?
What are meaningful metrics?

Familiarity with data analytics concepts and tools - R, Python, Excel

https://datainformedelearning.blogspot.sg/2018/01/bite-or-byte-size-tips-for-using.html

https://edition.cnn.com/2018/03/20/opinions/facebook-privacy-blockchain-opinion-parker/index.html


http://scale.nus.edu.sg/programmes/edp/public-courses/data-sciences.html

     http://scale.nus.edu.sg/documents/EDP/Data-Analytics-Begins-With-Me.pdf

     http://scale.nus.edu.sg/documents/EDP/Data-Science-AI.pdf

     http://scale.nus.edu.sg/documents/EDP/Data-Analytics-for-Senior-IT-Managers.pdf


https://www.edx.org/course/subject/data-analysis-statistics

https://www.coursera.org/browse/data-science/data-analysis?languages=en


https://www.sp.edu.sg/wps/portal/vp-spws/pace.short.course.details?WCM_GLOBAL_CONTEXT=/lib-pace/internet/courses/analytics+for+educators+i

https://www.edx.org/course/data-analytics-learning-utarlingtonx-link5-10x

https://www.tableau.com/solutions/education-analytics

https://www.tableau.com/solutions/education-higher-ed-analytics

https://www.analyticsvidhya.com/learning-paths-data-science-business-analytics-business-intelligence-big-data/tableau-learning-path/

https://www.udemy.com/tableau-data-analytics-must-see-introduction-to-analytics/


https://analytics.jiscinvolve.org/wp/2014/10/22/learning-analytics-using-business-intelligence-systems/

"In terms of evaluation of learners, assessment should be in-process, not at the conclusion of a course in the form of an exam or a test. Let’s say we develop semantically-defined learning materials and ways to automatically compare learner-produced artifacts (in discussions, texts, papers) to the knowledge structure of a field. Our knowledge profile could then reflect how we compare to the knowledge architecture of a domain — i.e. “you are 64% on your way to being a psychologist” or “you are 38% on your way to being a statistician.”
above quote from George Siemens in interview below
https://www.oreilly.com/ideas/education-data-analytics-learning

https://www.crcpress.com/Data-Analytics-Applications-in-Education/Vanthienen-Witte/p/book/9781498769273

https://www.wiley.com/en-us/Data+Mining+and+Learning+Analytics%3A+Applications+in+Educational+Research-p-9781118998236

Martin, Florence and Ndoye, Abdou, Using Learning Analytics to Assess Student Learning in Online Courses, Journal of University Teaching & Learning Practice, 13(3), 2016. Available at:http://ro.uow.edu.au/jutlp/vol13/iss3/7

https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v2_0.pdf


Google image search "machine learning vs deep learning"

https://www.zendesk.com/blog/machine-learning-and-deep-learning/

https://www.digitaltrends.com/cool-tech/deep-learning-vs-machine-learning-explained/


http://www.straitstimes.com/singapore/engineering-the-future-of-medicine

https://www.comscore.com/Insights/Blog/Mobile-Matures-as-the-Cross-Platform-Era-Emerges

Saturday, January 20, 2018

Bite or Byte-size Tips for Using Data(analytics) to Inform (e)Teaching and (e)Learning from workshop facilitators

Bite or Byte-size Ideas from the workshop facilitators

7 byte-size ideas from Poh-Sun

and
5 (more) short takeaways from Poh-Sun

1 - Using data (analyticsto inform eTeaching and eLearning 
Arguably the aim (outcome) of any learning session is to be able to THINK about (have a conversation), DO, and have a FEEL (opinion) about something, at the end of the educational and training process. Or gain Knowledge, Skills, and (form or change) Attitudes (KSA).
For this topic, when starting out, the material and terminology will feel unfamiliar at the beginning. With further reading, review of more material, repeatedly coming back to the topic, discussing and working through the ideas with other participants, colleagues and collaborators; starting and continuing to USE these ideas, you will progressively increase insight, understanding and skill in this area. Keep going. One step at a time. Iteratively. The more one thinks about, and uses ideas and strategies, the more one becomes familiar with, and confident in a topic, or area of professional practice.

above first presented at workshop @ 15th APMEC 2015, workshop blog link below


"We can manage better (to iteratively improve, share and build on) what is visible, what we can see – directly, through data, and data dashboards-visual data maps and illustrations. Data and observations, big data and small or rich data, quantitative and qualitative research (mixed methods research) gives us insights as educators into our teaching practice, its effectiveness and impact. Just as we blend the best features of traditional and online/eLearning/Technology enhanced learning in our teaching practice, we can “blend” and take advantage of “big data” or online data analytics, which when added to traditional classroom observations and measures-indicators of learning effectiveness and impact, can give us a more complete, comprehensive, and rounded picture of individual, and group learning."

above from

The Society for Learning Analytics Research (SoLAR) defines learning analytics as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs" (Long, Siemens, Conole, & Gašević, 2011).

'...Digital "footprints" (or trace data) about user interactions with technology have been recorded since the very introduction of the Internet and web-based software systems ... ...Over time, the value of such digital traces has been recognized as a promising source of data about student learning ...'
above quote from
Gašević, D., Dawson, S., & Siemens, G. (2015). Let's not forget: Learning analytics are about learning. TechTrends, 59 (1), 64-71. https://doi.org/10.1007/s11528-014-0822-x

"As learning analytics data provides a snapshot of how engaged students are and how they are performing, this could be considered a useful indication of where excellent teaching is taking place"
above quote from
From Bricks to Clicks - The Potential of Data and Analytics in Higher Education, report by the Higher Education Comission, on 26 January 2016.

"Key to adopting Learning Analytics is the growing use of Technology Enhanced Learning (TEL)
within Higher Education. Alongside the often cited benefits of facilitating access to course materials for students, and supporting personalised learning experiences, the use of TEL also allows the learner journey to be captured and understood. Drawing upon experiences gained through e-commerce and the related customer journey to enhance the retail experience, educational establishments can gain a much richer understanding of the patterns of engagement. This leads not only to an understanding of how students are progressing, but can also determine points of intervention when students may stop engaging and be considered a risk in relation to retention."
above quote from Preface to

"Learning is a long-lasting change in a person's knowledge attributable to experience."
and
"Effective instruction with technology seeks to help the learner engage in appropriate cognitive processing during learning without overloading her or his cognitive system."
above quotes from
page 195, Chapter 8, Learning with Technology by Richard Mayer
in The Nature of Learning: Using Research to Inspire Practice, OECD 2010

"In terms of evaluation of learners, assessment should be in-process, not at the conclusion of a course in the form of an exam or a test. Let’s say we develop semantically-defined learning materials and ways to automatically compare learner-produced artifacts (in discussions, texts, papers) to the knowledge structure of a field. Our knowledge profile could then reflect how we compare to the knowledge architecture of a domain — i.e. “you are 64% on your way to being a psychologist” or “you are 38% on your way to being a statistician.”
above quote from George Siemens in interview below
https://www.oreilly.com/ideas/education-data-analytics-learning


2 - Start with the your goal or objective as a teacher or educator. What is your learning objective(s)? How will you know this is (being) achieved? What dataevidence, and observations inform you (that this has been accomplished)?


"We can manage better (to iteratively improve, share and build on) what is visible, what we can see – directly, through data, and data dashboards-visual data maps and illustrations. Data and observations, big data and small or rich data, quantitative and qualitative research (mixed methods research) gives us insights as educators into our teaching practice, its effectiveness and impact. Just as we blend the best features of traditional and online/eLearning/Technology enhanced learning in our teaching practice, we can “blend” and take advantage of “big data” or online data analytics, which when added to traditional classroom observations and measures-indicators of learning effectiveness and impact, can give us a more complete, comprehensive, and rounded picture of individual, and group learning."

above from

"Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. The adoption of learning analytics too must be informed not only by what can be measured but also by what cannot. There will be limits in what learning analytics can do. In this vein, Siemens and Long have appropriately acknowledged that learning "is messy" and have warned that with learning analytics, "we must guard against drawing conclusions about learning processes based on questionable assumptions that misapply simple models to a complex challenge."5 The message here is important: not every aspect of learning can be captured by the powerful tool that analytics promises to be. Sometimes learning is ineffable! Therefore, multiple methods for assessing learning should be employed, including assessments that function as learning opportunities to support students' deep integration of knowledge, their personal development, and (hopefully!) their transformation over time."
and
"Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. As this principle states, assessment alone changes very little; likewise, learning analytics cannot act alone in radically disrupting and transforming education. Assessment (when done well) is about the authentic and deep understanding and improvement of teaching and learning. Analytics is about using the power of information technology to see patterns of success (or failure) in learning. Combining the two might actually produce the seeds of transformation—a powerful inquiry into what supports authentic, deep, transformative learning for students."
above quotes from
https://er.educause.edu/articles/2012/7/learning-analytics-the-new-black


3 - Increasingly education, learning and training will be technology mediated, technology facilitated, and technology enhanced. Take advantage of your delivery and engagement platform and process to gather this data and evidence. In real time. To inform, and further customise your educational offering. Similar to what you are familiar with in a traditional classroom and training setting.

"Learning analytics (LAcan be combined with data from Assessment (both formative and summative), faculty and peer/student observations and feedback, to inform both eTeaching (for faculty), and (e)Learning (for students). LA can provide insight and visualisation of the learning journey undertaken by students and trainees, from attendance, through engagement, (active) interaction with the learning content and training process, as well as collaborative learning with fellow students and trainees, toward step-wise and progressive achievement of learning milestones, learning objectives and outcomes."
- Poh Sun
(each and every highlighted word is hyperlinked to an online illustration/resource)
above first presented at symposium @ 15th APMEC 2015, symposium blog link below

What do "good" students and trainees do? 
Are the "best" students and trainees following different learning paths?
Can we predict, and assist students and trainees who are struggling with the course content or training process?
What are predictors, and indicators of poor, good, and outstanding outcomes?
How can we use (anonymised) cohort data, engagement and outcome metrics to assist both students and trainees (for feedback), and teachers to improve learning and training outcomes?


4 - Customisation and personalisation (personal learning) may simply require offering a "buffet" of (digital) content, and training scenarios, to suit individual preference and choice, for a specific purpose, at a specific time. Or use a "well stocked" digital repository for instructor facilitated, and specifically curated teaching and learning.

Novices, and early stage learners and trainees require guidance, and a curated limited offering of educational and training material.

More advanced learners and trainees often require a wider range of educational and training materials and options, customised to specific practice needs. Often in practice settings. To look up. Refresh memory. Check guidelines. Do some background reading. Go (a little) deeper with a limited, or more extensive literature review. To then engage with peers to have an informed discussion on diagnostic and therapeutic options.


5 - Technology enhanced Learning (TeL) makes what we teach with, and assess on visible, accessible, and assessable. We can "show what we teach with, and assess on". For students, trainees, peers, colleagues, administrators. In fact all potential stakeholders. To make this (data) available as evidence of scholarly activity, and educational scholarship (the Scholarship of Teaching and Learning, or SOTL).

Showing, and demonstrating (evidence of) value (add), impact, costs (effectiveness) of educational and training efforts and initiatives. This will be required for educational scholarship, course and program evaluation, benchmarking, external accreditation and review, quality improvement, quality assurance, as well as accountability to stakeholder exercises and initiatives.

           

3 byte-size ideas from Sergio

1 - Big data is as good as the analysis we perform on it.
2 - Big data and small data are like partners in a dance. They both need to be considered to generate insights
3 - Not all projects require machine learning.


4 byte-size ideas from Wee Khee

1.    Begin with the End in Mind - Set goals before considering analytics and measurement
2.    Design with learners at heart – Use human-centred design principles in learning journey design.
3.    Test, Measure and Experiment – Use analytics to guide experimentation 
4.    Take actions with insights – analytics are only useful if you can make decisions and changes






References

Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA.
Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-45. doi:10.3109/0142159X.2010.501190.

Goh, P.S. Learning Analytics in Medical Education. MedEdPublish. 2017 Apr; 6(2), Paper No:5. Epub 2017 Apr 4. https://doi.org/10.15694/mep.2017.000067

Goh, P.S., Sandars, J. Insights from the culinary arts for medical educators. MedEdPublish. 2017 Jan; 6(1), Paper No:10. Epub 2017 Jan 18.

Goh, P.S. A proposal for a grading and ranking method as the first step toward developing a scoring system to measure the value and impact of viewership of online material in medical education - going beyond “clicks” and views toward learning. MedEdPublish. 2016 Oct; 5(3), Paper No:62. Epub 2016 Dec 9.

Goh, P.S. The value and impact of eLearning or Technology enhanced learning from one perspective of a Digital Scholar. MedEdPublish. 2016 Oct; 5(3), Paper No:31. Epub 2016 Oct 18.

Goh, P.S. A series of reflections on eLearning, traditional and blended learning. MedEdPublish. 2016 Oct; 5(3), Paper No:19. Epub 2016 Oct 14.

Goh, P.S. Using a blog as an integrated eLearning tool and platform. Med Teach. 2016 Jun;38(6):628-9. Epub 2015 Nov 11.

Goh, P.S. eLearning or Technology enhanced learning in medical education - Hope, not Hype. Med Teach. 2016 Sep; 38(9): 957-958, Epub 2016 Mar 16

Gašević, D., Dawson, S., & Siemens, G. (2015). Let's not forget: Learning analytics are about learning. TechTrends, 59 (1), 64-71. https://doi.org/10.1007/s11528-014-0822-x

Gašević, D., Dawson, S. and Pardo, A. (2016), “How do we start? State and directions of learning analytics adoption”, International Council for Open and Distance Education, available at: https://icde.memberclicks.net/assets/RESOURCES/dragan_la_report%20cc%20licence.pdf (accessed 25 January 2018).

Lockyer, L & Heathcote, Elizabeth & Dawson, Shane. (2013). Informing Pedagogical Action Aligning Learning Analytics With Learning Design. American Behavioral Scientist. 57. 1439-1459. 10.1177/0002764213479367.

Long, P. D., Siemens, G., Conole, G., & Gašević, D. (Eds.). (2011). Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK'11). New York, NY, USA: ACM.

Mackintosh-Franklin C. An evaluation into the impact of undergraduate nursing 
students classroom attendance and engagement with online tasks on overall
academic achievement. Nurse Educ Today. 2017 Nov 20;61:89-93. doi:
10.1016/j.nedt.2017.11.017. [Epub ahead of print] Review. PubMed PMID: 29195212.

Martin, Florence and Ndoye, Abdou, Using Learning Analytics to Assess Student Learning in Online Courses, Journal of University Teaching & Learning Practice, 13(3), 2016.

Mohamed Amine Chatti; Anna Lea Dyckhoff; Ulrik Schroeder; Hendrik Thüs. A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL), Vol. 4, No. 5/6, 2012. https://www.inderscienceonline.com/doi/abs/10.1504/IJTEL.2012.051815

Papamitsiou, Z. and Economides, A.A. (2014), “Learning analytics and educational data mining in practice: a systematic literature review of empirical evidence”, Educational Technology & Society, Vol. 17 No. 4, pp. 49-64. [Google Scholar]

Wong, BTM, (2017) "Learning analytics in higher education: an analysis of case studies", Asian Association of Open Universities Journal, Vol. 12 Issue: 1, pp.21-40, https://doi.org/10.1108/AAOUJ-01-2017-0009 
http://www.emeraldinsight.com/doi/full/10.1108/AAOUJ-01-2017-0009
















"In three studies, we found that students who took notes on laptops performed worse on conceptual questions than students who took notes longhand. We show that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning"
above quote from abstract of article below