Research on E-Learning and ICT in Education: Technological, Pedagogical, and Instructional Perspectives 3031342909, 9783031342905

This book is comprised of research-based chapters developed from selected full papers presented at the Pan-Hellenic and

113 81 11MB

English Pages 303 [294] Year 2023

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Contents
About the Editor
About the Book
Evaluating Digital Learning Objects in Science Education with the “Science Learning Objects Evaluation Tool”
1 Introduction
1.1 Digital Learning Objects in Science Education
1.2 Evaluation Tools of Digital Learning Objects
2 Systematic Review on the Evaluation of DLOs for Science Education
2.1 Selected Databases and Search Algorithm
2.2 Inclusion and Exclusion Criteria
2.3 Review Process
2.4 Findings
3 Methodology
3.1 The Evaluation Tool SciLOET
3.2 Sample
3.3 Procedure
4 Results
4.1 Results from Elementary Science Teachers’ Study
Descriptive Statistics
Measurement Model
Structural Model
4.2 Results from Secondary Science Teachers’ Study
Descriptive Statistics
Measurement Model
Structural Model
5 Discussion and Conclusions
References
Digital Games as Learning Tools: Mapping the Perspectives and Experience of Student Teachers in Greek Universities
1 Introduction
2 Methodology
2.1 Participants
3 Results
3.1 The Students’ Gaming Profile: What and How They Play
3.2 The Perceptions and Attitudes of Students Towards GBL (LO and BI)
3.3 Gaming Profile and Attitudes Towards Games (LO and BI)
4 Discussion and Conclusions
References
Understanding Primary School Students’ Desire to Play Games on Smart Mobile Devices in their Leisure Time
1 Introduction
2 Theoretical Framework
3 Research Model and Hypotheses
4 Method
4.1 Participants and Procedure
4.2 Measures
4.3 Analysis
5 Results
5.1 Cronbach Alpha Reliabilities and Descriptive Analysis
5.2 Pearson Correlation Analysis
5.3 Hypothesis Testing
6 Discussion and Conclusions
References
Peer Evaluation Literacy in Teacher Education: Mapping Student Teachers as Reviewees and Reviewers
1 Introduction
1.1 The Usefulness of Peer Evaluation to Reviewers and Reviewees
1.2 The Barriers to Reviewers Composing and Reviewees Taking Up Peer Evaluation
1.3 Peer Evaluation Literacy
2 Peer Evaluation Practices in Teacher Education
3 Methods
3.1 Context and Participants
3.2 Materials: PeerLAND
3.3 Research Design
3.4 Procedure
4 Results
4.1 The Usefulness of Peer Evaluation
4.2 The Barriers Faced in Peer Evaluation
4.3 The Student Teachers’ Potential as Learning Design Reviewers
5 Discussion and Conclusions
References
Teachers’ Preferences for Having Guidance from Digital Tools in Authoring Learning Designs
1 Introduction
2 Background
3 Methods
3.1 Setting and Participants
3.2 Materials
3.3 Research Scope and Research Design
3.4 Procedure
4 Results
4.1 Quantitative and Qualitative Results
4.2 Mixed-Method Results
5 Discussion
6 Conclusions
References
Investigating TPACK Integration in the Designing and Implementation of Educational Activities Using ICT by Prospective Early Childhood Teachers
1 Introduction
2 Scope and Description of the Research
2.1 Purpose
2.2 Context
2.3 The TPACK Analysis Model
3 Data Analysis
4 Conclusions
References
Evaluating the Usability of Mobile-Based Augmented Reality Applications for Education: A Systematic Review
1 Introduction
1.1 Augmented Reality
1.2 Usability
2 Previous Studies
2.1 Contribution Over Existing Surveys
3 Methodology
3.1 Identification
Search Strings
Databases
3.2 Screening and Filtering
Basic Screening
Advanced Screening and Quality Assessment
3.3 Data Extraction and Synthesis
4 Results
4.1 Information of Selected Papers
4.2 What are the Common Domains, AR Types and Settings in Usability Studies for Educational Mobile-Based AR Applications?
4.3 What Type of Data and Experimental Design Have Been Used in Usability Studies for Educational Mobile-Based AR Applications?
4.4 What is the Sample of Participants in Usability Studies for Educational Mobile-Based AR Applications?
4.5 Which Usability Metrics Have Been Measured in Usability Studies for Educational Mobile-Based AR Applications?
4.6 Which Data Collection Methods and Instruments Have Been Used in Usability Studies for Educational Mobile-Based AR Applications?
4.7 What Kind of Usability Data, Metrics and Instruments Are Used in Each Educational Level to Evaluate Mobile-Based AR Applications?
5 Discussion
5.1 RQ1. What Are the Common Domains, AR Types and Settings in Usability Studies for Educational Mobile-Based AR Applications?
5.2 RQ2. What Type of Data and Experimental Design Have Been Used in Usability Studies for Educational Mobile-Based AR Applications?
5.3 RQ3. What is the Sample of Participants in Usability Studies for Educational Mobile-Based AR Applications?
5.4 RQ4. Which Usability Metrics Have Been Measured in Usability Studies for Educational Mobile-Based AR Applications?
5.5 RQ5. Which Data Collection Methods and Instruments Have Been Used in Usability Studies for Educational Mobile-Based AR Applications?
5.6 RQ6. Which Data Collection Methods and Instruments Have Been Used in Usability Studies for Educational Mobile-Based AR Applications?
5.7 Limitations
6 Conclusion
References
Augmented Reality Smart Glasses: Why Do Students Prefer to Use Them in Learning?
1 Introduction
2 Previous Empirical Research on Students’ Acceptance of AR and/or Mobile Technology Devices
3 Previous Empirical Research on Smart Glasses Acceptance in Education
4 Theoretical Background
5 Methodology
5.1 Sample
5.2 Data Collection and Procedure
5.3 Analysis
6 Results
6.1 Perceived Enjoyment
6.2 Perceived Ease of Use
6.3 Perceived Usefulness
6.4 Perceived Relative Advantage
7 Conclusions and Discussion
8 Limitations and Future Research
References
Collaborative Digital Storytelling Via. The StoryLogicNet Tool During COVID-19 School Closure
1 Introduction
2 Theoretical Background
2.1 Multiliteracy Education Pedagogy Via Digital Storytelling
2.2 The StoryLogicNet Tool
3 Method
3.1 Research Design
3.2 Participants
3.3 Implementation of the Pilot Project
4 Results
5 Engaging Young Learners with Collaborative Digital Storytelling: Exploring Implications for Second/Foreign Language Learning
5.1 Collaborative Digital Storytelling for Communicative Skills Development in a Second/Foreign Language
5.2 Collaborative Digital Storytelling for Multiliteracy Competences Development
6 Conclusion
References
Teaching Humanities Through Digital Tools in Secondary Education
1 Introduction
2 Research Methodology
3 Findings
4 Conclusions
References
Digital Humanities and Digital Narrative
1 Digital Humanities
2 Storytelling Goes Digital
3 And History Is Going Digital
4 Research
5 Educational Scenario
6 Research Results
7 Conclusions
References
Stimulation of Executive Functions with Embedded Preliteracy Skills in High Ability Preschoolers: An Educational Software
1 Introduction
2 The Interrelation Between EFs and Phonological Awareness
3 Executive Functions in High Ability Preschool Children
4 Preliteracy Skills in High Ability Preschool Children
5 EFs Computer-Assisted Stimulation
6 Cogni-PreLit App
7 Technical Specifications
8 Application Interface
9 Gameplay
10 The Implementation of Cogni-Prelit in a Group of High Ability Preschool Children
10.1 Participants and Procedure
10.2 Measures
Phonological Awareness
Verbal Short-Term Memory
Verbal Working Memory
Inhibitory Control
Cognitive Flexibility
Statistical Analysis
Results
11 Discussion
12 Limitations and Future Research Directions
13 Conclusion
References
Computational Thinking and Problem-Solving Skills in Preschool Education: Children’s Engagement and Response to a Sequence of Programming Activities with Scratch Jr
1 Introduction
2 Literature Review
3 Aim of the Study and Research Questions
4 Design of the Educational Intervention
5 Research Method
5.1 Research Context and Participants
5.2 Procedure and Data Sources
6 Results
7 Conclusions
References
Emergency Remote Teaching in K-12 Education During COVID-19 Pandemic: A Systematic Review of Empirical Research in Greece
1 Introduction
2 Methodology
3 Results
3.1 RQ1: Which Are the Main Characteristics of the Research on ERT in K-12 Education?
3.2 RQ2 – How Was ERT Implemented in School Education?
3.3 RQ3 – What Were the Impact/Benefits?
3.4 RQ4 – What Obstacles/Difficulties Were Recorded?
4 Discussion
5 Conclusion
Appendix: List of Master Theses
References
Analysis of Teachers’ Community Activity Within a Connectivist MOOC for Professional Development
1 Introduction
2 MOOC Design and Implementation
3 Aim and Research Questions
4 Research Methodology
4.1 Participants
4.2 Research Data and Teachers’ Contributions
4.3 Social Network Analysis
5 Results
6 Conclusions
References
Educational Continuity and Distance Learning, at the European Level. A Multi-perspective Examination
1 Introduction
2 Materials and Methods
3 Results
3.1 Questionnaire No 1 – Teachers
3.2 Questionnaire No 2 – Children
3.3 Questionnaire No 3 – Parents
4 Discussion
References
Index
Recommend Papers

Research on E-Learning and ICT in Education: Technological, Pedagogical, and Instructional Perspectives
 3031342909, 9783031342905

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Tharrenos Bratitsis   Editor

Research on E-Learning and ICT in Education Technological, Pedagogical, and Instructional Perspectives

Research on E-Learning and ICT in Education

Tharrenos Bratitsis Editor

Research on E-Learning and ICT in Education Technological, Pedagogical, and Instructional Perspectives

Editor Tharrenos Bratitsis School of Humanities and Social Studies University of Western Macedonia Florina, Greece

ISBN 978-3-031-34290-5    ISBN 978-3-031-34291-2 (eBook) https://doi.org/10.1007/978-3-031-34291-2 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

To the people bearing with those who pursue knowledge and academic outbreaks, bringing equilibrium to their lives. Especially to you for supporting me as such! “If we teach today as we taught yesterday, we rob our children of tomorrow.”  — John Dewey

Contents

Evaluating Digital Learning Objects in Science Education with the “Science Learning Objects Evaluation Tool” ��������������������������������    1 G. K. Zacharis and T. A. Mikropoulos Digital Games as Learning Tools: Mapping the Perspectives and Experience of Student Teachers in Greek Universities ������������������������   21 Iro Voulgari, Konstantinos Lavidas, and Vassilis Komis Understanding Primary School Students’ Desire to Play Games on Smart Mobile Devices in their Leisure Time��������������������������������������������   39 George Koutromanos Peer Evaluation Literacy in Teacher Education: Mapping Student Teachers as Reviewees and Reviewers����������������������������   57 Kyparisia Papanikolaou, Eleni Zalavra, and Maria Tzelepi Teachers’ Preferences for Having Guidance from Digital Tools in Authoring Learning Designs ����������������������������������������������������������������������   75 Eleni Zalavra, Kyparisia Papanikolaou, Yannis Dimitriadis, and Cleo Sgouropoulou Investigating TPACK Integration in the Designing and Implementation of Educational Activities Using ICT by Prospective Early Childhood Teachers����������������������������������   93 Aggeliki Tzavara and Vassilis Komis Evaluating the Usability of Mobile-Based Augmented Reality Applications for Education: A Systematic Review��������������������������  105 Filippos Tzortzoglou and Alivisos Sofos Augmented Reality Smart Glasses: Why Do Students Prefer to Use Them in Learning? ������������������������������������������������������������������  137 Georgia Kazakou and George Koutromanos

vii

viii

Contents

Collaborative Digital Storytelling Via. The StoryLogicNet Tool During COVID-19 School Closure ����������������������������������������������������������������  155 Eleni Korosidou and Tharrenos Bratitsis  eaching Humanities Through Digital Tools in Secondary Education ������  173 T Eleni Bekiari and Maria Xesternou  igital Humanities and Digital Narrative������������������������������������������������������  183 D Alexandros Kapaniaris and Anna Dimitriou Stimulation of Executive Functions with Embedded Preliteracy Skills in High Ability Preschoolers: An Educational Software��������������������������������������������������������������������������������  195 Eleni Rachanioti, Anastasia Alevriadou, Tharrenos Bratitsis, and Garyfalia Charitaki Computational Thinking and Problem-­Solving Skills in Preschool Education: Children’s Engagement and Response to a Sequence of Programming Activities with Scratch Jr��������������������������  221 Ourania Gaki and Athanassios Jimoyiannis  mergency Remote Teaching in K-12 Education During COVID-19 E Pandemic: A Systematic Review of Empirical Research in Greece������������  235 Apostolos Kostas, Vasilis Paraschou, Dimitris Spanos, and Alivizos Sofos Analysis of Teachers’ Community Activity Within a Connectivist MOOC for Professional Development ��������������������  261 Nikolaos Koukis, Panagiotis Tsiotakis, and Athanassios Jimoyiannis  ducational Continuity and Distance Learning, E at the European Level. A Multi-­perspective Examination ��������������������������  275 Tharrenos Bratitsis, Ana Barroca, Marina Bessi, and Stefania Guccione Index������������������������������������������������������������������������������������������������������������������  291

About the Editor

Tharrenos Bratitsis is a Full Professor at the Early Childhood Education Department, University of Western Macedonia, Greece, and a Director of the Creativity, Innovation and Technology in Education (CrInTE) Laboratory. His teaching and research are related to digital and global competences and innovative actions in the educational sector through technology, in all levels and types of education (formal, non-formal, informal). He holds a BSc in Electrical Engineering and Computer Science from the University of Patras. He holds a PhD in Technology Enhanced Learning from the University of the Aegean. His research interests include Technology-Enhanced Learning, Game-based Learning, Digital Storytelling, Design Thinking, STEAM Education, Educational Robotics, Computer-Supported Collaborative Learning, Gamification and Learning Analytics.

ix

About the Book

This book is an essential text for researchers and academics seeking the most comprehensive and up-to-date coverage of all aspects of e-learning and ICT in education. It provides expanded peer-reviewed content from research presented at the12th Pan-Hellenic and International Conference “ICT in Education”, held in Greece in 2021. The volume includes papers covering technical, pedagogical, organizational, instructional, as well as policy aspects of ICT in education and e-learning. Special emphasis is given to applied research relevant to the educational practice guided by the educational realities in schools, colleges, universities and informal learning organizations. This volume encompasses current trends, perspectives and approaches determining e-learning and ICT integration in practice, including learning and teaching, curriculum and instructional design, learning media and environments, teacher education and professional development. As expected, chapters focusing on the COVID-19 pandemic period and ICT application in education are included. It is based on research work originally presented at the conference, but the call for ­chapters was open and disseminated to the international community attracting also international contributions.

xi

Evaluating Digital Learning Objects in Science Education with the “Science Learning Objects Evaluation Tool” G. K. Zacharis and T. A. Mikropoulos

1 Introduction Digital Learning Objects (DLOs) can be defined as autonomous structural educational resources concerning a learning environment and contributing to the implementation of learning objectives (Topali & Mikropoulos, 2018). The features of DLOs are assembled as easily identifiable, reusable, disaggregated, accessible, interoperable, adaptable, durable, creative, and manageable entities (Gürer, 2013). We adopt the definition proposed by Topali and Mikropoulos (2018) according to which, a DLO is defined as a: “small, autonomous, reusable and pedagogically integrated learning content structure” (p. 257). This definition considers a DLO not just as an entity that includes information, but also as an entity that adds a specific educational value. The main purpose of a DLO should be to provide meaningful communication between the learner and the content of the DLO in which the learner reflects their knowledge, and previous experiences (de Almeida Pacheco et al., 2019).

1.1 Digital Learning Objects in Science Education Science Education addresses an organized body of knowledge that comprises abstract concepts, magnitudes and phenomena usually far from everyday experience, being often unclear and difficult to students. Science Education aims to G. K. Zacharis (*) School of Early Childhood Education, Aristotle University of Thessaloniki, Thessaloniki, Greece e-mail: [email protected] T. A. Mikropoulos Educational Approaches to Virtual Reality Lab, University of Ioannina, Ioannina, Greece © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 T. Bratitsis (ed.), Research on E-Learning and ICT in Education, https://doi.org/10.1007/978-3-031-34291-2_1

1

2

G. K. Zacharis and T. A. Mikropoulos

cultivate skills such as observation, experimentation, organization and planning, evaluation, problem solving, abstract and logical reasoning (Fernandes et al., 2019). DLOs in the field of Science Education have been identified as valuable tools to support the achievement of positive learning outcomes and motivation to learn (Barak & Dori, 2011; Lin & Dwyer, 2010; Nguyen et  al., 2012; Sudatha et  al., 2018). The integration of DLOs in teaching and learning processes seems to enhance the perception of phenomena (Gould et al., 2005), to contribute to the development of reasoning for a deeper understanding of scientific concepts (De Jong & Van Joolingen, 2008). DLOs also offer the possibility to interact with models and simulations overcoming to a certain extent limitations of a real laboratory (Zacharia & Olympiou, 2011).

1.2 Evaluation Tools of Digital Learning Objects Over the last twenty years, efforts have been made to develop methods, frameworks and tools for the evaluation of DLOs. Researchers and organizations have developed tools for evaluating DLOs and Digital Open Educational Resources (OERs) in general. Evaluation tools that have been developed by organizations are the SREB-­ SCORE (Southern Regional Education Board) (SREB-SCORE, 2007), HEODAR (Herramienta de Evaluación de Objetos de Aprendizaje) (Muñoz et al., 2009, 2012) and MERLOT (Multimedia Educational Resource for Learning and Online Teaching) (Cafolla, 2002). Evaluation tools developed by researchers are the LOES-S (Learning Object Scale for Students) (Kay & Knaack, 2009), LOES-T (Learning Object Scale for Teachers) (Kay & Knaack, 2008b), LORI 1.5 (Learning Object Review Instrument) (Leacock & Nesbit, 2009), LOEM (Learning Object Evaluation Metric) (Kay & Knaack, 2008a), the WBLT (web-based learning tools) Evaluation Scale (Kay, 2011), and the Learning Objects Reusability Effectiveness Metric (LOREM) (Sultan et al., 2014). MERLOT was designed to assess the content quality, ease of use, and potential learning benefits of using DLOs in educational practice. It consists of three axes, and the scores on the instrument’s questions are reported on a five-point scale. SREB-SCORE was designed to assess the quality and effectiveness of DLOs at both secondary and tertiary level. It consists of eight axes and uses a two-point scale and comments for the evaluator’s control. HEODAR consists of three axes on a five-­ point scale. The tool is implemented at the University of Salamanca framework and initially integrated with Moodle. The LORI 1.5 can be used by educators and designers of instructional materials, is structured on ten axes, and uses a five-point scale accompanied by feedback. LOEM addresses to students and teachers, it is structured in four axes and the evaluation is accomplished through a three-level scale. The LOES-T and LOES-S have been developed by the same authors as the LOEM, are structured in three axes and address to teachers and students respectively. LOES-T items are scored on a seven-point Likert scale, while the LOES-S uses a five-point Likert scale. The

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

3

WBLT Evaluation Scale is a revised version of the LOES-S and LOES-T. It is based on three axes and is addressed to students and teachers, while the evaluation is done through a five-level scale. LOREM is an evaluation tool for students, teachers, and designers of educational materials. It consists of eight categories with different types of scales (single choice, five-point scale) per axis. It is noteworthy that in all questions there is a minimum threshold, below which the evaluator is asked to document their score. The above evaluation tools are independent of the subject of DLOs, i.e., they refer to all disciplines and educational levels. This implies certain difficulties in targeted evaluations, as for example in the field of Science Education. MERLOT is an exception, because it offers different editions for different disciplines, and is aimed at pupils, students and teachers. However, MERLOT evaluates educational resources in general without considering the features of DLOs. The nine evaluation tools previously mentioned are composed of items consisting of three (MERLOT, LOES-S, LOES-T, LOEM, WBLT Evaluation Scale) to 10 (SREB-SCORE, LORI 1.5) factors, as presented in Table 1. They all include three common factors, namely content quality, learning objectives, and usability. SREB-­ SCORE, LOES, LOEM, LORI 1.5 and HEODAR have separate factors regarding motivation, feedback, and presentation of DLOs. These factors are also involved in various parts of the other tools. For example, feedback is among the four criteria of the “Ease of Use” factor of MERLOT. SREB-SCORE and LORI 1.5 refer to three of the characteristics of DLOs which are reusability, accessibility and standards. SREB-SCORE, LORI 1.5 and MERLOT are the only tools that investigate the factor of reusability. However, MERLOT includes the axes of “Gender” and “Metadata”, which are not included in any of the other assessment tools. Only the SREB-SCORE checklist is copyrighted.

2 Systematic Review on the Evaluation of DLOs for Science Education The lack of DLOs evaluation tools addressed to science education led to a systematic review that investigates the evaluation of DLOs for science education. The purpose of the review was to locate the factors that are important for the evaluation of DLOs for science education in primary and secondary education. To conduct the systematic review and minimize possible errors, the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) model was used (Moher et al., 2009).

4

G. K. Zacharis and T. A. Mikropoulos

Table 1  Evaluation tools of Digital Learning Objects Evaluation tool Organization MERLOT Sonoma State University

Year Scale 1997 5 point

SREB-SCORE

Southern Regional Education Board

2007 2 point (Yes/No) & Comments from the evaluator

LORI 1.5

Leacock, & Nesbit

2007 5 point & Comments from the evaluator

LOEM

Kay & Knaack

2008 3 point

LOES-S

Kay & Knaack

2008 5-point Likert

LOES-T

Kay & Knaack

2008 7-point Likert

HEODAR

Muñoz, Conde, & Peñalvo

2009 5-point Likert & Comments

Factors 1. Content quality 2. Potential effectiveness 3. Ease of use 1. Content quality 2. Learning goal alignment 3. Interface 4. Usability 5. Feedback, 6. Motivation 7. Presentation design 8. Reusability 9. Standards compliance 1. Content quality 2. Learning goal alignment 3. Interaction 4. Usability 5. Feedback and adaptation 6. Motivation 7. Presentation design 8. Accessibility 9. Reusability 10. Standards compliance 1. Interactivity 2. Design 3. Engagement 4. Usability 1. Quality 2. Learning 3. Engagement 1. Quality 2. Learning 3. Engagement 1. Motivation & attention 2. Professional competency 3. Level of difficulty 4. Interactivity 5. Creativity (continued)

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

5

Table 1 (continued) Evaluation tool Organization Kay WBLT Evaluation Scale LOREM Sultan, Nasr & Amin

Year Scale 2011 5-point Likert

2014 5 point (minimum value) & Single choice

Factors 1. Learning 2. Design 3. Engagement 1. Reusability 2. Gender 3. Accessibility 4. Appropriateness 5. Content quality 6. Metadata 7. Motivation 8. Usability

2.1 Selected Databases and Search Algorithm The systematic review was based on a search in ERIC, SCOPUS, Science Direct, Elsevier, Springer Link, Wiley Interscience, ACM, IEEE, JSTOR, ProQuest, Web of Science and Google Scholar. A Boolean search string was compiled by using the core terms and other similar to them in order to get all the relevant results: (“digital learning object“OR “web-­ based tool”) AND (“science education” OR “science” OR “physics” OR “chemistry” OR “biology” OR “geography” OR “environment”) AND (“evaluation model” OR “evaluation framework” OR “evaluation tool“OR “evaluation standards” OR “validation”) AND (“education” OR “primary education” OR “secondary education” OR “high school” OR “K-12 education”).

2.2 Inclusion and Exclusion Criteria Articles published in peer reviewed scientific journals or conference proceedings in English were included between the years 2000 and 2022. The year 2000 was chosen as the reference year for the instructional use of DLOs (Wiley, 2002). The articles should be empirical studies that present results regarding the evaluation of DLOs by students and/or teachers. Studies not written in English or published only as a summary or did not refer to the field of Science Education were excluded.

6

G. K. Zacharis and T. A. Mikropoulos

Fig. 1  A PRISMA flow diagram illustrating the review process

2.3 Review Process The initial search results in all databases produced a total of 2114 articles. Based on duplicates, publication date, title and abstract, 2074 studies were excluded. The remaining 40 articles were further examined. Twenty-eight of them were empirical studies that did not meet the admission criteria and excluded. The PRISMA procedure followed is illustrated in Fig. 1.

2.4 Findings The 12 articles that compiled the final dataset were included in the systematic review (Table 2). According to Table 2, it seems that the most used evaluation tools are LOES-S and LOES-T (Kay & Knaack, 2007, 2008b, 2009; Kay et al., 2009; Kay, 2012). It is noteworthy that in two studies (Pejuan et al., 2016; Schibeci et al., 2008) the researchers developed their own evaluation tool, without given any name.

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

7

Table 2  Synthesis of the studies under review Evaluation tool(s) LOES-S LOES-T

No Study 1. Kay and Knaack (2007) 2. Schibeci No name et al. (2008) 3. Akpinar LORI (2008)

Validation/ evaluation Evaluators Yes 111 students & 19 teachers No 134 students & 6 teachers Yes 507 students & 24 teachers Yes 1113 students & 33 teachers

Educational level Secondary

Primary & Secondary

Number of DLOs Discipline 3 Physics, Chemistry, Biology 40 Science

Primary & Secondary

24

Primary & Secondary

44

4.

Kay and Knaack (2008a)

LOEM

5.

Kay and Knaack (2008b)

LOES-S LOES-T

No

Primary & 262 students & Secondary 8 teachers

10

6.

Kay and Knaack (2009)

LOES-S

Yes

Primary & 1113 students & Secondary 33 teachers

48

7.

Kay and Knaack (2009) Kay et al. (2009)

LOES-S LOES-T

Yes

Primary & Secondary

21

LOES-T

Yes

503 students & 15 teachers 1113 students & 33 teachers

Kay (2011)

LOES-S LOES-T

No

8.

9.

10. Turel and Gürol (2011) 11. Kay (2012)

No Course Interest Survey (CIS) & Attitude Survey No WBLT Evaluation Scale for Students & WBLT Evaluation Scale for Teachers

Primary & 48 Secondary α

Primary & 371 students & Secondary 11 teachers 78 students Secondary

Primary & 333 students & Secondary 8 teachers

12

98

12

Physics, Chemistry, Biology Physics, Chemistry, Biology, Mathematics, History Physics, Chemistry, Biology, Mathematics Physics, Chemistry, Biology, Mathematics, History Science

Physics, Chemistry, Biology, Mathematics Physics, Chemistry, Biology Science

Physics, Mathematics

(continued)

8

G. K. Zacharis and T. A. Mikropoulos

Table 2 (continued)

No Study 12. Pejuan et al. (2016)

Evaluation tool(s) No name

Validation/ evaluation Evaluators Yes 4 teachers

Number Educational of DLOs Discipline level 5 Physics Primary, Secondary & Higher Education

Pejuan and colleagues referred to dynamic simulations namely physics applets. In the category “underlying learning conception” of their evaluation tool, the authors asked for advancing the “learning process through interactive discovery”, a generic question relevant to an instructional model used in Science Education. Schibeci and colleagues’ evaluation tool was not designed for DLOs referring to science. The authors simply used it in “science learning objects” because only such learning objects “were available for use at the time of the pre-pilot”. Turel and Gürol (2011) used a set of questions to evaluate students’ motivation, attitude, and achievements as well as students’ and teachers’ perception about DLOs. In the majority of the studies, the reliability and validity of the evaluation tool were measured, the evaluators being both primary and secondary school students and teachers. Kay and Knaack (2007) and Turel and Gürol (2011) referred only to secondary education. The review shows that there are few tools for the evaluation of DLOs concerning Science Education. Moreover, these tools hardly refer to instructional approaches used in Science Education such as inquiry-based learning. Thus, there is a need for a reliable and valid evaluation tool addressed to Science Education.

3 Methodology The aim of the present work was the validation of “Science Learning Objects Evaluation Tool – SciLOET” developed by Mikropoulos and Papachristos (2021).

3.1 The Evaluation Tool SciLOET The DLOs evaluation tool for Science Education “Science Learning Objects Evaluation Tool – SciLOET” is oriented for use in educational design and teaching practice. SciLOET does not include items referring to certain characteristics of DLOs such as granularity and generativity, durability and manageability which are not the main concern of teachers in order to select and use a DLO (Mikropoulos & Papachristos, 2021). SciLOET consists of four factors: content quality, teaching effectiveness, design, and documentation (Mikropoulos & Papachristos, 2021). The tool is composed of

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

9

12 questions. “Content Quality” includes three questions referring to the subject under study and the pedagogical value of the DLO. “Teaching effectiveness” consists of six questions on the contribution of the DLO to certain aspects of instructional design such as multiple representations, discovery or exploratory activities, and reflection. The questions offer short examples for concepts like inquiry-based learning, reflection and didactic situations. “Design” includes two questions, which refer to the appearance and interface of the DLO. The fourth factor “Documentation” of the tool refers to instructions for using the DLO.

3.2 Sample The validation of SciLOET was integrated through two empirical studies. The first one addressed to 102 elementary science teachers and the other one to 118 secondary education science teachers. According to the first study, 75 out of 102 elementary science teachers were women (73.5%) and 27 men, while their mean age was 35.7 (STD = 11.4). Eighty-­ six from the participants were teachers in public schools (84,3%), and the others in private schools. In terms of years of work experience, 52 participants (51%) worked between one and five years, 46 over 10 years, while only four had six to 10 years of experience. According to the second study, 62 out of 118 secondary education science teachers were women (52.5%) and 48 men, most of them aged between 45 and 54 years old. 112 from the participants were teachers in public schools (94.9%), while 6 (5.1%) in private schools. In terms of years of work experience, 11 participants worked less than two years, 10 worked from three to eight years, 26 worked from nine to 14 years, 34 from 15 to 20 years, while 37 (31.4%) worked more than 20 years. In addition, 66 teachers (55.9%) were Physicists, 26 Chemists, 23 Biologists, while three were Geologists. Also, 38 (32.2%) taught in a Gymnasium (junior high school), 40 in a Lyceum (senior high school), while 40 taught simultaneously in Gymnasium and Lyceum.

3.3 Procedure A questionnaire was developed and distributed through Google Forms. The questionnaire link was sent to teachers via e-mail. The questionnaire was structured in two parts. The first part included eight questions of demographic interest (gender, age, years of service, ICT training, knowledge, and use of the national repository of DLOs “Photodentro”, https://photodentro.edu.gr/lor/). The second part of the questionnaire was divided into the four factors of SciLOET with answers on the four-­ point Likert scale (1 = Strongly Disagree, 4 = Strongly Agree). The DLOs under evaluation were developed by the team of one of the authors and can be found at the

10

G. K. Zacharis and T. A. Mikropoulos

Fig. 2 DLO-1 “Magnet and electric circuit  – Electron flow”. (http://photodentro.edu.gr/ lor/r/8521/8575)

Greek national repository for digital educational resources “Photodentro”. DLO-1 and DLO-2 concerned primary education and are addressed to students 9–15 years, while DLO-3, DLO-4, and DLO-5 were for secondary education. DLO-1 “Magnet and electric circuit - Electron flow” (Fig. 2) is a dynamic visual representation that allows the interaction between a compass and the magnetic field a simple electrical circuit creates. The objective of the learning object is students to investigate the magnetic field around the conductor, to reflect on how the compass is oriented around the conductor and to connect the oriented electron flow with the creation of a magnetic field. DLO-2 “Sound is a wave” (Fig. 3) is a visualization of the waveform of a simple sound. Users may change the basic characteristics of the wave such as its amplitude and frequency and observe the changes in the sound. DLO-3 “The human eye – myopia and presbyopia” (Fig. 4) is a visualization of the function of a normal, a myopic and a presbyopic human eye. The objectives are students to understand the effects of myopia and presbyopia on human vision, as well as to correct a myopic or a presbyopic eye by using the appropriate lens. DLO-4 “Measuring the pH of solutions of acids, bases and salts” (Fig. 5) is a virtual experiment. Students have the possibility to measure the pH of various solutions and study the acidity of aqueous solutions. DLO-5 “Linear motion with constant velocity” (Fig.  6) is a simulation-based visualization. Students may divide the motion of a car into three segments (by changing the x-t diagram) and track the car’s motion in the three successive time intervals. Students may also observe the velocity – time diagram at the same time. SPSS v21.0 was used to extract the descriptive statistical measures, the reliability control and the factorial analysis into main components. The descriptive statistical

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

11

Fig. 3  DLO-2: “Sound is a wave”. (http://photodentro.edu.gr/lor/r/8521/8461?locale=el)

Fig. 4  DLO-3: “The human eye  – myopia and presbyopia”. (http://photodentro.edu.gr/lor/r/85 21/6176?locale=el)

measures (mean value and standard deviation) of the tool’s questions for the two DLOs which the teachers evaluated were calculated. In addition, the research tool was evaluated for its internal reliability through Cronbach’s alpha and Kendall’s tau correlation (t). An exploratory factorial analysis was also performed through Principal Component Analysis to investigate the interrelationship of variables.

12

G. K. Zacharis and T. A. Mikropoulos

Fig. 5  DLO-4: “Measuring the pH of solutions of acids, bases and salts”. (http://photodentro.edu. gr/lor/r/8521/10473?locale=el)

Fig. 6 DLO-5: “Linear lor/r/8521/1580?locale=el)

motion

with

constant

velocity”.

(http://photodentro.edu.gr/

4 Results The results of the evaluations are presented according to educational level.

4.1 Results from Elementary Science Teachers’ Study Descriptive Statistics Table 3 shows the mean values (M) and standard deviations (STD) for the two DLOs. Data from Table 3 reveal that both DLOs show high mean values per question, with some variations that are probably due to whether each one of the DLOs

Evaluating Digital Learning Objects in Science Education with the “Science Learning…

13

Table 3  Descriptive statistics for both DLOs evaluated with SciLOET SciLOET A. Content quality CQ1. The DLO refers to a topic for which students have difficulties or misconceptions CQ2. The DLO refers to a topic that is difficult to be taught without the support of digital technology CQ3. The DLO adds pedagogical value to the topic under study Comments B. Teaching effectiveness TE1. The DLO involves feedback that promotes knowledge construction TE2. The DLO implies the design of learning activities. TE3. The DLO uses multiple representations for the concepts, magnitudes, and phenomena (if required) TE4. The DLO promotes discovery or inquiry learning (e.g., inductive reasoning, hypothesis testing, problem solving, metacognitive skills acquisition) TE5. The DLO advances reflection (e.g., connection of existing knowledge with new information for problem solving) TE6. The DLO can be used in more than one situation (e.g., different goals, other discipline) Comments C. Design DE1. Graphics are suitable for the age of the students DE2. Multimedia elements effectively attribute the magnitudes and phenomena Comments D. Documentation DOC1. There are user’s instructions Comments

DLO-1 (Μ ± STD)

DLO-2 (Μ ± STD)

3.21 ± 0.67

3.25 ± 0.64

2.66 ± 0.86

3.22 ± 0.68

3.17 ± 0.61

3.26 ± 0.54

3.18 ± 0.60

3.12 ± 0.60

3.23 ± 0.60 3.06 ± 0.61

3.09 ± 0.63 3.09 ± 0.60

3.25 ± 0.58

3.17 ± 0.81

3.26 ± 0.56

3.18 ± 0.59

3.27 ± 0.56

3.08 ± 0.68

3.19 ± 0.59 3.14 ± 0.68

3.10 ± 0.68 3.09 ± 0.62

3.29 ± 0.64

3.38 ± 0.60

concerns a topic that is difficult to be taught without the support of digital technology and to be used in more than one didactic situation. Measurement Model The reliability and consistency of the SciLOET were verified, as the values of the Cronbach’s alpha index for the set of questions for both DLOs were accepted, with a value above the 0.70 threshold (DeVellis, 2016). Specifically, the value of the Cronbach alpha was 0.824 for DLO-1 and 0.866 for DLO-2. In addition, Kendall’s t coefficient was also studied, due to the small number of data and the large number of equal observations (high mean values), as shown in Table 3. The indicator shows that for both DLOs, among the questions that consist

14

G. K. Zacharis and T. A. Mikropoulos

of each one of the four factors there is a statistically significant correlation (p