Next Article in Journal
AttG-BDGNets: Attention-Guided Bidirectional Dynamic Graph IndRNN for Non-Intrusive Load Monitoring
Next Article in Special Issue
A Novel Authentication Method That Combines Honeytokens and Google Authenticator
Previous Article in Journal
Develop a Lightweight Convolutional Neural Network to Recognize Palms Using 3D Point Clouds
Previous Article in Special Issue
Mind the Move: Developing a Brain-Computer Interface Game with Left-Right Motor Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Online Professional Development on Educational Neuroscience in Higher Education Based on Design Thinking

by
Stylianos Mystakidis
1,2,*,
Athanasios Christopoulos
3,
Maria Fragkaki
4 and
Konstantinos Dimitropoulos
5
1
School of Natural Sciences, University of Patras, 26504 Rion, Greece
2
School of Humanities, Hellenic Open University, 26335 Patras, Greece
3
Turku Research Institute for Learning Analytics, University of Turku, FI-20014 Turku, Finland
4
Department of Educational Sciences and Early Childhood Education, University of Patras, 26504 Rion, Greece
5
Department of Computer Engineering and Informatics, University of Patras, 26504 Rion, Greece
*
Author to whom correspondence should be addressed.
Information 2023, 14(7), 382; https://doi.org/10.3390/info14070382
Submission received: 15 May 2023 / Revised: 14 June 2023 / Accepted: 30 June 2023 / Published: 3 July 2023

Abstract

:
Higher education teaching staff members need to build a scientifically accurate and comprehensive understanding of the function of the brain in learning as neuroscience evidence can constitute a way to optimize teaching and achieve learning excellence. An international consortium developed a professional development six-module course on educational neuroscience and online community of practice by applying design thinking. A mixed methods research design was employed to investigate the attitudes of thirty-two (N = 32) participating academics using a survey comprising eleven closed and open questions. Data analysis methods included descriptive statistics, correlation, generalized additive model and grounded theory. The overall evaluation demonstrated a notable satisfaction level with regard to the quality of the course. Given the power of habits, mentoring and peer interactions are recommended to ensure the effective integration of theoretical neuroscientific evidence into teaching practice.

1. Introduction

Most higher education institutions worldwide prioritize their academic development strategy research and development through winning competitive national and international funding programs and the publication of studies in high-impact venues [1]. In this teaching-research nexus in higher education, student teaching is often seen by university and college faculty members as a necessity, sometimes even as a distraction from truly meaningful work [2]. Career advancement of the tenured academic faculty is determined by research visibility and publication output and impact instead of teaching quality [1]. However, teaching and learning is a critical procedure that directly influences the prospects of economic growth and societal wellbeing through the knowledge, competencies and values of students and graduates within academic and professional communities. The quality of teaching and learning depends on several factors, one of them being the perceptions of teaching staff regarding what constitutes effective teaching and learning [3].
Neuropedagogy, educational neuroscience or neuroeducation is the application of cognitive neuroscience to teaching and learning [4]. Neuropedagogy is the place where science and education can combine and the scientific goals of which are to learn how to stimulate new zones of the brain and create neural connections. Despite its rich history, educational neuroscience has not yet had the impact envisioned on education, partly due to the fact that neuroscientific laboratory experiments are conceptually and practically different from attendance-based teaching practice [5]. Therefore, to bridge this gap, new initiatives are needed to conceptualize neuroscientific evidence into tangible models, frameworks, recommendations and practices. One such initiative focused on the essential skills of educators. Koehler and Mishra formulated the TPACK model, suggesting that educators should construct technological, pedagogical and content knowledge and skills [6].
Tokuhama-Espinosa expanded this model, providing a holistic vision for the contemporary educator as a learning professional and scientist [7]. She advocates that teachers at all levels of service should be literate in neuroeducation. This notion is reflected in an enriched TPACK model that she proposed, featuring a new area in educational neuroscience (mind, brain and education science) [7].
Are educators and higher education teachers informed about brain-based learning practices? There is an acute upskilling need in educational neuroscience for professionals at all levels of education, especially higher education lecturers and professors who shape the next generations of teachers and scientists [8]. This need is further amplified due to the fact that there is an alarmingly high number of persistent misconceptions (neuromyths) regarding the brain and its role in learning [9]. Teacher professional development can be essential in order to grasp the affordances of new media and for the formulation of pedagogy-informed teaching and learning practices. For these purposes, online learning is an essential practice for the rapid and wide professional development of teachers and lecturers [10,11]. Research findings from educational neuroscience provide opportunities for transformative learning in distance education settings [12].
Design thinking is a creative, flexible methodology for tackling complex problems. It is based on a human-centered mindset for problem-solving through iterative experimentation [13]. Design thinking features a series of steps that are revisited and repeated as often as is necessary. In the context of education, these steps are described as follows: Discovery, Interpretation, Ideation, Experimentation and Evolution [14]. In the first phase, the target audience is specified and data are detected, collected and analyzed around a challenge. Then, data are interpreted in the quest for sense-making and meaning around a set of commonly identified themes. In the ideation step, ideas are generated, prioritized, assessed and refined. Then, during experimentation, these ideas are implemented into prototypes. Finally, the evaluation feedback on prototypes is collected to track lessons learned and move forward with eventual shortcomings and open improvement issues. Design thinking techniques can be integrated into the user-centered design of electronic systems and artifacts to increase users’ creativity and innovation capacity [15].
In this study, we investigated the attitudes of academics towards a specialized course centered on Neuropedagogy utilizing a mixed methods approach. The primary objective was to offer a comprehensive understanding of the discipline, while focusing on equipping participants with the necessary skills to integrate this knowledge into their instructional practices.

2. Materials

2.1. Platform Design Process

This study took place in the framework of the Neuropedagogy Erasmus+ project. Its prime aim was to train higher education teaching staff in innovative methods based on neuroscience through an online platform in which a community of higher education lecturers was created to facilitate the development of transversal communication competences [16]. It was conceived for the application of neuroscience to teaching and to offer professional development chances to interested educators for the expansion of their knowledge and skills in that specific scientific domain. Experiences and lessons learned from previous professional development were incorporated to optimize learning design regarding pedagogy, technology and aesthetics [11].
The design thinking methodology was applied throughout all phases to ensure an empathetic implementation approach, as illustrated in Figure 1. First, extensive data collection from targeted users took place. Quantitative and qualitative user data from academic faculty members revealed a strong interest and knowledge gaps in the subject matter [17]. Then, a literature review described previous works and principles of neuroeducation [18]. Selected case studies, narratives, projects, toolkits and resources were compiled in a good practice guide. Then, elaborate ideation took place with the objective of outlining the platform’s structure. First, a desk study was conducted to identify the best possible software systems. A blended online brainstorming/ideation session followed to define the desirable features and structure of the platform. The ideation was based on objective data from the industry and literature, subjective data from project partners’ experiences and, finally, their preferences and needs regarding the project’s educational platform.
The designed educational online platform consisted of two hybrid subsystems. The first, the LMS, was aimed at higher education lecturers related to the training in the didactic method based on neuroscience, accommodating all developed training contents. The second space was a collaborative environment for an online community of practice that allows communication, sharing and mutual peer learning among academic faculty members. Both parts can be utilized to facilitate formal and informal learning experiences.
In the first phase of the ideation, desk research was conducted to compare software systems that were suitable for this project’s platform implementation. After this process, two main candidate systems emerged: WordPress and Moodle. Both systems are open source, can be installed for free and are world-leading in their categories. Wordpress is a popular, versatile open-source content management system with a wide user base, installations and a large variety of plugins and extensions. Moreover, Wordpress has greater potential for social online learning [19]. WordPress can provide the basis for the community part, while Moodle can be the hosting system for the online content and courses. Moodle is a solid open-source solution for e-learning, with millions of installations in educational institutions worldwide [20]. However, the interplay between them was not seamless. Ensuring a single-sign on system where users would sign-up once and could have access to both subsystems (e-learning courses and the community of practice platform) emerged as a top priority. This was a problem that was solved later on.
In the second phase, the most important and desired features for the educational platform and community were discussed. A blended system’s requirements collection was implemented in the form of structured brainstorming in two stages. In the first stage, a collaborative sheet was provided online in Google Drive to record the most important/desired features for the educational platform and community. The sheet had the following fields/columns: Importance (Critical, Essential, Good to have, Avoid), Component (E-learning platform or Community—web 2.0), Category (Aesthetics, Communication, Content, Functionality), Title—Description, Usefulness for the project, Source, Screenshot or link and Connection to Neuroscience (theory/method/technique) and Reference/s. Each partner had its own separate sheet but had open access and could read the other partners’ entries. Sources of evidence for suggestions could be an already existing platform, the instructional design for the e-learning module under development and the application of specific Neuropedagogy principles.
The second stage took place during an online workshop using a collaborative tool (Miro) to discuss and synthesize everyone’s suggestions. First, partners briefly presented their proposals to ensure that each suggestion or concept was clear to all without misconceptions. Second, all suggestions were inserted as virtual post-it notes into the two-dimensional space and organized for (i) e-learning (LMS) and (ii) community subsystems across four main categories: Communication, Content, Functionality, Aesthetics (Figure 2). Then, all items in each category for both systems were summarized and discussed one by one to see if there were any disagreements or diverging opinions. Through this process, a consensus was reached, which is presented in the next section. Finally, additional notes were created whenever prerequisites issues and open questions were identified. These open questions allowed the discussion to reach deeper levels and connect with the project’s rationale in more fundamental ways. However, the problem of the seamless interplay between Wordpress and Moodle had not been solved yet.
An open issue that helped us come up with a solution was the possibility to have systems informed by Neuropedagogy to act as good practice for practitioners and to differentiate from contemporary practices. This particular aspect led us to widen the search and consider other systems. This process derived from the results of the desk research on LMS, along with project requirements and partners’ experiences, lead to Learndash [21]. Finally, after a thorough comparison between the two LMS, Moodle and Learndash, and based on the suggested features and essential functions of the LMS, a decision was taken to adopt Learndash over Moodle as it was considered more appropriate for this particular case. Its main advantage was its seamless integration into WordPress as a plugin, ensuring full interoperability as well as improved organizational and aesthetical aspects in smaller “chunks” of content.

2.2. Neuropedagogy Course Outline

Administered via an online platform, the course was structured into six modules, each delving into distinct aspects of the subject under investigation as follows:
  • Introduction to Neuropedagogy: this inaugural module contains five sections:
    • The definition of neuropedagogy and how it was born.
    • What does neuropedagogy bring to teachers?
    • Anatomical organization of the brain.
    • How does the brain learn?
    • Twelve general principles for classroom application.
  • Neuromyths in Education: this module explores the problematic proliferation of inaccurate beliefs regarding the brain and its role in learning. It presents the five more common neuromyths and the debunking of scientific evidence. Its structure is as follows:
    • What is a Neuromyth?
    • Why do Neuromyths persist in schools and colleges?
    • Neuromyth Examples in Higher Education.
    • Other Neuromyths.
    • How to Spot Neuromyths.
It also contains a practical tool for everyone to self-assess eventual biases towards neuromyths.
  • Engagement: the neural mechanisms underlying engagement are explored along with recommended strategies including cooperation and gamification. Specifically, the following topics are reviewed:
    • Introduction to educational neuroscience regarding engagement.
    • Approach responses and avoidance responses.
    • Strategies that trigger the reward system and engagement.
    • Classroom actions.
  • Concentration and Attention: the same structure is duplicated here, presenting the definition of and the neural processes behind attention and indicative application methods for classroom instruction.
    • Introduction to educational neuroscience regarding concentration and attention
    • Attention… what’s in a name?
    • How does attention work in the brain?
    • Alerting network.
    • Orienting network.
    • Executive attention network.
    • Classroom actions.
  • Emotions: this is the most extensive module, starting with a definition and classification of emotions. It explores the emotional and social aspects of the brain and contains several ideas to improve emotional responses in education. Its structure is the following:
    • Where do emotions come from? The emotional brain.
      1.1
      The system responsible for the generation of emotions.
      1.2
      The emotional brain, motion creation, processing and transmitting
      1.3
      Neurotransmitters.
    • Classification of emotions.
      2.1
      Basic Emotions.
      2.2
      Secondary Emotions.
      2.3
      Positive and Negative Emotions.
    • Emotions and the social brain.
      3.1
      Levels of “social”.
      3.2
      Social Neural Networks.
    • Suggestions and ideas how to implement emotions and emotional responses in the classroom
  • Associative Memory: the last module explores the function of memory in learning, the different associative memory types, such as semantic and episodic memory, featuring the following sections:
    • Definition of Memory.
    • Types of Memory.
    • What is Associative Memory?
    • Types of Associative Memory.
    • Impact of Associative Memory.
    • Pitfalls of Associative Memory.
These modules (as well as the platform interface) were fully translated into five languages (Dutch, English, Greek, Polish, Spanish) and available for flexible self-directed study. They encompassed a diverse range of learning and evaluation activities, including diagnostic questionnaires, associative exercises, self-assessment quizzes, explanatory schemata and reflective questions.
The user navigation within each module is free and without any restrictions. Hence, there are no forced linear learning paths. Adhering to the principles of adult learning, the learners are entrusted to exercise their agency to navigate back and forth to consolidate their learning progress and in-depth understanding, engaging with activities of their choice [22]. This facilitates the needs of some advanced users for quick access to the learning content, skipping eventual self-evaluative and reflective activities. Screenshots from various facets of the Neuropedagogy platform and course are available in Appendix A.

3. Method

Upon completion of the course, participants were requested to complete a psychometric survey (Table 1), comprising 18 items, which was aimed at evaluating their learning experience perceptions. Our hypothesis was that overall satisfaction would be above average. These items encompassed various dimensions including: (i) overall assessment of the course, (ii) relevance, currency and efficiency of the educational materials, (iii) suitability of the learning activities, (iv) perceived value of individual modules and the identification of modules requiring revision, (v) relevance of the assessment tasks to the course objectives and, finally, (vi) perceived usefulness of the platform in view of technical challenges encountered during the course delivery.
For the analysis of the primary data, we followed the guidelines provided by Pallant [23]. To establish the reliability (internal consistency) of the survey items, Cronbach’s alpha was calculated. Subsequently, the Shapiro-Wilk test was employed to examine normality. Due to the violation of normal distribution and in view of the small sample size, non-parametric tests were selected [24].
Pertaining to the statistical analysis, descriptive statistics offered an initial overview of the dataset, facilitating the identification of patterns and trends. This analysis provided insights into the central tendencies, dispersion and distribution of the data. The frequencies of responses were calculated for the multiple selection items. Accordingly, multiple Spearman’s rank correlation analyses were conducted to determine the strength and the direction of the relationships between specific variables [25]. To further validate these findings, a Generalized Additive Model was created using the overall evaluation of the course as the dependent variable. This allowed us to identify potential predictors of the factors that influenced participants’ ratings [26].
Finally, in order to process the feedback received in the open-ended questions, we adopted the principles of Grounded Theory [27,28]. This approach is particularly well-suited to studies where the aim is to understand phenomena for which there is not yet a strong pre-existing theoretical base, as is the case of the present study. By starting from the data and working upwards, this inductive methodology allows new theories and ideas to directly emerge from the respondents’ words and experiences, rather than being constrained or predetermined by existing theories or hypotheses. In addition, it allows for flexibility in data analysis, especially when the researchers expect a diverse range of responses. This iterative method allows for the constant comparison, coding and categorizing of data, which is advantageous when dealing with complex and multifaceted perspectives. Moreover, the Grounded Theory is highly effective in detecting emerging patterns that might be overlooked with other methods. Our goal was to identify shared experiences and opinions across respondents. Finally, it is widely used and accepted in qualitative research, ensuring the scientific rigor and validity of our study.

4. Results

In total, thirty-two (N = 32) participants, high education faculty members, took part in the study and anonymously evaluated the platform and its contents between February and April 2023. The decision to keep the survey anonymous was taken to facilitate the objective and impartial validation of the course contents and the platform’s functionality. The participants were primarily university teaching staff and researchers who had expressed their interest in the project’s topic and provided information about their prior knowledge and training needs [17].
The Cronbach’s alpha coefficient for our Likert Scale items is 0.86, a value indicative of good internal consistency [25]. This value results from an analysis that groups the four items assuming a single underlying construct. Although such a blanket approach might seem oversimplified, it is justified by the common thread that runs through all four items that examine participants’ experiences and perceptions related to the course and its materials. This unifying aspect positions them as markers of a general course ‘evaluation’ or ‘satisfaction’ factor. Notwithstanding, while the items may revolve around a central theme, each item has a distinct focus and probes into a unique aspect of the participants’ experiences and perceptions, thus adding depth and complexity to the overall ‘satisfaction’ or ‘course evaluation’ construct. Therefore, although our Cronbach’s alpha value does reflect a certain degree of consistency in responses, its interpretation should incorporate an understanding that it reflects a holistic form of satisfaction emerging from the various facets of participants’ experiences with the course and its materials.
In the remainder of this section, we delve deeper into the key findings that emerged from the statistical analyses and discuss their implications for instructional design and Neuropedagogy.

4.1. Descriptive Statistics

The descriptive statistics of the Likert scale items can be found in Table 2, while the frequency of the multi-choice responses is illustrated in Table 3. The positive ratings across the various aspects of the course, such as its overall quality (M = 4.56, SD = 0.6), the relevance of the learning resources (M = 4.53, SD = 0.55), the efficiency of the educational materials (M = 4.5, SD = 0.7) and the perceived usefulness of the platform (M = 4.46, SD = 0.74), indicate that participants found the course to be effective in achieving its intended learning objectives. Participants also unanimously agreed on the appropriateness of the learning activities for each module (Q8.1–Q8.6) and the relevance of the assessment questions (Q9) with mean, median and mode values consistently at 1. This consensus highlights the importance of aligning the learning activities with the assessment tasks and the course objectives. On the negative side, a small portion of participants encountered technical difficulties during the training program (M = 0.25, SD = 0.43). While this number may be relatively low, addressing this issue is crucial to further improve the user experience and ensure that the stakeholders can access and benefit from the course without hindrances.
Concerning the multiple selection items, the analysis revealed that participants found the modules “Emotions” (56.25%), “Concentration and Attention” (46.85%) and “Associative Memory” (40.6%) to be the most useful, likely due to their direct impact on teaching practices. The popularity of these modules highlights the importance of addressing real-world classroom challenges and providing educators with tools and techniques to increase learner engagement and improve learning outcomes. In contrast, participants perceived the “Introduction to Neuropedagogy” (31.25%) and the “Engagement in the learning process” (31.25%) modules as less appealing, suggesting that they may have found the content too abstract or that they may have been looking for more concrete, practical examples, strategies and actionable insights that could be easily applied in diverse teaching contexts.
Regarding suggestions for improvement, the majority of participants (81.25%) believed that none of the modules required any revisions or improvements. This further consolidates the general satisfaction with the course content and structure, reflecting the effectiveness of the modules in addressing participants’ interests. However, a portion of participants identified some of the modules as requiring revision, with “Introduction to Neuropedagogy” (9.4%), “Emotions” (12.5%) and “Neuromyths” (9.4%) at the top of the list. Interestingly, despite “Emotions” being the most highly valued module, it still received feedback for improvement. This suggests that participants are highly interested in the topic and seek further development or expansion in this area to better support their teaching.

4.2. Correlations

Spearman’s rank correlations in Table 4 revealed a strong positive correlation between participants’ general evaluation of the course and their perception of the relevance and recency of the learning materials (ρ = 0.86, p < 0.01). Our study’s findings align with previous research emphasizing the significance of offering current and pertinent content for successful online learning experiences (for example, [29,30]). Additionally, we noticed a strong positive correlation between the overall course assessment and the platform’s usefulness (ρ = 0.67, p < 0.01). Earlier studies have also shown that factors such as the usability of the platform, its accessibility and the overall design significantly impact learners’ satisfaction and engagement in online courses (for example, [31,32]). Moreover, we found a strong positive correlation between the general evaluation of the course and the effectiveness of the educational materials (ρ = 0.73, p < 0.01). Several studies have also highlighted the critical role of effective instructional design in fostering successful online learning outcomes (e.g., [11,33]).

4.3. Generalised Additive Model

The Generalized Additive Model analysis confirmed the aforementioned findings, indicating that the evaluation of the course is significantly predicted by several factors with the relevance (smooth function = 1.45, p < 0.01) and efficacy (smooth function = 1.12, p < 0.01) of the learning materials being among the strongest predictors. These findings are in agreement with the relevant literature, further confirming the importance of ensuring that the course content is up to date [34,35]. Although to a lesser extent, the efficiency of the learning activities (smooth function = 0.89, p < 0.01) and the relevance of the assessment tasks (smooth function = 0.67, p < 0.05) also influenced participants’ attitudes toward the course. This outcome is also in alignment with recent research, suggesting that the learning activities should be designed to enhance learners’ understanding of the course material and the assessment tasks should be closely aligned with the educational content [36,37]. Finally, the usability of the platform utilized to deliver the educational activities and respective content does not seem to have influenced participants’ ratings as much (smooth function = 0.28, p < 0.01), possibly due to the sample consisting of academics who may be more adept at navigating various platforms. However, providing learners with a user-friendly platform remains an important aspect that should not be overlooked as it can contribute to a positive learning experience [38].

4.4. Open-Ended Questions

Participants’ responses to the open-ended questions complemented the quantitative data and offered additional insights. Academics found each module to be useful (S5.1) in its own way, with many of them noting multiple modules. The reasons cited for usefulness varied, but were mostly related to the content with comments such as “the content is informative”, “the material is packed with useful and essential guidelines” and “the course is well-structured and comprehensive”. Some participants highlighted the practical and applied elements of the materials, stating that “the material was well organized” and that the content was “very timely and closely linked to real problems”. They also appreciated the research-based way neuromyths were deconstructed, with one participant commenting on the “great job debunking myths with solid evidence”. The module concerning “Emotions” seemed to be an area of interest for some participants who noted the under-researched and underestimated nature of emotions in the learning process. This was further supported by additional feedback emphasizing the importance of engagement and positive attitudes in the learning process, as well as the need for attention to be paid to students’ emotions and their connection to the acquisition of knowledge and skills. Other participants praised the course for its clear concepts that could be applied in everyday situations, its ability to broaden their knowledge of associative memory and the interesting content that contains very useful resources and literature. Finally, one participant even described a module as “the most interesting” they had encountered. Overall, the added feedback underscores the value of the course in addressing prevalent misconceptions and providing a comprehensive understanding of crucial aspects of the learning process.
While most participants did not offer suggestions for enhancing or modifying the modules (S6.1), a few provided valuable feedback on areas that could be improved. Specifically, one participant suggested that the modules dealing with “Emotions” and “Neuromyths” might benefit from more detailed explanations and increased clarity, while another mentioned that certain topics “could be covered at greater depth”. Participants also raised concerns regarding the linguistic aspects of the modules, as well as the presentation and formatting style. Such remarks underline the need to find a balance between content richness and accessibility to satisfy individual preferences. Lastly, a few participants underlined the importance of ongoing development and adaptation.
The feedback received with regard to the training material (S7) was diverse. Some participants felt that they lacked the necessary experience to identify areas for improvement, while others had no comments or suggestions. Nevertheless, a few participants proposed incorporating more illustrations, images and videos to enrich the learning experience and make the content more dynamic. Additionally, one participant pointed out that certain visuals needed higher resolution, as they were either small or blurry. In terms of textual content, one respondent recommended that the amount of text should be reduced. Other participants observed that the assessment questions were more challenging than the training material itself, necessitating a deeper understanding of the content to answer them. Relatedly, one participant noted that, in terms of assessments, the platform did not accept some correct answers as valid due to language mixing or slight differences in phrasing. They suggested that open-ended tests should not be machine-tested to avoid such issues. In the same vein, one participant advised awarding partial points for correct answers in the quizzes with multiple correct answers, arguing that marking the entire answer as incorrect, when only one part was wrong, was neither helpful nor fair. To further support learning, one participant suggested that qualitative feedback should be provided alongside the correct answer. Finally, some respondents recommended including more examples of good practices, especially when addressing problematic or difficult students.
The last open-ended question (S11) sought to gather supplementary feedback from the participants, encompassing general remarks and suggestions. The majority of the responses painted a favorable picture of the educational program, featuring accolades such as “congratulations on the developed course!” and expressing keen interest. Intriguingly, one participant regarded the Neuropedagogy training program as both captivating and advantageous, further disclosing their plan to adapt their teaching approach in accordance with the program’s principles. On the other hand, some feedback identified areas in need of enhancement. For instance, one participant experienced technical difficulties during the course (e.g., challenges in navigating back to the homepage and sluggish page loading times), while other respondents encountered issues with the registration process, noting that the registration button was not properly working and that the platform was not very intuitive. Another area of concern was the translation quality of the learning material (from English into the participants’ native language). These observations underscore the significance of addressing the technical components and translation quality of educational programs to guarantee their seamless operation, efficacy and overall user experience.

5. Discussion

In the context of this study, we sought to investigate the perspectives of Higher Education instructors on a Neuropedagogy course to identify its strong points and potential areas for enhancement. This endeavor is of high importance to increase and improve open educational resources and courses and to build evidence-based bridges between neuroscience research and teaching practice in physical, blended and digital online settings [5]. Our research not only contributes to the broader literature by showcasing methods to enrich teaching and learning practices [39], but also supports the growing evidence highlighting the significance of educators’ professional development [40,41]. The insights obtained from this study can inform course and instructional designers in designing and developing engaging and effective online learning experiences. Concretely, design thinking allowed researchers to initiate a value- and learner-centric process to accommodate the requirements of all involved stakeholders.
Our findings revealed a hypothesis confirming a predominantly favorable attitude toward the course, characterized by high ratings in the relevance and efficacy of the educational resources. These insights can be harnessed to fine-tune the design and delivery of future online courses by: (a) prioritizing the provision of timely and pertinent materials [42], (b) ensuring that the assessment tasks align well with the content of the course modules [43] and (c) placing a strong focus on developing a user-friendly and technically sound platform [44,45]. Participants also unanimously commended the design of the learning activities and the congruence of the assessment tasks with the course’s intended learning outcomes [46,47]. All of these factors are crucial in delivering effective and engaging learning experiences [48,49].
Despite the overall notable satisfaction level with regard to the quality of the course, it is important to consider the following limitations when interpreting these findings. One limitation is the explicit reliance on self-reported data from a psychometric instrument, which may be subject to social desirability bias and may not fully capture participants’ actual experiences [50]. A more comprehensive assessment of the course’s impact on educators’ professional development could be obtained by incorporating alternative methods, such as the evaluation of longitudinal studies on the application of the Neuropedagogy principles in their teaching practices. To obtain a more in-depth understanding of participants’ experiences and identify potential areas for improvement, future research should incorporate additional qualitative data collection sources, such as interviews or focus groups.
Furthermore, we did not collect background information on participants. Future work should investigate the influence of participant characteristics, including gender, age, prior experience with online courses, academic discipline, teaching experience and so on. Understanding these relationships could help instructional designers and course developers to better target both the course content and the delivery methods.
Another significant limitation is the sample size, which restricts the generalizability of the findings to a broader population of academics. Small sample sizes may lead to biased estimates or reduced statistical power, thus making it challenging to detect true effects or relationships. To address this limitation, future research should increase the sample size by recruiting a more diverse and larger pool of participants from various academic disciplines, institutions and backgrounds. This approach would enhance the generalizability of the findings and provide more accurate insights into the participants’ experiences and perceptions.
Concerning the evaluation of the particular topics, we identified specific modules that call for further enhancement and consideration. The continuous evaluation and revision of courses are crucial to maintaining their ongoing relevance and efficacy [51]. Moreover, the technical obstacles reported by participants resonate with the prevalent issues often faced in online education, thus meriting careful attention [52].
In addition, they highlight the potential advantages of integrating diverse instructional strategies and multimedia resources into the course design [53]. Such an approach can accommodate varied learning preferences and foster cognitive flexibility, thereby creating a more engaging and inclusive learning environment that caters to the diverse needs of learners [37]. Lastly, our research focuses on the significance of providing personalized feedback and guidance to learners as a means to help them better comprehend their strengths and weaknesses [54].

6. Conclusions

In light of the study’s findings, we present the following implications for stakeholders involved in designing and delivering online professional development courses. Instructional designers and course content creators specializing in professional development initiatives should pay special attention to the structure of the learning activities, ensuring that the learning material is relevant and current and that the assessments are authentic [55,56]. In addition, the diverse learning preferences of the audience should be considered to increase the incentives for engagement [57]. Software developers need to invest in robust technical infrastructure and support systems in order to ensure that the final product will deliver a smooth and seamless learning experience [36,58].
Higher Education institutions and policymakers need to recognize the potential of online courses to foster professional growth among teachers [59], as well as their role in the overall teaching optimization process. As academics are subject experts in their fields, recommendations for change in their established teaching practices could be met with skepticism, even resistance. The sustainable transformation of teaching practices requires repeated reinforcement of new knowledge, the freedom to make responsible decisions and the verification of valid interpretations through mentoring and trusted peer interactions.
In this way, higher-managerial staff are urged to cultivate a culture of ongoing quality-focused professional development by providing access to online courses and encouraging their faculty to engage in lifelong learning [60]. Embracing this approach will not only enhance the quality of teaching, but it will also contribute to the overall success of the educational institution [61].

Author Contributions

Conceptualization, S.M.; methodology, S.M. and M.F.; software, K.D.; validation, S.M. and A.C.; formal analysis, A.C.; investigation, A.C.; resources, K.D.; data curation, S.M.; writing—original draft preparation, S.M. and A.C.; writing—review and editing, A.C. and S.M.; visualization, S.M. and K.D.; supervision, M.F.; project administration, S.M.; funding acquisition, M.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Erasmus+ program Neuropedagogy of the European Union, grant number 2020-1-PL01-KA203-081740.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of the University of Patras (protocol code 6823/9 October 2020).

Data Availability Statement

The data that support the findings of this study are available on request from the corresponding author, S.M.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Snapshots from the Neuropedagogy e-learning platform and online course on educational neuroscience.
Figure A1. Homepage with embedded twitter feed.
Figure A1. Homepage with embedded twitter feed.
Information 14 00382 g0a1
Figure A2. Neuropedagogy course grid featuring six (6) modules.
Figure A2. Neuropedagogy course grid featuring six (6) modules.
Information 14 00382 g0a2
Figure A3. Focus mode reducing cognitive load and improving learner attention.
Figure A3. Focus mode reducing cognitive load and improving learner attention.
Information 14 00382 g0a3
Figure A4. Embedded YouTube video (module 6—Emotions).
Figure A4. Embedded YouTube video (module 6—Emotions).
Information 14 00382 g0a4
Figure A5. Quiz-matching activity.
Figure A5. Quiz-matching activity.
Information 14 00382 g0a5
Figure A6. Explanatory schema (module 4—concentration and attention).
Figure A6. Explanatory schema (module 4—concentration and attention).
Information 14 00382 g0a6
Figure A7. User profile in the Neuropedagogy community of practice.
Figure A7. User profile in the Neuropedagogy community of practice.
Information 14 00382 g0a7

References

  1. Youn, T.I.K.; Price, T.M. Learning from the Experience of Others: The Evolution of Faculty Tenure and Promotion Rules in Comprehensive Institutions. J. High. Educ. 2009, 80, 204–237. [Google Scholar] [CrossRef]
  2. Malcolm, M. A Critical Evaluation of Recent Progress in Understanding the Role of the Research-Teaching Link in Higher Education. High. Educ. 2014, 67, 289–301. [Google Scholar] [CrossRef]
  3. Entwistle, N.J.; Peterson, E.R. Conceptions of Learning and Knowledge in Higher Education: Relationships with Study Behaviour and Influences of Learning Environments. Int. J. Educ. Res. 2004, 41, 407–428. [Google Scholar] [CrossRef]
  4. Ansari, D.; de Smedt, B.; Grabner, R.H. Neuroeducation—A Critical Overview of An Emerging Field. Neuroethics 2012, 5, 105–117. [Google Scholar] [CrossRef]
  5. Cui, Y.; Zhang, H. Educational Neuroscience Training for Teachers’ Technological Pedagogical Content Knowledge Construction. Front. Psychol. 2021, 12, 792723. [Google Scholar] [CrossRef] [PubMed]
  6. Koehler, M.J.; Mishra, P. What Is Technological Pedagogical Content Knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef] [Green Version]
  7. Tokuhama-Espinosa, T. Bringing the Neuroscience of Learning to Online Teaching: An Educator’s Handbook; Teachers College Press: New York, NY, USA, 2021; ISBN 9780807765531. [Google Scholar]
  8. Wilcox, G.; Morett, L.M.; Hawes, Z.; Dommett, E.J. Why Educational Neuroscience Needs Educational and School Psychology to Effectively Translate Neuroscience to Educational Practice. Front. Psychol. 2021, 11, 618449. [Google Scholar] [CrossRef]
  9. Grospietsch, F.; Lins, I. Review on the Prevalence and Persistence of Neuromyths in Education—Where We Stand and What Is Still Needed. Front. Educ. 2021, 6, 665752. [Google Scholar] [CrossRef]
  10. Bragg, L.A.; Walsh, C.; Heyeres, M. Successful Design and Delivery of Online Professional Development for Teachers: A Systematic Review of the Literature. Comput. Educ. 2021, 166, 104158. [Google Scholar] [CrossRef]
  11. Mystakidis, S.; Fragkaki, M.; Filippousis, G. Ready Teacher One: Virtual and Augmented Reality Online Professional Development for K-12 School Teachers. Computers 2021, 10, 134. [Google Scholar] [CrossRef]
  12. Doukakis, S.; Alexopoulos, E.C. Online Learning, Educational Neuroscience and Knowledge Transformation Opportunities for Secondary Education Students. J. High. Educ. Theory Pract. 2021, 21, 49–57. [Google Scholar] [CrossRef]
  13. Patrício, R.; Moreira, A.C.; Zurlo, F. Enhancing Design Thinking Approaches to Innovation through Gamification. Eur. J. Innov. Manag. 2021, 24, 1569–1594. [Google Scholar] [CrossRef]
  14. Gachago, D.; Morkel, J.; Hitge, L.; van Zyl, I.; Ivala, E. Developing ELearning Champions: A Design Thinking Approach. Int. J. Educ. Technol. High. Educ. 2017, 14, 30. [Google Scholar] [CrossRef] [Green Version]
  15. Gonzalez, C.S.G.; Gonzalez, E.G.; Cruz, V.M.; Saavedra, J.S. Integrating the Design Thinking into the UCD’s Methodology. In Proceedings of the IEEE EDUCON 2010 Conference, Madrid, Spain, 14–16 April 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 1477–1480. [Google Scholar]
  16. Dimitropoulos, K.; Mystakidis, S.; Fragkaki, M. Bringing Educational Neuroscience to Distance Learning with Design Thinking: The Design and Development of a Hybrid E-Learning Platform for Skillful Training. In Proceedings of the 2022 7th South-East Europe Design Automation, Computer Engineering, Computer Networks and Social Media Conference (SEEDA-CECNSM), Ioannina, Greece, 23–25 September 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar]
  17. Fragkaki, M.; Mystakidis, S.; Dimitropoulos, K. Higher Education Faculty Perceptions and Needs on Neuroeducation in Teaching and Learning. Educ. Sci. 2022, 12, 707. [Google Scholar] [CrossRef]
  18. Fragkaki, M.; Mystakidis, S.; Dimitropoulos, K. Higher Education Teaching Transformation with Educational Neuroscience Practices. In Proceedings of the 15th Annual International Conference of Education, Research and Innovation, Seville, Spain, 7–9 November 2022; pp. 579–584. [Google Scholar]
  19. Krouska, A.; Troussas, C.; Virvou, M. Comparing LMS and CMS Platforms Supporting Social E-Learning in Higher Education. In Proceedings of the 2017 8th International Conference on Information, Intelligence, Systems & Applications (IISA), Larnaca, Cyprus, 27–30 August 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–6. [Google Scholar]
  20. Al-Ajlan, A.; Zedan, H. Why Moodle. In Proceedings of the 2008 12th IEEE International Workshop on Future Trends of Distributed Computing Systems, Kunming, China, 21–23 October 2008; pp. 58–64. [Google Scholar] [CrossRef]
  21. Natriello, G.; Chae, H.S. Taking Project-Based Learning Online. In Innovations in Learning and Technology for the Workplace and Higher Education; Guralnick, D., Auer, M.E., Poce, A., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 224–236. [Google Scholar]
  22. Mystakidis, S.; Lympouridis, V. Immersive Learning. Encyclopedia 2023, 3, 396–405. [Google Scholar] [CrossRef]
  23. Pallant, J. SPSS Survival Manual: A Step by Step Guide to Data Analysis Using IBM SPSS; Allen & Unwin: Crows Nest, Australia, 2020; ISBN 9781000252521. [Google Scholar]
  24. Field, A. Discovering Statistics Using IBM SPSS Statistics; SAGE Publications: Thousand Oaks, CA, USA, 2018; ISBN 9781526419521. [Google Scholar]
  25. Hauke, J.; Kossowski, T. Comparison of Values of Pearson’s and Spearman’s Correlation Coefficients on the Same Sets of Data. Quaest. Geogr. 2011, 30, 87–93. [Google Scholar] [CrossRef] [Green Version]
  26. Hastie, T.; Tibshirani, R.; Friedman, J.H. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer Series in Statistics; Springer: Berlin/Heidelberg, Germany, 2009; ISBN 9780387848846. [Google Scholar]
  27. Walker, D.; Myrick, F. Grounded Theory: An Exploration of Process and Procedure. Qual. Health Res. 2006, 16, 547–559. [Google Scholar] [CrossRef] [PubMed]
  28. Strauss, A.; Corbin, J.M. Grounded Theory in Practice; Sage: Newbury Park, CA, USA, 1997. [Google Scholar]
  29. Moore, M.G.; Kearsley, G. Distance Education: A Systems View of Online Learning; Cengage Learning: Boston, MA, USA, 2011; ISBN 9781133715450. [Google Scholar]
  30. Keshavarz, M.; Ghoneim, A. Preparing Educators to Teach in a Digital Age. Int. Rev. Res. Open Distrib. Learn. 2021, 22, 221–242. [Google Scholar] [CrossRef]
  31. Al-Azawei, A.; Parslow, P.; Lundqvist, K. Investigating the Effect of Learning Styles in a Blended E-Learning System: An Extension of the Technology Acceptance Model (TAM). Australas. J. Educ. Technol. 2017, 33, 1–23. [Google Scholar] [CrossRef] [Green Version]
  32. Sun, P.-C.; Tsai, R.J.; Finger, G.; Chen, Y.-Y.; Yeh, D. What Drives a Successful E-Learning? An Empirical Investigation of the Critical Factors Influencing Learner Satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  33. Rienties, B.; Brouwer, N.; Lygo-Baker, S. The Effects of Online Professional Development on Higher Education Teachers’ Beliefs and Intentions towards Learning Facilitation and Technology. Teach. Teach. Educ. 2013, 29, 122–131. [Google Scholar] [CrossRef] [Green Version]
  34. Khalil, R.; Mansour, A.E.; Fadda, W.A.; Almisnid, K.; Aldamegh, M.; Al-Nafeesah, A.; Alkhalifah, A.; Al-Wutayd, O. The Sudden Transition to Synchronized Online Learning during the COVID-19 Pandemic in Saudi Arabia: A Qualitative Study Exploring Medical Students’ Perspectives. BMC Med. Educ. 2020, 20, 285. [Google Scholar] [CrossRef] [PubMed]
  35. Zhou, T.; Huang, S.; Cheng, J.; Xiao, Y. The Distance Teaching Practice of Combined Mode of Massive Open Online Course Micro-Video for Interns in Emergency Department During the COVID-19 Epidemic Period. Telemed. e-Health 2020, 26, 584–588. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Preece, J.; Sharp, H.; Rogers, Y. Interaction Design: Beyond Human-Computer Interaction; Wiley: Hoboken, NJ, USA, 2015; ISBN 9781119020752. [Google Scholar]
  37. Yu, Z. A Meta-Analysis and Bibliographic Review of the Effect of Nine Factors on Online Learning Outcomes across the World. Educ. Inf. Technol. 2022, 27, 2457–2482. [Google Scholar] [CrossRef]
  38. Rodrigues, H.; Almeida, F.; Figueiredo, V.; Lopes, S.L. Tracking E-Learning through Published Papers: A Systematic Review. Comput. Educ. 2019, 136, 87–98. [Google Scholar] [CrossRef]
  39. Darby, F.; Lang, J.M. Small Teaching Online: Applying Learning Science in Online Classes; Wiley: Hoboken, NJ, USA, 2019; ISBN 9781119619093. [Google Scholar]
  40. Christopoulos, A.; Sprangers, P. Integration of Educational Technology during the Covid-19 Pandemic: An Analysis of Teacher and Student Receptions. Cogent Educ. 2021, 8, 1964690. [Google Scholar] [CrossRef]
  41. Fernández-Batanero, J.M.; Montenegro-Rueda, M.; Fernández-Cerero, J.; García-Martínez, I. Digital Competences for Teacher Professional Development. Systematic Review. Eur. J. Teach. Educ. 2020, 45, 513–531. [Google Scholar] [CrossRef]
  42. Bozkurt, A.; Sharma, R.C. Emergency Remote Teaching in a Time of Global Crisis Due to CoronaVirus Pandemic. Asian J. Distance Educ. 2020, 15, 1–6. [Google Scholar]
  43. Zlatkin-Troitschanskaia, O.; Pant, H.A.; Coates, H. Assessing Student Learning Outcomes in Higher Education: Challenges and International Perspectives. Assess. Eval. High. Educ. 2016, 41, 655–661. [Google Scholar] [CrossRef]
  44. Christopoulos, A.; Conrad, M.; Shukla, M. Increasing Student Engagement through Virtual Interactions: How? Virtual Real. 2018, 22, 353–369. [Google Scholar] [CrossRef] [Green Version]
  45. König, J.; Jäger-Biela, D.J.; Glutsch, N. Adapting to Online Teaching during COVID-19 School Closure: Teacher Education and Teacher Competence Effects among Early Career Teachers in Germany. Eur. J. Teach. Educ. 2020, 43, 608–622. [Google Scholar] [CrossRef]
  46. Vanslambrouck, S.; Zhu, C.; Lombaerts, K.; Philipsen, B.; Tondeur, J. Students’ Motivation and Subjective Task Value of Participating in Online and Blended Learning Environments. Internet High. Educ. 2018, 36, 33–40. [Google Scholar] [CrossRef]
  47. Nancekivell, S.E.; Shah, P.; Gelman, S.A. Maybe They’re Born with It, or Maybe It’s Experience: Toward a Deeper Understanding of the Learning Style Myth. J. Educ. Psychol. 2020, 112, 221–235. [Google Scholar] [CrossRef]
  48. Darling-Hammond, L.; Flook, L.; Cook-Harvey, C.; Barron, B.; Osher, D. Implications for Educational Practice of the Science of Learning and Development. Appl. Dev. Sci. 2020, 24, 97–140. [Google Scholar] [CrossRef] [Green Version]
  49. Doo, M.Y.; Bonk, C.; Heo, H. A Meta-Analysis of Scaffolding Effects in Online Learning in Higher Education. Int. Rev. Res. Open Distrib. Learn. 2020, 21, 60–80. [Google Scholar] [CrossRef]
  50. Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.-Y.; Podsakoff, N.P. Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef] [PubMed]
  51. Sadaf, A.; Wu, T.; Martin, F. Cognitive Presence in Online Learning: A Systematic Review of Empirical Research from 2000 to 2019. Comput. Educ. Open 2021, 2, 100050. [Google Scholar] [CrossRef]
  52. Li, X.; Odhiambo, F.A.; Ocansey, D.K.W. The Effect of Students’ Online Learning Experience on Their Satisfaction during the COVID-19 Pandemic: The Mediating Role of Preference. Front. Psychol. 2023, 14, 1095073. [Google Scholar] [CrossRef]
  53. Clark, R.C.; Mayer, R.E. E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning; Wiley Desktop Editions; Wiley: Hoboken, NJ, USA, 2011; ISBN 9781118047262. [Google Scholar]
  54. Van der Kleij, F.M.; Feskens, R.C.W.; Eggen, T.J.H.M. Effects of Feedback in a Computer-Based Learning Environment on Students’ Learning Outcomes. Rev. Educ. Res. 2015, 85, 475–511. [Google Scholar] [CrossRef]
  55. Rao, K. Inclusive Instructional Design: Applying UDL to Online Learning. J. Appl. Instr. Des. 2021, 10, 1–10. [Google Scholar] [CrossRef]
  56. Al-Samarraie, H.; Saeed, N. A Systematic Review of Cloud Computing Tools for Collaborative Learning: Opportunities and Challenges to the Blended-Learning Environment. Comput. Educ. 2018, 124, 77–91. [Google Scholar] [CrossRef]
  57. Laurillard, D. The Educational Problem That MOOCs Could Solve: Professional Development for Teachers of Disadvantaged Students. Res. Learn. Technol. 2016, 24, 29369. [Google Scholar] [CrossRef] [Green Version]
  58. Lidolf, S.; Pasco, D. Educational Technology Professional Development in Higher Education: A Systematic Literature Review of Empirical Research. Front. Educ. 2020, 5, 35. [Google Scholar] [CrossRef]
  59. Carrillo, C.; Flores, M.A. COVID-19 and Teacher Education: A Literature Review of Online Teaching and Learning Practices. Eur. J. Teach. Educ. 2020, 43, 466–487. [Google Scholar] [CrossRef]
  60. Mystakidis, S.; Berki, E.; Valtanen, J.-P. The Patras Blended Strategy Model for Deep and Meaningful Learning in Quality Life-Long Distance Education. Electron. J. e-Learn. 2019, 17, 66–78. [Google Scholar] [CrossRef] [Green Version]
  61. Rahrouh, M.; Taleb, N.; Mohamed, E.A. Evaluating the Usefulness of E-Learning Management System Delivery in Higher Education. Int. J. Econ. Bus. Res. 2018, 16, 162. [Google Scholar] [CrossRef]
Figure 1. Online learning platform design based on five design thinking stages.
Figure 1. Online learning platform design based on five design thinking stages.
Information 14 00382 g001
Figure 2. Online collaborative ideation environment.
Figure 2. Online collaborative ideation environment.
Information 14 00382 g002
Table 1. The data collection instrument.
Table 1. The data collection instrument.
ItemsMeasurementCoding
Q1: What is your general evaluation of the course/learning material?Likert scale1 = Very poor, 3 = Average, 5 = Very good
Q2: Did you find the learning materials relevant and up-to-date?Likert scale1 = Very irrelevant and out-of-date, 3 = Neither relevant nor irrelevant, somewhat up to date, 5 = Very Relevant and Up-to-Date
Q3: What is your general impression of the platform?Likert scale1 = Very poor, 3 = Average, 5 = Very good
Q4: Were the educational materials sufficient and effective in achieving the learning objectives of the course?Likert scale1 = Not efficient, 3 = Neither efficient nor inefficient, 5 = Very efficient
Q5: Which module(s) did you find particularly useful?Multiple selection1 = Introduction to Neuropedagogy, 2 = Engagement in the learning process, 3 = Neuromyths, 4 = Concentration/Attention, 5 = Emotions, 6 = Associative memory, 7 = None of them
S5.1: Please explain why you found the module(s) you selected in the previous question as useful.Open-endedN/A
Q6: Which module(s) do you think need revision or improvement?Multiple selection1 = Introduction to Neuropedagogy, 2 = Engagement in the learning process, 3 = Neuromyths, 4 = Concentration/Attention, 5 = Emotions, 6 = Associative memory, 7 = None of them
S6.1: Please explain why you think this/these specific module(s) should be improved.Open-endedN/A
S7: Is there anything in the learning material that you would like to change or improve? Please provide your comments and suggestions.Open-endedN/A
Q8.1: Were the assessment questions in the Introduction to Neuropedagogy module based on the respective educational content?Binary0 = No, 1 = Yes
Q8.2: Were the assessment questions in the Neuromyths module based on the respective educational content?Binary0 = No, 1 = Yes
Q8.3: Were the assessment questions in the Concentration and Attention module based on the respective educational content?Binary0 = No, 1 = Yes
Q8.4: Were the assessment questions in the Associative memory module based on the respective educational content?Binary0 = No, 1 = Yes
Q8.5: Were the assessment questions in the Engagement module based on the respective educational content?Binary0 = No, 1 = Yes
Q8.6: Were the assessment questions in the Emotions module based on the respective educational content?Binary0 = No, 1 = Yes
Q9: Were the learning activities appropriate for the content?Binary0 = No, 1 = Yes
Q10: Did you encounter any technical difficulties in your progress through the training programme?Binary0 = No, 1 = Yes
S11: Comments and RecommendationsOpen-endedN/A
Table 2. Descriptive statistics of participant responses for course evaluation and related factors.
Table 2. Descriptive statistics of participant responses for course evaluation and related factors.
ItemMMedMoSDVARKurtSkew
Q1: Evaluation of the course4.56550.60.370.32−1.12
Q2: Relevance of the learning materials4.53550.550.31−0.51−0.69
Q3: Perceived usefulness of the platform4.46550.740.56−0.37−1.05
Q4: Efficiency of the educational materials4.5550.70.53.44−1.67
Q10: Technical difficulties (platform)0.25000.430.18−0.571.21
Table 3. Frequencies of responses in the multiple selection items.
Table 3. Frequencies of responses in the multiple selection items.
ItemQ5: Perceived
Usefulness of Modules
Q6: Modules That Should Be Revised
ModuleFreqPercentFreqPercent
Introduction to Neuropedagogy1031.25%39.4%
Engagement in the learning process1031.25%13.1%
Neuromyths1237.5%39.4%
Concentration and Attention1546.85%13.1%
Emotions1856.25%412.5%
Associative memory1340.6%13.1%
None of them00%2681.25%
Note: Each percentage represents the proportion of participants who chose each module.
Table 4. Spearman’s correlations across the Likert scale variables.
Table 4. Spearman’s correlations across the Likert scale variables.
Evaluation of the CourseRelevance and
Recency of Learning
Materials
Usefulness of the
Platform
Effectiveness of
Educational
Materials
Evaluation of the course1
Relevance and recency of learning materials0.861
Usefulness of the platform0.730.671
Effectiveness of educational materials0.790.830.731
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mystakidis, S.; Christopoulos, A.; Fragkaki, M.; Dimitropoulos, K. Online Professional Development on Educational Neuroscience in Higher Education Based on Design Thinking. Information 2023, 14, 382. https://doi.org/10.3390/info14070382

AMA Style

Mystakidis S, Christopoulos A, Fragkaki M, Dimitropoulos K. Online Professional Development on Educational Neuroscience in Higher Education Based on Design Thinking. Information. 2023; 14(7):382. https://doi.org/10.3390/info14070382

Chicago/Turabian Style

Mystakidis, Stylianos, Athanasios Christopoulos, Maria Fragkaki, and Konstantinos Dimitropoulos. 2023. "Online Professional Development on Educational Neuroscience in Higher Education Based on Design Thinking" Information 14, no. 7: 382. https://doi.org/10.3390/info14070382

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop