﻿<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD with MathML3 v1.2 20190208//EN" "http://dtd.nlm.nih.gov/publishing/3.0/journalpublishing3.dtd">
<article
    xmlns:mml="http://www.w3.org/1998/Math/MathML"
    xmlns:xlink="http://www.w3.org/1999/xlink" dtd-version="3.0" xml:lang="en" article-type="article">
  <front>
    <journal-meta>
      <journal-id journal-id-type="publisher-id">OJER</journal-id>
      <journal-title-group>
        <journal-title>Open Journal of Educational Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">2770-5552</issn>
      <issn pub-type="ppub"></issn>
      <publisher>
        <publisher-name>Science Publications</publisher-name>
      </publisher>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.31586/ojer.2022.273</article-id>
      <article-id pub-id-type="publisher-id">OJER-273</article-id>
      <article-categories>
        <subj-group subj-group-type="heading">
          <subject>Article</subject>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title>
          Weekly Quizzes Reinforce Student Learning Outcomes and Performance in Biomedical Sciences in-course Assessments
        </article-title>
      </title-group>
      <contrib-group>
<contrib contrib-type="author">
<name>
<surname>El-Hashash</surname>
<given-names>Ahmed</given-names>
</name>
<xref rid="af1" ref-type="aff">1</xref>
<xref rid="cr1" ref-type="corresp">*</xref>
</contrib>
      </contrib-group>
<aff id="af1"><label>1</label>Zhejiang University-University of Edinburgh Joint College of Biomedicine (ZJU-UoE Institute), Zhejiang University Intl. campus, Haining, Zhejiang 314400, PRC</aff>
<author-notes>
<corresp id="c1">
<label>*</label>Corresponding author at: Zhejiang University-University of Edinburgh Joint College of Biomedicine (ZJU-UoE Institute), Zhejiang University Intl. campus, Haining, Zhejiang 314400, PRC
</corresp>
</author-notes>
      <pub-date pub-type="epub">
        <day>30</day>
        <month>06</month>
        <year>2022</year>
      </pub-date>
      <volume>2</volume>
      <issue>4</issue>
      <history>
        <date date-type="received">
          <day>30</day>
          <month>06</month>
          <year>2022</year>
        </date>
        <date date-type="rev-recd">
          <day>30</day>
          <month>06</month>
          <year>2022</year>
        </date>
        <date date-type="accepted">
          <day>30</day>
          <month>06</month>
          <year>2022</year>
        </date>
        <date date-type="pub">
          <day>30</day>
          <month>06</month>
          <year>2022</year>
        </date>
      </history>
      <permissions>
        <copyright-statement>&#xa9; Copyright 2022 by authors and Trend Research Publishing Inc. </copyright-statement>
        <copyright-year>2022</copyright-year>
        <license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
          <license-p>This work is licensed under the Creative Commons Attribution International License (CC BY). http://creativecommons.org/licenses/by/4.0/</license-p>
        </license>
      </permissions>
      <abstract>
        Studies have highlighted the benefits of frequent quizzing in class. Frequent quizzing can promote more student attendance, engagement, practice and review, and achievement. Conversely, the opponents of frequent quizzing suggest that too frequent testing might hinder learning by frustrating anxious students and inhibiting larger units of instructional material. Notably, most studies have used degree examinations to evaluate the impact of quizzes on student learning and performance, yet little is known about whether quizzes can reinforce student performance in the in-course assessments (ICAs) despite ICA importance in student learning. The present study aimed to test the hypothesis that administration of weekly MCQ quizzes can enhance the leaning outcomes and performance of biomedical science students in assessment methods such as essay and oral presentation that can directly measure and provide information about student learning. It was therefore limited to in-course assessments. We found that the performance of the weekly quiz student group is remarkably better than that of the control student group in both the essay and oral presentation ICAs, which are two measures and indicators of student learning, suggesting improved student learning outcomes and performance after administrating weekly MCQ quizzes that also promoted student attendance in classrooms. The findings of this research study have implications for students, teachers, and curriculum designers in higher education.
      </abstract>
      <kwd-group>
        <kwd-group><kwd>In-course Assessments</kwd>
<kwd>Quizzes</kwd>
<kwd>Class Attendance</kwd>
<kwd>Learning Outcome</kwd>
<kwd>Biomedical Science</kwd>
</kwd-group>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec1">
<title>Introduction</title><p>There are different approaches and techniques to enhance student learning, performance, and skills. These approaches include asking questions and applying active learning strategies, constructing mechanisms and concept maps, developing critical thinking skills, using self-reflection, and asking for feedback, and using a range of learning resources. They also involve practicing learning by using simulation, applying knowledge learnt to new problems and service learning, using summative and formative assessment, applying differentiated instruction, applying the reversed learning model, and restructuring teaching methods (Azer et al., 2013; Philips, 2017; Kampen, 2021)[
<xref ref-type="bibr" rid="R1">1</xref>,<xref ref-type="bibr" rid="R2">2</xref>,<xref ref-type="bibr" rid="R3">3</xref>]. Quizzes have been assumed to be enhancing tools of student learning. They were considered by (Brown, 2004)[
<xref ref-type="bibr" rid="R4">4</xref>] as &#x26;#x0201c;the information that washes back to students in the form of useful diagnoses of strengths and weaknesses&#x26;#x02016; (p. 29)&#x26;#x0201d;.</p>
<p>Studies have highlighted the benefits of frequent quizzing in class. A study among dental science students showed that frequent quizzes are important tools that can significantly influence students&#x26;#x02019; learning performance (Geist and Soehren, 1997)[
<xref ref-type="bibr" rid="R5">5</xref>]. Indeed, student learning and performance can be further enhanced with increasing the number of quizzes they receive, suggesting an influential autonomous effect of quizzing on student learning (Geist and Soehren, 1997)[
<xref ref-type="bibr" rid="R5">5</xref>]. In addition, online quizzing is a powerful and preferrable tool for both teaching and student learning of biological courses (Culbert, 2020)[
<xref ref-type="bibr" rid="R6">6</xref>].</p>
<p>Frequent quizzing and testing can enhance students&#x26;#x02019; class attendance in classroom (Jones, 1984; Wilder et al., 2001; Clump et al., 2003)[
<xref ref-type="bibr" rid="R7">7</xref>,<xref ref-type="bibr" rid="R8">8</xref>,<xref ref-type="bibr" rid="R9">9</xref>]. Quizzes can motivate students to attend classes (Zarei, 2008)[
<xref ref-type="bibr" rid="R10">10</xref>]. Interestingly, frequent quizzing can also enhance student retention of the material presented during lecture or prepare them for high- stakes examinations (Johnson and Kiviniemi, 2009)[
<xref ref-type="bibr" rid="R11">11</xref>], and students taking weekly quizzes can perform better in the final achievement tests (Gholami and Moghaddam, 2013)[
<xref ref-type="bibr" rid="R12">12</xref>]. Indeed, a recent study has demonstrated that the use of weekly quizzes can promote students&#x26;#x02019; attendance, engagement, and achievement (AlBahadli, 2020)[
<xref ref-type="bibr" rid="R13">13</xref>]. Furthermore, weekly quizzes are assessment tool that can help students to retrieve their knowledge and receive feedback on their performance immediately (Heise et al., 2020)[
<xref ref-type="bibr" rid="R14">14</xref>]. This assessment tool can be further used as a prediction method of student outcomes and an important opportunity for early interventions to support at-risk students since there is a potential correlation between the performance of students on weekly quizzes and on unit examinations (Heise et al., 2020)[
<xref ref-type="bibr" rid="R14">14</xref>].</p>
<p>Similarly, online quizzing can help in maintaining both regular reading and study schedule for students (, 2008)[
<xref ref-type="bibr" rid="R15">15</xref>], while the use of random quizzes led to 10% increase of student attendance of psychology course (Wilder, 2001)[
<xref ref-type="bibr" rid="R8">8</xref>]. Online quizzes can also improve students&#x26;#x02019; preparation outside of class and reinforce their learning and enhance class preparation and participation (, 2008; et al., 2021)[
<xref ref-type="bibr" rid="R15">15</xref>,<xref ref-type="bibr" rid="R16">16</xref>].</p>
<p>More recent research on student perceptions of online quizzes and interview pretest implementations in pre-practicum activities has shown that online quizzes pretests were preferred by students as a pre-practicum activity since they are not time consuming and help students to prepare well before conducting a practicum (Atmajati et al., 2020)[
<xref ref-type="bibr" rid="R17">17</xref>].</p>
<p>Another study has developed evidence-based recommendations that facilitate the programing and implementation of quizzes by lecturers and instructors to promote students&#x26;#x02019; success in higher education and improve their behavior within their courses et al., 2020)[
<xref ref-type="bibr" rid="R18">18</xref>].</p>
<p>Other benefits of quizzes have been also highlighted in some other research studies. Quizzes such as pup quizzing have a positive effect not only on student learning, but also on prospective teachers when used as integrating activities in teacher training programs (Gull et al., 2015)[
<xref ref-type="bibr" rid="R19">19</xref>]. In addition, adaptive quizzes may enhance student engagement, motivation, and learning outcomes in a first-year accounting unit (Ross et al., 2018)[
<xref ref-type="bibr" rid="R20">20</xref>]. Moreover, an evaluation study of the impact of &#x26;#x0201c;feedback quizzes&#x26;#x0201d; on the learning and performance of undergraduates has found them very useful (Hennig et al., 2019)[
<xref ref-type="bibr" rid="R21">21</xref>]. Similarly, the multimodal quizzes are efficient teaching-learning tool in both assessing and teaching engineering courses (Gamage et al., 2019)[
<xref ref-type="bibr" rid="R22">22</xref>]. Furthermore, administrations of summative or formative feedback quizzes (in class or outside class can increase students&#x26;#x02019; learning and performance as well as satisfaction with some clinical courses (Hennig et al., 2019)[
<xref ref-type="bibr" rid="R21">21</xref>]. Recent studies have also uncovered the effects of quizzes on exam grades using a difference-in- difference approach (Latif and Miles, 2020)[
<xref ref-type="bibr" rid="R23">23</xref>]. Similarly, the scheduling of student assessments and exams has been shown recently to have an impact on student learning and performance (Sofoklis and Megalokonomou, 2020)[
<xref ref-type="bibr" rid="R24">24</xref>].</p>
<p>Measuring student learning can be achieved by effective and direct assessment methods such as examinations and/or in-course assessments (ICAs). Examinations such as degree examinations can be defined as a written demonstration form of a certain level(s) of theoretical knowledge that is gained by students based on achievements of the course learning objectives. Assessments (e.g., ICAs) are a practical form of measuring students&#x26;#x02019; competences by the evaluation of knowledge, skills, and attitude, which are three major factors that are associated to the course learning objectives (Stassen, 2001)[
<xref ref-type="bibr" rid="R25">25</xref>].</p>
<p>Most studies have used degree examinations to evaluate the impact of quizzes on student learning and performance, yet little is known about whether quizzes can reinforce student performance in ICAs despite the importance of ICAs in students&#x26;#x02019; learning. Herein<bold>,</bold><bold> </bold>we will test the hypothesis that administration of weekly MCQ quizzes can enhance the leaning outcomes and performance of biomedical science students in ICA methods such as essay and oral presentation that directly measure and provide information about student learning. We found that there is a remarkable improvement in student learning outcomes and performance as well as class attendance after the administration of weekly MCQ quizzes.</p>
</sec><sec id="sec2">
<title>Materials and Methods</title><title>2.1. Overview of the Course &#x00026;#x26; Assessment</title><p>Since 2018, I have designed Human Disease: From Research to Clinic 3A (HDRC3A) course to use the understanding of disease pathways to explore how biomedical research continues to give rise to knowledge and understanding of pathophysiology and the treatment of disease. In addition, I designed HDRC3A teaching and learning activities to use in-depth analysis of specific disease examples from a range of biomedical disciplines and using several major body systems. These examples will be chosen to introduce the most common current research techniques and approaches and will include teaching on the theoretical principles and practical applications and limitations of these techniques. These examples will be also used to explore hypothesis generation and experimental design. The learning outcomes of HDRC3A course are to make students able to describe and discuss a range of common human disease state and how human disease can be treated and prevented and apply understanding of current research techniques and approaches in biomedical sciences to problems in disease treatment or prevention.</p>
<p>The course covers 13 human diseases: one disease per week. Each disease/module is delivered through 3 lectures (each for one hour) and one 3-hour workshop every week. Both lectures and workshops are designed to be interactive and, therefore, all students are expected to contribute to an active discussion. The course is assessed by in-course assessment (ICA) that consists of a summative essay and a summative oral presentation, and a formative essay plan assessment.</p>
<p>In the academic year (AY) 2020-2021, I have incorporated frequent multiple-choice quizzes, an innovative teaching&#x26;#x02013;learning tool, every week to motivate students to attend classes of HDRC3A course.</p>
<title>2.2. Participants and Study Design</title><p>Consistent with other studies, our initial plan was to split students into 2 groups: control group that will not receive any quizzes and experimental group that will receive quizzes every week. Consequently, we will have two module streams in HDRC3A course, one stream will be assigned as the &#x26;#x0201c;control student group&#x26;#x0201d; and the other as the quizzes or experimental group. The same lecturers will teach both streams, and both identical lecture contents and teaching materials will be used to ensure comparability. However, this splitting needs a strong justification to adversely affect a subset of students and typically we would be expected to allow students to choose which option to take. Therefore, we decided to carry out this research for all students in HDRC3A course in AY 2020-2021, then compare the results with the student performance in the same course from the previous year in which there was no quizzes (i.e., AY 2019-2020).</p>
<p></p>
<p></p>
<p></p>
<title>2.3. Participants</title><p>As shown in the flowchart, Participants of this study were 63 students and 54 students in the AY 2019-2020 and 2020- 2021, respectively. The researchers of this study were the course organizer (the article&#x26;#x02019;s author) and the instructors of different classes. Participant students are two groups: AY 2020-2021 HDRC3A course students, who served as experimental group, and received weekly MCQ quizzes and AY 2019-2020 HDRC3A course students who served as control student group and did not receive weekly quizzes. Intact classes were used in this study.</p>
<title>2.4. Instruments &#x00026;#x26; Procedures</title><p>The <italic>instruments</italic> used in this study involved the courses organizer- and instructor-made MCQ quizzes.</p>
<p>The experimental student group took weekly MCQ quizzes for the whole course duration and the control student group did not take any MCQ quizzes during the course duration. The study employed an experimental design, Experimental students received 2 questions per week delivered at the end of lectures. A total of 25-26 in-class MCQs were set across the course. Students were asked to answer these MCQ quizzes within the constraints of a 50 min lecture to encourage lecture attendance. The MCQ quizzes were not available to answer outside of class. For formative practice, we prepared several example questions and made them available on Blackboard Learn to students. Students were asked to sit all</p>
<p><italic>MCQ</italic><italic> </italic><italic>quizzes</italic>. We performed t-test to investigate the effect of quizzes on the leaning outcomes and performance of students in ICA methods such as essay and oral presentation that can directly measure and provide information about student learning.</p>
<title>2.5. Statistical analyses</title><p>We compared the data of experimental and control student groups using unpaired <italic>t</italic><italic> </italic>test. Differences between groups were significant at a <italic>P</italic><italic> </italic>value of &lt;0.05. Statistical analyses were performed with both Excel and GraphPad Prism (GraphPad Software, Inc., San Diego, CA).</p>
</sec><sec id="sec3">
<title>Results</title><title>3.1. Effect of weekly quizzes on student learning outcomes and performance in ICA methods</title><p>To investigate the effect of weekly quizzes on student learning outcomes and performance in ICA methods, we used a clustered bar chart to compare the relative frequency of student scores in each ICA. As shown inFigure <xref ref-type="fig" rid="fig1"> 1</xref>, the percentage of essay ICA scores clearly increased in the 90-100 interval but was apparently unchanged in the 80-90 interval in the experimental (quizzes) group compared to the control student group (no quizzes). Conversely, the percentage of essay ICA scores clearly decreased in the 70-80 interval in the experimental (quizzes) group compared to the control student group (no quizzes) (Figure 1). Hence, there was almost a 50%<bold> </bold>decrease in the percentage of students who scored 70-80% in the essay ICA in the experimental group, while a 17%<bold> </bold>increase in the percentage of students who scored 90- 100% was found in the experimental group compared to the control student group (Figure 1).</p>
<fig id="fig1">
<label>Figure 1</label>
<caption>
<p><b> Clustered bar chart showing a comparison of the relative frequency of student scores in the essay ICA</b>. Note the increase of the percentage of essay ICA scores in the 90-100 interval after administrating of MCQ quizzes in the experimental (quizzes) group compared to the control student group (no quizzes). Note also that the percentage of essay ICA scores decreased in the 70-80 interval in the experimental group compared to the control student group.</p>
</caption>
<graphic xlink:href="273.fig.001" />
</fig><p>Similarly, the percentage of oral presentation ICA scores markedly increased in the 90-100 interval but was apparently decreased in the 80-90 interval in the experimental (quizzes) group compared to the control student group (no quizzes;Figure <xref ref-type="fig" rid="fig2"> 2</xref>). Hence, there was almost a 51%<bold> </bold>increase in the percentage of students who scored 90-100% in the presentation ICA in the experimental group, while a 19%<bold> </bold>decrease in the percentage of students who scored 80- 90% was found in the experimental group compared to the control student group (Figure 2).</p>
<fig id="fig2">
<label>Figure 2</label>
<caption>
<p><b> Clustered bar chart showing a comparison of the relative frequency of student scores in the oral presentation ICA</b>. Note that the percentage of presentation ICA scores markedly increased in the 90-100 interval but was apparently decreased in the 80-90 interval in the experimental (quizzes) group compared to the control student group (no quizzes).</p>
</caption>
<graphic xlink:href="273.fig.002" />
</fig><p>Notably, a decrease of almost 25%<bold> </bold>was also noted in the percentage of students who scored 70-80% in the presentation ICA in the experimental group, compared to the control student group. Three percentage of control students scored in the 60-70 interval, while none of the experimental students scored in this interval (Figure 2).</p>
<p>Furthermore, we conducted unpaired-samples t-test to determine whether the two sets of essay ICA data are significantly different. As shown inTable <xref ref-type="table" rid="tab1">1</xref>, there was not a significant difference in the essay ICA results/scores between the experimental student group who received the weekly MCQ quizzes (M=87.611; SD=3.790) and the control student group with no quizzes (M= 86.698; SD=4.361); t (degree of freedom value) = 1.1981, p value= 0.2334). Similarly, there was not a significant difference in the presentation ICA scores/results between the experimental student group (M= 87.692; SD=5.691) and the control student group (M=86.421; SD=6.812); t (degree of freedom value)=1.0722, p value=0.2859." Descriptive and test statistics for the oral presentation ICA are summarized inTable <xref ref-type="table" rid="tab2">2</xref>.</p>
<table-wrap id="tab1">
<label>Table 1</label>
<caption>
<p><b>Ta</b><b>b</b><b>le 1.</b><b> </b>Descriptive statistics for the unpaired-samples <i>t</i>-test for the essay ICA</p>
</caption>
<table> <tr>  <td>  <p><b >Student Group</b></p>  </td>  <td>  <p><b >Number (N)</b></p>  </td>  <td>  <p><b >Mean</b></p>  </td>  <td>  <p><b >Std. Deviation (SD)</b></p>  </td>  <td>  <p><b >Std. Error Mean (SEM)</b></p>  </td> </tr> <tr>  <td>  <p>experimental (quizzes)</p>  </td>  <td>  <p>54</p>  </td>  <td>  <p>87.611</p>  </td>  <td>  <p>3.790</p>  </td>  <td>  <p>0.516</p>  </td> </tr> <tr>  <td>  <p>control (no quizzes).</p>  </td>  <td>  <p>63</p>  </td>  <td>  <p>86.698</p>  </td>  <td>  <p>4.361</p>  </td>  <td>  <p>0.549</p>  </td> </tr></table>
</table-wrap>
<table-wrap-foot>
<fn>
The two-tailed P value= 0.2334 (t= 1.1981; df= 115; standard error of difference= 0.762)
</fn>
</table-wrap-foot><table-wrap id="tab2">
<label>Table 2</label>
<caption>
<p><b>Ta</b><b>b</b><b>le 2.</b><b> </b>Descriptive statistics for the unpaired-samples <i>t</i>-test for the oral presentation ICA</p>
</caption>
<table> <tr>  <td>  <p><b >Student Group</b></p>  </td>  <td>  <p><b >Number (N)</b></p>  </td>  <td>  <p><b >Mean</b></p>  </td>  <td>  <p><b >Std. Deviation (SD)</b></p>  </td>  <td>  <p><b >Std. Error Mean (SEM)</b></p>  </td> </tr> <tr>  <td>  <p>experimental (quizzes)</p>  </td>  <td>  <p>52</p>  </td>  <td>  <p>87.692</p>  </td>  <td>  <p>5.691</p>  </td>  <td>  <p>0.789</p>  </td> </tr> <tr>  <td>  <p>control (no quizzes).</p>  </td>  <td>  <p>63</p>  </td>  <td>  <p>86.421</p>  </td>  <td>  <p>6.812</p>  </td>  <td>  <p>0.858</p>  </td> </tr></table>
</table-wrap>
<table-wrap-foot>
<fn>
The two-tailed P value= 0.2859 (t=1.0722; df= 113; standard error of difference= 1.1
</fn>
</table-wrap-foot><p></p>
<title>3.2. Impact of MCQ quizzes on student attendance</title><p>In AY 2019-2020, we noted that the number of students who attended HDRC3A course in classroom dramatically decreased, particularly in the last several weeks of the course. Therefore, one major goal of administrating weekly MCQ quizzes was to enhance student attendance and alleviate the &#x26;#x02018;selective negligence&#x26;#x02019;, where students determine what needs to be done to pass the assessments and do this, but only this, resulting in poor engagement. Indeed, we observed a remarkable increase of student attendance of HDRC3A in classroom after using weekly MCQ quizzes in AY 2020-2021.</p>
<p>Taken together, there is a shift of student scores in both ICAs, with increasing the number of experimental students who have high scores in the 90-100 interval and decreasing the number of low-achieving score students (i.e., in the 70-80 and 60-70 intervals), compared to control students. This suggests that the administration of weekly MCQ quizzes can improve the leaning outcomes and performance of students in ICA methods such as essay and oral presentation that directly measure and provide information about student learning. However, there is no evidence exists that using MCQ quizzes in class has a significant effect on the average of student scores in both ICAs. Furthermore, weekly MCQ quizzes may motivate students to attend classes.</p>
</sec><sec id="sec4">
<title>Discussion</title><p>Most published studies have used degree examinations, rather than ICAs, to evaluate the impact of quizzes on student learning and performance, despite the importance of ICAs in student learning (Latif and Miles, 2020)[
<xref ref-type="bibr" rid="R23">23</xref>]. Therefore, in this study, we tested the hypothesis that administration of weekly MCQ quizzes can enhance the leaning outcomes and performance of biomedical science students in ICA methods such as essay and oral presentation that directly measure and provide information about student learning.</p>
<p>We found that the performance of the weekly quiz (experimental) student group was remarkably better than that of the control student group in both the essay and oral presentation ICAs, which are two measures and indicators of student learning, suggesting improved student learning outcomes and performance. This conclusion was supported by increased number of high score-achieving students (with a score between 90% and 100%), and reduced number of low score-achieving students with a score between 70% and 80% (or 60%-70%), after administration of weekly MCQ quizzes.</p>
<p>The findings of the current research are consistent with previous studies showing that weekly and online quizzes can enhance student performance (Geist and Soehren, 1997; Ballard and Johnson, 2004; Gholami and Moghaddam, 2013; Culbert, 2020)[
<xref ref-type="bibr" rid="R5">5</xref>,<xref ref-type="bibr" rid="R6">6</xref>,<xref ref-type="bibr" rid="R12">12</xref>,<xref ref-type="bibr" rid="R26">26</xref>]. Quizzing was found as both powerful and preferrable tool for both teaching and student learning of biological courses (Culbert, 2020)[
<xref ref-type="bibr" rid="R6">6</xref>]. Similarly, weakly quizzes-received students achieved higher scores than students who did not receive weekly quizzes during the course, suggesting improves student performance (Martin and Srikameswaran, 1974; Kamuche, 2005)[
<xref ref-type="bibr" rid="R27">27</xref>,<xref ref-type="bibr" rid="R28">28</xref>]. However, a study by Haberyan (2003)[
<xref ref-type="bibr" rid="R29">29</xref>], demonstrated that the difference in the performance in class between the weekly quiz student group and no-quiz control students is not significant. Conversely, a recent meta-analysis research study showed that students who are quizzed over class materials (at least once a week) had a better performance on both midterm and final examinations when compared to control students who did not take quizzes (Lukas et al., 2020)[
<xref ref-type="bibr" rid="R30">30</xref>]. Interestingly, the quiz format probably influences both student performance and student answer-changing behavior on common formative assessments (Sherman et al., 2021)[
<xref ref-type="bibr" rid="R31">31</xref>].</p>
<p>The impact of frequent testing such as weekly quizzes on improving student learning outcome and performance, compared to infrequent testing, could be due promoting student attendance that is critical for improving institutional outcomes (Bergin and Ferrara, 2019)[
<xref ref-type="bibr" rid="R32">32</xref>]. One of the major goals of the current research was to incorporate frequent MCQ quizzes, an innovative teaching&#x26;#x02013;learning tool, to motivate students to attend classes and enhance class attendance that notably reduced in the last several weeks of HDRC3A course in AY 2019-2020 (where no quizzes were used), and alleviate the &#x26;#x02018;selective negligence&#x26;#x02019; and attendance problem, where students determine what needs to be done to pass the assessments and do this, but only this, resulting in poor engagement<italic>.</italic></p>
<p>Indeed, we found that student attendance of HDRC3A in classroom remarkably increased after using weekly MCQ quizzes in AY 2020-2021. Our findings are consistent with previous studies showing that weekly quizzes can encourage students to consistently come to class since there is a positive relationship between frequent testing of students and their classroom attendance (Wilder et al., 2001; Zarei, 2008)[
<xref ref-type="bibr" rid="R8">8</xref>,<xref ref-type="bibr" rid="R10">10</xref>]. This can provide more opportunities for students to learn in the classroom, leading to improving their overall course grades (Wilder et al., 2001; Clump et al., 2003)[
<xref ref-type="bibr" rid="R8">8</xref>,<xref ref-type="bibr" rid="R9">9</xref>]. Other studies on college students supported this conclusion and showed that frequent quizzing can encourage students to attend classes at a higher rate (Gokcora and DePaulo, 2018)[
<xref ref-type="bibr" rid="R33">33</xref>]. Indeed, recent study has further supported the positive effect of using weekly quizzes on student attendance, engagement and achievement (AlBahadli, 2020)[
<xref ref-type="bibr" rid="R13">13</xref>].</p>
<p>Another effect of frequent testing on student learning outcomes and performance is improving student retention of information and lecture materials. This retention improvement could be due to frequent testing-based motivations of students to do both pre- class readings and preparations (Standlee and Popham, 1960; Dustin, 1971)[
<xref ref-type="bibr" rid="R33">33</xref>,<xref ref-type="bibr" rid="R34">34</xref>] and extra work in the class (Martin and Srikameswaran, 1974)[
<xref ref-type="bibr" rid="R27">27</xref>]. Indeed, online quizzes can also improve students&#x26;#x02019; preparation outside of class and reinforce their learning and enhance class preparation and participation (, 2008; et al., 2021)[
<xref ref-type="bibr" rid="R15">15</xref>,<xref ref-type="bibr" rid="R16">16</xref>].</p>
<p>The timing of quizzes given to students during learning can impact their retention of teaching material (Healy et al., 2017)[
<xref ref-type="bibr" rid="R36">36</xref>]. Interrupting students learning with quiz questions may have several benefits because of enhancing student engagement (Healy et al., 2017)[
<xref ref-type="bibr" rid="R36">36</xref>]. In addition, a remarkable study by (Case and Kennedy, 2020)[
<xref ref-type="bibr" rid="R37">37</xref>] investigated the impact of frequent in-class quizzing sequence, both post-lecture and pre-lecture quizzes, on student lessons&#x26;#x02019; preparation, in-class participation, and knowledge retention. They reported that quiz sequence (pre-lecture versus post-lecture quizzes) has a significant impact on student in-class engagement, and knowledge retention (Case and Kennedy, 2020)[
<xref ref-type="bibr" rid="R37">37</xref>].</p>
<p>Recent studies showed that weekly quizzes are also benefited assessment tool that can help students to retrieve their knowledge and receive feedback on their performance immediately (Heise et al., 2020)[
<xref ref-type="bibr" rid="R14">14</xref>]. (Lukas et al., 2020)[
<xref ref-type="bibr" rid="R30">30</xref>] reported that immediate feedback from instructors that are associated with quizzing can have a positive impact on student performance. Weekly quizzes can, therefore, be further used as a prediction method of student outcomes and an important opportunity for early interventions to support at-risk students since there is a potential correlation between the performance of students on weekly quizzes and on unit examinations (Heise et al., 2020)[
<xref ref-type="bibr" rid="R14">14</xref>].</p>
<p>Other reasons for the positive effect of frequent testing on information retention are promoting more student-teacher interactions, discussion, and engagement (Fitch et al., 1951; Selakovich, 1962; Farhady et al., 1994)[
<xref ref-type="bibr" rid="R38">38</xref>,<xref ref-type="bibr" rid="R39">39</xref>,<xref ref-type="bibr" rid="R40">40</xref>], and both facilitating and supporting the learning of lecture materials systematically (Fulkerson and Martin, 1981)[
<xref ref-type="bibr" rid="R41">41</xref>]. Since small amounts of lecture materials are tested through frequent testing, students can process these materials more meticulously and deeply, leading to more efficient learning (Standlee and Popham, 1960)[
<xref ref-type="bibr" rid="R34">34</xref>]. Similarly, online quizzes pretests were preferred by students as a pre-practicum activity since they are not time consuming and help students to prepare well before conducting a practicum (Atmajati et al., 2020)[
<xref ref-type="bibr" rid="R17">17</xref>].</p>
<p>Furthermore, frequent quizzing can create more extrinsic motivations for the students to score<bold> </bold>a high grade in their assessments and exams, leading to spending more time and efforts in both their study and preparation for the quizzes and assessments (Standlee and Popham, 1960; Dustin, 1971)[
<xref ref-type="bibr" rid="R34">34</xref>,<xref ref-type="bibr" rid="R35">35</xref>]. However, in his study on learning ability, (Zarei, 2008)[
<xref ref-type="bibr" rid="R10">10</xref>] argued that &#x26;#x0201c;motivation is not always the cause of good grades; it may well be the result of them&#x26;#x0201d;. Frequent quizzing can also regularly expose students to teaching materials, making them more familiar with teachers&#x26;#x02019; instructional expectations, question types, and methodology of both assessments and exams (Farhady et al., 1994)[
<xref ref-type="bibr" rid="R40">40</xref>]. Other types of quizzes such as pup quizzing (Gull et al., 2015)[
<xref ref-type="bibr" rid="R19">19</xref>], Adaptive quizzes (Ross et al., 2018)[
<xref ref-type="bibr" rid="R20">20</xref>], feedback quizzes (Hennig et al., 2019)[
<xref ref-type="bibr" rid="R21">21</xref>], multimodal quizzes (Gamage et al., 2019)[
<xref ref-type="bibr" rid="R22">22</xref>] have also a positive effect on student learning and performance and/or prospective teachers. Notably, quiz sequence (pre-lecture versus post-lecture quizzes) has a significant impact on student motivation (Case and Kennedy, 2020)[
<xref ref-type="bibr" rid="R37">37</xref>].</p>
<p>The findings of this research study have implications for students, teachers, and curriculum designers since they suggest that the administration of weekly MCQ quizzes can improve the leaning outcomes and performance of students in ICA methods such as essay and oral presentation that directly measure and provide information about student learning.</p>
</sec><sec id="sec5">
<title>Acknowledgments</title><p>The author gratefully acknowledges the support of colleagues and collaborators at the institute.</p>
<p></p>
<p><bold>D</bold><bold>eclaration of interest</bold><bold> </bold><bold>statement</bold></p>
<p>No potential conflict of interest was reported by the authors.</p>
<p><bold>Funding</bold> (Not applicable)</p>
<p><bold>Availability of data and material </bold>(Not applicable)</p>
<p><bold>Code availability</bold> (Not applicable)</p>
<p><bold>Authors' contributions </bold>(Not applicable)</p>
<p><bold>Ethical Approval: </bold>described</p>
<p></p>
<p></p>
</sec>
  </body>
  <back>
    <ref-list>
      <title>References</title>
      
<ref id="R1">
<label>[1]</label>
<mixed-citation publication-type="other">Azer SA, Anthony PS Guerrero AP, Walsh A (2013). Enhancing learning approaches: Practical tips for students and teachers. Medical Teacher 35(6); 433-443.
</mixed-citation>
</ref>
<ref id="R2">
<label>[2]</label>
<mixed-citation publication-type="other">Philips, J (2017). 5 tips to improve student learning outcome. https://www.bookwidgets.com/blog/2017/05/5-tips-to-improve-student-learning-outcome
</mixed-citation>
</ref>
<ref id="R3">
<label>[3]</label>
<mixed-citation publication-type="other">Kampen, M (2021). 36 Cutting-Edge Teaching Strategies &#x00026; Techniques for 2021 Learning. https://www.prodigygame.com/main-en/blog/teaching-strategies/
</mixed-citation>
</ref>
<ref id="R4">
<label>[4]</label>
<mixed-citation publication-type="other">Brown, HD. (2004). Language assessment: Principles and classroom Practices. US. Longman.
</mixed-citation>
</ref>
<ref id="R5">
<label>[5]</label>
<mixed-citation publication-type="other">Geist JR, Soehren SE. (1997). The Effect of Frequent Quizzes on Short- and Long-Term Academic Performance. Journal of Dental Education, 61(4); 339-345.
</mixed-citation>
</ref>
<ref id="R6">
<label>[6]</label>
<mixed-citation publication-type="other">Culbert PD (2020). Supplementing Forestry Field Instruction with Video and Online Dy-namic Quizzing. Natural Science Education. Nat Sci Educ. 2020. 49; e20015-1-12.
</mixed-citation>
</ref>
<ref id="R7">
<label>[7]</label>
<mixed-citation publication-type="other">Jones, CH (1984). Interaction of absences and grades in a college course. Journal of Psychology. 116; 133-136.
</mixed-citation>
</ref>
<ref id="R8">
<label>[8]</label>
<mixed-citation publication-type="other">Wilder, DA, Flood WA, Stromsnes W. (2001). The use of random extra credit quizzes to increase student attendance. Journal of Instructional Psychology. Retrieved March 7, 2010, From http://findarticles.com/p/articles/mi
</mixed-citation>
</ref>
<ref id="R9">
<label>[9]</label>
<mixed-citation publication-type="other">Clump MA, Bauer H, Alex W. (2003). To attend or not to attend: Is that a good question? The Internet TESOL Journal, 8(12), Retrieved March 12, 2010; from http://findarticles.com/p/articles/mi.
</mixed-citation>
</ref>
<ref id="R10">
<label>[10]</label>
<mixed-citation publication-type="other">Zarei, AA (2008). On the Learnability of three categories of Idioms by Iranian EFL learners. Journal of Humanities of the University of Kerman,.2(2); 82-100.
</mixed-citation>
</ref>
<ref id="R11">
<label>[11]</label>
<mixed-citation publication-type="other">Johnson BC, Kiviniemi MT (2009). The effect of online chapter Quizzes on Exam perfor-mance in an undergraduate social psychology course. Teach Psychology. 36(1); 33-37.
</mixed-citation>
</ref>
<ref id="R12">
<label>[12]</label>
<mixed-citation publication-type="other">Gholami V and Moghaddam MM. (2013). The Effect of Weekly Quizzes on Students' Final Achievement Score. J. Modern Education and Computer Science, 2013(1); 36-41.
</mixed-citation>
</ref>
<ref id="R13">
<label>[13]</label>
<mixed-citation publication-type="other">AlBahadli KH (2020). The effect of weekly quizzes on EFL college students' achievement, en-gagement, and attendance. International Journal of Interdisciplinary &#x00026; Multidisciplinary Re-search (2456-4567).
</mixed-citation>
</ref>
<ref id="R14">
<label>[14]</label>
<mixed-citation publication-type="other">Heise N, Meyer CA, Garbe BA et al. (2020). Table Quizzes as an Assessment Tool in the Gross Anatomy Laboratory. Journal of Medical Education and Curricular Development 7; 1-10.
</mixed-citation>
</ref>
<ref id="R15">
<label>[15]</label>
<mixed-citation publication-type="other">Marcell, M. (2008). Effectiveness of Regular Online Quizzing in Increasing Class Participation and Preparation. International Journal for the Scholarship of Teaching and Learning 2(1); 1-9.
</mixed-citation>
</ref>
<ref id="R16">
<label>[16]</label>
<mixed-citation publication-type="other">Agrawal N, Rathi S, Gupta N et al. (2021). The use of anonymous pop-quizzes as an inno-vative teaching-learning tool to reinforce learning among under-graduate dental students. SRM Journal of Research in Dental Sciences 12(2):74.
</mixed-citation>
</ref>
<ref id="R17">
<label>[17]</label>
<mixed-citation publication-type="other">Atmajati ED, Kristanto YD, Panuluh AH (2020). Student Perception of Online Quizzes and Interview Pretest Implementation in Pre-Practicum Activity. Companion Proceedings of the SEADRIC 2019 (2020); 150-154. https://usd.ac.id/seadr
</mixed-citation>
</ref>
<ref id="R18">
<label>[18]</label>
<mixed-citation publication-type="other">Glodowski K, Thompson RH, Asuncion EA (2020). Evidence-Based Recommendations for Programming Quizzes to Improve College Student Behavior in Residential Courses. Journal of Behavioral Education. 29(8); 543-570.
</mixed-citation>
</ref>
<ref id="R19">
<label>[19]</label>
<mixed-citation publication-type="other">Gull F, Shaheen F, Rana R (2015). Using "Pub Quiz" to Promote Participation and Active Learning in Prospective Teachers. Journal Pendidikan Malaysia 40(2); 119-128.
</mixed-citation>
</ref>
<ref id="R20">
<label>[20]</label>
<mixed-citation publication-type="other">Ross, B., Chase, AM., Robbie, D. et al. Adaptive quizzes to increase motivation, engagement and learning outcomes in a first-year accounting unit. Int J Educ Technol High Educ 15, 30 (2018). https://doi.org/10.1186/s41239-018-0113-2
</mixed-citation>
</ref>
<ref id="R21">
<label>[21]</label>
<mixed-citation publication-type="other">Hennig S, Staatz CE, Bond JA, et al. (2019). Quizzing for success: Evaluation of the impact of feedback quizzes on the experiences and academic performance of undergraduate students in two clinical pharmacokinetics courses. Currents in Pharmacy Teaching and Learning. 11(7); 742-749. https://doi.org/10.1016/j.cptl.2019.03.014.
</mixed-citation>
</ref>
<ref id="R22">
<label>[22]</label>
<mixed-citation publication-type="other">Gamage SH, Ayres JR, Behrend MB, Smith EJ (2019). Optimizing Moodle quizzes for online assessments. International Journal of STEM Education 6(27); 1-14.
</mixed-citation>
</ref>
<ref id="R23">
<label>[23]</label>
<mixed-citation publication-type="other">Latif, E, Miles S (2020). The Impact of Assignments and Quizzes on Exam Grades: A Differ-ence-in-Difference Approach, Journal of Statistics Education, 28(3); 289-294.
</mixed-citation>
</ref>
<ref id="R24">
<label>[24]</label>
<mixed-citation publication-type="other">Sofoklis, G.S, and Megalokonomou R (2020): The effects of exam scheduling on academic perfor-mance. https://voxeu.org/article/effects-exam-scheduling-academic-performance
</mixed-citation>
</ref>
<ref id="R25">
<label>[25]</label>
<mixed-citation publication-type="other">Stassen MLA (2001). COURSE-Based Review and Assessment: Methods for Understanding Student Learning. Massachusetts, USA: University of Massachusetts Amherst Press.
</mixed-citation>
</ref>
<ref id="R26">
<label>[26]</label>
<mixed-citation publication-type="other">Ballard, C. L. &#x00026; Johnson, M. F. (2004). Basic Math Skills and Performance in an Introductory Economics Class. Journal of Economic Education, 35(1), 3-24.
</mixed-citation>
</ref>
<ref id="R27">
<label>[27]</label>
<mixed-citation publication-type="other">Martin, RR and Srikameswaran, K. (1974). Correlation between frequent testing and student performance. Journal of Chemical Education, 51(7), 485-486.
</mixed-citation>
</ref>
<ref id="R28">
<label>[28]</label>
<mixed-citation publication-type="other">Kamuche, F. U. (2005). Do weekly quizzes improve student performance? Academic Exchange Quarterly, 9(3), 188-193.
</mixed-citation>
</ref>
<ref id="R29">
<label>[29]</label>
<mixed-citation publication-type="other">Haberyan, K. A. (2003). Do Weekly Quizzes Improve Student Performance on General Biology Exams? American Biology Teacher, 65(2), 110-114.
</mixed-citation>
</ref>
<ref id="R30">
<label>[30]</label>
<mixed-citation publication-type="other">Lukas, K. Sotola, Marcus Crede. Regarding Class Quizzes: A Meta-analytic Synthesis of Studies on the Relationship Between Frequent Low-Stakes Testing and Class Performance. Educational Psychology Review, 2020; DOI: 10.1007/s10648-020-09563-9
</mixed-citation>
</ref>
<ref id="R31">
<label>[31]</label>
<mixed-citation publication-type="other">Sherman, TJ, Harvey TM, Royse EA, et al. (2021). Effect of quiz format on student perfor-mance and answer-changing behavior on formative assessments, Journal of Biological Edu-cation, 55:3, 306-320,
</mixed-citation>
</ref>
<ref id="R32">
<label>[32]</label>
<mixed-citation publication-type="other">Bergin and J Ferrara L (2019). How student attendance can improve institutional outcomes. Ed-ucause Review. https://er.educause.edu/blogs/sponsored/2019/4/how-student- attend-ance-can-improve-institutional-outcomes
</mixed-citation>
</ref>
<ref id="R33">
<label>[33]</label>
<mixed-citation publication-type="other">Gokcora D and DePaulo D (2018). Frequent Quizzes and Student Improvement of Reading: A Pilot Study in a Community College Setting. SAGE Open 2018: 1-9
</mixed-citation>
</ref>
<ref id="R34">
<label>[34]</label>
<mixed-citation publication-type="other">Standlee, L. S. &#x00026; Popham, W. J. (1960). Quizzes'contribution to learning. Journal of Educational Psychology, 51(6), 322-325
</mixed-citation>
</ref>
<ref id="R35">
<label>[35]</label>
<mixed-citation publication-type="other">Dustin, D. S. (1971). Some effects of exam frequency. The Psychological Record, 21(3), 409-414
</mixed-citation>
</ref>
<ref id="R36">
<label>[36]</label>
<mixed-citation publication-type="other">Healy AF, Jones M, Lalchandani LA, Tack LA. (2017). Timing of quizzes during learning: Effects on motivation and retention. J Exp Psychol Appl. Jun;23(2),128-137.
</mixed-citation>
</ref>
<ref id="R37">
<label>[37]</label>
<mixed-citation publication-type="other">Case, JJ and Kennedy DK (2020). Using Quizzes Effectively: Understanding the Effects of Quiz Timing on Student Motivation and Knowledge Retention (a paper was completed and submitted in partial fulfillment of the Master Teacher Program, a 2-year faculty professional develop-ment program conducted by the Center for Teaching Excellence, United States Military Academy, West Point, NY 2020).
</mixed-citation>
</ref>
<ref id="R38">
<label>[38]</label>
<mixed-citation publication-type="other">Fitch, M. L., Drucker, A. J., &#x00026; Norton, J. R. (1951). Frequent testing as a motivating factor in large lecture classes. The Journal of Educational Psychology, 42(1), 1-20.
</mixed-citation>
</ref>
<ref id="R39">
<label>[39]</label>
<mixed-citation publication-type="other">Selakovich, D. (1962). An experiment attempting to determine the effectiveness of frequent testing as an aid to learning in beginning college courses in American government. The Journal of Educational Research, 55(4), 178-180.
</mixed-citation>
</ref>
<ref id="R40">
<label>[40]</label>
<mixed-citation publication-type="other">Farhady, H., Jafarpur, A., &#x00026; Birjandi, P. (1994). Testing language skills: From theory to practice. Tehran: SAMT Publication.
</mixed-citation>
</ref>
<ref id="R41">
<label>[41]</label>
<mixed-citation publication-type="other">Fulkerson, F. F. and Martin, G. (1981). Effects of exam frequency on student performance, evalua-tions of instructor, and test anxiety. Teaching of Psychology, 8(2), 90-93.
</mixed-citation>
</ref>
    </ref-list>
  </back>
</article>