Introduction
The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students (Harris and Martin, 2012; Levitz, 2016). These students also find asynchronous elements of a course are more useful than synchronous elements (Gillingham and Molinari, 2012). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger (Harris and Martin, 2012). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective (Gillis and Krull, 2020). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.
An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online (Freeman et al., 2014; Chen et al., 2018; Davis et al., 2018). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor (Freeman et al., 2014). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning (Bostock, 1998; Zull, 2002). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used (Deslauriers et al., 2019). We examine student perceptions of remote learning experiences in light of these previous findings.
In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.
Materials and Methods
Participant Pool
Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1).
Survey Design
The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.
This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions (Fink, 2013; Chen et al., 2018; Davis et al., 2018). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.
The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.
Qualitative Coding
We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice (Garrison et al., 1999; Zull, 2002; Fink, 2013; Freeman et al., 2014). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook (Supplementary Table 1). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.
Statistical Analysis
The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning (Table 1). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses (Mansinghka et al., 2016).
Table 1. Likert-scale questions.
Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U-test, and p-values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f, which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.
Results
Students Prefer Synchronous Class Sessions
Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% (n = 2,045) students identified live classes as their primary mode of learning, 54.6% (n = 2613) students preferred this mode (Figure 1). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).
Figure 1. Actual (A) and preferred (B) primary modes of learning.
In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation (Table 2 and Supplementary Figure 2). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.
Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.
Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience
To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories (National Research Council, 2000; Freeman et al., 2014). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive (Table 3). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method (p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) (p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response (Supplementary Figure 3).
Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).
Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction (Table 4). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more (p < 0.00001) and 60% chance that they felt more motivated to learn (p < 0.00001). Students who selected more than four methods (n = 417) were also 65.1% (p < 0.00001), 62.9% (p < 0.00001) and 64.3% (p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.
Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).
Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience
Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions (Table 5). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment (f = 0.546, p < 0.001), motivation (f = 0.553, p < 0.0001), satisfaction with instruction (f = 0.596, p < 0.00001), engagement (f = 0.572, p < 0.00001) and active participation (f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).
Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).
Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions (Supplementary Table 2). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.
Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning
As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).
In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”
Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.
The preference for synchronous vs. asynchronous modes of learning follows similar trends (Table 6). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.
Table 6. Most prevalent themes for students based on their preferred mode of remote learning.
Discussion
Student Perceptions Align With Research on Active Learning
The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning (Freeman et al., 2014).
Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses (Deslauriers et al., 2019). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.
A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.
This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.
Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention (Bradbury, 2016). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.
Students Value Social Presence in Remote Learning
Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility (Harris and Martin, 2012; Levitz, 2016). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.
Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” (Goodyear, 2002, 2005; Tu, 2002). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques (Gunawardena, 1995; Tu, 2002). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning (Gunawardena and Zittle, 1997; Richardson and Swan, 2019; Shea et al., 2019). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.
Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors (Gay, 2000). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.
Limitations of These Data
Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.
Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.
This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.
Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.
Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning
Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms (Shekhar et al., 2020). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days (Aspergren, 2020).
In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.
Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: doi: 10.7910/DVN/2TGOPH.
Data Availability Statement
The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.
Ethics Statement
Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.
Author Contributions
GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Acknowledgments
We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.
Supplementary Material
The Supplementary Material for this article can be found online at:
References
Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066
CrossRef Full Text | Google Scholar
Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369
PubMed Abstract | CrossRef Full Text | Google Scholar
Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019
CrossRef Full Text | Google Scholar
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116
PubMed Abstract | CrossRef Full Text | Google Scholar
Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.
Google Scholar
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111
PubMed Abstract | CrossRef Full Text | Google Scholar
Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117
PubMed Abstract | CrossRef Full Text | Google Scholar
Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6
CrossRef Full Text | Google Scholar
Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.
Google Scholar
Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4
CrossRef Full Text | Google Scholar
Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263
CrossRef Full Text | Google Scholar
Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work, eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4
CrossRef Full Text | Google Scholar
Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344
CrossRef Full Text | Google Scholar
Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.
Google Scholar
Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970
CrossRef Full Text | Google Scholar
Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211
CrossRef Full Text | Google Scholar
Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.
Google Scholar
Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7
CrossRef Full Text | Google Scholar
National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853
CrossRef Full Text | Google Scholar
Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864
PubMed Abstract | CrossRef Full Text | Google Scholar
Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856
PubMed Abstract | CrossRef Full Text | Google Scholar
Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.
Google Scholar
Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.
Google Scholar
Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421
CrossRef Full Text | Google Scholar
Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning, 1st Edn. Sterling, VA: Stylus Publishing.
Google Scholar
link