Skip to main content

What role can Gameful Pedagogy play in online courses?

COVID-19 caught everyone off guard in 2020. Suddenly, all classes had to be held online and instructors and students had to react quickly with minimal help. With time to reflect on these experiences, faculty ask themselves what methods are available to keep students engaged and motivated in an online or virtual environment.

At the Center for Academic Innovation, gameful pedagogy is one approach to increasing student engagement. This method of course design takes inspiration from how good games function and applies that to the design of learning environments. 

One key goal of gameful pedagogy, as one might guess, is leveraging student motivation. To achieve that, course designers draw on elements of Self-Determination Theory, or SDT for short. This theory centers the power of intrinsic motivation as a driver of behavior. It sits on three primary pillars: autonomy (the power of choice a learner can have in their learning experience), competency (a feeling of accomplishment derived from completing a challenge), and belongingness (a feeling of being included and heard by the environment one is in or the people around them) (Deci & Ryan, 2000). 

Yet, gameful pedagogy isn’t just about SDT. Practitioners also believe in an additive point-based grading system instead of traditional grading. In traditional deductive percentage-paced grading, learners start at 100% and have their points deducted as they learn, which does not align with what learning is about. 

In a gameful course, learners are treated as novices when they first start a learning journey, so they start from zero and then work their way up to their goals. It also provides learners the freedom to fail. From a gameful point of view, it is unfair to expect learners to be “perfect” in learning environments because mistakes are common in learning, and they are great growth opportunities. Therefore, in gameful, learning environments that leave space for learners to explore and offer chances to make up for mistakes are preferred. It is important, however, to acknowledge that this freedom does not mean creating an out-of-control environment. Educators can still apply limitations by assigning different point values, requiring the completion of certain tasks to unlock others, etc. to ensure that students are working toward the learning goals. All of these approaches and more boil down to gameful pedagogy, and this course design method has been used in a wide range of classes, from higher education down to K-12. However, most use cases occurred in person before the 2020 COVID outbreak. Does gameful also work in online environments?

That turns out to be a great question for Pete Bodary, clinical associate professor of applied exercise science and movement science in the School of Kinesiology.  He has taught gameful courses for several years, including MOVESCI 241. This course teaches body mass regulation assessments, principles, and strategies. It is constructed with an additive point-based grading scheme, all-optional assignments (a student has the autonomy to complete any combination of assignments to get to their desired grade/goal), a strong supportive network, and real-world relevant topics (diabetes, disordered eating, weight control, supplements and safety, etc.). 

To maintain all assignments as optional while ensuring that students are on track to the learning objectives, Bodary assigns significantly more points to certain assignments to encourage completion. Some assignments include personal dietary intake and physical activity tracking, case studies, participation and reflections on dietary and physical challenges, and more. 

In Winter 2023, he decided to give students more freedom to engage with the class lectures on top of the existing setup. Students could choose from three distinct sections: the in-person section, the synchronous virtual section, or the asynchronous virtual section. In the in-person section, students were required to attend lectures in person. In the synchronous virtual section, students could participate in lectures online while being live-streamed. The asynchronous virtual section allowed students the freedom to watch lecture recordings at their convenience without the obligation to attend lectures in real-time. 

Did students in different sections perform differently in this course? The short answer is no, not significantly.

“Those who are remote do not have the ease of popping out a question, [meaning the ability to raise their hand and spontaneously ask questions], so that is one difference to consider. However, we maintain a pretty active [asynchronous] Q/A space. I don’t believe that they ‘performed’ differently,” Bodary said.   

Students engage with the course content differently, but they are all motivated and learning in their own way. In fact, to find out students’ motivations in this course, Bodary deployed a U-M Maizey project. U-M Maizey is a generative AI customization tool that allows faculty, staff and students to build their a U-M GPT chatbot trained on a custom dataset. Bodary set up Maizey in the Fall 2023 term for the same course with a similar structure and prompted Maizey: What is the primary motivation of students? 

By evaluating students’ activity data, Maizey summarized that students are primarily motivated by finding course materials relatable and beneficial to improving their personal and loved ones’ health and well-being, connecting knowledge and issues they garnered in their daily lives to class content, and implementing course content in real-world problems. 

Looking at this example, three key characteristics emerge: controlled freedom for students to choose how to engage with the course, opportunities for students to make personal connections with course content, and possibilities for students to apply course content in real-world situations. 

Tying these characteristics back to gameful pedagogy, there is alignment between them and the three components of SDT – autonomy, belongingness, and competency. Furthermore, the additive grading system and all-optional assignment design support student exploration and agency to choose assignments and coursework.  The course format, whether in-person or online, didn’t impact students’ motivation. Instead, the fact that students can choose their own way to participate in the class may motivate them even more. 

What’s important here isn’t modality (online, in-person, or asynchronously) but rather the content and design of the course. The success of MOVESCI 241 hinges on a carefully designed course where students can successfully meet the learning goals regardless of how they engage. The design of MOVESCI 241 is gameful, but not all gameful courses are designed this way. If you want to use gameful pedagogy to increase engagement in your course, you can start with these steps. You can also check out GradeCraft, a learning management system (LMS) built at the center to support gameful courses. Some key features of GradeCraft that make it a perfect companion for gameful courses are the additive grading system, mechanisms for tracking tangible progress (points planner, levels, unlocks, and badges), and functions for flexibility (highly tailorable for both instructors and students). Finally, if you want to learn more about gameful pedagogy or GradeCraft, please email us at [email protected], and staff would be happy to set up a conversation with you.

References:

Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227-268.

Educators can use generative AI to transform dense, technical material into clear, easily understandable content. This improves students’ comprehension and makes the learning experience more inclusive to a wider audience. While students are growing in their knowledge of complex academic topics, sometimes academic terminology can be a barrier. Particularly early in the course, students may not yet be familiar with the jargon and language of your subject matter. In addition, you may have learners in your course with a wide range of educational and cultural backgrounds. Some of your students may be from countries outside of the United States, and English may not be their first language. By demystifying complex concepts, jargon, and metaphors with generative AI, educators are empowered to create more equitable and effective learning environments for our diverse array of learners. 

For example, you can use the following example prompt to get started: 

In this prompt, we are asking ChatGPT to rewrite text to an 8-10th-grade reading level on the Flesch-Kincaid Grade Scale. This is the reading level recommended for a general adult lay audience. Feel free to adjust this to fit your target audience. 

Example: An Online Course on Neuroscience

Drafting

Now imagine that you are a renowned neuroscientist and a highly regarded faculty member at Michigan Medicine. You are interested in developing an online course that will bring neuroscience concepts to a lay audience. You are excited to get started, but as you begin to develop content, you quickly realize that your typical content is aimed at seasoned medical students and filled with jargon that may be daunting to those without prior knowledge. You realize that generative AI may be able to assist you in breaking down concepts into simpler terms. 

You fill in the example prompt with some of the text from one of your old in-person presentations with key concepts that you would like to include in this online course: 

In response to your input, ChatGPT gives you the following output: 

In this example, ChatGPT keeps all of the main concepts intact while using simpler language, providing definitions of terminology used (rather than removing it entirely), and breaking the large paragraph into more digestible, smaller paragraphs or chunks. 

Refining 

As a content expert, it is important to read through the output and ensure that all key concepts remain intact. It is also up to you to determine whether the revisions are sufficient and appropriate for your audience. You may choose to ask for stylistic revisions as well. For example, ChatGPT wrote the text as though the course is currently happening. However, you plan on delivering this information at the beginning of the course to talk about what the learner will learn. This is your preference. 

You can ask ChatGPT to revise with the following: 

ChatGPT will then go through and make the requested revisions to the text using the appropriate tense that you indicated in your input: 

Continue to refine as needed. Consider feeding into the chat examples of your tone of voice so that the content is not only accessible for learners but also contains a human element. In addition, you can increase your expectation of language understanding as your students grow in their knowledge and your expectations of understanding increase.

Echoes of “Can we have a study guide?” still reverberate through the virtual classrooms, even as summer takes hold and the allure of relaxation sets in. Study guides offer a temporary solution to students’ hunger for knowledge, providing them with the fish they need to satisfy their immediate needs. This approach, however, creates a cycle of dependency, requiring another fix before the next test opens. This is not the way. Instead of spoon-feeding, students should be taught to fish.

Though study guides have their merits, their direct impact on learning is not always evident. Tests can be a significant source of stress for students, which in turn hampers their performance. Study guides can help alleviate this anxiety and improve exam scores (Dickson, Miller, & Devoley, 2005), but they don’t necessarily foster deep, long-term learning. If the goal is to guide students’ online study habits before a test, then they should receive guidance not only on what to study but also on how to study effectively. Problem Roulette is the way.

Problem Roulette is an invaluable personalized online learning tool that directs students’ attention to the study skills that work best for them. It offers a collection of previous test items for students to practice with and, starting this Fall 2023, will begin providing tailored study tips based on proven theory and algorithms designed to enhance test performance. In essence, Problem Roulette will not only feed but also teach students to fish. It will give them the confidence boost they crave through exposure to test-like items, while teaching them personally relevant study skills that can be applied to new situations. 

How will Problem Roulette work in online learning environments? In short, it will harness the power of gameful learning. As students engage with practice test items, the system will collect statistics on their performance, which will then be visualized and presented on a student-facing dashboard. This feedback will include information on the number of problems completed and the number of consecutive correct answers. These metrics will be compared with predefined volume and streak goals established through previous research (Black et al., 2023), known to maximize course performance. Consequently, the game for students becomes achieving their target volume and streak goals, which intrinsically incentivizes their study. To attain these goals, however, they must study effectively by consistently answering questions correctly in a row. As students strive to meet their volume and streak targets, they will simultaneously discover the study habits that yield the best results for them individually.

In the realm of online teaching, Problem Roulette emerges as an empowering force, equipping students with the skills they need to become self-sufficient learners. It shifts the focus from mere information consumption to active engagement, encouraging students to take charge of their own learning journey. By embracing Problem Roulette, educators can foster a generation of online students who not only excel academically but also possess the essential skills to adapt, learn, and thrive in the digital age.

Generative AI can be a valuable asset to instructors looking for assistance with creating various aspects of course design. For example, generative AI, such as ChatGPT, can be a valuable tool for educators in drafting learning objectives. Using GenAI in any setting is usually a process of drafting and then refining prompts until the desired result is achieved. In this article, we will outline some ways to generate and refine learning objectives for a course.

Learning objectives are concise statements that articulate what students are expected to learn or achieve in a course. They play a crucial role in guiding both teaching strategies and assessment methods, ensuring that educational experiences are focused and effective. Clear and well-defined learning objectives are essential for aligning educational activities with desired learning outcomes. By analyzing a vast array of educational content and pedagogical methods in its training data, AI can offer a wide range of learning objective recommendations, which educators can then build off of, using their knowledge as experts in the field. 

Articles

A computer monitor displaying lines of code

Generative AI for Course Design: The Basics

Learn more foundational information about Generative AI
A man smiles while working on his laptop

Learning objectives and outcomes

How to craft good learning objectives for instruction

Using your preferred GenAI tool, here is an example prompt that you can use to get started: 

This example prompt can be modified to fit your needs. For example, you may choose to add more ideas and give additional context about the course. The more detail and context you provide in your input, the better the AI output will be. So please feel free to add in outlines, syllabi, or any other materials that may help your GenAI assistant better understand your vision. 

Example: An Online Course on the Cold War

Drafting Objectives

Now that we have our example prompt, let’s see an example of it in action. Imagine you are an instructor for an introductory online course on the Cold War. You plan to use ChatGPT to generate some ideas on potential learning objectives to get you started and guide your curriculum creation. You already have some general ideas on what you want to cover: causes, major events, and overall impact. You fill in the prompt as so: 

You press enter and ChatGPT provides you with the following learning objectives: 

Refining

It is now up to you as the expert to determine which learning objectives are the most relevant and how you should go about revising them. For example, you may look at the list and notice that there are no learning objectives that ask the learners to create something with the knowledge they’ve acquired throughout the course (e.g., a final project). You return to ChatGPT and ask the following: 

In response, ChatGPT provides you with the following: 

If you disagree with this suggestion, you can reply with “More?” to get additional ideas. ChatGPT will then provide you with a longer list: 

You can repeat this process as often as you’d like – adjusting the prompt and adding additional context (e.g., outlines, key ideas, information about your teaching style) to get better responses. When formulating responses for you, ChatGPT looks at the entire chat log so it is recommended that you continue to add to the same chat for best results.

In our next article, we’ll explore how to use Generative AI to improve accessible language in your course.

Introduction

Education is undergoing a significant transformation as generative artificial intelligence continues to develop at a rapid pace. It is now easier than ever for educators to experiment with generative AI in their practice and see for themselves how generative AI can be leveraged during the course development process to brainstorm, synthesize, and draft everything from communications to students to learning objectives.

Generative AI: The Basics

Before experimenting with Generative AI (GenAI), it is helpful to have some high level foundational knowledge of how GenAI works. Essentially, GenAI functions using advanced machine learning algorithms, specifically neural networks, which emulate human brain processing. These networks are trained with large datasets, enabling them to learn language patterns, nuances, and structures. As a result, GenAI can produce contextually relevant and coherent content, a capability exemplified in tools like ChatGPT. 

To better understand how GenAI tools like ChatGPT work, let’s look at a breakdown of the acronym “GPT”: 

GPT stands for “Generative Pre-trained Transformer.” It is a type of artificial intelligence model designed for natural language processing tasks. “Generative” refers to its ability to generate text based on a combination of the data it was trained on and your inputs. It can compose sentences, answer questions, and create coherent and contextually relevant paragraphs. 

The term “Pre-trained” indicates that the model has undergone extensive training on a vast dataset of text before it is fine-tuned for specific tasks. This pre-training enables the model to understand and generate human-like text. 

Finally, “Transformer” is the name of the underlying architecture used by GPT. Transformers are a type of neural network architecture that has proven especially effective for tasks involving understanding and generating human language due to their ability to handle sequences of data, such as sentences, and their capacity for parallel processing, which speeds up the learning process. 

The GPT series, developed by OpenAI, has seen several iterations, with each new version showing significant improvements in language understanding and generation capabilities. Many of these improvements are due to the model continuously training on user inputs. OpenAI has made it transparent that your data is being used to improve model performance and you can choose to opt out by following the steps that will be outlined in the upcoming articles on how to use GenAI tools for course design, learning objectives and more.

Does it matter which GenAI Tool I use?

Not really. Individuals may find preferences for one tool or another based on response speed or comfort with the interface. You may wish to use a tool that can opt out of using personal data for training purposes. Most of the GenAI tools are generally similar.

Next Steps and Considerations

In educational contexts, the incorporation of GenAI tools, such as ChatGPT, will potentially reshape our approach to content creation and improve efficiency for educators who often find themselves pressed for time. However, it is important to note the importance of acknowledging the technology’s limitations, such as potential biases, outdated information due to insufficient training data, and incorrect information – often referred to as “hallucinations.” It is vital that you always fact-check and revise GenAI outputs to maintain the integrity and high quality of your content.

In conclusion, by leveraging GenAI tools like ChatGPT, educators can navigate course design with greater ease and efficiency. From drafting learning objectives and engaging course titles to simplifying complex academic language and brainstorming assessments, GenAI has the potential to be an invaluable asset to your design work. However, it is critical to remember that these tools come with limitations, including potential biases and inaccuracies. By combining the strengths of GenAI with the expertise and critical oversight of educators, we can efficiently create new experiences for our learners.

Introduction

If you have been anywhere where teaching is involved, you have probably heard mention of “learning styles.” “I’m a visual learner” vs. “I’m a hands-on learner” or “My instructor didn’t teach in my learning style” are all the types of commentary that are common when some individuals talk about their own learning. Although it is deeply appealing to be able to categorize individuals into easy methods of learning, unfortunately, it is deeply flawed, has little empirical evidence to support it, and might cause more problems than it solves.

What are learning styles?

To best understand why learning styles are problematic, it is important to clearly define learning styles. The idea of learning styles is that there are stable, consistent methods that individuals take in, organize, process, and remember information, and by teaching those methods, students learn better. 

One popular concept in learning styles posits that the modality of information is critical – a “visual” learner learns best by seeing versus an “auditory” learner who learns best by having things spoken or described to them. Learning style theory would suggest that by using visual aids, a visual learner would organize and retain information better than say, an auditory learner. The implication is that matching modality information to the modality of learning style is critical to student success.

At face value, the concept of learning styles makes sense. Individuals learn differently. Most educational settings are trying to reach large numbers of students in personalized ways.  It would be useful to have an easily applied theory that would help all students learn! As educators, we want to recognize the “uniqueness” of each student and help learners in any way we can. This desire has led educators to look for easier ways to navigate the complexities of teaching. Unfortunately, learning is not that simple.

Do learning styles really exist?

In general, most learning style theories make two presumptions: 

  1. Individuals have a measurable and consistent “style” of learning, and 
  2. Teaching to that style of learning will lead to better education outcomes, and conversely, teaching in a contradictory method would decrease achievement. 

In other words, if you are a visual learner, you should learn best if you see things, regardless of the situation. If you are a kinesthetic learner, you will learn best if you can physically manipulate something, regardless of the topic. However, neither of these two assumptions shows any grounding in research. These two propositions are where we can see the concept of learning styles breaking down.

Are learning styles measurable and consistent?

Did you know that there are actually over 50 different theories of learning styles by various researchers? Researchers have been trying for years to find a correlation between individuals and how to help learning. Some theories suggest the modality of learning matters (like the common VARK theory) while others propose details like time of day and temperature of the room define a learning style. One study that suggested using a cell phone was a learning style (Pursell, 2009).  Just the number of different styles makes it difficult to measure and make sense of an individual style. 

In addition, most learning style inventories rely on a student’s self-report about how they perceive they learn best. These self-reports are generally not validated in any way.  Generally, humans tend to be poor judges of our own learning. Therefore, these surveys are generally measuring “learner preference” rather than “learning style.” You may think you are an auditory learner but until it is validated that you objectively learn better through audio format, it is a preference, not a style. 

Also, when reporting results, many studies will rely on “student satisfaction” as a measure of success, or rely on students’ reflections as a measure of success in a class. For example, many measures of learning styles will ask students how they believe they learn best. Unfortunately, satisfaction with a class or a student’s recollections of success are subjective measures, and generally not accurate (Kirschner & van Merriënboer, 2013, Kirschner, 2017).  While understanding a learner’s preference is useful as is understanding student satisfaction with a lesson, it does not have the same weight as necessitating teaching to that preference. 

Finally, ​​”styles” are unstable and unreliable. The research on learning styles has suggested that these preferences may be unstable – they be topic-specific, but they also change over time (Coffield et al., 2004).  That means that although an individual may be a kinesthetic learner in history this week, that person is a visual learner in math when talking about calculus (but not about geometry), or prefers to learn how to ride a bike kinesthetically instead of reading about it in a book. This questions whether a learning style is a “trait” (or something stable and persisting for a person) or a “state” (something that is temporary and may change). Learning styles as a state of mind are not particularly useful. How can a teacher know the preference of an individual student today in a given subject? 

Does teaching a learning style result in better learning?

Even more importantly, however, is the second assumption – does teaching to an individual’s learning style lead to achievement? Simply put, there is no evidence that supports teaching to a person’s specified learning style results in better learning (Alley, et. al., 2023; Cuevas, 2015; Kirschner & van Merriënboer, 2013; Krätzig & Arbuthnott, 2006; Pashler et al., 2008; Rogowsky et al., 2020). No study has shown that teaching to an identified learning style results in better retention, better learning outcomes or student success. Instead, we see that teaching to a self-identified learning style has no impact on learning in children or adults (Krätzig & Arbuthnott, 2006; Paschler et al., 2008; Rogowsky et al., 2015, Rogowsky et al., 2020). Some research suggests that some students performed better on tasks when taught in a different modality than their self-identified “learning style” (Krätzig & Arbuthnott, 2006, Rogowsky et al., 2020). Most studies of learning styles use a methodology that uses multiple styles to all learners – meaning that there is no way to isolate learning style to teaching method. This leads us to ultimately conclude that while the concept of learning styles is appealing, at this point, it is still a myth.

Alternate explanations to learning styles

Anecdotally, there are many stories about the success of leveraging “learning styles.” If learning styles are not empirically supported, how are these successes explained? There are alternative explanations for why teaching in multiple methods increases achievement that do not prescribe students into style categories. Multi-modal learning explains how learning improves with various methods of teaching.  

Learning requires sustained attention. Therefore, if an educator can capture and maintain students’ attention, students’ learning outcomes likely improve.  Providing engagement with content in multiple forms – be it through hands-on activities, or different modalities – makes students pay attention to content in different ways, and requires learners to integrate knowledge in new ways. If an educator is using multiple methods and modalities, it’s just more interesting, and students pay more attention, which leads to better learning. Mayer and colleagues (2001, 2003) have extensively studied how students learn with visuals and audio, and the interaction of the two. What he and his colleagues suggest is that by providing dual streams of information in multiple methods engages learners to work harder at understanding the material, which leads to better learning. It may be that the research on learning styles is actually showing that teaching with different modalities is just more interesting to students rather than catering to a particular style of learning ​​(Krätzig & Arbuthnott, 2006).

Why learning styles are dangerous

While the intentions of learning styles are good, the implications of learning styles are more destructive than helpful.   On the positive side, reflecting on how one learns is always a lesson. However, by focusing on a style suggests that learners are passive vessels at the whim of the method of teaching. Ultimately, most educators want students to actively engage in their learning. The best learning takes place when an individual can connect and incorporate information into his or her personal experiences and understanding. By focusing on a student’s learning style we reinforce a simplistic view of learning. Learning styles suggest that individuals have one way to learn best. Unfortunately, learning is complex, and not easy. This is hard and takes time! It has very little to do with the way information is handed to a learner, but rather, how the learner processes that knowledge once they have it. It is important to remember – learning is within the control of the learner. 

Thinking critically about learning styles

If learning styles do not impact an individual’s ability to learn, why is there so much talk about them? Articles and books are still being published about learning styles and how to tailor teaching to reach every style. Research on teaching and learning is a complicated discipline, and being able to examine theories and concepts like learning styles critically is important to anyone working in education. The challenge is to keep a skeptical eye when you hear about research supporting learning styles and ask the right questions to make sure you are getting good information.

What should you think about the next time you encounter learning styles in the wild?

  1. What framework of learning styles are they referring to? Some are more empirically vetted than others. The most popular learning style VARK (Visual-Auditory-Read/Write-Kinesthetic) is also the least validated. Find out more about the learning style being discussed.
  2. How are they measuring both learning style and success? Are they self-reported? Are they looking at academic results or a self-report of satisfaction with learning?
  3. Is the study carefully controlled? Many studies fail to tailor the learning to a particular style. Rather, the lesson uses all the styles to reach all the students. There is no way to truly measure success.
  4. Learning styles can be controversial with some people. They aren’t necessarily harmful if they encourage people to reflect on teaching and learning in different ways. They can be harmful if students believe that their learning is outside their control.

References

Alley, S., Plotnikoff, R. C., Duncan, M. J., Short, C. E., Mummery, K., To, Q. G., Schoeppe, S., Rebar, A., & Vandelanotte, C. (2023). Does matching a personally tailored physical activity intervention to participants’ learning style improve intervention effectiveness and engagement? Journal of Health Psychology, 28(10), 889–899. https://doi.org/10.1177/13591053221137184

Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Should we be using learning styles?  What research has to say to practice: Learning & Skills Research Center.

Cuevas, J. (2015). Is learning styles-based instruction effective? A comprehensive analysis of recent research on learning styles. Theory and Research in Education, 13(3), 308–333. https://doi.org/10.1177/1477878515606621

Kirschner, P. A. (2017). Stop propagating the learning styles myth. Computers & Education, 106, 166–171. https://doi.org/10.1016/j.compedu.2016.12.006

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48(3), 169–183. https://doi.org/10.1080/00461520.2013.804395

Krätzig, G. P., & Arbuthnott, K. D. (2006). Perceptual learning style and learning proficiency: A test of the hypothesis. Journal of Educational Psychology, 98(1), 238–246. https://doi.org/10.1037/0022-0663.98.1.238

Lau, W. & Yuen, A.  (2009).  Exploring the effects of gender and learning styles on computer programming performance:  Implications for programming pedagogy.  British Journal of Educational Technology.  40(4), 696-712

Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43-52.

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles:  Concepts and evidence. Psychological Science in the Public Interest, 9(3), 105-119.

Pursell, D. P.  (2009)  Adapting to student learning styles:  Engaging students with cell phone technology in organic chemistry.  Journal of Chemical Education.  86(10), p1219-1222.

Rogowsky, B. A., Calhoun, B. M., & Tallal, P. (2015). Matching learning style to instructional method: Effects on comprehension. Journal of Educational Psychology, 107(1), 64–78. https://doi.org/10.1037/a0037478Rogowsky, B. A., Calhoun, B. M., & Tallal, P. (2020). Providing Instruction Based on Students’ Learning Style Preferences Does Not Improve Learning. Frontiers in Psychology, 11. https://www.frontiersin.org/articles/10.3389/fpsyg.2020.00164

You may hear different terminology as you begin online teaching. The following are some working definitions to help differentiate the terms used when discussing teaching leveraging technology.

University of Michigan Registrar defined instruction-mode definitions:

In-Person: Indicated by a (P) on the course listing, in-person classes are the traditional face-to-face classes. Instructors and students meet at a designated time in a designated place each week.

Online: Indicated by a (D) by the registrar, online classes do not meet in person. All learning activities take place online.  Online classes can be:

  • Synchronous: There are at least some designated times for students and instructors to meet simultaneously in a tool like Zoom. Synchronous classes will specify the day and time for meetings in the registrar. Online synchronous classes are generally structured more like an in-person class, except in a video conference format, however, there is often more asynchronous work as well. 
  • Asynchronous: There is no requirement for simultaneous meetings. Most of the interactions between students and instructors happen through other communication tools. Lectures may be pre-recorded. While video conferencing may be utilized, it is generally an optional component.

Hybrid: Indicted by an M by the registrar (for Mixed), these classes have both a required in-person component as well as an online component. Hybrid classes could meet once a month (rather than once a week), or have one class online and one class in person each week. Meeting days/times need to be specified.

Other Online Teaching Definitions:

  • Hy-flex: A course where students can choose on a day-by-day basis whether to attend class in person or via a synchronous videoconference session. Because Hyflex classes need to account for room capacity, they would be considered in-person. Currently, the University of Michigan does not have a specific indicator for hy-flex classes. 
  • Blended learning: While some people use “hybrid” and “blended” interchangeably, generally blended learning includes any kind of online component to supplement instruction. Even if a course does not reduce face-to-face meetings, maintaining and utilizing a Canvas course to extend the classroom indicates a blended learning experience. Blended learning takes advantage of online technology to enhance in-person classrooms.
  • Emergency Remote Teaching: Suddenly altering teaching modalities from in-person to online due to an emergency. Emergency remote teaching is a type of online teaching, however, it generally does not involve careful planning of instruction specifically for that modality.

The rapid shift to emergency remote instruction during COVID-19 left many instructors questioning how best to assess students, even well after classes resumed. Concerns about academic integrity left some wondering if using online tests made students more likely to violate academic integrity rules. Online test proctoring made news in many higher education settings as a way to ensure academic integrity. However, others have argued it is a violation of students’ privacy.

What is Online Proctoring?

You may be familiar with proctoring in a face-to-face or residential setting where a designated authority oversees an exam in a controlled, specified environment. Similarly, online proctoring is a service that monitors a learner’s environment by either a person or an artificial intelligence algorithm during an online exam. However, the environment an online proctor oversees is a learner’s personal environment. This monitoring can take the form of videotaping, logging students’ keystrokes, browser data, location data, and even biometric data like test-taker eye movements.

Advocates of online proctoring cite concerns about academic integrity in the online environment as a reason to implement proctoring (Dendir & Maxwell, 2020). Some even suggest that students do not mind the additional security because they believe it supports the integrity of the test and/or degree.

Online proctoring in the media and research

While onsite-proctoring for academic integrity may seem reasonable, there have been questions about monitoring a learner’s home environment. monitoring a learner’s home environment has the potential for harm. Online proctoring can be perceived as invasive by students, as personal information about one’s location and physical data is recorded that is not otherwise necessary for an exam. Several institutions, like U-M Dearborn, University of California Berkeley, University of Illinois, and the University of Oregon have placed limitations on, if not discontinuing altogether the use of third-party proctoring services. Institutions cite issues of accessibility, bias, concerns about student privacy, and institutional culture as reasons to discourage third-party proctoring. Student and faculty groups have publicly advocated for institutions to discontinue security features like locked-down browsers and third-party monitoring. At the University of Michigan Ann Arbor, third-party proctoring generally involves a separate fee and may be expensive, but still available through vendor partners.

Most of the academic research involving the use of online proctoring has focused on academic integrity, rather than the impact of proctoring itself. Wuthisatian (2020) found lower student achievement in online proctored exams compared to the same exam proctored onsite. Those students who were the least familiar with technology and the requirements for setting it up performed the most poorly. In addition, students who have test anxiety may experience even more anxiety in certain proctoring situations (Woldeab & Brothen, 2019). With further research, we may find the problem may not necessarily be proctoring, but rather the burden and effort of technology on students when taking an online exam.

Problems with internet connections or the home testing environment may be beyond students’ control. The lack of ability to create a “proper” testing environment raised students concerns about being unjustly accused of cheating (Meulmeester, Dubois, Krommenhoek-van Es, de Jong, & Langers, 2021)

What are the alternatives to proctoring?

Ultimately, only the instructor can determine whether proctoring is the right choice for a class and sometimes proctoring may be the best choice for your discipline, field, or specific assessment. Particularly in a remote setting, it may feel like the integrity of your assessment (particularly a test) is beyond your control, so proctoring may feel like the only option. However, there are alternatives to proctoring exams, from using exam/quiz security measures, to re-thinking a course’s assessment strategy to deemphasize exams. If you are concerned about how and what you are assessing, the Center for Research on Learning and Teaching provides resources and consultations to discuss academic integrity and different methods of assessment. We also recommend CAI’s Faculty Proctoring document if you have questions about proctoring.

Learn more:

How this will help:

Understand key principles of ethical community engagement and how to operationalize them when designing and teaching online community-engaged courses.
Learn concrete suggestions, resources, and strategies for addressing the needs of students and community partners during online community-engaged teaching.
Discover ways that Ginsberg Center staff can support your community-engaged course, from finding remote engagement opportunities for students to helping prepare your students to partner with communities and maximize their learning.

The basics

Community-engaged learning is when “students engage in activities that address human and community needs, together with structured opportunities for reflection designed to achieve desired learning outcomes” (adapted from Jacoby, 1996). In your face-to-face classes, you may have experience working directly with community partners in various ways to integrate students into community-engaged learning.

But what does this look like in an online class? Or when the community engagement is virtual? When shifting to an online environment, many community-engaged instructors at the University of Michigan have expressed the difficulty of balancing students’ needs for accessible and empathetic virtual instruction, community partners’ rapidly-shifting needs and priorities, and general public health and safety concerns. However, online instruction does not mean isolated – instead, it is possible to leverage technology as well as use it as a lens to examine community engagement.

The Ginsberg Center is a community and civic engagement center with a mission to cultivate and steward equitable partnerships between communities and the University of Michigan in order to advance social change for the public good. Our Best Practices for Online Community Engaged Teaching and Learning provides tangible suggestions, resources, and strategies that are rooted in the 6 key principles that guide our work. The guide also synthesizes research on online service-learning and community engagement with a particular focus on the opportunities available at the University of Michigan. We offer highlights from our guide below:

1. Connecting Civic Learning Across Contexts:
We support students’ integrative learning across classroom, co-curricular, personal, and community settings. Reflection is a critical component of this integration throughout the partnership process.

Examples of how to apply this principle:

  • Have students reflect on their assumptions about technology and how these assumptions may impact their work with community partners.
  • Use technology to allow students to reflect in multiple ways: online journals, group discussion boards, videos, and audio recording. 
  • Take time to reflect upon how using technology has affected your approach to teaching and community engagement.

2. Starting with Community: Our approach centers around community-identified priorities and how we can most effectively match University of Michigan resources and expertise to those of community partners working to address these priorities. It’s important to start with your community partners’ goals and priorities when deciding how and when to integrate technology into your engaged course.

Examples of how to apply this principle:

  • If community partners are co-creating the virtual course with you, ask about the partners’ technological preferences and capacity explicitly, and maintain a dialogue through the course.
  • Consider inviting community members to your virtual classroom as a guest speaker because community partners bring ideas, perspective, language, and knowledge to the table that can be inaccessible otherwise.
  • If the community members virtually “host” the students with their organization, communicate the roles of the community partner clearly and take an active role in managing your students’ participation.

3. Centering on Equity: We strive for balanced impact in our partnerships, which means that students, faculty, university staff, and community partners all have the opportunity to share their interests, goals, and expectations. Leveraging technology may bring more opportunities to converge interest and goals but may also present added challenges to centering equity..

Examples of how to apply this principle:

  • Co-create course objectives with your community partners, and use these objectives to determine what technological tools are most appropriate and compatible with the goals and capacity of the community partner.
  • Give community partners access to all virtual components of the course, including discussion boards, course announcements, and readings.
  • Develop a plan to prepare students for both synchronous and asynchronous interaction with community partners.

4. Fostering long-term Partnership: We focus on stewarding long-term relationships with community partners that last beyond a particular project or engagement. 

Examples of how to apply this principle:

  • Work with Ginsberg to learn about what technological resources and supports are available to you and your community partners through the university.
  • Discuss with your partner how they can continue to have access to any online resources (readings, recordings, discussion boards, students’ work, etc.) that were created during your course. 

5. Acknowledging Power: Cultural humility requires a recognition of power differences and conscious attempts to balance these differences through reflection and learning (Tervalon & Murray-Garcia, 1998). Technology adds an additional layer of power and equity into the community conversation.

Examples of how to apply this principle:

  • Consider how power imbalances might manifest in specific forms of virtual interactions (video meetings, phone meetings, email, discussion boards, etc.) and establish a plan for how issues will be identified and counteracted.
  • Invite your students to reflect on how technology can be used to decrease the negative effects of power and privilege and when it may exacerbate those effects.

6. Moving from Individual to Collective Action: We support coordination, collaboration, and increased coherence by bringing together parties with shared interests to amplify positive community impact.

Examples of how to apply this principle:

  • Consider inviting community partners into the virtual classroom to share historical, political, organizational, and community contexts for the issues they are addressing and who else in the community is working on the issues.
  • Work with the Ginsberg Center to access our extensive network of partners so you can connect efficiently with partners ready and eager to collaborate.

Practical tips

  • Want some ways to get started in community-engaged learning right away?
    • Consider inviting community partners into the virtual classroom to share historical, political, organizational, and community contexts for the issues they are addressing and who else in the community is working on the issues.
    • Consider inviting community members to your virtual classroom as a guest speaker because community partners bring ideas, perspective, language, and knowledge to the table that can be inaccessible otherwise.
  • The Community Engagement: Collaborating for Change MOOC offers free, online modules to help you and your students prepare for community-engaged learning.
  • Ginsberg Center staff hold regular Community of Practice gatherings and workshops for instructors of community-engaged teaching. View our full calendar of events.
  • Read the full guide for Online Community Engaged Learning and the general guide for Community-Engaged Learning.

Resources

University of Michigan

How this will help:

Find online resources available from the University of Michigan Museums to include in your course

The Basics

The museums and special library collections of the University of Michigan – Ann Arbor support online teaching with a wide range of digital collection and exhibition resources. Many have educational staff dedicated to hosting and crafting synchronous and asynchronous learning experiences with their digital resources. By clicking on the links to specific museums below, you can learn more about each institution’s materials and support for online learning.

Please click on the link for resources from the various museums: 

University of Michigan Museum of Art

University of Michigan Museum of Natural History

University of Michigan Museum of Anthropological Archaeology

Papyrology Collection (University of Michigan Library)

Special Collections Research Center (University of Michigan Library)

University of Michigan Herbarium

University of Michigan Museum of Zoology

Matthaei Botanical Gardens & Nichols Arboretum

William L. Clements Library

Stearns Collection of Musical Instruments

University of Michigan Kelsey Museum of Archaeology

Sindecuse Museum of Dentistry

Bentley Historical Library

University of Michigan Museum of Paleontology

Clark Library of Maps and Atlases