Skip to main content

Making Your Course Accessible

A person wearing large over-ear headphones is focused on a laptop, with one hand on a Braille display in front of the keyboard.

Panorama tool offers easy, accurate remediation within Canvas

How this will help

Built-in tool can identify and fix accessibility issues
Checks that course materials meet federally mandated standards
Helps instructors design equitable lessons for all learners

Reviewing your course materials in Canvas for accessibility ensures all learners can participate without obstacles. As the importance of digital accessibility is demonstrated, the tools available to meet those requirements are more prevalent, accurate, and easier to use. 

Panorama is one such tool and is currently available on Canvas. It can scan, evaluate, and fix content directly within the learning management system. Using automatic scripts and machine learning, Panorama reviews course materials and matches them to accessibility standards for color contrast, text, graphics, tables, and other issues that can pose barriers to learning for students with disabilities. 

Like many major accessibility checkers, Panorama is built around Web Content Accessibility Guidelines, or WCAG (commonly referred to as WICK-ag), which are considered the universal standard for digital accessibility. The University of Michigan’s Digital Accessibility Strategic Initiative aims to meet WCAG and Americans with Disabilities Act (ADA) regulations by April 2026. 

Getting Started

Panorama automatically audits all of the items in the course and checks their compliance with the prescribed accessibility standards. 

Once Panorama scans every Canvas course for accessibility issues, and there are several ways to access those results and start making corrections.

Accessibility scores are also found next to each item in your course materials. Those scores appear in three colors: 

Three icons indicate accessibility levels: a red warning icon for scores below 60%, a yellow caution icon for scores between 60–90%, and a green proceed icon for scores above 90%.
  • Red – Warning icon. This indicates significant accessibility issues with this item.
  • Yellow – Caution icon. There are some issues that may be difficult for learners to navigate.
  • Green – Proceed icon. This means the item meets or mostly meets accessibility criteria, which is the ultimate goal for mandated requirements.

You can see the accessibility report for that item when you click on the accompanying icon. Instructors and designers can also open a full course report by selecting Panorama from the left-side navigation menu in your course. It should be noted that these scores are not accessible by students; they are solely to inform you as instructors or designers.

Full course reports give you a snapshot of a course’s overall accessibility and a complete list of items with their accompanying score. From this list, you can prioritize the remediation as issues are filtered by severity or content type, so you can rank the findings accordingly. 

If you are creating pages in Canvas, the Panorama accessibility tool icon, which looks like a temperature gauge, appears underneath the text box. As content is added, any accessibility issues appear as a numeric count on the icon. Clicking it opens a report where you can review a list of all accessibility issues Panorama has discovered.

A screenshot outlines three ways to access an accessibility report for a specific item. The first method is through the Rich Text Editor, shown with a gray gauge icon and a purple badge marked “1.” The second method is via the Accessibility Score Icon next to a page or file, represented by three icons: a red pentagon, a yellow triangle, and a green hexagon, each with a human figure inside. The third method is from the Accessibility Score Icon beside issues in the Course Report, illustrated with a gray gauge icon labeled “70%” in orange text.

Making Corrections

Panorama allows course designers and instructors to create, scan, and fix digital content directly in the Canvas platform.

The most common issues flagged include:

The Accessibility Report featured displays a total of 19 issues with a 0% accessibility score, categorized as 4 minor, 15 major, and 0 severe issues. Below, the "Review Issues" section lists specific problems: 1) A major issue stating "The slide does not have a title," with a "Learn more" link. 2) A minor issue about "Check reading order," accompanied by a "Learn more" link and a "Fix Issue" button. 3) Another major issue identical to the first one. Pagination buttons for navigating through the issues are located at the bottom, with page 1 highlighted.
  • Alt-text – Alternative text is a short description of an image and should be accurate, short, and contextual.
  • Tables – Tables should be used to help explain data, not create a visual layout.
  • Headings – Short text phrases that introduce sections in a document or page, and should follow a hierarchy of levels. 
  • Color contrast – Difference between lightness and darkness of two colors that improves visibility of text. 

Once you access the list of issues by clicking on the relevant icon and viewing that item’s accessibility report, instructors can make repairs using a few different options.

Fix Issue

The easiest way to update a not accessible item is to click the Fix Issue button listed on the accessibility report, if that button is available. 

Clicking the Fix Issue button launches a pop-up box that lays out what the issue is, how you can fix it, and recommends a change. Clicking Add Change will automatically make the correction.

Manual Remediation

The remediation process depends on different factors like the type of item or what accessibility problems are prevalent, so not all issues will have automated repair options. 

In those cases, instructors will have to manually correct the accessibility problem. Solutions can be found by clicking Learn More next to the listed item, which will provide step-by-step instructions on remediation. 

If the issues are with a source file, you can download the file, make corrections as advised, then upload it again using the update document feature in the accessibility report. Panorama rescans the item, then registers the item’s new accessibility score. 

Don’t Delay

Since reviewing and updating every page, document, and activity in your courses can take time, accessibility advocates advise planning time to work with Panorama and improve content over multiple months to meet your deadline.

While ideally it’s recommended to build accessible content at the beginning, remediation will still need to occur. Working through your materials ahead of time means you won’t be scrambling to make everything accessible on the first day when, for example, a student using a screen reader can’t access a PDF. Trying to remediate content on demand is not only an added stress for instructors, but also forces the student to wait and risk falling behind in the class. 

So, avoid the scramble and the burnout and start using Panorama to check your course materials. Those working with faculty emphasize that progress, not perfection, is the current goal. With such a large volume of materials to review, accessibility advocates hope to see broad improvement rather than 100% accessibility ratings for a handful of courses.

As new guidelines are released, Panorama will update to meet those standards. This means that the sooner instructors implement its usage, the better they’ll be prepared to meet the needs of incoming students.

Practical Tips

  • Share your experiences and issues with the ITS Service Center. Any problems are sent to the vendor for corrections and updates.
  • Register for a training session or take the Canvas Accessibility with Panorama course from the Canvas Accessibility Service.

Resources

University of Michigan

Additional Resources

For those who don’t have access to Canvas and Panorama, there are external resources that can help check your course’s accessibility.

  • WAVE Web Accessibility Evaluation Tools (wave.webaim.org) are available as a browser extension, subscription-based or stand-alone products that identify issues in web content.
  • Axe accessibility testing tools (deque.com/axe) include a free browser extension as well as more in-depth products.

Americans with Disabilities Act Title II Web and Mobile Accessibility

How this will help

Students can use generative AI to build their learning process
Providing structured guidance is key to responsible AI use

A current challenge facing university instructors is the use of generative AI tools by students to achieve the goals and objectives of a course. Students commonly interpret course goals and objectives as things they need to produce – the outputs of learning. But what is missing in this view is how they need to think, learn, and build their knowledge – the process of learning. 

Students are commonly asked to produce written output as evidence of their knowledge and thinking, such as essays and assignments. Given that, it’s not surprising they would turn to generative AI tools, such as ChatGPT, that are programmed to generate and produce contextually appropriate written output. And because of the relative ease with which quality-looking output can be generated with these tools, it’s also not surprising that instructors are placing even greater attention on verifying that assignments are created by students and not produced by a generative AI tool.

With so much concern surrounding students’ use of generative AI tools in classroom learning settings, specifically with the outputs of learning, could AI promote other important learning processes that are not focused on outputs? Could the focus be shifted to strengthening students’ thinking processes instead?

Expert Thinking

At the center of an effective learning experience are problems to be solved. The goal is for students to come away from the learning experience having acquired a certain amount of content knowledge so they can engage with and solve problems that are presented. However, teachers and educational researchers know that content knowledge alone isn’t sufficient to solve problems. 

When presented with a problem, experts not only use content knowledge, they also actively weigh the possible strategies and options they may want to use to tackle the problem – using metacognition in the problem-solving process. Metacognition is thinking about your own thought processes. Metacognitive strategies are ways to encourage and actively engage in the “whys” around thought processes.

For example, the problem-solving process may involve asking or reminding themselves:

  • Have I considered all the other options?
  • Am I jumping to conclusions too quickly?
  • Have I seen this before?
  • Remember what happened the last time. 

And while the use and benefits of metacognitive knowledge may have been developed over the course of a professional career, educational researchers know that metacognitive strategies can not only be of great benefit for novices learning in the classroom, but that it can be explicitly taught alongside content. 

Using AI to Support Metacognition for Learning

Let’s say you’ve assigned an analysis of a local art installation. Some students might find themselves struggling to get started even after attending lectures, passing quizzes, and attending small group discussions. By introducing metacognitive strategies, you could help get their thinking started. 

Providing them questions to ask themselves and explicitly walking them through this thinking process is an example. By demonstrating this question strategy with students, they not only begin the process of generating output for their analysis, but they also strengthen their metacognitive processes for similar analysis in the future. 

Your students are now feeling confident that they can tackle this analysis and will be able to implement these metacognitive strategies at 11 pm when they do their work, right? Of course, we know learning doesn’t work this way. 

This is where generative AI tools can come in to support students’ use of metacognitive strategies and their writing process – at the time when they need it. This shifts the use of generative AI tools from “prompting for output” to “prompting for learning.”

For example, if the student is stuck and doesn’t know how to get started, they could be provided a set of prompts that ask the generative AI tool to provide questions designed to elicit written responses from the student. 

Prompt: Act as a writing coach to help me write a critical analysis of a local art installation. Your goal is to model metacognitive strategies to help me unpack, organize, and compose my analysis.

The requirements for the analysis are as follows: 

  • The rubric that will be used to determine the quality of the analysis is as follows: [provide rubric information].
  • As a writing coach, you will ask me questions that will help me gather and compose my observations and insights about the art installation I visited.
  • You will then evaluate my response against the requirements and rubric, and ask me to consider what other information I could provide to clarify my analysis. Continue to ask me questions until I have written at least 200 words.
  • Do not provide any examples for how to improve my analysis. Your job is to help guide my approach toward composing my analysis.
  • Do not ask all the questions at once. Instead, ask one question at a time, expect a response, evaluate the response, then ask another question.

In creating a writing coach with a generative AI tool, students are not only prompted to  start writing, but the prompts themselves model a process of what a metacognitive strategy looks like for students, and what they can begin to use for future writing assignments.

If we want to help students fully realize the goals and objectives of our courses – to be able to apply what they’ve learned and create solutions in the real world – it’s important that we provide them with instruction and tools that emphasize strengthening their thinking skills and building their knowledge.

Resources

U-M Generative AI Tools

U-M Generative AI Use Cases

References

Dennis, J.L., Somerville, M.P. Supporting thinking about thinking: examining the metacognition theory-practice gap in higher education. High Educ 86, 99–117 (2023).

Krathwohl, D. R. (2002). A Revision of Bloom’s Taxonomy: An Overview. Theory Into Practice, 41(4), 212–218.

National Research Council. 2000. How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: The National Academies Press. 

McCormick, C. B., Dimmitt, C., & Sullivan, F. R. (2012). Metacognition, Learning, and Instruction. Handbook of Psychology, Second Edition, 7

Merrill, M. D. (2013). First Principles of Instruction: Identifying and Designing Effective, Efficient, and Engaging Instruction. San Francisco, CA: Pfeiffer.

Pintrich, P. R. (2002). The Role of Metacognitive Knowledge in Learning, Teaching, and Assessing. Theory Into Practice, 41(4), 219–225.

Tankelevitch, L., Kewenig, V., Simkute, A., Scott, A., Sarkar, A., Sellen, A., & Rintel, S. (2024). The Metacognitive Demands and Opportunities of Generative AI. CHI ‘24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems

Answers to common questions about forming student groups

How this will help

Well-constructed team-based assignments help students build skills and knowledge
Surveying students at the beginning of a course can inform better group formations
Establishing team roles gives learners structure and clear expectations

Project-based team learning is a great way to engage students and help them build real-world collaboration skills. But organizing your students into groups that blend easily and productively can be a daunting task at the beginning of the semester. 

Below are common questions about forming student teams, along with answers to help you create team projects that support your students and their learning goals.

How can I get to know my students at the beginning of the course?

A student survey at the beginning of the course may help you learn about your students and form teams more effectively. Consider including any of the following questions in your survey:

  • Where will you be living this semester? 
  • What days of the week are you most available to meet?
  • What time of day are you most productive? 
  • What concerns do you have about teamwork in this class?
  • What are your priorities or goals for yourself in this class?
  • How much experience do you have with [skill relevant to the class or project]?

You can also survey students about personality characteristics and working styles. Here is a sample survey for rating personality characteristics:

 An image of a sample survey for students asking the question, "Where would you place yourself on the following scales?" The scales go from 1 to 7. The following scenarios rank as a 1 on the scale: In groups, I tend to listen more than speak; I usually do work close to a deadline; I expect to fit right into this course; I like to share work, even if my team finishes tasks differently than me; I'd rather hold back ideas or preferences if my group stays happy. The following scenarios rank as a 7 on the scale: I often speak up in groups; I get working on a project when it's assigned; I expect to feel pretty out of place in this course; I'd rather pick up extra work so I know it's done right; It's easy for me to speak up about my ideas or preferences even if it disrupts my group.

In some cases, students with similar characteristics do well in the same groups; other characteristics are better served by being spread out among the teams. These insights help you build cohesive and equitable teams. 

Consider the following guidance for grouping students based on the survey sample:

CharacteristicsGrouping Type
Extroverted vs. IntrovertedGroup diverse
Precrastinator vs. ProcrastinatorGroup similar to maximize team happiness
Group diverse to maximize team productivity
BelongingnessAvoid stranding low-belonging students on a team of high-belonging students
Controlling vs. CollaborativeSpread out controlling students
Self-censoring vs. ContributingAvoid stranding self-censoring students on a team of high contributors

How should teams be formed, and which factors should instructors consider?

Forming effective teams is the first step in setting students up for success on their team projects. Here are some strategies for team formation:

  • Aim for diversity in skills, experiences, and perspectives.
  • Aim for similarity in schedules and campus location (or time zone if students are remote) to reduce logistics issues.
  • Grouping students with similar time management approaches and procrastination styles may reduce conflict within the team, though it may also reduce project quality. 
  • Consider outcome goals, academic strengths, and previous experience with the topic and skills when forming teams.

If you have students who indicated they don’t enjoy teamwork or have conflicts with others, consider assigning them to a team with students who:

  • Have strong collaborative skills and patience.
  • Share similar goals for the class.
  • Are empathetic and adaptable.

In some cases, it may be better to let a student work alone on a project, especially if they have previously had difficult team experiences due to factors outside their control (e.g. cultural differences, neurodiversity, etc.).

How can instructors make structured teams (with roles) work well?

Planning and communication are key when designing and assigning roles for group work. Important considerations include:

  • Clearly define roles and explain how each role benefits the team’s success. Team roles can include a facilitator to lead team meetings, a timekeeper to track deadlines, and a reporter to record team decisions.
  • Ensure equitable distribution of the fun or exciting tasks, or tasks that help students build critical skills.
  • Rotate roles if possible. For example, note-taking should be a duty each team member takes on, rather than the same person always doing it. 
  • Talk about different leadership roles and how to share them in a team. Students can be idea leaders, task leaders, social leaders, or organization leaders (or a combination of leadership types).
Leadership typeStrength areaCommon contributions
Social leadersHelp to create the necessary social bond and cohesion among teammates– Facilitate conflict resolution
– Amplify teammates’ voices and ideas to increase equity of participation and contribution
– Foster an environment of shared respect for teammates and excitement over the teams’ work
Organizational leadersOffer needed structure to the team and project– Spearhead conversations about team policies and norms
– Develop and implement approaches to project scheduling
– Provide big-picture project oversight to make sure all aspects of the work are moving forward as needed
Idea leadersHelp get the team’s work off on the right foot and provide a boost when needed later in the project– Model strategies for idea generation
– Suggest alternatives when a solution isn’t working
– Collect, evaluate, and prioritize feedback
Task leadersMake sure that the team’s work progresses by using their skills and strengths to complete project tasks– Demonstrate and teach skills to teammates who are motivated to learn
– Outline tasks, objectives, and strategies to complete tasks with teammates
– Set deadlines and guide teammates when there are gaps in knowledge

How can instructors form equitable teams?

Instructors should create teams thoughtfully to reduce learner barriers and encourage a sense of belonging, particularly for students who may hold a marginalized identity. Important considerations include:

  • When possible, avoid stranding students who may be underrepresented in their area of study on a team of all majority students. You can accomplish this by asking an open-ended question about how each student would prefer to be grouped. 
  • Allowing teams to self-select may help, but it may also create more homogeneous teams.
  • Forming teams based on self-reported sense of belonging may help accomplish this (e.g. not stranding students who report a low sense of belonging on a team with all high-belonging students).
  • Consult with expert resources on your campus to develop strategies that meet this goal while complying with privacy, legal, and ethical considerations.

For more information on team equity, see the scoping review conducted by researchers at the University of Michigan, as well as a graphic illustrating their findings. They identified seven themes of team equity: alignment, dialogism, heterophily, participation, power, ownership, and risk.

A graphic wheel with Team Equity at the center surrounded by seven themes that contribute toward Team Equity, including Alignment, Risk, Power, Participation, Ownership, Heterophily, and Dialogism.

References

Legal notice: Protected identity characteristics (race, ethnicity, sex, etc.) cannot be used to assign students to teams, and should not be collected in any team formation survey. The Tandem survey tool meets these guidelines. For more information, reach out to the University of Michigan Office of the General Counsel


Moffat, A. D., Matz, R. L., Fowler, R. R., & Jeffrey, M. (2024). Facets of Team Equity: A Scoping Review. Small Group Research, 56(1), 32-70.

Resources

When examining the effectiveness of a massive open online course, or MOOC, standard metrics paint an incomplete picture. 

Test scores, completion rates, and grade-point averages are often used to measure the success of a traditional academic program. These markers don’t translate as easily to MOOCs, where enrollment and completion rates diverge. Instead, research shows students enrolled in open online courses are seeking something beyond top marks.

These learners intrigued Dr. Caren Stalburg, whose popular “Instructional Methods in Health Professions Education” MOOC has continued to attract interest since its launch in 2013, with nearly 25,000 enrollees and counting. Stalburg, an associate professor of obstetrics and gynecology and learning health sciences at the University of Michigan Medical School, wanted to find out how her learners were using the course, and whether it was impacting their professional and personal goals.

Teaching the Teachers

Stalburg’s career path was shaped early on by her curiosity around teaching and learning. After completing her residency at U-M, Stalburg was a new faculty member, adjusting to her role as an educator after years of being a student. She joined a committee that included experts from the Medical School and the Center for Research on Learning and Teaching, and swiftly realized there was more to instruction than just passing along knowledge. 

“I got involved in some of our curricular designs and realized that there is a science to education, and nobody teaches it to us,” Stalburg said.

She decided to learn the science. She earned a master’s degree in 2006 from what is now called the Marsal Family School of Education. Then in 2012, Stalburg again followed her curiosity, becoming one of the earliest faculty members to embrace the university’s new partnership with Coursera and launching a MOOC in collaboration with the center.

Stalburg’s course tapped into the lessons she learned early on about the importance of teaching the teachers, designing modules dedicated to informing and improving students’ instructional skills.

“I wanted the content to be broad enough so that it was applicable to any and all professionals who were teaching others about health,” she said.

Measuring the Impact

As the course’s enrollment grew, Stalburg was curious about its effectiveness. She decided to explore that question after seeing a report by the center’s senior research scientist Nate Cradit and former director Cait Hayward, which examined how students evaluated a MOOC’s quality based on the course’s affordances and attributes. 

Partnering with the center, Stalburg developed a survey specific to her course to evaluate if students’ needs were being met. In addition to gathering demographic data on the learners, the survey examined how they interacted with and benefitted from the content.

“We designed this study to look at understanding the participants’ goals for the course and how completing the course has impacted their professional goals,” Stalburg said. 

Surveys were sent to 278 learners, with 40 of those participants completing the survey. The respondents represented were diverse and educated, hailing from 18 different nations with 75% of them holding advanced degrees. Many were entering the middle of their careers, and they worked in a diverse range of fields, including medicine, nursing, dentistry, physical therapy, and more.

A vast majority of the learners said they took the course to increase knowledge and skill development, with some seeking professional development or meeting a requirement for their current job.

Interestingly, Stalburg and her team also discovered that the learners were using the material for their own instruction. Most of the participants downloaded the course material and used it to improve their teaching methods or create their own lessons. 

It was a satisfying finding for Stalburg, affirming her pursuit of sharing the science behind the instruction with fellow educators.

“I really believe that it’s about increasing human capacity, and meeting people where they are and in the needs that they identify,” she said. “And so, to me, this is like that old parable about teaching people to fish.” 

Local and Global Lessons

The impact of the Health Professions Education MOOC can be found here at the university, as well as on campuses across the globe.

In 2021, Stalburg was tapped to help design and launch the Health Infrastructures and Learning Systems Online Master of Science Degree, again in collaboration with the center. It is the first and only online degree program offered by the Medical School and Rackham Graduate School. 

Stalburg’s MOOC course was selected by universities and hospitals around the world to help train their medical communities. The University of Guyana partnered with Coursera and enrolled 86 students between 2018 and 2021. 

There was also a cohort of surgeons from the SSR Medical College in Mauritius who utilized the course, and praised the focus on teaching the instructors.

“For most of us MOOCs are a novelty, and the fact that the course content was so relevant to our professional activities made the experience so much more enriching,” wrote one faculty member.

More to Learn

Stalburg now wants to know more about those original respondents to the MOOC query. 

She plans to contact the participants for further interviews to gain a deeper understanding of the course’s impact on their career trajectories or personal lives, how their professional roles may have changed, and how they are applying the course content with their colleagues. 

Looking at the data she has collected so far, Stalburg believes the responses will reveal this MOOC’s success, and how it can’t be measured solely in numbers, but instead through its impact on learners’ lives. 

“I am hoping that people will sort of say I got better at teaching, I’ve become more recognized for my teaching, or my job opportunities have increased,” Stalburg said. “You know, just a flourish, a boost in whatever direction they’re looking to go.” 

References

Cradit, N., Hayward, C. (2023, October). What is a Successful MOOC? Lessons from Global Learner Narratives [Paper presentation]. IEEE Learning with MOOCS, Cambridge, MA, United States.

Finding ways to support every student is a fundamental challenge for instructors. When the learning occurs online, ensuring an equitable experience can seem daunting, especially when students are part of teams that meet outside a professor’s purview.

According to researcher Yiwen Lin, interventions aimed at boosting student engagement and experience are effective, and the strategic use of generative AI could ensure group learning benefits every team member.

Local Inspiration

As an undergraduate student at the University of Michigan, Lin got a glimpse into her future research while attending a talk on the student support tool ECoach. Developed by the Center for Academic Innovation, ECoach software provides students personalized feedback and tailored strategies for success. 

Lin recalled attending the presentation given by ECoach founder Tim McKay, Arthur F. Thurnau Professor of Physics, Astronomy, Education. She was struck by McKay’s finding that while female physics students did not frequently speak in class, they did engage and contribute in other meaningful and important ways. 

“What he found was that women like to back channel,” Lin said. “I thought, well women engage, but oftentimes they just engage differently, and it’s hard for an assessment that only looks at the frequency of participation.”

Lin, now a postdoctoral associate at the University of Pittsburgh, researches this deeper data with an eye on gender differences. She examines how psychological factors impact the persistence of online STEM learners, the quality of participation in team settings, and what interventions can be used to encourage more equity among students. Lin shared her research in an Innovation Insights talk titled “Charting Equity in Online Learning Teams: Opportunities and Challenges,” presented by the center.

Male vs. Female Motivation

Examining gender differences in STEM learning has traditionally evaluated how students’ psychological experiences impact outcomes. Lin’s research delves deeper into the learning process, revealing some surprising findings.

In one study, Lin and her team replicated a previous research project that looked at how a sense of belonging and STEM identity impacted female students’ desire to continue in STEM. But unlike the former study, Lin’s research used a pool of international online learners, many of them graduate students. 

The results corroborated the importance of belonging and identity for women. However, when they examined the same connection for male learners, Lin’s team found that belonging and identity were also strong motivators for men. In fact, identity and belonging showed a slightly stronger link to STEM persistence for men compared to their female peers. This was the opposite of the previous findings. 

Lin believes the pool of students (international and online) may have been a factor in the divergence from past research. Either way, interventions designed to increase female learners’ belonging and identity also clearly impacted male learners.  Subsequent polling showed that a positive group dynamic impacted both male and female retention in STEM. 

“We found that facilitating effective group dynamics can be potentially quite important for cultivating a more inclusive psychological experience,” Lin said.

Beyond Quantifying Participation

It can be challenging for instructors of online courses to incorporate those interventions, especially for small groups meeting outside the virtual classroom. 

Lin outlined those challenges and the importance of diving deeper into the data in a study monitoring 88 small teams (three students per team) who were given a series of challenges to complete in a short period of time. Examining the gender differences in participation, Lin’s team confirmed that women spoke less in mixed-gender groups as well as male-majority teams, using fewer words and speaking less often compared to their peers.

The team then ran a language analysis on the transcripts of the students’ collaboration and found the female students actually provided a higher quality of participation than their male peers. 

“Female students were better at responding to their teammates, building onto their contributions, and also being more cohesive with their own participation,” Lin said. 

It affirmed her assertion that research can help look beyond the initial observations about frequency. Lin hopes that assessing the quality of contributions will be key to developing effective tools that encourage student participation in online courses and bring more equity to small groups.

AI for an Equitable Learning Experience

What those tools may look like is an exciting proposition to Lin, especially generative AI tools that can be applied to what she describes as the “in between,” the learning experience of students as they work through their course and team assignments. 

“We sort of conceptualize that it is useful for AI to help us assess and model collaborative processes, rather than only collaborative outcomes,” Lin said. 

Generative AI tools could provide personalized support for students, identifying learning patterns that may require intervention, like an intelligent tutoring system. Lin also sees potential in creating a similar generative AI program for teams, encouraging more equity in their collaboration and helping students from varied backgrounds and diverse perspectives interact in constructive and respectful ways. She referred to the center tool Tandem as an example of how well-designed support tools can reveal more about team dynamics and help instructors better support and guide students. Tandem coaches students working on team projects and allows instructors the chance to intervene when they see a group needs assistance. 

Lin acknowledges that integrating generative AI with student support comes with challenges. That is why, Lin says, instructor input is key to ensuring tools are built using careful consideration of privacy and bias, and are extensively tested before launch. When done correctly, they could be powerful tools for building a more inclusive and equitable online learning environment. 

“We wanted to think more deeply about how we can leverage AI as a tool for equity,” Lin said. “And this would perhaps be always a constant discussion in the community as we move forward with it.”

References

Lin, Y. & Nixon, N. (2024) STEM pathways in a global online course: Are male and female learners motivated the same?, L@S 2024: Proceedings of the Eleventh ACM Conference on Learning @ Scale, 243-249. 

Lin, Y., Dowell, N., Godfrey, A., Choi, H., & Brooks, C. (2019). Modeling gender dynamics in intra and interpersonal interactions during online collaborative learning. LAK19: Proceedings of the 9th International Conference on Learning Analytics & Knowledge, 431–435.

Nixon, N., Lin, Y., & Snow, L. (2024). Catalyzing equity in STEM teams: Harnessing generative AI for inclusion and diversity. Behavioral and Brain Sciences, 11(1), 85-92.

Lewis, N.A. , Sekaquaptewa, D. , & Meadows, L.A. (2019). Modeling gender counter-stereotypic group behavior: A brief video intervention reduces participation gender gaps on STEM teams. Social Psychology of Education, 22(3), 557–77. 

Dowell, N., Lin, Y., Godfrey, A., & Brooks, C. (2019, June 25-29). Promoting inclusivity through time-dynamic discourse analysis in digitally-mediated collaborative learning. [Proceedings] In Artificial Intelligence in Education: 20th International Conference, AIED 2019, Chicago, IL, USA. Springer International Publishing AG, Part 1(20), 207–19.

Generative AI (GenAI) tools are becoming increasingly popular for a wide variety of uses, including in classrooms. Whether you’re generating images, building slides, or creating summaries of readings, it’s important to be thoughtful about the tools you’re using and the impact they can have on both your students and our world as a whole.

Bias

A GenAI tool is only as good as its training data; if that data contains content that is racist or sexist, we shouldn’t be surprised when the GenAI tool develops the same kind of bias. Bias can come in a variety of different types: stereotypical, gender, and political. All of these biases can lead to certain groups being inaccurately featured more or less in outputs.

Bloomberg tested the biases present in the Stable Diffusion text-to-image generator in 2023. When they prompted the model to create representations for jobs that were considered “high-paying” and “low-paying,” the images generated of high-paying jobs were typically of people with lighter skin tones. People with darker skin tones featured more prominently in images of low-paying jobs. Bloomberg found similar results when they looked at the gender of the people in the images. Stable Diffusion generated three images of men for one image of a woman. When women did appear in the generated images, they were typically in lower-paying and more traditional roles, like housekeeper. Prompts for jobs like “politician,” “lawyer,” “judge,” and “CEO” led to images that were almost entirely light-skinned men.

Harmful Content

Besides being biased, GenAI can produce content that is harmful in a variety of ways. GenAI can hallucinate content that is not based on actual data, and is instead fictitious or unrealistic. It can be used to produce artificial video or audio content impersonating a person’s likeness. When this kind of video and audio content is done with permission of the person, it’s commonly called “synthetic media.” When people create artificial video or audio content of someone without their permission, it’s referred to as a “deep-fake.” Deep-fakes are often used to harass, humiliate, and spread hate-speech. GenAI has made the creation of deep-fakes easy and cheap, and there have been several high-profile cases in the US and Europe of children and women being abused through their creation. 

Policymaking efforts to combat the proliferation of and harm caused by deep fakes have become common both in the U.S. and abroad, with proposals often including disclosure requirements for the use of synthetic media, at least for certain activities. While educational uses of these technologies are unlikely to be restricted or banned, users should strongly consider disclosing the use of these technologies by default in the interest of transparency and in anticipation of any future requirements to do so that may apply. It may also be worthwhile to consider whether companies offering these products are well positioned to comply with this quickly evolving regulatory landscape as well as whether they are making reasonable efforts to help prevent the misuse of their products.  

Data

The collection of data used to train GenAI models can raise a variety of privacy concerns, particularly around personal and proprietary data. Some personal data collection can be declined, although the methods of how to do so are often buried in lengthy terms of service that most users don’t read. Those terms of service also cover how the GenAI tool can use the data that you put into the tool via prompting, so you should be cognizant of the kind of information you’re feeding it.

Recently, the Cisco 2024 Data Privacy Benchmark Study revealed that most organizations are limiting the use of GenAI, with some banning it entirely, because of data privacy and security issues. This is likely because 48% of employees surveyed admitted to entering non-public company information into GenAI tools. There’s also a general lack of transparency around what kinds of data sets have been used to train GenAI tools. Although some explicitly state where their training data comes from, many are vague about what the training data was and how they accessed it.

Copyright

Right now, many believe that using content, like books, images, and videos, to train GenAI falls under fair use in the U.S., but there are currently multiple lawsuits challenging this notion. If companies are unable to leverage fair use to acquire training data, the effectiveness and availability of GenAI is likely to decrease dramatically. The cost of obtaining licenses for the incredible amount of data needed will likely drive all but the biggest companies out of the market.

The outputs created by GenAI can have their own copyright issues, depending on how much they pull from the training data. If the image generated by GenAI, for example, is substantially similar to an image in the training data, there could potentially be some liability for copyright infringement if or when the image is used. Many GenAI tools are attempting to avoid this by refusing to generate content that is similar to copyrighted material, but there are ways for creative prompters to get around these restrictions.

Although many GenAI tools claim to be trained on openly licensed content, studies show that when asked about licensing requirements, 70% of the tools didn’t specify what license requirements were for the generated work, and if they did, the tool often provided a more permissive license than what the original creator intended.

The use of GenAI brings up ethical issues around authorship that are often related to copyright but are separate. For example, when using information gathered from GenAI, there may be an ethical obligation to cite the original source to avoid claims of plagiarism. GenAI doesn’t typically provide citations, and when it does, those citations are frequently incorrect. There are also concerns about the displacement of human authors and artists by GenAI; this frequently comes up when GenAI is used to create works in the style of certain artists or authors.

Environmental Impact

GenAI has a huge environmental impact. Research has shown that training the early chatbots, such as GPT-3, produced as much greenhouse gas as a gasoline powered vehicle driving for 1 million miles. Generating one image using GenAI uses as much energy as fully charging your phone. ChatGPT alone consumes the same amount of energy as a small town every day. On top of that, the data centers needed to house the training data and infrastructure for these tools require large amounts of electricity and water to keep them from overheating. Right now, it’s nearly impossible to accurately evaluate or know the full extent of the environmental impacts of GenAI.

Equity

There are a variety of different types of equity concerns when it comes to GenAI. Most GenAI tools are trained on data from data rich languages and are less likely to include non-standard dialects or languages. There are also access and efficacy disparities. Not everyone will have access to GenAI tools, whether it’s because of the cost, a lack of internet access, or because there are accessibility issues with the tool. Underrepresented or underserved groups may find their experiences missing from the training data, which is only optimized for some groups, not all, limiting the efficacy of the outputs.

Finally, it’s important to remember that all of the legal and ethical issues discussed so far have a disproportionate effect on marginalized groups. For example, negative environmental effects tend to be felt the worst in more vulnerable communities. Considering the major impact GenAI has on the environment, how are we going to work with these groups to help ensure they’re not further harmed?

Conclusion

Overall, there are pretty significant legal and ethical issues we should consider before using GenAI tools. This doesn’t mean that we shouldn’t use GenAI tools; it means that we should be thoughtful about when, how, and why we’re using them. And we should know that the way we use them might change in the not so distant future. The current lawsuits will take years to work their way through the legal system, and depending on how they shake out, GenAI tools may have to go through some major changes when it comes to their training data.

Practical Tips

Here are five tips for navigating through these complex issues:

  1. Investigate the reputation of the GenAI tool and the company that created it. Perform an online search for any potential legal or ethical issues. Add search terms like “complaint,” “violation,” or “lawsuit” with the company’s name, and be sure to read product reviews.
  2. Check the terms of service. Review the terms of service and privacy policies before using GenAl. Caution should be taken before publishing materials created through GenAI.
  3. Protect sensitive data. In addition to data shared for training purposes, it should be assumed, unless otherwise stated, that data shared when using GenAI tools will be accessible by the third party tool provider and affiliates. Data sharing must adhere to U-M policies
  4. Consider the ethics/limitations. Continue to remember, and remind your students, that GenAI tools are often biased, as the technology is designed to output common results based on its learning model. GenAI can also “hallucinate,” so specific claims should always be verified before sharing.
  5. Consult resources and ask for help. We are still swimming in uncharted waters. Utilize resources available here at U-M, including training and workshops on GenAI that are hosted across U-M. There is also a new GenAI as a Learning Design Partner series led by U-M instructors that is freely available via Coursera.

When the Covid-19 global pandemic began, so did a more frequent conversation about the collective trauma endured during this time, from healthcare to housing to education. As time has passed, specifically within the educational sphere, discussions about trauma-informed pedagogy, once commonplace in the scope of the pandemic, seem to have receded. However, understanding the impact of trauma in the classroom continues to be essential for student success.

What is Trauma-Informed Pedagogy?

Sarah Le Pichon and Steve Lundy, 2023, share that “…trauma-informed pedagogy does not seek to provide a “cure” for students’ personal or social histories of trauma. But a trauma-informed pedagogy…entails that there are measures educators can adopt that do not exacerbate and may even mitigate trauma in the course of learning”. The CDC states that “[a]dopting a trauma-informed approach is not accomplished through any single particular technique or checklist. It requires constant attention, caring awareness, sensitivity, and possibly a cultural change at an organizational level”.  

Trauma itself isn’t always tied to a dramatic event or story. Roger Fallot and Maxine Harris, commonly attributed with developing trauma-informed care principles in 2009, note that “Trauma is pervasive. National community-based surveys find that between 55 and 90% of us have experienced at least one traumatic event. Individuals report, on average, that they have experienced nearly five traumatic events in their lifetimes. The experience of trauma is simply not the rare exception we once considered it. It is part and parcel of our social reality.”

In considering how to conceptualize trauma-informed pedagogy in the current context of higher education, specifically online, we spoke with four education innovators at University of Michigan to gain insight into their expertise of how to best understand and practically apply this concept.

Who is Trauma-Informed Pedagogy For? 

Trauma-informed pedagogy is for everyone. Dr. Kyra Shahid, Director of the Trotter Multicultural Center, shares that it’s not only for everyone, but specifically for “those invested in education being a pathway towards healing, restoration, innovation and change”. She continues, “We’d be remiss to not pay attention to how global, national, and local trauma impacts the way students see the world who have seen vast changes in how we teach online.” Shahid feels that this work is relevant to the world we live in now and helps learners to avoid cognitive dissonance during a time when the world is vastly changing. Due to this, she feels that one must disrupt the “normal” and teach in a way that is responsive to what this generation has lived through, from racial terror to mass shootings.

Dr. Rebeccah Sokol, Assistant Professor of Social Work, adds that Trauma-Informed Pedagogy is beneficial to both her, as instructor, and to her students. It’s simply put, a “compassionate teaching style”.  She feels that open communication about her students and their lives is really beneficial & helps her students and her be more authentic in the process, which opens the door to being able to learn and receive information. Sokol shares that because of Trauma-Informed teaching practices, she comes to every classroom setting with the understanding that students are coming to the learning experience with a lot of lived experiences. She recognizes and honors their diversity of experience which enriches the depth of learning for the entire community.

“Trauma-informed pedagogy is a learner-centered approach that focuses on the needs of students first and foremost. Since the pandemic, I think we have seen an overall shift toward putting the experience of students first, even if it means making adjustments to expectations and timelines for course delivery,” says Dr. Rebecca Quintana, Director of Blended and Online Learning Design at The Center for Academic Innovation. She also shares that these ideas can also be applied to instructors. “Instructors need to give themselves grace as they seek to provide grace to their students. For instructors, it can be challenging to know how much visibility to give students into challenges they are facing personally, so it is important to thoughtfully navigate each situation on a case by case basis.”

Dr. M. Remi Yergeau, Associate Director of the Digital Studies Institute, notes that there are misconceptions, or bias, when it comes to trauma-informed pedagogy. They note that before one can do the work of learning, one does not need to resolve their trauma. “There is a common misconception, more of a bias, around trauma as well as disability…that people who are in the throws of lived experience, like people experiencing the traumatic impact of a life event, people who are going through a medical event, disabled folks…there is a presumption that you shouldn’t be here”. They add that there is a presumption that you need to get your life in order before you can do the work of learning, which is harmful and presumptuous that experience is not valuable.  Yergeau also feels that it’s important to remember that trauma isn’t just one thing; it can be in the community, a lived experience, social structure, identity, or even one’s body. 

Shahid also shared that we need to reframe our thinking that trauma-informed pedagogy is therapy: “…[Trauma-Informed Pedagogy] is not focused on individual needs but on the collective needs of the entire classroom, instructor included”. Sokol shares it could be as simple as a mindset shift to “…come to teaching with understanding that people have a diversity of experiences and backgrounds and being mindful of that diversity when teaching and interacting with students.” Overall, they emphasize that a student’s life experience is valuable within the learning environment.

Trauma-Informed Pedagogy for Online Learning

When we asked Shahid about practical ways to incorporate Trauma-Informed Pedagogy into their online teaching, she suggested that instructors build in time to reflect and time to incorporate the body. “Don’t fall into the practice of education just being an exchange of intellectualism,” she warns. “[We need to acknowledge] the ways that our body is impacted by what we learn, how we learn, where we learn.” If content is particularly challenging–whether due to the nature of the topic or technical difficulty–students can benefit from pausing and allowing the tension of that stress to move through them. Online learning environments can feel somewhat disembodied, relegating student representation to posts on a discussion forum or a small box on a video meeting screen. Technology can engender cognitive dissonance, Shahid says, and establishing ways to remember students as part of a learning group—and as part of their bodies—can support learning. 

Shahid points out that most of human communication is actually nonverbal. “It’s not the words we use, it’s our body language, it’s the eye contact, it’s the energy that we share when we come into a room,” she explains. “It’s those things that really influence how we experience, what we learn, and what triggers in our body that we’re safe or we’re not safe.” Since online learning tends to be absent of many of these cues, this can be particularly challenging for learners with a history of trauma, or for anyone living through unpredictable times. At the same time, she says, technology can bridge gaps, and bring in forms of engagement less common in a classroom.

Yergeau notes that instructors don’t have to limit their online teaching to tools like Zoom or Canvas. Platforms like Discord, for example, may have a steep learning curve, but can also allow for students to signal ways they would like to engage, and more layered conversation. No tool is perfect, and Yergeau suggests “pulling in students to do the critical work of assessing those technologies themselves.” They ask, “how are these technologies imagining their users?” Similarly, how are we as instructors imagining learners as we make decisions about how we teach?

While technology presents incredible opportunities for online teaching and learning, Shahid points out that educators aren’t always trained in how to fully utilize it in ways that are continuously accessible to students. Yergeau also notes that it is important to consider how the tech we use ultimately uses the data of our learners, with or without their consent.

Trauma-Informed Pedagogy and Diversity, Equity and Inclusion (DEI)

The principles of trauma-informed teaching—for example caring awareness, transparency, and empowerment—can support all learners. They also provide a framework for instructors to be human too, sharing our pedagogical decisions with students so that they can be improved. Because of its focus on trustworthiness, collaboration, and voice, trauma-informed approaches can be the glue that supports diversity, equity, and inclusion not only in the classroom, but on campus as well.

Shahid points out that trauma-informed pedagogy, DEI, and other forms of healing centered practices are interrelated. “For me, [trauma-informed pedagogy] is what allows us to apply the good work of DEI that we have been doing for so many years in ways that are responsive to the students we are working with in the moment.”

Whether we are talking about DEI or trauma-informed pedagogy, Yergeau says, “we’re talking about ways of viewing and approaching the world.” They note that it can be easy to imagine how individual trauma intersects with disability, but that it can also intersect with class, intergenerational trauma, legacies of colonialism, racism, and war – the ways in which our country is structured around violence and disempowering folks. We can provide learning experiences that support people where they’re at, Yergeau says, but we can also come at it thinking in terms of providing learning experiences that support social and educational transformation.

Practical Tips

Looking for concrete ways to incorporate trauma-informed pedagogy into your online classroom? Check out these tips below, synthesized from our conversations with faculty.

  1. Environment: Think about the kind of learning environment you want to create.
  • If you have synchronous meetings, plan the way you kick off each session. Are there ways to cultivate an atmosphere that gets students feeling welcome and ready to learn? Some instructors make use of icebreakers, check-ins, Zoom surveys or chat prompts, or music.
  • If your course is asynchronous, are there regular announcements that can be sent, or can periodic introductory pages be embedded in a course site? These can be ways to create rich regular touch points that use clear expectations, reminders, or gifs to build a cohesive community that helps learners to orient themselves and prepare.
  • Work with class participants to develop a group agreement. Sometimes called community guidelines or ground rules, activities like this can make transparent expectations and requests of one another as a class.

2. Time: Slow down and take stock of the moment.

  • Build in time for the class to pause and reflect, such as mindful moments before and after engaging learning activities. This can help students to prepare, shift gears, or reflect and synthesize. 
  • When traumatic events or experiences occur in the world or lives of individuals it can be feel dissonant to go about business as usual. If you are aware of something that may be weighing on students’ minds or bodies, consider creating moments to shift outside of routine. This can mean journaling, an unplanned discussion, or taking the opportunity to connect what is going on with course content so students can see how what they are learning is relevant.

3. Bodies: teach to the whole person.

  • Think about ways to acknowledge how the body is impacted by how and what we learn. If you teach online, recall that sitting for long periods can cause physical discomfort or present challenges to concentrating. Consider taking breaks for movement, or incorporating the body into the learning process. If course content is emotionally difficult, movement can be an ally to work through the material and any tension it may cause in the body.
  • Consider cultivating a learning environment that normalizes rest and restoration rather than busyness and opportunity/information overload. This could look like a segment of the course schedule that doesn’t introduce new content or assignments so that learners can focus on wellness, or regular messages that go beyond content to support student wellbeing.

4. Engagement: include yourself and students in your pedagogy.

  • Let students know why you’ve made certain decisions about class assignments or structure, and the ways in which your teaching style supports you–your passion and values, and also your own wellbeing and boundaries. Students want us to support them, but they don’t want us to burn out.
  • Invite students to be a part of class design or making decisions regarding their assessment. This can provide a sense of control, fairness, inclusion, and importance. We can do as much as possible to plan for learner success and inclusion, but nothing takes the place of students’ determining their own learning. Tools like Gameful that integrate with Canvas or other Learning Management Systems can be an effective way to support learners in individualizing their learning and assessment.
  • Learning isn’t just a two-way street between students and teachers, but also among students themselves. Invite students to share from their own knowledge and experiences, and build in time and activities to help build connections among learners. This can be particularly important in online classes, where students can sometimes feel isolated or as if they are going through class materials alone. For example, there could be regular discussion prompts that get students talking to one another about life or hobbies, and not just course material. Some instructors hold weekly synchronous drop in office hours where students can chat with the instructor about anything, or find fellow students and connect with them. It can be helpful to let students know that there is always room for conversation, connection, and disagreement.

5. Flexibility: Build in a diversity of ways to participate.

  • Online learning can rely heavily on live or recorded lectures, quizzes, and discussion forums. This can feel predictable for students–in both good ways and bad. To create a variety of ways of engaging with material, some instructors layer in use of Discord or other tools. This can provide opportunities for students to go beyond what Zoom or Canvas allow in terms of communication and relationship building (think gifs, or threaded chats, or ease of movement between multiple concurrent video discussions). Other instructors encourage opportunities for video, audio, or image responses as alternatives to writing.
  • Not all technology is equally accessible. The burden of pointing this out can fall to students whose needs aren’t being met. Before that happens, some instructors engage students in analyzing and selecting options that work best for the group or individuals.
  • Create ways for students to signal how they want to interact if they choose. For example, do they want to be reached out to outside of class for study groups, or do they like communicating by email, text, or other apps?

Resources

References

Copyright exists to promote progress by securing time-limited exclusive rights for creators of original literary and artistic works, including movies, songs, software, photographs,and architecture. On the other hand, facts and ideas do not fall under copyright protection, including methods of operations or systems. A work is copyrighted as soon as it is fixed in a tangible medium, or created in a way that is saved in some way. For instance, if you come up with a new song and sing it at an open mic night, that song is not protected until you either write it down or record it. Specific exceptions to copyright, such as using material in a classroom or making preservation copies for libraries, exist in the US. Fair use is the broadest of these exceptions (or user’s rights) and provides flexible guidelines to help determine how you can appropriately use the work.

If your use is not allowed under an exception to copyright law, permission is needed from the copyright holder. This can be difficult, especially when works are posted online and not connected with a person’s name or contact information. Even if creators were okay with people using their works under certain circumstances, there was no easy way to convey those permissions to the general public. In 2001, Creative Commons created a suite of licenses that would help bridge these gaps and make it easier for creators to give permission for their content and for the general public to find works they can easily reuse. Creative Commons licenses work within the existing copyright landscape, not against it, and explicitly allow for fair uses of the works, even if that fair use would contradict the other terms of the Creative Commons license.

Between 2001 and today, Creative Commons has grown, not only as an organization, but as a movement. The licenses are now used on nearly two billion works online across nine million websites. Creative Commons licenses increase access and “…give every person and organization in the world a free, simple, and standardized way to grant copyright permissions for creative and academic works; ensure proper attribution; and allow others to copy, distribute, and make use of those works” (About Creative Commons). 

Layers of Creative Commons Licenses

There are three layers of the Creative Commons Licenses. First, a legal code that is a base layer that provides terms enforceable in court. Next, there’s a human readable layer that summarizes the legal code and is easy to understand for non-lawyers.  Finally, there is a machine readable layer that is a summary of key features that technology, such as search engines, understand, allowing for filtering of works by Creative Commons license.

License Elements

There are four license options to pick from when choosing a Creative Commons license: Attributions (BY); Share Alike (SA); Non-Commercial (NC); and No-Derivatives (ND).  More detail about each type of license is outlined below, along with information about the two public domain tools Creative Commons has created.

  • The Attribution license (CC BY): allows people to use and adapt the work for any purpose (even commercially) as long as credit is given to the creator. This is the least restrictive Creative Commons license.
  • The Attribution-ShareAlike license (CC BY-SA): allows people to use and adapt the work for any purpose (even commercially) as long as credit is given to the creator and any adaptations made are shared under the same or a compatible license.
  • The Attribution-NonCommercial license (CC BY-NC): allows people to use and adapt the work for any noncommercial purpose as long as credit is given to the creator.
  • The Attribution-NoDerivatives license (CC BY-ND): allows people to use the work for any purpose (even commercially), as long as they give credit to the creator and do not create adaptations or derivatives of the work. This includes making any major changes, creating translations, or creating sequels. Under this license, people may adapt the work for their own personal use but may not share any adaptations publicly.

The four license types can be mixed and matched depending on the preferences of the copyright holder. It’s important to remember that all Creative Commons licenses require attribution. 

There are also two Public Domain tools: CC0 and the Public Domain Mark.

  • CC0: Allows creators/owners of a work to waive copyright and put their work in the public domain. This is different from a CC license because it is a choice to opt out of copyright protection.
  • Public Domain Mark: Universal label that shows a work is no longer covered by copyright. Popular with museums, this mark is for works that are free of known copyright restrictions around the world.

Get Involved!

The Creative Commons Global Network is part of the open movement focusing on collaboration and sharing works across the globe. You can learn more about the movement and how to get involved in your local Creative Commons Chapter at their website.

Resources

Visit Openverse or take a look at the Center for Academic Innovation Finding Useable Materials Guide to help you find openly licensed third party materials. 

The Copyright Team at the Center for Academic Innovation is always available to answer any questions you may have about Creative Commons licenses, including licensing your own work and using the works of others. Feel free to contact us at [email protected].

The Roundup on Research series is intended for faculty and staff who are interested in learning more about the theories, frameworks, and research in online and technology-enhanced teaching and learning.

If you have been anywhere where teaching is involved, you have probably heard mention of “learning styles.” “I’m a visual learner” vs. “I’m a hands-on learner” or “My instructor didn’t teach in my learning style” are all the types of commentary that are common when some individuals talk about their own learning. Although it is deeply appealing to be able to categorize individuals into easy methods of learning, unfortunately, it is deeply flawed, has little empirical evidence to support it, and might cause more problems than it solves.

What are Learning Styles?

To best understand why learning styles are problematic, it is important to clearly define learning styles. The idea of learning styles is that there are stable, consistent methods that individuals take in, organize, process, and remember information, and by teaching those methods, students learn better. 

One popular concept in learning styles posits that the modality of information is critical – a “visual” learner learns best by seeing versus an “auditory” learner who learns best by having things spoken or described to them. Learning style theory would suggest that by using visual aids, a visual learner would organize and retain information better than say, an auditory learner. The implication is that matching modality information to the modality of learning style is critical to student success.

At face value, the concept of learning styles makes sense. Individuals learn differently. Most educational settings are trying to reach large numbers of students in personalized ways.  It would be useful to have an easily applied theory that would help all students learn! As educators, we want to recognize the “uniqueness” of each student and help learners in any way we can. This desire has led educators to look for easier ways to navigate the complexities of teaching. Unfortunately, learning is not that simple.

Do Learning Styles Really Exist?

In general, most learning style theories make two presumptions: 

  1. Individuals have a measurable and consistent “style” of learning, and 
  2. Teaching to that style of learning will lead to better education outcomes, and conversely, teaching in a contradictory method would decrease achievement. 

In other words, if you are a visual learner, you should learn best if you see things, regardless of the situation. If you are a kinesthetic learner, you will learn best if you can physically manipulate something, regardless of the topic. However, neither of these two assumptions shows any grounding in research. These two propositions are where we can see the concept of learning styles breaking down.

Are Learning Styles Measurable and Consistent?

Did you know that there are actually over 50 different theories of learning styles by various researchers? Researchers have been trying for years to find a correlation between individuals and how to help learning. Some theories suggest the modality of learning matters (like the common VARK theory) while others propose details like time of day and temperature of the room define a learning style. One study that suggested using a cell phone was a learning style (Pursell, 2009).  Just the number of different styles makes it difficult to measure and make sense of an individual style. 

In addition, most learning style inventories rely on a student’s self-report about how they perceive they learn best. These self-reports are generally not validated in any way.  Generally, humans tend to be poor judges of our own learning. Therefore, these surveys are generally measuring “learner preference” rather than “learning style.” You may think you are an auditory learner but until it is validated that you objectively learn better through audio format, it is a preference, not a style. 

Also, when reporting results, many studies will rely on “student satisfaction” as a measure of success, or rely on students’ reflections as a measure of success in a class. For example, many measures of learning styles will ask students how they believe they learn best. Unfortunately, satisfaction with a class or a student’s recollections of success are subjective measures, and generally not accurate (Kirschner & van Merriënboer, 2013, Kirschner, 2017).  While understanding a learner’s preference is useful as is understanding student satisfaction with a lesson, it does not have the same weight as necessitating teaching to that preference. 

Finally, ​​”styles” are unstable and unreliable. The research on learning styles has suggested that these preferences may be unstable – they be topic-specific, but they also change over time (Coffield et al., 2004).  That means that although an individual may be a kinesthetic learner in history this week, that person is a visual learner in math when talking about calculus (but not about geometry), or prefers to learn how to ride a bike kinesthetically instead of reading about it in a book. This questions whether a learning style is a “trait” (or something stable and persisting for a person) or a “state” (something that is temporary and may change). Learning styles as a state of mind are not particularly useful. How can a teacher know the preference of an individual student today in a given subject? 

Does Teaching a Learning Style Result in Better Learning?

Even more importantly, however, is the second assumption – does teaching to an individual’s learning style lead to achievement? Simply put, there is no evidence that supports teaching to a person’s specified learning style results in better learning (Alley, et. al., 2023; Cuevas, 2015; Kirschner & van Merriënboer, 2013; Krätzig & Arbuthnott, 2006; Pashler et al., 2008; Rogowsky et al., 2020). No study has shown that teaching to an identified learning style results in better retention, better learning outcomes or student success. Instead, we see that teaching to a self-identified learning style has no impact on learning in children or adults (Krätzig & Arbuthnott, 2006; Paschler et al., 2008; Rogowsky et al., 2015, Rogowsky et al., 2020). Some research suggests that some students performed better on tasks when taught in a different modality than their self-identified “learning style” (Krätzig & Arbuthnott, 2006, Rogowsky et al., 2020). Most studies of learning styles use a methodology that uses multiple styles to all learners – meaning that there is no way to isolate learning style to teaching method. This leads us to ultimately conclude that while the concept of learning styles is appealing, at this point, it is still a myth.

Alternate Explanations to Learning Styles

Anecdotally, there are many stories about the success of leveraging “learning styles.” If learning styles are not empirically supported, how are these successes explained? There are alternative explanations for why teaching in multiple methods increases achievement that do not prescribe students into style categories. Multi-modal learning explains how learning improves with various methods of teaching.  

Learning requires sustained attention. Therefore, if an educator can capture and maintain students’ attention, students’ learning outcomes likely improve.  Providing engagement with content in multiple forms – be it through hands-on activities, or different modalities – makes students pay attention to content in different ways, and requires learners to integrate knowledge in new ways. If an educator is using multiple methods and modalities, it’s just more interesting, and students pay more attention, which leads to better learning. Mayer and colleagues (2001, 2003) have extensively studied how students learn with visuals and audio, and the interaction of the two. What he and his colleagues suggest is that by providing dual streams of information in multiple methods engages learners to work harder at understanding the material, which leads to better learning. It may be that the research on learning styles is actually showing that teaching with different modalities is just more interesting to students rather than catering to a particular style of learning ​​(Krätzig & Arbuthnott, 2006).

Why Learning Styles are Dangerous

While the intentions of learning styles are good, the implications of learning styles are more destructive than helpful.   On the positive side, reflecting on how one learns is always a lesson. However, by focusing on a style suggests that learners are passive vessels at the whim of the method of teaching. Ultimately, most educators want students to actively engage in their learning. The best learning takes place when an individual can connect and incorporate information into his or her personal experiences and understanding. By focusing on a student’s learning style we reinforce a simplistic view of learning. Learning styles suggest that individuals have one way to learn best. Unfortunately, learning is complex, and not easy. This is hard and takes time! It has very little to do with the way information is handed to a learner, but rather, how the learner processes that knowledge once they have it. It is important to remember – learning is within the control of the learner. 

Thinking Critically About Learning Styles

If learning styles do not impact an individual’s ability to learn, why is there so much talk about them? Articles and books are still being published about learning styles and how to tailor teaching to reach every style. Research on teaching and learning is a complicated discipline, and being able to examine theories and concepts like learning styles critically is important to anyone working in education. The challenge is to keep a skeptical eye when you hear about research supporting learning styles and ask the right questions to make sure you are getting good information.

What Should you Think About the Next Time you Encounter Learning Styles in the Wild?

  1. What framework of learning styles are they referring to? Some are more empirically vetted than others. The most popular learning style VARK (Visual-Auditory-Read/Write-Kinesthetic) is also the least validated. Find out more about the learning style being discussed.
  2. How are they measuring both learning style and success? Are they self-reported? Are they looking at academic results or a self-report of satisfaction with learning?
  3. Is the study carefully controlled? Many studies fail to tailor the learning to a particular style. Rather, the lesson uses all the styles to reach all the students. There is no way to truly measure success.
  4. Learning styles can be controversial with some people. They aren’t necessarily harmful if they encourage people to reflect on teaching and learning in different ways. They can be harmful if students believe that their learning is outside their control.

References

Alley, S., Plotnikoff, R. C., Duncan, M. J., Short, C. E., Mummery, K., To, Q. G., Schoeppe, S., Rebar, A., & Vandelanotte, C. (2023). Does matching a personally tailored physical activity intervention to participants’ learning style improve intervention effectiveness and engagement? Journal of Health Psychology, 28(10), 889–899.

Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Should we be using learning styles?  What research has to say to practice: Learning & Skills Research Center.

Cuevas, J. (2015). Is learning styles-based instruction effective? A comprehensive analysis of recent research on learning styles. Theory and Research in Education, 13(3), 308–333.

Kirschner, P. A. (2017). Stop propagating the learning styles myth. Computers & Education, 106, 166–171.

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48(3), 169–183.

Krätzig, G. P., & Arbuthnott, K. D. (2006). Perceptual learning style and learning proficiency: A test of the hypothesis. Journal of Educational Psychology, 98(1), 238–246.

Lau, W. & Yuen, A.  (2009).  Exploring the effects of gender and learning styles on computer programming performance:  Implications for programming pedagogy.  British Journal of Educational Technology.  40(4), 696-712

Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43-52.

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles:  Concepts and evidence. Psychological Science in the Public Interest, 9(3), 105-119.

Pursell, D. P.  (2009)  Adapting to student learning styles:  Engaging students with cell phone technology in organic chemistry.  Journal of Chemical Education.  86(10), p1219-1222.

Rogowsky, B. A., Calhoun, B. M., & Tallal, P. (2015). Matching learning style to instructional method: Effects on comprehension. Journal of Educational Psychology, 107(1), 64–78.

Rogowsky, B. A., Calhoun, B. M., & Tallal, P. (2020). Providing Instruction Based on Students’ Learning Style Preferences Does Not Improve Learning. Frontiers in Psychology, 11.

How this will help

Examine the pros and cons surrounding ChatGPT
Navigate the concerns of LLM as well as the potential benefits for learners and instructors

It is safe to say that by now, you have seen many articles/posts, opinions, and stories about ChatGPT—and the larger AI-Language Learning Models (LLMs)—in relation to higher education and teaching/learning in particular. These writings included several perspectives ranging from raising concerns to celebrating new opportunities and a mix of the former and the latter. Also, these writings continue to evolve and grow rapidly in number as new AI-powered LLMs continue to emerge and evolve (e.g., Google’s new AI LLMs: Bard).

The intent of this piece is not to add another article sharing tips or concerns about ChatGPT. That being said, this article (1) summarizes the major concerns about ChatGPT and (2) the ideas about its positive implications based on what it is published to date.

Concerns about ChatGPT

Faculty, scholars, and higher education leaders have raised several concerns about ChatGPT. These concerns stem from possible ways it can be used.

  • Using ChatGPT to cheat by asking it to write essays/answer open-ended questions in exams/discussion forums and homework assignments (December 19th, 2022 NPR Story) (December 6th, 2022 Atlantic Story) (January 16 New York Times Story).
  • Using ChatGPT to author scholarly works which conflict with the ethical standards of scientific inquiry. Several high-impact/profile journals have already formulated principles to guide authors on how to use LLMs AI tools and why it is not allowed to credit such tool as an author—any attribution of authorship carries with it accountability for the scholarly work, and no AI tool can take such responsibility (January 24th, 2023 Nature Editorial).
  • ChatGPT can threaten the privacy of students/faculty (and any other user). Its privacy policy states that data can be shared with third-party vendors, law enforcement, affiliates, and other users. Also, while one can delete their ChatGPT account, the prompts they entered into ChatGPT cannot be deleted. This setup threatens sensitive or controversial topics as this data cannot be removed (January 2023 Publication by Dr. Torrey Trust).
  • ChatGPT is not always trustworthy, as it can fabricate quotes and references. In an experiment conducted by Dr. Daniel Hickey at Indiana University Bloomington, Instructional Systems Technology department, “ChatGPT was able to write a marginally acceptable literature review paper, but fabricated some quotes and references. With more work such as including paper abstracts in the prompts, GPT is scarily good at referencing research literature, perhaps as well as a first-year graduate student.” (January 6th, 2023, Article by Dr. Daniel Hickey)

Excitement about ChatGPT

At the other end of the spectrum, there have been several ideas that express interest and excitement about ChatGPT in higher education. These ideas stem from how they can be used ethically and in a controlled manner.

  • Using ChatGPT to speed up the writing of drafts for several outlets (reports, abstracts, emails, conference proposals, press releases, recommendation letters, etc.) ChatGPT can produce elaborated writing that must be edited to remove any possible inconsistencies or inaccuracies (December 7th, 2022 Social Science Space story)
  • Using ChatGPT in the process of brainstorming ideas for curriculum design, lesson planning, and learning activities. The tool can provide some novel ideas or remind educators of some instructional techniques and strategies that they had heard about in the past (January 23rd, 2023, Article by Dr. David Wiley).
  • Using ChatGPT to provide students tutoring/scaffolds. The tool can act like a virtual tutor who does not simply give the answer to the student but rather scaffold them to reach the correct answers by themselves. (Sal Khan, founder/CEO of Khan Academy, Spring 2023 TED Talk)
  • Teaching with ChatGPT to train students on using AI tools and models, provide opportunities to exercise critical thinking skills, and improve their technological literacy (January 12th New York Times story).

Concluding Thoughts

There are major concerns about ChatGPT and the larger AI-powered Language Learning Models (LLMs) phenomenon. These concerns are legitimate and are opposed by notable ideas about the positive implications of AI-powered LLMs in higher education classrooms. As we aspire to make evidence-based educational and learning design decisions, one should carefully review the research that has been done on AI in relation to higher education up to this point and engage with the gaps as opportunities to expand knowledge and find new opportunities and risks.

Our University’s newly formed advisory committee on the applications of generative AI is a good example of how higher education institutions ought to recommend the use, evaluation, and development of emergent AI tools and services. Additionally, discussions about generative AI and its implications on education happening in public venues are necessary to strengthen the public-facing mission of the University, where input from educators, students, and members of the community is welcome.