Teaching is Hard. Let’s Solve One Challenge: AI Slop.

[00:00:00]

Welcome to make EdTech 100. I am LindyHoc Educator, K 12 Ed Tech Advisor, and your host. This is a podcast where we keep it real about what actually works in classrooms. No hype, no overwhelm, just practical strategies, honest stories and tools that make a real difference for teachers and students. So come along with me on a journey to make EdTech 100.

Welcome to the inaugural episode of Make Ed Tech 100. I'm thrilled to share this time with you and honored that you chose time outta your busy day to spend with me. Teaching is hard. Full stop. It seems like every year it gets a little more challenging in one way or another, so. I wanna spend the very first episode of this podcast giving one [00:01:00] solution to one very real problem that a lot of educators are facing right now, and that challenge is AI slop.

If you're not familiar, AI slop is the new pop culture term to describe work that is clearly copy and pasted from ai. If you've ever opened student work and you knew in your gut that they just didn't write it, you're not alone. It's frustrating. Let's just pause and recognize that for a moment. It may seem like an impossible hill to climb, especially when you're in the thick of it.

It can be so frustrating and it's hard to see. It's like you're in a dense forest and you can't see. The way out. But educators, we are resilient. We're problem solvers, and I promise you we can overcome this. So today [00:02:00] I'm gonna share a framework I created to make assessment in the age of AI feel doable again.

And actually I feel like it even makes assessment. Exciting and better as an educator and not just that we have to make this change because of generative AI in particular.

Alright, it's time for a segment I call tech check or Tech Rec wo, where we give an honest take on a strategy, a trend, or an idea . Today, you might see one way to combat AI slop is AI content detectors. These detectors identify what percent of text is human written versus AI generated. I know this seems like a great solution to the problem. Quick, easy. Done. We're gonna use this. Just like we used to use plagiarism detectors to see [00:03:00] if texts had been copied from another source.

It seems like it should work, but the honest truth is AI content detectors are a tech wreck. All caps. Sometimes tech check or tech rec is gonna be opinion based on the podcast, but this one is rooted in research. There are so many studies out there that show that AI content detectors just don't work. In fact, these studies show that you're better off flipping a coin to determine if a piece of text was written by a human or by an AI model.

So think about that for a second. You're literally better off labeling one side of a coin human, the other side, AI and flipping a coin. It's just as accurate to say like, oh yes, a human wrote this, wrote this, or an AI wrote this as using these AI [00:04:00] content detectors. That's really problematic. So they have really high error rates in summary, and if those high error rates aren't bad enough, there's also research that shows that these content detector tools.

Are really easily misled, meaning that you can just take and make a few tweaks to the text and really sway its results. I promise you students are doing this. If you go on TikTok and YouTube, you'll find a million videos of students explaining how to mislead these AI content detectors Also. More research shows, and this is, I don't know.

I don't know which is most concerning, but this one's pretty concerning to me that these AI content detectors are biased against both non-native English writers and neurodivergent learners. And if we need even more proof as to why these content detectors are a tech [00:05:00] rep, there's also a breakdown of trust.

They kind of create this gotcha culture. This really is, I think, the most concerning part for me because if the pandemic taught us anything, it taught us that the student teacher relationship is so critical to learning and research really backs that, by the way, like overwhelmingly backs that. Literally, there's studies that show that if there is no student teacher relationship, that learning just will not occur.

So coming out of the pandemic, I feel like every school, every teacher, every educator has made a really. Distinct effort to focus on building those relationships and now these con detector tools that are easily misled, that have high error rates are coming [00:06:00] along and degrading that trust that we've spent so long building back up with our students and to me.

That effect on the culture of teaching and learning is, is very concerning. Even if you don't wanna look at the high error rates and that they're easily misled and that they can be biased against certain populations, just take a minute and, and think about that. So. For me, moral of the story, AI detectors are a tech wreck wom.

Now one of my personal pet peeves if you know me, is presenting a problem with no solution, and I just presented a big problem. So Lindy, quit talking, get to the solution. [00:07:00] My solution focuses around shifting from policing mode to redesigning mode when you're policing, like using ai. Content detectors is an example of policing that focuses on the symptom.

Redesigning really hits at the core of the problem, and the core of the problem of AI slop is that if AI can complete an assessment in full. We just need to change the assessment. That assessment has to change. If an AI can do it, that assessment needs to look different. So the key idea to my solution is that it's all in how you, the teacher craft the learning experience.

I say this to teachers all the time, you are the crafter of the learning. So my solution is called the Assessment Puzzle [00:08:00] Framework. It's just one strategy to help you think about redesigning assessments in the age of ai. It's important to note that I'm not saying that this is the only way to redesign assessments or the only type of assessment you can do to redesign and think about assessing, learning differently, tweak your assessments.

It's just one strategy of potentially many out there and a strategy that I have personally had a lot of luck with. I've shared with a lot of teachers over the last couple years and had a really great response to this being one really great solution. To AI swap. So the assessment puzzle framework goes like this.

Think of crafting AI friendly assessments like building a puzzle. You need at least three pieces to make a puzzle. And I always use the example of a toddler. If you were to hand a toddler, two puzzle pieces, they're [00:09:00] gonna look at it and go, oh, I put it together in a few seconds or less. There's no critical thinking.

There's really not any problem solving there, right? But now if you hand the toddler a third piece of a puzzle, now they have to think about it. They've gotta think about which two pieces to put together first, and then how that third piece fits in there. You now have some critical thinking and some problem solving involved.

Going back to how does this fit with assessment? The assessment puzzle framework uses that idea of a puzzle. Imagine that assessment puzzle framework, the analogy of a puzzle, and that you have to have at least three pieces of an assessment to have an assessment that really is AI friendly. So just like a puzzle needs multiple pieces to create the full picture.

Assessment needs at least three different ways of showing learning so that you can ensure [00:10:00] integrity and really evaluate that learning happened and the student just didn't outsource their thinking to an AI or copy paste. Each puzzle piece in the framework represents a different way students express their learning.

Like text. Yes. Text is still there. It's just not the only puzzle piece. That's, that's the trick. You can't have just text anymore, unfortunately, because that's what AI currently really excels at. Starting to excel at other pieces. But text is the big one with large language models, right. Visuals is another puzzle piece.

Audio, video voice reflection is a big one. That one can be audio and or video. Annotations is a puzzle piece, and if you're not familiar with what that term means, I like to describe it as purposeful. Doodling is what an annotation is. Personal connections is a puzzle [00:11:00] piece. Current events, and. Even collaboration with AI is one piece, one puzzle piece I should say.

So yes, AI friendly assessments, leverage AI when appropriate. Not always, doesn't it? Not always, but there are some really key points where AI can be a really helpful learning tool. When students have to synthesize across these multiple modes, like video, audio, text annotation, learning becomes visible.

That's really the key to the assessment puzzle framework is making learning visible. If you're familiar with John Hattie's research, he talks a lot about making learning visible and the importance of that. When learning is visible, you can ensure integrity. You know, if the student truly understands a [00:12:00] concept or if they outsourced their thinking to ai.

So at this point, I think I need to give you an example to really bring this framework to life. I like to start with the classic photosynthesis concept because almost everybody has learned about photosynthesis from the learner perspective. Many of us have taught the concept of photosynthesis, so it's just a good example to start building out this idea of the assessment puzzle framework.

A before assessment for photosynthesis might look something like this. Write a one page essay explaining photosynthesis. AI can do that in seconds. Actually. Probably milliseconds. Microseconds. I think microseconds are less than milliseconds. I need to fact check my sec myself there. Picoseconds. I know that's smaller.

The student [00:13:00] learns very little 'cause the AI can do it. And you get polished swap.

So here is the redesigned assessment. Create a visual diagram showing photosynthesis in our local ecosystem. Use images and or videos of local plant life. Use annotations to create a diagram record, a voice reflection, explaining how our local climate affects the process of photosynthesis differently than other climates.

Now I know that seems like a lot. That's a lot more than the one sentence. Write an essay about. How photosynthesis works, but if you break that down, you can start to see these different puzzle pieces. We've got the visual of they're actually drawing out the process. They're using annotations to explain the process, the voice reflection part, where they're explaining.

As they're building out the photosynthesis process, and you [00:14:00] don't have to do this, but it's kind of an easy bonus, especially with something like this, we have the puzzle piece of personal connection where they actually have to tie it to their local climate and ecosystem. Just another kind of little punch there to add another puzzle piece in there.

So it's the same content. We're still teaching photosynthesis. We're still assessing our understanding of photosynthesis, but we're deepening the thinking. We're also adding, by the way, a level of relevance there by adding in that personal connection of connecting it to their local ecosystem and climate.

In the designed assessment, you tweak how students show the learning and you shift from recall, which is what AI strength is. By the way. That's what AI is good at. It literally can spit out and recall facts and information and picosecond. It's not always correct, [00:15:00] of course, but. I would say most of the time it is we're getting less and less hallucinations as the AI models get better and better.

More of the story. AI is good at recall, so we're shifting from recall to really synthesis where the human brain has to put together all these different pieces via talking, annotating, attaching it to their local climate. This particular example of photosynthesis includes two of my favorite puzzle pieces, voice reflection and annotations.

And on that note, there are some puzzle pieces that hold more weight or power than others. I like to explain it as the corner puzzle pieces. Everybody loves the corners, right? That's always where you start when you're building a puzzle. If you don't start with the corners, uh, like how are you even building?

How do you know where to start [00:16:00] your puzzle if you don't have the corner pieces? So voice, reflection and annotation are kind of like two corner puzzle pieces that just hold. A lot of power because from the voice reflection side, when a learner explains something out loud, it's literally like you as the teacher getting X-ray vision into their brain.

You're literally seeing and hearing the student's thought process. It's really amazing and really, really powerful. I gave you a science example for photosynthesis. Let's look at a few other examples in different content areas. I wanna do math. Math works really well with this framework. So here's an example of a math assessment.

Solve a multi-step word problem on video using two different methods. Show your work for each approach [00:17:00] while verbally explaining every step and why you chose each method. So basically to break that, that down. They are solving a math problem with two different methods. You don't have to do two different, you could have them choose the method that works best for them, depending upon what learning outcomes are, , and what you're assessing.

They're basically just solving the problem and explaining out loud why they are making the decisions they're making as they're solving that problem. This works really well in this framework because what are math teachers always saying? Show your work. The assessment puzzle framework is literally

totally focused around showing your work. In other words, making learning visible. Now, I gave you two core content areas, math and science. Let's look at an art example. For my elective and fine arts teachers out there [00:18:00] use AI image generation to create a piece of art that embodies surrealism characteristics.

Annotate the image to identify at least three surrealist elements. Record a voice reflection explaining how your creation reflects the surrealism movements principles. So hopefully this is starting to come together. That particular example actually is one where we use the collaborations with AI puzzle piece because they're using a text to image generator.

To generate a piece of surrealist art. Is that how you say it? Surrealist art teachers Don't judge me if I said that wrong. I didn't teach art. Your kids are safe. I think it's surrealist art, not surrealism or, Nope, nope. I just said it out loud. It made sense. Surrealist art. And by the way, I wanna pedagogy pause here for a [00:19:00] second.

Prompting an ai. Uses so much vocabulary. There's so much vocabulary development as part of prompting 'cause think about it. So in this example, they were taking a text prompt to create a piece of surrealist art that doesn't exist. They have to know how to explain the characteristics of surrealist art, which is keyword terminology.

Based upon art, right? In order to make that prompt to even generate it back. Then once they're doing that, they're having to explain whatever piece of art that they made via annotation, so actually pointing out pieces within the art that they created and explaining over the top why that piece shows a characteristic of that art period.

So, like I said, [00:20:00] hopefully this is starting to come together for you. You're starting to see it a little bit more and seeing these different puzzle pieces and how you can put them together to really require synthesizing information and making learning visible. If it isn't coming together to you, or maybe you're a really visual learner. Don't worry. I have all of this outlined as a free download on my website, and I'll tell you at the end of the episode how to get that so you can dig deeper and see lots more examples.

In fact, I think there's over 35 examples across all content areas and all grade levels K through 12. There's primary examples in there. I promise I didn't forget about you. K one, two teachers. There's examples for electives, CTE, stem, all the core content areas, et cetera, et cetera, all in that toolkit.

The next step is to make [00:21:00] this possible and to make it possible for an educator. I know it has to be quick and easy. We have very little time, extra time. We say we don't have extra time as an educator, right? We have very little non-instructional time as part of our day when we're not actively instructing students.

There's a million other things we have to get done during that non-instructional time, answering emails, grading, putting grades in, going to meetings, on and on, and on and on. . So I wanna acknowledge that it's easy for me to talk about redesigning assessments. Like, oh, we can snap our fingers and voila, it's done.

We've redesigned our assessment.

The reality is it isn't a snap your fingers moment, but it also doesn't have to take a significant amount of your time. The trick is to take it assessment by assessment. Just do one by one. So here's what I want [00:22:00] you to do. I want you to pick one upcoming assessment that feels like it's copy and pastable.

It feels like a student could easily take and outsource their thinking to an AI and copy and paste that and submit it to you. Take that assessment and add a visual component so you're not necessarily changing the assessment and full, you're adding another visual component to it. Then add a voice reflection element that requires the student to explain their thinking.

Done, honestly. Adding that voice reflection. You could even take the visual component off there and just add the voice reflection, and you are gonna completely transform that assessment from being copy and pastable to requiring the learner to explain what they understand or maybe don't understand.

Last piece of the puzzle. See what I did there? See what I did there is I know you're thinking, how do I [00:23:00] find tools that are school friendly that I can use with my students to do a lot of these puzzle pieces, like the voice reflection, I. In the assessment puzzle toolkit, there is two pages of school friendly tools for you.

I wanna call out one in particular. That is a really great starting point and that tool is snorkel without an E. So S-N-O-R-K-L, it really is the perfect tool to add visuals and voice reflections and annotations. Snorkel is a lot like the app called Explain Everything. If you use that back in the day on the iPad.

It was a really popular iPad app. It still exists. Promethean brought it, so if you're at a Promethean school, you might still have access to it. There was another app on the iPad called, show me. I don't know if that one still exists or not, but just to give you an idea, of examples of other types of apps that are out there.

You get [00:24:00] a whiteboard and you as the teacher can add stuff to that whiteboard to have the student start or not. So in the example of photosynthesis, you might have the instructions at the top, and then you might have different pieces of the process laid out for students where they have to drag it around, they hit a draw button, they can add annotations, and while they're doing all of this.

It's recording, so it's recording them thinking out loud and explaining why they're moving this piece of the photosynthesis process over here and why they're adding that annotation and why it works this way. That's kind of the idea of like a whiteboard that students are talking over the top, and that whiteboard really gives the power to do lots of different things like images, annotations, et cetera.

Now the really great thing about snorkel that just makes it above and beyond is that it gives [00:25:00] students immediate feedback on their videos. So once the student hits, stop recording. It processes it and to back up a step, when you, as the teacher set up the snorkel assignment, it asks you to tell it what it wants to focus on with the feedback.

So you as the teacher guide the AI and say, I wanna make sure that they understand this concept, that they get this right. I wanna make sure you give them feedback on X, et cetera, et cetera. So then the AI processes, what the student said. What the student did in terms of visuals on the whiteboard and gives them that immediate feedback.

This is awesome for two reasons. Reason one is that we know research shows that immediate feedback is key. It's really, really important. There's overwhelming research showing how critical, immediate feedback is to the learning process. But two, and I know you're thinking it, you're like, [00:26:00] Lindy, I have 30 kids in my class.

Or maybe you're a middle school high school teacher and you have 150, maybe even 200 kids that you see every single day. You're like, I can't listen to that many videos. That's why Snorkel is so amazing, because it does that for you, gives the student the feedback, but then it puts it all into a teacher dashboard and really calls out student by student that this student really got this concept but maybe didn't fully grasp this concept.

And then if you want, you can click and watch the student video. So maybe you're starting to see. A similar misconception. So you wanna click in and watch a few videos to get an idea of what students weren't understanding or where they went wrong, for example. You can do that, but you don't have to, and you don't have to listen to 200 videos from every single student in your class.

So that is the other reason why snorkel is just chef's kiss.

So, as I said, snorkel [00:27:00] is just one example of a tool that you can use to really make this assessment puzzle framework come to life. But there are lots of other tools in the toolkit. Again, welcome to the Make Ed Tech 100 podcast. This episode I call a LindyHoc take where you get to hear from me, but the show's gonna be like an ed tech variety show of sorts.

Sometimes you're gonna get these LindyHoc takes. Other times you're gonna get to hear educator stories and even hear from the EdTech products themselves.

Before we wrap, we end every LindyHoc Take episode with a make ed tech 100 moment. I think today's 100 moment, appropriately is to make learning visible. Think about how you can shift to focus on the process of learning and making that [00:28:00] process of learning visual and not just focusing on that final product

aI can make products, but AI can't recreate the learning process. So document that process and get x-ray vision into your students' brains. I turned my assessment puzzle framework into an over 20 page toolkit. Like I said, it includes everything you have heard here, plus more detail and over 30 more examples.

Actually, probably more like 35 more examples of copy, paste, proof assessments across all content areas and grade levels. There's also a list of school friendly tools, like I said. Like snorkel and many others to help make this happen, and it's all available as a free download at LindyHoc.com/assessment-puzzle.

I'll make sure to put that link in the show [00:29:00] notes.

Thanks for joining Make EdTech 100. I know educator time is valuable and I'm honored you choose to spend yours with me. For more EdTech strategies you can use tomorrow and ways to bring me to your school or event, head to LindyHoc.com. If this episode resonated, hit subscribe so you don't miss the next one.

I'm LindyHoc. Go forth and make EdTech 100.

Teaching is Hard. Let’s Solve One Challenge: AI Slop.
Broadcast by