What is the Research Telling Us About AI Literacy?
Welcome to make EdTech 100.
I am LindyHoc Educator, K 12
Ed Tech Advisor, and your host.
This is a podcast where we keep it real
about what actually works in classrooms.
No hype, no overwhelm, just practical
strategies, honest stories and
tools that make a real difference
for teachers and students.
So come along with me on a
journey to make EdTech 100.
It is a big day today at Make Ed Tech 100.
First of all, it is March 27th, which
means it is officially AI Literacy Day.
Speaker: Secondly, it is the
10th episode of Make Ed Tech 100.
We made it.
thank you for sticking around
through the first episodes.
We're figuring out the technical
things and trying to make your listing
and experience the best it can be.
I think I have gone through, no
joke, five pairs of headphones for
interview episodes, and I'm still
not happy with what I have right now.
So if you have a headphone
recommendation, that won't cause
one mic bleed, and two, it has to
provide a great listening and talking.
Experience.
I am all ears, pun intended.
The ones that I have right now,
the first ones caused some mic
bleed, so I was getting echo.
And then the ones I have now are
just not a great listening experience
and listening and talking experience
when I'm interviewing people.
So that's where we're at.
Anyway, again, thank you.
Thank you.
We made it to episode 10.
We're figuring it out.
Thanks for listening and sticking around.
However, that is not what today is about
because today is National AI Literacy Day.
So question number one is AI
literacy a tech check or a tech rec?
I do this to some of my guests,
so I have to do it to myself.
If you followed any of my
work, you know that I have been
traveling the country literally.
Preaching.
Literally traveling and preaching
the importance of AI literacy.
AI literacy is a giant tech
check, so it is quite literally
my jam to talk about this.
I want to spend AI Literacy day
talking about something else.
I have been traveling in the country
preaching something else, but yet related,
and that is the research on AI literacy.
The research plus my personal experience
teaching AI to people of all ages is
what is driving my push to educate and
preach everyone on A, the importance
of AI literacy, and B, increasing
AI literacy and people of all ages.
Let's start with a definition,
just in case you're not familiar
with the term AI literacy, and make
sure we're all on the same page
of what we're talking about here.
There are a lot.
Of definitions out there, and I
mean like a lot, a lot, a lot.
But I like to use the definition from
digital promises, AI literacy framework.
It says the knowledge and skills that
enable humans to critically understand,
use, and evaluate AI systems and tools
to safely and ethically participate
in an increasingly digital world.
I like this definition because
it has three clear verbs,
understand, use, and evaluate.
It's really easy to remember, and
they have this really great graphic
that shows how all three of those
verbs have to occur together.
So if you don't understand ai,
you won't know how to effectively
use it and evaluate it.
If you don't use ai, you're not
going to fully understand it.
If you don't know how to evaluate AI
outputs, then you aren't fully using
and understanding it, et cetera.
There are several studies out there
that define AI literacy, and lots
and lots and lots of frameworks
that define AI literacy as well.
In fact, the number of published
definitions doubled between 2022 and 2024.
I told you, there's a lot
of definitions out there.
So like I said, I like
the digital promise.
AI literacy framework definition, but
if you want more of a definition coming
from a research study, there is one study
called What is AI Literacy Competencies
and Design Considerations, and its
definition is a set of competencies
that enables individuals to critically
evaluate AI technologies effectively
communicate and collaborate with ai.
Use AI as a tool online at
home and in the workplace.
So all the definitions are
basically saying the same thing
in just using different words.
Basically, AI literacy is the
foundation of healthy, productive,
and ethical and responsible AI use.
What is the research
saying about AI literacy?
This first study I share in almost
every AI training I do for educators.
It shows the correlation
of AI literacy and AI use.
The study calls it propensity to use ai.
That's what they call AI use essentially.
And the name of the study is lower
artificial intelligence literacy
predicts greater AI receptivity.
So in not so many words, this study
found that as a person's AI literacy
increases their propensity to use
ai actually decreases the study,
calls it the magical thinking trap.
And this is a quote
directly from this study.
This lower literacy greater receptivity
link is not explained by differences
in perceptions of AI's capability.
Ethicality or feared impact on humanity.
Instead, this link occurs because
people with lower AI literacy are more
likely to perceive AI as magical and
experience feelings of awe in the face
of AI's, execution of tasks that seem
to require uniquely human attributes.
In other words, when you don't
understand how AI works, that
it's just a predicting machine.
It's just predicting the next token.
You view it as magical and are in
awe that it can do human-like tasks.
This magical thinking results in less
critical evaluation of AI's outputs.
You think it's the answer
to everything you think that
you can just copy and paste.
The outputs that it gives you
and you don't need to review
and you don't need to edit them.
One more thing I want to note about this
particular study is that it's not an
education study, so most people are like,
oh, this come from a college of education.
Nope.
Nope.
It's a marketing study that
comes from multiple business
professors from different business.
Schools, higher education
institutions, I should say, and
this really hits home for me.
The last two sentences of the abstract
of this study says, these findings
suggest that companies may benefit
from shifting their marketing efforts
and product development toward
consumers with lower AI literacy.
In addition, efforts to demystify ai AKA.
Increase their AI literacy may
inadvertently reduce its appeal.
So just stop and think
about that for a second.
This means that companies are preying
on people with low AI literacy to
get them to buy their stuff.. This
is a problem for society as a whole.
Especially for our vulnerable
populations, which includes our kids.
And if you're teaching at the higher ed,
not K 12 level, most of your students,
the majority of your students still
do not have fully developed brains.
Our brains don't fully
develop till around 25.
So the majority of college aged kids
still have underdeveloped brains.
I still consider them to
be a vulnerable population.
Next study is called Chat, GPT in
lesson preparation, and it gives us
insights into AI literacy as well as
whether teachers can leverage generative
AI to save time on instructional
preparation while not decreasing quality.
That's something I've been
thinking about a lot lately.
So in the study, teachers
were split into two groups.
One was asked to not use
generative AI for lesson planning.
The second group was provided access to
chat GPT as well as a guide for how to
use chat GPT for instructional planning.
Okay.
So they got a guide that helped
them how to learn how to use it.
For this task, the teachers with
access to chat GPT spent 31% less time
creating lessons than their peers.
And here is the important part.
There was no detectable differences
in lesson quality, so multiple
AI literacy pieces here.
First thing to note is
they were given guidance.
They weren't just handed chat, GPT and
one thing I didn't note here, , their
time savings increased over time.
So like the first, I wanna
say eight to 10 weeks.
, They had like around like a 25% savings
or 20% savings, something like that.
And then over six months, that time saving
increased to 31%, but wait for this and no
detectable differences in lesson quality.
Here's the AI literacy
piece . Teachers use of AI tools.
So I told you over time
their time savings increased.
Their use of AI tools decreased
from 39% to 29%, yet their savings
not only persisted, but increased.
So the study says that quote, suggesting
teachers quickly learn where AI adds
value and deploy it more selectively.
So this is really reinforcing
this idea of when you.
Increase AI literacy.
Your use of AI or propensity to use
AI actually goes down because you
start to learn when can it help me?
When can it not help me?
When should I use it?
When should I not use it?
This is really reflected in my personal
use of this technology as well.
Those two studies show the amount
of AI use, but several other
studies note a troubling gap
between AI use and AI literacy.
We know students are already using
AI students of all ages with or
without guidance, so whether you.
Show them how to use it, right?
Whether you ban block,
open, it doesn't matter.
The research suggests that if the use is
unguided, it leads to passive consumption,
AKA copy pasting outputs rather than
collaborative iterative engagement that
actually improves learning outcomes.
The question isn't, will they use it,
but do they have the literacy to use it?
Well, I will link to one of
these studies in the show notes.
And by the way, I will link to, , all of
the studies I referenced, or if I'm kind
of summarizing, I'll link to, , a couple
of the studies that I'm referencing now.
These studies are focusing
specifically on AI skills.
There are studies that find that
AI literacy instruction doesn't
just teach students about ai.
It tends to develop transferable
skills such as critical engagement,
reflection, and metacognition iteration,
communication and collaboration,
creativity, and even emotional
regulation for certain populations of
students like multilingual learners.
In other words, when you increase AI
literacy, you aren't just focusing on ai.
You aren't just focusing
on the technology.
And I'm hearing this a lot.
We don't have time for this.
I get it.
Those darn standardized tests
dictate everything we do.
We've got these massive
standards we have to hit.
These curriculums that aren't
super flexible all of the time,
but the research is telling us
that we have to find the time.
And my approach, if you follow my
work, is that I really have this
idea that AI literacy is core and.
Emerging technologies, technology in
general doesn't exist outside of humans.
Humans create technology because they went
to school and learned core curriculum.
So I kind of take this approach as
it doesn't need to be a standalone.
Thing.
It doesn't need to be that we whittle
out more time for another class.
It should be integrated into what we're
already teaching and , these studies
that are showing that it's, they're
not just learning about the tech,
they're not just learning about ai.
So much of AI literacy is understanding
how to have a thought partner and
iterate and collaborate, for example.
That is a summary of the why,
why AI literacy is important.
But let's look at what the research
says about when we should teach AI
literacy for the rest of these points,
I'm gonna give more of an overview and
summary of the research in general,
rather than going into the specifics
about specific studies, but again, I'll
put those links in the show notes, or
I'll put links to a few of the studies
I'm referencing in the show notes.
In terms of when most of the research we
have on AI literacy programs is focused
on middle school, high school, higher ed.
In fact, Stanford just recently, so
Stanford has this AI and education
hub and it has a repository of over
800 AI and education research studies.
They did a summary of it just a few
weeks ago of kind of looking at those
800 studies in the repository and
saying , what do these studies say?
And one of the main findings of that
review was that K five AI literacy is
really understudied and no surprise.
Higher ed is the most studied.
That is always the case with research.
By the way, I have a book about online
learning, and I say in that book that
there's way more research on online
learning at the higher ed level than
there is at the K 12 level, but.
It makes sense because college professors
are the ones doing the research.
They have much easier access to college
aged students than K 12 aged students.
So what we have to do is we have to
take the findings from that research
that is focused on higher ed and kind
of combine it with our experience
to find this happy medium space.
Moral of the story, there's not a
ton of research on kind of the win.
When you get down to elementary age
students, however, there is a framework
called AI 4K 12 that maps out AI literacy
concepts across all grade levels.
And what that framework makes clear
is that the foundational concepts like
what is ai, how does it make decisions?
Who builds it?
Why do they build it?
They are absolutely teachable in
elementary school and even preschool.
Don't panic.
I'll talk more about that in a second.
So the moral is they typically
we're waiting until kids get to
high school to take like a computer
science class, maybe middle school.
We might be embedding some
computational thinking in stem
and steam and elementary school.
, But really we need to be
doing it as soon as possible.
And I say.
Honestly, as soon as kids start
to talk and can communicate and
understand what you're saying, you
need to start having the conversations.
And the key word there is conversations.
That does not mean some people,
like I said, don't panic.
Some people take it to mean
that I say we need to teach AI
literacy as soon as possible.
They, we as humans go
to the extreme, right?
So your mind goes to, oh my gosh, we're
gonna put kindergartners in Chad, GBT.
No.
Not the case at all.
So much of AI literacy work that I do
is not using AI directly, and actually
a lot of it is no tech or very, very
low tech with students of all ages.
I'm super excited.
I did this Kickstarter for this AI
literacy card deck for elementary kids.
I haven't got it yet, but I
think it should be coming soon.
And it's basically just a big set
of cards that are all completely
no tech activities or discussions
starters that you can use with kids
to start talking about AI literacy.
So that's the key.
Start talking, start
having the conversations.
I always use the example.
Of what we call in my house, the
lady in the corner, A-K-A-A-L-E-X-A,
I can't say it or else.
It'll start talking to me because I
have one in the corner of the room here.
, How many households have
those, or Hey, Google devices.
I don't know what those are called.
I think they're called, Hey, Google.
I do not have those, or phones that have
S-I-R-I-I can't say that one either.
'cause my phone is right here.
I have this whole session I do.
That's all about basically this idea that.
We don't access technology
just through a screen anymore.
Think about it.
The lady in the corner , has
a bit of a screen.
The hate Googles, I don't
think have a screen at all.
You know, I always use an example of
walking to school this morning or work.
Your face was highly likely
scanned by a video camera using AI
technologies to do facial recognition.
That's not a screen that's not sitting in
front of a computer and interacting with
a keyboard and a mouse or a touch screen,
technology is literally the infrastructure
of the world that we live in.
And because of that, you really can't
avoid it if you're a member of society.
And we need to have these
conversations as soon as possible.
So bottom line on timing, don't
wait and definitely don't assume AI
literacy is only for computer science
teachers or only for high school kids
. We've covered the why.
We've covered the win.
Let's talk about the how.
How do we teach AI literacy?
There is not a ton of research on the
how, but what we do have points to
three things it needs to be hands-on.
It needs to be iterative, and it works
best when it's integrated across subjects
rather than siloed into one class.
So exactly what I said from my experience.
I'm really focusing on, Hey, you
don't need an AI literacy class.
You don't need an AI class.
We need to give teachers the knowledge
and curriculum to be able to do
this and integrate it into math.
And yes, ELA, social studies,
electives, everything.
Technology and ai.
And it's not just ai.
We're talking AI today,
'cause it's AI Literacy day.
And this is really one of my passions
and goals in my life right now is
to talk about AI literacy because
I think it is the bedrock of our
society going in positive directions
But AI literacy is just a branch.
A very big important branch, not
just a branch under digital literacy.
So it's more than just ai.
And now pretty much any tech has an
element of AI in it, it's teaching
digital literacy and digital citizenship
concepts embedded throughout our
curriculum and every content area.
Back to the how I said,
it needs to be iterative.
It needs to be hands on.
This means we're talking about
project based learning, collaborative
problem solving, real world
context, not worksheets about
ai, not watching videos about ai.
That doesn't mean that that can be a
tiny part of it, but if your AI literacy
work or digital literacy work in general.
Is students passively only watching
videos and completing worksheets?
No, that can be part of it and often
having some videos about how AI works.
I know code.org has some great
videos about how AI works is a
great starting point and sparks the
conversation, but that can't be it.
That's kind of the, how do we teach
it from the student learner lens, but
it's important to talk about teachers
in particular and how should we teach
AI literacy to teachers And surprise,
surprise, no, not surprise at all.
The research says that technical
training alone is not enough.
I've been saying this for years,
not just me, me and others that
work in my instructional technology
field, that we can't just teach
teachers point and click training.
It can't just be technical training.
It has to be embedded.
Into their curriculum and instruction
and framed around pedagogy, and
the research is reflecting this.
So there's a study that did
a systematic review of 43
studies on teacher professional
development for AI in particular.
And they found that schools were
teachers actually changed their practice.
Were the schools that combined
technical , skill building
with pedagogical reflection.
Ongoing support and a culture that
made it safe to experiment and fail.
Now I saved the best for last.
We did the why.
We did the, when we did the how.
What's the research saying about those?
But it's important to know
what are students saying when
it comes to AI and AI literacy
the consensus is they
are saying, teach us.
They want to be taught how
to use it appropriately and
ethically and responsibly.
There's a lot of studies
out there on this.
No surprise, they're almost
mostly higher ed focus, but.
We have to take at a K 12 level,
take the findings from that.
I don't think there's this cutoff
at age 18 where we all of a
sudden like shift as a learner.
Again, our brains aren't fully developed
till around 25, so I think we can take
a lot from these higher ed studies.
, There's a study out there that looked at
99 different studies, and they synthesized
all those studies where they interviewed
students, all higher ed students on their
thoughts and their attitudes towards
generative AI in particular, and here's a
little clip from the summary of the study.
Students are already using generative AI
widely, but often lack confidence about
quality, ethics, and appropriate use.
They argue that successful integration
of gen, AI and higher education depends
on giving students explicit guidance,
structured activities, and clear
expectations so that they can use these
tools critically rather than passively.
In other words, students
overwhelmingly are saying that we
are not confident about how to use
when to use what's appropriate use.
They want very explicit guidance and
very structured activities and very
clear expectations on how they should
and should not use this technology.
They want to learn how to use AI
responsibly and appropriately,
so let's teach them.
There you have it.
We did the why, the when, the how
as it comes to AI literacy and
what the research so far is saying.
We also looked at what are students
saying, but it's important to note that
this is very much a developing area.
Very, very, very much so expect
to see more and more studies.
This is something I am really staying on
top of and using to guide my professional
development sessions and advice that
I'm giving to schools and teachers
and educators and ed tech companies.
my make ed tech 100 moment to leave you
with on this National AI literacy day
is AI literacy is not a nice to have.
It's a now to have, the educators
who understand that today
are the ones who will define.
What this is gonna look like tomorrow.
I always say there's a lot of
different, everybody always asks
like, you know, is AI the existential
dread end of humanity as we know it?
Is.
It just another tool,
just another technology.
And my answer is always, there are a lot
of roads that we could go down with this
technology and we wanna make sure that
we choose a road with a positive ending.
And I really, really think that.
AI literacy is the key to that.
And again, the only way you are going
to be able to make sure we go down
that positive route is to be a part
of the conversation and help define
what does AI look like in education.
Happy AI Literacy Day.
Speaker 4: Thanks for
joining Make EdTech 100.
I know educator time is valuable and I'm
honored you choose to spend yours with me.
For more EdTech strategies you can use
tomorrow and ways to bring me to your
school or event, head to LindyHoc.com.
If this episode resonated, hit subscribe
so you don't miss the next one.
I'm LindyHoc.
Go forth and make EdTech 100.