HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
November 1, 2024
Vol. 82
No. 3
Interview

AI as a “Paintbrush of Possibility”

author avatar
    Stanford researcher Victor Lee delves into AI in education—the good, the bad, and the possibilities on the horizon.

    premium resources logo

    Premium Resource

    TechnologyEquity
    Profile photo of author Victor Lee
    Credit: Sofia Kukhar
      Victor Lee is an associate professor at Stanford University’s Graduate School of Education and a faculty lead for the Stanford Accelerator for Learning’s initiative on AI and Education. His research focuses on AI literacy, exploring how students and teachers can critically engage with AI in K–12 settings. As part of Stanford’s CRAFT initiative, Lee collaborates with K–12 teachers to design education experiences that prepare learners for a data-driven future, integrating AI into everyday learning across subject areas. In this interview, Lee unpacks the ever-evolving definition of “AI literacy,” shares a sneak peek at Stanford’s growing of AI resources, and discusses what’s next for AI in education.

      You’ve described how AI literacy can mean different things depending on one’s perspective—for example, as a user, a developer, or a critic. How would you define AI literacy in the context of a K–12 student user? What are some core aspects of AI literacy that every student should learn?

      Broadly speaking, users need to know when and when not to use AI for a specific task. With generative AI, we want to think about, How is AI supportive of, or augmentative of, the things that I’m trying to do myself?
      With student writing, for example, let’s say we’re trying to make a compelling argument. When we aim for that goal without AI, we may have a peer look at it. We may reread and mark up our writing, then walk away and think about two or three possible ways to strengthen the argument we are making.
      When we involve AI in that writing process, we have to think about how it’s helping to accomplish our writing goals—for example, to help strengthen the ideas that we want to express. AI can make some things faster, tidier, and more efficient, but it can’t replace a human. Using AI well doesn’t mean pressing a button that writes an essay; it means we’re engaging back and forth with a tool that’s augmenting our goals. So it comes down to, What is our human intent and how is AI in service of that?
      In a similar way, when you’re cooking a meal, you put thought into picking your ingredients, considering how finely to chop them, or what temperature to cook them. You can have a fancy kitchen gadget that will help along the way, but for people who cook really well, that gadget is stitched into the larger activity of the cooking process: How do I optimally use a food processor or a mandolin or a certain knife to make that meal the way I want it to be? So, being a user means knowing when and when not to use a tool, whether it’s to cook or to write.

      How can AI help students become better digital problem solvers—better critical thinkers and savvy consumers of information—especially when some argue that AI is doing the critical thinking for students?

      I would stress, especially for ­education now, that students need to think about misinformation and accuracy. We missed the boat very badly on that with a lot of early digital citizenship, internet, and social-media-related activities.
      So, one thing to think about now is that the term “artificial intelligence” is misleading in and of itself. It sounds like AI is smart and thinks the way intelligent people think. But it doesn’t—it spits out patterns. Some of the patterns that it can spit out right now surprise us because it’s a big leap forward, but it’s still spitting out patterns. It’s not “intelligent” like humans are intelligent.
      If you approach AI by thinking, This is not an intelligent thing, it’s just spitting out patterns, then it changes your disposition to it as an educator or a student. You think, I’ll take some of what you say or what you give me as information, but you’re not intelligent.
      So, that’s one way to think critically: AI is handy, but it’s still just a tool, and you choose what to take from it. It puts together information, but it doesn’t judge information. Knowing a little bit of the inherent limits of AI changes your expectations of it.

      We want to give teachers good raw material to work with and ways that AI can fit or combine easily with existing lessons.

      Author Image

      What is Stanford’s CRAFT initiative? How is it helping to shape AI literacy in K–12 education?

      CRAFT (Classroom-Ready Resources About AI For Teaching) is a growing collection of free resources for high school teachers to support students’ understanding of AI literacy. Teachers for any subject or grade level have too much to cover as it is. So, when we have a new topic like AI, it’s very understandable to say, not me—that’s for the computer teacher or whoever else.
      But AI is such a big shift that it does impact everyone because it opens up how we think about art, music, writing, history, etc. At the same time, that doesn’t mean that everyone’s responsible for all knowledge about AI.
      So, the CRAFT initiative is trying to speak about what AI is for a specific subject area, and offering small tweaks for incorporating AI within or as part of an existing lesson. It’s grounded in what teachers already do because it’s being built with and by teachers. And the CRAFT resources are for small pockets of time, because there are already so many things to cover.
      We have a lot of materials in the pipeline. For example, one of the lessons that we have coming out is for history classes, asking, Is AI the new Industrial Revolution? It opens up questions like, What was the Industrial Revolution? What does “industrial revolution” mean? What all is happening with AI and is it similar to what was happening in the 1800s?
      Amid the very busy lives that teachers have, we’re working to make usable, grounded, manageable resources that are based on as much as we know about good pedagogy and practice. CRAFT is about bite-sized curricular resources for teachers, by teachers, with the support of people who spend a lot of time thinking about AI in education.
      The broad vision is that there will continually be things coming out, I would guesstimate on a monthly basis. What we aspire to is to build out a newsletter of the new things that have come out and to continue to grow because the topics are going to change and there are lots of new developments that teachers may want to capitalize on. For instance, we’re working on a lesson about why TikTok is starting to get banned because that’s a hot topic right now.
      Teachers are very creative individuals, which is why, for CRAFT, we should not be making a whole AI curriculum. We want to give teachers good raw material to work with and ways that AI can fit or combine easily with existing lessons to empower teachers and support their agency in adapting materials to fit their needs.

      The term 'artificial intelligence' is misleading in itself. AI is not 'intelligent' like humans are intelligent.

      Author Image

      What important conversations about AI literacy are we not yet having in education?

      One conversation goes back to this critic, developer, and user lens idea: It’s not clear how much is enough in terms of AI literacy. How deep down the rabbit hole do you need to go to do the things that you want to do? Does every teacher need to know that there’s training data that are involved in AI, or that there’s all of these named language models, or what large language models are, for example?
      A lot of people have talked about how AI has perpetuated systems of harm and oppression and how this continues to marginalize people, and that’s the most urgent thing for people to understand. And then you also have people who are saying that it is all well and good, but we aren’t doing enough teaching about how to use AI because it’s important for doing all different kinds of jobs. So, all these voices are jumping in and declaring, this is it. This is AI literacy. But we don’t know—we don’t know how much of each perspective is enough to get the outcomes we want.
      For now, what I would encourage educators to ask is, What do students need to know about AI to do the things I’m trying to teach? What do they need to know about AI specifically for my subject or for my grade? Those questions create a good boundary for the moment until we get ­standards in place, which will ideally synthesize best ­practices for AI use.

      What do you see on the horizon for AI in schools? What’s an emerging AI technology you’re excited about or intrigued by?

      So much has been focused on students, and I’m all about students, but a lot of survey research shows that teachers are heavier AI users than students. I think the most exciting situations are how to help teachers with the work that they’re doing, whether it’s lesson planning, ­synthesizing information, or streamlining routine ­communications.
      There’s also beginning to be more investment in real, day-to-day classroom challenges, like having multilingual classrooms or students with learning differences—How can AI help in creating more access to broaden the playing field for these students? I’m seeing a bit more movement in those directions, but I’m also trying to use whatever platforms I have to push things in that way.
      The other area I’ve been putting a lot of energy toward is creativity. How is AI a tool for creativity and creation? The conversation around AI has been so much about, How does this automate information delivery? How does it tutor you? How does it just make things fast? These questions almost yank out the human spirit of AI. AI is actually this incredible new paintbrush of possibility, so let’s play with that.
      You can imagine that kids can make video game simulations of the thing that they’re learning in history. Or they can make really cool multimodal presentations of the science idea that they’re learning in class. Or they could put some of their poetic writings into music. So, how can we make AI expressive, celebratory, and focused on possibilities?
      Editor’s note: This interview has been edited for length.

      Jessica Comola is an editor with Educational Leadership magazine.

      Learn More

      ASCD is a community dedicated to educators' professional growth and well-being.

      Let us help you put your vision into action.
      Related Articles
      View all
      undefined
      Technology
      Teaching Better Screen Habits
      Liz Kolb
      2 weeks ago

      undefined
      4 Questions to Ask Before Integrating a New App
      Paul Emerich France
      2 weeks ago

      undefined
      It’s Time to Get Serious About Digital Citizenship Education
      Kristen Mattson
      2 weeks ago

      undefined
      Learning to Slow Down and Fact-Check
      Andrew K. Miller
      2 weeks ago

      undefined
      “Hey Alexa, Show Us How You Work”
      Bryan Goodwin & Matt Linick
      2 weeks ago
      Related Articles
      Teaching Better Screen Habits
      Liz Kolb
      2 weeks ago

      4 Questions to Ask Before Integrating a New App
      Paul Emerich France
      2 weeks ago

      It’s Time to Get Serious About Digital Citizenship Education
      Kristen Mattson
      2 weeks ago

      Learning to Slow Down and Fact-Check
      Andrew K. Miller
      2 weeks ago

      “Hey Alexa, Show Us How You Work”
      Bryan Goodwin & Matt Linick
      2 weeks ago
      From our issue
      Magazine cover titled 'Growing a Generation of Digital Problem Solvers' featuring an illustration of students working on devices as "pixels" of digital information float into the air
      Growing a Generation of Digital Problem Solvers
      Go To Publication