The AI Workforce for Higher Ed is Here |

Talk to a Bolt Agent
EP
99
January 12, 2026
Episode 99: Sora Is Good Enough to Be Dangerous

Sora Is Good Enough to Be Dangerous

Or listen on:

About the Episode

The AI Engage Summit is a free, virtual experience built to help higher ed leaders actually do something with AI. Over two afternoons, you’ll hear from peers, see practical demos, and walk away with ideas you can use immediately — no travel required, no cost to attend. If you’re ready to move AI from “interesting” to “impactful,” this is the place to be. Register now, don't miss out.

About the Episode:

#NotJustaCadburyAd with Shah Rukh Khan

Mallory Willsea sits down with Erin Fields, Marketing Director at Ologie (and former Enrollify team member), to explore the shifting role of AI video in higher education marketing. Together, they dissect the creative possibilities, strategic risks, and trust considerations emerging from tools like Sora, while tackling how marketers should use AI video as an ideation engine—not a shortcut to storytelling. This conversation offers a grounded, forward-looking take on how higher ed marketers can responsibly and creatively integrate AI video into their workflows.

Key Takeaways

  • AI video tools like Sora are better for ideation than execution in higher ed marketing—use them upstream to explore tone, narrative, and visual ideas quickly.
  • Overly detailed prompts lead to brittle, incoherent outputs; the best AI video results often stem from simple, focused inputs.
  • The real risk of AI video in higher ed isn't visual quality—it's trust erosion when audiences can’t tell what’s real and what’s generated.
  • Higher ed marketers must draw clear ethical lines: AI video should never stand in for real student experiences or simulate authenticity.
  • AI video unlocks faster thinking, not just faster production, shifting creative responsibility upstream to taste and judgment.
  • The Cadbury India campaign is a model of ethical AI video—transparently personalized, rooted in community impact, and strategically sound.
  • Sameness is a creeping threat—AI video is trained on existing visual patterns, and higher ed must resist the temptation to accept defaults.
  • Marketing’s job has shifted: it’s not about making more content faster; it’s about making fewer, better bets and knowing when to stop.

Episode Summary

What is AI video actually good for in higher ed marketing?

AI video has crossed a credibility threshold—it’s now capable of creating output that's “good enough” to prompt real marketing decisions. Mallory and Erin emphasize that AI video tools like Sora are most valuable during the ideation phase, not final production. They help teams explore different tones, story structures, and visual languages quickly, giving marketers more clarity before locking in a campaign direction. Erin shares that she’s used Sora to create playful, low-stakes content internally—not to represent real student experiences, but to test what the tool can do creatively. The takeaway: AI video is best when it's used for brainstorming, not broadcasting.

Why is “taste” now more important than execution?

With AI making execution fast and cheap, taste and judgment become the new marketing superpowers. Erin argues that good creative work has never been about controlling every detail—it’s about knowing when to intervene. Trying to over-direct an AI video with too many prompt details usually results in worse outcomes. Simpler prompts often yield better results, echoing a broader shift: marketers need to know when to stop generating and start deciding. In this AI era, the temptation to endlessly iterate is real—but productivity isn't the same as progress.

Where does AI video risk eroding trust?

As AI-generated visuals become more lifelike, audiences will become less forgiving of inauthentic content, especially in higher education where credibility is key. Mallory and Erin agree: the risk isn’t that the video “looks AI”—it’s that it might quietly simulate real student experiences. And that’s a hard line higher ed cannot cross. Marketing a manufactured moment as if it were authentic undermines trust. Instead, Erin recommends using AI video as a visual sketchpad—not a reality replacement. If AI is used to extend generosity or access, like Cadbury’s campaign that uplifted local businesses, people accept it. But when it’s used to simulate connection or manufacture meaning, that’s when trust erodes.

How can higher ed marketers use AI video responsibly?

Mallory and Erin recommend drawing firm ethical boundaries. AI video should never be used to fake classroom experiences, fabricate student stories, or simplify the real, messy, non-linear nature of student journeys. Instead, schools should leverage AI tools as collaborative ideation resources, especially in the early stages of campaign development. These tools can help bring strategists and creatives into the same conversation by visualizing ideas before resources are committed. Erin calls this the “study guide effect”—AI helping to translate strategy into visual possibilities without jumping the ethical shark.

What does Cadbury’s campaign get right—and what can higher ed learn from it?

The Cadbury India campaign used AI to create over 100,000 personalized ads featuring Bollywood star Shah Rukh Khan promoting local businesses. It disclosed the use of AI clearly, never pretended the videos were real interactions, and used AI to amplify generosity, not simulate experience. Erin highlights this as a gold standard: AI stayed in the background, the strategy stayed upfront, and the campaign avoided false intimacy. For higher ed, this means that transparency isn’t optional—it’s a trust signal. Your audiences don’t need to understand the tech, but they do need to believe in the integrity of the story.

What’s the future of AI video in higher ed?

Both Mallory and Erin agree: the immediate opportunity isn’t publishing AI videos—it’s using them to get smarter, faster. Sora can help marketing teams test and visualize creative directions before the cameras roll, helping teams make more confident bets. However, Erin warns that sameness is a creeping risk—AI defaults can cause visual convergence, making it harder for institutions to stand out. That’s why creative taste, restraint, and intentionality matter more than ever. As Erin says, the boundary is simple: “AI can help you imagine what a story might look like, but humans must decide what’s worth showing.

Connect With Our Host:

Mallory Willsea
https://www.linkedin.com/in/mallorywillsea/
https://twitter.com/mallorywillsea

Enrollify is produced by Element451 —  the next-generation AI student engagement platform helping institutions create meaningful and personalized interactions with students. Learn more at element451.com.

People in this episode

Host

Mallory Willsea is a strategist and consultant working at the intersection of higher education.

Interviewee

Erin Fields

Erin Fields is the Marketing Director at Ologie.

Other episodes

Episode #314: Why Faculty Are Your Secret Marketing WeaponPlay Button
Episode #314: Why Faculty Are Your Secret Marketing Weapon

Dustin sits down with Brian Hartnack, Founder and CEO of Archer Education, for a rich conversation about the evolution of digital marketing and enrollment management in higher ed.

Episode 1: How AI Will Redefine Higher Ed in 2026Play Button
Episode 1: How AI Will Redefine Higher Ed in 2026

JC Bonilla sits down with Dr. Paul Russo, Vice Provost and Founding Dean of the Katz School of Science and Health at Yeshiva University.

Ep. 81: Self-Awareness And Handling Failure As A Young ProfessionalPlay Button
Ep. 81: Self-Awareness And Handling Failure As A Young Professional

Jeremy Tiers chats with Conrad Hawley, a 23 year-old former dual-sport college athlete turned motivational speaker.

Episode 88: Student Fans: Using Taylor Swift's Playbook for Higher Ed SuccessPlay Button
Episode 88: Student Fans: Using Taylor Swift's Playbook for Higher Ed Success

Allison is joined by Jenny Petty and Kerri Shook to discuss rethinking student recruitment and retention by building a fan base.

Episode 101: AI Won’t Save Higher Ed, Leadership WillPlay Button
Episode 101: AI Won’t Save Higher Ed, Leadership Will

Mallory Willsea sits down with Dr. Claire Brady, President of Glass Half-Full Consulting and author of the upcoming book AI with Intention: The Leadership Guide for Higher Education.

Weekly ideas that make you smarter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Subscribe
cancel

Search podcasts, blog posts, people