About the Episode
About The Episode:
Scott Cline sits down with Emily Thayer Owens, a college access strategist, counselor, and AI ethics advocate. With nearly two decades of experience across admissions, policy, and student support, Emily shares a refreshingly grounded perspective on how AI is already transforming the college admissions landscape—and what higher ed leaders are completely missing in this moment. From digital divides to transparency gaps, this episode unpacks what it takes to build trust and adapt strategically in the age of intelligent agents.
Key Takeaways
- AI is already embedded in the college admissions journey—from students using ChatGPT to platforms like SCOIR quietly pushing AI features.
- The digital divide is widening: Access to AI tools and training varies drastically between schools and communities, compounding existing equity issues.
- High school counselors and students want transparency: Institutions must disclose when and how AI is used in admissions and decision-making.
- Onboarding gaps are real: Many students arrive on campus without baseline tech fluency—assumptions that “digital natives” know everything are deeply flawed.
- Sustainability and social justice matter: Institutions that adopt AI must align their usage with their stated values—or risk losing credibility with Gen Z.
- AI policies require training, not just enforcement: Without proactive education, policy rollouts can inadvertently punish or alienate students.
- Leaders need to learn AI hands-on: Decision-makers who don’t understand the tech they’re investing in are setting up their campuses for failure.
Episode Summary
How are students and counselors actually using AI right now?
Emily breaks down how AI usage is already widespread—even if unacknowledged. Students are relying on everything from ChatGPT to embedded tools in platforms like SCOIR without always realizing it’s AI. Counselors, too, are piecing together their own tech stacks, building GPTs or leveraging Midjourney or Canva for recruitment visuals and presentations. But the kicker? Access to these tools—and to basic technology like reliable Wi-Fi or computers—is not equal across districts.
This means college admissions teams can’t afford to make blanket assumptions about tech fluency. What's accessible in one school may be completely out of reach in another. Institutions that recognize and address this digital divide in their recruitment practices will build stronger community ties—and trust.
What’s the biggest blind spot in higher ed’s AI onboarding strategy?
Emily points to a deceptively simple issue: email. Many students don’t know how to use basic email features—like forwarding—because the interface has changed, and no one’s teaching them. This micro example represents a macro problem: institutions often assume digital fluency based on age, but that assumption is dangerous.
When it comes to AI, this leads to vague policies that don’t account for lived student experience. If campuses implement AI usage rules but don’t pair them with hands-on learning or tech literacy resources, they’re setting students (and staff) up for confusion and inequity. Onboarding needs to be reimagined—starting with the fundamentals.
Do high school counselors trust colleges to use AI ethically in admissions?
In short: no. Emily is blunt about the skepticism on the high school side. Counselors are deep in conversations about AI’s ethics, transparency, and data privacy, and they expect the same from higher ed. If colleges want to maintain (or rebuild) trust with high school counselors, they need to show how AI aligns with institutional values—especially when those values include environmental stewardship or social justice.
This means clearly articulating how AI is being used, how student data is being protected, and what the institution will do in the event of a data breach. Bonus points for colleges that go a step further and demonstrate how they’re offsetting environmental impact or using AI to advance equitable outcomes.
Is AI becoming a differentiator in student choice?
Absolutely. Just as students weigh factors like size, location, and majors, AI usage is emerging as part of the value proposition. Emily suggests we’re heading toward a future where institutions position themselves along an AI spectrum—from fully analog, human-centered environments to highly tech-integrated, AI-forward campuses.
Some colleges are already experimenting: from offering AI-integrated coursework to promoting paper-only classrooms that emphasize human debate and critical thinking. Institutions that define their stance and communicate it clearly will stand out—not just because of what they offer, but because of why they offer it.
What does higher ed get wrong about AI and Gen Z engagement?
Gen Z expects immediacy. When they have a problem, they want it solved now. They're used to digital tools, but that doesn't mean they want everything automated. If a school’s AI-powered chat or mental health resource feels impersonal or slow, it risks alienating students rather than engaging them.
Emily also cautions that digital overload is real. Students are navigating school-issued devices, push notifications, and anxiety-inducing headlines—all through their phones. Colleges must be strategic about tech touchpoints: being accessible without becoming background noise.
What do high school counselors need from admissions teams right now?
Transparency, again, tops the list. As new AI regulations emerge (like Colorado’s AI Act or EU data rules), colleges will legally need to disclose when AI influences admissions decisions. But Emily encourages schools not to wait for legislation: proactively share how AI is used, how bias is monitored, and what recourse students have if things go wrong.
Simplification also matters. The admissions process is overwhelming enough—FAFSA, essays, applications, AI-powered platforms. Institutions that streamline this experience (without sacrificing support) will stand out as more student-centric and trustworthy.
What worries Emily most—and what gives her hope?
Her biggest concern: the misalignment between how fast AI is evolving and how slowly educational institutions are responding. She sees a critical need for colleges to get to the table where tech policy is being shaped—and to advocate for students as innovation and regulation collide.
But Emily remains hopeful. As a mom and an educator, she believes in a future where students are equipped, not exploited. Her call to action for institutional leaders? Learn the tech. Ask hard questions. Partner with counselors. And be bold enough to admit when you don’t have the answers—but care enough to go find them.
Register for the AI Engage Summit — Oct 15–16, 2025
👉 Join us online from 12–4 PM ET each day for two afternoons of AI insights and training. Free to attend with on-demand access included. Register now.


