| Name | Department/Unit |
|---|---|
| Chris Cornwell (Chair) | Economics & Ivester Institute for Analytics and Insights |
| Margaret Christ | Accounting |
| Susan Cohen | Management |
| Jerry Kane | Management Information Systems |
| Son Lam | Marketing |
| Jim Metcalf | Office of Information Technology |
| Nikhil Paradkar | Finance |
| Marc Ragin | Institute for Leadership Advancement |
The artificial intelligence revolution is no longer approaching—it has arrived. In the three years since ChatGPT’s release in November 2022, AI capabilities have advanced at a pace that has surprised even leading researchers. Today, nearly half of U.S. workers aged 18-64 report using generative AI, with 23% using it regularly at work.1 For business schools, this transformation presents both an urgent challenge and a generational opportunity.
This report presents the findings and recommendations of the Terry College Faculty AI Strategy Task Force, established by Interim Dean Santanu Chatterjee in August 2025. Our charge was clear: develop a comprehensive two-year strategic plan to position Terry College as a leader in responsible AI integration across research, instruction, and industry engagement.
The evidence is sobering. Recent research documents a 13% relative decline in employment for young workers (ages 22-25) in AI-exposed occupations compared to older workers—even after controlling for firm-level factors.2 The occupations most affected include software development, customer service, accounting, and administrative support—fields where many of our graduates begin their careers.
Based on extensive research, faculty surveys, industry consultations, and benchmarking against peer institutions, we propose a comprehensive strategy organized around three pillars:
1. Faculty Development: Empower faculty to become confident users, creators, and evaluators of AI technologies through structured professional development, peer learning communities, and research enablement.
2. Student Learning and Instructional Innovation: Ensure every Terry graduate possesses the ability to use, build, and critically assess AI tools, with AI competencies integrated across the curriculum and new experiential learning opportunities.
3. Industry and Ecosystem Engagement: Position Terry as a trusted AI partner to business, government, and alumni through strategic partnerships, applied research collaborations, and thought leadership.
Our November 2025 faculty survey (n=148) revealed both readiness and concern:
Faculty self-assess their AI literacy at 3.09/5 and competence at 2.50/4, indicating significant room for growth—and strong demand for support.
We propose a phased implementation over 24 months:
| Phase | Timeframe | Focus |
|---|---|---|
| Phase 1: Foundation | Months 1-6 | Establish governance, launch faculty development, pilot student modules |
| Phase 2: Scaling | Months 7-12 | Expand curriculum integration, formalize partnerships, deploy assessment innovations |
| Phase 3: Institutionalization | Months 13-24 | Embed AI competencies in requirements, establish sustainable funding, achieve recognition |
Total Estimated Annual Investment: $500,000-$600,000
The institutions that thrive in the AI era will be those that move deliberately but decisively. Terry College has the faculty expertise, institutional resources, and strategic vision to lead. This report provides the roadmap. The time to act is now.
The AI systems available today bear little resemblance to the chatbots of five years ago. Modern large language models (LLMs) demonstrate capabilities that were unexpected even by their creators: they can write sophisticated code, conduct nuanced analysis, generate creative content, and engage in complex reasoning tasks. GPT-4, released in March 2023, could pass the bar exam in the 90th percentile. Subsequent models have continued this trajectory of rapid improvement.
The transformation has been swift and comprehensive. Consider the progression: In early 2022, AI systems could generate plausible-sounding text but struggled with logical consistency and factual accuracy. By late 2022, ChatGPT demonstrated the ability to explain complex concepts, debug code, and engage in extended reasoning. By 2024, AI systems could pass professional licensing exams, generate functional software applications from natural language descriptions, and produce research-quality analysis.
The AI Index 2025 documents this acceleration in concrete terms: coding benchmark performance jumped from 4.4% in 2023 to 71.7% in 2024.3 This is not incremental improvement—it represents a qualitative shift in what these systems can accomplish.
Yet AI capabilities remain uneven in ways that matter profoundly for how we integrate these tools into education and work. Ethan Mollick and colleagues at Harvard Business School have characterized this as the “jagged frontier”—AI performs at superhuman levels on some tasks while failing completely or subtly on others.4
This jaggedness manifests in surprising ways. An AI system might flawlessly analyze a complex financial model yet confidently produce fabricated citations when asked for references. It might generate elegant prose that conveys entirely incorrect information with the same confident tone as accurate content. There is no instruction manual for navigating this frontier—no clear boundary between what AI can and cannot do.
For education, this jagged frontier creates particular challenges. Students must learn not only how to use AI tools but when to use them—and crucially, when not to. Faculty must redesign assessments to remain meaningful in a world where AI can complete many traditional assignments. The skills that matter are shifting from execution to judgment, from production to verification.
Experts disagree about what comes next, and this disagreement matters profoundly for institutional strategy.
The “Normal Technology” Scenario: Some researchers, including Arvind Narayanan and Sayash Kapoor at Princeton, argue that AI will follow the pattern of previous general-purpose technologies like electricity and computing.5 Capabilities will improve steadily, but diffusion will be slowed by integration costs, regulation, and organizational learning. Economic impact will be measured in a few percentage points of additional productivity growth over decades.
The “Fast Takeoff” Scenario: Others argue that AI possesses unique properties that could enable much more rapid advancement.6 If AI systems can meaningfully contribute to AI research itself—improving algorithms, curating training data, identifying errors—feedback loops could drive capabilities forward faster than institutions can adapt. Under this scenario, we could see superhuman performance in key domains by the late 2020s.
What is clear is that both scenarios demand significant institutional adaptation—they differ in how much time we have, not whether change is necessary.
The employment effects of AI are no longer theoretical projections. A rigorous analysis of over 5 million payroll records from ADP, conducted by Brynjolfsson, Chandar, and Chen at Stanford’s Digital Economy Lab, provides comprehensive evidence of how AI is reshaping labor markets.7
The findings are stark:
Entry-level workers bear the brunt: Employment for workers aged 22-25 in AI-exposed occupations has declined 13% relative to older workers in those same occupations.
Aggregate statistics mask distributional effects: Total employment continues growing, but young workers in AI-exposed jobs declined 6% from late 2022 to July 2025, while older workers grew by 6-9%.
Adjustments occur through employment, not wages: Workers are losing jobs rather than accepting lower wages, suggesting AI functions as a substitute for entry-level labor.
The occupations most affected—software developers, customer service representatives, accountants, and administrative assistants—represent precisely the fields where many business school graduates begin their careers.
Yet the story is not simply one of displacement. A landmark field experiment at Procter & Gamble reveals AI’s potential as a powerful collaborative partner.8
Key findings:
If AI can produce competent work from the start, what experiences develop expertise? Traditional professional development followed a clear path: juniors learned by doing progressively more complex work under senior supervision. AI disrupts this path fundamentally.
This is not an argument against AI adoption—the productivity benefits are too substantial to forgo, and students must learn to work with AI. Rather, it is an argument for intentional pedagogy: we must design learning experiences that develop genuine expertise even when AI is available.
At UGA, academic honesty cases nearly doubled from academic year 2024 to 2025, with most of the increase attributed to AI-related violations.9 But the challenge goes beyond policing. Traditional assessments were designed for a world where completing them required the knowledge and skills we aimed to teach. When AI can produce competent responses, the assessment no longer measures what it was designed to measure.
Research in Nature Human Behaviour estimates that 22.5% of computer science paper abstracts showed evidence of LLM modification by September 2024.10 More troubling: after LLM adoption, complex writing in LLM-assisted papers correlates with worse outcomes—suggesting that sophisticated-sounding text may mask weak underlying work.
Surface-level quality signals—polished prose, comprehensive structure, sophisticated vocabulary—no longer reliably indicate mastery. Faculty need new frameworks for assessing genuine understanding.
A comprehensive study of 26 leading business schools reveals that top institutions have moved well beyond pilot programs to systematic AI integration.11
Six major themes characterize leading practice:
| Institution | Key AI Initiative |
|---|---|
| CMU Tepper | Integrated AI ecosystem; Block Center for ethics |
| Harvard Business School | Required ‘Data Science & AI for Leaders’ for all MBAs |
| Indiana Kelley | Comprehensive AI Playbook; five principles framework |
| Penn State Smeal | Provost-endorsed faculty AI initiatives |
| UT Austin McCombs | Weekly faculty workshops; structured development |
| UMD Smith | AI Initiative in finance; textbook integration |
Terry College has significant strengths: the Ivester Institute for Analytics and Insights, faculty experimenting with AI, and the task force structure ensuring coordination. However, our survey reveals gaps:
The competitive landscape leaves little room for complacency.
Our strategy is organized around three interconnected pillars. These are mutually reinforcing: faculty who are confident with AI can better support students; students with strong AI skills are more attractive to industry partners; industry engagement generates insights that inform faculty development and curriculum design.
Faculty are the linchpin of any academic strategy. Our survey revealed faculty engaged but seeking support:
F1. Establish a Tiered Faculty Development Program
Foundational Track (All Faculty): Introduction to AI capabilities, basic prompting, policies, integrity considerations. Self-paced modules, available within 3 months.
Applied Track (Faculty Ready for Integration): Advanced prompting, AI-enhanced assignment design, assessment strategies, department-specific applications. Department-based workshops beginning Month 4.
Advanced Track (AI Champions): AI agent development, custom tools, research methodology. Cohort-based, beginning Year 2.
F2. Build Department-Specific Exemplar Libraries
Each department designate 1-2 faculty to curate libraries including sample prompts, assignment templates, rubrics, and case studies.
F3. Establish Peer Learning Cohorts
Semester-based cohorts of 6-8 faculty from mixed departments meeting monthly to share experiences and troubleshoot.
F4. Create Faculty Showcase Series
Quarterly presentations mixing successes and “productive failures,” recorded for asynchronous viewing.
F5. Enable Research Applications
Map existing research, propose seed funding, clarify IRB protocols, evaluate infrastructure needs, negotiate enterprise licensing.
| Metric | Baseline | Year 1 | Year 2 |
|---|---|---|---|
| Faculty completing foundational training | 0% | 75% | 95% |
| Faculty self-reported AI literacy | 3.09/5 | 3.5/5 | 4.0/5 |
| Faculty self-reported AI competence | 2.50/4 | 3.0/4 | 3.5/4 |
| Departments with exemplar libraries | 0% | 100% | 100% |
| Faculty participating in cohorts | 0 | 30 | 60 |
Nearly 75% of college students have used ChatGPT.12 Education must do more than help students use AI—it must develop judgment to use it wisely, skills to work alongside it, and adaptability to navigate ongoing change.
S1. Establish AI Literacy Requirements
S2. Develop Terry AI Certificate
Timeline: Design in Year 1; pilot in Year 2
S3. Launch Annual AI Build-a-thon
Teams tackle real business challenges using AI tools with corporate sponsors providing problems and mentorship. First event Spring Year 2.
S4. Redesign Assessment for the AI Era
S5. Update Academic Integrity Policies
College-level framework with department flexibility, clear definitions, disclosure requirements, graduated response protocols, student education.
| Metric | Baseline | Year 1 | Year 2 |
|---|---|---|---|
| Students completing AI literacy requirement | 0% | 40% new students | 100% new students |
| Courses with explicit AI policies | Unknown | 80% | 100% |
| AI Certificate enrollment | N/A | 50 (pilot) | 150 |
| AI Build-a-thon participation | 0 | 100 | 200 |
Industry is adopting AI faster than most academic institutions. Our consultations revealed consistent themes:
I1. Establish Industry Advisory Panel on AI
Leverage TDAC and expand with AI-focused advisors for quarterly briefings, curriculum input, guest speakers, and early warning on skill requirements.
I2. Develop Corporate Partnership Program
I3. Pursue External Funding
Federal/state programs (NSF, Georgia initiatives, DOE) and corporate/foundation sources (Microsoft, Google, Amazon education initiatives).
I4. Build Thought Leadership Platform
Support faculty writing for practitioner audiences, host annual symposium, develop policy briefings, cultivate media relationships.
I5. Leverage Alumni Network
Survey alumni on AI adoption, recruit mentors, engage in executive education, connect students with alumni in AI implementation.
| Metric | Baseline | Year 1 | Year 2 |
|---|---|---|---|
| Corporate partners | 0 | 5 | 15 |
| Industry-sponsored student projects | Unknown | 10 | 25 |
| External funding for AI initiatives | $0 | $100K | $300K |
| Alumni engaged in AI initiatives | 0 | 50 | 150 |
Governance: Establish Implementation Committee, designate AI Initiative Director, secure funding, negotiate licenses
Faculty Development: Launch foundational track, recruit exemplar curators, form first cohorts
Student Learning: Finalize AI literacy module, update integrity guidelines, begin Certificate design
Industry Engagement: Convene Advisory Panel, approach founding partners, submit funding applications
Faculty Development: Full foundational rollout, launch applied track, expand cohorts
Student Learning: Require AI literacy for new students, pilot Certificate, deploy assessment strategies
Industry Engagement: Formalize partnerships, launch sponsored projects, host thought leadership event
Faculty Development: Launch advanced track, integrate into annual reviews, establish ongoing funding
Student Learning: Full Certificate launch, host first Build-a-thon, integrate competencies into outcomes
Industry Engagement: Expand partnerships, scale projects, pursue major funding, achieve recognition
| Category | Annual Cost |
|---|---|
| Personnel (AI Director, Coordinator, Designers) | ~$150K |
| Faculty Support (releases, stipends, grants, travel, seed funding) | $280K |
| Technology (licenses, computing, platforms) | $135K |
| Events & Marketing (Build-a-thon, symposium, showcases) | $85K |
Total Estimated Annual Investment: $500,000-$600,000
This should be viewed against the costs of inaction: graduates less prepared, faculty struggling without support, and competitive position eroding.
AI Strategy Implementation Committee
Responsibilities: Oversee implementation, allocate resources, monitor metrics, adapt strategy, report quarterly to Dean
Recommendations: Update annual review guidelines; provide guidance to department heads; establish recognition programs; avoid mandates that penalize faculty still developing capabilities.
| Risk | Likelihood | Impact | Mitigation |
|---|---|---|---|
| Faculty resistance | Medium | High | Peer-led development; demonstrate value |
| Rapid tech change | High | Medium | Focus on principles not specific tools |
| Insufficient resources | Medium | High | Phased implementation; external funding |
| Academic integrity challenges | High | High | Proactive policy; assessment innovation |
| Competitive catch-up | Medium | Medium | Move quickly; distinctive positioning |
The AI transformation of business and business education is underway. Entry-level employment in AI-exposed occupations is declining. Traditional assessments are losing validity. Leading business schools are moving aggressively to integrate AI across their programs.
Terry College has the faculty expertise, institutional resources, and strategic position to respond effectively. This report provides a comprehensive framework: developing faculty capabilities, preparing students for AI-transformed workplaces, and engaging industry as a partner.
The recommendations are ambitious but achievable. They require investment, but the costs of inaction—measured in graduate preparedness, faculty satisfaction, and competitive position—are higher. They require change, but change in service of Terry’s enduring mission: preparing students to lead in business and society.
The task force respectfully submits these recommendations to Dean Chatterjee and looks forward to supporting implementation.
| Date | Activity |
|---|---|
| August 26, 2025 | Charge meeting; working groups established |
| September 30, 2025 | Second meeting; initial group reports |
| October-November 2025 | Faculty survey conducted (n=148) |
| November 4, 2025 | Third meeting; survey preliminary results |
| November 17, 2025 | Industry focus groups |
| December 2, 2025 | Fourth meeting |
| January 7, 2026 | Fifth meeting; final group reports |
| January 2026 | Report drafting and review |
Working Groups:
| Institution | Key AI Initiatives |
|---|---|
| CMU Tepper | Integrated AI ecosystem; Block Center for ethics |
| Harvard Business School | Required ‘Data Science & AI for Leaders’ for all MBAs |
| Stanford GSB | AI applications across curriculum; faculty development |
| Indiana Kelley | Comprehensive AI Playbook; five principles framework |
| Penn State Smeal | Provost-endorsed faculty AI initiatives |
| UT Austin McCombs | Weekly faculty workshops; structured development |
| UW-Madison | AI symposiums; marketing AI integration |
| UMD Smith | AI Initiative in finance; textbook integration |
| Rutgers | ‘AI in Accounting’ certificates |
Common Themes: Move from pilots to coordinated portfolios; tiered AI literacy pathways; heavy faculty development investment; ethics embedded across curriculum; strategic industry partnerships; assessment innovation emphasis.
AI Index Report 2025, Stanford Institute for Human-Centered AI. Available at: https://aiindex.stanford.edu/↩︎
Brynjolfsson, E., Chandar, D., & Chen, W. (2025). “Canaries in the Coal Mine: Early Evidence of AI’s Impact on the Labor Market.” Working paper, Stanford Digital Economy Lab. Available at: https://digitaleconomy.stanford.edu/publications/canaries-in-the-coal-mine/↩︎
AI Index Report 2025, Stanford Institute for Human-Centered AI.↩︎
Dell’Acqua, F., McFowland, E., Mollick, E., et al. (2023). “Navigating the Jagged Technological Frontier: Field Experimental Evidence of the Effects of AI on Knowledge Worker Productivity and Quality.” Harvard Business School Working Paper 24-013. Available at: https://www.hbs.edu/faculty/Pages/item.aspx?num=64700↩︎
Narayanan, A., & Kapoor, S. (2024). AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference. Princeton University Press.↩︎
“The AI Future of Leadership and Management.” Presentation, Orlando, 2025.↩︎
Brynjolfsson, Chandar, & Chen (2025), op. cit.↩︎
Dell’Acqua, F., et al. (2025). “The Cybernetic Teammate.” Harvard Business School Working Paper. Available at: https://www.hbs.edu/faculty/Pages/item.aspx?num=67197↩︎
UGA Office of Academic Honesty data, as reported to task force, November 2025.↩︎
Liang, W., et al. (2025). “Large Language Model Use in Scientific Publishing.” Nature Human Behaviour.↩︎
“The State of AI in Business Education.” Inspire Higher Ed, October 2025.↩︎
“Student Survey Summary 2025.” Journal of College Student Development, May/June 2025.↩︎