Task Force Members

Name Department/Unit
Chris Cornwell (Chair) Economics & Ivester Institute for Analytics and Insights
Margaret Christ Accounting
Susan Cohen Management
Jerry Kane Management Information Systems
Son Lam Marketing
Jim Metcalf Office of Information Technology
Nikhil Paradkar Finance
Marc Ragin Institute for Leadership Advancement

1 Executive Summary

The artificial intelligence revolution is no longer approaching—it has arrived. In the three years since ChatGPT’s release in November 2022, AI capabilities have advanced at a pace that has surprised even leading researchers. Today, nearly half of U.S. workers aged 18-64 report using generative AI, with 23% using it regularly at work.1 For business schools, this transformation presents both an urgent challenge and a generational opportunity.

This report presents the findings and recommendations of the Terry College Faculty AI Strategy Task Force, established by Interim Dean Santanu Chatterjee in August 2025. Our charge was clear: develop a comprehensive two-year strategic plan to position Terry College as a leader in responsible AI integration across research, instruction, and industry engagement.

1.1 The Stakes Are High

The evidence is sobering. Recent research documents a 13% relative decline in employment for young workers (ages 22-25) in AI-exposed occupations compared to older workers—even after controlling for firm-level factors.2 The occupations most affected include software development, customer service, accounting, and administrative support—fields where many of our graduates begin their careers.

1.2 Our Strategic Response

Based on extensive research, faculty surveys, industry consultations, and benchmarking against peer institutions, we propose a comprehensive strategy organized around three pillars:

1. Faculty Development: Empower faculty to become confident users, creators, and evaluators of AI technologies through structured professional development, peer learning communities, and research enablement.

2. Student Learning and Instructional Innovation: Ensure every Terry graduate possesses the ability to use, build, and critically assess AI tools, with AI competencies integrated across the curriculum and new experiential learning opportunities.

3. Industry and Ecosystem Engagement: Position Terry as a trusted AI partner to business, government, and alumni through strategic partnerships, applied research collaborations, and thought leadership.

1.3 Key Findings from Faculty Survey

Our November 2025 faculty survey (n=148) revealed both readiness and concern:

Faculty self-assess their AI literacy at 3.09/5 and competence at 2.50/4, indicating significant room for growth—and strong demand for support.

1.4 Implementation Framework

We propose a phased implementation over 24 months:

Phase Timeframe Focus
Phase 1: Foundation Months 1-6 Establish governance, launch faculty development, pilot student modules
Phase 2: Scaling Months 7-12 Expand curriculum integration, formalize partnerships, deploy assessment innovations
Phase 3: Institutionalization Months 13-24 Embed AI competencies in requirements, establish sustainable funding, achieve recognition

Total Estimated Annual Investment: $500,000-$600,000

The institutions that thrive in the AI era will be those that move deliberately but decisively. Terry College has the faculty expertise, institutional resources, and strategic vision to lead. This report provides the roadmap. The time to act is now.


2 Part I: The AI Landscape in January 2026

2.1 The Current State of AI Capabilities

2.1.1 From Chatbot to Cognitive Partner

The AI systems available today bear little resemblance to the chatbots of five years ago. Modern large language models (LLMs) demonstrate capabilities that were unexpected even by their creators: they can write sophisticated code, conduct nuanced analysis, generate creative content, and engage in complex reasoning tasks. GPT-4, released in March 2023, could pass the bar exam in the 90th percentile. Subsequent models have continued this trajectory of rapid improvement.

The transformation has been swift and comprehensive. Consider the progression: In early 2022, AI systems could generate plausible-sounding text but struggled with logical consistency and factual accuracy. By late 2022, ChatGPT demonstrated the ability to explain complex concepts, debug code, and engage in extended reasoning. By 2024, AI systems could pass professional licensing exams, generate functional software applications from natural language descriptions, and produce research-quality analysis.

The AI Index 2025 documents this acceleration in concrete terms: coding benchmark performance jumped from 4.4% in 2023 to 71.7% in 2024.3 This is not incremental improvement—it represents a qualitative shift in what these systems can accomplish.

2.1.2 The “Jagged Frontier”

Yet AI capabilities remain uneven in ways that matter profoundly for how we integrate these tools into education and work. Ethan Mollick and colleagues at Harvard Business School have characterized this as the “jagged frontier”—AI performs at superhuman levels on some tasks while failing completely or subtly on others.4

This jaggedness manifests in surprising ways. An AI system might flawlessly analyze a complex financial model yet confidently produce fabricated citations when asked for references. It might generate elegant prose that conveys entirely incorrect information with the same confident tone as accurate content. There is no instruction manual for navigating this frontier—no clear boundary between what AI can and cannot do.

For education, this jagged frontier creates particular challenges. Students must learn not only how to use AI tools but when to use them—and crucially, when not to. Faculty must redesign assessments to remain meaningful in a world where AI can complete many traditional assignments. The skills that matter are shifting from execution to judgment, from production to verification.

2.1.3 Two Possible Futures

Experts disagree about what comes next, and this disagreement matters profoundly for institutional strategy.

The “Normal Technology” Scenario: Some researchers, including Arvind Narayanan and Sayash Kapoor at Princeton, argue that AI will follow the pattern of previous general-purpose technologies like electricity and computing.5 Capabilities will improve steadily, but diffusion will be slowed by integration costs, regulation, and organizational learning. Economic impact will be measured in a few percentage points of additional productivity growth over decades.

The “Fast Takeoff” Scenario: Others argue that AI possesses unique properties that could enable much more rapid advancement.6 If AI systems can meaningfully contribute to AI research itself—improving algorithms, curating training data, identifying errors—feedback loops could drive capabilities forward faster than institutions can adapt. Under this scenario, we could see superhuman performance in key domains by the late 2020s.

What is clear is that both scenarios demand significant institutional adaptation—they differ in how much time we have, not whether change is necessary.

2.2 AI’s Impact on Work and the Labor Market

2.2.1 Early Evidence of Labor Market Disruption

The employment effects of AI are no longer theoretical projections. A rigorous analysis of over 5 million payroll records from ADP, conducted by Brynjolfsson, Chandar, and Chen at Stanford’s Digital Economy Lab, provides comprehensive evidence of how AI is reshaping labor markets.7

The findings are stark:

  1. Entry-level workers bear the brunt: Employment for workers aged 22-25 in AI-exposed occupations has declined 13% relative to older workers in those same occupations.

  2. Aggregate statistics mask distributional effects: Total employment continues growing, but young workers in AI-exposed jobs declined 6% from late 2022 to July 2025, while older workers grew by 6-9%.

  3. Adjustments occur through employment, not wages: Workers are losing jobs rather than accepting lower wages, suggesting AI functions as a substitute for entry-level labor.

The occupations most affected—software developers, customer service representatives, accountants, and administrative assistants—represent precisely the fields where many business school graduates begin their careers.

2.2.2 AI as Collaborator: The Cybernetic Teammate

Yet the story is not simply one of displacement. A landmark field experiment at Procter & Gamble reveals AI’s potential as a powerful collaborative partner.8

Key findings:

  • Performance equivalence: Individuals with AI matched the performance of two-person teams with AI, and both exceeded teams without AI
  • Expertise democratization: Non-experts with AI achieved performance comparable to expert teams
  • Top-tier performance: Teams with AI were three times more likely to produce top 10% solutions
  • Emotional benefits: AI users reported higher positive emotions approaching human teamwork levels

2.2.3 The Training Paradox

If AI can produce competent work from the start, what experiences develop expertise? Traditional professional development followed a clear path: juniors learned by doing progressively more complex work under senior supervision. AI disrupts this path fundamentally.

This is not an argument against AI adoption—the productivity benefits are too substantial to forgo, and students must learn to work with AI. Rather, it is an argument for intentional pedagogy: we must design learning experiences that develop genuine expertise even when AI is available.

2.3 Challenges for Higher Education

2.3.1 The Academic Integrity Crisis

At UGA, academic honesty cases nearly doubled from academic year 2024 to 2025, with most of the increase attributed to AI-related violations.9 But the challenge goes beyond policing. Traditional assessments were designed for a world where completing them required the knowledge and skills we aimed to teach. When AI can produce competent responses, the assessment no longer measures what it was designed to measure.

2.3.2 The Quality Signal Problem

Research in Nature Human Behaviour estimates that 22.5% of computer science paper abstracts showed evidence of LLM modification by September 2024.10 More troubling: after LLM adoption, complex writing in LLM-assisted papers correlates with worse outcomes—suggesting that sophisticated-sounding text may mask weak underlying work.

Surface-level quality signals—polished prose, comprehensive structure, sophisticated vocabulary—no longer reliably indicate mastery. Faculty need new frameworks for assessing genuine understanding.

2.4 The Competitive Landscape in Business Education

A comprehensive study of 26 leading business schools reveals that top institutions have moved well beyond pilot programs to systematic AI integration.11

Six major themes characterize leading practice:

  1. From Pilots to AI Ecosystems: Coordinated portfolios across programs
  2. Democratizing AI Literacy: Tiered pathways for all students
  3. Domain-Specific Applications: Tailored approaches by discipline
  4. Faculty Development as Critical Success Factor: Heavy investment in support
  5. Responsible AI Integration: Ethics embedded across curriculum
  6. Strategic Partnerships: Leveraging industry and alumni networks
Selected Peer Institution AI Initiatives
Institution Key AI Initiative
CMU Tepper Integrated AI ecosystem; Block Center for ethics
Harvard Business School Required ‘Data Science & AI for Leaders’ for all MBAs
Indiana Kelley Comprehensive AI Playbook; five principles framework
Penn State Smeal Provost-endorsed faculty AI initiatives
UT Austin McCombs Weekly faculty workshops; structured development
UMD Smith AI Initiative in finance; textbook integration

2.4.1 Where Terry Stands

Terry College has significant strengths: the Ivester Institute for Analytics and Insights, faculty experimenting with AI, and the task force structure ensuring coordination. However, our survey reveals gaps:

  • Average AI literacy of 3.09/5 and competence of 2.50/4 suggest many faculty do not feel confident
  • While 76% use ChatGPT, adoption of sophisticated tools remains limited
  • Faculty demand practical training, discipline-specific exemplars, and assessment guidance
  • Research use of AI lags instructional use

The competitive landscape leaves little room for complacency.


3 Part II: Strategic Framework and Recommendations

Our strategy is organized around three interconnected pillars. These are mutually reinforcing: faculty who are confident with AI can better support students; students with strong AI skills are more attractive to industry partners; industry engagement generates insights that inform faculty development and curriculum design.

3.1 Pillar 1: Faculty Development

3.1.1 The Challenge

Faculty are the linchpin of any academic strategy. Our survey revealed faculty engaged but seeking support:

3.1.2 Recommendations

F1. Establish a Tiered Faculty Development Program

  • Foundational Track (All Faculty): Introduction to AI capabilities, basic prompting, policies, integrity considerations. Self-paced modules, available within 3 months.

  • Applied Track (Faculty Ready for Integration): Advanced prompting, AI-enhanced assignment design, assessment strategies, department-specific applications. Department-based workshops beginning Month 4.

  • Advanced Track (AI Champions): AI agent development, custom tools, research methodology. Cohort-based, beginning Year 2.

F2. Build Department-Specific Exemplar Libraries

Each department designate 1-2 faculty to curate libraries including sample prompts, assignment templates, rubrics, and case studies.

F3. Establish Peer Learning Cohorts

Semester-based cohorts of 6-8 faculty from mixed departments meeting monthly to share experiences and troubleshoot.

F4. Create Faculty Showcase Series

Quarterly presentations mixing successes and “productive failures,” recorded for asynchronous viewing.

F5. Enable Research Applications

Map existing research, propose seed funding, clarify IRB protocols, evaluate infrastructure needs, negotiate enterprise licensing.

3.1.3 Success Metrics

Metric Baseline Year 1 Year 2
Faculty completing foundational training 0% 75% 95%
Faculty self-reported AI literacy 3.09/5 3.5/5 4.0/5
Faculty self-reported AI competence 2.50/4 3.0/4 3.5/4
Departments with exemplar libraries 0% 100% 100%
Faculty participating in cohorts 0 30 60

3.2 Pillar 2: Student Learning and Instructional Innovation

3.2.1 The Challenge

Nearly 75% of college students have used ChatGPT.12 Education must do more than help students use AI—it must develop judgment to use it wisely, skills to work alongside it, and adaptability to navigate ongoing change.

3.2.2 Recommendations

S1. Establish AI Literacy Requirements

  • Undergraduate: Required AI literacy module in core curriculum; AI applications in major-specific courses; ethics embedded across programs
  • Graduate: AI module in orientation; program-specific applications in core courses; executive education updated

S2. Develop Terry AI Certificate

  • Core requirement: AI Foundations seminar
  • Electives: AI-enhanced courses across departments
  • Capstone: Experiential project with AI application
  • Portfolio: Documented AI projects and competencies

Timeline: Design in Year 1; pilot in Year 2

S3. Launch Annual AI Build-a-thon

Teams tackle real business challenges using AI tools with corporate sponsors providing problems and mentorship. First event Spring Year 2.

S4. Redesign Assessment for the AI Era

  • Immediate: College-wide guidelines, assessment strategy guides, rubrics for AI assistance
  • Near-Term: Expand in-class assessment, increase oral examination, design assignments requiring AI use with reflection
  • Longer-Term: Pilot AI-resistant formats, investigate process-based evaluation

S5. Update Academic Integrity Policies

College-level framework with department flexibility, clear definitions, disclosure requirements, graduated response protocols, student education.

3.2.3 Success Metrics

Metric Baseline Year 1 Year 2
Students completing AI literacy requirement 0% 40% new students 100% new students
Courses with explicit AI policies Unknown 80% 100%
AI Certificate enrollment N/A 50 (pilot) 150
AI Build-a-thon participation 0 100 200

3.3 Pillar 3: Industry and Ecosystem Engagement

3.3.1 The Challenge

Industry is adopting AI faster than most academic institutions. Our consultations revealed consistent themes:

  • Employers increasingly expect AI fluency from new hires
  • Managerial judgment is becoming more valuable
  • Companies investing heavily in upskilling want academic partners
  • “Our #1 strategy is upskilling” appeared repeatedly

3.3.2 Recommendations

I1. Establish Industry Advisory Panel on AI

Leverage TDAC and expand with AI-focused advisors for quarterly briefings, curriculum input, guest speakers, and early warning on skill requirements.

I2. Develop Corporate Partnership Program

  • Partner Benefits: Early access to graduates, sponsored projects, executive education priority, research collaboration
  • Terry Benefits: Real-world projects, practitioner input, financial support, placement pipeline
  • Target: 5 founding partners Year 1; 15 partners by Year 2

I3. Pursue External Funding

Federal/state programs (NSF, Georgia initiatives, DOE) and corporate/foundation sources (Microsoft, Google, Amazon education initiatives).

I4. Build Thought Leadership Platform

Support faculty writing for practitioner audiences, host annual symposium, develop policy briefings, cultivate media relationships.

I5. Leverage Alumni Network

Survey alumni on AI adoption, recruit mentors, engage in executive education, connect students with alumni in AI implementation.

3.3.3 Success Metrics

Metric Baseline Year 1 Year 2
Corporate partners 0 5 15
Industry-sponsored student projects Unknown 10 25
External funding for AI initiatives $0 $100K $300K
Alumni engaged in AI initiatives 0 50 150

4 Part III: Implementation Roadmap

4.1 Phased Implementation

4.1.1 Phase 1: Foundation (Months 1-6)

Governance: Establish Implementation Committee, designate AI Initiative Director, secure funding, negotiate licenses

Faculty Development: Launch foundational track, recruit exemplar curators, form first cohorts

Student Learning: Finalize AI literacy module, update integrity guidelines, begin Certificate design

Industry Engagement: Convene Advisory Panel, approach founding partners, submit funding applications

4.1.2 Phase 2: Scaling (Months 7-12)

Faculty Development: Full foundational rollout, launch applied track, expand cohorts

Student Learning: Require AI literacy for new students, pilot Certificate, deploy assessment strategies

Industry Engagement: Formalize partnerships, launch sponsored projects, host thought leadership event

4.1.3 Phase 3: Institutionalization (Months 13-24)

Faculty Development: Launch advanced track, integrate into annual reviews, establish ongoing funding

Student Learning: Full Certificate launch, host first Build-a-thon, integrate competencies into outcomes

Industry Engagement: Expand partnerships, scale projects, pursue major funding, achieve recognition

4.2 Resource Requirements

Category Annual Cost
Personnel (AI Director, Coordinator, Designers) ~$150K
Faculty Support (releases, stipends, grants, travel, seed funding) $280K
Technology (licenses, computing, platforms) $135K
Events & Marketing (Build-a-thon, symposium, showcases) $85K

Total Estimated Annual Investment: $500,000-$600,000

This should be viewed against the costs of inaction: graduates less prepared, faculty struggling without support, and competitive position eroding.

4.3 Governance Structure

AI Strategy Implementation Committee

  • AI Initiative Director (chair)
  • One representative from each department
  • Directors of Graduate and Undergraduate Programs
  • OIT and CTL representatives
  • Industry Advisory Panel liaison

Responsibilities: Oversee implementation, allocate resources, monitor metrics, adapt strategy, report quarterly to Dean

4.4 Implications for Performance Evaluation

4.4.1 Faculty Evaluation

  • Teaching: Recognize AI-integrated pedagogy innovation; consider adoption as instructional development
  • Research: Acknowledge AI-related research; recognize methodological innovation
  • Service: Credit participation in AI initiatives and contributions to exemplar libraries

Recommendations: Update annual review guidelines; provide guidance to department heads; establish recognition programs; avoid mandates that penalize faculty still developing capabilities.

4.5 Risk Management

Risk Likelihood Impact Mitigation
Faculty resistance Medium High Peer-led development; demonstrate value
Rapid tech change High Medium Focus on principles not specific tools
Insufficient resources Medium High Phased implementation; external funding
Academic integrity challenges High High Proactive policy; assessment innovation
Competitive catch-up Medium Medium Move quickly; distinctive positioning

5 Conclusion

The AI transformation of business and business education is underway. Entry-level employment in AI-exposed occupations is declining. Traditional assessments are losing validity. Leading business schools are moving aggressively to integrate AI across their programs.

Terry College has the faculty expertise, institutional resources, and strategic position to respond effectively. This report provides a comprehensive framework: developing faculty capabilities, preparing students for AI-transformed workplaces, and engaging industry as a partner.

The recommendations are ambitious but achievable. They require investment, but the costs of inaction—measured in graduate preparedness, faculty satisfaction, and competitive position—are higher. They require change, but change in service of Terry’s enduring mission: preparing students to lead in business and society.

The task force respectfully submits these recommendations to Dean Chatterjee and looks forward to supporting implementation.


6 Appendix A: Task Force Process

Task Force Timeline
Date Activity
August 26, 2025 Charge meeting; working groups established
September 30, 2025 Second meeting; initial group reports
October-November 2025 Faculty survey conducted (n=148)
November 4, 2025 Third meeting; survey preliminary results
November 17, 2025 Industry focus groups
December 2, 2025 Fourth meeting
January 7, 2026 Fifth meeting; final group reports
January 2026 Report drafting and review

Working Groups:

  • Faculty Development (Son Lam, Nikhil Paradkar): Developed and administered faculty survey; analyzed results
  • Student Learning (Margaret Christ, Jerry Kane): Researched peer institutions; analyzed integrity trends; developed curriculum recommendations
  • Industry Engagement (Susan Cohen, Marc Ragin): Conducted focus groups and interviews; researched partnership models

7 Appendix B: Peer Institution Benchmarking

Institution Key AI Initiatives
CMU Tepper Integrated AI ecosystem; Block Center for ethics
Harvard Business School Required ‘Data Science & AI for Leaders’ for all MBAs
Stanford GSB AI applications across curriculum; faculty development
Indiana Kelley Comprehensive AI Playbook; five principles framework
Penn State Smeal Provost-endorsed faculty AI initiatives
UT Austin McCombs Weekly faculty workshops; structured development
UW-Madison AI symposiums; marketing AI integration
UMD Smith AI Initiative in finance; textbook integration
Rutgers ‘AI in Accounting’ certificates

Common Themes: Move from pilots to coordinated portfolios; tiered AI literacy pathways; heavy faculty development investment; ethics embedded across curriculum; strategic industry partnerships; assessment innovation emphasis.



  1. AI Index Report 2025, Stanford Institute for Human-Centered AI. Available at: https://aiindex.stanford.edu/↩︎

  2. Brynjolfsson, E., Chandar, D., & Chen, W. (2025). “Canaries in the Coal Mine: Early Evidence of AI’s Impact on the Labor Market.” Working paper, Stanford Digital Economy Lab. Available at: https://digitaleconomy.stanford.edu/publications/canaries-in-the-coal-mine/↩︎

  3. AI Index Report 2025, Stanford Institute for Human-Centered AI.↩︎

  4. Dell’Acqua, F., McFowland, E., Mollick, E., et al. (2023). “Navigating the Jagged Technological Frontier: Field Experimental Evidence of the Effects of AI on Knowledge Worker Productivity and Quality.” Harvard Business School Working Paper 24-013. Available at: https://www.hbs.edu/faculty/Pages/item.aspx?num=64700↩︎

  5. Narayanan, A., & Kapoor, S. (2024). AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference. Princeton University Press.↩︎

  6. “The AI Future of Leadership and Management.” Presentation, Orlando, 2025.↩︎

  7. Brynjolfsson, Chandar, & Chen (2025), op. cit.↩︎

  8. Dell’Acqua, F., et al. (2025). “The Cybernetic Teammate.” Harvard Business School Working Paper. Available at: https://www.hbs.edu/faculty/Pages/item.aspx?num=67197↩︎

  9. UGA Office of Academic Honesty data, as reported to task force, November 2025.↩︎

  10. Liang, W., et al. (2025). “Large Language Model Use in Scientific Publishing.” Nature Human Behaviour.↩︎

  11. “The State of AI in Business Education.” Inspire Higher Ed, October 2025.↩︎

  12. “Student Survey Summary 2025.” Journal of College Student Development, May/June 2025.↩︎