EdTech AI Insights™

EdTech AI Insights™EdTech AI Insights™EdTech AI Insights™

EdTech AI Insights™

EdTech AI Insights™EdTech AI Insights™EdTech AI Insights™
  • Home
  • Quick Reads
  • In-Depth Insights
  • Books
  • About
  • Resources for AI

Generative AI in Business Education

  

Introduction

Business schools stand at a crossroads. They can treat generative AI as a threat to academic integrity, or as a tool to prepare graduates for the realities of professional work. A growing body of U.S. research suggests that banning these tools undermines program credibility, while structured use can enrich learning and protect equity. While this article draws on U.S. studies, the design principles apply to business education internationally, where questions of integrity, equity, and readiness are shared. This article translates research findings into course design principles that build judgment, not shortcuts.


Why a Disciplined Approach Matters

Graduates entering accounting, marketing, finance, or management will be expected to write, analyze, and present with AI. Pretending the technology does not exist only leaves students unprepared. Instead, courses should teach when and how to use AI responsibly, model expert prompts, and reinforce verification. Evidence from recent studies shows that design choices, not the tools themselves, determine learning outcomes (Sun & Deng, 2025; Wukich et al., 2024).


What the Research Shows


Five recent studies form a strong foundation for action:

• Accounting field experiment. In three accounting courses, students could choose to use ChatGPT. Test scores showed no difference between the 59 users and 111 non-users, regardless of major or modality. Students preferred notes and textbooks for qualitative tasks, while leaning on ChatGPT for calculations (Wukich, Henderson, & Daigle, 2024). This suggests that simply permitting AI does not erode outcomes; students still favor traditional resources for core tasks.


• Experiential design study. Guided activities aligned to Kolb’s learning cycle revealed that prompts created by instructors shaped the quality of student prompts. Students found ChatGPT helpful but noted challenges with metacognitive tasks, highlighting the need for scaffolds and verification routines (Sun & Deng, 2025).


• Regional survey. At a Southeastern U.S. university, 80% of business students had heard of ChatGPT, but only 22% used it in class. Sixty percent viewed unpermitted use as unethical, underscoring the influence of policy clarity (Shurden & Shurden, 2024).

• National survey. A U.S. sample of 1,001 students found that institutional policies permitting AI use correlated with higher adoption. Non-native English speakers reported greater reliance on ChatGPT for writing, suggesting its potential as a language support tool (Baek, Tate, & Warschauer, 2024). Together, these adoption patterns underscore how policy signals are powerful levers for shaping student behavior.


• Marketing focus groups. Marketing students reported using ChatGPT for brainstorming, drafting, and feedback. While they valued efficiency, they expressed concerns about accuracy, ethics, and over-reliance (Hazari, 2025).


Together, these findings show that AI neither automatically boosts nor erodes learning. Outcomes depend on thoughtful course design, institutional policy, and equity measures.


Ten Design Principles for Business Courses

These principles align with established instructional design traditions while adapting them to the unique challenges of generative AI:

1. Tie AI use to explicit learning outcomes, aligned with Bloom’s taxonomy.

2. Model expert prompting in class and explain the structure of strong prompts.

3. Keep human judgment at the center; grade reasoning and decision quality.

4. Introduce AI in low-stakes labs early in the term.

5. Provide scaffolds, checklists, and clear tool boundaries.

6. Require process evidence such as drafts and verification notes.

7. Add short oral checks to confirm understanding.

8. Align AI use with disciplinary practices, such as accounting or marketing.

9. Provide plain-language guides and support for non-native speakers.

10. Close each cycle with structured reflection on transfer and verification.


Policy as a Teaching Tool

Policies are more than rules; they shape student behavior. When institutions or instructors allow AI under specific conditions, students respond with greater responsibility (Baek et al., 2024; Shurden & Shurden, 2024). Effective policies are assignment-bound, transparent, and include concrete examples. They should also address privacy, data use, and institutional approval of tools. Overreliance on detection tools can disproportionately harm non-native English speakers, who are more likely to be misclassified (Baek et al., 2024). Detection software, while tempting, should never be the sole basis for discipline.


Sub-Discipline Applications

• Accounting: Permit AI for narrative explanations, not final calculations. Require transparency and oral checks (Wukich et al., 2024).

• Marketing: Use AI for brainstorming and audience testing, but require verification of claims and brand safety (Hazari, 2025).

• Information Systems: Map prompts to Bloom’s levels, generate test cases, and require manual checks (Sun & Deng, 2025).

Management and Strategy applications are broader and integrative. Here, AI can be used to generate competing options, highlight counterarguments, and surface second-order effects. The goal is not polished prose but structured reasoning. Assignments in this area can emphasize clarity of criteria, defensible choices, and short team defenses during class. These applications point to a future in which AI is not an add-on, but a routine part of disciplinary practice. This approach reinforces the central theme: AI works best when paired with human judgment and strategic decision-making.


Assessment That Rewards Thinking

A process-evidence model strengthens integrity. Students submit prompts, drafts, and revision notes, alongside a short rationale. Verification quality is graded separately, and rotating oral checks deter misuse. These steps emphasize judgment over polished output.


Equity and Workload

Equity requires more than access to tools. Students benefit from plain-language guides, language support, and low-bandwidth options. Limiting courses to one approved tool per assignment reduces confusion. To manage instructor workload, use pass-fail checks for routine artifacts and focus feedback on milestones.


Conclusion: A Pragmatic, Optimistic Path

Generative AI is already part of professional work. Business programs should meet this reality with careful design, structured policies, and a focus on human judgment. Business education has always adapted to disruptive technologies—from spreadsheets to ERP systems. Generative AI is the next wave. Programs that act now will not only protect academic integrity but also graduate students ready to lead in AI-augmented workplaces. This disciplined approach is equitable, scalable, and prepares business schools to lead rather than follow.


References


Baek, C., Tate, T., & Warschauer, M. (2024). “ChatGPT seems too good to be true”: College students’ use and perceptions of generative AI. Computers and Education: Artificial Intelligence, 7, 100294. https://doi.org/10.1016/j.caeai.2024.100294


Hazari, S. (2025). Marketing students’ perceptions towards ChatGPT: An AI-assisted thematic analysis. Marketing Education Review, 35(3), 171–187. https://doi.org/10.1080/10528008.2025.2470198


Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall.

Shurden, S., & Shurden, M. (2024). Business student’s perception of the use of artificial intelligence in higher education with a focus on ChatGPT. Journal of Instructional Pedagogies, 30, 1–12. https://files.eric.ed.gov/fulltext/EJ1459917.pdf


Sun, R., & Deng, X. (2025). Using generative AI to enhance experiential learning: An exploratory study of ChatGPT use by university students. Journal of Information Systems Education, 36(1), 53–64. https://doi.org/10.62273/ZLUM4022


Wukich, J., Henderson, C., & Daigle, R. J. (2024, October 29). How students use—and don’t use—ChatGPT. Journal of Accountancy. https://www.journalofaccountancy.com/news/2024/oct/how-students-use-and-dont-use-chatgpt.html


© 2025 Charles Ulrich Company, Inc. | EdTech AI Insights™ | All Rights Reserved.  


Articles were developed with research, drafting, and grammar support from ChatGPT and Grammarly.  

All images were created using ChatGPT.

  • Privacy Policy
  • Terms of Use

Powered by

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

Accept