Tech scholar publishes research on embracing AI, now a part of COB curriculum
Louisiana Tech’s Dr. Craig Van Slyke, an expert in the field of information systems, recently published an article in Communications of the Association for Information Systems. Titled “Generative Artificial Intelligence in Information Systems Education: Challenges, Consequences, and Responses,” Van Slyke’s analysis provides a thorough examination of the challenges posed by AI integration in information systems education, highlights the resulting transformations, and offers insightful responses to navigate this changing landscape.
What if I told you AI — ChatGPT, to be specific— wrote the above paragraph? Well, it did. I gave it a simple, one-sentence prompt, and it produced my introduction. The tool is becoming one of the most rapidly adopted technologies in history, and Van Slyke’s research aims to develop a better understanding of the range of impacts within information systems education and how educators might respond.
Written alongside Richard D. Johnson of Washington State University and Jalal Sarabadani of San Jose State University, the article provides specific recommendations that will allow educators to effectively respond to the rise of AI tools: do nothing; prohibit the use of AI tools; allow limited use of AI tools; or, embrace AI tools as legitimate learning aids.
“My personal view is that it is a losing game to try to prevent AI use. We need to teach students and workers how to use generative AI ethically, effectively, and safely,” said Van Slyke, who serves as Tech’s Mike McCallister Eminent Scholar Chair in Information Systems. “A large part of this will be related to helping people understand the risks and benefits of AI.”
Tech’s College of Business is taking steps to do just that. This Fall, faculty will begin embedding AI content in both CIS 125: IT Solutions for Business and CIS 310: Principles of Information Systems.
“One likely activity will be designed to help students learn to refine results through a chain of prompts,” Van Slyke said. “It’s pretty rare to get the instructions to ChatGPT right the first time. It’s kind of like talking with a colleague to help you refine ideas. It’s a process of back and forth through which the ideas become more concrete and better defined.”
Students will also learn to use ChatGPT for specific learning-related tasks, such as brainstorming ideas and acting as a study partner or a tutor.
“We want to provide students with guidelines for ethical AI use for learning activities,” Van Slyke said. “Our goal is to help students understand how to use generative AI ethically AND effectively.”
Van Slyke and colleagues France Belanger of Virginia Tech and Rob Crossler of Washington State are currently working on the 5th edition of the textbook used in Tech’s CIS 310 course, Information Systems for Business: An Experiential Approach, which will continue to provide students with the tools and basic knowledge needed to continually figure out how new technologies fit into their lives.
“Through our curriculum, students learn to leverage technology in business to its fullest,” College of Business Dean Dr. Chris Martin said. “We have to recall that calculators, computers, and the internet were initially cast aside. Not having these today is unimageable. As generative AI continues to become more mainstream, businesses will likely embrace these tools to become more efficient and effective. We want our students to be prepared — just as they are with current technology.”
Faculty across the business disciplines have started incorporating generative AI concepts into their coursework. Tech’s Dr. Patrick Scott, Associate Professor of Economics and Patricia Garland Endowed Professor, uses AI in his upper-division economics courses to help students with coding — something Scott anticipates will allow him to increase the number of analytical exercises offered by at least 30 percent this next fall.
“ChatGPT queries of coding topics usually include minimal working examples as well as explanations for why someone would typically want to approach a problem in a given way,” said Scott. “This significantly shortens the learning curve for students by reducing the opportunity cost of trying new techniques. Students will be exposed to more material and expected to know more. This will close the gap between what students know in an undergraduate program and what they will need in a graduate program or job.”
The College of Business prides itself on preparing students for a global workforce and aligning curriculum with industry needs. Integrating emerging concepts into the classroom is one way faculty increase students’ exposure to both innovation and technology, two of the College’s core academic themes.
In Scott’s lower-division classes, he asks students to review and correct AI-generated content that is wrong.
“AI frequently gives irrelevant, incorrect, or improper output,” he said. “This exercise forces students to identify what is not right. It sharpens student’s critical thinking and discernment skills.”
As faculty and students alike begin to explore the challenges and possibilities of generative AI in the classroom, Van Slyke’s research will continue to provide relevant guidance on what could become, in his words, “as disruptive as e-commerce was in the late 1990s/early 2000s.”
“In our paper, we argue that Information Systems faculty should embrace AI tools as legitimate learning aids and that the Association for Information Systems should take a leadership role in determining our collective response to the threats and challenges from ChatGPT and other AI tools,” Van Slyke said. “Although nobody knows the extent of the impact of generative AI systems like ChatGPT, it is clear that the impacts will be significant. We need to prepare students to live in a world that’s heavily influenced by generative AI systems.”