Have you ever been frustrated after asking someone to complete a task for you, only to find that their results were disappointing? If you take a moment to reflect, is it possible that the problem was that your instructions weren’t as clear as they could have been? Weak or inadequate instructions will usually lead to results that are ineffective and lacking to say the least. Would you be surprised to hear that Generative AI is no different?
The rise of Gen-AI tools has brought with it unprecedented opportunities for education and business. However, it has also introduced an increasingly important digital skill that most of us did not previously know existed — understanding how to communicate effectively with artificial intelligence. Generative AI, like ChatGPT, is transforming how we learn, teach, and work. However, we need to stop using it like we would Google or other traditional search engines. To leverage its full potential, we must learn to how to provide it with clear and unambiguous commands. These initial instructions, known as prompts, need to be effective in order for us to truly take advantage of all that artificial intelligence is capable of. This is where the SMARTER framework comes in — a practical method for optimizing instructions and interactions with generative AI.
Why prompt engineering matters
Generative AI tools rely on user-provided input to generate meaningful output. Despite their sophistication, these tools are not mind readers. ChatGPT, Claude and others are sophisticated prediction machines, and are designed to follow instructions precisely as given. Ambiguous or poorly constructed prompts can often lead to unhelpful responses, while clear and structured prompts, on the other hand, produce higher-quality results. Prompt engineering, the skill of designing inputs for AI, has quickly become essential for navigating our AI-enhanced world. While advances in chatbots are helping with some of the more rudimentary and basic daily Gen-AI interactions, the reality is that, especially in higher education, being good at prompting is going to be a valuable digital skill for most people in the future. Think of it as training people how to think and communicate with AI, as well as teaching AI how to think and generate optimum output for the user.
Introducing the SMARTER framework
The SMARTER framework provides a step-by-step guide for creating prompts that maximize AI’s effectiveness. SMARTER is an acronym that stands for the following.
S. Specify your identity
Begin by clearly introducing yourself and your role. Providing context helps AI tailor its responses to your unique needs.
Prompt example: I am a university professor teaching Business Ethics to undergraduates. I need help designing an interactive classroom activity on ethical dilemmas.
Unless you have a paid subscription to ChatGPT, for example, the chatbot you are using most likely does not have memory and will not recall who you are or what you have previously told it about yourself.
M. Make requests clear
Clearly articulate your goals and desired outcomes. The more specific your instructions, the more accurate AI’s response will be. Instead of asking, “Help me with class activities,” be more specific and try something like the following.
Prompt example: Suggest a role-play exercise for teaching corporate social responsibility.
A. Articulate steps to be taken
In the same way you would give your TA clear steps to be followed to complete a task, you also need to provide a clear roadmap for what you want the AI to do. If you’re designing a curriculum, outline the structure.
Prompt example: I need a 12-week schedule for an MBA course on leadership, including weekly topics, readings, and three activities that can be completed asynchronously.
R. Request or give examples
Examples help AI understand your expectations. If you provide it with an example, it has a better idea of what you are looking for. Alternatively, you can ask it to provide you with specific illustrations to help teach a particular concept.
Prompt example: Give me a contemporary example that took place within the last 12 months of a debate topic for a business law class focusing on intellectual property rights.
T. Task limitations
Setting boundaries and constraints for Gen-AI’s responses is vital to keeping it within the scope of your inquiry. This helps in managing expectations and ensuring that it does not stray into unwanted territories. Setting boundaries for what it should focus on or exclude prevents it from generating irrelevant or overly broad content, keeping the output focused and manageable.
Prompt example: Provide examples of leadership strategies but avoid referencing military or sports contexts.
E. Enhance and refine
Request summaries, rephrasing, corrections, or clarifications to improve the quality and accuracy of the information provided. This step involves improving the quality of Gen-AI responses through iterative feedback, refining the prompts based on initial outputs, and ensuring the final product meets all specified requirements. Ask for clarifications, rephrasing, or a summary. Don’t hesitate to iterate until you get the desired result.
R. Regenerate and experiment
AI responses are not set in stone. If an answer isn’t satisfactory, refine your prompt or use the “regenerate” feature to explore different perspectives. Experiment and do not settle for the first answer provided. Try using different prompts or asking the AI to regenerate outputs to provide new perspectives or ideas, fostering creativity and ensuring a robust examination of the topic.
Practical applications of SMARTER prompts
The SMARTER framework is not just theoretical — it’s immensely practical. It gives us a clear approach on how to best interact with the plethora of emerging Gen-AI chatbots, whether in higher education specifically, or in our wider context today. It’s intended to help us optimize the way we interact with artificial intelligence tools by ensuring our prompts are concise, specific, and deliberately engineered to generate the most accurate and helpful content. As Gen-AI continues to saturate both academic and professional life, the ability to write clear and precise prompts that are capable of producing correct and detailed responses will emerge as a critical digital skill that enhances innovation and efficiency in AI-interactions.
Written by Martin Jones, PhD, JD, Associate Dean of Graduate and Online Business Programs, and Associate Professor of Law and Ethics in the College of Business and Entrepreneurship at North Greenville University.
To learn more about implementing the SMARTER prompt engineering framework, watch Dr. Jones’ upcoming webinar session, SMARTER Prompt Engineering: A Practical Framework for Using GenAI in Higher Ed, part of our 2025 virtual Empowered Educator conference, taking place February 12-February 13.