Lesson 4: Unleashing the Power of Effective Prompting Techniques

This lesson is part of The Prompt Artisan Prompt Engineering in ChatGPT: A Comprehensive Master Course.

4.1. Explicitness in Instructions

Being explicit in your instructions is a crucial aspect of prompt engineering. AI language models, such as GPT-4 and ChatGPT, are sensitive to input phrasing and can provide better results when given clear, unambiguous instructions. To enhance explicitness:

  • Specify the desired format or structure of the response, such as a list, a paragraph, or a step-by-step explanation.
  • Include examples to illustrate your expectations, while ensuring the examples don’t bias the AI model’s response.
  • Define any terms or concepts that may be ambiguous or open to interpretation.


Vague Prompt:

"Explain inflation."

Explicit Prompt:

"In simple terms, explain what inflation is and provide three common causes of inflation in modern economies."

Additional tips for enhancing explicitness:

  • Set clear goals: Make sure the AI model understands the ultimate goal or objective of the task.

Examples of good and bad prompts:

Good Prompt:

"Describe three ways in which solar energy contributes to environmental sustainability." 

This prompt is explicit, providing context (solar energy), a specific goal (environmental sustainability), and a format (three ways).

Bad Prompt:

"Tell me about solar energy." 

This prompt is vague and doesn’t provide guidance on the desired focus or format.

4.2. Asking the Model to Think Step-by-Step

Encouraging the AI model to think step-by-step can lead to more thoughtful, coherent, and accurate responses. By breaking down complex tasks or problems into smaller steps, the AI model can better grasp the underlying concepts and provide more useful outputs. Some strategies include:

  • Asking the model to outline or enumerate the steps before diving into a detailed explanation.
  • Requesting the model to weigh pros and cons, evaluate alternatives, or consider different perspectives before arriving at a conclusion.
  • Scaffold complex tasks: Break down larger tasks into smaller, manageable components for the AI model to address systematically.


Regular Prompt:

"What are the best ways to invest money?"

Step-by-Step Prompt:

"List five popular investment options and briefly analyze the advantages and disadvantages of each."
Examples of good and bad prompts:

Good Prompt:

"Outline the process of creating a marketing plan in five steps, starting with market research and ending with implementation." This prompt encourages step-by-step thinking and provides a clear starting and ending point.

Bad Prompt:

"How do you make a marketing plan?" This prompt is vague and doesn't encourage step-by-step thinking or a structured response.

4.3. Experimenting with Temperature and Max Tokens

Adjusting parameters like temperature and max tokens can greatly influence the AI model’s output:

  • Temperature: A higher temperature (e.g., 0.8) leads to more diverse and creative outputs, while a lower temperature (e.g., 0.2) generates more focused and deterministic responses. Experiment with different temperature settings to achieve the desired balance between creativity and consistency.
  • Max Tokens: By setting a max token limit, you can control the length of the AI model’s output. Be cautious when setting a low max token limit, as it may result in truncated responses that are difficult to understand.
  • Context-dependent adjustments: Adjust these parameters based on the context and desired outcome of the task.


High Temperature:

"Write a creative story about a dragon and a knight."

Low Temperature:

"Summarize the key principles of project management."
Examples of good and bad prompts:

Good Prompt (low temperature):

"List three benefits of using renewable energy sources like solar and wind power." 

This prompt benefits from focused, deterministic responses generated at a low temperature.

Bad Prompt (low temperature):

"Write an imaginative story set in a world where plants can communicate with humans." 

This prompt would benefit from higher temperature, as it seeks creativity and diversity in the response.

4.4. Using User-like Inputs

Feeding the AI model user-like inputs can lead to more engaging and natural responses. This approach is especially useful when working with ChatGPT, which is designed for conversational interactions. To create user-like inputs:

  • Phrase your prompt as a question, request, or statement that a real user might pose.
  • Include conversational elements, such as greetings, follow-up questions, or acknowledgements, to create a more interactive and engaging experience.
  • Mimic real-world scenarios: Use phrasing and context that reflects real-world situations and challenges users may face.


Regular Prompt:

"Pros and cons of solar energy."

User-like Prompt:

"Hey, I've been thinking about switching to solar energy for my home. Can you tell me some of the pros and cons?"
Examples of good and bad prompts:

Good Prompt:

"I'm thinking about starting a small business but I'm not sure what type to choose. Can you suggest three types of small businesses that have been thriving in recent years and why they're successful?" 

This prompt simulates a genuine user inquiry, providing context (small businesses) and asking for specific suggestions.

Bad Prompt:

"Business ideas." 

This prompt lacks user-like input and doesn’t provide enough context or guidance for the AI model to generate a meaningful response.

Mastering effective prompting techniques is essential to harness the full power of AI language models like GPT and ChatGPT. By applying these techniques, you will elevate your prompt engineering skills, creating prompts that generate exceptional results in a wide range of applications.

Take some time to test what you learned and you new ideas. Please come back when you are ready to learn more in the next lesson Lesson 5: Mastering Advanced Prompting Techniques.

Leave a Comment