White Arrow Pointing To The Left Of The Screen
Manuel Aparicio
Back to Blog

What is OpenAI Prompt Engineering?


Prompt Engineering with ChatGPT is one of the most common topics of conversation these days. They have shaped almost every industry in less than a year, including Software Development. There are tons of applications for ChatGPT in and out of your workplace. It doesn’t surprise that so many big corporations are using it to streamline their processes and boost productivity.

Even schools are teaching ChatGPT to students despite the fact they were panicking when it came out. ChatGPT and similar AI Generative tools are pretty powerful. But there’s one caveat here (probably more than one). They’re (so far) only as effective as the prompts you give them. Thus, Prompt Engineering is crucial to get the most out of them. We’ll explore the main aspects of Prompt Engineering for ChatGPT, starting with OpenAI best practices.

What is Prompt Engineering?

Prompt Engineering goes down to the instructions you give to generative AI tools to answer your questions. It’s as if you were chatting with a friend. The thing is, tools like ChatGPT don’t “work” as we humans do. Therefore, the instructions you give them have to be clear and concise, following certain patterns. But still, it does feel like a conversation with another person. Keep in mind that Prompt Engineering with ChatGPT goes beyond Content Generation. You can also use it to summarize existing content and extract, translate, or categorize data. Another handy application is to use it to analyze some text.

Prompt Engineering involves using relevant keywords, signs, specific formats, and clear language. That said, OpenAI’s best practices promote using detailed descriptions that guide responses, including length, style, tone, and context. You can also include an example of the desired format at the bottom of the prompt. OpenAI asks to use either “”” or #### to separate context from the prompt description to get the best results. Yet, that’s just the tip of the iceberg.

Parameters in OpenAI Prompt Engineering

Parameters are keywords to which you can assign certain values to play around with the results you get. OpenAI states that models and temperature settings are the parameters most affecting outputs. Regarding models, the latest free and paid versions are currently ChatGPT 3.5 turbo and GPT-4. The main aspects you should consider are creativity, accuracy, and problem-solving skills. Needless to say, OpenAI recommends using GPT4 (in other words, the latest model) so you get the best results. Now, how can you use Temperature to get different results?

The temperature parameter dictates how “random” (and potentially creative) responses will be. Setting a low value will make responses more standard, predictable, and to the point. That said, if you're learning about a new subject or need more factual information for any reason, you may want to set the temperature to a very low value. On the other hand, if you’re looking for inspiration regarding an artistic subject, you may want to set Temperature to a higher value. Common use cases are writing stories, poems, copy for marketing purposes, emails, and so on.

How to Writte Beter Prompts?

Part of getting the most out of complex prompts involves getting more familiar with what you can do with them. I’m sure you know quite a few things you can accomplish with it, but let’s go over a few use cases in business environments. Apart from writing copy, summarizing information, and drafting emails, you can prompt ChatGPT to act as a personal assistant. Moreover, you can use it to create business presentations, process customer service data, and as a coding tool. Knowing the formats in which it can present a response is also key.

Examples include bullet points, numbered lists, comma-separated lists, inline code snippets, quotes, and headers. You can also ask to get your response in a table and include web links. Regardless of your goals, assigning certain roles is also very powerful. What do I mean by that? At the beginning of your prompt, you can ask ChatGPT to act as a certain character. That can help you play around with creativity and professionalism without touching Temperature.

For example, you can start your prompt with "Act as a full-stack developer with ten years of experience, specialized in the MERN stack." Then, it’s also important to state your goal or intended purpose with context. That could be, "I’m a beginner software developer and need your help to create a website blog. Start by helping me write the code for the navbar using React.js, HTML, and CSS."

Last Version of ChatGPT Updates

OpenAI has recently added some features to ChatGPT that took it to the next level. Instead of manually customizing each prompt to define the tone and intent of your request, you can specify that in the new custom instructions section. In fact, this new section contains two boxes. In the upper box, you can give ChatGPT any information you think it should take into account for better outputs. Suggestions include your job, goals, location, hobbies, etc. Then, in the lower box, you can tell ChatGPT how you expect it to respond. You can use the second box to define the outputs' tone, length, and bias. The best thing is that you can edit and deactivate your custom instructions, if you wish, anytime.

Another new game-changer feature for Prompt Engineering in ChatGPT is the code interpreter. Let’s start with the bad news. Until now, this feature is only available in the latest paid version (GPT-4 model). The good news is that its power totally makes up for the membership cost. With the code interpreter, ChatGPT is capable of writing, executing, and even testing its own code. As a result, you can use your effective prompts to complete coding tasks with much higher accuracy. What’s really overwhelming is that it also allows you to pass files and images in your prompts. That goes from simple PDFs to complex Excel and CSV files with messy data. That also opens up a whole new world of possibilities for Data Science and data analysis.

Prompt Engineering vs Problem Formulation

Generative AI tools like ChatGPT, DallE, and Midjourney caused a worldwide buzz about Prompt Engineering. While knowing how to interact with these tools is important, there’s something you should keep in mind. Newer models will most likely have the ability to refine your prompts thoroughly. With time, they’ll improve, making Prompt Engineering less relevant. That’s why you should also care about honing your problem-formulation skills in the meantime. Prompt Engineering skills involve interacting with a specific AI tool using specific keywords and following certain rules.

In contrast, Problem Formulation focuses on fully understanding the problem so you can easily delineate it. That’s especially powerful since even an excellent prompt is useless if it doesn’t target the right problem. For example, you may want to ask ChatGPT to create a database schema for an SQL database. Yet, the product in which you plan to use it requires horizontal scalability and involves tons of unstructured data. That means you should’ve probably approached that problem using a NoSQL Database Management System like MongoDB instead of an SQL database with a strict schema. That may not always be the case, but it should give you a rough idea of how important it is to understand the problem you want to solve.


Prompt Engineering techniques and best practices are specific to your tool. In other words, part of the work to improve is tons of practice, experimenting, and interacting with the tool. Luckily, OpenAI has given us detailed guidelines for Prompt Engineering with its tools. This way, it’s relatively easy for anyone to get started in the blink of an eye. It’s worth noting that experimenting with your chosen tool involves using follow-up prompts to get the desired output.

However, the more you get used to using the AI tool, the less you’ll need them. As a full-cycle digital agency, we believe that AI generative tools can help streamline the product development process to a large extent on a daily basis.