At one point in early 2023, there was a certain optimism in tech circles: the idea that you could make six figures just by being able to communicate with a machine. No boot camp for coding. No degree in computer science. Just the right words entered into a chat box in the right order. In retrospect, it sounds ridiculous. It seemed at the time to be a real fissure in the wall that had always kept technical jobs apart from the rest. Prompt Engineer was the name of the profession. And it was real for about eighteen months.
The reaction to ChatGPT’s late 2022 launch ranged from mild collective hysteria to fascination. In a matter of weeks, tens of millions of users registered. Screenshots of conversations in which the AI wrote legal briefs, wrote songs, or debugged code were all over social media. Amidst all of that excitement, there was a subtle realization that the question you asked had a significant impact on the response you received. Someone had to learn how to do that. It seems that businesses would compensate them well.
| Topic Profile: Prompt Engineering as a Career | |
|---|---|
| Field | Artificial Intelligence / Human-AI Interaction |
| Emergence Year | 2022 (post-ChatGPT launch) |
| Peak Salary (US) | $335,000/year — offered by Anthropic, San Francisco |
| Pioneer | Riley Goodside — early GPT-3 experimenter, first Staff Prompt Engineer in history |
| Employer (Notable) | Scale AI, Anthropic, OpenAI, Microsoft, various startups |
| Decline Declared | Early 2025 — Sean Grove, senior OpenAI researcher |
| Microsoft Survey Finding | 31,000 employees surveyed; Prompt Engineer ranked 2nd least-wanted role for next 12–18 months |
| Entry Requirement (2023) | No computer science degree required |
| Key Techniques | Chain of Thought, Tree of Thought, Role-based Prompting, Reflection Prompts |
| Current Status | Absorbed into broader AI Engineering, LLM Engineer, and AI Product roles |
The San Francisco AI lab Anthropic advertised a position for a “Prompt Engineer & Librarian” with an annual salary of up to $335,000. Technical experience is not necessary. According to the job description, the ideal candidate “loved solving difficult problems.” It’s easy to see why that post went viral on Twitter and LinkedIn. For those who had always been told they lacked the necessary credentials for technology, it read like a door left wide open.
Long before anyone else realized that door existed, Riley Goodside had been using it. He was a self-described experimenter who had spent years posting on social media about his interactions with early language models, testing what GPT-3 would do when given personas, forced to review its own responses, and guided step-by-step through problems.

He wasn’t merely conversing with a computer. He was charting its psychology. Goodside would go on to become the first Staff Prompt Engineer in history, according to Scale AI founder Alexandr Wang. In 2021, that title seemed unique, even strange, but it would soon appear on hundreds of business cards.
There was something akin to a gold rush by 2023. According to a ResumeBuilder survey, almost 29% of businesses intended to hire prompt engineers, and about 25% of them anticipated offering starting salaries higher than $200,000. There have been rumors in China that senior practitioners receive yearly packages worth up to one million yuan.
In a nation where that was truly competitive, a junior prompt engineer could earn between 10,000 and 15,000 yuan per month in the startup world. In the way that fascinating things frequently do at first, the profession seemed youthful and a little chaotic.
Since the job wasn’t always clearly defined in the beginning, it’s important to take a moment to recognize what it actually entailed. Some practitioners recount a time when business executives wanted AI integrated into their operations but were unsure of how to do so after seeing demonstrations of ChatGPT writing marketing copy or digital humans reading scripts.
That gap was filled by quick engineers. Part translator, part consultant, part technician. Because the duties were so ambiguous, salaries frequently reflected what a candidate could negotiate rather than any set rate card, according to some insiders. Although the role’s boundaries were hazy, it was real.
However, the skills were tangible. Chain of Thought prompting guided a model step-by-step through an issue. Reflection Prompts, which required the AI to evaluate and edit its own work. Role-based prompting changed the model’s tone or focus by dressing it in a persona. The Tree of Thought branched reasoning along several different avenues. Exactly, these weren’t tricks.
They were strategies to make up for the shortcomings of the model—ways to extract dependable performance from a system that was both genuinely impressive and genuinely inconsistent. And that’s exactly where the narrative takes a turn.
The models improved. Rapidly, almost rudely, rather than gradually, as is typically the case with technological advancements. Gemini 1.5, Claude 3, and GPT-4. By 2024 and 2025, these systems were able to comprehend intent in ways that their predecessors were unable to. The intricate prompt construction scaffolding, including the meticulous sequencing, persona framing, and instruction ordering, ceased to be very important. A vague request would come close enough to being helpful. The gap that expert prompting had been filling was getting smaller.
Things became even sharper when tools were used. “Imagine you have access to a database…” was a common construction in early prompt engineering, which aimed to mimic capabilities the models lacked, such as searching, calculating, and retrieving. Models just had access to those items by 2025. When the equipment arrived, the gymnastics were no longer necessary.
In an early 2025 private meeting, OpenAI senior researcher Sean Grove stated unequivocally that “prompt engineering is dead.” It’s the kind of statement that, depending on your position, has a different impact. It hurt those who had made careers out of the discipline. It felt like confirmation to the doubters who had always thought the position was a little speculative.
According to a Microsoft survey of 31,000 workers, prompt engineers were ranked as the second least desired new position that employers wanted to fill in the upcoming year. It appeared that the town had been exhausted by the gold rush.
The real story is more complex and likely more fascinating. The role dissolved into its surroundings rather than completely vanishing. The job board data monitored by communities of AI practitioners reveals a complex picture: positions requiring prompt engineering skills—now nestled inside titles like AI Engineer, LLM Engineer, and Applied ML Engineer—grew significantly, while postings with the specific title “Prompt Engineer” decreased by about 30%. The ability was assimilated. The unique occupation did not endure.
Watching this develop gives me the impression that prompt engineering has always had a specific vulnerability. The discipline’s fundamental goal was to effectively communicate with a system that required assistance in comprehending you. With only a hint of irony, one practitioner stated: “That’s not an AI skill.” That is merely dialogue.
Expert intermediaries became less necessary as the systems became more adept at determining intent. Curriculum was already changing as a result of the certified prompt engineer bootcamps and YouTube courses that had sprung up around the position, many of which sold frameworks with names like LangGPT or structured prompt libraries.
The practitioners who had already begun to move toward something more difficult were the ones who survived and prospered. The significant technical work shifted to agent architecture, which involves creating systems where several AI models delegate tasks to one another, recover from errors, manage memory across sessions, and choose the best model for each subtask. It’s really complicated. Systems, expenses, failure modes, and design tradeoffs must all be understood. You don’t learn it over the weekend.
The calculus has changed once more for individuals like Xiuda, a Chinese student from what he describes as a third-tier college who used prompt skills to land an internship he never thought possible. The door that opened in 2022 is now narrower and the room behind it has a different appearance, but it hasn’t completely closed. These days, composite skills—domain knowledge, product thinking, and some coding proficiency—are important. Even the somewhat exaggerated “anyone can do this” pitch is no longer a useful approximation.
The entire spectrum of prompt engineering, from salaries that seem unattainable to early deaths, might provide insight into how the tech sector handles innovation. When a new skill is incorporated into a job title before anyone fully comprehends what the job actually entails, there is always a moment of overcorrection. The work was authentic.
The field developed practical methods that actually enhanced the capabilities of earlier models. However, it turned out to be a remarkably brief window of time during which that was a stand-alone, specialized, well-paid career. By most honest accounts, it was about eighteen months.
The well-informed child still requires supervision. However, it is now anticipated that the guides will also be engineers.
