Table of Contents
Fight magic with magic
How to easily optimize prompt words in 10 seconds?
Home Technology peripherals AI Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Apr 08, 2023 am 11:01 AM
ai chatgpt

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

#In the computer field, the prompt word (Prompt) refers to the leading left-oriented string before the algorithm output. For example, the earliest C:> under MSDOS, ~: under Linux, and >>> under IPython are all prompt words. In 2023, prompt words have become the most natural and intuitive way to interact with large-scale language models (LLMs).

If ChatGPT is compared to the gorgeous magic in the Harry Potter novels, then the prompt words are like the spells used to summon the magic. Whether you can use this magic well depends on whether you recite the spell clearly or with an "accent". The same magic has different powers depending on the person chanting it. It is said that there are a thousand Hamlets for a thousand readers, but the Avada Kedavra curse of a thousand wizards is not as effective as Voldemort's chanting alone (of course, no matter how good Voldemort's chanting is, it is not as effective as Harry's).

So, whether you can make good use of ChatGPT and large-scale language models depends largely on the quality of your prompt words. In fact, not only language models, including AI text-to-picture generation models such as DALL·E and Stable Diffusion, which were very popular a few months ago, prompt words also have a great impact on the style and quality of their generated art.

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

(Same burger, same Stable Diffusion 2.1 model, prompt in the burger on the left Even if the word "Trending on Artstation" is added, it is still unappetizing. So the question is, can you guess what the prompt word on the right is?)

But when it comes to prompt words, it is inevitable that people will love and hate them. People who love it see it as a fusion of technology and art, while those who hate it see it as a stumbling block that hinders the advancement of machine learning and AI.

ChatGPT founder Sam Altman believes that Prompt Engineering uses natural language The black technology of programming is definitely a skill with high returns. There are many people on the Internet and forums who collect, organize, and even sell at high prices and offer reward tips. Many people regard prompt words as the source code of AIGC in this era, and corresponding online courses have begun to emerge.

Correspondingly, Yann LeCun, a well-known deep learning giant, believes that The prompt word project exists because LLMs lack understanding of the real world . He feels that the need for prompt words in LLMs is only a temporary state, which just shows that there is still a lot of room for improvement in current LLMs. With the continuous innovation of LLMs technology, LLMs will soon have the ability to understand the real world, and by then the prompt word project will lose its value.

The future is too far away, but objectively speaking from the current development of LLM, the existence of prompt words has a certain meaning. Just as the interaction between people in the real world requires certain communication skills, You can also think of prompt words as communication skills when people interact with machines . Good prompt words can help you achieve better results when using LLMs, just like in the real world, people who are articulate and well-spoken can often coordinate to get the job done faster.

Although in 2023, natural language has leapt into a unified way of communication between people and people, and people and machines, communicating with LLM machines is still more challenging than talking with people. First, LLMs cannot understand nuance, tone, or context in the same way humans do, which means cue words need to be carefully designed to make them unambiguous and easily understood by the model. You can imagine that you chatted a lot to LLM, and then LLM replied coldly, "Talk like a human being." Secondly, due to the limitations of training corpus, LLMs may have certain limitations in language understanding. Some long logical expressions, elaboration, reversal, and even simple reasoning and induction in the real world cannot be perfectly understood and executed in LLMs. And some code words in LLMs are generated from the training corpus (such as the most famous "Let's think step by step / Let's think step by step" and "Below is my best shot / Below is my best prediction" in GPT) It is not common in daily communication between people. These have further complicated the prompt word project and promoted it to the so-called "metaphysics".

For Chinese users whose native language is not English, prompt words are also the biggest pain point that prevents them from trying LLMs. Looking back on the summer of 2022, when Midjourney and Stable Diffusion were at their peak in the English market, the response from the domestic community was not enthusiastic. The reason is that the prompt words of Midjourney and Stable Diffusion are mainly in English, which requires a large amount of vocabulary and pop culture reserves when constructing. This is extremely unfriendly to Chinese users who want to try new things. Part of the reason why ChatGPT is popular in the Chinese community is due to its good support in Chinese, which greatly lowers the threshold for Chinese users. As one of the most spoken languages ​​in the world, Chinese is still hampered by prompt words; from now on, you can only imagine how difficult it is for small languages.

In short, the existence of the prompt word project has its rationality. A good reminder word can indeed bring twice the result with half the effort. Good prompt words can help us understand the capabilities and boundaries of large language models, deeply explore its potential, and better play its role in production practice. The most famous example of this is Context learning (In-context learning).

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Fight magic with magic

In reality, the optimization process of prompt words requires trial and error. Iteration is extremely cumbersome and requires a certain amount of knowledge. This makes people ask, in today's era of AI, can prompt words be automatically generated?

In the reply to Yann LeCun’s tweet criticizing prompt words, we noticed this reply: “Prompt word engineering is like the description and definition of a problem in science; the same Problems, described by different people, may be good or bad, easy or difficult, solvable or unsolvable. Therefore, there is nothing wrong with the existence of the prompt word project, and the prompt word project itself can also be automated." This netizen At the same time, a product is also given: "The most beautiful prompt word" (PromptPerfect.jina.ai). In other words, this new paradigm of using algorithms to optimize prompt words has been successfully implemented!

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Experience link: https://promptperfect.jina.ai

The promptperfect.jina.ai mentioned in this reply uses magic to tame magic and let AI guide AI. When you enter the prompt word, it will output the optimized "most beautiful prompt word" and Allows you to preview the model output before and after optimization. This achieves a virtuous cycle from "garbage-(prompt)-in-garbage-(content)-out" to "good input - good output". According to the product’s official documentation, it not only supports the currently popular ChatGPT prompt word optimization, but also supports GPT 3, Stable Diffusion, and Dall-E. Next, let us evaluate the technology and skills of this "AI Prompt Word Engineer" - PromptPerfect.

How to easily optimize prompt words in 10 seconds?

1. Turn spoken language needs into clear prompt words

Optimizing prompt words requires understanding the structure of language and knowing what Which words in a sentence can "activate" the intelligence of LLMs. Without these reserves, the prompt words are unclear and the spoken language is a mess, then it is easy to be misled by LLM. "The most beautiful prompt words" can learn from massive data and deeply understand deeper language knowledge to produce more accurate, clear and effective prompt words, no matter what kind of needs and tasks you want, it can Directly tailored to provide the most accurate expression.

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

When facing GPT3 or ChatGPT, the prompt word may be stuck because of limited communication skills and difficulty in expressing clearly. Questions or instructions seriously affect the quality of the model's answers. We try to use "The Most Beautiful Prompt Word" to optimize some common instructions, as shown below. "The Most Beautiful Prompt Word" expands the context of the original simple and rough prompt word "Please send me some money-making ideas" and outputs a perfect message. Prompt word:

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

##Manual input Rely on luck


Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

##Use the "most beautiful prompt word" Relying on technology

Compared with the original prompt words, the "most beautiful prompt words"

define clear goals, clear output, and return ChatGPT supplements the scenario-based foreshadowing logic , making the measures generated by ChatGPT more practical, and the effect is indeed greatly improved visibly to the naked eye.

#2. Easily grasp the "talking skills" of different LLMs/LMs

Different LLMs have different temperament and habits. If you want to communicate effectively with them, you need to learn the local dialect. Otherwise, it will be easy to form a chicken-and-duck talk. It's like when you finally mastered the Stable Diffusion spell, only to find that ChatGPT's conversation method is completely different, and you have to start all over again. "The most beautiful prompt words" help users avoid the cost of learning different models, whether it is ChatGPT, GPT 3, Stable Diffusion or Dall·E, etc.

Just select the model and you can optimize it with one click The most appropriate prompt word.

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

3. One-click optimization of Chinese prompt words to generate perfect English prompt words

Compared with the single-modal ChatGPT, in the field of AI painting, it is difficult to write English well. Even if you have a bunch of reminder words, you may still be frustrated because you don’t have enough vocabulary, don’t know how to describe it, and can’t find the right reminder words. "The most beautiful prompt words"

can directly turn the prompt words you think of in Chinese into English prompt words , making it easier for you to use and more effective. Great, you no longer have to work hard to learn various English adjectives, Chinese users can also use it easily. Sometimes when we use DALL・E or Stable Diffusion to generate images, we find it difficult to produce good results. This may be because our English is not good enough or our imagination is not rich enough to come up with specific images or scenes. So the pictures that come out are blurry or weird.

We try to use the "Most Beautiful Prompt Word" to optimize some common commands. For example, in the picture below, the "Most Beautiful Prompt Word" changes the original simple, rough and slightly boring "Impressionist Beijing Street Scene" into a sentence with There are rich descriptions and great English!

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

#The prompt words before optimization cannot display impressionism, headphones, and future style at all

The prompt words for AI painting are more clear when tested. The "most beautiful prompt word" generates a lengthy but extremely precise "spell", which directly enhances the aesthetics, imagination and experience of the original prompt word, making the picture more vivid and more accurately expressing our original expectations.

4. API that developers can call directly

If you want to optimize prompt words in large quantities, or directly If integrated into the existing system, can directly call the API of "The Most Beautiful Prompt Words", so that batches of high-quality prompt words can be generated faster, regardless of No matter how many prompt words you need, "The Most Beautiful Prompt Words" can be completed quickly for you and provide the best service.

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

The technology and team behind it

We noticed that the "most beautiful prompt words" have been active since February 28 Since its launch, it has attracted a lot of attention, and everyone wants to use it to optimize prompt words in various scenarios. In just within a few days attracted thousands of users optimized nearly 10,000 prompt words, in Various platforms have received rave reviews. After all, as long as the prompt words it generates are used, the things produced by the large model can be both creative and beautiful.

"The Most Beautiful Prompt Word" uses two advanced machine learning techniques to find the best prompt words for various language models: Reinforcement learning and contextual learning . Reinforcement learning is its coach, constantly instilling knowledge and experience into it, making it become more and more powerful. It first uses some manually screened prompt words to lay the foundation for a pre-trained model, and then adjusts the prompt word network strategy based on user input and model output. For example, when we want to optimize the prompt words of DALL・E and Stable Diffusion, we want the generated content to be both relevant and aesthetic, just like a coach asking athletes to perform extremely well in all aspects.

Context learning is its teacher, teaching it how to learn through multiple examples. But instead of stacking all the examples together, it divides many examples into several groups, and then lets the language model encode it by itself. In this way, the "most beautiful prompt words" can use more examples to teach the model, thereby generating more accurate and effective prompt words. By using these two tricks, "The Most Beautiful Prompt Words" can optimize prompt words for various language models, significantly improving efficiency and accuracy, just like a top athlete trained by coaches and teachers.

This large-scale generative model, whether it is a language generation model or a multi-modal generative model, is currently based on language. However, in the future, we will definitely see the emergence of more multi-modal generative models. We found that the R&D team of "The Most Beautiful Prompt Word" is actually Jina AI, an emerging technology company focusing on multi-modal AI. It was founded in 2020 and is headquartered in Berlin, Germany, with offices in Beijing and Shenzhen. There is research and development. Jina AI focuses on the research and development of multi-modal AI technology and is widely used in the fields of search and generation. Previously, Jina AI has released a series of open source projects as follows, and has received a total of nearly 40,000 stars on GitHub from developers around the world. Attention provides convenience for developers to quickly implement multi-modal AI applications:

  • Multi-modal MLOps framework Jina: ​https:// github.com/jina-ai/jina
  • Data structure specially designed for multi-modal dataDocArray: github.com/docarray/docarray
  • CLIP-as-service: github.com/jina-ai/clip-as-service

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

In this era where generative AI breaks through multiple modal barriers like a tsunami, the "most beautiful prompt words" can directly improve the productivity of large models and bring significant improvements in efficiency. We have also noticed that Jina AI has also developed Rationale (rationale.jina.ai), an AI decision-making tool based on ChatGPT. You only need to enter one or several decisions in your mind, and Rationale will generate an exclusive decision evaluation report for you within 10 seconds. . It can be used in consultation, evaluation, research, planning, reporting and other scenarios to improve decision-making efficiency. As an artificial intelligence decision-making tool with "critical thinking", Rationale can broaden your thinking, refine your opinions, and make rational decisions by helping you list the advantages and disadvantages of different decisions, generate SWOT reports, conduct multi-criteria analysis or cause-and-effect analysis, etc. decision making. 2023 could be a game-changing year for startups.

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly

##Experience link: https: //rationale.jina.ai

With the opening of the ChatGPT API, AI applications for C-sides in 2023 will explode like the Internet era in 2000: Hundreds of ChatGPT API applications are launched every day. They are spread across various fields, breaking existing rules and subverting the ecology of multiple fields. Some traditional giants are facing challenges, some traditional barriers are facing breaking down, and some traditional industries are facing innovation. For us, if we want to gain a foothold in the new era of AI, we have to stand on the shoulders of giants, recite perfect spells, and use magic to solve various generation tasks. After all, perfect Prompt words are the soul of a ChatGPT application.

The above is the detailed content of Comprehensive automation of Prompt Engineering: LeCun was silent after watching it, ChatGPT watched it and called the experts directly. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to solve the complexity of WordPress installation and update using Composer How to solve the complexity of WordPress installation and update using Composer Apr 17, 2025 pm 10:54 PM

When managing WordPress websites, you often encounter complex operations such as installation, update, and multi-site conversion. These operations are not only time-consuming, but also prone to errors, causing the website to be paralyzed. Combining the WP-CLI core command with Composer can greatly simplify these tasks, improve efficiency and reliability. This article will introduce how to use Composer to solve these problems and improve the convenience of WordPress management.

How to solve SQL parsing problem? Use greenlion/php-sql-parser! How to solve SQL parsing problem? Use greenlion/php-sql-parser! Apr 17, 2025 pm 09:15 PM

When developing a project that requires parsing SQL statements, I encountered a tricky problem: how to efficiently parse MySQL's SQL statements and extract the key information. After trying many methods, I found that the greenlion/php-sql-parser library can perfectly solve my needs.

How to solve complex BelongsToThrough relationship problem in Laravel? Use Composer! How to solve complex BelongsToThrough relationship problem in Laravel? Use Composer! Apr 17, 2025 pm 09:54 PM

In Laravel development, dealing with complex model relationships has always been a challenge, especially when it comes to multi-level BelongsToThrough relationships. Recently, I encountered this problem in a project dealing with a multi-level model relationship, where traditional HasManyThrough relationships fail to meet the needs, resulting in data queries becoming complex and inefficient. After some exploration, I found the library staudenmeir/belongs-to-through, which easily installed and solved my troubles through Composer.

How to solve the complex problem of PHP geodata processing? Use Composer and GeoPHP! How to solve the complex problem of PHP geodata processing? Use Composer and GeoPHP! Apr 17, 2025 pm 08:30 PM

When developing a Geographic Information System (GIS), I encountered a difficult problem: how to efficiently handle various geographic data formats such as WKT, WKB, GeoJSON, etc. in PHP. I've tried multiple methods, but none of them can effectively solve the conversion and operational issues between these formats. Finally, I found the GeoPHP library, which easily integrates through Composer, and it completely solved my troubles.

How to solve the problem of PHP project code coverage reporting? Using php-coveralls is OK! How to solve the problem of PHP project code coverage reporting? Using php-coveralls is OK! Apr 17, 2025 pm 08:03 PM

When developing PHP projects, ensuring code coverage is an important part of ensuring code quality. However, when I was using TravisCI for continuous integration, I encountered a problem: the test coverage report was not uploaded to the Coveralls platform, resulting in the inability to monitor and improve code coverage. After some exploration, I found the tool php-coveralls, which not only solved my problem, but also greatly simplified the configuration process.

Solve CSS prefix problem using Composer: Practice of padaliyajay/php-autoprefixer library Solve CSS prefix problem using Composer: Practice of padaliyajay/php-autoprefixer library Apr 17, 2025 pm 11:27 PM

I'm having a tricky problem when developing a front-end project: I need to manually add a browser prefix to the CSS properties to ensure compatibility. This is not only time consuming, but also error-prone. After some exploration, I discovered the padaliyajay/php-autoprefixer library, which easily solved my troubles with Composer.

git software installation tutorial git software installation tutorial Apr 17, 2025 pm 12:06 PM

Git Software Installation Guide: Visit the official Git website to download the installer for Windows, MacOS, or Linux. Run the installer and follow the prompts. Configure Git: Set username, email, and select a text editor. For Windows users, configure the Git Bash environment.

How to solve the problem of virtual columns in Laravel model? Use stancl/virtualcolumn! How to solve the problem of virtual columns in Laravel model? Use stancl/virtualcolumn! Apr 17, 2025 pm 09:48 PM

During Laravel development, it is often necessary to add virtual columns to the model to handle complex data logic. However, adding virtual columns directly into the model can lead to complexity of database migration and maintenance. After I encountered this problem in my project, I successfully solved this problem by using the stancl/virtualcolumn library. This library not only simplifies the management of virtual columns, but also improves the maintainability and efficiency of the code.

See all articles