Intelligence is compression. But the key to prompting ability is the courage to be verbose.

Explore the in-depth analysis of intelligent compression and prompt skills, and reveal the new paradigm of communication in the AI era.
Core content:
1. The importance of prompt skills and their value in the AI era
2. The concept of intelligent compression and its application in human thinking and language
3. Comparative analysis of clear Feynman prompt and extreme compression prompt
There is no doubt that the prompt skill is a meta-skill in the AI era. It is already crucial now and will only become more and more important in the future.
However, what kind of prompt is a good prompt? How to train your prompt ability? There are completely different opinions in the market.
I have observed that there is a cult of "unclear but powerful prompts" . These prompts are usually extremely condensed, and even humans can't read them, but they look awesome, so there are many followers.
This is in stark contrast to my experience and thinking in using ChatGPT for two and a half years . This conflict is so intense and significant that I would like to share my (possibly wrong) thinking with you to start a discussion.
Intelligence is compression
A major feature of this type of prompt is that it is extremely compressed. So let's talk about compression first.
Compression is a sign of intelligence, there is no doubt about that.
Compression is intelligence: better compression means deeper understanding.
Intelligence as compression : better prediction, better predict next token (the kind that moves you), presupposes better compression.
To sum up in one sentence: the essence of intelligence is compression and prediction . Intelligence is essentially an efficient compression and prediction system. Our thinking, memory, and learning are actually all about compressing information and predicting the future.
The LLM paradigm of artificial intelligence compresses human knowledge into a trillion-parameter model, and then follows your instructions and prompts to make high-quality predictions.
Not only is LLM's intelligence compressed, but isn't human intelligence also a kind of compression? Language is a compression algorithm for human knowledge, and the essence of all languages is the conceptual compression of information. From a neuroscience perspective, human brain learning is essentially a process of information compression. Even the core of writing is the compression and refinement of complex ideas, which is consistent with the compression attribute of intelligence.
Does prompt need to be compressed?
Intelligence is compression, but, in my opinion: there is no need to pursue "extreme concentration" when writing prompts, or even be a little bit wordy, but you should strive for clarity as much as possible.
This is also the principle I established when practicing the Feynman technique: clarity (clear expression, clear thinking) is the first principle .
The key to prompt is Feynman, and the first criterion for judging whether Feynman is good or bad is clarity.
Accuracy? Comprehensiveness? Am I being long-winded? It doesn’t really matter.
After using ChatGPT for two and a half years, I have become more and more certain and affirmative: when chatting with ChatGPT, we really should not pursue overly compressed expressions, pursue profound and awesome things, but in fact it is unnecessary, and it increases the resistance and psychological barriers for ordinary people to use ChatGPT .
In fact, using chatgpt should be the most barrier-free way of learning and thinking. All you need is a mouth and then slowly develop your brain .
The study and pursuit of the ultimate compression prompt has caused many people to waste a lot of time and energy (which should have been used to actually use chatgpt).
The way of prompt is the way of Feynman
The difference between the extremely compressed/unclear but powerful prompt and the clear Feynman-type/human-language prompt is not a difference in terms of techniques, but a difference in the underlying concepts.
The underlying problem:
What kind of prompt is a good prompt?
Why is the clear Feynman prompt better than the extremely compressed prompt? What is the reason?
First, it is context : communication between people and between people and LLM, in addition to the instruction itself , should also include the context necessary to understand the instruction . Extremely compressed prompts will almost inevitably lead to the loss of context information, and LLM can only guess and easily go astray. Feynman-style prompts, on the other hand, fully state the background, goals, and constraints, making LLM's reasoning path shorter and more stable.
Secondly, cognitive load : the extremely compressed prompt requires the writer to work hard, repeatedly carve out the number of words, and design a content structure (even a programming language) that humans cannot understand, and the reader (including LLM) has to repeatedly decompress and complete the implicit information. However, the Feynman prompt, although a bit verbose, has the advantage of making the implicit information explicit, saving both parties brainpower.
In fact, whether the compressed prompt is good or not, Chinese people should be the easiest to understand. Classical Chinese is a typical extremely compressed text style. Is classical Chinese easy to understand? If you go back to the classical Chinese speaking style from the current vernacular speaking style, will you feel happy or crazy?