Introduction to AI terminology (Part 2): Prompt, Agent, Langchain, the next stop of AI map

Written by
Jasper Cole
Updated on:June-13th-2025
Recommendation

In-depth exploration of AI terminology, mastering key concepts of product building and project development.

Core content:
1. The deep meaning and writing skills of Prompt
2. Agent role positioning and structural analysis
3. The core functions and application scenarios of the Langchain framework

Yang Fangxian
Founder of 53A/Most Valuable Expert of Tencent Cloud (TVP)

The previous article started with the basics and first understood what AI is (Part 1). We started with core terms such as AI, LLM, large models, and multimodality, and sorted out the basic concepts.

In this article, we will go deeper and thoroughly explain the common but confusing terms in product building, agent combination and generative AI projects.

It’s not just about explaining “what they are”, but also answering the questions you must have asked behind these terms:

Why do some people write well, but some fail to write effectively?

What are Langchain and LLAMA? Can they produce products?

What is MCP? How does it relate to RAG and Agent?



Prompt: Why is it difficult to use? How can we write it well?

0 1


0 1


What is Prompt?




Prompt is the "questioning method" or "prompt word" you give to the big model. It is not just a sentence, it may be a long structured text, or it may be embedded with function calls, format templates, and instruction combinations.

There are two most common problems encountered by beginners:

  1. How to write so that the model can understand?

  2. How big is the difference between good writing and bad writing?

That’s why we hear the term Prompt Engineering, which emphasizes:

Writing a prompt is not like writing an essay, but designing input , which should be combined with the model context logic.

High-quality prompts usually have clear structure, role settings, and clear input and output boundaries .


For example?

You are a senior child psychologist. Please help me design an interactive game script with the theme of "courage" suitable for 6-year-old children. It should include 3 levels, each lasting 1 minute, and the characters are: fox and rabbit.

This prompt is much clearer than " Help me design a children's script ."

The structure, objectives, context, and output content boundaries are all clearly stated.


0 2


How to learn the Prompt technique?




Prompt Library Tools:

You can check out popular prompt templates on websites such as FlowGPT , PromptBase , and PromptHero Chinese website . You can also collect prompt words written by others and reuse them for reference.


Prompt Management Tools: 

Frameworks such as Langchain/LLAMA/PromptLayer also provide prompt word template components that can be centrally managed and reused.




Agent: Evolving from Chatbot to Automated Executor

0 2


0 1


What is Agent?




Agent is not the big model itself, but a system of "dispatcher" + "executor", which can:

Autonomous decision-making:  Determine what to do next based on your goals

Calling tools:  such as searching web pages, calling functions, reading and writing files

Complete tasks in steps:  For example, "Book a hotel for me and send me an email."


0 2


What is the structure of Agent?




User input → Thought → Tool use → Observe results → Make a decision → Output answer


Compared with ordinary Chatbot, it not only “answers questions” but also “solves tasks”.

Common Agent framework platforms include:





Langchain and LLAMA: Not a big model,

Instead, assemble the frame

0 3


0 1


What is Langchain?




Langchain is not a model, but a Python framework "for assembling models and tools".

Its core objectives are to:

It allows you to put LLM + Prompt + Agent + Memory + tools together like building blocks to create an "AI application that can execute tasks."


0 2


What can Langchain do?




Chain multiple steps, such as "extract text → search for information → generate content"

Manage multiple agents and multiple model calls

Implement "memory" functions, such as context management and user preference memory


0 3


What is LLAMAIndex?




?Note: LLAMA here is not Meta's model LLaMA, but another tool with similar spelling, LLAMAIndex (formerly known as GPT Index). It is an open source framework designed to connect large language models (LLM) with external data. Through structured indexing and efficient retrieval mechanisms, it solves the contextual limitations and illusion problems of LLM when processing enterprise-level knowledge bases. Suitable for:

  • Private knowledge base construction

  • Question answering system based on files/databases/web pages

  • Used in conjunction with RAG (Retrieval Enhanced Generation)


0 4


Comparison with LangChain: Why choose LlamaIndex?





? Suggestion: If you need to deeply optimize the search effect (such as in the financial and medical fields), choose LlamaIndex; if you need to quickly build a general agent, choose LangChain




Token: A key unit that affects cost, length, and effect

0 4


0 1


What is Token?




Token is the basic unit of text processing by the model.

It is not a "character" or a "word", but a "fragment that the model can recognize".

for example:

“ChatGPT” will be divided into 2 tokens

The Chinese word "hello" will be divided into 1 token



02


Why is it important?




The cost is charged by token, for example, GPT-4 is 0.03 USD/1000 input tokens, and DeepSeek-V3 is 4.8 RMB/1 million input tokens.

The prompt cannot be too long. If it exceeds the upper limit of the token, it will be truncated.

Training, reasoning, and storage are all related to the number of tokens.




MCP: for understanding "multi-turn context"

0 5


0 1


What is MCP?




MCP = Model Context Protocol, is a class of protocols or components used to manage the "context state" of large models.

What it does is:

  • Defines how the model remembers what you said last time (dialogue state)

  • Ensure consistency of instructions, identities, and memories across multiple rounds of question-answering

  • Control the position and priority of content such as "System Prompt", "Character Settings", and "History Dialogue" in the context


0 2


Why is it important?




You will find:

  • Large models occasionally "can't remember what you said"

  • The answer will show personality disorder (the last round was an expert, this round becomes a student)

  • Prompt does not take effect, it may be overwritten by the context


This is probably due to MCP configuration issues. It is often integrated into frameworks such as Langchain, LangGraph, and OpenAI Function Calling to help you finely control the model context.




✅To summarize: How do these terms relate to each other?


We string together the core terms to form an "AI term map" like this :