Tongyi Lingma 2.0 AI Programmer was launched in January 2025 and currently supports more than one million developers. The intelligent programming capabilities of this tool have now been technically integrated with Alibaba Cloud's AI model development platform ModelScope, becoming another development tool connected to the platform after Function Compute FC. This cooperation applies Tongyi Lingma's code generation capabilities to the AI model development process and provides development support for algorithm engineers.
In the Notebook online development environment of ModelScope, developers can now directly enable Tongyi Lingma's intelligent question and answer and AI programmer capabilities, and experience the core functions without the need for additional download, installation and configuration.
Full process support for intelligent model development
When building model training scripts and data processing pipelines, Tongyi Lingma can intelligently generate code snippets that meet best practices based on the context of frameworks such as TensorFlow/PyTorch. Whether it is layer structure definition, loss function configuration or distributed training logic, you can get accurate code suggestions.
It supports intelligent code conversion between mainstream deep learning frameworks (such as PyTorch to TensorFlow). Framework adaptation can be automatically completed by describing requirements in natural language, significantly reducing the cost of transfer learning.
Real-time analysis of key links in the model code, such as computational graph construction and memory allocation, provides professional suggestions such as GPU utilization optimization and mixed precision training, and helps developers unleash the maximum potential of hardware.
Based on the model type and dataset characteristics, it intelligently recommends initial hyperparameter combinations and provides dynamic tuning strategy suggestions to accelerate the model convergence process.
Automatically generate model unit test cases, covering key test scenarios such as forward propagation verification and gradient checking, to ensure the reliability of model code.
Real-time monitoring of common problems such as OOM errors and gradient explosions during training. Tongyi Lingma can quickly identify the cause of the error based on the error screenshot and code, and provide repair suggestions.
Experience intelligent AI model development
In the ModelScope platform, developers can quickly use the intelligent coding assistant in the following ways.
The MoDa Community Notebook cloud development IDE tool, with the built-in Vscode programming environment and Tongyi Lingma tool, can build AI applications in one stop. No complex environment configuration or code foundation is required. As long as you are proficient in using AI tools, you can complete the construction and operation of AI applications.
Linux or Windows or MacOs ❓
Vscode or jetbrains ❓
Copilot or Cursor ❓
Notebook IDE + Tongyi Lingma✔️
ModelScope Notebook is a cloud-based machine learning development IDE tool that provides a remote Linux programming environment suitable for AI developers of different levels. Notebook provides free computing power quotas that can be used with a click.
For specific environment, see:
https://modelscope.cn/docs/notebooks/intro
https://www.modelscope.cn/home
2. Introduction to Tongyi LingmaTongyi Lingma has the ability to modify multi-file codes and use tools, and can collaborate with developers to complete coding tasks, such as code generation, requirement realization, problem solving, unit test case generation, batch code modification, etc.
The complete Tongyi Lingma has been built into the Notebook. We only need to log in to our account to use the Lingma for code development.
First, we log in to Tongyi Lingma, click the "Tongyi Lingma" logo on the left, and select the login method
Code Generation:
First select the AI programmer function (supports file creation and multi-file modification)
Enter our requirements and let Tongyi Lingma create files and generate codes
After the code is generated, you can choose to accept or reject the readjustment of the prompt word.
After accepting, we can select the file that needs to be modified as the context
After letting Tongyi Lingma modify the code, you can also choose to accept or reject it (Note: when the AI-generated content does not meet your expectations, or your needs have changed, you can use the snapshot function to roll back to the previous conversation round and code changes and continue to ask questions again)
Write a small game of pushing boxes, create relevant code files and generate corresponding codes
Code writing:
Unit Testing
Code Optimization
3. Case demonstration: Completed using Notebook + Tongyi LingmaNow we use the Notebook environment and Tongyi Lingma to build the "Job Hunting Assistant" from scratch .
Generate code skeleton
A job search assistant webpage is implemented based on gradio, which can help users optimize resumes, recommend positions and recommend positions based on resumes. The following are the detailed functions:1. Global configuration large model loading and calling, the code is as follows:Model loading part:qwen_model = ModelFactory.create(model_platform=ModelPlatformType.OPENAI_COMPATIBLE_MODEL, model_type="Qwen/Qwen2.5-72B-Instruct", api_key="Your API",#Get the Modelscope API key from the environment variable, that is, the personal SDK token url="https://api-inference.modelscope.cn/v1", model_config_dict=QwenConfig(temperature=0.2).as_dict(), )agent = ChatAgent( system_message="You're a helpful assistant for job anlysis, please answer in Chinese.", message_window_size=30, model=qwen_model )Model calling part:assistant_response = agent.step(combined_content)assistant_response_str = assistant_response.msgs[0].content3. Auxiliary function URL search: open a txt file that stores URL information (help me create this file), and then use the target position and location to let the big model analyze to obtain the most suitable URL information. Crawl information 4. Specific function job recommendation: call the big model to recommend three suitable positions based on the target position and location and the input resume content, and give reasons for the recommendation. Resume analysis: first crawl the relevant information through the crawling information function, and then call the big model to recommend the three most suitable positions based on the user's resume content, and explain the reasons for the recommendation. Resume optimization: receive the resume file and the target position as input, call the big model to optimize the resume content according to the target position, generate an optimized resume file, and provide improvement suggestions. Create the corresponding files according to the above requirements and fill in the approximate code framework
How to submit a request to Tongyi Lingma, for more detailed procedures, please refer to:
https://mp.weixin.qq.com/s/dNcZ-_WhHtWCd5TuMfMxfw
? Get your personal SDK token: https://modelscope.cn/my/myaccesstoken (Note: pay attention to the expiration date)
After Tongyi Lingma generates files and codes, pay attention to whether the code is reasonable. For example, when generating the code for the first time, because the description of "large model loading and calling" is too general and this part itself can be configured very complexly, Tongyi Lingma will write this part very complexly. Therefore, we can tell Tongyi Lingma the prepared interface code directly and let it generate code based on the existing code.
The preliminary code framework shows the results as follows (you can see that the code generated by Tongyi Lingma is relatively rigorous in structure and more suitable for further development):
Complete functions
Improve web page information crawling and various functional components to ensure normal operation.
For example: We need to use the Crawl4AI tool to tell Tongyi Lingma to replace the crawling information function with our interface function, tell it the specific adjustments we need to make when using it (such as writing the crawled results to a specified file), and modify the corresponding calling code in other files; in the search_urls function, we find the comment: Use a specific model for analysis, which means that the model we loaded in the global configuration model.py has not been used yet, so we need to tell Tongyi Lingma that the model loaded in model.py needs to be called here
In utils.py, change the function that crawls information to the following function, and add logic: the crawled URL address is obtained through search_urls, and the crawled information is written to a txt file. At the same time, modify the calls in other files async def bug_search(target_position, target_location): async with AsyncWebCrawler() as crawler: result = await crawler.arun( url="https://www.nbcnews.com/business", ) print(result.markdown)
Call the model of model.py in the search_urls function to analyze and return the results
Likewise, we can refine each function until it runs successfully.
Added features
After the basic functions are implemented, we can add more functions, and then we can show the power of multi-file modification.
For example: We want to add target cities and fixed target positions in job recommendations so that we can crawl more relevant information when crawling information.
When inputting the target position and location in the position recommendation module, make these two inputs into a two-column style with a drop-down menu. You can select the position and city in the drop-down menu, or you can directly input them. Set some global commonly used cities and positions (as many as possible) as selectable elements in utils.py
Final result:
The above mainly introduces how to use the Notebook IDE environment and Tongyi Lingma tools to develop AI products. Through the previous introduction, you can feel that a good development environment and development tools can often make the development process more efficient and solve some practical problems faster and better. As AI code generation tools continue to mature, just move your fingers and your AI product will become a reality~