AI Programming: The beginning of the era of co-building intelligent IDEs

The future of AI programming has arrived. Microsoft open-sources GitHub Copilot to lead the era of intelligent IDE co-construction.
Core content:
1. Microsoft announced the open source of GitHub Copilot, opening a new chapter in the co-construction of AI programming IDE
2. From the perspective of technical implementation, Copilot's prompt engineering and optimization strategy
3. In the future, intelligent IDEs will move towards steady convergence, and open source VS Code will become the best infrastructure
On May 19, Microsoft officially announced the open source of GitHub Copilot and will build it together with the community. This move is of milestone significance!
Historical status
Since the advent of the Large Model (LLM), intelligent programming tools represented by Cursor have ushered in a new era of AI programming. However, almost all AI programming IDEs are currently rebuilt based on the source code of forked VS Code, typical examples include Cursor, Windsurf, etc. These IDEs are very opaque and difficult to customize. Advanced features must use external models and expose internal source code. Even Microsoft's own Copilot is also privatized.
From the perspective of technical implementation, Copilot mainly consists of prompt engineering and optimization strategies. Nowadays, even a newly graduated developer can fork VS Code and get millions of dollars in investment. Institutions such as YC have invested in and incubated a large number of similar open source Cursor products, but these projects are often developed by single-digit mini teams on the basis of VS Code in a "patch-style" manner, with slow iterations, lack of community activity and influence (very low visibility). A large number of secondary forks lead to duplication of work, and there is no real co-construction model that gathers community strength.
Because of this, I once reversed Copilot, obtained some readable code, and successfully ran the intelligent IDE in an offline network environment. At the end of the articles "VS Code Secondary Development Guide" and "Building Intranet Offline Cursor", I made the following conclusions:
Agent strategy) is the best strategy. 4. Currently, smart IDEs are very popular, but they are all forked based on vscode (known ones include cursor/windsurf/trea, etc.). This is because vscode officials are too slow to respond to new tools. In the long run, the smart IDE experience will eventually become stable and convergent, and the necessity of forking vscode will become less and less. In the future, it is very likely that vscode will still provide public smart IDE capabilities and provide interfaces for extensions to optimize and fine-tune their own LLM strategies. ……" data-editid="1yb4eg1rv3b40000000" data-authorname="undefined" data-title="undefined" data-url="">…
3. In the long run, the best strategy is to reuse the open source vscode infrastructure (including AI chat and application capabilities) and implement the internal Copilot extension (for internal model tuning prompts and strengthening Agent strategies).
4. Currently, smart IDEs are very popular, but they are all forked based on vscode (known ones include cursor/windsurf/trea, etc.). This is because vscode officials are too slow to respond to new tools. In the long run, the smart IDE experience will eventually become stable and convergent, and the need to fork vscode will become less and less. In the future, it is very likely that vscode will still provide public smart IDE capabilities and provide interfaces for extensions to optimize and fine-tune their own LLM strategies.
…
Official Announcement
The following is the official announcement
————
We believe that the future of code editors should be open and powered by AI. Over the past decade, VS Code has been one of the most successful open source projects on GitHub. We are grateful to the vibrant community of contributors and users who choose VS Code because it is open source. As AI becomes core to the VS Code developer experience, we will continue to uphold our original development principles: open, collaborative, and community-driven.
We will open source the GitHub Copilot Chat extension under the MIT license, and then carefully refactor the extension's components into VS Code core. This is the next and logical step in our journey to make VS Code an open source AI editor. It reflects that AI-driven tools are at the core of how we write code; it also reaffirms our belief that working in the open leads to better products for users and fosters a diverse extension ecosystem.
Why open source now?
Over the past few months, we have observed changes in AI development that motivated us to transition AI development in VS Code from closed source to open source:
Large language models have improved significantly, alleviating the need for “secret sauce” hinting strategies.
The most popular and effective AI interaction UX treatments are already widely used in the editor today. We hope that by integrating these common UI elements into a stable, open code base, the community can continue to improve and build on them.
An ecosystem of open source AI tools and VS Code extensions has emerged. We want to make it easier for these extension authors to build, debug, and test their extensions. This is particularly challenging today without access to the source code for the Copilot Chat extension.
We’ve received a lot of questions about the data our AI editors collect. Open sourcing the Copilot Chat extension will increase transparency by giving you visibility into the data we collect.
Malicious actors are increasingly targeting AI developer tools. Throughout VS Code's history as open source software (OSS), community issues and PRs have helped us quickly discover and fix security issues.
In the coming weeks, we will work on open sourcing the code for the GitHub Copilot Chat extension and refactoring the AI capabilities in the extension into VS Code core. Our core priorities remain the same: delivering great performance, powerful extensibility, and an intuitive and beautiful user interface.
Open source works best when the community builds around a stable, shared foundation. Therefore, our goal is to make contributing AI features as easy as contributing to any part of VS Code. The stochastic nature of large language models makes it particularly difficult to test AI features and make changes quickly. To make this easier, we are also open sourcing our rapid testing infrastructure to ensure community PRs can build and pass tests.
We’re excited to shape the future of development as an open source AI editor - and we hope you’ll join us on this open journey.
——Official announcement——
AI programming requires open source infrastructure
I was a little excited to see this news. It not only verified my original intuition and prediction, but more importantly, the community strength of countless forked VS Code is expected to gather again . In the future, building various offline intelligent programming platforms for enterprises will also greatly reduce the workload of reinventing the wheel.
Here, I would like to pay tribute to the VS Code team again - in the past decade, their open source actions have almost brought together the most active community after Linux and Chromium. From the open source repository of VS Code, we can see that the community is very active. Not only does it release software iteration plans and work details, but it also thoughtfully provides a large number of "good first issue" tags for newcomers to facilitate their friendly participation in code contributions. Most importantly, VS Code is still the most worthwhile software project to learn in the front-end field, without a doubt.
Can Copilot redefine the new era of open source AI programming infrastructure? It depends on whether the future leading team can truly inherit VS Code's past transparency and respect for community opinions and enthusiasm .