From 20,000 traces of Linux code to international scale: Microsoft’s open-source journey


From Linux kernel code to AI at scale, uncover Microsoft’s open supply evolution and impression.

Microsoft’s engagement with the open supply neighborhood has reworked the corporate from a one-time skeptic to now being one of many world’s main open supply contributors. In reality, over the previous three years, Microsoft Azure has been the biggest public cloud contributor (and the second largest general contributor) to the Cloud Native Computing Basis (CNCF). So, how did we get right here? Let’s take a look at some milestones in our journey and discover how open-source applied sciences are on the coronary heart of the platforms powering a lot of Microsoft’s greatest merchandise, like Microsoft 365, and massive-scale AI workloads, together with OpenAI’s ChatGPT. Alongside the best way, we have now additionally launched and contributed to a number of open-source initiatives impressed by our personal experiences, contributing again to the neighborhood and accelerating innovation throughout the ecosystem.  

Embracing open supply: Key milestones in Microsoft’s journey

2009—A brand new leaf: 20,000 traces to Linux. In 2009, Microsoft contributed greater than 20,000 traces of code to the Linux kernel, initially Hyper‑V drivers, underneath Basic Public License, model 2 (GPLv2). It wasn’t our first open supply contribution, but it surely was a visual second that signaled a change in how we construct and collaborate. In 2011, Microsoft was within the prime 5 firms contributing to Linux. Right now, 66% of buyer cores in Azure run Linux.

2015—Visible Studio code: An open supply hit. In 2015, Microsoft launched Visible Studio Code (VS Code), a light-weight, open-source, cross-platform code editor. Right now, Visible Studio and VS Code collectively have greater than 50 million month-to-month lively builders, with VS Code itself extensively thought to be the preferred growth setting. We imagine AI experiences can thrive by leveraging the open-source neighborhood, simply as VS Code has efficiently performed over the previous decade. With AI turning into an integral a part of the trendy coding expertise, we’ve launched the GitHub Copilot Chat extension as open supply on GitHub.

2018GitHub and the “all-in” dedication. In 2018, Microsoft acquired GitHub, the world’s largest developer neighborhood platform, which was already house to twenty-eight million builders and 85 million code repositories. This acquisition underscored Microsoft’s transformation. As CEO Satya Nadella stated within the announcement, “Microsoft is all-in on open supply… Relating to our dedication to open supply, choose us by the actions we have now taken within the current previous, our actions at the moment, and sooner or later.” Within the 2024 Octoverse, GitHub reported 518 million public or open-source initiatives, over 1 billion contributions in 2024, about 70,000 new public or open-source generative AI initiatives, and a few 59% year-over-year surge in contributions to generative AI initiatives. 

Open supply at enterprise scale: Powering the world’s most demanding workloads 

Open-source applied sciences, like Kubenetes and PostgreSQL, have change into foundational pillars of recent cloud-native infrastructure—Kubernetes is the second largest open-source challenge after Linux and now powers thousands and thousands of containerized workloads globally, whereas PostgreSQL is without doubt one of the most generally adopted relational databases. Azure Kubernetes Service (AKS) and Azure’s managed Postgres take the perfect of those open-source improvements and elevate them into sturdy, enterprise-ready managed providers. By abstracting away the operational complexity of provisioning, scaling, and securing these platforms, AKS and managed PostgreSQL lets organizations give attention to constructing and innovating. The mix of open supply flexibility with cloud-scale reliability permits providers like Microsoft 365 and OpenAI’s ChatGPT to function at huge scale whereas staying extremely performant.

COSMIC: Microsoft’s geo-scale, managed container platform powers Microsoft 365’s transition to containers on AKS. It runs thousands and thousands of cores and is without doubt one of the largest AKS deployments on the earth. COSMIC bakes in safety, compliance, and resilience whereas embedding architectural and operational greatest practices into our inner providers. The end result: drastically diminished engineering effort, quicker time-to-market, improved price administration, even whereas scaling to thousands and thousands of month-to-month customers world wide. COSMIC makes use of Azure and open-source applied sciences to function at planet-wide scale: Kubernetes event-driven autoscaling (KEDA) for autoscaling, Prometheus, and Grafana for real-time telemetry and dashboards to call a number of.

OpenAI’s ChatGPT: ChatGPT is constructed on Azure utilizing AKS for container orchestration, Azure Blob Storage for person and AI-generated content material, and Azure Cosmos DB for globally distributed information. The size is staggering: ChatGPT has grown to nearly 700 million weekly lively customers, making it the fastest-growing shopper app in historical past.1 And but, OpenAI operates this service with a surprisingly small engineering staff. As Microsoft’s Cloud and AI Group Government Vice President Scott Guthrie highlighted at Microsoft Construct in Could, ChatGPT “must scale … throughout greater than 10 million compute cores world wide,” …with roughly 12 engineers to handle all that infrastructure. How? By counting on managed platforms like AKS that mix enterprise capabilities with the perfect of open supply innovation to do the heavy lifting of provisioning, scaling, and therapeutic Kubernetes clusters throughout the globe. 

Think about what occurs while you chat with ChatGPT: Your immediate and dialog state are saved in an open-source database (Azure Database for PostgreSQL) so the AI can keep in mind context. The mannequin runs in containers throughout 1000’s of AKS nodes. Azure Cosmos DB then replicates information in milliseconds to the datacenter closest to the person, guaranteeing low latency. All of that is powered by open-source applied sciences underneath the hood and delivered as cloud providers on Azure. The end result: ChatGPT can deal with “unprecedented” load—over one billion queries per day, and not using a hitch, and while not having an enormous operations staff. 

What Azure groups are constructing within the open

At Microsoft, our dedication to constructing within the open runs deep, pushed by engineers throughout Azure who actively form the way forward for open-source infrastructure. Our groups don’t simply use open-source applied sciences, they assist construct and evolve them.  

Our open-source philosophy is simple: we contribute upstream first after which combine these improvements into our downstream merchandise. To assist this, we play a pivotal function in upstream open-source initiatives, collaborating throughout the trade with companions, clients, and even opponents. Examples of initiatives we have now constructed or contributed to incorporate:  

  • Dapr (Distributed Software Runtime): A CNCF-graduated challenge launched by Microsoft in 2019, Dapr simplifies cloud-agnostic app growth with modular constructing blocks for service invocation, state, messaging, and secrets and techniques.
  • Radius: A CNCF Sandbox challenge that lets builders outline utility providers and dependencies, whereas operators map them to assets throughout Azure, AWS, or non-public clouds—treating the app, not the cluster, because the unit of intent.
  • Copacetic: A CNCF Sandbox software that patches container pictures with out full rebuilds, dashing up safety fixes—initially constructed to safe Microsoft’s cloud pictures.
  • Dalec: A declarative software for constructing safe OS packages and containers, producing software program invoice of supplies (SBOMs) and provenance attestations to provide minimal, reproducible base pictures.
  • SBOM Instrument: A command line interface (CLI) for producing SPDX-compliant SBOMs from supply or builds—open-sourced by Microsoft to spice up transparency and compliance.
  • Drasi: A CNCF Sandbox challenge launched in 2024, Drasi reacts to real-time information adjustments utilizing a Cypher-like question language for change-driven workflows. 
  • Semantic Kernel and AutoGen: Open-source frameworks for constructing collaborative AI apps—Semantic Kernel orchestrates massive language fashions (LLMs) and reminiscence, whereas AutoGen allows multi-agent workflows.
  • Phi-4 Mini: A compact 3.8 billion-parameter AI mannequin launched in 2025, optimized for reasoning and arithmetic on edge units; accessible on Hugging Face.
  • Kubernetes AI Toolchain Operator (KAITO): A CNCF Sandbox Kubernetes operator that automates AI workload deployment—supporting LLMs, fine-tuning, and retrieval-augmented era (RAG) throughout cloud and edge with AKS integration. 
  • KubeFleet: A CNCF Sandbox challenge for managing purposes throughout a number of Kubernetes clusters. It gives sensible scheduling, progressive deployments, and cloud-agnostic orchestration. 

That is only a small sampling of among the open-source initiatives that Microsoft is concerned in—each sharing, in code, the teachings we’ve discovered from operating programs at a world scale and alluring the neighborhood to construct alongside us.  

Open Supply + Azure = Empowering the following era of innovation

Microsoft’s journey with open supply has come a good distance from that 20,000-line Linux patch in 2009. Right now, open-source applied sciences are on the coronary heart of many Azure options. And conversely, Microsoft’s contributions are serving to drive many open-source initiatives ahead—whether or not it’s commits to Kubernetes; new instruments like KAITO, Dapr, and Radius; or analysis developments like Semantic Kernel and Phi-4. Our engineers perceive that the success of finish person options like Microsoft 365 and ChatGPT depend on scalable, resilient platforms like AKS—which in flip are constructed on and sustained by sturdy, vibrant open supply communities. 

Be a part of us at Open Supply Summit Europe 2025

As we proceed to contribute to the open supply neighborhood, we’re excited to be a part of Open Supply Summit Europe 2025, happening August 25–27. You’ll discover us at sales space D3 with dwell demos, in-booth classes protecting a variety of subjects, and loads of alternatives to attach with our Open Supply staff. Make sure you catch our convention classes as effectively, the place Microsoft specialists will share insights, updates, and tales from our work throughout the open supply ecosystem. 


1 TechRepublic, ChatGPT’s On Observe For 700M Weekly Customers Milestone: OpenAI Goes Mainstream, August 5, 2025.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles