Can you trust uv long-term?
In March 2026, OpenAI acquired Astral, the venture-backed startup behind uv. Before betting a team’s workflow on a tool whose owner’s priorities just changed, the reasonable question is what happens if Astral’s output slows or stops.
The handbook recommends uv. This page lays out why that recommendation survives the trust question, what would have to change before it doesn’t, and what to do today to keep your migration costs low.
Look at what you actually adopt
uv is dual-licensed under Apache 2.0 and MIT. That license cannot be revoked. Every release that has shipped is permanently available to fork, modify, or redistribute. Anyone, including you, can freeze the source tree at any commit and keep using it. No future owner can take that away.
The corporate risk question is not “what happens to the source code?” because the source is already a public good. The question is “what happens to the team that has been shipping releases on a roughly two-week cadence?” Those are different problems with different answers.
Track what changed when OpenAI acquired Astral
The Astral team is joining OpenAI’s Codex group. OpenAI says it will keep the open source tools open while building closer integration with Codex.
The community read on the deal was that it was an acquihire of a team near the end of its runway, not a takeover of the technology. The technology was already open. What OpenAI bought was the people who know it best.
The realistic worry is roadmap drift, not extinction. OpenAI could steer the project toward Codex-specific features, and release pace could slow as team attention shifts. Either scenario still leaves the existing tool functional and the source available to fork or maintain.
Test how forkable uv really is
uv is forkable on license terms. The practical question is whether anyone could match Astral’s velocity. uv depends on a custom resolver, the python-build-standalone project, PEP 517 build-backend interface support, and lockfile generation. Matching Astral’s two-week release cadence would need real engineering capacity. Keeping a fork functional, with bug fixes, security patches, and builds against new Python releases, is a much smaller job, well within reach of a handful of paid maintainers or a corporate sponsor. That is roughly the model pip, virtualenv, and pip-tools have run on for years.
A fork would also inherit the codebase but not Astral’s working relationships with PyPI, the Packaging Council, and standards authors. Those relationships influence which standards land and how quickly. A fork starts from scratch on that front.
What still makes a fork more credible for uv than for a typical orphaned project is that Python’s packaging standards are themselves open, so a fork would not have to invent compatibility from scratch. The Rye project was already absorbed into uv once before, which is a precedent for the codebase being adoptable by a different team.
A fork would still be slower than Astral was, but it is a real fallback rather than a hypothetical one.
Watch for signals that would change the recommendation
The handbook will keep recommending uv as long as the following remain true. When any of them flips, this page will say so.
- The published source matches what the
uvbinary actually runs (no closed components, no telemetry the user cannot disable) - New releases stay under Apache 2.0 / MIT terms
- Bug reports get addressed on a timescale comparable to the pre-acquisition pace
- Standards work continues, including PEP 723 inline-script metadata and PEP 751 standardized lockfiles
- A working fork exists if and when releases stall
A switch away from uv would be triggered by license changes, opaque telemetry the user cannot disable, a release cadence that visibly drops, or a closed extension layer added on top of the open code. None of those have happened, and none of them are visible yet.
Keep your migration costs low
The single best hedge against any tool’s future is making sure you could leave it. uv is unusual in that this hedge is almost free.
Stick to standardized inputs and outputs. Keep runtime dependencies in [project.dependencies] and extras in [project.optional-dependencies], both defined by PEP 621. Put dev, test, and lint groups in [dependency-groups] (PEP 735) rather than uv’s older [tool.uv.dev-dependencies] table. Both formats are read by uv today, by recent versions of pip, and by Poetry. Lockfiles are headed in the same direction: PEP 751 defines a standard pylock.toml that any installer can consume.
Some uv tables don’t have standard equivalents yet. The common ones to know about:
[tool.uv.sources]for git, path, or URL dependencies[tool.uv.workspace]for monorepo-style multi-package layouts[tool.uv.index]for alternative package indexes
Using these is fine. They are also the entries you would translate by hand if you ever switched tools, so it’s worth keeping the list short.
Don’t depend on uv-only behaviors in your build. uv plays a development-time and CI role; the wheels you publish should still build with a standards-compliant backend like Hatch’s hatchling so that downstream consumers don’t need uv to install your package.
Practice the exit you might never take. The How to migrate from uv to pip walkthrough makes this concrete: an exported pylock.toml plus pip inside a virtual environment replicates the install. If that sequence works, your project is portable. If it does not, you have leaked uv specifics into your code or config, and that is fixable independent of any acquisition.
Compare uv to its alternatives
The trust question for any tool is “compared to what?” Pip is maintained by PyPA volunteers; it isn’t going anywhere, but it solves a smaller problem (just installation, not environment or interpreter management). Poetry is maintained by an independent team but has had multi-month gaps between releases and a smaller pool of paid maintainers than Astral’s pre-acquisition team.
“Use a slower tool to avoid VC risk” is a real choice, but the cost is a slower tool. Reverting to pip, virtualenv, pyenv, and pip-tools to avoid one company’s commercial decisions reintroduces the fragmented Python toolchain that uv consolidated. That tradeoff is worth making explicit.
If you trust no VC-funded developer tool by principle, the consistent stance is to use only PSF-stewarded software. That is a coherent position. Almost no modern developer tooling, in Python or any other ecosystem, would survive that filter, including a large fraction of the type checkers, linters, IDE integrations, and AI coding assistants the same teams already rely on every day.
Related
- How to migrate from uv to pip operationalizes the exit playbook with pyenv and pip
- Why You Should Try uv if You Use Python covers what uv replaces and the speed argument
- uv: A Complete Guide is the comprehensive overview
- OpenAI to Acquire Astral summarizes the announcement and community reaction
- Charlie Marsh on uv, agents, and the future of Python is the Astral CEO interview from before the acquisition
- Which Python package manager should I use? walks through the broader decision tree