Anthropic acquires Bun
(bun.com)1856 points by ryanvogel 17 hours ago
1856 points by ryanvogel 17 hours ago
But how is another company that is also VC backed and losing money providing stability for Bun?
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
Anthropic may be losing money, but a company with $7bn revenue run rate (https://www.anthropic.com/news/statement-dario-amodei-americ...) is a whole lot healthier than a company with a revenue of 0.
If I had the cash, I could sell dollar bills for 50 cents and do a $7b run rate :)
I am fairly skeptical about many AI companies, but as someone else pointed out, Anthropic has 10x'ed their revenue for the past 3 years. 100m->1b->10b. While past performance no predictor of future results, their product is solid and to me looks like they have found PMF.
Often it happens that VCs buy out companies from funds belonging to a fresh because the selling fund wants to show performance to their investors until "the big one", or move cash one from wealthy pocket to another one.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
It's an acquihire. If Anthropic is spending significant resources, or see that they will have to, to improve Bun internally already it makes a lot of sense. No nefarious undertones required.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
If it was an acquihire, still a lot less slimy than just offering the employees they care about a large compensation package and leaving the company behind as a husk like Amazon, Google and Microsoft have done recently.
> But how is another company that is also VC backed and losing money providing stability for Bun?
Reminds me of when Tron, the crypto company, bought BitTorrent.
I misread Amazon, implying that Amazon might buy Anthropic, and I think that's what will end up happening.
In my three or four non chatbot related projects, I’ve found Amazon’s Nova models to be just as good as Anthropic’s.
Ditto, and I got to know Bun via HN. It seemed intriguing, but also "why another JS runtime" etc.
If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.
Bun seems more alive than Deno, FWIW.
One thing I like about this, despite it meaning Bun will be funded, is Anthropic is a registered public benefit corporation. While this doesn't mean Anthropic cant fuck over the users of Bun, it at least puts in some roadblocks. The path of least-resistance here should be to improve Bun for users, not to monetize it to the point where it's no longer valuable.
I wonder what this means for Deno.
Will this make it more or less likely for people to use Bun vs Deno?
And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
Bun and Deno's goals seem quite different, I don't expect that to change. Bun is a one stop shop with an ever increasing number of built-in high-level APIs. Deno is focused on low level APIs, security, and building out a standard lib/ecosystem that (mostly) supports all JS environments.
People who like Bun for what it is are probably still going to, and same goes for Deno.
That being said I don't see how Anthropic is really adding long term stability to Bun.
I think Deno's management have been somewhat distracted by their ongoing lawsuits with Oracle over the release of the Javascript trademark.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
In regards to Deno, to me that means their business is not really flying and they need this kind of distractions instead.
Amount of people at big corps that care about their lawsuit, and would switch their IT guidelines from node to Deno due to such heroic efforts?
Zero.
Ironically, this was early Deno - but then adoption required backwards compatibility.
The bloat. I prefer lean designs with plug-in modules for additional functionality. Not only do unused sub-systems take up memory, but they also increase the potential attack surface.
> Will this make it more or less likely for people to use Bun vs Deno?
I'm not sure it will make much of a difference in the short term.
For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.
For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.
I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).
----------------------------
The two largest problems I see in JS are:
1. Terrible security practices
2. A lack of a standard library which pushes people into dependency hell
Deno fixes both of those problems with a proper permission model and a standard library.
----------------------------
> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.
But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.
Prediction Bun is absorbed in house and used by Anthropic to have faster/cheaper places for Claude to run code.
It fades away as a direct to developer tool.
This is a good thing for Deno.
Deno is dead. Seems like there haven't been very relevant or user-informed changes on their roadmap for year(s) now.
Why don't they use their self-acclaimed SE-replacing AI coding bot to fork Bun and called it AnthroBun instead of hiring actual engineers behind Bun?
My first thought went to how openai used Rust to build their CLI tool and Anthropic's CEO bought influence over Zig as a reaction.
Jarred just tweeted a few days ago about how little influence over zig he has, funnily enough.
> bought influence over Zig as a reaction
Elaborate? I believe Zig's donors don't get any influence and decision making power.
Since when is a CLI tool like this a sufficiently demanding technical project that it needs to buy the runtime just to get sufficient support?
This just isn't the hard part of the product.
Like if I was building a Claude Code competitor and I acquired bun, I wouldn't feel like I had an advantage because I could get more support with like fs.read?
As someone who have been using Deno for the last few years, is there anything that Bun does better? Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?). The last time I checked Bun's source code, it was... quite messy and spaghetti-like, plus Zig doesn't really offer many safety features, so it's not that hard to write incorrect code. Zig does force some safety with ReleaseSafe IIRC, but it's still not the same as even modern C++, let alone Rust.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Deno does all that. Hell, yarn does too, or pnpm as the sibling mentioned.
Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").
pnpm does all that on top of node. Also disables postinstall scripts by default, making the recent security incidents we've seen a non-issue.
I decided to stick with Node in general. I don't see any compelling reason to change it.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
Search for pointer exceptions or core dumps on Bun's GitHub issues and you'll see why people (should) use Deno over Bun, if only because Rust is a way more safe language than Zig.
This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state. Whether it be kernel exception, pointer exception, or Rust's panic! - these things exist.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
Don't make a false equivalence, how many times does one get a panic from Deno versus a segmentation fault in Bun? It's not a similar number, and it's simply wrong to say that both are just as unsafe when that's plainly untrue.
> This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
I haven't verified this, but I would be willing to bet that most of Bun's issues here have more to do with interfacing with JavaScriptCore through the C FFI than Zig itself. this is as much a problem in Rust as it is in Zig. in fact, it has been argued that writing unsafe Zig is safer than writing unsafe Rust: https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
As someone who has researched the internals of Deno and Bun, your unverified vibe thoughts are flat out wrong. Bun is newer and buggier and that's just the way things go sometimes. You'll get over it.
Easily bundling and serving frontend code from your backend code is very appealing: https://bun.com/docs/bundler/fullstack
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
I tried several times to port Node projects to Deno. Each time compatibility had "improved" but I still didn't have a working build after a few days of effort.
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
My team has been using it in prod for about a year now. There were some minor bugs in the runtime's implementation of buffers in 1.22 (?), but that was about the only issue we ran into.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
> Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?).
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
I’ve been using Deno too. Although npm support has improved and it’s fine for me, I think Deno has more of a “rewrite the world” philosophy. For example, they created their own package registry [1] and their own web framework [2]. Bun seems much more focused on preexisting JavaScript projects.
It's interesting that people have directly opposite opinions on whether Deno or Bun are meant to be used with the existing ecosystem - https://news.ycombinator.com/item?id=46125049
I don’t think these are mutually exclusive takes. Bun is essentially taking Node and giving it a standard library and standard tooling. But you can still use regular node packages if you want. Whereas Deno def leaned into the clean break for a while
At this stage I don't think either is better over the other. Deno has inexplicable high memory usage issues in Linux containers. Bun more or less suffers from the same with an added dose of segfaults.
1. https://github.com/denoland/deno/issues?q=is%3Aissue%20state... 2. https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...
Node.js is a no-brainer for anyone shipping a TS/JS backend. I'd rather deal with poor DX and slightly worse performance than risk fighting runtime related issues on deployment.
Linux needs to be a first class citizen for any runtime/langauge toolchain.
I really want to like Deno and will likely try it again, but last time I did it was just a bit of a pain anytime I wanted to use something built for npm (which is most packages out there), whereas bun didn't have that problem.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Looking at Bun's website (the comparison table under "What's different about Bun?") and what people have said here, the only significant benefit of Bun over Node.js seems to be that it's more batteries-included - a bigger standard library, more tools, some convenience features like compiling JSX and stripping TypeScript types on-the-fly, etc.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
Is JSC less tested? I thought it was used in Safari, which has some market share.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
I've found it to be at least twice as fast with practically no compat issues.
You might want to revise what you consider to be "absolutely zero chance". Bun has an insanely fast startup time, so it definitely can be true for small workloads. A classic example of this was on Bun's website for a while[1] - it was "Running 266 React SSR tests faster than Jest can print its version number".
Keep in mind that it's not just a matter of comparing the JS engine. The runtime that is built around the engine can have a far greater impact on performance than the choice of v8 vs. JSC vs. anything else. In many microbenchmarks, Bun routinely outperforms Node.js and Deno in most tasks by a wide margin.
I find comments like this fascinating, because you're implicitly evaluating a counterfactual where Bun was built with Rust (or some other "interesting" language). Maybe Bun would be better if it were built in Rust. But maybe it would have been slower (either at runtime or development speed) and not gotten far enough along to be acquired by one of the hottest companies in the world. There's no way to know. Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?
Because maybe they reached out to them, and they didn't took the money, while Bun folks business model wasn't working out?
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
> I'll admit I'm somewhat biased against Bun?
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
> At its core Zig is marketed as a competitor to C, not C++/Rust/etc
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
I don't mean to disparage you in particular, this is like the 1000th time I've seen this.
That's fair, but the word 'biased' felt unusual to describe how they perceive the runtime.
I always figured Bun was the "enterprise software" choice, where you'd want to use Bun tools and libraries for everything and not need to bring in much from the broader NPM library ecosystem.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
If you want to download open source libraries to be used in your Bun project then they will come from npm, at least by default. [1].
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
Yes, both can pull in open source libraries and I can't imagine either dropping that ability. Though they do seem to have different eagerness and competency on Node compatibility and Bun seems better on that front.
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
That’s true of some parts of Deno’s standard libraries, but major functionality like Deno.test and Deno.serve are Deno-specific API’s.
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
Stopped following Deno while they were rejecting the need for a package management solution. Used Bun instead.
They tried to realign package management with web standards and tools that browsers can share (URLs and importmaps and "cache, don't install"). They didn't offer compatibility with existing package managers (notably and notoriously npm) until late in that game and took multiple swings at URL-based package repositories (deno.land/x/ and JSR), with JSR eventually realizing it needed stronger npm compatibility.
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
> is there anything that Bun does better?
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
Is it just me, but I don't find npm that slow? Sure it's not a speed demon, but I rarely need to do npm install anyways so it's not a bottleneck for me.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
I've been using Bun since 2022 just to be trendy for recruitment (it worked, and still works despite it almost being 2026)
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
Anthropic has been trying to win the developer marketshare, and has been quite successful with Claude Code. While I understand the argument that this acquisition is to protect their usage in CC or even just to acquire the team, I do hope that part of their goal is to use this to strengthen their brand. Being good stewards of open source projects is a huge part of how positively I view a company.
> Being good stewards of open source projects is a huge part of how positively I view a company.
Maybe an easier first step would be to open source Claude Code...?
I think because their models are open (e.g. CC can send any instruction and it’ll use your max plan), they need to keep the code obfuscated to prevent people from sending everybody and their mother through that API.
Codex has the opposite issue. It has an open client, which is relatively pointless, because it will accept only one system prompt and one prompt only.
From the comments here it sounds like most people think the amount Anthropic paid for the company was probably not much more than the VC funding which Bun raised.
How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.
What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?
Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?
All VC's have preferred shares, meaning in case of liquation like now, they get their investment back, and then the remainder gets shared.
Additionally, depending on round, they also have multiples, like 2x meaning they get at least 2x their investment before anyone else gets anything
Hard to say it makes no sense when you don't know how much they were acquired for. I would guess it is a trivial amount relative to Anthropic's war chest.
I think this acquisition in reality has more to do with developer goodwill? And a little to do with the shell game of making these AI companies hard to value because they collect assets like this.
Bun is pretty cool. I maintain a Node.js library and updated my Node.js engine version and my library just didn't work on the latest version... In frustration, I decided to try Bun for the first time... I had never used it before but my library worked straight away, no warnings, no errors. I have never seen that level of compatibility before when a library works better with an alternative engine than the one it was designed for.
I did end up fixing Node.js compatibility later but it was extra work. Felt like they just created busy-work. Node.js maintainers should stop deprecating perfectly good features and complicating their modules.
Quote from the CEO of Anthropic in March 2025: "I think we'll be there in three to six months where AI is writing 90% of the code and then in 12 months we may be in a world where AI is writing essentially all of the code"
I think this wound up being close enough to true, it's just that it actually says less than what people assumed at the time.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
At an individual level, I think it is for some people. Opus/Sonnet 4.5 can tackle pretty much any ticket I throw at it on a system I've worked on for nearly a decade. Struggles quite a bit with design, but I'm shit at that anyway.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
From the article, Claude Code is being used extensively to develop Bun already.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
> You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
"Wasting" is doing a lot of work in that sentence.
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
Deciding what to Implement
and
Implementing the Decisions
are complementary, one of these is being commoditised.
And, in fact, decimated.
Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.
Same. I don’t understand how people aren’t getting this yet. I’m spending all day thinking, planning and engineering while spending very little time typing code. My productivity is through the roof. All the code in my commits is of equal quality to what I would produce myself, why wouldn’t it be? Sure one can just ask AI to do stuff and not review it and iterate, but why on earth would one do that? I’m starting to feel that anyone who’s not getting this positive experience simply isn’t good at development to begin with.
Maybe he was correct in the extremely literal sense of AI producing more new lines of code than humans, because AI is no doubt very good at producing huge volumes of Stuff very quickly, but how much of that Stuff actually justifies its existence is another question entirely.
Why do people always stop this quote at the breath? The rest of it says that he still thinks they need tech employees.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn
I actually like claude code, but that was always a risky thing to say (actually I recall him saying their software is 90% AI produced) considering their cli tool is literally infested with bugs. (Or it least it was last time I used it heavily. Maybe they've improved it since.)
Is this why everyone only seems to know the first half of Dario's quote? The guy in that video is commenting on a 40 second clip from twitter, not the original interview.
I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784
I'm curious what people think of quotes like these. Obviously it makes an explicit, falsifiable prediction. That prediction is false. There are so many reasons why someone could predict that it would be false. Is it just optimistic marketing speech, or do they really believe it themselves?
Everybody knows that marketing speech is optimistic. Which means if you give realistic estimates, then people are going to assume those are also optimistic.
The big picture of “build a runtime” is an easier idea than “what would make this runtime better and how should the parts interact”.
Given the horrible stability of Windows this year, it seems like Microsoft went all in on that
What languages and frameworks? What is the domain space you're operating in? I use Cursor to help with some tasks, but mainly only use the autocomplete. It's great; no complaints. I just don't ever see being able to turn over anywhere close to 90% with the stuff we work on.
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.
You can see my site here, if you'd like: https://chipscompo.com/
Probably about 95% of mine now. Much better than I could for the most part.
Weird, AI writes terrible code for me that would never pass a code review. I guess people have different standards for good code.
I don't really see how Bun fits as an acquisition for an AI company. This seems more like "we have tons of capital and we want to buy something great" than "Bun is essential to our core business model".
If Anthropic wants to own code development in the future, owning the full platform (including the runtime) makes sense.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
Even outside of code development, Anthropic seems to be very strongly leaning into code interpreter over native tool calling for advancing agentic LLM abilities (e.g. their "skills" approach). Given that those necessitate a runtime of sorts, owning/having access to a runtime like Bun that could e.g. allow them to very seamlessly integrate that functionality into their products better, this acquisition doesn't seem like the worst idea.
They're baking the LORA as we speak, and it'll default to `bun install` too
"the full platform"
there are more languages than ts though?Acquisition of Apple Swift division incoming?
TypeScript is the most popular programming language on the most popular software hosting platform though, owning the best runtime for that seems like it would fit Pareto's rule well enough:
https://github.blog/news-insights/octoverse/octoverse-a-new-...
I think there's a potential argument to be made that Anthropic isn't trying to make it easier to write TS code, but rather that their goal is a level higher and the average person wouldn't even know what "language" is running it (in the same way most TS devs don't need to care the many layers their TS code is compiled via).
According to a JetBrains dev survey (I forget the year) roughly 58% of devs deploy to the web. That's a big money pie right there.
It doesn't make sense, and you definitely didn't say why it'd make sense... but enough people are happy enough to see the Bun team reach an exit (especially one that doesn't kill Bun) that I think the narrative that it makes sense will win out.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
Claude Code running on Bun is an obvious justification, but Buns features (high performance runtime, fast starts, native TS) are also important for training and inference. For instance, in inference you develop a logical model in code that maps to a reasoning sequence, and then execute the code to validate and refine the model, then use this to inform further reasoning. Bun, which is highly integrated and highly focused on performance, is an ideal fit for this. Having Bun in house means that you can use the feedback from all of automation driven execution of Bun to drive improvements to its core.
Looks like they are acquiring the team rather than the product
No, they're clearly acquiring the technology. They're betting Claude Code on Bun, they have an invested interest in the health of Bun.
Why would they want to bet on nascent technology whereas Node.js bas existed for a god 15 years?
That was my thinking is, this would be useful for Claude Code.
It does actually.
Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.
Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.
Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.
Does this acquisition mean Claude Code the CLI is more valuable than entiriety of Bun?
No, just that people who borrowed bun 7 million dollars want some of it back...
What is the business model behind open source projects like bun? How can a company "aquire" it and why does it do that?
In the article they write about the early days
We raised a $7 million seed round
Why do investors invest into people who build something that they give away for free?The post mentions why - Bun eventually wanted to provide some sort of cloud-hosting saas product.
The standard argument here is that the maintainers of the core technology are likely to do a better job of hosting it because they have deeper understanding of how it all works.
There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...
Hosting is a commodity. Runtimes are too. In this case, the strategy is to make a better runtime, attract developers, and eventually give them a super easy way to run their project in the cloud. Eg: bun deploy, which is a reserved no op command. I really like Buns DX.
Yep. This strategy can work, and it has also backfired before, like with Docker trying to monetize something they gave away for free.
Free now isn't free forever. If something has inherent value then folks will be willing to pay for it.
Anyone know how much Anthropic paid for Bun? I assume it was at least $26M, so Bun could break even and pay back its own investors, but I didn't see a number in the announcements from Anthropic or Bun.
"Node.js compatibility & replacing Node.js as the default server-side runtime for JavaScript"
Except Node's author already wrote its replacement: Deno.
I’ll be honest, while I have my doubts about the match of interests and cohesion between an AI company and a JS runtime company I have to say this is the single best acquisition announcement blog post I’ve seen in 20 years or so.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.