Anthropic acquires Bun
(bun.com)1856 points by ryanvogel 17 hours ago
1856 points by ryanvogel 17 hours ago
Java was doing "cloud-native, stripped down (jlink) image, self-contained runtime with batteries included" long before Bun existed. There's also GraalVM for one executable binary if one's ambitious.
Yea, they just posted this a few days ago:
https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
It’s relevant enough that I feel I can roll out this bash.org classic…
<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders
EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!
Not in the browser, and no – webassembly doesn't count, otherwise you can say the same about Go and others.
AI tools value simplicity, fast bootstrapping and iterations, this rules out the JVM which has the worst build system and package repositories I've ever had the displeasure of needing to use. Check in gradle binaries in 2025? Having to wait days for packages to sync? Windows/Linux gradle wrappers for every project? Broken builds and churn after every major upgrade. It's broken beyond repair.
By contrast `bun install` is about as good as it gets.
run code anywhere hamstrung by 90s syntax and hidden code indirections
Under "Programmatic Tool Calling"
> The challenge
> Traditional tool calling creates two fundamental problems as workflows become more complex:
> Context pollution from intermediate results: When Claude analyzes a 10MB log file for error patterns, the entire file enters its context window, even though Claude only needs a summary of error frequencies. When fetching customer data across multiple tables, every record accumulates in context regardless of relevance. These intermediate results consume massive token budgets and can push important information out of the context window entirely.
> Inference overhead and manual synthesis: Each tool call requires a full model inference pass. After receiving results, Claude must "eyeball" the data to extract relevant information, reason about how pieces fit together, and decide what to do next—all through natural language processing. A five tool workflow means five inference passes plus Claude parsing each result, comparing values, and synthesizing conclusions. This is both slow and error-prone.
Basically, instead of Claude trying to, e.g., process data by using inference from its own context, it would offload to some program it specifically writes. Up until today we've seen Claude running user-written programs. This new paradigm allows it the freedom to create a program it finds suitable in order to perform the task, and then run it (within confines of a sandbox) and retrieve the result it needs.
Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
Bun isn't based on V8, it's JavaScriptCore, but your point still stands.
you left out the best part...what happened to Kenton? He looked at lightweight serverless architecture..and then what?
> Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.
To be honest, that sounds more like a pitch for deno than for bun, especially the “paranoidly sandboxed” part.
It's fine but why is Js a good language for agents? I mean sure its faster than python but wouldn't something that compiles to native be much better?
JS has the fastest, most robust and widely deployed sandboxing engines (V8, followed closely by JavaScriptCore which is what Bun uses). It also has TypeScript which pairs well with agentic coding loops, and compiles to the aforementioned JavaScript which can run pretty much anywhere.
Note that "sandboxing" in this case is strictly runtime sandboxing - it's basically like having a separate process per event loop (as if you ran separate Node processes). It does not sandbox the machine context in which it runs (i.e. it's not VM-level containment).
> It also has TypeScript which pairs well with agentic coding loops, (...)
I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?
> It also has TypeScript which pairs well with agentic coding loops
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
It's widespread and good enough. The language just doesn't matter that much in most cases
This is one of those, "in theory, there's no difference between theory and practice. In practice, there is" issues.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
> strong typing is very very useful for AI coding context
what makes you think so?
I believe strong typing is very very useful for human coding,
I'm not convinced its so 'very very' for agents.
I'm not confused about the acquisition but about the investment. What were the investors thinking? This is an open source development tool with (to date), 0$ of revenue and not even the beginnings of a plan for getting such a thing.
The acquisition makes more sense. A few observations:
- no acquisition amount was announced. That indicates some kind of share swap where the investors change shares for one company into another. Presumably the founder now has some shares in Anthropic and a nice salary and vesting structure that will keep him on board for a while.
- The main investor was Kleiner Perkins. They are also an investor in Anthropic. 100M in the last round, apparently.
Everything else is a loosely buzzword compatible thingy for Anthropic's AI coding thingy and some fresh talent for their team. All good. But it's beside the point. This was an investor bailout. They put in quite a bit of money in Bun with exactly 0 remaining chance of that turning into the next unicorn. Whatever flaky plan there once might have been for revenue that caused them to invest, clearly wasn't happening. So, they liquidated their investment through an acquihire via one of their other investments.
Kind of shocking how easy it was to raise that kind of money with essentially no plan whatsoever for revenue. Where I live (Berlin), you get laughed away by investors (in a quite smug way typically) unless you have a solid plan for making them money. This wouldn't survive initial contact with due diligence. Apparently money still grows on trees in Silicon Valley.
I like Bun and have used it but from where I'm sitting there was no unicorn lurking there, ever.
Isn't what you're describing just a set of APIs with native bindings that the LLM can call?
I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?
Can't you just do it as a library and train/instruct the LLM to prefer using that library?
Mostly, just Jarred Sumner makes it worth it for Anthropic.
Could also be a way to expand the customer for Claude Code from coding assistant to vibe coding, a la Replit creating a hosted app. CC working more closely with Bun could make all that happen much faster:
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
>Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
The writeup makes it sound like an acquihire, especially the "what changes" part.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...
That's a really cool use case and seems super helpful. working cloud native is a chore sometimes. having to fiddle with internal apis, acl/permissions issues.
This matches some previous comments around LLMs driving adoption of programming languages or frameworks. If you ask Claude to write a web app, why not have it use your own framework, that it was trained on, by default?
Users are far more likely to ask it about shadcn, or material, than about node/deno/bun. So, what is this about?
Currently Claude etc. can interact with services (including AWS) via MCPs.
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
An AI company scoops up frontend tech. Do you really think it was because of s3?
As a commandline end user who prefers to retreive data from the www as text-only, I see deno and bun as potential replacements (for me, not necessarily for anyone else) for the so-called "modern" browser in those rare cases where I need to interpret Javascript^1
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
https://github.com/yt-dlp/yt-dlp/wiki/EJS
where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)
> At the time of writing, Bun's monthly downloads grew 25% last month (October, 2025), passing 7.2 million monthly downloads. We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
> They didn't have to join, which means they got a solid valuation.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
They had a second round that was $19m in late 2023. I don't doubt for a second that they had a long runway given the small team.
I don't like all of the decisions they made for the runtime, or some of the way they communicate over social media/company culture, but I do admire how well-run the operation seems to have been from the outside. They've done a lot with (relatively) little, which is refreshing in our industry. I don't doubt they had a long runway either.
Thanks I scrolled past that in the announcement page.
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
> They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Can't edit my comment anymore but Bun posted a pretty detailed explanation of their motivation here: https://bun.com/blog/bun-joins-anthropic
Sounds like "monetizing Bun is a distraction, so we're letting a deep-pocketed buyer finance Bun moving forward".
Anthropic is still a new company and so far they seem "friendly". That being said, I still feel this can go either way.
Yep. Remember when "Open"AI took a bunch of grant money and then turned for-profit?
And kept their fraudulent name.
> I believe this a bit less.
They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.
This is my fear. It's one thing to lose a major sponsor. It's another to get cut due to a focus on profitability later down the line.
Yeah, now they are part of Anthropic, who haven't figured out monetization themselves. Shikes!
I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.
Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.
Not sure if that counts as "figured out monetization" when no AI company is even close to being profitable -- being able to get some money for running far more expensive setups is not nothing, but also not success.
"We were maybe gonna fuck ya, buy now we promise we wont"
I am more shocked about the origin story compared to the acquisition.
> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.
Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.
Most people use what they know. You start out that way, and if it turns out to be good, you can always do a v2
Yes, but there are obvious limits to that. This is like someone who knows how to bake wanting to build a car, so they start making it out of dough.
Because he wanted to? Do you also berate the choices of people in the 4K demo scene for using too little memory?
He may have been serving a game in a canvas hosted in a Next.js app, but have done all the actual game (rendering, simulation, etc.) in something else. That’s a decent approach - Next can handle the header of the webpage and the marketing blog or whatever just fine.
My point isn’t that you absolutely need that, just that the negative effect on your game development are pretty minimal if you’re not leaning on the SPA framework for anything related to the game. If your game is going to be embedded into an otherwise normal-ish website, this isn’t a terrible way to go (I’ve done it personally with a game mostly written in Rust and compiled to WASM). You can get gains by splitting your game and web site bundles and loading the former from the latter explicitly, but they’re not massive if your bundler was already reasonably incremental (or was already esbuild).
Thanks for assuming I “read” about bundlers somewhere, though. I’ve been using (and configuring) them since they existed.
index.html with script files would still benefit from a bundler. You can have a very minimal react footprint and still want to use react build tools just for bundling.
> Anthropic has direct incentive to keep Bun excellent.
Huh, this feels very odd to read and buying a company outright is definitely not the only way to push Bun to be excellent. Contributing to Bun from their developers, becoming a sponsor, donating through other means, buying 'consulting services' or similar, or even forking it and keeping it up to date would all be also steps towards keeping the Bun excellent.
This is vendoring a dependency on steroids, and first moment interests of community are not aligned with what Antropic needs, it will be interesting to see how this unfolds. History has thought us that this will end up with claims in the blog post not holding much weight.
I'm sort of surprised to see that you used Claude Code so much. I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc. And I know Bun started with an extreme attention to detail around performance.
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
I am not your target with this question (I don't write Zig) but there is a spectrum of LLM usage for coding. It is possible to use LLMs extensively but almost never ship LLM generated code, except for tiny trivial functions. One can use them for ideation, quick research, or prototypes/starting places, and then build on that. That is how I use them, anyway
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
> Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
I'll give you a basic example where it saved me a ton of time to vibe code instead of doing it myself, and I believe it would hold true for anyone.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
Yeah I had OpenAI crank out 100 different fizzbuzz implementations in a dozen seconds—-and many of them worked! No chance a developer would have done it that fast, and for anyone who needs to crank out fizzbuzz implementations at scale this is the tool to beat. The haters don’t know what they’re talking about.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
Handmade Cities founder here.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
Finding this comment interesting, parent comment didn't suggest any past association but it seemingly uses project reference as pivot point to do various outgroup counter signaling / neg bun?
I might missing some context. Just to check my understanding: HMC and Bun aren't a good match anymore because Bun devs use LLM/AI tooling more than HMC? Basically to really level up a system is incompatible these tools? (IYHO)
Thank you! I appreciated how you wrote up this clarifying.
back in my day we used to write code on punch cards.
"exquisitely hand-written"
This sounds so cringe. We are talking about computer code here lol
Bespoke handcrafted ethically sourced all natural cruelty free source code
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
Yep. And there's a lot of people making use of LLMs in both coding and learning/searching doing exactly that.
One of my favorite things is describing a bug to an LLM and asking it to find possible causes. It's helped track something down many times, even if I ultimately coded the fix.
more people should have such a healthy approach not only to llms but to life in general. Same reason I partake less and less in online discourse: its so tribal and filled with anger that its just not worth it to contribute anymore. Learning how to be in the middle did wonders to me as a programmer and I think as a person as well.
I'm not sure about exquisite and small.
Bun genuinely made me doubt my understanding of what good software engineering is. Just take a look at their code, here are a few examples:
- this hand-rolled JS parser of 24k dense, memory-unsafe lines: https://github.com/oven-sh/bun/blob/c42539b0bf5c067e3d085646... (this is a version from quite a while ago to exclude LLM impact)
- hand-rolled re-implementation of S3 directory listing that includes "parsing" XML via hard-coded substrings https://github.com/oven-sh/bun/blob/main/src/s3/list_objects...
- MIME parsing https://github.com/oven-sh/bun/blob/main/src/http/MimeType.z...
It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.
And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.
I can also see why it's a very good fit for LLM-heavy workflows.
I can't speak as much about the last two examples, but writing a giant parser file is pretty common in Zig from what I've seen. Here's Zig's own parser, for example[1]. I'm also not sure what you mean by memory unsafe, since all slices have bounds checks. It also looks like this uses an arena allocator, so lifetime tracking is pretty simple (dump everything onto the allocator, and copy over the result at the end). Granted, I could be misunderstanding the code, but that's the read I get of it.
[1] https://codeberg.org/ziglang/zig/src/commit/be9649f4ea5a32fd...
What are your thoughts on using AI generated cartoons as your primary marketing material on social media? For instance https://xcancel.com/bunjavascript/status/1955893818529866055...
Are you at liberty to divulge how much Anthropic paid for Bun?
Amazing news, congrats! Been using Bun for a long while now and I love it.
Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:
Isn't that still "acqui-hiring" according to common usage of the term?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
I think it’s an acquihire, and they also like Bun.
Why can't you make CLI autocompletions work? It's so basic, but the ticket has languished for almost as long as bun has existed!
Thanks, Jarred. Seeing what you built with Bun has been a real inspiration, the way one focused engineer can shift an entire ecosystem. It pushed me back into caring about the lower-level side of things again, and I’m grateful for that spark. Congrats on the acquisition, and excited to see what’s next
Hi Jarred. Congratulations on the acquisition! Did (or will) your investors make any profit on what they put into Bun?
Probably not. When we add new APIs in Bun, we generally base the interface off of popular existing packages. The bar is very high for a runtime to include libraries because the expectation is to support those APIs ~forever. And I can’t think of popular existing JS libraries for these things.
Hi Jarred,
I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?
Thanks
With Bun's existing OSS license and contribution model, all contributors retain their copyright and Bun retains the license to use those contributions. An acquisition of this kind cannot change the terms under which prior contributions were made without explicit agreement from all contributors. If Bun did switch to a CLA in the future, just like with any OSS project, that would only impact future contributions made after that CLA went into effect and it depends entirely on the terms established in that hypothetical CLA.
Hello, thank you, but that doesn't answer my question. I'm not asking for a definition, but for information about licensing decisions for the future of Bun.
Does this acquisition preclude implementing an s3 style integration for AWS bedrock? Also is IMDSv2 auth on the roadmap?
Hi Jarred, thanks for all your work on Bun.
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
Any chance there will be some kind of updating mechanism for 'compiled' bun executables?
I have a PR that’s been sitting for awhile that exposes the extra options from the renameat2 and renameatx_np syscalls which is a good way to implement self-updaters that work even when multiple processes are updating the same path on disk at the same time. These syscalls are supported on Linux & macOS but I don’t think there’s an equivalent on Windows. We use these syscalls internally for `bun install` to make adding packages into the global install cache work when multiple `bun install` processes are running simultaneously
No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one
Yeah why are you not out on a boat somewhere enjoying this moment? Go have fun please.
Acq's typically have additional stips you have to follow - they probably have new deadlines and some temporary stress for the next few months.
yes, acquisitions rarely result in an immediate cash payout.
Any thoughts on the claude "soul document" that was leaked this week?
I wonder if this is a sign of AI companies trying to pivot?
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
"Insanity Is Doing the Same Thing Over and Over Again and Expecting Different Results"
I find it a little sad, that there is almost no pushback on what a few people with deep pockets are trying to sell here. Normaly on HN an article on balcon gardening would be met with more critical thinking than this piece. Maybe instead of staring to the screen all day long take a break, think about what people with lots of money care about. And I don't judge, making money is nothing illegal. But Anthropic would be absolutely NOTHING without OSS. And then to see the kind of this effusive, submissive admiration and gratitude for their js wrapper thing makes me sick to my stomach.
A lot of people seem confused about this acquisition because they think of Bun as a node.js compatible bundler / runtime and just compare it to Deno / npm. But I think its a really smart move if you think of where Bun has been pushing into lately which is a kind of cloud-native self contained runtime (S3 API, SQL, streaming, etc). For an agent like Claude Code this trajectory is really interesting as you are creating a runtime where your agent can work inside of cloud services as fluently as it currently does with a local filesystem. Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases