Gaining access to anyones Arc browser without them even visiting a website
(kibty.town)1469 points by xyzeva 10 months ago
1469 points by xyzeva 10 months ago
Was the post written for HN users only? I cannot see it on your blog page (https://arc.net/blog). It’s not posted on your twitter either. Your whole handling seems to be responding only if there is enough noise about it.
Hursh, can you please respond to the above commenter? As an early adopter, I find it fairly troubling to see a company that touts transparency hide the blog post and only publicly "own up to it" within the confines of a single HN thread.
Pretty obvious now that Arc will only share security alerts with the people who "catch" them at it - as few as possible
Leaves no choice but for this community to make the rest of the Arc community aware of it as they refuse the transparency
Not a good look it not being on the main page! I personally use [zen browser](https://github.com/zen-browser/desktop); I like the ideas of Arc, but it always seemed sketchy to me, especially it being Chromium-based and closed-source.
Heads up: HN doesn't support link naming markdown and some of the extra characters broke the hyperlink.
In case the parent can't fix it in time for the edit window: https://github.com/zen-browser/desktop
I used Arc for a while because despite my misgivings about using a browser that requires an account etc the workflow was very good for me
I started moving to Zen about a week ago, hearing about this vulnerability yesterday and especially seeing their reaction to it I know I made the right choice in leaving Arc.
Hi Hursh, I'm Tom. A couple friends use Arc and they like it, so I had considered switching to it myself. Now, I won't, not really because of this vulnerability itself (startups make mistakes), but because you paid a measly $2k bounty for a bug that owns, in a dangerous way, all of your users. I won't use a browser made by a vendor who takes the security of their users this unseriously.
By the way, I don't know for sure, but given the severity I suspect on the black market this bug would have gone for a _lot_ more than $2k.
Selling vulnerability on the black market is immoral and may be illegal. The goal of bug bounty programs was initially to signal "we won't sue white hat researchers who disclose their findings to us", when did it evolve into "pay me more than criminals would, or else"?
Let's set aside morality for a second. There is a reason low payouts are bad without even having to consider the black market: it pushes people to search for bugs in a competitor's app that pays more instead of in your app!
If your app is paying out $2K and a competing app pays out $100K, why would anyone bother searching for bugs in your app? Every minute spent researching your app pay 1/50th of what you'd get searching in the competing app (unless your app has 50x more bugs I suppose, but perhaps then you have bigger problems...).
I'm always so confused by the negative responses to people asking for higher bug bounties. It feels like it still comes from this weird entitlement that researchers owe you the bug report. Perhaps they do. But you know what they definitely don't owe you? Looking for new bugs! Ultimately this attitude always leads to the same place: the places that pay more protect their users better. It is thus completely reasonable to decide not to use a product as a user if the company that makes the product isn't paying high bug bounties. It's the same as discovering that a restaurant is cheeping out on health inspections and deciding to no longer eat there.
> because you paid a measly $2k bounty for a bug that owns, in a dangerous way, all of your users
The case is redeemable. It may still be an opportunity if handled deftly. But it would require an almost theatrical display of generosity to the white hat (together, likely, with a re-constituting of the engineering team).
After thinking about it for a good long ten seconds, yeah. It would be very easy to steal users' banking information with this. If you crack into one single bank account you have a decent shot at making over $2k right there, a skilled hacker could do a lot more.
So you're not going to use Arc. How much do you pay for the browser you do use?
Should have at least paid €1 per user. Eh, maybe that’s what they did?
Comments further down are concerned that on each page load, you're sending both the URL and a(n identifiable?) user ID to TBC. You may want to comment on that, since I think it's reasonable to say that those of us using not-Chrome (I don't use Arc personally, but I'm definitely in the 1% of browser users) are likely to also be the sort of person concerned with privacy. Vulnerabilities happen, but sending browsing data seems like a deliberate design choice.
I think that is addressed in the post. Apparently the URL was only sent under certain conditions and has since been addressed:
>We’ve fixed the issues with leaking your current website on navigation while you had the Boost editor open. We don’t log these requests anywhere, and if you didn’t have the Boosts editor open these requests were not made. Regardless this is against our privacy policy and should have never been in the product to begin with.
Given the context (boosts need to know the URL they apply to after all) this indeed was a "deliberate design choice" but not in the manner you appear to be suggesting. It's still very worrisome, I agree.
There isn't really anything you can do to convince me that your team has the expertise to maintain a browser after this. It doesn't matter that you have fixed it, your team is clearly not capable of writing a secure browser, now or ever.
I think this should be a resigning matter for the CTO.
Yeah, I also think that asking someone to resign for this does not look like a proportionate response
They are owning up to their mistakes and making sure such things don't happen again (and increasing the amount from 2K :-)) seems like the right approach to me
Surprise surprise, turns out it takes a looong time for every software startup to finally strip out all the hacky stuff from their MVP days. Apparently nobody on this startup community forum has ever built a startup before.
Pro tip: if stuff like this violently upsets you, never be an early adopter of anything. Wait 5-10 years and then make your move.
Personally, I expect stuff like this from challenger alternatives, this is the way it should be. There is no such thing as a new, bug-free software product. Software gets good by gaining adoption and going through battle testing, it’s never the other way around like some big company worker would imagine.
I don't think you understood the severity or the noobiness of the error. This is a browser not a crud app or electron app. A browser is a complex system level piece of software not a hacky mvp and this kind of error shows that maybe they don't have the competence to be building something like this. It makes you wonder what other basic flaws are there just waiting to be exploited, even if its built on top of chromium. Would you fly in an mvp airplane built by bicycle engineers? (maybe not the best analogy since the first airplane was built by bicycle engineers)
Will you be increasing the bug bounty payout? $2,000 is a tiny fraction of what this bug is worth, I hope you will pay the discoverer a proper bounty.
You've been handed a golden opportunity to set the right course.
> $2,000 is a tiny fraction of what this bug is worth
The Browser Company raises $50mm at a $550mm post-money valuation in March [1]. They’ve raised $125mm altogether.
Unless they’re absolute asshats, they’ll increase the bug payout. But people act truly when they don’t think they’re being watched—a vulnerability of this magnitude was worth $2k to this company. That’s…eyebrow raising.
[1] https://techcrunch.com/2024/03/21/the-browser-company-raises...
"We will let anyone run arbitrary JavaScript on all your web pages if you send them a referral link" is surely a 6-7 figure vulnerability for a web browser. That this vulnerability was discoverable using about two steps of analysis tools suggests many more issues are in the product.
They have more users than what I could have guessed:
> As of July 2023, The Browser Company has 100,000+ users
https://www.boringbusinessnerd.com/startups/the-browser-comp....
Hursh responded elsewhere on the thread:
Most of the vulnerabilities I've disclosed, and I've seen disclosed, were disclosed for free, with no expectation of getting anything. Why do you think every researcher is an amoral penny pincher who will just sell exploits without caring for the consequences?
I know a lot of different people who do independent security research and have submitted vulns to bounty programs. Not a single one would even come close to saying "well, the bounty is low so I'll sell this on the black market."
Low bounties might mean that somebody doesn't bother to look at a product or doesn't bother to disclose beyond firing off an email or maybe even just publishes details on their blog on their own.
Bounties aren't really meant to compete with black markets. This is true even for the major tech companies that have large bounties.
> including moving off Firebase
Firebase is not to blame here. It's a solid technology which just has to be used properly. Google highlights the fact that setting up ACLs is critical and provides examples on how to set them up correctly.
If none of the developers who were integrating the product into Arc bothered about dealing with the ACLs, then they are either noobs or simply didn't care about security.
Until this individual comes back and responds to at least a few of the questions/comments, I don't think we should even pay attention to this marketing-dept-written post. They basically want this to go away, and answering any questions would raise more issues most likely, so they just seemed to have done the bare minimum and left it at that. It's 3 hours later now, they might as well have not even posted anything here.
50k or 100k would be far more appropriate given the severity of this issue. But overall, this makes me think there's probably a lot more vulnerabilities in Arc that are undiscovered/unpatched.
Also, there's the whole notion of every URL you visit being sent to Firebase -- were these logged? Awful for a browser.
> Honestly this was our first bounty ever awarded and we could have been more thoughtful
That’s corporate speak for “no, we won’t pay the researcher any more money.”
I think the bigger question is: Why are you violating your own security policy by keeping track on what we browse. I though my browsing is private and hidden away from you but if you store my browsing data in your firebase this is not acceptable at all.
> "...the hypothetical depth of this vulnerability is unacceptable."
What is also unacceptable is to pay 2000 dollars for something like this AND have to create user accounts to use your browser. Will definitely stay away from it.
Are you going to address the part where you send visited websites to Firebase which goes against your privacy policy of not tracking visited URLs?
I would like to respectfully provide the suggestion of allowing for the use of Arc without being signed into an account. Although I understand browser/device sync is part of most modern browsers, and the value it provides, normally it is a choice to use this feature. Arc still provides a lot of attractive features, even without browser sync on.
I like Arc, and I don’t want to pile on: God knows I’ve written vulnerable code.
To explore a constructive angle both for the industry generally and the Browser Company specifically: hire this clever hacker who pwned your shit in a well-remunerated and high-profile way.
The Browser Company is trying to break tradition with a lot of obsolete Web norms, how about paying bullshit bounties under pressure rather than posting the underground experts to guard the henhouse.
If the Browser Company started a small but aggressive internal red team on the biohazard that is the modern web?
I’ll learn some new keyboard shortcuts and I bet a lot of people will.
So when there are near weekly reports of websites being compromised due to horrid Firebase configuration, did absolutely no one on your teams raise a red flag? Is there some super low-pri ticket that says "actually make sure we use ACLs on Firebase"?
Hursh / ha470, where did you go? There are lots of good questions in the replies to your thread, yet you went dark immediately after posting more than 8 hours ago. It's hard to imagine what could be more pressing than addressing people's concerns after a major security incident such as this.
To be honest, I'm a bit disappointed. For future reference, this doesn't seem like a good strategy to contain reputational damage.
remember when reading this that this guy's company is valued at a billion dollars and his comp is 10x yours if not more. we live in a meritocracy
ngl this is pretty pathetic. the massive security hole is one thing but you're just gonna gloss over violating your own privacy policy?
> This kind of bug could be sold for 100-200k easily
Maybe not. If the browser is that buggy, there may be plenty of these lying around. The company itself is pricing the vulnerability at $2k. That should speak volumes to their internal view of their product.
Many engineers at SV startups use Arc on a daily basis. This bug could've resulted in the compromise of multiple companies, probably including crypto exchanges. A browser bug of this severity is extremely valuable, even for a niche browser like Arc.
I just want to call out that there is a lot of blame put on firebase here in the comments but I think that's just people parroting stuff they don't actually know about (I don't use firebase, I have tried it out in the past though). This isn't some edge case or hard to solve thing in firebase, this is the easy stuff.
The real issue here is that someone wrote an api that trusted the client to tell it who they were. At the end of the day this is an amateur mistake that likely took a 1 line diff to fix. Don't believe me? Check out the docs: https://firebase.google.com/docs/rules/rules-and-auth#cloud-... - `request.auth` gives you the user id you need (`request.auth.uid`).
As someone with an app built on firebase, yes. As the author rightly points out, it's very easy to misconfigure, but basic security practices like these are highlighted in bright, bold warning text in the Firebase docs.
Security rules are meant to be taken seriously, and it's your only line of defense.
> bold warning text in the Firebase docs.
Unfortunately, we currently have an industry where highly paid "engineers" unironically believe that their job can be done by reading/watching random tutorials, googling for StackOverflow answers, and pasting code from gists.
Attentively reading documentation or developing a mental model of how your tools work so that you know how they are built to be handled does not make it on to any job listing bullet points. It presumably fell off the bottom in favor of team spirit or brand enthusiasm or whatever.
How many tutorials, community answers, and gists do you think conveyed that warning?
Reading/watching random tutorials and asking basic questions on SO __instead of reading the official docs__ is a trend I've observed for the last 10 years. Even for stuff pretty well documented like Python, Postgres, React, etc.
I am glad to put engineers in quotes because many people here and elsewhere will use that word with a straight face while also believing that you can call yourself that while learning your job from watching youtube vids and pasting code you don't understand. We need to stop using the word "engineer" for "software developer".
I shall watch the downvotes come from these so called "engineers".
ChatGPT would have probably parrotted the bold text. It is always super concerned about risks.
> it's very easy to misconfigure, but basic security practices like these are highlighted in bright, bold warning text in the Firebase docs.
I'm sorry but if the whole design is "one big database shared with everyone and we must manually configure the database for auth" there is a problem that's deeper than just having to read the doc. It means the basic understanding of what it means to keep data as private as possible is not understood. A shared database only works when the server accesses it, not when client has direct access.
What Arc needs is to segregate each user's data in a different place, in the design of the database, not as part of configuration of custom code. Make it impossible to list all user's data, or even users. When, not if, an id is guessed, related data becomes accessible by someone else; make it so that someone else still can't read it, or can't replace it.
It's interesting to see software engineers going from rolling their own auth, to not rolling their own auth, to not even noticing this quite blatant security problem.
It doesn't matter if you roll your own auth or not, you need to understand a very basic fundamental of it all: never trust the client.
At the end of the day this is an amateur mistake
God I wish. More than one of my coworkers has made this exact mistake with our (thankfully internal) front-end apps.
Coworker implies paid work, and therefore they are not amateurs. They very well may make the same mistakes, but those mistakes would be professional mistakes.
If it's internal, did they really need to have auth?
YES!!! You need auth to prevent employees from looking up sensitive user data without a good reason, or it'll be a stalker's haven. And to prevent possible intruders from gaining more data/access. Defense in depth. And for preventing an experiment from wiping use data. And for so many other reasons!
The term of art is "Friendly fraud".
A significant amount of product stolen from retail stores actually goes out the back door.
> If it's internal, did they really need to have auth?
Nothing on a network is truly internal. The moment you break the physical link between metal and man you're in an unintuitive, and thus insecure, state.
A security plan which depends on any person never making an amateur mistake, is an amateur mistake.
Agreed, if I understand correctly the fix to this issue would be the following rules inside of a "match" statement in firestore.rules which is plainly documented as firebase firestore security 101:
```
// Allow create new object if user is authenticated
allow create: if request.auth != null;
// Allow update or delete document if user is owner of document
allow update, delete: if request.auth.uid == resource.data.ownerUID
```
Unclear if they had these rules in place already but I'm curious... If the rule permits writing when the userid matches, presumably there is nothing stopping the write operation to change the userid value, to your point.
Which then leads me to the next question, what is the practical way to write rules against that operation?
In my limited experience, I've seen it handled by adding the user's ID in the path of any resource that belongs to a particular user, so that the user ID from the resource path can be compared with the authenticated user ID as a security rule condition.
But as expected, you can validate the incoming data as well https://firebase.google.com/docs/firestore/security/rules-co... but this would need to be done for any attribute that might lead to a change of ownership.
Is there no Allow-read? Edit: Yes,
allow read, update, delete: if request.auth != null && request.auth.uid == userId;
This is what happens when you hire solely based on leetcode skill. A shit-tier engineer can master leetcode within months, but a good engineer will probably struggle at Find Nth Smallest Sum problem because he spends more time reading and thinking about code.
Leetcode is a fucking joke to the industry, gone are the days when you actually had good code with devs who spent time thinking about information architecture. In my experience boomer devs are actually the only ones who write idiomatic code. Millennial and Gen-z devs are the worst, they have no understanding beyond basic function calling.
the whole idea of firebase is flawed as logic that belongs to a server is now on the client side. I don't know much about security but that sounds like making any centralized rule (eg security) hard to implement. It also tends to expose more internal logic than the client needs to know, which is bad in both software design and security.
I just wanted to say, I enjoyed the little pixel art cat that runs towards wherever you click immensely. It’s one of those fun, whimsical little touches that I don’t see all that often. A reminder that the internet can be a fun, whimsical place if we want it to be :)
It does: https://github.com/adryd325/oneko.js/blob/main/oneko.js
const isReducedMotion =
window.matchMedia(`(prefers-reduced-motion: reduce)`) === true ||
window.matchMedia(`(prefers-reduced-motion: reduce)`).matches === true;
if (isReducedMotion) return;
Simple but effective. More websites should include this check. Well done, adryd325!Same for me, on FF you can override it with:
about:config
ui.prefersReducedMotion = 0
https://developer.mozilla.org/en-US/docs/Web/CSS/@media/pref...It's doing great for being a 35-year-old cat!
I don't, but I run the same system configuration, so I can compile it on my computer, transfer it and run it.
Alternatively, if a compiler such as gcc is available, you could also run
# https seems to be broken on this website currently
wget http://www.daidouji.com/oneko/distfiles/oneko-1.2.sakura.5.tar.gz
tar -xf oneko-1.2.sakura.5.tar.gz
cd oneko-1.2.sakura.5/
gcc oneko.c -lX11 -lm -o oneko
./oneko &
cd ..
# remove all traces
rm -r oneko-1.2.sakura.5 oneko-1.2.sakura.5.tar.gz
it sits when it's next to pointer. just don't move your mouse.
For the curious, that specific cat goes back to 1989:
It is distracting and annoyed me, I stopped reading because of it.
According to this article, Arc requires an account and sends Google's Firebase the hostname of every page you visit along with your user ID. Does this make Arc the least private web browser currently being used?
I trashed Arc immediately after install when I found out having an account was mandatory. That seemed so silly, like toothbrushes-requiring-wifi absurd. How much moreso now.
Truly. I was looking for a privacy respecting Chromium-based browser to use for Web MiniDisc (https://web.minidisc.wiki/) and came across some enthusiastic praise for Arc. I downloaded it and it immediately wanted me to create an account to even use it. How can that possibly respect my privacy? It went right in the trash.
I had doubts already when submissions promoting the browser were added on hn while there was no way to see how it looks like or even test it out - for quite some time there was nothing but mail singup on their page.
I guess it's relatively easy to test, add the Firebase domain to your host file and point it to 127.0.0.1 and try to use the browser.
Sometimes things like this handle connection failures better than "never-ending connection attempts", so you might want to try to add a throttle or something too for the traffic between the domain and the browser, might also trip it up.
"Arc is the Chrome replacement I’ve been waiting for." [1]
I guess now we know why they frame it that way.
Chrome does not require an account to use. And Chrome by default doesn't send sites you visit to Google, unless you turn on the "make searches and browsing better" feature or the "enhanced safe browsing" feature.
So the OP is right. Arc's privacy is worse than Chrome.
This is such a fantastic bug. Firebase security rules (like with other BaaS systems like Firebase) have this weird default that is hard to describe. Basically, if I write my own API, I will set the userId of the record (a 'boost' in this case) to the userId from the session, rather than passing it in the request payload. It would never even occur to a developer writing their own API past a certain level of experience to let the client pass (what is supposed to be) their own userId to a protected API route.
On the other hand, with security rules you are trying to imagine every possible misuse of the system regardless of what its programmed use actually is.
> On the other hand, with security rules you are trying to imagine every possible misuse of the system regardless of what its programmed use actually is.
Tbh you're doing it wrong if you go that way.
Default deny, and then you only have to imagine the legitimate uses.
The failure modes are much clearer: when you write the API in a default-deny context & forget to add that allowed pattern, it never works, so you notice & figure out the bug.
The same story with default-allow means the system looks like it works fine, and you end up with no security at all.
And then when you imagine the legitimate uses you have to imagine how allowing those legitimate uses could be misused. You always need to think red and blue.
For inserts yes, but for updates I've frequently seen cases where people just stuff the whole request into their ORM or document store. It is pretty easy to think "the owner can update the document" without realizing that there are some fields (that the official client doesn't set) that shouldn't be updated (like the owner or created timestamp).
The correct solution is likely default-deny auth for every single field. Then you at least have to explicitly make the owner field writable, and hopefully consider the impact of transfering this object to another user.
I'm amazed by how profoundly stupid this vulnerability is. To get arbitrary code execution, you literally just send somebody else's user ID, which is fairly trivial to obtain.
I don't work at FAANG. I just work at some company that makes crap products you don't actually need, and even I would never build this kind of bug.
But these people want to build a web browser, with all the security expertise and moral duty that implies?! Wow.
Can we have Arc added to the title of the post to better alert people who use or know people who use the browser?
I’m Hursh, cofounder and CTO of The Browser Company (the company that makes Arc). Even though no users were affected and we patched it right away, the hypothetical depth of this vulnerability is unacceptable. We’ve written up some technical details and how we’ll improve in the future (including moving off Firebase and setting up a proper bug bounty program) here: https://arc.net/blog/CVE-2024-45489-incident-response.
I'm really sorry about this, both the vuln itself and the delayed comms around it, and really appreciate all the feedback here – everything from disappointment to outrage to encouragement. It holds us accountable to do better, and makes sure we prioritize this moving forward. Thank you so much.