Comment by chrsw
I still find some pages don't work 100% correctly in Firefox. But not nearly enough to keep me from using it on my personal machines. (My employer doesn't allow any browser except Chrome and Edge). For me, the most important feature of a browser is the web experience. I guess it should be security but I try to be careful about what I do online, regardless of what browser I'm using.
Many years ago I used to run the Firefox NoScript extension exclusively. For sites that I trusted and visited frequently I would add their domains to an exceptions list. For sites that I wasn't sure about I would load it with all scripts disabled and then selectively kept allowing scripts until the site was functional, starting with the scripts hosted on the same domain as the site I wanted to see/use.
Eventually I got too lazy to keep doing that but outside of the painstaking overhead it was by far the best web experience I ever had. I started getting pretty good at recognizing what scripts I needed to enable to get the site to load/work. Plus, uBlock Origin and annoyances filters got so good I didn't stress about the web so much any more.
But all this got me thinking, why not have the browser block all scripts by default, then have an AI agent selectively enable scripts until I get the functionality I need? I can even give feedback to the agent so it can improve over time. This would essentially be automating what I was dong myself years ago. Why wouldn't this work? Do I not understand AI? Or web technology? Or are people already doing this?
> I still find some pages don't work 100% correctly in Firefox.
Sometimes this is simply because the site preemptively throws an error on detecting Firefox because they don't want to QA another browser with a smaller market share. Usually those sites work fine if you just change the user agent Firefox reports to look like Chrome (there are add-ons for that). Personally, I haven't had to resort to a non-Firefox browser or user agent spoof even once in well over a year now.