renehsz 2 days ago

The Plan 9 operating system.

It's the closest thing to a Unix successor we ever got, taking the "everything is a file" philosophy to another level and allowing to easily share those files over the network to build distributed systems. Accessing any remote resources is easy and robust on Plan9, meanwhile on other systems we need to install specialized software with bad interoperability for each individual use case.

Plan9 also had some innovative UI features, such as mouse chording to edit text, nested window managers, the Plumber to run user-configurable commands on known text patterns system-wide, etc.

Its distributed nature should have meant it's perfect for today's world with mobile, desktop, cloud, and IoT devices all connected to each other. Instead, we're stuck with operating systems that were never designed for that.

There are still active forks of Plan9 such as 9front, but the original from Bell Labs is dead. The reasons it died are likely:

- Legal challenges (Plan9 license, pointless lawsuits, etc.) meant it wssn't adopted by major players in the industry.

- Plan9 was a distributed OS during a time when having a local computer became popular and affordable, while using a terminal to access a centrally managed computer fell out of fashion (though the latter sort of came back in a worse fashion with cloud computing).

- Bad marketing and posing itself as merely a research OS meant they couldn't capitalize on the .com boom.

- AT&T lost its near endless source of telephone revenue. Bell Labs was sold multiple times over the coming years, a lot of the Unix/Plan9 guys went to other companies like Google.

  • teddyh 2 days ago

    > The reasons it died are likely:

    The reason Plan 9 died a swift death was that, unlike Unix – which hardware manufacturers could license for a song and adapt to their own hardware (and be guaranteed compatibility with lots of Unix software) – Bell Labs tried to sell Plan 9, as commercial software, for $350 a box.

    (As I have written many times in the past: <https://news.ycombinator.com/item?id=22412539>, <https://news.ycombinator.com/item?id=33937087>, and <https://news.ycombinator.com/item?id=43641480>)

    • EdiX 2 days ago

      Version 1 was never licensed to anyone. Version 2 was only licensed to universities for an undiscolsed price. Version 3 was sold as a book, I think this is the version you are referring to. However note that this version contained a license that only allowed non commercial uses of the source code. It also came with no support, no community and no planned updates (the project was shelved half a year later in favor of inferno)

      More than the price tag the problem is that plan 9 wasn't really released until 2004.

    • Shugyousha 2 days ago

      Strictly speaking, it's not dead. The code is now open source and all the rights are with the Plan 9 foundation: https://p9f.org/

      It's just unlikely that it will get as big of a following as Linux has.

    • pjmlp a day ago

      Had UNIX also been something like other OSes price points, instead of a song as you say, it would never even taken off, it was more about the openess and being crazy cheap than the alternatives, than anything else.

  • pjmlp a day ago

    The team moved on to work on Inferno, which Plan 9 afficionados tend to forget about, which was also a much better idea as UNIX evolution, Plan 9 combined with a managed userspace, which also didn't went down well.

  • mycall 2 days ago

    Plan 9 Filesystem Protocol lives on inside WSL2.

    • ajross 2 days ago

      9P is used everywhere in the VM ecosystem. It's clean and simple and well supported by almost all guests.

  • tjchear 2 days ago

    What’s stopping other Unix-like systems from adopting the everything is a file philosophy?

    • c0balt 2 days ago

      Probably that not everything can be cleanly abstracted as a file.

      One might want to, e. G., have fine control over a how a network connection is handled. You can abstract that as a file but it becomes increasingly complicated and can make API design painful.

      • Someone 2 days ago

        > Probably that not everything can be cleanly abstracted as a file.

        I would say almost nothing can be cleanly abstracted as a file. That’s why we got ioctl (https://en.wikipedia.org/wiki/Ioctl), which is a bad API (calls mean “do something with this file descriptor” with only conventions introducing some consistency)

    • WD-42 2 days ago

      They have to an extent. The /proc file system on Linux is directly inspired by plan 9 IIRC. Other things like network sockets never got that far and are more related to their BSD kin.

      • pjmlp a day ago

        Not at all, /proc comes from Solaris.

        • WD-42 9 hours ago

          Looking into it we are both wrong. Plan 9 implemented /proc after 8th edition Unix. Solaris and Linux both implemented it at the same time in 1992.

      • cratermoon 2 days ago

        There's also /dev/tcp in Linux

            exec 5<>/dev/tcp/www.google.com/80
            echo -e "GET / HTTP/1.1\r\nhost: www.google.com\r\nConnection: close\r\n\r\n" >&5
            cat <&5
    • mike_hearn 2 days ago

      The fact that everything is not a file. No OS actually implements that idea including Plan9. For example, directories are not files. Plan9 re-uses a few of the APIs for them, but you can't use write() on a directory, you can only read them.

      Pretending everything is a file was never a good idea and is based on an untrue understanding of computing. The everything-is-an-object phase the industry went through was much closer to reality.

      Consider how you represent a GUI window as a file. A file is just a flat byte array at heart, so:

      1. What's the data format inside the file? Is it a raw bitmap? Series of rendering instructions? How do you communicate that to the window server, or vice-versa? What about ancillary data like window border styles?

      2. Is the file a real file on a real filesystem, or is it an entry in a virtual file system? If the latter then you often lose a lot of the basic features that makes "everything is a file" attractive, like the ability to move files around or arrange them in a user controlled directory hierarchy. VFS like procfs are pretty limited. You can't even add your own entries like adding symlinks to procfs directories.

      3. How do you receive callbacks about your window? At this point you start to conclude that you can't use one file to represent a useful object like a window, you'd need at least a data and a control file where the latter is some sort of socket speaking some sort of RPC protocol. But now you have an atomicity problem.

      4. What exactly is the benefit again? You won't be able to use the shell to do much with these window files.

      And so on. For this reason Plan9's GUI API looked similar to that of any other OS: a C library that wrapped the underlying file "protocol". Developers didn't interact with the system using the file metaphor, because it didn't deliver value.

      All the post-UNIX operating system designs ignored this idea because it was just a bad one. Microsoft invested heavily in COM and NeXT invested in the idea of typed, IDL-defined Mach ports.

      • pjmlp a day ago

        Unfortunely Microsoft didn't invest heavily enough on COM tooling, it sucks in 2025 as much as in the 1990's.

    • IshKebab 2 days ago

      Probably the fact that it's a pretty terrible idea. It means you take a normal properly typed API and smush it down into some poorly specified text format that you now have to write probably-broken parsers for. I often find bugs in programs that interact with `/proc` on Linux because they don't expect some output (e.g. spaces in paths, or optional entries).

      The only reasons people think it's a good idea in the first place is a) every programming language can read files so it sort of gives you an API that works with any language (but a really bad one), and b) it's easy to poke around in from the command line.

      Essentially it's a hacky cop-out for a proper language-neutral API system. In fairness it's not like Linux actually came up with a better alternative. I think the closest is probably DBus which isn't exactly the same.

      Maybe something like FIDL is a proper solution but I have only read a little about it: https://fuchsia.dev/fuchsia-src/get-started/learn/fidl/fidl

      • vacuity 2 days ago

        I think you have to standardize a basic object system and then allow people to build opt-in interfaces on top, because any single-level abstraction will quickly be pulled in countless directions for as many users.

  • AfterHIA a day ago

    UNIX is for dorks. We needed a Smalltalk style, "everything is an object and you can talk to all objects" but thankfully we got Java and, "object oriented" C++. The Alto operating system was leaps and bounds ahead of the Mac and Windows 3.1 system and it took Steve Jobs a decade to realize, "oh shit we could have just made everything an object." Then we get WebObjects and the lousy IPod and everything is fascist history.

    #next #never #forget #thieves

    • pjmlp a day ago

      I had a UNIX zealot phase back in the 1990's, until the university library opened my eyes to Xerox PARC world, tucked away at the back there were all the manuals and books about Smalltalk from Xerox, eventually I also did some assigments with Smalltalk/V, and found a way to learn about Interlisp and Mesa/Cedar as well.

      My graduation project was porting a visualisation framework from Objective-C/NeXTSTEP to Windows.

      At the time, my X setup was a mix of AfterStep or windowmaker, depending on the system I was at.

    • fud101 a day ago

      Plan9 fanboys are some of the dumbest well meaning idiots you'll find. It's kind of adorable.

Animats 2 days ago

- Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system.

- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.

- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.

- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.

- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones. IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.

- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.

- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.

- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?

- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.

  • ndiddy 2 days ago

    > MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.

    You have things backwards. The Copland project was horribly mismanaged. Anybody at Apple who came up with a new technology got it included in Copland, with no regard to feature creep or stability. There's a leaked build floating around from shortly before the project was cancelled. It's extremely unstable and even using basic desktop functionality causes hangs and crashes. In mid-late 1996, it became clear that Copland would never ship, and Apple decided the best course of action was to license an outside OS. They considered options such as Solaris, Windows NT, and BeOS, but of course ended up buying NeXT. Copland wasn't killed to justify buying NeXT, Apple bought NeXT because Copland was unshippable.

  • jasode 2 days ago

    >- XHTML. [...] Would it kill people to have to close their tags properly?

    XHTML appeals to the intuition that there should be a Strict Right Way To Do Things ... but you can't use that unforgiving framework for web documents that are widely shared.

    The "real world" has 2 types of file formats:

    (1) file types where consumers cannot contact/control/punish the authors (open-loop) : HTML, pdf, zip, csv, etc. The common theme is that the data itself is more important that the file format. That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries. And both 7-Zip and Winrar can read malformed zip files with broken headers (because some old buggy Java libraries wrote bad zip files). MS Excel can import malformed csv files. E.g. the Citi bank export to csv wrote a malformed file and it was desirable that MS Excel imported it anyway because the raw data of dollar amounts was more important than the incorrect commas in the csv file -- and -- I have no way of contacting the programmer at Citi to tell them to fix their buggy code that created the bad csv file.

    (2) file types where the consumer can control the author (closed-loop): programming language source code like .c, .java, etc or business interchange documents like EDI. There's no need to have a "lenient forgiving" gcc/clang compiler to parse ".c" source code because the "consumer-and-author" will be the same person. I.e. the developer sees the compiler stop at a syntax error so they edit and fix it and try to re-compile. For business interchange formats like EDI, a company like Walmart can tell the vendor to fix their broken EDI files.

    XHTML wants to be in group (2) but web surfers can't control all the authors of .html so that's why lenient parsing of HTML "wins". XHTML would work better in a "closed-loop" environment such as a company writing internal documentation for its employees. E.g. an employee handbook can be written in strict XHTML because both the consumers and authors work at the same company. E.g. can't see the vacation policy because the XHTML syntax is wrong?!? Get on the Slack channel and tell the programmer or content author to fix it.

    • crote 2 days ago

      The problem is that group (1) results in a nightmarish race-to-the-bottom. File creators have zero incentive to create spec-compliant files, because there's no penalty for creating corrupted files. In practice this means a large proportion of documents are going to end up corrupt. Does it open in Chrome? Great, ship it! The file format is no longer the specification, but it has now become a wild guess at whatever weird garbage the incumbent is still willing to accept. This makes it virtually impossible to write a new parser, because the file format suddenly has no specification.

      On the other hand, imagine a world where Chrome would slowly start to phase out its quirks modes. Something like a yellow address bar and a "Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message. Turn it into a red bar and a "click to continue" after 10 years, remove it altogether after 20 years. Suddenly it's no longer that one weird customer who is complaining, but everyone - including your manager. Your mistakes are painfully obvious during development, so you have a pretty good incentive to properly follow the spec. You make a mistake on a prominent page and the CTO sees it? Well, guess you'll be adding an XHTML validator to your CI pipeline next week!

      It is very tempting to write a lenient parser when you are just one small fish in a big ecosystem, but over time it will inevitably lead to the degradation of that very ecosystem. You need some kind of standards body to publish a validating reference parser. And like it or not, Chrome is big enough that it can act as one for HTML.

      • pixl97 2 days ago

        >File creators have zero incentive to create spec-compliant files, because there's no penalty for creating corrupted files

        This depends. If you are a small creator with a unique corruption then you're likely out of luck. The problem with big creators is 'fuck you' I do what I want.

        >"Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message.

        This would appear on pretty much every website. And it would appear on websites that are no longer updated and they'd functionally disappear from any updated browser. In addition the 10-20 year thing just won't work in US companies, simply put if they get too much pressure next quarter on it, it's gone.

        >Your mistakes are painfully obvious during development,

        Except this isn't how a huge number of websites work. They get html from many sources and possibly libraries. Simply put no one is going to follow your insanity, hence why xhtml never worked in the first place. They'll drop Chrome before they drop the massive amount of existing and potential bugs out there.

        >And like it or not, Chrome is big enough that it can act as one for HTML.

        And hopefully in a few years between the EU and US someone will bust parts of them up.

        • immibis a day ago

          We don't accept this from any other file format - why is HTML different? For example, if I include random blocks of data in a JPEG file, the picture is all broken or the parser gives up (which is often turned into a partial picture by some abstraction layer that ignores the error code) - in both cases the end user treats as completely broken. If I add random bytes into a Word or LibreOffice document I expect it not to load at all.

      • bsimpson 2 days ago

        That would break decades of the web with no incentive for Google to do so. Plus, any change of that scale that they make is going to draw antitrust consideration from _somebody_.

      • drob518 2 days ago

        You’re right, but even standards bodies aren’t enough. At the end of the day, it’s always about what the dominant market leader will accept. The standard just gives your bitching about the corrupted files some abstract moral authority, but that’s about it.

    • afavour 2 days ago

      I’d argue a good comparison here is HTTPS. Everyone decided it would be good for sites to move over to serving via HTTPS so browsers incentivised people to move by gating newer features to HTTPS only. They could have easily done the same with XHTML had they wanted.

      • JimDabell 2 days ago

        The opportunities to fix this were pretty abundant. For instance, it would take exactly five words from Google to magically make a vast proportion of web pages valid XHTML:

        > We rank valid XHTML higher

        It doesn’t even have to be true!

        • pixl97 2 days ago

          Even more reason to break Google up.

    • layer8 2 days ago

      > That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries.

      No, the reason is that Adobe’s implementation never bothered to perform much validation, and then couldn’t add strict validation retroactively because it would break too many existing documents.

      And it’s really the same for HTML.

    • Fluorescence 2 days ago

      This is an argument for a repair function that transforms a broken document into a well-formed one without loss but keeps the spec small, simple and consistent. It's not an argument for baking malformations into a complex messy spec.

    • easyThrowaway a day ago

      We could've made the same arguments for supporting Adobe Flash on the iPhone.

      And yet Apple decided that no, this time we do it the "right" way[1], stuck with plain HTML/CSS/JS and frankly we're all better for it.

      [1] I'm aware this is a massive oversimplification and there were more cynical reasons behind dropping the flash runtime from iOS, but they're not strictly relevant to this discussion.

  • JimDabell 2 days ago

    > - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?

    Amen. Postel’s Law was wrong:

    https://datatracker.ietf.org/doc/html/rfc9413

    We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody.

    The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs.

    • maratc 2 days ago

      Just to remind you that <bold> <italic> text </bold> </italic> [0] that has been working for ages in every browser ever, is NOT a valid XHTML, and should be rejected by GP's proposal.

      I, for one, is kinda happy that XHTML is dead.

      [0]: By <bold> I mean <b> and by <italic> I mean <i>, and the reason it's not valid HTML is that the order of closing is not reverse of the order of opening as it should properly be.

      • JimDabell 2 days ago

        That caused plenty of incompatibilities in the past. At one point, Internet Explorer would parse that and end up with something that wasn’t even a tree.

        HTML is not a set of instructions that you follow. It’s a terrible format if you treat it that way.

      • reactordev 2 days ago

        It’s totally valid XHTML, just not recognized.

        XHTML allows you to use XML and <bold> <italic> are just XML nodes with no schema. The correct form has been and will always be <b> and <i>. Since the beginning.

  • dimal 2 days ago

    I was all gung ho on XHTML back in the day until I realized that a single unclosed tag in an ad or another portion of our app that I had no control over would cause the entire page to fail. The user would see nothing except a giant ugly error. And your solution of rendering the rest of the page in Times New Roman isn’t an option. Do you try to maintain any of the HTML semantics or just render plain text? If it’s plain text, that’s useless. If you’re rendering anything with any semantics, then you need to know how to parse it. You’re back where you started.

    Granted, I could ensure that my code was valid XHTML, but I’m a hypermeticulous autistic weirdo, and most other people aren’t. As much as XHTML “made sense”, it was completely unworkable in reality, because most people are slobs. Sometimes, worse really is better.

    • markasoftware 2 days ago

      if the world was all XHTML, then you wouldn't put an ad on your site that wasn't valid XHTML, the same way you wouldn't import a python library that's not valid python.

      • jasode 2 days ago

        >, then you wouldn't put an ad on your site that wasn't valid XHTML,

        You're overlooking how incentives and motivations work. The gp (and their employer) wants to integrate the advertisement snippet -- even with broken XHTML -- because they receive money for it.

        The semantic data ("advertiser's message") is more important than the format ("purity of perfect XHTML").

        Same incentives would happen with a jobs listing website like Monster.com. Consider that it currently has lots of red errors with incorrect HTML: https://validator.w3.org/nu/?doc=https%3A%2F%2Fwww.monster.c...

        If there was a hypothetical browser that refused to load that Monster.com webpage full of errors because it's for the users' own good and the "good of the ecosystem"... the websurfers would perceive that web browser as user-hostile and would choose another browser that would be forgiving of those errors and just load the page. Job hunters care more about the raw data of the actual job listings so they can get a paycheck rather than invalid <style> tags nested inside <div> tags.

        Those situations above are a different category (semantic_content-overrides-fileformatsyntax) than a developer trying to import a Python library with invalid syntax (fileformatsyntax-Is-The-Semantic_Content).

        EDIT reply to: >Make the advertisement block an iframe [...] If the advertiser delivers invalid XHTML code, only the advertisement won't render.

        You're proposing a "technical solution" to avoid errors instead of a "business solution" to achieve a desired monetary objective. To re-iterate, they want to render the invalid XHTML code so your idea to just not render it is the opposite of the goal.

        In other words, if rendering imperfect-HTML helps the business goal more than blanking out invalid XHTML in an iframe, that means HTML "wins" in the marketplace of ideas.

      • mikehall314 2 days ago

        But all it takes in that world is for a single browser vendor to decide - hey, we will even render broken XHTML, because we would rather show something than nothing - and you’re back to square one.

        I know which I, as a user, would prefer. I want to use a browser which lets me see the website, not just a parse error. I don’t care if the code is correct.

      • troupo 2 days ago

        Yes, you would be able to put an ad on your site that wasn't XHTML, because XHTML is just text parsed in the browser at runtime. And yes, that would fail, silently, or with a cryptic error

    • easyThrowaway a day ago

      The most sensible option would be to just show the error for the ad part of the website.

      Also, the whole argument falls apart the moment the banner has a javascript error too. Should we attempt to run malformed code just in case? Or should a browser start shipping shims and compatibility fixes for known broken websites like microsoft do for windows apps?

  • eterm 2 days ago

    > Would it kill people to have to close their tags properly

    It would kill the approachability of the language.

    One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output.

    That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something.

    If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages.

    The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away.

    Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.

    But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was.

    • bazoom42 2 days ago

      If you forget a closing quote on an attribute in html, all content until next quote is ignored and not rendered - even if it is the rest of the page. I dont think this is more helpful than an error message. It was just simpler to implement.

      • eterm 2 days ago

        Let's say you forget to close a <b></b> element.

        What happens?

        Even today, after years of better error messages, the strict validator at https://validator.w3.org/check says:

            Error Line 22, Column 4: end tag for "b" omitted, but OMITTAG NO was specified 
        
        What is line 22?

            </p>
        
        
        It's up to you to go hunting back through the document, to find the un-closed 'b' tag.

        Back in the day, the error messages were even more misleading than this, often talking about "Extra content at end of document" or similar.

        Compare that to the very visual feedback of putting this exact document into a browser.

        You get more bold text than you were expecting, the bold just runs into the next text.

        That's a world of difference, especially for people who prefer visual feedback to reading and understanding errors in text form.

        Try it for yourself, save this document to a .html file and put it through the XHTML validator.

            <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
                "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
            <?xml-stylesheet href="http://www.w3.org/StyleSheets/TR/W3C-WD.css" type="text/css"?>
            <html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en">
        
            <head>
              <title>test XHTML 1.0 Strict document</title>
              <link rev="made" href="mailto:gerald@w3.org" />
            </head>
        
            <body>
        
            <p>
            This is a test XHTML 1.0 Strict document.
            </p>
        
            <p>
            See: <a href="./">W3C Markup Validation Service: Tests</a>
            <b>huh
            Well, isn't that good
        
            </p>
        
            <hr />
        
            <address>
              <a href="https://validator.w3.org/check?uri=referer">valid HTML</a><br />
              <a href="../../feedback.html">Gerald Oskoboiny</a>
            </address>
        
            </body>
        
            </html>
    • 1718627440 a day ago

      I can "handwrite" C, Python, etc. just fine and they don't assign fallback meanings to syntax errors.

    • MangoToupe 2 days ago

      > Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.

      This was also part of the initial draw of `clang`.

  • PaulRobinson 2 days ago

    Nice list. Some thoughts:

    - I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place.

    - I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions.

    - Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation.

    - XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology.

    • JimDabell 2 days ago

      > XHTML died because it was too hard for people to get stuff done.

      This is not true. The reason it died was because Internet Explorer 6 didn’t support it, and that hung around for about a decade and a half. There was no way for XHTML to succeed given that situation.

      The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React.

      People can deal with strict syntax. They can manage it with JSX, they can manage it with JSON, they can manage it with JavaScript, they can manage it with every back-end language like Python, PHP, Ruby, etc. The idea that people see XHTML being parsed strictly and give up has never had any truth to it.

      • troupo 2 days ago

        > The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React.

        JSX is processed during the build step, XHTML is processed at runtime, by the browser.

    • eloisant 2 days ago

      They would have gotten another modern OS instead of Next as the base for MacOSX (then iOS).

      Another possibility they were exploring was buying BeOS, which would have been pretty interesting because it was an OS built from scratch in the 90's without any of the cruft from the 70's.

      Also, the only thing specific to Next that survived in MacOSX and iOS was ObjectiveC and the whole NextStep APIs, which honestly I don't think it a great thing. It was pretty cool in the 90's but when the iPhone was released it was already kinda obsolete. For the kernel, Linux or FreeBSD would have worked just the same.

      • otabdeveloper4 2 days ago

        > without any of the cruft from the 70's

        By "cruft" you mean "lessons learned", right?

    • pjmlp a day ago

      There is hardly any UNIX stuff for iOS and Android applications sold via the respective app stores.

      You won't get far with POSIX on any of the platforms.

    • bsimpson 2 days ago

      Didn't Google already own Android when iOS was announced?

      • troupo 2 days ago

        Yes, and they were going to position it against Windows Mobile.

        When iOS was announced, Google scrambled to re-do the entire concept

    • donatj 2 days ago

      On XHTML, I think there was room for both HTML and a proper XHTML that barks on errors. If you're a human typing HTML or using a language where you build your HTML by concatenation like early PHP, sure it makes sense to allow loosey goosey HTML but if you're using any sort of simple DOM builder which should preclude you from the possibility of outputting invalid HTML, strict XHTML makes a lot more sense.

      Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam.

      • detaro 2 days ago

        But a HTML5 parser will obviously parse "strict" HTML5 just fine too, what value is there to special-case the "this was generated by a DOM builder" path client-side?

      • chrismorgan 2 days ago

        > Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam.

        The HTML Standard supports two syntaxes, HTML and XML. All browsers support XML syntax just fine—always have, and probably always will. Serve your file as application/xhtml+xml, and go ham.

  • archargelod 2 days ago

    > Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.

    If you appreciate Modula's design, take a look at Nim[1].

    I remember reading the Wikipedia page for Modula-3[2] and thinking "huh, that's just like Nim" in every other section.

    [1] https://nim-lang.org

    [2] https://en.wikipedia.org/wiki/Modula-3

    • pjmlp a day ago

      Swift, D and C# are also quite close to Modula-3 in spirit and features.

  • bazoom42 2 days ago

    > Would it kill people to have to close their tags properly?

    Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.

    • defanor 2 days ago

      > but what would be the benefit of having more pages fail to render?

      I think those benefits are quite similar to having more programs failing to run (due to static and strong typing, other static analysis, and/or elimination of undefined behavior, for instance), or more data failing to be read (due to integrity checks and simply strict parsing): as a user, you get documents closer to valid ones (at least in the rough format), if anything at all, and additionally that discourages developers from shipping a mess. Then parsers (not just those in viewers, but anything that does processing) have a better chance to read and interpret those documents consistently, so even more things work predictably.

      • bazoom42 2 days ago

        Sure, authoring tools should help authors avoid mistakes and produce valid content. But the browser is a tool for the consumer of content, and there is no benefit for the user if it fails to to render some existing pages.

        It is like Windows jumping through hoops to support backwards compatibility even with buggy software. The interest of the customer is that the software runs.

      • detaro 2 days ago

        HTML5 was the answer for the consistency part: where before browsers did different things to recover from "invalid" HTML, HTML5 standardizes it because it doesn't care about valid/invalid as much, it just describes behavior anyways.

    • Eric_WVGG 2 days ago

      I used to run an RSS feed consolidator, badly formed XML was the bane of my life for a very long time.

      If devs couldn't even get RSS right, a web built on XHTML was a nonstarter.

  • bawolff 2 days ago

    > XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized?

    I actually have, and its not that bad.

    If anything, the worst part is foreign content (svg, mathml) which have different rules more similar to xml but also not the same as xml.

    Just as an aside, browsers still support xhtml, just serve with application/xhtml+xml mime type, and it all works including aggressive error checking. This is very much a situation where consumers are voting with their feet not browser vendors forcing a choice.

  • MarsIronPI 2 days ago

    > - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?

    IMO there's a place for XHTML as a generated output format, but I think HTML itself should stay easy to author and lightweight as a markup format. Specifically when it comes to tag omission, if I'm writing text I don't want to see a bunch of `</li>` or `</p>` everywhere. It's visual noise, and I just want a lightweight markup.

  • Eric_WVGG 2 days ago

    +1 Copland

    BeOS. I like to daydream about an alternate reality where it was acquired by Sony, and used as the foundation for PlayStation, Sony smartphones, and eventually a viable alternative to Windows on their Vaio line.

    Neal Stephenson, https://web.stanford.edu/class/cs81n/command.txt :

    > Imagine a crossroads where four competing auto dealerships are situated… (Apple) sold motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

    > (Microsoft) is much, much bigger… the big dealership came out with a full-fledged car: a colossal station wagon (Windows 95). It had all the aesthetic appeal of a Soviet worker housing block, it leaked oil and blew gaskets, and it was an enormous success.

    > On the other side of the road… (Be, Inc.) is selling fully operational Batmobiles (the BeOS). They are more beautiful and stylish even than the Euro-sedans, better designed, more technologically advanced, and at least as reliable as anything else on the market--and yet cheaper than the others.

    > … and Linux, which is right next door, and which is not a business at all. It's a bunch of RVs, yurts, tepees, and geodesic domes set up in a field and organized by consensus. The people who live there are making tanks.

    It would be years before OS X could handle things that wouldn’t cause BeOS to break a sweat, and BeOS still has a bit of a responsiveness edge that OS X still can't seem to match (probably due to the PDF rendering layer).

  • d3Xt3r 2 days ago

    In addition to Photon, I would say QNX itself (the desktop OS). I ran QNX 6 Neutrino on my PIII 450 back in the day, and the experience was so much more better than every other mainstream OS on the market. The thing that blew me away was how responsive the desktop was while multitasking, something Linux struggled with even decades later.

    Similarly, I'm also gutted that the QNX 1.44MB demo floppy didn't survive past the floppy era - they had some really good tech there. Imagine if they pitched it as a rescue/recovery OS for PCs, you could've run it entirely from the UEFI. Or say as an OS for smart TVs and other consumer smart devices.

  • pjmlp a day ago

    Somehow I feel C# has become the right successor to Modula-3 ideas, even if has taken 25 years to get there.

    GCC nowadays has Modula-2 as official frontend, not sure how much it will get used though.

    XHTML, yep I miss it, was quite into it back then.

  • bobmcnamara 2 days ago

    > IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU.

    TIL: what microchannel meant by micro and channel.

    Also it had OS independent device-class drivers.

    And you could stuff a new CPU on a card and pop it right in. Went from a 286+2MB to a 486dx2+32MB.

  • kanwisher 2 days ago

    Word lens team was bought by google, its far better in google translate then the local app ever was. You could repeat the old app with a local LLM now pretty easily but it still won't be as close in quality as using google translate

  • incognito124 2 days ago

    > word lens

    I don't know if you know it, that's a feature of Google Lens

  • le-mark 2 days ago

    CICS is still going strong as part of ZOS. There are industries where green screen, mainframe terminal apps still rule and CICS is driving them.

    • ethbr1 2 days ago

      CICS seems perfectly fine in problem spaces where requirements change slowly enough than one can trade development time for reliability (read: finance and insurance).

  • elric 2 days ago

    I love this mismatched list of grievances and I find myself agreeing with most of them. XHTML and proper CPU hypervisors in particular.

    People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem. Modern HTML is a cesspool. I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it. Is that part of the reason why we have so few browsers?

    • chrismorgan 2 days ago

      > People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem.

      Your chronology is waaaaaaaaaaaay off.

      <BR> came years before XML was invented. It was a tag that didn’t permit children, so writing it <BR></BR> would have been crazy, and inventing a new syntax like <BR// or <BR/> would have been crazy too. Spelling it <BR> was the obvious and reasonable choice.

      The <br /> or <br/> spelling was added to HTML after XHTML had already basically lost, as a compatibility measure for porting back to HTML, since those enthusiastic about XHTML had taken to writing it and it was nice having a compatible spelling that did the same in both. (In XHTML you could also write <br></br>, but that was incorrect in HTML; and if you wrote <br /> in HTML it was equivalent to <br /="">, giving you one attribute with name "/" and value "". There were a few growing pains there, such as how <input checked> used to mean <input checked="checked">—it was actually the attribute name that was being omitted, not the value!—except… oh why am I even writing this, messy messy history stuff, engines doing their own thing blah blah blah, these days it’s <input checked="">.

      Really, the whole <… /> thing is more an artefact of an arguably-misguided idea after a failed reform. The absolute mayhem came first, not last.

      > I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it.

      The HTML parser is magnificent, by far the best spec for something reasonably-sized that I know of. It’s exhaustively defined in terms of state machines. It’s huge, far larger than one would like it to be because of all this compatibility stuff, but genuinely easy to implement if you have the patience. Seriously, go read it some time, it’s really quite approachable.

    • detaro 2 days ago

      Not really. HTML5 parsing is very well documented and quite easy compared to all the other things a browser needs.

  • mberning 2 days ago

    CICS and HATS are perhaps the most annoying pieces of technology I’ve ever encountered.

  • Timwi 2 days ago

    The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient).

    Meanwhile, local files with the doctype would be treated as XHTML, so people assumed the doctype was all you needed. So everyone who tried to use XHTML didn't realize that it would go back to being read as HTML when they upload it to their webserver/return it from PHP/etc. Then, when something went wrong/worked differently than expected, the author would blame XHTML.

    Edit: I see that I'm getting downvoted here; if any of this is factually incorrect I would like to be educated please.

    • JimDabell 2 days ago

      > The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient).

      None of that is correct.

      It was perfectly spec. compliant to label XHTML as text/html. The spec. that covers this is RFC 2854 and it states:

      > The text/html media type is now defined by W3C Recommendations; the latest published version is [HTML401]. In addition, [XHTML1] defines a profile of use of XHTML which is compatible with HTML 4.01 and which may also be labeled as text/html.

      https://datatracker.ietf.org/doc/html/rfc2854

      There’s no spec. that says you need to parse XHTML served as text/html as HTML not XHTML. As the spec. says, text/html covers both HTML and XHTML. That’s something that browsers did but had no obligation to.

      The mismatched doctype didn’t trigger quirks mode. Browsers don’t care about that. The prologue could, but XHTML 1.0 Appendix C told you not to use that anyway.

      Even if it did trigger quirks mode, that makes no difference in terms of tag soup. Tag soup is when you mis-nest tags, for instance <strong><em></strong></em>. Quirks mode was predominantly about how it applied CSS layout. There are three different concepts being mixed up here: being parsed as HTML, parsing tag soup, and doctype switching.

      The problem with serving application/xhtml+xml wasn’t anything to do with web servers. The problem was that Internet Explorer 6 didn’t support it. After Microsoft won the browser wars, they discontinued development and there was a five year gap between Internet Explorer 6 and 7. Combined with long upgrade cycles and operating system requirements, this meant that Internet Explorer 6 had to be supported for almost 15 years globally.

      Obviously, if you can’t serve XHTML in a way browsers will parse as XML for a decade and a half, this inevitably kills XHTML.

      • Timwi a day ago

        Okay, I guess I got a fair bit of the details wrong. However, there's one detail I want to push back on:

        > In addition, [XHTML1] defines a profile of use of XHTML which is compatible with HTML 4.01 and which may also be labeled as text/html.

        If you read this carefully, you'll see that it's not saying that text/html can be used to label XHTML. It's saying that you can use text/html if you write your XHTML in such a way that it's compatible with HTML 4.01, because the browser will parse and interpret it as HTML.

        You're correct that the doctype wasn't the reason it was treated as tag soup. It was instead because of the parts of XHTML that are not directly compatible with HTML 4.01.

        The mismatch between local files and websites served as text/html was very real and I experienced it myself. It's curious that you'd think I'd make it up. There were differences in behavior, especially when JavaScript was involved (notably: Element.tagName is all-uppercase in HTML but lowercase in XHTML) and it is absolutely the case that developers like myself blamed this on XHTML.

    • crote 2 days ago

      Isn't that what the <!DOCTYPE> tag was supposed to solve?

      • Timwi 2 days ago

        Yes, I covered that; everyone assumed that you only needed to specify the doctype, but in practice browsers only accepted it for local files or HTTP responses with Content-Type: application/xml+xhtml. I've edited the comment to make that more explicit.

        • crote 2 days ago

          Ah, I see. Yeah, that's a bit silly. They should've gone for "MUST have doctype, SHOULD have content type".

BirAdam 2 days ago

Google Wave.

Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.

  • mikewarot 2 days ago

    Google wave was built on an awesome technology layer, and they they totally blew in on the user interface.... deciding to treat it as a set of separate items instead of a single document everyone everywhere all at once could edit.... killed it.

    It make it seem needlessly complicated, and effectively erased all the positives.

    • vendiddy 2 days ago

      I think this is spot on. A document metaphor would have made a Wave a lot easier to understand.

  • socalgal2 2 days ago

    I was blown away by the demo but then after I thought about it, it seemed like a nightmare to me. All the problems of slack of having to manually check channels for updates except X 100 (yea, I get that slack wasn't available then. My point is I saw that it seemed impossible to keep up with nested constantly updated hierarchical threads. Keeping up with channels on slack is bad enough so imagine if Wave had succeeded. It'd be even worse.

    • prisenco 2 days ago

      Wave was great for conversation with one or two other people on a specific project, which I'm sure most people here used it for. I can't imagine it scaling well beyond that.

    • pwlm a day ago

      Twitter has hierarchical threads and it succeeded.

      Mailing lists use hierarchical threads and they haven't gone away.

    • LargoLasskhyfv 2 days ago

      Maybe one could have worked around that by embedding Yahoo Pipes, thus automating the X 100.

  • brap 2 days ago

    Google Wave had awesome tech but if you look at the demo in hindsight you can tell it’s just not a very good product. They tried making an all-in-one kind of product which just doesn’t work.

    In a sense Wave still exists but was split into multiple products, so I wouldn’t say it’s “dead”. The tech that powered it is still used today in many of Google’s popular products. It turns out that having separate interfaces for separate purposes is just more user friendly than an all-in-one.

  • aftergibson 2 days ago

    I managed trips with friends and it was a great form factor for ad-hoc discussions with docs and links included. I thought it was the future and in my very early programming days wrote probably the most insecure plugin ever to manage your servers.

    https://github.com/shano/Wave-ServerAdmin

    It's been 16 years. I should probably archive this..

  • edanm 2 days ago

    Immediately thought of this.

    Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that.

    I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss.

  • drnick1 2 days ago

    Isn't Nextcloud (including Nextcloud Talk) a viable alternative? Certainly, something like Discord (centralized and closed source) isn't.

  • gwbas1c 2 days ago

    It was smoke and mirrors, spiced with everyone letting their imagination run away.

    I downloaded the open-source version of the server to see if I could build a product around it, but it came with a serious limitation: The open-source server did not persist any data. That was a complete non-starter for me.

    At that point I suspected it wasn't going anywhere. My suspicions were confirmed when I sat near some Wave team members at an event, and overhead one say, with stars in his eyes, "won't it be groovy when everyone's using Wave and..."

    ---

    Cool concept, though.

  • 1-more a day ago

    Studied for some CS/EE final with my class on Google Wave. It absolutely rocked for making a study guide together. I can't even really remember how it worked, just that I was blown away by it.

  • delduca 2 days ago

    Slack is the new Google Wave, Wave was too much ahead of time.

  • spooky_deep 2 days ago

    Is there a video or anything of this version of Wave?

    • BirAdam 2 days ago

      I haven’t found one showing what Chris showed. Most seem to focus on just communications with little demonstration of productivity or other features. This is sad to me because its most glorious asset was being open source with a rich set of plugins/extensions allowing tons of functionality.

  • jwpapi 2 days ago

    Discord is function wise the best now...

    • portaouflop 2 days ago

      I don’t get the downvotes. Discord for all its flaws is amazing. I never experienced wave so maybe the comparison is not a good one?

      • progval 2 days ago

        It's indeed not a good one. Discord refined instant messaging and bolts other things on top like forums but isn't fundamentally different. Google Wave was (and still is) a completely different paradigm. Everything was natively collaborative: it mixed instant messaging with document edition (like Google Docs or pads) and any widget you could think of (polls, calendars, playing music, drawing, ...) could be added by users through sandboxed Javascript. The current closest I can think of is DeltaChat's webxdc.

  • bdangubic 2 days ago

    wave was fucking amazing. buggy but amazing

    • burnt-resistor 2 days ago

      Google sucked/s at executive function because they completely lack appreciation for proper R&D and long-term investment and also kill things people use and love.

      • smrtinsert 2 days ago

        Honestly a lot of the time they seem to be be in "what do humans want?" mode.

bxparks 2 days ago

A lot of things on https://killedbygoogle.com/ . I used to use 30-40 Google products and services. I'm down to 3-4.

Google Picasa: Everything local, so fast, so good. I'm never going to give my photos to G Photos.

Google Hangouts: Can't keep track of all the Google chat apps. I use Signal now.

Google G Suite Legacy: It was supposed to be free forever. They killed it, tried to make me pay. I migrated out of Google.

Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.

Google Finance: Tracked my stocks and funds there. Then they killed it. Won't trust them with my data again.

Google NFC Wallet: They killed it. Then Apple launched the same thing, and took over.

Google Chromecast Audio: It did one thing, which is all I needed. Sold mine as soon as they announced they were killing it.

Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this..

  • brandonb927 2 days ago

    Google Reader: I will forever be salty about how Google killed something that likely required very little maintenance in the long run. It could have stayed exactly the same for a decade and I wouldn't have cared because I use an RSS reader exactly the same way I do that I did back in 2015.

    • nine_k 2 days ago

      Yes. That was the single worst business decision in Google history, as somebody correctly noted. It burned an enormous amount of goodwill for no gain whatsoever.

      Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't.

      • perardi 2 days ago

        Just think of the data mining they could have had there.

        They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared…

        They could have just kept the entire service running with, what, 2 software engineers? Such a waste.

        • nine_k 2 days ago

          This would require the decision-maker to think and act at the scale and in interests of the entire company. Not at the scale of a promo packet for next perf: "saved several millions in operation costs by shutting down a low-impact, unprofitable service."

      • eloisant 2 days ago

        Yes, Google killing Reader was probably the first time they killed a popular product and what started the idea that any Google product could be killed at any time.

      • benjaminwootton 2 days ago

        There is some truth in this. I fit into a few of these buckets and I don’t think I could ever recommend their enterprise stuff after having my favourite consumer products pulled.

    • Spooky23 2 days ago

      Yes! I loved this product… it was our little social network for my friends and coworkers.

    • ta12653421 2 days ago

      I never understood why noone built a Copycat (like "bgr" -> "better google reader :-D) There would have been a clear change to fill this vacuum?

      The thing is: I guess they didnt see a good way to monetize it (according to their "metrics"), while the product itself had somehow relative high OpEx and being somehow a niche thingy.

      • noirscape 21 hours ago

        Killing Reader didn't just kill Reader. It killed the expectation of RSS to be a valid default consumption format of the internet. These days, if you use RSS, it's either relying on some legacy hidden feed feature that hasn't been shuttered yet (lots of Rails and WordPress sites that are like this) or you're explicitly adding RSS to your site as a statement.

        Picking up the pieces after Reader was impossible because the entire RSS ecosystem imploded with it. Almost every single news site decided that with killing Reader, they wouldn't bother maintaining their RSS feeds, leaving them basically all "legacy" until they irrevocably break one day and then get shut down for not wanting to get maintained.

      • starkparker a day ago

        > I never understood why noone built a Copycat (like "bgr" -> "better google reader :-D)

        like theoldreader and Inoreader, which explicitly copied the columnar interfaces, non-RSS bookmarklet content saving, item favoriting, friend-of-a-friend commenting and quasi-blog social sharing features, and mobile app sync options via APIs? Or NewsBlur, which did all of that _and also_ added user-configurable algorithmic filtering? Or Feedly, which copied Reader's UX but without the social features? or Tiny Tiny RSS and FreshRSS, which copied Reader's UX as self-hosted software?

        theoldreader remains the most straightforward hosted ripoff of Google Reader, right down to look and feel, and hasn't changed much in more than a decade. Tiny Tiny is very similar, and similarly unchanging. FreshRSS implemented some non-RSS following features. So did NewsBlur, but as it always has, it still struggles with feed parsing and UI performance.

        Inoreader and Feedly both pivoted toward business users and productivity to stay afloat, with the former's ditching of social features leading to another exodus of people who'd switched to it after Google Reader folded.

      • janwl a day ago

        There were a few copycats, but they 1) weren't as good (mostly because they wanted to do more than google reader!) and 2) they weren't free.

  • huhkerrf 2 days ago

    > Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.

    You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload.

    • david_allison 2 days ago

      YouTube Music isn't available in all countries which Google Play Music was available in.

      My music was deleted.

    • shakna 2 days ago

      You made the transition, under differing licensing terms. Not always an option.

  • mikewarot 2 days ago

    Picasa was awesome, they had face recognition years before almost everything else, in a nice offline package.

    Unfortunately the last public version has a bug that randomly swaps face tags, so you end up training on the wrong persons faces just enough to throw it all off, and the recognition becomes effectively worthless on thousands of family photos. 8(

    Digikam is a weak sauce replacement that barely gets the job done.

  • nja 2 days ago

    Chromecast Audio still works! They just don't sell them anymore. I use mine every day, and have been keeping an eye out for anyone selling theirs...

    • bxparks 2 days ago

      Hmm, good to know. But given Google's history, I assumed that it would stop working.

      I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support.

  • tdeck 2 days ago

    I'm still amused that they killed Google Notebook and then a few years later created Google Keep, an application with basically the same exact feature set.

    • TheCapeGreek 2 days ago

      You can say that for a fair few of the services mentioned by GP.

      Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium.

      No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works?

      The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works.

      • jychang 2 days ago

        They stopped making the Chromecast hardware

  • electroglyph 2 days ago

    Google G Suite offered a free option after initially saying it was ending. just logged into my Workspace account: https://ibb.co/99jBLJnD

    still have many domains on there, all with gmail

    • zimmund 16 hours ago

      They've been crippling the free tier for a long time. And a few years ago they were about to restrict it completely (they've backed out last moment), to a point where many of us migrated to other platforms and never looked back.

  • daxfohl 2 days ago

    Google Search: Not officially dead yet, but....

    • bdangubic 2 days ago

      yup, losing 0.000087% year-over-year so in 865 billion years it’ll be dead :)

      • bxparks 2 days ago

        That was probably me, when I stopped using Google Search some years ago. :-) Got tired of the ads, the blog spam, and AI-generated content crap floating to the top of their results page.

      • kirubakaran 2 days ago

        How did you go bankrupt?

        Two ways. Gradually, then suddenly.

        - Ernest Hemingway, The Sun Also Rises

        • bdangubic 2 days ago

          I guess I’ve heard it all now… Google going bankrupt would not have made Top-1 Million list of likely things to read on Sunday morning…

      • easyThrowaway a day ago

        I wouldn't be surprised if they're going to kill it with their own hands by implementing some half-assed AI feature that breaks the core functionality of the product.

  • dwayne_dibley 2 days ago

    I still use PICASA it works fine. However, when google severed the gdrive-photo linking it meant my photos didn’t automatically download from google to my PC. This is what killed google for me.

  • tomComb 2 days ago

    I’m still using - free g suite - play music - finance - nfc wallet is just google wallet isn’t it? - chromecast, video and audio-only I guess play music is now YouTube music, and doesn't have uploads, so that can be considered dead, but the others seem alive to me.

  • eloisant 2 days ago

    > Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this..

    They have something called Google TV Streamer now, so for me it's more of a rebrand than really killing a product.

    • Hobadee 2 days ago

      Except Google TV isn't the same. You can cast to it, but it's more akin to a Roku - it comes with a remote and has "channels" you install.

      Oh, and a metric crapton of ads it shows you.

  • stavros 2 days ago

    Immich is a great replacement for Google Photos, if maybe not Picasa.

  • bigthymer 2 days ago

    I'm still upset that Google Maps no longer tracks my location. It was very useful to be able to go back and see how often and where I had gone.

    Is there another app where I can store this locally?

    • socalgal2 2 days ago

      Google Maps still tracks my location.

      The difference is they no longer store the data on their servers, it's stored on your phone (iPhone/Android)

      https://support.google.com/maps/answer/6258979

      That way, they can't respond to requests for that data by governments as they don't have it.

      I can look on my phone and see all the places I've been today/yesterday, etc

    • bapak 2 days ago

      Arc and its free Arc Mini companion. iOS. Been using it since Facebook eclipsed Moves app. A decade later, it's still not as good as Moves.

    • forever_frey 2 days ago

      Check out Dawarich, it has an official iOS app and you can use a number of 3rd party mobile apps to track your data and then upload it to server: either ran on your own hardware (FOSS self-hosted) or to the Dawarich Cloud one: https://dawarich.app

      Using it on daily basis

    • sameline 2 days ago

      Apple Maps added a Visited Places (beta) feature recently.

    • bxparks 2 days ago

      Strava? :-) Half-joking, half-serious, I haven't used Strava in years, I don't remember all its capabilities.

      Edit: Missed the "locally" part. Sorry no suggestions. Maybe Garmin has something?

      • iamacyborg 2 days ago

        Nope, Garmin only tracks your location when you record an activity that uses gps, which is good, frankly.

  • socalgal2 2 days ago

    I used Picasa and loved it, until I realized I want all my photos available from all my devices at all times and so gave in to Google Photos (for access, not backup)

    • bxparks 2 days ago

      I use SyncThing for that purpose. It syncs across my phone, my laptops, and my Synologies. But I don't sync all my photos.

      I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals.

  • chrismorgan 2 days ago

    > Google Hangouts:

    Which particular thing called Hangouts? There were at least two, frankly I’d say more like four.

    Google and Microsoft are both terrible about reusing names for different things in confusing ways.

    > Can't keep track of all the Google chat apps.

    And Hangouts was part of that problem. Remember Google Talk/Chat? That was where things began, and in my family we never wanted Hangouts, Talk/Chat was better.

    Allo, Chat, Duo, Hangouts, Meet, Messenger, Talk, Voice… I’ve probably forgotten at least two more names, knowing Google. Most of these products have substantial overlap with most of the rest.

  • hshdhdhj4444 2 days ago

    I think Chromecast has been replaced by Google TV which is a souped up Chromecast.

  • nine_k 2 days ago

    Picasa definitely went against the grain of Google, which is all about tying you to online services.

    Hangouts had trouble scaling to many participants. Google Meet is fine, and better than e.g. MS Teams.

    Legacy suite, free forever? Did they also promise a pony?..

    Play Music: music is a legal minefield. Don't trust anybody commercial who suggests you upload music you did not write yourself.

    Finance: IDK, I still get notifications about the stocks I'm interested in.

    NFC Wallet: alive and kicking, I use it literally every day to pay for subway.

    Can't say anything about Chromecast. I have a handful of ancient Chromecasts that work. I don't want any updates for them.

  • hshdhdhj4444 2 days ago

    Google Desktop Search (and also the Search Appliance if you were an SMB).

  • rgblambda 2 days ago

    Add Google Podcasts to the list. I switched to AntennaPod. Youtube Music has too noisy an interface.

  • andsoitis 2 days ago

    Why did you keep on using so many Google products if those products get cancelled?

    Why didn’t you quit Google after, say, the third product you used got canned?

    • dlcarrier 2 days ago

      I used Google Talk than Hangouts, but once they switched to Meet, I gave up on them. By then my family was all using Hangouts, and we never settled on a new service, because one of my siblings didn't want to support any chat services that don't freely give user information to the government, and the rest of us didn't want to use a chat platform that does freely give user information to the government.

  • dheera 2 days ago

    Isn't it "Google TV Streamer" now?

    • bxparks 2 days ago

      From what I can tell (since I am just finding out about this today), they stopped manufacturing the old Chromecast hardware, and at some point, will stop supporting the old devices. The old devices may stop working in the future, for example, because they sunset the servers. Like their thermostats. Who knows?

      • dheera 2 days ago

        I wish there was some law that requires open-sourcing firmware and flashing tools if a company decides to EOL a product ...

  • kulahan a day ago

    I am just learning via this comment they’re killing chromecast. My disappointment is immeasurable. I have 3, and use them daily. This might be the push to get me to install network wide Adblock.

    I should’ve realized when that recent update broke them for like a week, then the brought the all back online, but suddenly much buggier.

  • serial_dev 2 days ago

    Am I the only one salty about Google Podcasts? For me that was the straw that broke the camel’s back… I dropped Android, switched to iOS, and slowly phasing out the Google products in my life.

zaptheimpaler 2 days ago

Adobe Flash / Shockwave. After all these decades, I've yet to see a tool that makes it as easy to make games or multimedia as Flash did. One of many reminders recently (many others in politics) that humanity doesn't just inevitably or linearly move forward in any domain, or even 2 steps forward 1 step back. Some things are just lost to time - maybe rediscovered in a century, maybe never.

  • mikkupikku 2 days ago

    Enabling novice normies to make games was excellent, and I believe the whole game industry benefited from this resulting injection of fresh ideas. A lot of indy developers with fresh takes on what games could be got started this way. Zachtronics is one example of many that comes to mind right now.

    On the other hand, for every flash game made there were about ten thousands flash-based ads, and nearly as many websites that used flash poorly for things like basic navigation (remember flash based website dropdown menus?). And for a few years it seemed like every single restaurant with a website was using flash for the entire thing, the results were borderline unusable in the best cases. And let's not forget that as long as flash was dominant, it was choking out the demand to get proper video support into browsers. Flash based video players performed like dog shit and made life on Linux a real chore.

    • Doctor_Fegg 2 days ago

      Which made it much easier to block ads than it is now.

    • snicky 2 days ago

      This post reminded me about the good time I had watching Salad Fingers and Happy Tree Friends. "Na, na, nanana na".

  • eloisant 2 days ago

    I wish Flash would have died sooner.

    It was a plague on the web, you couldn't zoom, select text, go back, just a black box ignoring everything about your web browser.

    Killing it was probably the best thing Jobs ever did.

    • acheron 2 days ago

      This. Flash was awful. I see people defending it and I feel like I’m taking crazy pills.

      • 9x39 2 days ago

        It was both awful when it showed up in the enterprise and amazing at unleashing creativity for many. Most young non-technical people I knew during its rise had regularly made Flash creations or even games, and deeply enjoyed the Cambrian explosion of games and animations for a few years.

      • big_toast 2 days ago

        I dunno, a whole subtree of the internet died and I’m not sure it really came back. It was a beautiful Galápagos Islands.

      • morshu9001 2 days ago

        It was really meant for animation and games but got misused as a web GUI tool. I think it would've been fine to allow it anyway, and anyone who wants to build a GUI can just not use Flash.

      • sarchertech 2 days ago

        For the most part, people are talking about games and animation, not text based websites.

      • acidburnNSA 2 days ago

        Did you ever try one of those Flash-based room escape games? It was really amazing to lose yourself in the challenges and puzzles.

    • Minor49er 2 days ago

      Flash players had zoom built in. And I believe there were textareas that allowed people to copy and paste text if they wanted, though it wasn't very common

      Flash was the last thing that got people excited for the Web generally

    • pjmlp a day ago

      Thankfully now we have WebAssembly/WebGL/WebGPU for that.

    • ethbr1 2 days ago

      Flash was the original web Excel (also Lotus 1-2-3) -- a simultaneous design + data + programming tool.

      These are terrible for maintainability, but excellent for usability.

      On the whole, I'd say it was easily a loss for the greater web that web programming left the citizen-programmer behind. (By requiring them all to turn into hamfisted front-end javascript programmers...)

      Many of the centralized evils of the current web might have been avoided if there had remained an onramp for the neophyte to really create for the web.

      I.e. Facebook et al. might have instead been replaced by a hosted, better-indexed Macromedia create + edit + host platform

      Or the amount of shit code produced by inexperienced front-end devs throwing spaghetti at IE might have been reduced

  • neya 2 days ago

    Even if Adobe had gotten their act together and fixed all security holes, Apple would have still killed it. It was always a threat as a popular design tool. And decades later, with the HTML canvas hype faded, there's still no replacement to what Adobe Flash could do - any designer could create stellar, interactive design that can be embedded into any website...without a monthly subscription.

    • prox 2 days ago

      True, I do think Godot is on the right path, I haven’t had time to look into it in detail, but their HTML5 export seems solid from the videos I saw.

      • herpdyderp 2 days ago

        Eh... it's very hit or miss. It keeps getting better though!

  • sen 2 days ago

    Godot is pretty awesome. Easy to learn, can do 2D or 3D, and can export to HTML5/webasm that works across all major OSes and browsers including mobile.

    It’s far from perfect but I’ve been enjoying playing with it even for things that aren’t games and it has come a long way just in the last year or two. I feel like it’s close to (or is currently) having its Blender moment.

  • watwut 2 days ago

    Yes. I never used flash personally, but I loved those little games people created with them. There was the whole scene of non developers creating little games of all kinds and it just ceased to exist.

    • Imustaskforhelp 2 days ago

      There is still a way to run flash apps via https://ruffle.rs/ You can probably still make flash games and run them via ruffle.

      • PhilipRoman 2 days ago

        Ruffle is amazing. I launched a 20+ year old game yesterday with zero compatibility issues. Even better than the original Flash because of superior security isolation mechanisms.

    • rkomorn 2 days ago

      So much college years time spent (wasted?) on Addicting Games.

      • kjkjadksj 2 days ago

        Time you enjoyed wasting is not wasted.

        • rkomorn 20 hours ago

          I agree.

          My transcripts, on the other hand, have a different opinion.

    • portaouflop 2 days ago

      Kids now create games in Roblox. More constrained, more commercial, more exploitative- but there is still a huge scene of non-developers creating games if you care to look.

  • Y-bar 2 days ago

    Those tools were awesome. But as formats go, they were awful due to bad performance and more security holes than anything else.

    I still miss Macromedia Fireworks.

    • KaiserPro 2 days ago

      > more security holes than anything else.

      yeah it wasn't secure

      but;

      > bad performance

      I don't think thats the case. For the longest while flash was faster than js at doing anything vaguely graphic based. The issue for apple was that the CPU in the iphone wasn't fast enough to do flash and anything else. Moreover Adobe didn't get on with jobs when they were talking about custom versions.

      You have to remember that "apps" were never meant to be a thing on the iphone, it was all about "desktop" like web performance.

      • Y-bar 2 days ago

        I remember well. I earned my living for a few years around 2010 porting slow Flash sites to regular web tech. It was hard to translate some functionality, but Flash was definitely slow compared to the equivalent regular website done without the plugin.

    • OtherShrezzing 2 days ago

      Macromedia Fireworks was an outstanding piece of software.

      The 20 most common things you’d do with the tool were there for you in obvious toolbars. It had a lot of advanced features for image editing. It had a scripting language, so you could do bulk editing operations. It supported just about every file extension you could think of.

      Most useful feature of all was that it’d load instantly. You’d click the icon on the desktop, and there’d be the Fireworks UI before you could finish blinking. Compared to 2025 Adobe apps, where you click the desktop icon and make a coffee while it starts, it’s phenomenal performance.

    • GuB-42 2 days ago

      Performance was way better than what we have now with modern web stacks, we just have more powerful computers.

      I agree on security and bugs, but bugs can be fixed. It just shows neglect by Adobe, which was, I think, the real problem. I think that if Adobe seriously wanted to, it could have been a web standard.

      • Y-bar 2 days ago

        Lots of people say performance was good, but that seems to be through the nostalgic lens of a handful of cool games.

        Those did sometimes run really great, but most implementations were indeed very slow.

        I remember vividly because it was part of my job back then to help with web performance and when we measured page speed and user interface responsiveness flash was almost always the worst.

    • Sankozi 2 days ago

      Flash performance is still better than current web stack's. Probably will always be - you could write non trivial games that would work on 128MB memory machine. Currently single browser tab with simple page can take more than that.

    • hulitu 2 days ago

      > more security holes than anything else.

      Adobe was never known for its security or quality.

  • bombcar 2 days ago

    Flash was the HyperCard of the 90s/early 2000s.

    There hasn’t been a replacement, yet.

  • al_borland 2 days ago

    The big issue with Flash was how overused it was.

    When Flash was on its way out one app made at the place I worked still said they needed it, and I couldn't figure out why... it was a Java app. After some digging, I found it, some horizontal dividers on the page. They could have, and should have, just been images. They didn't do anything. Yet someone made them in Flash.

    I'd also say all the drop-down menu systems were an overuse. Splash screens on every car company's home page. It was out of hand.

    I guess you could call it a victim of it's own success, where once it was time for it to die (due to mobile), very few people were sad to see it go.

  • achisler 2 days ago

    Try Roblox! Unless you haven't yet. I was SO impressed. Everything works as expected. 5 minutes after starting the game making kit I totally understood why Roblox is worth billions. It just works. It's magic. All can be scripted, but also any 6y.o. can use it.

  • donatj 2 days ago

    Personal pet peeve, but as someone who still makes gifs, Image Ready. Adobe kind of absorbed Image Ready into Photoshop and it's just never lived up to how easy it was to make simple gifs in Image Ready

  • josefrichter 21 hours ago

    It was actually fantastic even for creating websites. To think that 20 years later we still don't have tools to make similar stuff with similar ease is mindblowing.

  • morshu9001 2 days ago

    I was even fine with Flash being misused for web GUIs, just to pressure the open web to get its act together. At least devs got to pick 2 between [fancy, fast, easy]. If you want something better, make it instead of hobbling the competition.

  • Wistar 2 days ago

    Well, I miss Director which I used a lot for demos/prototyping.

  • make3 2 days ago

    it's called Roblox and it's bigger than Flash ever was

w10-1 2 days ago

Optane persistent memory had a fascinating value proposition: stop converting data structures for database storage and just persist the data directly. No more booting or application launch or data load: just pick up where you left off. Died because it was too expensive, but probably long after it should have.

VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.

  • lexszero_ 33 minutes ago

    I haven't heard about Optane before, but the concept of persistent memory reminds me of PhantomOS[0], which is based around the idea that from the app perspective everything is already in memory and the kernel/runtime (JVM-ish, so object-aware) takes care of {,de}serialization to a non-volatile storage by virtue of a highly sophisticated virtual memory manager so the app programmer doesn't have to think about it. I remember seeing it being presented at some conference around 2012 and the live demo running Tetris game slowed down to a crawl and crashed after a few blocks due to bugs in GC.

    [0] https://en.wikipedia.org/wiki/Phantom_OS

  • Gud 2 days ago

    1+ for 3dxpoint.

    The technology took decades to mature, but the business people didn’t have the patience to let the world catch up to this revolutionary technology.

    • dlcarrier 2 days ago

      The world had already caught up. By the time it was released, flash memory was already nearing it's speed and latency, to the point that the difference want with the cost.

      • Havoc 2 days ago

        >flash memory was already nearing it's speed and latency

        Kinda, but for small writes it's still nowhere near.

        Samsung 990 Pro - IOPS 4KQD1 113 MBytes/Sec

        P4800X optane - IOPS 4KQD1 206 MBytes/Sec

        And that's a device 5 years newer and on a faster pcie generation.

        It disappeared because the market that values above attribute is too small and its hard to market because at first glance they look about the same on a lot of metrics as you say

  • LargoLasskhyfv 2 days ago

    Not only because of price. The 'ecosystem' infrastructure wasn't there, or at least not spread wide enough. The 'mindshare'/thinking of ways how to do, neither. This is more aligned with (live) 'image-based' working environments like early Lisp and Smalltalk systems. Look at where they are now...

    A few more thoughts about that, since I happen to have some of the last systems who actually had systems level support for that in their firmware, and early low-capacity optanes designed for that sort of use. It's fascinating to play with these, but they are low capacity, and bound to obsolete operating systems.

    Given enough RAM, you can emulate that with working suspend and resume to/and from RAM.

    Another avenue are the ever faster and larger SSDs, in practice, with some models it makes almost no difference anymore, since random access times are so fast, and transfer speeds insane. Maybe total and/or daily TBW remains a concern.

    Both of these can be combined.

  • tester756 2 days ago

    Optane was impressive from tech standpoint.

    We were about get rid of split between RAM and disk memory and use single stick for both!

  • Findecanor 2 days ago

    Systems are stuck in old ways in how they model storage, so they weren't ready for something that is neither really RAM nor disk. Optane did inspire quite a few research projects for a while though. A few applications emerged in the server space, in particular.

  • veqq 2 days ago

    I have an optane drive with the kernel on it, instant boot!

    • stanac 2 days ago

      How does that work? It loads kernel from drive to ram?

      Isn't windows fast boot something like that (only slower, depending on ssd)? It semi-hibernates, stores kernel part of memory on disk for faster startup.

      • goku12 2 days ago

        This one would have behaved more like suspend to RAM. In suspend to RAM, the RAM is kept powered, while everything else is shut down. The recovery would be near instant, since all the execution contexts are preserved on the RAM.

        Optane was nearly as fast as RAM, but also persistent like a storage device. So you do a suspend to RAM, without the requirement to keep it powered like a RAM.

_bent 2 days ago

Lytro light field cameras. The tech was impressive and the company was able to put two products on to the shelves, though unfortunately they hadn't quite reached the image quality needed for professional photographers.

But now with the new Meta Ray-Bans featuring a light field display and with new media like gaussian splats we're on the verge of being able to make full usage of all the data those cameras were able capture, beyond the demos of "what if you could fix your focus after shooting" of back then.

Beyond high tech, there's a big market for novelty kinda-bad cameras like Polaroids or Instax. The first Lytro has the perfect form factor for that and was already bulky enough that slapping a printer on it wouldn't have hurt.

  • mikewarot a day ago

    The problem with the Lytro was that the sensor/lens pair was just too darned small. If they had somehow scaled it up so that the sensor was about 4" in diameter, even if it meant using an interposing frosted glass plate or something to allow a smaller image sensor, the depth of field effects could have been fantastic. It would have allowed genuine and beautiful bokeh across almost any arbitrary focal plane, even pan/tilt, in software.

  • bobmcnamara 2 days ago

    > unfortunately they hadn't quite reached the image quality needed for professional photographers.

    I always wondered about that - since it works by interleaving pixels at different focal depths, there's always going to be a resolution tradeoff that a single-plane focus camera wouldn't.

    It's such a cool idea though, and no more difficult to manufacturer than a sensor + micro lens array.

    • summa_tech 2 days ago

      In fact, the Lytro Illum (the big one) had a really nice, very flexible, bright super-zoom lens. If you ever wondered how that was achieved: having the microlens array and a light field sensor (1) allows relaxing so many aberration constraints on the lens that you could have a light, compact super-zoom.

      (1) it's not really different focal depths, it's actually more like multiple independent apertures at different spatial locations, each with a lower resolution sensor behind it - stereovision on steroids (stereoids?)

  • s3p 2 days ago

    Don't phones do this now? I remember Lytro cameras, they were really exciting.

    • phire 2 days ago

      Phone cameras fake it.

      They don't capture a light field like Lytro did, they capture a regular image with a very deep depth of field, extract a depth map (usually with machine learning, but some phones augment it with stereoscopy or even LIDAR on high end iPhones) and then selectively blur based on depth.