Comment by panstromek

Comment by panstromek 5 hours ago

16 replies

> As Facebook would push for more engagement, some bands would flood their pages with multiple posts per day

The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention. That's one of the main reasons fb and other sites implemented algorithmic feeds in the first place. If you take away the time component, posters compete on quality instead.

> The story we are sold with algorithmic curation is that it adapts to everyone’s taste and interests, but that’s only true until the interests of the advertisers enter the picture.

Yea, exactly, but as emphasized here: The problem is not curation, the problem is the curator. Feed algorithms are important, they solve real problems. I don't think going back to RSS and chronolgical feed is the answer.

I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.

mixcocam 5 hours ago

> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.

RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds. In fact, I can’t imagine that no one is using the AI boom to say they will build a decentralized Twitter using rss plus ai for the algorithm

  • jasode 4 hours ago

    >RSS is just a protocol. You could make a reader now with any algorithm you want that displays feeds.

    Your proposal to filter on the client side where the RSS reader runs can't do what the gp wants: algorithmic suggestions on the _server_ side.

    The issue of an AI algorithm applied to client-side-RSS is it's limited to the closed set items of the particular feed(s) that RSS happened to download of whatever websites the user pre-defined in the white-list.

    Example of inherent client-side limitation would be how Youtube works:

    - a particular Youtube channel about power tools : can use RSS to get a feed of that channel. Then use further customized client filtering (local AI LLM) to ignore any videos that talk about politics instead of tools.

    - the Youtube algorithm of suggested and related videos to discover unknown channels or topics : RSS can't subscribe to this so there's no ability to filter on the client side. Either Youtube itself would have to offer a "Suggested Videos as RSS feed" -- which it doesn't -- or -- a 3rd party SaaS website has to constantly scrape millions of Youtube videos and then offer it as a RSS feed. That's not realistic as Google would ban that 3rd-party scraper but let's pretend it was allowed... getting millions of XML records to filter it client-side to throw away 99% of it is not ideal. So you're still back to filtering it on the server side to make the RSS feed managable.

    In the "explore-vs-exploit" framework, the "explore" phase is more efficiently accomplished with server-side algorithms. The "exploit" phase is where RSS can be used.

    - "explore" : use https://youtube.com and its server-side algorithms to navigate billions of videos to find new topics and content creators. Then add interesting channel to RSS whitelist.

    - "exploit" : use RSS to get updates of a particular channel

    • jcgl 3 hours ago

      GGP does express interest in Algorithm-as-a-Service (AaaS), but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms. Client-side is perfectly suitable for the near-univeral case where feed servers don't overwhelm the client with spam (in which case you remove the offending server from your feed).

      To your points about YouTube-style algorithmic discovery, I do agree that that would require the server to do things like you describe. So I think that there could be both client-side and server-side algorithms. In time, who knows? Maybe even some client-server protocol whereby the two could interact.

      • jasode 2 hours ago

        >, but I don't see why AaaS or server-side anything would be required to have non-chronological feed algorithms.

        You assume gp's idea of "non-chronological" feed means taking the already-small-subset-downloaed-by-RSS and running a client-side algorithm on it to re-order it. I'm not debating this point because this scenario is trivial and probably not what the gp is talking about.

        I'm saying gp's idea of "non-chronological" feed (where he emphasized "curation is not the problem") means he wants the huge list of interesting but unknown content filtered down into a smaller manageable list that's curated by some ranking/weights.

        The only technically feasible way to do curation/filtering algorithm on the unexplored vastness out on the internet -- trillions of pages and petabytes of content -- is on servers. That's the reasonable motivation for why gp wants Algorithm-as-a-Service. The issue is that the companies wealthy enough to run expensive datacenters to do that curation ... want to serve ads.

        • jcgl 2 hours ago

          Maybe you're right about what they meant. I'll not debate that.

          I will say that, for my purposes, I would definitely like an RSS reader that has more dynamic feed presentation. Maybe something that could watch and learn my preferences, taking into account engagement, time of day, and any number of other factors.

          What's more, with primarily text-oriented articles, the total number of articles can be extremely high before overwhelming the server or the client. And a sufficiently smart client needn't be shy about discarding articles that the user is unlikely to want to read.

akho an hour ago

I've patched miniflux with a different sorting algorithm that is less preferential to frequent posters. It did change my experience for the better (though my particular patch is likely not to everyone's taste).

It is a bit strange that RSS readers do not compete on that, and are, generally, not flexible in that respect.

Social media targets engagement, which is not a good target. Even a pure chronological sort is better.

joules77 2 hours ago

Its all subjective. There is no clear quantification of X Attention consumed = Y Value produced. So saying what the algo does is important is like saying astrology is important. Or HN is important ;) At the end of the day most info produced is just entertainment/placebo. 3 inch chimp brains have upper limits on how much they can consume and how many updates to their existing neural net are possible. Since there is nothing signaling these limits to people, people(both producers and consumers of info) live in their own lala land about what their own limits are or when those limits have been crossed, mostly everyone is hallucinating about Value of Info.

The UN report on the Attention Economy says 0.05% of info generated is actually consumed. And that was based on a study 10-15 years ago.

nickdothutton 4 hours ago

> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.

I thought about this back in 2017 (within the context of LinkedIn signal to noise) [1]. I hap hoped for a marketplace/app store for algos. For example:

"What if the filter could classify a given post as 30% advertorial 70% editorial, and what if you could set a threshold for seeing posts of up to 25% advertorial but no more?"

and

"What if the filter could identify that you’d already received 25 permutations of essentially the same thing this month, and handle it accordingly."

[1] https://blog.eutopian.io/building-a-better-linkedin/

wodenokoto 4 hours ago

I think these are good points, and also a reason why I never understood people wanted digg and reddit to supply them with RSS feeds back in the heydays of RSS.

latexr 3 hours ago

> If you take away the time component, posters compete on quality instead.

That is verifiably false simply by looking at the state of social media. What they compete on is engagement bait, and the biggest of them all is rage.

By your logic, social media would be a panacea of quality posts by now, but it’s going to shit with fast-paced lies. Quick dopamine hits prevail, not “quality”.

> I'm thinking of something like "algorithm as a service," which would be aligned with your interests and tuned for your personal goals.

So, another service dedicated to tracking you and mining your data. I can already smell the enshittifaction.

andrepd an hour ago

That's nonsense. If the problem truly was spam then the "algorithm" would be a simple and transparent penalty proportional to the frequency of posts. The goal is not that (it's """engagement""") and the algorithm is not that either (it's a turbo-charged skinner box attacking your mind with the might of ten thousand data centres).

thaumasiotes 5 hours ago

> The causation is opposite, and it's the whole problem with chronological feeds, including RSS - chronological feeds incentivises spam-posting, posters compete on quantity to get attention.

That doesn't make any sense. Quantity might make you more prominent in a unified facebook feed, but an RSS reader will show it like this:

    Sam and Fuzzy (5)
    Station V3 (128)
They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS. There is no such incentive. Quantity is a negative thing; when I see that I've missed 128 posts, I'm just going to say "mark all as read" and forget about them. (In fact, I have 174 unread posts in Volokh Conspiracy A† right now. I will not be reading all of those.)

† Volokh Conspiracy is hosted on Reason. Reason provides an official feed at http://reason.com/volokh/atom.xml . But Volokh Conspiracy also provides an independent feed at http://feeds.feedburner.com/volokh/mainfeed . Some of their posts go into one of those feeds, and the rest go into the other. I can't imagine that they do this on purpose, but it is what they do.

  • pbmonster 5 hours ago

    > They've always displayed that way. You never see one feed mixed into another feed. This problem can't arise in RSS.

    All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.

    I always use it like that. If I'd want to see all new posts from a single author, I might as well just bookmark their blog.

    • thaumasiotes 5 hours ago

      > All readers I know have the option to display all feeds chronologically, or an entire folder of feeds chronologically. In most, that's the default setting when you open the app/page.

      The option might exist. It was certainly not the default in mainstream readers in the past and it still isn't now. I never encountered it in Google Reader (as mainstream as it gets), or in Yoleo (highly niche), or in Thunderbird (also as mainstream as it gets).

      Whether a bunch of unused projects make something strange the default doesn't really have an impact on the user experience. This is not something you can expect to encounter when using RSS.

      > If I'd want to see all new posts from a single author, I might as well just bookmark their blog.

      That approach will fail for two obvious reasons:

      1. The bookmark is not sensitive to new posts. When there is no new post, you have to check it anyway. When there are several new posts, you're likely to overlook some of them.

      2. Checking one bookmark is easy; checking 72 bookmarks is not.

      • akho 2 hours ago

        > I never encountered it in Google Reader

        It was the default view in Google Reader, the "All Items" view.

        A mix of all feeds, ordered chronologically, is the default view in tt-rss, miniflux, inoreader, feedly, netnewswire, and all RSS readers I've ever seen.

        It's also what "syndication" means.

        • Kye an hour ago

          Inoreader as well. "Magic" sorting costs extra.