Comment by goblin89

Comment by goblin89 5 days ago

8 replies

SoundCloud used to be good prior to the redesign.

Recently I decided to evaluate it for serious use and start posting there again, only until their new uploader told me I need to switch to a paid plan, even though I triple-checked I was well within free limits and under my old now unused username I uploaded a lot more (mostly of experimental things I am not that proud of anymore).

It looks like their microservices architecture is in chaos and some system overrides the limits outlined in the docs with stricter ones. How can I be sure they respect the new limits once I do pay, instead of upselling me the next plan in line?

Adding to that things like the general jankiness or the never-ending spam from “get more fake listeners for $$$” accounts (which seem to be in an obvious symbiosis with the platform, boosting the numbers for optics), the last year’s ambiguous change in ToS allowing them to train ML systems on your work, it was enough for me to drop it. Thankfully, it was a trial run and I did not publish any pending releases.

If you still publish on SoundCloud, and you do original music (as opposed to publishing, say, DJ sets, where dealing with IP is problematic), ask yourself whether it is timr to grow up and do proper publishing!

storystarling 5 days ago

This sounds like a classic consistency vs latency trade-off. Enforcing strict quotas across distributed services usually requires coordination that kills performance. They likely rely on asynchronous counters that drift, meaning the frontend check passes but the backend reconciliation fails later. It is surprisingly hard to solve this without making the uploader feel sluggish.

  • LoganDark 5 days ago

    That would explain why the front-end would allow you to attempt something that goes over your limits, but not why the back-end would reject something that doesn't go over your limits.

    • goblin89 5 days ago

      My bet at the time was that they have a bunch of hidden extra limits based on account age, IP/user agent information, etc. If that is true, their problem is that they advertise the larger limits instead of the smaller limits (to get more users signed up), and that they do not communicate when their extra limits apply and instead straight up upsell you, which are both dark patterns.

      • storystarling 4 days ago

        That sounds plausible. I've had to implement similar reputation-based limits on my own backend just to keep inference costs from exploding, so I sympathize with the fraud prevention angle. Masking that as a generic quota issue to push an upsell is pretty hostile though.

        • goblin89 4 days ago

          The feeling of being gaslit, when I calculated and recalculated the length of my tracks and compared it with limits on their pricing page, was quite unpleasant.

          Another possibility is maybe they reduced their limits from 3 to 2 hours of audio around the same time. I don’t know if it happened before or after my experience, did not read their blogs or press releases, only made sure I was well under whatever limits were currently listed on their pricing & plans page (I was probably under 2 hours as well, but as this point can’t be bothered to check). Perhaps that transition was chaotic and for some time their left hand did not know what the right hand is doing.

    • storystarling 4 days ago

      Fair point. I suspect it comes down to ghost reservations or stale caches. If a previous upload failed mid-flight but didn't roll back the quota reservation immediately, the backend thinks you're over the limit until a TTL expires. Or you delete something to free up space, but the decrement hasn't propagated to the replica checking your quota yet.

    • storystarling 4 days ago

      Fair point. I suspect it comes down to how they handle retries. If an upload times out but the counter already incremented, the system sees the space as used until an async cleanup job runs. It is really common to have ghost usage in eventually consistent systems.