Comment by coreylane

Comment by coreylane 13 hours ago

21 replies

RClone has been so useful over the years I built a fully managed service on top of it specifically for moving data between cloud storage providers: https://dataraven.io/

My goal is to smooth out some of the operational rough edges I've seen companies deal with when using the tool:

  - Team workspaces with role-based access control
  - Event notifications & webhooks – Alerts on transfer failure or resource changes via Slack, Teams, Discord, etc.
  - Centralized log storage
  - Vault integrations – Connect 1Password, Doppler, or Infisical for zero-knowledge credential handling (no more plain text files with credentials)
  - 10 Gbps connected infrastructure (Pro tier) – High-throughput Linux systems for large transfers
noname120 13 hours ago

I hope that you sponsor the rclone project given that it’s the core of your business! I couldn’t find any indication online that you do give back to the project. I hope I’m wrong.

  • coreylane 13 hours ago

    I'm certainly planning on sponsoring the project as soon as possible, but so far I have zero paying customers, hopefully that will change soon

    • znnajdla 11 hours ago

      first thing that popped into my mind is that your free plan is crazy generous. cut it out.

      • PunchyHamster 9 hours ago

        first thing that popped into mine is $30/mo for running a vm with a command is something people will now just tell LLM to do

        • tonymet 6 hours ago

          first thing that popped into my mind is that OP did a lot of hard work and doesn't need cynical and useless comments about it.

  • stronglikedan 12 hours ago

    that's just creepy and hella presumptuous

    • asacrowflies 11 hours ago

      Yeah I've seen this pop up in foss a lot lately and I don't like it.

  • sneak 12 hours ago

    Gifts do not confer obligation. If you give me a screwdriver and I use it to run my electrical installation service business, I don’t owe you a payment.

    This idea that one must “give back” after receiving a gift freely given is simply silly.

    • burnte 11 hours ago

      Yes but thank-yous are always good. Making sure the project sticks around is just smart.

    • MattGrommes 11 hours ago

      If your neighbor kept baking and giving you cookies, to the point where you were wrapping and reselling them at the market, don't you think you should do something for them in return?

      • einsteinx2 3 hours ago

        Not if they gave me a legal document explicitly stating I didn’t need to give them anything…and I could get an infinite amount of the cookies with no extra work or money on their part…

        And I would probably suggest to them that if they were interested in profiting from their cookies they should stop giving them away for free and make them commercial instead. They might then tell me they don’t want to spend the effort and money to commercialize their cookies, or maybe they prefer it as a hobby with no obligations to customers, or maybe they tell me they have a philosophical belief that they should give their their cookies away for free for anyone to do as they please with them, including commercializing them as long as they aren’t legally responsible for anything done with the cookies which is why they handed me that legal contract explicitly stating that when they gave them to me in the first place.

plasticsoprano 13 hours ago

How do you deal with how poorly rclone handles rate limits? It doesn't honor dropbox's retry-after header and just adds an exponential back off that, in my migrations, has resulted in a pause of days.

I've adjusted threads and the various other controls rclone offers but I still feel like I'm not see it's true potential because the second it hits a rate limit I can all but guarantee that job will have to be restarted with new settings.

  • darthShadow 11 hours ago

    > doesn't honor dropbox's retry-after header

    That hasn't been true for more than 8 years now.

    Source: https://github.com/rclone/rclone/blob/9abf9d38c0b80094302281...

    And the PR adding it: https://github.com/rclone/rclone/pull/2622

    • plasticsoprano 3 hours ago

      Interesting. I’ve been through 4 large transfers to Dropbox in the last 3 years and never once has it honored that header.

  • coreylane 13 hours ago

    I honestly haven't used it with Dropbox before, have you tried adjusting --tpslimit 12 --tpslimit-burst 0 flags? Are you creating a dedicated api key for the transfer? Rate limits may vary between Plus/Advanced forum.rclone.org is quite active you may want to post more details there.

    • plasticsoprano 3 hours ago

      I have made a dedicated api application and I have adjusted the tps flags. I’ve scanned the forums a few times but I’ve yet to inquire there.

tonymet 6 hours ago

i had been thinking about this service for a long time, especially something supporting transforms and indexing for backups. great job spinning it up.

  • coreylane 5 hours ago

    Thanks 1. are you thinking of something like aws data firehose transform feature? where pandas or something can run inline? https://docs.aws.amazon.com/firehose/latest/dev/data-transfo...

    2. do you have an example of what indexed backups would look like? Im thinking of macos time machine, where each backup only contains deltas from the last backup. Or am I completely off?

    • tonymet 2 hours ago

      For transforms, the concept would be user friendly processing, like downcoding video & photos, compressing PDFs & text files, filtering out temporary or wasteful files . Something like AirTable for backups with a gui workflow editor with common processing jobs for backups.

      For indexing, full text indexing of backups to allow for record retrieval based on keyword or date. E.g. “images in Los Angeles before 2010” or “tax records from 2015”. If possible, low resolution thumbnails of the backups to make retrieval easier.

      I think #1 (transforms) would be more generally useful for cross cloud applications, and #2 is more catered toward backups