dragonscave.space is one of the many independent Mastodon servers you can use to participate in the fediverse.
A fun, happy little Mastodon/Glitch instance.

Server stats:

237
active users

Public

Hey #linux #debian people: it’s occurring to me that #rclone might not actually be the best way to do what I’m doing. So: if you had two Debian servers on a vpn, both with 1 gig fiber links to the internet, in cities 100 km apart, how would you go about having shared filesystems between them? Right now I am using rclone mount with sftp. Is there a less janky way?

Public

@fastfinge Do they need to be shared, or are you looking for eventual consistency? Because my first answer is that I wouldn't; I'd locally cache the data for some period of time be it seconds, minutes, hours or days, and then batch-sync it on a schedule. Removes an entire layer of complexity from proceedings.

Public

@jscholes They need to be shared. The machine that needs to act on the files doesn't have enough space to store the files.

Public

@fastfinge I'd probably still be looking at a rotating sync. Grab some files, work on them, upload the results, delete and repeat.

Public

@jscholes Right, but we don't know what files we're going to need until we need them. And we need to know what files are available at all times. So now we're into keeping an updated list in sync between two machines, and doing caching and batch syncs, and this is all starting to sound like a filesystem.

@fastfinge Fair enough. Not knowing what you're trying to achieve, I can't offer much more.

Public

@jscholes NFS is pretty much what I want. In short, I'm managing my dozens of Linux ISO's and hours of public domain video, but the box that actually downloads the ISO's is on a different network from the one that requests, serves, and indexes them.