I use pCloud. They're a little expensive for what I get in storage size, but more sensitive to privacy concerns, less likely to get politically disturbed, and the help is real people. Although I don't currently use them for private data, I have in the past, while travelling. I don't use them for bulk media backup, because my internet is lousy, so the limited space isn't currently an issue. I had thought with this PC I might change to Microsoft Family, but the 6TB is divided into 6 x 1TB accounts, spectacularly inconveniently. Obviously also rather less private/unpolitical.
You're fundamentally right about the NAS serving.
It's something of a bummer, really, but the cloud supplier don't have a app to install on Synology's NAS. And they're too small a cloud for Synology to offer a specific tested component or app. Gid's Mutual Obscurity Law of Compatibility, I guess.
As I understand it (last reviewed a year ago), the general supported-by-all interface is WEBdav, and they do both claim to support it. But, there's a known issue with Synology's app and WEBdav such that basically it only works in trivial cases. IIR, one enters cloud login details into the app to set it up & schedule jobs or sync'd directories. Clearly there is a form of credential exchange to start {something}, but the cloud server (and this isn't specific to my supplier) applies some limits to how long {speculation: some credential session or token} is valid. But the Synology app treats it as a fatal error and has to be manually restarted with login entered again. Useless.
For me to do the job on the NAS, while logical, requires the cloud supplier to provide a package (in Python on Linux maybe?) for exposing their cloud to the Linux file system. They weren't remotely forthcoming on that when I asked for solutions. Otherwise I'd have to do a better job using WEBdav, which I suspect would be a fair sized task, providing quite high level function on a probably fairly low level stack, given that a respectable-ish company ships like Synology can't get it right.
Whereas since the cloud installation for PC exposes my cloud account to the Windows file system, Scheduler->batch file->Robocopy just robustly blats it: one line of script plus my error checking & logging. So it's a very time effective kludge.
I suspect my optimal paths are change cloud supplier or wait until someone else fixes it. I should rereview now to see if a year has changed things.
(Not really the right forum here though.)
(London Underground have a lot of heat to dump as well, and again, it either isn't or wasn't reused. Even though in London for six months of the year, there's great demand for heat.)
You're fundamentally right about the NAS serving.
It's something of a bummer, really, but the cloud supplier don't have a app to install on Synology's NAS. And they're too small a cloud for Synology to offer a specific tested component or app. Gid's Mutual Obscurity Law of Compatibility, I guess.
As I understand it (last reviewed a year ago), the general supported-by-all interface is WEBdav, and they do both claim to support it. But, there's a known issue with Synology's app and WEBdav such that basically it only works in trivial cases. IIR, one enters cloud login details into the app to set it up & schedule jobs or sync'd directories. Clearly there is a form of credential exchange to start {something}, but the cloud server (and this isn't specific to my supplier) applies some limits to how long {speculation: some credential session or token} is valid. But the Synology app treats it as a fatal error and has to be manually restarted with login entered again. Useless.
For me to do the job on the NAS, while logical, requires the cloud supplier to provide a package (in Python on Linux maybe?) for exposing their cloud to the Linux file system. They weren't remotely forthcoming on that when I asked for solutions. Otherwise I'd have to do a better job using WEBdav, which I suspect would be a fair sized task, providing quite high level function on a probably fairly low level stack, given that a respectable-ish company ships like Synology can't get it right.
Whereas since the cloud installation for PC exposes my cloud account to the Windows file system, Scheduler->batch file->Robocopy just robustly blats it: one line of script plus my error checking & logging. So it's a very time effective kludge.
I suspect my optimal paths are change cloud supplier or wait until someone else fixes it. I should rereview now to see if a year has changed things.
(Not really the right forum here though.)
(London Underground have a lot of heat to dump as well, and again, it either isn't or wasn't reused. Even though in London for six months of the year, there's great demand for heat.)