![]() ![]() If the borg init command went through successfully, you installed Borg properly. I use a hard drive mounted with the unassigned devices plugin so my command would be as follows borg init -encryption=repokey '/mnt/disks/my_backup_location/borg_repo/' ![]() Also probably not the best idea to store your array backup on your actual array. This is your secret repo encryption password (Make it strong!). You will then be prompted for a password. To do this type the following: borg init -encryption=repokey '/path/to/repo' The repo is where Borg will store it’s backup files. Once you have it, use it to install Borg, setuptools, llfuse and libffiģ.Grab rclone beta by waseh on the unraid app store.Ĥ.We need to create our repo for Borg. Stepsġ.Grab the user scripts plugin from the app store on unRAID.Ģ.Grab the nerdpack plugin from the app store on unRAID. Borg runs the backup, and Rclone moves a copy of the backup to my google cloud. I use Google Drive to back my data up and that is where Rclone comes in. Well Borg can do remote backups but via SSH. Rclone is more of a move this data to that location tool. So if Borg does all of these things, why do we need Rclone? Well.Rclone first and foremost has some tricks to get versioning but it is not a backup tool per se. Side note: I hate how close duplicacy and duplicity and duplicati all are to each other. Some community complaints of duplicacy circles around their license agreement, but the creator basically made a mistake more or less and it is clearly outlined on their website. ![]() The web ui and some os versions of the duplicacy program seems to require a payment.not sure if one time or not. This software is amazing and they have a free version too. I did find the fix for this but after much thought, I didn’t want to have to remember how to do all of the steps and further remember the commands for restore. The only thing I didn’t like was how it stored little files in each directory during backup. Worst software experience I’ve had so far for backups.Īrq worked and is even multi threaded, but I hardly noticed any de duplication (I’ve had similar results with just compression) which I could live with, but there were no options to manage how many versions to keep or for how long, the Windows UI was clunky, and there was no Linux version.Ĭloudberry No de duplication natively max 5TB storage.ĭuplicacy - There is one backup tool called Duplicacy that I found, and it is pretty freaking awesome. Due to how Duplicati stores files, it is EXTREMELY slow recovering a single file. I’ve seen people here recommend the following programs and here is my feedback for the time of my researchĭuplicati looked nice, but in the first day of testing also choked after the first few hundred GB or so of data possibly. Error Detection (We want to keep our data reliable).Remote backup (Remember we need off-site backup for the 3-2-1 rule.Open Source (I like to see that code of yours).Multiple platforms (Linux, Windows, etc).Pruning & Archiving (Keep X backups for X time).Encryption (Keep files secure from prying eyes).Compression (Make files smaller by removing statistical redundancy).De-duplication (Cut down on file size for repeating files).With backup programs I look for the following features and only in extreme situations will I accept less. 3 copies of your data, 2 local and 1 offsite. So backups, the 3-2-1 rule is a good practice. This guide will be brief but I am here to help as always.ĭISCLAIMER this guide is offered in the hope that it is helpful, but comes with no warranty/guarantee/etc. So I will re-write this because I love this community. I archived my reddit history to try some machine learning out, and I didn't read the docs (always read the docs) and ended up nuking it. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |