How Paranoid Should You Be For Backing Up Your Data?

Backup Backup Backup - And Test Restores

Backup Backup Backup – And Test Restores (Photo credit: Wikipedia)

If you ask me what is the best way to backup your data, I will probably direct your concern to more than one way.  I like to think of not placing all of your eggs in one basket kind of scenario.  What’s the point of backing up data in the first place?  It’s to hope that when things go crazy such as a computer’s data corruption might occur, you can then access your most valuable backup data.  If you only rely on one preferable backup method, then what if in a critical moment that even the backup data isn’t accessible through your preferable only backup method, what will you do then?  Even a perfect storm is a possible scenario for spreading eggs in more than one basket, therefore I think being paranoid about safekeeping your data with more than one preferable backup method is the best way to go about doing the backups for your valuable data.

For us normal folks, the regular Joe(s), who have data that we want to safeguard, it’s a must for us to spread our data in more than one basket.  It must not be that you have to be a company to take this approach.  Furthermore, nowadays regular Joe(s) do have plenty of ways to go about doing backups for their data.  Let me list few of them:

  • Google Drive
  • Pogoplug
  • Dropbox
  • Amazon Simple Storage Service
  • CrashPlan
  • External hard drives
  • Network attach storage solution such as QNAP NAS servers
  • Do it yourself FreeNAS server solution
  • rsync to a renting server with affordable monthly fee

And the list can go on a lot longer as third party cloud services are now in amble supply.  I think the problem isn’t about finding a backup solution or solutions for the regular Joe(s), but it’s about the affordability, speed, security, and conveniency aspects.  Let say, if a regular Joe wants to spread his backup data in more than one basket, how affordable can this be?  So on and so on…

I think affordability should not be as big of an issue as before the time when there were no third party cloud service and competitive (affordable) computer hardware pricing.  If you don’t intend to harbor 100 of Gigabytes worth of data for streaming purpose or whatever extreme configuration, backing up few Gigabytes worth of data should not cost you much at all.  Perhaps, you can do it at no cost too.  One example, I think Google Drive gives you around 10 Gigabytes worth of free data space or a little bit more than this, and just with this service alone you know you don’t have to spend a dime to backup your data as long you are not going over the free space limitation that Google Drive allows.  Don’t like third party cloud services for whatever reasons?  Computer hardware such as external hard drives nowadays are no longer pricing at outrageous prices, therefore it’s easier for regular Joe(s) to go this route for doing their data backups.  How about coupling Linux with a spare, dusty computer to form a local backup storage server at zero cost in term of money, but you have to spend time on putting things together such as installing Linux and deploying Linux’s network attached storage services to have a more complete backup server solution.

I can see that the many third party cloud services as good solutions for doing backups.  How come?  Let say you’re paranoid about the safety of your data to a point that you consider the scenario where local backup data can all be corrupted at the same time for whatever reasons such as a virus/hack attack (or by even a more nefarious scenario), therefore you think third party cloud services are the additional safety reservoirs for your backup data.  If you are this paranoid, I think you’re doing it right.  Although third party cloud services are good measures against local data corruption, there are problems with this whole approach in general.  Let me list a few:

  • Broadband’s upload speed (Internet connection) isn’t fast enough to do a major backup (i.e., backing up huge amount of data in Gigabytes worth)
  • Security issue… how do we know our data can be securely safeguarded and stored on the remote servers?
  • Trust issue… such as how do we know our data privacy and our privacy won’t be breached on the remote servers?

I sneakily snuck in the speed and security concerns about backing up data remotely through third party cloud services, but we should not take the security issue lightly since many people may not want their privately backup data to be made known to the whole world.  Security done right in term of backing up data locally and remotely, this will also address the privacy issue/concern too.  I think employing good network and computer security measures locally will enhance the security protection level for the backup data.  Such measures should be about employing hardware and software firewall, antivirus, and so on.  Don’t forget to update the software and firmware, because through updating these things that you can be assured of weeding out security bugs.  You can never be too sure about the security of your data when you’re backing up your data remotely, therefore you should employing encryption for your backup data before you upload your backup data to the remote servers.  One good encryption measure I know of is TrueCrypt software which can be downloaded and used freely.

I don’t think we should sacrifice our data security for conveniency, because data security is definitely more important than otherwise.  Still, conveniency should be considered in the calculation of our data backup challenge too.  It’s just that we have to make sure we don’t have to sacrifice data security for conveniency.  Let say, you want to backup your data to a third party cloud service, but you don’t like the idea of doing a local encryption for your data first… this means you are sacrificing your data security for conveniency and this is truly bad for you as the owner of the backup data (i.e., privacy concern).

In summary, I think if you’re paranoid enough about the health of your data, then you should devise many backup plans for your data.  You should try to backup your data both locally and remotely, but you should employ encryption for your data when you do backup your data remotely.  Backing up huge amount of data remotely can be very inconvenient at this point in time since so many regular Joe(s) do not have access to fast upload broadband speed.  Let hope this will change soon, and I know things will be moving in this direction since data streaming and data sharing and data backup are in much more demand than ever before.  One example would be Google fiber Internet service.  Google is driving the Internet Service Provider competition forward as Google deploys its Gigabit Internet connection service for many households in various lucky cities and towns.  With Google pushing for more competition in the area of broadband speed, I think the future — having great Internet connection for uploading our backups — is definitely bright.  As time is moving on, the costs of computer backup hardware and backup services can be even more competitive, we can expect the cost of deploying backup measures for our data can only get cheaper and easier.  I like the idea of having a NAS locally, and using one or two third party cloud services for my data backups.

(How paranoid should you be for backing up your data?  In my opinion, the answer should be, the more the merrier.)

Advertisements

Upload Any File To iCloud, But You Got To Manually Rename The Upload File Correctly!

iCloud

iCloud (Photo credit: BasBoerman)

I barely use iCloud, because I prefer Dropbox, Pogoplug (i.e., software only so it would turn a computer into Pogoplug device), Ubuntu One, CrashPlan, and FreeNAS (i.e., I prefer to virtualize FreeNAS until I can set up a proper physical FreeNAS box).  This is why I know so little about iCloud.  In fact, the only time I use iCloud is when I hit the iCloud button which allows me to backup my iPhone and iPad to free iCloud account (i.e., as this writing iCloud gives 5GB free storage space).  According to the video right after the break, iCloud isn’t allowing users to upload specific files from their Mac computers to iCloud, therefore you can’t really use iCloud as how you have been using Dropbox.  I’m perplexed why this is the case, but anyhow the video right after the break will show you how to upload any file to iCloud — it seems to me like a lot of work.  (Just stick with Dropbox instead?)

Using FreeNAS’s CIFS Service To Allow Local Computers (e.g., Mac, Windows, Linux) To Share Data Within A Local Network

As I’m getting to know FreeNAS better, I begin to like it more than ever before.  FreeNAS has allowed me to set up CIFS share (Common Internet File System share) so I don’t have to rely on Pogoplug software to share data between my local computers.  Why is FreeNAS’s CIFS share is better than Pogoplug solution?  Well, I like how my data don’t have to travel through Pogoplug’s servers that host outside of my local network in order for me to be able to share data between my local computers.  With this piece of information, we can acknowledge that data travel locally are always faster (i.e., not making a trip to the Internet first and so save time and bandwidth) and more secure if the local network is being secured correctly.  Of course, I’m still going to use Pogoplug when I travel abroad, because Pogoplug is great in allowing you to connect to local computers without opening up any port within your router (this means you don’t have to sacrifice your network security when sharing files between local network and the Internet).  Still, you must trust Pogoplug’s network security in order for you to access your local computer through Pogoplug software, because ultimately your data will travel through Pogoplug’s network before they reach the devices that you use outside of your local network.

Steps to create CIFS Share in FreeNAS (the instructions at the bottom are tailored for FreeNAS 8).

  1. The first thing you want to set up a CIFS within FreeNAS is to go ahead and make sure you have created a ZFS Dataset.  What on earth is ZFS Dataset?  Within FreeNAS, you can create separate ZFS Datasets within a ZFS volume so each ZFS Dataset acts like a partition within a partition.  You can view each ZFS Dataset as a partition within a ZFS volume, but we know a ZFS volume can also be viewed as a partition itself.  Anyhow, why on earth one wants to create a partition within a partition?  Simple!  FreeNAS allows the creation of ZFS Datasets for one reason, and this reason is to enhance data security.  Each ZFS Dataset can be configured with specific permissions that not necessary to be the same as the global permissions of a particular ZFS volume.  This means if you have the access to a specific ZFS volume, you might not have access to a ZFS Dataset (i.e., partition) within — only the user who has correct permission can actually access to a specific ZFS Dataset.  In my case, I named my ZFS Dataset for CIFS Share as windows_share.  (Creating a ZFS Dataset by go to Storage > Create ZFS Dataset.)
  2. Now you need to go to Services and click on the wrench icon next to the on/off switch of CIFS label.  A CIFS settings window would pop up.  In this CIFS settings window, you might want to,
    •  enable Authentication Model for Local User (better security this way)
    • name NetBIOS Name to simply freenas
    • leave Workgroup as WORKGROUP
    • set log level to minimum (so your FreeNAS server/box won’t be overload with extremely large log files)
    • check the box which labels as Local Master
    • check the box which labels as Time Server for Domain
    • leave Guest account drop down box as nobody
    • do not check the box that labels as Allow guest access (for security purpose)
    • check the box that labels as Large RW support
    • check the box that labels as Send files with sendfile(2) (make Samba faster if Samba software/protocol has to be used to access this CIFS share)
    • check the box that labels as EA Support (to enable extended attributes support)
    • check the box that labels as Support DOS File Attributes
    • check the box that labels as Zeroconf share discovery (to allow Mac OS X clients to access CIFS share)
    • click OK button to save all the settings of CIFS settings
  3. Now, under Services again, switch the CIFS’s OFF button to ON.
  4. Click on Sharing > Windows > Add Windows Share.
    • Inside the Name’s text box, enter windows_share
    • For the path, try to either enter the path of the ZFS Dataset we had created earlier or just browse to it using the Browse button
    • Check the box that labels as Browsable to Network Clients
    • Enter the local IP addresses of local computers that you want to allow access to ZFS Dataset (i.e., CIFS share) into the text box which labels as Hosts Allow
    • Enter ALL into the text box which labels as Hosts Deny (to deny all other computers that don’t have the IP addresses that list inside the Hosts Allow text box)
    • Click OK button to save everything and exit this Windows Share window

Now you should be able to connect to this particular FreeNAS’s ZFS Dataset.  From a normal user’s standpoint who uses Mac or Linux or Windows to connect to this ZFS Dataset, all the user sees would be just another local network folder (or you can say local network destination).  Basically, any local computer which has permission to connect to this specific ZFS Dataset will see it as a Windows Share folder, therefore the data within this ZFS Dataset suddenly makes available to other Windows, Mac, and Linux machines.  How come Mac and Linux can see the data within CIFS Share folder (i.e., ZFS Dataset of CIFS Share)?  I think it’s that Mac and Linux are supporting the reading and writing to Windows file system.

Using a Windows computer to connect to FreeNAS Windows Share is easy!  All you have to do is to go to Computer > Network.  Once the Network locates FreeNAS Windows Share volume (i.e., ZFS Dataset of CIFS Share), you can browse to it and use it as if it’s just another network folder — allowing local computers to share the same data (i.e., read and write to the same data).

You can also use Mac machine to connect to FreeNAS Windows Share!  How?  Open up finder and go to Go > Connect to Server.  Inside Connect to Server box, enter cifs://192.168.0.101/ (please replace the local IP address to the one that runs your FreeNAS server).  Click Connect button to connect to FreeNAS windows share.  If it asks for user credential (i.e., username and password), please enter the username and the password that you allow to have access to this particular FreeNAS Windows Share (i.e., ZFS Dataset of CIFS share).  Once you can browse the FreeNAS Windows Share, you can read and write data to this ZFS Dataset, consequently allowing Windows computers to share data with Mac machines within a local network.

I’ve not used Linux to access FreeNAS Windows Share, therefore I don’t know the exact process of how doing it just yet.  You know?  Please share your knowledge on this in your comment.  Thank you!

Combining Local Virtualization And Remote Cloud Together Can Truly Help Everyday People Prevent Data Loss

Oh, crap!! [DSCF8022]

Image by portfolium via Flickr

Not the best data redundancy solutions of all, but if you follow my data redundancy solutions here, I think your data are going to be very resilient against data loss.  The idea is to have more than one backup of everything.  Emphasizing on data redundancy is the key.  This is well known for businesses, but here I’m pointing this out to everyday people who happen to have some personal data they want to protect for a long time to come.  So let us begin.

You need to create a personal file server and remote cloud.  Personal file server has become easy to create nowadays.  What you need is the right solution.  I used to love Pogoplug, but I noticed how Pogoplug required your local data to be trafficked through its network from remote locations from time to time, this would not be a good idea for slow Internet connection or data security.

In our specific case, we want a personal backup file server solution to help boost our data redundancy, and we don’t really have to have our file servers to stay up 24/7 as how businesses do.  With this in mind, we can just use a virtual machine as a webDAV or rsync or FTP server.  We can then clone our main virtual machine.  We’re going to store our important backup data onto the main and clone virtual machines.  We can place the clone virtual machines onto different external hard drives so we can access our clone virtual machines as easy as how we can access our main virtual machine.  Each time we have new backup data, we have to sync or copy the new backup data onto the main and clone virtual machines.  Even if our main virtual machine goes bad, we can rely on our clone virtual machines to recover our backup data.

For security purpose, our backup data must be encrypted.  Nonetheless, you don’t really have to encrypt your external hard drives since such a process would take too long, but I recommend you to encrypt one big backup partition within the main virtual machine once.  To encrypt one big partition for backup data we can use Truecrypt.  Using Truecrypt to encrypt one big backup partition within our main virtual machine once can speed the encryption process up tremendously, and yet the backup data can still be super secure.  We don’t have to create newly encrypted backup partitions for clone virtual machines since we are going to clone our main virtual machine anyway.  We only clone our main virtual machine right after we have completely saved our backup data onto the encrypted backup partition (i.e., using Truecrypt to encrypt data) within our main virtual machine.

To go about creating a main virtual machine, you can use VirtualBox or Parallels or VMware.  I recommend VirtualBox since it’s free and as capable as the paid products.  Next, you have to know which operating system you want to use for your main virtual machine.  I recommend you use an operating system you know best so you can set up a webDAV or FTP as fast as you can.  For the people who care about the planning process more and want to learn something new at the same time, I recommend Ubuntu as the operating system for the main virtual machine.  Why?  Ubuntu and any other Linux distribution can allow you to rsync backup files easily, and so by using Ubuntu or any other Linux distribution you get not just the webDAV and FTP capabilities, you also get the rsync capability.

The obvious next step is to set up the file servers for our main virtual machine so we can backup our important data onto it.  If you want to have a lot of choices, you can set up both webDAV and FTP servers for your main virtual machine.  If you want only one choice, I recommend you to set up webDAV.  webDAV is better since it allows you to map network drives to your webDAV folders.  This way, you can just copy, paste, drag, and drop the files and folders from local hard drives onto the network drives.

Ubuntu comes ready with rsync capability, and so you can just use rsync to sync your backup data from your desktop or laptop to the main virtual machine. Rsync will sync only new backup data, and so it can update your backup partition faster than otherwise.  You can also use rsync to delete old backup data from the backup partition, this way you will be able to keep the backup partition of your main virtual machine identical to the backup structure of your desktop or laptop and the clone virtual machines.

The obvious last step for the creation of personal file server solution is to clone the main virtual machine.  I think Parallels and VMware and VirtualBox all have their own special method to allow you to clone a virtual machine.  After having clone the main virtual machine more than once, you can then place the clone virtual machines onto separate external hard drives.  Each time you backup the new backup data, you have to fire up the main and the clone virtual machines to do so.  The good thing is that you don’t have to fire up all virtual machines at once, because you can always fire up the main virtual machine first and each subsequent clone virtual machines later.

By having proper local/personal backup file server solution, your backup data are now more resilient against data loss than before.  Still, local/personal backup file server solution is susceptible to fire, flood, power surge, hardware failures, and other unfortunate catastrophic events.  When such unfortunate events happen, your backup data will forever be lost.  This is why we must also backup our data to a remote cloud.

There are several remote cloud solutions you can look into, but most remote clouds require you to pay certain amount of monthly fee for a certain size of cloud storage space.  You can use free remote cloud solutions such as Skydrive, Ubuntu One, and Dropbox.  With that being said, sometimes it’s better to go with a premium cloud solutions since free cloud solutions usually come with limitations.  One good example of the limitations of using free cloud solutions is not enough cloud storage space.

Besides using remote cloud solutions through third parties, you can create your own remote cloud solution such as renting a web hosting server.  This requires you to be knowledgeable in securing your web hosting server.  After renting a web hosting server, you can turn it into a personal webDAV or FTP or rsync backup server.  This way it acts as if it’s your remote cloud, but it will be a private remote cloud.  With that being said, some web hosting companies will not allow you to use their web hosting servers as remote file servers or remote cloud solution.  This is why you need to read up on their terms of use before implementing this solution, OK?

Of course, don’t forget to encrypt your backup data using Truecrypt when you have to backup your data to a remote file server or cloud.  Encrypting data is much more important when you are actually sending your backup data out to a remote file server or cloud, because you don’t actually have a complete control over the security of the remote file server or cloud.  We’re talking about the whole enchilada here.  Ideally, the physical location of the file or cloud servers has to be secure from unauthorized access; the file or cloud servers have to be secure with firewall, antivirus and antimalware software, and so on; physical preventive measures and means to prevent hardware failures and so on; the list can go on pretty much.

Another thing to make sure is that your remote file or cloud servers have to be able to churn 24/7.  It’s important for you to be able to reach your backup data at any time, remotely.  You never know what will happen to your backup data if you cannot reach the file or cloud servers that host the backup data, right?

In summary, it costs some money to protect data.  Even if you’re just protecting some private data, it is still going to cost you some money such as buying external hard drives.  For everyday people like us, we might not even need the remote file or cloud solution.  Still, if people who are paranoid enough about protecting their backup data, then I think these people need to deploy a remote file or cloud solution.  It’s smart to go about using virtualization to deploy local backup file server solution since the virtual machines can be cloned easily and stored on external hard drives for data redundancy purpose.

After Pogoplug Video Debacle, Cloud Engines Wins My Love Again By Releasing Free Software To Turn Everyone’s Hardware Into Nodes Of Their Personal Clouds

Just recently, I was disappointed with Cloud Engines for its Pogoplug Video devices have a tendency of bursting into flames, but I may have to praise Cloud Engines this time for its effort in renewing its commitment to its users by forgoing its defective devices altogether and providing new and awesome solution which is providing a personal cloud without requiring you to buy another hardware at all.  According to ReadWriteWeb’s “Pogoplug Launches Personal Cloud Service – No USB Drive Required,” Cloud Engines provides free software which turns your local computers into servers to be acted together as a personal cloud.  To make your personal cloud available through the Internet, you have to pay $29 fee one time.

I assume if this works out, Cloud Engines can be a very effective competitor against Apple and Amazon and Google and so on.  Nonetheless, I fear that the idea might work only for some people, because many other people prefer Cloud Engines to provide cloud hardware such as Pogoplug device that run quietly and energy efficient.  I don’t think by turning a desktop at home into a node of a personal cloud using Cloud Engines’s software is that energy efficient.  Plus, I’m not that fond with having a desktop to run 24/7 just to satisfy my music addiction.  This is why I think Cloud Engines should also provide new waves of hardware devices that won’t burst into flame, but these devices have to be energy efficient and powerful enough to deliver cloud features!

After saying all of that, as a curious being I’m going to have a lot of fun with turning a computer or a virtual machine into a node of my personal cloud using Cloud Engines’s software.  In the end, it’s great to see Cloud Engines pushes out new solution and revives its commitment to its users.  I’m loving the idea of paying only $29 fee one time to have my personal cloud share with whoever I like across the Internet!  Now, I think Cloud Engines has just win my love again!  So, I don’t recommend you to continue to use Cloud Engines’ defective Pogoplug Video device (i.e., it will burst into flame and burn down your house), but I do recommend you to try out Cloud Engines’s free software that promises to turn all of your computers in a local network into nodes of your very own personal cloud.  I think it’s way cheaper that way than going for any other cloud solution which you probably have to pay for a sizable monthly fee.

(Why Cloud Engine isn’t yet making software which compatibles with Linux?  For now, its software only supports Windows and Mac!)

Pogoplug Videos Burst Into Flames, Cloud Engines Frantically Recalls Them

Apparently, Pogoplug Videos from Cloud Engines are not able to handle the strenuous features that the maker (i.e., Cloud Engines) has advertised, and so some users have experienced Pogoplug Videos bursted into flames.  Cloud Engines warns that users who have versions/models Pogo-P11 through 14 need to return the devices to get refund for their money.  Also, do not use the affected model unless the users don’t really care about burning down their houses.

I’ve a Pogoplug, but it’s not the Pogoplug Video!  I’ve one of those older model where it does not encode videos on the fly, and so it runs cool to the touch!  I’m so satisfied with my older model which I find it’s hard for me to hear the news of how Pogoplug Video can be bursted into flame.  Personally, I hope Cloud Engines corrects their mistakes and fixes the problems and compensates its users through and through, because I really think the products that Cloud Engines is making do have great potential.

On the side note, I hope Cloud Engines thinks about supporting USB 3.0 or Thunderbolt or both!  For your information, I did recommend Pogoplug on two blog posts that I’d written long ago which everyone could read it here and here.  Unfortunately, I can’t really recommend Cloud Engines’ products at the moment until the company itself fully fixes the problems and shows that it can bring out better products without jeopardizing the security of users, whether that be computer security or just plain health hazards.

Source:  http://www.engadget.com/2011/06/17/cloud-engines-recalls-potentially-flammable-pogoplug-video/