Try To Use rsync To Complete scp Failed Job

Here is a little trick/tip for nerds who manage websites with a server which allows tools such as SSH and Rsync.  Basically, let’s say you were doing scp command such as …

[scp -r -Cpv example@example.com:/home/example/public_html/* ~/Download/backup/example.com/public_html/]

but your Internet connection got disconnected and stopped the scp command from completing the process of copying files from remote server to local server.  If you don’t have a lot of files to copy, then you should be able to use the same scp command to copy the same files again until everything got copied from remote server to local server.  What if you got huge amount of files (i.e., in tens of Gigabytes) to copy down from remote server to local server?  The disconnection of the Internet during scp process is a devastation in this situation, because the scp command would restart the copying of existing files that already downloaded to your local server.  This would be a waste of time.

No sweat.  I got a solution for you.  Try to use rsync command to sync remote files to local files instead.  This would mean existing files will be skipped, and rsync would only download new files from remote server to local server.  Of course you can reverse the direction of file copying too such as from local server to remote server using rsync.  Nonetheless, the command right after this paragraph shows you how to stop wasting time and continuing the copying of files from remote server to local server in the case scp got interrupted.

[rsync -avzhe ssh example@example.com:/home/example/public_html/* ~/Download/backup/example.com/public_html]

This rsync command I’d mentioned above uses the e parameter to append the ssh command so rsync can be done through SSH for secure file copying.  Basically, the e parameter specifies a remote shell to be used.  By the way, the other parameters are -a (equivalent to -rlptgoD – meaning preserving more files attributes than just using -r), -v (verbose printout), -z (compress files during transfer for faster file transfer), and -h (output numbers in human readable format).  By using rsync this way, you can now continuing the process of copying files from remote server to local server when scp failed to complete the job the first time around.

Advertisements

Be The Master Of Your ownCloud Data, Installing ownCloud And Run A Similar DropBox Service Privately For Free

Dropbox and various online third party cloud services are great and free to certain expectations, but to truly have all you can eat buffet kind of expectation is definitely not the kind of thing that these cloud services can provide.  Right off the bat, one thing for sure that these third party cloud services cannot provide is the best privacy level that one could get with having storing data within one’s own private network.  Want to have more cloud space than the so called free space?  It’s not free, and you have to pay more for how many more Gigabytes you want and so forth.

ownCloud is a free, open source software which acts like DropBox, but you can download, install, and use it freely.  I think ownCloud does give you the opportunity to be 100% in control of your data’s privacy.  If you know how to implement robust security measures such as proper firewall and port-forwarding, you can even allow yourself to roam the seven seas and still be able to sync with your local data securely.  Unlike DropBox and other third party cloud services, you know you’re the master of your own data in the cloud when it comes to ownCloud those data.  OK, I begin to rant on unnecessarily.

Anyhow, want to know how to install ownCloud and use it?  Check out the video right after the break, I show you how to install ownCloud on Linux Mint.  Of course, you can follow the video’s instruction to do the same for Ubuntu, because Linux Mint is just an Ubuntu based distribution.  Enjoy!!!

Installing And Securing Linux Mint 17, And Installing Adobe Reader

I was installing Linux Mint 17 for a virtual machine on my PC, and I decided it was a good idea to record the whole process.  Furthermore, I also installed Adobe Reader manually on Linux Mint 17, and so by watching this video you will also know how to do this.  If you’re trying to do what I’ve done within this video, make sure you do not deny shell access and lock the password for the regular user or users that you want to use, because if doing so you will not be able to log into the system.  Of course, if you follow my video closely, deny shell access means editing the /etc/passwd file, and lock password means editing the /etc/shadow file by executing the command line passwd -l [username].

Moreover, if you’re trying to edit the /etc/fstab file as I’d done in the video, make sure you make a copy of the original /etc/fstab file first before editing the original /etc/fstab file.  /etc/fstab file is very important, because it tells the system how to load up the devices such as hard drive, and screwing this file up will prevent your system from loading/booting.  Having the original copy of /etc/fstab file will allow you to restore it in the case that you screw up the original /etc/fstab file.

If you are going to pay close attention to my part of editing /etc/fstab file, you will notice that I’d made error on adding rw option to the /tmp and /dev/shm devices, but you will also notice that I had correct the errors in the video few seconds later.  Basically, rw option is correct, but in the video, before I made the option as rw I had the option as wr.  By having the option as wr, the system won’t recognize this option.  So instead of wr, it should be rw.

rw is a permission option.  By adding rw option to /tmp and /dev/shm, the /tmp and /dev/shm devices won’t allow anything to execute commands in these devices, but these devices only allow whatever to read and write to them.  Anyhow, you can check out this video right after the break.  Enjoy!!!

Virtual Machine Is A Very Beautiful Thing

Virtual machine is a very beautiful thing, but the majority computer users might be ignorant of it.  How beautiful virtual machine is?  Let me just say this right off the bat, virtual machine is there to piss off evil doers!  It’s so beautiful that you can basically download computer viruses onto a virtual machine without the fear of these nasty things go around and infect a physical machine.  Of course, with just about anything, if one is so inept in computer things, one might be able to allow the computer viruses and what not to infect the whole Intranet (LAN) network even one is using a virtual machine.  Nonetheless, one has to be very inept to do so.  For an example, allowing virtual machine to be on the same subnet with a physical machine without its own protection measures (i.e., antivirus, firewall and what not) — thus, showing just another door to the evil doers.  The evil doers can use a compromised active virtual machine as a gateway for their Intranet (LAN) hacking activities.  The beautiful thing is that if one is smart enough to secure a virtual machine, one basically has a hardened sandbox which can easily be used as a platform for browsing the dangerous web at will.  Perhaps, even downloading computer viruses and what not for testing purposes such as testing to see the effectiveness of an antivirus program.  Professional antivirus software reviewers are mostly using a hardened virtual machine to test to see how effective an antivirus program can be.

Virtual machine is so beautiful that it is very perverted.  How?  I’ve heard how many people have seen their computers got infected with computer viruses, worms, trojans, and what not just because they have been browsing dangerous pornographic websites.  What’s worse is that these folks do not use readily available simple measures such as Javascript blocker software/plugins (e.g., ScriptSafe, Noscript, etc…).  For an example, I’d talked to one person who complained that he would format his computer often, because he caught too many computer viruses.  This very person would like to say that he’s an advance computer user.  Nonetheless, he’d told me that he befuddled how his Windows machine kept on catching a flu (i.e., sarcasm for computer viruses).  Furthermore, he told me that it was too easy for his computer to catch a flu whenever he got perverted.  Obviously, it meant that he browsed pornographic websites and his computer caught a flu.  In the end, he told me his assumption that there’s no way a PC can be OK if one is browsing a pornographic website.  I told him flat out that he’s dead wrong.  The simplest answer I could give to him at that point was that just make sure his physical machine is clean (i.e., not being infected with any computer virus) and then install a virtual machine.

Virtual machine is beautiful since it’s allowing us to have a secure sandbox to play around.  Of course, it’s a bit more complicated than just a secure sandbox, because a virtual machine can run just about all major operating systems.  Furthermore, a virtual machine can be a quick testing ground for security software and what not.  If a virtual machine user doesn’t like what he or she sees, he or she can simply go through few clicks to delete a virtual machine and make a new one.  My suggestion for whoever that does browse the web dangerously is to install a virtual machine on a clean physical machine, install Linux such as Ubuntu, install firewall and ClamAV onto Ubuntu, harden up Ubuntu (virtual machine) as if it’s running on a real machine, and then browse the dangerous web.

Virtual machine is a strange beast, because it can do certain things exceptionally well and efficient, but it can be totally useless at times.  For an example, playing games on a virtual machine is a no no.  First of all, a virtual machine does not use a dedicated graphic card, because it’s emulating one.  Even if a virtual machine environment allows a physical computer to share dedicated graphic resources, I doubt a virtual machine could really share dedicated graphic resources efficiently.  Playing intensive graphic resource demanding games would be almost impossible.  Nonetheless, if one uses a virtual machine for applications such as virtualizing a NAS (i.e., Network Attached Storage server), it can become very interesting.  Imagining this further, how interesting it is for one to be able to clone a virtualized NAS easily, right?  Virtual machine platforms such as VirtualBox is certainly carrying the option of allowing a computer user to clone a virtual machine through few clicks of a mouse.

In summary, virtual machine is very beautiful, but the degrees of beautifulness are scaling accordingly according to whoever is using it.  One can simply use a virtual machine to test out how effective an antivirus software can be, but one can also use it to run a virtualized NAS.  If one is horny, one can simply browse the dangerous pornographic websites with a virtual machine.  Basically, virtual machine is quite useful and secure if one knows how to use it as a sandbox.

 

How Paranoid Should You Be For Backing Up Your Data?

Backup Backup Backup - And Test Restores

Backup Backup Backup – And Test Restores (Photo credit: Wikipedia)

If you ask me what is the best way to backup your data, I will probably direct your concern to more than one way.  I like to think of not placing all of your eggs in one basket kind of scenario.  What’s the point of backing up data in the first place?  It’s to hope that when things go crazy such as a computer’s data corruption might occur, you can then access your most valuable backup data.  If you only rely on one preferable backup method, then what if in a critical moment that even the backup data isn’t accessible through your preferable only backup method, what will you do then?  Even a perfect storm is a possible scenario for spreading eggs in more than one basket, therefore I think being paranoid about safekeeping your data with more than one preferable backup method is the best way to go about doing the backups for your valuable data.

For us normal folks, the regular Joe(s), who have data that we want to safeguard, it’s a must for us to spread our data in more than one basket.  It must not be that you have to be a company to take this approach.  Furthermore, nowadays regular Joe(s) do have plenty of ways to go about doing backups for their data.  Let me list few of them:

  • Google Drive
  • Pogoplug
  • Dropbox
  • Amazon Simple Storage Service
  • CrashPlan
  • External hard drives
  • Network attach storage solution such as QNAP NAS servers
  • Do it yourself FreeNAS server solution
  • rsync to a renting server with affordable monthly fee

And the list can go on a lot longer as third party cloud services are now in amble supply.  I think the problem isn’t about finding a backup solution or solutions for the regular Joe(s), but it’s about the affordability, speed, security, and conveniency aspects.  Let say, if a regular Joe wants to spread his backup data in more than one basket, how affordable can this be?  So on and so on…

I think affordability should not be as big of an issue as before the time when there were no third party cloud service and competitive (affordable) computer hardware pricing.  If you don’t intend to harbor 100 of Gigabytes worth of data for streaming purpose or whatever extreme configuration, backing up few Gigabytes worth of data should not cost you much at all.  Perhaps, you can do it at no cost too.  One example, I think Google Drive gives you around 10 Gigabytes worth of free data space or a little bit more than this, and just with this service alone you know you don’t have to spend a dime to backup your data as long you are not going over the free space limitation that Google Drive allows.  Don’t like third party cloud services for whatever reasons?  Computer hardware such as external hard drives nowadays are no longer pricing at outrageous prices, therefore it’s easier for regular Joe(s) to go this route for doing their data backups.  How about coupling Linux with a spare, dusty computer to form a local backup storage server at zero cost in term of money, but you have to spend time on putting things together such as installing Linux and deploying Linux’s network attached storage services to have a more complete backup server solution.

I can see that the many third party cloud services as good solutions for doing backups.  How come?  Let say you’re paranoid about the safety of your data to a point that you consider the scenario where local backup data can all be corrupted at the same time for whatever reasons such as a virus/hack attack (or by even a more nefarious scenario), therefore you think third party cloud services are the additional safety reservoirs for your backup data.  If you are this paranoid, I think you’re doing it right.  Although third party cloud services are good measures against local data corruption, there are problems with this whole approach in general.  Let me list a few:

  • Broadband’s upload speed (Internet connection) isn’t fast enough to do a major backup (i.e., backing up huge amount of data in Gigabytes worth)
  • Security issue… how do we know our data can be securely safeguarded and stored on the remote servers?
  • Trust issue… such as how do we know our data privacy and our privacy won’t be breached on the remote servers?

I sneakily snuck in the speed and security concerns about backing up data remotely through third party cloud services, but we should not take the security issue lightly since many people may not want their privately backup data to be made known to the whole world.  Security done right in term of backing up data locally and remotely, this will also address the privacy issue/concern too.  I think employing good network and computer security measures locally will enhance the security protection level for the backup data.  Such measures should be about employing hardware and software firewall, antivirus, and so on.  Don’t forget to update the software and firmware, because through updating these things that you can be assured of weeding out security bugs.  You can never be too sure about the security of your data when you’re backing up your data remotely, therefore you should employing encryption for your backup data before you upload your backup data to the remote servers.  One good encryption measure I know of is TrueCrypt software which can be downloaded and used freely.

I don’t think we should sacrifice our data security for conveniency, because data security is definitely more important than otherwise.  Still, conveniency should be considered in the calculation of our data backup challenge too.  It’s just that we have to make sure we don’t have to sacrifice data security for conveniency.  Let say, you want to backup your data to a third party cloud service, but you don’t like the idea of doing a local encryption for your data first… this means you are sacrificing your data security for conveniency and this is truly bad for you as the owner of the backup data (i.e., privacy concern).

In summary, I think if you’re paranoid enough about the health of your data, then you should devise many backup plans for your data.  You should try to backup your data both locally and remotely, but you should employ encryption for your data when you do backup your data remotely.  Backing up huge amount of data remotely can be very inconvenient at this point in time since so many regular Joe(s) do not have access to fast upload broadband speed.  Let hope this will change soon, and I know things will be moving in this direction since data streaming and data sharing and data backup are in much more demand than ever before.  One example would be Google fiber Internet service.  Google is driving the Internet Service Provider competition forward as Google deploys its Gigabit Internet connection service for many households in various lucky cities and towns.  With Google pushing for more competition in the area of broadband speed, I think the future — having great Internet connection for uploading our backups — is definitely bright.  As time is moving on, the costs of computer backup hardware and backup services can be even more competitive, we can expect the cost of deploying backup measures for our data can only get cheaper and easier.  I like the idea of having a NAS locally, and using one or two third party cloud services for my data backups.

(How paranoid should you be for backing up your data?  In my opinion, the answer should be, the more the merrier.)

How To Connect And Mount iSCSI Onto Ubuntu And Linux Mint

In the video right after the break, I show you how to connect to iSCSI target and mount iSCSI LUN onto Ubuntu and Linux Mint.  Enjoy!!!