Try To Use rsync To Complete scp Failed Job

Here is a little trick/tip for nerds who manage websites with a server which allows tools such as SSH and Rsync.  Basically, let’s say you were doing scp command such as …

[scp -r -Cpv example@example.com:/home/example/public_html/* ~/Download/backup/example.com/public_html/]

but your Internet connection got disconnected and stopped the scp command from completing the process of copying files from remote server to local server.  If you don’t have a lot of files to copy, then you should be able to use the same scp command to copy the same files again until everything got copied from remote server to local server.  What if you got huge amount of files (i.e., in tens of Gigabytes) to copy down from remote server to local server?  The disconnection of the Internet during scp process is a devastation in this situation, because the scp command would restart the copying of existing files that already downloaded to your local server.  This would be a waste of time.

No sweat.  I got a solution for you.  Try to use rsync command to sync remote files to local files instead.  This would mean existing files will be skipped, and rsync would only download new files from remote server to local server.  Of course you can reverse the direction of file copying too such as from local server to remote server using rsync.  Nonetheless, the command right after this paragraph shows you how to stop wasting time and continuing the copying of files from remote server to local server in the case scp got interrupted.

[rsync -avzhe ssh example@example.com:/home/example/public_html/* ~/Download/backup/example.com/public_html]

This rsync command I’d mentioned above uses the e parameter to append the ssh command so rsync can be done through SSH for secure file copying.  Basically, the e parameter specifies a remote shell to be used.  By the way, the other parameters are -a (equivalent to -rlptgoD – meaning preserving more files attributes than just using -r), -v (verbose printout), -z (compress files during transfer for faster file transfer), and -h (output numbers in human readable format).  By using rsync this way, you can now continuing the process of copying files from remote server to local server when scp failed to complete the job the first time around.

Advertisements

OnePlus One Personal Experience After Using It For 3 Days (Ultra HD)

Just adding a video on YouTube to describe how I felt personally about using OnePlus One after 3 days.  The video is shaky since I’d rarely ever used a smartphone for recording videos, and so I didn’t anticipate for the footage to be so shaky.  The footage I’d taken with OnePlus One is raw; meaning I didn’t use Photoshop or any other video quality enhancement feature to make the footage looks better.  I want the footage to be pristine  just like how it got captured without being edited, showing the real quality of the hardware that I used to capture the footage.  Nonetheless, since the video is way too shaky for normal viewing, and so I had to use a stabilizer feature which came with my video editing software to somewhat stabilize the shaky footage.  In the future, if I’m ever going to do another video recording session with a smartphone, I’ll definitely put more effort in anchoring myself so the footage won’t be too shaky.  Anyhow, check out the video right after the break.  Enjoy!!!

Be The Master Of Your ownCloud Data, Installing ownCloud And Run A Similar DropBox Service Privately For Free

Dropbox and various online third party cloud services are great and free to certain expectations, but to truly have all you can eat buffet kind of expectation is definitely not the kind of thing that these cloud services can provide.  Right off the bat, one thing for sure that these third party cloud services cannot provide is the best privacy level that one could get with having storing data within one’s own private network.  Want to have more cloud space than the so called free space?  It’s not free, and you have to pay more for how many more Gigabytes you want and so forth.

ownCloud is a free, open source software which acts like DropBox, but you can download, install, and use it freely.  I think ownCloud does give you the opportunity to be 100% in control of your data’s privacy.  If you know how to implement robust security measures such as proper firewall and port-forwarding, you can even allow yourself to roam the seven seas and still be able to sync with your local data securely.  Unlike DropBox and other third party cloud services, you know you’re the master of your own data in the cloud when it comes to ownCloud those data.  OK, I begin to rant on unnecessarily.

Anyhow, want to know how to install ownCloud and use it?  Check out the video right after the break, I show you how to install ownCloud on Linux Mint.  Of course, you can follow the video’s instruction to do the same for Ubuntu, because Linux Mint is just an Ubuntu based distribution.  Enjoy!!!

Installing And Securing Linux Mint 17, And Installing Adobe Reader

I was installing Linux Mint 17 for a virtual machine on my PC, and I decided it was a good idea to record the whole process.  Furthermore, I also installed Adobe Reader manually on Linux Mint 17, and so by watching this video you will also know how to do this.  If you’re trying to do what I’ve done within this video, make sure you do not deny shell access and lock the password for the regular user or users that you want to use, because if doing so you will not be able to log into the system.  Of course, if you follow my video closely, deny shell access means editing the /etc/passwd file, and lock password means editing the /etc/shadow file by executing the command line passwd -l [username].

Moreover, if you’re trying to edit the /etc/fstab file as I’d done in the video, make sure you make a copy of the original /etc/fstab file first before editing the original /etc/fstab file.  /etc/fstab file is very important, because it tells the system how to load up the devices such as hard drive, and screwing this file up will prevent your system from loading/booting.  Having the original copy of /etc/fstab file will allow you to restore it in the case that you screw up the original /etc/fstab file.

If you are going to pay close attention to my part of editing /etc/fstab file, you will notice that I’d made error on adding rw option to the /tmp and /dev/shm devices, but you will also notice that I had correct the errors in the video few seconds later.  Basically, rw option is correct, but in the video, before I made the option as rw I had the option as wr.  By having the option as wr, the system won’t recognize this option.  So instead of wr, it should be rw.

rw is a permission option.  By adding rw option to /tmp and /dev/shm, the /tmp and /dev/shm devices won’t allow anything to execute commands in these devices, but these devices only allow whatever to read and write to them.  Anyhow, you can check out this video right after the break.  Enjoy!!!

Adding .htaccess File To QNAP’s /share/Web/ To Secure All Web Applications Within

Legal Disclaimer:  Following the tip within this blog post at your own risk.  You have been warned, thus you know that you are going to do something dangerous here to your web server or QNAP server.  With this knowledge of yours and by having reading this warning or skipping this clear warning, you cannot hold me for your stupidity or dangerous action against your very own QNAP server or web server or against anyone’s web server that you’re responsible for its administrative duties and procurements.

Are you running a web server on QNAP NAS?  NAS stands for Network Attached Storage server.  If you are for whatever purpose, whether this web server is for production purpose or testing purpose, you might want to know that .htaccess file can help secure QNAP’s web applications such as WordPress, Drupal, and the rest.  Here’s how to create proper .htaccess file that controls all web applications at once on your QNAP server.

  1. You need to change into directory of /share/Web by using this Linux command [cd /share/Web].  Of course, please do ignore the square brackets as these are only for clarifying the command line.
  2. Quickly do [ls -la] to figure out if you have an .htaccess file already.  If you do, please make a backup of this file in case you need this original file again for whatever purpose.  To make a backup of this .htaccess file that you already have had in the QNAP’s /share/Web directory, use this command [cp -p -a /share/Web/.htaccess /share/Web/.htaccess-old].
  3. Once you had followed the step #2 herein, then you can try to remove the original .htaccess file (Not the backup one you just made OK?) by using this command [rm -rf /share/Web/.htaccess].  Be very careful with [rm -rf] command line, because if you misspell a file or a directory you’re trying to remove, you will definitely lose such directory or file forever and won’t be able to recover it.
  4. Now let us create the .htaccess file again, but this time we’re creating it the way we like it.  Of course, .htaccess is a complex file, thus regular Joe like us needs not to worry about making this file too complex.  Instead, let a regular Joe like us to just create simple .htaccess file that denies all IP addresses but only allows a specific IP addresses.  This means, if you want to allow one or two specific IP addresses to access QNAP’s web applications, this .htaccess file should satisfy your command.  So here we go…
    1. Creating .htaccess file by using this command [touch /share/Web/.htaccess].
    2. Now, let’s edit the .htaccess file we just created by using this command [vim /share/Web/.htaccess].
    3. Let’s enter the lines below for our new .htaccess file shall we?  These lines must be in the order as follow…
      1. order deny,allow
      2. allow from 192.168.0.x (please use your very own IP address here)
      3. allow from 192.168.0.x (please use your very own IP address here)
      4. deny from all
    4. What we had done was adding 2 IP addresses to the allow list in .htaccess file so these 2 IP addresses will be able to interact/access the web applications that reside in QNAP’s /share/Web directory.  You can add more IP addresses or remove most IP addresses but allowing only one according to your desire by simply adding more [allow from…] or remove [allow from…] lines.  Of course all [allow from…] lines must be written or typed out above the line which said [deny from all] and below the line which said [order deny,allow].  Now, we must save our newly edited .htaccess file by doing this while you’re still in the vim editor.
      1. Hit escape key on the keyboard to exit the editing mode.
      2. Type in [:wq] and hit enter key on the keyboard.  Of course, please do ignore the square brackets as these are only for clarifying the command line.
  5. The last step is to secure our new .htaccess file by doing two things.
    1. First thing to secure is to make sure the owner and the group owner of the .htaccess file are indeed the right owner and group owner.  For me personally, I prefer to not use admin user and administrators group for any web application files and directories, because I don’t want the evil doers to be able to use one of these files with high privilege access to escalate the privilege and execute malicious commands.  This is why on my QNAP server I rather make most of my web applications’ files and directories in the name of user httpdusr and group owner everyone.  So let’s do this command to make this happens OK?  Type in [chown httpdusr:everyone /share/Web/.htaccess].  Afterward, just do [ls -la /share/Web/.htaccess] to see if .htaccess file indeed is using user httpdusr and group owner everyone.
    2. Second thing to secure is to make sure the .htaccess file has the right permission.  So we need to use this command [chmod 400 /share/Web/.htaccess].  What this command does is change the permission of .htaccess file in /share/Web directory to read only for user (owner of the .htaccess file) and no other permission is allowable for anyone else, hint the two zeros after #4.  These two zeros stand for no permission for group user (whoever has the group authorization of whichever group) and no permission for everyone else (this is the last 0 for).  Finally, you can do [ls -la /share/Web/.htaccess] to confirm that the permission for .htaccess file is indeed 400 or not.  If it’s so, it means only the QNAP web server user httpdusr will be able to read the file, but even this user cannot write to or execute whatever within this .htaccess file.

Now, with this .htaccess file configuration for your QNAP’s /share/Web directory, the web applications that are residing within this specific Web directory will not be accessible to anyone with any IP address unless somebody is using the IP address that is being allowed by this very .htaccess file.

Do you know that by following the tip herein, you can also use this very tip for non-QNAP web server?  Just create a similar .htaccess file within whatever web server’s directory to prevent snooping to most IP addresses and allow only the IP addresses that are being allowed within.

Virtual Machine Is A Very Beautiful Thing

Virtual machine is a very beautiful thing, but the majority computer users might be ignorant of it.  How beautiful virtual machine is?  Let me just say this right off the bat, virtual machine is there to piss off evil doers!  It’s so beautiful that you can basically download computer viruses onto a virtual machine without the fear of these nasty things go around and infect a physical machine.  Of course, with just about anything, if one is so inept in computer things, one might be able to allow the computer viruses and what not to infect the whole Intranet (LAN) network even one is using a virtual machine.  Nonetheless, one has to be very inept to do so.  For an example, allowing virtual machine to be on the same subnet with a physical machine without its own protection measures (i.e., antivirus, firewall and what not) — thus, showing just another door to the evil doers.  The evil doers can use a compromised active virtual machine as a gateway for their Intranet (LAN) hacking activities.  The beautiful thing is that if one is smart enough to secure a virtual machine, one basically has a hardened sandbox which can easily be used as a platform for browsing the dangerous web at will.  Perhaps, even downloading computer viruses and what not for testing purposes such as testing to see the effectiveness of an antivirus program.  Professional antivirus software reviewers are mostly using a hardened virtual machine to test to see how effective an antivirus program can be.

Virtual machine is so beautiful that it is very perverted.  How?  I’ve heard how many people have seen their computers got infected with computer viruses, worms, trojans, and what not just because they have been browsing dangerous pornographic websites.  What’s worse is that these folks do not use readily available simple measures such as Javascript blocker software/plugins (e.g., ScriptSafe, Noscript, etc…).  For an example, I’d talked to one person who complained that he would format his computer often, because he caught too many computer viruses.  This very person would like to say that he’s an advance computer user.  Nonetheless, he’d told me that he befuddled how his Windows machine kept on catching a flu (i.e., sarcasm for computer viruses).  Furthermore, he told me that it was too easy for his computer to catch a flu whenever he got perverted.  Obviously, it meant that he browsed pornographic websites and his computer caught a flu.  In the end, he told me his assumption that there’s no way a PC can be OK if one is browsing a pornographic website.  I told him flat out that he’s dead wrong.  The simplest answer I could give to him at that point was that just make sure his physical machine is clean (i.e., not being infected with any computer virus) and then install a virtual machine.

Virtual machine is beautiful since it’s allowing us to have a secure sandbox to play around.  Of course, it’s a bit more complicated than just a secure sandbox, because a virtual machine can run just about all major operating systems.  Furthermore, a virtual machine can be a quick testing ground for security software and what not.  If a virtual machine user doesn’t like what he or she sees, he or she can simply go through few clicks to delete a virtual machine and make a new one.  My suggestion for whoever that does browse the web dangerously is to install a virtual machine on a clean physical machine, install Linux such as Ubuntu, install firewall and ClamAV onto Ubuntu, harden up Ubuntu (virtual machine) as if it’s running on a real machine, and then browse the dangerous web.

Virtual machine is a strange beast, because it can do certain things exceptionally well and efficient, but it can be totally useless at times.  For an example, playing games on a virtual machine is a no no.  First of all, a virtual machine does not use a dedicated graphic card, because it’s emulating one.  Even if a virtual machine environment allows a physical computer to share dedicated graphic resources, I doubt a virtual machine could really share dedicated graphic resources efficiently.  Playing intensive graphic resource demanding games would be almost impossible.  Nonetheless, if one uses a virtual machine for applications such as virtualizing a NAS (i.e., Network Attached Storage server), it can become very interesting.  Imagining this further, how interesting it is for one to be able to clone a virtualized NAS easily, right?  Virtual machine platforms such as VirtualBox is certainly carrying the option of allowing a computer user to clone a virtual machine through few clicks of a mouse.

In summary, virtual machine is very beautiful, but the degrees of beautifulness are scaling accordingly according to whoever is using it.  One can simply use a virtual machine to test out how effective an antivirus software can be, but one can also use it to run a virtualized NAS.  If one is horny, one can simply browse the dangerous pornographic websites with a virtual machine.  Basically, virtual machine is quite useful and secure if one knows how to use it as a sandbox.