Installing And Securing Linux Mint 17, And Installing Adobe Reader

I was installing Linux Mint 17 for a virtual machine on my PC, and I decided it was a good idea to record the whole process.  Furthermore, I also installed Adobe Reader manually on Linux Mint 17, and so by watching this video you will also know how to do this.  If you’re trying to do what I’ve done within this video, make sure you do not deny shell access and lock the password for the regular user or users that you want to use, because if doing so you will not be able to log into the system.  Of course, if you follow my video closely, deny shell access means editing the /etc/passwd file, and lock password means editing the /etc/shadow file by executing the command line passwd -l [username].

Moreover, if you’re trying to edit the /etc/fstab file as I’d done in the video, make sure you make a copy of the original /etc/fstab file first before editing the original /etc/fstab file.  /etc/fstab file is very important, because it tells the system how to load up the devices such as hard drive, and screwing this file up will prevent your system from loading/booting.  Having the original copy of /etc/fstab file will allow you to restore it in the case that you screw up the original /etc/fstab file.

If you are going to pay close attention to my part of editing /etc/fstab file, you will notice that I’d made error on adding rw option to the /tmp and /dev/shm devices, but you will also notice that I had correct the errors in the video few seconds later.  Basically, rw option is correct, but in the video, before I made the option as rw I had the option as wr.  By having the option as wr, the system won’t recognize this option.  So instead of wr, it should be rw.

rw is a permission option.  By adding rw option to /tmp and /dev/shm, the /tmp and /dev/shm devices won’t allow anything to execute commands in these devices, but these devices only allow whatever to read and write to them.  Anyhow, you can check out this video right after the break.  Enjoy!!!


Virtual Machine Is A Very Beautiful Thing

Virtual machine is a very beautiful thing, but the majority computer users might be ignorant of it.  How beautiful virtual machine is?  Let me just say this right off the bat, virtual machine is there to piss off evil doers!  It’s so beautiful that you can basically download computer viruses onto a virtual machine without the fear of these nasty things go around and infect a physical machine.  Of course, with just about anything, if one is so inept in computer things, one might be able to allow the computer viruses and what not to infect the whole Intranet (LAN) network even one is using a virtual machine.  Nonetheless, one has to be very inept to do so.  For an example, allowing virtual machine to be on the same subnet with a physical machine without its own protection measures (i.e., antivirus, firewall and what not) — thus, showing just another door to the evil doers.  The evil doers can use a compromised active virtual machine as a gateway for their Intranet (LAN) hacking activities.  The beautiful thing is that if one is smart enough to secure a virtual machine, one basically has a hardened sandbox which can easily be used as a platform for browsing the dangerous web at will.  Perhaps, even downloading computer viruses and what not for testing purposes such as testing to see the effectiveness of an antivirus program.  Professional antivirus software reviewers are mostly using a hardened virtual machine to test to see how effective an antivirus program can be.

Virtual machine is so beautiful that it is very perverted.  How?  I’ve heard how many people have seen their computers got infected with computer viruses, worms, trojans, and what not just because they have been browsing dangerous pornographic websites.  What’s worse is that these folks do not use readily available simple measures such as Javascript blocker software/plugins (e.g., ScriptSafe, Noscript, etc…).  For an example, I’d talked to one person who complained that he would format his computer often, because he caught too many computer viruses.  This very person would like to say that he’s an advance computer user.  Nonetheless, he’d told me that he befuddled how his Windows machine kept on catching a flu (i.e., sarcasm for computer viruses).  Furthermore, he told me that it was too easy for his computer to catch a flu whenever he got perverted.  Obviously, it meant that he browsed pornographic websites and his computer caught a flu.  In the end, he told me his assumption that there’s no way a PC can be OK if one is browsing a pornographic website.  I told him flat out that he’s dead wrong.  The simplest answer I could give to him at that point was that just make sure his physical machine is clean (i.e., not being infected with any computer virus) and then install a virtual machine.

Virtual machine is beautiful since it’s allowing us to have a secure sandbox to play around.  Of course, it’s a bit more complicated than just a secure sandbox, because a virtual machine can run just about all major operating systems.  Furthermore, a virtual machine can be a quick testing ground for security software and what not.  If a virtual machine user doesn’t like what he or she sees, he or she can simply go through few clicks to delete a virtual machine and make a new one.  My suggestion for whoever that does browse the web dangerously is to install a virtual machine on a clean physical machine, install Linux such as Ubuntu, install firewall and ClamAV onto Ubuntu, harden up Ubuntu (virtual machine) as if it’s running on a real machine, and then browse the dangerous web.

Virtual machine is a strange beast, because it can do certain things exceptionally well and efficient, but it can be totally useless at times.  For an example, playing games on a virtual machine is a no no.  First of all, a virtual machine does not use a dedicated graphic card, because it’s emulating one.  Even if a virtual machine environment allows a physical computer to share dedicated graphic resources, I doubt a virtual machine could really share dedicated graphic resources efficiently.  Playing intensive graphic resource demanding games would be almost impossible.  Nonetheless, if one uses a virtual machine for applications such as virtualizing a NAS (i.e., Network Attached Storage server), it can become very interesting.  Imagining this further, how interesting it is for one to be able to clone a virtualized NAS easily, right?  Virtual machine platforms such as VirtualBox is certainly carrying the option of allowing a computer user to clone a virtual machine through few clicks of a mouse.

In summary, virtual machine is very beautiful, but the degrees of beautifulness are scaling accordingly according to whoever is using it.  One can simply use a virtual machine to test out how effective an antivirus software can be, but one can also use it to run a virtualized NAS.  If one is horny, one can simply browse the dangerous pornographic websites with a virtual machine.  Basically, virtual machine is quite useful and secure if one knows how to use it as a sandbox.


Had Ditched Spotify, Was With Xbox Music, Now I’m Back With Spotify Again But Still Using And Loving Windows 8 Ecosystem

Spotifys huvudkontor på Humlegårdsgatan

Spotifys huvudkontor på Humlegårdsgatan (Photo credit: Wikipedia)

A month after I quitted Spotify, now I’m back with it for my music listening pleasure.  A month ago, I ditched Spotify for Microsoft’s Xbox Music as I bought into the idea of a more coherent Windows 8 platform ecosystem.  Unfortunately, Xbox Music was a frustrated experience.  Fortunately, I like the Windows 8 ecosystem still, but my love affair with Windows 8 ecosystem just have to do without the Xbox Music experience.

Xbox Music experience was bad for me, because it was hard to have the playlist on my HTC 8X Windows 8 phone to sync correctly with the one on Windows 8 PC and creating playlist was a pain through Xbox Music app.  Furthermore, Xbox Music suddenly refused to play any music in my playlists even though I had one more day of free trial subscription before the whole free trial subscription period would end accordingly.  I would have stuck with Xbox Music by subscribing to its monthly fee payment structure, but the last straw was about how Microsoft did not train their customer support departments well on how to deal with Xbox Music errors, whether the Xbox Music errors found on the smartphone or the Windows 8 PC.  I experienced this first hand as customer supports would transfer me back and forth between the Windows 8 and Xbox customer support departments, but in the end my question and problem would not be resolved.

I thought I would have gone for weeks on end without being able to listen to awesome music on my smartphone since now I’m no longer using iPhone 5.  Instead of iPhone 5, I’m using HTC 8X Windows 8 phone.  I like HTC 8X Windows 8 phone a lot, because it got Windows 8 operating system.  I think Windows 8 operating system is way cooler than the stuffs that make up Android and iPhone operating systems.  Nonetheless, without great music experience, it was painful for me.  Luckily, Spotify came to the rescue.  Spotify app, a beta version nonetheless, is now available in Windows 8 phone app store.  Like a thirsty person that was in a desert, I was too eager to download Spotify app on HTC 8X and paid up roughly around $10 per month for premium plan with Spotify.

The great feeling of being able to listen to whatever music I want and not having to be frustrated by the creation and syncing of playlists is a wonderful thing.  Now, I’m listening to Spotify again whenever I’m in my car, at home, and elsewhere.

I think Microsoft needs to have dedicated teams within their Xbox Music department to make sure Xbox Music does well.  If I’m Microsoft, I would want to imagine that Xbox Music — a core service among core services within Microsoft complex — is a company of itself, just like Spotify, and so the teams that build Xbox Music can be dedicated enough to see things from the ground up, to make sure that Xbox Music will be just as easy and a pleasure to use as Spotify.  After all, Xbox Music does carry the substantial amount of music that form the core of the whole Xbox Music service.  Unfortunately, content fulfilling isn’t enough for some people like me, because a clunky Xbox Music user interface in terms of creating and syncing playlists and weird Xbox Music errors do push people like me away from the service.  Furthermore, bad customer supports on Xbox Music do not help to alleviate but only enhance the Xbox Music problems.

In conclusion, I think Xbox Music needs to be better, and I think Microsoft has the resource to dedicate such a task.  Windows 8 ecosystem is great, but I think people will appreciate more if Xbox Music is a part of Windows 8 ecosystem greatness.  Thanks to Spotify, I like to stick to HTC 8X smartphone a lot longer.  There is one downside with Spotify at the moment is that it doesn’t work well with Windows 8 PC.  Why?  I experience that Spotify refuse to quit or to have its process to be terminated once you launch it on Windows 8 PC.  Furthermore, Spotify tends to crash too frequently on Windows 8 PC.  Perhaps, some people have better luck with Spotify on Windows 8 PC, but I don’t.  Some people’s answer to Spotify problematic issues on Windows 8 PC by launching Spotify on a Windows 7 virtual machine (i.e., installing Windows 7 on a virtual machine that runs on Windows 8 PC).

Gigabit LAN Empowers Productivity Such As Running Virtual Machines On A Network Attached Storage’s iSCSI

English: Intel Pro/1000 GT Gigabit Ethernet PC...

English: Intel Pro/1000 GT Gigabit Ethernet PCI Network card (Photo credit: Wikipedia)

If you’re on a Gigabit LAN (Local Area Network), then you can do so many things that sometimes the extra efforts seem to be so redundant, but that is the whole idea!!!  For an instance, on my Gigabit LAN, I had installed Fedora 16 virtual machine (VirtualBox type) onto FreeNAS box (my home Network Attached Storage server), but accessing this virtual machine from my other home computers.  This way, I can centralize whatever virtual machines I have had in one location, and yet I’m able to access these virtual machines anywhere (i.e., any local computer which has VirtualBox installed).  To run a virtual machine on FreeNAS box, I set up iSCSI and installed a virtual machine (using VirtualBox) on iSCSI drive (iSCSI ZFS dataset volume).

I wonder… what happen if two local computers access the same virtual machine at the same time?  Probably something bad might happen.  I don’t think there will be a problem for two local computers access the same virtual machine at different time.  Nonetheless, why don’t you try this out and let me know, OK?

Other examples of how I have used a Gigabit LAN are doing backups for Windows 7, Macs, and other computers to FreeNAS box (e.g., CIFS, AFP).  Without a Gigabit LAN, doing the many things I had mentioned previously would be tedious and slow.  A Gigabit LAN pushes data ten to 40 times faster (at least that is how I feel) than slower types of LAN.

Getting Gigabit LAN going for your home isn’t hard at all!  The requirements are, CAT6 cable, a Gigabit NIC (Network Interface Card), and a Gigabit router.  That’s all you really need for having a Gigabit LAN going.  I wish I can say more as if I’m very sophisticated, but there isn’t much more to say of how getting a Gigabit LAN going.

Nowadays, CAT6 cable isn’t expensive anymore.  For an example, I looked on Amazon and saw a 50 feet CAT6 cable costed only $3.45.  The same inexpensive story goes for Gigabit NIC.  I saw a PCI-E Gigabit NIC priced around $32 on Amazon.  Gigabit router is probably the most expensive item you have to get before you can have a Gigabit LAN going.  I saw a Gigabit wireless router priced around $72 on Amazon, but few reviewers said this router had overheating problem.  You definitely need to get a good Gigabit router which has few problems or else you might not get even close to the advertised Gigabit speed.

Sometimes, Glossing Over The Simplest Things Would Prevent One From Fixing The Problems

I had built an awesome FreeNAS 8.04 box, but little I knew that this was the beginning of all the problems, and these problems had bugged me for two days straight.  Noticing how I had not updated my blog in two days?  Anyway, it all started with I bought three 3 TB 72000 RPM non-spin down Seagate hard drives, and I installed these three hard drives into an HP Pavilion desktop computer which I had not touched for at least two years.  The HP Pavilion desktop computer has had the spec for making a fine FreeNAS box.  It got 6 GB of DDR2 SDRAM 800 MHz, a quad core, and everything else wasn’t that important in building a FreeNAS box besides the three 3 TB Seagate hard drives I bought for the sole purpose of starting the building of a FreeNAS box.  Before, I had only experienced FreeNAS through virtualization technology (e.g., VirtualBox, VMware, Parallels), and so I had always been eager to start a real FreeNAS box.  It was about time, I guess.  So, it was a breeze for me to install three 3 TB Seagate hard drives into the HP Pavilion desktop computer, and the installation of FreeNAS 8.04 onto a USB flash drive was also just as easy.

With everything was in place before my FreeNAS set sail, I thought man I got this!  Sure, I had it but… Here is the but…  I had forgotten that there was a reason for me not to have played with the HP Pavilion desktop computer all along until now.  Since the day I had this computer off of Windows 7 addiction and I was too lazy to put Windows 7 back on so I could flash the updated BIOS for it, but without a newer BIOS this computer would freeze on reboot or fresh boot — the BIOS could not even get the chance to boot up and the whole computer would freeze at a black screen.  This problem was obviously given me a hard time in putting Windows back on, because 9 out of 10 times, the computer would freeze before the BIOS could even boot, therefore I would not even have the chance to let the computer read the Windows 7 installation disk or USB flash drive.  Luckily, I was persistent and finally got the computer to start the BIOS.  I quickly installed Windows 7 and crossed my fingers that it would allow me to boot into Windows 7 so I could update the BIOS.  This too was a lucky shot, and eventually I had the BIOS updated.

After the BIOS mess was over, I thought now I could use my awesome FreeNAS box with joy.  Such joy was never to last, because I kept on asking myself why on earth it took the Macbook Pro over eight or nine hours just to backup around 10 GB worth of data to FreeNAS AFP ZFS share volume.  This second incident had me pulled my hair and cursed foully.  I should have known better to do the right things first by making sure the basic elements of the problems weren’t the root of the problems.  Instead of such I went on impatiently, fixating on that it had to be FreeNAS problem from the start.  It took so much of my precious time to diagnose FreeNAS box and so on just to find out my last ditch effort was what I should have done from the very beginning.  It was the router’s configuration that had my MacBook Pro sent 1MB worth of data per second.  Considering I’m on a Gigabit network, 1 MB per second worth of data transfer had to be one of the lamest things I had ever seen.  After readjusted the router’s configuration, I was glad to see that even through WiFi, my MacBook Pro was able to send 14 or 15 times faster (i.e., ethernet connection would be much much faster).

The moral of this story is that you have to think it through before you actually embark on fixing things.  Things could be a lot simpler in regarding to fixing computing and networking related matter, but sometimes you might gloss over simple elements and miss the whole show.  I had done just that and it was exhausting.  To end this blog post of mine, I like to end it with a tip in regarding to how one would go about knowing the data transfer speed between one’s computer and a FreeNAS box.  The idea is to use an FTP program like FileZilla and monitoring the upload data rate/speed of a very large file (preferable in Gigabytes) that got transferred from a computer to the FreeNAS FTP volume (i.e., FTP ZFS dataset).

Let Download An Entire Website Locally For Viewing A Website Offline

A download symbol.

A download symbol. (Photo credit: Wikipedia)

I should have known how to save a website for offline viewing long long ago, but the truth was that I did not know an elegant way of doing it until now!  For the longest time, I have used wget for downloading open source software over the Internet, but I had no idea that I could also use wget to download an entire website for offline viewing.  Now I know, but how do I use wget to download an entire website on a Mac?

Before I continue on with the main point, you might wonder what is the point of downloading an entire website, but the point is simply that some people might experience Internet Interruption Syndrome and by downloading a website for offline viewing they can basically somewhat anticipate this very syndrome.  You know, it can happen to you too!  Like, whenever you on a road trip to somewhere you have been fantasized about, but your so called 21st century car doesn’t have 21st century wireless technology and you don’t have other 21st century always on wireless technology with you (e.g., a portable hotspot, a good enough smart phone data plan which allows you to have a smart phone behaves as a portable hotspot, etc…) — you are in a bind as to not to be able to connect to the Internet while inside a rather modern moving car and this makes you want to scream “Oh my God, I want a cure for my Internet Interruption Syndrome!”  Don’t scream too loudly, because you might make your driver dangerously swivels in and out of that highway lane.  The driver might blame you for experiencing a “Sudden Oh my God syndrome,” but the blame has to be after the fact that the car and its passengers are still whole.

With the why for using wget to download an entire website out of the way, let us move on with the how to acquire wget so we can use it to download an entire website, OK?  Unfortunately, wget isn’t coming with Mac by default, but you can always get it onto Mac by following the’s How To Get Wget For Your Mac tutorial.  If for some reasons you don’t like to follow the tutorial I just mentioned to get wget onto your Mac, you can always install a virtual machine (e.g., VMware, VirtualBox, Parallels) that runs Linux (e.g., Ubuntu, Fedora, CentOS, Mint, etc…) and through this way you can automatically acquire wget as Linux will install wget by default (i.e., so far Linux has always include wget).  Just remember though, you need to enable a share folder between Linux virtual machine and your host machine (e.g., Mac, Windows) so you can share whatever wget had downloaded between the virtual machine and the host machine — this way you don’t have to download the content from a virtual machine onto a USB flash drive and then share whatever content on a USB flash drive with the host machine (e.g., Mac, Windows).

OK, with the how to acquire wget is out of the way, let us move on with how to use wget to download an entire website, OK?  I follow’s Downloading an Entire Web Site with wget tutorial for using wget to download an entire website.  In case you don’t want to check out that tutorial, you can read on as I will repeat the how to use wget to download an entire website within this blog post of mine.  To use wget to download an entire website, what you need to do is to open up a terminal in Linux (or a terminal on Mac if you have wget installed successfully on Mac) and type in the commands below:

  1. cd ~/Documents/
  2. mkdir wget-Downloads
  3. cd ~/Documents/wget-Downloads
  4. wget –recursive –no-clobber –page-requisites –html-extension –convert-links –restrict-file-names=windows –domains

After using the commands above, you should now have a directory wget-Downloads created inside your Documents directory (e.g. Linux – /home/[user-name-here]/Documents/wget-Downloads, Mac – /Users/[user-name-here]/Documents/wget-Downloads) and a website which you had downloaded to this directory.  Of course, remember to replace with an actual website, OK?  Also, if you compare the tutorial from LinuxJournal against mine, you will notice I had not used the –no-parent parameter for the wget command.  When using –no-parent parameter with wget command, it will limit you from downloading an entire website, therefore you might have broken links when viewing the website offline.  Still, if you are sure about the usage of –no-parent wget parameter, then you should use it.  Also, you should know that using wget to download an entire website might be the worst thing you can do sometimes, because you might have to fiddle your fingers for the longest time if not forever when a website you try to download is way way too big.  Luckily, you can always use Ctrl+C key combination on Linux (might be the same for Mac) to actually stop wget from continuing the download of an entire website.

As how had explained,

  • –recursive wget parameter is for telling wget to download an entire website
  • –domains wget parameter is for telling wget to download the contents within a specific website and not to download the contents of other websites as wget can actually follow the links that point to other websites and scrape the contents of those websites too
  • –no-parent wget parameter for telling wget to not follow links outside of a directory within a website, therefore stopping wget from downloading whatever contents that are locating outside of a specific directory
  • –page-requisites parameter for wget is for telling wget to download all the extra contents besides just text (e.g., CSS, images, etc…), and this way an offline website will appear pretty much the same as if it’s being viewed online
  • –html-extension wget parameter is for telling wget to save files of the offline website in .html extension, keeping the website structure as if it’s being served online (this is useful for website owner to backup a website locally)
  • –convert-links wget parameter is for telling wget to convert links locally so when a website is viewing offline, the offline website’s web links will link to each other properly (locally)
  • –restrict-file-names=windows wget parameter is for telling wget to convert file names in a way that when using the files that are downloaded with wget will be displayed correctly on Windows as well (i.e., Windows will be able to serve offline website’s files correctly in whatever browsers that are installed on Windows)
  • –no-clobber wget parameter is for telling wget to don’t overwrite any existing file so you can save some bandwidth and storage space, but sometimes it’s best to not use this parameter so you can actually update the entire website offline (i.e., sometimes a website updates its webpages with newer contents)

In summary, I had tried many other methods of saving a website offline for later viewing, but none is so elegant and simple as using wget.  How come?  For an example, when I used a browser to save a website (i.e., File > Save Page As), I had to do this more than once so I could actually save the portions of website correctly.  Furthermore, I had to reorganize the saving portions of the website locally or else the saving portions of the website appear unorganized within a local directory.