Be The Master Of Your ownCloud Data, Installing ownCloud And Run A Similar DropBox Service Privately For Free

Dropbox and various online third party cloud services are great and free to certain expectations, but to truly have all you can eat buffet kind of expectation is definitely not the kind of thing that these cloud services can provide.  Right off the bat, one thing for sure that these third party cloud services cannot provide is the best privacy level that one could get with having storing data within one’s own private network.  Want to have more cloud space than the so called free space?  It’s not free, and you have to pay more for how many more Gigabytes you want and so forth.

ownCloud is a free, open source software which acts like DropBox, but you can download, install, and use it freely.  I think ownCloud does give you the opportunity to be 100% in control of your data’s privacy.  If you know how to implement robust security measures such as proper firewall and port-forwarding, you can even allow yourself to roam the seven seas and still be able to sync with your local data securely.  Unlike DropBox and other third party cloud services, you know you’re the master of your own data in the cloud when it comes to ownCloud those data.  OK, I begin to rant on unnecessarily.

Anyhow, want to know how to install ownCloud and use it?  Check out the video right after the break, I show you how to install ownCloud on Linux Mint.  Of course, you can follow the video’s instruction to do the same for Ubuntu, because Linux Mint is just an Ubuntu based distribution.  Enjoy!!!

Advertisements

Allowing Specific IP Addresses To Access QNAP’s Web Apps Using .htaccess File And Preventing All Other IP Addresses From Meddling With QNAP’s Web Apps

If you’re using QNAP as a NAS, you probably know that QNAP allows you to install web apps onto QNAP server.  Web apps are cool, but these web apps can be a security nightmare.  This is why you often have to upgrade these web apps.  One example of a popular web app that you can install on QNAP server is WordPress.  Anyhow, whether a web app might carry a computer vulnerability or not, you want to secure your QNAP’s web apps with .htaccess file.  By adding .htaccess file to /share/Web directory in QNAP server, you add one more hoop (security layer) for hackers to have dealt with.  In the video right after the break, I’m going to show you how to add a very simple .htaccess file to QNAP’s /share/Web directory to thwart a possible malicious user which might be able to bypass the router’s firewall and hack your QNAP server using web apps’ vulnerabilities.  Enjoy!!!

How Paranoid Should You Be For Backing Up Your Data?

Backup Backup Backup - And Test Restores

Backup Backup Backup – And Test Restores (Photo credit: Wikipedia)

If you ask me what is the best way to backup your data, I will probably direct your concern to more than one way.  I like to think of not placing all of your eggs in one basket kind of scenario.  What’s the point of backing up data in the first place?  It’s to hope that when things go crazy such as a computer’s data corruption might occur, you can then access your most valuable backup data.  If you only rely on one preferable backup method, then what if in a critical moment that even the backup data isn’t accessible through your preferable only backup method, what will you do then?  Even a perfect storm is a possible scenario for spreading eggs in more than one basket, therefore I think being paranoid about safekeeping your data with more than one preferable backup method is the best way to go about doing the backups for your valuable data.

For us normal folks, the regular Joe(s), who have data that we want to safeguard, it’s a must for us to spread our data in more than one basket.  It must not be that you have to be a company to take this approach.  Furthermore, nowadays regular Joe(s) do have plenty of ways to go about doing backups for their data.  Let me list few of them:

  • Google Drive
  • Pogoplug
  • Dropbox
  • Amazon Simple Storage Service
  • CrashPlan
  • External hard drives
  • Network attach storage solution such as QNAP NAS servers
  • Do it yourself FreeNAS server solution
  • rsync to a renting server with affordable monthly fee

And the list can go on a lot longer as third party cloud services are now in amble supply.  I think the problem isn’t about finding a backup solution or solutions for the regular Joe(s), but it’s about the affordability, speed, security, and conveniency aspects.  Let say, if a regular Joe wants to spread his backup data in more than one basket, how affordable can this be?  So on and so on…

I think affordability should not be as big of an issue as before the time when there were no third party cloud service and competitive (affordable) computer hardware pricing.  If you don’t intend to harbor 100 of Gigabytes worth of data for streaming purpose or whatever extreme configuration, backing up few Gigabytes worth of data should not cost you much at all.  Perhaps, you can do it at no cost too.  One example, I think Google Drive gives you around 10 Gigabytes worth of free data space or a little bit more than this, and just with this service alone you know you don’t have to spend a dime to backup your data as long you are not going over the free space limitation that Google Drive allows.  Don’t like third party cloud services for whatever reasons?  Computer hardware such as external hard drives nowadays are no longer pricing at outrageous prices, therefore it’s easier for regular Joe(s) to go this route for doing their data backups.  How about coupling Linux with a spare, dusty computer to form a local backup storage server at zero cost in term of money, but you have to spend time on putting things together such as installing Linux and deploying Linux’s network attached storage services to have a more complete backup server solution.

I can see that the many third party cloud services as good solutions for doing backups.  How come?  Let say you’re paranoid about the safety of your data to a point that you consider the scenario where local backup data can all be corrupted at the same time for whatever reasons such as a virus/hack attack (or by even a more nefarious scenario), therefore you think third party cloud services are the additional safety reservoirs for your backup data.  If you are this paranoid, I think you’re doing it right.  Although third party cloud services are good measures against local data corruption, there are problems with this whole approach in general.  Let me list a few:

  • Broadband’s upload speed (Internet connection) isn’t fast enough to do a major backup (i.e., backing up huge amount of data in Gigabytes worth)
  • Security issue… how do we know our data can be securely safeguarded and stored on the remote servers?
  • Trust issue… such as how do we know our data privacy and our privacy won’t be breached on the remote servers?

I sneakily snuck in the speed and security concerns about backing up data remotely through third party cloud services, but we should not take the security issue lightly since many people may not want their privately backup data to be made known to the whole world.  Security done right in term of backing up data locally and remotely, this will also address the privacy issue/concern too.  I think employing good network and computer security measures locally will enhance the security protection level for the backup data.  Such measures should be about employing hardware and software firewall, antivirus, and so on.  Don’t forget to update the software and firmware, because through updating these things that you can be assured of weeding out security bugs.  You can never be too sure about the security of your data when you’re backing up your data remotely, therefore you should employing encryption for your backup data before you upload your backup data to the remote servers.  One good encryption measure I know of is TrueCrypt software which can be downloaded and used freely.

I don’t think we should sacrifice our data security for conveniency, because data security is definitely more important than otherwise.  Still, conveniency should be considered in the calculation of our data backup challenge too.  It’s just that we have to make sure we don’t have to sacrifice data security for conveniency.  Let say, you want to backup your data to a third party cloud service, but you don’t like the idea of doing a local encryption for your data first… this means you are sacrificing your data security for conveniency and this is truly bad for you as the owner of the backup data (i.e., privacy concern).

In summary, I think if you’re paranoid enough about the health of your data, then you should devise many backup plans for your data.  You should try to backup your data both locally and remotely, but you should employ encryption for your data when you do backup your data remotely.  Backing up huge amount of data remotely can be very inconvenient at this point in time since so many regular Joe(s) do not have access to fast upload broadband speed.  Let hope this will change soon, and I know things will be moving in this direction since data streaming and data sharing and data backup are in much more demand than ever before.  One example would be Google fiber Internet service.  Google is driving the Internet Service Provider competition forward as Google deploys its Gigabit Internet connection service for many households in various lucky cities and towns.  With Google pushing for more competition in the area of broadband speed, I think the future — having great Internet connection for uploading our backups — is definitely bright.  As time is moving on, the costs of computer backup hardware and backup services can be even more competitive, we can expect the cost of deploying backup measures for our data can only get cheaper and easier.  I like the idea of having a NAS locally, and using one or two third party cloud services for my data backups.

(How paranoid should you be for backing up your data?  In my opinion, the answer should be, the more the merrier.)

How To Create, Attach/Associate, And Mount EBS Onto Amazon EC2 (Using Ubuntu 12.04 Linux OS)

English: Cloud Computing

English: Cloud Computing (Photo credit: Wikipedia)

I was playing around with Amazon Web Services, and then I thought it would be a good idea to make a how-to video which shows people (who are new to Amazon Web Services) how to add, attach/associate, and mount EBS (Elastic Block Storage) to Amazon EC2 (Elastic Cloud Computing) instance (using Ubuntu 12.04 Linux OS for the EC2 instance).  For your information, Amazon Web Services is like a cloud web hosting and network infrastructure (plus a whole lot more).  Nonetheless, if you have no idea what I just spewed and still think Amazon Web Services is interesting, you definitely can find out more about Amazon Web Services at Amazon (the website and not the jungle).  Anyhow, the thought is awesome in my opinion, and so the end result is the video right after the break.  Enjoy!!!

On QNAP Server, How DO I Set Up FTP And Connect To It?

QNAP TS-419P II is what I use to hold the backup data and share data among machines within my house.  Basically, QNAP TS-419P II is a network attached storage server.  It got RAIDS and host of other capabilities such as hosting a Time Machine service.  Nonetheless, within this post, I post a video which talks about how to set up FTP and how to connect to FTP, on QNAP TS-419P II.  Obviously, it’s not only QNAP TS-419P II which uses this particular firmware/software, therefore any other QNAP server model which uses the same firmware/software will work with the instruction within the video right after the break.  Please, enjoy the video!!!

(If you know how to set up FTP and connect to FTP, on QNAP server already, then I think this video is rather useless for you.)

Installing Memcached To Improve Performances of A Server

What is Memcached?  Quoting directly from Memcached.org:

Free & open source, high-performance, distributed memory object caching system, generic in nature, but intended for use in speeding up dynamic web applications by alleviating database load.

Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering.

Memcached is simple yet powerful. Its simple design promotes quick deployment, ease of development, and solves many problems facing large data caches. Its API is available for most popular languages.

Memcached is FREE and yet has a capability to reduce server loads and improve websites’ performances.  So, there is a huge benefit to have Memcached installed onto a server, and then write necessary code for your web applications to take advantage of Memcached.  In this post, I’ll briefly explain and provide some instructions for you to install Memcached onto your servers.

  1. Assume you install Memcached as root, so let change into a root directory!  Execute the command [cd /root].
  2. In root directory, make a new, temporary directory with whatever name you like to hold the necessary packages and dependencies so Memcached can be installed.  For an example, make a new, temporary directory by executing this command [mkdir mynewtemp].
  3. At this point in time, Memcached needs LibEvent to be installed before Memcached could be installed successfully.  You need to go to LibEvent’s official website to download its latest package, and then you need to put latest package of LibEvent into /root/mynewtemp directory.  Let assume you had done that already, and so we simulate how you would go about installing LibEvent.
    1. tar xzvf libevent-latestversion.tar.gz
    2. cd libevent-latestversion
    3. ./configure; make; make install
  4. It’s now time to download Memcached and install it onto your server.  Let execute the command [cd /root/mynewtemp].
  5. You need to download latest version of Memcached from its official website and put it into your /root/mynewtemp directory.
    1. tar xzvf memcached-latestversion.tar.gz
    2. cd memcached-latestversion
    3. ./configure –with-lib-event=/usr/local/; make; make install
  6. Let try to start Memcached right now by executing the command [memcached].  If you don’t see any error, then you don’t have to worry about anything.  If you see this error: (memcached: error while loading shared libraries: libevent-2.0.so.5: cannot open shared object file: No such file or directory)
    1. We can fix this error!  Perhaps you’re using 64 bit machine, and so the installation of LibEvent had not registered correctly with Memcached.  Let fix this problem by registering LibEvent with Memcached using symbolic link.  Execute the command [ln -s /usr/local/lib/libevent-2.0.so.5 /usr/lib64/libevent-2.0.so.5].
    2. Let start Memcached again to see if the error is fix!  Execute the command [memcached -d -u nobody].  By now your error should be fix.  If not, please Google for help or ask another expert!
  7. Since I’m familiar with WHM/Cpanel, I’ll use only WHM/Cpanel way of loading Memcached module into PHP.  Without loading Memcached module into PHP, you basically cannot use PHP with Memcached.  Let us begin by log into WHM, go to Software >> Module Installers.  Let pick PHP PECL >> Manage.  Try to search for Memcached and then install the available package of Memcached module.  If there is no error, then you’re almost done.  WHM spits error in the installation process of a Memcached module for PHP?Perhaps, it’s a bug of WHM, and we need a work around for this!  Let work around the problem by doing a symbolic link as below:
    • ln -s /usr/lib/php/extensions/no-debug-non-zts-20090626/memcache.so /usr/local/lib/php/extensions/no-debug-non-zts-20090626/memcache.so
    • chmod 0755 /usr/local/lib/php/extensions/no-debug-non-zts-20090626/memcache.so
    • restarting Apache to have PHP works with Memcached module.
  8. Let edit /etc/rc.local so Memcached will start on boot!
    1. vim /etc/rc.local
    2. add this line into rc.local file on its own line memcached -d -u nobody&
    3. save the file /etc/rc.local and exit it.

The command that I mention of for you to start Memcached and add to /etc/rc.local is very general.  You can further customize the command to make Memcached works better for your server since each server has different specifications and customizations such as amount of memory (i.e., RAM).  You can always execute command [man memcached] in terminal to see more information on how to use Memcached effectively.

Disclaimer:  Please carefully considering the risks of installing any new software onto your production servers.  Also, you must do your own research in addition in following the tips in this post.  Without understanding the risks and the technicality of installing something as Memcached onto your server, and without knowing why you even need something as Memcached, then it’s you to be blamed for crashing a server or servers of yours!  This post is no mean of soliciting Memcached, but it’s merely a brief tutorial or how-to for working with Memcached!