Google DNS Benchmarked

Last week, Google made a couple of public DNS servers available to the general public. The claim was Google’s DNS was much faster than any DNS servers available to date. Andrew Brampton ran a series of tests to determine if this claim was indeed true. He tested Google’s DNS against OpenDNS, Sky/Easynet and Plus.net domain name servers. His findings were indeed interesting.

He found that OpenDNS is still faster than Google’s DNS servers, but Google’s DNS is faster than Sky/Easynet and Plus.net. In the meantime, I’ve already set my DNS servers to Google’s. I will probable leave it there since we are only talking microseconds here. Like Andrew’s conclusions, I expect Google’s servers to be optimized and tuned in the near future. It will only get better.

Install Subversion Repository on Ubuntu Desktop

This is a tutorial how to install a Subversion on your desktop. Subversion is an open-source revision control system. A repository is usually installed on servers so developers and programmers can have easy access to code. Subversion uses a check-in an check-out process for submitting changes to the repository. The repository can also be installed on desktop systems. Access is gained through many means by way of direct file access, ftp, http, svn and svn+ssh. See chart below.

Installing Subversion will install both Subversion administration tools and the client. In Ubuntu or Debian-based systems, you can install Subversion by performing the following commands. By the way, I added an Apache and Subversion WebDav module so both can be installed with just a single command.

Install Subversion

sudo apt-get install subversion libapache2-svn

Reboot the Apache Web Server

sudo /etc/init.d/apache2 restart

Create a Subversion Repository

svnadmin create /home/yourname/repository/

I’m placing the repository in my home directory. You can place it anywhere in your system. You may need to use sudo if you install it outside of your home directory. Remember the repository location, we will use it a few times below to configure the Apache Subversion WebDav module, etc.

Import your Repository

svn import /path/to/import/directory file:///home/yourname/repository

If you have a repository ready, now is a good time to import it. If you are just starting out, you can initialize the Repository here.

Access to Subversion

file:// Direct access on local disk
http:// Browser using http WebDav protocol
https:// Browser using https secure and WebDav
svn:// Subversion protocol
svn+ssh:// Subversion protocol and SSH tunnel

Configure WebDav protocol

sudo vi /etc/apache2/apache2.conf

Add

<Location /svn>
DAV svn
SVNPath /home/yourname/repository
AuthType Basic
AuthName "Repository"
AuthUserFile /etc/subversion/passwd
<LimitExcept GET PROPFIND OPTIONS REPORT>
Require valid-user
</LimitExcept>
</Location>

Change Ownership to HTTP-User

sudo chown -R www-data:www-data /home/yourname/repository

Password Protect the Repository

sudo htpasswd -c /etc/subversion/passwd username

You will be asked to provide a password. Enter the password twice.

Reboot Apache Server

sudo /etc/init.d/apache2 restart

It’s probably a good idea to restart the Apache server one more time.

Browser Access
Next, open up your browser and access http://localhost/svn from the address bar. You will be asked for the username and password. You should see the repository and any content or directory underneath it. That’s it. Happy coding.

WordPress and Password Protected Directories

I think I just solved an issue with WordPress Permalinks and password protected directories that use Apache’s .htaccess. Here’s the problem in detail. I have WordPress installed on the root of my domain. Under that domain, I have a directory that I want password protected using .htaccess. It’s just a directory containing a few PHP scripts. Every time I try to access the password protected directory, I get a 404 page missing error. WordPress is confused thinking the directory is a post or a page. Since it’s not, it generates a 404 error instead.

The workaround for this is to place a couple of error codes at the top of the .htaccess to pre-empt the WordPress .htaccess rules. There are a couple of scenarios. If there is 401 situation, an authentication in this case, it will send the user to the error document which is just a blank html file. The WordPress permalinks rule never gets processed or is ignored. If there is a 403 error code, a forbidden situation in this case, it will send the user to that error document as well.

Here is the working .htaccess file. You will see the two error code rules at the top of the file. Underneath, you will see the standard WordPress permalinks rules.

ErrorDocument 401 ./blank.html
ErrorDocument 403 ./blank.html
# BEGIN WordPress
&lt;IfModule mod_rewrite.c&gt;
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
&lt;/IfModule&gt;
# END WordPress

Simple fix. Thanks to aiso.net.

Concatenate Video Files Using FFMPEG

This article will show you how to concatenate two mpg files into one big file using the CAT and FFMPEG commands. Here was the dilemma. I have two video files that were recorded separately. The recording was accidentally stopped for a brief second or two. The cam created two separate video files. I wanted to stitch and merge the two files into one big file.

To accomplish the feat, I will be using both CAT and FFMPEG commands that is common to most Linux systems. If you use the cat command alone like the in the example below, it will not work. You’ll have timestamp issues. The player also will not go past the end of the first video. It will never play the second and third videos if you have more.

Cat is not enough.

cat abc.mpg def.mpg > ghi.mpg

What you need to do is use CAT and pipe the output using FFMPEG.

The command:

cat abc.mpg def.mpg | ffmpeg -f mpeg -i - -vcodec copy -acodec copy ghi.mpg

You now have a file named ghi.mpg which is the combined output of abc.mpg and def.mpg. You can concatenate 3 or more files if you wish. FFMPEG is such a powerful video converter program capable of converting videos to any video format like avi, wmv, mov, etc. Earlier in the year, I wrote how to install FFMPEG and also another explaining the options.

Diet Chromium OS

Google Chrome OS requires a 4GB USB stick. If you need a smaller version, check out Diet Chromium OS which need only 1GB. From the Yahoo article:

Many of the builds thus far have been targeted at specific hardware configurations, such as one made available by a Dell employee designed for the Dell‘s Mini 10V netbook.

The Diet Chromium build has a smaller footprint, but promises wider hardware support. Diet Chromium comes courtesy of a UK student and programmer known as Hexxeh. Hexxeh explains that he constructed the build in order to “fill a gap that hadn’t been filled.” His Web site offers instructions on how to install the lighter Chromium build on Windows, Mac and Linux machines.

Ian Paul will show you how to install a more standard build of Chrome OS right now.

Using Google Public DNS

Google just announced today a new public DNS aimed at making browsing even a faster experience. DNS or domain name servers are servers that translate domain names to IP addresses that computers can understand. Having a faster DNS can definitely make surfing the web a faster experience. In the past, I’ve used OpenDNS as an alternative to my ISP nameservers. Now Google has their own.

nameserver 8.8.8.8
nameserver 8.8.4.4

To use Google DNS.

In Windows, you can open up your network interface IP properties and enter the Google nameservers. In Linux, you can place the Google nameservers in resolv.conf. In your router, you can replace your ISP nameservers with Google’s nameservers. Complete instructions on how to use Google’s nameservers are available from Google’s website.

Google DNS was no surprise to me. It makes perfect sense. What’s next? Web hosting.

Does Ubuntu Really Need A Longer Release Schedule?

LinuxPlanet.com brought up a good debate if Ubuntu needs a longer release schedule. Ubuntu releases a new version every six months and a LTS or Long Term Support version every two years. Here’s my take on it. Like a little kid in a toy store, I always get excited when a new version of Ubuntu comes out. I’ve gone so far as even upgrading my desktop to a Release Candidate prior to the announcement.

Along with the excitement comes disappointment. Twice, I’ve had to revert to the previous versions due to bugs and problems with the latest Ubuntu release. The bugs are not always fixed in the first few days. Sometimes it takes months. At the moment, I’m still running 9.04. So, I’ll wait a few months before moving to 9.11. Dell and other hardware vendors also practice the same. Dell currently sells Ubuntu 9.04 versions on their website.

So, what do I recommend. I recommend Ubuntu switch to one a year release schedule. I know it seems like a long time between releases, but six months goes by really fast. If the development team takes their time to work out bugs and do more testing, then Ubuntu can really focus on delivering a great product with every release. Yearly is not as taxing as six months. A yearly release will also work hand in hand with the LTS version. It will be every other LTS release instead of 3 releases for every LTS release.