Terminal – calculate number of lines of code in a directory

We had an interesting question, can we calculate how many lines of code we have written for an entire project? It turns out this isnt the easiest thing to calculate for a web-project, but we gave it a go. This is the best we have come up with so far for all code we have written to calculate the number of lines of code in all PHP, CSS, JS, HTML and HTM pages.

( find ./ -name '*.php' -print0 -o -name '*.css' -print0 -o -name '*.js' -print0 -o -name '*.html' -print0 -o -name '*.htm' -print0 | xargs -0 cat ) | wc -l

The answer for our particular project was 1500784 lines of code!

If you wanted to do just PHP pages its rather easier:

( find ./ -name '*.php' -print0 | xargs -0 cat ) | wc -l

Ubuntu – set the time zone so that it takes BST into account

On Ubuntu, if you set the time manually using

sudo locale-gen en_GB.UTF-8

then the problem is that it does not take BST (British Summer Time) into account, so during BST the server time is out by 1 hour. This is obviously an issue if you have time restrictions on logging on to whatever system is hosted on the server.

The solution is to run:

sudo dpkg-reconfigure tzdata

Follow through the on-screen prompts to set the locale to Europe and then London this then solves the issue, and the server automatically stays at the correct time when the clocks change.

Ubuntu – Copy all files from another server by FTP

Moving servers from one infrastructure to another. In our case, from Webfusion UK to AWS.

Problem we face is that there are over 82Gb of files to move from one server to another. Traditionally we would have downloaded them all locally, then uploaded them, but what if there was a way to transfer them directly from one server to another.

We turned on FTP on the source server, and updated the firewall so that only the destination IP could connect.

Then on the destination server we can simply type:

wget -r ftp://sourceip/folderinftproot/* –ftp-user=username –ftp-password=password -P /var/www/html/ -q

This copies all folders from the FTP root on the source server in to the web root of the new server.

To transfer 82Gb of data between data-centres took 14 minutes, compared to the older download-upload method we used to use that took several overnights of downloading locally!

And of course, remembering to turn off FTP on the source server once completed!

Ubuntu – count all files in a directory recursively

Ubuntu – count all files in a folder recursively.

Took a while to figure out how to do this on a single command, but so very useful to check if all files have copied successfully.

find . -type f | wc -l

How this works:
find . -type f finds all files ( -type f ) in this ( . ) directory and shows everything as one file on each line in a list.

The second command comes after the pipe | into wc (word count) the -l option tells wc to only count lines of its input.

Together they count all files in the folder you are in and all sub-folders.

Ubuntu – Backup Amazon EC2 to S3 (Pre-requisites)

Pre-requisites on the server are Amazon S3 tools, and zip

sudo apt-get install s3cmd

sudo apt-get install zip unzip

You then need to configure the s3cmd to use the Access Code and Secret Key from the IAM within your AWS console. Its recommended that you set up a new user with programmatic access only for each server / project, and give the user the AmazonS3FullAccess permission.

Back on the server, run
sudo s3cmd --configure
Enter your access key, secret key

If, like me you are using EU-West-1 (Dublin) as your datacenter, then type in “eu-west-1” for the Default Region.

Enter a password to encrypt traffic between the EC2 instance and S3 (DO NOT USE YOUR MAIN ACCOUNT PASSWORD, MAKE A NEW ONE)

Path to GPG program – just press Enter

Use HTTPS – Yes

HTTP Proxy – leave blank, just press Enter

Test – Yes