Tuesday, March 26, 2013

Learn to use rsync For Transferring Files in Linux



rsync is a free software computer program for Unix and Linux like systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate. An important feature of rsync not found in most similar programs/protocols is that the mirroring takes place with only one transmission in each direction.

So why do I care?

It can perform differential uploads and downloads of files over the network, transferring only the data that is different. This is great for doing backups between servers, or transferring contents between a development server and a production server. (Just what I used it for.) This is really neat because instead of needing to DL the files locally and then re-upload to the server, you just ask the servers to talk to each other.

 

How do I get it?

If your install of Linux doesn't already have it, use the appropriate package manager. (yum, apt-get, etc)
For example if you are using Debian or Ubuntu Linux, type the following command:

# apt-get install rsync


 

Copy file from a local computer to a remote server

Copy file from /www/backup_file.tar.gz to a remote server called epsilonk.com
$ rsync -v -e ssh /www/backup_file.tar.gz dummyaccount@epsilonk.com:~
Output:
Password:
sent 19009 bytes  received 35 bytes  2093.31 bytes/sec
total size is 19045

Copy file from a remote server to a local computer

Copy file /homedummyacount/some_file.txt from a remote server epsilonk.com to a local computer's /files directory:
$ rsync -v -e ssh dummyaccount@epsilonk.com:~/some_file.txt /files

Synchronize a local directory with a remote directory

$ rsync -r -a -v -e "ssh -l dummyaccount" --delete /local/www epsilonk.com:/www

Synchronize a remote directory with a local directory

$ rsync -r -a -v -e "ssh -l dummyaccount" --delete epsilonk.com:/www/ /local/www

Mirror a directory between my "old" and "new" web server/ftp

You can mirror a directory between an old (old.epsilonk.com) and new web server with the command
$ rsync -zavrR --delete --links --rsh="ssh -l user" my.old.server.com:/home/www /home/www

This assumes that ssh keys are set for password less authentication


The most common rsync command options

  • --delete : delete files that don't exist on sender
  • -v : Verbose 
  • -vv for more details that Verbose
  • -e "ssh options" : specify the ssh as remote shell
  • -a : archive mode
  • -r : recurse directories
  • -z : compress file data

As always though, consult the man.


Thanks to +Brian Downey of thelinuxfix.com for introducing me to rsync. :-) 

    Thursday, March 21, 2013

    Nikon D7100 review....

    I had been crouched next to the mailbox waiting for the mailman to arrive with my shiny new D7100...alas it finally showed up.


    I am currently writing up a full review, but in a nutshell. If you are a DX user but don't already have a D7000 and are pondering an upgrade....get one. If you have a D7000 already, (as I did prior to getting this.) it is more complicated flowchart.... I'll explain shortly.

    In short, Nikon hit a homerun with this camera.

    Tuesday, March 12, 2013

    Creating a Droplet with Photoshop

    In my last post I covered the basics of optimizing images for the web. Now I will go over how to automate that workflow for contributors or whatnot.

    Raindrops are falling on my head...

    Creating the droplet

    First we must create the actions that we wish to incorporate into our droplet:


    1. Open action tab (alt+F9 brings it up)
    2. Create new action set. Name it whatever you want, I called mine "Web Optimization".

      Create new action set











            


          3. Create new action. Name it whatever you want, I named mine "optimize and save".
                  Once you click record it will begin recording your actions so that they can be duplicated         
                  later upon calling this action . (You can pause recording at the bottom of the actions panel.)

          





















                4. Then proceed to optimize as seen in the last blog post.
                5. Once this is complete, then click the stop button at the bottom of the pane.
                6. Then File->Automate->Create Droplet...
              At this point you will encounter the dialog shown below:

























         Ensure that you save it somewhere you can find with the "choose" button.
        The destination defaults to "Save and close" but I prefer "Folder". This way you can create a naming convention that works well. Additionally I recommend checking the box for linux file name compatibility since that is likely where it will be hosted.

        This whole process creates an .exe that you then can merely drag files onto and it will optimize the files, and save them according to the naming convention that you have created. Even a non technical person can now optimize their own content. 

        Additionally this could be used to automate any number of other tasks.


        Web image optimization



        While working on a recent client's site, I noticed how writers and editors were not optimizing their images at all. There was an example where the image in PNG format was 2376K!! This will result in extremely slow loading time. The reality is that when publishing online there has to be a consideration given to the balance between quality and load time. I will show how to optimize images, and then how to automate the workflow for your staff/contributors and get consistent results.

        Image formats

        First let's discuss the various image formats that we can use on the web. The primary differentiation is the type of image compression scheme used, either lossy or lossless.


        Lossy compression is a method to encode that tosses out some of the data that comprises the image. It minimizes the amount of data used to represent that image. Typically we display images at a much much lower resolution than what comes out of the camera, this allows us to discard some of that information and still end up with something that looks like the original.

        Lossless compression on the other hand retains all of the original information found in the source file, it merely compresses it using various algorithms which discussing the differences in algorithms would be a whole different post. But see this wiki page for more info.
        When you decompress the lossless file you get the exact copy as the source file. This is largely unnecessary on the web.

        JPEG

        JPEG is a lossy compression method that can often achieve a 10:1 compression ratio with very little perceptible loss of image quality. It is the defacto standard for much of the web. It however does not display transparency, and when used for small graphics, bullet points, buttons, etc. It may be better to use a different format.

        The file name extensions can be the following:.jpg, .jpeg, .jpe
        .jif, .jfif, .jfi

        GIF

        This is a bitmap image format that supports 8bits per pixel which allows 256 colors. Back in 1987 when this was invented that was great, now that we have monitors showing the entire adobeRGB spectrum this doesn't cut it for general graphics use, but for limited uses it is still great. Primarily used for banners, small graphics, bullet points, etc. It favors "favors flat areas of uniform color with well defined edges"  However for this sort of work, I prefer the PNG format as it features better compression which gives us better load times.

        The file name extensions can be the following: .gif

        PNG

        This format was created as a reaction to the patent battle that took place with the GIF format. It is better in every way to GIF, unless you need animation. In that case, use MNG.
        It also features transparency. 

        The file name extensions can be the following: .png


        I use Photoshop to optimize my images, that is understandably not in the budget for everyone. Here is a list of other tools that will to shrink your file sizes and achieve better load times.

         


        Saving images with many colors for the web:

        *Always make sure you are working with a copy and have an original unmodified file saved elsewhere. 

        First let's check the resolution of the image: Image > Image Size...



         
          




















        We can see that this image is far too large for sensible display on the web. Lets drop this down to something more reasonable:























        Notice that I also reduced the resolution as well as the dimensions. 300 DPI is suitable for print, 72 DPI is fine for web.


        Once we have done this we will then Save for the web. Choose File > Save for web and devices. (alt-shift-ctrl-s) Start learning shortcuts!



         At the top of this panel ensure that you click "4-up" This will show you four panels that you can manipulate independently to compare the various compression schemes against each other.


        The file sizes and loading times are displayed in each pane. The load times are referenced as if via a 2Mbps connection. This may not be the case for your visitors. I am more concerned with file size to quality ratio. When you look at each panel you can see that there are artifacts introduced into the image. For my purposes I can't see a difference between 73k file(quality 60) and the original at 932k. However the load time difference will be noticeable!
        The file size that you shoot for will be entirely dependent on a number of factors. I'll leave the exact size up to you, but now you should have the tools to do it.



        In the case of the clients photo which was a PNG format at over 2000k, I reduced the file size to under 90k with just optimizing for web. That means they were wasting 26 times the bandwidth every time the page was loaded, not to mention the CPU cycles and memory load on the server.

        The little tiny blue line is the optimized file. The large red line is the original unoptimized file from the client.
         
        This may seem like a small example, but extrapolate this out....for a site that gets 40k unique page views a month. (Assuming the file is being cached by all repeat visitors, which it wasn't...but that is the topic of another blog post.) That is 90Gb of data being served out! vs 3Gb
        Incredible!

        Moving a Drupal 7 site to a new host at TheLinuxFix.com

        The Linux Fix

         I am migrating some sites over to a new host for a client. I highly recommend The Linux Fix. Those guys know what they are doing. Great value, great service. This is not a paid advert or anything, I am just happy with what they provide.

        Anyhow on to the migrating....

        Flying south for the winter


        Pro tip: Use MySQL Client to import databases when migrating D7 site, as often you reach the max upload size when using PHPmyadmin (even when the .SQL file is compressed).


        I have seen some resources indicating the use of "source" (even apparently the official documentation?) But really "source" is used to execute sql queries and display output, it isn't really meant to import large databases.

        Instead we are going to use the MySQL Client via the following commands:
        1. Create your database  (mysql > CREATE DATABASE exampleDB;)

        2. Change to the directory your sql file is located in.


        3. Execute: mysql -u root -p exampleDB < dbarchive.mysql






        As an alternative you can dump a sql file into the mysql command line like
        mysql -u -p << exampledatabasetable.sql

        However the sql file will have to have the use `exampleDB` at the top of the file.


        NOTE:
        The following variables need to be replaced with your own information:
        • -u username specifies the database username.
        • -p designates that you will be entering a password.
        • exampleDB is the name of the database you are trying to import
        • dbarchive.mysql is the name I gave my backup file, you can name yours however you please.