Sep 072013
 

I’ve been using Gallery (aka Menalto) for almost a decade, starting with Gallery1, now up to Gallery3 in the current release. In the early days the work-flow for adding and providing meta data about photos wasn’t so easy. That got better with Gallery Remote, which I used for many years. However, the Gallery Remote workflow was still clumsy. If I was going to stay with Gallery in any shape it needed to be much easier.

Thankfully alloyphoto made that much easier – his Lightroom Gallery Export Plug-in works extremely well with Gallery3. It works similarly to other export plugins (Flickr, SmugMug, etc) – setup albums, photo properties, and publish to your Gallery3 site. Any content changes made in Lightroom (including image or album removal) can be published back the the Gallery3 site.

gallery3 publish services

When you first start out with the Gallery3 publish service in Lightroom it will not have any of your existing gallery3 albums or images. This can be remedied by first running the Import albums process, and then associating images in Lightroom with ones retrieved from the Gallery3 instance.

lightroom gallery3 plugin import albums

lightroom gallery3 plugin albums

Once the photos are associated with the Lightroom library, new versions of previously created gallery photos can be uploaded again. This is helpful if you have done some updating of previous image metadata or tags, added or updated exif gps data, or simply applied new image processing rules. In my case because I have so many albums and images already out on the site (and the fact that I am running a very low resource server) I chose to change the associate images options to do specific large albums one at a time before trying the entire library. I’ve found that in my case this process is quite time and resource consuming, so if you have a lot to do you may want to grab a cup of tea while you wait. I also made changes to which images are associated as I typically have both jpg and RAW versions of the same files in my library from previous exports – in my case I want to make sure I’m associating the RAW files instead of any generated jpg files.

lightroom gallery3 plugin associate images

lightroom gallery3 plugin associate images settings

One thing I really appreciate in this plugin is the ability to set the number of albums the plugin accesses at a time, and set a throttled response rate. This probably doesn’t apply to everyone – but since I run my site on a Amazon EC2 Micro instance with only bursting CPU it really helps.

gallery3 publish service server options

I enthusiastically recommend this plugin if you have Gallery3 and Lightroom. The plugin is not free, but at $15 it is more than worth it for me in the sheer amount of time it has saved me by streamlining my workflow. One of the main reasons I get so behind in publishing my photos is the large amount of work that I had to do previously – Now everything I’m working on is controlled via Lightroom and that takes all of the headache out of maintaining multiple sets of data.

Lightroom Gallery Publish Plugin home page
Gallery Plug-in at Adobe Exchange

Features

  • Supports Export operations
  • Supports Publish Services in Lightroom 3 and above
  • Supports multiple hosting servers and multiple accounts
  • Supports nested album structures
  • Supports custom sorting in published albums
  • Allows you to import the album structure from the hosting environment into your Lightroom catalog
  • Allows you to associate existing photos with photos in your Lightroom catalog
  • Supports photo keywords (tags) and comments
  • The plug-in automatically checks if a new version is available and updates itself with one click

Requirements

  • Lightroom 2 (2.4 – 2.7): Windows XP, Windows Vista, Windows 7, Windows 8, Mac OS X
  • Lightroom 3 (3.3 – 3.6): Windows XP, Windows Vista, Windows 7, Windows 8, Mac OS X
  • Lightroom 4 (4.0 – 4.4): Windows Vista, Windows 7, Windows 8, Mac OS X
  • Lightroom 5 (5.0): Windows Vista, Windows 7, Windows 8, Mac OS X
  • Gallery 3.0.1 (and above) hosting service with the following modules enabled:
    • Comments
    • Exif Data
    • Exif GPS Data (if you want your Gallery show map locations for geoencoded photos)
    • REST API
    • Tags
    • Lightroom Plugin Helper – see the download link on the side bar on the right
May 272013
 

I’ve been using a combination of scripts to do local backups on my Amazon EC2 micro instance I use to serve this website – AutoMySQL Backup and some cron jobs which ran rsync for various paths on a rotation. For example:

#!/bin/sh
rsync -a /var/www/html /mnt/backup/filebackup/weekly
rsync -a /var/lib/g2data /mnt/backup/filebackup/weekly

This is frankly a pretty lazy way to do it. I’m not protected at all if something wipes out the backup destinations or the EBS drive goes bad, and this method uses up a lot of EBS space because there are multiple sets of the files. I could use the AWS EC2 framework to script out EBS snapshots, but that’s just going to further increase my monthly Amazon bill without any ability to be very specific about point in time restores for files. Instead, I thought I should make use of something that I’m already paying for: CrashPlan.

As noted in a previous post, I’m using CrashPlan to backup our desktop & laptop computers, as well as my file server (Synology NAS). I have a CrashPlan+ Family Unlimited plan which means I can add up to 10 computers and store unlimited backups from them to the CrashPlan cloud included in the plan (local backups, or peer to peer Crashplan backups are always free and don’t require a plan).

Install CrashPlan to a Amazon Linux AMI

CrashPlan offers a number different clients, including a headless java client for Linux. This is perfectly suited to the micro instance I’m using in EC2 – the Amazon Linux AMI which is based on RedHat/Centos. I installed the headless client using the following options – note that I’m using the latest version at the time of this post (3.5.3) in my commands below, but you can see the latest download link on their page. I’m also using sudo in my commands, you can remove pr ignore that if it is isn’t needed in your Linux configuration.

sudo yum install grep sed cpio gzip coreutils
wget http://download.crashplan.com/installs/linux/install/CrashPlan/CrashPlan_3.5.3_Linux.tgz
sudo tar -xzf CrashPlan_3.5.3_Linux.tgz
cd CrashPlan-install/
sudo install.sh

At this point the installer launches and will ask questions about where the files should go. Their suggestions are reasonable for my configuration and I was able to simply follow the defaults, hitting enter the whole way through the install. Once the install finishes, it will start the CrashPlan service automatically.

 

Connect to the headless CrashPlan Linux server with a remote client

Now that the service has been started, a remote client needs to connect to the server in order to further configure backup options. The easiest and most secure way to do that is by making use of SSH’s ability to tunnel to the server. The following instructions are for Windows, but similar steps can be performed on other operating systems. First, install the CrashPlan client if it is not already installed on your computer, but don’t start the program. Next, locate and edit the ui.properties file using a text editor. This file is typically located here: C:\Program Files\CrashPlan\conf\ui.properties for Windows systems. As shown below, remove the # to uncomment the line, and change the port to 4200. When done, save the file and exit.

Edit servicePort for ui.properties

Next the SSH tunnel needs to be enabled for the client to connect to the server via SSH. Open PuTTY and create a new connection to your Linux server. Under the configuration menu, navigate to Connection, SSH, then click the Tunnels option in the menu. On that page, enter “4200” as Source port, enter “localhost:4243” as the Destination, and click the Add button. Once completed, connect to the server as normal via the configured SSH session and leave the terminal window open.

putty tunnel add

putty tunnel added

At this point the CrashPlan client can be started. It will first ask for CrashPlan credentials, then display the usual interface. Note that the compression and dedupe options can be resource heavy – which means during the first backups for the server it will likely consume a lot of CPU, particularly for EC2 micro instances which have low CPU throughput (bursting) to start with. This CPU usage should reduce over time as the backup deltas get smaller.

crashplan ui linux remote

Configure CrashPlan Linux to backup /var or other hidden directories (if needed)

Note that by default, several directories and file structures are hidden in the CrashPlan client for Linux. In my case I want to backup files under /var, as that is where my gallery2 files reside, as well as my web content. In order to expose that folder structure for CrashPlan the my.service.xml configuration file should be edited, and the “pattern regex=”/var/” line under the Linux area should be removed. First stop the CrashPlan service and edit the config file (assuming you installed using default file paths):

sudo service crashplan stop
sudo vi /usr/local/crashplan/conf/my.service.xml

Next, look for a line like this under the Linux area and remove the following data from the file (e.g. dd in vi):

<pattern regex="/var/"></pattern>

Save the file (Esc, :wq + enter in vi) and then start the service back up. After connecting again using the client, the /var folder should now be visible.

sudo service crashplan start

crashplan ui linux remote file selection

May 192013
 

I’ve got a lot of data. I’ve been been shooting RAW photos for a decade (almost 200 GB at this point), have a large (legit!) music collection (74 GB), and have a lot of other files from various projects over the years I want to hang onto. In total, I’ve got about 330 GB I want to keep. This used to be a very expensive proposition – stuffing it all in Amazon S3 or using backup services that charge by file size was tough to swallow. That landscape has changed recently. I’ve been using CrashPlan+ Family Unlimited for my home backups for about a year and couldn’t be happier. Unlimited cloud backups for up to 10 computers for $9-14 per month (depending on subscription length) is an amazing deal.

More than being a good deal I’ve also been very impressed with the CrashPlan software as well. It does the typical things you want to see in backup software – good performance, ability to set transfer rates/time of day, data deduplication, compression, and encryption. However it’s hidden strength is its great flexibility for backup targets while maintaining security. In addition to the unlimited cloud storage you can also have local encrypted backups, encrypted backups to another one of your computers running CrashPlan (p2p using the same account), or even send your encrypted backups to a friend (p2p with different CrashPlan accounts). These options create a perfect backup scenario for me – I know I have a local copy of files which I can get at quickly (compared with downloading them all) but they are also stored in the cloud to protect against catastrophic loss (e.g. a house fire).

CrashPlan destination options

Their software is available for multiple platforms, and they support a headless java client on Linux. This means the software can be installed on a lot of different machine types and opens up a lot of different options. The most important one for me is support in Synology. I’ve been using Synology NASs as my file server for many years as they are very customizable and powerful. With some hard work invested, patters was able to get the headless client running on a wide range of newer Synology devices. Using his packages and instructions I’ve got all of the files on my Synology file server backing up locally, as well as to the cloud:

Desktop/laptops

  • Backup to local Synology NAS Crashplan target (Vol2)
  • Backup to Crashplan cloud

Synology NAS (Vol1)

  • Backup Vol1 to local Synology NAS Crashplan target (Vol2)
  • Backup to Crashplan cloud

Some screen shots of what this looks like:

synology package center

crashplan ui synology remote

If you find the post at pcloadletter.co.uk hard to follow, Scott Hanselman has a great guide on his site on how to setup CrashPlan on Synology.

Other than having to restart the Synology CrashPlan package after updates, everything has worked amazingly well together. I was able to customize everything the way I needed to but still feel like I’m well protected. If you aren’t using a backup solution you like, I highly recommend giving CrashPlan a try – the following link will save you 20% off their prices: http://www.crashplan.com/ff20

Jan 262007
 

I’ve owned a few routers in my day, but have never really been happy with any of them. Coming from a tech background they always seemed to be dumbed down and complicated in the worst ways. All that has been changing though. Thanks to Linksys being forced to release the code for their routers due to the GPL, people have been able to take that and write some fabulous code.

My favorite so far is the Tomato Firmware. It is lean, fast, and full of fantastic features. The QoS works wonderfully to prioritize specific network traffic. The interface and graphs powered by ajax are clean and informative. The router even has a site survey to see which wireless channels have the most traffic.

If you are comfortable flashing your Linksys router, I highly recommend the Tomato firmware.

Oct 042006
 

Adobe updated Lightroom to beta 4 a week back. I’ve been playing with it a bit so far, and they have definitely made some improvements.

Since purchasing Pixmantec (and RawShooter code), Adobe has some new features in the develop module, like Recover and Fill Light. Recover is a highlight recovery tool, and Fill Light lets you lighten dark shadows. Both seem to be nice additions. Photoshop news has a great overview of the Lightroom develop module. Existing Adobe Camera Raw develop settings are still incompatible with Lightroom, though that’s supposed to be fixed “soon”.

In addition to sprucing up the UI, Adobe seems to have added more options for import and export, and continued support XMP files. However I’d still like to see the program making as many file changes as possible, in addition to internal database changes. Photoshop news also has a good overview of the Lightroom library module.

If they can address my archival/multiple machine fears, and fix Camera RAW compatibility, I can definitely see my workflow changing to Lightroom.

Update: RawWorkflow.com has some great Lightroom tutorial videos up for free here. I’m really starting to dig the way Adobe has broken contrast out by the four sections – Highlights, Lights, Darks, and Shadows. Being able to manipulate all of these on the image, histogram, or curve is really cool.

Jul 302006
 

I’m a fan of autostitch, and use it quite a lot to make panoramic shots. Microsoft Labs has hit on a great idea that takes this to a new level: Photosynth. Of course, the name tells you nothing. The product sorts through a bunch of photos, and figures out their relationships with each other. It then takes all these photos and places them in a 3d world, trying to replicate the exact way they were taken. This is sort of backwards from the GPS & direction EXIF information that will be coming with cameras in a few years, but it is still very cool. When it is done, you can fly through the 3d world of photos, enlarging some, or taking a virtual tour.

Photosynth

I want.

PS – Sony just put out a GPS tracker. It logs where you were and the time, then you can sync that up with your photos after the fact.

Jul 132005
 

Skippy has a very snazzy WordPress Database Backup plugin. I just loaded it up on mine and it works fine (1&1 hosting). This is really nice for me. Previously I had to login to my hosting account, go to the mysql admin, then run a backup on the tables. Now I can schedule backups, and can even get it to email the gziped sql to me.

So swassy.

Update: This plugin is now part of wordpress v2

Feb 082005
 

Ah, firefox has hit the big time. Why? It has a bug that IE doesn’t. It has to do with IDN names mimicking real domains, IE hasn’t implemented this yet. They use the international characters to make it look like you are at a different site. In order to get this to work, they have to get you to click a link – so there will be phishing scams coming I’m sure. Try the demo out here. Full info on the exploit here.

Someone has shown how to disable IDN here.