Smokes your problems, coughs fresh air.

Tag: Subversion (Page 1 of 2)

Converting a Subversion repository to Git

I just used the instructions in this article by John Albin to archive an old svn project on my private machine.

A shell summary (see the John’s article for details):

svn log -q | awk -F '|' '/^r/ {sub("^ ", "", $2); sub(" $", "", $2); print $2" = "$2" <"$2">"}' | sort -u > authors-transform.txt
 
vim authors-transform.txt
# Make changes
 
git svn clone [SVN repo URL] --no-metadata -A authors-transform.txt --stdlayout ~/temp
 ~/temp
git svn show-ignore > .gitignore
git add .gitignore
git commit -m 'Convert svn:ignore properties to .gitignore.'
 
git init --bare ~/new-bare.git ~/new-bare.git
git symbolic-ref HEAD refs/heads/trunk
 ~/temp
git remote add bare ~/new-bare.git
git config remote.bare.push 'refs/remotes/*:refs/heads/*'
git push bare
 ~
rm -rf ~/temp
 ~/new-bare.git
git branch -m trunk master
 ~/new-bare.git
git for-each-ref --format='%(refname)' refs/heads/tags |
cut -d / -f 4 | ref
  git tag "$ref" "refs/heads/tags/$ref";
  git branch -D "tags/$ref";

John has also put all this into a number of scripts published on GitHub.

Aihato WordPress maintenance notes

According to Brian Kernighan, “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?” With that in mind I’m re-exposing myself to the spaghetti that is the Aihato WordPress code base.

My last comment on that project stems from December of 2010, although there have been some minor code updates in the meantime—not all of them by me—mostly having to do with the look and feel.

During my Christmas break, I visited the Ytec office to meet with their sysadmin. He made the old Subversion repository available to me again (although, internally, they’ve gone over to Bazaar) and created a new development version of the site (which hadn’t been moved along with the move of the production version to a new server). After some changes to my Makefile, I’m now able to do the whole development/production dance again, and I’ve just tested this by successfully upgrading to WordPress 3.5.1 (all the way from 3.0.5). Yes, this is quite a simple affair, if your setup is 1337:

Upgrading WordPress

I really like it better when the WordPress factory stuff lives in its own directory. I like this better even than using vendor branches to keep track of what’s mine and what’s WordPress’s. Let me show you why (and yes, I could have turned this into a proper script, if I wanted to):

# Get into my working dir
www.aihato.nl
 
# Add the new WP version to the externals
svn propget svn:externals . > /tmp/wp-ext 'wp-3.5.1 http://core.svn.wordpress.org/tags/3.5.1/' >> /tmp/wp-ext
svn update
svn propset svn:externals --file /tmp/wp-ext
 
rm wp # The symbolic link pointing to the old wp-3.0.5
ln -s wp-3.5.1 wp
 
make update-development # to test if the upgrade will fuck up anything major
make deploy-production # because it works
 
svn ci -m "Upgraded WP from wp-3.0.5 to wp-3.5.1."

Rewrite problems after upgrade

Of course, the upgrade had to cause some trouble. This particular trouble was caused by some code in my theme’s functions.php that really belonged in a plugin. I basically made a rogue call to $wp_rewrite->flush_rules(); which caused the rewrite rules to either lack the ones concerning my custom category base (‘/categorie’) or those concerning my custom rewrite endpoints.

To fix this I created an exceedingly simple plugin with just the following code:

add_action('init', 'gallery_endpoint_initialize');
 
register_activation_hook( __FILE__, function() {
  global $wp_rewrite;
 
  add_rewrite_endpoint('gallery', EP_PERMALINK | EP_PAGES);
  $wp_rewrite->flush_rules();
});
 
function gallery_endpoint_initialize() {
  add_rewrite_endpoint('gallery', EP_PERMALINK | EP_PAGES);
  add_filter('query_vars', 'aihato_queryvars');
}
 
function aihato_queryvars($qvars) {
  $qvars[] = 'gallery';
  return $qvars;
}

Thoughts on custom post types for fighter profiles

I commented on custom post types before, in the section on fighter profiles in the extensive summary of the whole development process, and in a comment in my development notes.

The custom post type for fighter profiles that I started implementing as a plugin is still quite high on my feature wishlist. The complex dance with page parent, page template and custom fields has proven to be quite difficult for the guys.

At the time, a real showstopper “bug” was that the featured image box wasn’t displayed with the post type. Now, it took me approximately five minutes to find out that I should have simply added the custom post type to the add_theme_support() call in my theme’s functions.php:

add_theme_support('post-thumbnails', array('post', 'page', 'aihato_fighter'));

Another problem “was that clicking the Insert image button replaced the current page with the upload dialog instead of loading it in a modal dialog through AJAX.” This seems to be resolved in the new WordPress, so postponing the custom post type stuff until it had stabilized a bit in the future has proven to be an excellent strategy. Apparently, that future has arrived.

I do wonder: if I implement the custom post type for fighter profiles now, how do I convert my existing page content to the custom post type? Could I just change the post_type field in the database after adding the necessary function calls to register this new type?

Also, I’m thinking of turning some of the custom fields not just into fancy boxes in the post edit GUI, but into proper taxonomies. This might complicate migration even further, so it’s probably the wisest course to do this thing in phases:

  1. Implement the custom post types with integrated support in the GUI for the custom fields that will remain custom fields under the hood.
  2. Maybe migrate some custom fields (such as fighter discipline) to proper taxonomies.
  3. Modify the theme to use the new custom post type instead of the old pages.
  4. Integrate the fight record stuff with the aihato-events stuff.

The process of moving to fighter profiles to their own custom post type deserves a dedicated post, but this’ll have to wait until I actually continue this process.

Thoughts on improving the photo gallery

There are some annoying bugs in the photo gallery. I want to start out by upgrading all the relevant JavaScript libraries that I used to see how far that will bring me.

Performance-wise, there’s also some low-hanging fruit. Photo’s are often uploaded in high resolution. Instead of forbidding this, I want this to be performance neutral in its effect by, instead of using the full version of the image in many places (such as the gallery, using a huge-ass thumbnail that serves as the “enlarged” version when you click pictures in the gallery and such.

Thoughts on improving the video gallery

With regards to the video gallery, it has turned out the YouTube support is not sufficient. I will have to find a way to base the gallery on all oEmbed links. I have, since initial deployment, already added embed (not oEmbed) support for fightstartv.com:

wp_embed_register_handler('fightstartv', '|http://www\.fightstartv\.com/videos/([a-zA-Z0-9-_]+)/|', 'embed_handler_fightstartv');
 
function embed_handler_fightstartv($matches, $attr, $url, $rawattr) {
  $fstv_html = file_get_contents($url);
  $fstv_html_matches = array();
  preg_match('|"(/upload/[^"/]+.mp4)".*"image": "([^"]+)"|', $fstv_html, $fstv_html_matches);
  $fstv_mp4_url = 'http://www.fightstartv.com' . $fstv_html_matches[1];
  $fstv_image_url = $fstv_html_matches[2];
 
  $embed = '<embed src="http://www.fightstartv.com/wp-content/plugins/vipers-video-quicktags/resources/jw-flv-player/player.swf" height="300" width="600" bgcolor="0x000000" allowscriptaccess="always" allowfullscreen="true" flashvars="file='.urlencode($fstv_mp4_url).'&skin=http%3A%2F%2Fwww.fightstartv.com%2Fwp-content%2Fplugins%2Fvipers-video-quicktags%2Fresources%2Fjw-flv-player%2Fskins%2Fmodieus.swf&volume=100&bufferlength=3&backcolor=0x000000&duration=-1&frontcolor=0xffffff&lightcolor=0xffffff&screencolor=0x000000&autostart=false&image='.urlencode($fstv_image_url).'&ltas.cc=zerfildkisjwvgk&ltas.height=268&ltas.width=700&ltas.y=0&ltas.visible=true&ltas.x=0&adtvideo.height=268&adtvideo.y=0&adtvideo.position=over&adtvideo.width=700&adtvideo.visible=true&adtvideo.x=0&adtvideo.config=http%3A%2F%2Fwww.fightstartv.com%2Fmy_config.xml& adttext.height=268& adttext.width=700& adttext.y=0& adttext.visible=true& adttext.x=0&adttext.y=0&adttext.x=0&adttext.height=268&adttext.visible=true&adttext.config=http%3A%2F%2Fwww.fightstartv.com%2Fadttext.php&adttext.width=600&plugins=hd-2,viral-1,ltas,adtvideo, http://plugins.longtailvideo.com/5/flow/flow-2.swf, adttext,adttext"/>';
    echo $match[0];
 
  return apply_filters('embed_fightstartv', $embed, $matches, $attr, $url, $rawattr);
}

But these videos show up as just links in the film gallery instead of sporting the fancy FancyBox popup. Luckily, thumbnails do work, but more due to a quirk than by actual design. 🙁 This is going to take some time to cleanup, because most of the code concerned is terribly YouTube-centric by design.

Latest video on the homepage

The homepage features a block with the latest video. From my previous notes on this subject: “[… T]o show the latest video on the homepage, I just perform a search for posts which contain a YouTube URL. Then I parse the content a bit, and include the YouTube ID in my own low-res embed code. (The latest video area on the homepage is smaller than the default embed created by WordPress.)”

As of late, no video had been showing there, making me suspect—No I’m not telling you that. It’s actually too embarrassing!

Block with(out) latest video on the home page

Block with(out) latest video on the home page

This had everything to do with a missing hyphen character in the video id character class in the YouTube URL matching regexp within the scraping function:

preg_match('|http://www.youtube.com/watch\?v=([-a-zA-Z0-9]+)|', // better
preg_match('|http://www.youtube.com/watch\?v=([a-zA-Z0-9]+)|', // worse 

With the hyphen added:

Block with latest video on the home page

Block with latest video on the home page

Making post thumbnails crop from the top

WordPress hard crops uploaded images to match the post thumbnail size (set with set_post_thumbnail_size()) and the other alternative image sizes (added through add_image_size()). It does this from the center. This gives horrible results in the common case that a person does not happen to be decapitated and holding his or her head in his or her hands:

Aihato fighter profiles with cut-off heads

In the image editor, there is the ability to edit the thumbnail separately from the actual image, allowing you to manually crop the thumbnail, but this is cumbersome and only allows you to choose between applying the changes to all image sizes, just the thumbnail or to everything but the thumbnail, and I happen to not be using the thumbnail for the fighter profiles.

Basically, what I want is for WordPress to hard crop from the top instead of from the middle of the photo. And this would be an ok default for all photos. I truly don’t get why this isn’t configurable by default.

I found someone suggesting to change image_resize_dimensions() in wp-includes/media.php, because, at the time that he encountered the problem, there were apparently no plugin hooks available. There is a filter now, since WP version 3.4, called “image_resize_dimensions”. A quick Googling for this function/filter name reveals that, apparently, the Codex now even has a ready-to-go recipe to crop thumbnails by keeping the top of the image, instead of the default center.

With this code up and running in a plugin called ‘top-crop’, I consider this problem fixed. No more looking at beheaded fighters and no need for my users to muck around with the image editor (which wouldn’t work anyway because I use to many different image sizes).

My trusty old regenerate-thumbnails and the new AJAX Thumbnail Rebuild are doing their job after me chasing ghosts for hours because of a silent file permission problem on the development web server.

Aihato fighter profiles with heads

Aihato fighter profiles with heads

Relief and anticipation

In retrospect, digging back into the spaghetti isn’t as bad as it could have been, partly because I spent an extraordinary amount of time on documenting all the bumps in the road before. After reacquainting myself, I’m actually less intimidated by the prospect of upping the awesomeness of www.aihato.nl than I was when I actually delivered the product, which goes entirely against the saying at the beginning of this article. Either I’m smarter now than I was a couple of years ago, or I exercised sufficient constraint in introducing complexities (although (mostly well-documented) idiosyncrasies abound).

First steps with Subversion’s new merge tracking

A while ago we succeeded in upgrading the Debian server where I keep many of my SVN repositories. (The server was running on “testing”, but we hadn’t dared to upgrade it for a long time until it came time to make it “stable” again with Lenny.) On of the upgraded packages in Lenny is Subversion, now at 1.5.1, which means that I can finally start using subversions new merge tracking features.

To be able to use merge tracking for my existing repositories, I first have to upgrade these. From the svn 1.5 release notes:

The Subversion 1.5 server works with 1.4 and older repositories, and it will not upgrade such repositories to 1.5 unless specifically requested to via the svnadmin upgrade command. This means that some of the new 1.5 features will not become available simply by upgrading your server: you will also have to upgrade your repositories. (We decided not to auto-upgrade repositories because we didn’t want 1.5 to silently make repositories unusable by 1.4 — that step should be a conscious decision on the part of the repository admin.)

The only other machine I’m using these repos from is my laptop, which, running Gentoo, is already at Subversion 1.6.2. So, how do I upgrade?

First, I made a backup of all our SVN repos using a script that Halfgaar made to run from cron. After running that script, I initialized the upgrade procedure:

/var/svn
svnadmin upgrade blog.bigsmoke.us
svn-populate-node-origins-index blog.bigsmoke.us

The last command is unimportant, but may speed up the next few operations. The release notes again:

After running svnadmin upgrade, you may wish to also run the svn-populate-node-origins-index program on the repository. Subversion 1.5 maintains a node-origins index for each repository, and builds the index lazily as the information is needed. But for old repositories with lots of revisions, it’s better to create the index in one step, using the aforementioned tool, than to have live queries be slower until the index has built itself. See issue #3024 for details.

With my data safe and the repo upgraded, it’s time to actually test the new merge tracking features:

~/blog.bigsmoke.us # Enter working copy (on the same server)
 
# I'm now in the lichtgekruid branch.
svn switch file:///var/svn/blog.bigsmoke.us/trunk
svn add wp-content/plugins/openid
svn ci -m "I was convinced this was a change which didn't belong to my feature branch" wp-content/plugins/openid
 
svn switch file:///var/svn/blog.bigsmoke.us/branches/lichtgekruid
svn merge file:///var/svn/blog.bigsmoke.us/trunk
# The last command was a real kicker, not having to find and specify the correct range of revision numbers.
 
svn ci -m "Merged changes from trunk to 'lichtgekruid' branch."
svn switch file:///var/svn/blog.bigsmoke.us/trunk
svn merge file:///var/svn/blog.bigsmoke.us/branches/lichtgekruid
# (Another spontaneous orgasm.)
 
svn ci -m "Merged 'lichtgekruid' branch back into trunk."

Ok, I know I could have simply changed all these projects to use Git, and I like Git. I love its simplicity and flexibility. But, call me old-fashioned; I have an SVN fetish, and so far I find its merge tracking quite convincing.

Besides all the gotchas and pointers in the 1.5 release notes, there’s a lot more info about merge tracking in the SVN redbook.

Replacing the full contents of a Subversion working (sub)dir

The annoyances that I suffered earlier today during the upgrade of a WordPress plugin made me turn to my favorite text-editor to create a simple script, svn-replace-dir:

#!/bin/bash
 
usage() {
    cat <<"EOF"
$0 [--dry-run] <svn_dir> <replacement_dir>
 
This script replaces the contents of <svn_dir> with the contents of <replacement_dir>,
where <replacement_dir> is not an svn directory.
 
Copyleft 2010, Rowan Rodrik van der Molen <rowan@bigsmoke.us>
EOF
}
 
fatal_error() {
    message=$1
 
    -e "\e[1;31m$message\e[0m"
    1
}
 
usage_error() {
    error="Wrong usage."
 
    [ -n "$1" ];
        error=$1
   
 
    -e "\e[1;31m$error\e[0m"
    1
}
 
run_command() {
    -e "\e[1;34m$1\e[0m"
 
    [ $dry_run == 1 ] || $1
}
 
dry_run=0 [ $1 == '--dry-run' ];
  dry_run=1
 
 
 
[ $# == 2 ] || usage_error "Wrong number of arguments."
 
svn_dir= "$1"|sed -e 's#/$##'`
replacement_dir= "$2"|sed -e 's#/$##'`
begin_path=$PWD
 
#if [ "${svn_dir:0:1}" != "/" ]; then svn_dir="$PWD/$svn_dir"; fi
#if [ "${replacement_dir:0:1}" != "/" ]; then replacement_dir="$PWD/$replacement_dir"; fi
 
[ -d "$svn_dir" ] || usage_error "$svn_dir is not a directory."
[ -d "$replacement_dir" ] || usage_error "$replacement_dir is not a directory."
 
 
# Create all subdirectories in $svn_dir that do not yet exist
$replacement_dir
find . -mindepth 1  d -print | sed -e 's#^./##' | d;
    $begin_path/$svn_dir
    # Doesn't the destination directory already exist?
    [ ! -d "$d" ];
        run_command "svn mkdir '$d'"
   
 
# Copy all files from $replacement_dir to $svn_dir
$begin_path/$replacement_dir
find .  f -print | sed -e 's#^./##' | f;
    $begin_path
    run_command "cp '$replacement_dir/$f' '$svn_dir/$f'" # FIXME: Quoting problem
 
# Remove all files that do no longer exist in $replacement dir
$begin_path/$svn_dir
find .  f -print | grep -v '.svn' | f;
    [ ! -f "$begin_path/$replacement_dir/$f" ];
        run_command "svn rm '$f'"
   
 
# Remove all subdirs that do no longer exist in $replacement dir
$begin_path/$svn_dir
find . -mindepth 1  d -print | grep -v '.svn' | d;
    [ ! -d "$begin_path/$replacement_dir/$d" ];
        run_command "svn rm '$d'"
   
 0

Using the script is simple:

svn-replace-dir simple-tags new-simple-tags|less -R

It replaces all the contents of the first directory (simple-tags in the example) with those of the second directory and it deletes everything that is no longer present in the second dir. In the process, it does all the necessary calls to svn mkdir, svn rm and (in the next version) svn add.

diff tells me that the script has done its work correctly:

diff -x .svn -ruN simple-tags new-simple-tags
# Emptiness is bliss :-) 

This is another one of these occasions when Git would have made life so much easier. Luckily, at least there’s GitHub to host this script as a Gist. Check there if you want to fetch the newest version of this script.

Tracking WordPress in a Subversion vendor branch

Two months prior to writing a script to upgrade MediaWiki installations using Subversion vendor branches, I wrote something similar for WordPress. It’s a little bit more limited and should really incorporate some of the improvements made for the MediaWiki version, but it worked fine so far:

cat upgrade-wordpress.sh 
#!/bin/bash
 
svn_repo_url=file:///var/svn/blog.omega-research.org/vendor/wordpress/
last_version=$1
new_version=$2
 
tmp_dir=`mktemp -d` $tmp_dir
 version $*;
    "Downloading and extracting WordPress version $version..."
 
    [ -z `svn ls $svn_repo_url|grep $version` ];
        branch= $version |sed -e 's/\.[0-9]\+$//'`
        archive_file="wordpress-$version.tar.gz"
        md5_file="wordpress-$version.md5"
 
        wget "http://wordpress.org/$md5_file"
        wget "http://wordpress.org/$archive_file"
 
        [ `md5sum $archive_file |cut -f 1 -d ' '` != `cat $md5_file` ];
            "MD5 sum did not match!" >&2
            1
       
 
        tar --extract --ungzip --transform "s/^wordpress/$version/" --file $archive_file
        svn_load_dirs.pl $svn_repo_url -t $version current $version
   
 -
svn merge "$svn_repo_url$last_version" "$svn_repo_url$new_version" .

I’m actually planning to make both scripts a little bit more generic (in the sense that svn_repo_url becomes an external param) and to track future changes to them using GitHub’s Gist. (How ironic is that, tracking an script for Subversion using Git?)

Tracking MediaWiki in a Subversion vendor branch

Vendor branches are the proper™ way of merging upstream changes in your web application installations. In Subversion, managing vendor branches isn’t so easy as it is in Git. Still, vendor branches make it much easier to track upstream.

From before I first deployed the Omega Research Wiki, I already used svn to track changes to my MediaWiki installation. However, for upgrading from one upstream release to the next, I used diff and patch. This isn’t the most reliable of methods as is exemplified by doing a diff comparing a fresh MediaWiki download with the actual files in my repo (which are supposed to belong to the same version).

What’s basically a shortcoming of svn is that I can’t just say:

svn merge http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_15_0/phase3/ \
http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_15_1/phase3/ .

This would have been incredibly helpful, because now I’m keeping a vendor branch not because of local modifications to upstream, but just to be able to merge cleanly.

In Subversion, maintaining a vendor branch by hand is quite some work, because you need to do a checkout first before you can import each version. (My working copy is normally a checkout of /trunk, not of /vendor/mediawiki.) Luckily, Subversion is distributed with a handy Perl script, svn_load_dirs.pl, which can do most of the heavy lifting.

Still, I didn’t feel like having to do to many manual steps, such as typing in the painfully long URLs for merging, so I decided to wrap the whole process into a nice little Bash script:

cat upgrade-mediawiki.sh
#!/bin/bash
 
svn_repo_url=file:///var/svn/wiki.omega-research.org/vendor/mediawiki/
 
merge=1 [ $# -gt 2 ] # Process extra options
    "$1"
        --no-merge)
            merge=0
            ;;
   
   
 [ $# -ne 2 ]; then
    "Usage: $0 [--no-merge] version version"
    1
 
last_version=$1
new_version=$2
 
tmp_dir=`mktemp -d` $tmp_dir
 version $*;
    "Downloading and extracting MediaWiki version $version..."
 
    [ -z `svn ls $svn_repo_url|grep $version` ];
        branch= $version |sed -e 's/\.[0-9]\+$//'`
        download_file="mediawiki-$version.tar.gz"
        download_url="http://download.wikimedia.org/mediawiki/$branch/$download_file"
 
        wget $download_url || { "Downloading $download_file failed">&2; 1; }
        tar --extract --ungzip --transform 's/^mediawiki-//' --file $download_file
        svn_load_dirs.pl $svn_repo_url -t $version current $version
   
 - [ $merge == '1' ];
    svn merge "$svn_repo_url$last_version" "$svn_repo_url$new_version" .
 0

The script only downloads and imports each specified version if that version doesn’t already exist in /vendor/mediawiki/. Also, because it has a --no-merge option, you can download all the old versions of MediaWiki that you’ve ever used, so that you can go back and compare old versions of your installation with the factory version. Of course, this is only useful if you were already tracking your installation in svn at the time, and even then not really. 😉

Anyway, the important thing is that you can use the script to download the version you’re running now and the version you want to upgrade too. I wanted to make a big jump, from 1.11.1 to 1.15.0 (I had put of the upgrade for a long time, because I first wanted to learn more about vendor branches):

Of course, as always, you need to check if the patch went well. My own results were a vivid demonstration of the unreliability of my previous method:

svn st|grep '^C'
C      languages/messages/MessagesKrj.php
C      languages/messages/MessagesWar.php
C      languages/messages/MessagesSe.php
C      languages/messages/MessagesFrc.php

Luckily, these were all files I was sure I hadn’t modified:

i `svn st|grep '^C'|sed -e 's/^C //'`; mv $i.merge-right.r32 $i; svn resolved $i;
Resolved conflicted state of 'languages/messages/MessagesKrj.php'
Resolved conflicted state of 'languages/messages/MessagesWar.php'
Resolved conflicted state of 'languages/messages/MessagesSe.php'
Resolved conflicted state of 'languages/messages/MessagesFrc.php'

Apparently, a previous upgrade hadn’t turned my MediaWiki installation exactly into 1.11.1.

Styling XML SVN logs with CSS

My friend, Wiebe, keeps his website in Subversion. (Always keep your project files in version management or you’ll be sorry.) He used to manually track the date with the last significant change in each file (because who cares about typos, right?). But, of course, he kept forgetting to update this when he actually made such changes. So, he decided that he wanted to publish the full SVN log for each page.

The raw SVN logs are a bit raw, reason enough to try to turn it in something prettier. Luckily, there’s no need to parse the log files, because svn log has a command-line option, --xml. This option causes the log file to be printed in a simple XML format:

<?xml version="1.0"?>
<log>
<logentry revision="345">
<author>halfgaar</author>
<date>2009-05-25T10:50:07.560914Z</date>
<msg>Added very useful note to index about an awkward sentence.
</msg>
</logentry>
<!-- Snipped the rest of the logentry elements -->
</log>

Now we can use any number of ready-to-use XML tools to process this log, but I figured that, maybe, a very simple solution could work: CSS. Cascasing Stylesheets can be used for more than just styling HTML. One of the few differences is that with non-HTML XML, there are no defaults for the CSS properties (and aren’t we always trying to discover and override the various browser-specific CSS defaults anyway?)

First, we want to add a <?xml-stylesheet?> processing instruction to the log file:

svn log --xml example_file.xhtml | sed -e '/<\?xml / a<?xml-stylesheet type="text/css" media="screen" href="/css/svn-log.css"?>'

The XML file now references the CSS file that we’re going to make:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/css" media="screen" href="svn-log.css"?>
<!-- snip -->

svn-log.css:

log {
 :;
 :('http://svn.collab.net/repos/svn/trunk/notes/logo/256-colour/subversion_logo-200x173.png');
 :;
 :;
 :;
 :;
 : 2em 204px 2em 5ex;
}
 
logentry {
 :;
 :;
 : 1em;
 :;
 :;
 : #999;
}
 
author {
 :;
}
 
date {
 :;
 : 10ex;
  /* If Firefox would support font families even when you force a font,
     I could use 'overflow: hidden' to hide everything except the date part of <date>. */
 :;
 :; 
 :;
 :;
 : 110%;
 : #7488a7;
}
 
msg {
 :;
 : pre-wrap;
 :;
}

We now have a nicely formatted log file. Other things you could do:

  • Add styles for printing (in a separate stylesheet or by using @media blocks).
  • Display the revision author instead of hiding it.

Of course, you could do all this and much more with XSLT, but that’s just all too obvious. 🙂

If you want to see the stylesheet in action, take a look at Wiebe’s website and look for a View revision log of this page link in the footer of any of his pages.

XML SVN log styled with CSS

XML SVN log styled with CSS

Silent change in URLs of Ruby on Rails svn repository

I don’t know why, but everything associated with Ruby on Rails seems to change all the time, without notice, or (in the case of URLs) without redirect.

We used http://svn.rubyonrails.org/rails for our svn:external deployment of rails 1.2.6. But that suddenly stopped working. The Ruby on Rails weblog doesn’t contain any information on it (that I can find), going to svn.rubyonrails.org with my web browser yielded just the front page of the weblog and Googling didn’t help much. That is, until I accidentally found http://dev.rubyonrails.org/svn/rails.

So, I changed the svn:externals from “rails http://svn.rubyonrails.org/rails/tags/rel_1-2-6/” to “rails http://dev.rubyonrails.org/svn/rails/tags/rel_1-2-6/” and now it works again.

Untangling WordPress’ core files from your local customizations

Since version 2.6, WordPress can be installed in its own directory, separate from your customizations and everthing. Needless to say, this makes upgrading a whole lot easier.

If, in the pre-2.6 days, you wanted to fetch your WordPress updates through SVN, the docs would advice you to do an svn checkout from the official WP SVN repo in your working dir and then do an svn update whenever you want to update WordPress. This works because svn update leaves local modifcations alone. However, this means that you’ll be unable to commit your local changes (configuration, themes, plugins, etc.) if you choose this route.

I used my own subversion repository for my blogs and thus had to upgrade the old fashioned way with each release (although I prefer diff/patch over rm/cp). (I could have used vendor branches, but, clearly, I hadn’t thought about that at the time.) This was pretty much a royal pain in the ass, so I was glad when I could move WordPress into a separate directory with its 2.6 release.

This process consisted of removing everything except wp-content/, wp-config.php, .htaccess. (I also kept robots.txt, favicon.ico and some other personal files.) Then, I added the current WordPress release as an svn:external.

svn propset svn:externals 'wp-factory http://svn.automattic.com/wordpress/tags/2.6.1' .

.htaccess changes

In the WordPress codex, it is then suggested to copy index.php to the root dir and to change it to require wp-factory/wp-blog-header.php instead of ./wp-blog-header.php. I preferred adding some mod_rewrite voodoo of my own to .htaccess, so I did:

<IfModule mod_rewrite.c>
# This way I don't need directory indices
RewriteRule ^$ /wp-factory/index.php [L]
 
# This way WordPress can manage its own block without doing any harm
RewriteRule ^index.php /wp-factory/index.php [L]
 
# Allow easier access to /wp-factory/wp-admin/
RewriteRule ^wp-admin http://%{HTTP_HOST}/wp-factory/wp-admin/ [L,R=301]
</IfModule>

The middle rule performs most of the magic. It redirects all the requests to /index.php to the factory default index.php. This means that I can let WordPress pretend that index.php does live in the root, so I don’t have to modify the rewrite rules that are managed by WordPress itself:

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress 

wp-config.php changes

In Giving WordPress Its Own Directory in the WordPress Codex, it is suggested to change the “siteurl” and “home” options through the administration panel. In my case they would have to be changed to “http://blog.bigsmoke.us/wp-factory/” and http://blog.bigsmoke.us“. I couldn’t do this because I override these with WP_SITEURL and WP_HOME in my wp-config.php. This is because I configured WordPress to support a development environment separate from the live production environment.

Ignoring the customizations for my development environment, these are the relevant settings in wp-config.php:

define('WP_HOME', 'http://blog.bigsmoke.us');
define('WP_SITEURL', WP_HOME . '/wp-factory');
 
define('WP_CONTENT_DIR', dirname(__FILE__) . '/wp-content');
define('WP_CONTENT_URL', WP_HOME . '/wp-content');

BTW: I really like it how WordPress disables the form controls for siteurl and home when you override these settings in wp-config.php. Kudos for that, devs!

Next time: git

In the end, this is all quite a bit of pain to compensate for what is essentially a version management problem. That’s why, on my newer projects, I’m now using git which makes forking and tracking an upstream repo absolutely trivial. 🙂

A few references

Using diff and patch to upgrade web application installations

Update (July 30, 2008): I added information about making sure that the patch was successful.

When you install a big-ass web application such as WordPress or MediaWiki, you usually end with a bunch of configuration files and customizations (skins/themes, extension/plugins, uploads, etc.). This makes upgrading the files that come with the application a bit tricky. There’s a simple solution, however, which work regardless of whether you use a revision control system or not.

First of all, you do, of course, always need a revision control system. I personally recommend Git or Subversion, which are both excellent tools. But, that’s not what this post is about. I’m going to use two simple tools which are uniformly available on all (Unixy) platforms: diff and patch.

The procedure is simple:

  1. Download the version of the application which you’re currently running. For our example, we pretend that this version is extracted into the directory webapp-1.4.3.

  2. Then, download the version to which you’d like to upgrade. (We’re assuming that this version is extracted into the webapp-1.6.2 directory.)

  3. Compare the two versions to create a patch file:

    $ diff --unified --recursive --new-file webapp-1.4.3 webapp-1.6.2 > webapp-upgrade.diff
    
  4. Apply the patch to the installation of said web app:

    $ cd webapp-installed
    $ patch --strip=1 --remove-empty-files < ../webapp-upgrade.diff || echo "Some failures!"
    

Check if everything was patched perfectly

Now, if the patch command returned a non-zero status (printing Some failures! in the above example), it's time to check which chunks of which files failed. Get a summary by searching all files with an .rej or a .orig suffix:

$ find . -name "*.rej"

After manually applying any failed hunks, what's left is to compare your directory containing the patched application to the directory with the contents of the new application archive which you've used to create the patch:

$ cd ..
$ diff --unified --recursive --new-file webapp-1.6.2 webapp-installed

Version management

Your upgrade is done. Now, if your using a revision control system, you just need to check in new files and check out deleted files. In Subversion, I do this quickly using the following command sequence:

$ svn status|sed -e '/^\?/!d; s/^\?//'|xargs svn add
$ svn status|sed -e '/^\!/!d; s/^\!//'|xargs svn del

If you'd been using Git, you could do this all a little bit more sophisticatedly, but my Git skills are not advanced enough to go around giving others advice. Also, it's nice to learn a generic method before learning more specific tools.

« Older posts

© 2024 BigSmoke

Theme by Anders NorenUp ↑