Smokes your problems, coughs fresh air.

Author: Rowan Rodrik (Page 8 of 25)

Rowan is mainly a writer. This blog here is a dumping ground for miscellaneous stuff that he just needs to get out of his head. He is way more passionate about the subjects he writes about on Sapiens Habitat: the connections between humans, each other, and to nature, including their human nature.

I am naked and feeling very vulnerable

There are many clever ways to tell you this. There are many ways to deceive. But in the end I feel that more often than not the deception merely serves to reinforce that image of a very vulnerable naked man.

Thus: “I am naked and feeling very vulnerable.”

MediaWiki ConfirmEdit/QuestyCaptcha extension

Since I moved my LDAP wiki over from DokuWiki to MediaWiki, I’ve been burried by a daily torrent of spam. Just like with my tropical timber investments wiki, the ReCaptcha extension (with pretty intrusive settings) doesn’t seem to do much to stop this shitstream.

How do the spammers do this? Do they primarily trick visitors of other websites into solving this captchas for them or do they employ spam-sweatshops in third-world countries? Fuck them! I’m trying something new.

I’ve upgraded to the ConfirmEdit extension. (ReCaptcha has also moved into this extension.) This allows me to try different Captcha types. The one I was most interested in is QuestyChaptcha, which allows me to define a set of questions which the user needs to answer. I’m now trying it out with the following question:

$wgCaptchaQuestions[] = array( 'question' => "LDAP stands for ...", 'answer' => "Lightweight Directory Access Protocol" );

I don’t think it’s a particularly good question, since it’s incredibly easy to Google. But, we’ll see, and in the mean time I’ll try to come up with one or two questions that are context-sensitive, yet easy enough to answer for anyone with some knowledge of LDAP. If you have an idea, please leave a comment.

Safari: don’t give gzipped content a .gz extension

Yesterday, while helping Caloe with the website for her company De Buitenkok, I came across the mother of all stupid bugs in Safari. Me having recently announced, I loaded it up in Apple’s hipster browser only to notice that the CSS wasn’t loaded. Oops!

Reloading didn’t help, but … going over to the development version, everything loaded just fine. Conclusion? My recent optimizations—concatenating + gzipping all javascript and css—somehow fucked up payformystay for Safari users. The 14 Safari visitors (16.28% of our small group of alpha users) I received since the sixth must have gotten a pretty bleak image of the technical abilities of’s Chief Technician (me). 😥

The old cat | gzip

So, what happened?

To reduce the number of HTTP requests per page for all the JavaScript/CSS stuff (especially when none of it is in the browser cache yet), I made a few changes to my build file to scrape the <head> of my layout template (layout.php), which I made to look something like this:

<?php if (DEV_MODE): ?>
  <link rel="stylesheet" type="text/css" href="/layout/jquery.ui.selectmenu.css" />                                   <!--MERGE ME-->
  <link rel="stylesheet" type="text/css" href="/layout/fancybox/jquery.fancybox-1.3.4.css" />                         <!--MERGE ME-->
  <link rel="stylesheet" type="text/css" href="/layout/style.css" />                                                  <!--MERGE ME-->
  <script src="/layout/jquery-1.4.4.min.js" type="text/javascript"></script>                                          <!--MERGE ME-->
  <script src="/layout/jquery.base64.js" type="text/javascript"></script>                                             <!--MERGE ME-->
  <script src="/layout/jquery-ui-1.8.10.custom.min.js" type="text/javascript"></script>                               <!--MERGE ME-->
  <script src="/layout/jquery.ui.selectmenu.js" type="text/javascript"></script>                                      <!--MERGE ME-->
  <script src="/layout/jquery.cookie.js" type="text/javascript"></script>                                             <!--MERGE ME-->
  <script src="/layout/fancybox/jquery.fancybox-1.3.4.js" type="text/javascript"></script>                            <!--MERGE ME-->
  <script src="/layout/" type="text/javascript"></script>                                  <!--MERGE ME-->
  <script src="/layout/jquery.writeCapture-1.0.5-min.js" type="text/javascript"></script>                             <!--MERGE ME-->
<?php else: # if (!DEV_MODE) ?>
  <link href="/layout/motherofall.css.gz?2" rel="stylesheet" type="text/css" />
  <script src="/layout/3rdparty.js.gz?2" type="text/javascript"></script>
<?php endif ?>

It’s very simple: All the files with a “<!--MERGE ME-->” comment on the same line got concatenated and gzipped into motherofall.css.gz and 3rdparty.js.gz respectively, like so:

MERGE_JS_FILES := $(shell grep '<script.*<!--MERGE ME-->' layout/layout.php|sed -e 's/^.*<script src="\/\([^"]*\)".*/\1/')
MERGE_CSS_FILES := $(shell grep '<link.*<!--MERGE ME-->' layout/layout.php|sed -e 's/^.*<link .*href="\/\([^"]*\)".*/\1/')
all: layout/3rdparty.js.gz layout/motherofall.css.gz
layout/3rdparty.js.gz: layout/layout.php $(MERGE_JS_FILES)
        cat $(MERGE_JS_FILES) | gzip > $@
layout/motherofall.css.gz: layout/layout.php $(MERGE_CSS_FILES)
        cat $(MERGE_CSS_FILES) | gzip > $@

Of course, I simplified away the rest of my Makefile. You may notice that I could have used yui-compressor or something alike to minify the concatenated files before gzipping them, but yui-compressor chokes on some of the third-party stuff. I am using it for optimizing my own css/js (again, only in production).

Safari ignores the Content-Type for anything ending in .gz

As far as the HTTP spec is concerned, “file” extensions mean absolutely nothing. They’re trivial drivel. Whether an URL ends in .gz, .css, .gif or .png, what it all comes down to is what the Content-Type header tells the browser about the response being sent.

You may have noticed me being lazy in the layout template above when I referenced the merged files:

<link href="/layout/motherofall.css.gz?2" rel="stylesheet" type="text/css" />
  <script src="/layout/3rdparty.js.gz?2" type="text/javascript"></script>

I chose to directly reference the gzipped version of the css/js, even though I had a .htaccess files in place (within /layout/) which was perfectly capable of using the right Content-Encoding for each Accept-Encoding.

$ cat /layout/.htaccess

AddEncoding gzip .gz
RewriteEngine On
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule ^(.*)$ $1.gz [QSA,L]
<Files *.css.gz>
ForceType text/css
<Files *.js.gz>
ForceType application/javascript

You may notice that the .htaccess file contains some configuration to make sure that the .gz files are not served as something like application/gzip-compressed.

Anyway, I went to see if there were any browsers left that do not yet Accept-Encoding: gzip and could find none. When, yesterday, I was faced with an unstyled version of my homepage, my first reaction was (after the one where I was like hitting reload 20 times, embarrassedly mumbling something about “those damn browser-caches!”): “O then, apparently, Safari must be some exception to the rule that browsers have all been supporting gzip encoding for like forever!”

No, it isn’t so. Apparently Safari ignores the Content-Type header for any resource with an URL ending in .gz. Yes, that’s right. Safari understands Content-Encoding: gzip just fine. No problem. Just don’t call it .gz.

The new cat ; gzip

So, let’s remove the .gz suffix from these files and be done with it. The .htaccess was already capable of instructing all necessary negotiations to be able to properly serve the gzipped version only when it’s accepted (which is always, but I digress).

A few adjustments to my Makefile:

MERGE_JS_FILES := $(shell grep '<script.*<!--MERGE ME-->' layout/layout.php|sed -e 's/^.*<script src="\/\([^"]*\)".*/\1/')
MERGE_CSS_FILES := $(shell grep '<link.*<!--MERGE ME-->' layout/layout.php|sed -e 's/^.*<link .*href="\/\([^"]*\)".*/\1/')
all: layout/3rdparty.js.gz layout/motherofall.css.gz layout/pfms.min.js.gz
layout/3rdparty.js: layout/layout.php $(MERGE_JS_FILES)
	cat $(MERGE_JS_FILES) > $@
layout/motherofall.css: layout/layout.php $(MERGE_CSS_FILES)
	cat $(MERGE_CSS_FILES) > $@
%.gz: %
	gzip -c $^ > $@

And here’s the simple change to my layout.php template:

<link href="/layout/motherofall.css?2" rel="stylesheet" type="text/css" />
  <script src="/layout/3rdparty.js?2" type="text/javascript"></script>

That’s it. I welcome back all 14 Safari users looking for paid work abroad! Be it that you’re looking for international work in Africa, in America, in Asia or in Europe, please come visit and have a look at what we have on offer. 😉


Januari the first, a very good day to announce a new project that I’ve been working on this past year. Which I did, on Facebook and Twitter. Now, five days later, it’s time te repeat the announcement to give it some much-needed link-juice. I know that normal people don’t follow this blog. (I don’t even follow this blog!) But it does have PageRank. And it does have 4000 monthly visitors. Time for some link-whoring!

PFMS search screen - top

PFMS search screen - top

PFMS search screen - bottom

PFMS search screen - bottom is a website for adventurers who’re looking for paid work abroad. Whether you want to work in Europe, work in Afrika, work in Asia, work in Australia or whether you just want to do some seasonal work anywhere but home (grape picking, strawberry harvest, whatever you fancy). Of course we have many types of work: office jobs, tourism jobs, healthcare jobs, childcare jobs, wildlife jobs, anything.

The cool thing about payformystay, though, is that we only sport paid jobs. So, no wrestling through page after page of crappy offers where some evil cunt swine tries to make you pay for your own work. That’s right! Job offers on must at the very least include full board (something like a bed or tent and 3 meals daily) or enough pay to cover these basic living expenses! Offers are audited and violators are fed to the spammers.

Go get yourself a piece of the action: – where people get paid to go on adventure

Peace out. End of announcement.

Have fun! Be scared! Be tough! And be safe!

Ubuntu Desktop Linux and Acer TravelMate 7513WSMi

My youngest sister has retired her big-ass (17″) Acer TravelMate (model 7513WSMi 7510) with a more modern offering from Sony. That was last year. Now, she thought it’d be a good idea to donate it to our oldest sister. But since the thing has always “run” like a pig with Windows Vista, her girl-geek instincts thought it better if I’d equip the old monster with Ubuntu Linux instead.

AMD 64bit

I’m also considering upgrading my own laptop to 64 bit. (They’ve told me that, really, the 32 bit age is over.) So, the first thing I’m trying to find out (now that I’m getting on the 64 bit train) is if this thing supports 64 bit. I can’t really think of a quick way to find out, so I’m just going to create a 64bit installation CD and see how that works.

Or, I could have just popped open the hood to see the “AMD Turion64x2 Mobile Technology” sticker. 😯


After changing the boot order, the installation CD (burned from my T61 using “wodim -data ubuntu-10.10-desktop-amd64.iso”) seems to be booting despite the worrying sounds that seem to indicate that the laptop is trying to rip apart and eat the disc.

I’m surprised how good the current installation program looks and that it asks me if I want to “download updates while installing” and “install third-party software”. Nice.

Great idea to ask all the annoying questions (timezone, etc.) during installation instead of after! I’m amused with how much I’m behind the time if I see all the promotional screens for new and improved software which is meant to keep me inspired during the installation process. “ is fully compatible with Microsoft Office[…]” Am I really that much behind with the times? Nah, I can’t imagine. I must still have some very, very nasty Excel sheet lying around somewhere, gathering dust. If I feed that monster of a thing to OpenOffice, then I’m pretty sure… Yeah, that’s going to be fun. 😈

Post-installation configuration

I had expected to spend at least an hour or two hunting around forums to find solutions for obscure driver-related issues and other nuisances. But no issues popped up. It just worked. Ubuntu is very compatible with the Acer TravelMate 7513WSMi! 😀

So, I spent some of the time saved on setting a user pic and a few other niceties, but I refrained from doing anything fancy, because I’ve figured out a new sister support strategy that I might blog about later. (It involves a four-hour work-week…)

[For my own reference, I started on the first draft of this post on Januari 14.]

Making a shell-script run with setuid root

If you want to run a process with root privileges that you can invoke as a less unprivileged user, you can make the program setuid root. This can be very useful, for example, when you want a PHP or CGI script to call a backup process, or to create a new site or irrevocably delete you whole system. The latter example points to a serious security problem: if anyone can figure out a way to make your program do something you don’t want, you’re screwed, because you just gave them root privileges to wreak maximum havoc. That’s why, normally, scripts (anything executed by an interpreter by the kernel because of a shebang) won’t get elevated privileges when you set their setuid bit.

To understand the setuid bit, let’s first see what happens when I try to cat a file that belongs to root:

su -
# I am now root; fear me
touch no-one-can-touch-me
chmod 600 no-one-can-touch-me
cat no-one-can-touch-me
# cat: Permission denied 

Next, I’ll create a shell script that cats the file:

cat no-one-can-touch-me

And make the script setuid root:

su -
chown root:root
chmod +xs

If I now execute the script, I still get the permission denied. What I need to make this work is a wrapper program. For that, I refer to Wiebe’s post about the same subject. (Yeah, I know: why bother publishing this if Wiebe already did an excellent job explaining? Well, I just hate to throw away an otherwise fine draft.)

Remove appending slash from a path using Sed

Here’s how you can remove the appending slash from a path using sed, the stream editor:

/just/a/path/ | sed -e 's#/$##'
# Output: /just/a/path
# And, if there isn't an appending slash, nothing happens:
 /just/another/path | sed -e 's#/$##'
# Output: /just/another/path 

It works quite simple. Sed executes expression (-e) on its standard input. The expression is a substitution using regular expressions. The #-sign is the delimiter. The part (#/) between the first two hash signs is the matching expression and the (empty) part between the second and the third hash sign is the replacement expression. This expression (“s#/$##”) basically says: replace all occurrences of “/” at the end of the line (the dollar sign is the end-of-line anchor) with nothing.

To use this in a script is easy-peasy. Suppose $1 is a system path that may or may not include an appending slash:

sanitized_path= "$1" | sed -e 's#/$##'`

This script outputs its first parameter with the appending slash removed.

Ubuntu and SiS 671 VGA chipset driver

The video on my mom’s laptop, A Fujitsu Siemens Esprimo Mobile V5535, had recently gone awry. At the time, the laptop was running Ubuntu 9.04 (I think). Reconfiguring the driver didn’t do much good, so I upgraded the machine to 10.04, hoping that that would fix it. It didn’t.

lspci|grep -i vga
01:00.0 VGA compatible controller: Silicon Integrated Systems [SiS] 771/671 PCIE VGA Display Adapter (rev 10)

I solved the problem by manually installing a replacement driver that I found through a blog post that I found through another blog post that I found through a forum post.

Or something like that. Who cares? The point is that I’m uploading the files I found here so that I don’t have to jump through MegaUpload hoops again (and sit through MedaAnnoying ads):

Installing the binary driver wasn’t too difficult. (I just always cringe when something happens outside of package management.) 🙁

mkdir sis; sis
unzip *zip
sudo cp sis671_drv.* /usr/lib/xorg/modules/drivers
#Edit /etc/X11/xorg.conf and set `Driver   "sis671"` on the "Device" Section
[ -z $EDITOR ] && EDITOR=/usr/bin/vim
$EDITOR /etc/x11/xorg.conf

Restarting the X server after that was a bit difficult, since the upgrade to 10.04 also fucked up the console (that damn framebuffer) and because Ctrl-Alt-Backspace is disabled by default. I had to reboot. (Ok, I hate to admit: it’s not that it’s difficult, it’s just wrong.)

Anyway, after the system restart, it worked just fine again. The X log agrees:

(II) SIS: driver for SiS chipsets: SIS5597/5598, SIS530/620,
        SIS6326/AGP/DVD, SIS300/305, SIS630/730, SIS540, SIS315, SIS315H,
        SIS315PRO/E, SIS550, SIS650/M650/651/740, SIS330(Xabre),
        SIS[M]661[F|M]X/[M]741[GX]/[M]760[GX]/[M]761[GX]/662, SIS340,
        [M]670/[M]770[GX], [M]671/[M]771[GX]
(II) SIS: driver for XGI chipsets: Volari Z7 (XG20),
        Volari V3XT/V5/V8/Duo (XG40/XG42)
(II) Primary Device is: PCI 01@00:00:0
(WW) Falling back to old probe method for sis671
(--) Assigning device section with no busID to primary device
(--) Chipset [M]671/[M]771[GX] found
(II) SIS(0): SiS driver (2006/10/17-1, compiled for
(II) SIS(0): Copyright (C) 2001-2005 Thomas Winischhofer  and others
(II) SIS(0): *** See
(II) SIS(0): *** for documentation, updates and a Premium Version.
(II) SIS(0): RandR rotation support not available in this version.
(II) SIS(0): Dynamic modelist support not available in this version.
(II) SIS(0): Screen growing support not available in this version.
(II) SIS(0): Advanced Xv video blitter not available in this version.
(II) SIS(0): Advanced MergedFB support not available in this version.
(--) SIS(0): sisfb not found
(--) SIS(0): Relocated I/O registers at 0x9000

Then, to also fix the console:

grep vga16fb /etc/modprobe.d/* || sudo sh -c "echo blacklist vga16fb >> /etc/modprobe.d/blacklist-framebuffer.conf"
sudo update-initramfs -u
sudo reboot
# pray 

I had two other issues that popped up after the upgrade the 10.04. I was inclined to blame the first on the new video driver, but I solved it by disabling “Hardware Acceleration” in the Flash plugin preferences. [source]

Another problem that confused my mother was that the volume control icon had gone. [solution]

How to make a wiki work: PALDAP

The first ever wiki I started was PALDAP stands for “PALDAP: A Lazy Directory Administrator’s Pal”. Yes, that’s a recursive acronym. Cute, ainnit? I actually registered the domain because it was the name of a crappy abandonware PHP LDAP administration tool that I wrote in PHP, but decided instead to configure it as a wiki to host some of my assorted experiences with LDAP and OpenLDAP in particular.

I never much bother with LDAP anymore, but the wiki remains because cool URLs don’t change and it doesn’t cost me that much. AdSense income for the wiki is only marginal (€15 in over three years) because the wiki’s content is only marginally useful and the traffic (300 visitors/month) reflects that fact.


Six days passed since I wrote the last paragraph. It’s a funny thing how writing can mess with your head. I was going to use PALDAP as in introduction to my struggle to make money of my wikis in general. Because the most promising of these wikis are my Hardwood Wikis and not my LDAP wiki, I wasn’t going to linger too much on it. But it’s a week later and some unexpected things happened.

Often, since becoming more familiar with Semantic MediaWiki, I’ve been considering the idea of converting the DokuWiki installation that runs to a Semantic MediaWiki installation. Yet, nothing ever happened. I no longer work with LDAP professionally and most of the time I just kind of forgot that the site even existed. Until a week ago, when I started writing this post.

So, what happened? How do decisions happen? I have no idea. I’m not a neurologist. (I’m not even a sceintist; Hell, I can’t even spell “scientist”.)

What I have now are rationalizations for my decision but my decision is quite clear: I can’t kill my darling, even though I never really properly cared for it. For the last four years or so I had simply abandoned it on the grounds that it wasn’t costing me much anyway (it’s hosted at NearlyFreeSpeech.Net). Yes, the costs have gone up, but that’s just a rationalization. I could have just gone on and ignore the site’s existence without it ever making much of an impact on my cash flow. (I did some years ago actually promise the site’s most active contributer to never take the site off-line.)

So, if I was being rational, I would have just left the site alone. But, I’m not a rational being. Increasingly less so, in fact. A happy fact, if you ask me.


The PALDAP logo designed by Jeroen Dekker

The looks

Anyway, I still haven’t told you what happened last week. I didn’t leave the site alone. I created a development version of the site based on Semantic MediaWiki. It’s fucking kick-ass. It looks awesome thanks to MediaWiki’s new vector skin. But it looks even more awesome thanks to Jeroen Dekker. As we often do, we were hanging out at his place in a lazy haze, being generally unproductive but with random bursts of intelligent conversation and productivity. This day I had been absent-mindedly hacking away on my new MediaWiki darling and I was about to leave and jump on my bicycle when I mentioned that I could use a logo for PALDAP.

It was probably way past one in the morning already, but Jeroen was still in a creative mood from play-practicing with his new lighting set. All I can say about his creativity is that it was late, I hadn’t slept very long the night before (and the night before that and the night before and…) Let’s just say that he besides his excellent gear he didn’t have some very good material to work with. I was feeling ugly and tired. Yet…

Rowan, Januari 7, 2010

Jeroen's creative genius is a compliment to my awesome facial features 🙂

He went into a kind of frenzy on his big-ass touch-screen and being coaxed by me he created the perfect offset for the boring technical subject that is LDAP.

The brains

In the meantime, since last weekend, I’ve been starting to assemble a logical structure of semantic properties (think of LDAP attributes or SQL fields) and templates (sort of like MediaWiki functions) that’ll allow me to capture all the semantics related to the LDAP and the ecology around it.

The booty

I still don’t believe that PALDAP has a huge revenue potential, but hosting costs have increased and if I can get the website to awaken from its winter sleep, maybe it’ll at least start paying for itself again. Not that I really care, honestly. Somehow it’s just masturbatorily satisfying to use the expressiveness of RDF to capture the semantics of LDAP. What I like about it is that the wiki concept (and especially the semantic wiki concept) is a very tight fit for technical documentation. Another thing that I like about working with a wiki about a technical subject is that the wiki has a technical audience. I mean, there’s a reason that the visitors of my Hardwood Investment Wikis click on all those expensive links and that reason is not the technical insight that’ll lead to users clicking the edit button and actually contributing content.

In fact, even with the old DokuWiki version of the site, much of the content was actually created by other users (most by the same user called brontolo). If the community of my Hardwood Wikis worked this well… Let’s just say that I could remain in retirement for a while then.

So, even if this’ll just be an exercise in effective community building/plumbing rather than a way to make advertising income easily, it’ll still be effective as an exercise. I’m going to follow my intuition on this one and see how successful it’ll become and how much time it’ll take.

Fuck, this post sucks, but it sure does help me. Don’t ask me how, but it does. Kinda.

Effective CLI habits

Just an example of some effective CLI magic that I copy/pasted into a draft aboutexactly a year ago. Can you see what’s happening? I’m moving some selected files into a subdirectory.

$ ls *png
boucoule-17jaar-met-steen.png         evening_cloud.png  small-map-molenweg.png  tile11.png
boucoule-2001-2002-face5-400x300.png  hardwood-logo.png  step-01.png             tile9a.png
$ ls *png|while read f; do echo $f; done
$ ls *png|while read f; do svn mv $f index; done
A         index/boucoule-17jaar-met-steen.png
D         boucoule-17jaar-met-steen.png
A         index/boucoule-2001-2002-face5-400x300.png
D         boucoule-2001-2002-face5-400x300.png
A         index/evening_cloud.png
D         evening_cloud.png
A         index/hardwood-logo.png
D         hardwood-logo.png
A         index/small-map-molenweg.png
D         small-map-molenweg.png
A         index/step-01.png
D         step-01.png
A         index/tile11.png
D         tile11.png
A         index/tile9a.png
D         tile9a.png

Bonus points if you notice that I could have moved the JPEGs and PNGs in one command instead of doing the same thing for the second time for the JPEGs as below. (I probably forgot that I also had some JPEGs lying around, or there must have been some other lame excuse.)

$ ls *jpg
bruggetje-225x300.jpg  favicon.jpg  purple-rowan.jpg        rowan-2007.jpg                rowan-wilderness.jpg
bruggetje.jpg          hekje.jpg    rowan-2007-448x300.jpg  rowan-wilderness-400x300.jpg
$ ls *jpg|grep -v favi
$ ls *jpg|grep -v favi|while read f; do svn mv $f index; done
A         index/bruggetje-225x300.jpg
D         bruggetje-225x300.jpg
A         index/bruggetje.jpg
D         bruggetje.jpg
A         index/hekje.jpg
D         hekje.jpg
A         index/purple-rowan.jpg
D         purple-rowan.jpg
A         index/rowan-2007-448x300.jpg
D         rowan-2007-448x300.jpg
A         index/rowan-2007.jpg
D         rowan-2007.jpg
A         index/rowan-wilderness-400x300.jpg
D         rowan-wilderness-400x300.jpg
A         index/rowan-wilderness.jpg
D         rowan-wilderness.jpg
« Older posts Newer posts »

© 2022 BigSmoke

Theme by Anders NorenUp ↑