Ethernet cable labelling

Ever felt that you have too many cables in your home and don’t know which one leads where?  Welcome to my world!  Here’s my recipe to how to make the situation easier.

It all starts with a sticky tape.


Now comes the pen.


Let’s cut the tape with scissors.


At this point you should remove the protective foil from the back of the tape to expose the glue and shape a loop.


As a final move, pull the cable through the loop and push the tape against the cable to make it stick.


When doing all the above properly, you’ll get a of sight that makes the pants of any nerd wet.

Make your OpenWrt router automatically enable your home server and notify you

Here’s the situation: I have a Mini-ITX home server in my LAN that sometimes gets disabled when the power goes down. I could setup its BIOS to get automatically enabled when it gets power but considering the nature of power outages it may happen that the power goes out and on in rapid succession over a short amount of time which isn’t really good for the hardware.

I’d rather choose my server to be checked on a 5 minute basis and have it automatically woken up and I also wanna be notified by email on such an occasion so that I can SSH into it and uncrypt the crypto partition.

I’m about to carry out this task using my OpenWrt driven router. Everything is pretty straightforward, except that I can’t send emails through GMail SMTP using Openwrt because mini_sendmail lacks SASL support so I’ll just fetch a PHP script that’ll actually send the mail.


Be aware that the above wol utility is a home-cooked one.


# m h  dom mon dow   command
0,5,10,15,20,25,30,35,40,45,50,55 * * * * supervise-host mybox

At this ponit restart cron on OpenWrt.

< ?php
mail('', $_GET['host'] . ' has been rebooted', 'uncrypt me!');


Google Reader transition

I’ve been using Google Apps for a while and a transition opportunity has appeared lately on the Google Apps admin interface that I’ve just executed. As a result I can use Reader through my Apps account instead of my Gmail account and my shared items page has changed from the old page to the new page.

According to the statistics I’ve read 86,937 items since April 7, 2006 with my old account which seems like a lot.

Minimizing CSS and JavaScript HTTP requests automatically on the CMS level

Broadband Internet connections are pretty standard in the modern world nowadays, but despite of this many sites feel slow to load. Why is that? This is because of a multitude of reasons but the one that really hits home for me is excess HTTP requests. To be even more specific, I’d like to talk about excess CSS and JavaScript file requests. It’s not unusual for sites to load about a dozen or more CSS and JavaScript files combined which I think is way overkill.

I came up with an algorithm that could be implemented by any CMS on the API level and it could dramatically reduce the load times of sites and relieve web servers significantly.

  1. Expose a dedicated API on the CMS level for plugins such as add_cached_css() / add_cached_js() .
  2. Execute the following points upon every page load:
  3. Check the modification times of all the CSS / JavaScript files.
  4. If the modification time of any file has been changed since the last page load or any new file has been added then go on, otherwise abort.
  5. Save the modification time of all the CSS / JavaScript files, concatenate the files and md5sum the concatenated files.
  6. Save the concatenated CSS / JavaScript files using their md5sums, such as 7c1735b79f2d13052454c196259ca511.css and 9fee0c4c4391bd75ca4269dac409a0aa.js
  7. Save the md5sums for the CMS to be able to reference the generated files from the main page.

A couple of things to note:

  • The generated CSS / JavaScript files should be cached forever as it’s practically impossible for two distinct generated files to ever collide.
  • This algorithm could be implemented by any CMS so that any plugins could use it with no effort.
  • No new API functions should be necessary for every CMS.  For example, WordPress already has functions for adding CSS / JavaScript files.  A simple define should be enough to activate such an algorithm.

Let me know what you think.

Stick your file to a specific path with stickfile

Update (2011-04-22): Zach let me know in the meantime that there’s a much easier way to implement stickfile in BASH:

Moral of the story: I should have searched for inotify command line which would lead me to inotify-tools which contains inotifywait.

And now to the original post:

My employer uses SonicWALL NetExtender for his VPN needs. Saying that I’m not a fan of IPsec would be definitely an understatement, but my major problem is that NetExtender overwrites my resolv.conf upon every connection which screws the hostname resolution on my LAN from my laptop. chmoding or chowning resolv.conf doesn’t help because it gets re-chowned and re-chmoded by NetExtender.

I was thinking about overwriting resolv.conf on a regular basis from a script but it seemed rather inelegant. But how should I do it otherwise? With inotify, of course.

Here’s the script I’ve written which you should save as “stickfile” to a directory that is featured in your $PATH.

After I created a valid resolv.conf and saved it as /etc/resolv.conf.orig I only had to execute the following as root before starting up NetExtender:

stickfile /etc/resolv.conf.orig /etc/resolv.conf

Self hosting or cloud hosting: that is the question

I know many geeks. Most of them are smart. Some of them are brilliant. Few of them are world class. And guess what? Many of them blog on a regular basis. These are tech savvy folks who run their blogs on their own (or rented) server for maximum customizability and pay a monthly fee. What do you think, how long will the information they produced survive if some of them get hit by a train? It’s very simple. You divide the amount on their bank account with their monthly hosting fees and you got the number of months in question.

What I wanna ultimately conclude is that there are lots of folks who create valuable content on a daily basis and their content is way too much vulnerable.

I was thinking a lot about this issue. I have 350 posts at the moment and although I know that mankind would happily survive without any of them, I feel the need to secure this information just in case.

Since I (and most of the people) use WordPress, I was thinking about a WordPress specific solution. The idea is to dump my self hosted blog to my cloud hosted blog on a regular basis from cron. Of course, there are some drawbacks, like Google may find the mirrored content along or before my primary site and I won’t be able to really customize the cloud hosted blog but I think that the benefits outweight the drawbacks.

The term I’m about to coin is the “backup blog” what is a cloud hosted blog where you mirror your content on a regular basis in an automated manner. I’m about to do this but I’m not there yet.