Skip to main content

Remote Working and non-reliable ISP

Having agreed with my manager, who was kind enough to allow me to work from home while away in India (my home country), I was quite eggsited about the Easter break. As usual, I was visiting my family, and was hopeful that it will workout OK for me, but it turns out that the country seem to develop backwards!

Basically the ISP in my town here is quite appalling! A strong breeze and the connection is gone! Summer rains which is usually associated with thunders, makes it worse! Power-cuts or load shedding as they call it, is cherry on the cake! So I am left with a home setup that has a few KVA backup power but no reliable Internet connection! So I am having to work offline mostly and then sync occasionally!

Some people might think, why not use a remote filesystem like SSHFS (fuse mount over SSH); I could, if the connection was stable, it is a no go area on unreliable link, as the mount hangs when link goes away!

Surely, what is said in the title is normally not very relevant to most people in developed world, but this would work equally for anyone who need to develop in an environment where source edits happen on one machine (laptop, desktop, tablet, etc) and the updated files need to be copied to another host for it be compiled or run (in webserver, for example).

lsyncd came to my rescue! For a basic overview, click here. The command that I ended up formulating is as below. The utility is available as a package in Linux Mint (ditching Ubuntu for now, till Mr Shuttleworth makes up his mind; not his guinea pig).

lsyncd -nodaemon -log Exec -rsyncssh $HOME/devhost_backup/svn_root/trunk devhost.mycompany.com $HOME/svn_root/trunk

Here, I have a backup of my usual SVN checkout under the path $HOME/devhost_backup. When ever I change anything within that folder, lsyncd monitors these via inotify and the changes aggregated over a few seconds' period, and are replicated using rsync. File operations that are normally done better via shell commands are run over an SSH session. (Here it is assumed that the username is the same on both hosts, and you have a key-based SSH access setup and working.)

IMO, this is the best combo inotify + rsync + SSH (for moves, deletes, etc). Seems to work a treat!

Comments

Popular posts from this blog

Lock and Turn-off ... why not by default?!

For reasons that I do not understand, most distributions including Linux Mint and my old favourite Ubuntu, doesnt seem to turn the display off once I have locked it (with keys Ctrl+Alt+L). For me, when I lock the display, it means that I am not interested in seeing anything on the display; or I am going away from the display/computer (physically). Either way, my intention was to make it secure while I leave it unattended! If I am not interested in the display, wouldn't it make sense to turn it off as well?! For me logically yes! Anyway, I had to make it work like that for me. Had to knock-together a little script and re-assign the lock keyboard shortcut to execute that script. The below script did it for me. #!/bin/sh ( sleep 3 && /usr/bin/xset dpms force off )& /usr/bin/gnome-screensaver-command --lock It starts a background task which will kick-in after three seconds since we locked the display. For some unknown reason, in my case, it didn't work w

Readable Dockerfile and Compact Images

Call me lazy; I don't like doing the same thing over and over when it comes to writing code; same for Dockerfiles as well. I have been following various formula that I gathered from various sources; primarily from this post of Dave Beckett's . After a while, I thought I will put them together into a script that can easily be consumed from my Dockerfiles. Here is the outcome; have a look at my github repo codemedic/debian-docker-build-tools . The README has an example that will help you how to use it. The scheme used here has two main intentions; use backend agnostic primitives (function names) to perform package related tasks, and provide backend specific auto-cleanup to remove package manager artefacts. This paves way to the possibility of using a different (non-debian, say alpine or centos) base image with a different version of these scripts, in a generic and more readable fashion; and it will tidy up after itself, leading to smaller images.

cups-pdf : wish it could open when done

Its been something I always wanted in cups-pdf; to see the PDF output from the printing. Just to make sure that the printer didn't run out of ink! After a bit of digging, I managed to get it working with inotify-tools. Here is what I had to do. sudo apt-get install inotify-tools inotify is one of the facilities in Linux that I adore. Very scalable; have used it extensively in production environment that fires a few thousand events per second (it is capable of doing a lot more). For our purpose, we need to setup an event monitor and make the event do something; which is in our case monitor for write+close event, and ask gnome (in my case) to open the file. The best we can do is to run a background script on gnome session startup . The script as below; make sure you set execute permission on the file. #!/bin/bash if [ "$1" != "-nohup-" ]; then     nohup $0 "-nohup-" </dev/null &>/dev/null &     exit 0; fi inotifyw