Daniel Hoelbling-Inzko talks about programming

Compiling vim8 with python support on Ubuntu

Today I took a day off from work so as always when I try some new stuff I end up spending 2 hours on my Vim configuration before actually getting something done. So todays two hours where spent on getting Vim compiled with python3 support.

First off - do use Vim8 - it's awesome and do compile it from source. It's rather simple and saves you from outdated packages on Ubuntu :).

Now my issue today was that I tried enabling python2 and python3 support at the same time. For no apparent reason the following configuration did always result in a vim binary that thought it had python support - but didn't.

./configure --with-features=huge \
            --enable-multibyte \
            --enable-rubyinterp=yes \
            --enable-pythoninterp=yes \
            --with-python-config-dir=/usr/lib/python2.7/config-x86_64-linux-gnu \
            --enable-python3interp=yes \
            --with-python3-config-dir=/usr/lib/python3.5/config-3.5m-x86_64-linux-gnu \
            --enable-perlinterp=yes \
            --enable-luainterp=yes \
            --enable-cscope --prefix=/usr \

Running vim --version resulted in +python/dyn and +python3/dyn so I thought - cool it's working.. Until I started vim and was greeted by:

Sorry, this command is disabled, the Python library could not be loaded.

To make things more interesting :echo has('python') did return 0 too - although the Vim was built with python support (and --enable-fail-if-missing is supposed to fail if python can't be linked).

So after trying around a bit and not getting anywhere I decided to just remove the python3 support from the configure line and voila - python is statically linked and working.. Yay!

./configure --with-features=huge \
            --enable-multibyte \
            --enable-rubyinterp=yes \
            --enable-pythoninterp=yes \
            --with-python-config-dir=/usr/lib/python2.7/config-x86_64-linux-gnu \
            --enable-perlinterp=yes \
            --enable-luainterp=yes \
            --enable-cscope --prefix=/usr \
Filed under vim, python, tools

Enable code coverage reports in create-react-app projects

create-react-app is a nice and easy way to bootstrap a new React.js project with some sane defaults and most of the tedious configuration required to enable Webpack building of Babeljs etc..

One thing I was missing from the generated configs though is how to output code coverage. Turns out it's rather simple - locate your package json and add the following line under scripts:

    "coverage": "node scripts/test.js --env=jsdom --coverage"

This way you can run yarn coverage or npm coverage and get a nicely formatted output with your coverage data. You can read more about the jest cli options in the docs

Filed under reactjs, testing, tools, javascript

Razer Black Widow Ultimate Review

I have been putting this review off for a very very long time since purchasing the Razer Black Widow Ultimate, (in fact it's been almost 3 years since I got mine), but since friends keep asking about the Keyboard I thought I could save myself a few keystrokes here.

So, short and sweet: Is it any good?


I don't know how many times this has already been said (see Jeff Atwood for example), but keyboards matter. And having a great keyboard is one of the most important things to me personally.

So after 6 worn out Microsoft Natural 4000 Keyboards, some intermediate Razer and Logitech keyboards, I decided to bite the bullet and jump on that new "mechanical" keyboard wagon to test it out and got the Razer Black Widow Ultimate. And god this thing changed my life!

When you first type on it (or any mechanical keyboard for that matter) it's this "HOLY CRAP" moment when you remember how typing felt back on those IBM keyboards in your youth. The keys travel perfectly uniform, with exactly the right amount of pressure and a satisfying click at the end. Let's just say the typing is sublime. It's just plain better than conventional keyboards - period.

Now we have established you need an mechanical keyboard, but do you need the Razer Black Widow?

Yes, no and maybe. I love Razer products, I swear by my Razer mouse and their keyboards have always served me well before. So I would say the Razer Black Widow is a well build, solid and great looking Keyboard you want to buy. But: Don't buy it for it's gaming features. Buy it for the looks, the build quality and the switches.

Why not for gaming features? Because gaming keyboards are a lie - Gaming keyboards are the equivalent of 3D-TVs, just a marketing gag to extort money from you. You don't want an extra row of macro buttons, because you don't need an extra row of macro buttons. That's like putting a second door handle on a door - everything you need out of a keyboard is already there: On or near the WASD keys. No game on this earth expects it's players to have a macro-recording super duper keyboard so all games are designed to work well with a standard keyboard. I have yet to find a game where I actually could not remap the keys in the interface, or had to perform a keyboard input that weird that I had to use these keys - EVER.

Second lie with gaming keyboards is their anti-ghosting technology. Again: You ain't gonna need it. Yes the keyboard may accept more than 4 inputs at the same time, but I have never ever felt that this was a problem with other keyboards which lacked this before. The times where you played multiplayer games by having 2 people use the same keyboard are gone, and for everything else you will never hit any limits even with a 10€ keyboard.

Third lie is the ultra-fast 1ms response time. Who are we kidding? There are no noticeable keyboard delays on regular keyboards, so any improvement on already unnoticeable lag is just snake-oil. But heck, it sure sounds like that's the only thing holding you back in multiplayer games.

Now that we established that I love my Razer Black Widow, but think all the gaming features they market it with are crap, I also have to express my frustration with the Ultimate version of the keyboard.

When I bought it, you could get the Razer Black Widow for around 80€, and the Black Widow Ultimate for 120€. I went for the Ultimate edition, because it has backlight illumination and I liked that. It also has an additional USB Port and a Audio/Mic pass-though. This means in theory you could connect your headset to the keyboard, avoiding problems with cable length etc. The reality is just frustrating: Brainless monkeys designed this feature! They put it on the right side of the keyboard - right where my mousepad starts!. What on earth where they thinking? I am supposed to have cables and USB sticks on my mousepad? Like there is no fucking space anywhere around the keyboard! Actually, there is exact the same space unoccupied on the left side of the keyboard. The whole back of the keyboard is empty. I've seen other keyboards solve this way better! I have had keyboards that even had grooves on the bottom to pass your headset cables below the keyboard so they aren't in your way. And Razer designed theirs so the whole point of the cables is to be in your way.

So in closing: You want this keyboard - it's great. Just make sure you really really want to pay 40€ extra for the illumination - because the rest of the "ultimate" package is just crap.

Filed under hardware, tools, review

Amazon S3 uploads fail in numerous ways for no apparent reason

Today I spent almost 30 minutes trying to debug a problem where one of our Servers was not correctly synchronizing files to Amazon S3.

First thing I tried was manually doing a s3cmd put of the files in question, and was immediately greeted by numerous connection reset by peer error messages. s3cmd is so smart, it even automatically retries for 5 times while throttling the upload bandwith - still to no avail. When down to 5 kb/s upload speed I started getting broken pipe error messages.

Well, obviously our server has more than 5 kb/s uplink so I suspected someone broke something on the network level, but everything else was working fine.

The final clue then came when I was running `s3cmd ls` on my bucket where I finally got a meaningful error message: the difference between the request time and the current time is too large. Huh? Yes! Doing a date on the server revealed that the server was almost 20 minutes behind the current time, thus the SSL connection to S3 could not be established.

Turns out we had not enabled ntp and so the server clock kept drifting for a couple of months. The solution was straightforward:

sudo ntpdate ntp.ubuntu.com
sudo apt-get install ntp

The first does an immediate update (only installing ntp will take some time to sync the clock back up, so doing a manual update fixes your problem immediately), and then it installs ntp to prevent this from happening in the future.

Filed under tools

Make GNU screen xterm-256color work on OSX

I just ran into this and spend like 2 hours with Linux genius Jam trying to figure out why in the heck I could run Vim in 256 colors mode on my server while once I started screen it didn't work any more.

The issue was two fold. a) My local Terminal.app was reporting itself as xterm-color instead of xterm-256color. You have to update this setting in your Terminal app here:


Once done you only need to edit your .screenrc to include the following 3 lines:


# terminfo and termcap for nice 256 color terminal
# allow bold colors - necessary for some reason
attrcolor b ".I"
# tell screen how to set colors. AB = background, AF=foreground
termcapinfo xterm 'Co#256:AB=\E[48;5;%dm:AF=\E[38;5;%dm'
# erase background with current bg color
defbce "on" 
# set TERM
term screen-256color-bce

Problem is: Even if you set your .screenrc correctly, it won't matter if your terminal is not reporting the correct version string in the first place..

Filed under tools

Securely managing database.yml when deploying with Capistrano

The more I venture into Ruby land the more magic Unicorns I find on the way. The wonders of SSH still seem totally outlandish to someone used to do deployments by RDPing into a server and xcopying a directory structure into your IIS folder.

But here I am and learning the ways of Capistrano and how deployments to multiple servers really should work.

Naturally I ran into issues I'll detail a bit later, but one of my major problems with my Rails deployment was the different database.yml between my production and my dev environment. Since the repository is in a shared location I could not put the production server mysql password into the config as it would be available to anyone with read access to the repository. This may be something you can get away with in a corporate environment, but if you plan on ever open-sourcing your project you should make sure you don't put production passwords into your repository :).

My solution to that problem is quite simple: I ssh'd into my server and put a "production" database.yml into the home directory of my deployment user and added the following task to my Capfile:

namespace :db do
  task :db_config, :except => { :no_release => true }, :role => :app do
    run "cp -f ~/database.yml #{release_path}/config/database.yml"

after "deploy:finalize_update", "db:db_config"

The after statement tells Capistrano to run the db_config task right before finishing the code update, but before running any migrations in case you run cap deploy:migrations (capistrano process). And during every deployment I overwrite the database.yml from the repo with the one on the server.

I also added a assets:precompile task since Capistrano won't run the precompilation of Rails assets out of the box (you need RVM integration for this though):

  task :precompile, :role => :app do
    run "cd #{release_path}/ && rake assets:precompile"
after "deploy:finalize_update", "deploy:precompile"

Et voilá: I can now run cap deploy:migrations from my dev machine and it will automatically connect to my release server, pull the code out of the git repository, compile the assets and migrate the database to a new version. And it will even roll-back to the old version if something goes wrong along the way.

Ps: I also struggled at one point with the SSH keys for the git repository. Since the deployment user on the server has no own private key I was inclined to generate one and add it to my git server's allowed keys list. But that's apparently the wrong way to go about things. The right thing to do here is to simply enable agent forwarding so the server will forward any questions about keys to your dev machine that should have the appropriate set of keys available.

ssh_options[:forward_agent] = true

Filed under rails, ruby, tools

SSH address book

This may be old news to anyone somewhat used to Linux server administration, but Jammm just enlightened me so I thought I'd share.

Say you don't have a hostname associated with your server (yet), you may get bored of writing ssh [email protected] all the time. Assuming you are using public key authentication anyways (and you should!) writing ssh myserver would be far more convenient.

Turns out you can do that by modifying the ~/.ssh/config file like this:

Host myserver
  Port 22
  User root

Filed under tools

Measure execution time in PowerShell

I have no idea why, but although having been a Windows user for most of my career, I know the unix commandline pretty well. In fact, one of the best things in Powershell was the ls alias to the Get-ChildItem command.

Naturally, Microsoft could not include an alias for every unix command out there, so I spend a fair amount of time hunting down the Powershell equivalents to Unix commands whenever I need one.

This time it’s the time command that allows you to measure how long the execution of a particular command took. The Powershell equivalent is called Measure-Command and does exactly the same thing, returning a System.TimeSpan.

For example, to measure the execution time of a git checkout:

Measure-Command { git checkout gh-pages }

Switched to branch ‘gh-pages’

Days : 0

Hours : 0

Minutes : 0

Seconds : 0

Milliseconds : 344

Ticks : 3448544

TotalDays : 3,99137037037037E-06

TotalHours : 9,57928888888889E-05

TotalMinutes : 0,00574757333333333

TotalSeconds : 0,3448544

TotalMilliseconds : 344,8544

I considered creating an alias for Mesaure-Command to just time, but the usages are so rare that it’s not really necessary.

Filed under tools

Using Readability on the iPhone

Disclaimer: This is not a post about programming. No code was harmed during the creation of this blogpost.

As you may have guessed from the title, I got myself an iPhone4 some weeks ago and love it ever since. The browser in particular is great, yet sometimes even the best browser can’t change that a website is badly designed. Too often you can’t make the content out in between all the Google Adwords, the fonts are hideous or it’s a fixed width layout that’s way too wide.

On my PC I just hit the Readability bookmarklet and through magic all the ugly stuff goes away and only the content remains. Well, since Readability is just JavaScript you can do the same thing on the iPhone too, it’s just a bit trickier to install.

Here is how a badly readable site looks with Readability (note that it does not remove images that belong to the post!):

Before – After Readability


As you can see, the width of the layout is too wide to be easily readable in portrait orientation.

Step 1: Go to http://lab.arc90.com/experiments/readability/ on your iPhone

Select all the text from the textbox on the right and copy it:


Next, just hit add bookmark on the site and save the Readability site.
Now go into your bookmarks and edit the readability bookmark.

Delete the previous address and paste the code we copied earlier.


Et voila, whenever you want to see a page clearer, just open that bookmark and it will convert any ugly site into a rather pleasant read.

Filed under internet, personal, tools

Using git from Powershell just got easier: Posh-git

Whenever someone asks me at the end of my git presentation about what tools to use with git my answer was always be the same:

Learn to use the commandline. It’s by far the most convenient way to get stuff done. That’s the way git was intended to be used, and with msysgit and Powershell you get a pretty powerful shell to do your stuff. I work exclusively from the commandline, rarely using gitk to take a look at the history.
Unfortunately the basic Windows commandline (cmd) is just awful and outdated. And while some people swear by the MinGW stuff, I loathe the Unix commandline. So Powershell was the only good way for me on Windows to use git.

Now, thanks to some great work from Keith Dahlby, Jeremy Skinner and Mark Embling using git from Powershell just got a whole lot more comfortable with Posh-Git.

Documentation is still a bit sparse, but Posh-Git at it’s core gives you two things: It modifies your Powershell prompt to display relevant git information (branch name and staged/unstaged changes) and adds tab-completion to all git commands. Tab completion also works on branches so you can even hit tab on a git checkout and it will auto-complete to a branch. Really really nice I have to say.

Here is how my git shell looks like now with Posh-Git:


Once you have changes it will display them also on the prompt:


(Meaning: 1 new file, 1 deleted and 0 old ones changed)


Unfortunately not everyone is a Powershell buff like Keith and friends, so I had a bit of trouble finding out how to set up Posh-Git. Actually, it’s dead simple, but nobody told me: Just put all files from the Posh-Git repository into the folder:


And rename the file profile.example.ps1 to Microsoft.PowerShell_profile.ps1. If you already had a PowerShell_profile.ps1 file set up with some custom settings, you can just add the code from profile.example.ps1 to your existing profile (assuming you didn’t change your prompt before).

After that, restart your Powershell prompt and enjoy!


The latest Posh-Git is only working with git 1.7.1 and higher, so if you are still running the release you have to use the v0.2 Posh-Git release.

Filed under git, tools

My Photography business


dynamic css for .NET