Monthly Archives: June 2015

logstash on fedora

But refer to when trying to find the package to install for F20.

Contrary to previous comments, this process works very nicely and I really appreciate the work that’s gone into the nginx package to use the Ubuntu vhost layout.

The only variation for me was to use port 5000 for the sever,

# firewall-cmd --zone=public --add-port=5000/udp

logstash-forwarder on r-pi

Have just discovered that the digitalocean instructions for logstash-forwarder don’t work on an r-pi: alas, there’s no pre-built ARM package. No matter, very good workaround instructions are available at:

I love it when people go to all the effort and work out all these details for people like me to just walk up and grab it. Thanks.

As an alternative,, describes a perfect way to get the forwarder installed on an r-pi. The beauty here is that we get an installable package for distribution on other boxes.

Am very aware that I’m feeding off the great work done by others and not actually contributing an awful lot myself.

$ dpkg -i logstash-forwarder_0.4.0_armhf.deb
Selecting previously unselected package logstash-forwarder.
(Reading database ... 64610 files and directories currently installed.)
Unpacking logstash-forwarder (from logstash-forwarder_0.4.0_armhf.deb) ...
Setting up logstash-forwarder (0.4.0) ...
update-rc.d: using dependency based boot sequencing
Logs for logstash-forwarder will be in /var/log/logstash-forwarder/

Linux love and hate

Straight to the point: Linux printing and CUPS has always sucked and still sucks. Every time I consider printing from a Linux desktop I have to ask: have they not yet fixed that mess.

When I send a job to a printer with to use the full size of the page, I don’t expect the print to consist of the page scrunched up as small as possible in the top left corner. Trying to add a printer using CUPS is as horrible now as it ever has been. I really don’t care about all the different possible protocols and whatnot: on Windows and a Mac, printing just works.

In a similar vein, I had fun and games trying to scan a document on a network printer. Now, granted this is quite a pan on Windows and Mac, requiring a Photoshop install to acquire the scan; Gimp on Linux is a non-starter (even with an xsane package). I found a whole bunch of packages (xsane, sane-backends, sane-backends-drivers-scanners) and a reference to /etc/sane.d/epson2.conf but running xsane was still throwing device errors. Doing the basic stuff is so difficult it’s no wonder desktop Linux will never catch on.

I believe the command ought to be,

xsane epson2:net:

but there’s nowhere I could find an example of the actual command to be used to connect to the scanner. Too much guessing until something works so you don’t exactly what is required for a working solution and are not likely to be able to repeat it seamlessly in the future. At least the scanned document actually matched the original.aIt’s depressing to see that these fundamental desktop operations are as unpleasant now as they were 15 years ago and aren’t looking like they’ll improve anytime in the next 15.

Enabling webcam on Fedora 20

Decided to try and be brave with grabbing a photo to upload to the address book.

dmesg was reporting the following error:

[ 12.101118] uvcvideo: Found UVC 1.00 device <unnamed> (05ca:1839)
[ 12.101567] uvcvideo: UVC non compliance - GET_DEF(PROBE) not supported. Enabling workaround.
[ 12.101942] uvcvideo: Failed to query (129) UVC probe control : -32 (exp. 26).
[ 12.101945] uvcvideo: Failed to initialize the device (-5).

A quick G-search for ‘uvcvideo sony’ turned up,, and after installing the libusb-devel.i686 and glib-devel.i686 (‘m on a 32-bit laptop) followed by,

r5u87x-loader --reload

does the trick. Installed cheese to grab the image from the webcam.

Note, however, that the following would probably have been a bit simpler,

# yum search uvcvideo
Loaded plugins: langpacks
============================ N/S matched: uvcvideo =============================
libwebcam.i686 : A library for user-space configuration of the uvcvideo driver

Giving up on rails on Docker

Having gone through the process of preparing s Rails image from a ruby build has ballooned out to over 1.2GB even after removing the compiler and associated packages.

I even tested running a bundle update to identify the gems that need native compilation: mysql2, bcrypt, therubyracer.

And then… And then after downloading the application running the bundle update borked when bcrypt 3.1.9 is needed as a dependency instead of the 3.1.10 installed. Given the time it takes to compile gems, it’s not reasonable or sensible to try and stay on top of deploying rails applications to a container.

Okay, so we’ll try to do a WordPress deployment instead.

Docker image building – a time consuming process

While I have to use the following construct to ensure that the Ruby environment is setup for building gems, and it’s a recommended practice for Docker,

RUN /bin/bash -c "source /usr/local/rvm/scripts/rvm \
 && gem update --system --no-rdoc --no-ri \
 && gem update --no-rdoc --no-ri \
 && gem install --no-rdoc --no-ri bundler \
 && gem install --no-rdoc --no-ri libv8 \
 && gem install --no-rdoc --no-ri mysql2"

One implication of this is that it is an atomic operation and when you discover that the libmysqlclient-dev package is missing and the build needs to be run again, there’s no cache to fall back on. Ruby takes 5 hours on an r-pi, libv8 is 2 hours. This is not quick turnaround stuff for the background build although it will really improve Rails container deployment times. Always a tradeoff.