Three points on the costs of COTS

It seems to be quite popular to move away from custom build IT solutions to so called COTS – commercial of the shelf solutions. The idea being, that to software solution fulfil a functionality which long has been commoditized and standardized to such an extent that it offers no “competitive edge” nor core value to the business.

For most companies and organizations the office suite would be a pretty safe bet for a piece of software which is magnificently suited for a COTS solution. Finding someone who develops an internal word processor in-house seems crazy as so many fully capable solutions exists in the market.

As time passes more software seem to be included in the parts which may be commoditized and custom solutions be replaced by standard solutions to provide an adequate and capable solution to areas served by custom solutions.

The drive to COTS software seem to be a hard challenge to many organizations, as the primary driver in most COTS adoption projects seems to be a drive from the accountants and a mistrust to the IT department to choose and deliver the best fit solutions to the rest of the business.

When listening for failed Microsoft Office implementations it sems fairly small, yet the number of failed ERP projects seem endless. The scope of this post is not to address when nor how to choose COTS solutions, but just make the point, that the choice of COTS is often naive and not fully understood ahead of the decision itself.

  • When adopting COTS you’re tied to the options and customizations offered by the chosen COTS software. You should never expect to be able to force the solution to be adapted to your organization and processes, but instead be prepared to adapt the organization and processes to fit within options offered by the chosen software.
  • Choosing COTS is a strategic commitment to the vendor of the software within the scope the COTS solution is adapted to fit within the organization. Once implemented within an organization, the adopting organization is often committed to follow the roadmap and direction the vendor chooses – as the cost of switching to another solution often is large and challenging project.
  • When adopting COTS you’re committing to follow along. All versions of software has a limited “life cycle” and as new versions are released you’re expected to follow along – in the pace that’s suitable for your organization and within the roadmap offered by the vendor (in terms of support and upgrade paths).

While COTS software seems like a cheap and easy solution to many areas within an organization, the three points above seems to be forgotten too often and causes problems with the stand COTS solutions again and again.

Coming back to Microsoft Office it seems all organizations are more than capable to restrain within the possibilities offered by Word, Excel and “friends”. As the Office documents seems to be the standard exchange format, there is an implicit drive to move the organization to current versions of the software and the new options offered by new versions.

When COTS implementations fail it seems often seems, that organizations are unwilling to adopt within the options offered by the COTS software chosen – and thus breaking the core idea of COTS as a commoditized solution.

It also seems many organizations seem to forget the commitment to follow the COTS vendor, and often end up using software versions dangerously outdated, as no budget exists to update or too many customizations have been made (see paragraph above) to make it easy to upgrade to current versions.

While COTS may offer solutions to many areas in the organization, please be warned – there is no free lunch. COTS does not only come with an initial implementation price – it also comes with commitment.

Bulk conversion of webp files to png format

Google has come up with a nice new image format called webp. Currently support for this format is fairly limited, so if you need to use webp images else where it might be nice to convert them to a more widely supported format. To do the conversion, Google has made a small tool available called dwebp. The tool however does only seem to support conversion of a single image, not a batch of images.

Using regular command line magic it’s easy though. Download the tool a pair it with the find and xargs command and you should quickly be on you way. If all the webp files needing conversion to png is in a single directory, simply do this:

find . -name "*.webp" | xargs -I {} dwebp {} -o {}.png

It findes all webp files, and converts them one by one. If the initial files name was image.webp the resulting file will be called image.webp.png (as the command above doesn’t remove the .webp but only appends .png at the end.

The command assumes the dwebp program is available in you include path. If this isn’t the case, you need to specify the complete path to the program.

Watching your Raspberry Pi

So I’ve installed a Raspberry Pi and it’s been running smoothly day in, day out. I’d like it to stay that way, but as the server is running it’s gathers lint in log files, databases grows and knowing how the load on CPU and memory is utilized through out time, I was looking for a tool which could help me to solve this problem.

As fun as it might be to build your own solution, I’ve learned to appreciate ready to use solutions, and it seems a nice little tool is available called RPi-Monitor. Assuming you run the Raspbian, the RPi-Monitor is available as a package ready to install through the standard package manager (once you’ve added the package repository).

RPi-Monitor installs a web server on port 8888 and gives you a nice overview on key resources – cpu, memory, disk  and more – and even historical graphs is available.

RPi-Monitor is free, but if you find it useful, do consider donating to the author on the website.

Using (Google) Calendar for domains

Here’s a little trick, which is has proven itself just as useful as it is easy. To most companies handling domains is critical task, as losing your domain name may have catastrophic consequences. Handling domains isn’t particularly hard, but there are some tasks, that may be time-critical to handle in due time – luckily Google Calendar provides an easy way to help make sure these tasks are handled.

(In this little tip, I’m using Google Calendar as the reference, but, Office365 or any other online calendaring system can probably do the same.)

Setup a new Google Calendar on an existing Google Account and call it “domains”.

Whenever a domain name is bought or renewed, make a new entry in the calendar at the expire time of the expiry date of the domain. Note the domain name in the subject of the calendar, and if you buy domains at various registrars note any details needed (but not confidential) in the description field.

Next step is to remove the default pop-up notification and add email notifications instead. Choose which warning horizons you’d like – i.e. 1 month, 1 week and 48 hours – and Google will let you know when the renewal is coming up.

Final step is to invite any other who needs to be notified of the domain expiry to the appointment, and make sure, that they notifications is also set up with the warning horizons they like.

… also applicable of certificates

The calendar notifications can also be utilized for SSL / TLS certificates. When buying or renewing certificates make an entry on their expiry date and set up notifications as described above. This way you should be able to ensure your users never see an expired certificate again.

Beware of DNS

For some time the server running this site had been acting up. Page loads were slow, access through SSH seemed lagging and something was absolutely misbehaving.

I’ve been trying to figure out what exactly was going on, but nothing really made sense. there were plenty of disk space, memory was reasonable utilized (no swapping) and the CPU load seemed to be less than 0.1 at any time – there were no good reason the server was “turtling” along at such a perceived slow pace.

Thanks to a tip from Henrik Schack, the server is now running at full speed again. it turned out that one of the DNS resolvers used by the machine was in a bad state and slow, unreliable or dysfunctional DNS causes trouble all sorts of places. The fix was quite easy, the file /etc/resolv.conf was updated to contain the IPs of the Google Public DNS servers, and once the file was saved things were back to the rapid normal.

All computers really need solid, fast DNS servers these days – be it servers or workstations – as the auto-updates and the utilization of “cloud-resources” of various kind much have DNS working to reach out to the destinations they need. If your system starts acting up without any reasonable explanation, checking DNS could be an easy place to start checking things.

Viewing EML files

As mails bounch around some email programs (I’m looking at you, Microsoft), seems to encrypt package forwarded mails in attachments with the extension .eml.

On Linux…

While Mozilla Thunderbird should be able to read them (as should Evolution), it requires you have the mail  application available on your machine, but I haven’t – I’m doing just fine with GMail in the browser. So far the best solution I’ve find – assuming it’s trivial non-sensitive, personal files – that an Online viewer seems to work pretty well. My preferred solution is the free one from encryptomatic. It handles the mails quite nicely, it restores the formatting to something quite readable and even handles embedded images and attachments within the eml-file.

On Windows…

If you’re using Windows Live Mail or any other mail application running on windows, it can probably handle the .eml files. An other option is to look for an App, as there seems to exist several apps on windows, which renders the .eml files with no issues.

A little trick (with a browser)

When using windows – even in a VirtualBox – there’s an easy little trick you can use: Save the file and simply rename the file extension from “.eml” to “.mht” and open the file with Internet Explorer. It should render perfectly.

Once the .eml file is renamed to .mht Google Chrome and Firefox seems able to render the contents too – though handling images and attachments seems much less graceful.

Updating Viscocity certificates (on mac osx)

When using Viscocity to connect to a corporate network or any other openVPN server, you’re probably using certificates with a reasonable lifetime, but sometimes the certificate expire and needs be updated. Replacing the certificate files through the Viscocity interface is quite easy – just edit the connection and replace the certificate files in the appropriate tab.

There is however another little trick, which may need to be applied before the new certificates work. Viscocity offers to save the certificate password in the Keychain and I choose to use this feature, which caused a bit of trouble when updating the certificate. While it ought to – Viscocity does not – clear the password, when the certificate is changed, so to get prompted you need to go into the Keychain access tool and delete the stored password.

Look for an entry looking something like the highlighted line below and delete the occurrence.
Screen Shot 2014-09-09 at 23.04.07


Connection debugging tip

Viscocity provides a detailed log, which makes it much easier to debug connection issues. In the OSX Menu bar, right click the Viscocity icon, then choose “Details”. This opens a details window where a the button bar. The button to the right allows you to see a fairly detailed log of what Viscocity is doing, and provides clues on what to fix. In the screenshot below, it’s a wrong certificate password issue (“private-key-password-failure”).


Sending mail from a droplet

As stated earlier this site is now running on a DigitalOcean droplet. A droplet is basically the same as having a “real server”, and when running a bare bones machine, it isn’t born with the ability to handle email – receiving nor sending. As a number of web apps require the ability to handle mail, I had to setup facilities on the server (or droplet) to handle mail.

The “default” way to do this would probably be to install sendmail or postfix, as they are full-featured mail server, but configuring a mail-server, keeping it secure and updated is a nightmare I’d like to avoid. Therefore it was time to look for another option.

Enter msmtp

msmtp is an open-source, light-weight solution, which allows you to get your server to send email, or as the project itself describes it:

In the default mode, it transmits a mail to an SMTP server (for example at a free mail provider) which takes care of further delivery.

msmtp project homepage

There are several ways msmtp can be setup, but in this post I’ll just cover the two basic scenarios.


msmtp can handle mail delivery different ways. I’ll just cover two basic scenarios here.

If you have a smtp-server available. Your hosting provider or someone else may provide you with access to a full-featured SMTP-server. If this is the case, you can configure msmtp to pass all mail on to that server like this:

# smtp server configuration
account  smtp
port   25
# Default account to use
account default : smtp

As you’re talking to a “real” SMTP server all options and features should (potentially) be available to you.

If you have a Google account – either a regular Gmail account or Google Apps account will do just fine. To configure msmtp to use the Gmail SMTP server use this configuration:

# Gmail/Google Apps
account  gmail 
port   587 
password  enter-password-here!
auth   on 
tls   on 
tls_trust_file /etc/ssl/certs/ca-certificates.crt 
# Default account to use
account default : gmail

In the above example you need to change “” to an actual GMail account, and you need to change “enter-password-here!” to the password belonging to the specified Gmail addresss.

Using Gmail, all mail passed on from msmtp, will be sent from the account credentials used in the configuration, and there doesn’t seem to be a way to override this. You may therefore opt to create a specific mail-account for this use. You can set a custom Reply-To header in the mails passed through Gmail SMTP, which in many cases may help secure the replies get to a proper recipient.

If your site has adopted DMARC, this may not be a suitable option (at least not on the free tier), as they don’t support signing and do not offer dedicated IP-addresses for you SPF-records.

Testing 1, 2, 3…

Once you’ve set up the mstmp configuration file, it’s time to do some testing. Create at text file called “testmail.txt” with this content:

Subject: Subject for test mail
This is the body content for the test mail.

Change to your own actual email address. Then enter from the command line:

cat testmail.txt | msmtp

You should recieve your test mail shortly.

Setting up an alias

Many unix/linux tools and apps seems to assume, that you have sendmail installed and that it is available at /usr/bin/sendmail or a few other locations in the file system. To handle these cases easily, you can create an alias pointing the sendmail name to the msmtp binary like this (the examples should cover most cases):

ln -s /usr/bin/msmtp /usr/sbin/sendmail
ln -s /usr/bin/msmtp /usr/bin/sendmail
ln -s /usr/bin/msmtp /usr/lib/sendmail

Depending on which package manager your installation use, it may automatically setup these aliases, so do check if they exist before trying to create them.

Setting up with PHP

if you made the aliases as suggested above, it may already work, but you should make the following changes, just keep things clean and transparent.
Find all php.ini files applicable (you probably have one for the web-server and another for the Command Line):

Add or change the line:

sendmail_path = "/usr/bin/msmtp -t"

Now for some testing. Add a file with the following content (change the example-address to your own):

<!--?php mail("","test","test",""); ?-->

Now, call the file from the command line using the php cli, and then call the file through the webserver. In both cases you should receive an email shortly.

 Another suggestion…

Apart from running sendmail or postfix, there also seems to an application similar to mstmp called ssmtp, which offers many of the same features as msmtp.

Server setup: Setting up a firewall

A firewall is a basic filter that can provide an efficient protection to your server by only allowing the traffic in and out as the rules of the firewall allows it. Setting up a firewall on a Ubuntu Linux server does not need to be complicated – in fact the one used in this example is called “uncomplicated firewall”.

To get the firewall up and running make sure it’s installed through the package manager. Login and switch to a root shell, then install the firewall with this command:

apt-get install ufw

If everything goes okay, the firewall is installed but not configured nor enabled.

Firewall Configuration

I find the easiest way to mange the firewall is through a little script in the root home directory. The beginning script could look something like this:

ufw reset
ufw allow from
#ufw allow ssh
ufw enable
ufw status

Line 2 resets any existing configuration rules in the firewall.

In line 3 you should change the to you own fixed IP address if you have one (you really ought to). This line will allow any traffic from you ip-number into the server (assuming there is something able to receive it naturally).

If you haven’t a fixed IP number line 3 should be removed and line 4 used instead. It allows SSH connections from any outside IP-number to knock on the door – then well rely on the SSH daemon (and the configuration of this) to reject any unwanted visitors knocking on the server.

Line 5 enables the firewall and line 6 prints a list of the current status and configuration of the firewall.

Depending on what you are using your server to do, you’ll probably need a few more lines in the firewall script. If you’re running a webserver, you should at least add a line (just above the “ufw enable” line) allowing web traffic to pass through the server:

utf enable www

Are you using https on you’re webserver? – then you need to allow that too:

utf enable www

The simple enable lines above are suitable for “publicly accessible services”. If you’re running something the whole world should be able to use, UFW allows for that too. The Community documentation on UFW over at the Ubuntu site is quite helpful.