Karls dev-blog

Creating LESS mixins from Fontello config

• fontello, javascript, less, and node

Fontello is a great service for managing icon fonts for a project. For those who haven’t used it it allows one to cherry pick icons from a range of different font colletions into your own custom font. It also makes it really easy to just upload your own SVG icons and include it in your custom font.

Creating your own custom font with just the icons needed has several advantages. First of all it reduces the download size to just the needed glyphs (instead of downloading the whole Font Awesome for example).

Secondly it also gives you a larger selection of icons, even though mixin a lot of icon sets might not look that well. But not having a thousand generic icons readily available also forces you to actually look for a good icon instead of just taking something that is available. It might take a bit more time but in the end it will hopefully lead to a better icon selection.

Exporting from Fontello

When exporting from Fontello you get a zipfile with the following structure (omitted most of the css files for brevity).


Importing fontello.less (and probably updating the font paths) is all that you need to get it up and running which is great for some projects. For larger projects though we’ve always resorted to using mixins for icons and our designer manually updated our mixins everytime we added a new icon. This did of course increase the time to add a new icon by a lot.

So per usual, when someone is doing something manually it can be automated if they just ask. And within a development team there is usually someone who can do it.

Converting the Fontello config to a LESS mixin

Create a new file in your project called fontello-to-less.js and paste this code into it. Adjust the part where we import the font config on line 3 if it doesn’t match up with your workspace.

/* global require, module */
const fs = require('fs');
const config = require('../fonts/config');

// This method is used to read the fontello config and output
// the icons as Less mixins
const fontelloToLess = function fontelloToLess(outputPath) {
  const lineBreak = '\r\n';
  let output = '#icons {' + lineBreak;
  for(let glyph of config.glyphs) {
    output += `  .${glyph.css}() {` + lineBreak;
    output += `    content: '\\${glyph.code.toString(16)}';` + lineBreak;
    output += `  }` + lineBreak;
  output += '}';

  fs.writeFileSync(outputPath, output);

module.exports = fontelloToLess;

Then either create a new file to invoke the task like below or adjust it to make it a part of your build pipeline.

For a simple invokation via node it could look like this (lets call this file build.js).

const fontelloToLess = require('./fontello-to-less');

Then call it like node build.js.

This will output a LESS file that looks like this in ./styles/fontello-icon-definitions.less.

#icons {
  .arrow-left() {
    content: '\e800';

Which makes it a lot easier to work with the icons if you ask me. I’ve so far used this with two projects, one with a simple Grunt pipeline and with with a Ember CLI pipeline and I’ll probably use this with a lot of future projects as well.

All the little things - OpenSearch

• opensearch, c#, aspnetcore, saga, json-ld, and schema.org

Since a huge thing in Saga is its search engine, adding an OpenSearch endpoint was a given. This might seem like a small thing and the average internet user won’t know what it is but a lot of people appreciate it being there.

So what does it get you? Well first of all it is a standardized way to declare how to search your website which is always a good thing since this makes your site easier to understand for browers and other services.

Then there is the user facing parts. In Firefox you get the option to add a new search provider when visiting a site with OpenSearch.

Add search provider in Firefox

After adding the search provider, typing a search query into the address field will allow you to search directly on the site by using the icons at the bottom of the dropdown.

Execute a search provider in Firefox

Chrome goes even further and automatically loads search providers when visting a site for the first time. It will then be integrated into what they call the “Omnibox” so that it can be used quickly with only the keyboard.

Chrome Omnibox

So as we see this is something that we really want to have, especially if we have a site where searching is a huge part of the reason to visit it.

Implementing the OpenSearch endpoint

If adding this to a regular website it would be enough to just put a static xml file somewhere on the site and linking it in the header, but for a CMS like Saga we of course needed to generate this dynamically. Since we are using ASP.Net Core in Saga the easiest way to do it was using a controller action.

public class OpenSearchController : Controller {
    private readonly ISiteSettings _siteSettings;

    public OpenSearchController(ISiteSettings settings) {
        _settings = settings;

    public ActionResult Status() {
        var xml =
            [email protected]"<OpenSearchDescription xmlns=""http://a9.com/-/spec/opensearch/1.1/"" xmlns:moz=""http://www.mozilla.org/2006/browser/search/"">
<Image height=""64"" width=""64"" type=""image/png"">{_settings.SiteUrl}/images/favicon.png</Image>
<Url type=""text/html"" method=""get"" template=""{_settings.SiteUrl}/search?query="" />
<Url type=""self"" template=""{_settings.SiteUrl}/open-search"" />

        return Content(xml, "application/opensearchdescription+xml");

While this could have been done by building an XDocument instead, which would be safer, this was a quick fix early in development that has worked fine since then.

So then we just need to wire it up with a link-tag in the header template.

<link rel="search" type="application/opensearchdescription+xml" href="/open-search" title="The Local Library" />

And then we are done! Next time you visit the site the OpenSearch stuff will be available.

Going the extra mile

So while OpenSearch is great, structured data was also something we used quite a bit in Saga. I might talk more about that in a later post but for now, the website search snippet is the interesting part.

It is as simple as adding this snippet to your layout template.

<script type="application/ld+json">
  "potentialAction": {
    "@type": "SearchAction",
    "query-input": "required name=search_term_string",
    "target": "https://www.example.com/search?query={search_term_string}"
  "@context": "http://schema.org",
  "@type": "WebSite",
  "url": "https://www.example.com"


So this is just one of the small tings we did with to make Saga a great CMS. I’ve had the chance to ask both our customers and their users about this and while most doesn’t seem to know about all this, they are all both surprised and excited when they find out about it.

The great Saga

• saga

Saga is a product that started development mere weeks after I joined Open Library Solutions back in 2013. We were a small time, at no time more then four developers and one designer but for the most part just two and a half devloper. The core team have been me and two colleagues since the start

The goal was to create a new and more modern version of the companys older product CSLibrary, a website with integrated library catalog search and other regular CMS functions to publish content.

It took a few weeks of evaluating different content management systems to base it on. We made a huge list of features that we had in the old product that we needed to be able to support. Below are some of the more complex ones.

In the end we settled for a completely custom solution, most systems we evaluated either didnt’t have all the features we needed or was way to hard to customize and then upgrade.

Now, six years later, we have a fully custom CMS tailored to our exact needs. We have advanced widgets, with entirely customizable editor UI; we support multiple languages and have connections between pages in different languages; we have full control to publish content at certain times and also archiving it at a later time; etc. The admin interface as well as the API is part of the product deployment while each custom website is its own git repository that is just merged with each new release when upgrading.

The name Saga was chosen after a year or so of development. We went through a bunch of names but felt that Saga was a good fit since it works across languages, it can be related to libraries and it just sounds good.

It took us almost three years to get our first customer up and running (with a small detour of a couple of months for another project) but in the end we delivered something amazing. And now another three years later there are hundreds of library across Sweden, Norway and Finland using Saga.

Familjen Helsingborg running on Saga 3.0

Above is the site for Familjen Helsingborg, a collaboration of 11 municipalities in Sweden running on Saga 3.0. The main layout of the website is similar to a standard Saga site but they have some own things going on as well. It is one of the more advanced customers as of writing and they have some really good editors, the site might turn up from time to time in other posts as an example.

Time for a change

• saga

Back in January the company I have worked for the past 6 years was bought by the only real competitor on the market. While we spent almost all those 6 years building our product Saga, a specialized CMS for public libraries, our competitor threw together a product based on a generic CMS and called it a day.

So while I’m out searching for a new job I’m also a bit saddened to leave Saga behind. We did some great things with our small team over the years, some basic and some quite interesting, which is what made Saga exceptional for its purpose.

So my hope here is to write a series of posts describing some the little things we’ve added to Saga during the years of development. Since this is a proprietary product I won’t be able to show much actual code for stuff that are not already available (such as public javascripts) but I can still talk about the ideas and concepts.

Lets try Jekyll for a while!

• all-the-little-things

Seems like it’s time to move to a new blog platform again. I really enjoyed Ghost but there was something seriously wrong with my setup.

My old setup was Ghost hosted on Herokus free tier with binaries stored in AWS. It was fine when I set it up but then Heroku limited their free tiers and the AWS plugin started having problems uploaded images.

So enter Jekyll and GitHub Pages!

This blog is now completely hosted in a GitHub repository based on a simple theme called Contrast forked from another GitHub repository. No need for advanced setups or hosting providers!

It took me about 30 minutes to get going. I installed ruby, installed jekyll via gem, forked the theme repo and cloned it, then I just ran jekyll serve --watch from my terminal and I was up and running So far it’s been a blast and since my old posts were already written in markdown I just copied them over and updated the syntax highlighting markup and I was back!

Hopefully this ease of everything might encourage me to write some more, I’ve got some fun stuff lined up that might be interesting.

A tale of debugging ASP.Net 5 and HttpPlatformHandler

• c#, aspnetcore, and httpplatformhandler

Had an interesting problem at work this morning. We were ready to deploy a new version of our ASP.Net 5 website to our staging environment and everything was looking fine both locally and on our development server. We merged it to our staging branch and pushed it to our buildserver and everything was looking fine until we tried to browse to the site and got this:


Not much to go on there, something obviously crashed but that’s it. So how do we proceed here?

First of, the Event Viewer in Windows is always a good start when you want to know what happened in IIS so lets start there. If we look in the “Application log” we could see something like this.

The description for Event ID 1001 from source HttpPlatformHandler cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer. If the event originated on another computer, the display information had to be saved with the event. The following information was included with the event: Process ‘5624’ started successfully and is listening on port ‘5225’.

Well that’s swell, it seems like the platform handler managed to start the dnx process and was happy with it, no other information was available though. Seems like the Event Viewer might no the be “go to place” it used to be when running via a platform handler.

After this I started checking all other parts of our system to see if there was something wrong with the API since we fetch some settings from it during startup which might end badly if the API wasn’t responding. This showed up empty though, everything else was running fine so the culprit was somewhere at our ASP.Net 5 site after all.

What I then found was that the platform handler could log the stdout of the started process. Open the web.config located in wwwroot of your deployed site and change stdoutLogEnabled to true.

      <add name="httpplatformhandler" path="*" verb="*" modules="httpPlatformHandler" resourceType="Unspecified" />
    <httpPlatform processPath="..\approot\web.cmd" arguments="" stdoutLogEnabled="true" stdoutLogFile="..\logs\httpplatform-stdout" startupTimeLimit="3600"></httpPlatform>

When I then tried to browse to the site again I got this in my logfile.

error   : [Microsoft.AspNet.Hosting.Internal.HostingEngine] Application startup exception
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.ArgumentException: 
Parameter name: value
   at Microsoft.AspNet.Http.PathString..ctor(String value)
   at Microsoft.AspNet.Builder.ExceptionHandlerExtensions.UseExceptionHandler(IApplicationBuilder app, String errorHandlingPath)
   at MorbidFox.Web.Startup.ConfigureStaging(IApplicationBuilder app, ILoggerFactory loggerFactory)
   --- End of inner exception stack trace ---
   at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
   at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
   at Microsoft.AspNet.Hosting.Startup.ConfigureBuilder.Invoke(Object instance, IApplicationBuilder builder)
   at Microsoft.AspNet.Hosting.Internal.HostingEngine.BuildApplication()

It’s not much, but at least we got a stracktrace pointing us to Startup.ConfigureStaging and the UseExceptionHandler call.

Here are our ConfigureStaging and ConfigureProduction methods from Startup.cs.

//This method is invoked when ASPNET_ENV is 'Staging'
public void ConfigureStaging(IApplicationBuilder app, ILoggerFactory loggerFactory) {
    // StatusCode pages to gracefully handle status codes 400-599.


//This method is invoked when ASPNET_ENV is 'Production'
public void ConfigureProduction(IApplicationBuilder app, ILoggerFactory loggerFactory) {
    // StatusCode pages to gracefully handle status codes 400-599.


So what caused this hard to track down error? Apparently we were missing a / at the start of the url to app.UseExceptionHandler in our setup for the staging environment, and that caused it all to crash and burn. Since it’s internally converted to a PathString it has to start with a / or it’s an ArgumentException with the message “Parameter name: value”. I added that missing /, tested it locally, pushed it to the staging branch and everything was up and running again.

Lazy loading external scripts in Ember

• ember and javascript

When building a large Ember application you might get to a point when you feel that the initial download size is just to large. In my current project we got almost 2mb of javascript when all the templates and extra utilities are compiled and bundled. This is for an internal function so 95% of the time speed won’t be a problem but as developers we strive to deliver a great experience for all our users.

Looking through our dependencies in bower.json I quickly found two huge offenders. We are utilizing TinyMCE as base for our “WYSIWYG”-editor and vis.js to render a nice timeline. Both these adds a few hundred kilobytes each to our vendor.js that need to be downloaded initially for all first time users.

So how can we get around adding these dependencies to vendor.js but still use them in our application? This turned out to be quite easy by simply adding them to the DOM in the model-hook using a promise. I created two methods, one for js and one for css, and put them in a small utility script which can be seen below.


import Ember from 'ember';

export var loadScriptResource = function loadScriptResource(uniqueName, path) {
    return new Ember.RSVP.Promise(function(resolve, reject) {
      if(!!document.getElementById(uniqueName)) {
        return resolve();

      var element = document.createElement('script');
      element.id = uniqueName;
      element.src = path;
      element.addEventListener('load', function() {
      element.addEventListener('error', function() {
        reject(`Failed to load ${uniqueName} (${path})`);
    }, `Loading external script resource ${uniqueName} (${path})`);

export var loadStyleResource = function loadStyleResource(uniqueName, path) {
    return new Ember.RSVP.Promise(function(resolve, reject) {
      if(!!document.getElementById(uniqueName)) {
        return resolve();

      var element = document.createElement('link');
      element.id = uniqueName;
      element.href = path;
      element.rel = 'stylesheet';
      element.addEventListener('load', function() {
      element.addEventListener('error', function() {
        reject(`Failed to load ${uniqueName} (${path})`);
    }, `Loading external style resource ${uniqueName} (${path})`);

Then in the route where you need the script your simply import the methods and set them up as parts of your route model.


import Ember from 'ember';
import { loadScriptResource, loadStyleResource } from 'sauces/utils/external-resource-loader'

export default Ember.Route.extend({
  model: function(params) {
    var visjs = loadScriptResource('visjs', '/assets/visjs/vis.min.js');
    var viscss = loadStyleResource('viscss', '/assets/visjs/vis.min.css');

    return Ember.RSVP.hash({
      awesomeSauces: this.get('store').findAll('sauce')
      visjs: visjs,
      viscss: viscss

The name you give the script when loading it will be used to identify the script once loaded so it won’t be loaded again so you can reuse the same name across several routes and it will still only be loaded once.

This change saved us about 600 kilobytes on the initial load, and for users who never watch the timeline vis.js will never be loaded at all.

Downloading Windows 10

• misc

I heard somewhere that the internet was a bit crowded yesterday with everyone downloading Windows 10. Can’t say that I noticed it when I got it myself.

Downloading Windows 10

For some reason I had a Swedish license for Windows 8.1 so I had to go through that to successfully update to Windows 10. The “Ta emot”-label would be “Recieve” if I had bothered to install the language pack before updating.

Installing an Add-in Express addin using NSIS

• nsis and add-in-express

This post was migrated from my old WordPress installation and later from my Ghost installation and was initially published back in September 2013.

I’m currently in the end phase of a really nice project using Add-in Express to create plugins for Microsoft’s Office suite. While it comes with support for generating a WIX-installer for you which works fine I always liked NSIS more and have good templates ready for my own branded installers.

Breaking the WIX-installerscript that was generated apart wasn’t that hard thankfully. The important parts, i.e. registering the addins, was easy to extract and relies on the adxregistrator.exe and AddinExpress.MSO.2005.dll which is available from the Add-in Express installation folder. So by embedding those in my project and then calling adxregistrator.exe to do the actual registration made this simple enough. It’s important to remember to keep adxregistrator.exe after the installation though since it’s needed to uninstall the application afterwards.


Section "Office Addins"
  ; To make this check you need the FindProc-plugin for NSIS and the
  ; macro defined below in the article.
  !insertmacro CheckAppRunning "OUTLOOK" "OUTLOOK.EXE" "Microsoft Outlook"
  !insertmacro CheckAppRunning "EXCEL" "EXCEL.EXE" "Microsoft Excel"
  !insertmacro CheckAppRunning "WORD" "WINWORD.EXE" "Microsoft Word"
  !insertmacro CheckAppRunning "POWERPOINT" "POWERPOINT.EXE" "Microsoft PowerPoint"
  SetOutPath "$INSTDIR"
  ; We need a few files from the Add-in Express folders, you might need
  ; to update the paths to match your system though.
  File "C:\Program Files (x86)\Add-in Express\Add-in Express for .NET\Redistributables\adxregistrator.exe"
  File "C:\Program Files (x86)\Add-in Express\Add-in Express for .NET\Bin\AddinExpress.MSO.2005.dll"
  ; We need all outputted files from the project, make sure that you have registered
  ; the project with Office at least once so that the adxloader-dlls are available
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\adxloader.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\adxloader.dll.manifest"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\adxloader64.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Microsoft.Office.Interop.Excel.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Microsoft.Office.Interop.Outlook.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Microsoft.Office.Interop.PowerPoint.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Microsoft.Office.Interop.Word.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Microsoft.Vbe.Interop.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Office.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Shorthand.Core.OfficeAddins2.Addin.dll"
  File "..\Shorthand.Core.OfficeAddins2.Addin\bin\Release\Shorthand.Core.OfficeAddins2.Addin.tlb"
  ; Finally we call adxregistrator.exe with the path to our dll to register it on the target system
  ExecWait "$INSTDIR\adxregistrator.exe /install=$\"$INSTDIR\Shorthand.Core.OfficeAddins2.Addin.dll$\" /privileges=user /returnExitCode=false"
  Section Uninstall
  ; And of course, when we run the uninstaller we call adxregistrator.exe again to unregister the addin
  ExecWait "$INSTDIR\Office Addins\adxregistrator.exe /uninstall=$\"$INSTDIR\Shorthand.Core.OfficeAddins2.Addin.dll$\" /privileges=user"


!macro CheckAppRunning ID Proc Name
  FindProcDLL::FindProc "${Proc}"
  IntCmp $R0 1 0 notRunning${ID}
  MessageBox MB_OK|MB_ICONINFORMATION "${Name} is currently running and needs to be closed before the installation can continue."
  goto checkApp${ID}