Monday, July 29, 2013

Tools for changing the World

As an ongoing entrepreneur my ambitions to change the world are part of the new job description. One thing I notice when I talk to tech people is their obsession with details.

Knowing some of your infrastructure in depth sure is important. Even as a non technical person running a tech start-up you clearly want to know about why things are as they are. If people use expensive Windows servers for no obvious reason learn why and consider using something else for example.

I've grown up with Linux. I know my way around UNIX systems and for a long time I wanted to know all there is. Truth is however thats an impossible goal. I'd spend all my life learning more and more details which don't get me anywhere.

I realised how important it is to have the right tools and the right set of know how to avoid getting tied up in non productive work.

For instance, unless you're running a mail server company, spending time to configure a mail server is a waste of time and money. The hours fiddling around with those settings could have been spent on productive work in the field you're trying to sell something.

Same thing about Desktops. I know a lot of die hard Linux guys who use their Linux desktops. Sometimes that's OK, but the truth is that in many cases a Mac with all its fancy software is more productive than a Linux Desktop. People spend way to much time tinkering around with their system and produce results with tools that are way to complicated.

I'm not saying that I don't use Linux desktops or that they're bad. My point is that you should chose your tools wisely. Find out what works for you, not against you. Avoid things that require a lot of time without really adding any value for that.

Most importantly, get rid of anything that distracts you from your actual work. If your phone makes a lot of trouble consider getting another one that gets the job done without your constant supervision.

Choose tools not only by their price but look at how comfortable they are. Spending a bit of money on a stress free user interface instead of going with a free program thats a mess can save you endless hours of unproductive work.

Tuesday, July 23, 2013

Fixing Power Manager on Linux Mint 15 MATE

Due to some issues with my previous Mint 15 installation using the Cinnamon desktop I decided to do a fresh installation of Linux Mint 15 MATE. That way I could make use of the new LUKS manager in the Mint installation and get rid of the eCryptFS home directory encryption which too caused issues.

The MATE edition is really nice and is pretty much unchanged to a much earlier version of Mint I fell in love with. The fantastic "start menu" is still in there, all the GNOME goodness still works as I'm used to and it feels like a Desktop written for a computer instead of trying to serve multiple purposes.

One problem however is the power manageent. I don't like it when the screen dims down just because I pull the power cable. The standard Mint Power manager doesn't offer any options regarding that issue and I really didn't fancy going down to the command line level on this one.

Instead I've installed the XFCE4 power manager which integrates seamlessly. More importantly however it allows for much more detailed settings on how the notebook should behave in different modes.

To install it on your Mint Linux box use the following command:

sudo apt-get install -y xfce4-power-manager

Afterward find it or call it from the console using the same name as above in the installation command.

Sunday, July 14, 2013

Discovering JavaScript Closures

One of the biggest issues I have with JavaScript are callback functions which cannot access data from their parents scope. I tried OOP to get around this problem, yet the lack of a reliable $this variable voids those efforts pretty much.

Right now I'm working on a piece of middleware which should download a file from a CouchDB server, send it to a service providers API and obviously update the source document with the result.

Sounds easy, but if the callbacks don't know about the CouchDB document any longer this simple task quickly becomes a nightmare.

And so it happened that I've discovered this article about JS closures. Not entirely sure how it works under the hood but it sure solved my problem. The final code looks a bit like this:

function change_processor(change, param) {

  return {

    download: function() {

      for (var filename in change.doc._attachments) {                                               
        http.get({
          url: param.dbhost,
          port: param.dbport,
          path: '/' + param.dbname + '/' + change.doc._id + '/' + filename
        }, function(res) {
          var data = [];
          var dataLen = 0;
          res.on('data', function(chunk) {
            data.push(chunk);
            dataLen += chunk.length;
          }).on('end', function() {
            var buf = new Buffer(dataLen);
            for (var i=0, len = data.length, pos = 0; i < len; i++) {
              data[i].copy(buf, pos);
              pos += data[i].length;
            }
            console.log(buf.toString('ascii'));
          });
        });
      }
    }
  }
}

module.exports = change_processor;

The cool thing about this closure is that variables like `change` are available throughout the entire construct, including the callbacks of http.get. It's incredible.

Wednesday, July 10, 2013

Why using CouchApps?

Seeing one of CouchDB's coolest features, CouchApps, being neglected by most CouchDB users often saddens me. While CouchDB itself makes for a great noSQL data storage it may also be used to serve web applications with excellence.

CouchApps are nothing but design documents in a CouchDB. They may only contain stuff like map/reduce functions or replication instructions. Yet they may also hold any number of assets like css files, images, videos and so on. On top of that, the document itself is a json document that can be accessed in a common way.

CouchDB's internal http server can then serve those files and data with ease. Data may be displayed and managed by nothing else than the CouchApp itself, eliminating the need for an additional layer all together.

There seems to be a mind problem with that out there in the community. To me it appears almost as if people can't let go of the idea to separate data storage and application. Yes, CouchApps have some limitations to what you can do with them but in most cases those limitations don't get in the way of what the apps are doing.

So by adding another application layer (e.g. a PHP script) all developers really do is to add more work. CouchDB was designed to make life easier. It's ability to serve entire apps is one of the biggest steps in that direction so you should consider making use of it.

To get started, make yourself familiar with Design Documents and all the bits and pieces they may contain. Of course there is a number of tools making life easier.

Even so it's no longer maintained the python tool couchapp still does a fantastic job. Besides the website contains a lot of nice insights into how CouchApps work.

Then there's kanso. Like the couchapp tool it allows you to structure your CouchApp within the file system of your computer. But it does a lot more than just that. Kanso introduces a node.js stype package manager to the world of CouchApps.

Every kanso app is a package and can be reused by another App. That makes it extremely nice to work with in terms of modular design. Imagine a number of apps all of which belong to the same overall project. Another app contains global code and assets like templates, configs, etc. The different apps may then use that "core app" and you can reuse every bit of code in there.

However, each app is completely atomic. You may use different core versions within them so that they can live on their own. Whenever you change something for one app that doesn't necessarily effect all the others. That's good news if you want to avoid adding more problems by solving bugs within one feature.

To me kanso is one of the best choices for webapp development. It's fast and the modular approach makes sharing code a no brainer. I seriously suggest you give it a go and find out how you might save a ton of development time by cutting out the middle man to relax a little more.

Tuesday, July 9, 2013

Getting things done with JavaScript

I've been a software developer for more than 15 years now. There have been lots of ups and downs. It feels fantastic to label a project "done" after month of hard work. At other times I wished for another job and really hated everything about development.

It's the lengthiness of projects that made it increasingly difficult to love with my work. But all of that changed just recently.

I wrote a lot of software using PHP or Perl. Those who know me know that I don't limit myself to those languages at all but that's what sold best. Yet despite my wealth of experience with a number of programming languages I never managed to develop my own product.

Still that's what I want most. I know that I have only so much energy left after my daily work is done. Asking a developer to work for eight hours a day is just not realistic. Squeeze four hours out of a creative programmer and you're lucky.

Building my own product on the side however takes a lot more than that. After my eight hours jobs are over I'd still have to put in a few more hours every day. For a while that's doable but not in the long run.

So how to solve this problem? Simple, use modern technology rather than the common patterns you're used to. PHP does a fine job if done right. Use a framework like CakePHP or CodeIgniter and you'll save tons of time in development. Yet still it takes quite a bit of time to get basic issues like user logins right for each individual project. SQL isn't really helpful either and even so there are module systems like PEAR I don't really like using them.

I consider those languages, as good and useful as they are, old world languages. The day I discovered JavaScript for me was a revelation and these days it's no longer limited to the browser. Node.js is a great example. A lot of common tasks are handled by modules that are easy to manage and are pretty common to use.

Attach that to a noSQL database like CouchDB or MongoDB and you've cut down development time significantly. Just recently I discovered the kanso famework allowing me to reduce my stack even further. I've played with CouchDB and CouchApps a while ago yet it's kanso, that really got me excited.

CouchApps live wihtin CouchDB itelf. They're written in JavaScript and require only little code to get Webapps running. There's no need for SQL, kanso provides a great module system similar to node.js and almost the entire app is written in frontend code.

I'm impressed on how fast I'm able to develop apps now. A lot of the basics just work and I can focus on what matters most. Armed with these new world tools I'll sure be able to crank out a product in what little time I have left.

Rich dad, poor dad

Since I developed an interest in topics around entrepreneurship I read quite a lot of books on those subjects. Reading about other entrepreneurs and their struggle of success, start-up stories and techniques as well as guides on how to get started and so on really helps a lot.

One book that has been suggested to me over and over again was Rich Dad, Poor Dad. Somehow I didn't like the tile, so I avoided it until I found a free audio version on Youtube.

Having read a lot of these kinds of books meant that it wasn't all new to me, yet it still blew my mind a little. I like the parts about taxes, learning and the explanation of the difference between assets and liabilities.

The book helped me greatly to look at my finances in a rich mans way. Other books like The Art of the Start or Think and Grow Rich are very inspiring and contain helpful tips about how to build a start-up or how to develop the right mindset.

Rich Dad, Poor Dad, isn't just another book re-spinning a bunch of existing books. It contains quite some solid advice for those who listen. Besides it's not all about entrepreneurship. A lot of advice in this book may be applied by employees by investing their income into assets rather than liabilities.

That said, it's time for me to kill my huge column of liabilities ;)

Qucktip: Use Leaflet for maps

When building some online map application you do have quite a lot of frameworks to chose from. Google and other map providing companies offer very good framework, yet they only work with their maps.

A popular open source solution is OpenLayers. Since it's quite popular I trief it first. However I think it's rally nasty to use. Even so every bit of it is documentes I just can't make sense of it. So I looked further.

What I found is Leaflet. It seems to have less features than OpenLayers yet I couldn't complain about the lack of any of them. The documentation is brilliant and so is the entire API. On top of it, Leaflet works amazingly good with mobile devices out of the box.

So if you're looking for an open mapping library that just works Leaflet is what I'd suggest to you.

Monday, July 8, 2013

The Mac Effect

Macs and I have a very special relationship dating back to my early computer days. In fact the first computer I've ever used was a Mac, the first one I owned too. Needless to say that in the meantime a lot has happened.

I too used (and hated) Windows. Linux was a logical step on the way, the BSDs followed. For a long time I didn't have a Mac. Yet still the desire for having one never stopped burning within me.

These days I use pretty much any OS. Linux still occupies a massive part of my life in IT but it lacks a little thing I call the Mac effect.

A modern Linux desktop provides everything you need. Graphical file managers, menus, desktop backgrounds, widgets and most of all pretty terminals. But thats just it, it's all you need.

Mac OS on the other hand provides all that and that little extra that puts a smile on my face. For example, I like the notes app in Mac OS 10.8. I didn't use it for a while now but still it keeps notes in a way I like to work with them.

I wanted to call my internet provider. In my mind I went through all the steps I'd need to take. First, find their telephone number, second dig up my customer id, third check their offerings for discounts. All of that would have taken me around 10 minutes.

Then I typed my providers name into the spotlight search and an old note came up. Within seconds I had all the information I needed. I called them, conducted my business with them and in a whole that didn't take five minutes.

And thats just it. Of course I can keep notes on a Linux Desktop, but would I ever find them again? No, not me. And thats why I love using Macs so much. Hardly a day goes by without some little detail making me smile.

Sunday, July 7, 2013

Creating a private kanso package repository

Setting up a private kanso repository is an easy task and it makes a lot of sense. I consider using kanso and CouchDB for a rather large project which should be split up into many smaller packages. That way I can make use of kansos package mechanism and don't have to repeat myself when it comes to code shared by all the different apps.

Of course I could easily use the public package repository and whenever an app is generic enough it should actually be used. But when your package is highly specialised or not meant to be published having a private repository gets you around that problem.

The first thing you need to do is setting up the repository itself. Check out the Github Project and install it to a CouchDB server of your choice using kanso. And that's really all there's to it. It's amazing how simple it is.

Now use the --repository or --repo (that's not consistent yet) parameter to specify your local repository. Here's an example:

kanso publish --repo http://localhost:5984/kansorep/

In your final step you need to edit `lib/kansorc` within your kanso installation. Find the line that says `export.DEFAULTS` and add your repository URL. Here's an example:

exports.DEFAULTS = {
    repositories: [
      "http://kan.so/repository",
      "http://localhost:5984/kansorep"
    ],
    env: {
        // custom push locations
    }
};

Once that's done, kanso is smart enought to look for packages within all both repositories. That means that you'll be able to use `kanso install` without having to worry where the packages you're looking for are located.

I've gotta admit, that the repository handling of kanso isn't great yet. There's an old discussion touching that subject. Yet it doesn't look like that the functions suggested in there have been implemented yet.

Saturday, July 6, 2013

Lenovo X1 Carbon, six month later

About half a year ago I've received a brand new X1 Carbon. At first glance it appears to be a "black" clone of the 13" MacBook Air. It's very thin, quite light and even the shape is remarkably similar. But that's where the similarities end.

Instead of a convenient Mac OS X UNIX System I may run from day one till the day the box goes out of service it ships with a pre installed Windows 8. So before you can start using it you have to either install Linux, BSD or Windows 7 to make it usable. The latter OS may not be first choice but at least it's bearable.

Luckily a common Linux installation is done in little time. To my surprise everything worked out of the Box, even back then when it was pretty new. Even the mobile modem made no trouble whatsoever.

The first thing I really noticed is the keyboard. It's a fantastic piece of kit. It's backlight function is controlled manually without any need for OS drivers. Typing on it is really comfortable and unlike the MacBook all keys, including the functions keys, are reasonably large.

The mousepad is a bit of a pig. It's one of those new clickpads Apple introduces. It's HUGE proportions mean that you hand palms touch it constantly when typing. I am used to tap-clicking which is just impossible with this pad. It's just so annoying when I'm in the middle of typing something and all the sudden I realize that I accidentally activated a completely different window.

Since I'm completely incapable of usind the "click the entire touchpad" function I can't click on the pad. Luckily the second set of click buttons just above are easy to be used in combination. Besides they offer a lot more feel to the clicking as such. Still, A small mousepad would have done the trick just as well.

The batteries are phenomenal. Not only do they last for hours so that I actually use them, but they also recharge faster than light. I am usually quite uncomfortable with batteries, trying to keep them charged at all time for when I need them. With the X1C however I discharge them once, sometimes twice a day. They never take more than half an hour to be in top shape again.

Another marvel is the display. So far I never had a laptop that could be used in the outdoors, let alone sunshine. With the X1C however, I may sit in the brightest sunlight and I can still read the display. That benefit alone is worth having the X1C rather than an older T series model. It's just amazing how much freedom is gained once you can actually use a laptop in just about any light.

When it comes to the body of the X1C I am a bit disappointed. The bottom half is very solid, obviously it's the carbon part. That part is very good. However the other hald, housing the display, is a joke. It bends like a sail in the wind. Besides it's poorly mounted. I already managed to break the left display mounting somehow. As a result the crappy plastic opened quite a gap through which I can spot the inside. That really sucks.

Talking about poor build quality and insides. I've owned quite a few Macs in my life. All of them, even the ones in the 1990's where built to perfection. There where no gaps, no openings through which I could see LEDs, heck, there where no unnecessary LEDs as well (I really don't five a nickel about HDD activity). The X1C however has lots of those imperfections. At those high prices the build quality should be supreme but it just isn't.

In terms of roughness I have a split opinion as well. It's sturdy alright, but the fact the cheapish plastic mounts on the display broke already doesn't really convince me.

All in all I wouldn't buy it again. The only reason for me to chose it was the company policy of using Lenovos. I'd never buy a Lenovo voluntarily. I hate the fact that they come with windows pre installed and that so many things on them are made from cheap plastic. Another issue is the fact that I can only chose from OS I wouldn't want to use on a Laptop. Windows is no good for what I need and Linux desktops just annoy me. As good as this book is, I can't say I'd chose it over a MacBook.

Kanso Build Processors

Today I've created a custom jQuery package for Kanso. It's not only containing a much later jQuery version but also all the jQuery extensions I normally use. My goal was to compile all of them into one single file so that applications using it don't have to contain a column of script tags just to get all the extensions in.

Initial effort

I begun by creating a new package with a `kanso.json` file like this:

{
  "name": "jquery2",
  "version": "2.0.3",
  "categories": ["utils"],
  "attachments": [
    "jquery.js"
  ],
  "maintainers": [
    {
      "name": "Arthur McFlint",
      "url": "https://github.com/arthurmcflint"
    }
  ],
  "url": "http://jquery.com/",
  "dependencies": {
    "attachments": null,
    "modules": ">=0.0.8"
  },
  "description": "Contains the jQuery core as well as some extensions used commonly used by my projects."
}

By adding the `jquery2` extension to an app it automatically adds jquery.js.

Fine Tuning

Even so the above solution works like a charm it has a massive flaw. I couldn't care less about manually maintaining one big file containing the jQuery core as well as the extensions. After a while it would be beyond repair.

Instead I added build processors and some JavaScript files executed by node when running kanso. The new `kanso.json` file looks a bit like this:

{
  "name": "nexus-jquery",
  "version": "2.0.3",
  "categories": ["utils"],
  "attachments": [
    "jquery.js"
  ],
  "maintainers": [
    {
      "name": "Arthur McFlint",
      "url": "https://github.com/arthurmcflint"
    }
  ],
  "preprocessors": {
    "merge": "build/merge"
  },
  "postprocessors": {
    "cleanup": "build/cleanup"
  },
  "url": "http://jquery.com/",
  "dependencies": {
    "attachments": null,
    "modules": ">=0.0.8"
  },
  "description": "Contains the jQuery core as well as some extensions used commonly used by my projects."
}

Whenever kanso is pushing an app to the server it first goes through the process of building the app. By adding 'hooks' in the shape of preprocessors and postprocessors it's possible to alter the standard behavior quite a bit.

First of all I've created a new file calles `build_config.json`:

{
  "jsfiles": [
    "jquery-2.0.3.js",
    "extensions/misc/jquery.ba-serializeobject.js"
  ]
}

Then I've added `build/merge.js`:

var uglify = require('uglify-js');


module.exports = function(root, path, settings, doc, callback) {

  // Read the extensions config file                                                                                                                                                                   
  var build_config = JSON.parse(fs.readFileSync(__dirname + '/../build_config.json', 'ascii'));

  // Fix pathes to js files                                                                                                                                                                            
  var jsfiles = [];
  build_config.jsfiles.forEach(function(path) {
    jsfiles.push(__dirname + '/../' + path);
  });

  // This is where the output will go to                                                                                                                                                               
  var target_file = __dirname + '/../jquery.js';

  // Init output buffer                                                                                                                                                                                
  var output = '';


  // Combine JS files into one string                                                                                                                                                                  
  jsfiles.forEach(function(path) {
    output += fs.readFileSync(path, 'ascii');
  });

  // Add newline at the end of file                                                                                                                                                                    
  output += "\n";

  // If compression is active, minify the code                                                                                                                                                         
  if (settings.jscompression) {
      output += uglify.minify(output, {fromString: true})
  }

  // Write jquery.js                                                                                                                                                                                   
  fs.writeFileSync(target_file, output);

  // Carry on :)                                                                                                                                                                                       
  callback(null, doc);
};

And of course `build/cleanup.js`:

var fs = require('fs');

module.exports = function(root, path, settings, doc, callback) {

  fs.unlinkSync(__dirname + '/../jquery.js');
  callback(null, doc);

}

To make it all work, I've finally added a `package.json` file adding `uglify-js` to the requirements and installed it with npm.

The result

As a result the code is now automatically merged from the files I've added in the `build_config.json` file. If I turn on `jscompression` in the `kanso.json` file of my main application using the jquery2 package, kanso automatically minifies the code during the build.

All in all that's a very nice result and shows how much tweaking is possible within the kanso framework. The only thing you have to be cautious about is async code. I ran into big trouble using minify or node-minify to do the job. However as long as the functions run in sync mode you're good to go.