jueves, 2 de octubre de 2014

Better by design: when less is way more

As technology evolves and cheapens, more and more people jump into its bandwagon. Technology offers a new world of possibilities and enhance our lives. But does it really come cheap?

In the early days of computing, computers where huge machines built by engineers for engineers. They were far from user friendly and no average Joe could even turn one on (they didn't want, either). As they became smaller and cheaper, a new concept emerged: the personal computer. A lot of criticism came with it, as many people doubt about the need of having a computer at home. Those computers, however, were still not too user friendly, with command line interfaces (CLI) and a limited set of commands you should keep in mind in order to use them.


Evolution continued to the GUI (Graphical User Interface) and our beloved mouse. Now, computers were really and truly intuitive and user friendly; or weren't they?. With Windows 3.1, Microsoft started its conquest of the world and I remember spending lots of time moving windows around, trying different color schemes, playing solitaire and minesweeper, because there wasn't much to do..., for me. However, it was a breakthrough in terms of office productivity. With each evolution of Windows, the applications that run on it became more powerful and more complex. Menus, sub-menus, nested-menus, drop-downs and standalone panels when there were too many options, macros and even scripts!. Nowadays you can do a ton of things! (if you happen to know how to do them).

It would be shocking to know how much of current devices power is being used, compared to early years ones. I think that Pareto's law applies here and 80% of users use 20% of the power of their computer (hardware and applications) and I guess it's just because they are huge. The concept of "more is better" is to blame here. It's been shown that the broader the options, the less happy the average individual is about the choice.

During the early cellphone days, people spent lots of time choosing the ring tone, alarm music, background color... Then came the first smart phones, with Symbian, WindowsCE and PalmOS. Now you could spend days playing around with options!. Android 1.X was the pinnacle of options...

However, the iPhone arrived and it was a truly "better by design" breakthrough: simple, functional and with less options. It freed the user for the task of thinking how the wanted the things to be. Things were the way "they should be". And orthogonal, simple and logical design that everybody could understand. In fact, Android has evolved that way, simplifying most of the things, both in the general interface and in the settings... However, Apple has gone the other way around...

Another great example of this paradigm is Nest thermostat. The user just sets the temperature every now and then and the device cares about the pattern. It just makes the technology vanish up to the point of almost disappearing. "Better by design" is, as Jony Ive said in the original iPad introduction video, when you don't have to change to fit the product; it fits you. It's a pitty Apple completely lost the focus after Jobs passed away.

Apart from the technology, bad design principles apply all around us, as Donald A. Normam describes in one of the best design books: The design of everyday things.

In my years of experience, I've seen one major cause of poor design: design made by engineers. We tend to cover every possible scenario and possibility. However, "better by design" implies simplifying and thinking the best way of doing what the product is intended for, not providing support for all the possible ways of use.

So next time you find yourself developing a product, make a favor to your users and think about how it will be used before starting :)

miércoles, 6 de agosto de 2014

Agile..., but not that much

It's been long since I first went into University, for studying Computer Engineering and things have changed a lot over time, but not always for the good, I guess.

During the first years, we were taught general sciences, programming, problem solving... And slightly after that, the classic project lifecycle. We didn´t come into project management, but we knew the classic lifecylce phases and project structure. Viability documents, analysis, requirements, technical design... The waterfall approach at its best.

You are taught that every stage is like a blackbox, sandboxed and has well-defined interfaces that link the phases together. Everything is clear and tidy; nothing can go wrong, right?

We you start working with this approach, you'll sooner or later realize that it's far from being perfect. Some times it's even unusable and some of the premises are not even true!. For this method to work, every participant in the project (including clients and suppliers) must be exactly in the same page and know the whole system, so nothing can come after a phase is closed. So, when taking to the real world, it just doesn't fit.

Most of the times, the picture is much like this one:

After suffering this incoherency, some clever people came up with new approaches that are widely used nowadays, under the common name of agile methodologies. I myself am certified SCRUM Manager and it's been of great help during the last years.

However, I've found an alarming amount of engineers that are "agile extremists" and don't see the risk of completely eliminating traditional methods. This sums up to the overwhelming amount of new startups, born from the ideas of non-technical people that want immediate results in order to try to rise money. MVP (Minimum Viable Product) is a key word in the startup world and the fastest way to get to them is thru fast prototyping.

I've work in both a methodology strict company, which complied and applied a big bunch of methodologies like Prince, ISO 9001 and 27001, CMMI3 and ITIL and in a company with no methodology at all. This last one could be a extreme example of Lean and how it shouldn't be. Even though I've never been a friend of documentation, I have to admit the first one worked better. However, it started to work at its best when we re-defined the traditional waterfall stages as a cycle, with the introduction of SCRUM.

The major risk is, in my opinion, when you try to manage your company in an "agile" way. You have to be quick, you have to be nimble and you have to be agile, but before it all, you need to know where you stand. It's a common error I'm seeing very often, when a startup with non-technical founders, want to have a MVP as soon as possible and, for doing so, they look for developers, so the product is done ASAP.

It doesn't matter how senior a developer is, the first thing a technology-related startup needs is an architect. Someone with a wide and long-term view to set the basis. It doesn't matter much how you implement it, whether it's JAVA or PHP, Rest or SOA; after all, most likely you'll have to re-engineer your project in a 10-year timeframe, but your product has to be thought before it's done.

Every company need a backbone and a structure and a manager cannot be improvising because the company lacks a method. So my humble advise is to take agility with a grain of salt and use it when it's effective: in the production cycle. Be agile, but don't be extreme.