Microservices Need Architects

Microservices Need Architects – An excellent article on the complexity of something with ‘micro’ in it’s name. And, yes, I know and I am here to help with over a decade of experience in service design and enterprise integration skills.

For the past two years, microservices have been taking the software development world by storm. Their use has been popularized by organizations adopting Agile Software Development, continuous delivery and DevOps, as a logical next step in the progression to remove bottlenecks that slow down software delivery. As a result, much of the public discussion on microservices is coming from software developers who feel liberated by the chance to code without the constraints and dependencies of a monolithic application structure. While this “inside the microservice” perspective is important and compelling for the developer community, there are a number of other important areas of microservice architecture that aren’t getting enough attention.

Specifically, as the number of microservices in an organization grows linearly, this new collection of services forms an unbounded system whose complexity threatens to increase exponentially. This complexity introduces problems with security, visibility, testability, and service discoverability. However, many developers currently treat these as “operational issues” and leave them for someone else to fix downstream. If addressed up front—when the software system is being designed—these aspects can be handled more effectively. Likewise, although there is discussion on techniques to define service boundaries and on the linkage between organizational structure and software composition, these areas can also benefit from an architectural approach. So, where are the architects?

Corporatization of the Consumer Space

Much has been written in the last decade about the ‘Consumerization of Corporate’ IT with the primary example being corporate users wanting to use their smartphones and tablets from home in a corporate ecosystem.

I would argue that the inverse of that trend has started in the last few years. That is concerns that were once firmly in the corporate space are starting to bleed into the consumer space. These include:

  • a focus on security for personal devices with more emphasis on firewalls, encryption, SSL, password strength and even two-factor authentication.
  • a growing interest and need for analytic and visualization tools for the growing amount of data from wearables and other in-home devices. Currently this is served by one off tools from each vendor with more platforms emerging that are corporate-style integration platforms that take in data from disparate systems and provide a more unified ‘dashboard’ view to consumers.
  • additional emphasis on in-home automation and monitoring control systems for everything from thermostats, lighting, locks, motion sensors, flow sensors. Previously, this was the realm of building security groups and manufacturing plants. Automation and monitoring is also driving the previously mention areas of security and analysis.

Canary is an engineering and support failure

deadCanary
Canary.is promote their product as an alternative to a real home security system. Nothing could be further from the reality. Here is the simple truth:

  • if the power goes out, you will get robbed (no battery backup, probably wouldn’t make a difference even if it did, because)
  • if your internet connection goes out, you will get robbed (more on this later)
  • if your internet upload connection experiences any slowness, you will get robbed
  • NONE of these things is true with a real home security system.

    You must understand that this is basically just a dumb camera unit that requires an internet connection to do anything. There is no local storage or functionality in the unit itself which means if your internet connection is out or slow, the Canary is absolutely useless. As a consequence, it is constantly trying to upload video for analysis (motion detection) — it can do noting on its own. Make sure you set it to ‘privacy’ mode when you are home to cut down on it hammering your wifi.

    I had one unit and it sort of worked, I added a second one and BOTH of them stopped working. The more units you add, the more of your upload bandwidth they suck up (and suck they do).

    Sadly, tech support is basically useless. At some point they will have you run a test on speedtest.net and if you EVER tell them you experienced an upload speed of less than 1Mbps, then, game over, that is THE problem and apparently the end of their sorry support script. It seems their ‘engineers’ are unfamiliar with data compression, efficient data streaming and error handling algorithms, etc — if you are .01 under 1Mbps (my case), then you are screwed, they won’t support their product (or allow you to return it because it says 1Mbps on the web site). It doesn’t matter if you can facetime or run google hangouts without any glitching, ‘the problem’ is your bandwidth, not their dubious implementation.

    Seriously, save your money and/or look for alternatives. This canary is dead in the coal mine.

    GPS Is No Substitute For Knowing Where You Are Going

    GPSisWrong
    Seems obvious, right? And yet, stories like this are still relatively common place. What is really egregious in this case is that this faux pas was committed by a commercial tour bus driver.

    Confirm the address or location on a map (heck even maps.google.com). You will be disappointed if you arrive in Dayton, Ky when you intended to go to Dayton, OH.

    Run Linux In A Window On A Chromebook

    Screenshot 2014-12-29 at 11.09.09 AM
    This is a handy new development that allows you to run Linux (Crouton) in a window on a Chromebook. It also addresses some of the difficulties of copy and paste from ChromeOS and Crouton which is something I have been missing very much.

    Obviously, you can run Crouton without this plugin by switching between full screen Crouton and full screen ChromeOS, but these just feels more seemless and integrated.

    Why SOAP Lost? or Why software engineering is hard and lazy is good

    I chuckled my way through this post on ‘Why SOAP lost’ because it seems to be missing some fundamental observations.

    SOAP (like Java) was designed for structured use in well designed systems. most developers shy away from anything structured. It gets in the way of just writing code (or more frequently, downloading code and pasting into the editor). Much better to use JSON and write a bunch of validation code than to use SOAP/XML and re-use existing robust, well tested parsers and validators. I know, I am making a big assumption there – that a developer would actually write validation code. The more ad hoc the development process, the more ‘agile’ it is and that is good, right? Ask your friendly neighborhood QA and operations people about the value of optimizing for slap-dash development versus designing for sustainability, consistency, uptime and performance.

    XML is ‘much harder to read’ than JSON? Right. Give someone a JSON document with 2-3 levels of nesting and an array or two and see how much easier they think JSON is. XML can be verbose, but that is for the purpose of clarity. Oh, and kudos for adding the line breaks in front of the namespace declarations to make your example XML look more ‘verbose’.

    I’m not sure I understand the comment about SOAP usage of HTTP POST being a hardship because it can’t be tested in a browser. Easily solved by using something like the POSTman plugin in your browser. And I suppose the author is one of the service designers who doesn’t use anything but HTTP GET and returns everything (including errors) with an HTTP status of 200. Because, you know, that is easier – especially when your production environment is a browser and not a server or something exotic like that.

    The last set of bullet items in the post is missing a little something as well:

    Laziness, when it is the primary decision criteria, optimizes for development and sacrifices everything else. That is like optimizing for 5% or less of the lifecycle of that code. Just dumb. Be a nice person and drink your steaming cup of STFU when your YAGNI snark causes 20 hours of production downtime a month because the code has no design rigor behind it and certainly doesn’t take supportability concerns into consideration.

    Using Two-Factor (yubikey) with Mac OS X

    YubiKey Neo
    YubiKey Neo
    Outstanding detailed article on using two-factor authentication with the Mac OS X operating system. Note that there is a lot of good follow up in the comments section as well.

    I bought a yubikey neo back in October and have been using it with Google’s U2F implementation. I think that this is a smart way to go security-wise and I am glad to see that Google is making it easier to take advantage of. You can also opt for the less expensive yubikey standard if you don’t have a need for the Near Field Communications (NFC) capability on the yubikey.

    Happy New Year

    2014-12-13 13.16.52
    Happy New Year! 2014 was filled with ups and downs (as to be expected). Hopefully, 2015 will see projects successfully completed and new directions explored. Coming into the new year with a bit of flu has been kind of a drag, but things should start picking up again in a few days.