While not a basic introduction, this article is a valuable chronicle of some hands-on learnings from using kubernetes. The hand-drawn illustrations are a great addition.
Great article on system failures in IT and how groups/people react to them. Here is a summary:
tl;dr: Catastrophic system failures are remarkably common in IT-dependent environments. The reactions to such failures varies but is often some version of blame-and-train. There are a number of problems with blame-and-train but perhaps the most important is it is a form of organizational blindness that forestalls improvement.
- These failures are markers of systemic brittleness, the inverse of resilience.
- The blame-and-train reaction is a diversion, a red herring, and counterproductive; it increases brittleness.
- There are productive reactions to failure but they are difficult to accomplish, especially when the failure has big consequences.
Serverless architecture uses a lot of services — hence why some prefer to call the architecture “service-full” instead of serverless. Those services are essentially elements of an application that are independent of your testing regime.
An external element.
A good external service will be tested for you. And that’s really important. Because you shouldn’t have to test the service itself. You only really need to test the effect of your interaction with it.
Here’s an example …
Let’s say you have a Function as a Service (e.g. Lambda function) and you utilise a database service (e.g. DynamoDB). You’ll want to test the interaction with the database service from the function to ensure your data is saved/read correctly, and that your function can deal with the responses from the service.
Now, the above scenario is relatively easy because you can utilise DynamoDB from your local machine, and run unit tests to check the values stored in the database. But have you spotted something with this scenario? It’s not the live service — it’s a copy of it. But the API is the same. So, as long as the API doesn’t change we’re ok, right?
To be honest, I’ve reached a point where I’m realising that if we use an AWS service, the likelihood is that AWS have done a much better job of testing it than I have. So we mock the majority of our interactions with AWS (and other) services in unit tests. This makes it relatively simple to develop a function of logic and unit test it — with mocks for services required.
Here is a quick summary of what is new in the latest Android Things Developer Preview.
Earlier this month Google announced a partnership with AIY by releasing a co-produced build-your-own Google Home kit. Built on Google’s Raspberry Pi 3 developer board, the kit showcased Android Things’ ever-expanding features, particularly the integration of Voice Kit. Enabling developers to build a proper Voice User Interface (VUI) Voice Kit is an open-source platform which can integrate cloud services such as Google Assistant SDK or Cloud Speech API or simply run similar services directly on the device with Tensorflow – Google’s on-board neural-network.
Google also added some important drivers to the mix – most notably those necessary for implementing Google Assistant SDK on any certified development board. Also in tow is support for Inter-IC Sound Bus (I2S) which has been Added to the Peripheral I/O API. A Voice Kit sample for which is included, aimed at demonstrating the use of I2S for audio.
Developer Preview 4 will also bring new hardware, adding a Board Support Package for the NXP i.MX7D. Also, in a display of Android Things’ scalability, Google has released Edison Candle – a sample of custom hardware which fits modularly with SoM’s (system-on-modules) running the lightweight OS. Code for this sample is hosted on GitHub while hardware design files can be found on CircuitHub.
Things seem to be coming together quite well for Google’s IoT solution. With the 1.0 release of Tensorflow in February and I/O kicking off, we hope to see even bigger strides today.
I found this posting to be a bit swear-y (you’ve been warned), but otherwise on the money.
The final paragraph nails it (I have definitely seen my share of those ‘success’ messages:
Above all else, have a wonderful holiday season and give your teams a break until the code freeze is lifted in mid-January. Then you can get back to shoving Agile on people, making them work 60 hours a week again and then having your directors send “we did it the Agile way!!!!” success messages after the project you executed took production offline, took twice as long to finish and cost 3 times as much.
I saw that Google is testing a password generator for the Chrome browser. Hmm, I wonder if that means that they will stop storing passwords in clear text?
Password-generating tools like LastPass, 1Password, RoboForm, and others are a mainstay of browser accessories, and are often recommended by security experts because they can help create and manage “strong” passwords. “Strong” refers to passwords that are difficult for hackers and computers to guess. Google’s effort, if it makes it into the regular version of Chrome, could encourage other browser makers to build password generators and make the field more competitive.
1Password has the advantage that it is multi-platform and not tied to a single browser, which I consider to be a very good thing. Having each browser create its own incompatible password manager would be even worse that each browser having its own incompatible HTML interpreter.
Another recent post, again focused on API design, but could/should apply to all tech efforts (The Four Principles of Successful APIs). This time the guidance takes a slightly different approach:
1. Understand The Strategy
2. Decide Who You Are Really Designing For
3. Start Small and Iterate
4. Architect for the Long Term with Abstraction
#2 sounds like a component of #1 – your target user base should be part of your strategy. #3 is a good opportunity to apply the consistency principle from the previous posting. #4 is interesting because abstraction seems to be a hard concept for developers who tend to think that API = CRUD overlay.
I had to chuckle when I read through this post titled When crafting your API strategy, put design first. It is very high-level and could/should apply to anything. Here are the main points:
Design for consistency
Design for scale
Design for people
Check. Yes to all of these. I suppose some folks need to be reminded of this. Especially the ‘sling code first and declare victory at some arbitrary point’ proponents. The ones with 60 hours of production downtime a month because design ‘just slows them down’. Apparently downtime doesn’t slow them down, but it sure slows down the consumers.
Another chuckle was this paragraph, which is nearly a direct quote from me (emphasis added):
Planning too little is dangerous. But so is planning too much. This isn’t a science experiment to find the ideal design. Perfection isn’t the goal: consistency is.
I saw this whiny article in the Washington Post that was just begging for a response. The gist of the article is that the author got a Computer Science degree and wasn’t given his dream job out of the gate. This then becomes an indictment of the education system rather than the typical sniveling millennial i-wasn’t-given-the-world-without-having-to-work-for-it screech that it is. Let’s take a look at some quotes from the posting:
My college education left me totally unprepared to enter the real workforce. My degree was supposed to make me qualified as a programmer, but by the time I left school, all of the software and programming languages I’d learned had been obsolete for years.
The think this belies a misunderstanding of how higher education works – it is not what you are given, it is what you do with it (the whole learning how to learn thing). It is as if he expects to read a book on swimming but never gets into the pool; and ‘surprise’ he really can’t swim because he put no effort into applying the learnings. Also, if ‘all of the software and programming languages’ were obsolete, what were they teaching? FORTRAN?, RPG?, Visual Basic?
To find real work, I had to teach myself new technologies and skills outside of class, and it wasn’t easy.
Poor you. You should have been doing this all along. The Computer Science curriculum should be teaching you fundamental concepts in how computers work, programming concepts and techniques that can be applied across specific programing languages, databases and platforms. Actually, it is a bit shocking how many recent CS grads don’t have a grasp of fundamentals.
Businesses aren’t looking for college grads, they’re looking for employees who can actually do things – like build iPhone apps, manage ad campaigns and write convincing marketing copy. I wish I’d been taught how to do those things in school, but my college had something different in mind.
Businesses are indeed looking for those things, but they are looking for people who can learn and grow and apply what they have learned in the past. If you have a CS degree and can’t figure out how to write an iPhone app you either had a horrible curriculum or slept through most of your class time. The fact that you weren’t specifically trained for that is not a problem with your education. Rather it is a failure to apply what you should have learned.
At least 90 percent of my college education (and that of so many others) boiled down to pure terminology, or analysis of terminology. My success in any given class was almost wholly based on how well I could remember the definitions of countless terms – like the precise meaning of “computer science” or how to explain “project management” in paragraph form, or the all-too-subtle differences between marketing and advertising.
Wow. Ok. So, if that percentage is accurate, I can see why you can’t get a job. When I got my CS degree (many moons ago) that was maybe 1% of what we were taught.
To me, this is the root of our college problem: The average college student is paying $30,000 a year for the chance to learn valuable skills from professors who haven’t had the opportunity to learn those skills themselves. Maybe it’s a crazy idea, but if you’re going to spend all that money for a college education, shouldn’t you expect to learn real-world skills from people who know what they’re doing?
This seems excessively harsh and a bit misguided. If you want to be learning what is new and trendy, go to a conference, a user group, or actually talk with people who are doing interesting things. By the time those things get packaged up into an approved curriculum, the technology might be on the stale side. But, again, if you don’t understand the fundamentals, you are not going to be able to effectively apply new technology and concepts. No one can give that to you at any price.
Solving the issue of inexperienced teachers may be even simpler: have schools relax academic requirements for professors and focus far more on hiring effective businesspeople. With a little more leeway, academically-minded candidates will have more freedom to gain job experience, and schools may even attract more talent directly from the business world. Success in business and success in the classroom are certainly different things, but I’d wager that it’s a lot easier to show an accomplished businessperson how to teach than it is to show a teacher how to be an accomplished businessperson.
So it sounds like what you want is for all universities to be trade schools, focused on cranking out very specific skills and techniques rather than more broadly educating students and preparing them to apply a wide set of competencies to a range of problem domains. This sounds a bit like the certification trap from the 90s – go get a very narrow, often vendor specific certification but still have no practical experience in applying that knowledge. When that vendor falls out of favor, you are a bit stuck unless you can teach yourself the reasoning and abstraction skills you would have learned in college.
To steal the trite closing from the original article: But what do I know, I have been happily applying my Computer Science degree for nearly 30 years with technologies, programming languages and platforms that never even existed when I graduated.
I have struggled with the Zinio magazine app for about a year now on several different devices. The device I use the most is my Samsung Galaxy Tab 10.1 as it has the best form factor for e-magazine reading. I just can’t believe how wide of the mark that Zinio has fallen.
First of all, here is what Zinio thinks it is (from zinio.com):
Zinio is more than a mobile reading application. We’ve spent the last decade creating the digital editions of the magazines you love, delivering the exact same material you get in print‚ plus exclusive features like video, audio and live links, on your iPad, iPhone, desktop and laptop.
No, Zinio, all you are is a reading app (and not a very good one). Things that I would expect to do (and can do in other reading apps) like search, highlight, take notes, bookmark, et al are all missing from your application. When I open the application, I want to be returned to where I left off reading, not to my library or even worse, your shopping app where you waste my bandwidth displaying stuff I didn’t ask for in the first place. And I have yet to see a publication with ‘exclusive video and audio’.
All of the things I listed above I can do with a physical magazine but not with your supposedly enhanced electronic version. The one thing I can do with Zinio that I can’t do with a physical magazine is follow links. But you manage to get that hopelessly wrong as well.
I mean, why, oh why, when I click on a link in a magazine you insist on taking me to your walled-garden browser where I can’t save the URL or basically have access to any browser functions other than viewing? Why not just open the link in the browser on the tablet? Or at least give me the option of choosing? I actually reported this as a bug and got several nonsensical answers back that never addressed the issue.
Me: “Zinio app needs to allow the user to use the browser built into the tablet and not the broken walled-garden browser that zinio forces you to use. the zinio browser lacks much functionality and is a real nuisance to use (and is completely unnecessary)”
Zinio: “Zinio does not have a browser.
Zinio has an app.
If your device browser does not support Flash, you will not be able to read your magazines through Zinio’s website. You must, then, use the app.”
This seems to demonstrate a complete failure to apprehend what I was asking about. So I tried again:
Me: “You are wrong. I run the reader on an Android tablet using the Android
version of the Zinio application. When I click on a link inside of a
publication in Zinio it does not launch the tablet’s browser, it opens a
minimally functional browser window that is part of the Zinio application.
This broken version of the zinio browser does not allow any of the
function of the native browser. Users need to have the option to run the
full featured browser. There is no reason I can see why this shouldn’t be
Zinio: “Zinio does not have a browser or an Android Reader.
Zinio has an Android app. The app does not integrate with any browser.”
In frustration, I finally responded: “You are a bit of an idiot and keep repeating yourself. Perhaps you should pass this on to someone with a better understanding of both the Zinio Android application and customer service.”
Here endith my attempt as a customer to be heard by Zinio.
So I guess it is safe to say that your dubious product is matched by your equally dubious support function as well.
Well, Zinio, do you recognize these obvious issues and have a plan to fix them? Or should I just anticipate you going out of business in a few years and seek alternatives?
There have always been those few apps that insist on looking like their physical, real world, equivalent. Calculator apps, date books, calendars, note taking apps, “stickies” — you know what I am talking about. Despite there being better options out there, better ways of displaying the data, designers stick with the known representation of the tool.
Now, though, Apple is taking it too far.
If you have seen any of the screenshots linked across the web about the new iCal interface you know what I am talking about. If you haven’t seen those, iCal is looking a lot like it does on the iPad right now in Lion’s developer preview. It’s ugly, and we should be way past this style by now.
Ugly and harder to use than it should be. Designers need to focus on how to allow the user to fluidly access and manipulate their data not slavishly stick to the limitations of physical items.
Another dimension of this is how poorly developers/designers have approached the touch interface. The industry seems to be mired in button-driven-pull-a-menu-to-do-anything paradigm. Interfaces really need to take better advantage of long-tap context options and gestures to make the interactions more fluid. This is one of the things that drives me bonkers about the iPad – it is so modal; I have to close one app to do something in another. I guess I have gotten used to how easy it is in Android to just share data between apps without having to change apps.
Speaking of Android apps, I think that Feedly is the first really usable news reader that I have encountered on Android. I subscribe to a lot of feeds and that seems to be the death of most readers on mobile devices because the developers thought it would be a good idea to download all your feed updates at once. This typically results in the app going away for a long time. Feedly does it more on demand. And they are clever about using gestures in the app – swipe down and to the left and I have marked that page of articles read and moved on to the next. Brilliant. Much better than ‘pull menu, select mark read, select next page, close menu’ annoyance of other apps.
Here is yet another proclamation on the death of the netbook because of tablet computers. I’m not convinced quite yet. My own experience shows that when we travel with both an iPad and a netbook, the iPad sees some use (in short sputters) but the netbook does the majority of the computing duties. Part of that may be that the iPad in particular is just so darn limited in what it can do (and, in many ways, too cumbersome in the way that it does/doesn’t do things).
The new billboards, developed by Japanese electronic company NEC, scan the faces of passing shoppers, quickly determine their age and gender, and then display demographic-appropriate ads
Critics fear the technology as an invasion of privacy, but NEC say people will remain anonymous, their faces instantly deleted. The technology will get an American trial later this year.
Interesting long-ish post on The Mobile Data Apocalypse, And What It Means To You. Of course, as noted in the posting, the assumptions are made based on the Cisco provided data — Cisco not exactly being a disinterested party when it comes to selling more WiFi and network gear.
The mobile industry is now completing a huge shift in its attitude toward mobile data. Until pretty recently, the prevailing attitude among mobile operators was that data was a disappointment. It had been hyped for a decade, and although there were some successes, it had never lived up to the huge growth expectations that were set at the start of the decade. Most operators viewed it as a nice incremental add-on rather than the driver of their businesses.
But in the last year or so, the attitude has shifted dramatically from “no one is using mobile data” to “oh my God, there’s so much demand for mobile data that it’ll destroy the network.” A lot of this attitude shift was caused by the iPhone, which has indeed overloaded some mobile networks. But there’s also a general uptick in data usage from various sources, and the rate of growth seems to be accelerating.
I find it fascinating (and a bit disappointing) that Gartner and others are just beginning to figure out that an effective Enterprise Architecture practice needs to start with an understanding of business strategy and direction and cannot (successfully) exist as a purely technical concern.
Perhaps this is because the early days of EA was really more of an application or technology architecture focus. The much lauded (and, in my opinion, over-rated) Zachman framework is really nothing more than a taxonomy as much as it wants to be sold as an ‘architecture’. If you can fill out the top row of Zachman, you have probably exhausted its usefulness (and really gained nothing more that the Who? What? so on perspective that you learned in elementary school).
Spewak then came along with another view of EA that was heavily technology oriented. The thrust of this seemed to assert that if you had a complete inventory of your applications and their interactions you were doing EA. No, actually you were on your way to doing portfolio rationalization – a valuable EA service, but not EA in its entirety.
Maybe it was the recent addition of Business dimension to TOGAF in release 9 that caused these ‘pundits’ to finally come to there senses and realize what successful EA practitioners were doing all along.
It would seem that this technology-focused approach has been the seed corn for the old saw ‘IT needs to align with the business’. I always thought it was odd that there was never an exhortation to have Accounting align with the business or Marketing align with the business. I believe that both parties are to blame here — the business needs to articulate a vision and plan that IT can understand and execute against. Without a clear plan from the business, IT more often than not will turn inward and focus on technology in a way that may or may not support business direction.
This disconnect on having business drive EA sort of reminds me of the strange looks that I would get about 10 years ago when I would try to explain that before an enterprise rush into slathering pointy brackets on their data and declaring that they are ‘service oriented’ that they should take the opportunity to make sure that there was a single enterprise definition of enterprise data and use services to expose them in a enterprise uniform way. ‘That has nothing to do with SOA!’ I was told. Tsk, tsk, that is data management, not SOA. Now, this ‘insight’ is all the rage, with every vendor and consulting firm thumping their chests and proclaiming that ‘data comes first’ and ‘the importance of MDM‘ as a pre-cursor to SOA.
Similarly the same pundits thundered on that it was laughable that BPM be tied to SOA. Problem is that BPM has a certain amount of ambiguity around what the M in BPM means for any given speaker. Is it Modeling? Management? Monitoring? Mapping? So, yes, for all of the non-implementation aspects of BPM, the service orientation part is largely irrelevant. But for any business process implementation that has system touch points (nearly all non-trivial processes do), services are (or should) play a role in exposing the business functions in a consistent, re-usable manner within the enterprise.
So, yes, Enterprise Architecture should be business driven, not technology driven. MDM is a critical underpinning for successful SOA. And BPM is probably the most visible part of service orientation and SOA is key to BPM implementation. What next, governance is key to enterprise SOA success?
As I was reading Martin Fowler’s post on ServiceCustodian I was struck by something that, in his words, didn’t smell right. After re-reading the article several times, I finally put my finger on it. He appears to assume that a service is no different than Java .class file or a .jar . Nothing could be further from the truth.
A true service should reflect a reusable business function, not merely some technical/programatic detail. As such, it should have a business owner who defines and controls what changes are appropriate to that function at a business level. Having coders making changes willy-nilly could prove disastrous to the business (but quite satisfying to the coders). It is unlikely that the business service owner will be able to understand the nature of a change from a patch (or even what a ‘patch’ was for that matter). There is no substitute for appropriate documentation and change control procedures to avoid errant changes.
This seems to be an increasingly frequent miss for coders: focusing on the code and what is convenient for the coder rather than on what makes sense for the business that they are supposed to be supporting.
I was reflecting on the state of the BPM marketplace while returning from Software AG’s Innovation World. It seems that, by and large, there are few consultants out there who can advise you on the actual implementation of BPM (the hard part) but plenty of them that can fulminate on the easier theoretical portions. For example, here is a relative plot of the marketplace as I see it:
The justification phase is easy, as it primarily consists of the same pro forma advise for any IT-related project: have an executive sponsor, get business buy in, don’t try to justify a big bang approach, blah, blah blah. Check.
The analysis phase is where the Lean/Six Sigma types will descend upon you with endless discussion of SIPOC and other jargon. Don’t get me wrong, this is a valuable analysis to have, it just does not solve the entire problem.
Then comes the actual implementation and the sounds of crickets in the field. For implementation, that favorite consulting cliche comes out all too often: ‘it all depends’. Well, yes, it does all depend, but if anyone has successfully implemented BPM even a handful of times, they should be able to begin to synthesize a set of best practices and guidelines in general and offer specifics in a given tool stack. This area is sorely wanting — in most cases, even the vendors can’t tell you how to effectively use their own tool stacks in any detail.
Assuming that you have navigated the rocky shores of implementation, there are any number of Business Activity Monitoring and Business Intelligence vendors who will sell you their wares to help you visualize your process data as executive friendly dashboards and portals. They typically have nothing to say about effective data collection and meaningful representation of data.
I tried out the latest Yahoo Go mobile app on my Nokia N95 8GB. Go quickly demonstrated that Yahoo have no idea about the mobile market and their offering stinks. By focusing on bandwidth wasting adverts they undermine the entire mobile experience.
In my case I loaded up Go to try out the new voice search feature. Marginal success in that it mis-interpreted most everything that I spoke into it. Just for fun, I clicked over to check for my email. Up pops and error that it can’t connect to email. But apparently what it *could* do was connect to a server and start streaming some useless video for some Ford product that I had absolutely no interest in. If I wasn’t on an unlimited plan, I’d be really pissed. Oh, an there is no way to stop the ad until it downloads completely — sheer genius.
I loved this blog post title Experience should guide, not constrain. Basically the point of the post was a recasting of the old cliche about ‘when all you have is a hammer everything starts to look like a nail’.
What the post really made be think about was the importance of having a breadth of experience in technology as well as depth in a few areas, especially if you are (or aspire to be) an enterprise architect. I personally have been lucky enough to work as a software developer, database administrator, network engineer, project manager and tech lead over my 20+ year career. I feel that each of these has helped me as an architect to bring all of that experience to bear on current issues and plans. Consider trade offs and side effects.
The converse of this is the puzzling phenomenon I have seen where people who only know Microsoft technologies declare themselves to be ‘enterprise architects’ when in fact they are little more than one-note technologist. This is particularly laughable in enterprises that aren’t 100% MS technology. These EAs probably only have about 5% of the picture — have they lost track of what the ‘enterprise’ really is. So it is no wonder that the way they ‘fix’ a problem is by insisting that it move onto the MS platform (which is my experience is usually the wrong answer).
So in technology as in life, grow what you know, keep learning and try new things.
I had to chuckle at this article wherein IBM seems vexed that the number of computer science and IT graduates is declining in the USA. Really. IBM is probably one of the IT companies that led the charge to offshore jobs and slash US IT positions.
And they wonder why IT is not as attractive an option for college students? They have already sent the message that ‘cheap’ is what they want; not homegrown (or even good, for that matter).