I saw that Google is testing a password generator for the Chrome browser. Hmm, I wonder if that means that they will stop storing passwords in clear text?

Password-generating tools like LastPass, 1Password, RoboForm, and others are a mainstay of browser accessories, and are often recommended by security experts because they can help create and manage “strong” passwords. “Strong” refers to passwords that are difficult for hackers and computers to guess. Google’s effort, if it makes it into the regular version of Chrome, could encourage other browser makers to build password generators and make the field more competitive.

1Password has the advantage that it is multi-platform and not tied to a single browser, which I consider to be a very good thing. Having each browser create its own incompatible password manager would be even worse that each browser having its own incompatible HTML interpreter.

Another recent post, again focused on API design, but could/should apply to all tech efforts (The Four Principles of Successful APIs). This time the guidance takes a slightly different approach:

    1. Understand The Strategy
    2. Decide Who You Are Really Designing For
    3. Start Small and Iterate
    4. Architect for the Long Term with Abstraction

#2 sounds like a component of #1 – your target user base should be part of your strategy. #3 is a good opportunity to apply the consistency principle from the previous posting. #4 is interesting because abstraction seems to be a hard concept for developers who tend to think that API = CRUD overlay.

I had to chuckle when I read through this post titled When crafting your API strategy, put design first. It is very high-level and could/should apply to anything. Here are the main points:

    Design for consistency
    Design for scale
    Design for people

Check. Yes to all of these. I suppose some folks need to be reminded of this. Especially the ‘sling code first and declare victory at some arbitrary point’ proponents. The ones with 60 hours of production downtime a month because design ‘just slows them down’. Apparently downtime doesn’t slow them down, but it sure slows down the consumers.

Another chuckle was this paragraph, which is nearly a direct quote from me (emphasis added):

Planning too little is dangerous. But so is planning too much. This isn’t a science experiment to find the ideal design. Perfection isn’t the goal: consistency is.

IBM’s Watson is not just good for game shows – apparently it takes a turn at bartending: This Cocktail Concocted By IBM’s Watson Isn’t Half Bad

TL;DR version:

What You’ll Need:

    1.5 oz. coconut milk
    3 oz. white rum
    3 oz. banana juice
    4 oz. pure pineapple juice
    1/2 oz. fresh lime juice
    10(ish) drops blue food coloring
    2 USB sticks for garnish (optional)

To Finish:

    5 oz. Sprite or similar soda

I saw this whiny article in the Washington Post that was just begging for a response. The gist of the article is that the author got a Computer Science degree and wasn’t given his dream job out of the gate. This then becomes an indictment of the education system rather than the typical sniveling millennial i-wasn’t-given-the-world-without-having-to-work-for-it screech that it is. Let’s take a look at some quotes from the posting:

My college education left me totally unprepared to enter the real workforce. My degree was supposed to make me qualified as a programmer, but by the time I left school, all of the software and programming languages I’d learned had been obsolete for years.

The think this belies a misunderstanding of how higher education works – it is not what you are given, it is what you do with it (the whole learning how to learn thing). It is as if he expects to read a book on swimming but never gets into the pool; and ‘surprise’ he really can’t swim because he put no effort into applying the learnings. Also, if ‘all of the software and programming languages’ were obsolete, what were they teaching? FORTRAN?, RPG?, Visual Basic?

To find real work, I had to teach myself new technologies and skills outside of class, and it wasn’t easy.

Poor you. You should have been doing this all along. The Computer Science curriculum should be teaching you fundamental concepts in how computers work, programming concepts and techniques that can be applied across specific programing languages, databases and platforms. Actually, it is a bit shocking how many recent CS grads don’t have a grasp of fundamentals.

Businesses aren’t looking for college grads, they’re looking for employees who can actually do things – like build iPhone apps, manage ad campaigns and write convincing marketing copy. I wish I’d been taught how to do those things in school, but my college had something different in mind.

Businesses are indeed looking for those things, but they are looking for people who can learn and grow and apply what they have learned in the past. If you have a CS degree and can’t figure out how to write an iPhone app you either had a horrible curriculum or slept through most of your class time. The fact that you weren’t specifically trained for that is not a problem with your education. Rather it is a failure to apply what you should have learned.

At least 90 percent of my college education (and that of so many others) boiled down to pure terminology, or analysis of terminology. My success in any given class was almost wholly based on how well I could remember the definitions of countless terms – like the precise meaning of “computer science” or how to explain “project management” in paragraph form, or the all-too-subtle differences between marketing and advertising.

Wow. Ok. So, if that percentage is accurate, I can see why you can’t get a job. When I got my CS degree (many moons ago) that was maybe 1% of what we were taught.

To me, this is the root of our college problem: The average college student is paying $30,000 a year for the chance to learn valuable skills from professors who haven’t had the opportunity to learn those skills themselves. Maybe it’s a crazy idea, but if you’re going to spend all that money for a college education, shouldn’t you expect to learn real-world skills from people who know what they’re doing?

This seems excessively harsh and a bit misguided. If you want to be learning what is new and trendy, go to a conference, a user group, or actually talk with people who are doing interesting things. By the time those things get packaged up into an approved curriculum, the technology might be on the stale side. But, again, if you don’t understand the fundamentals, you are not going to be able to effectively apply new technology and concepts. No one can give that to you at any price.

Solving the issue of inexperienced teachers may be even simpler: have schools relax academic requirements for professors and focus far more on hiring effective businesspeople. With a little more leeway, academically-minded candidates will have more freedom to gain job experience, and schools may even attract more talent directly from the business world. Success in business and success in the classroom are certainly different things, but I’d wager that it’s a lot easier to show an accomplished businessperson how to teach than it is to show a teacher how to be an accomplished businessperson.

So it sounds like what you want is for all universities to be trade schools, focused on cranking out very specific skills and techniques rather than more broadly educating students and preparing them to apply a wide set of competencies to a range of problem domains. This sounds a bit like the certification trap from the 90s – go get a very narrow, often vendor specific certification but still have no practical experience in applying that knowledge. When that vendor falls out of favor, you are a bit stuck unless you can teach yourself the reasoning and abstraction skills you would have learned in college.

To steal the trite closing from the original article: But what do I know, I have been happily applying my Computer Science degree for nearly 30 years with technologies, programming languages and platforms that never even existed when I graduated.

One of the things that I really like about the Android ecosystem is the degree to which you can customize it to suit yourself (and not what some designer in Cupertino thought was good enough). This extends to the newly released Android Wear devices (I opted for the Samsung Gear Live).

I just discovered a new widget app called Zooper Wear – Square Wearables that brings useful aggregation to Wear notifications. I really like having a consolidated view of the number and type of alerts all on one screen. Throw in current temperature and battery level and I am sold.

Interestingly, top coffee drinkers are from countries that probably couldn’t grow their own coffee if they tried.

America might be famous for running on coffee, but it doesn’t run on much. Not compared to a handful of other countries, anyway. When it comes to actual coffee consumption per person, the US doesn’t even crack the top 15.

I would love to see (and use) this in more locations. Sadly, it will likely be quickly perverted to route visitors to/near shops and other unattractive locales.

If you want to find the most scenic route to get somewhere, there may soon be an app for that. Daniele Quercia and colleagues at Yahoo Labs in Barcelona have come up with a way to create a crowd-sourced measure of a city’s beauty, and made an algorithm to find the prettiest way to get from one point to another. “The goal of this work is to automatically suggest routes that are not only short but also emotionally pleasant,” the scientists told Technology Review:

Quercia and co begin by creating a database of images of various parts of the center of London taken from Google Street View and Geograph, both of which have reasonably consistent standards of images. They then crowd-sourced opinions about the beauty of each location using a website called UrbanGems.org.
Each visitor to UrbanGems sees two photographs and chooses the one which shows the more beautiful location. That gives the team a crowd-sourced opinion about the beauty of each location. They then plot each of these locations and their beauty score on a map which they use to provide directions.

This topic has been getting a lot of banter lately. I like the Guardian’s perspective on the topic.

Picking up a book is gratifying: look at me, not reading dumb listicles on the internet! Finishing a book, however, is a challenge. Which of this summer’s top-selling books have the highest reader attrition? Dr. Jordan Ellenberg has a semi-scientific way to find out, using buyer-generated info from Amazon to identify this year’s most unread book.

It’s a charmingly simple (if not entirely rigorous) method: Dr. Ellenberg cruises the “Popular Highlights” listings for each title, which shows the five passages most frequently highlighted by Kindle readers. If most folks make it to the very last page, those passages should come from the front, the back, and everywhere in between. If everyone drops off in Chapter 3, the most popular passages will be focused in the first few pages.

Should you finish every book you start?

…But this funny business of the Hawking Index, a lighthearted attempt to work out how far people persist in reading books, as indicated by the passages they highlight on their Kindles, has got me thinking. And it’s made me realise that my view has changed. I used to believe that if you really weren’t enjoying a book, you should toss it to one side and move on to something you might find more rewarding; essentially, it was born of an insurmountable fear of the sheer number of books I wouldn’t get round to reading before I died.

But things have changed. Clearly, I’ve got older and realised that I was a fool to see world literature as a mountain I had to scale, but more to the point, I’ve seen the threat that endless distractions and a wussy, don’t-like-it, bring-me-another attitude poses to our reading culture. I know I risk sounding po-faced, but the best books are a medium of thick description, painstakingly built word by word to produce strange and unexpected effects in the brain and heart; they deserve more than being treated like a passing bit of entertainment that hasn’t quite lived up to the reader’s exacting standards.

Interesting development, but even Chrome doesn’t support Dart by default.

Google’s Dart[2] language is now[3] an official ECMA standard with the catchy name of ECMA-408[4]. ECMA[5] may not be a household name, but if you’re reading this, your browser is using ECMAscript to render at least some parts of this page. That’s because ECMAscript[6] is the official standard body behind JavaScript. In the past, the organization has also been behind the specs for JSON, C#, the Office Open XML format and various CD-ROM specs.

I am not so sure how ‘useful’ these examples are. Is the fact that most of them have negative connotations a reflection of the person who curated them or of the Russian language?

This was one of my favorites:

переподвыподверт (‘per-e-pod-‘voy-pod-‘vert)
Reddit user deffun on /r/doesnottranslate defined this noun as “to do something in a complex, incomprehensible way.”

The word kind of embodies itself, as it has four prefixes including one that repeats itself twice.

I think this is an interesting approach to gesture control of a device that does not require a camera. It uses changes in wireless signals.

The challenge is going to be making this work in an area occupied by multiple people (and with multiple devices). How do you ‘cue’ a device that it should react to a gesture? What do you do about mis-cues?

About a month ago, I attended the API Strategy and Practice conference in San Francisco. Overall, a pretty good conference and, as always, much of the value was in connecting with people between and after sessions.

One panel discussion was concluded with the question ‘what is the future of the internet?’. The responses seemed to fall into two categories 1) code/APIs everywhere and/or 2) intelligent consumption and composition of the available APIs.

I wanted to point out that the first category of thought reflected the view of a (now defunct) little technology outfit just down the highway who voiced the credo of ‘the network is the computer’. They had a great spec/implementation for a technology called JINI that very much reflected the philosophy of ‘be a node and not a hub’ – inherently scalable and cluster-able running practically anywhere.

The second group reminded me of the great AI gold rush in the mid-90s, when ‘intelligent agents‘ were going to manage all our personal data and book travel and plan meetings based on all of the metadata that we surround ourselves with. Companies were funded and failed trying to deliver on this vision (General Magic anyone?). Perhaps it was an idea before it’s time and enough has changed and opened up that it might work this time. We shall see.