The State of the Internet – 2016

Self proclaimed futurist tweets an obfuscated link to an ad-encrusted pull quote that links to an article… behind a paywall.

For some the internet has moved from a means to share information and ideas to one that exists solely to generate clicks that have zero information value (well, except to them – ‘ad impressions’ and all that). And, no, I don’t want to sign up for you email-harvesting ‘newsletter’ that you never publish but benefit by selling on my contact information.

IoT and Auto-Replenishment – A Good Thing?

I guess I am taking a little more cautious/skeptical stance when it comes to the auto-replenishment feature touted by many IoT pundits and vendors. If you aren’t familiar, this would allow a device to determine that you were out of or running low on a given consumable (be it a food item, dish soap or toilet paper) and then order more of it on your behalf.

Here is the problem: the vendor and the device don’t have your best interests at heart and might tend to exaggerate the current state of consumable and (maybe) tend to order more of it more frequently that you might actually need (or want). For example, if you have ever owned an inkjet or laser printer you have probably experienced this already – persistent warnings/notifications to replace a toner or ink cartridge when, in reality, the useful life of the item is much, much longer that you are being led to believe. Heck, I have a laser printer that has been telling be for 13 months that I need to replace the toner. In that time my family and I have printed hundreds of additional pages with this ’empty’ toner cartridge.

Consider also the existing confusion over the meaning of ‘sell by’ and ‘best by’ designations on other consumables (most notably food). What if vendors add a ‘replenish by’ or ‘order by’ date into the mix? Not a great situation for consumers, especially if they have delegated this to an networked device in the name of ‘convenience’.

Creating a shopping list: Evernote vs Google Keep

evernote-v-google-keep
Evernote experience:

1) create list in desktop app
2) attempt to share it with my wife; sorry, have to upgrade to paid version for this
3) finally share with wife, she attempts to edit shared list; sorry, she has to upgrade to paid version (screw that)
4) remember a few more items on the go, add to list via mobile app
5) attempt to sync from mobile; get loads of errors – sync fails
6) only way to fix sync error is to copy note contents, delete note an paste contents into a new note
7) repeat from step 2 or just give up

Google Keep experience:
1) create list on tablet using Keep app
2) share with wife; no problem – she has access to it within seconds
3) she needs to add items to the list – no problem; she adds them and they automatically sync with me
4) edit list on mobile – no problem; list automatically syncs
5) both of us run Keep app in grocery store, ticking off items from the list; no problem – list syncs automatically
6) marvel at the superior user experience from Google Keep
7) BONUS: I can set a reminder on the list that is a location; Google Now notifies me when I am near the store.

Evernote just keeps getting worse and worse. About the only thing that keeps me using it is the web clip functionality in the browser. Come on Keep, add that and I can leave Evernote behind.

Learn How To Learn And Stop Blaming Your Education

I saw this whiny article in the Washington Post that was just begging for a response. The gist of the article is that the author got a Computer Science degree and wasn’t given his dream job out of the gate. This then becomes an indictment of the education system rather than the typical sniveling millennial i-wasn’t-given-the-world-without-having-to-work-for-it screech that it is. Let’s take a look at some quotes from the posting:

My college education left me totally unprepared to enter the real workforce. My degree was supposed to make me qualified as a programmer, but by the time I left school, all of the software and programming languages I’d learned had been obsolete for years.

The think this belies a misunderstanding of how higher education works – it is not what you are given, it is what you do with it (the whole learning how to learn thing). It is as if he expects to read a book on swimming but never gets into the pool; and ‘surprise’ he really can’t swim because he put no effort into applying the learnings. Also, if ‘all of the software and programming languages’ were obsolete, what were they teaching? FORTRAN?, RPG?, Visual Basic?

To find real work, I had to teach myself new technologies and skills outside of class, and it wasn’t easy.

Poor you. You should have been doing this all along. The Computer Science curriculum should be teaching you fundamental concepts in how computers work, programming concepts and techniques that can be applied across specific programing languages, databases and platforms. Actually, it is a bit shocking how many recent CS grads don’t have a grasp of fundamentals.

Businesses aren’t looking for college grads, they’re looking for employees who can actually do things – like build iPhone apps, manage ad campaigns and write convincing marketing copy. I wish I’d been taught how to do those things in school, but my college had something different in mind.

Businesses are indeed looking for those things, but they are looking for people who can learn and grow and apply what they have learned in the past. If you have a CS degree and can’t figure out how to write an iPhone app you either had a horrible curriculum or slept through most of your class time. The fact that you weren’t specifically trained for that is not a problem with your education. Rather it is a failure to apply what you should have learned.

At least 90 percent of my college education (and that of so many others) boiled down to pure terminology, or analysis of terminology. My success in any given class was almost wholly based on how well I could remember the definitions of countless terms – like the precise meaning of “computer science” or how to explain “project management” in paragraph form, or the all-too-subtle differences between marketing and advertising.

Wow. Ok. So, if that percentage is accurate, I can see why you can’t get a job. When I got my CS degree (many moons ago) that was maybe 1% of what we were taught.

To me, this is the root of our college problem: The average college student is paying $30,000 a year for the chance to learn valuable skills from professors who haven’t had the opportunity to learn those skills themselves. Maybe it’s a crazy idea, but if you’re going to spend all that money for a college education, shouldn’t you expect to learn real-world skills from people who know what they’re doing?

This seems excessively harsh and a bit misguided. If you want to be learning what is new and trendy, go to a conference, a user group, or actually talk with people who are doing interesting things. By the time those things get packaged up into an approved curriculum, the technology might be on the stale side. But, again, if you don’t understand the fundamentals, you are not going to be able to effectively apply new technology and concepts. No one can give that to you at any price.

Solving the issue of inexperienced teachers may be even simpler: have schools relax academic requirements for professors and focus far more on hiring effective businesspeople. With a little more leeway, academically-minded candidates will have more freedom to gain job experience, and schools may even attract more talent directly from the business world. Success in business and success in the classroom are certainly different things, but I’d wager that it’s a lot easier to show an accomplished businessperson how to teach than it is to show a teacher how to be an accomplished businessperson.

So it sounds like what you want is for all universities to be trade schools, focused on cranking out very specific skills and techniques rather than more broadly educating students and preparing them to apply a wide set of competencies to a range of problem domains. This sounds a bit like the certification trap from the 90s – go get a very narrow, often vendor specific certification but still have no practical experience in applying that knowledge. When that vendor falls out of favor, you are a bit stuck unless you can teach yourself the reasoning and abstraction skills you would have learned in college.

To steal the trite closing from the original article: But what do I know, I have been happily applying my Computer Science degree for nearly 30 years with technologies, programming languages and platforms that never even existed when I graduated.

Algorithm Maps The Most Beautiful Route To Where You’re Going

I would love to see (and use) this in more locations. Sadly, it will likely be quickly perverted to route visitors to/near shops and other unattractive locales.

If you want to find the most scenic route to get somewhere, there may soon be an app for that. Daniele Quercia and colleagues at Yahoo Labs in Barcelona have come up with a way to create a crowd-sourced measure of a city’s beauty, and made an algorithm to find the prettiest way to get from one point to another. “The goal of this work is to automatically suggest routes that are not only short but also emotionally pleasant,” the scientists told Technology Review:

Quercia and co begin by creating a database of images of various parts of the center of London taken from Google Street View and Geograph, both of which have reasonably consistent standards of images. They then crowd-sourced opinions about the beauty of each location using a website called UrbanGems.org.
Each visitor to UrbanGems sees two photographs and chooses the one which shows the more beautiful location. That gives the team a crowd-sourced opinion about the beauty of each location. They then plot each of these locations and their beauty score on a map which they use to provide directions.

“Useful” Russian Words With No English Equivalent

I am not so sure how ‘useful’ these examples are. Is the fact that most of them have negative connotations a reflection of the person who curated them or of the Russian language?

This was one of my favorites:

переподвыподверт (‘per-e-pod-‘voy-pod-‘vert)
Reddit user deffun on /r/doesnottranslate defined this noun as “to do something in a complex, incomprehensible way.”

The word kind of embodies itself, as it has four prefixes including one that repeats itself twice.

Thoughts on APIStrat Panel on The Future of the Internet

About a month ago, I attended the API Strategy and Practice conference in San Francisco. Overall, a pretty good conference and, as always, much of the value was in connecting with people between and after sessions.

One panel discussion was concluded with the question ‘what is the future of the internet?’. The responses seemed to fall into two categories 1) code/APIs everywhere and/or 2) intelligent consumption and composition of the available APIs.

I wanted to point out that the first category of thought reflected the view of a (now defunct) little technology outfit just down the highway who voiced the credo of ‘the network is the computer’. They had a great spec/implementation for a technology called JINI that very much reflected the philosophy of ‘be a node and not a hub’ – inherently scalable and cluster-able running practically anywhere.

The second group reminded me of the great AI gold rush in the mid-90s, when ‘intelligent agents‘ were going to manage all our personal data and book travel and plan meetings based on all of the metadata that we surround ourselves with. Companies were funded and failed trying to deliver on this vision (General Magic anyone?). Perhaps it was an idea before it’s time and enough has changed and opened up that it might work this time. We shall see.

Women And Tech Events

Not even sure where to start with this psycho-babble rant I just read.

To me, the whole thing reads like an insecure individual trying to justify their warped world view by blaming it on everyone else. Sure, I know that there are bad things that happen to women at tech conferences (and elsewhere) and that is stupid and inexcusable. But, to use that as justification for the view that *all* men are rapist/gropers/whatever is also stupid and inexcusable. The simple fact is that if you are looking to be offended, you will find offense in everything and everybody around you. Especially, if you play both sides of a situation: if someone engages with you, it is for strictly sexual purposes and if they don’t then they are *obviously* denigrating you because you are a female. There is no good way out of that spiral other than recognizing that the premise is setup for self fulfilling distrust.

Being guilty of whatever darkness is in another person’s head is just raw prejudice with a unhealthy dose of over-generalization/labeling. Sure, lets play the game (fill in the blanks): All ______ are lazy. All _____ are cheap. All _____ are bad drivers. And all men at tech events are misogynistic predators. Right.

I have some recent evidence on this front. I was at a tech conference in San Francisco last month and had some fantastic conversations with some of the female attendees there (and the male ones as well). Topics ranged from privacy, genetic testing, pregnancy(!), hypermedia, security concerns for services, agile practices, gardening, coding standards and whisky. No one was groped or molested or talked down to. Oddly enough, at the drinks reception on the last night, I did have a woman approach me several times and try to invite herself back to my room (which I gently but firmly declined). Do I think all women at tech events are horn-dogs because of that? No, not for one minute.

For an excellent exploration of learning to be offended, see this article at NPR. The referenced article is coming at this topic from race rather than gender, but I think it resonates with many of the points I made.

Internet of Things

I am a little wary of the vague but hype-intensive discussions around the Internet of Things (IoT). I am particularly leery when I ask a pundit in the area ‘what *specifically* is IoT going to have the biggest impact on?’ The answer tends to meander along the path of ‘it will change everything’ and ‘there isn’t anything you can’t do with it’. Right. Sort of reminds me of the 90s-era hyperbolic proclamations about Object Oriented databases and how they were going to change everything. As one wag rightly summarized that hype: ‘Object oriented databases are a billion dollar market with no customers’.

SCADA, RFID, SNMP, RPC, etc – didn’t all of these come with the same set of snares and delusions that seems to surround the IoT piper? I fear the only thing that is different this time around is that IoT is paired with the equally rabid running mate ‘Big Data’ that is desperately trying to find a problem to solve and in so doing might encourage the accumulation of whatever data from IoT that it can take on.

Irrational Consumerism

Yes, it is that time of year, when people go shopping, because, well, they are supposed to shop. New York magazine has a great article that explores just how crazy this is Why Black Friday Is A Behavioral Economist’s Nightmare:

The big problem with Black Friday, from a behavioral economist’s perspective, is that every incentive a consumer could possibly have to participate — the promise of “doorbuster” deals on big-ticket items like TVs and computers, the opportunity to get all your holiday shopping done at once — is either largely illusory or outweighed by a disincentive on the other side. It’s a nationwide experiment in consumer irrationality, dressed up as a cheerful holiday add-on.

It then goes on to explore the retailing ‘tricks’ that are employed:

The doorbuster: The doorbuster is a big-ticket item (typically, a TV or other consumer electronics item) that retailers advertise at an extremely low cost. (At Best Buy this year, it’s this $179.99 Toshiba TV.) We call these things “loss-leaders,” but rarely are the items actually sold at a loss. More often, they’re sold at or slightly above cost in order to get you in the store, where you’ll buy more stuff that is priced at normal, high-margin levels.
That’s the retailer’s Black Friday secret: You never just buy the TV. You buy the gold-plated HDMI cables, the fancy wall-mount kit (with the installation fee), the expensive power strip, and the Xbox game that catches your eye across the aisle. And by the time you’re checking out, any gains you might have made on the TV itself have vanished.

Implied scarcity: This is when a store attempts to drum up interest in an item by claiming “limited quantity” or “maximum two per customer,” which makes us think we’re getting something valuable when we may not be. It’s a staple of deceptive marketing, and at no time in the calendar year is it in wider use than on Black Friday. (There is also actual scarcity on Black Friday — when stores carry only a 50 or 100 of an advertised doorbuster item — which also introduces a risk that you’ll be 51st or 101th in line and waste your time entirety. Both are bad.)

I spent Black Friday at home, with my family, working through my to-do list. Aside from lunch, we didn’t venture out to buy a thing.

What If We Reviewed Movies The Same Way We Review Tablets?

What if we reviewed movies that same way that we review tablets? That is, don’t rate them based on their own merits but always relative to some other popular movie, allow lots of subjective, unsupported assertions and conclude that popularity equals quality. So if we assume that Spiderman was the benchmark du jour, it might go something like this:

Avengers had quite a few popular characters in it, but the fatal flaw was that there was no Spiderman. However, everyone noted that many of the characters closely copied Spiderman in having an alternate identity, special powers and a snazzy costume, it was clear that these were to make the characters more like Spiderman, who is the leader in the super hero space. While the movie was entertaining, it just didn’t have the same flow and ‘ease of watching’ that Spiderman did. And while we paid less to see the Avengers at a matinee, the quality of Spiderman clearly made it worth the extra ticket expense because everyone knows that Spiderman is just a higher quality product. We are sure that the Avengers might appeal to some people; we still believe that Spiderman is the best movie there is.

Application Privacy Over-Reach

The amount of attempted privacy over-reach in mobile apps is approaching appalling. The number of mobile applications either out of the box or via subsequent updates that require the privilege to access (and in some cases upload) your contacts from your device is growing. In most cases it seems the same reason is given for this invasive action: it is for *your* convenience. Meh. It is unnecessary, plain and simple. Twitter, Facebook, Foursquare, Yelp, Linkedin, Path, Gowalla and others all do this – with or without your permission or knowledge. I mean, why would an application that is supposed to serve as a remote control for you TV need access to you contacts?

Second to the contacts grab is the gratuitous need to have fine grained location information for no apparent reason. For example, why would an application that identifies music need your fine location to work? What does that have to do with music recognition? It seems to be just collecting data for the sake of collecting it.

It is beginning to get even more obnoxious. Some web-based services are not allowing users to create their own username and passwords. Rather they try to force you to log in using your Twitter or Facebook accounts. And with few exceptions they require access to your contacts and other inappropriate information. Some applications (typically browser) are taking this approach. This is even more heinous as now not only do they have your contact information they have a record of every site you visit and every keystroke that you type into every site that you visit. Think about that before you run something like RockMelt.

Be aware of what permissions sites and applications ‘require’ and don’t be afraid to say no. After all, it is your data that is being given away. And once it is gone, chances are you’ll never get it back or get it deleted.

Holiday Shopping And The Sheeple

I am a bit amazed at the manufactured frenzy that is Black Friday and Cyber Monday. It seems that each year the press does their very best to hype something that really doesn’t have a need to exist any longer (and probably doesn’t for the majority of people).

There really is no reason for people to be pitching tents in front of retailers the day before Thanksgiving so they can be first in line for the big ‘deals’. Is this really more of a social thing than a necessary thing? Do this people not value their own time? Or do they (the sheeple) do it because the press tells them that is what they should do? Are the press trying to justify their repeated (if not specious) claim that the day after Thanksgiving is ‘the busiest shopping day of the year’ when actual facts (something that journalism in this country seems to have only a nodding acquaintance with of late) show that the weekend before Christmas is typically the busiest shopping day. The only thing that I bought on ‘Black Friday’ was a couple of pints at the pub – well away from the shopping mayhem.

The ‘Cyber Monday’ hype is another head scratcher. I could see how this might have been significant a decade ago when most people didn’t have high speed internet connectivity at home and availed themselves of their employer’s internet pipe after returning from Thanksgiving holiday. But now most people *do* have high speed connectivity at home. And not only that, they have high speed connectivity at home the other 364 days of the year as well; so there is no practical need to wait for a specific day to do their online ordering. In fact, quite a few folks I know begin shopping online as early at October to insure that they get the selection they want and have plenty of time to deal with backorders and special orders.

Figure it out folks. Don’t believe the hype.

User Interfaces Need To Evolve

This post titled Don’t Mimic Real-World Interfaces really resonated with me and reminded me of a post that I had done a while ago titled Evolution Of The Mobile Experience.

There have always been those few apps that insist on looking like their physical, real world, equivalent. Calculator apps, date books, calendars, note taking apps, “stickies” — you know what I am talking about. Despite there being better options out there, better ways of displaying the data, designers stick with the known representation of the tool.

Now, though, Apple is taking it too far.

If you have seen any of the screenshots linked across the web about the new iCal interface you know what I am talking about. If you haven’t seen those, iCal is looking a lot like it does on the iPad right now in Lion’s developer preview. It’s ugly, and we should be way past this style by now.

Ugly and harder to use than it should be. Designers need to focus on how to allow the user to fluidly access and manipulate their data not slavishly stick to the limitations of physical items.

Another dimension of this is how poorly developers/designers have approached the touch interface. The industry seems to be mired in button-driven-pull-a-menu-to-do-anything paradigm. Interfaces really need to take better advantage of long-tap context options and gestures to make the interactions more fluid. This is one of the things that drives me bonkers about the iPad – it is so modal; I have to close one app to do something in another. I guess I have gotten used to how easy it is in Android to just share data between apps without having to change apps.

Speaking of Android apps, I think that Feedly is the first really usable news reader that I have encountered on Android. I subscribe to a lot of feeds and that seems to be the death of most readers on mobile devices because the developers thought it would be a good idea to download all your feed updates at once. This typically results in the app going away for a long time. Feedly does it more on demand. And they are clever about using gestures in the app – swipe down and to the left and I have marked that page of articles read and moved on to the next. Brilliant. Much better than ‘pull menu, select mark read, select next page, close menu’ annoyance of other apps.

Why Are Downtown Cincinnati Bloggers So Bitter?

I spent a little time this morning browsing the blogs of people who live in downtown Cincinnati. After about 10 minutes I had to stop. Why do they all seem so bitter and angry? On one hand, they spent a fair amount of time talking about how great it is to live downtown, then turn and belittle people who come down from the (evil) suburbs to partake of the urban greatness. Leaves me wondering why I should hang out downtown with such cliquish bitter crowd.

They also seem to love to hate on people who have chosen to live in the ‘burbs (apparently all they we do is drive SUVs and go to the mall). We have nothing entertaining to do, nothing interesting to eat and nothing worthy to see. Look, downtown folks, it is all about choices; I made mine and you made yours – it doesn’t make either one of us right or wrong.

Does the anger and bitterness come from a perceived lack of awe at the downtown living decision? Should there be weekly articles in the local press about how wonderful the people who live downtown are? Do they not feel vindicated by their decision, don’t feel revered enough that they chose to live downtown? And where is the line? I am sure it exists. That is, the line beyond which you are no longer ‘downtown’ enough to be part of the in-crowd. Yikes, now I am doing it to. Downtown folks, here is what I have for you: respect. Care to share?

Don’t get me wrong, I would love to live downtown; especially if I worked downtown. But I don’t. I work in the evil suburbs (Blue Ash) and live in the even more evil exurbs (Union Township). I enjoy being able to commute to work on my Vespa. I enjoy being a few miles from the fantastic Little Miami Bike trail; my wife and I love to cycle down to Loveland for brunch on sunny Sundays. I am sure I enjoy a pint at the Brazenhead just as much as I would at the Lackman. I like that my daughter has a fantastic school system to attend. I enjoy having the largest YMCA in the country a few mile from my home. I enjoy not being able to see my neighbor’s houses. I enjoy having a large vegetable garden that feeds us through part of the year. Besides, looking at what condos are going for downtown I would pay about twice as much for what amounts to a two bedroom apartment as I paid for my three bedroom house on five acres of land here in the heart of evil-dom.

Technorati Tags:
, , ,

Aberrant Behavior Online vs Real Life

I’ve been thinking further about my previous post on social media. In particular how some people behave very differently online that they do in person. Looking back at the example from my previous post, I have my doubts that the parties involved would have behaved the same if the online communication channel wasn’t available. In the one case, would a person exchange physical postal mail for months and then fire off a grudging missive? Probably not. Or they would skip right to the missive. It is probably the same thinking that motivates spammers – if they had to physically address and postal mail letters hawking boner pills and fake watches they likely wouldn’t. Put the ability to electronically send this same junk to thousands at a push of a button is just too easy.

Another example that comes to mind is a former co-worker who began following me on Twitter and Facebook. He stands as the only person (so far) that I have had to ban/block online because of continual obnoxious behavior. In real life he is a likable enough guy and very opinionated. He is ultra-right wing, but claims to be a Libertarian. I always suspected that this was just cover so that he could support the most radical aspects of the Republican agenda but claim ‘I’m not one of them’ when they get caught in their inevitable lies and corruption.

As I said, in person he was fine; online it was just a constant torrent of right wing talking points and Fox News propaganda and spin. The really sad thing was, he couldn’t defend or explain any of it – only parrot the shout radio spew. I debated him a few times and buried him every single time because there were no facts or logic behind his diatribes. This just made him even more radical. Not liking his online shellacking; he began posting lies/distortions about me and what I said in other ‘safe’ forums where he knew he would get no challenge from his other right wing buddies. When he made some pretty overtly racist statements on my Facebook wall, I was done. It would have been one thing if there was some intelligent debate or discussion. Instead this was just tedious, willfully ignorant, offensive, poorly reasoned noise on his part. Banned.

If you need further examples of online bad behavior take a look at the sewer that is the comment section on most posts on the Cincinnati Enquirer site. Maybe I have too much faith in humanity, but I am fairly certain that in real life a person would react with ‘they probably had it coming’ upon hearing that a person had died in a car accident – yet you see this sort of response almost daily on that site. You’ll also see the full regurgitation of the shout radio sloganeering in response to any news posting with even a hint of politics in it.

I guess the anonymizing effect of being online seduces some into the most outrageous behavior. Of course this effect also exists offline as well. As I have pointed out: “there is never a line for the toilet at the public pool”. Yes, people will do the pretty obnoxious things in public if they think they stand a chance of getting away with it – least of which is peeing in a public swimming pool.

Technorati Tags:
, ,