While this podcast is largely about home automation, this particular episode featuring Basheer Tome from Google, is a fantastic, intelligent, clear, no-hype exploration of virtual reality, augmented reality and user experience and design topics. Definitely worth a listen, IMHO.
There is a lot of talk these days of the evil of ‘fake news’. As far as I can tell, fake news is just a symptom. The real problem is the ‘easy money’ mentality of the online ad machine. This largely anonymous (and most definitely unregulated) mechanism rewards any and all bad behavior by handing out cash for page views. These days, that means page views creaking from the overload of irrelevant advertising that delivers no value to the viewer but does enrich the bottom-dwellers that plaster ads all over the page. Fake news, clickbait, porn, gossip, real news – at this point they are rewarded equally by advertisers.
Fake news is only the most headline grabbing; there is so much more of this dubious activity festering everywhere and in more subtle ways. Most recently, I have noticed online retailers starting to use unnecessary parcel tracking services ‘to better server their customers’. In this case, to better serve customers bandwidth wasting, unnecessary advertising. I have strongly suggested to some retailers that I do business with to just stick the tracking number for my order in the shipment confirmation email. I don’t want to (or need to) click on a series of links that are awash with advertising just to get to the tracking number that could and should be provided me in plain text in my email.
I can’t tell if companies are just ill-informed or just don’t care that much about customer satisfaction and privacy when it comes to things like this. We recently stayed at a hotel in San Diego that offered as one of it’s ‘customer services’ the ability to text the hotel if something was needed. What was not disclosed was that this service is not operated by the hotel, but by a third party. So, by texting the ‘hotel’ you are (probably unwittingly) providing this third party with your cell phone number, name, info about your stay and who knows what else. That information gets sold immediately and you get nothing for it. Just like the dubious ‘free’ safe in your room that requires you to swipe a credit card to ‘activate it’. As soon as you swipe, some unknown, undisclosed third party now has your credit card number, name and whatever else is encoded on your credit card’s magstripe. That’s right, I don’t need or want your data harvesting in the middle.
Additionally, I have little sympathy for all of the web sites that block visitors if they are using ad blocking software (which has been shown to prevent the distribution of ad-based malware (aka, forbes, businessinsider, wsj, wired, etc). They whine about not getting their vig from online ads but are silent about the 10, 15, 20 trackers and beacons (in the form of cookies and local storage) that they DO profit from that are placed on your system without your knowledge or approval (again ‘to better provide you service’). But, don’t believe me, run a browser extension like Ghostery to see all the garbage that gets placed on your system when you visit one of these cookie cesspools. Alternatively, you can at least click on the site information icon in Chrome and see all the ‘3rd party’ cookies that are placed on your system.
Increasingly, the web is moving away from its roots as a means to easily share information (and actual data) into the realm of the quick buck, ‘publish anything that will generate a click’ crapfest we have now.
Install ad-blocking in your browser and think before (and after) you click.
Self proclaimed futurist tweets an obfuscated link to an ad-encrusted pull quote that links to an article… behind a paywall.
For some the internet has moved from a means to share information and ideas to one that exists solely to generate clicks that have zero information value (well, except to them – ‘ad impressions’ and all that). And, no, I don’t want to sign up for you email-harvesting ‘newsletter’ that you never publish but benefit by selling on my contact information.
I guess I am taking a little more cautious/skeptical stance when it comes to the auto-replenishment feature touted by many IoT pundits and vendors. If you aren’t familiar, this would allow a device to determine that you were out of or running low on a given consumable (be it a food item, dish soap or toilet paper) and then order more of it on your behalf.
Here is the problem: the vendor and the device don’t have your best interests at heart and might tend to exaggerate the current state of consumable and (maybe) tend to order more of it more frequently that you might actually need (or want). For example, if you have ever owned an inkjet or laser printer you have probably experienced this already – persistent warnings/notifications to replace a toner or ink cartridge when, in reality, the useful life of the item is much, much longer that you are being led to believe. Heck, I have a laser printer that has been telling be for 13 months that I need to replace the toner. In that time my family and I have printed hundreds of additional pages with this ’empty’ toner cartridge.
Consider also the existing confusion over the meaning of ‘sell by’ and ‘best by’ designations on other consumables (most notably food). What if vendors add a ‘replenish by’ or ‘order by’ date into the mix? Not a great situation for consumers, especially if they have delegated this to an networked device in the name of ‘convenience’.
1) create list in desktop app
2) attempt to share it with my wife; sorry, have to upgrade to paid version for this
3) finally share with wife, she attempts to edit shared list; sorry, she has to upgrade to paid version (screw that)
4) remember a few more items on the go, add to list via mobile app
5) attempt to sync from mobile; get loads of errors – sync fails
6) only way to fix sync error is to copy note contents, delete note an paste contents into a new note
7) repeat from step 2 or just give up
Google Keep experience:
1) create list on tablet using Keep app
2) share with wife; no problem – she has access to it within seconds
3) she needs to add items to the list – no problem; she adds them and they automatically sync with me
4) edit list on mobile – no problem; list automatically syncs
5) both of us run Keep app in grocery store, ticking off items from the list; no problem – list syncs automatically
6) marvel at the superior user experience from Google Keep
7) BONUS: I can set a reminder on the list that is a location; Google Now notifies me when I am near the store.
Evernote just keeps getting worse and worse. About the only thing that keeps me using it is the web clip functionality in the browser. Come on Keep, add that and I can leave Evernote behind.
I saw this whiny article in the Washington Post that was just begging for a response. The gist of the article is that the author got a Computer Science degree and wasn’t given his dream job out of the gate. This then becomes an indictment of the education system rather than the typical sniveling millennial i-wasn’t-given-the-world-without-having-to-work-for-it screech that it is. Let’s take a look at some quotes from the posting:
My college education left me totally unprepared to enter the real workforce. My degree was supposed to make me qualified as a programmer, but by the time I left school, all of the software and programming languages I’d learned had been obsolete for years.
The think this belies a misunderstanding of how higher education works – it is not what you are given, it is what you do with it (the whole learning how to learn thing). It is as if he expects to read a book on swimming but never gets into the pool; and ‘surprise’ he really can’t swim because he put no effort into applying the learnings. Also, if ‘all of the software and programming languages’ were obsolete, what were they teaching? FORTRAN?, RPG?, Visual Basic?
To find real work, I had to teach myself new technologies and skills outside of class, and it wasn’t easy.
Poor you. You should have been doing this all along. The Computer Science curriculum should be teaching you fundamental concepts in how computers work, programming concepts and techniques that can be applied across specific programing languages, databases and platforms. Actually, it is a bit shocking how many recent CS grads don’t have a grasp of fundamentals.
Businesses aren’t looking for college grads, they’re looking for employees who can actually do things – like build iPhone apps, manage ad campaigns and write convincing marketing copy. I wish I’d been taught how to do those things in school, but my college had something different in mind.
Businesses are indeed looking for those things, but they are looking for people who can learn and grow and apply what they have learned in the past. If you have a CS degree and can’t figure out how to write an iPhone app you either had a horrible curriculum or slept through most of your class time. The fact that you weren’t specifically trained for that is not a problem with your education. Rather it is a failure to apply what you should have learned.
At least 90 percent of my college education (and that of so many others) boiled down to pure terminology, or analysis of terminology. My success in any given class was almost wholly based on how well I could remember the definitions of countless terms – like the precise meaning of “computer science” or how to explain “project management” in paragraph form, or the all-too-subtle differences between marketing and advertising.
Wow. Ok. So, if that percentage is accurate, I can see why you can’t get a job. When I got my CS degree (many moons ago) that was maybe 1% of what we were taught.
To me, this is the root of our college problem: The average college student is paying $30,000 a year for the chance to learn valuable skills from professors who haven’t had the opportunity to learn those skills themselves. Maybe it’s a crazy idea, but if you’re going to spend all that money for a college education, shouldn’t you expect to learn real-world skills from people who know what they’re doing?
This seems excessively harsh and a bit misguided. If you want to be learning what is new and trendy, go to a conference, a user group, or actually talk with people who are doing interesting things. By the time those things get packaged up into an approved curriculum, the technology might be on the stale side. But, again, if you don’t understand the fundamentals, you are not going to be able to effectively apply new technology and concepts. No one can give that to you at any price.
Solving the issue of inexperienced teachers may be even simpler: have schools relax academic requirements for professors and focus far more on hiring effective businesspeople. With a little more leeway, academically-minded candidates will have more freedom to gain job experience, and schools may even attract more talent directly from the business world. Success in business and success in the classroom are certainly different things, but I’d wager that it’s a lot easier to show an accomplished businessperson how to teach than it is to show a teacher how to be an accomplished businessperson.
So it sounds like what you want is for all universities to be trade schools, focused on cranking out very specific skills and techniques rather than more broadly educating students and preparing them to apply a wide set of competencies to a range of problem domains. This sounds a bit like the certification trap from the 90s – go get a very narrow, often vendor specific certification but still have no practical experience in applying that knowledge. When that vendor falls out of favor, you are a bit stuck unless you can teach yourself the reasoning and abstraction skills you would have learned in college.
To steal the trite closing from the original article: But what do I know, I have been happily applying my Computer Science degree for nearly 30 years with technologies, programming languages and platforms that never even existed when I graduated.
I would love to see (and use) this in more locations. Sadly, it will likely be quickly perverted to route visitors to/near shops and other unattractive locales.
If you want to find the most scenic route to get somewhere, there may soon be an app for that. Daniele Quercia and colleagues at Yahoo Labs in Barcelona have come up with a way to create a crowd-sourced measure of a city’s beauty, and made an algorithm to find the prettiest way to get from one point to another. “The goal of this work is to automatically suggest routes that are not only short but also emotionally pleasant,” the scientists told Technology Review:
Quercia and co begin by creating a database of images of various parts of the center of London taken from Google Street View and Geograph, both of which have reasonably consistent standards of images. They then crowd-sourced opinions about the beauty of each location using a website called UrbanGems.org.
Each visitor to UrbanGems sees two photographs and chooses the one which shows the more beautiful location. That gives the team a crowd-sourced opinion about the beauty of each location. They then plot each of these locations and their beauty score on a map which they use to provide directions.
Amazingly small. But I imagine what it would take to distribute all of that power is the real trick.
I like this because it makes the concept much more concrete: leaving excavation equipment in the ground because they didn’t have a plan for extracting it (and now it is ‘too expensive’). Sound familiar tech folks?
I am not so sure how ‘useful’ these examples are. Is the fact that most of them have negative connotations a reflection of the person who curated them or of the Russian language?
This was one of my favorites:
Reddit user deffun on /r/doesnottranslate defined this noun as “to do something in a complex, incomprehensible way.”
The word kind of embodies itself, as it has four prefixes including one that repeats itself twice.
About a month ago, I attended the API Strategy and Practice conference in San Francisco. Overall, a pretty good conference and, as always, much of the value was in connecting with people between and after sessions.
One panel discussion was concluded with the question ‘what is the future of the internet?’. The responses seemed to fall into two categories 1) code/APIs everywhere and/or 2) intelligent consumption and composition of the available APIs.
I wanted to point out that the first category of thought reflected the view of a (now defunct) little technology outfit just down the highway who voiced the credo of ‘the network is the computer’. They had a great spec/implementation for a technology called JINI that very much reflected the philosophy of ‘be a node and not a hub’ – inherently scalable and cluster-able running practically anywhere.
The second group reminded me of the great AI gold rush in the mid-90s, when ‘intelligent agents‘ were going to manage all our personal data and book travel and plan meetings based on all of the metadata that we surround ourselves with. Companies were funded and failed trying to deliver on this vision (General Magic anyone?). Perhaps it was an idea before it’s time and enough has changed and opened up that it might work this time. We shall see.
Not even sure where to start with this psycho-babble rant I just read.
To me, the whole thing reads like an insecure individual trying to justify their warped world view by blaming it on everyone else. Sure, I know that there are bad things that happen to women at tech conferences (and elsewhere) and that is stupid and inexcusable. But, to use that as justification for the view that *all* men are rapist/gropers/whatever is also stupid and inexcusable. The simple fact is that if you are looking to be offended, you will find offense in everything and everybody around you. Especially, if you play both sides of a situation: if someone engages with you, it is for strictly sexual purposes and if they don’t then they are *obviously* denigrating you because you are a female. There is no good way out of that spiral other than recognizing that the premise is setup for self fulfilling distrust.
Being guilty of whatever darkness is in another person’s head is just raw prejudice with a unhealthy dose of over-generalization/labeling. Sure, lets play the game (fill in the blanks): All ______ are lazy. All _____ are cheap. All _____ are bad drivers. And all men at tech events are misogynistic predators. Right.
I have some recent evidence on this front. I was at a tech conference in San Francisco last month and had some fantastic conversations with some of the female attendees there (and the male ones as well). Topics ranged from privacy, genetic testing, pregnancy(!), hypermedia, security concerns for services, agile practices, gardening, coding standards and whisky. No one was groped or molested or talked down to. Oddly enough, at the drinks reception on the last night, I did have a woman approach me several times and try to invite herself back to my room (which I gently but firmly declined). Do I think all women at tech events are horn-dogs because of that? No, not for one minute.
For an excellent exploration of learning to be offended, see this article at NPR. The referenced article is coming at this topic from race rather than gender, but I think it resonates with many of the points I made.
I am a little wary of the vague but hype-intensive discussions around the Internet of Things (IoT). I am particularly leery when I ask a pundit in the area ‘what *specifically* is IoT going to have the biggest impact on?’ The answer tends to meander along the path of ‘it will change everything’ and ‘there isn’t anything you can’t do with it’. Right. Sort of reminds me of the 90s-era hyperbolic proclamations about Object Oriented databases and how they were going to change everything. As one wag rightly summarized that hype: ‘Object oriented databases are a billion dollar market with no customers’.
SCADA, RFID, SNMP, RPC, etc – didn’t all of these come with the same set of snares and delusions that seems to surround the IoT piper? I fear the only thing that is different this time around is that IoT is paired with the equally rabid running mate ‘Big Data’ that is desperately trying to find a problem to solve and in so doing might encourage the accumulation of whatever data from IoT that it can take on.
I think I am in favor of pro-active laws against wearing/using something like Google Glass while driving. It is a form of distracted driving and aligns with the current laws regarding texting and driving.
Yes, it is that time of year, when people go shopping, because, well, they are supposed to shop. New York magazine has a great article that explores just how crazy this is Why Black Friday Is A Behavioral Economist’s Nightmare:
The big problem with Black Friday, from a behavioral economist’s perspective, is that every incentive a consumer could possibly have to participate — the promise of “doorbuster” deals on big-ticket items like TVs and computers, the opportunity to get all your holiday shopping done at once — is either largely illusory or outweighed by a disincentive on the other side. It’s a nationwide experiment in consumer irrationality, dressed up as a cheerful holiday add-on.
It then goes on to explore the retailing ‘tricks’ that are employed:
The doorbuster: The doorbuster is a big-ticket item (typically, a TV or other consumer electronics item) that retailers advertise at an extremely low cost. (At Best Buy this year, it’s this $179.99 Toshiba TV.) We call these things “loss-leaders,” but rarely are the items actually sold at a loss. More often, they’re sold at or slightly above cost in order to get you in the store, where you’ll buy more stuff that is priced at normal, high-margin levels.
That’s the retailer’s Black Friday secret: You never just buy the TV. You buy the gold-plated HDMI cables, the fancy wall-mount kit (with the installation fee), the expensive power strip, and the Xbox game that catches your eye across the aisle. And by the time you’re checking out, any gains you might have made on the TV itself have vanished.
Implied scarcity: This is when a store attempts to drum up interest in an item by claiming “limited quantity” or “maximum two per customer,” which makes us think we’re getting something valuable when we may not be. It’s a staple of deceptive marketing, and at no time in the calendar year is it in wider use than on Black Friday. (There is also actual scarcity on Black Friday — when stores carry only a 50 or 100 of an advertised doorbuster item — which also introduces a risk that you’ll be 51st or 101th in line and waste your time entirety. Both are bad.)
I spent Black Friday at home, with my family, working through my to-do list. Aside from lunch, we didn’t venture out to buy a thing.
What if we reviewed movies that same way that we review tablets? That is, don’t rate them based on their own merits but always relative to some other popular movie, allow lots of subjective, unsupported assertions and conclude that popularity equals quality. So if we assume that Spiderman was the benchmark du jour, it might go something like this:
Avengers had quite a few popular characters in it, but the fatal flaw was that there was no Spiderman. However, everyone noted that many of the characters closely copied Spiderman in having an alternate identity, special powers and a snazzy costume, it was clear that these were to make the characters more like Spiderman, who is the leader in the super hero space. While the movie was entertaining, it just didn’t have the same flow and ‘ease of watching’ that Spiderman did. And while we paid less to see the Avengers at a matinee, the quality of Spiderman clearly made it worth the extra ticket expense because everyone knows that Spiderman is just a higher quality product. We are sure that the Avengers might appeal to some people; we still believe that Spiderman is the best movie there is.
The amount of attempted privacy over-reach in mobile apps is approaching appalling. The number of mobile applications either out of the box or via subsequent updates that require the privilege to access (and in some cases upload) your contacts from your device is growing. In most cases it seems the same reason is given for this invasive action: it is for *your* convenience. Meh. It is unnecessary, plain and simple. Twitter, Facebook, Foursquare, Yelp, Linkedin, Path, Gowalla and others all do this – with or without your permission or knowledge. I mean, why would an application that is supposed to serve as a remote control for you TV need access to you contacts?
Second to the contacts grab is the gratuitous need to have fine grained location information for no apparent reason. For example, why would an application that identifies music need your fine location to work? What does that have to do with music recognition? It seems to be just collecting data for the sake of collecting it.
It is beginning to get even more obnoxious. Some web-based services are not allowing users to create their own username and passwords. Rather they try to force you to log in using your Twitter or Facebook accounts. And with few exceptions they require access to your contacts and other inappropriate information. Some applications (typically browser) are taking this approach. This is even more heinous as now not only do they have your contact information they have a record of every site you visit and every keystroke that you type into every site that you visit. Think about that before you run something like RockMelt.
Be aware of what permissions sites and applications ‘require’ and don’t be afraid to say no. After all, it is your data that is being given away. And once it is gone, chances are you’ll never get it back or get it deleted.
I am a bit amazed at the manufactured frenzy that is Black Friday and Cyber Monday. It seems that each year the press does their very best to hype something that really doesn’t have a need to exist any longer (and probably doesn’t for the majority of people).
There really is no reason for people to be pitching tents in front of retailers the day before Thanksgiving so they can be first in line for the big ‘deals’. Is this really more of a social thing than a necessary thing? Do this people not value their own time? Or do they (the sheeple) do it because the press tells them that is what they should do? Are the press trying to justify their repeated (if not specious) claim that the day after Thanksgiving is ‘the busiest shopping day of the year’ when actual facts (something that journalism in this country seems to have only a nodding acquaintance with of late) show that the weekend before Christmas is typically the busiest shopping day. The only thing that I bought on ‘Black Friday’ was a couple of pints at the pub – well away from the shopping mayhem.
The ‘Cyber Monday’ hype is another head scratcher. I could see how this might have been significant a decade ago when most people didn’t have high speed internet connectivity at home and availed themselves of their employer’s internet pipe after returning from Thanksgiving holiday. But now most people *do* have high speed connectivity at home. And not only that, they have high speed connectivity at home the other 364 days of the year as well; so there is no practical need to wait for a specific day to do their online ordering. In fact, quite a few folks I know begin shopping online as early at October to insure that they get the selection they want and have plenty of time to deal with backorders and special orders.
Figure it out folks. Don’t believe the hype.
This is a great story about a guy in Copenhagen who had his bike stolen and through the power of social media and the interwebs he got it back. And what a great reward for the guy who found it for him!
There have always been those few apps that insist on looking like their physical, real world, equivalent. Calculator apps, date books, calendars, note taking apps, “stickies” — you know what I am talking about. Despite there being better options out there, better ways of displaying the data, designers stick with the known representation of the tool.
Now, though, Apple is taking it too far.
If you have seen any of the screenshots linked across the web about the new iCal interface you know what I am talking about. If you haven’t seen those, iCal is looking a lot like it does on the iPad right now in Lion’s developer preview. It’s ugly, and we should be way past this style by now.
Ugly and harder to use than it should be. Designers need to focus on how to allow the user to fluidly access and manipulate their data not slavishly stick to the limitations of physical items.
Another dimension of this is how poorly developers/designers have approached the touch interface. The industry seems to be mired in button-driven-pull-a-menu-to-do-anything paradigm. Interfaces really need to take better advantage of long-tap context options and gestures to make the interactions more fluid. This is one of the things that drives me bonkers about the iPad – it is so modal; I have to close one app to do something in another. I guess I have gotten used to how easy it is in Android to just share data between apps without having to change apps.
Speaking of Android apps, I think that Feedly is the first really usable news reader that I have encountered on Android. I subscribe to a lot of feeds and that seems to be the death of most readers on mobile devices because the developers thought it would be a good idea to download all your feed updates at once. This typically results in the app going away for a long time. Feedly does it more on demand. And they are clever about using gestures in the app – swipe down and to the left and I have marked that page of articles read and moved on to the next. Brilliant. Much better than ‘pull menu, select mark read, select next page, close menu’ annoyance of other apps.