I have been obsessed with User Interfaces (UI) for as long as I can remember. I remember marveling at the beauty that was Compaq TabWorks while I played “The Incredible Machine” and listened to “Tears For Fears—Greatest Hits” on the family computer.
Don’t judge me—I was listening to “Mad World” way before Donny Darko and that creepy rabbit. If none of those references landed with you, it’s probably because I’m super old. In the words of George Castanza, “It’s not you, it’s me.”
That’s another super old reference you might not get. You know what—forget all that, let’s move on.
I really got into UI when I bought my own computer. I had joined the Coast Guard and saved a bunch of money during boot camp (when you can’t go shopping—you know—because of push-ups and stuff). I wanted to buy a Chevy Cavalier (sadly, that’s not a joke), but my father encouraged me to invest in a computer instead, so I bought a Compaq from Office Depot that came with Windows 98. Also you can’t buy a Cavalier with 800 bucks.
I spent countless hours changing the themes in Windows 98. I was mesmerized by the way windows overlapped and how the icons and fonts would change; the shapes of buttons and the different colors. The slight drop shadow each window had to layer it in space. Each theme was better than the previous theme!
If only I had known how much better things were going to get. If only I had known, about Windows XP.
Does love at first sight exist? No—don’t be ridiculous. Love is an extremely complex part of the human condition that can only manifest itself over time through long periods of struggling and the dark night of the soul.
“What is love? Baby don’t hurt me. Don’t hurt me. No more.”
—Haddaway, “What Is Love”
But love’s fickle and cruel cousin, Infatuation, does exist and it is almost exclusively available at first sight. I was absolutely infatuated with Windows XP.
The curves on the start menu. The menu animations. I could just look at it for hours. And I did. Shocking fact—I wasn’t exactly in high social demand so I had a great deal of free time to do weird things like stare at an operating system.
For those who remember, Windows XP was extremely customizable. Virtually every part of the operating system could be skinned or themed. This spawned a lot of UI hacking communities and third party tools like Window Blinds from the fine folks at Stardock. I see you Stardock; the north remembers.
I could go on and on about my long, boring and slightly disturbing obsession with UI. Oddly enough, I am not a designer or an artist. I can build a decent UI, but you would not hire me to design your site. Or you would but your name would be “Burke’s Mom.”
I can however assemble great UI if I have the building blocks. I’ve been lucky enough to work on some great UI projects in my career, including being part of the Kendo UI project when it first launched. I love buttons, dropdown lists, and dialogue windows with over the top animation. And I can assemble those parts into an application like Thomas Kinkade. I am the UI assembler of light.
But as a user, one thought has been recurring for me during the past few years: the best user experience is really no user interface at all.
The only reason that a UI even exists is so that users can interact with our systems. It’s a middle-man. It’s an abstracted layer of communication and the conversation is pre-canned. The user and the UI can communicate, but only within the specifically defined boundaries of the interface. And this is how we end up with GLORIOUS UX fails like the one that falsely notified Hawaiian residents this past weekend of an incoming ballistic missile.
This is the screen that set off the ballistic missile alert on Saturday. The operator clicked the PACOM (CDW) State Only link. The drill link is the one that was supposed to be clicked. #Hawaii pic.twitter.com/lDVnqUmyHa
— Honolulu Civil Beat (@CivilBeat) January 16, 2018
We have to anticipate how the user is going to think or react and everyone is different. Well designed systems can get us close to intuitive. I am still a fan of skeumorphic design and “sorry not sorry.” If a 4 year old can pick up and use and iPad with no instruction, that’s kind of a feat of UX genius.
That said, even a perfect UI would be less than ideal. The ideal is to have no middleman at all. No translation layer. Historically speaking, this hasn’t been possible because we can’t “speak” to computers.
Natural-language processing (NLP) is the field of computing that deals with language interaction between humans and machines. The most recognizable example of this would be the Amazon Echo, Siri, Cortana or Google. Or “OK Google.” Or whatever the heck you call that thing.
I firmly believe that being able to communicate with an AI via spoken language is a better user interaction than a button—every time. To make this case, I would like to give you three examples of how NLP can completely replace a UI and the result is a far better user experience.
Siri is not a shining example of “a better user experience,” but one thing that it does fairly well and the thing I use it for almost every day, is creating reminders.
It is a far better user experience to say “Hey Siri, remind me to email my mom tomorrow morning 9 AM” than it is to do this…
- Open the app
- Tap a new line
- Type out the reminder
- Tap the “i”
- Select the date
- Tap “Done”
No matter how beautiful the Reminders app is, it will never match the UX of just telling Siri to do it.
Now this comes with the disclaimer of, “when it works.” Siri frequently just goes to lunch or cuts me off halfway through which results in a nonsensical reminder with no due date. When NLP goes wrong, it tends to go WAY wrong. It’s also incredibly annoying as anyone who as EVER used Siri can attest.
This is a simple example, and one that you might already be aware of or not that impressed with. Fair enough; here’s another: Home Automation.
I have a bunch of the GE Z-Wave switches installed in my house. I tie them all together with a Vera Controller. If you aren’t big into home automation, just know that the switches connect to the controller and the controller exposes the interface with which to control them, allowing me to turn the lights on and off with my phone.
The Vera app for controlling lights is quite nice. It’s not perfect, but the UX is decent. For instance, if I wanted to turn on the office lights, this is how I would do it using the app.
I said it was “quite nice.” Not perfect. I’m just saying I’ve seen worse.
To be honest though, when I want to turn a light on or off, I don’t want to go hunting and pecking through an app on my phone to do it. That is not awesome. I want the light on and I want it on now. Turning lights on and off via your phone is a step backward in usability when compared to, I don’t know, A LIGHT SWITCH?
What is awesome, is telling my Echo to do it.
I can, for any switch in my house, say…
“Alexa, turn on/off the office lights”
Or the bedroom, or the dining room or what have you. Vera has an Alexa skill that allows Alexa to communicate directly with the controller and because Alexa uses NLP, I don’t have to say the phrase exactly right to get it to work. It just works.
Now, there is a slight delay between the time that I finish issuing the command and the time that Alexa responds. I assume this is the latency to go out to the server, execute the skill, call back into my controller, turn off the light, go back out to the skill in the cloud and then back down into my house.
I’m going to be honest and say that I sometimes get irritated that it takes a second or two to turn the lights on. Sure—blah blah blah technical reasons, but I don’t care. I want the lights on and I want them on NOW. Like Veruca Salt.
I also have Nest thermostats which I can control with the Echo and I gotta tell you, being able to adjust your thermostat without even getting out of bed is kind of, well, it’s kind of pathetic now that I’ve said it out loud. Never mind. I never ever do that.
NLP doesn’t have to be limited to the spoken word. It turns out that interfacing with computers via text is STILL better than buttons and sliders.
For that, I give you Exhibit C.
Digit is a remarkable little service that I discovered via a Twitter ad. You’ve aways wondered who clicks on Twitter ads and now you know.
I wish more people knew about Digit. The basic premise behind the service is that they save money for you automatically each month by running machine learning on your spending habits to figure out where they can save money without sending you into the red.
The most remarkable thing about Digit is that you don’t interface with it via an app. Everything is done via text; and I love it.
Digit texts me every day to give me an update on my bank account balance. This is a nice daily heads up look at my current balance.
If I want to know how much Digit has saved for me, I just ask how much is in my savings. But again, because Digit is using NLP, I can ask it however I like. I can even just use the word “savings” and it still works. It’s almost like I’m interfacing with a real person.
Now if I want to transfer some of that back into savings because I want to buy more Lego and my wife says that Lego are a “want” not a “need” and that we should be saving for our kids “college,” I can just ask Digit to transfer some money. Again, I don’t have to know exactly what to say. I can interface with Digit until I get the right result. Even If I screw up mid-transaction, Digit can handle it. This is basically me filling out a form via text without the hell that is “filling out a form.”
After using Digit via text for so long, I now want to interface with everything via text. Sometimes it’s even better than having to talk out loud, especially if you are in a situation where you can’t just yell something out to a robot, or you can’t be bothered to speak. I have days like that too.
No. Emphatically no. NLP is not a substitution for all user interfaces. For instance, I wouldn’t want to text my camera to tell it to take a picture. Or scroll through photos with my voice. It is, however, a new way to think about how we design our user interfaces now that we have this powerful new form of input available.
So, before you design that next form or shopping cart, ask yourself: Do I really even need this UI? There’s a good chance that thanks to NLP and AI/ML, you don’t.
NLP is far easier to create and develop than you might think. We’ve come a long way in terms of developer tooling. You can check out the LUIS project from Azure which provides a GUI tool for building and training NLP models.
It’s free and seriously easy.
Here’s a video of me building an AI that can understand when I ask it to turn lights on or off by picking the light state and room location out of an interaction.