Remember when reading on paper was a thing? It still is – sort of. Sometime in the early nineties, we started reading more on screens than on paper. A shift was in place when we realised that most of the reading we do on a computer was done to get a gist of the information and not retain all its meaning. An EyeTracking report by Nielsen in 2018 says users ‘tracked words’ more than actual continuous reading. The reduction in short term memory through an increase of consumption of ‘other’ formats of information made readers read an average of ONLY 20% of any article with 500 words or more. The influx of audio, short format text and video made reading not just a thing of the past but something that drew confused looks from peers.
“You still read?”
Kindle, by Amazon, provided an exciting evolution for readers and was timed right as a product. If they were released today (the future) or in the 1970s (when XEROX first tried testing a similar product) they’d probably not achieve the kind of success they did. Users read physical books, and the transition had to be slow and iterative.
Tech companies, brands, digital publishers and social media have played a role in how we consume information today. For this piece, however, let’s focus on the way corporations the way we consume information and our interaction with devices to accept said information. Habits out of thin air.
Do as I say. And as I do.
It seems forever ago when Mr Jobs took the stage and for the first time, showed what a touchscreen device “should’ be like – the first iPhone. When other Windows-based manufacturers like O2 , HTC, Palm and Asus were making touch screens with differently branded but essentially the same type of styli (Yes, that’s the plural), Mr Jobs simply flicked his finger on the screen and the contacts just scrolled. It was like magic.
This, of course, is by design. A lot of how we interact with our communication tools today is guided by a product or UX design manager somewhere. Their job is to follow 2 simple guidelines:
- How do I make interactions with this product/software a habit?
- The way users interact with the product/software should either:
- Look amazingly cool
- Or be a reflection of an action they are organically inclined to
There are, of course, other elements like efficiency, ruggedness and live interaction with other users (like Snap, and later Instagram with face tracking for pets) which gained prominence. In the long-term, however, corporations will follow through with said guidelines.
It was Jobs’ mission to change the way we interact with devices. Not just because he wanted to make money, but because he thought that it wasn’t natural to “press a button” or “use an Interface input that is not your finger”. The way an individual “did a thing” like calling someone, take a picture, play music or browse albums was changed forever. Every single manufacturer now bases their UX on an improved version or inspired by elements from the original iPhone.
13 years later, we still use a version of that multi-touch on all our devices. We still look at a minimal looking camera UX with the shutter button on the right side of the screen (Holding the phone in the landscape is the only right way to take a photo – just saying). And using a touchscreen to type out an SMS and other short format texts has become second nature compared to a physical QWERTY.
Motorola, with its gesture controls, was pretty cool. Their goal was to make some features more accessible than others. Specifically, the ones you might require in a hurry. To launching the flashlight app or the camera, users don’t need to go through the process of pressing the power button to first switch on the screen, then unlock the phone, then locate the app and then execute. It was a shake or twirl of your hand. OnePlus does the same thing with its devices with the “Draw on-screen” feature. Users got hooked.
Sounds good. Doesn’t work
Making users ‘do something instead of that other thing they’re used to’ hasn’t always worked out. One of the devices by a Korean company that launched this year had amazing gesture controls. Well, at least in the promo. Essentially, you look at the device and wave your hands in patterns to make it do things. In the real world, however, it just didn’t feel practical to suddenly wave your hand wildly over a phone (sometimes 2-3 times) to find out the time or cut a call – especially while having others in the room. It solved Q2 of the UX manager’s dilemma – looked really cool. Was it practical? Not for everyone. No habits formed here.
This is a beginning
With AR & VR playing active roles in our lives, interaction with people, machines and our pets is probably going to change in small, continuous, iterative steps. Radical changes rarely make it through. Har cheez ka time aata hai.
Not one to make predictions, but the future of interaction – with people and machines, is probably going to be a stirred cocktail of augmented reality with the olives of tactile feedback. We’d be walking past a convenience store with a thin glove-like layering on our hands that allows us to feel the physical attributes of this product and order it via voice commands. Corporations like Amazon have already reached about 75% of this goal. The other big 3 in Tech are probably not too far behind.
Your habits will change. Tech will change them
20 years from now, we’d probably guffaw at the 2020 generation when most of us had rectangular shaped bricks as our primary communications device.
This article would probably be read on a tiny flexible lens placed on your eye. If that is the case, I’d like to ask you:
Dear Future Reader,
You guys still read?
Blink, if yes.
The views and opinions published here belong to the author and do not necessarily reflect the views and opinions of the publisher.
Thank you Reputation Today. Cheers.
(makes happy noises)