‘Good’ hardware design can be defined as making sure products are created with great respect towards the customers needs through form and function. However, with the ever-growing shift from hardware to software (specifically mobile apps and websites), are companies now designing in order to disrespect the customers needs?
I recently broke my phone. This device I used constantly to do everything was now an unresponsive shiny monolith. Despite this only lasting 4 days I became astonished at how much it felt like losing a limb. I realised my apps had made me addicted.
Addictions are indulgences we repetitively consume to distract ourselves from a deeper misery. Studies report the strong link between increased levels of depression and addicted internet users (Young & Rogers, 1998). Depression has led to increases in suicide (Evans, 2015), with the rate in Scotland increasing by 8% from 2015 to 2016 (NRS, 2017). With social networking accounting for 28% of our time spent online (Go-Globe, 2017) – there is strong evidence that this is a significant predictor of mobile addiction (Mohammed, 2013).
These problems have been around for decades and do not solely stem from our phones. As customers, however, we buy phones to help us live more time efficient lives. So why then are we allowing companies to take advantage of our lives and steal our time?
‘Good’ interface design can be defined as being aesthetically pleasing, simple to navigate and user intuitive. I believe these to be the triggers which trap people into interminable, destructive distractions from the outside world. If apps didn’t have these sensory satisfying qualities, surely people would cease to use them? Personally, whenever I use a website for research I get turned off by a bare-bones, barely formatted block of monochrome text – the same way I’d refuse to use a poorly designed product.
Coincidence or planned? My frustrations that something as severe as an addiction stemming from something as innocent as a designer trying to create a beautiful application are boosted further by Tristan Harris (2017), an ex-product designer at Google, who believes that these design choices are a lot more strategic and a lot more conniving.
Harris says that our minds have been “hijacked” by our phones and that the main goal driving all digital tech is a “race for our attention” (TED, 2017). Harris points to several examples of social media sites which force you into giving them your full attention. Facebook, Instagram, Snapchat and LinkedIn all have systems which reward you (TED, 2017). Instagram gives young people skewed representations of other peoples lives. It is then a game to see who can get the most likes – a game with reported negative effects on teen’s emotional well-being (Weinstein, 2017). We attach importance to these made-up constructs and spend our days finding validity through them (Crossman, 2016). We are seeing a shift from using sites for information into using them to numb ourselves into a stasis through their mechanics of instant gratification. Swiping, hearing notification “pings” and receiving “likes” then become the things that trap us (Crossman, 2016). Several times have I tweeted out of boredom to get that brief feeling of delight from a retweet.
Harris left Google to create ‘Time Well Spent’ a company whose aim it is to create “ethically motivated digital products” which combat the current zeitgeist (TWS, 2017). Like many of us, he wants software design to have “new standards” where these apps free us instead of entangle us (TWS, 2017). He is currently seeking public pressure to attain his goal (TWS, 2017).
To defend what appears to be the indefensible would seem absurd. We aren’t just resorting to these apps through lack of willpower, we are being manipulated. There are, however, points to take into account. Applications and websites are platforms – this means they have an equal amount of opportunity to be used considerately as they have been used indecently. This is seen in the 165,000 apps dedicated to health and well-being, the largest category of which are aids for mental health – including addiction (Byrnes, 2017). The problem may have a deeper root in the industry.
Harris claims he was taught the techniques these industries use on us whilst studying at Stanford University under a man named B J Fogg (Harris, 2017). Fogg is a scientist teaching students to think about ‘behaviour design’ under a course fittingly titled the ‘Persuasive technology lab’ (Fogg, 2017). His goal is to teach people the most efficient paths for making persuasive technologies, particularly for mobiles (Fogg, 2017). Fogg is a major component in the current state of apps and websites, with many of the co-founders of guilty companies like LinkedIn, Instagram and Snapchat graduating from Stanford (Stone, 2014). “Fifteen years ago there were relatively few examples of persuasive technologies” he explains (Fogg, 2017), further stating that software’s aim was to focus more on “crunching data and boosting productivity” (Fogg, 2017).
Higher Education should be more responsible, it is usually where students are taught sustainable values – how is it allowed that someone is setting up the next generation to corrupt the minds of the public?
I gained more of an insight into Fogg’s ideas through his curriculum. He has surmised a list of eight steps which he claims can guide you towards a successful piece of persuasive technology (Fogg, 2009). Steps 1 through 6 detail user research and how you must “choose the easiest target audience” (Fogg, 2009), something which can be said for tech companies targeting adolescents. He also sees problematic companies like Facebook as a good example for his students to “imitate” (Fogg, 2009). Despite this, an overwhelming amount of the process comes from a place of positivity. Fogg details how the main aims are to “provide ability” to people, be “motivating” and can lead to big successes like saving the environment (Fogg, 2009). He also finds that his students are always ambitious and somewhat idealistic, “Everything big started small” he notes (Fogg, 2009), “Google offered a simple search box, yahoo was a list of links, Facebook was a directory created for friends” (Fogg, 2009).
Vanja Garaj, a senior lecturer of digital design at Brunel University, shared similar sentiments of positivity when interviewed. According to Garaj (2017), the ‘gamification’ of information that Fogg teaches is beneficial. “Contextualising tasks as play makes more contributions to human development than mandatory tasks” he states, adding that designers were brought in to the industry because information became “too long and difficult to process” (Garaj, 2017).
Ultimately, for every Instagram Co-founder, the education system creates people like Tristan Harris. This system is somewhat comparable with mobile phone apps. Both are just platforms that give you the basic tools. People can choose whether to use them positively or negatively.
There is strong evidence that people are using design, with the additional layer of psychology, to create trivial constructs that trick people into addiction. We are also, however, starting to witness the rumblings of a counter movement which have begun to expose this to the public and start their own humane alternatives. Through what I’ve come to understand, we’re at an advantage now. More and more people are starting to recognise this as a real addiction with real consequences. We require the next generation of budding Silicone Valley entrepreneurs to recognise the importance of ethical behavioural design. Now more than ever, whilst this industry is still evolving, would be the time to act. We all own phones, and we’re all stuck likes flies on the world wide web. It’s only a matter of time before the spider comes.
Article by Thomas Bryant
For more information and references, please contact email@example.com