It is a great concept in theory. Smartphones and apps have revolutionised every aspect of our daily lives; from expert-level advice to professional-quality photographs, it seemed natural that they might one day improve our depression and loneliness. The dawn of mindfulness apps, such as Headspace, suggested that there might be real potential here. However, to suggest that any more time spent on our anxiety-inducing, data-harvesting device can somehow improve our mental wellbeing is a touch optimistic, if not outright naïve.
Internet addiction is a thing, social-media-induced anxiety or ‘fomo’ is absolutely a thing. These are drivers of depression. The more time you spend on your smartphone, the more likely you are to experience these consequences. If your phone is on you or near you and that notification pops up, best believe you will be on that phone asap. It is not a matter of will-power to resist the urge. Hundreds of engineers have worked to ensure the device and app are designed in such a way as to ensure you pick it up and stay glued to that screen. Nor does it a matter that a mindfulness app is downloaded alongside your plethora of social media applications. In addition to the social-media fomo, there is the physical attachment to your phone: carrying it around in your hand, having it on your desk as you study, charging it by your bed. Researchers in Hong-Kong call it nomophobia. Phobia or nomo-phobia, the detrimental effects on our mental health of this physical and psychological dependence are experienced first-hand by us all. This is why I find it so strange to see articles claiming that a new ‘AI’ app can help solve all our mental health woes. The underlying problem of the physical and psychological attachment remains the same. Apps make money from our screen-time and our attention, so irrespective of what any given app’s mission-statement may be, our usage is its bottom line. That is why Snapchat has streaks with your best friends, that’s why Facebook sends you completely irrelevant notifications, and that is why meditation apps will give you rewards if you ‘keep up the good work’. These are all attention-grabbing plays to bring you back into the app. For a more detailed description of these techniques, I suggest listening to Google’s Tristan Harris. So if all that matters is that you keep coming back and staring at the screen, does it really matter if the app is ‘good’ or ‘bad’? Whilst I see the mindfulness app market as no more than effective marketing of an equally addictive product, some might make the argument that being addicted to an app inducing positive habits is not such a bad thing. The problem here then is the very different kind of data that these mindfulness apps collect.
Headspace has 16 million users and is valued at $250m, a small part of the booming $183bn industry. However, you do not process that much user data and have a valuation of a quarter of a billion dollars, without having a pretty damn good profit model. Unlike Facebook or Instagram, mindfulness apps store infinitely more sensitive data. From measuring our mood swings and our heart rates, to app-usage, all this data can be neatly aggregated, allowed according to ‘community sharing’ permissions we seldom see, and sold off to the highest bidder. That is the for-profit model of contemporary apps. This is less of a problem when Instagram places targeted ads for a clothing brand you might like for example. With mindfulness apps, on the other hand, there is a real danger that highly sensitive data about your well-being can be sold on and exploited for commercial gain.