Take hold of your thoughts

Early testing and development

We developed a prototype that would be the simplest version of our service- This prototype could meaningfully mimic the final offering, so we could experiment, understand the user experience and hone the final proposition before investing too heavily in a native app for the public.

Therefore,we created a web app which encouraged users to ask themselves a question and then subsequently answer this question. The intent was that users would inadvertently find themselves having a fluid conversation with themselves where they may recognise more of their own capacity to respond to their challenges with a clearer thought process.

We recruited 30 users aged 20-50 years old who had a mixed level of self-declared anxiety to use the service as much as they wished over a two week period. Through interviews with the participants, their usage data,  survey responses and the contents of their conversations, we were able to gain many new insights about the experience of our proposition.

The most crucial learning was that most people were sceptical about the idea of having conversations with themselves. People often felt it was embarrassing, ineffective or weren’t quite sure what to do. However, we also found that some people (including some sceptics) were surprised that the process eventually provided strong results for them by helping them think more clearly, feel listened to or even feel less alone. With these main insights, we focussed on guiding and simplifying the conversation process and tried to manage user expectations from the start.

In addition, we found that people wanted more than to just have and store conversations, they wanted to be given a new understanding of themselves based on what they had written. While there are many advanced mechanisms for delivering more understanding, the team sought out new, simple methods to curate their conversations in ways that would show them new insight. One approach to this was to give more emphasis to an already appreciated mood rating feature within the prototype.

Find out how we ‘Updated the strategic questions’ to define the design research strategy.

What is the problem?

This service evolved from research on the EQLs design proposition where we explored how to reduce anxiety in people who struggle to self-reflect. EQLs offered people a range of AI characters to talk to that would support them, monitor their mood and alert them to patterns in their behaviour that influence their happiness. The topics arising from this proposition, which warranted further investigation, were about how people feel about having their emotional status analysed? How comfortable are people sharing intimate information with a digital service? And what types of values people can take from non-human interactions?

We continued our research and found that many young people suffering from anxiety were not engaging with self-reflection because it was often ineffective. And while conversational types of reflection with other people were valued, they were seen as unsafe from a social or emotional perspective, and therefore often avoided. This perception of a lack of safety was often connected to the idea that self-reflection meant you had a mental health problem and therefore shared some of the same stigma.

We chose an approach of developing heuristics that uses conversation to help people build rituals and skills for effective self-reflection in order to ultimately build emotional resilience and agency over their lives. We explored simple technological ways to foster the valuable attributes of conversation with ‘another’, because it is effective and familiar, but does not involve a second human— as it is seen as ‘unsafe’


Not much in order to boost my mood or help me if I’m not feeling great.
It surprised me that playing both roles in a conversation – physically typing the issue and then generating the response yourself is so effective. I might have been very sceptical to be honest but actually doing it showed me how helpful this is to my peace of mind.

What is Hold’?

Following the detailed analysis of the prototype trial, as well as online engagement testing and user interface experiments, we refined the proposal and developed a native web app, available on the Appstore and Playstore.

The proposal was positioned as a tool that helped people think clearly through difficult moments by enabling structured self-conversation. Essentially borrowing the format of conversation with another person, but making it safe, anonymous, private, always available, and easier to reflect on and learn from.


Structure self conversation

The main function of the app is a self-conversation feature. The user can start a conversation either by asking themselves a question or simply by saying whatever is on their mind. Immediately after doing this, they are asked to continue depending on how they started, i.e. if they started by saying what is on their mind, then they are prompted to ask themselves a question about what they’ve just said, and then subsequently to answer the question and so on. As they express what they are thinking, their words are displayed in a familiar messenger style conversation format that helps suspend the belief that they are discussing something with someone else.

The conversation can be typed or spoken and the words the user says will be translated into text. At any point, the user can hear an artificial voice (with an accent and gender of their choice) read out their words.

At the end of each conversation, the user is asked to rate their mood and define a title to apply to the conversation.


Conversation reflection

All conversations are stored with their titles in a log where they are organised based on the chronology or the mood rating they logged at the time. When a user re-reads a conversation, they can find questions from the app that encourage a healthy and inquisitive reflection of what was written. Additionally, conversations are grouped together automatically based on simple traits such as the most viewed, or highest mood rated. Beyond the automatic collections, users are also prompted to group conversations in more sophisticated classifications such as the topic, or the environment they were written in or the type of emotion they felt.

These different levels of reflective activity encourage users to use the app not just to vent and cathartically divulge their thoughts, but to investigate, consider and understand themselves further.


Guided thinking

Throughout the app there is a guiding voice which instructs the user if they are unsure about anything. It bolsters a perception of authority, builds the user’s trust and encourages them to engage with the app. It is this guiding voice that offers a steady selection of optional prompt questions during the conversation as well as reflection questions if they return to their conversations or create collections from them. The positionality of the voice is written to be neutral, trustworthy, informed, non-judgemental and open to support the user in whatever they choose.


Respectful in its position

This positioning of the guiding voice within the app was also mirrored through the entire environment and all brand touchpoints. We found in our exploration of the market that synonymous services tended to either ‘own’ the fact their service was a mental health app or disguise it with playfulness, but in all instances they still required the user to admit an emotional issue. Given the associated stigma, it seemed to put up barriers for many of our users. With hold, we try to be honest and clear about the tool rather than being too heavily branded. We reduced clutter, created neutral space and had no tacky characters in an attempt to respect the process that people are going through during use.

Jump to:

Service visions

Perfect friends

Perfect friends is a group of AI Agents that support your mental wellbeing with positive machine-learning-powered messages that model positive human interactions.



EQLS is a digital space where people can speak to AI characters about anything they’d like. They help people learn about themselves and they help life get easier.

Find out how we ‘Reframed the user value hypotheses’.

Jump to:

Proposition Types

Agency Enhancers

Developing a deeper AI driven understanding of yourself to influence your decisions and optimise for your happiness and prosperity.

The science behind it

To create an environment that supports self-reflection, we enlisted well established psychological frameworks to underpin the structure and content of the conversations people construct.

The objective is to support the user in different types of reflective practice that can span from simply expressing what is on their mind,  through to critical self-reflection (i.e.self-reflection that is more aware of time, place and context). The ultimate objective is to reveal deeper assumptions that may transform people’s lives.

The prompt questions in the app are designed considering the working memory model to enrich and bolster self-reflection at a cognitive level. This is done by encouraging expression that provokes long-term and short-term memory, audio and visual stimuli as well as multiple modes of thought such as simple expression vs critical thinking  (Baddeley & Hitch‘s (1974) theory of Working Memory). The questions are also tuned to provoke varied modes of language, which help users illustrate their reflections with greater description, narration, examination or consideration, thus creating a deeper, broader engagement with the reflection content. The aim is to encourage greater awareness, control and insight about the reflection, the act of self-reflection and therefore their own thoughts, feelings and behaviours. (Cubero et al., 2008)

Encouraging the user to reflect not just on their situation, but on their understanding of their situation helps build their knowledge of how they think as well as their ability to plan, monitor and assess how they think. This is called meta cognition; it is a skill which may help build people’s emotional resilience and ultimately their ability to control their thoughts and their life.
(Schraw,1994; Schraw,1998; Schraw & Dennison, 1994; Sperling, Howard, Miller, &   Murphy, 2002; Sperling, Howard, Staley, & DuBois, 2004 cited in Hussain: 2015; 134)

Through these models, we refined attributes of the guidance and content in the app as well as established brand positioning and features such as the text-to-voice playback.

Jump to:


Structured Social Judgement

Constant social media judgement could lead to extended states of anxiety potentially making people attempt to perfect their outward appearance. Data could be collected against people’s will and without their knowledge and lead to further control over people’s behaviour.

Find out how we did ‘Field research’.

The trial

From the 12th December to the 20th January 2019, we ran a trial where we promoted and monitored the app and subsequently conducted a series of interviews with engaged users. From the trial we took five main learnings.


The product is desirable

Over the entire period, we had 2712 users with an average of 11.3% of them returning to use the app later. These users completed 1123 conversations consisting of 6497 thought entries (different stages of the conversation). On average, 59% of people who clicked on our adverts downloaded the app. During our most optimised week, we spent £400 on advertising, resulting in 208 fully engaged users costing just £1.25 each to acquire. This high conversion percentage and the low cost of engaging a new user demonstrates that the proposition is highly desirable.


Conversations are valued in two distinct ways

There are two distinct ways that people found value in having conversations with themselves. Firstly, the initial expression allows a cathartic unpacking of complex, anxious or out of control thoughts, which provides an in-the-moment release and calm. Secondly, the subsequent extended conversation reframes users’ mindset so they can further understand their context and thoughts, and then potentially progress toward a conclusion —ultimately providing a calming way to process thought and reach new understandings.


Revisiting conversations can be challenging but offers reassurance, learning & pride

Different users engage with past conversations in different ways and for different reasons. Some users are intimidated by the intensity of the emotion in their conversations and avoid it to protect themselves; some users want to see their conversations but only when there are enough of them to find bigger insights or to feel proud of how many they have; and finally, some users revisit after each conversation. They return in a calm moment or a stressful moment, but when they do, they often find assurance that they can get through what they are experiencing because they can see the evidence that they have done so before. It also helps them learn and remember what works for them and they often develop a sense of independence, self-reliance and pride.


It is valued for being a private, convenient, neutral space to externalise thoughts

We validated that some of the critical attributes we had built into the service were valued by our users. Being able to write and speak thoughts, that were previously internal, helps people process them. And the fact they are stored in the app, lends a welcome sense of weight and significance to the process. Absolute privacy has helped people explore thoughts they wouldn’t have done otherwise and was a critical reason why some adopted the app. Having Hold available at all times created a sense of comfort and made self-reflection more achievable in people’s busy lives. The simplistic nature of the app and the light touch prompts provide structure for people, but leave plenty of room for them to act as they wish, offering a flexibility that respects their process.


Our users suffer from anxiety more than we expected

We found that our most frequent users are more anxious than we had previously anticipated. Through previous research we hypothesised that our most likely target user would be someone who is experiencing symptoms of anxiety, but is not formally engaging in professional therapeutic services and does not effectively self-reflect. However, the users engaging the most with the app were often already professionally diagnosed with anxiety type disorders and often had regular therapy. These users tended to prefer the app as a regular mechanism to manage their conditions and in some cases used the app and their conversations to inform their therapy sessions.

Jump to:

Dimensions of change


Your relationship with your body may be pressurised but you could have more capacity than ever to control it.



Mymes uses an understanding of people’s behaviour to create simulations of their future to help them make decisions and it distills different sides of their character to help them explore who they are.

Emerging discussions

What has this service taught us more broadly? Beyond simply exploring what self-reflection can mean to people and how they may engage with it in the future, this service raises many new areas of interest with regard to people’s attitudes towards understanding themselves through technology. Below we discuss the service in relation to the original question: How might AI be used to help people gain agency over their lives? We briefly touch on four topics about harnessing valuable formats of interaction, about how people are treating themselves, what people need in exchange for their data and what it means for a service to have an identity and a relationship with its users.


Mimicking engrained interactions

This service invites people to act directly to improve the management of their thoughts and their lives. To do this, the core feature available is a small heuristic that mimics a conversation with someone else and in doing so, it insights learning and development. The fact that simply mimicking such an engrained format of communication has helped people engage with and understand complex thought, demonstrates an opportunity to explore co-opting other formats to enrich complex exchanges.


People are gaming themselves

In addition to the opportunities for exploring the co-opting of interactions. This heuristic also tells a story of people’s increased understanding and acceptance of their logical inconsistencies in order to help themselves. Simply by mimicking a conversation, this device enables people to overcome stigma and think more clearly, even though they are knowingly participating in an illusion or role play.

This willingness to indulge in an illusion that they are having a conversation with someone else, is an acknowledgement on the part of the user of their own human irrationalities and an awareness of the factious or demanding nature of their thought processes and what conditions are required to unravel a complex personal issue.

This speaks of a humble, pastoral dynamic people have with themselves and a willingness to bring technology into that dynamic, demonstrating a high level of trust in technology. People are essentially saying, ‘I’m unable to think clearly and while I want to speak to someone else that’s not appropriate, however, I can overcome these problems by pretending to have a conversation.’

Is this a sign of an emerging shift in people’s attitudes toward themselves that maturely embraces some of the anomalous irrationalities in human behaviour in order to develop more control and ultimately be happier? Is this a collective increase in meta-cognition? Does it demonstrate a shift in people’s relationship with mental health, the role that technology can have within it and a willingness to game themselves?


The price of sharing our data is to be shown ourselves

There is a commonly discussed public concern about companies collecting data about us and the power they may wield as that data enables them to understand us and potentially control us. Yuval Noah Harari speaks of a need to learn and understand ourselves so we cannot be easily manipulated by technology. This dynamic equates to an arms race of knowledge about ourselves. Services such as Hold build both sides of that arms race, enriching the individual and the company with knowledge of the user. Can this growth continue symbiotically? Will us humans reach our capacity too quickly? Will technology always have more scope for growth? Would this be technology companies greatest trick in gaining control?

When asking people what they consider as a fair exchange for sharing data about themselves, they are ostensibly responding by saying that they want the same thing companies do —to understand themselves. Does this mean people do not understand the risks of data sharing or does it represent how much people crave knowledge of themselves?


Who is the service?

Another interesting area of discussion that emerges from the service is about the relationship people form with the service as an entity. In this particular situation, the user mentally manifests another entity with whom they directly converse, but the app itself also has an identity. In this case, the identity is intended to manifest as a neutral, trustworthy, informed and non-judgemental presence that lends authority to the space and in turn helps the user respect their own process of reflection. In many emerging therapeutic services, the identity of the service is also the other entity with which a user interacts and converses with.

One of the challenges with designing the ‘Hold’ service was establishing the right positionality of the artificial ‘voice’ in the app (the instructive guide), so that it hosts the user in the space in the correct way allowing them to form the correct relationship with the activities conducted within it. There is a complexity in the interplay and overlapping of the brand of the service, the identity of any voices within it and the space. This complexity was carefully considered for our users, however, as we see more complex services emerge where some of the voices of the service are artificial, directly interacting with users and deeply engaged with a user in a long-term intimate exploration of themselves, there is not only an imperative to fastidiously curate that dynamic on a level not so far paralleled. But there is also a need to consider how these dynamics must be customised for each individual and how these relationships will influence people, perhaps years into a relationship.
Our service represents a willingness on the part of the user to engage with other entities of different, alien characteristics in an emotionally intimate and trusting dynamic. That trusting dynamic is fascinating, filled with potential, powerful and comes with a heavy responsibility

Nicolas Rebolledo
John Makepeace
Francesca Ferrari
Mattia Gobbo

Other services developed

Live Services


Edit is an app that helps you fill your daily routine with more positive actions than smoking. It’s not about quitting cold turkey or feeling like a patient. It’s just about trying new things and seeing if they work for you and your lifestyle.

Live Services


Quirk is a personal finance app that helps young people learn about and manage their finances according to their personality and interests so that they can ultimately make better financial decisions that align with their life goals.

Would you like to know more?

Let's find the place to think, the freedom to challenge and the capability to act on real change. Together.

Let's Talk!

Let's find the place to think, the freedom to challenge and the capability to act on real change. Together.