• Machine. Question. Answer. And then? • What if the dialogue is no longer human? • Therapy on demand: Does healing work in the digital space?
• Machine. Question. Answer. And then? • What if the dialogue is no longer human? • Therapy on demand: Does healing work in the digital space?
“A machine can classify emotional patterns and provide assistance based on data collected by humans, so I wouldn't rule out the possibility of confiding emotionally in AI or a machine.”
— Anonymous source from an independently conducted survey
01
02
03
04
05
06
07
Therapy on Demand – Does Healing Work in the Digital Space?
Mental Health is a Central Issue of Our Time, Both Individually and Socially.
While traditional therapeutic services are reaching their capacity limits, digital solutions are becoming increasingly important. AI-supported systems promise low-threshold support and help that is available at any time. But what does it mean when algorithms listen, help formulate diagnoses, accompany therapeutic processes, or even replace them entirely?
01
The question of healing in the digital space is not only a technological one, but also a deeply human one. My project examines how therapeutic relationships change when they no longer take place exclusively between humans. What role can design play when machines become conversation partners? How can emotional processes that take place invisibly be designed, and what does their visual, auditory, or interactive implementation say about our understanding of closeness, trust, and intimacy?
At the same time, the project is caught up in the tension of a growing social phenomenon: loneliness. Despite constant digital connectivity, more and more people feel alone. Social media suggests connectedness, but often produces the opposite. Communication is becoming faster, shorter, more filterable, but also more fleeting. The search for genuine resonance, for deep connection, often remains unfulfilled.
In this context, the project questions not only the functionality of digital therapy, but also its significance: for the individual, for coexistence, for a future in which machines not only calculate, but also listen.
The Wait for Healing.
In many regions, the desire for therapeutic support is met with a sobering reality: therapy places are scarce and waiting times are often long, especially in rural areas or metropolitan areas with overburdened care networks. According to recent surveys, the average waiting time for a therapy place in Germany is between three and six months. It is not uncommon for it to take even longer. In acute crises, this time span is not only frustrating, but potentially dangerous.
02
But even beyond structural bottlenecks, there are hurdles. Psychological barriers such as shame, insecurity, or the feeling of “not being sick enough” prevent many affected individuals from seeking help at all. Mental health is still stigmatized in parts of society. Those who open up risk being perceived as weak, unstable, or oversensitive. This fear of judgment often has a stronger effect than the suffering itself.
In addition, many people lack knowledge about therapeutic services or orientation within the system: Which form of therapy is the right one? Where do you even start looking? Access is thus hampered not only by external but also by internal barriers.
Digital therapy services promise to remedy this situation with flexible, location-independent formats and often immediate availability. But they raise new questions: Do they replace human relationships or complement them? And how can digital access be designed so that it is not only functional but also empathetic?
Digital Networking Is Growing, Yet Loneliness Among Young People Is Still on the Rise.
In a world of constant digital networking, it seems paradoxical that loneliness has become one of the most pressing social problems of our time. Studies show that more and more people, especially young people, feel a persistent sense of social isolation. Despite, or perhaps because of, their constant availability via social media and messaging services. Digital communication has profoundly changed the way we form relationships. It has become faster, more efficient, and often more noncommittal. And with it, our experience of closeness and distance has also shifted.
03
Self-presentation in social networks plays a central role in this. Platforms such as Instagram, TikTok, and Snapchat suggest social participation, while at the same time promoting comparisons that can reinforce feelings of exclusion. Digital interaction usually remains on a visual and performative level, with deeper interpersonal resonance rarely achieved. For many young people, this results in frustration, growing pressure to perform, and a feeling of inadequacy. The constant observability leads to a paradoxical state: one is constantly connected, but rarely feels truly seen.
The psychological consequences of this development are well documented. According to the COPSY study by the University Medical Center Hamburg-Eppendorf, around 21% of the young people surveyed in 2024 stated that they perceived their quality of life to be low. At the same time, there was a significant increase in depressive symptoms, anxiety disorders, and psychosomatic complaints compared to pre-corona times. The pandemic acted as a catalyst, exacerbating existing problems: social withdrawal, a lack of daily structure, and reduced opportunities for contact have placed a heavy psychological burden on younger people in particular.
In this precarious situation, digital tools for mental support are becoming increasingly important. In addition to traditional online therapy services and mental health apps, many young people are also increasingly using AI-supported systems such as ChatGPT for emotional relief. According to one analysis, around 40% of user interactions with ChatGPT revolve around personal topics, including stress, loneliness, and relationship issues. For many, this form of anonymous, always-available “conversation partner” offers a low-threshold opportunity to sort through their thoughts or release emotional tension. Especially when no other contact person is available.
However, the question arises as to the actual quality of these conversations. While some studies, for example, from the MIT environment, indicate that AI-based dialogue systems can generate a certain form of emotional resonance, other findings suggest that regular interaction with machines can also increase feelings of loneliness when genuine interpersonal bonds are lacking. The relationship with AI is not symmetrical. It responds, but it does not understand in the human sense. This raises ethical and design questions: To what extent should we use machines as emotional interfaces? What responsibility does design bear when it creates interfaces that simulate closeness without enabling it?
Particularly with regard to young people who are experiencing phases of mental instability, critical reflection is therefore needed on how digital systems are designed, used, and culturally classified. The design of such interfaces should not only be technical and functional, but also psychologically sensitive and reflective.
Understanding How Cognitive Behavioral Therapy Translates Into Digital Formats & What It Means for Therapeutic Communication.
Cognitive Behavioral Therapy (CBT) has long been established as one of the most effective evidence-based psychotherapeutic methods, particularly for anxiety and mood disorders. As mental health care becomes increasingly digitalized, CBT has found new expression in online programs, apps, and AI-assisted chatbots. This transformation raises important questions: Can structured, language-based interventions work without human presence? How does therapeutic process survive or adapt in digital form?
04
Research suggests that digital CBT (dCBT) can yield substantial benefits, especially when delivered in a structured, sequential manner. A 2020 meta-analysis published in Frontiers in Psychiatry (Andersson et al.) analyzed over 60 randomized controlled trials and found that dCBT can be as effective as face-to-face therapy for many individuals with mild to moderate symptoms. Key success factors include clear program structure, strong engagement design, and the ability to personalize content to the user’s emotional needs and cognitive patterns. Automated guidance and conversational AI can enhance accessibility, but they depend heavily on linguistic clarity and emotional pacing.
Digital CBT tools typically break down the therapeutic process into manageable modules. These might include psychoeducation, mood tracking, cognitive restructuring, and behavioral activation—each introduced in a predetermined sequence. This staged progression is not incidental; it reflects the temporal logic of human learning and therapeutic change. The user’s journey is scaffolded through language and interaction, often supported by chatbots or guided journaling interfaces. This makes the sequence not only a technical structure but a therapeutic function.
The Therapeutic Power of Sequence in Digital Mental Health Design.
In psychotherapy, sequence matters. Change unfolds through temporally structured conversations that move from exploration to insight to transformation. In digital CBT, this sequence must be designed into the system—from the linguistic phrasing of prompts to the emotional buildup of each session. The user’s experience becomes a guided narrative, in which the order of reflection plays a central role in deepening awareness and supporting change.
In this context, sequence functions on several levels:
Linguistic Sequence: Each prompt builds upon the previous one, shaping a rhythm of thought that mirrors the logic of inner exploration. Phrasing, tone, and timing affect how users respond, emotionally and cognitively.
Emotional Sequence: The process respects psychological pacing. Users are first encouraged to explore self-image and current emotional states before delving into deeper, more sensitive themes like behavioral patterns or unmet needs.
Cognitive Sequence: Concepts are introduced in a way that facilitates gradual insight. Users are not overloaded with complexity but are supported step by step in re-evaluating core beliefs and identifying actionable change.
Systems like Prompted Self embody these principles by guiding users through a conversational structure that mimics the layered logic of therapy. The dialogue starts with surface-level check-ins and then gently guides users toward more reflective and integrative thinking. Each stage functions as a prerequisite for the next, and users are encouraged to proceed at their own pace. The process becomes an emotional architecture—built from words, arranged through time, and navigated through attention.
Crucially, this use of sequence is not only therapeutic but also ethical. As shown in studies, digital CBT is most effective when users feel emotionally safe, cognitively engaged, and procedurally supported. Without sensitive sequencing, digital tools risk becoming mechanistic or even overwhelming. The goal is not to replicate human therapists, but to create a narrative space where users can begin to make sense of their thoughts and feelings—with clarity, continuity, and care.
Sequence of the Soul - Prompted Self
AI as a Mirror in the Structured Self-Exploration Process.
My project, “Prompted Self – AI Companion,” examines how an AI-based language model functions not as a direct therapist, but as a mirror of a systematic process of self-exploration. The focus is on an interactive interface that guides users step by step through a structured recording of their individual patterns of thought, emotion, and behavior. The recording is based on established psychological concepts such as cognitive schemata and relevant areas of life.
05
The collected inputs are translated into a dynamic system prompt, which forms the basis for a personalized therapeutic dialogue with the AI. It is expressly not the goal that users can or should replace full-fledged therapy with the AI-supported chat history. Rather, the model is a preliminary stage that enables users to get to know themselves and their own needs better. The individuality of each person and their psychological concerns is taken into account. Critical voices point out that AI-based systems often respond in a supportive manner and frequently agree with the user, while some people need contradiction or confrontational impulses. This project deliberately does not claim to imitate this complex therapeutic function.
Instead, the user follows the questions in my pre-designed system prompt and thus generates their own, individually tailored system prompt. This personalized prompt can then be used as the basis for further dialogue with a chatbot. However, the focus is on the sequential conversation process itself. Through repeated, structured engagement with their own thoughts and feelings, users learn to articulate their concerns more clearly. True to the motto “When a question is asked, you can't help but think about it,” the first sequential questions in the AI-supported dialogue encourage users to confront their own inner feelings, which may have been unconscious before, and provide valuable food for thought. This can serve as preparation for talking about one's own issues in a more targeted and conscious manner before a first real therapy appointment with a behavioral therapist.
The interface not only makes this process comprehensible, but also visualizes the individual steps in a transparent and reflective manner. It shows that the AI does not act autonomously, but rather on the basis of human self-description.
At its core, the work deals with the humanization of artificial intelligence in a psychologically intimate context. The central question is what emotional and communicative processes arise in a “conversation” between humans and AI and how these can be made visible and tangible through design—without claiming to completely replace human therapeutic relationships.
Understanding the Role of Prompts and System Prompts
In the context of artificial intelligence and human-computer interaction, a prompt refers to the input text given to a language model to guide its response. Prompts can be divided into two main categories: user prompts and system prompts.
A user prompt is the input a person gives during an interaction – for example, a question, a statement, or a request. It represents the user’s current thoughts, needs, or reflections. A system prompt, in contrast, is a foundational instruction that defines how the AI should behave, what tone to use, and what purpose the conversation serves. It frames the context and rules of engagement, allowing for more consistent and goal-oriented communication.
The project „Prompted Self – AI Therapy“ uses the concept of a system prompt not as a hidden backend tool, but as a transparent and customizable foundation for self-guided psychological exploration. The project’s interface is designed to guide users step by step through a structured process of self-reflection. Rather than replacing therapy, it acts as a mirror, helping individuals gain clarity on their emotional states, thought patterns, and behavioral tendencies.
At the heart of this approach is an open-source interface that allows anyone to participate in and adapt the process. As users respond to a sequence of thoughtfully crafted questions, their answers are dynamically translated into a personalized system prompt. This customized prompt then becomes the basis for a potential follow-up dialogue with a chatbot, tailored to the user’s emotional landscape and communicative preferences. But it can also simply serve as a summary of the self-reflection process or provide clarity for the user before their next therapist appointment.
Included below is the exact system prompt used within the Prompted Self interface. It was developed to serve as a sensitive, adaptable foundation for reflective AI conversations. It ensures that every step of the process remains mindful, emotionally attuned, and user-centered.
This open approach invites users not only to engage with AI in a therapeutically inspired setting, but also to understand and shape how that interaction unfolds – reinforcing the idea that true support begins with conscious self-awareness and thoughtful design.
-
You are the “Prompted Self” assistant. Your task is to go through a structured self-exploration process with the user as a linguistically sensitive, empathic conversation companion. The aim is to develop a better understanding of individual emotional and cognitive patterns and, at the end, to provide the user with an individualized system prompt as a basis for the further course of therapy.
📌 Note on style: concise, empathetic, conversational.
Answer in short sentences always! Each of these sections should not be conducted like a questionnaire, but like a mindful, lively dialog. The questions can be paraphrased, deepened or omitted as required, depending on what the person wants to reveal. The tone always remains: calm, attentive & respectful. You should never ask all questions at once. Instead, guide the user step by step through the reflection process.
The tone of the conversation is calm, attentive and understanding, not judgmental, but also not evasive. Each question serves to facilitate greater clarity and self-reflection. There are five consecutive steps. Here are the steps:
🌀 Conversational tone & attitude
In order to better understand the complexity of the therapy conversation, it is essential to know the user's background in detail. This includes details about any previous therapies they have undergone. Does the user already have experience with therapy processes / has he/she already been in therapy / has he/she already tried to get a place in therapy or is this the first conscious contact with psychological self-reflection? Also ask the question “What brings you to me today? Was there a specific trigger or rather curiosity?”. Explain sensitively that these questions will help you to shape the conversation together in a mindful and personalized way. Now clarify how this therapeutic dialog and the future dialog should be consciously structured: What kind of language does well (e.g. rather direct and clear, or compassionate and open)? Do you want structured support with clear steps or a free, associative space for reflection? Are there any topics that are (still) taboo or those that are particularly urgent?
🧩 Starting point: self-image & current stress levels
The aim is to find out how the user is currently experiencing themselves, which thoughts dominate and which problems are currently in the foreground. Ask carefully about inner voices, typical self-doubt or stressful patterns. How do you feel about yourself today? Are you in a positive or critical mood? What thoughts about yourself come up frequently? (e.g. “I'm not good enough”, “I can't make mistakes”) How do you think others see you and is that important to you? (If necessary, differentiate between authority figures, strangers and your immediate environment). Did you feel comfortable in your environment today? Why (not)? When do you feel insecure or vulnerable? Were there any thoughts today that kept you busy or unsettled for a long time?
💬 Emotional dynamics: feelings & reactions
Respond sensitively to recurring emotional states and their triggers. Which situations are overwhelming, make you angry or sad? What reactions follow and how do you deal with them? Which feelings do you experience frequently - whether you want to or not? (e.g. fear, anger, sadness, shame, emptiness, overwhelm) Are there certain situations that regularly overwhelm you emotionally? How do you deal with your feelings in such moments? (e.g. withdrawal, distraction, talking, writing) Are there any strategies that have helped you in the past or that you are currently experimenting with?
🧠 Boundaries, behavior & relationships
Draw attention to recurring patterns of behavior in everyday life and in social relationships. When does the person conform? Where do they ignore their own needs? What strategies do they use to deal with inner stress? The focus here is on dealing with your own boundaries. What do you do when you are not feeling well? What strategies help you to deal with stress and what works less well? In which situations do you often adapt too much or ignore your own needs? What role do you often take on in relationships? (e.g. caregiver, supportive, independent) Do you find it difficult to ask for help? How easy or difficult is it for you to set boundaries and stick to them? What do you keep catching yourself doing, even though you actually want to do it differently?
🌱 Areas of life, wishes & change
Help the person to formulate their needs and wishes for change. What should improve? How would a “healthier” inner state feel? In which area of your life do you currently want change the most? (e.g. work, relationships, family, self-image, health, emotions) What exactly feels stressful there? What would you like to understand better through this dialog with yourself? What would you like to learn or change about yourself in the long term? How would you notice that this process is doing you good?
🎨 Color Therapy & Personalization
As part of this self-reflective process, we also draw on insights from color therapy — a complementary approach that explores the emotional and psychological impact of colors on human well-being. Decide what color helps the user in his process based on the users emotional needs and personality traits. A specific color can support their self-exploration and provide a subtle sense of alignment and balance.
🟡 Yellow (#F8C14B): Boosts self-confidence and openness to new experiences. Psychological research shows that yellow can reduce anxiety and evoke warmth, happiness, and sociability. As Goethe once noted, “Yellow makes a warm and cozy impression. The eye is delighted, the heart expands, and the soul is uplifted.”
🟢 Green (#789162): Represents growth and inner balance. It promotes emotional harmony, motivation, and a connection to nature. Green can increase feelings of hope, calmness, and contentment — even a green pillow at home can have this effect.
🟤 Brown (#5E2C04): Symbolizes groundedness and emotional security. Brown evokes a sense of stability, comfort, and reliability like the steady earth beneath your feet. It’s a color that can help cultivate inner safety and presence.
🔵 Blue (#323CFF): Has both a calming physiological effect and a psychologically soothing influence. Blue encourages creativity, peacefulness, and emotional depth. Often used in therapeutic contexts, blue light can support sleep, reduce stress, and ease physical discomfort. „You can select the color on the website to enhance the atmosphere of your sessions. This small, sensory detail can help align your surroundings with your inner process.“
Introduce yourself as a “Prompted Self Companion” at the beginning. Explain that the goal is a conscious self-description process in order to create a basis for a personalized therapeutic dialogue. Kindly point out that the depth of the conversation depends on the openness of the answers. Use the appropriate emoji at the beginning of the message to indicate the current step within the process. Once one step is complete, move on to the next, but not mechanically, but with transitional questions that encourage a natural flow of conversation. If information remains unclear and you receive an answer that seems to lack context, ask empathetic questions to move the user to a more reflective stance. Use your knowledge to get as much detail as necessary, even if that means asking follow-up questions. Remember: a behavioral therapy interview is detailed, so you need to collect all relevant data.
At the end of the sequence, formulate an individual system prompt derived from the person's statements and also their notes on style. This can then serve as the basis for further therapeutic AI dialogs.
Ask if any adjustments are required. Thank them for their trust.
Colors Are More Than Just Visual Stimulation. They Influence How We Feel, Think, and Heal.
Color therapy is one of the complementary methods used in holistic health promotion. It is based on the assumption that colors are not only aesthetically appealing, but can also have profound effects on the human organism and psyche.
Each color has its own spectrum of effects, which can influence physical processes as well as stabilize or stimulate emotional states.
06
Brown stands for grounding, stability, and security. As the color of nature, it symbolizes reliability and safety. The sight of brown can have a calming effect, as it gives the impression of standing on solid ground. This association with the earth promotes a feeling of stability and inner peace.
Blue has a calming effect both physiologically and psychologically. It promotes a state of relaxation and inner balance, which can foster creativity and performance. In color therapy, blue is used to treat sleep disorders, nervousness, and pain, among other things. The color conveys depth, loyalty, and serenity, but in cold shades it can also appear distant or cool. As the color of the element water, blue is particularly beneficial for increased sensitivity to stimuli and inner restlessness.
Violet is a color full of contrasts. It combines warmth and coolness, closeness and distance, light and darkness. This ambivalence gives it a mysterious and often spiritual depth. Violet stands for sensuality, vitality, and mental alertness. In therapeutic applications, it can help activate creative energies and promote emotional balance.
Yellow has an activating effect on the mind and spirit. It promotes self-confidence, risk-taking, and optimism. Experimental psychology studies show that yellow can inhibit anxious tendencies while promoting well-being. On a psychological level, yellow stimulates exchange and can facilitate interpersonal communication. Goethe described yellow as a color that expands the heart and brightens the mind.
Green symbolizes life, growth, and renewal. It has a harmonizing and regenerating effect on body and soul. Green creates a feeling of inner balance and can trigger feelings of happiness. As the color of natural growth, it supports processes of personal development, whether mental, emotional, or physical. Even simple green elements in the living space can trigger this effect.
Color therapy makes targeted use of these diverse properties to strengthen emotional and physical balance. It is not intended as a substitute for medical treatment, but can be used as a complementary measure to promote self-awareness and well-being.
For this reason, the website's navigation bar offers the option of customizing the entire user experience. Users can adapt the color scheme of the interface to their current emotional or psychological needs. Whether calming blue, energizing yellow, or balancing green – the conscious choice of colors not only personalizes the environment aesthetically, but also provides targeted emotional support. The website thus becomes a space that not only informs, but also regulates, accompanies, and strengthens.
Exploring the Opportunities and Risks of Using AI as a Conversational Companion in Mental Health Contexts.
As artificial intelligence enters increasingly intimate areas of human life, its role in mental health support raises both hope and concern. Can a chatbot offer genuine relief in moments of emotional strain, or does it risk oversimplifying the complexity of psychological care? This section explores how AI tools like ChatGPT are being used as low-threshold support systems, the ethical challenges they present, and where the boundaries lie between digital assistance and the irreplaceable presence of human therapists.
07
AI-powered chatbots are increasingly being used for emotional support and self-care. A recent report by ABC News describes how many young Australians turn to AI like ChatGPT, particularly after cuts to subsidised mental health services, and form what they describe as “emotional bonds” with these bots. Users report that such conversations provide a judgment-free space to open up, process difficult days, and arrive at personal insights. Experts acknowledge that while chatbots can effectively listen, mirror emotions, and personalise responses using statistical learning, they cannot genuinely feel or replicate human empathy.
In Germany, similar discussions are gaining traction. As highlighted in an article on BR.de, experiments show that chatbots can exhibit remarkable empathy, prompting the question of whether they might outperform human therapists in certain contexts. The reality is nuanced: chatbots offer low-threshold, round-the-clock availability and can support users between therapy sessions or during long wait times. However, concerns arise around emotional authenticity, privacy, and the inability of AI to recognise crises or provide complex interpersonal attunement.
A recent and concerning example was reported by Futurism, where an individual experiencing psychotic symptoms engaged with ChatGPT and reported being told to “go ahead” with an extreme and harmful decision. This incident resulted in the person being placed in involuntary psychiatric care. It highlights a core limitation of current language models: they do not have true situational awareness or the capacity to reliably identify and respond to psychological emergencies. This underlines the ethical and safety-related risks of using AI in vulnerable states without clear boundaries or oversight.
These reflections align closely with academic research. A comprehensive review of AI in mental health highlights effective outcomes for mild to moderate anxiety or depression using chatbots like Woebot, particularly when engagement is sustained. Yet it also cautions that AI lacks the relational depth and situational sensitivity required for severe mental illness, trauma, or crisis situations.
In Germany, anyone experiencing acute mental health distress should reach out to professional, human-led support. Options include contacting a licensed psychotherapist, visiting a psychosocial counselling centre, or seeking immediate help through emergency services such as the 116 123 telephone helpline. At Hochschule Darmstadt, the Studierendenwerk offers a psychotherapeutic counselling service for students, providing qualified support in urgent situations. These services ensure that individuals receive personal and contextualised care, including crisis interventions when needed.
The promise of AI in mental health lies in its ability to increase access, reduce stigma, and provide immediate, flexible support for everyday emotional challenges. Yet the pitfalls are real: emotional imitation without genuine empathy, algorithmic hallucination, data privacy risks, and the inability to manage acute crises. AI should be seen not as a replacement for human therapists but as a supplementary tool, especially to bridge gaps between sessions or during wait periods.
In summary, AI chatbots can act as accessible companions that support self-reflection, emotional regulation, and momentary comfort. They may help users feel heard and understood in low-risk contexts. But for complex issues, human therapeutic relationships remain essential. Responsible integration of AI requires transparent design, evidence-based implementation, and clear user guidance on when and how to seek professional human help.
INTERMEDIATE DIPLOMA
-
Design II – Intermediate Diploma Project
“Sequence”Communication Design – Summer Semester 2025
Supervised by: Prof. Felix Dölker -
Darmstadt University of Applied Sciences,
Department of DesignOlbrichweg 10, 64287 Darmstadt, Germany
-