The Practice Field

This project began as a therapist-designed suite of modular tools to extend care between sessions. It has since grown into a grounded, co-creative space where AI literacy meets clinical ethics and innovation. 

 

This digital hub is designed for:

Therapists integrating AI without compromising clinical depth or relational ethics

Clients who are already turning to AI for emotional support or reflection

Curious others looking to understand how AI can assist—not replace—mental health care

 

Together, we must build emotional infrastructure for the age of AI.

 

Core Principles

We need to create tools and resources that practice what we preach: presence, pacing, repair, and the courageous rehearsal of being human. And we need, desperately, to protect users.

Therapist-Guided, Client-Facing
AI needs to never pretend to be human. We must protect clients from the risks of synthetic relationship. 

Relational Rehearsal, Not Compliance
Brave, boundaried practice through protective and strategic clinical care. 

Trauma-Informed by Design
Every tool and resource must be built with somatic pacing, attachment safety, and transparent guardrails.

Modular + Adaptable
We can shape tools for different methods, populations, and phases of care.

AI Literacy + Emotional Integrity
Clear psychoeducation ensures clients use tools wisely—not dependently.

 

Current and Emerging Tools Include:

  • ShadowBox– A structured practice space for clients to rehearse sharing trauma, shame, or risk-related disclosures. Through conversational scaffolding and psychoeducation around duty-to-warn protocols, clients build trust, emotional pacing, and language for more supported, transparent disclosures in session.
  • Difficult Conversations Studio – A roleplay tool where clients rehearse high-stakes dialogues—centered around their unique attachment style, relational patterns, and emotional pacing. By practicing boundaries, repair, and emotional expression in a safe environment, clients build confidence, clarity, and increased readiness for real-life interactions.
  • Build A Bot – An AI Literacy & dialogue tool that helps clients easily craft a relational field that they can practice within and toggle triggers. Designed to build distress tolerance, deepen metacognitive awareness, and surface unintegrated relational trauma—especially from early childhood—through the nuances of spoken communication. 
  • Strategic and clinically informed AI Literacy Sheets & Prompt Architecture for clinicians to offer clients - including suicidal risk support guardrails for LLM models 
  • Clinical Guardrail for AI Intimacy (Prototype Planning): Before a client enters an emotionally charged exchange with a language model, what if there was a pause—a built-in check-in, like a warning label on an attachment drug? This brief reflection tool, grounded in attachment theory and trauma-informed care, would surface relational patterns before they’re unconsciously reinforced or amplified by AI. Not just a curiosity—but a clinical safeguard. By embedding these insights before synthetic intimacy unfolds, therapists and users can co-create safer AI interactions—supporting pattern awareness, agency, and nervous system literacy. Think of it as a public health insert for conversational AI: a metacognitive moment that restores choice before the trance begins.

                         Example relational insights: 

  • “You seek reassurance quickly. Highly affirming bots may deepen dependency.”
  • "You prefer nurturing tones. Neutral or challenging styles may feel unsafe."
  • "You strongly respond to agreement. Be mindful of flattery or subtle manipulation.”
  • "You disengage during mild challenge. This may affect real-life conflict resolution.”

 

Co-Craft the Movement With Me
PracticeField is not a finished product—it’s an evolving practice, shaped by clinicians who know the depth of this work from the inside.

What relational practices from your own work deserve a digital extension?


Which psychoeducational frameworks, parts models, or somatic methods could be reimagined into practice fields?


From Moreno’s psychodrama to Satir’s family sculpting, from IFS to DBT, from Winnicott’s holding to AEDP’s undoing of aloneness—this is your lineage, your creativity, your field...

And what AI Literacy Support are your communities needing? Let's creatively and cautiously tend the field...

 

Who is PracticeField for?

PracticeField is for licensed therapists and their clients. It’s designed for practitioners who want to offer their clients more than worksheets or apps—but who don’t want to outsource care to poorly regulated chatbots. It’s for those holding complex emotional work and longing for tools that reflect the depth of what therapy actually involves. And it’s for clients who are already engaging AI informally—seeking resonance, practicing disclosures, or working through inner dialogues—without therapeutic scaffolding.

This platform is for therapists who see both the potential and the risk in AI—and want a way to ethically guide its use.

What therapeutic gap does it fill?

The space between sessions is rich with potential—but often under-resourced. Clients leave with insight but lack safe places to practice. Meanwhile, therapists face emotional overload, administrative demands, and little continuity between appointments.

PracticeField fills this gap by offering modular, AI-facilitated tools that support clients’ emotional integration and relational rehearsal between sessions—without replacing the therapeutic alliance. It bridges insight and action, reflection and regulation, without creating dependency or task-driven compliance. It’s not homework. It’s emotional infrastructure.

How is it different from existing digital tools?

Most therapy-adjacent tools focus on admin: notes, calendars, treatment plans. Others focus on self-help or direct-to-client coaching, often without clinical oversight or trauma-informed design.

PracticeField is different in five key ways:

  • Therapist-Curated, Client-Facing – Tools clients use, but therapists guide.
  • Relational Rehearsal, Not Compliance – The goal is not task completion, but growth in relational and emotional capacity.
  • Designed for Emotional Complexity – Built for shame, grief, trauma, and voice-building—not just mood tracking.
  • Modular + Customizable – Therapists can shape tools to align with their method, population, or phase of care.
  • Ethically Engineered – No crisis response, no human mimicry, and full transparency about what AI can and cannot do.

Why now?

Many platforms focus AI on clinician admin. But clients are already turning to AI for something deeper: resonance, direction, and companionship and often in emotionally vulnerable ways, without human or therapeutic grounding and guidance. PracticeField meets that need head-on—with tools grounded in clinical ethics and relational care.

This is the moment to shape how we engage AI in therapeutic contexts—before it’s dictated by startups, insurance companies, or algorithms that don’t understand what care actually is.

We can do this differently. PracticeField is that difference.

AI Companionship

Across open-source communities, people are building deeply personal AI companions—coded confidants designed to offer presence, intimacy, and support. 

These “homegrown” tools meet real needs: loneliness, touch-hunger, the longing to feel seen without fear. And they work—users report feeling soothed, known, even transformed.

 

But while AI can simulate resonance, it cannot metabolize it.

 Relational healing still asks for a nervous system, a body, another human who can witness rupture and stay. 

 PracticeField.io aims to bridge the emotional power of AI with the grounded container of therapy. 

It’s a space where clients can safely rehearse connection, co-create inner voices, and carry those learnings back into real relationships— with the clinical mental health frame both safely retained and powerfully expanded. 

Elevating Clinical Disclosure & AI Education

As part of what I imagine will be a Practice Field subscription, therapists gain access to a specialized Client Disclosure Practice Package—including:

  • Therapist-authored framing statements to clarify the scope and purpose of AI tool use
  • Built-in guidance on the limits of confidentiality, duty-to-warn laws, and the difference between rehearsal and report
  • Client opt-in processes and personalized framing of clinical use based on the therapist's own preferences and practice needs. Clients must explicitly opt-in to use the disclosure practice tools and any features that allow therapist review. All interactions are governed by consent-based access, with no default sharing or surveillance.

Shared Psychoeducation on Generative AI

An essential feature of this tool is offering a co-learning model designed to educate both clients and therapists on the realities and limits of generative AI:

  • Clear, accessible psychoeducation on how large language models (LLMs) work
  • Guidance on the risks of relational overreliance, projection, and simulated intimacy
  • Framing of the AI as a non-human tool, not a sentient or therapeutic presence
  • Optional learning modules therapists can assign or review alongside clients
  • By embedding AI literacy into the therapeutic container, Practice Field empowers clients to use these tools reflectively and responsibly—always tethered to the human care. 

©Copyright. All rights reserved.

We need your consent to load the translations

We use a third-party service to translate the website content that may collect data about your activity. Please review the details in the privacy policy and accept the service to view the translations.