Our Blog

Explore our insights on developing talent within the health ecosystem and empowering leaders and teams to execute strategic goals.

Curiosity Over Certainty

by TLD Group

 

Rarely does a week go by at TLD Group without a conversation about AI and coaching. Sometimes it comes from a Chief Human Resources Officer wondering whether to pilot an AI coaching platform for mid-level managers. Sometimes from one of our independent coaches, asking whether AI will eventually make their work redundant — or whether embracing it is now a prerequisite for staying competitive. Sometimes from inside our own team, as we figure out where and how we want to use these tools in our own practice.

We do not have all the answers. What we have is a growing set of questions, and a strong conviction that asking them carefully, honestly, and together is the right place to start.

The Opportunity Is Real

Let us be clear about something first: AI in coaching is not a threat to be managed. It is a genuine expansion of what is possible.

Coaching has historically been constrained by access. It is expensive. It requires scheduling. It is available to senior leaders far more often than to the managers, physicians, and emerging leaders who arguably need it just as much. AI changes that equation. Platforms can now offer on-demand reflection prompts, goal-tracking, behavioral nudges, and conversational support at scale, reaching populations who would never otherwise have a development experience that feels personalized.

We also see real utility in the ways AI supports the work of human coaches. Transcription tools that free coaches to be fully present rather than note-taking. Assessment platforms that synthesize data more quickly than manual scoring. Scheduling tools that reduce friction and increase follow-through. When AI handles the routine, it creates more space for what only human coaches can do which is the kind of deep, generative listening that shifts how a leader sees themselves and their world.

The International Coaching Federation's AI Coaching Framework puts it plainly: AI used well allows coaches to invest more in the high-value work of complex, transformational coaching in the human-to-human domain. We agree. The question is not whether to engage with AI. It is how.

What We Are Watching

The relationship is where coaching lives. Every credentialed coach knows the coaching relationship is not the container for the work — it is the work. Safety, trust, and human presence are not nice-to-haves. They are the mechanism by which change happens. Can an AI create the conditions for real coaching? The ICF framework is honest about this: while a coaching agreement may be relatively straightforward to establish with AI, the depth of presence and trust required for a client to feel truly safe is harder to replicate. The leaders we work with are navigating genuinely complex terrain, leading through uncertainty, reconciling who they are with who their role demands they become. That work requires a human witness who can sit in silence, hold ambiguity, and ask the question that disrupts a long-held assumption.

Disclosure is no longer optional. The updated ICF Code of Ethics, effective April 2025, is explicit: ICF professionals must disclose the use of artificial intelligence and ensure clients' interests are protected when AI technologies are used. Transcription tools. Analysis software. Any platform that touches client data. This is not bureaucratic fine print. It is the recognition that clients are entitled to know when a machine is in the room. For organizations evaluating AI coaching platforms, the same rigor applies: ask the vendor how data is stored, who has access, how long it is retained, and whether the platform discloses its AI nature to users. These are ethical questions, not technical ones.

Equity cuts both ways. One of the most compelling arguments for AI in coaching is access.  AI can extend meaningful development to leaders at every level, not just those at the top. We hold that value deeply. But access is only equitable if the experience is actually useful and trustworthy for everyone who receives it. AI systems trained on narrow or biased datasets can reproduce existing inequities rather than challenge them. Healthcare organizations in particular, already navigating deep questions of equity in care and culture, should press hard on this dimension before deploying AI coaching at scale.

Coaching is a specific discipline. It is not therapy. It is not mentoring. It is not consulting. The ICF definition is clear: coaching partners with the client in a thought-provoking, creative process that inspires them to maximize their potential. That definition does not change because the coach is human or machine. What concerns us is when AI tools blur those boundaries — when a platform offering behavioral prompts and wellness check-ins begins to function more like a mental health tool without the safeguards that entails. For organizations deploying these tools across clinical workforces, where the emotional stakes are high and burnout is real, this distinction is not abstract.

How We Are Thinking About This at TLD Group

We are a firm grounded in human relationships. Our coaches are credentialed, experienced, and deeply committed to ICF standards. That is not going to change.

At the same time, we are genuinely curious about where AI can extend our reach, strengthen our clients' experience between coaching conversations, and support the kind of continuous learning that leadership development requires. We are not treating this as a binary choice between "all AI" and "no AI." We are treating it as a design challenge: how do we build hybrid models that use technology to enhance human coaching, not replace it?

We are also watching our clients grapple with this in real time. HR and talent leaders under pressure to democratize development. Learning and development teams trying to evaluate a rapidly growing marketplace of AI coaching platforms, many of which make claims that outpace their evidence. CEOs asking whether their investment in executive coaching will look outdated in five years.

Our answer to that last question: no. The most complex leadership development work, such as helping a physician leader hold both accountability and compassion, or a pharma executive navigate an organization-wide transformation requires human depth. It requires relationship. That will not be automated away.

The Questions Worth Sitting With

Three keep coming back for us:

  1. What is the coaching experience actually designed to do and does the tool match that intention? The difference between a reflection app and a transformational coaching experience is significant. Naming what you are buying matters.

  2. Who has access to the data, and has the client truly consented? Disclosure and informed consent are now codified in the ICF Code of Ethics. They should be foundational to any deployment decision.

  3. What does the human coach do that this tool cannot and are we protecting that space? The answer to this question should shape every decision about where AI fits in your coaching architecture.

The best outcomes we have seen come when coaches, clients, HR leaders, and technology partners are thinking through these questions in collaboration, not when any one party is handed a tool and told to use it.

Leadership development has always been about helping people grow into complexity. AI is adding a new layer of complexity to the field itself. The right response is neither uncritical enthusiasm nor defensive resistance. It is curiosity — rigorous, ethical, human-centered curiosity.

That, at least, is something no algorithm can replace.


Tracy Duberman, PhD, is President and CEO of The Leadership Development Group and co-author of From Competition to Collaboration: How Leaders Cultivate Cross-Sector Partnerships to Drive Change. TLD Group partners with healthcare, pharmaceutical, and health-adjacent organizations to design and deliver customized leadership development solutions. Learn more at www.tldgroupinc.com. 

 

Topics: Executive Coaching, Artificial Intelligence

TLD Group

Written by TLD Group