Dear Negotiation Explorer,
Welcome to Week 15 of our NegoAI series.
In our recent McKinsey–PharmaCrest preparation, we worked from a simple premise: negotiations with top consultancies are shifting because day‑to‑day delivery is changing. Soon after, The Wall Street Journal published a feature on McKinsey’s AI transition (Aug 2, 2025). It describes many of the same dynamics—smaller senior‑led teams, outcomes‑linked fees, and explicit AI roles in delivery. For email subscribers, here is the gift link (it should be full access, no subscription required).
Previous Rounds
For transparency on method: in previous rounds we prepared both sides using Cassidy AI workflows.
McKinsey perspective — four specialized agents:
Jerry (B2B Sales Agent): analyzed market context, competitors, and PharmaCrest’s business to define McKinsey’s defensible value proposition.
Daniel (Behavioral Agent): profiled PharmaCrest’s CPO, Sarah Chen, to tailor the communication strategy.
Deepak (Negotiator): mapped interests, BATNAs, scenarios, and creative integrative options using Jerry and Daniel’s inputs.
Linus (Compiler): integrated the layers into a single, actionable report aligning strategic, behavioral, and tactical elements.
PharmaCrest perspective — three specialized agents:
Eli (Strategic Buyer): evaluated vendor performance, alternatives, and procurement strategy from PharmaCrest’s side.
Deepak (Negotiator): prepared stakeholders, interests, BATNA analysis, options, and scenarios.
Linus (Compiler): integrated outputs into a coherent strategy.
WSJ, through a negotiation lens
When we prepared the McKinsey–PharmaCrest negotiation, we envisioned AI as part of the team rather than a silent back office. Practically, that meant using AI for research, synthesis, and quality checks while senior people retained judgment, stakeholder work, and sign‑offs. The WSJ describes a very similar working model inside McKinsey. At the table, it’s reasonable—and helpful—to talk openly about where AI speeds the work and where human review is non‑negotiable. That transparency reduces objections early and makes the offer easier to compare.
We also organized the proposal around a smaller, senior‑anchored core, with AI lifting routine analysis that once drove headcount. The reporting points in the same direction: leaner teams concentrated on experience. That shift naturally changes the price conversation. Rather than negotiating hourly rates, both sides can look at a clear team design, understand what each role adds, and let commercial terms follow the design.
On economics, our preparation favored putting a measured portion of fees at risk against observable milestones—time‑to‑pilot, adoption thresholds—so progress is recognized without turning the engagement into an all‑or‑nothing bet. The WSJ notes the broader move toward outcomes‑linked work, which helps normalize that structure. Tying the at‑risk element to mutual conditions—access to data, a steady decision cadence, executive sponsorship—keeps incentives aligned and renegotiations rare.
Scope, in this context, is more than a deck. We focused on enablement: playbooks, training, and shadow‑to‑lead handovers that allow the client to run the process after we step back. The article highlights the same emphasis on capability building alongside strategy. Making those transfer artifacts explicit in the statement of work turns “value” into something the buyer can see and use, not just hear about.
Pace and transparency matter. To avoid surprises, we proposed a simple decision cadence—short, fixed‑agenda check‑ins every two weeks—and a light tracker so decisions surface early and course corrections are quick. The pressures described in the WSJ point the same way: agree the rhythm up front and both sides stay calm when timelines tighten.
Where AI expands, guardrails become part of the negotiation, not a legal appendix. We set out who owns what, excluded client data from external model training, and kept audit trails for AI‑assisted work. The WSJ focus on scaled AI adoption makes these points standard practice; writing them down keeps the conversation on outcomes rather than process anxiety.
Applying this to the McKinsey–PharmaCrest negotiation
The structure we drafted—a smaller senior core, AI‑assisted analysis, and milestone‑linked economics—fits the landscape the WSJ describes. It lets a buyer compare like‑for‑like models across firms and gives a consultant room to differentiate where it matters: partner judgment, execution discipline, and stakeholder alignment, with AI compressing the routine.
From the PharmaCrest vantage point in Part One, the stakes were explicit: an $18.5M annual baseline with a target band of $13.9–$14.5M, anchored by quoted alternatives—BCG at $13.8M, Bain at $14.2M, Deloitte at $11.5M (implementation‑heavy). Setting the negotiation on team design first allows price to follow design rather than devolve into rate cutting.
Performance data also shaped the buyer’s leverage: only 42% of recommendations fully implemented versus a 65% benchmark, and internal AI compressing analysis cycles from two weeks to two hours with roughly $2.3M in documented quarterly savings. Bringing those facts into the room legitimizes explicit AI roles and milestone‑linked economics (e.g., time‑to‑pilot, adoption thresholds) so progress is recognized without turning the engagement into an all‑or‑nothing bet.
Compliance concerns mattered too. PharmaCrest flagged senior‑partner engagement shortfalls; a simple decision cadence with a light tracker makes accountability visible and course corrections quick—without inflating headcount.
Finally, the most constructive path is integrative rather than adversarial: a hybrid AI‑human partnership that uses PharmaCrest’s internal AI with McKinsey oversight; success‑sharing tied to implementation milestones; a knowledge‑transfer accelerator (playbooks, training, shadow‑to‑lead); and a phased transition that reconciles cost, speed, and capability. This is exactly the shift from advice to enablement described in the WSJ reporting.
Spelling these terms out in the proposal—team design first, plain‑English AI usage, measurable milestones, enablement deliverables, simple decision cadence, and clear data/IP rules—reduces friction, shortens cycles, and keeps the fee discussion grounded in design rather than headcount.