By the time a planning engagement reaches Step 7, the consulting team has been living inside the program for weeks and understands the logic behind every decision. The people who will actually execute the plan (i.e., sales representatives, district managers, market access teams, and medical science liaisons) were not in those sessions. Step 7 asks how those people will experience what the room has designed.
Change Management Builds on Structural Decisions
We place change management at Step 7 intentionally. Steps 3 through 6 define what will change: architecture, risk assessment, roadmap, and operating model. Step 7 asks how people will experience that change, who will resist it, and what the organization needs to do to support adoption. When change management is treated as an afterthought, the plan arrives too late to be specific.
Attempting change management before the structural work is complete produces vague plans. We can’t design a communication calendar if the milestones aren’t set, and we can’t build a training plan if the process changes aren’t defined. The structural work provides the specificity that makes change management actionable.
Impact Assessments Reveal How Change Accumulates by Role
The first artifact is the impact assessment. For each role affected by the program, the consulting team documents what changes:
- New processes and tools
- New reporting relationships, metrics, and decision rights
The assessment maps the magnitude of change by role, not by workstream. This reframing matters because programs are organized by workstream (i.e., technology, operations, finance) while changes accumulate by person. A district sales manager might be affected by three different workstreams simultaneously: a new CRM platform and a new territory alignment model layered on top of a restructured incentive compensation plan. Each workstream sees its own change as manageable; the district manager experiences all of them at once.
The impact assessment makes this accumulation visible. When the room sees that a single role is absorbing changes from four workstreams in the same quarter, the team can make sequencing decisions that the workstream-level view would never surface. The intelligence that makes this possible often comes from Step 2, where the consulting team mapped stakeholders beyond the org chart and surfaced the informal networks that shape how change actually travels through the organization.
Specific Resistance Points Produce Actionable Responses
The resistance analysis asks the consulting team to name, specifically, who will resist the changes and why: identify the role, the specific change they’ll resist, and the reason for resistance. The facilitator’s job is to push past generalities; “people resist change” is not useful. The question is which people, resisting which changes, for which reasons.
When the analysis gets specific, it produces actionable intelligence:
- The field sales leaders may resist the new territory model because it disrupts established physician relationships and they weren’t consulted on the alignment methodology.
- The MSL team may resist the new engagement platform because it adds a documentation layer that Medical Affairs views as compliance but the field views as administrative burden.
- The market access managers may resist the new payer engagement cadence because the contracting cycle doesn’t align with the go-live timeline.
Each resistance point has a different root cause and requires a different response. The field sales leaders need to see how the new territories were designed and have input into transition plans for their key accounts; the MSL team needs the documentation requirements scoped to what’s actually necessary for compliance; the market access managers need the go-live timeline aligned to the contracting cycle or a bridge plan for the gap. Framing resistance as data (i.e., information about where the plan has gaps) produces honest analysis; framing it as a problem to overcome produces defensive, surface-level responses.
Communication and Training Work When They’re Timed to the Audience
The communication calendar maps what gets communicated, to whom, through which channel, and when. Different audiences need different messages at different times: the executive steering committee needs strategic rationale before launch, middle managers need operational detail two weeks before changes hit, and frontline teams need personal impact and support information one week before go-live. The calendar breaks communication into digestible pieces timed to when each audience needs the information.
The training plan is built from the impact assessment, not from the workstream deliverables. Workstream teams tend to design training around their technology, but the people receiving the training need to learn how their job changes. A training module on “how your territory planning workflow changes and why” is harder to build than a module on the new CRM interface and more effective. The plan maps training delivery to the rollout schedule so each wave of deployment has training arriving close enough to go-live that content is retained and early enough that people feel prepared. Teams that skip these wave-level lessons tend to relearn them at scale, consuming time and credibility that a well-timed training plan would have preserved.