The whole team was trained upfront. The first engagements were a disaster.
…because an unused skill fades.
An accounting and tax firm of about twenty people decides to replace its shared drives and physical binders with an integrated document management system, connected to its accounting and tax software. The new platform marks a clear break from the existing setup: a new filing structure, automatic indexing of incoming documents, approval workflows for invoices, e-signature for tax filings, and rule-driven legal archiving. The partner in charge of the project plans ahead. He sends the entire team — partners, engagement managers, accountants, administrative assistants — to training at the solution integrator. Three days per person, spread across two weeks to avoid paralyzing the year-end closings already underway.
Training goes well. Feedback is positive. Everyone reports being operational.
The rollout slips. Migrating the historical archive turns out to be heavier than expected. Technical integration with the accounting software requires adjustments. Tax season opens and makes any tool change impossible during closings. The actual go-live happens six months after the training ends.
The first weeks are chaotic. Documents are mis-indexed at scan time; some become unfindable. Approval workflows are bypassed because no one remembers the exact sequence. Several team members keep filing documents in the old shared drives “just in case.” Two clients receive their tax filings late. A third puts a complaint on record about the quality of follow-up.
What this go-live cost
The loss does not show up in the “training” line of the budget. It surfaces at several levels that are almost never consolidated.
The productivity lost during the two weeks of training already represents tens of thousands of francs on the relevant payroll. The errors and rework at launch add hours of re-keying, time spent searching for missing pieces, and calls to clients to request documents that were lost in the system. The least visible effect is also the most lasting: internal trust. Part of the team concludes that the system is poorly designed. Another part concludes that they don’t know their job. A third keeps using the old shared drives in parallel, which produces two competing repositories and undermines the project for months.
The partner had reasoned like a sound manager: anticipate, avoid being caught off guard, have the team ready on the day of the cutover. The reasoning holds on paper. It overlooks an elementary property of human memory, and it underestimates the probability that the cutover will be delayed.
Why training too early doesn’t train
The forgetting curve described by Ebbinghaus in the late 19th century is a stable, well-known result. Without reactivation, roughly 70% of new content is lost within 24 hours, and most of the rest within a few weeks. What survives long-term depends on two conditions: rapid use in real situations, and spaced repetition. Neither was present in the case above.
Brinkerhoff, who spent thirty years studying the transfer of training to the workplace, reaches an even harder conclusion. In most organizations, only 15 to 20% of training content is ever applied on the job. The gap rarely comes from the content or the pedagogy itself; it comes from what happens before and after. Before: no preparation for immediate use. After: no real case in the window where the knowledge is still fresh, no mentoring, no reactivation.
The firm did not train its team. It funded an exposure to content that was never anchored. The distinction is not semantic; it is budgetary.
What the annual plan doesn’t see
The classical training plan operates in campaigns. Needs are identified at the start of the year, a budget is negotiated, services are commissioned, and boxes are ticked in a tracking sheet. The format is legible for management and for HR; it is largely disconnected from the cadence at which technical, regulatory, or software changes actually arrive in the firm.
In a service SME that has to absorb several disruptions per year — across tools, methods, standards, client expectations — the annual plan produces two symmetrical effects. It trains too early on what will not be used for a long time, as in the case above. And it doesn’t train at all on topics that emerged after the budget was set, which have to wait for next year’s plan. The skill ends up either premature or late; rarely synchronous with the actual need.
This desynchronization hurts services even more than industry. A service is, by construction, what the team knows how to do. When skill is out of sync with the tool, what slips is the quality delivered to the client.
What could have been organized differently
A different logic indexes training to the first operational use, not to the planned cutover date. In the document management case, this could have looked like the following.
A short, targeted initial session — half a day to a day — delivered to a small core: one partner, one engagement manager, one accountant, one administrative assistant. Not the whole team; the people who would carry the first engagement run on the new platform. A volunteer client engagement then becomes the main learning material. The solution integrator supports this engagement — on-site or remotely — and answers questions when they actually arise. Skill is built on a real case, with a real client, real source documents, real closings.
The rest of the team is trained later, through internal mentoring by those who just did it. Knowledge moves from practitioner to practitioner, in a format that takes hold because it is immediately usable. This is situated learning, studied by Senge and a recurring theme in the literature on the learning organization: collective skill is built by shared action, not by simultaneous exposure to theoretical content.
This logic does not eliminate external training. It moves it. It accepts not being ready on cutover day in order to be genuinely competent three months later.
The application-delay test
A short question places an organization. What is the average delay, in your firm, between the end of a training session and a participant’s first operational use of its content? If the answer is measured in weeks, training has a chance to produce skill. If it is measured in months, it produces mostly certificates. If the question has never been asked, the firm is funding a line item whose return on investment it does not measure.
The question to ask this week
Identify the last significant training session organized in your firm. Ask one of the people who attended what they have actually applied in their work over the past three months. The honest answer is almost always lower than expected. Then ask yourself what should have been organized between the training and today for that answer to be different.
Lionel Jaquet — CVO @tebicom · Responsive Management · DBA Candidate @GEM


