Data Cut-offs

AI models are trained on data up to a specific date and do not signal when their information is outdated.
Share this failure mode:

What it means

AI models are trained on data up to a specific date. They have no knowledge of events, regulatory changes, or developments after that point and, crucially, do not signal when their information is outdated.

Why it matters

AI tools used for regulatory guidance, market intelligence, legal research, or stakeholder assessment may be operating on significantly outdated information, presented with the same confidence as current fact.

Board governance implications

The board must know the training data cut-off of any AI tool used in decision-relevant contexts. This is a due diligence requirement and not a technical detail.

Governance failure timeline

Pre-deployment


Failure to establish the training data cut-off date of any AI tool before approving its use in decision-relevant contexts.

Absence of a due diligence requirement covering knowledge currency as part of procurement governance.

Deployment


Decisions are being made on outdated regulatory, legal, or market information presented with the same confidence as current fact.

Advice reaches boards or clients that predates material change, with nothing in the output to indicate it.

Post-deployment


The exposure surfaces when compliance decisions are reviewed in hindsight and found to have relied on superseded guidance, or when audit findings identify AI-informed advice that predates a regulatory change.

Where the knowledge cut-off was a contributing factor and was not disclosed, enforcement action follows.

other Failure Modes