As organisations increasingly embed generative AI into customer and student facing platforms, a critical challenge has emerged: how to govern AI model responses in a way that is provable, explainable and auditable. Traditional moderation approaches, such as keyword filters and probabilistic classifiers struggle to reliably detect nuanced policy breaches, particularly where intent and context matter more than explicit wording.
For First Education & Technology Group (FETG), operator of the MarsLadder AI learning platform, this challenge was amplified by the need to comply with the Safer Technologies 4 Schools (ST4S) framework. The platform must ensure that AI responses to students consistently protect privacy, prevent unsafe interactions, and adhere to regulatory expectations while still delivering high-quality educational support.
PwC worked with FETG to explore how AWS Bedrock Automated Reasoning (AR) could provide a stronger governance mechanism for AI responses. Unlike traditional filters, automated reasoning evaluates responses against formal logic rules, enabling deterministic validation rather than probabilistic judgement.
The PoC focused on integrating AR into MarsLadder’s existing AI Adapter architecture. Key ST4S principles were distilled into ten clear “if–then” rules covering personal data protection, student safety and privacy of others. These rules were programmatically generated from policy documentation and then refined through human review to ensure clarity and relevance to student interactions.
Key activities completed during the engagement included:
This architecture ensured that AI request and response is evaluated before being shown to a student, with clear traceability on which rule was applied and why.
The PoC demonstrated that automated reasoning can significantly reduce the effort required to operationalise complex policy frameworks, with early estimates indicating up to 80% reduction in initial rule-setup effort and 50% reduction in ongoing compliance overhead. More importantly, it provided mathematically provable evidence of compliance, an essential capability for high-risk, regulated environments such as education.
From a performance perspective, our initial testing shows automated reasoning introduced an average 8–13 second latency, accounting for roughly 67% of total response time, however upon engaging with AWS specialists from AWS’s Seattle headquarters to further analyse and optimise latency, we were able to reduce this latency down as low as 1.5 seconds subject to inference geo location. This collaboration between FETG, PwC Australia and AWS have shown the true benefits that the power of three can achieve and bringing responsive, scalable AI solutions without compromises within a regulated industry.
© 2017 - 2026 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details. Liability limited by a scheme approved under Professional Standards Legislation.