Cryptopolitan
2025-07-21 12:15:44

New UK audit standard will go live on July 31 to regulate AI

The UK’s national standards body, the British Standard Institution (BSI), is reportedly set to unveil a new standard for companies that independently audit artificial intelligence tools in a bid to protect innovation. The new standard, tagged the UK AI audit standard, will focus on addressing unregulated and potentially risky practices in the emerging AI sector by introducing a structured framework for auditing AI systems. The standard comes into effect on July 31 According to the BSI, there are now “hundreds” of “unchecked” groups offering audits that claim to assess whether companies utilizing AI models used in both traditional and pioneering use cases do so reliably, fairly and safely. Many of the groups that sell AI audits also develop their own AI technologies, “raising concerns about independence and rigor”, the BSI said. According to the institute , the standard will launch on July 31 and is the first international set of requirements to standardize how assurance firms verify if companies are complying with AI management standards. The news comes weeks after the UK’s Financial Reporting Council (FRC) published its guidance on AI in audit on June 26, 2025. The guidance provided a coherent approach for implementing AI tools in audits, and also emphasized documentation requirements to support innovation in the audit profession. The UK’s new standard is crucial While the world has embraced AI, its vulnerability to mistakes such as hallucinations, and other dangers the technology poses has raised the stakes on reliable AI assurance services. Companies also need to worry about complying with international regulations, such as the EU AI Act , making assurance services crucial. Boutique companies have come out of the woodwork to take advantage of the increase in demand, pitting themselves against larger contenders, including the Big Four accountancy firms. Although it’s still a fledgling sector, the AI assurance market already generates a gross value of more than £1 billion for the UK. So, it makes sense that regulators have sounded the alarm on the lack of standardization in the sector. That means a company can still offer assurance services that ends up being some light-touch advice, or just limited to checking whether the AI is compliant with one particular piece of legislation. Mark Thirlwell, global digital director at the BSI, said: “Businesses need to be sure that when their AI management system is being assessed, it is being done in a robust, coherent and consistent manner.” It is the BSI’s hope that the standard will aid regulators, customers and investors by helping them tell between AI that has been assured by a certified assurance provider, thereby “supporting responsible AI innovation.” The standard is now being branded as a “pivotal step forward for the AI assurance ecosystem” as it clarifies which companies are qualified to certify AI systems against ISO standards. Inioluwa Deborah Raji, researcher at UC Berkeley who specializes in AI audits and evaluations, has highlighted that “many assurance firms used proprietary systems to audit AI” and while companies “will pay to evaluate against [proprietary] standards”, she warned that there is currently no way of externally vetting the quality of these standards. Cryptopolitan Academy: Want to grow your money in 2025? Learn how to do it with DeFi in our upcoming webclass. Save Your Spot

获取加密通讯
阅读免责声明 : 此处提供的所有内容我们的网站,超链接网站,相关应用程序,论坛,博客,社交媒体帐户和其他平台(“网站”)仅供您提供一般信息,从第三方采购。 我们不对与我们的内容有任何形式的保证,包括但不限于准确性和更新性。 我们提供的内容中没有任何内容构成财务建议,法律建议或任何其他形式的建议,以满足您对任何目的的特定依赖。 任何使用或依赖我们的内容完全由您自行承担风险和自由裁量权。 在依赖它们之前,您应该进行自己的研究,审查,分析和验证我们的内容。 交易是一项高风险的活动,可能导致重大损失,因此请在做出任何决定之前咨询您的财务顾问。 我们网站上的任何内容均不构成招揽或要约