The potential economic consequences of new

artificial intelligence

models such as

Anthropic PBC

’s Mythos may require a “whole of government” response rather than oversight by a collection of agencies such as the

Ontario Securities Commission

, the head of the Canada’s largest securities regulator said Wednesday.

“I do in the back of my mind — and I think many people would echo this — wonder if the technology is so transformative that we need a different approach,” Grant Vingoe, chief executive of the OSC said at a Bloomberg conference on Wednesday.

“There’s a question about AI being so transformative in so many dimensions that maybe it does need a bespoke type of regulatory environment.”

Given the potential of Anthropic’s Mythos technology to amp up the speed and precision of

cyberattacks

by quickly identifying and exploiting vulnerabilities, the San Francisco-based company has restricted the deployment. A handful of large banks and technology companies have been given strictly controlled access to test it before the full launch of the new AI, which has been flagged as a potential risk to financial stability.

Beyond the cybersecurity risks, fast-developing AI is poised to have a significant impact on a range of capital markets activities, Vingoe said, from pricing investments and synthesizing information to conducting asset management.

Until now, the OSC’s approach to new technology has been to apply traditional principles of securities regulation and set rules for capital markets activities in a “technology neutral” manner. When it comes to AI, regulators are looking to ensure accountability through documentation and system integrity to protect against cyber risk.

“With other technologies, we tended to say that we’re technology neutral, and that really is our stance,” Vingoe said. “For new product offerings, service offerings, or technologies (with the) same risks (and) same activities, (applying the) same rules or regulation is a good general mantra.”

But he said he now sees fear — and understands it — when he speaks to groups of investment professionals and asset managers who are contemplating what the effect of AI might be on the entire investment management process and the skilled professionals who carry it out.

The fear is out there, but we as a society and a regulatory and business community… we have to accelerate our thinking about what’s next,” he said.

“If we did a different approach, it’s likely not by securities regulators or any collection of agencies alone, but would require a whole of government approach when you consider the potential economic consequences, where whole areas of work might disappear.”

Canada is grappling with who will manage the risks posed by the rollout of Mythos, with closed-door meetings of government, regulatory and industry organizations such as the Canadian Financial Sector Resiliency Group, which is chaired by the Bank of Canada. The OSC’s Quebec counterpart, the Autorité des marchés financiers (AMF), is also a member.

On Friday, Bank of Canada governor Tiff Macklem said the group whose members also include the Office of the Superintendent of Financial Institutions (OSFI), the Finance Department, the Canadian Centre for Cyber Security and technology experts from the big banks, had already met twice this month to discuss Mythos.

 

Sources in Canada’s financial sector have said government officials are keen for the Bank of Canada, which oversees the payments system, and OSFI, which oversees the banks, to take the lead on a response.

Macklem said he has also been in touch with

United States Federal Reserve chairman Jerome Powell

to discuss the risks posed by Anthropic’ latest artificial intelligence model, discussions that were among a series of high-level communications between Canadian and U.S. officials on the topic.

 

“The Minister of Finance has been talking to Secretary of the Treasury in the U.S. about the U.S. approach,” Macklem said Friday during a news conference from Washington, where he was attending meetings of the International Monetary Fund.

• Email: bshecter@nationalpost.com