As part of our Compliance & Conversation event held on February 5, 2025, we were joined by a panel of surveillance and compliance experts to explore key takeaways from the Global Relay Data Insights: Communication Capture Trends in 2024/25 report. Analyzing the data of over 12,000 regulated financial institutions, the report shed light on the biggest challenges facing compliance and surveillance teams, including artificial intelligence (AI), social media, and conduct risks.
We’ve summarized some of the expert insights raised by the panel below.
Panel
Is the FCA in trouble?
The Financial Conduct Authority (FCA) has spent the last year receiving considerable criticism from across the financial sector for a range of reasons, including its controversial ‘name and shame’ proposals and pressure to support the U.K. government’s growth agenda.
Anna Gooch, a 13-year market conduct veteran at the FCA, was unsurprised by the negative press. “As an ex-regulator, you’re never expecting good headlines. You’re not doing it for the praise.” She believes it’s “not just the last few years” but “the last 13 or so,” and that the industry response “goes with the territory of being a regulator” and that “hopefully the FCA is learning lessons.”
Asking financial firms what they think of the regulator and expecting a balanced response is “like asking ten thousand Spurs fans what they think of Arsenal,” according to Gooch.
The FCA is “being more direct” and exhibiting “a lot more transparency,” with Gooch believing that the regulator is “usually quite silent on matters,” but that the regulator engaging with the industry and being more candid is beneficial, as there’s “nothing like hearing from the regulator directly.”
Is the FCA’s AI report result a surprise?
The FCA published a report in November 2024, outlining that 75% of surveyed firms were already using AI, with a further 10% intending to employ it within the next three years. Concerningly, only 34% of respondents were confident they had a complete understanding of how AI works.
Ugne Willard said it may “sound like a high number” but it “doesn’t surprise me,” with AI use “significantly expanding as we speak.” She believes there is currently a “good relationship between regulators and senior surveillances SMEs at banks and firms,” but that pressure from regulators around ‘explainability’ and requiring firms to communicate “how the black box is working” might prevent future innovation. She believes that the FCA is “perhaps a bit too cautious” with its AI sandbox, though it has made “admirable progress.”
Hannah Bowery sees that it is not just regulators prioritizing ‘explainability,’ with many firms requesting “vendor information up front when choosing to implement tools,” and that she is seeing “more clients wanting to independently test how models are working, not just compliance or surveillance teams.” This includes firms testing lexicons vs. Large Language Models (LLMs), “especially where firms are looking to replace systems.”
Anna Gooch said of the FCA’s 75% figure that “it is quite high” and she wonders “if people are being entirely truthful.” She sees the FCA’s AI lab as “a good initiative” and that the regulator is “trying to build a framework around something that is unknown.”
Will Generative AI be used to model risk?
When asked about the possibility that Generative AI platforms like ChatGPT might be used as part of risk modeling, Willard said that there is “a lot of excitement around that,” and that questions were being asked including “how will it change risk detection?” and “do we need people overseeing it?” She believes it will “shape skillsets and team setups,” but she is not sure currently if model risk “has answers today with how to deal with Gen-AI adoption.”
Bowery believes there are more questions to ask, including ascertaining “how you make sure something implemented today or a year ago is still working the way it should in a years’ time,” and whether it is ever “okay to check this using a machine?”
Will 2025 be a year of conduct and culture focus?
In 2024, the FCA published the results of its first survey into non-financial misconduct (NFM) in The City. The regulator has increasingly focused on how firms are ensuring good conduct and culture, and setting the correct ‘tone from the top’ on this. With the regulator recently publishing a ‘Dear CEO’ letter to brokers that included substantial focus on conduct and culture, will 2025 see a particular focus on this?
Anna Gooch believes it is “not a new thing that conduct and culture goes hand in hand,” and that it is “rare to see a market abuse case that didn’t have other types of misconduct,” estimating that “about 95%” of financial misconduct cases will include NFM elements. She said firms “have a policy for a reason, it’s there to be adhered to,” and that if regulators “were looking at someone who was a regulated individual you’d inevitably find an HR document as long as your arm with red flags.”
She believes it is clear that “poor conduct does lead to poor culture,” and that firms are “missing a trick” if they don’t use their internal surveillance and compliance data “to inform the culture within your firm.”
Are communication surveillance tools used to identify NFM?
Hannah Bowery summarized that “communications surveillance is primarily put in place for market abuse and market misconduct,” and that firms need to ask themselves how they can “actively try and use this information” as part of broader efforts to tackle NFM. She believes it is vital we see firms “using the data they have,” but that “more needs to be done to use this to proactively drive programs” to identify and escalate NFM instances. Ugne Willard agreed, saying that it requires a “shift in thinking” to address.
Is social media a compliance risk?
Our Data Insights report found that 33% of surveyed firms were capturing social media communication, seeing it as an area of potential risk. Interestingly, Hannah Bowery identified that although firms are treating it as risk, it is not being given the same priority as risks like off-channel communications.
Bowery highlighted that “social media is being discussed, but not right at the top of the list. Surveillance gaps in more traditional communications methods are the priority,” saying we had seen a “big swing in mobile messaging capture in recent years.” However, she sees it as a potentially evolving risk area, positing that “who’s to say social media won’t be the biggest challenge for the next few years?”
All three panellists strongly agreed that capturing communications, regardless of channel or platform, is an essential part of surveillance and compliance workflows. Willard summarized that “if it’s allowed for business use, it has to be recorded, and if it’s recorded, it has to be put through surveillance.” Gooch agreed, affirming that “if you’re going to allow it, you should be doing surveillance on it.”
To stay informed of all our upcoming events and panel discussions, sign up to our newsletter here.