How fund operators can act on ESG data

Aria Goudarzi, Experienced Data Leader and Advisor, discusses actioning ESG data through the operational and regulatory pipeline.

Andrew Putwain POSTED ON 9/25/2023 8:00:00 AM

Andrew Putwain: What are your biggest pain points around data? To what extent is the lack of data, an abundance of data, or an insufficiency of data a roadblock for you?

Aria Goudarzi: Initially, it was a lack of data. Now, it's a case of efficient storage and retrieval of the data, as well as the correct entity mapping to the securities held within portfolios and benchmarks.  

There are two key parts here. The first is storage and retrieval – which means you need to have the correct data governance structures in place to do so efficiently. You also need to consider the meta-tagging and data lineage components to ensure you can also incorporate data quality aspects further down the line.

The second consideration is making sure the information maps correctly to what you need. The challenge is to correctly link to the relevant issuer for which we have the data. Because data is typically provided at the issuer level, this is and will always be a massive challenge for every firm, as there could be multiple security issues that may not always be easily identifiable I see this is where a lot of firms would continue to spend quite a bit of resources to ensure it is done correctly. Otherwise, you will always have data quality issues.

Andrew: Are data silos that much of a persistent and serious issue? And if so, how can they best be mitigated?

Aria: Data consistency is critical.

We don’t want to be in a position where, for example, a global firm has multiple offices and multiple investment teams, and the Singapore investment teams report one view on the ESG ranking or manual score whilst the London team gives a different view.

"In terms of outputs and mitigation strategies, I’m an advocate of 

centralising the use of the data function."

Data silos can lead to data inconsistencies and inconsistent views, which can lead to reputational queries being flagged. It can also reveal that you don’t have a united front on the ESG angle, which can bring with it its own set of issues.

In terms of outputs and mitigation strategies, I’m an advocate of centralising the use of the data function. If it goes through one function, then one function is responsible for doing the screening, tagging, and pushing this information out to the rest of the business.

It then becomes the central team's job to ensure that whoever needs access to ESG data has a single team that they can go to – or at least a single database or repository that they can retrieve from. This, in turn, removes the need for individual teams, or, in this case, data silos to manage their own strategy, tech and approach to solving a wider business need, which could be disjointed

Andrew: You mentioned reputational risks, safeguarding, and retaining client trust. How do data silos factor into the equation there – in terms of mitigating those risks?

Aria: Data silos create those risks. When you have people working in silos while extracting and dealing with data, those risks emerge because you could lose sight of the overall business view or objectives.

Having consistency and an open approach in terms of who has access to ESG data for investment and research needs can mitigate some of those risks. You want your data teams working closely with end users to better understand how the information is being used. This insight will help ensure there are no incorrect assumptions about the purpose of the data and will also help data teams fine tune their models or vendors to meet evolving end-user requirements. Having data silos will make this process significantly challenging.

Andrew: How do you achieve confluence within a reporting team and operational pipeline – for example, getting buy-in from top to bottom, or getting everyone to communicate and perform their tasks correctly?

Aria: Many of the changes in our industry stem from one of two components: clients and regulations.

Typically, if you tell your senior management team it’s a regulatory-driven project, you can pave the way for quicker progression. Knowing it's a significant client query also helps.

"You want to make sure you can deliver what you said you would, by having the

appropriate operational structure and business buy-in."

The rest of the challenge is around linking the issue to business viability. By this, I mean not only connecting it to peer competitiveness but also aligning it with the firm’s global branding. For example, if we said we’d do X publicly, we need to ensure we’re providing data and proof to meet that requirement.

One often overlooked way of achieving these goals is via efficient operational integration. You want to make sure you can deliver what you said you would, by having the appropriate operational structure and business buy-in. There is a risk that you could commit to something you haven’t yet built, but at least having open conversations internally as soon as possible is an essential building block.

Andrew: What is the most significant improvement that you have seen in recent years when it comes to reporting efficiency and segment alignment around ESG? How can the typical conversations around that – and in ESG in general – be advanced?

Aria: Before we used to complain that there wasn’t enough data or options for data sources, but now we’ve moved beyond that conversation. Thankfully, a lot of new players have entered the market.

Of course, with that, you also have new challenges. For example, the abundance of data can cause some confusion amongst users as they have all this information they don’t know what to do with or fully utilise.

Another issue is inconsistencies with how the information is reported – little details such as information being reported by different units like “metric tonnes”, or “cubic metric tonnes”, or “MtCE”, or “MTCO2E”. All of these terms have small variations which can cause problems if not properly standardised and reported.

I’m on the asset management side, so we’re buyers of this data. This means I'm an advocate of more players in the market. Increased competition gives us more options, and we can finally shift from “we have all this data” to “how do we improve quality?”. It’s an important shift.

Due to the increase in data providers, regulations have helped reporting companies ensure they are providing data with the proper frameworks, which benefits us to use this data and ultimately report to our clients.

It’s only fair to say that the system was never going to be perfect from day one. I'm hoping that the next set of conversations around this topic centres around ensuring that the right type of data at source is captured, collected, and relayed back in the most efficient manner.

"With real time data capture, you could seriously improve information accuracy.

It means engaging in conversations that could open a new era for the industry."

That conversation could focus on data capture technology and firms that specialise in aggregating that information – for example, scraping the reports – and, therefore, helping these underlying companies at source.

It brings us to another issue, which is that there's value in historical data, but we’re also moving towards real time reporting – which will be a game changer.

With real time data capture, you could seriously improve information accuracy. It means engaging in conversations that could open a new era for the industry. You can get into alpha generation ideas: high-frequency trading, day trading, or hedge funds that will take that data and plug it into their models.

This would be a change from models that predict future value or impact – which, to be honest, aren’t entirely accurate and have lengthy caveats or assumptions. Real time data is going to play a fundamental role in the next generation of models, valuations, and trading strategies.

Andrew: You mentioned difficulties around data scraping and available information. What did those relationships look like in the past – and how have they changed since then? How do you access this information for regular use?

Aria: Some companies do it in a seek-and-find style – such as start-up-type firms that have the technology to scale and pivot quickly, as well as the skill to get more niche and focus on specific data. In comparison, larger data providers offer muscle and often cast a wider net.

One relatable analogy is a super tanker versus a speedboat; the latter can pivot and adjust more quickly. If you ask large data providers to find specific datasets, they won’t do it as quickly (or at all) – unless you're a significant client or there’s wider demand.

Andrew: A lot of companies do have multiple sources of data. If cost weren’t an option, would you recommend using a bigger player versus a smaller, more agile start-up – especially when it comes to these benefits of real-time data?

Aria: The strategy that I’ve pitched in the past revolves around having multiple data providers at different levels and stages.

You often need the larger provider because sometimes it is client-driven, and your clients want that specific provider to deliver their data. But there will always be gaps – whether around data coverage or quality – so you want to supplement your datasets. This could mean having one or two large providers and supplementing with new, up-and-coming edge-style providers.

Aria is on the judging panel for the ESG Investment Leader awards and will be speaking at the ESG Investment Leader conference in November. For more information on the conference, please click here, and for the awards, please click here

 

Please Sign In or Register to leave a Comment.