Frank Smietana in conversation with Karthik Rajagopalan and Peter Sherriff. 

Investment managers and asset owners increasingly rely on partnerships with third-party data, analytics and application providers to support investment and risk decision making, bring new products to market faster, and deliver a superior user experience to their stakeholders. Open architecture platforms that enable interoperability via application programming interfaces (APIs) and cloud services are key to leveraging partnerships. This allows clients to tailor their operating environment in ways that best meet the unique demands of their product and asset class mix.

In this article, Karthik Rajagopalan and Peter Sherriff join Frank Smietana to discuss how the partnership between FactSet and State Street Alpha® provides clients with an integrated and co-engineered solution that delivers differentiated multi-asset data and analytics across their front and middle office.

Frank Smietana Frank Smietana

Head of Thought Leadership,
State Street Alpha

Karthik Rajagopalan Karthik Rajagopalan, CIPM

Director of Client Solutions,
FactSet

Peter Sherriff Peter Sherriff

Director of Product Strategy, Asia Pacific,
Charles River

Why are traditional data infrastructures and database constructs failing investment managers?

KARTHIK: These infrastructures tend to be monolithic and lack the dexterity and agility to scale with rapidly growing data volumes and demands. While traditional data solutions provide core capabilities for mastering and aggregation, they have inherent limitations due to their design. For example, the ability to process unstructured data from email, social media, and news feeds alongside structured investment data sets such as exposures and analytics requires advanced capabilities not found in legacy technology.

The ingress and egress of data between disparate solutions comes with ETL (extract, transform, load) and orchestration challenges that hinder interoperability between systems, and limited ability to process large volumes and varied types of data. Supporting the data intensive needs of quants and AI/ML initiatives is something these systems were not designed for.

PETER: Over time, data was created in many different siloed systems specific to a use case or asset class. To consolidate those silos, ETL tools were used, presenting a suboptimal and very technical approach that requires highly skilled operations teams to organize that data into a centralized data store.

These approaches increasingly fall short due to human resources requirements. It’s not sustainable to keep allocating resources to centralizing the information and delivering that information to data consumers.

Depending on the industry sector and use cases, a recent survey found that barely 50% of data contained in these centralized warehouses is actually being used.  Implementing modern data architectures that make it easier to track and manage data consumption patterns will enhance the way data is being accessed and applied to solving business problems.

What’s driving fund managers and asset owners to make greater use of their existing investment data?

KARTHIK: There is an ongoing need for organizations to make use of alternate datasets and blend different data domains to extract investment insights. Examples include analyzing the sentiment of earnings call transcripts to create alpha signals as well as using ESG factors for regulatory oversight and reporting purposes. Regardless of how alternate data is used, cross referencing is a prerequisite to combining different datasets; it helps ensure legal entity and security identifiers are mapped correctly and holistically across all sources.

Timely and sophisticated analytics are increasingly gaining traction as the foundation for efficient investment decision-making, especially through intraday portfolio monitoring and real-time market data updates. We see continued expansion into new geographies as well as asset classes (e.g. – alternatives, digital assets, and exotic derivatives) to drive investment returns and grow AUM. Being able to reduce the total cost of ownership through data and technology is a huge opportunity; however, it requires the adoption of a new data management paradigm.

PETER: Increasingly, institutional investors realize that new data sources can provide insight and advantage, such as creating a differentiated product based on something identified in one of those data sets.

For example, ETF managers that are harnessing large language models and natural language processing to offer innovative and disruptive new products by parsing regulatory filings, patent applications and other alternate data. The other way to differentiate is by delivering superior risk-adjusted performance relative to competitors. Here too, unlocking insights from data can be a key enabler of success.

How are fund management clients leveraging partnerships and open architecture platforms like State Street Alpha to provide stakeholders with fit for purpose data?

KARTHIK: Our clients tell us they want deeper relationships with fewer partners to streamline and simplify their operating environment. A connected architecture between external solutions is imperative to reducing operational friction and achieving a single source of truth across the enterprise.

FactSet’s collaboration with State Street and their Alpha Data Platform (ADP) stands on the shoulders of its partnership with Snowflake—it unlocks the power of interconnected solutions for our clients. We have automated the flow of ABOR/IBOR datasets like portfolio hierarchy, security terms and conditions as well as portfolio holdings and transactions from ADP into FactSet.

Our multi-asset class data populates ADP, enabling a golden source of analytics that enhances operational workflows and accelerates investment decision-making. All of this is achieved through turnkey integration that supports faster time to market for our clients. The ability to persist and “lock down” analytics through book-of-record capabilities ensures data consistency across our client’s enterprise.

Whether users access data via Looker dashboards, Power BI, Jupyter notebooks, or the Alpha portal, ADP ensures everyone is operating from the same official data. ADP provides interpretability, lineage, and governance, regardless of how data is consumed.

PETER: In addition to the desire for fewer but deeper partner relationships, our clients also expect their partners to have deeper relationships with each other. This requires viewing the integration through a client lens. It’s important to understand what the client is trying to do with that data and what value they’re looking to capture.

Fund managers are leveraging existing partnerships between their partners as much as they are their own relationships with partners. For example, the industry has struggled for years to align OTC instrument security terms and conditions across different platforms. With an open architecture platform, OTC instruments start their life in the front office. In the case of State Street Alpha, that Information flows seamlessly through the platform and out to FactSet to inform investment professionals how those OTC instruments are impacting portfolio returns and attribution, and how that ties into risk and other analytics. It’s critical that our clients have confidence that partners are working together to ensure accurate and consistent data and analytics across the investment process.

What capabilities does a modern enterprise data architecture offer fund managers who are either looking to consolidate their latest merger or launch a wholesale transformation of their organization?

KARTHIK: First, the ability to process unstructured and structured data in a synchronized manner. It’s important that data integration patterns such as Open API standards are adopted. Second, the use of powerful cloud native architectures that can increase resiliency, agility and scalability to support data governance and AI/ML initiatives, and improve automation and lower operating costs. Lastly, the ability to minimize data movement through cloud sharing technologies and harness the power of real-time streaming pipelines. Reducing the proliferation of data silos and empowering data democratization through relevant access controls would not be possible without these capabilities.

PETER: Data quality is always a central focus, ensuring that the information derived from that data is accurate, complete, and reliable. Newer solutions provide better technology and capabilities to assist with that. For example, machine learning and other cloud native tools can be used to flag data anomalies while generating significantly fewer false positives.

As Karthik pointed out, data governance and inherent confidence in the accuracy of the data is critical. Many firms embarking on data transformation projects have identified gaps in the governance of traditional solutions and architectures, noting the difficulty in eliminating data silos at an enterprise level which results in a lack of trust and inconsistent data.

Why does consistency matter?

PETER: As a portfolio manager, knowing that at the start of your day you can view positions, exposures and cash and have confidence that information has gone through rigorous and consistent quality checks and governance controls is key to promoting confidence in portfolio data.

Security and data privacy considerations in a cloud enabled world are also critical. With the increasing number of cyber threats, the time and investment large cloud solutions providers like Microsoft put into ensuring the security and privacy of their cloud solutions is phenomenal. For the manager, there’s a lot of confidence in the security and privacy of that data because it’s being managed to the highest possible standards. That’s difficult if not impossible for even the largest global asset managers to achieve solely with internal resources.

How is access to real time data transforming the landscape for investment firms?

KARTHIK: End-of-day analytics and processing are no longer sufficient to make timely investment decisions. As a content provider, we stream real-time feeds into order management systems from FactSet’s global ticker plants. This, combined with our portfolio management system that calculates up-to-date positions from order management systems’ fills and allocations, allows portfolio managers to pivot as needed based on real-time exposures and risk indicators as trades occur intra-day.

How does our partnership benefit operations professionals working in middle office performance and risk teams?

PETER: We’ve worked with clients in-region who previously relied on a multi-day exercise to generate an enterprise view of performance and risk involving the manual ingestion and aggregation of information from a variety of different sources. By leveraging the power of Snowflake’s zero-movement data sharing, fund managers can effectively eliminate the information lag between the front office and other parts of the organization that rely on the same investment data, but with a different use case, placing more importance on the oversight and control processes.

From an operational and back-office perspective, information lags will be increasingly problematic in a T+1 settlement world. The movement of data from the front office into the middle office for trade matching confirmation affects the settlement process and the delivery of positions and cash. Traditional methods of moving files around overnight will no longer be viable. Giving all stakeholders access to consistent information in real-time streamlines the solution to industry problems such as T+1 settlement because everyone has visibility into what is happening across the transaction lifecycle. Our clients are excited by the new opportunities this affords them to improve the value-add to their end clients. For example, getting plan members more engaged with their savings and pension plans.

How are asset managers across APAC leveraging outsourced data management services?

KARTHIK: We are seeing an increased demand for data scrubbing and managed performance and reporting services—especially in high-cost markets such as Australia, Singapore, Hong Kong, and Japan. As our clients expand into new geographies, they are looking to delegate operationally intensive tasks to trusted partners that can provide a flexible service offering. This helps strike a balance between insourcing, outsourcing, and co-sourcing models tailored to each organization’s unique requirements.

PETER: For some time, fund managers have leveraged their asset service provider for data management services because there’s an expectation that the positional and unit pricing data is well governed. Initially, funds were interested in NAV oversight, and greater transparency and visibility which has increased over time to meet stakeholder and regulatory demands.

The co-sourcing option Karthik referred to is particularly interesting. As clients move toward an outsourced data management service, they still need to exercise due diligence and oversight of their data. ADP allows our partners like FactSet to import their data into a single repository that is managed and governed with transparent data lineage, auditability, and traceability. Proprietary data sets can still travel through the same pipeline but can be ringfenced with secure access controls in place for defined users. Having this level of consistency gives users defensibility in their data sets. They have the lineage and visibility into where that data came from and how it arrived at the designated end point.

How does the State Street and FactSet partnership help clients evolve their data architecture?

KARTHIK: The complementary nature of State Street and FactSet’s solutions and a shared focus on client centricity throughout our partnership with State Street is a powerful combination. Combining ADP’s robust front-to-back capabilities with FactSet’s market-leading performance and risk analytics, research management, and portfolio management capabilities provides our clients with bespoke, fit-for-purpose solutions tailored to their unique challenges.

PETER: Technology is no longer the defining point. Instead, we’re focused on supporting our client’s unique requirements and the delivery of consistent, high-quality data to drive investment insights and improve operations. Data architecture is all about how information is consumed. What we are doing is evolving our clients’ data architecture to a point where they only need to think about the consumption pattern. We’re providing them with a toolkit that engenders trust and confidence to share that data amongst their consumers.

“While traditional data solutions provide core capabilities, they have inherent limitations due to their design.”

– Karthik Rajagopalan

“New data sources created by blending different datasets can provide insight and advantage that previously may have been overlooked.”

– Peter Sherriff

“Clients are looking to delegate operationally intensive tasks to trusted partners that can provide a flexible service offering – striking a balance between insourcing, outsourcing and co-sourcing models.”

– Karthik Rajagopalan

“Consistent quality checks and governance controls is key to promoting confidence in portfolio data.”

– Peter Sherriff

FRANK: Why are traditional data infrastructures and database constructs failing investment managers?

KARTHIK: These infrastructures tend to be monolithic and lack the dexterity and agility to scale with rapidly growing data volumes and demands. While traditional data solutions provide core capabilities for mastering and aggregation, they have inherent limitations due to their design. For example, the ability to process unstructured data from email, social media, and news feeds alongside structured investment data sets such as exposures and analytics requires advanced capabilities not found in legacy technology.

The ingress and egress of data between disparate solutions comes with ETL (extract, transform, load) and orchestration challenges that hinder interoperability between systems, and limited ability to process large volumes and varied types of data. Supporting the data intensive needs of quants and AI/ML initiatives is something these systems were not designed for.

PETER: Over time, data was created in many different siloed systems specific to a use case or asset class. To consolidate those silos, ETL tools were used, presenting a suboptimal and very technical approach that requires highly skilled operations teams to organize that data into a centralized data store.

These approaches increasingly fall short due to human resources requirements. It’s not sustainable to keep allocating resources to centralizing the information and delivering that information to data consumers.

Depending on the industry sector and use cases, a recent survey found that barely 50% of data contained in these centralized warehouses is actually being used.  Implementing modern data architectures that make it easier to track and manage data consumption patterns will enhance the way data is being accessed and applied to solving business problems.

 

FRANK: What’s driving fund managers and asset owners to make greater use of their existing investment data?

KARTHIK: There is an ongoing need for organizations to make use of alternate datasets and blend different data domains to extract investment insights. Examples include analyzing the sentiment of earnings call transcripts to create alpha signals as well as using ESG factors for regulatory oversight and reporting purposes. Regardless of how alternate data is used, cross referencing is a prerequisite to combining different datasets; it helps ensure legal entity and security identifiers are mapped correctly and holistically across all sources.

Timely and sophisticated analytics are increasingly gaining traction as the foundation for efficient investment decision-making, especially through intraday portfolio monitoring and real-time market data updates. We see continued expansion into new geographies as well as asset classes (e.g. – alternatives, digital assets, and exotic derivatives) to drive investment returns and grow AUM. Being able to reduce the total cost of ownership through data and technology is a huge opportunity; however, it requires the adoption of a new data management paradigm.

PETER: Increasingly, institutional investors realize that new data sources can provide insight and advantage, such as creating a differentiated product based on something identified in one of those data sets.

For example, ETF managers that are harnessing large language models and natural language processing to offer innovative and disruptive new products by parsing regulatory filings, patent applications and other alternate data. The other way to differentiate is by delivering superior risk-adjusted performance relative to competitors. Here too, unlocking insights from data can be a key enabler of success.

 

FRANK: How are fund management clients leveraging partnerships and open architecture platforms like State Street Alpha to provide stakeholders with fit for purpose data?

KARTHIK: Our clients tell us they want deeper relationships with fewer partners to streamline and simplify their operating environment. A connected architecture between external solutions is imperative to reducing operational friction and achieving a single source of truth across the enterprise.

FactSet’s collaboration with State Street and their Alpha Data Platform (ADP) stands on the shoulders of its partnership with Snowflake—it unlocks the power of interconnected solutions for our clients. We have automated the flow of ABOR/IBOR datasets like portfolio hierarchy, security terms and conditions as well as portfolio holdings and transactions from ADP into FactSet.

Our multi-asset class data populates ADP, enabling a golden source of analytics that enhances operational workflows and accelerates investment decision-making. All of this is achieved through turnkey integration that supports faster time to market for our clients. The ability to persist and “lock down” analytics through book-of-record capabilities ensures data consistency across our client’s enterprise.

Whether users access data via Looker dashboards, Power BI, Jupyter notebooks, or the Alpha portal, ADP ensures everyone is operating from the same official data. ADP provides interpretability, lineage, and governance, regardless of how data is consumed.

PETER: In addition to the desire for fewer but deeper partner relationships, our clients also expect their partners to have deeper relationships with each other. This requires viewing the integration through a client lens. It’s important to understand what the client is trying to do with that data and what value they’re looking to capture.

Fund managers are leveraging existing partnerships between their partners as much as they are their own relationships with partners. For example, the industry has struggled for years to align OTC instrument security terms and conditions across different platforms. With an open architecture platform, OTC instruments start their life in the front office. In the case of State Street Alpha, that Information flows seamlessly through the platform and out to FactSet to inform investment professionals how those OTC instruments are impacting portfolio returns and attribution, and how that ties into risk and other analytics. It’s critical that our clients have confidence that partners are working together to ensure accurate and consistent data and analytics across the investment process.

 

FRANK: What capabilities does a modern enterprise data architecture offer fund managers who are either looking to consolidate their latest merger or launch a wholesale transformation of their organization?

KARTHIK: First, the ability to process unstructured and structured data in a synchronized manner. It’s important that data integration patterns such as Open API standards are adopted. Second, the use of powerful cloud native architectures that can increase resiliency, agility and scalability to support data governance and AI/ML initiatives, and improve automation and lower operating costs. Lastly, the ability to minimize data movement through cloud sharing technologies and harness the power of real-time streaming pipelines. Reducing the proliferation of data silos and empowering data democratization through relevant access controls would not be possible without these capabilities.

PETER: Data quality is always a central focus, ensuring that the information derived from that data is accurate, complete, and reliable. Newer solutions provide better technology and capabilities to assist with that. For example, machine learning and other cloud native tools can be used to flag data anomalies while generating significantly fewer false positives.

As Karthik pointed out, data governance and inherent confidence in the accuracy of the data is critical.  Many firms embarking on data transformation projects have identified gaps in the governance of traditional solutions and architectures, noting the difficulty in eliminating data silos at an enterprise level which results in a lack of trust and inconsistent data.

 

FRANK: Why does consistency matter?

PETER: As a portfolio manager, knowing that at the start of your day you can view positions, exposures and cash and have confidence that information has gone through rigorous and consistent quality checks and governance controls is key to promoting confidence in portfolio data.

Security and data privacy considerations in a cloud enabled world are also critical. With the increasing number of cyber threats, the time and investment large cloud solutions providers like Microsoft put into ensuring the security and privacy of their cloud solutions is phenomenal. For the manager, there’s a lot of confidence in the security and privacy of that data because it’s being managed to the highest possible standards. That’s difficult if not impossible for even the largest global asset managers to achieve solely with internal resources.

 

FRANK: How is access to real time data transforming the landscape for investment firms?

KARTHIK: End-of-day analytics and processing are no longer sufficient to make timely investment decisions. As a content provider, we stream real-time feeds into order management systems from FactSet’s global ticker plants. This, combined with our portfolio management system that calculates up-to-date positions from order management systems’ fills and allocations, allows portfolio managers to pivot as needed based on real-time exposures and risk indicators as trades occur intra-day.

 

FRANK: How does our partnership benefit operations professionals working in middle office performance and risk teams?

PETER: We’ve worked with clients in-region who previously relied on a multi-day exercise to generate an enterprise view of performance and risk involving the manual ingestion and aggregation of information from a variety of different sources. By leveraging the power of Snowflake’s zero-movement data sharing, fund managers can effectively eliminate the information lag between the front office and other parts of the organization that rely on the same investment data, but with a different use case, placing more importance on the oversight and control processes.

From an operational and back-office perspective, information lags will be increasingly problematic in a T+1 settlement world. The movement of data from the front office into the middle office for trade matching confirmation affects the settlement process and the delivery of positions and cash. Traditional methods of moving files around overnight will no longer be viable. Giving all stakeholders access to consistent information in real-time streamlines the solution to industry problems such as T+1 settlement because everyone has visibility into what is happening across the transaction lifecycle. Our clients are excited by the new opportunities this affords them to improve the value-add to their end clients. For example, getting plan members more engaged with their savings and pension plans.

 

FRANK: How are asset managers across APAC leveraging outsourced data management services?

KARTHIK: We are seeing an increased demand for data scrubbing and managed performance and reporting services—especially in high-cost markets such as Australia, Singapore, Hong Kong, and Japan. As our clients expand into new geographies, they are looking to delegate operationally intensive tasks to trusted partners that can provide a flexible service offering. This helps strike a balance between insourcing, outsourcing, and co-sourcing models tailored to each organization’s unique requirements.

PETER: For some time, fund managers have leveraged their asset service provider for data management services because there’s an expectation that the positional and unit pricing data is well governed. Initially, funds were interested in NAV oversight, and greater transparency and visibility which has increased over time to meet stakeholder and regulatory demands.

The co-sourcing option Karthik referred to is particularly interesting. As clients move toward an outsourced data management service, they still need to exercise due diligence and oversight of their data. ADP allows our partners like FactSet to import their data into a single repository that is managed and governed with transparent data lineage, auditability, and traceability. Proprietary data sets can still travel through the same pipeline but can be ringfenced with secure access controls in place for defined users. Having this level of consistency gives users defensibility in their data sets. They have the lineage and visibility into where that data came from and how it arrived at the designated end point.

 

FRANK: How does the State Street and FactSet partnership help clients evolve their data architecture?

KARTHIK: The complementary nature of State Street and FactSet’s solutions and a shared focus on client centricity throughout our partnership with State Street is a powerful combination. Combining ADP’s robust front-to-back capabilities with FactSet’s market-leading performance and risk analytics, research management, and portfolio management capabilities provides our clients with bespoke, fit-for-purpose solutions tailored to their unique challenges.

PETER: Technology is no longer the defining point. Instead, we’re focused on supporting our client’s unique requirements and the delivery of consistent, high-quality data to drive investment insights and improve operations. Data architecture is all about how information is consumed. What we are doing is evolving our clients’ data architecture to a point where they only need to think about the consumption pattern. We’re providing them with a toolkit that engenders trust and confidence to share that data amongst their consumers.

Contact Us

To learn more about State Street Alpha®​ or to schedule a demo.

5893117.1.1.GBL.

The material presented is for informational purposes only. The views expressed in this material are the views of the author, and are subject to change based on market and other conditions and factors, moreover, they do not necessarily represent the official views of Charles River Development and/or State Street Corporation and its affiliates.