Jennifer Cosco
Luke Stafford
The world today is characterised by countervailing forces: a borderless digital economy is advancing like never before, while the international system is descending into a deepening recession. Geopolitical divisions are bringing fragmentation across the traditional economy, from global supply and value chains to international flows of capital, and increasingly in the digital sphere.
While global leaders have taken proactive steps to provoke agreement on the most pressing issues of the moment – the UK’s initiative in convening the AI Safety Summit at Bletchley Park in 2023 is a pertinent example – cooperation on digital and technology policy remains on precarious footing. With elections taking place in more than 60 countries in 2024, bringing potential new governments, with new mandates and new priorities, the tightrope could shake again.
Understandably, given the profound – and sometimes existential – questions attached to the issue, artificial intelligence (AI) has been prioritised by policymakers around the world. The collective need to manage the biggest threats and opportunities that this technology presents is a source of common ground across borders. But with a predominant focus on AI’s possible effects, there’s a risk that we lose sight of collaborating to ensure the integrity of the key ingredient that goes into it: data.
Since long before the explosion of AI, data has been the lifeblood of the global economy and financial system. Flowing from sources to users around the world, data underpins how we all work and live – whether it’s about accessing the cloud, making informed strategic decisions or facilitating remote working, its enabling role is ubiquitous in organisations of all sizes, sectors and geographies. With services representing an increasing proportion of the global trade mix alongside goods, cross-border data flows are only becoming more important.
LSEG alone delivers around 300 billion data messages to customers across 190 markets every day. With data becoming ever more central to the global economy, we have seen customer demand for our data rise by around 40% per year since 2019.
To unlock the value of the data to our global economy and society, it needs to be able to flow across borders freely, securely, and with trust. For example, globally accessible know-your-customer data helps financial institutions to prevent their business from being used to launder the proceeds of financial crime, and cross-border data flows allow firms to monitor and respond to cybersecurity and other operational risks in real time.
When Russia invaded Ukraine, LSEG saw a 600% spike in customers’ use of our risk intelligence database to implement sanctions quickly and effectively. With 40,000 requests from customers per day, the ability for data to cross borders makes a huge difference in our ability to meet their needs.
The need for trusted data flows is no different when it comes to capitalising on the opportunities and minimising the risks associated with AI. An AI system’s output is only as good as its data inputs – its potential depends on access to large datasets sourced from jurisdictions around the world, and robust processes to ensure the quality, integrity and lineage of data.
This is particularly apparent in the financial services sector, which is rapidly adopting AI systems across a range of use cases. Our industry relies on pinpoint accuracy: if LSEG were to tell our customers that our data and analytics were 95% accurate, it would simply not be good enough.
In this respect, there’s certainly a need to coordinate on policy that promotes the responsible use of AI, and we’ve seen welcome progress through fora such as the G7 and the OECD. LSEG has drawn on this work to inform our own approach in developing Responsible AI Principles.
But continuing to advance cooperation on data governance and strong digital trade rules is also an important part of the recipe. The G7 and OECD have been central on this front too, driving international cooperation to promote cross-border data flows while ensuring trust in privacy, security, and intellectual property rights. The Japanese Government was pivotal in gaining political support for “Data Free Flow with Trust” (DFFT) during its 2023 G7 Presidency, and now the OECD is leading multi-stakeholder work to operationalise this agenda at a technical level.
DFFT has many applications, and enabling the development of robust, reliable and responsible AI systems that bring benefits to the economy and society is one of them. Looking ahead, policymakers should consider these issues holistically, connecting the dots between parallel workstreams that are intrinsically linked, and continuing to go further and faster at pace with the rapid technological transformation that the world is undergoing.
You can’t make a good meal without the right ingredients – and for AI to meet its potential in addressing the world’s biggest challenges, we need to be able to rely on the data that’s going into it.
Legal Disclaimer
Republication or redistribution of LSE Group content is prohibited without our prior written consent.
The content of this publication is for informational purposes only and has no legal effect, does not form part of any contract, does not, and does not seek to constitute advice of any nature and no reliance should be placed upon statements contained herein. Whilst reasonable efforts have been taken to ensure that the contents of this publication are accurate and reliable, LSE Group does not guarantee that this document is free from errors or omissions; therefore, you may not rely upon the content of this document under any circumstances and you should seek your own independent legal, investment, tax and other advice. Neither We nor our affiliates shall be liable for any errors, inaccuracies or delays in the publication or any other content, or for any actions taken by you in reliance thereon.
Copyright © 2024 London Stock Exchange Group. All rights reserved.
The content of this publication is provided by London Stock Exchange Group plc, its applicable group undertakings and/or its affiliates or licensors (the “LSE Group” or “We”) exclusively.
Neither We nor our affiliates guarantee the accuracy of or endorse the views or opinions given by any third party content provider, advertiser, sponsor or other user. We may link to, reference, or promote websites, applications and/or services from third parties. You agree that We are not responsible for, and do not control such non-LSE Group websites, applications or services.
The content of this publication is for informational purposes only. All information and data contained in this publication is obtained by LSE Group from sources believed by it to be accurate and reliable. Because of the possibility of human and mechanical error as well as other factors, however, such information and data are provided "as is" without warranty of any kind. You understand and agree that this publication does not, and does not seek to, constitute advice of any nature. You may not rely upon the content of this document under any circumstances and should seek your own independent legal, tax or investment advice or opinion regarding the suitability, value or profitability of any particular security, portfolio or investment strategy. Neither We nor our affiliates shall be liable for any errors, inaccuracies or delays in the publication or any other content, or for any actions taken by you in reliance thereon. You expressly agree that your use of the publication and its content is at your sole risk.
To the fullest extent permitted by applicable law, LSE Group, expressly disclaims any representation or warranties, express or implied, including, without limitation, any representations or warranties of performance, merchantability, fitness for a particular purpose, accuracy, completeness, reliability and non-infringement. LSE Group, its subsidiaries, its affiliates and their respective shareholders, directors, officers employees, agents, advertisers, content providers and licensors (collectively referred to as the “LSE Group Parties”) disclaim all responsibility for any loss, liability or damage of any kind resulting from or related to access, use or the unavailability of the publication (or any part of it); and none of the LSE Group Parties will be liable (jointly or severally) to you for any direct, indirect, consequential, special, incidental, punitive or exemplary damages, howsoever arising, even if any member of the LSE Group Parties are advised in advance of the possibility of such damages or could have foreseen any such damages arising or resulting from the use of, or inability to use, the information contained in the publication. For the avoidance of doubt, the LSE Group Parties shall have no liability for any losses, claims, demands, actions, proceedings, damages, costs or expenses arising out of, or in any way connected with, the information contained in this document.
LSE Group is the owner of various intellectual property rights ("IPR”), including but not limited to, numerous trademarks that are used to identify, advertise, and promote LSE Group products, services and activities. Nothing contained herein should be construed as granting any licence or right to use any of the trademarks or any other LSE Group IPR for any purpose whatsoever without the written permission or applicable licence terms.