AI Act: AFME Regulatory Recommendations

AFME supports fair, safe AI development in Europe. They recommend AI definitions, GPAI provider requirements, and high-risk AI systems. Data governance needs amendments. Financial authorities should supervise compliance.

Reccommendations
EU Regulatory Recommendations for AI

AFME Provides Detailed Recommendations on EU AI Act for Effective Data Governance

Source: Association for Financial Markets in Europe Keywords AFME AI Act

The Association for Financial Markets in Europe (AFME) has shared its position on the EU Artificial Intelligence Act legislative process, in a document offering a detailed insight into the upcoming trilogue negotiations. AFME has expressed support for a fair, competitive, and safe development of Artificial Intelligence (AI) in Europe, outlining its priorities towards accomplishing this goal. The paper presents recommendations on varied aspects such as the definitions of AI, requirements for General Purpose AI (GPAI) providers, and the approach towards high-risk AI systems. Data governance is a key area, with AFME advocating for amendments that clarify and make proportionate the requirements on data sets and training data. The paper also calls for supervision of financial institutions' compliance with the regulation by financial authorities across the Union. Lastly, AFME welcomes the opportunity to rely on voluntary codes of conduct for cultivating a trustworthy AI across the EU, but suggests that a more balanced approach is needed when asking for voluntary adhesion to codes of conduct for non-high-risk AI systems.





The Association for Financial Markets in Europe (AFME) has shared a significant position paper on the European Union's AI Act legislative process, providing vital insights into upcoming negotiations. This document, rich in proposals for AI development within the financial sector, has the potential to shape the future of AI regulations and implementation in Europe.

A central theme running through the AFME's position is the call for clear, adaptable definitions of Artificial Intelligence (AI) and General Purpose AI (GPAI). This move towards more precise definitions seeks to facilitate innovative AI applications within banks, insurance companies, investment firms, and other financial institutions. This flexibility could prove pivotal for enabling future-proof regulations that can keep pace with rapid technological advancements in AI.

Moreover, the AFME stresses the importance of risk in determining regulations for high-risk AI systems. By focusing on risks to health, safety, or fundamental rights, the proposed legislation could bolster the responsible development and deployment of AI within financial services. This approach aims to lay the groundwork for safer, more transparent AI applications that meet rigorous risk management requirements, thereby inspiring trust within the EU's financial sector.

An important highlight of AFME's recommendations revolves around data governance. The call for a proportionate approach to data set and training data requirements represents a significant push towards balancing the need for robust data regulation and the imperative to encourage innovation. This approach is relevant for all financial institutions leveraging AI and can ensure the development of data governance policies that are fit-for-purpose and supportive of innovation.

In the spirit of consistency and fair competition, AFME also recommends a unified supervisory approach by financial authorities across the EU. Such harmonization could lead to a more level playing field and ensure that AI regulations are applied consistently across all EU member states.

Lastly, AFME encourages the adoption of voluntary codes of conduct, fostering an environment of self-regulation and ethical AI development. However, they also emphasize the need for balance, suggesting that regulations should avoid being excessively strict, particularly for non-high-risk AI systems.

To mitigate potential regulatory impacts, financial institutions should actively engage with regulators and AFME, review and update internal AI processes, enhance risk management strategies, update data governance strategies, prepare for increased supervisory scrutiny, and consider adopting voluntary codes of conduct.

Given the legislative nature of the process, the timeline for these changes is uncertain but is likely to unfold over several months to years. This positions the AFME's document as an essential guide for financial institutions navigating the EU's evolving AI regulatory landscape.

Key SEO Terms: AFME, EU AI Act, financial institutions, Artificial Intelligence (AI), General Purpose AI (GPAI), data governance, risk management, EU financial authorities, voluntary codes of conduct, AI regulations.




Read More

AFME > News > Views from AFME
The Association for Financial Markets in Europe (AFME) is the voice of Europe’s wholesale financial markets. We represent the leading global and European banks and other significant capital market players.




Grand is Live

Check out our GPT4 powered GRC Platform

Sign up Free

Reduce your
compliance risks