The AI Act has been in force since August 2, 2025, and introduces groundbreaking regulations on the responsible development and use of artificial intelligence systems in the European Union. Companies must meet specific legal requirements for risk assessment, technical documentation, transparency, and AI oversight.
Expert commentary – why is this important?
Compliance with the AI Act is not just a formality – it is the basis for the safe and ethical use of AI, data protection, prevention of discrimination, and building trust with customers and business partners. Integrating the requirements of the AI Act with the ESG (Environment, Social, Governance) approach allows companies to operate in line with global trends in social responsibility and corporate governance.
Why are AI Governance and ESG the foundation of a modern organization?
- Environment - AI systems should support sustainable development, including through effective data and computing resource management, which reduces the carbon footprint.
- Social - responsible AI must protect human rights, counteract bias, and support social inclusion.
- Governance - companies need transparent AI management policies, roles and responsibilities, and oversight and audit mechanisms.
By integrating AI Governance with ESG, organizations increase their competitiveness, reduce regulatory and reputational risk, and gain an advantage in their dialogue with investors and customers.
What do companies need to do to comply with the AI Act?
- Conduct an inventory of AI systems - determine which ones are subject to the AI Act and identify their risk category.
- Define an AI Governance and ESG strategy - develop a policy for the implementation, monitoring, and ethical use of AI.
- Prepare documentation and processes - create model cards, impact assessments, risk management procedures, and compliance reporting.
- Implement AI oversight and monitoring - ensure testing, validation, and periodic audits of AI systems.
- Train employees - understanding of the AI Act requirements and the principles of ethical AI use must be widespread throughout the organization.
- Manage supplier risk - ensure that external technologies meet legal and ethical requirements.
Why is this crucial in the context of ESG?
Implementing AI in accordance with the AI Act and ESG principles is not only a regulatory obligation. It is also a way to:
- increase transparency and trust,
- reduce risks related to discrimination, privacy violations, and cyber threats,
- adapt to the growing demands of investors and business partners,
- build long-term value for the company in the global market.
Summary - legal foundation and competitive advantage
The AI Act is a turning point in the regulation of artificial intelligence. Companies that implement effective AI Governance and ESG processes now will gain an advantage – they will be prepared for legal requirements, avoid financial and reputational risks, and build customer trust.
From August 2, 2025, non-compliance may result in heavy financial penalties. Therefore, it is essential to:
- conduct risk assessments and classify AI systems,
- implement monitoring and auditing processes,
- maintain complete compliance documentation,
- train employees and build a culture of ethical AI.
Ethical and compliant AI is not just a requirement - it is the foundation of sustainable, responsible business in the digital age.