We are a Swiss law firm, dedicated to providing legal solutions to business, tax and regulatory matters.
SWISS LAW AND TAX
Services
Intellectual Property
Life Sciences, Pharma, Biotech
Litigation and Arbitration
Meet our team
Our knowledge, expertise & publications
View all
Events
Blog
In the VISCHER Innovation Lab, we not only work in the field of law, we also develop our solutions ourselves as far as possible from a technical point of view.
VISCHER Legal Innovation Lab
Red Dragon
Careers
Category: Data & Privacy
Have you, following the DPO and CISO, already appointed an "AI Officer"? For most organizations this will not be necessary for tackling compliance of the AI developments, but you still need to have appropriate governance. After having discussed our 11 principles for a responsible use of AI and risk assessments, we now explain what you should look out for and what steps you should take in terms of governance for ensuring a legal and ethical use of AI. This is part 5 of our AI series.
In simple terms, governance concerning a legal and ethical use of AI means that a company defines who has which tasks, powers and responsibilities in relation to the use of AI and the implementation of AI projects and which rules and procedures need to be complied with. This ensures that everything runs smoothly, that important goals can be achieved and that no unwanted risks are taken. Achieving this with regard to compliance is not easy, especially in the field of artificial intelligence, for three reasons:
How is a compliance or legal department that is suddenly swamped with requests supposed to provide a reasonable answer within a reasonable time? This has to settle down and, above all, be regulated and organized, i.e. companies have to define their guidelines (not only legally, but also technically, such as the platform issue) and regulate responsibilities and procedures. This is what we currently do most often with our clients, apart from assessing specific projects, providers and tools.
One other remark: Proper governance is not only necessary with a view of ensuring compliance, but all other goals an organization may have, as well. While we in the following will focus on compliance governance, similar processes, rules and structures will also help in achieving other goals in relation to the use of AI.
This raises the question of who in the company is responsible for the topic. At least with regard to legal issues, this has not really been determined in many places. However, we are seeing certain trends among our clients. They include the fact that the topic of AI compliance is now primarily assigned to those who already take care of data protection compliance. It is true that AI can also affect other areas of law, such as copyright law. However, as many companies are primarily users of AI technologies that are already available on the market, these other legal issues are somewhat less prominent for them than for those who offer AI products themselves (we will discuss separately the changes that result from the AI Act).
In many cases, we believe it also makes sense for AI compliance to be coordinated internally with those who are already responsible for data protection. They will usually already have the most experience: Many of the central concerns and approaches that relate to a legally compliant, ethical and risk-aware use of AI are already well known and established in data protection. Examples include the principles of transparency, accuracy and self-determination. Similarly, in data protection, there is already a lot of experience with two compliance tools that are now also becoming important in the field of AI: records of processing activities and data protection impact assessments. We recommend the former to our clients for their internal AI applications, and the latter represents a proven methodology for assessing and addressing risks that are also well suited to AI projects (see Part 4 of our series). It is therefore not surprising that most of the regulatory recommendations on the use of AI to date have come from supervisory authorities in the area of data protection.
The EU AI Act will also add the spectrum of product regulation to the use of AI (especially for those companies that are considered to be providers) and a number of cases in which companies will have to introduce certain standard checks, such as whether a particular project could result in a prohibited AI applications, whether one of the special obligations that are imposed on deployers of AI (e.g. when using AI-generated content for public information) need to be complied with, or whether the intended AI application is considered "high-risk" under the AI Act.
Click here for the AI Governance.
A company needs to address at least these three elements to handle the use of AI in terms of governing compliance:
Even though there is currently a hype with regard to the topic of artificial intelligence, it seems clear that AI will play an increasingly important role in the corporate world and companies will need to get involved in how to use AI even if only for not falling behind their competitors. Establishing sound AI governance is therefore not only a question of compliance, but also a step towards exploiting the full potential of the technology while minimizing risks. In our experience in advising our clients, the current great interest in promoting the use of AI represents a very good opportunity to also create the necessary guidelines, organization and processes for compliance and thus governance – at least if this can be shown to be enabling instead of slowing down AI applications.
We will be happy to answer any specific questions you may have.
David Rosenthal
This article is part of a series on the responsible use of AI in companies:
We support you with all legal and ethical issues relating to the use of artificial intelligence. We don't just talk about AI, we also use it ourselves. You can find more of our resources and publications on this topic here.
Sign up for the newsletter to not miss anything.
Subscribe
Team Head
Numerous Swiss companies as well as public organizations rely on Microsoft's cloud services. In...
Which law applies where and how? What needs to be done? Seven short training videos Some like...
Many banks, insurance companies and other Swiss financial institutions are currently working on...