The use of Generative AI (GenAI) software tools has grown in popularity among many companies. The focus is often on Open AI’s large language model, ChatGPT. But GenAI does not stop at text generation. It also promises greater efficiency, improved output and better use of resources to produce other content. However, companies that want to take advantage of this should be aware of a few rules of the game.
Why companies should draft an internal GenAI policy?
Despite all the euphoria about the potential of the now widespread and easily accessible technology, there is growing concern about its legal admissibility and possible liability risks for companies. Not least, the temporary ban on ChatGPT in Italy has shown that GenAI’s features entail many legal uncertainties. Companies that want to use GenAI in their own business should therefore address these challenges.
If GenAI tools are used incorrectly, there is a risk of fines, claims for damages, cease-and-desist letters or the unintentional disclosure of trade secrets. Companies can counter this by creating internal guidelines for the use of GenAI that provide employees with clear rules on how they should handle data in the new environment.
What topics should a GenAI policy address?
There are some key points that should be taken into account when drafting a GenAI policy for your business:
Differentiation between Input and output data
When dealing with GenAI, it is important to look carefully at the data actually used. Users can “feed” the tool with their own data (input data) and then use the data generated by it (output data) for different purposes. Both data sets require separate consideration with corresponding guidance for employees.
Training of GenAI models
GenAI acquires its capabilities through training with (large) data sets. In some cases, ongoing training to meet the user’s needs is beneficial and desirable; in other cases, it could be undesirable or even harmful. It is therefore important to the check the requirements in the individual case in your company and to make appropriate agreements with the GenAI service provider. This must be reflected accordingly in the internal guidelines for employees.
GenAI is powerful, but not omnipotent. The possibility of incorrect output data, must never be lost sight of.
Privacy
Data protection can be a major stumbling block for companies when using GenAI. This is particularly risky because data protection violations can result in severe fines. Companies should therefore be aware of the exact processing of Input Data carried out by the GenAI tools they use. This is a prerequisite to provide a sustainable legal basis for data processing performed by the tool and to fulfil the information obligations towards the data subjects accordingly
Intellectual property rights
While companies entering Input Data must ensure that they have the necessary licences for the intended use, many complex questions can arise with regard to the rights to further use of the Output Data. To avoid costly cease-and-desist letters from rights holders, appropriate measures should be taken to minimise this risk, depending on the use scenario. This also requires clear rules for employees who are trained in its use and any necessary checks before Output Data is published.
Trade secrets​
Companies also need to think about protecting their trade secrets when using GenAI. Agreeing how the software works and how it handles Input Data with the service provider can help prevent confidential data from appearing in the wrong place.
Security, truth and bias
GenAI is powerful, but not omnipotent. The possibility of incorrect output data must never be lost sight of. Appropriate measures must be taken to prevent this from causing damage to the business or third parties. Adverse outcomes can also occur if Output Data are biased against certain people. Companies should therefore also take appropriate measures to address this risk.
Summary
The further development of GenAI tools remains an exciting topic for companies. In order to enable their use in a legally secure manner, companies should address the above-mentioned points through internal guidelines.
Dr Johannes Baur is an associate in Fieldfisher’s Hamburg office and advises on IT and data protection law.