Skip to main content
Service Pages

Guidelines for data and information security and copyright in relation to generative AI

Digital tools, services and utilities based on generative AI, such as ChatGPT, Copilot Chat and Midjourney, can be a valuable resource to utilise as an employee at SDU. However, when using generative AI, there are a number of things you need to be aware of. 

Guidelines

Most generative AI solutions are built so that your input is taken ‘out of SDU’ when you use them. This means that the solution provider can use the data and there is a risk it will be read by others. Data can be used to train the model or can be read by the provider’s employees. Even if the provider promises not to do so, SDU is not always able to check.

Here, you will find a series of short interviews with questions related to copyright and data security. Below, you will find an elaboration on the topics and info about the authorisation of AI services.

 

If the AI system has been approved for general use at SDU, you must adhere to the restrictions in the collective assessment (you can read more here ). If not, it is possible to use the system locally (see more below).

Many generative AI systems are operated by other providers than SDU. Typically, you are not permitted to use personal data in your input, and you must be aware of the types of information you are entering:

If the information has been published or will be published (e.g. text for sdu.dk), you are welcome to use it as input.

If the information is internal SDU information (e.g. ordinary meeting minutes without confidential items or personal names), you must assess whether it is prudent to use it as input. You must consider whether there may be a risk in relation to the AI service you use and the specific information you share. There will be many situations in which this is not a problem, but be aware that there may be a risk of the information being shared with unauthorised parties.

If the information is confidential (e.g. closed meeting points, information subject to non-disclosure agreements, information about patentable inventions or similar) it must never be used as input in AI solutions that are not explicitly authorised for this purpose.

Be aware of license terms. Even free solutions have requirements they impose as a condition of use. For example, ChatGPT's license terms specify that responses from ChatGPT may not be used for direct decisions that may have legal consequences for individuals. Likewise, if the text is used in a user-facing context, it must be declared that it was written with the assistance of AI.

Please note that you may only provide inputs that you are authorised to disclose to a generative AI service. This means that you are not permitted to upload images, text or code that violates the copyright of third parties.

Never use personal data as input in generative AI unless the solution has been authorised for this purpose. Personal data is information about a person that says something about the person by which the person is identified, can be identified from the information or can be identified by comparing the information you have about the person with other available information. 

 


Authorisation of AI services

The Committee for Information Security and Data Protection at SDU (UID) has approved new guidelines for security authorisation of IT systems for IT acquisitions. The guideline is currently being implemented. According to the guideline, central systems purchased and licence-controlled by SDU IT, including systems that students at SDU are expected to use, must be approved centrally by SDU IT and possibly SDU RIO if personal data is processed in the system. 

If you plan to use generative AI for teaching, you must make sure that the system/service (e.g. Copilot or ChatGPT) has been centrally approved at SDU. If the service has not been centrally approved, the lecturer cannot require students to sign up for the service and use it as part of the course. However, this does not prevent lecturers from using the services and their outputs in teaching as long as the above-mentioned input rules are respected.

Systems/services that the department wishes to use for delimited research projects or other local purposes that are not aimed at students or that process student data are called local systems/services. These can be used locally, provided that the individual VIP/TAP and their management vouch for the legal use of the system. This also applies to the use of generative AI. 
In concrete terms, this means that the VIP/TAP in question must be able to assess whether the security of the service is commensurate with the input with which they intend to feed the service. In addition, the above-mentioned principles regarding the processing of personal data, copyright and licence terms must of course also be observed. This also applies in relation to SDU’s data if the VIP/TAP in question uses a service in their capacity as a private person.

If you need further information, you should first contact your local GDPR and information security coordinator. You can find your local coordinator by accessing the contact page and navigating to your area. You are also welcome to contact the Help Desk or SDU Digital Compliance at sdu-digital-compliance@sdu.dk



Last Updated 02.07.2024