Artificial intelligence (AI) is an evolving, fast paced topic.  It includes chatbots, algorithms, deep learning, machine learning/ intelligence, predictive analytics, and more. The purpose of this guidance is to share current legal considerations, as well as best practices when conducting human subject research at UCI. Please consider the following before submitting a human subject research protocol.

Look for Red Flags

Red Flags May Include:

  • Use, storage, sharing, selling of identifiable information about UC Student Employees, Research Subjects, Patients, including Protected Health Information (PHI)
  •  AI makes decisions or recommendations that affect research subjects
  • AI conducts foreign language translations of consent documents

Engage with Procurement

Please note that UCI already offers a selection of contracted AI services. If intending to use AI outside of these available options, please ask for Procurement to review and execute all agreements or contracts with AI vendors.  During the Procurement process, the Terms of Use or Privacy Policy will be reviewed to ensure it does not expose UC to risk.  Contracts must include an Appendix Data Security, and, where PHI is involved, an Appendix - Business Associate Agreement.

Remember, faculty do not have the ability to sign contracts on behalf of UC.  If contract has already been signed, and data involves Red Flags, contact Procurement to protect UC.

Alert Legal, Privacy and Compliance, and Information Security

AI can be helpful, but it may also carry risks.  Think about issues such as privacy, cybersecurity, discrimination, and liability.

AI can be biased and inaccurate. UCI may need to know what datasets were used to create the AI tool as datasets may contain inappropriate material.  Further, the IRB may ask for evidence of how the AI translation addresses bias; if the data will be stored, for how long; and if the data may be shared.

Research that includes Red Flags must work with UCI Campus Legal Counsel and Privacy Partners as follows: UCI Health Compliance & Privacy Office (for PHI); UCI Registrar’s Office (for student data);  UCI Privacy (all other personal data (including some PHI and student data)); and UCI Information Security to ensure that the use of the AI product is appropriate for use at UCI.

Research involving health data must also work with the Health Data Governance Committee.

Follow Federal, UC, Local and International Security Policies

When providing personal information, including protected health information, UC’s Electronic Information Security Policy (IS-3), along with local UC data security policies for researchers, must be followed.

Importantly, the National Institutes of Health (NIH) have noted that institutions who have been issued a Certificate of Confidentiality (CoC) must consider the CoC protections when selecting 3rd parties or vendors.  Why? These 3rd parties or entities (contractors, online platform vendors) must provide the same CoC protections. This means that they must also protect covered information against compelled disclosure.

Provisions under GDPR and PIPL may also apply if  the data subjects are located internationally (such as EEA, UK, or China).

When AI is used for consent translations, researchers must follow UCI HRP Policy  # 31. Confirmation of an AI translation must be verified via a human translator whose qualifications  align with UCI HRP Policy.

Data Use Agreement

AI should be considered as a third-party vendor. 

Any content provided to AI tools can be saved and reused by the tool – and their affiliates.  Accordingly, agreements must be in place to ensure the proper privacy and confidentiality terms, among others.  UC must be able to directly negotiate and impose requirements on vendors to defend and indemnify UC against 3rd party claims. This is important since AI is not always accurate and could infringe on intellectual property rights.

To discuss Data Use Agreements at UCI, please contact Ms. Wanda Seang in the Office of Research.

Questions about getting started?

Please reach out to HRP Staff directly.