Official Guidance on Generative AI Use
Georgia Tech's Office of Information Technology, in partnership with the Office of General Counsel, Institute Communications and Cyber Security, has crafted general guidance around the use of GenAI. While we understand that nuances exist with respect to GenAI use for research and academic discovery, these guidelines outline approved practices for general use of GenAI for work optimization and enhancements. The guidance will be updated regularly as the Institute reviews the rapidly evolving role of AI at Georgia Tech.
Generative AI Guidance for Privacy and Security
Georgia Tech's Generative AI Guidance for Privacy and Security is provided to inform the use of Artificial Intelligence (AI). Highlights of this guidance include, but are not limited to, the following:
- Georgia Tech faculty, staff, and students shall not submit any personally identifiable information (PII), Protected Data, Regulated Data or Georgia Tech Organizational Data into any AI tool.
- All faculty, staff, and students are expected to adhere to the security best practices outlined within the complete guidance document when using AI tools.
It is important that AI users read this guidance in its entirety before leveraging a generative AI tool. Guidance will be updated as the Institute continues to review the role of AI at Georgia Tech and explores formal contracts and agreements with AI vendors.
AI Dos and Don'ts
The following shares a list of key dos and don’ts for utilizing generative AI. Adhering to this list of guidelines will help you maximize your use of technology while minimizing risks and ensuring the ethical, responsible and effective use of AI in your work.
- DO cooperate with a risk review for new AI tools and existing tools that have not been approved by OIT.
- DO disclose the use of AI tools for relevant activities and the manner and scope of such use.
- DO exercise caution if using data scraped from the internet to train the AI.
- DO verify the accuracy and validity of AI generated content with reliable sources.
- DO review AI generated content for bias and remediate, as necessary.
- DO regularly review and comply with all official AI guidance and key GT data policies (i.e., Data Privacy Policy, Personal Information Privacy Policy, Data Governance and Management Policy and Protected Data Practices).
- DO ensure vendor contract terms align with safe and responsible AI governance.
- DO report any potential data incident or breach at soc@gatech.edu.
- DON’T enter any sensitive, protected, regulated or confidential data into an AI tool regardless of whether it is approved for use or not.
- DON’T assume that public data is free of intellectual property.
- DON'T use AI tools that have not completed a Third-Party Security Assessment.
- DON’T use AI tools that do not meet security standards set forth by the Office of Information Technology.
Tool Availability by Security Review Status
All technology services and tools offered at the Institute must undergo a comprehensive review, which includes a series of third-party risk assessments, privacy and security analysis, contractual negotiations, financial and technical support impact assessments, and data stewardship approvals. Georgia Tech's Generative AI Guidance for Privacy and Security highlights this process as a requirement before any tool is leveraged with protected or regulated data or integrated with existing Georgia Tech services or systems. With this in mind, OIT has disabled the use of third-party AI plugins pending a comprehensive risk review.
The following table shows the most up-to-date status on AI tools that are either available or being investigated based on each tool's current security review status. This list will be updated as additional tools are assessed.
Note: We realize that there are specific AI tools leveraged in academic and research spaces that have been reviewed and approved at a college, school, or departmental level. The following list does not include those specific tools, but outlines tools that are being explored at an institute-wide level.