Cautionary Notes on Generative AI
Many unresolved questions about the legal, ethical, technical, and practical dimensions of generative Artificial Intelligence (“genAI”) tools exist and easy answers seem unlikely to arise at this point. Regardless, these questions–I think–should affect how we individually decide to use genAI in our work or personal lives; for a number of these reasons, I don’t use any genAI tools, and if you choose to work with me, you can count on the work I create being the product of the experience and instincts I’ve honed over my career thus far.
I’m not creating this page to shame anyone but rather to collate some useful resources on this topic and serve as food for thought when you consider whether and how to use these tools yourself.
Critical Thinking
Studies on how AI use affects brain activity and critical thinking skills are beginning to be conducted and published. Using generative AI tools leads to a reduction of critical thinking skills, memory, and neural connections in the brain (Kosmyna et al., 2025). Frequent AI use has been found to negatively relate to critical thinking ability in adults aged 17-45 (Gerlich, 2025). The negative effect was strongest for the youngest age group of that study: 17-25 year olds (Gerlich, 2025). In work contexts, participants reported that they did less critical thinking the more that they trusted genAI outputs (Lee et al., 2025).
Corporate Oversight
Corporations develop and own generative AI engines but legal safeguards protecting customers–or people in general–are still in their infancy (Yakimova & Ojamo, 2022). In 2022, Google fought a 13-month legal battle (Rogoway, 2022) to avoid disclosing that they were using 25 percent of the town’s drinking water in The Dalles, Oregon (Osaka, 2023).
/ Environmental Impact
Using AI directly and substantively contributes to climate change (UN Environment Programme 2024).
AI Queries – A ChatGPT query uses about 5 times the energy of an equivalent web search (Zewe, 2025). A 100-word email written by ChatGPT (GPT-4) uses 519 mL (17.6 oz) of water–about equivalent to a single-use water bottle (Pranshu & Tan, 2024).
Data Centers – Data centers use 10-50 times the energy used by a commercial office building of the same size (US Department of Energy, 2026). The electricity used to power the data centers is still largely powered by burning fossil fuels. Data centers tend to be built in places where the cost of land is cheap, and water is often scarce in such places (Osaka, 2023). Large data centers may use 1-5 million gallons of water a day, comparable to the daily water usage of a town of 10-50,000 people (Osaka, 2023). An average Google data center in 2022, for example, used 450,000 gallons of water each day (Hölzle, 2026). In 2018, United States data centers used just under 136 billion gallons of water (Siddick et al., 2021).
Bias
If the information going in is inaccurate, biased, fabricated, etc., the product will be inaccurate, biased, or fabricated. AI suffers from a range of biases (Nazer et al., 2023). GenAI used in hiring decisions is particularly problematic as it reflects the biases built into current hiring pools and industry demographics. Amazon had to scrap its AI-hiring tool after finding it penalized resumes including the word “women” (Dastin, 2018). A genAI-based speech recognition system used by HireVue to make hiring decisions was found to disadvantage deaf and non-white applicants (Stein & Carrick, 2025).
Intellectual Property
Peoples’ writing, art, code, analyses, data, etc. are fed into the system without consent, attribution, or compensation (Appel et al., 2023). GitHub itself acknowledges that the “model that powers Copilot is trained on a broad collection of publicly accessible code, which may include copyrighted code” (GitHub, 2026). This is–in my opinion–problematic in and of itself but is worsened by the use of genAI tools to displace the creators whose outputs were used to train the models!
Output Reliability
Anecdotally, many colleagues of mine (who are almost exclusively working data scientists or academics) have commented on how genAI “hallucinations” (i.e., entirely fake, incorrect, or otherwise unreliable outputs) are common–even under the most recent versions of the more popular LLMs (e.g., GitHub Copilot, ChatGPT, Claude, Gemini).
This unreliability means that all genAI-produced outputs need to be carefully vetted (which I support!) but also means that the efficiency gains of using AI may be less than they are purported to be by AI proponents. One colleague expressed to me that it takes them as long to double-check the code outputs of genAI than it would to just write the code themselves from the start.
Resources
- Appel, G., Neelbauer, J., & Schweidel, D.A. 2023. Generative AI Has an Intellectual Property Problem. Harvard Business Review Online
- Dastin J., 2018. Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women. Reuters
- Gerlich, M. 2025. AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Societies
- GitHub. 2026. GitHub Copilot - Your AI Pair Programmer. Accessed March 21, 2026
- Hölzle, U. 2022. Our Commitment to Climate-Conscious Data Center Cooling. Google, The Keyword
- Kosmyna, N., Hauptmann, E., Yuan, Y.T., Situ, J., Liao, X. Beresnitzky, A.V., Braunstein, I., & Maes, P. 2025. Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. arXiv. https://arxiv.org/abs/2506.08872
- Lee, H.P., Sarkar, A., Tankelevitch, L., Drosos, I., Rintel, S. Banks, R., & Wilson, N. 2025. The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects From a Survey of Knowledge Workers. Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems
- Nazer, L.H., Zatarah, R., Waldrip, S., Ke, J. X. C., Moukheiber, M., Khanna, A. K., Hicklen, R.S., Moukheiber, L., Moukheiber, D., Ma, H., & Mathur, P. 2023. Bias in Artificial Intelligence Algorithms and Recommendations for Mitigation. PLOS Digital Health
- Osaka, S. 2023. A New Front in the Water Wars: Your Internet Use. Washington Post
- Pranshu, V. & Tan, S. 2024. A Bottle of Water Per Email: The Hidden Environmental Costs of Using AI Chatbots. Washington Post
- Rogoway, M. 2022. The Dalles Settles Public Records Lawsuit Over Google’s Data Centers, Will Disclose Water Use to The Oregonian/OregonLive. The Oregonian
- Siddik, M.A.B., Shehabi, A., & Marson, L. 2021. The Environmental Footprint of Data Centers in the United States. Environmental Research Letters
- Stein, L. & Carrick, D. 2025. AI Job Recruitment Tools Could ‘Enable Discrimination’ Against Marginalised Groups, Research Finds. ABC News, Law Report
- United Nations Environment Programme. 2024. AI Has an Environmental Problem. Here’s What the World Can Do About That. https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about
- United States Department of Energy. 2026. Data Centers and Servers. Accessed March 21, 2026
- Yakimova, Y. & Ojamo, J. 2022. Artificial Intelligence Act: Deal on Comprehensive Rules for Trustworthy AI. European Parliament Press Release
- Zewe, A. 2025. Explained: Generative AI’s Environmental Impact. MIT News