Sam Altman, CEO of OpenAI, stated that an average ChatGPT query consumes 0.34 watt-hours of energy, comparable to brief usage of an oven or lightbulb.
Concerns arise regarding the lack of context for this figure, including how "average" queries are defined and if training and server cooling are factored into the energy use.
Experts, like Sasha Luccioni from Hugging Face, question the credibility of Altman’s energy estimate and the absence of detailed information from OpenAI.
Research highlights the urgent need for transparency in AI’s environmental impact, noting that 84% of large language model (LLM) usage has no environmental disclosures.
Discrepancies exist in energy consumption claims, such as ChatGPT requests allegedly using ten times more energy than Google searches, which lacks solid evidence and arises from unverified statements.
Vendor Risk Awareness: Evaluate vendor protections, especially regarding generative AI, and confirm their support and legal backing in case of issues.
Indemnification by AI Vendors: Key vendors like Google, Amazon, and IBM offer indemnification to protect against accidental copyright infringements.
Training and Cultural Shift: Effective adherence to policies requires comprehensive training and a cultural shift reflected in staffing decisions, emphasizing AI governance.
Role of IT in AI Readiness: IT decisions must align with budget constraints and organizational readiness, impacting workload management between on-premises and cloud environments.
Strategic Infrastructure Decisions: Moving IT back on-premises can be driven by cost concerns, while leveraging cloud resources can be beneficial if supported by knowledgeable staff.