Big Green AI: Balancing Innovation with Sustainability

Last modified on March 25, 2026 • 3 min read • 546 words

Dartmouth is known as the Big Green, a nickname that reflects not only the color of our logo but also our shared commitment to a healthy planet. By protecting the environment, we create a stronger foundation for learning, discovery, and innovation, and those same principles guide our approach to teaching, research, and the responsible use of AI.

In this way, AI becomes a tool that amplifies our ability to study climate systems, develop sustainable solutions, and enrich every classroom and laboratory on campus.

Local models  

Local models run on Dartmouth-owned computers. Data stays on Dartmouth infrastructure, the models are small enough to keep energy use low, and there is no token budget, so you can use them as often as needed.

Cloud models  

Cloud models are hosted by external providers such as Google, OpenAI, or Anthropic. They often provide richer answers, but each request uses tokens from your daily allowance, sends your data to a third-party data center, and typically consumes more power from distant servers.

For more information, see Dartmouth Chat FAQs.

User-installed local models  

User-installed local models are tiny open-source models that individuals can run on their own laptops or desktops. Because they stay on the user’s machine, they have no token cost, the smallest possible energy footprint, and the highest level of data privacy.

This option is especially useful for developers and researchers who want full control over custom workflows without relying on campus or cloud infrastructure.

Where the hardware lives  

Dartmouth Chat’s compute resources for local models reside in the Massachusetts Green High-Performance Computing Center (MGHPCC) in Holyoke, Massachusetts. MGHPCC is a joint venture among Dartmouth, Boston University, Harvard, MIT, Northeastern, the University of Massachusetts system, and Yale.

The facility is designed to maximize energy efficiency and minimize environmental impact, and it has earned LEED Platinum certification.

Because the center runs on shared infrastructure that is continuously tuned for low power per computation, the carbon intensity of a local request is already much lower than that of a typical commercial cloud service.

Rightsizing the model  

Not every task requires the largest or most powerful model. By selecting a model that is just large enough for the job, we gain several benefits:

  • Lower energy use
  • No token cost for Dartmouth-hosted local models
  • Faster turnaround for users

A simple workflow might look like this:

  • Start with a local model for routine queries such as facts, spell-checking, or simple calculations.
  • Escalate to a cloud model when the task requires deeper reasoning, broader synthesis, or specialized knowledge.
  • If you have the expertise and want complete control, run a user-installed model on your own machine.

Why AI and the environment are connected  

Conversations about AI often return to the environment because the compute that powers modern language models draws electricity, and electricity has a carbon cost.

Dartmouth’s location in a LEED-Platinum data center, the ability to run unlimited local models on campus hardware, and the option for users to run small models on their own machines give the community concrete ways to reduce impact while still benefiting from AI.

By understanding where prompts travel, choosing the right-sized model for each task, and grounding decisions in real data, Dartmouth can keep the Big Green spirit alive while still using cutting-edge AI responsibly.