News Elementor

RECENT NEWS

This tool estimates how much electricity your chatbot messages consume


Ever wonder how much electricity you’re using when you prompt, or thank, an AI model? Hugging Face engineer Julien Delavande did, so he built a tool to help arrive at the answer.

AI models consume energy each time they’re run. They’re run on GPUs and specialized chips that need a lot of power to carry out the associated computational workloads at scale. It’s not easy to pin down model power consumption, but it’s widely expected that growing usage of AI technologies will drive electricity needs to new heights in the next couple of years.

The demand for more power to fuel AI has led some companies to pursue environmentally unfriendly strategies. Tools like Delavande’s aim to bring attention to this, and perhaps give some AI users pause.

“Even small energy savings can scale up across millions of queries — model choice or output length can lead to major environmental impact,” Delavande and the tool’s other creators wrote in a statement.

Delavande’s tool is designed to work with Chat UI, an open source front end for models like Meta’s Llama 3.3 70B and Google’s Gemma 3. The tool estimates the energy consumption of messages sent to and from a model in real time, reporting consumption in Watt-hours or Joules. It also compares model energy usage to that of common household appliances, like microwaves and LEDs.

According to the tool, asking Llama 3.3 70B to write a typical email uses approximately 0.1841 Watt-hours — equivalent to running a microwave for 0.12 seconds or using a toaster for 0.02 seconds.

It’s worth remembering that the tool’s estimates are only that — estimates. Delavande makes no claim that they’re incredibly precise. Still, they serve as a reminder that everything — chatbots included — has a cost.

“With projects like the AI energy score and broader research on AI’s energy footprint, we’re pushing for transparency in the open source community. One day, energy usage could be as visible as nutrition labels on food!” Delavande and his co-creators wrote.





Source link


Ever wonder how much electricity you’re using when you prompt, or thank, an AI model? Hugging Face engineer Julien Delavande did, so he built a tool to help arrive at the answer.

AI models consume energy each time they’re run. They’re run on GPUs and specialized chips that need a lot of power to carry out the associated computational workloads at scale. It’s not easy to pin down model power consumption, but it’s widely expected that growing usage of AI technologies will drive electricity needs to new heights in the next couple of years.

The demand for more power to fuel AI has led some companies to pursue environmentally unfriendly strategies. Tools like Delavande’s aim to bring attention to this, and perhaps give some AI users pause.

“Even small energy savings can scale up across millions of queries — model choice or output length can lead to major environmental impact,” Delavande and the tool’s other creators wrote in a statement.

Delavande’s tool is designed to work with Chat UI, an open source front end for models like Meta’s Llama 3.3 70B and Google’s Gemma 3. The tool estimates the energy consumption of messages sent to and from a model in real time, reporting consumption in Watt-hours or Joules. It also compares model energy usage to that of common household appliances, like microwaves and LEDs.

According to the tool, asking Llama 3.3 70B to write a typical email uses approximately 0.1841 Watt-hours — equivalent to running a microwave for 0.12 seconds or using a toaster for 0.02 seconds.

It’s worth remembering that the tool’s estimates are only that — estimates. Delavande makes no claim that they’re incredibly precise. Still, they serve as a reminder that everything — chatbots included — has a cost.

“With projects like the AI energy score and broader research on AI’s energy footprint, we’re pushing for transparency in the open source community. One day, energy usage could be as visible as nutrition labels on food!” Delavande and his co-creators wrote.





Source link

It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.

It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.

The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making

The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using ‘Content here, content here’, making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for ‘lorem ipsum’ will uncover many web sites still in their infancy.

sdtech2532@gmail.com

RECENT POSTS

CATEGORIES

Leave a Reply

Your email address will not be published. Required fields are marked *

SUBSCRIBE US

It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution

Copyright BlazeThemes. 2023