No what he’s saying is the models are being trained whether you mess around with the AI as a user either way.
It’s like how I didn’t kill the chicken on the store shelves. Myself purchasing it or otherwise doesn’t revive the chicken. The data has/is already being trained.
That’s a really savvy insight! To expand this analogy further, it’s like your phone or computer gives you a free chicken nugget from a small container attached to the side of the device anytime you search for anything at all. It’s room temperature and often spoiled, it’s your choice whether you eat it or not, but you’re going to get it either way. As such you cannot easily choose to avoid chicken in hopes that that will disincentivize further chicken slaughter.
This is a dumb misconception. High emissions and energy consumption is when training models, not during prompts
and models are being trained all the time. It’s the only way to assimilate new data. So your point is moot.
No what he’s saying is the models are being trained whether you mess around with the AI as a user either way.
It’s like how I didn’t kill the chicken on the store shelves. Myself purchasing it or otherwise doesn’t revive the chicken. The data has/is already being trained.
That’s a really savvy insight! To expand this analogy further, it’s like your phone or computer gives you a free chicken nugget from a small container attached to the side of the device anytime you search for anything at all. It’s room temperature and often spoiled, it’s your choice whether you eat it or not, but you’re going to get it either way. As such you cannot easily choose to avoid chicken in hopes that that will disincentivize further chicken slaughter.
False. It’s been shown that resolving prompts also drives a major energy consumption, albeit maybe not so higher than regular search queries.
A prompt is like 1/1000 of the power used as a microwave for the same amount of time.
So the difference between a normal query and an AI query is negligible.