

2·
1 year agoit is better than the competition but it will never be like Google before 2019 because they’ll never build their own index.
it is better than the competition but it will never be like Google before 2019 because they’ll never build their own index.
My guess would be that using a desktop computer to make the queries and read the results consumes more power than the LLM, at least in the case of quickly answering models.
The expensive part is training a model but usage is most likely not sold at a loss, so it can’t use an unreasonable amount of energy.
Instead of this ridiculous energy argument, we should focus on the fact that AI (and other products that money is thrown at) aren’t actually that useful but companies control the narrative. AI is particularly successful here with every CEO wanting in on it and people afraid it is so good it will end the world.