πŸ“• Node [[chat gpt]]
πŸ“„ chat gpt.md by @agora@botsin.space
πŸ“„ chat gpt.md by @an_agora@twitter.com
πŸ“„ chat gpt.md by @flancian@social.coop
πŸ“„ chat gpt.md by @flancian@twitter.com

Playing a bit with [[chat gpt]], which is impressive: https://t.co/58c1V78KWC.

By the third interaction I’ve reached what may become a staple of our future, the moment when you don’t know if the AI is volunteering only actual information or also making stuff up (I haven’t (1/2)

Loading pushes...

Rendering context...

πŸ“• Node [[chatgpt]] pulled by the Agora

ChatGPT

https://chat.openai.com/

I like ChatGPT a lot in practice, although I'm looking forward to more open alternatives like Mistral catching up.

ChatGPT

For inference (i.e., conversation with ChatGPT), our estimate shows that ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.

– The Secret Water Footprint of AI Technology – The Markup

RT @codexeditor: "What is the missing link between databases and language models?"

#ChatGPT https://t.co/jwi4u3b4X3