ChatGPT
I like ChatGPT a lot in practice, although I'm looking forward to more open alternatives like Mistral catching up.
For inference (i.e., conversation with ChatGPT), our estimate shows that ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPTβs huge user base, the total water footprint for inference can be enormous.
β The Secret Water Footprint of AI Technology β The Markup
#ChatGPT #UBI
—-
RT @JMSifter
at my uncle’s funeral, one of my great aunts took to asking me all kinds of stuff about chatGPT (I’m the extended family’s “smart internet guy”)
all on her own, she said 1) it’s obvious now it will take all jobs, 2) UBI is the only thing we can do and 3) her gen must die first
https://twitter.com/JMSifter/status/1623741892486742018
#ChatGPT speaking [[agorese]] :)
RT @codexeditor: "What is the missing link between databases and language models?"
#ChatGPT https://t.co/jwi4u3b4X3
RT @mylesbyrne: @codexeditor #ChatGPT:
‘To not integrate KGs, the semweb, and LLMs in the open web would therefore be illogical.’ https://β¦
RT @codexeditor: "What is the next development in the modelling of ontologies?"
#ontology #ChatGPT https://t.co/TCXrkC2PSW
#ChatGPT Does not know what a garden path sentence is and will fight me over it https://t.co/luHG0DNaLz
Expanding this section will automatically generate an AI synthesis of the contributions in this node.
Rendering context...