▲ | thewebguyd 3 days ago | |||||||
> There is like one or two really clever uses I've seen - disappointingly, one of them was Jira. The internal jargon dictionary tool was legitimately impressive. Will it make any more money? Probably not. Sounds like Microsoft 365 Copilot at my org. Sucks at nearly everything, but it actually makes a fantastic search engine for emails, teams convos, sharepoint docs, etc. Much better that Microsoft's own global search stuff. Outside of coding, that's the only other real world use case I've found for LLMs - "get me all the emails, chats, and documents related to this upcoming meeting" and it's pretty good at that. Though I'm not sure we should be killing the earth for better search, there are probably other, better ways to do it. | ||||||||
▲ | ljf 3 days ago | parent | next [-] | |||||||
Agreed - 95% of the questions I ask Copilot, I could answer myself by searching emails, Teams messages and files - BUT Copilot does a far far better job than me, and quicker. I went from barely using it, to using it daily. I wouldn't say it is a massive speed boost for me, but I'd miss it if it was taken away. Then the other 5% is the 'extra; it does for me, and gets me details I wouldn't have even known where to find. But it is just fancy search for me so far - but fancy search I see as valuable. | ||||||||
▲ | tasty_freeze 3 days ago | parent | prev | next [-] | |||||||
My favorite copilot use is when I join a MS Teams meeting a few minutes late I can ask copilot: what have I missed? It does a fantastic job of summarizing who said what. | ||||||||
| ||||||||
▲ | kyledrake 3 days ago | parent | prev [-] | |||||||
> Though I'm not sure we should be killing the earth for better search Are we, though? What I have read so far suggests the carbon footprint of training models like gpt4 was "a couple weeks of flights from SFO to NYC" https://andymasley.substack.com/p/individual-ai-use-is-not-b... They also seem to be coming down in power usage substantially, at least for inference. There's pretty good models that can run on laptops now, and I still very much think we're in the model T phase of this technology so I expect further efficiency refinements. It also seems like they have recently hit a "cap" on the increase in intelligence models are getting for more raw power. The trendline right now makes me wonder if we'll be talking about "dark datacenters" in the future the same way we talked about dark fiber after the dot com bubble. |