| ▲ | ementally 2 hours ago |
| Copilot https://github.com/microsoft/litebox/blob/main/.github/copil... |
|
| ▲ | pjmlp an hour ago | parent | next [-] |
| To be expected, given how many organisations now require employees to use AI if they want to meet their OKRs, especially all that sell AI tools. |
| |
| ▲ | outofpaper 21 minutes ago | parent | next [-] | | What's dumb, on top of everything, is needing to store non special standard operating procedures in specific AI folders and files when wanting to work with AI tooling. | |
| ▲ | andai an hour ago | parent | prev [-] | | https://files.catbox.moe/cq6xf4.png | | |
| ▲ | UqWBcuFx6NV4r 6 minutes ago | parent [-] | | The funniest thing about this take, and I mean “laugh at you” funny, is that it’s so singularly focused on hating AI that in the process of doing so it takes a CEO soundbite at face value.
I’m sick to death of seeing this image and the million others like it. If you were sleeping on the fact that Microsoft was releasing third-rate software well before GPT-1 existed then that’s on you. That, or you’re simply too young, in which case, go to bed. |
|
|
|
| ▲ | embedding-shape 5 minutes ago | parent | prev [-] |
| > Extremely simple changes do not require explicit unit tests. I haven't used Copilot much, because people keep saying how bad it is, but generally if you add escape hatches like this without hard requirements of when the LLM can take them, they won't follow that rule in a intuitive way most of the time. |