| ▲ | falcor84 8 hours ago |
| If you pay for Copilot Business/Enterprise, they actually offer IP indemnification and support in court, if needed, which is more accountability than you would get from human contributors. https://resources.github.com/learn/pathways/copilot/essentia... |
|
| ▲ | blibble 2 hours ago | parent | next [-] |
| 9 lines of code came close to costing Google $8.8 billion how much use do you think these indemnification clauses will be if training ends up being ruled as not fair-use? |
| |
| ▲ | falcor84 an hour ago | parent [-] | | Are you concerned that this will bankrupt Microsoft? | | |
| ▲ | tsimionescu an hour ago | parent | next [-] | | I think they're afraid they will have to sue Microsoft to get them to abide by the promise to come to their defense in another suit. | |
| ▲ | blibble an hour ago | parent | prev [-] | | be nice, wouldn't it? poetic justice for a company founded on the idea of not stealing software |
|
|
|
| ▲ | christoph-heiss 8 hours ago | parent | prev | next [-] |
| I think that they felt the need to offer such a service says everything, basically admitting that LLMs just plagiarize and violate licenses. |
| |
|
| ▲ | jayd16 8 hours ago | parent | prev [-] |
| That covers any random contribution claiming to be AI? |
| |
| ▲ | falcor84 3 hours ago | parent [-] | | Their docs say: > If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights, our contractual terms are designed to shield you. I'm not actually aware of a situation where this was needed, but I assume that MS might have some tools to check whether a given suggestion was, or is likely to have been, generated by Copilot, rather than some other AI. |
|