| ▲ | arjunchint 7 days ago |
| Wait so they raised all that money just to give it to publishers? Can only imagine the pitch, yes please give us billions of dollars. We are going to make a huge investment like paying of our lawsuits. |
|
| ▲ | Wowfunhappy 7 days ago | parent | next [-] |
| From the article: > Although the payment is enormous, it is small compared with the amount of money that Anthropic has raised in recent years. This month, the start-up announced that it had agreed to a deal that brings an additional $13 billion into Anthropic’s coffers. The start-up has raised a total of more than $27 billion since its founding in 2021. |
| |
| ▲ | slg 7 days ago | parent | next [-] | | Maybe small compared to the money raised, but it is in fact enormous compared to the money earned. Their revenue was under $1b last year and they projected themselves as likely to make $2b this year. This payout equals their average yearly revenue of the last two years. | | |
| ▲ | masterjack 7 days ago | parent [-] | | I thought they were projecting 10B and said a few months ago they have already grown from a 1B to 4B run rate | | |
| ▲ | slg 7 days ago | parent | next [-] | | Here is an article that discusses why those numbers are misleading[1]. From a high level, "run rate" numbers are typically taking a monthly revenue number and multiplying it by 12 and that just isn't an accurate way to report revenue for reasons outlined in that article. When it comes to actual projections for annual revenue, they have said $2b is the most likely outcome for their 2025 annual revenue. [1] - https://www.wheresyoured.at/howmuchmoney/ | |
| ▲ | privatelypublic 7 days ago | parent | prev | next [-] | | It doesn't matter if they end up in chapter 11... If it kneecaps all the other copyright lawsuits. I won't pretend to know the exact legal details. But I am (unfortunately) old enough that this isn't my first "giant corporation benefits from legally and ethically dubious copyright adjacent activities, gets sued, settles/wins." (Cough, google books) | | |
| ▲ | utyop22 7 days ago | parent [-] | | Personally I believe in the ideal scenario (for the fed govt.) these firms will develop the tech. The fed will then turn around and want those law suits to win - effectively gutting the firms financially and putting the tech in the hands of the public sector. You never know, its a game of interests and incentives - one thing for sure - does does the fed want the private sector to own and control a technology of this kind? Nope. |
| |
| ▲ | stingraycharles 7 days ago | parent | prev [-] | | But what are the profits? 1.5B is a huge amount, no matter what, especially if you’re committing to destroying the datasets as well. That implies you basically used 1.5B for a few years of additional training data, a huge price. |
|
| |
| ▲ | dkdcio 7 days ago | parent | prev | next [-] | | maybe I’m bad at math but paying >5% of your capital raised for a single fine doesn’t seem great from a business perspective | | |
| ▲ | ryao 7 days ago | parent | next [-] | | If it allowed them to move faster than their completion, I imagine management would consider it money well spent. They are expected to spend absurd amounts of money to get ahead. They were never expected to spend money efficiently if it meant taking additional months/years to get results. | | |
| ▲ | carstenhag 7 days ago | parent [-] | | Someone here commented saying they claimed they did not even use it for training, so apparently it was useless. | | |
| ▲ | ryao 6 days ago | parent [-] | | In that case, this was one of the most expensive no-ops in history. |
|
| |
| ▲ | arrty88 7 days ago | parent | prev | next [-] | | If they are going to be making Billions in net income every year going forward, as many years as analysts can make projections for, and using these works allowed them to GTM faster/quicker/gain advantage against competitors, then it is quite great from a business prospective. | |
| ▲ | bongodongobob 7 days ago | parent | prev | next [-] | | Yeah it does, cost of materials is way more than that if they were building something physical like a new widget or something. Same idea, they paid for their raw materials. | |
| ▲ | siliconpotato 7 days ago | parent | prev [-] | | It's VC money, I don't think anyone believes it's real money | | |
| ▲ | Aachen 7 days ago | parent | next [-] | | If it weren't, why are we taking it as legal tender? I certainly wouldn't mind being paid in VC money | |
| ▲ | 7 days ago | parent | prev [-] | | [deleted] |
|
| |
| ▲ | xnx 7 days ago | parent | prev [-] | | The money they don't pay out in settlements goes to Nvidia. |
|
|
| ▲ | jongjong 7 days ago | parent | prev | next [-] |
| Isn't that how the whole system operates? Everyone is a conduit to allow rich people to enrich themselves further. The amount and quality of opportunities any individual receives are proportional to how well it serves existing capital. So long as there is an excuse to justify money flows, that's fine, big capital doesn't really care about the excuse; so long as the excuse is just persuasive enough to satisfy the regulators and the judges. Money flows happen independently, then later, people try to come up with good narratives. This is exactly what happened in this case. They paid the authors a lot of money as a settlement and agreed on a narrative which works for both sets of people; that training was fine, it's the pirating which was a problem... It's likely why they settled; they preferred to pay a lot of money and agree on some false narrative which works for both groups rather than setting a precedent that AI training on copyrighted material is illegal; that would be the biggest loss for them. |
| |
| ▲ | danans 7 days ago | parent [-] | | > Isn't that how the whole system operates? Everyone is a conduit to allow rich people to enrich themselves further. The amount and quality of opportunities any individual receives are proportional to how well it serves existing capital. Yes, and FWIW that's very succinctly stated. | | |
| ▲ | utyop22 7 days ago | parent [-] | | Sort of. Some individuals in society find a way through that and figure out a way to strategically achieve their goals. Rare though. |
|
|
|
| ▲ | non_aligned 7 days ago | parent | prev | next [-] |
| You're joking, but that's actually a good pitch. There was a significant legal issue hanging over their heads, with some risk of a potentially business-ending judgment down the line. This makes it go away, which makes the company a safer, more valuable investment. Both in absolute terms and compared to peers who didn't settle. |
| |
| ▲ | freejazz 7 days ago | parent [-] | | It just resolves their liability with regards to books they purported they did not even train the models on, which is all that was left in this case after summary judgment. Sure the potential liability was company ending, but it's all a stupid business decision when it is ultimately for books they did not even train on. It basically does nothing for them besides that. Given the split decisions so far, I'm not sure what value the Alsup decision is going to bring to the industry, moving forward, when it's in the context of books that Anthropic physically purchased. The other AI cases are generally not fact patterns where the LLM was trained with copyrighted materials that the AI company legally purchased copies of. |
|
|
| ▲ | freejazz 7 days ago | parent | prev [-] |
| They wanted to move fast and break things. No one made them. |