Remix.run Logo
anematode 2 days ago

Who could have seen this one coming. From yesterday: https://www.cbsnews.com/news/google-ai-pentagon-classified-u... ("Hundreds of Google workers urge CEO to refuse classified AI work with Pentagon").

Any AI researcher who continues to work here is morally compromised.

orochimaaru 2 days ago | parent | next [-]

Why is it morally wrong for a US citizen to work with their government?

finghin 2 days ago | parent | next [-]

The acts of the government being wrong in an upsetting amount of cases would be a big reason.

fooker 2 days ago | parent | prev | next [-]

Because, we have pretty convincing historical precedent that 'just following orders' does not work as a defense when your government does something indefensible.

ReptileMan 2 days ago | parent [-]

Worked just well for the paperclip guys.

citadel_melon 19 hours ago | parent [-]

Let’s steel-man the parent comment. Obviously “just following orders” is not generally a morally sufficient argument even if you end up not facing repercussions for your actions.

tyre 2 days ago | parent | prev | next [-]

It’s not, but legal is not the same as ethical.

For a long time, and probably still, it was legal for the US to torture enemy combatants. It was never ethical.

rob74 2 days ago | parent | next [-]

If you add to that the very broad limits of what the current administration considers "legal" (as in "pretty much anything we want to do"), I can understand feeling uneasy as a Google employee...

gigatree 2 days ago | parent | prev [-]

You’d need some shared ethical/moral framework to make that claim, which doesn’t really seem to exist anymore

yibg 2 days ago | parent [-]

You don't need a shared moral framework to come to a personal moral conclusion.

lo_zamoyski 2 days ago | parent [-]

What does that mean? How does one come to a personal moral conclusion? Vibes?

(I take "moral framework" to mean a principled stance that gives objective grounding for a moral judgement. I agree that we can come to a moral judgement without putting it through a systematic and discursive defense, and I reject the notion that there are many moralities or that they are arbitrary, but it is also true that diverging conceptions of the basis of morality will frustrate agreement. Stopping at personal moral judgement does not lend itself to fruitful dialogue and understanding, as it constraints the domain of what is intersubjectively knowable.)

yibg 16 hours ago | parent [-]

My moral framework can be different from yours. Me the individual can come to the conclusion that something is immoral when the rest of the group doesn’t agree with me. And (at least for my own moral framework) I should take action accordingly.

So I don’t need a shared framework to make the claim that something is immoral (to me).

josefx 2 days ago | parent | prev | next [-]

What makes you think that Googles AI experts are US citizens?

kube-system 2 days ago | parent [-]

100% of the Google employees who would be working on "classified AI work" are US citizens by law.

mattlondon 2 days ago | parent [-]

So what, they won't be using any of the existing Google Gemini models of infra then? Because all of Google - from Gemini to the data center infra etc - has (and still is) worked on by non-US persons even - gasp - outside the US. They'll do a complete clean-room ground up bootstrap of all the research and infrastructure from zero?

Seems unlikely.

kube-system 2 days ago | parent [-]

You of course don't have to reinvent science, but it is in fact standard practice to do infrastructure from the literal ground up with US citizens for even unclassified government data.

https://aws.amazon.com/govcloud-us/

AlotOfReading a day ago | parent [-]

Can you provide a different source on that? The govcloud page you've linked says operated by US citizens, not built by US citizens. I'd be pretty surprised if they did the latter. Standard practice as I understand it is to simply run the standard software in a separate environment. A recent Propublica report [0] pointed out that Microsoft was hiring citizens to escort the actual engineers that aren't citizens, for example.

[0] https://www.propublica.org/article/microsoft-digital-escorts...

hashmap 2 days ago | parent | prev | next [-]

working to directly advance a product used substantially to oppress people via surveillance or war crimes, when you have many other choices, is immoral. easy.

_vertigo 2 days ago | parent | prev | next [-]

It’s not morally wrong per-se but just because you are working with your government does not mean what you’re doing is necessarily moral

cooper_ganglia 2 days ago | parent [-]

Just because you are working with your government does not mean what you’re doing is necessarily immoral, either.

_alternator_ 2 days ago | parent | next [-]

Correct. It depends. For example, it might depend on what the collaboration is likely to result in. Perhaps it would be more likely to be moral there were some boundaries in place, like "no mass domestic surveillance" or "no fully autonomous weapons".

Because the US government currently believes it is legal to blow up civilian drug traffickers and wage war without congressional approval. So at some point, yes, collaboration is immoral.

nradov 2 days ago | parent [-]

The US military has deployed fully autonomous weapons since at least 1979, and potential adversaries are now doing the same. For better or worse that ship has sailed.

_alternator_ 2 days ago | parent | next [-]

Look, a dumb bomb is a fully autonomous weapon once it's launched. Let's be real: an LLM making decisions on who to target and when and where to launch munitions represents a meaningful change in our concept of autonomous weapons.

Forgeties79 2 days ago | parent | prev [-]

So we are wrong to express any opposition or desire to maybe raise the bar here? Aren’t we supposed to be “the good guys”? Or should we just accept a role as the menace of the world, wildly throwing its weight around whenever we have an unscrupulous president?

nradov 2 days ago | parent [-]

Those questions are moot. There are situations where it's simply impossible to have a human in the loop because reaction time is too slow or the environment is too dangerous or communication links are unreliable. Russia is deploying fully autonomous weapons to attack Ukraine today and they will be selling those weapons (or licensing the technology) to their allies. There is no option to stop. And let's please not have any nonsense suggestions that we can somehow convince Russia / China / Iran / North Korea to sign a binding, enforceable treaty banning such weapons: that's never going to happen.

t-3 2 days ago | parent | next [-]

There's always an option to stop. We can choose civility over barbarity, stop trying to kill people over 1000+ year old dick waving contests, and stop threatening each other with doomsday weapons because your grandpa shot my grandpa. Just because our leaders are too stupid and cowardly doesn't mean there's no option.

nradov 2 days ago | parent [-]

Sounds good! Please convince Vladimir Putin to choose civility over barbarity, then get back to us so we can discuss options.

convolvatron 2 days ago | parent | next [-]

I wasn't aware that the US was throwing away its moral compass for the just cause of frustrating Putin's expansionism. The new story seems to be Putin gets to do what he wants, and so do we.

nradov 2 days ago | parent [-]

If you think there's something wrong with giving our warfighters the most effective weapons to carry out their assigned missions with minimum casualties then your moral compass is completely broken. Personally I favor a less interventionist foreign policy but that has to be addressed through the political process. Not by unaccountable individual defense contractor employees making arbitrary policy decisions.

Forgeties79 a day ago | parent [-]

> warfighters

You should know that every single veteran I know ruthlessly mocks Hegseth for trying to use this term non-comedically. It’s a synonym for someone who takes their service way too seriously/makes it their whole identity. It’s almost exclusively used to mock people.

sillyfluke 2 days ago | parent | prev | next [-]

Not sure you're aware, but the joke may be on you. It's apparently Putin who's convinced Trump and the Mullahs (not the band) to choose civility over babarity by allowing a superyacht of one of his cronies to pass through the Hormuz.[0]

Russian trolling at its finest, truly. This timeline keeps raising the bar on the absurdity quotient.

[0] https://www.bbc.com/news/articles/cm2pn8zdxdjo

Forgeties79 2 days ago | parent | prev [-]

We aren’t Russian and Putin is not our leader. We can choose how we behave and operate. This is like saying we should use chemical weapons if someone else deploys one. You’re speaking as if it’s all so binary. “Do what they do or you lose.”

nradov 2 days ago | parent [-]

It's cheap and easy for someone sitting safely behind a computer to pretend to be morally superior when you're not the one who has to make hard decisions, or deal with the consequences. Chemical weapons have seen minimal use after WWI largely because they're not very militarily effective. Autonomous kinetic weapons actually work. Right now Ukrainians are building autonomous weapons to defend themselves against Russian autonomous weapons. For Ukrainians it is binary: do what they do or you lose. Would you prefer that they lose? And don't presume to tell us that the Russians can be persuaded to stop by non-violent means, that would be completely delusional.

Forgeties79 2 days ago | parent [-]

>It's cheap and easy for someone sitting safely behind a computer to pretend to be morally superior when you're not the one who has to make hard decisions, or deal with the consequences.

This is a deeply flawed argument that has an obvious application back at you, but either way if you’re going to stoop to personal attacks I think we’re done here.

2 days ago | parent | prev [-]
[deleted]
vintermann a day ago | parent | prev | next [-]

Right, so it was a comically bad defense.

Like the guy in an old clip saying "What is my crime? Enjoying a meal? A succulent Chinese meal?" while being arrested for trying to pay with a stolen credit card. The succulence of the meal has nothing to do with it, and that it's your own government has nothing to do with it. It's just a sad way to try to distract from what's actually wrong with helping build tools for mass surveillance and autonomous murder.

t-3 2 days ago | parent | prev | next [-]

In a logical or mathematical sense, sure, but when it's the US government and a huge surveillance-tech company it's pretty necessarily immoral (at least in an American context where harming liberty is immoral - other cultures disagree).

Jtarii 2 days ago | parent | prev | next [-]

Hegseth bombed a girls school in Iran last month. I think it's fair to doubt the moral worth of anyone assisting this admin.

somenameforme 2 days ago | parent | next [-]

I don't think that was intentional, but invading countries while trying to distract them with negotiations, randomly assassinating leaders and hoping everything just turns out well, threatening to "destroy civilizations", targeting bridges and more, all while aiding and abetting Israel which is intentionally destroying pharmaceutical, educational, and other such civilian institutions is all 100% intentional.

In some ways worse than bombing the school was the effort to implicitly deny it. The school was near a military facility, and itself was a military facility in the past. US intelligence screwed up. They should have simply acknowledged what happened and why. Their response just reeked of cowardice and malice at the highest level.

xp84 2 days ago | parent | prev | next [-]

[flagged]

conartist6 2 days ago | parent | prev [-]

[flagged]

conartist6 2 days ago | parent [-]

[flagged]

cooper_ganglia 2 days ago | parent [-]

You should probably give this a second look:

https://news.ycombinator.com/newsguidelines.html

conartist6 2 days ago | parent [-]

If speaking vigorously in defense of morality is wrong, I guess that's something I'll just have to live with.

ThrowawayR2 2 days ago | parent | next [-]

You'll have to live with it somewhere else. Neither HN's administrators nor readership will tolerate that kind of behavior. If you intend to participate on Hacker News over the long term, please take up the suggestion by the other poster to review the guidelines and adhere to them.

conartist6 2 days ago | parent [-]

I thought what I said was borderline, but we seem to believe in free speech in this country and here in this place.

And you haven't disagreed with what I said, only how I said it ;)

k12sosse 2 days ago | parent [-]

They'll say your 1A doesn't exist here

conartist6 2 days ago | parent [-]

Of course it doesn't! I acknowledge that I have no first amendment right to speak in this forum, none at all. I merely observe that the people who run the forum are themselves champions of free speech, within limits of course.

2 days ago | parent | prev [-]
[deleted]
Forgeties79 2 days ago | parent | prev [-]

Who said otherwise? Clearly it’s about facilitating specific acts by the government. Why are y’all acting like it was so wildly broad? No one said “working with the government is inherently immoral.”

cooper_ganglia 2 days ago | parent [-]

Literally the parent comment:

>Any AI researcher who continues to work here is morally compromised.

Forgeties79 2 days ago | parent [-]

…doing this kind of work with the federal government. That is clearly what they are saying. You stripped all context from the discussion.

You’re looking for the least defensible, worse interpretation of their comment.

cooper_ganglia 2 days ago | parent [-]

No. Their comment was: “Any AI researcher who continues to work here is morally compromised.”

But, “…doing this kind of work with the federal government.” is added context that was not there and is based on your own interpretation.

The language of the parent comment charges that simply working at a company that is engaging in this makes one complicit in an immoral act, and the complicity itself is immoral. I disagree with all of that.

Forgeties79 2 days ago | parent [-]

Yes. Working at a company explicitly profiting off of doing clearly immoral acts is wrong. It doesn’t mean working for a company contracted with the federal government is always wrong.

blks 2 days ago | parent | prev | next [-]

Besides all the questionable and illegal stuff that the current government does, a lot of people don’t want to work on technologies that kill people.

SauciestGNU 2 days ago | parent | prev | next [-]

Because the government is comprised of Nazis now and is waging wars of expansionist conquest abroad and murdering domestic dissidents at home. Anyone working toward enabling that deserves to be on the receiving end of the systems they build.

unethical_ban 2 days ago | parent | prev | next [-]

Are you intentionally lumping in all civic service in one moral bucket? Is working at the post office morally equivalent to developing panopticon technology to suppress protest and track citizens?

pigpag 2 days ago | parent | prev | next [-]

Weird, why is it morally right for anyone to work with immoral organizations? -- That's what's in the focus, right?

Whether the current government is immoral, or if government can be philosophically immoral is up to debate. But your question sounds like a deflection to me.

Cider9986 2 days ago | parent [-]

Heya pigpag. Your account seems to be shadowbanned, even though your comments seem normal. If you want people to be able to see your comments I reccomend creating a new account or appealing to hn@ycombinator.com

hawk_ 2 days ago | parent | prev | next [-]

Sorry to Godwin the thread but the Third Reich would like a word.

vintermann a day ago | parent | prev | next [-]

Same thing that's wrong with enjoying a succulent Chinese meal.

mattnewton 2 days ago | parent | prev | next [-]

Idk about morality, but it’s certainly a way to stop dystopian mass surveillance nightmares if everyone capable of building one refuses.

So if you live in the US and don’t want one government agency in the US to have this power (that is ambiguous under current law), one way you can try to avoid it is by refusing to sell it to them and urging others to do the same.

It’s a long shot sure, but it certainly seems more effective than hoping the legislature wakes up and reigns in the executive these days.

psychoslave 2 days ago | parent | prev | next [-]

Given most government policies and direct engagement in all kind of monstrosities over the last millennia, there is really no reason to limit the case to USA, indeed.

Terr_ 2 days ago | parent | prev | next [-]

You're using a strawman. This was never about just being employed by a government in the most tepid and universal sense.

Ex: "Why is it morally wrong for a US citizen to work with their government?", asked the employee compiling lists of American citizens of Japanese descent to be rounded up into Internment Camps.

2 days ago | parent | prev | next [-]
[deleted]
OkWing99 a day ago | parent | prev | next [-]

Change the country from 'US' to 'China' or 'Iran'. And ask the question again.

IshKebab 2 days ago | parent | prev | next [-]

Because their current government is immoral.

catcowcostume 21 hours ago | parent | prev | next [-]

Because the American government is a criminal organization

tastyface 2 days ago | parent | prev [-]

Because the current government is a vindictive, murderous, proto-fascist government. (But you know that already.)

tjwebbnorfolk 2 days ago | parent | prev | next [-]

Why is it morally compromising to work with the military of the country you live in?

plaidthunder 2 days ago | parent | next [-]

I'm not anti-military as a rule but... c'mon. Opinions on the US military vary.

In extremis, were the people working for Pol Pot just good patriots with no moral culpability?

We could surely at least agree that there are cases where working for the military of your home country doesn't fully excuse you from your actions.

In fact, I think international tribunals have existed which operated on just those principles.

2 days ago | parent | prev | next [-]
[deleted]
throawayonthe a day ago | parent | prev | next [-]

because the country you live in is the united states? this is not complicated

2 days ago | parent | prev | next [-]
[deleted]
mrexcess 2 days ago | parent | prev [-]

We can all agree that working for the Nazi government’s military would be morally compromising, right?

You propose that other governments militaries would not be so compromising. Seems reasonable.

But the question then becomes, what is the operative distinction between the two?

cooper_ganglia 2 days ago | parent [-]

[flagged]

banannaise 2 days ago | parent | next [-]

"Lawful" as determined by the party executing the action is very different from actually lawful.

The courts can intervene later, but they can't un-bomb a hospital.

This is setting aside the obvious problem where governments will often set laws based on self-interest rather than morality, particularly when it comes to military conflict.

exe34 2 days ago | parent | prev | next [-]

Lawful use in the US is whatever Dementia Don says it is.

CamperBob2 2 days ago | parent | prev [-]

This government doesn't GAF what is "lawful" and what isn't. Was what happened to Pretti and Good in Minneapolis lawful? Would you work for ICE/CBP with no qualms at all?

See also the new national sport of hunting for fishing boats off the South American coast. Is that "lawful?"

And yes, since you went there: everything the Nazis did was "lawful." To the extent it wasn't "lawful," they made it "lawful."

cooper_ganglia 2 days ago | parent [-]

[flagged]

exe34 2 days ago | parent [-]

> Don't attack law enforcement with a deadly weapon, whether it's a vehicle or gun.

How do you attack law enforcement with a gun while on your knees, with your arms pinned behind you and the gun is holstered? It's interesting how we can watch the same video, and some people only see what they are told to see.

cooper_ganglia 2 days ago | parent [-]

[flagged]

declan_roberts 2 days ago | parent | prev | next [-]

Thankfully Russia, China, etc have the same qualms as we do in the United States and will refused to send their brightest engineers to work on weapons so they don't become "morally compromised"!!!

titzer 2 days ago | parent | next [-]

I don't think the long-term game theory of race to the bottom works out quite how you think.

"Our enemies would have no qualms building a weapon that will end life on earth! We better build it first because we're the good guys!"

declan_roberts 2 days ago | parent [-]

Послушайте этого парня!

yibg 2 days ago | parent | prev | next [-]

We also used to point to Russia and China as places we don't want to copy.

declan_roberts a day ago | parent [-]

You don't have to copy them. You have to beat them.

notJim 2 days ago | parent | prev | next [-]

This was the same logic that was used when building nuclear weapons, and many of the scientists involved in that tried to find a different path (most notably Niels Bohr). I think we would be in a much better world if they had been successful. It's good that we're trying again w/ LLMs.

mvelbaum 2 days ago | parent [-]

[flagged]

genxy 2 days ago | parent | prev | next [-]

People in those countries do have qualms, they are people after all and they choose to work in other fields.

tensor 2 days ago | parent | prev | next [-]

The US is sure becoming an unfree scary place just like Russia. Keep it up following those role models!

gambiting 2 days ago | parent | prev | next [-]

I don't know if you're being sarcastic(sounds like you are!) but indeed a lot of engineers left Russia after the war in Ukraine started as they didn't want to be drafted and didn't want to contribute to the war effort in some way, even if indirectly. Of course, many stayed or even willingly help. See how many engineers from Iran work abroad too, for moral and other reasons.

The point is - this happens everywhere, it's not just some weird western thing.

cooper_ganglia 2 days ago | parent [-]

[flagged]

griffzhowl 2 days ago | parent | next [-]

National security can mean protecting a society founded on the values of life, liberty and the pursuit of happiness.

It can also mean facilitating a militaristic surveillance state.

Not necessarily the same things, and at some point we might have to choose who's side we're on

wood_spirit 2 days ago | parent | prev | next [-]

Just curious, did the move on Greenland and Iran have national security interest? And is it economical or life threatening?

cooper_ganglia 2 days ago | parent [-]

[flagged]

wewtyflakes 2 days ago | parent | prev | next [-]

We, the people, ostensibly get to say what these security interests are. Also, the security policy executed on by the state is not some immutable monolith. One can agree or disagree with it as it changes over time, and hopefully, influence its direction to arc towards goodness.

d5lt5 2 days ago | parent | prev [-]

I did and helped d33ps33k. You are welcome!

2 days ago | parent | prev [-]
[deleted]
boringg 2 days ago | parent | prev | next [-]

Why is it that this line items comes up EVERY TIME an article comes out in a knee jerk reaction - its so incredibly absolute:

"Any AI researcher who continues to work here is morally compromised."

It feels like a constant campaign and the posters seem so incredibly self righteous and unthoughtful.

crumpled 2 days ago | parent [-]

Probably because the articles are talking about how the AI will be used in immoral ways, and that the people who know that and continue doing the work must be morally compromised.

I know that there might be $several ways those highly-paid engineers might still rationalize their work. Some of them might have ideological reasons to treat entire classes of people as unworthy of life. Within the model of their ideologies, the most evil things might be perfectly moral.

I wonder what reasons you have to disagree with people's moral stance against using AI as a weapon.

boringg 2 days ago | parent [-]

In absoluteness you lose your credibility except when rallying people to arms.

anematode 2 days ago | parent [-]

I stand by it. I'm not including all Google employees, ofc – there are some fantastic projects coming out of there – just the people working on their AI systems which will be accessible to the government with (effectively) no oversight.

I actually don't think it's so nuanced. We know (from its spat with Anthropic) that the government wants the ability to use AI to implement mass surveillance of Americans and fully autonomous killings. We also have ample data that this administration takes the law as a mere suggestion. It's imperative not to make their abuses easier.

Google's researchers aren't stuck there; their skills are in extraordinary demand and I'm sure Anthropic, for example, would hire them in an instant.

devin 2 days ago | parent | prev | next [-]

That's what the 7 figure salaries are for.

testfrequency 2 days ago | parent [-]

It’s funny to me how many progressive people I know and am friends with who work at these AI companies which are marginalized demographics (Trans, Gay, Latino, Black).

Still have faded Bernie stickers on their cars, No Kings organizers, “fuck SF I’m in the east bay for life fuck tech” - and you all make 7 figures Monday - Friday by supporting the death of society and democracy.

I don’t dare say anything though because “money is money”, the bay is expensive..but I do sure as shit judge every single person I know who joined OAI, Anthropic, Google, and Meta.

foobar_______ 2 days ago | parent | next [-]

Preach. The hypocrisy is startling. I think people started at these companies maybe years ago with "good intentions" and are willing to turn a blind eye. But now, given just how glaring clear it is, I don't think it is really excusable anymore. To be clear, people can work wherever they want including these companies but what kills me is the hypocrisy. They are pathological liars to themselves if they somehow think they aren't complicit.

beernet 2 days ago | parent | prev | next [-]

Agreed. Just shows that big money doesn't dilude small character.

site-packages1 2 days ago | parent | prev [-]

I would suggest looking inwards if this is how you really feel.

testfrequency 2 days ago | parent | next [-]

I mean no harm in saying what I said, I love my friends. I just can’t stomach the hypocrisy, it’s what the companies are preying and feeding off of.

My friends are incredibly bright and good at what they do, it’s why they all have the roles they have. It makes me sad (and frustrated) knowing they are lured in by enough money dangling in front of them that makes them swallow their souls and identity, while fuelling the fire in the same breath.

I have a deep amount of respect and gratitude for my friends (and anyone else) who chooses to work at non-profits, and more ethical - mission based companies for less. I hate how much these AI companies and roles are offering people, it’s completely forced lots of gifted people into a war machine.

site-packages1 2 days ago | parent [-]

Do you suspect there is any chance they are fully independent adult human beings with full agency, who have looked at the pros and cons, and chosen to make the choices they did with clear eyes? Do you think there's any context that might square their choices with their own internal principles that don't make them hypocrites? I mean these as real questions. For "friends you love" you really seem to take a dim view of their intelligence.

somenameforme 2 days ago | parent | next [-]

One of humanity's greatest weaknesses is cognitive dissonance. People can convince themselves of just about anything. And in some ways intelligence is a burden here. A fool will just do something with a reason of 'f you, that's why.' It's only the clever man that will even bother rationalizing the villain into the hero, and we're great at it. An interesting thought experiment is to ask people if they'd be willing to push a button that would randomly kill a person somewhere in the world for a million dollars. They'd have no direct accountability themselves and their action would be unknown to anybody else.

People will rationalize themselves into declaring this moral even though it is obviously one of the most overtly amoral actions possible. One friend I have, a rather intelligent guy otherwise, was even trying to create a utilitarian argument that he'd donate some percent of his 'earnings' to life saving charities meaning he'd be saving more life on the net. The fact that if everybody thought and behaved the same way, the entirety of humanity would cease to exist, was a consideration he didn't have a response for. Let alone the fact that he just rationalized his way into justifying near to any deed imaginable, so long as you got paid enough for it.

testfrequency 2 days ago | parent | prev [-]

I’ll be honest and say it’s made me question and reposition some of my friendships with a number of these friends. Some joined well before we knew the fallout of how AI has affected and impacted society negatively, some have joined in recent years because they were offered 2x their currently already high comp package, and others will take any job they can get (who, admittedly, I judge far less as I know they are just needing to survive in a HCOL city).

My dim view is more on the AI companies being absurdly overvalued, with too much money to know what to do, which feeds downwards into compensation packages, which lure in “innocent” individuals who can’t say no. It’s not been a healthy market to be vulnerable in, most companies outside AI are just not getting the same funding or can compete at all - and it’s a shit storm.

gambiting 2 days ago | parent | prev [-]

I'm curious what is that you're suggesting, exactly.

site-packages1 2 days ago | parent [-]

I made another comment above. People contain multitudes. Different contexts, different choices, not everyone is in a box defined by the viewer's world view. You can't really know what's going on with someone else, in their heads, in their context, so give them some grace. Instead, this person's "friends" are "hypocrites" who were "lured" into their choices. It's very condescending. I am suggesting the poster re-examine their own views on other people in light of this.

foltik 2 days ago | parent [-]

You're missing the point. They're just lamenting the contrast between what their friends say (fuck tech, no kings) and what they spend their workweek in service of.

It's not complicated: if these friends would take a non-society-destroying job at equal pay (who wouldn't?) then their values aren't driving the decision, money is. Fine, that's a choice adults get to make. But then own it and actually justify it on its merits, don't just retreat to "who are you to judge."

senordevnyc 2 days ago | parent [-]

Not everyone sees AI as "society-destroying".

foltik a day ago | parent | next [-]

Didn’t say that. The friends in question clearly think it is. My point more generally was about people who publicly talk about $X being society-destroying while materially enabling $X for a paycheck.

senordevnyc 20 hours ago | parent [-]

It’s really not clear to me that they think that. OP was clearly saying that if you’re progressive, the intellectually honest position is to be anti-AI. I don’t think that necessarily follows.

a day ago | parent | prev | next [-]
[deleted]
a day ago | parent | prev [-]
[deleted]
JeremyNT 2 days ago | parent | prev | next [-]

Also yesterday, on Brin getting cozy with this administration:

https://www.nytimes.com/2026/04/27/us/politics/sergey-brin-g...

robrenaud 2 days ago | parent | prev | next [-]

Is every American tax payer morally compromised?

eks391 2 days ago | parent [-]

Yes ;)

I agree with the intent of your rhetorical question, so I'm jesting with you. I'm justifying my "yes" with the hopefully humorous distraction that every person, including American taxpayers, has at some point made a nonsustainable/selfish (my definition of immoral) decision.

RobRivera 2 days ago | parent | prev | next [-]

That's not a productive stance to take, if you're trying to be good faith and an agent of progress, even assuming morality isn't realitive, and the context nuanced.

pixel_popping 2 days ago | parent | prev | next [-]

Why would they be morally compromised? So the ones building open-source models should be as well because some terrorist will use the model to do nefarious stuff?

thisisauserid 2 days ago | parent | prev | next [-]

I agree that it is immoral to obey some laws. Which ones are you saying are immoral here?

ddtaylor 2 days ago | parent [-]

An AI researcher can work anywhere they want, can't they? At the minimum they could work in a different field entirely. It seems like a false dichotomy to frame the question around laws.

thisisauserid 2 days ago | parent [-]

Got it. It's immoral because you said so.

ddtaylor a day ago | parent [-]

What did I say was immoral?

2 days ago | parent | prev | next [-]
[deleted]
ReptileMan 2 days ago | parent | prev | next [-]

Morality is relative and malleable. And usually people are quite good at claiming that whatever suits my agenda is moral.

site-packages1 2 days ago | parent | prev | next [-]

> Any AI researcher who continues to work here is morally compromised.

Arguably it's exactly the opposite. In the same way we ask billionaires to pay their taxes because the regulatory regime is what allowed them the structure to make their billions in the first place, the national security of the country the AI researchers are in is what allows them to make a vast salary to work on interesting, leading edge capabilities like AI. They should feel obligated to help the military.

2OEH8eoCRo0 2 days ago | parent | prev | next [-]

Is it any less moral than surveilling your neighbors and/or turning your neighbors against each other with social media?

mvelbaum 2 days ago | parent | prev [-]

Any AI researcher who refuses to support his own country in a technological arms race is morally bankrupt, foolishly naive and does not deserve to enjoy the the way of life created for him by those who sacrificed their lives.