| ▲ | mrlongroots 3 days ago |
| I think Palantir is highly misunderstood. As a technology, it is just database joins. It is just that they are able to pull in data from everything from S3 to SAP to ArcGIS, and provide analytics, visualization etc. on top to provide global visibility into any system. The visibility can be "show me all illegal immigrant clusters" or "show me bottlenecks and cost sinks in CAHSR construction". When we offload the moral impetus for society from politics to technology, we also squander control. Tech is tech and can be used for both good and bad. It is not that a strategy that aims to cap downsides by preventing the proliferation of technology is inherently bad, but it is doomed to fail. The evidence for dysfunction is not the existence of Palantir but in the failure of the watchdog layer of society (also called the government). |
|
| ▲ | nxobject 3 days ago | parent | next [-] |
| That also deflects moral responsibility away from Palantir: they have and had every choice to question the purpose of their contracts. The essence of Palantir is specifically pursuing government surveillance contracts as a lucrative, never ending source of profit. No doubt Deloitte or any other contractor shop would be able to do the same thing - but they don’t choose to. |
| |
| ▲ | Terr_ 3 days ago | parent | next [-] | | Analogy: It's true that TNT is, on its own, just a tool, and vastly useful for mining and demolition. But if the major vendor and purveyor happens to be Blow Thine Enemies To Tiny Bits Incorporated, developing faster fuses and embedded shrapnel, then people are right to be concerned about "The TNT stuff". | |
| ▲ | claw-el 5 hours ago | parent | prev | next [-] | | I am curious if you think that the payment processing companies refusing to serve legal yet undesirable businesses to be in a similar situation as this Palantir situation here? | |
| ▲ | mrlongroots 3 days ago | parent | prev | next [-] | | > No doubt Deloitte or any other contractor shop would be able to do the same thing - but they don’t choose to. I'm sorry but I absolutely disagree that the reason say Deloitte is leaving a few hundred billion dollars on the table is the presence of a moral compass. | | |
| ▲ | pydry 3 days ago | parent [-] | | It might be. Deloitte dont specifically select for employees who dont have one. | | |
| ▲ | mrlongroots 3 days ago | parent | next [-] | | If a selection mechanism is orthogonal to a property, it seems weird to argue that the selected subset is distributed differently along that axis than the broader population. | |
| ▲ | jjani 3 days ago | parent | prev [-] | | I wouldn't be so sure of that. |
|
| |
| ▲ | andoando 3 days ago | parent | prev [-] | | What you think is ethical is different from what I think is ethical. The power shouldn't be solidified in a few hands period. | | |
| ▲ | 3 days ago | parent | next [-] | | [deleted] | |
| ▲ | user____name 2 days ago | parent | prev | next [-] | | I think you're conflating ethics with morality. | |
| ▲ | bumby 3 days ago | parent | prev [-] | | Ethics, in simple terms, is how we treat each other. If you claim it’s intrinsically attached to something like decentralized power, it’s at the least a misunderstanding and possibly a misapplied dogma. |
|
|
|
| ▲ | irishloop 3 days ago | parent | prev | next [-] |
| Yes. But also, all technologies will eventually be used as weapons. And so its important for us to understand how they can be weaponized and to consider the social cost of that weaponization. |
| |
| ▲ | mrlongroots 3 days ago | parent [-] | | Kitchen knives murder people. Toyota Hiluxes have powered more jihad than modern battle tanks. Our tastes, beliefs, and opinions as a society are shaped by recommendation algorithms run by facebook/instagram/twitter, to our profound detriment (personal opinion). > And so its important for us to understand how they can be weaponized and to consider the social cost of that weaponization. To be clear, I absolutely agree. Plenty of tech is double-edged. And Palantir very much so. Let me restate my point. Palantir (or that class of tech products) is powerful at enabling visibility over a complex system. But visibility is not decisions, it is an input to decisions. If you had real-time telemetry from every single stomach, you could maybe automatically dispatch drones with food wherever someone is starving. Or you could use the data as a high-frequency indicator for a successful invasion. Morality is downstream of decisions not data. | | |
| ▲ | Avshalom 3 days ago | parent | next [-] | | Palantir is not double edged, technology is pretty much by definition an application and Palantir is applying in exactly one direction. "oh it's just database joins" is about like me ripping your arms off and describing it as "chemical reactions" | | |
| ▲ | mandevil 3 days ago | parent | next [-] | | No, you've only heard about one application of it. Airbus and Palantir built something so powerful they productized it and now sell it to airlines to help manage their fleet https://aircraft.airbus.com/en/services/enhance/skywise-data... They have a thriving commercial business outside of their government work. (Disclaimer: long PLTR) | | |
| ▲ | bumby 3 days ago | parent [-] | | That link is more marketing than substance. Is there any data on how well these models perform? For example, how well does their predictive maintenance work, how much risk-adjusted money savings does it provide, what data streams does it require? |
| |
| ▲ | mrlongroots 3 days ago | parent | prev | next [-] | | > "oh it's just database joins" is about like me ripping your arms off and describing it as "chemical reactions" This argument is both inconsistent and counterproductive. Inconsistent as in, the harm to me from having my arms being ripped off comes from you deciding to effect the intent to harm me. No photograph or x-ray of my arms can produce the intent of wanting to harm me. Counterproductive as in, the "good vs bad" framing is pointless because it does not help with solutions. If your solution is to ban joins, you will have a hard time gaining traction for your cause. Strategic advocacy requires understanding axes along which you may be able to produce a coherent argument and gain leverage. "Ban joins" does not help. | | |
| ▲ | const_cast 2 days ago | parent [-] | | The root-cause of all of this isn't evil government, or data analytics, or joins, or even evil company. Its data collection. Its privacy. If were waiting around for the day people start acting ethically, we'll experience the heat death of the universe. Governments can always turn evil. Companies can always be compelled. People can always turn evil. We need to not give them the ammunition. We've cornered ourselves into a situation where we sacrifice our data and privacy, and we are forced to blindly trust it will not be used against us. If we do not collect data, we cannot have data breaches. If we do not collect data, we cannot have mass surveillance. If we do not collect data, we cannot have wiretapping. We've simply allowed and encouraged tech companies to collect as much data as humanly possible. That starts with Google, Meta, et al. We then trust they will not abuse it. But they certainly can, and they certainly will. What is done now cannot be undone. We cannot take back data immortalized. But, what we can do is prevent new data collection. Use private services. Run software locally when feasible. Deny analytics. Block advertisments. Use end to end ecryption. Etc. | | |
| |
| ▲ | datadrivenangel 3 days ago | parent | prev [-] | | A good government having better information technology allows it to do more to serve our interests. |
| |
| ▲ | bigyabai 3 days ago | parent | prev [-] | | Palantir is still a tumor. We don't need people profiting off database joins, Oracle did that and became the most hated company on the planet. If the surveillance industry ends up resembling the other "rice bowl" military contractors, American taxpayers will suffer most. It will inevitably become a cost treadmill with infinite billable hours, Congress has seen this happen hundreds of times. In truth, the rest of your arguement is fully correct. Palantir is often portrayed as the "hacking American businesses" group, but that's NSO. Palantir is merely buying out the data from morally-flexible telecoms and capricious cookie-laden websites. There is an uncomfortable truth about networked technology that America has swept under the rug for decades, and now we have entire businesses as a symptom of that failure. It's a sickening precedent for a free society. I'd like to believe in a political solution to this. I've yet to see one, and the consequences of the Snowden leaks suggest we may never correct course here in America. |
|
|
|
| ▲ | coldtea 3 days ago | parent | prev | next [-] |
| >Tech is tech and can be used for both good and bad It's not that simple, since tech also enables bad that was previously not possible. |
| |
| ▲ | ojbyrne 3 days ago | parent [-] | | Just quibbling, but obviously tech also enables good that was previously not possible. | | |
| ▲ | coldtea 3 days ago | parent [-] | | Sure. And if that bad it enables is worse than the good it enables, then tech is not really that neutral. | | |
| ▲ | ojbyrne 3 days ago | parent [-] | | Correct, but then there's all sorts of value judgments involved. And things that are "good" just become part of the background. Makes me think of Louis C.K. and his "Everything is Amazing and Nobody is Happy" bit but now that sort of highlights the double-edgedness of things. | | |
| ▲ | AdieuToLogic 3 days ago | parent [-] | | >>> Just quibbling, but obviously tech also enables good that was previously not possible. >> Sure. And if that bad it enables is worse than the good it enables, then tech is not really that neutral. > Correct, but then there's all sorts of value judgments involved. The problem is when the form of "bad" enabled has no remedy. For example, identifying potential dissidents and their "network of associates" to authoritarian regimes. In these cases, there is no amount of "things that are 'good'" which can offset the "bad". |
|
|
|
|
|
| ▲ | supercanuck 3 days ago | parent | prev | next [-] |
| The problem with Palantir is they target gov. agencies. Most of the time companies who have systems like Palantir, I’m thinking the SAP, Oracle, blah
Blah, have to report earnings to the street through a 10k or have to comply with regulations like Sarbanes Oxley. They will also have in-house IT staff to monitor the logs etc. The programs installing the Foundry system have an incentive to hide the data from prying eyes and therefore it never leaves the Palantir ecosystem. The government doesn’t hire independent consultants, auditors etc to confirm if it’s being used or not. They simply have to demonstrate trustworthiness to a security officer and hope an IG doesn’t have an external equivalent of a Forward deployed engineer. So while the technology is mediocre, it’s the nebulousness or the lack of audit-ability and the are the people writing checks the same people signing them. So I sympathize with Karp talking about technology being fine it’s the apparatus surrounding it that says “just trust us” that gives pause, especially in today’s culture of conflict. If I told you that 90% of all transactions get routed through a foreign companies software, you might pause but it’s been like that for years (SAP). The difference is there are controls in place. |
|
| ▲ | lolive 3 days ago | parent | prev | next [-] |
| Being a daily user of Foundry, I really see Foundry as a scalable implementation of the SemanticWeb / LinkedData principles. Not to go [again] in the technical debate, I will summarise their stack: they use Spark as their foundation, with a simple pattern of materialisationAsTables when needed, possible synchronisation to RDBMS or to a graph database with a strong ontological layer on top. They then provide a web app stack and a low code/noCode dev environment.
[there are other components in the platform, but let’s keep it simple]. So no IT rocket science here, but the UX mostly hides all the IT bricks under a pure data oriented workflow. [very few of my colleagues know what python, spark, AWS are]. Three comments on this: could anyone rebuild such a platform? Yes. Is it worth it? Most companies will say no. Do SAP analytics tools compete? Time will tell. Really in Foundry the scalability is MASSIVE !
[but keep in mind that this is an analytics platform, not a write-oriented platform] Now let’s switch to the political side of [the company called] Palantir: Can such a platform be used by Santa Claus to monitor the data for the next Christmas? yes Can someone decide to aggregate all the data of all the citizens and hope to do mass control with that? Probably [but hey, Facebook/Netflix/TikTok are already doing that, plus they are actively hacking your *brain*, and no one complains] |
| |
| ▲ | AlexeyBelov 15 hours ago | parent [-] | | > and no one complains You lost me at this point. The comment was insightful up until the whataboutism and "no one complains". Lmao. |
|
|
| ▲ | 3 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | 3np 3 days ago | parent | prev | next [-] |
| > As a technology, it is just database joins. As a business, it is far more. Their FDEs are intrinsic to unlocking capabilities for customers. |
|
| ▲ | 3 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | AtlasBarfed 3 days ago | parent | prev | next [-] |
| I'm sorry, is Palantir a self-sustaining AI with zero human employees? Or is it a corporation of people that is (I know, try to stifle the laughter) supposed to have at least some morality? I get it. Corporations haven't functioned as any institutional morality since their inception as a legal framework, despite the Supreme Court handing them immortal citizenship with effective privilege over any real citizen. So far we have: - masked paramilitary agents chasing down the lowest rungs of "at first they came for ..." - deployed formal military to democratic cities for intimidation - cowering, terrified tech CEOs embarrassingly kissing ass - Threatened, capitulated universities, law firms, and fourth estate tv companies - Massive amounts of purging of civilian institutions from any oversight - Purging of military leadership based solely on blind loyalty to the president - Massive fraud leading to multibillion dollar increase in Trumps wealth - A supreme court that may as well have been disbanded that has handed unlimited privilege to Trump's executive branch And waiting in the wings is Palantir-enabled TOTAL INFORMATION AWARENESS of the entire populace. So back to Palantir, the absolved "just a tech firm" that has been providing turnkey authoritarian control to the US government for decades now. Of course it won't function as any bulwark against the coming storm. Oh, I think I understand Palantir very well. Anyone that works there should know that you exist to set up totalitarianism. That is your function now. "Homeland defense" and all those weak USA PATRIOT act justifications and funding are now far in the rear view mirror. Up ahead: Mount Totalitarianism. Cloak yourself in doublespeak, Palantir. I have likely marked myself for death. |
|
| ▲ | themafia 3 days ago | parent | prev | next [-] |
| > As a technology, it is just database joins Which don't work out all that well in practice. > on top to provide global visibility into any system. Global visibility into the data. There's no guarantee your data and your performance match. We have so much data the quality of much of it is fairly low. > Tech is tech and can be used for both good and bad. You can also just lie about what you're doing and use it as a cover for violations of civil rights and federal law. I mean, if it's just "database joins," then why is the government buying this from a vendor? Shouldn't they just be able to _do that_? |
| |
| ▲ | lolive 3 days ago | parent [-] | | That’s called MakeOrBuy. Why do everyone go to Facebook [/or HN] instead of self-hosting their blog? Because Facebook [/or HN] is a bunch of highly skilled experts that have shaped the proper UX for ubiquitous information exchange between humans [/ geeks]. So we, the users, can concentrate on our own business [/ trolls]. | | |
| ▲ | standardly 2 days ago | parent [-] | | I love how so many HN users make up their own "syntax" in comments "[/or HN]" ! Jesus, lol.. Reminds me of when folks /emphasize/ words like /this/ in their comments. Anyways, I'm just joshin' ya. Personal blogs /would/ make for a superior internet as opposed to Facebook/X. | | |
|
|
|
| ▲ | 1231423 3 days ago | parent | prev | next [-] |
| >When we offload the moral impetus for society from politics to technology, I mean Palantir exec and their employees are part of society too right ? They themself are making moral choice when working for such technology instead of joining tables for hospitals. |
| |
| ▲ | mrlongroots 2 days ago | parent [-] | | Our social contract bestows individuals with freedom and governments with the monopoly of violence to police the limits of individual freedom. If the acts of law abiding individuals (or groups) are a net negative for society, that is not an individual failure. Fiduciary responsibility is a useful parallel: it is not the job of a sugar manufacturer to think about the public health aspects of sugar. Their responsibility to their shareholders is to produce clean, safe, and edible sugar at competitive prices and do a good job with marketing and distributin, that's all. |
|
|
| ▲ | theOGognf 3 days ago | parent | prev | next [-] |
| It’s ironic that HN threads, arguably one of the forums where the majority of users should understand a tech company and its tech, about Palantir always devolve into some weird speculative and conspiracy-like discussion. Palantir’s docs are pretty open too - it’s not like it’s a black box that you can only see if you have a contract with them. So one would think the HN crowd would know something and have an interesting discussion on how it compares to what they’ve seen, etc. But it somehow always turns mostly political and less about the tech |
| |
| ▲ | tamimio 3 days ago | parent [-] | | It's because we understand technology very well and how it can be used to further control or surveil you. The tech itself isn't complicated; at best you would have a unified protocol that seamlessly integrates with all data sources, at worst, that part is done manually, but the rest of the tech isn't new as a concept, so there's really nothing to discuss technology-wise. However, as technical people, we can see how something can be used in a bad way or at least, sometime in the future based on the current trend, and it's necessary to discuss such implications even if it's political. For example, when a messaging app requires a phone number to activate, it's essential to highlight that it could be exploited in a SIM swap attack (thus the user should not trust it) or it could leak that number which will expose this person's real identity. And in this case, having so much information collected and shared and easily accessed by one centralized entity is never a good indicator. It's also ironic that the people who used to (still?) attack China and other countries for being surveillance state Orwellian dystopias while virtue signaling all democracy and freedom values, are now okay with such data collection and processing and potential red flagging for things as simple as social media posts. |
|
|
| ▲ | mgh2 3 days ago | parent | prev [-] |
| They are just a data company capitalizing on the AI hype - when the AI bubble pops, they will too. When ChatGPT launched, Palantir's stock started climbing by selling its "AI platform". The cycle follows a marketing funnel: AIDA - awareness, interest, decision, action.
https://www.smartinsights.com/traffic-building-strategy/offe... FUD: Awareness and interest (AI) - at the initial stages, doomer marketing by big tech to government about its dangers and regulations https://en.wikipedia.org/wiki/Fear,_uncertainty,_and_doubt FOMO: Decision and action (DA) - After selling, it is all about investing in infrastructure and adopting the technology https://en.wikipedia.org/wiki/Fear_of_missing_out Sentiment shift: https://news.ycombinator.com/item?id=44870777 |
| |
| ▲ | stocksinsmocks 3 days ago | parent | next [-] | | I don’t think Palantir is going anywhere and preceded the AI hype train. I suspect they’re kind of an out-and-proud MAIN CORE successor. Disturbing, but the Cyber Punk genre has warned us about this for some time. | |
| ▲ | mingus88 3 days ago | parent | prev [-] | | The palantir bubble will not pop as long as Thiel and his folk are embedded in USG. The stock took off when Trump/Vance came in and Vance is thiels pick Their primary technology predates any AI hype by a decade at least, and their strength has always been in deploying great engineers. | | |
|