| ▲ | applfanboysbgon 3 days ago |
| Software development jobs are too accessible. Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification, and there should be business-cratering fines for something as egregious as completely ignoring security reports. It is ridiculous how we've completely normalised leaks like this on a weekly or almost-daily basis. |
|
| ▲ | morpheuskafka 2 days ago | parent | next [-] |
| At my last job, I opened up Shodan in my free time and clicked through our ASN with the free filters. In two minutes I found multiple iDRACs online. Surprisingly, none had default pw. But one had a public exploit vuln that was years old allowing takeover... Turns out during the firewall hardware migration years ago, several units firewalls were switched to audit mode (not enforcing rules). So an entire institute (health research!) had their whole subnet public with zero firewalls, both the server OS and iDRAC interfaces. iDRAC isn't even supposed to be on the same VLAN per Dell let alone on the internet. To top it off, after making some tickets (admittedly not all as serious, ex MFP web UIs on internet) from Shodan, I got pushback from the firewall team for causing units to submit to many changes. I also got in trouble with our Qualys analyst for undermining his work because he hadn't gotten to that units annual review yet, even though I didn't even have a Qualys login. (And even if I had found it there, since when do we wait for annual reviews to fix that?) It took at least three weeks internally to get it fixed, and by that I mean only the iDRAC IP blocked with the server itself still wide open. And that's only because I mentioned it to my manager (awesome guy and not formally responsible for firewall rules) after an unrelated no firewall host incident came through and he authorized an emergency rule. |
| |
| ▲ | bblb 2 days ago | parent [-] | | Huawei Enterprise devices tend to have a CAPTCHA by default on their BMC/OOB GUIs or the other various system/infrastructure service GUIs (such as the HuaweiCloud/FusionCloud products). I'm guessing the reason is that people leave the management ports and GUIs wide open to the public Internet, so the CAPTCHA is protecting at least from the very basic script kiddie bots. |
|
|
| ▲ | morpheuskafka 3 days ago | parent | prev | next [-] |
| They may be part of it, but as a publicly traded company, there's got to be a at least a few people there with a fancy pedigree (not that that actually means they are good at their job or care). But if such a test existed, they presumably would have passed it. They also have an ISO 27001 certificate (they try to claim a bunch of AWSs certs by proxy on their security page, which is ironic as they say AWS stores most of their data while apparently all uploads are on this). |
| |
| ▲ | trollbridge 2 days ago | parent | next [-] | | A while ago I had a customer come to me who had a simple Shopify site and fell for a phishing type of attack where someone simply had an email like "shopify_security at gmail" and kept telling her she needed to apply all kinds of changes. They laundered the payments through Fiverr. Then they would install WordPress plugins to make the site worse and claim even more "work" was needed. I documented the entire thing, including my own credentials, and sent it off to Fiverr. Fiverr's response was everything was fine and there was nothing they could do about it, even though it was obvious fraud. Google never did anything about it either, nor did Shopify. Given how they handled such a minor situation like that... I guess it shouldn't be surprising they're just asleep at the switch for a major one like this. | |
| ▲ | bluefirebrand 2 days ago | parent | prev [-] | | > But if such a test existed, they presumably would have passed it Sure, and now they could have their credentials revoked, potential be legally liable, and never find work in this field again which would prevent them from cocking up another company this way |
|
|
| ▲ | Aurornis 2 days ago | parent | prev | next [-] |
| > should require some kind of genuine software engineering certification Wouldn't change a thing, other than add another hassle you have to pay for to do your job. This is the result of carelessness, not someone who didn't know that private data should be private because they weren't certified. |
| |
| ▲ | applfanboysbgon 2 days ago | parent | next [-] | | This is the result of somebody who has no idea how the fuck the tech they're using works. They surely knew it should be private, but they did not know that they were making it publicly available because they were blindly fumbling their way around in a job beyond their competence level. There is a 0% chance this was ordinary carelessness, in the form of "I know better but don't care enough", this is so clearly a case of "I don't know what I'm doing". | | |
| ▲ | Aurornis 2 days ago | parent [-] | | Any time someone tries to suggest certification as a solution I ask the same question: How would it have solved this problem? Would the certification require someone to take an official certification test for the framework used? And therefore we’re only allowed to use frameworks which have certification tests available? If you want to write some new software, do you have to generate a certification for it and get that approved so people are allowed to use it? Sounds like a great way to force us all to use Big Company approved software because they’re the only ones with pockets deep enough to play all of the certification games | | |
| ▲ | fsflover 2 days ago | parent | next [-] | | How did original engineering certification prevent dangerous constructions? Did it force everyone to use a Big Company? | |
| ▲ | applfanboysbgon 2 days ago | parent | prev | next [-] | | The fact that you're thinking purely in frameworks is the exact problem that plagues the software industry. Framework-focused development is why we're in this mess; frameworks make it easy for people who don't understand how to program to publish shitty software by copying-and-pasting code and fudging around a few strings or variables to match their use case. That kind of accessibility is great for low-stakes software, letting anyone make interesting toys, but should be completely unacceptable in a professional environment with, for example, people's fucking tax documentation at stake. If I had my way, the certification process starts at the bottom of the stack, ie. you should be expected to have a functional knowledge of assembly instructions, memory management, registers, the call stack, and build up from there. Not that we need to write assembly on a daily basis, but all of the abstractions are built on top of that, and you cannot realistically engineer secure software if you don't understand what is being abstracted away. If you do understand the things being abstracted away, you have the fundamentals necessary to do good work with any programming language or framework. Throw in another certification starting from networking fundamentals if your job involves that. 30 years ago, most professional programmers had this level of understanding as table stakes, so we can hardly say it's an unrealistic burden that's impossible to meet. Would it be a higher barrier to entry that massively cuts the size of the field working on sensitive software and slows software development down, yes. That is exactly what we need. There was a time when people built bridges that collapsed, then we implemented standards and expected engineers to do real work to make sure that didn't happen. Is that work expensive and expertise-intensive, yes, do bridges still collapse, only very rarely. We are witnessing software bridge collapses on a weekly basis, which should be seen as completely unacceptable. The harm is less obvious than when everyone on a bridge dies, but I do think that routinely leaking millions of people's sensitive data is causing serious harm and likely does lead to people dying in second-order effects. | | |
| ▲ | bruce511 2 days ago | parent | next [-] | | I follow your logic here, and it's certainly a coherent argument. That said, there are perhaps some factors you are overlooking which matter. The first is that no amount of certification solves the actual problem (which is that security mistakes are made, often in new and novel ways.) Secondly the amount of software being needed (and produced) is immense. Bridges require engineers, but the demand for new bridges is tiny. The demand for new software is enormous, and the current rate of production requires many more people that could ever be certified. In other words, say you only allowed comp-sci graduates with a proper 4 year degree, covering assembly upwards etc. The supply of programmers would drop to what colleges could produce. Which is not nearly enough. The analogy also falls down a bit on penalty-for-failure, a collapsed bridge kills people, bugs in my notepad app might lead to information leaks? Thats not the same thing. In truth, at least for the last 35 years, the number of unqualified developers exceed qualified ones by orders of magnitude. And there still seems to be no limit to software demand. Finally there have been no studies I am aware if that suggest that security flaws are added more frequently by non comp-sci grads compared to comp-sci grads. Anecdotally I don't see that distinction myself. (From my observation security outcomes correlate to the degree to which the individual considers security to be important.) And, of course, security issues are not limited to programmers- management has a role to play as well. Should they be certified too? So, I'm not convinced that your suggestion, however desirable, would solve the problem. And since it's clearly unimplementable in the real world it's a moot argument anyway. | | |
| ▲ | applfanboysbgon 2 days ago | parent [-] | | "Bridges" are shorthand. There is no shortage of need for new infrastructure. Any kind of construction needs engineers involved to ensure what's being built doesn't collapse from a gust of wind. Apparently, in the US, there seem to be about 1.5 million engineers and 4.5 million software developers. Well, I think in the short term, certifying only 1.5 million "software engineers" would be fine, actually. Note that my argument pertains only to sensitive software. If you want to make software that doesn't pose a danger to its users, you don't need an 'engineer'. This should have the second-order benefit of making PII toxic waste. If you need a real engineering team to process PII, companies that don't need PII will stop scraping every last fucking thing and leaking it. The majority of software in the world doesn't actually need PII to function, they could just be incentivized to stop hoarding it and use a regular "software development" team if they want to deliver cheap and fast. I also wouldn't specifically associate this with college degrees. In fact I think universities are doing a shockingly bad job of producing functional software developers. But, on the other hand, you don't need a university to produce a good programmer. Software development is possibly the most open, information-available discipline in the world. Self-motivated learners can absolutely become competent on their own. The certification should be merit-based, and provide a clear path to learning the material the certification is based on. Many people will go through the effort to educate themselves and learn the required skills, especially if certified software engineers are in high demand and command a higher salary. Regarding the penalty-for-failure, as I said, the harm is not as immediately apparent as when people die in a bridge collapse. But leaking sensitive information still leads to people dying, even if the connection is not as direct. Doxxing and blackmail frequently lead to suicide, and there are other damages that could lead to a butterfly effect culminating in a higher death rate, or, even if not death, tangible harm. This leak contained birth certificates, IDs, passports, tax documentation, passwords, all kinds of information that could be used to ruin someone's life with identity fraud. There is also, of course, some software in the world that is directly safety-critical, much of the software used in the health field for instance, which is also currently being written by the lowest bidder in many cases. Regarding management, they don't need a certification but rather consequences for their actions. Currently the incentive structure is such that management is rewarded for cutting costs and is never punished for harming customers. Fiverr, for instance, should be facing an investigation that threatens to shut down the business given that not only did this happen in the first place, and not only did they ignore it for 40 days, but even after it went public the sensitive files were still accessible for 12+ hours (notably, after they were definitely made aware of it, given reports in this thread of people receiving replies from Fiverr about it). Maybe throw in some criminal liability for the people most responsible for a situation this horrible. Management would tighten up real quick. I don't agree that this is unimplementable in the real world at all. If anything it's a complete abnormality that software development is the way it is, when most other skilled professions are licensed and regulated. |
| |
| ▲ | joseangel_sc 2 days ago | parent | prev [-] | | i have bad news for you |
| |
| ▲ | ryandrake 2 days ago | parent | prev | next [-] | | The certification obviously would have to have teeth. A certification that you needed in order to do work as a software professional, which could be revoked for cases of carelessness or negligence, would disincentivize carelessness and negligence. This is how airline pilot certificates work. And in that career, certification actually works. It's not a miracle or unexplainable. | |
| ▲ | throwanem 2 days ago | parent | prev [-] | | > Would the certification require someone to take an official certification test for the framework used? > And therefore we’re only allowed to use frameworks which have certification tests available? When it's safety-critical, yes, absolutely. A service that handles sensitive PII, such as the one whose "engineers" should be prosecuted for this incident, is definitionally safety-critical. If you're afraid in that world you'd be unable to work, maybe you deserve to be. |
|
| |
| ▲ | hilariously 2 days ago | parent | prev | next [-] | | It's so much worse in the industry, the truth is that many people literally have no idea how to secure things, what to secure, why to secure it - they pay no attention and are plainly ignorant of the state of the world and oftentimes just stupid. I worked at a company where a customer called confused because when they googled our company as they did every day to login to their portal they found that drivers licenses we stored were available on the public internet. The devs literally didn't know about direct object access and thought obfuscation was enough, didn't know about how robots.txt worked, didn't know about google webmaster shit, didn't know about sitemaps, they were just the cheapest labor the company could find who could do the thing. This is a huge portion of outsourced labor in my experience, not because they are worse overseas in any respect, but because the people looking for cheap labor were always looking for the cheapest labor and had no idea how that applied to the actual technical work of running their business. | | |
| ▲ | jval43 2 days ago | parent [-] | | >they were just the cheapest labor the company could find who could do the thing. Thats the problem right there. The company doesn't care. No amount of personal certifications is going to fix that. It MUST be on the companies. They should be fined out of existence for such breaches and they would quickly change tune. | | |
| ▲ | ChrisMarshallNY 2 days ago | parent [-] | | > They should be fined out of existence for such breaches and they would quickly change tune. Looks like this is a great opportunity for an object lesson. Let’s see how it goes… As far as certification stuff… Civil engineering has had licensing forever. That’s because Bad Things Happen, when they make mistakes. I do think that it would be a good idea to score/certify critical infrastructure stuff. That might involve certification of the people that make it, but it should certainly involve penalties for the people responsible. That might include the authors, but it should probably also include the folks that decide to use the bad code. I know that ISO 9000 is an attempt to address this kind of thing. In my opinion, it’s kind of a mess. I’ve worked in ISO 9000 shops, and it’s not much fun. The thing you learn, pretty quickly, is how to end-run the process, as it’s so heavy, that it basically stops all forward progress. It doesn’t have to, but often does. Mistakes get made. If you design carefully, these mistakes won’t cause real damage. I just figured out that an app I wrote, that’s been out for two years, has an embarrassing bug (mea culpa). I’ll get it fixed today. Because I’m pretty careful, it doesn’t affect stuff like user privacy. It just introduces performance overhead, in one operation, so the fix will mean that the app will suddenly speed up. I’m not sure that certification would have solved it. My security mindset is why user privacy wasn’t affected, and that comes from experience. > Good judgment comes from experience. Experience comes from bad judgement. | | |
| ▲ | Orygin 2 days ago | parent [-] | | Also if you are personally liable of gross negligence, you will: 1. Get paid more (as less fake "engineers" are available for the responsibility). 2. Push back harder (or at least document in detail) on malpractice during development. Manager did not listen to your warnings? Document it and when shit hits the fan, the manager gets the stick instead of you. Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems. Manager not listening? Remind them they will face a trial if the issue does surface. | | |
| ▲ | ailef 2 days ago | parent [-] | | > Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems. What!? So, when you can't switch jobs because the market is bad or for any other reason, your choices are: 1) quit and lose the income (which you can't afford) or 2) sign on whatever and accept the risk of jail time? | | |
| ▲ | Orygin a day ago | parent | next [-] | | The job market in such society would not be the same as it is now. If you are certified, chances are you will have lots of choices to work. | |
| ▲ | ChrisMarshallNY a day ago | parent | prev [-] | | Sounds like every other vocation out there. Software devs have been insanely privileged, for the last couple of decades. That seems to be changing. |
|
|
|
|
| |
| ▲ | seemaze 2 days ago | parent | prev [-] | | >Wouldn't change a thing.. That's exactly what certification or licensure does; it imposes financial, civil, and criminal penalties for malpractice. The liability of incurring penalties quickly outweigh the benefit of arbitraging costs with an unqualified practitioner. | | |
| ▲ | hurflmurfl 2 days ago | parent [-] | | I think just putting it on the companies is enough. If the fines are serious and can put your company out of business, and are enforced, then the companies themselves will probably work out processes for not doing stupid stuff.
Whether that be creating some sort of certifications that would be prized by the companies, knowing to hire a specialized team for a security review, or anything else. If everyone knows that messing up security gets you in real trouble and the company loses real money, and it happens all the time, and it's not just "Facebook fined $x million for doing shady stuff", then I think the industry will adapt. Like when GDPR got released and no matter if I thought we are or are not handling PII, I had to read up and double-check my assumptions just because it was being talked about all over the place and it would be embarrassing to be caught with your pants down when you didn't actually intend to do a shady thing. | | |
| ▲ | Orygin 2 days ago | parent [-] | | > I think just putting it on the companies is enough. If the fines are serious and can put your company out of business They don't care. It's either never enough to make them care, or the company can just bankrupt and you go do something else. If you or your manager has the threat of jail in the back of their mind, it's no longer just someone else's money being lost, it's personal. > If everyone knows that messing up security gets you in real trouble and the company loses real money There's already huge fines on paper for this, but never ever are the fines enough. It's always factored in the "cost of doing business". Also it's still someone else's money, why would an engineer care? Please show me a GDPR fine that hit hard enough to scare companies into not fucking up? Evidently here it was not enough for Fiverr. Edit: Just to provide an example, Takata airbags have been recalled massively (if you don't know why, look it up) but the company is now bankrupted and who is footing the bill? Their customers. You cannot impose a fine on them, as it's bankrupt (now, but it was always the plan). They deliberately sold dangerous airbags and now what can you do so it doesn't happen again? Fine them some more? or maybe throw a few execs in jail because they knew of the problem and continued as usual. |
|
|
|
|
| ▲ | userbinator 2 days ago | parent | prev | next [-] |
| some kind of genuine software engineering certification That only gives those in power another way to push people into toeing the line. There's enough corporate authoritarianism these days as it is already. Give Stallman's "Right to Read" a read. His dystopia is exactly where we're going to be headed quickly if we keep demanding someone to "do something". "The optimal amount of fraud is nonzero." "Those who give up freedom for security deserve neither." |
| |
| ▲ | computably 2 days ago | parent [-] | | You're responding to literally 7 words out of context. > Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification FAANG, Fortune 500, etc., almost universally go out of their way to violate user freedom in pursuit of profit. Regulation is practically the only way to force megacorps to respect users' rights and improve their security, as evidenced by right-to-repair, surveillance/privacy, and so on. And none of that has anything to do with users' individual rights to create, run, and modify their own software. (Yes, regulatory capture exists, no, it doesn't mean all regulation is bad.) | | |
| ▲ | userbinator 2 days ago | parent [-] | | If the megacorps are going in that direction of being strictly regulated, the rest of the industry will follow. It's the general movement of the Overton Window that's the underlying issue. | | |
| ▲ | subscribed 2 days ago | parent [-] | | No, they won't. No one in their right mind "wants" ISO27001, ISO9001, SOC or multiple PITA certifications. Companies do that because they want to attract certain kind of customers and have enough spare manpower and money to go through this all year long. ....or they want to hold a very sensitive data that requires *proven* processes, trainings and skills. My firm has several of these and we have to keep full compliance team and *always* have some auditor on site. No one does it just because. |
|
|
|
|
| ▲ | victorbjorklund 2 days ago | parent | prev | next [-] |
| I once worked in a company and noticed that customer financial statements were publicly accessible. Ran into the software team. And got the reply that no one told them that it should be behind authentication. Some people really don't use their own brains. |
| |
| ▲ | 21asdffdsa12 2 days ago | parent [-] | | If you do, you get into trouble with the hierarchy, all those middle-managers and responsibility distributing committees will be unemployed. |
|
|
| ▲ | Loughla 2 days ago | parent | prev | next [-] |
| Teachers have to be licensed and keep up on licensing. Plumbers. Electricians. Lawyers. Doctors. Hell, I have to get a license to run my own business. Why shouldn't software come with a branch for licenses if you're working with sensitive data? |
| |
| ▲ | coldtea 2 days ago | parent | next [-] | | We're going the other way: now any random vibe coded slop is the norm. | | |
| ▲ | bad_haircut72 2 days ago | parent [-] | | Normalize "vibe-plumbing" | | |
| ▲ | bombcar 2 days ago | parent | next [-] | | Both plumbing and wiring are “easier” in a way than programming-as they’ll violently and potentially explosively let you know if you messed up; whereas programming lets you be blissfully unaware until you see your data plastered across the nightly news. | | |
| ▲ | avian 2 days ago | parent | next [-] | | Wiring mistakes can kill or burn down a house months or years after they have been done. You will not notice unconnected protective earth or badly dimensioned circuit breakers until something else breaks and the protective element is not there. | |
| ▲ | Ekaros 2 days ago | parent | prev | next [-] | | There are many failure cases that are slow. Especially with water. Let say they sag a bit and connection is poor. It might slowly start leaking over months causing structural damage or at least dampness and microbiological effects. | |
| ▲ | ChoGGi 2 days ago | parent | prev [-] | | I'm in the midst of renovating a house at the moment, it's about ten years old. The plumber siliconed all the shower valves to the fiberglass walls without screwing them to a backplate. Unsurprisingly the builder is now out of business. | | |
| |
| ▲ | lotsofpulp 2 days ago | parent | prev | next [-] | | It is, it just usually results in immediate calls to actual plumbers without anyone else finding out. Or it’s hidden behind some new drywall and paint until a different occupant finds out. | |
| ▲ | dfedbeef 2 days ago | parent | prev [-] | | This is a good comment |
|
| |
| ▲ | bradleyankrom 2 days ago | parent | prev [-] | | Hairdressers! |
|
|
| ▲ | ge96 2 days ago | parent | prev | next [-] |
| People at my company don't even lock their computer when they walk away from their desk. Which yeah it's in a controlled environment but still. |
| |
| ▲ | yojo 2 days ago | parent | next [-] | | My work has a “donuts” slack channel for this. You find an unlocked computer you post “donuts on me!” Social pressure says they buy the office donuts. Still get a few a week, but at least it’s public and amusing. | | |
| ▲ | rcbdev 2 days ago | parent [-] | | This would be borderline illegal in most countries. Not very enforcable, sure, but illegal. |
| |
| ▲ | SillyUsername 2 days ago | parent | prev [-] | | We used to flip display upside down in display options, which also reverses the mouse. We'd then lock the PC and disconnect the keyboard.
After they figured out the keyboard had been pulled they often couldn't work out why their screen was upside down... |
|
|
| ▲ | fnimick 2 days ago | parent | prev | next [-] |
| At least I'm sure LLM tools deploying code to production won't result in this happening more frequently. "Make sure it's secure. Make no mistakes." |
| |
|
| ▲ | philip1209 2 days ago | parent | prev | next [-] |
| good thing it's getting easier to code - nothing bad can come of this :-) |
|
| ▲ | borplk a day ago | parent | prev [-] |
| Unfortunately everything is going in the opposite direction. We are in the age of AI-slop AI-everything AI-break-it AI-fix-it. Software companies are competing with each other on how low they can push the quality and still get away with it. There's no reward or incentive for paying attention to the details or the quality. In fact you will get penalised for it. |