Remix.run Logo
Aurornis 2 days ago

> should require some kind of genuine software engineering certification

Wouldn't change a thing, other than add another hassle you have to pay for to do your job.

This is the result of carelessness, not someone who didn't know that private data should be private because they weren't certified.

applfanboysbgon 2 days ago | parent | next [-]

This is the result of somebody who has no idea how the fuck the tech they're using works. They surely knew it should be private, but they did not know that they were making it publicly available because they were blindly fumbling their way around in a job beyond their competence level. There is a 0% chance this was ordinary carelessness, in the form of "I know better but don't care enough", this is so clearly a case of "I don't know what I'm doing".

Aurornis 2 days ago | parent [-]

Any time someone tries to suggest certification as a solution I ask the same question: How would it have solved this problem?

Would the certification require someone to take an official certification test for the framework used?

And therefore we’re only allowed to use frameworks which have certification tests available?

If you want to write some new software, do you have to generate a certification for it and get that approved so people are allowed to use it?

Sounds like a great way to force us all to use Big Company approved software because they’re the only ones with pockets deep enough to play all of the certification games

fsflover 2 days ago | parent | next [-]

How did original engineering certification prevent dangerous constructions? Did it force everyone to use a Big Company?

applfanboysbgon 2 days ago | parent | prev | next [-]

The fact that you're thinking purely in frameworks is the exact problem that plagues the software industry. Framework-focused development is why we're in this mess; frameworks make it easy for people who don't understand how to program to publish shitty software by copying-and-pasting code and fudging around a few strings or variables to match their use case. That kind of accessibility is great for low-stakes software, letting anyone make interesting toys, but should be completely unacceptable in a professional environment with, for example, people's fucking tax documentation at stake.

If I had my way, the certification process starts at the bottom of the stack, ie. you should be expected to have a functional knowledge of assembly instructions, memory management, registers, the call stack, and build up from there. Not that we need to write assembly on a daily basis, but all of the abstractions are built on top of that, and you cannot realistically engineer secure software if you don't understand what is being abstracted away. If you do understand the things being abstracted away, you have the fundamentals necessary to do good work with any programming language or framework. Throw in another certification starting from networking fundamentals if your job involves that. 30 years ago, most professional programmers had this level of understanding as table stakes, so we can hardly say it's an unrealistic burden that's impossible to meet.

Would it be a higher barrier to entry that massively cuts the size of the field working on sensitive software and slows software development down, yes. That is exactly what we need. There was a time when people built bridges that collapsed, then we implemented standards and expected engineers to do real work to make sure that didn't happen. Is that work expensive and expertise-intensive, yes, do bridges still collapse, only very rarely. We are witnessing software bridge collapses on a weekly basis, which should be seen as completely unacceptable. The harm is less obvious than when everyone on a bridge dies, but I do think that routinely leaking millions of people's sensitive data is causing serious harm and likely does lead to people dying in second-order effects.

bruce511 2 days ago | parent | next [-]

I follow your logic here, and it's certainly a coherent argument.

That said, there are perhaps some factors you are overlooking which matter.

The first is that no amount of certification solves the actual problem (which is that security mistakes are made, often in new and novel ways.)

Secondly the amount of software being needed (and produced) is immense. Bridges require engineers, but the demand for new bridges is tiny. The demand for new software is enormous, and the current rate of production requires many more people that could ever be certified.

In other words, say you only allowed comp-sci graduates with a proper 4 year degree, covering assembly upwards etc. The supply of programmers would drop to what colleges could produce. Which is not nearly enough.

The analogy also falls down a bit on penalty-for-failure, a collapsed bridge kills people, bugs in my notepad app might lead to information leaks? Thats not the same thing.

In truth, at least for the last 35 years, the number of unqualified developers exceed qualified ones by orders of magnitude. And there still seems to be no limit to software demand.

Finally there have been no studies I am aware if that suggest that security flaws are added more frequently by non comp-sci grads compared to comp-sci grads. Anecdotally I don't see that distinction myself. (From my observation security outcomes correlate to the degree to which the individual considers security to be important.)

And, of course, security issues are not limited to programmers- management has a role to play as well. Should they be certified too?

So, I'm not convinced that your suggestion, however desirable, would solve the problem. And since it's clearly unimplementable in the real world it's a moot argument anyway.

applfanboysbgon 2 days ago | parent [-]

"Bridges" are shorthand. There is no shortage of need for new infrastructure. Any kind of construction needs engineers involved to ensure what's being built doesn't collapse from a gust of wind. Apparently, in the US, there seem to be about 1.5 million engineers and 4.5 million software developers. Well, I think in the short term, certifying only 1.5 million "software engineers" would be fine, actually. Note that my argument pertains only to sensitive software. If you want to make software that doesn't pose a danger to its users, you don't need an 'engineer'. This should have the second-order benefit of making PII toxic waste. If you need a real engineering team to process PII, companies that don't need PII will stop scraping every last fucking thing and leaking it. The majority of software in the world doesn't actually need PII to function, they could just be incentivized to stop hoarding it and use a regular "software development" team if they want to deliver cheap and fast.

I also wouldn't specifically associate this with college degrees. In fact I think universities are doing a shockingly bad job of producing functional software developers. But, on the other hand, you don't need a university to produce a good programmer. Software development is possibly the most open, information-available discipline in the world. Self-motivated learners can absolutely become competent on their own. The certification should be merit-based, and provide a clear path to learning the material the certification is based on. Many people will go through the effort to educate themselves and learn the required skills, especially if certified software engineers are in high demand and command a higher salary.

Regarding the penalty-for-failure, as I said, the harm is not as immediately apparent as when people die in a bridge collapse. But leaking sensitive information still leads to people dying, even if the connection is not as direct. Doxxing and blackmail frequently lead to suicide, and there are other damages that could lead to a butterfly effect culminating in a higher death rate, or, even if not death, tangible harm. This leak contained birth certificates, IDs, passports, tax documentation, passwords, all kinds of information that could be used to ruin someone's life with identity fraud. There is also, of course, some software in the world that is directly safety-critical, much of the software used in the health field for instance, which is also currently being written by the lowest bidder in many cases.

Regarding management, they don't need a certification but rather consequences for their actions. Currently the incentive structure is such that management is rewarded for cutting costs and is never punished for harming customers. Fiverr, for instance, should be facing an investigation that threatens to shut down the business given that not only did this happen in the first place, and not only did they ignore it for 40 days, but even after it went public the sensitive files were still accessible for 12+ hours (notably, after they were definitely made aware of it, given reports in this thread of people receiving replies from Fiverr about it). Maybe throw in some criminal liability for the people most responsible for a situation this horrible. Management would tighten up real quick.

I don't agree that this is unimplementable in the real world at all. If anything it's a complete abnormality that software development is the way it is, when most other skilled professions are licensed and regulated.

joseangel_sc 2 days ago | parent | prev [-]

i have bad news for you

ryandrake 2 days ago | parent | prev | next [-]

The certification obviously would have to have teeth. A certification that you needed in order to do work as a software professional, which could be revoked for cases of carelessness or negligence, would disincentivize carelessness and negligence.

This is how airline pilot certificates work. And in that career, certification actually works. It's not a miracle or unexplainable.

throwanem 2 days ago | parent | prev [-]

> Would the certification require someone to take an official certification test for the framework used?

> And therefore we’re only allowed to use frameworks which have certification tests available?

When it's safety-critical, yes, absolutely. A service that handles sensitive PII, such as the one whose "engineers" should be prosecuted for this incident, is definitionally safety-critical.

If you're afraid in that world you'd be unable to work, maybe you deserve to be.

hilariously 2 days ago | parent | prev | next [-]

It's so much worse in the industry, the truth is that many people literally have no idea how to secure things, what to secure, why to secure it - they pay no attention and are plainly ignorant of the state of the world and oftentimes just stupid.

I worked at a company where a customer called confused because when they googled our company as they did every day to login to their portal they found that drivers licenses we stored were available on the public internet.

The devs literally didn't know about direct object access and thought obfuscation was enough, didn't know about how robots.txt worked, didn't know about google webmaster shit, didn't know about sitemaps, they were just the cheapest labor the company could find who could do the thing.

This is a huge portion of outsourced labor in my experience, not because they are worse overseas in any respect, but because the people looking for cheap labor were always looking for the cheapest labor and had no idea how that applied to the actual technical work of running their business.

jval43 2 days ago | parent [-]

>they were just the cheapest labor the company could find who could do the thing.

Thats the problem right there. The company doesn't care. No amount of personal certifications is going to fix that.

It MUST be on the companies. They should be fined out of existence for such breaches and they would quickly change tune.

ChrisMarshallNY 2 days ago | parent [-]

> They should be fined out of existence for such breaches and they would quickly change tune.

Looks like this is a great opportunity for an object lesson. Let’s see how it goes…

As far as certification stuff…

Civil engineering has had licensing forever. That’s because Bad Things Happen, when they make mistakes.

I do think that it would be a good idea to score/certify critical infrastructure stuff. That might involve certification of the people that make it, but it should certainly involve penalties for the people responsible. That might include the authors, but it should probably also include the folks that decide to use the bad code.

I know that ISO 9000 is an attempt to address this kind of thing. In my opinion, it’s kind of a mess. I’ve worked in ISO 9000 shops, and it’s not much fun. The thing you learn, pretty quickly, is how to end-run the process, as it’s so heavy, that it basically stops all forward progress. It doesn’t have to, but often does.

Mistakes get made. If you design carefully, these mistakes won’t cause real damage.

I just figured out that an app I wrote, that’s been out for two years, has an embarrassing bug (mea culpa). I’ll get it fixed today.

Because I’m pretty careful, it doesn’t affect stuff like user privacy. It just introduces performance overhead, in one operation, so the fix will mean that the app will suddenly speed up.

I’m not sure that certification would have solved it. My security mindset is why user privacy wasn’t affected, and that comes from experience.

> Good judgment comes from experience. Experience comes from bad judgement.

Orygin 2 days ago | parent [-]

Also if you are personally liable of gross negligence, you will:

1. Get paid more (as less fake "engineers" are available for the responsibility).

2. Push back harder (or at least document in detail) on malpractice during development. Manager did not listen to your warnings? Document it and when shit hits the fan, the manager gets the stick instead of you.

Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.

Manager not listening? Remind them they will face a trial if the issue does surface.

ailef 2 days ago | parent [-]

> Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.

What!? So, when you can't switch jobs because the market is bad or for any other reason, your choices are: 1) quit and lose the income (which you can't afford) or 2) sign on whatever and accept the risk of jail time?

Orygin a day ago | parent | next [-]

The job market in such society would not be the same as it is now.

If you are certified, chances are you will have lots of choices to work.

ChrisMarshallNY a day ago | parent | prev [-]

Sounds like every other vocation out there.

Software devs have been insanely privileged, for the last couple of decades. That seems to be changing.

seemaze 2 days ago | parent | prev [-]

>Wouldn't change a thing..

That's exactly what certification or licensure does; it imposes financial, civil, and criminal penalties for malpractice.

The liability of incurring penalties quickly outweigh the benefit of arbitraging costs with an unqualified practitioner.

hurflmurfl 2 days ago | parent [-]

I think just putting it on the companies is enough. If the fines are serious and can put your company out of business, and are enforced, then the companies themselves will probably work out processes for not doing stupid stuff. Whether that be creating some sort of certifications that would be prized by the companies, knowing to hire a specialized team for a security review, or anything else.

If everyone knows that messing up security gets you in real trouble and the company loses real money, and it happens all the time, and it's not just "Facebook fined $x million for doing shady stuff", then I think the industry will adapt.

Like when GDPR got released and no matter if I thought we are or are not handling PII, I had to read up and double-check my assumptions just because it was being talked about all over the place and it would be embarrassing to be caught with your pants down when you didn't actually intend to do a shady thing.

Orygin 2 days ago | parent [-]

> I think just putting it on the companies is enough. If the fines are serious and can put your company out of business

They don't care. It's either never enough to make them care, or the company can just bankrupt and you go do something else.

If you or your manager has the threat of jail in the back of their mind, it's no longer just someone else's money being lost, it's personal.

> If everyone knows that messing up security gets you in real trouble and the company loses real money

There's already huge fines on paper for this, but never ever are the fines enough. It's always factored in the "cost of doing business". Also it's still someone else's money, why would an engineer care?

Please show me a GDPR fine that hit hard enough to scare companies into not fucking up? Evidently here it was not enough for Fiverr.

Edit: Just to provide an example, Takata airbags have been recalled massively (if you don't know why, look it up) but the company is now bankrupted and who is footing the bill? Their customers.

You cannot impose a fine on them, as it's bankrupt (now, but it was always the plan). They deliberately sold dangerous airbags and now what can you do so it doesn't happen again? Fine them some more? or maybe throw a few execs in jail because they knew of the problem and continued as usual.