| ▲ | tsujamin 2 days ago |
| This reminds me strongly of reaching the final year industry projects in my software engineering degree, and seeing a significant portion of my colleagues unable to develop software in any meaningful way. There was a curriculum correction in the years afterwards I think, but so many students had zero concept of version control, of how to start working on a piece of software (sans an assignment specification or scaffold), or how to learn and apply libraries or frameworks in general. It was unreal. |
|
| ▲ | ninkendo 2 days ago | parent | next [-] |
| In my senior design project in college, we were the only team that decided to use version control. I pushed for it, and set up a CVS server, mostly because I was the team lead and thought it was an easy way to feel like I was making a difference on the team. This was around 2005 or so, git didn’t really exist yet, svn was the new kid on the block and cvs was the established player. I had never used any vcs before and neither had anyone on any team, but man was it worth it. The ability to have one place with the latest code and not emailing zip files around was great, but so was being able to easily roll back to a known good version if we caused an issue, compare changes, etc. By the end of it we all agreed it would have been impossible to do as well as we did if we didn’t do version control. (This was a cross disciplinary engineering curriculum with ME/CE/CS, ours was slightly more software-heavy than other teams but everyone had some amount of software. Version control wasn’t taught and most teams just didn’t even consider it. It was a very different time from today.) |
| |
| ▲ | compootr 2 days ago | parent [-] | | That seems pretty crazy to me (18M) even for one-off scripts, I'll often throw them into a VCS because why not! | | |
| ▲ | ninkendo 2 days ago | parent [-] | | It was quite a bit harder back then to set up CVS. You had to have a cvs server running, there was no way to just “git init” and commit as you go and worry about pushing later. (At least not that I knew about then, I was pretty green.) Public hosting services were hard to come by, so it meant setting up a real server on the internet for your colleagues to use, figuring out auth, etc. Nowadays version control is just so easy it’s easy to forget how good we have it. Not just in getting started locally but pushing to a public service that everyone can share (even private repos are free on GitHub nowadays, it’s a complete no brainer.) |
|
|
|
| ▲ | bdndndndbve 2 days ago | parent | prev | next [-] |
| At the end of my 5 year computer engineering degree, one of the groups had nothing to show for their industry project. They had written an Android app with MySQL credentials hard-coded into it, and on the school's network they couldn't connect to port 3306. They could have changed the MySQL port, or they could have written a REST API, but instead they just gave up and it didn't matter. I was already pretty disillusioned with my undergrad program but that was really the icing on the cake. |
|
| ▲ | intelVISA 2 days ago | parent | prev | next [-] |
| Is that unique to software though? Plenty of people can follow a plan but still find it tough to start from first principles, I would think. |
| |
| ▲ | SkiFire13 2 days ago | parent [-] | | I've worked for a bit in an engineering company and I was surprised at how bad they were at versioning their documents. | | |
| ▲ | codazoda 2 days ago | parent [-] | | Do you have any examples of correctly versioned documents? | | |
| ▲ | SkiFire13 a day ago | parent | next [-] | | Some big issues were: - they were manually tracking the "versions" of documents by copy-pasting them in some folders
- and even this was only done for each "release" of each document; inbetween two releases all the changes were made to files shared on Onedrive (possibly concurrently by two people, sometimes leading to conflicts with the loss of days of work)
- at every release the changes since the last release had to be looked up manually every time and included in a document; this was very time consuming.
- informations were duplicated in multiple documents, with no way to relate them; every change to one of them had to be manually replicated to the others. I would argue that a correctly versioned document should not have these issue. A dedicated software should track all the changes, including individual ones inbetween releases. It should also provide a way to list them, possibly relative to some milestore (like the last release). Data should be kept in a format that's easy to automatically compare for changes, and hopefully in a deduplicated way so that changes need to be made only in one place. If that's not possible I would argue there should be a software that checks for inconsistent informations and prompts for them to be synchronized. In the software development world this has mostly been solved by version control systems like git and continuous integration to ensure that the codebase is in a consistent state after every change. | |
| ▲ | josephg 2 days ago | parent | prev [-] | | I mean, I’d say a markdown / latex / typst document in a Git repository would fit the bill. I’m working on a history project at the moment which has reconstructed the version history of the US constitution based on the secretarial records and various commentaries written during the drafting process. At the moment we’re working on some US state constitutions, the Indian constitution, Irish peace process and the Australian constitutional process. We only have so many historical records of the committee processes, but it turns out to be more than enough to reconstruct the version history of the text. | | |
| ▲ | andai 2 days ago | parent [-] | | Fascinating, what tools are you using to keep track of all that? (Also, are there any interesting practices involved here?) | | |
| ▲ | josephg a day ago | parent [-] | | All custom tooling. I should write it up at some point - there's an awful lot to say about the whole thing, both technically and historically. And the data shows that there are several things apparently taught in American schools (like the idea that the final text is a compromise between the Virginia plan and New Jersey plan) that are simply wrong. |
|
|
|
|
|
|
| ▲ | MrMcCall 2 days ago | parent | prev | next [-] |
| Our final CS course, Operating Systems, was in C, after all the previous courses were in Pascal. Luckily, I had already gotten access to various Unix systems and had taught myself C (thanks, K&R !!). And I say luckily because it was expecially lucky for my SWE group members who would otherwise have not graduated. I was already 10x at that point because of early access and passion for the craft. Most if not all of them had already decided that programming was not their career path, so it was in everyone's benefit and happiness. Funny enough, there was no version control at our uni (a pretty good one, but not primarily technical), and that OS we tweaked for the course was the current version of Tanenbaum's Minix that Linus transformed into Linux. 20 minutes for a recompile and test loop to fix that stupid mistake in the semaphore logic was painful, but that's life on a 286. It took real passion to want to bang through that learning curve. It really weeded out the folks who were just looking for an engineering job, at least for the handful (4) of people I knew in the program. |
| |
| ▲ | aleph_minus_one 2 days ago | parent | next [-] | | > It really weeded out the folks who were just looking for an engineering job Wanting an engineering job means that engineering is such an important part of your life that you desire that your job (i.e. many hours each day) centers around it. The breed of people that you mentioned to be weeded out were not looking for an engineering job, but for some well-paid (often management) job that formally requires engineering qualifications, but where the daily business has barely to do anything related to engineering. | | |
| ▲ | MrMcCall 2 days ago | parent [-] | | In this ~40yo case, I'm guessing it was more of a "I got into a very prestigious state university because I busted my butt in high school and I've learned that CS is a high growth industry." The talented and hard working folks got in and found that studying algorithms at the beginning of the 3rd year from a textbook was doable, but designing and implementing a significant software system (or tweaking an operating system) in the 4th year is a whole other level. It's just that software design and engineering is really a unique beast. I mean, it is the most difficult engineering on the planet, because every single other industry and discipline depends upon it. | | |
| ▲ | ninkendo 2 days ago | parent [-] | | It certainly should be the most difficult engineering on the planet, yes… but IME the standards for quality are so low that it’s a pretty easy gig. You’re expected to slap together stuff that barely works and fix things later. That’s now how any other engineering discipline works, or at least not nearly to the same degree. Also it’s hard to observe how hacky software is from the outside, so it’s easy to get away with a terrible mess of code that would nauseate anyone who looked at it. Most of the time management doesn’t even care. Watching Practical Engineering on YouTube is a pretty illuminating experience for me as it as it shows the extreme care that goes into projects from the outset, how much planning is involved, how much we’ve learned from centuries of experience building things, and how even despite all of this, things can still fail spectacularly. And when it does, there are independent reports done by thorough investigators, who find the real root causes and carry the knowledge forward for future projects. It makes me sad that Software isn’t treated this way. Yes, we get things off the ground fast, but we don’t learn from our mistakes, we don’t conduct thorough investigations into failures, and often people just release things and move on to the next job. Software may be a more complicated and “difficult” discipline but it sure isn’t treated like it. | | |
| ▲ | MrMcCall 2 days ago | parent | next [-] | | Its difficulty and uniqueness is why it just hasn't been 'engineerized' or 'engineerified' yet. It's still a craft, as opposed to an engineering discipline. As such, that means the quality of the software produced by any shop is going to vary according to the organization's standards, which is their combination of management and engineers. Sometimes management is very business-schooly in their perspective, sometimes more engineery. That boundary layer and how it is treated by the top of the hierarchy makes all the difference. I was summer programming in an IT department in the late 80's; it was under the auspices of the comptroller, simply because the org didn't know where else to put it. Management was still figuring out which department would have the budget/cost stuff allocated to it. You can forget about engineering excellence or even semblance of IT knowledge. Everything since then has been the (in)organic growth of IT in the situation that, for the vast majority of companies, IT is simply a cost whose benefit is hard to quantify, especially to the money guys, who are always the ones in charge. | |
| ▲ | jart 2 days ago | parent | prev [-] | | Planning and "extreme care" makes sense if you're building a bridge. Professional software engineering is more like a demolition derby. If you want to be successful you should focus on building fast and breaking things. It wouldn't make sense to blow up a bridge right after you've constructed it, but with software you can do just that. For example, I was just using American Fuzzy Lop a moment ago to break my symbol demangler. It was able to find eight ways to make it crash in a minute. So I fixed all those and let it run for a couple hours and it found another. Tools like this are a superpower. It's also fun to write torture tests and DDOS your own infrastructure. If you don't break it then someone else will. |
|
|
| |
| ▲ | jart 2 days ago | parent | prev [-] | | With that kind of build latency, you're weeding out people who don't want to spend their whole day on coffee break. | | |
| ▲ | MrMcCall a day ago | parent [-] | | Yeah, I guess. I rather enjoyed those days but was always craving more power to just get on with it. And those build times were nothing compared to waiting for a long SQL process. It's all really just determining whether the data flowed properly or not. From a C64 BASIC program to an OS component to a database's tables, it's all just data flowing from one place/form to another. |
|
|
|
| ▲ | neerajsi 2 days ago | parent | prev [-] |
| I was an EE undergrad. The formative project for me was a competition to see who could make the fastest digitally controlled maze solving robot. The key that gave my team an advantage was the humble ASSERT. If the robot got off track, it would stop in place, blink a light, and show a line number on a digital display. I've been working in and around Windows for a long time, and I'd say asserts and crash dumps are the two things that allow us to improve our quality given that we're still mostly using C/C++. |