Remix.run Logo
eddythompson80 8 hours ago

Python was common place long before ML. Ever since 1991, it would jump in popularity every now and then, collect enough mindshare, then dives again once people find better tools for the job. It long took the place of perl as the quick "linux script that's too complex for bash" especially when python2 was shipping with almost all distros.

For example, python got a similar boost in popularity in the late 2000s and early 2010s when almost every startup was either ruby on rails or django. Then again in the mid 2010s when "data science" got popular with pandas. Then again in the end of 2010s with ML. Then again in the 2020s with LLMs. Every time people eventually drop it for something else. It's arguably in a much better place with types, asyncio, and much better ecosystem in general these days than it was back then. As someone who worked on developer tools and devops for most of the time, I always dread dealing with python developers though tbh.

CatMustard 7 hours ago | parent [-]

> I always dread dealing with python developers though tbh.

Out of curiosity, why is that?

eddythompson80 5 hours ago | parent [-]

There are plenty of brilliant people who use python. However, in every one of these boom cycles with python I dealt with A LOT of developers with horrific software engineering practices, little understanding of how their applications and dependencies work, and just plane bizarre ideas of how services work. Like the one who comes with 1 8k line run.py with like 3 functions asking to “deploy it as a service”, expecting it to literally launch `python3 run.py` for every request. It takes 5 minutes to run. It assumes there is only 1 execution at a time per VM because it always writes to /tmp/data.tmp. Then poses a lot of “You guys don’t know what you’re doing” questions like “yeah, it takes a minute, but can’t you just return a progress bar?” In a REST api? Or “yeah, just run one per machine. Shouldn’t you provide isolation?”. Then there is the guy who zips up their venv from a Mac or Windows machine and expects it to just run on a Linux server. Or the guy who has no idea what system libs their application needs and is so confused we’re not running a full Ubuntu desktop in a server environment. Or the guy who gives you a 12GB docker image because ‘well, I’m using anaconda”

Containers have certainly helped a lot with python deployments these days, even if the Python community was late to adopt it for some reason. throughout the 2010s where containers would have provided a much better story especially for python where most libraries are just C wrappers and you must pip install on the same target environments, python developers I dealt with were all very dismissive of it and just wanted to upload a zip or tarball because “python is cross platform. It shouldn’t matter” then we had to invent all sorts of workarounds to make sure we have hundreds of random system libs installed because who knows what they are using and what pip will need to build their things. prebuilt wheels were a lot less common back then too causing pip installs to be very resource intensive, slow and flaky because som system lib is missing or was updated. Still python application docker images always range in the 10s of GBs