| ▲ | dzonga 8 hours ago |
| good to see incredible stuff being shipped in Swift. Haven't used it since v3 though. around 2015-17 - Swift could have easily dethroned Python. it was simple enough - very fast - could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift. the server ecosystem was starting to come to life, even supported by IBM. I think the letdown was on the Apple side - they didn't bring in the community fast enough whether on marketing, or messaging - unfortunately Swift has remained largely an Apple ecosystem thing - with complexity now chasing C++. |
|
| ▲ | willio58 3 hours ago | parent | next [-] |
| > the server ecosystem was starting to come to life, even supported by IBM. I was in college at the time and doing some odd freelance jobs to make some money. Unbeknownst to my clients I was writing their website backends in swift, using build packs on heroku to get them hosted. It was a fun time for me and I love swift but I will admit last year I went ahead and rewrote an entire one of those sites in good ol typescript. I love swift but anything outside of the Apple ecosystem with it just seems like it hasn’t hit critical mass yet. |
|
| ▲ | isodev 4 hours ago | parent | prev | next [-] |
| > Swift has remained largely an Apple ecosystem Even today, with the fancy Swift 6.3, the experience of using Swift for anything other than apps for Apple platforms is very painful. There is also the question of trust - I don't think anyone would voluntarily introduce Apple "The Gatekeeper" in parts of their stack unless they're forced to do it. |
| |
| ▲ | mathverse 3 hours ago | parent [-] | | You can use swift on the server but what for? You have a gigantic ecosystems in languages X,Y,Z. Even Apple does not use Swift on the server (AFAIK) so why would you? | | |
| ▲ | jshier 3 hours ago | parent | next [-] | | What, of course Apple uses Swift on the server, that's the only reason they're investing in any of this. Many of the foundational Swift on the server libraries were written at Apple and later opened, like SwiftNIO. | |
| ▲ | isodev 3 hours ago | parent | prev [-] | | > Even Apple does not use Swift Exactly true - they've created all these "working groups" of open source / volunteers to care for Android / Server / Wasm / ... all while being constraint "as an Apple product". Of course the end result is crappy | | |
| ▲ | mathverse 3 hours ago | parent [-] | | Yea there is no incentive. Why use Swift on the server or in k8s when you have gazillion other languages that are performant and have the ecosystems. |
|
|
|
|
| ▲ | vmsp 7 hours ago | parent | prev | next [-] |
| True. Google was even thinking of switching TensorFlow from Python to Swift. https://github.com/tensorflow/swift |
| |
| ▲ | mi_lk 6 hours ago | parent | next [-] | | That’s really because Chris Lattner was at Google Brain at the time. Don’t think it ever took off in meaningful ways | |
| ▲ | mark_l_watson 6 hours ago | parent | prev [-] | | I was enthusiastic about early TensorFlow in Swift efforts, sorry when the effort ended. My interest then flowed into early Mojo development for a while. I wrote an eBook on Swift several ago but rarely update that book anymore. Count me as one of the many developers who for a while thought Swift would take over the world. At least Swift is a fun language to use, and now with LLM coding tools writing macOS/iOS/iPadOS apps is fairly easy. | | |
| ▲ | c-fe 3 hours ago | parent [-] | | funnily enough, I talked recently to someone working on the swift compiler (not an Apple employee) to make Swift functions differentiable. So its not all dead yet |
|
|
|
| ▲ | mdemare 5 hours ago | parent | prev | next [-] |
| Python 3 barely managed to dethrone Python. |
| |
| ▲ | throwaway27448 3 hours ago | parent [-] | | I'm sorry, that's absolutely bullshit. In fact, I wish we had left everyone who complained behind—the python community would have been happier and healthier for it. Absolute crybabies who wanted to be catered to without caring for how intractable the problems with python2 were—e.g. dealing with unicode was a royal pain in the ass, and the bytes/string divide completely fixed it. IMO, it was the best-executed breaking change I've ever witnessed in a language. In comparison, e.g. Scala 2 -> Scala 3 was an absolute nightmare—it just didn't have the same vocal wailing from maintainers in the community (or, I suppose, a fraction of Python's popularity to begin with). | | |
| ▲ | crest 2 hours ago | parent [-] | | Being to aggressive in breaking stuff gets you a shitshow like Node.js or Ruby. Long-term source code compatibility is a very useful feature for open source and a sign of a mature eco system. Feel free to add stuff, but once it's part of a stable release it has to be maintained long after a "better" way to do it comes along. | | |
| ▲ | jwlake 42 minutes ago | parent [-] | | nodejs itself doesn't have very many breakages; i have plenty of code that is unchanged from 0.12 to 24. npm is a whole other kettle of fish but I don't think you can blame the core project for the sins of everyone that publishes to the package manager. Python2 -> Python3 on the other hand had a lot of breakage in "standard" code. |
|
|
|
|
| ▲ | lynndotpy 2 hours ago | parent | prev | next [-] |
| Python's interactive interpreter makes it pretty useful as a shell, for iterative development, and crucially useful in a Jupyter notebook. I've also found CircuitPython's interpreter to be bonkers useful in prototyping embedded projects. (This, on top of the nice datascience, ML, and NN libraries). Swift just wasn't doing the same things. And even if it did, Swift would compete with other languages that were understood as "a better Python", like Julia. Even then, Swift only came to Linux in 2016, Windows in 2020, and FreeBSD less than a year ago with WWDC 2025. I think it doesn't help that the mid 2010s saw a burst of Cool and New languages announced or go mainstream. Go, Julia, Rust, TypeScript, Solidity, etc. along with Swift. I think most of us only have space to pick up one or two of these cool-and-new languages every few years. |
|
| ▲ | tarentel 4 hours ago | parent | prev | next [-] |
| > could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift. In 2015-2017 you could interop with C, C++ support wasn't added until very recently. I do agree with you though and I am not sure what the exact reasoning is, but Swift is definitely an Apple ecosystem language despite the random efforts to gain traction elsewhere. |
|
| ▲ | wiseowise 5 hours ago | parent | prev | next [-] |
| > around 2015-17 - Swift could have easily dethroned Python. Why could it? > it was simple enough - very fast - could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift. Half a dozen languages fit this description. > the server ecosystem was starting to come to life, even supported by IBM. No, not at all. Kitura, Vapor (a fitting name) were just a toys that no serious player ever touched. |
| |
| ▲ | hocuspocus 4 hours ago | parent [-] | | After that, and IBM losing interest, Apple did hire a few competent people (including contributors to Netty and Akka) to build the Swift Server Workgroup. But I don't know why I'd pick Swift on the server when Rust is better in almost every dimension, with a thriving and more community-driven ecosystem. | | |
| ▲ | mathverse 3 hours ago | parent [-] | | I think it's not about that but about dogfooding Swift on the server. Apple uses Go, Java etc for a lot of its server components and refused to invest in hiring people that would extend the ecosystem for server Swift. Thats the problem. | | |
| ▲ | hocuspocus 3 hours ago | parent [-] | | It certainly doesn't help, but among big tech, Apple is not the only company where teams are siloed and independent. Microsoft has people writing Java or Go instead of C# too. I assume the server side usage is not zero, but not enough to reach a critical mass, you're probably right there. |
|
|
|
|
| ▲ | rasmus1610 8 hours ago | parent | prev | next [-] |
| Maybe Chris Lattner leaving and creating Mojo also didn’t help in that regard. Swift for TensorFlow was a cool idea in that time … |
| |
| ▲ | troupo 7 hours ago | parent [-] | | Lattner probably left because Apple didn't give the team any breathing room to properly implement the language. It was "we must have this feature yesterday". A lot of Swift is the equivalent of Javascrip's "we have 10 days to implement and ship it": https://youtu.be/ovYbgbrQ-v8?si=tAko6n88PmpWrzvO&t=1400 --- start quote --- Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff... We had a ton of users, it had a ton of iternal technical debt... the whole team was behind, and instead of fixing the core, what the team did is they started adding all these special cases. --- end quote --- | | |
| ▲ | groundzeros2015 4 hours ago | parent | next [-] | | For this language to become default at Apple they had to be doing a massive amount of internal promotion - in other words they knew where it was going. And then if that's the case, how were they not ready to solve the many problems that a big organization would run into? And all the schedule constraints that come with it? | |
| ▲ | mpweiher 3 hours ago | parent | prev | next [-] | | > Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff... That's true, but only partly true. It already was a gigantic super complicated bag of special cases right from the start. Rob Rix noted the following 10 years ago: Swift is a crescendo of special cases stopping just short of the general; the result is complexity in the semantics, complexity in the behaviour (i.e. bugs), and complexity in use (i.e. workarounds). https://www.quora.com/Which-features-overcomplicate-Swift-Wh... Me, 2014: Apple's new Swift language has taken a page from the C++ and Java playbooks and made initialization a special case. Well, lots of special cases actually. The Swift book has 30 pages on initialization, and they aren't just illustration and explanation, they are dense with rules and special cases https://blog.metaobject.com/2014/06/remove-features-for-grea... Of course, that doesn't mean that it didn't get worse. It got lot worse. For example (me again, 2020): I was really surprised to learn that Swift recently adopted Smalltalk keyword syntax ... Of course, Swift wouldn't be Swift if this weren't a special case of a special case, specifically the case of multiple trailing closures, which is a special case of trailing closures, which are weird and special-casey enough by themselves. https://blog.metaobject.com/2020/06/the-curious-case-of-swif... Oh, and Function Builders (2020, also me): A prediction I made was that these rules, despite or more likely because of their complexity, would not be sufficient. And that turned out to be correct, as predicted, people turned to workarounds, just like they did with C++ and Java constructors. https://blog.metaobject.com/2020/04/swift-initialization-swi... So it is true that it is now bad and that it has gotten worse. It's just not the case that it was ever simple to start with. And the further explosion of complexity was not some accidental thing that happened to what was otherwise a good beginning. That very explosion was already pretty much predetermined in the language as it existed from inception and in the values that were visible. From my exchange with Chris regarding initializers: "Chris Lattner said... Marcel, I totally agree with your simplicity goal, but this isn't practical unless you are willing to sacrifice non-default initializable types (e.g. non-nullable pointers) or memory safety." Part of my response: "Let me turn it around: Chris, I totally agree with your goal of initializable types, but it is just not practical unless you are willing to sacrifice simplicity, parsimony and power (and ignore the fact that it doesn't actually work)." Simplicity is not the easy option. Simplicity is hard. Swift took the easy route. [...] when you first attack a problem it seems really simple because you don't understand it. Then when you start to really understand it, you come up with these very complicated solutions because it's really hairy. Most people stop there. But a few people keep burning the midnight oil and finally understand the underlying principles of the problem and come up with an elegantly simple solution for it. But very few people go the distance to get there. -- Steve Jobs (borrowed and adapted from Heinelein) https://blog.metaobject.com/2014/04/sophisticated-simplicity... | |
| ▲ | hirvi74 4 hours ago | parent | prev | next [-] | | To be fair, I think such a fate in inevitable for most languages after many years of changes and development. | |
| ▲ | msie 2 hours ago | parent | prev [-] | | [dead] |
|
|
|
| ▲ | pjmlp 3 hours ago | parent | prev | next [-] |
| The thing what people don't get with C++'s complexity is that complexity is unavoidable. It is also there in Ada, C#, Java, Python, Common Lisp,.... Even if the languages started tiny, complexity eventually grows on them. C23 + compiler extensions is quite far from where K&R C was. Scheme R7 is quite far from where Scheme started. Go's warts are directly related to ignoring history of growing pains from other ecosystems. |
| |
| ▲ | embedding-shape 3 hours ago | parent [-] | | > Even if the languages started tiny, complexity eventually grows on them. And then of course the case that proves the opposite, Clojure. Sure, new ideas appear, but core language is more or less unchanged since introduced, rock solid and decades old projects still run just fine, although usually a bit faster. | | |
| ▲ | pjmlp 3 hours ago | parent [-] | | That is because Clojure is done, there is hardly anything being done other than probably what matters to NuBank and Datomic mostly. Also its market share adoption kind of shows it. | | |
| ▲ | honr 2 hours ago | parent | next [-] | | The current market share shows how far you can go with just being a better Java. If (or when? I haven't checked recently) a decent and well-thought-out LLVM backend emerges for it, ideally with some new underlying complexity seeping through, the market share might expand overnight. And as for C++, while some complexity is certainly unavoidable, a rigorous complexity control is desperately needed. Ideally, the same way Bell Labs folks did when they initially conceived Go from Algol68 and C and similar (before or after joining Google; I couldn't tell), and Rich Hickey did when he initially designed Clojure. Some people are managing the complexity using style guides and clang-tidy checks. Which is great in that doing so doesn't need lengthy language committee decisions. But that approach hasn't been enough to make code _sufficiently_ safe; every now and then an enterprising engineer or team finds a way to abuse a feature in a way that produces unsafe or unpredictable results. Rust is a bit better and solves a few of the common problems, but sadly the list of potential issues (of using Rust in a codebase at scale; Engineers' faults, not Rust's) is long and growing. My verdict is we need both complex and simple LLVM languages, ideally co-designed to have no interop problems by design, while allowing expressing some logic in the simple parts and some logic in the complex parts. Or better, a 3 tier design would be nearly perfect: expressive config language, glue and research language, and core building blocks language. I think a clojure-style language can be designed to achieve all three. | |
| ▲ | zdragnar 3 hours ago | parent | prev | next [-] | | That's a pretty far cry from "complexity is unavoidable". Reading that to me implies that the complexity is inherent in programming language design, whereas this follow-up argument seems to say that complexity is the result of tacking on new features. The latter is a bit tautological, since the size of the language grammar is itself a measure of complexity. | | |
| ▲ | pjmlp 2 hours ago | parent [-] | | I think they even haven't adopted newer JVM features, it is a hosted language designed to depend on its host, plus it is a Lisp. The complexity would be to grow like Common Lisp, instead it is up to Clojure folks to write Java, C#, JavaScript code, therein lies the complexity. |
| |
| ▲ | embedding-shape 3 hours ago | parent | prev [-] | | > That is because Clojure is done Yes, that's one approach to avoiding ever growing complexity, maybe the other languages should try it sometime ;) With that said, everything around Clojure keeps improving and getting better. While the language doesn't have static types, clojure.spec offers something that is even better than static typing (imo), and doesn't even require any changes to the core language. Something else other mainstream languages could learn too. | | |
| ▲ | pjmlp 2 hours ago | parent [-] | | Is Typed.Clojure finally stable and sound? In theory we only need parentheses, prefix operators and a REPL, but mainstream never went down that route. Anyway the complexity then ends up being custom DSLs and macros. |
|
|
|
|
|
| ▲ | michaelcampbell 6 hours ago | parent | prev | next [-] |
| > Swift could have easily dethroned Python. Just IMO, but... no. To me a "could have easily" requires n-1 things to have happened, and 1 thing not happening. Like, we "could have easily" had a nuclear exchange with the USSR, were it not for the ONE Russian guy who decided to wait for more evidence. https://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alar... But even in '15-'17, there were too many people doing too many things with Python (the big shift to data orientation started in the mid/late 90's which paved the way to ML and massive python usage) by then. The 'n' was large, and not nearly of the 'n' things were in Swift's favor then. Again, IMO. |
|
| ▲ | iamcalledrob 8 hours ago | parent | prev | next [-] |
| That's my read too. Swift was feeling pretty exciting around ~v3. It was small and easy to learn, felt modern, and had solid interop with ObjC/C++. ...but then absolutely exploded in complexity. New features and syntax thrown in make it feel like C++. 10 ways of doing the same thing. I wish they'd kept the language simple and lean, and wrapped additional complexity as optional packages. It just feels like such a small amount of what the Swift language does actually needs to be part of the language. |
| |
| ▲ | 72deluxe 7 hours ago | parent | next [-] | | I get this feeling with C#. I have been here since its release. I looked at Swift and then they moved very quickly at the beginning, so the book I had to teach me was out of date moments after it was printed. With all the complexity being thrown in, I stuck with C++ because at least it was only 1 language I had to keep track of (barely)! | | |
| ▲ | CharlieDigital 4 hours ago | parent [-] | | C# is the other direction, IMO. I've been using C# since the first release in 2003/4 timeline? Aside from a few high profile language features like LINQ, generics, `async/await`, the syntax has grown, but the key additions have made the language simpler to use and more terse. Tuples and destructuring for example. Spread operators for collections. Switch expressions and pattern matching. These are mostly syntactic affordances. You don't have to use any of them; you can write C# exactly as you wrote it in 2003...if you want to. But I'm not sure why one would forgo the improved terseness of modern C#. Next big language addition will be discriminated unions and even that is really "opt-in" if you want to use it. | | |
| |
| ▲ | willtemperley 7 hours ago | parent | prev | next [-] | | Which keywords would you get rid of and why? You don't have to use all of them! | | |
| ▲ | fauigerzigerk 6 hours ago | parent | next [-] | | I would remove result builders and all other uses of @attributes that change the semantics of the code (e.g property wrappers). I would remove the distinction between value types and reference types at the type level. This has caused so many bugs in my code. This distinction should be made where the types are used not where they are defined. I would remove everything related to concurrency from the language itself. The idea to let code execute on random threads without any explicit hint at the call site is ridiculous. It's far too complicated and error prone, which is why Swift designers had to radically change the defaults between Swift 6.0 and 6.2 and it's still a mess. I would remove properties that are really functions (and of course property wrappers). I want to see at the call site whether I'm calling a function or accessing a variable. I would probably remove async/await as well, but this is a broader debate beyond Swift. And yes you absolutely do have to know and use all features that a language has, especially if it's a corporate language where features are introduced in order to support platform APIs. | | |
| ▲ | fingerlocks 4 hours ago | parent [-] | | I agree with you about result builders, silly feature that only exists for SwiftUI. But a lot of what you said, except for the concurrency and property wrapper stuff, largely exists for Obj-C interop. The generated interface is more readable, and swift structs act like const C structs. It’s nice. |
| |
| ▲ | quietbritishjim 7 hours ago | parent | prev | next [-] | | I'm not a Swift user, but I can tell you from C++ experience that this logic doesn't mitigate a complex programming language. * If you're in a team (or reading code in a third-party repo) then you need to know whatever features are used in that code, even if they're not in "your" subset of the language. * Different codebases using different subsets of the language can feel quite different, which is annoying even if you know all the features used in them. * Even if you're writing code entirely on your own, you still end up needing to learn about more language features than you need to for your code in order that you can make an informed decision about what goes in "your" subset. | |
| ▲ | cloogshicer 7 hours ago | parent | prev | next [-] | | But you have to know all of them to read other people's code. To answer your question: I would immediately get rid of guard. Also, I think the complexity and interplay of structs, classes, enums, protocols and now actors is staggering. | | |
| ▲ | willtemperley 4 hours ago | parent [-] | | I'm surprised, guard is really useful, especially when unwrapping optionals. It's terse, explicit and encourages defensive programming. internal should definitely go though. | | |
| ▲ | cosmic_cheese 4 hours ago | parent [-] | | The absence of guard in Kotlin is one of those things that regularly trips me up when bouncing between it and Swift. Rather than Swift losing guard I’d prefer if Kotlin gained it. | | |
| ▲ | iamcalledrob 2 hours ago | parent [-] | | I think the ?: operator ends up being a decent alternative, e.g. // Swift
guard let foo = maybeFoo else {
print("missing foo")
return false
}
// Kotlin
val foo = maybeFoo ?: run {
print("missing foo")
return false
}
Unless there's a use case for guard I'm not thinking of |
|
|
| |
| ▲ | merlindru 5 hours ago | parent | prev | next [-] | | i would get rid of associatedtype, borrowing, consuming, deinit, extension, fileprivate, init, inout, internal, nonisolated, open, operator, precedencegroup, protocol, rethrows, subscript, typealias, #available, #colorLiteral, #else, #elseif, #endif, #fileLiteral, #if, #imageLiteral, #keyPath, #selector, #sourceLocation, #unavailable, associativity, convenience, didSet, dynamic, indirect, infix, lazy, left, mutating, nonmutating, postfix, precedence, prefix, right, unowned, weak, and willSet | | |
| ▲ | willtemperley 4 hours ago | parent [-] | | It's true that internal is pointless. Focusing on the keywords rather than the macros, I think the rest of them have legitimate use cases, though they're often misused, especially fileprivate. |
| |
| ▲ | eptcyka 7 hours ago | parent | prev | next [-] | | You can take this approach in personal projects - with teams you need to decide on this and then on-board people into your use of the language. This does not work. | | | |
| ▲ | troupo 7 hours ago | parent | prev [-] | | 1. You don't have to use it all, but someone will. And there are over 200 keywords in the language: https://x.com/jacobtechtavern/status/1841251621004538183 2. On top of that many of the features in the language exist not because they were carefully designed, but because they were rushed: https://news.ycombinator.com/item?id=47529006 | | |
| ▲ | uasi 5 hours ago | parent | next [-] | | That number is unfairly exaggerated. The list includes ~40 internal keywords used only by language developers, plus dozens of tokens that would be called preprocessor directives, attributes, or annotations in other languages (e.g. `canImport` as in `#if canImport(...) #endif`; `available` and `deprecated` as in `@available(*, deprecated) func`). | |
| ▲ | dematz 6 hours ago | parent | prev [-] | | are there actually 217 keywords? Just wondering what the difference between that file and https://docs.swift.org/swift-book/documentation/the-swift-pr... (a mere 102 keywords) | | |
| ▲ | merlindru 5 hours ago | parent [-] | | That file is the compiler's list of reserved keywords, so some of them may not have been added to docs, or they're experimental/internal/... I'm not 100% sure but I think the swift doc you linked is missing at least a dozen keywords so the truth probably lies in the middle | | |
| ▲ | dematz 5 hours ago | parent [-] | | Ah makes sense, personally I wouldn't consider reserved but unused words as keywords in the sense that you don't need to know them to read the language (even though they're keywords in some other technical sense). I was curious because I just tried counting number of keywords by language and it seemed surprisingly ambiguous/subjective/up to the language to say what's a "keyword" vs some type of core module. So my attempt (https://correctarity.com/keywords) probably has mistakes... |
|
|
|
| |
| ▲ | msie 2 hours ago | parent | prev [-] | | I felt that too many smart people were getting involved in the evolution of the language. There should have been a benevolent dictator to say NO. |
|
|
| ▲ | Terretta 3 hours ago | parent | prev | next [-] |
| > Haven't used it since v3 though. Since 5.10 it's been worth picking back up if you're on MacOS. |
|
| ▲ | ramesh31 4 hours ago | parent | prev | next [-] |
| >"around 2015-17 - Swift could have easily dethroned Python." NumPy, SciPy, Pandas, and Pytorch are what drove the mass adoption of Python over the last few years. No language feature could touch those libraries. I now know how the C++/Java people felt when JS started taking over. It's a nightmare to watch a joke language (literally; Python being named for Monty Python) become the default simply because of platform limitations. |
|
| ▲ | afavour 4 hours ago | parent | prev | next [-] |
| Eh, I don't think Swift would ever have dethroned Python. What pain point would it practically solve? I don't use Python often but I don't hear folks complaining about it much. I do, though, think Swift had/has(?) a chance to dethrone Rust in the non-garbage collected space. Rust is incredibly powerful but sometimes you don't really need that complexity, you just need something that can compile cross-platform and maintain great performance. Before now I've written Rust projects that heavily use Rc<> just so I don't have to spend forever thinking about lifetimes, when I do that I think "I wish I could just use Swift for this" sometimes. You're right, though, that Swift remains Apple's language and they don't have a lot of interest in non-Apple uses of it (e.g. Swift SDK for Android was only released late last year). They're much happier to bend the language in weird ways to create things like SwiftUI. |
| |
| ▲ | fainpul 4 hours ago | parent [-] | | > just need something that can compile cross-platform and maintain great performance. I think Go has already taken that part of the cake. | | |
| ▲ | afavour 4 hours ago | parent [-] | | Go is garbage collected, though. Rust and Swift still occupy a niche Go doesn't. | | |
| ▲ | mathverse 3 hours ago | parent [-] | | ARC is a form of garbage collection. Swift does not fare better than Go usually. |
|
|
|
|
| ▲ | oefrha 7 hours ago | parent | prev | next [-] |
| > Swift could have easily dethroned Python No way something that compiles as slowly as Swift dethrones Python. Edit: Plus Swift goes directly against the Zen of Python > Explicit is better than implicit. > Namespaces are one honking great idea -- let's do more of those! coupled with shitty LSP support (even to this day) makes code even harder to understand than when you `import *` in Python. Edit 2: To expand a little on how shitty the LSP support is for those who don't work with Swift: any trivial iOS or macOS project that builds fine in Xcode can have a bunch of SourceKit-LSP (the official Swift LSP) errors because it fails to resolve frameworks/libraries. The only sane way to work with Swift in VS Code or derivatives I've found is to turn off SourceKit diagnostics altogether and only keep swiftc diagnostics. And I have the swift-lsp plugin in Claude Code, there's a routine baseline of SourceKit errors ignored. So you have symbols without explicit namespaces, and the LSP simply can't resolve lots of them, so no lookup for you. Good luck. |
| |
| ▲ | vovavili 7 hours ago | parent | next [-] | | >No way something that compiles as slowly as Swift dethrones Python. This must have pushed Chris Lattner towards making Mojo both interpreted and compiled at the same time. | | | |
| ▲ | bossyTeacher 7 hours ago | parent | prev | next [-] | | > Explicit is better than implicit. That's funny. To me magic is implicit by definition and Python strikes me as a very magical language compared to something like Java that is way more explicit. | | |
| ▲ | robmccoll 6 hours ago | parent | next [-] | | Until you start using frameworks like Spring and then everything is so painfully magic that no one knows how the program actually runs. | |
| ▲ | wiseowise 5 hours ago | parent | prev [-] | | Magical language how? And you should see what reflection based Java monstrosities do in the background. |
| |
| ▲ | commandersaki 6 hours ago | parent | prev [-] | | Plus Swift goes directly against the Zen of Python The Zen of Python is how we got crap like argparse where arguments are placed in the namespace instead of a dict. | | |
| ▲ | coldtea 4 hours ago | parent [-] | | I wouldn't change that in any way. I'd might make it an Arguments class, but I wound't make what parser returns merely a dict. | | |
| ▲ | commandersaki 4 hours ago | parent [-] | | Yeah, so what happens when you have an option with a '-' in it that isn't valid as a variable name (I know what happens). It's just stupid. | | |
| ▲ | coldtea 2 hours ago | parent [-] | | The same thing you'd do yoursef if you wanted to assign it to a namesake local variable even if it was in a dict to begin with: you'd make the dash an underscore. | | |
| ▲ | commandersaki 2 hours ago | parent [-] | | It would be extremely unlikely that you would replicate the name as a local variable if it was in a dict, but regardless a dict doesn't have that limitation. The namespace thing is atrocious and bad design -- no straightforward way to iterate over them, merging/updating them is awful, collides with keyword methods (keys, items, etc.), and so on; thankfully more modern argument parsing libraries didn't repeat this mistake. It's just a shame this ended up in the standard library, but then Python standard library has never really been any good, e.g. logging and urllib1234567. |
|
|
|
|
|
|
| ▲ | WD-42 4 hours ago | parent | prev [-] |
| Dethroned Python? The Apple language, seriously. Where is numpy for swift? |