| ▲ | xnorswap 7 months ago |
| I had an interview last year that left such a bad impression on me it still puts me off interviewing. The opening question was: "List all the C# primitive types" The next question was: "List all their sizes in bytes". Such a pointless waste of time. I got them all right ( as far as I remember ), but it's just trivia that if you really need to know, you are better off keeping the reference handy rather than committing it to memory. As far as I could tell, the whole thing was just set up to demonstrate how smart the interviewers were. It devolved when later on I was berated because I said I'd google to check whether a System Timer Tick was 10ns or 100ns, I thought it was 100ns but I'd check if I was doing something where it mattered. The interviewer took offence to the concept of googling. |
|
| ▲ | ygra 7 months ago | parent | next [-] |
| Interestingly enough, there are no primitive types in C#; that's a Java term. I'd probably ask for what definition they're after because I can think of a few: • Types that have a type alias (that makes them special in the language), but that would include string and object. • Value types that have a type alias. Closer to what Java does and would include most basic numeric types. But it would exclude System.Half and System.Int128 for example. • Value types that the runtime knows about and might treat in a special way (e.g. by also having specific IL instructions for them). That would include most basic numeric types (and enums), but probably exclude boolean, decimal and the newer ones. • Value types in general, as they all function similarly, regardless of whether it's bool, int, ValueTuple<float, ushort> or an enum. But that list is no longer finite. But perhaps the following discussion about types in C# and .NET was what they're after. Who knows. |
| |
|
| ▲ | kardianos 7 months ago | parent | prev [-] |
| If I was an interviewer, I would want any programmer to know these things. If you don't, then likely you know programming by pattern matching, rather knowing the spec. Leaning by pattern matching is a fine way to start, but a professional programmer, you need to understand the language. These details are a proxy for knowing the language fully. Believe it or not, knowing these things without looking them up, while not useful on your own, if you don't know these things, you don't know the language to the level at which I need you to know it. Full stop. |
| |
| ▲ | y-c-o-m-b 7 months ago | parent | next [-] | | I've been in tech for almost 20 years now. I've worked in all sorts of industries, small companies, start-ups, big companies, you name it and I've been there. I'm currently in FAANG. I've been tech lead, in management, and even offered c-level positions in the past. There has not been a single time in my entire career where such a thing was even remotely true. Having been the interviewer before on many occasions, if I heard a candidate speak this way, I would absolutely not hire them regardless of how well they performed otherwise. This reeks of "I can't work well with others, so it's my way or the highway". It's very toxic and I question if you're trolling at this point. | |
| ▲ | cess11 7 months ago | parent | prev | next [-] | | I'd rather the candidate shows an acquaintance with profiling and debugging tooling. If they're good with those they'll figure out which data structures aren't a good fit once it starts to matter. Learning some numbers and names by rote doesn't mean you actually understand the trade-offs they imply. | |
| ▲ | bena 7 months ago | parent | prev | next [-] | | The size of primitives in bytes is something that will rarely matter and when it does, you can look it up. | | |
| ▲ | Moru 7 months ago | parent [-] | | Don't they also dependon the architecture you are programming? 8-16-32-64 bit computers? Or did they standardise the names since I learned C (and never used it) 30-40 years ago? | | |
| ▲ | bena 7 months ago | parent | next [-] | | This was about C#, not C or C++. So it's a little more involved. The CLR, which C# targets, has explicitly sized types. There is no "int", there's only "Int32", which is a signed 32-bit integer. C# maps keywords to certain CLR types. "int" always maps to Int32. It's a guarantee in C# and I think a requirement of all languages targeting the CLR. | |
| ▲ | unwind 7 months ago | parent | prev [-] | | It was in C#, not C. In C they are not standardized but there are requirements on relative sizes. Since C99 we also have `int32_t` and friends that expose sizes. |
|
| |
| ▲ | Abekkus 7 months ago | parent | prev | next [-] | | I could imagine a team that needed it, but most professional programming work does not require this level of memorization, I'd have to see the job description to know which side to take in this argument. | | |
| ▲ | kardianos 7 months ago | parent [-] | | [flagged] | | |
| ▲ | do_not_redeem 7 months ago | parent | next [-] | | In C# it's not called `i32`, it's called `int`. Also don't forget in C#, `decimal` is a primitive type too. How many bytes does a decimal take up? If I had to look that up, would you fire me? | |
| ▲ | 7 months ago | parent | prev [-] | | [deleted] |
|
| |
| ▲ | mynameisvlad 7 months ago | parent | prev [-] | | This is asinine. How often has the size of a primitive come up in day to day programming in say... The last month? When has one of your developers ever had to use this information to the point where it had to be memorized? Especially when it's literally a 1 second search away. | | |
| ▲ | kardianos 7 months ago | parent [-] | | This isn't about knowledge.
If you can't compute that a 32-bit integer is 4 bytes, or have the knowledge that an int in c# is 32 bits and a long is 64-bits, then no, you don't actually know enough to be on a team. And yes, it does come up and yes, it does matter. | | |
|
|