Remix.run Logo
bendergarcia 4 hours ago

We are without our consent introducing a party in between people. The models become the arbiters of who does and does not get a job. It feels problematic.

justonceokay 3 hours ago | parent | next [-]

There will be a great arbitrage for people who do not use LLMs.

If your HR department is using ChatGPT to filter resumes, you’ll end up with people who used ChatGPT to generate resumes. I don’t want to make a “slippery slope“ argument, but my gut feeling is that the quality of your organization will deteriorate quickly.

On the other hand, I am a handyman/subcontractor. Almost all of my work comes through phone calls, texts, and one-off emails. I only work with people that are recommended by a trusted sources. I haven’t handled a traditional resume (mine or other people’s) in over eight years.

If I started interacting with somebody and they seemed like they were a computer, that would be the fastest way for me to know I should move on to another client. If they can’t take the time to interact with me, how am I supposed to perform hundreds of hours of physical labor for them?

bendergarcia 4 hours ago | parent | prev | next [-]

And I feel the common response of: well just use the model that’s available. Ai is and will probably always be resource constrained and profit driven, that means we will eventually see a world where poor people have worse resumes than rich people and there really won’t be any way around it because the man in the middle has the final say

adrianN 3 hours ago | parent [-]

Not too long ago I bet resumes that were printed from a computer were preferred to resumes typed on a typewriter. What happened was that computers became commodities. It is reasonable to assume that LLMs will become commodified too.

YurgenJurgensen 3 hours ago | parent | next [-]

That would hardly be surprising. Monospaced fonts make natural language a pain to read, so what that would prove is that well-presented resumes are preferred to poorly-presented ones.

This case is different, as the LLM output isn’t measurably better than the human output (unless you have a particular love of bland corpo-speak).

Nuzzerino 3 hours ago | parent | prev [-]

This is a terrible way to soften an obvious alignment failure with AI rollout.

4 hours ago | parent | prev | next [-]
[deleted]
falcor84 3 hours ago | parent | prev | next [-]

The ship has sailed as soon as hiring managers stopped reading cv's directly and we got recruiters as a profession.

ekianjo 4 hours ago | parent | prev | next [-]

before it used to be HR, so you always had a party in between "actual" people. HR (mostly) never cared about the CV, they just look at a checklist and see if it matches.

sneak 4 hours ago | parent | prev | next [-]

We already did that when we all created LinkedIn accounts.

sxg 4 hours ago | parent | prev [-]

Take a look at how things worked before (and still do): employers decide who get jobs based on a combination of personal biases, nepotism, and ulterior motives while applicants present distorted versions of themselves and network/pull strings to put the odds in their favor. That seems more problematic.

1attice 3 hours ago | parent [-]

You would be surprised at the process in other industries. What you are describing is the tech job market specifically.

Other fields have their own problems, including credentialism and ballooning concomitant student loans, but do, by strict convention, not hire based on vibes or pulled strings. Often to their partial detriment, as the cure -- ie, strict oversight of hiring that also forces the hiring manager to ignore important implicit signals -- is alive and well in medicine, law, civil engineering, education, and the trades. Notable exceptions include entertainment, sales, real estate, and software engineering.

By optimizing for vibes, the tech industry gains "Spidey senses" in the hiring loop but pays for it in impartiality.

IMO this precipitated the DEI movement's advent, as it was seen as a way of remediating the drawbacks while preserving the information channel.

Without it, expect either homophily, and, eventually, a harsh and remedial credentialism.