Remix.run Logo
yawniek 7 hours ago

fwiw i know tobias and its very very unlikely he made this up. my guess its intentionally vague to not leak any information about the culprit which i guess is fair.

rausr 7 hours ago | parent | next [-]

heh, I know that username. I came to the same conclusion. (I hope all is well with you, Yannick)

BrissyCoder 7 hours ago | parent | prev | next [-]

Okay. If it's real I apologize.

But in any case it's so lacking in detail and so brief as to make it so uninteresting that it might as well be fake.

> Somebody "vibecodes" medical app/system. The app was insecure. Personal info leaked.

Okay cool.

kuboble 7 hours ago | parent [-]

Is really weird to me that this is your reception.

It's a rarely updated personal blog, not a daily tabloid story.

BrissyCoder 6 hours ago | parent [-]

It's pure bs. If you read that blog post and think "this definitely happened", let alone "wow - this is interesting" then I have a monorail to sell you.

> Technical Background

> The entire application was a single HTML file with all JavaScript, CSS, and structure written inline. The backend was a managed database service with zero access control configured, no row-level security, nothing. All "access control" logic lived in the JavaScript on the client side, meaning the data was literally one curl command away from anyone who looked.

> All audio recordings were sent directly to external AI APIs for transcription and summarization.

> There was more, but this is already enough to get the idea.

Hmmmm... interesting, now that I have the "Technical Background" I for sure know that this medical app was 100% vibe coded by a Medical Practice in the Real World and exists! (TM)

trelbutate 6 hours ago | parent [-]

What do you want as proof? A link to the app?

BrissyCoder 5 hours ago | parent [-]

Non-ironically: "yes please" if you want me to believe that any of this happened.

spacebacon 7 hours ago | parent | prev [-]

It’s unlikely any LLM tasked with a prompt involving medical records did not automatically address separation of concerns. The type of data involved is worst case scenario. One JS file is also worst case scenario. This is why it may feel manufactured. If it is true, they truly deserve to be put on blast.

moooo99 7 hours ago | parent [-]

I can 100% imagine prompts that would even feel natural that would never hint at any medical background of the data being processed. Could be as simple as using customer instead of patient.