Remix.run Logo
cactacea 8 months ago

PCB layout is as much art and black magic as it is science. I'm not sure why you dismiss the complexity so easily, this definitely is not just a matter of applying Maxwell's equations.

0_____0 8 months ago | parent [-]

Layout is a puzzle, especially with particularly high density layouts, but some of this is ameliorated by high layer count and fine trace/space boards becoming cheaper. Definitely not black magic. RF layout is black magic, let's not steal their thunder here.

tuetuopay 8 months ago | parent | next [-]

High speed PCBs are RF. At high enough frequencies, traces become waveguides, and the result cannot be predicted analytically. Simulation is your only light in this mess.

0_____0 8 months ago | parent [-]

I have been lucky to not have to lay out anything that had frequencies of interest over 1Ghz or so. What's your experience been? E.g. types of signals, frequency range, issues you ran into?

fxtentacle 8 months ago | parent [-]

Signals that arrive faster than what the speed of light should physically allow for that trace length because you made the corners too sharp and then instead of flowing along your path the electricity creates a magnetic field which then induces a current and that allows the signal to tunnel through non-conductive walls.

High speed boards cannot be simulated well. Because they are far from deterministic. That's what makes them so different from coding.

0_____0 8 months ago | parent [-]

What was the context you had that issue in? RAM bus?

numpad0 8 months ago | parent | prev [-]

It's just across-modal. The list of components are linear list, connections between components are graphs, placements are geometrically constrained, and overall shape is both geometric and external to the board. So you can't just mechanically derive the board from mere linear textual descriptions of it.

A lot of automagic "AGI achieved" LLM projects has this same problem, that it is assumed that brief literal prompt shall fully constrain the end result so long it is well thought out. And it's just not how it - the reality, or animal brains - works.

0_____0 8 months ago | parent [-]

You need a LOT of context about what the components are and how they're being used in order to route them. Extreme case is an FPGA where a GPIO might be a DAC output or one half of a SERDES diff pair.

numpad0 8 months ago | parent [-]

Doesn't even have to be that extreme: there is no way port placements of a Mac Mini can be mathematically derived from a plain English natural language prompt, and yet that's what they're trying to do. It's just the reality that not everything happen or could be done in literal languages. I guess it takes few more years before everyone accepts that.

bb88 8 months ago | parent [-]

There's nothing new in EE under the sun. Hasn't been for 40 years really. EE's min/max a bunch of mathematical equations. There's a lot of them, but it's not nearly as difficult as people think it is. They end up being design constraints, which can be coded, measured, and fed back into the AI.

It's not even been three years since Github Copilot was released to developers. And now we're all complaining about "vibe-coding".

0_____0 8 months ago | parent | next [-]

Design constraints that have so many factors that people still don't use autorouters for most stuff. You're not getting it, drawing the wires isn't the hard part, understanding the constraints is.

bb88 8 months ago | parent [-]

I think we agree with that part.

I once thought software constraints were so hard a machine would never be able to program it.

But on the other hand, there are tons of circuit boards designed day after day. If it was super hard, we'd not be able to have the tens of thousands of high speed motherboards that come out year after year.

0_____0 8 months ago | parent [-]

Are you in HW design by chance?

Software and hardware are fundamentally different in the ability of the engineer to isolate working segments. You can take a piece of code and set up unit tests for it, and if you feel good about your test suite, you can be fairly certain that it will serve your engineering and product goals.

In hardware engineering, that kind of isolation is a liability. As a billing electrical design engineer, you should be working tightly with your mechanical and SW/FW/GW teams to optimize what you're building. The massive context and knowledge base you collectively synthesize a design from is a huge benefit, and things like your phone or laptop, or any piece of spaceflight hardware, would not be possible without it.

Example - you can take something like a motor controller. Easy peasy, you say. Grab the best stocked and reasonably priced TI IC off of Digikey and slap its reference design into your copy of Altium Designer. If you give it its own power, thermal, and packaging solution, you can absolutely silo that component and hand it off to an AI agent that builds that piece for you.

Congrats, you've built a standalone motor control module, which you can also buy off of Digikey for a reasonable price that is much cheaper than the time you spent thinking about this.

Also congrats, systems engineering wants your head on a pike and mechanical engineering has taped a picture of your face to a football and is kicking it around in the parking lot.

If you're designing into a product, you're working with the mech and systems teams to create an integrated product that meets the systems/module requirements. The context for this includes not just circuit function, thermal performance, what the EMI situation is, whether there's some room to push back on systems and product as you weigh thermal performance and device longevity against module volume, global industrial geopolitics and the effect on part availability (there's a tariff tickbox in Digi-key now, and during COVID I had to redesign parts several times before being able to actually build them because parts became unavailable overnight due to panic buying).... the list is huge.

The cost of "compiling and running against the test suite" is also huge, because it involves typically weeks of answering questions/issues from the fab/assy, waiting for them to build and ship it, doing electrical bring up, actually running the tests you care about...

It is also hard to catch design issues in schematic or layout reviews. We don't have comprehensive and ubiquitous models for electronic devices, so we can't economically simulate this stuff.

This huge cost means "mashing GO until LLM spits out the right code" can't work, at all.

If you really do want to apply AI to EDA software, I think there's actually a really good use case in being able to catch small issues in a board, things that are too small to address in design reviews but have a meaningful impact on bring-up timelines for R&D test articles - stupid things like having a footprint flipped, or drawing the schematic symbol for a slightly different version of the part that has subtly different power pin configurations (my latest fuck-up). That is a fairly tightly containable problem, because our schematics all have links to vendor data and PDF datasheets that should be easily ingestible, and in practice there's often a lot of EDA users copying pins configs into their tools. I think AI would actually be good at catching the "dumb" errors that are sort of hard to see for humans.

numpad0 8 months ago | parent | prev [-]

So "not everything happen or could be done in literal languages" is the part that got you?

bb88 8 months ago | parent [-]

18ghz circuits were around since 1973 was the part that got me.

Your response doesn't really add to the conversation so I'll stop here.