▲ | hansvm 13 hours ago | |
For "ordinary" working programmers (some denominator of reasonably common knowledge across the industry without any specific skills that help with CTEs in particular), there are a couple mental models I find helpful: 1. Recursion and iteration are duals of each other. Anywhere a "recursive" CTE is a good tool for the job, there exists a natural for-loop or while-loop you would write in an ordinary programming language to solve the same job. Figure out what that looks like, and then you can translate it to SQL. Optimizing further often isn't necessary. The general form isn't terrible (ellipses hide the extra columns you'd have to manually include in most SQL dialects):
1 (continued). You can explicitly encode for-loops and while-loops, doing all the normal sorts of things an ordinary programming language allows. SQL is just a different language representing all the things you already know how to do; don't make it complicated until performance becomes a problem.2. There exists a concept of "equality saturation" that's exceptionally powerful in a number of domains. Everything in (1) is familiar to an ordinary working programmer, but the database's mental model of a recursive CTE is
2 (continued 0). One model of solving a sudoku puzzle is an iterative/recursive branch-and-bound algorithm (try a reasonable thing, expand all the easily learnable knowledge, exit if impossible or if done (recursing at each level), go to the next reasonable thing). One "equality saturation" version of that solution is (plus a stopping condition somewhere):2a. For each partial potential solution 2b. For all the ways in which the potential solution is viable 2c. Expand into all those new potentialities 2 (continued 1). That ability to describe a monotonically increasing set of things which finitely approaches a boundary is powerful in math, powerful in search, powerful in optimization (the Rust EGG crate (e-graphs good) has reasonable documentation for one particular concrete way in which that idea could manifest, if such concreteness helps you learn -- whether you know/like Rust or not), and so on. Gradient descent is just expanding your information till you don't have much more to learn. Optimizing a program is just expanding the set of optimization-based re-writes till you don't have any more re-writes (and pick the best one). Parsing a document is just adding to the things you know about it till no parsing rules can glean any more information. Since that thinking modality is how the database treats your query, you'll usually have better optimized queries if you can naturally formulate your problem in that language (as opposed to the naive iterative solution I proposed in (1)). Not all problems can be thus phrased, but if you're doing a lot of work in your database with recursive CTEs, it's a good idea to spend a week or three hammering home. 3. Combining (1) and (2) a bit, your database will usually have a cost somewhere around O(all_the_rows_you_produce) when evaluating recursive CTEs. These tasks only get hard when performance is a problem and you have to figure out how to take the naive ideas from (1) and transform them into the expected model of (2) in a way that actually reduces unnecessary work. For the sudoku example, you can do that by adding a bit more internal state to transform the breadth-first-search I described into a depth-first-search (making the solution _more_ iterative, interestingly; the "natural" solution is very slow comparatively), but in general you might have to get very creative or might not actually have a reasonable solution available. |