| ▲ | AIPedant 3 days ago |
| Articles like this indicate we should lock down the definition of "computation" that meaningfully distinguishes computing machines from other physical phenomena - a computation is a process that maps symbols (or strings of symbols) to other symbols, obeying certain simple rules[1]. A computer is a machine that does computations. In that sense life is obviously not a computation: it makes some sense to view DNA as symbolic but it is misleading to do the same for the proteins they encode. These proteins are solving physical problems, not expressing symbolic solutions to symbolic problems - a wrench is not a symbolic solution to the problem of a symbolic lug nut. From this POV the analogy of DNA to computer program is just wrong: they are both analogous to blueprints, but not particularly analogous to each other. We should insist that DNA is no more "computational" than the rules that dictate how elements are formed from subatomic particles. [1] Turing computability, lambda definability, primitive recursion, whatever. |
|
| ▲ | da_chicken 3 days ago | parent | next [-] |
| I don't think it's necessary to completely discard the idea. However, I do think it's important, at the end of it all, to ask: Okay, so what's the utility of this framework? What am I getting out of setting up my point of view this way? I'm reminded of an old YouTube video [0] that I rewatched recently. That video is "Every Zelda is the Darkest Zelda." Topically, it's completely different. But in it Jacob Geller talks about how there are many videos with fan theories about Zelda games where they're talking about how messed up the game is. Except, that's their only point. If you frame the game in some way, it's really messed up. It doesn't extract any additional meaning, and textually it's not what's present. So you're going through all this decoding and framing, and at the end your conclusion is... nothing. The Mario characters represent the seven deadly sins? Well, that's messed up. That's maybe fun, but it's an empty analysis. It has no insight. No bite. So, what's the result here other than: Well, that's neat. It's an interesting frame. But other than the thought to construct it, does it inform us of anything? Honestly, I'm not even sure it's really saying life is a form of programming. It seems equally likely it's saying programming is a form of biochemistry (which, honestly, makes more sense given the origins of programming). But even if that were so, what does that give us that we didn't already know? I'm going to bake a pie, so I guess I should learn Go? No, the idea feels descriptive rather than a synthesis. Like an analogy without the conclusion. The pie has no bite. [0]: https://youtu.be/O2tXLsEUpaQ |
| |
| ▲ | dsign 3 days ago | parent | next [-] | | > I don't think it's necessary to completely discard the idea. However, I do think it's important, at the end of it all, to ask: Okay, so what's the utility of this framework? What am I getting out of setting up my point of view this way? That's the important question indeed. In particular, classing life as a computation means that it's amenable to general theories of computation. Can we make a given computation--an individual--non-halting? Can we configure a desirable attractor, i.e. remaining "healthy" or "young"? Those are monumentally complex problems, and nobody is going to even try to tackle them while we still believe that life is a mixture of molecules dunked in unknowable divine aether. Beyond that, the current crop of AI gets closer to anything we have had before to general intelligence, and when you look below the hood, it's literally a symbols-in symbols-out machine. To me, that's evidence that symbol-in symbol-out machines are a pretty general conceptual framework for computation, even if concrete computation is actually implemented in CPUs, GPUs, or membrane-delimited blobs of metabolites. | |
| ▲ | vidarh 2 days ago | parent | prev [-] | | The very immediate utility is that if life is computation, would be to tell us that life is possible to simulate, and that AGI is possible (because if there is no "magic spark" of life, then the human brain would be existence proof that a power and space- efficient computer capable of general intelligence can be constructed; however hard it might be). If life is not a computation, then neither of those are a given. But it has other impacts too, such as moral impacts. If life is a computation, then that rules out any version of free will that involves effective agency (a compatibilist conception of free will is still possible, but that does not involve effective agency, merely the illusion of agency), and so blaming people for their actions would be immoral as they could not at any point have chosen differently, and moral frameworks for punishment would need to center on minimising harm to everyone including perpetrators. That is hard pill to swallow for most. It has philosophical implications as well, in that proof that life is computation would mean the simulation argument becomes more likely to hold. |
|
|
| ▲ | Xcelerate 3 days ago | parent | prev | next [-] |
| > a computation is a process that maps symbols (or strings of symbols) to other symbols, obeying certain simple rules[1] There are quite a number of people who believe this is the universe. Namely, that the universe is the manifestation of all rule sets on all inputs at all points in time. How you extract quantum mechanics out of that... not so sure |
|
| ▲ | dsign 3 days ago | parent | prev | next [-] |
| > In that sense life is obviously not a computation: it makes some sense to view DNA as symbolic but it is misleading to do the same for the proteins they encode. Proteins can also be seen as sequence of symbols: one symbol for each aminoacid. But that's beyond the point. Computational theory uses Turing Machines as a conceptual model. The theories employ some human-imposed conceptual translation to encode what happens in a digital processor or a Lego computer, even if those are not made with a tape and a head. Anybody who actually understands these theories could try to make a rigorous argument of why biological systems are Turning Machines, and I give them very high chances of succeeding. > These proteins are solving physical problems, not expressing symbolic solutions to symbolic problems This sentence is self-contradictory. If a protein solves a physical problem and it can only do so because of its particular structure, then its particular structure is an encoding of the solution to the physical problem. How can that encoding be "symbolic" is more of a problem for the beholder (us, humans), but as stated before, using the aminoacid sequence gives one such symbolic encoding. Another symbolic encoding could be the local coordinates of each atom of the protein, up to the precision limits allowed by quantum physics. The article correctly states that biological computation is full of randomness, but it also explains that computational theories are well furnished with revolving doors between randomness and determinism (Pseudo-random numbers and Hopfield networks are good examples of conduits in either direction). > ... whatever. Please don't use this word to finish an argument where there are actual scientists who care about the subject. |
|
| ▲ | jes5199 3 days ago | parent | prev | next [-] |
| our relationship to computation got weird when we moved to digital computers. Like, I don’t think anyone was saying “life is like millions of slide-rules solving logarithms
in parallel”. but now that computers are de-materialized, they can be a metaphor for pretty much anything |
| |
| ▲ | mannykannot 3 days ago | parent [-] | | Good point - maybe the analogy to computation arises simply because digital computation and the synthesis of DNA, RNA and proteins are all performed by discrete-state machines? | | |
| ▲ | jes5199 3 days ago | parent [-] | | does DNA/RNA keep state other than the position of the read head? | | |
|
|
|
| ▲ | vidarh 2 days ago | parent | prev | next [-] |
| By your defdinition, life is obviously a computation. The symbolic nature of digital computers is our interpretation on top of physical "problems". If we attribute symbols to the proteins encoded by DNA, symbolic computation takes place. If we don't attribute symbols to the voltages in a digtal computer, we could equally dismiss them as not being computers. And we have a history of analogue computers as well, e.g. water-based computation[1][2], to drive home that computers are solving physical problems in the process of producing what we then interpret as symbols. There is no meaningful distinction. The question of whether life is a computation hinges largely on whether life can produce outputs that can not be simulated by a Turing complete computer, and that can not be replicated by an artificial computer without some "magic spark" unique to life. Even in that case, there'd be the question of those outputs were simply the result of some form of computation, just outside the computable set inside our universe, but at least in that case there'd be a reasonable case for saying life isn't a computation. As it is, we have zero evidence to suggest life exceeds the Turing computable. [1] https://en.wikipedia.org/wiki/Water_integrator [2] https://news.stanford.edu/stories/2015/06/computer-water-dro... |
|
| ▲ | ants_everywhere 3 days ago | parent | prev | next [-] |
| I think you may be forgetting about analog computers https://en.wikipedia.org/wiki/Analog_computer |
| |
| ▲ | lmm 3 days ago | parent [-] | | I don't think they are. The things analog computers work on are still symbolic - we don't care about the length of the rod or what have you, we care about the thing the length of the rod represents. | | |
| ▲ | ants_everywhere 3 days ago | parent [-] | | analog computers don't generally compute by operating on symbols. For example see the classic video on fire control computers https://youtu.be/s1i-dnAH9Y4?t=496 OP's specific phrasing is that they "map symbols to symbols". Analog computers don't do that. Some can, but that's not their definition. Turing machines et al. are a model of computation in mathematics. Humans do math by operating on symbols, so that's why that model operates on symbols. It's not an inherent part of the definition. | | |
| ▲ | lmm 3 days ago | parent | next [-] | | > analog computers don't generally compute by operating on symbols. For example see the classic video on fire control computers https://youtu.be/s1i-dnAH9Y4?t=496 > OP's specific phrasing is that they "map symbols to symbols". Analog computers don't do that. Some can, but that's not their definition. How is that not symbolic? Fundamentally that kind of computer maps the positions of some rods or gears or what have you to the positions of some other rods or gears or what have you, and the first rods or gears are symbolising motion or elevation or what have you and the final one is symbolising barrel angle or what have you. (And sure, you might physically connect the final gear directly to the actual gun barrel, but that's not the part that's computation; the computation is the part happening with the little gears and rods in the middle, and they have symbolic meanings). | | |
| ▲ | defrost 3 days ago | parent [-] | | There's a confusion of nomenclature. Computers are functional mappings from inputs to outputs, sure. Analog fire computers are continuous mappings from a continuum, a line segment (curved about a cam), to another continuum, a dial perhaps. Symbolic operations, mapping from patterns of 0s and 1s (say) to other patterns are discrete, countable mappings. With a real valued electrical current, discrete symbols are forced by threshold levels. | | |
| ▲ | lmm 2 days ago | parent | next [-] | | > Analog fire computers are continuous mappings from a continuum, a line segment (curved about a cam), to another continuum, a dial perhaps. > Symbolic operations, mapping from patterns of 0s and 1s (say) to other patterns are discrete, countable mappings. What definition of "symbolic" are you using that draws a distinction between these two cases? If it means merely something that symbolises something else (as I would usually use it), then both a position on a line segment and a pattern of voltage levels qualify. If you mean it in the narrow sense of a textual mark, that pattern of voltage levels is just as much not a "symbol" as the position on the line segment. | |
| ▲ | emmelaich 3 days ago | parent | prev [-] | | To what degree is the threshold precise? Maybe fundamentally there's not that much difference. |
|
| |
| ▲ | AIPedant 3 days ago | parent | prev [-] | | No, analog computers truly are symbolic. The simplest analog computer - the abacus - is obviously symbolic, and thus is also true for WW2 gun fire control computers, ball-and-shaft integrators, etc. They do not use inscriptions which is maybe where you're getting confused. But the turning of a differential gear to perform an addition is a symbolic operation: we are no more interested in the mechanics of the gear than we are the calligraphy of a written computation or the construction of an abacus bead, we are interested in the physical quantity that gear is symbolically representing. Your comment is only true if you take an excessively reductive view of "symbol." | | |
| ▲ | ants_everywhere 3 days ago | parent | next [-] | | I'm not confused, and an abacus is a digital computer. You keep referring to what we are interested in, but that's not a relevant quantity here. A symbol is a discrete sign that has some sort of symbol table (explicit or not) describing the mapping of the sign to the intended interpretation. An analog computer often directly solves the physical problem (e.g. an ODE) by building a device whose behavior is governed by that ODE. That is, it solves the ODE by just applying the laws of physics directly to the world. If your claim is that analog computers are symbolic but the same physical process is not merely because we are "interested in" the result then I don't agree. And you'd also be committed to saying proteins are symbolic if we build an analog computer that runs on DNA and proteins. In which case it seems like they become always symbolic if we're always interested in life as computation. | | |
| ▲ | AIPedant 3 days ago | parent [-] | | This is where you are confused - in fact just plain wrong: A symbol is a discrete sign that has some sort of symbol table (explicit or not) describing the mapping of the sign to the intended interpretation
Symbols do not have to be discrete signs. You are thinking of inscriptions, not symbols. Symbols are impossible for humans to define. For an analog computer, the physical system of gears / etc symbolically represent the physical problem you are trying to solve. X turns of the gear symbolizes Y physical kilometers. |
| |
| ▲ | zabzonk 3 days ago | parent | prev [-] | | Surely an abacus is a simple form of digital computer? The position/state of the beads is not continuous, ignoring the necessary changes of position/state. |
|
|
|
|
|
| ▲ | antegamisou 3 days ago | parent | prev [-] |
| I think the notion largely boils down to another dogmatic display of tech industry's megalomania. |
| |
| ▲ | ok_dad 3 days ago | parent [-] | | In what sense? I agree the tech industry fucking sucks right now, but I don't see how this has anything to do with that. A physical computer is still a computer, no matter what it's computing. The only use a computer has to us is to compute things relative to physical reality, so a physical computer seems even closer to a "real computer" or "real computation" to me than our sad little hot rocks, which can barely simulate anything real to any degree of accuracy, when compared to reality. | | |
| ▲ | MountDoom 3 days ago | parent [-] | | I suspect what the parent is alluding to is that we tend to reduce everything to computer-world analogies, which we believe we're uniquely qualified to analyze. It's sort of like a car mechanic telling you "SQL query, eh? It must be similar to what happens in an intake manifold." For all I know, there might be Turing-equivalency between databases and the inner workings of internal combustion engines, but you wouldn't consider that to be a useful take. |
|
|