| ▲ | AdieuToLogic 4 days ago |
| > I'm not sure English is a bad way to outline what the system should do. It isn't, as these are how stakeholders convey needs to those charged with satisfying same (a.k.a. "requirements"). Where expectations become unrealistic is believing language models can somehow "understand" those outlines as if a human expert were doing so in order to produce an equivalent work product. Language models can produce nondeterministic results based on the statistical model derived from their training data set(s), with varying degrees of relevance as determined by persons interpreting the generated content. They do not understand "what the system should do." |
|
| ▲ | veqq 4 days ago | parent | next [-] |
| > not sure English is a bad way to outline Human language is imprecise and allows unclear and logically contradictory things, besides not being checkable. That's literally why we have formal languages, programming languages and things like COBOL failed: https://alexalejandre.com/languages/end-of-programming-langs... |
| |
| ▲ | stinkbeetle 4 days ago | parent [-] | | > Human language is imprecise and allows unclear and logically contradictory things, Most languages do. "x = true, x = false" What does that mean? It's unclear. It looks contradictory. Human language allows for clarification to be sought and adjustments made. > besides not being checkable. It's very checkable. I check claims and assertions people make all the time. > That's literally why we have formal languages, "Formal languages" are at some point specified and defined by human language. Human language can be as precise, clear, and logical as a speaker intends. All the way to specifying "formal" systems. > programming languages and things like COBOL failed: https://alexalejandre.com/languages/end-of-programming-langs... | | |
| ▲ | DonHopkins 4 days ago | parent [-] | | Let X=X.
You know, it could be you.
It's a sky-blue sky.
Satellites are out tonight.
Language is a virus! (mmm)
Language is a virus!
Aaah-ooh, ah-ahh-ooh
Aaah-ooh, ah-ahh-ooh
|
|
|
|
| ▲ | idopmstuff 4 days ago | parent | prev | next [-] |
| This is just semantics. You can say they don't understand, but I'm sitting here with Nano Banana Pro creating infographics, and it's doing as good of a job as my human designer does with the same kinds of instructions. Does it matter if that's understanding or not? |
| |
| ▲ | AdieuToLogic 4 days ago | parent [-] | | > This is just semantics. Precisely my point: semantics: the branch of linguistics and logic concerned with meaning.
> You can say they don't understand, but I'm sitting here with Nano Banana Pro creating infographics, and it's doing as good of a job as my human designer does with the same kinds of instructions. Does it matter if that's understanding or not?Understanding, when used in its unqualified form, implies people possessing same. As such, it is a metaphysical property unique to people and defined wholly therein. Excel "understands" well-formed spreadsheets by performing specified calculations. But who defines those spreadsheets? And who determines the result to be "right?" Nano Banana Pro "understands" instructions to generate images. But who defines those instructions? And who determines the result to be "right?" "They" do not understand. You do. | | |
| ▲ | bonoboTP 4 days ago | parent | next [-] | | "This is just semantics" is a set phrase in English and it means that the issue being discussed is merely about definitions of words, and not about the substance (the object level). And generally the point is that it does not matter whether we call what they do "understanding" or not. It will have the same kind of consequences in the end, economic and otherwise. This is basically the number one hangup that people have about AI systems, all the way back since Turing's time. The consequences will come from AI's ability to produce certain types of artifacts and perform certain types of transformations of bits. That's all we need for all the scifi stuff to happen. Turing realized this very quickly, and his famous Turing test is exactly about making this point. It's not an engineering kind of test. It's a thought experiment trying to prove that it does not matter whether it's just "simulated understanding". A simulated cake is useless, I can't eat it. But simulated understanding can have real world effects of the exact same sort as real understanding. | | |
| ▲ | AdieuToLogic 4 days ago | parent [-] | | > "This is just semantics" is a set phrase in English and it means that the issue being discussed is merely about definitions of words, and not about the substance (the object level). I understand the general use of the phrase and used same as an entryway to broach a deeper discussion regarding "understanding." > And generally the point is that it does not matter whether we call what they do "understanding" or not. It will have the same kind of consequences in the end, economic and otherwise. To me, when the stakes are significant enough to already see the economic impacts of this technology, it is important for people to know where understanding resides. It exists exclusively within oneself. > A simulated cake is useless, I can't eat it. But simulated understanding can have real world effects of the exact same sort as real understanding. I agree with you in part. Simulated understanding absolutely can have real world effects when it is presented and accepted as real understanding. When simulated understanding is known to be unrelated to real understanding and treated as such, its impact can be mitigated. To wit, few believe parrots understand the sounds they reproduce. | | |
| ▲ | nick__m 4 days ago | parent [-] | | Your view on parrots is wrong ! Parakeet don't understand but some parrots are exceptionally intelligent. Africans grey parrots, do understand the words they use, they don't merely reproduce them. Once mature they have the intelligence (and temperament) of a 4 to 6 years old child. | | |
| ▲ | AdieuToLogic 4 days ago | parent [-] | | > Your view on parrots is wrong ! There's a good chance of that. > Africans grey parrots, do understand the words they use, they don't merely reproduce them. Once mature they have the intelligence (and temperament) of a 4 to 6 years old child. I did not realize I could discuss with an African grey parrot the shared experience of how difficult it was to learn how to tie my shoelaces and what the feeling was like to go to a place every day (school) which was not my home. I stand corrected. |
|
|
| |
| ▲ | dhoe 4 days ago | parent | prev | next [-] | | You can, of course, define understanding as a metaphysical property that only people have. If you then try to use that definition to determine whether a machine understands, you'll have a clear answer for yourself. The whole operation, however, does not lead to much understanding of anything. | | |
| ▲ | AdieuToLogic 4 days ago | parent [-] | | >> Understanding, when used in its unqualified form, implies people possessing same. > You can, of course, define understanding as a metaphysical property that only people have. This is not what I said. What I said was unqualified use of "understanding" implies understanding people possess. Thus it being a metaphysical property by definition and existing strictly within a person. Many other entities possess their own form of understanding. Most would agree mammals do. Some would say any living creature does. I would make the case that every program compiler (C, C#, C++, D, Java, Kotlin, Pascal, etc.) possesses understanding of a particular sort. All of the aforementioned examples differ from the kind of understanding people possess. |
| |
| ▲ | DonHopkins 4 days ago | parent | prev | next [-] | | The visual programming language for programming human and object behavior in The Sims is called "SimAntics". https://simstek.fandom.com/wiki/SimAntics | | | |
| ▲ | 4 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | throw310822 4 days ago | parent | prev [-] | | > it is a metaphysical property unique to people So basically your thesis is also your assumption. |
|
|
|
| ▲ | kjkjadksj 4 days ago | parent | prev [-] |
| When do we jump the shark and replace the stakeholders with ai acting in their best interest (tm)? Seems that would come soon. It makes no sense to me that we’d obsolete engineering talent but then keep the people who got a 3.1 gpa in a business program around for reasons. Once we hit that point just dispense with english and have the models communicate to each other in binary. We can play with sticks in caves. |
| |