Monday, June 29, 2020

// // Leave a Comment

Is Natural Language Recursion or Dereferencing? - Comment about the article "Recursive language and modern language were acquired simultaneously ..."

https://phys.org/news/2019-08-recursive-language-modern-simultaneously-years.html?fbclid=IwAR04SqHIJKrbZR52gvllBiM2qJp3-xhZ8aJ0vQXW1-hwEcplsjTw8ZccglE

And the discussion of B.K. from CogAlg: https://www.facebook.com/boris.kazachenko.5/posts/10216534326967246
"a snake on the boulder to the left of the tall tree that is behind the hill,"  
"Prefrontal Synthesis (PFS)."  
" Similarly, nested explanations, such as "a snake on the boulder to the left of the tall tree that is behind the hill,"  force listeners to use PFS to combine objects (a snake, the boulder, the tree, and the hill) into a novel scene. Flexible object combination and nesting (otherwise known as recursion) are characteristic features of all human languages. For this reason, linguists refer to modern languages as recursive languages."

The bold part (bold - mine) is a general characteristic of codes.

I didn't see the capacity of the "working memory" mentioned, while it is crucial and general - both as the definition of  "7+2" and in more general sense of complexity of the representations one could productively work with, which correlates with the complexity of the matters ones could deal with.

Nested and recursive are not synonyms by default IMO. You could call them so, but IMO they *could* be, but are not obliged to be. Recursion is about self-reference, while the example IMO is not "self" reference, unless assuming such "function call" mechanism. It's *chaining* and redirection of relations (a list, graph, network), some of the parts could be not processed the same way, thus not "recursion".

( Sure, one can call "recursion" whatever she wants; e.g. regarding CogAlg there was a time, I don't know if it is still valid, where "recursion" was called "iteration" interchangeably - they are opposite methods in programming. The general of these is "a stage of processing". )

As of redirection, a famous quote in Computer Science says, that "we can solve everything with one more level of redirection". The original uses another word:

Indirection | Dereferencing | Redirection

"We can solve any problem by introducing an extra level of indirection."  
Indirection, dereferencing: https://en.wikipedia.org/wiki/Indirection
https://en.wikipedia.org/wiki/Fundamental_theorem_of_software_engineering

Processor's Indirection

It starts with the CPUs with more advanced methods for addressing, even for simple CPUs like read the value at the address, pointed by the current content of a register, offset by the content of selected index register (read the content of the index register, add the value to the already read value, multiplied by the selected word-size - byte, word, doubleword, quadword, - where those selections are encoded in the opcodes of the instructions - and store the read value in the same source register.

Say:  A - Accumulator, X - IndeX register

MOV @[A+X]

It could be to point to an address, which is used as another address of a table, add to it offset from another register, add something else etc.

This is constantly done also when parsing and expanding high level code and data structures, first load one table with addresses with a key (identifier), search, find where it redirects, get another table etc.

It is similar to the linguistic dereferences like:

"This is the book of the girl that lives on the third floor in your house".

All that is easier to be expressed in code and by graphs/flowcharts with arrows, rather than by text, because it's naturally "spatial" and connected.

...

Regarding the example from the article:


It can be traversed and processed recursively, but it could be done also iteratively, up to particular "depth", length, and since in human case that length is quite limited anyway, and it doesn't grow infinitely, unlike the claims in the artificial syntactic examples, most people very quickly get lost in the relations betweeen words - probably like in this sentence.

That has a simple general interpretation, though and exposure to nested input can't help it: too limited working memory (resources) do not allow to spawn and to hold enough objects/patterns and relations in order to understand their relations or to combine them into new; while just activation/recall is cheaper, so a system may be able to just recall memories and reconstruct already visited patterns, which may require less activations in the higher levels (PFC), but couldn't have an even higher level which use many of these into new imagined entities, combined by other parts. The relations are patterns and they seem to be expensive ones for humans.

Even if you had that skill of "PFS" within the cognitive repertoir and you can generate sequences in principle, if you had that "modern imagination", if the available resources allow just to manage one level of certain complexity, the result will be the same.

This is exemplified by the complexity of the used sentences and language, which is supposed to grow during language acquisition, but it has a limit. That goes also for the complexity of programming code expressions for developers' skills.

The math "word problems" given to students are also tests for the working memory capacity:

John has 5 apples, Mary has 3 apples more, but she gave 2 to Kate who had 1. How many apples Mary has now?

These problems require correspondingly big enough working memory for such patterns in order to hold the elements and not just in principle to be able to do "recursion" (or nesting or chaining).

...

Mapping to vocalisation

I agree about with Boris that a mapping to vocalisation exists ( to recording spoken words in working memory as well), I've measured sometimes the lenght of text which I can remember and write correctly while listening to a talk, a show etc, see also my old compressed definition of natural language, given for example in the "What's Wrong With the NLP" series:

Todor: Natural language is a hierarchical redirection/abstraction/generalization/compression of sequences of multi-modal sensory inputs and motor outputs, and records and predictions for both.
When we learn the language in a multi-modal net which includes motor commands to the vocal tract and sensory from the records of other people talking, our own sounds and utterances, expectations of what sound would be produced when we do this or that sequence of motor commands to the vocal system, starting with current state etc., then it would recall memories of "vocalization" etc.

https://en.wikipedia.org/wiki/Vocal_tract


Declarative Memory, Hippocampus, Consciousness, Creativity, Sequences

I also agree that there is a correlation/relation between declarative memory capacity/clarity/skills and general intelligence and creativity (and have at least myself as an example for that phenomenon), so if Hippocampus is required and important for the former, then it would be for the general hierarchical sequence generations and analysis/re-syntehsies as well, which is in short what producing and cognition of creative works is, either "hard" as code or scientific theories, or more artistic as creative writing, music, films. Also "declarative memory" is required for navigation as "memory" - you must be able to record and compare what have you visited and what's "history", in what sequence, what was prior and what follows etc.

It is known by research that the capacity of the working memory strongly correlates with the G-factor - general intelligene, - which suggests both/either the existence of general processing within "mind" and that this memory is somewhat distributed, and/or something that Schopenhauer has suggested in 19-th century if I remember correctly - that the difference between genius and average/mediocre is actually quantative, not qualitative, the latter just runs "out of memory" too quickly and can't reach to the required complexity and length to understand, discover or produce something which is "more meaningful" or original than the expected*.


* If I'm not mistaken that was in "Parerga and Paralipomena"

Prepositions

That reminds me of insights I had about "prepositions" in language and how they related to the Cognitive Algorithm theory of Boris as I knew and understood it back in the end of 2015.

I may revisit the discussion:

To be continued...


* Regarding the "prepositions", mentioned in the article, as material syntactical elements, they could be virtual, implied in the word forms, but also, technically, as someone mentions in the comments section, too, many ancient languages usually are with cases (падежи), the prepositions are scarce or auxilliary or still require cases - such as the European languages: Latin, Greek, German (thus Proto-English), Slavic - except for Bulgarian which gradually lost the cases (except a few informal and in some expressions, and we generally understand some of the cases in kin languages because they use morphemes/suffixes which we use and understand, such as - "у", "ов", "му" - etc. and there are archaisms and Old Bulgarian ("Old Church-Slavonic") which are known such as "Православному българскому народу", "Моли се Богу", и съвременното: "У нас" (At my home, at our home) и пр.

0 коментара: