The cognitive architecture of the new Perl AI is
so simple and straightforward that the heart of the
associative neural net can be seen at a glance when
one is interacting with the AI in order to debug it.
A core mind-dump is displayed below after the human
user has entered "Boys make robots" into the AI Mind.
The mind-dump displays the following associative tags:
psi -- the concept number standing in for a conceptual neuron;
act -- the act(ivation) level of the concept or quasi-neuron;
hlc -- the human language code: en=English; de=deutsch; ru=Russian;
pos 1=adj 2=adv 3=conj 4=interj 5=noun 6=prep 7=pron 8=verb
jux -- any juxtaposed concept, particularly an adverb such as NOT;
pre -- the previous concept associated with the psi concept;
tkb -- time-in-knowledge-base where a time-bound idea is stored;
seq -- the subsequent concept associated with the psi-concept;
num -- the grammatical number of a noun, or pronoun, or verb;
mfn -- male-female-neuter 1-2-3 identifier of the gender;
dba -- "doing business as": case of a noun; person of a verb;
rv -- recall-vector time-point of a word stored in the @ear array.
Time "t" above is an internal counter geared to the phonemes
stored in the @ear auditory array for quasi-phonetic input.
Concepts are identified by a number such as 589 for "boy";
835 for "make"; and 571 for "robot". Any new concept in
the AI is assigned a new concept number, starting at 3001.
The most recent addition to the Perl free AI source code
registers the associations among the words in the sentence.
Concept 589 "boys" above has a "seq" tag to 835 "make" as
the verb of which 589=boys is the subject: "Boys make...."
Concept 835 "make" has a "pre" tag back to 589-boys and
a "seq" tag forward to 571-robots: "Boys make robots".
Concept 571-robots has only a "pre" tag back to 835-make.
As the Perl AI grows more and more complex during the
details of the mind-state will be visible in the mind-dump.
Anyone may download both Strawberry Perl5 and the AI and
see results similar to the above in English or in Russian.
Mentifex here claims that associative neural-net AI is
actually a lot simpler in design and function than most
Netizens can possibly realize for lack of True AI programs.
Even the most complex DeepMind or OpenCog or OpenCyc or
what-have-you AI will need to instantiate concepts and
their productive (generative) relations with stored words.
Anyone who wants to see the AI in action does not need to
wait for the completion of the Perl port, but may instead
run one of the four previous AI Minds as listed upthread.