> I would not have any problem with (AI's) hardcoded goals if they are
> guaranteed to stay fully compatible with our goals.
Nobody can give such guarantee.
For instance, desire to protect their families was a component of motivation of suicide pilots who crashed into Twin Towers in NYC in Sep 2001.
>In order to make AI to achieve such "high level goals",
>operator/admin has to carefully design set of "simple goals".
Having enough data, AI can generate all the needed sub-goals and solutions.
Without assistance (in form of goals) it's practically impossible to learn.
Without learning it's practically impossible to achieve high level goals.
Without sexual instinct reproduction is practically impossible.
> Optionally, admin can specify rules which cannot be broken during the
> problem solving process.
Problem solving process is too delicate to give it to an admin.
Solving process should be implemented by developer under strict architect supervision.
>>It is impossible to educate without "desire to learn"
>> (read: "learning hardcoded goals") already implemented in AI.
> Not sure if I understand correctly. Assuming I do, I would say it
> applies to people, not to AI.
It applies to any intelligent learning system.
> The AI needs to be able to generate customized models for particular
> problem scenarios. The same question can be asked under different
> scenarios and the correct answers might be different or even contrary.
Different scenario means that this different scenario will be mentioned in the question.
In case if different scenario is mentioned in the question --- simple AI would generate different answer.
> That's one of the reasons why your AI cannot work well. Another one is
> that it cannot generalize.
Generalization is different feature. It could be implemented later.
BTW, most humans don't generalize well.
They can borrow generalizations from other people, but typically don't create their own generalizations.
Simple AI will be able to borrow generalization from NL text.
Bottom line: "Generalization ability is not core AI feature".
>>BTW, these "logically advanced" humans are not necessarily the most
>>successful ones :-)
>Right.. Success takes some luck..
This is not about luck.
Strong communication skills and efficient set of goals are far more important for intelligence than advanced logical skills.
> The most basic demo might be doable in a few days.
> The parser which does the inserts should be relatively easy to do.
Correct, I successfully implemented it.
But this is not full demo. Therefore there is nothing to show / experiment with.
> Put the sentence-parts into a single table as you have originally planned.
> Let it learn from locally stored text files...
This learning part takes longer time to develop.
>>If I need 1 digit precision, then my AI need just remember few hundred
> There is an infinite number of combinations.
With 10 digits???
> It's terribly limited if it cannot do calculation it did not observed.
Intelligence is possible without ability to count.
It's proved by history.
>>Also my AI would use special math functions for calculations :-)
> Good, you are getting there ;-)..
Well, NL text has to be processed first. After that needs for calculations should be identified. Then parameters should be prepared and passed to the math functions.
For me it's obvious that AI can work without math, but cannot work without NL processing.
>>> The system needs to understand symbols "2", "+", "4", "=" separately.
>>Yes, but in a limited way.
>>Concept "2" may have relations with "1 + 1", "0 + 2", and "1 * 2".
>>"=" may be associated with internal math calculator. And with "2*2 = 4".
> Crazy ;-)..
Sorry, but that's how our minds work.
> Here you go. Do not waste time with lots of coding. Google is your AI.
> The problem is that you would need a lot more magic than some synonyms
> from webster to turn it into a clever AI.
I cannot update Google's links' weights.
That's why I cannot just play with Google.