In 1986 Griffiths and Palissier published their brief book Algorithmic Methods for Artificial Intelligence. In my 2011 paper* I decomposed thought (and, specifically, A.s.a. H.) into remembering, generalization, comparison, explanation, deduction, organization, induction, classification, concept formation, image manipulation, feature detection, analogy, compression, simulation, and value assessment. In order to implement these processes on a computer each of them was, in turn, decomposed into: sorting, searching, vector averaging, vector differencing, vector dot product, sensitivity analysis, renormalization, interpolation, extrapolation, concatenation, time warping, and image manipulations.** Mid 2023 Stefan Edelkamp is expected to release his book: Algorithmic Intelligence: Towards an Algorithmic Foundation for Artificial Intelligence. I am anxious to see what he has to say. A google search has a number of sites listing typical AI algorithms as being: classifiers, clustering, and regression/forecasting. Some also add deduction, dimension reduction, and neural networks to the list.
* Trans. Kansas Acad. Sci., vol. 114, no. 1-2, pg 162, 2011.
** There's no claim that these decompositions are unique.
No comments:
Post a Comment