Thursday, August 28, 2014
Distraction and focus of attention again
Each layer of the Asa H hierarchy passes a vector up to the next layer. Perhaps focus of attention might be obtained in the following way: calculate an average and standard deviation from all of the vector components (assume all components are positive), keep only those components which are "a couple" of standard deviations above the average, delete all other components, renormalize the vector. Report a vector =0 if no components survive this test. (What number should "a couple" really be? Should it vary?) I plan to try this on Asa H.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment