I have given my AI Asa H a kind of minimalist set of concepts based (mostly) on the Toki Pona artificial language:
"need" or "want" is defined by low robot battery and need to recharge
"away", "long" (distance), or "far" is defined by a Lego NXT ultrasonic sensor reading which is approaching 255
"near" is defined by a small input reading from an ultrasonic sensor
"strength" or "force" or "push" is defined by input from HiTechnic force sensors
"white" is defined by an input reading from a HiTechnic color sensor approaching 17
"grasp" is defined as a gripper servo closing and feeling an object with force or touch sensors
"drop" is defined as opening a gripper that had been grasping an object
"strike", or "hit" is defined by inputs from force and Lego NXT contact sensors
"home" is defined by a robot's docking station and recharger
"say" is defined by robot sound or other signal transmission
any "time" is defined by reference to the computer's clock (or an external time reference)
"move" is defined by input from encoders in a robot's drive servos and by any HiTechnic motion sensors
"taste" is defined by inputs from Vernier pH and salinity sensors
"light" or "bright" is defined by input from a Lego NXT light sensor
"knowledge" is defined by input (and output) of a computer file/case-base
"front", "back", "left", "right", "side", "top"/"on", "bottom", "body", "head", "hand", "arm", etc. are all defined by force or touch sensors on those various sides/parts of a Lego NXT robot.
Any "location" is defined by input from a Dexter industries gps module
"hot", "cold", and "temperature" are defined by input from a Vernier temperature sensor
"end" and "stop" are learned as the cessation of some servo actions
"black" and "dark" are defined by a low input from a HiTechnic color sensor or light sensor
"work" or "active" is defined by motor activity continuing over time
"eye" is defined by the inputs from a webcam
"leg" or "foot" are defined by signals to and from the appropriate servos
"word" or "name" are defined by the set of categories and names learned for them
"path" or "road" can be defined by a line following system
"food" can be defined by the measured amount of energy stored in the robot's batteries
"eat" is defined by sensing battery recharging
"earth" or "ground" or "floor" can be defined by setting down force or contact sensors
"wall" is defined by lateral touch or force sensor contacts and gps readings
"see" is defined by inputs from a webcam, light, or color sensors
"red", "green", and "blue" (or other colors) are given as inputs from a HiTechnic color sensor
"hear" and "sound" are defined by inputs from a Lego NXT sound sensor
"color" is defined by an input from the color sensor which is neither too high nor too low
"wind" and "air" or "fluid" are defined by the input from a Vernier anemometer
"wait" or "stay" is defined by prolonged lack of servo operation and fixed gps reading
"bump" or "acceleration" is defined by input from a HiTechnic acceleration sensor
"rotation" is defined by input from a HiTechnic gyro sensor
"north", "south", "east", "west", and "magnetism" are defined by input from HiTechnic compasses and magnetic field sensors
"turn" is defined by input from the gyro sensor, compass, and servos
"fast" and "slow" are defined by the level of inputs from various servos
"hunger" is defined by a low battery charge measurement
"pain" and "breakage" are defined by input from fine damage detecting (breakage) wires
"mouth" can be defined by the robot's battery charging contacts
"piece" can be defined by the components of a robot ("body", "arm", "gripper"/"hand", etc.)
for a virus AI like Ava 1.0 "reproduction" can be defined by disk or file copying
"parent" can be defined as the source copy when file copying occurs
"child" can be defined as the file copy
"dead servo" can be defined by zero current and zero motion when the servo is commanded
We can also detect when certain sensors are "dead".
"dead robot" can be defined by seeing when all or many servos and sensors are dead and/or many "pain" signals are input
"sense" is defined by input from any of the robot's sensors
"surface" is defined as a "wall" or "floor"
"control" can be defined by the activation/use of a PID postprocessor
"age", the robot keeps track of how long it's been in operation
"young" or "new" can be defined as an "age" less than some given number
"old" can be defined as an "age" greater than some given number
"inside" can be defined by gps values falling within a certain range
Asa H can make use of a NOT or inverse (see my paper in Trans. Kansas Acad. Sci. 109 (3/4), pg 161, 2006) and then "live" can be defined as NOT "dead". (You can elect to define the inverse of just a limited number of signals.)
"room" or "container" is defined by "floor" and "walls"
"hard" is defined by strong "push" and small displacement
"soft" is defined by small "push" and larger displacement
"take" is learned as a sequence "grasp"-"lift"-"move"
"tool" is learned as sequences like "push"-"object"-"push"-X
"collision" is learned as a sequence "near"-"strike"-"accelerate"
"damage" is learned by a sensor pegging and/or in terms of breakage detectors ("pain")
level of "health" is learned as a combination of level of "damage" and "hunger"
"leave" is defined by a "near" proximity measurement followed by a "far" measurement
"approach" or "arrive" is defined by a "far" proximity measurement followed by a "near" measurement
"feel" is defined by input from any force or touch sensor
"good" and "bad" are defined by the degree of activation of the "health" concept.
Neural network preprocessors have been trained to identify various "letters", "numbers", "shapes", common objects etc. "similar" and "different" can be defined by the degree of match reported by such preprocessors.
Algorithms are available to detect "faces" and "people" in images and to count them.
"group" or "many" or "large" can be defined as when a count exceeds some given number.
"lone" is a single person or face.
In some cases we would like to have additional definitions for a given concept.
Looking back on this work I think that up until now I may have underestimated the value of embodiment.
As they are listed above, the english words for each of these concepts can be learned by association with each of the relevant input hardware signals seen by the robot. We can then hope to converse with Asa in this elementary robot language. This is an extension of the simple communications reported in chapter 1 of my book Twelve Papers (see my website www.robert-w-jones.com under Book).
No comments:
Post a Comment