A real (Lego NXT) or simulated robot (running Asa H 2.0) can learn sensor-action patterns like "collision":
with the robot moving at time step 1
sensing an object far away at time step 1
with the robot moving at time step 2
sensing an object nearby at time step 2
sense contact at time step 3
If an observer inputs the word "collision" at the same time then Asa H associates this sign with the concept it learns (see my blog of 6 March 2013 and chapter 1 of my book, Twelve Papers, www.robert-w-jones.com, book). Asa H has much of the same functionality that Meystel prescribes in his book, Semiotic Modeling and Situation Analysis (AdRem, 1995).