Each level in the Asa H hierarchy learns a set of cases and the components/features that make up the case. Each case is a vector and the features are the vector's components. In my blog of 7 Oct. 2015 I noted that we can use things like statistical measures of independence to prune features. We can also use standard statistical measures in order to determine the relative importance of each feature in identifying a case it is found in. Such statistical measures can then be used as weights in the (dot product or other) similarity measure that Asa uses when it compares cases.
(In other case-based reasoning systems it is not real common to see the dot product used as the similarity measure. I have tried other similarity measures in Asa H but coming out of a physics background I have probably shown a bias toward the dot product. It is worth noting that Jannach et. al in Recommender Systems, Cambridge Univ. Press, 2011 on page 19 say "In item-based recommendation approaches cosine similarity is established as the standard metric as it has been shown that it produces the most accurate results.")