I have addressed before the question of how deep should our networks be. (See my blog of 14 March 2014 for example.) Asa H, artificial neural networks, semantic networks, or whatever. December of last year Microsoft (Peter Lee) reported work employing a network (ANN) of 152 layers. This seems excessive.
Features should be defined consistent with "carving up nature at the joints" (as with Plato's Phaedrus). Admittedly Asa H adds a few layers in the form of various preprocessors that it makes use of.
No comments:
Post a Comment