With a scalar utility an AI like Marcus Hutter's AIXI might evolve toward lower intelligence if it resides in a sufficiently simple and benign environment.* Indeed, its the harsh world we find ourselves in that has forced life to develop mind. A society of A.s.a. H. agents with their vector utilities (at least one component of which values mental prowess of some sort) will always seek to produce some offspring with increasing mentality.**
* even temporarily
**I include the acquisition of knowledge as a part of this.