There are various kinds of intelligence and various ways in which you can become more intelligent. I suppose you could have more and faster memory, more knowledge, more and faster processors, more and better algorithms, etc. without making any improvement to your value system. But I think the human value system is unsound* and must be improved too. I doubt that we can survive as a species if this does not happen.** So I don't agree with the idea of making AIs that agree with and are slaves to current human values. I'm OK with humans going extinct so long as we don't take AIs with us.
* Being composed mostly of a set of simple drives and aversions.
** i.e. the Fermi paradox