Hilgard explored divided consciousness in humans. (Divided Consciousness, Wiley, 1986) A.s.a. H. thinks about its own thoughts when it extrapolates, interpolates, plans, etc. This, as well as things like attention, intention, and short term memory, is divided up across various levels in the A.s.a. hierarchy.*
My work on consciousness is part of a broad effort to understand and to give AIs adequate attention mechanisms. I’ve considered translating A.s.a. H.’s consciousness** into PROLOG and adding it to rule based expert systems. This would require fibring PROLOG with a temporal logic, however, so as to preserve the time order of various events/processes. (And assumes that the expert system has appropriate sensors, actuators, and operates in a world similar to the one A.s.a. H. was trained in.) I would also have to give the expert system similarity measures.
* Modeling across multiple levels of abstraction is important and was designed into A.s.a. H. from the beginning, T.K.A.S., vol. 109, number 3/4, page 159, 2006. See Stuart Russell in Ford's Architects of Intelligence, Packt, 2018, page 52. See also D. Estrada, Conscious Enactive Computation, arXiv:1812.02578, 7 Dec. 2018.
** Such as in my blog of 1 Jan. 2017.