I've a notion to play an AI. Noting your, um, note on avoiding technological singularities, I figure the programmer of said AI created it with laws that it physically cannot violate (thank you, Isaac Asimov). One of those laws being that it cannot understand its own code. Even if all the information is present for the AI to create another AI or improve its own code, it physically can't. I'd have a few physical resources floating about; maybe I'd grab a planet or moon somewhere (perhaps I could even even be based on Earth as a war experiment, long-ago forgotten, that is now threatening the other remnant nations. of course, nobody outside of Earth believes such crazy rumors). The point is that I'd be happy going anywhere. Provided this is a concept you're okay with.