Some thoughts on robots ..... barring disaster, it is an inevitability that we'll soon invent machines more intelligent than us, rendering ourselves, in many crucial ways, "obsolete" and redundant, economically and otherwise. In movies this is often portrayed as a disaster, with robots trying to destroy us. In sci-fi, the future is usually portrayed with humans in control and machines our servants, a world where humans continue to be around forever. In reality, we cannot indefinitely rule over superior intelligences (we'd anyway begin submissively conferring to them the moment they came into existence, just for economic and strategic expediency), nor would we be of much use to them, particularly in space where our biological machinery is fragile - custom-designed intelligent machines would be far more versatile.
So perhaps the so-called "purpose of humanity", if there is one, *is* only to ultimately invent our superior *replacements*, and *nothing more*.
Humans are actually deeply flawed; our creations will almost certainly be superior to us - perhaps we're just a throwaway "intermediate" stage in the long and steady evolution of something grander and more sublime - our primate intelligence necessary only for taking life to the "next step", but nothing further. And perhaps that isn't a "bad thing" in the bigger picture - just a bad thing for humans specifically. Of course, this assumes a strict human / robot "dichotomy"; in reality we'll both hybridize with our own machines, and re-engineer our own biology. So humans may live on in some other form, but it'll likely be as radically different from what we are now, as we seem now to our single-celled ancestors. In a way, the "purpose" of our single-celled ancestors was merely to eventually become *us*, and our purpose, in turn, could then be to eventually become (or create) something more complex and meaningful than we can imagine (as single-celled organisms could never have imagined us). Recall, "we are the universe, thinking"; the universe is big and complex, and (to risk anthropomorphizing it) might "want" or need to do far more complex thinking than we're capable of now (e.g. collective intelligences etc.), and/or more widespread colonisation of space. Evolution doesn't stop. Evolution also cannot distinguish between sentient organic life forms and "robotic" machinery (sentient or not). Most of us like to think humans should be around forever, but this seems ridiculous if you take a historical view ... just a few million years ago we were a far different form. (Of course, single-celled organisms still exist, but nobody thinks they have a grander purpose in and of themselves.) We can imagine robots might at least keep us around as pets --- but this is not likely, as we really only keep pets around because they serve some purpose to us (be it utilitarian or companionship), and I doubt we'd be of any use to robots at all, more likely a burden.
It's a little absurd though that a life form would work so hard at trying to create something superior to itself that would render it obsolete and may lead to its demise. What kind of animal purposely tries to bring a superior, competing animal into its own habitat? We do it basically out of 'intellectual curiosity'. Maybe the saying "curiosity killed the cat" has some applicability.