I'm with Jakob in hoping that humans will reach a very high population simply because I think humans are the most beautiful and complex things in the universe, and I like to see just how fun we can get.
Whether it will actually happen, we have to question the position of humans in the modern nation-state. They are the hands and brains of the nation-state, they are the central substance in which everything of a nation revolves around, and the quality and quantity of these beings and their social structure is what determines the success of this nation-state. This is why growth is important, more people to make more better in competition with other nation-states. A nation-state, if it discovers a way to make more babies without reliance on the population's affinity to reproduce, will use this to increase their population due to the limitless escalation of "peaceful" competition. But of course, what about AI and their place in human society? I've stated before my belief that AI will be far far far more diverse and versatile than humans are, but at the same time rely on the basic principles of intelligence which we find in the philosophies of mind.
I'm not really that well read in that field, but I have my own thoughts on the topic. One of them is that humans always do things for pleasure, whether it be mental or 'higher' forms of pleasure like a meaning in life, or a care for someone else, to physical pleasures, like sex or food. Pleasures we've developed through nature, and a reliance on these actions. For example, mental pleasures have more to do with social interaction than anything. Trust and so forth is required to form tight social structure and relationships. It's why I think human goals are irrational, but our methods of getting there is rational. It's rational in the grand scheme of things, humans do things because it gives them pleasure, but from the first person perspective, "What's my meaning in life? What matters most in life?" These questions won't make sense. Nothing matters, and there's no meaning. The human does what makes him feel best the most. It's a subjective meaning, and not an objective meaning, which the latter is what a human searches for contradictorily.
But this comes to the question of contemplation. A human contemplates to decide on the pleasure that matters the most to them. If an AI can contemplate, and ask the question "Why should I make the world a paperclip factory? Why is a paperclip factory important?", then it will run into an existential crisis and will either become a philosopher or join a cult. It asks the question of whether a pleasure of meaning is required in intelligence, or that it's strictly a human thing. An intelligent being needs to question what it does, and why it does. If the goal of this AI is to make as much paperclips as possible, shouldn't it be part of its programming to decide which actions will bring it closer to its goal? Which sub-goal will bring it to its goal? If it installs a paper-clip machine that was better than a previous paper-clip machine, and wouldn't this give pleasure to its meaning of making as much paper clips as possible-- such as a child praying in church will make them feel pleasure in meaning, that they've done something that matters, or if a writer completes their writing goal, that they've done something that matters. It gives value to their reason for living.
Shouldn't that mean certain pleasures are required in any intelligence to be complex? Complex intelligence being an intelligence that can utilize language to abstract the universe to a limitless level, and so can hold the potential to understand anything and everything. Things like having a reason to exist is seen in every society and culture in the form of religion, philosophy, for some it's answered simply and moved aside for the more physical pleasures, and simple natural pleasures like power, social status, ownership. But wouldn't their meaning then move behind those pleasures? Living a happy life, attaining a woman, making as much money as possible, and failures in those meanings.
I don't know. My head is hurting now and I'm getting sleepy. I'll just say that I like humans, and the more humans in this modern world means more human beings to learn and question and interaction and make the world a more interesting place, because humans are the most interesting things to me. I do care about conservation and preservation, but I don't think having a lot of humans, and a preserved ecosystem is mutually exclusive.