mercredi 26 juillet 2017

Is there a downside to a significant overestimation in a reserve()?

Let's suppose we have a method that creates and uses possibly very big vector<foo>s. The maximum number of elements is known to be maxElems.

Standard practice as of C++11 is to my best knowledge:

vector<foo> fooVec;
fooVec.reserve(maxElems);
//... fill fooVec using emplace_back() / push_back()

But what happens if we have a scenario where the number of elements is going to be significantly less in the majority of calls to our method?

Is there any disadvantage to the conservative reserve call other than the excess allocated memory (which supposably can be be freed with shrink_to_fit() if necessary)?

Aucun commentaire:

Enregistrer un commentaire