I am running a loop in C++ (with the template library) and print the amount it took from the beginning of the loop to go through another 10% of the data.
I see the following behavior:
in row 1 you see the time that passed between the beginning and each 10% of the loop.
in row 2 you see the differences between each adjacent cells in row 1 (which gives the amount of time it took to just go through the extra 10%).
in row 3 you see the division of each adjacent cell from row 2 (so it is the factor by which the amount of taking more 10% of the loop takes at each 10% that goes by).
I was expecting given my knowledge of the code that I would see row 3 to be close to 1.0.
oddly enough, there is an exponential increase between each 10%, and in the beginning it is even worse. later on it stabilizes, but still exponential ~ 1.4.
I do manage a lot of data structures (mostly unordered_map and sets, etc. all from the standard C++ library), and I am wondering what's the best way to identify why this happens, perhaps with valgrind or otherwise? are there anything to be wary of when using such data structures (the only thing I can think of at the moment is that I have a map where the value is a map too, and I am wondering if it is copying the whole map each time I find one with a key? it looks like this: I define an object of type unordered_map<T, unordered_map<T, my_set> >
where I have typedef unordered_set<some_obj<T> > my_set;)
also, I do not want to wait again that long (the numbers in the first row are in seconds!), is there a quicker way to test it, even with valgrind? I just need to identify a (set of) function(s) where there is such increase.
Aucun commentaire:
Enregistrer un commentaire