If I know I'm going to insert large amount of data (about one million entries) into std::unordered_map, is there anything I can do in advance to boost the performance? (just like std::vector::reserve can reserve enough memory space to avoid reallocate when I roughly know the size of data before bulk insert)
More specifically, the key in hashmap is a coordinate in 2D plane with customized hash function, as shown below
using CellIndex = std::pair<int32_t, int32_t>;
struct IdxHash {
std::size_t operator()(const std::pair<int32_t, int32_t> &idx) const { return ((size_t)idx.second << 31) ^ idx.first; }
};
std::unordered_map<CellIndex, double, IdxHash> my_map;
// bluk insert into my_map
...
Aucun commentaire:
Enregistrer un commentaire