• Why is there an iterator-range version of std::unordered_map::erase()?

    From Juha Nieminen@21:1/5 to All on Fri Nov 12 14:32:48 2021
    I recently noticed that std::unordered_map::erase() has a version that
    takes an iterator range.

    Why?!

    It makes no sense to me. What exactly can be the purpose of that?
    The order in which the elements are stored in the unordered_map is,
    as far as I know, unspecified. For all intents and purposes it could
    well be random (always the same random permutation for the same
    element type and hash, but random nevertheless).

    This means that if you erase a range, you'll be essentially
    erasing a bunch of elements pretty much at random. What can
    possibly be the purpose of this? Why does this function exist
    at all?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Marcel Mueller@21:1/5 to All on Fri Nov 12 22:09:26 2021
    Am 12.11.21 um 15:32 schrieb Juha Nieminen:
    I recently noticed that std::unordered_map::erase() has a version that
    takes an iterator range.

    Why?!

    It makes no sense to me. What exactly can be the purpose of that?
    The order in which the elements are stored in the unordered_map is,
    as far as I know, unspecified.

    The order is undefined but it is still preserved unless you modify the collection. Note that it has forward iterators. So if you erase a range examined before it will have well defined behavior.

    But unless the documentation at https://en.cppreference.com/w/cpp/container/unordered_map/erase_if
    is wrong the element sequence seem not to change if you erase elements
    from the collection. Otherwise the equivalent loop code would have
    undefined behavior.

    The magic seems to be that an unordered_map never shrinks unless you
    call rehash. So any erase implementation will always preserve the
    element sequence. The latter specification was added with C++14 (but
    likely existed before for the above reason).

    But I am confused about the worst case complexity of O(size()). This
    make no sense to me.

    So if you ask for a use case:
    It might be more efficient to remove adjacent element en bloc rather
    than removing them one by one. Although I do not think that it makes a significant difference with any real existing implementation.


    Marcel

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)