This is an implementation of a map and set for integer keys. It's an ordered container (by byte order, which is sorted order for integers and byte strings when done in the right direction) with O(1) worst-case lookup, removal and insertion. There's no rebalancing or rehashing so it's actually O(1) without amortizing any costs.
The fanout can be adjusted in multiples of 2 from 2-ary through 256-ary, but it's hardcoded at 16-ary because there isn't a way to expose that in the type system yet. To keep things simple, it also only allows `uint` keys, but later I'll expand it to all the built-in integer types and byte arrays.
There's quite a bit of room for performance improvement, along with the boost that will come with dropping the headers on `Owned` `~` and getting rid of the overhead from the stack switches to the allocator. It currently does suffix compression for a single node and then splits into two n-ary trie nodes, which could be replaced with an array for at least 4-8 suffixes before splitting it. There's also the option of doing path compression, which may be a good or a bad idea and depends a lot on the data stored.
I want to share the test suite with the other maps so that's why I haven't duplicated all of the existing integer key tests in this file. I'll send in another pull request to deal with that.
Current benchmark numbers against the other map types:
TreeMap:
Sequential integers:
insert: 0.798295
search: 0.188931
remove: 0.435923
Random integers:
insert: 1.557661
search: 0.758325
remove: 1.720527
LinearMap:
Sequential integers:
insert: 0.272338
search: 0.141179
remove: 0.190273
Random integers:
insert: 0.293588
search: 0.162677
remove: 0.206142
TrieMap:
Sequential integers:
insert: 0.0901
search: 0.012223
remove: 0.084139
Random integers:
insert: 0.392719
search: 0.261632
remove: 0.470401
@graydon is using an earlier version of this for the garbage collection implementation, so that's why I added this to libcore. I left out the `next` and `prev` methods *for now* because I just wanted the essentials first.
My merges for #5143 missed a couple other copies. This patch corrects this, and gets stage0 to compile libsyntax with `#[deny(vecs_implicitly_copyable)]`. stage1 still fails though.
Before:
````
test.rs:3:21: 3:30 error: expected constant integer for repeat count but found variable
test.rs:3 let a = ~[0, ..n]; //~ ERROR expected constant integer for repeat count but found variable
^~~~~~~~~
````
After:
````
test.rs:3:27: 3:28 error: expected constant integer for repeat count but found variable
test.rs:3 let a = ~[0, ..n]; //~ ERROR expected constant integer for repeat count but found variable
^
````
Good morning,
It's taken a long time, but I finally am almost done freeing libsyntax of `vecs_implicitly_copyable` in this pull request, but I'm running into some issues. I've confirmed that all but the last commit (which only disables `vecs_implicitly_copyable` pass the `check` tests. The last commit errors with this message, which makes no sense to me:
```
/Users/erickt/rust/rust/src/libcore/num/f32.rs:35:37: 35:43 error: expected `,` but found `=`
/Users/erickt/rust/rust/src/libcore/num/f32.rs:35 pub pure fn $name($( $arg : $arg_ty ),*) -> $rv {
^~~~~~
```
and this stack trace:
```
#1 0x00000001000b059b in sys::begin_unwind_::_a923ca4ae164c::_06 ()
#2 0x00000001000b0542 in sys::begin_unwind::anon::anon::expr_fn_13876 ()
#3 0x00000001000048a1 in sys::begin_unwind::_8ec273289fc0adc0::_06 ()
#4 0x00000001005df999 in diagnostic::__extensions__::meth_7941::span_fatal::_efdf2d14612d79ec::_06 ()
#5 0x0000000100682d48 in parse::parser::__extensions__::meth_16938::fatal::_8aa3239426747a3::_06 ()
#6 0x00000001006850b8 in parse::common::__extensions__::meth_17005::expect::_d3604ec6c7698d5f::_06 ()
#7 0x00000001006b59f1 in parse::common::__extensions__::parse_seq_to_before_end_17860::_48c79835f9eb1011::_06 ()
#8 0x00000001006a50f7 in parse::parser::__extensions__::meth_17606::parse_fn_decl::_14f3785fe78967d::_06 ()
#9 0x00000001006b6f59 in parse::parser::__extensions__::meth_17987::parse_item_fn::_8a6be529cf7b2ca5::_06 ()
#10 0x00000001006ac839 in parse::parser::__extensions__::meth_17761::parse_item_or_view_item::_bfead947d6dd7d25::_06 ()
#11 0x00000001006c8b8f in parse::parser::__extensions__::meth_18364::parse_item::_96b54e33f65abe76::_06 ()
#12 0x000000010076179f in ext::tt::macro_rules::add_new_extension::generic_extension::anon::anon::expr_fn_23365 ()
#13 0x000000010072e793 in ext::expand::expand_item_mac::_a4f486c4465cfb1b::_06 ()
#14 0x00000001007b5ad3 in __morestack ()
```
There also a bunch of new warnings that I haven't cleaned up yet: https://gist.github.com/erickt/5048251.
@nikomatsakis thought there might be some scary bug in the parser caused by moving a vector in the parser instead of copying it, which is why I'm filing this pull request before it's ready. Thanks for any help!
This allows `TreeMap`/`TreeSet` to fully express their requirements and reduces the comparisons from ~1.5 per level to 1 which really helps for string keys.
I also added `ReverseIter` to the prelude exports because I forgot when I originally added it.