Renamed bytes_iter to byte_iter to match other iterators
Refactored str Iterators to use DoubleEnded Iterators and typedefs instead of wrapper structs
Reordered the Iterator section
Whitespace fixup
Moved clunky `each_split_within` function to the one place in the tree where it's actually needed
Replaced all block doccomments in str with line doccomments
Contiunation of naming cleanup in `libsyntax::ast`:
```rust
ast::node_id => ast::NodeId
ast::local_crate => ast::LOCAL_CRATE
ast::crate_node_id => ast::CRATE_NODE_ID
ast::blk_check_mode => ast::BlockCheckMode
ast::ty_field => ast::TypeField
ast::ty_method => ast::TypeMethod
```
Also moved span field directly into `TypeField` struct and cleaned up overlooked `ast::CrateConfig` renamings from last pull request.
Cheers,
Michael
`crate => Crate`
`local => Local`
`blk => Block`
`crate_num => CrateNum`
`crate_cfg => CrateConfig`
Also, Crate and Local are not wrapped in spanned<T> anymore.
This does a number of things, but especially dramatically reduce the
number of allocations performed for operations involving attributes/
meta items:
- Converts ast::meta_item & ast::attribute and other associated enums
to CamelCase.
- Converts several standalone functions in syntax::attr into methods,
defined on two traits AttrMetaMethods & AttributeMethods. The former
is common to both MetaItem and Attribute since the latter is a thin
wrapper around the former.
- Deletes functions that are unnecessary due to iterators.
- Converts other standalone functions to use iterators and the generic
AttrMetaMethods rather than allocating a lot of new vectors (e.g. the
old code would have to allocate a new vector to use functions that
operated on &[meta_item] on &[attribute].)
- Moves the core algorithm of the #[cfg] matching to syntax::attr,
similar to find_inline_attr and find_linkage_metas.
This doesn't have much of an effect on the speed of #[cfg] stripping,
despite hugely reducing the number of allocations performed; presumably
most of the time is spent in the ast folder rather than doing attribute
checks.
Also fixes the Eq instance of MetaItem_ to correctly ignore spans, so
that `rustc --cfg 'foo(bar)'` now works.
This does a number of things, but especially dramatically reduce the
number of allocations performed for operations involving attributes/
meta items:
- Converts ast::meta_item & ast::attribute and other associated enums
to CamelCase.
- Converts several standalone functions in syntax::attr into methods,
defined on two traits AttrMetaMethods & AttributeMethods. The former
is common to both MetaItem and Attribute since the latter is a thin
wrapper around the former.
- Deletes functions that are unnecessary due to iterators.
- Converts other standalone functions to use iterators and the generic
AttrMetaMethods rather than allocating a lot of new vectors (e.g. the
old code would have to allocate a new vector to use functions that
operated on &[meta_item] on &[attribute].)
- Moves the core algorithm of the #[cfg] matching to syntax::attr,
similar to find_inline_attr and find_linkage_metas.
This doesn't have much of an effect on the speed of #[cfg] stripping,
despite hugely reducing the number of allocations performed; presumably
most of the time is spent in the ast folder rather than doing attribute
checks.
Also fixes the Eq instance of MetaItem_ to correctly ignore spaces, so
that `rustc --cfg 'foo(bar)'` now works.
Macros can be conditionally defined because stripping occurs before macro
expansion, but, the built-in macros were only added as part of the actual
expansion process and so couldn't be stripped to have definitions conditional
on cfg flags.
debug! is defined conditionally in terms of the debug config, expanding to
nothing unless the --cfg debug flag is passed (to be precise it expands to
`if false { normal_debug!(...) }` so that they are still type checked, and
to avoid unused variable lints).
Clang actually highlights using bold, not using bright white. Match
clang on this so our diagnostics are still readable on terminals with a
white background.
If the TLS key is 0-sized, then the linux linker is apparently smart enough to
put everything at the same pointer. OSX on the other hand, will reserve some
space for all of them. To get around this, the TLS key now actuall consumes
space to ensure that it gets a unique pointer