Commit Graph

344 Commits

Author SHA1 Message Date
Aaron Hill
b9b2546417
Unconditionally capture tokens for attributes.
This allows us to avoid synthesizing tokens in `prepend_attr`, since we
have the original tokens available.

We still need to synthesize tokens when expanding `cfg_attr`,
but this is an unavoidable consequence of the syntax of `cfg_attr` -
the user does not supply the `#` and `[]` tokens that a `cfg_attr`
expands to.
2020-10-21 18:57:29 -04:00
bors
22e6b9c689 Auto merge of #77250 - Aaron1011:feature/flat-token-collection, r=petrochenkov
Rewrite `collect_tokens` implementations to use a flattened buffer

Instead of trying to collect tokens at each depth, we 'flatten' the
stream as we go allong, pushing open/close delimiters to our buffer
just like regular tokens. One capturing is complete, we reconstruct a
nested `TokenTree::Delimited` structure, producing a normal
`TokenStream`.

The reconstructed `TokenStream` is not created immediately - instead, it is
produced on-demand by a closure (wrapped in a new `LazyTokenStream` type). This
closure stores a clone of the original `TokenCursor`, plus a record of the
number of calls to `next()/next_desugared()`. This is sufficient to reconstruct
the tokenstream seen by the callback without storing any additional state. If
the tokenstream is never used (e.g. when a captured `macro_rules!` argument is
never passed to a proc macro), we never actually create a `TokenStream`.

This implementation has a number of advantages over the previous one:

* It is significantly simpler, with no edge cases around capturing the
  start/end of a delimited group.

* It can be easily extended to allow replacing tokens an an arbitrary
  'depth' by just using `Vec::splice` at the proper position. This is
  important for PR #76130, which requires us to track information about
  attributes along with tokens.

* The lazy approach to `TokenStream` construction allows us to easily
  parse an AST struct, and then decide after the fact whether we need a
  `TokenStream`. This will be useful when we start collecting tokens for
  `Attribute` - we can discard the `LazyTokenStream` if the parsed
  attribute doesn't need tokens (e.g. is a builtin attribute).

The performance impact seems to be neglibile (see
https://github.com/rust-lang/rust/pull/77250#issuecomment-703960604). There is a
small slowdown on a few benchmarks, but it only rises above 1% for incremental
builds, where it represents a larger fraction of the much smaller instruction
count. There a ~1% speedup on a few other incremental benchmarks - my guess is
that the speedups and slowdowns will usually cancel out in practice.
2020-10-21 15:03:14 +00:00
Aaron Hill
593fdd3d45
Rewrite collect_tokens implementations to use a flattened buffer
Instead of trying to collect tokens at each depth, we 'flatten' the
stream as we go allong, pushing open/close delimiters to our buffer
just like regular tokens. One capturing is complete, we reconstruct a
nested `TokenTree::Delimited` structure, producing a normal
`TokenStream`.

The reconstructed `TokenStream` is not created immediately - instead, it is
produced on-demand by a closure (wrapped in a new `LazyTokenStream` type). This
closure stores a clone of the original `TokenCursor`, plus a record of the
number of calls to `next()/next_desugared()`. This is sufficient to reconstruct
the tokenstream seen by the callback without storing any additional state. If
the tokenstream is never used (e.g. when a captured `macro_rules!` argument is
never passed to a proc macro), we never actually create a `TokenStream`.

This implementation has a number of advantages over the previous one:

* It is significantly simpler, with no edge cases around capturing the
  start/end of a delimited group.

* It can be easily extended to allow replacing tokens an an arbitrary
  'depth' by just using `Vec::splice` at the proper position. This is
  important for PR #76130, which requires us to track information about
  attributes along with tokens.

* The lazy approach to `TokenStream` construction allows us to easily
  parse an AST struct, and then decide after the fact whether we need a
  `TokenStream`. This will be useful when we start collecting tokens for
  `Attribute` - we can discard the `LazyTokenStream` if the parsed
  attribute doesn't need tokens (e.g. is a builtin attribute).

The performance impact seems to be neglibile (see
https://github.com/rust-lang/rust/pull/77250#issuecomment-703960604). There is a
small slowdown on a few benchmarks, but it only rises above 1% for incremental
builds, where it represents a larger fraction of the much smaller instruction
count. There a ~1% speedup on a few other incremental benchmarks - my guess is
that the speedups and slowdowns will usually cancel out in practice.
2020-10-19 13:59:18 -04:00
Aaron Hill
f6aec82d4d
Avoid cloning the contents of a TokenStream in a few places 2020-10-19 12:30:41 -04:00
bors
834821e3b6 Auto merge of #78066 - bugadani:wat, r=jonas-schievink
Clean up small, surprising bits of code

This PR clean up a small number of unrelated, small things I found while browsing the code base.
2020-10-18 13:50:31 +00:00
Dániel Buga
d708d7fb79 No need to map the max_distance 2020-10-18 11:01:08 +02:00
Santiago Pastorino
03defb627c
Add check_generic_arg early pass 2020-10-16 17:14:36 -03:00
Santiago Pastorino
c3e8d7965c
Parse inline const expressions 2020-10-16 15:15:30 -03:00
Yuki Okushi
022d20759b
Rollup merge of #77739 - est31:remove_unused_code, r=petrochenkov,varkor
Remove unused code

Rustc has a builtin lint for detecting unused code inside a crate, but when an item is marked `pub`, the code, even if unused inside the entire workspace, is never marked as such. Therefore, I've built [warnalyzer](https://github.com/est31/warnalyzer) to detect unused items in a cross-crate setting.

Closes https://github.com/est31/warnalyzer/issues/2
2020-10-15 07:32:29 +09:00
est31
49d4a756f1 Remove unused code from rustc_ast 2020-10-14 04:14:32 +02:00
Aaron Hill
9a6ea38647
Add hack to keep actix-web and actori-web compiling
This extends the existing `ident_name_compatibility_hack` to handle the
`tuple_from_req` macro defined in `actix-web` (and its fork
`actori-web`).
2020-10-11 13:20:26 -04:00
Esteban Küber
4ae8f6ec7c address review comments 2020-10-09 22:00:48 -07:00
bors
91a79fb29a Auto merge of #76985 - hbina:clone_check, r=estebank
Prevent stack overflow in deeply nested types.

Related issue #75577 (?)

Unfortunately, I am unable to test whether this actually solves the problem because apparently, 12GB RAM + 2GB swap is not enough to compile the (admittedly toy) source file.
2020-10-07 21:51:12 +00:00
Robin Schoonover
5ab19676ed Remove extra indirection in LitKind::ByteStr 2020-10-04 15:52:15 -06:00
Erik Hofmayer
764967a7e5 tidy 2020-09-23 22:08:30 +02:00
Erik Hofmayer
138a2e5eaa /nightly/nightly-rustc 2020-09-23 21:51:56 +02:00
Erik Hofmayer
dd66ea2d3d Updated html_root_url for compiler crates 2020-09-23 21:14:43 +02:00
Dylan MacKenzie
6044836284 Add #![feature(const_fn_transmute)] to rustc_ast 2020-09-22 10:22:21 -07:00
Hanif Bin Ariffin
dc655b2842 Prevent stackoverflow 2020-09-21 04:20:41 +08:00
Ralf Jung
8405d50e12
Rollup merge of #76890 - matthiaskrgr:matches_simpl, r=lcnr
use matches!() macro for simple if let conditions
2020-09-20 15:52:01 +02:00
Matthias Krüger
40dddd3305 use matches!() macro for simple if let conditions 2020-09-18 20:28:35 +02:00
est31
ebdea01143 Remove redundant #![feature(...)] 's from compiler/ 2020-09-17 07:58:45 +02:00
bors
9b4154193e Auto merge of #76541 - matthiaskrgr:unstable_sort, r=davidtwco
use sort_unstable to sort primitive types

It's not important to retain original order if we have &[1, 1, 2, 3] for example.

clippy::stable_sort_primitive
2020-09-14 21:43:17 +00:00
Aaron Hill
fec0479075
Fully integrate token collection for additional AST structs
This commit contains miscellaneous changes that don't fit into any of
the other commits in this PR
2020-09-10 17:58:14 -04:00
Aaron Hill
156ef2bee8
Attach tokens to ast::Stmt
We currently only attach tokens when parsing a `:stmt` matcher for a
`macro_rules!` macro. Proc-macro attributes on statements are still
unstable, and need additional work.
2020-09-10 17:33:06 -04:00
Aaron Hill
c1011165e6
Attach TokenStream to ast::Visibility
A `Visibility` does not have outer attributes, so we only capture tokens
when parsing a `macro_rules!` matcher
2020-09-10 17:33:06 -04:00
Aaron Hill
55082ce413
Attach TokenStream to ast::Path 2020-09-10 17:33:06 -04:00
Aaron Hill
3815e91ccd
Attach tokens to NtMeta (ast::AttrItem)
An `AttrItem` does not have outer attributes, so we only capture tokens
when parsing a `macro_rules!` matcher
2020-09-10 17:33:06 -04:00
Aaron Hill
1823dea7df
Attach TokenStream to ast::Ty
A `Ty` does not have outer attributes, so we only capture tokens
when parsing a `macro_rules!` matcher
2020-09-10 17:33:05 -04:00
Aaron Hill
de4bd9f0f8
Attach TokenStream to ast::Block
A `Block` does not have outer attributes, so we only capture tokens when
parsing a `macro_rules!` matcher
2020-09-10 17:33:05 -04:00
David Tolnay
fd4dd00dde
Syntactically permit unsafety on mods 2020-09-10 06:56:33 -07:00
bors
a18b34d979 Auto merge of #76291 - matklad:spacing, r=petrochenkov
Rename IsJoint -> Spacing

Builds on #76286 and might conflict with #76285

r? `@petrochenkov`
2020-09-10 08:07:48 +00:00
Tyler Mandry
fdff7defc9 Revert "Rollup merge of #76285 - matklad:censor-spacing, r=petrochenkov"
This reverts commit 85cee57fd7, reversing
changes made to b4d3873024.
2020-09-10 02:18:46 +00:00
Matthias Krüger
b4935e0726 use sort_unstable to sort primitive types
It's not important to retain original order if we have &[1, 1, 2, 3] for example.

clippy::stable_sort_primitive
2020-09-10 00:03:58 +02:00
Dylan DPC
6545985888
Rollup merge of #76274 - scottmcm:fix-76271, r=petrochenkov
Allow try blocks as the argument to return expressions

Fixes #76271

I don't think this needs to be edition-aware (phew) since `return try` in 2015 is also the start of an expression, just with a struct literal instead of a block (`return try { x: 4, y: 5 }`).
2020-09-07 01:17:46 +02:00
bors
ffaf158608 Auto merge of #76331 - Aaron1011:fix/group-compat-hack-test, r=petrochenkov
Account for version number in NtIdent hack

Issue #74616 tracks a backwards-compatibility hack for certain macros.
This has is implemented by hard-coding the filenames and macro names of
certain code that we want to continue to compile.

However, the initial implementation of the hack was based on the
directory structure when building the crate from its repository (e.g.
`js-sys/src/lib.rs`). When the crate is build as a dependency, it will
include a version number from the clone from the cargo registry (e.g.
`js-sys-0.3.17/src/lib.rs`), which would fail the check.

This commit modifies the backwards-compatibility hack to check that
desired crate name (`js-sys` or `time-macros-impl`) is a prefix of the
proper part of the path.

See https://github.com/rust-lang/rust/issues/76070#issuecomment-687215646
for more details.
2020-09-06 06:15:28 +00:00
Aaron Hill
9e7ef659e1
Account for version number in NtIdent hack
Issue #74616 tracks a backwards-compatibility hack for certain macros.
This has is implemented by hard-coding the filenames and macro names of
certain code that we want to continue to compile.

However, the initial implementation of the hack was based on the
directory structure when building the crate from its repository (e.g.
`js-sys/src/lib.rs`). When the crate is build as a dependency, it will
include a version number from the clone from the cargo registry (e.g.
`js-sys-0.3.17/src/lib.rs`), which would fail the check.

This commit modifies the backwards-compatibility hack to check that
desired crate name (`js-sys` or `time-macros-impl`) is a prefix of the
proper part of the path.

See https://github.com/rust-lang/rust/issues/76070#issuecomment-687215646
for more details.
2020-09-04 13:10:23 -04:00
Aleksey Kladov
09d3db2e59 Optimize Cursor::look_ahead
Cloning a tt is cheap, but not free (there's Arc inside).
2020-09-03 23:28:22 +02:00
Aleksey Kladov
ccf41dd5eb Rename IsJoint -> Spacing
To match better naming from proc-macro
2020-09-03 17:32:45 +02:00
Scott McMurray
791f93c796 Allow try blocks as the argument to return expressions
Fixes 76271
2020-09-02 23:39:50 -07:00
bors
b4acb11033 Auto merge of #76170 - matklad:notrivia, r=petrochenkov
Remove trivia tokens

r? @ghost
2020-09-02 03:19:38 +00:00
Aleksey Kladov
5326361fc0 Remove trivia tokens 2020-09-01 11:39:11 +02:00
Aaron Hill
090b16717a
Factor out StmtKind::MacCall fields into MacCallStmt struct
In PR #76130, I add a fourth field, which makes using a tuple variant
somewhat unwieldy.
2020-08-30 18:38:53 -04:00
mark
9e5f7d5631 mv compiler to compiler/ 2020-08-30 18:45:07 +03:00