rust/compiler
Yuki Okushi c14c9bafcd
Rollup merge of #77629 - Julian-Wollersberger:recomputeRawStrError, r=varkor
Cleanup of `eat_while()` in lexer

The size of a lexer Token was inflated by the largest `TokenKind` variants `LiteralKind::RawStr` and `RawByteStr`, because
* it used `usize` although `u32` is sufficient in rustc, since crates must be smaller than 4GB,
* and it stored the 20 bytes big `RawStrError` enum for error reporting.

If a raw string is invalid, it now needs to be reparsed to get the `RawStrError` data, but that is a very cold code path.

Technically this breaks other tools that depend on rustc_lexer because they are now also restricted to a max file size of 4GB. But this shouldn't matter in practice, and rustc_lexer isn't stable anyway.

Can I also get a perf run?

Edit: This makes no difference in performance. The PR now only contains a small cleanup.
2020-10-11 03:19:07 +09:00
..
2020-09-23 21:51:56 +02:00
2020-09-23 21:51:56 +02:00
2020-09-23 21:51:56 +02:00
2020-10-08 16:22:31 +03:00
2020-09-23 21:51:56 +02:00
2020-09-23 21:51:56 +02:00
2020-10-08 13:17:01 +02:00