Cleanup of `eat_while()` in lexer
The size of a lexer Token was inflated by the largest `TokenKind` variants `LiteralKind::RawStr` and `RawByteStr`, because
* it used `usize` although `u32` is sufficient in rustc, since crates must be smaller than 4GB,
* and it stored the 20 bytes big `RawStrError` enum for error reporting.
If a raw string is invalid, it now needs to be reparsed to get the `RawStrError` data, but that is a very cold code path.
Technically this breaks other tools that depend on rustc_lexer because they are now also restricted to a max file size of 4GB. But this shouldn't matter in practice, and rustc_lexer isn't stable anyway.
Can I also get a perf run?
Edit: This makes no difference in performance. The PR now only contains a small cleanup.