10689: Handle pub tuple fields in tuple structs r=Veykril a=adamrk
The current implementation will throw a parser error for tuple structs
that contain a pub tuple field. For example,
```rust
struct Foo(pub (u32, u32));
```
is valid Rust, but rust-analyzer will throw a parser error. This is
because the parens after `pub` is treated as a visibility context.
Allowing a tuple type to follow `pub` in the special case when we are
defining fields in a tuple struct can fix the issue.
I guess this is a really minor case because there's not much reason
for having a tuple type within a struct tuple, but it is valid rust syntax...
Co-authored-by: Adam Bratschi-Kaye <ark.email@gmail.com>
The current implementation will throw a parser error for tuple structs
that contain a pub tuple field. For example,
```rust
struct Foo(pub (u32, u32));
```
is valid Rust, but rust-analyzer will throw a parser error. This is
because the parens after `pub` is treated as a visibility context.
Allowing a tuple type to follow `pub` in the special case when we are
defining fields in a tuple struct can fix the issue.
10738: internal: Do not search through all three namespaces in `ItemScope::name_of` r=Veykril a=Veykril
Brings down `5ms - find_path_prefixed (46 calls)` to `1ms - find_path_prefixed (46 calls)` for me on the `integrated_completion_benchmark`.
Still `O(n)` but this should considerably cut down lookups nevertheless(as shown by the timings already).
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10699: internal: Make CompletionItem `label` and `lookup` fields `SmolStr`s r=Veykril a=Veykril
This replaces a bunch of String clones with SmolStr clones, though also makes a few parts a bit more expensive(mainly things involving `format!`ted strings as labels).
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10704: internal: Short-circuit `descend_into_macros_single` r=Veykril a=Veykril
There is no need to descend everything if all we are interested in is the first mapping.
This bring `descend_into_macros` timing in highlighting in `rust-analyzer/src/config.rs` from `154ms - descend_into_macros (2190 calls)` to `24ms - descend_into_macros (2190 calls)` since we use the single variant there(will regress once we want to highlight multiple namespaces again though).
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10703: internal: Don't check items for macro calls if they have no attributes r=Veykril a=Veykril
Turns out when highlighting we currently populate the Dynmaps of pretty much every item in a file, who would've known that would be so costly...
Shaves off 250 ms for the integrated benchmark on `rust-analyzer/src/config.rs`.
We are still looking at a heft `154ms - descend_into_macros (2190 calls)` but I feel like this is slowly nearing towards just call overhead.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10701: internal: Cache ast::MacroCalls to their expansions in Semantics::descend_into_macros_impl r=Veykril a=Veykril
Saves ~45ms when highlighting `rust-analyzer/src/config.rs` for me
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10686: internal: Add `Semantics::original_ast_node` for upmapping nodes out of macro files r=Veykril a=Veykril
Fixes trying to insert imports into macro expanded files which then do text edits on very wrong text ranges.
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10623: internal: replace L_DOLLAR/R_DOLLAR with parenthesis hack r=matklad a=matklad
The general problem we are dealing with here is this:
```
macro_rules! thrice {
($e:expr) => { $e * 3}
}
fn main() {
let x = thrice!(1 + 2);
}
```
we really want this to print 9 rather than 7.
The way rustc solves this is rather ad-hoc. In rustc, token trees are
allowed to include whole AST fragments, so 1+2 is passed through macro
expansion as a single unit. This is a significant violation of token
tree model.
In rust-analyzer, we intended to handle this in a more elegant way,
using token trees with "invisible" delimiters. The idea was is that we
introduce a new kind of parenthesis, "left $"/"right $", and let the
parser intelligently handle this.
The idea was inspired by the relevant comment in the proc_macro crate:
https://doc.rust-lang.org/stable/proc_macro/enum.Delimiter.html#variant.None
> An implicit delimiter, that may, for example, appear around tokens
> coming from a “macro variable” $var. It is important to preserve
> operator priorities in cases like $var * 3 where $var is 1 + 2.
> Implicit delimiters might not survive roundtrip of a token stream
> through a string.
Now that we are older and wiser, we conclude that the idea doesn't work.
_First_, the comment in the proc-macro crate is wishful thinking. Rustc
currently completely ignores none delimiters. It solves the (1 + 2) * 3
problem by having magical token trees which can't be duplicated:
* https://rust-lang.zulipchat.com/#narrow/stream/185405-t-compiler.2Frust-analyzer/topic/TIL.20that.20token.20streams.20are.20magic
* https://rust-lang.zulipchat.com/#narrow/stream/131828-t-compiler/topic/Handling.20of.20Delimiter.3A.3ANone.20by.20the.20parser
_Second_, it's not like our implementation in rust-analyzer works. We
special-case expressions (as opposed to treating all kinds of $var
captures the same) and we don't know how parser error recovery should
work with these dollar-parenthesis.
So, in this PR we simplify the whole thing away by not pretending that
we are doing something proper and instead just explicitly special-casing
expressions by wrapping them into real `()`.
In the future, to maintain bug-parity with `rustc` what we are going to
do is probably adding an explicit `CAPTURED_EXPR` *token* which we can
explicitly account for in the parser.
If/when rustc starts handling delimiter=none properly, we'll port that
logic as well, in addition to special handling.
Co-authored-by: Aleksey Kladov <aleksey.kladov@gmail.com>
10629: Add assist for replacing turbofish with explicit type. r=Veykril a=terrynsun
Converts `::<_>` to an explicit type assignment.
```
let args = args.collect::<Vec<String>>();
```
->
```
let args: Vec<String> = args.collect();
```
Closes#10285
Co-authored-by: Terry Sun <terrynsun@gmail.com>
10649: internal: Remove `CompletionKind` in favor of `CompletionItemKind` r=Veykril a=Veykril
and move some more tests around
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10642: minor: Add dummy impls for `trace_macros` and `log_syntax` r=Veykril a=Veykril
Both of these are macros for debugging macros and as such don't really need an implementation for us.
Closes https://github.com/rust-analyzer/rust-analyzer/issues/2212
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10633: fix: Implement most proc_macro span handling for other ABIs r=Veykril a=Veykril
Follow up to #10378
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10634: minor: Drop resolver and `authors` manifest entry in `limit` r=lnicola a=lnicola
The new resolver is on by default in the 2021 edition,
bors r+
Co-authored-by: Laurențiu Nicola <lnicola@dend.ro>
10631: fix: Fix postfix completions panicking r=Veykril a=Veykril
Fixes https://github.com/rust-analyzer/rust-analyzer/issues/10243, I couldn't reproduce the panic with the given snippet, but this change should still guard against it.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
Converts `::<_>` to an explicit type assignment.
```
let args = args.collect::<Vec<String>>();
```
->
```
let args: Vec<String> = args.collect();
```
Closes#10285
The general problem we are dealing with here is this:
```
macro_rules! thrice {
($e:expr) => { $e * 3}
}
fn main() {
let x = thrice!(1 + 2);
}
```
we really want this to print 9 rather than 7.
The way rustc solves this is rather ad-hoc. In rustc, token trees are
allowed to include whole AST fragments, so 1+2 is passed through macro
expansion as a single unit. This is a significant violation of token
tree model.
In rust-analyzer, we intended to handle this in a more elegant way,
using token trees with "invisible" delimiters. The idea was is that we
introduce a new kind of parenthesis, "left $"/"right $", and let the
parser intelligently handle this.
The idea was inspired by the relevant comment in the proc_macro crate:
https://doc.rust-lang.org/stable/proc_macro/enum.Delimiter.html#variant.None
> An implicit delimiter, that may, for example, appear around tokens
> coming from a “macro variable” $var. It is important to preserve
> operator priorities in cases like $var * 3 where $var is 1 + 2.
> Implicit delimiters might not survive roundtrip of a token stream
> through a string.
Now that we are older and wiser, we conclude that the idea doesn't work.
_First_, the comment in the proc-macro crate is wishful thinking. Rustc
currently completely ignores none delimiters. It solves the (1 + 2) * 3
problem by having magical token trees which can't be duplicated:
* https://rust-lang.zulipchat.com/#narrow/stream/185405-t-compiler.2Frust-analyzer/topic/TIL.20that.20token.20streams.20are.20magic
* https://rust-lang.zulipchat.com/#narrow/stream/131828-t-compiler/topic/Handling.20of.20Delimiter.3A.3ANone.20by.20the.20parser
_Second_, it's not like our implementation in rust-analyzer works. We
special-case expressions (as opposed to treating all kinds of $var
captures the same) and we don't know how parser error recovery should
work with these dollar-parenthesis.
So, in this PR we simplify the whole thing away by not pretending that
we are doing something proper and instead just explicitly special-casing
expressions by wrapping them into real `()`.
In the future, to maintain bug-parity with `rustc` what we are going to
do is probably adding an explicit `CAPTURED_EXPR` *token* which we can
explicitly account for in the parser.
If/when rustc starts handling delimiter=none properly, we'll port that
logic as well, in addition to special handling.
10603: fix: Don't resolve attributes to non attribute macros r=Veykril a=Veykril
Also changes `const`s to `static`s for `Limit`s as we have interior mutability in those(though only used with a certain feature flag enabled).
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10563: feat: Make "Generate getter" assist use semantic info r=agluszak a=agluszak
This PR makes "Generate getter" assist use semantic info instead of dealing with types encoded as strings.
Getters for types which are:
- `Copy` no longer return references
- `AsRef<str>` (i.e. `String`) return `&str` (instead of `&String`)
- `AsRef<[T]>` (i.e. `Vec<T>`) return `&[T]` (instead of `&Vec<T>`)
- `AsRef<T>` (i.e. `Box<T>`) return `&T` (instead of `&Box<T>`)
- `Option<T>` return `Option<&T>` (instead of `&Option<T>`)
- `Result<T, E>` return `Result<&T, &E>` (instead of `&Result<T, E>`)
String, Vec, Box and Option were previously handled as special cases.
Closes#10295
Co-authored-by: Andrzej Głuszak <gluszak.andrzej@gmail.com>
10387: Move `IdxRange` into la-arena r=Veykril a=arzg
Currently, `IdxRange` (named `IdRange`) is located in `hir_def::item_tree`, when really it isn’t specific to `hir_def` and could become part of la-arena. The rename from `IdRange` to `IdxRange` is to maintain consistency with the naming convention used throughout la-arena (`Idx` instead of `Id`, `RawIdx` instead of `RawId`). This PR also adds a few new APIs to la-arena on top of `IdxRange` for convenience, namely:
- indexing into an `Arena` by an `IdxRange` and getting a slice of values back
- creating an `IdxRange` from an inclusive range
Currently this PR also exposes a new `Arena::next_idx` method to make constructing inclusive`IdxRange`s using `IdxRange::new` easier; however, it would in my opinion be better to remove this as it allows for easy creation of out-of-bounds `Idx`s, when `IdxRange::new_inclusive` mostly covers the same use-case while being less error-prone.
I decided to bump the la-arena version to 0.3.0 from 0.2.0 because adding a new `Index` impl for `Arena` turned out to be a breaking change: I had to add a type hint in `crates/hir_def/src/body/scope.rs` when one wasn’t necessary before, since rustc couldn’t work out the type of a closure parameter now that there are multiple `Index` impls. I’m not sure whether this is the right decision, though.
Co-authored-by: Aramis Razzaghipour <aramisnoah@gmail.com>
10600: minor: Make some functions non-generic r=Veykril a=lnicola
This reduces `text` size by 10192 bytes (0.064% 😢), with no apparent change in performance.
Co-authored-by: Laurențiu Nicola <lnicola@dend.ro>