10699: internal: Make CompletionItem `label` and `lookup` fields `SmolStr`s r=Veykril a=Veykril
This replaces a bunch of String clones with SmolStr clones, though also makes a few parts a bit more expensive(mainly things involving `format!`ted strings as labels).
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10704: internal: Short-circuit `descend_into_macros_single` r=Veykril a=Veykril
There is no need to descend everything if all we are interested in is the first mapping.
This bring `descend_into_macros` timing in highlighting in `rust-analyzer/src/config.rs` from `154ms - descend_into_macros (2190 calls)` to `24ms - descend_into_macros (2190 calls)` since we use the single variant there(will regress once we want to highlight multiple namespaces again though).
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10703: internal: Don't check items for macro calls if they have no attributes r=Veykril a=Veykril
Turns out when highlighting we currently populate the Dynmaps of pretty much every item in a file, who would've known that would be so costly...
Shaves off 250 ms for the integrated benchmark on `rust-analyzer/src/config.rs`.
We are still looking at a heft `154ms - descend_into_macros (2190 calls)` but I feel like this is slowly nearing towards just call overhead.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10701: internal: Cache ast::MacroCalls to their expansions in Semantics::descend_into_macros_impl r=Veykril a=Veykril
Saves ~45ms when highlighting `rust-analyzer/src/config.rs` for me
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10686: internal: Add `Semantics::original_ast_node` for upmapping nodes out of macro files r=Veykril a=Veykril
Fixes trying to insert imports into macro expanded files which then do text edits on very wrong text ranges.
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10623: internal: replace L_DOLLAR/R_DOLLAR with parenthesis hack r=matklad a=matklad
The general problem we are dealing with here is this:
```
macro_rules! thrice {
($e:expr) => { $e * 3}
}
fn main() {
let x = thrice!(1 + 2);
}
```
we really want this to print 9 rather than 7.
The way rustc solves this is rather ad-hoc. In rustc, token trees are
allowed to include whole AST fragments, so 1+2 is passed through macro
expansion as a single unit. This is a significant violation of token
tree model.
In rust-analyzer, we intended to handle this in a more elegant way,
using token trees with "invisible" delimiters. The idea was is that we
introduce a new kind of parenthesis, "left $"/"right $", and let the
parser intelligently handle this.
The idea was inspired by the relevant comment in the proc_macro crate:
https://doc.rust-lang.org/stable/proc_macro/enum.Delimiter.html#variant.None
> An implicit delimiter, that may, for example, appear around tokens
> coming from a “macro variable” $var. It is important to preserve
> operator priorities in cases like $var * 3 where $var is 1 + 2.
> Implicit delimiters might not survive roundtrip of a token stream
> through a string.
Now that we are older and wiser, we conclude that the idea doesn't work.
_First_, the comment in the proc-macro crate is wishful thinking. Rustc
currently completely ignores none delimiters. It solves the (1 + 2) * 3
problem by having magical token trees which can't be duplicated:
* https://rust-lang.zulipchat.com/#narrow/stream/185405-t-compiler.2Frust-analyzer/topic/TIL.20that.20token.20streams.20are.20magic
* https://rust-lang.zulipchat.com/#narrow/stream/131828-t-compiler/topic/Handling.20of.20Delimiter.3A.3ANone.20by.20the.20parser
_Second_, it's not like our implementation in rust-analyzer works. We
special-case expressions (as opposed to treating all kinds of $var
captures the same) and we don't know how parser error recovery should
work with these dollar-parenthesis.
So, in this PR we simplify the whole thing away by not pretending that
we are doing something proper and instead just explicitly special-casing
expressions by wrapping them into real `()`.
In the future, to maintain bug-parity with `rustc` what we are going to
do is probably adding an explicit `CAPTURED_EXPR` *token* which we can
explicitly account for in the parser.
If/when rustc starts handling delimiter=none properly, we'll port that
logic as well, in addition to special handling.
Co-authored-by: Aleksey Kladov <aleksey.kladov@gmail.com>
10629: Add assist for replacing turbofish with explicit type. r=Veykril a=terrynsun
Converts `::<_>` to an explicit type assignment.
```
let args = args.collect::<Vec<String>>();
```
->
```
let args: Vec<String> = args.collect();
```
Closes#10285
Co-authored-by: Terry Sun <terrynsun@gmail.com>
10649: internal: Remove `CompletionKind` in favor of `CompletionItemKind` r=Veykril a=Veykril
and move some more tests around
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10642: minor: Add dummy impls for `trace_macros` and `log_syntax` r=Veykril a=Veykril
Both of these are macros for debugging macros and as such don't really need an implementation for us.
Closes https://github.com/rust-analyzer/rust-analyzer/issues/2212
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10633: fix: Implement most proc_macro span handling for other ABIs r=Veykril a=Veykril
Follow up to #10378
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10634: minor: Drop resolver and `authors` manifest entry in `limit` r=lnicola a=lnicola
The new resolver is on by default in the 2021 edition,
bors r+
Co-authored-by: Laurențiu Nicola <lnicola@dend.ro>
10631: fix: Fix postfix completions panicking r=Veykril a=Veykril
Fixes https://github.com/rust-analyzer/rust-analyzer/issues/10243, I couldn't reproduce the panic with the given snippet, but this change should still guard against it.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
Converts `::<_>` to an explicit type assignment.
```
let args = args.collect::<Vec<String>>();
```
->
```
let args: Vec<String> = args.collect();
```
Closes#10285
The general problem we are dealing with here is this:
```
macro_rules! thrice {
($e:expr) => { $e * 3}
}
fn main() {
let x = thrice!(1 + 2);
}
```
we really want this to print 9 rather than 7.
The way rustc solves this is rather ad-hoc. In rustc, token trees are
allowed to include whole AST fragments, so 1+2 is passed through macro
expansion as a single unit. This is a significant violation of token
tree model.
In rust-analyzer, we intended to handle this in a more elegant way,
using token trees with "invisible" delimiters. The idea was is that we
introduce a new kind of parenthesis, "left $"/"right $", and let the
parser intelligently handle this.
The idea was inspired by the relevant comment in the proc_macro crate:
https://doc.rust-lang.org/stable/proc_macro/enum.Delimiter.html#variant.None
> An implicit delimiter, that may, for example, appear around tokens
> coming from a “macro variable” $var. It is important to preserve
> operator priorities in cases like $var * 3 where $var is 1 + 2.
> Implicit delimiters might not survive roundtrip of a token stream
> through a string.
Now that we are older and wiser, we conclude that the idea doesn't work.
_First_, the comment in the proc-macro crate is wishful thinking. Rustc
currently completely ignores none delimiters. It solves the (1 + 2) * 3
problem by having magical token trees which can't be duplicated:
* https://rust-lang.zulipchat.com/#narrow/stream/185405-t-compiler.2Frust-analyzer/topic/TIL.20that.20token.20streams.20are.20magic
* https://rust-lang.zulipchat.com/#narrow/stream/131828-t-compiler/topic/Handling.20of.20Delimiter.3A.3ANone.20by.20the.20parser
_Second_, it's not like our implementation in rust-analyzer works. We
special-case expressions (as opposed to treating all kinds of $var
captures the same) and we don't know how parser error recovery should
work with these dollar-parenthesis.
So, in this PR we simplify the whole thing away by not pretending that
we are doing something proper and instead just explicitly special-casing
expressions by wrapping them into real `()`.
In the future, to maintain bug-parity with `rustc` what we are going to
do is probably adding an explicit `CAPTURED_EXPR` *token* which we can
explicitly account for in the parser.
If/when rustc starts handling delimiter=none properly, we'll port that
logic as well, in addition to special handling.
10603: fix: Don't resolve attributes to non attribute macros r=Veykril a=Veykril
Also changes `const`s to `static`s for `Limit`s as we have interior mutability in those(though only used with a certain feature flag enabled).
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10563: feat: Make "Generate getter" assist use semantic info r=agluszak a=agluszak
This PR makes "Generate getter" assist use semantic info instead of dealing with types encoded as strings.
Getters for types which are:
- `Copy` no longer return references
- `AsRef<str>` (i.e. `String`) return `&str` (instead of `&String`)
- `AsRef<[T]>` (i.e. `Vec<T>`) return `&[T]` (instead of `&Vec<T>`)
- `AsRef<T>` (i.e. `Box<T>`) return `&T` (instead of `&Box<T>`)
- `Option<T>` return `Option<&T>` (instead of `&Option<T>`)
- `Result<T, E>` return `Result<&T, &E>` (instead of `&Result<T, E>`)
String, Vec, Box and Option were previously handled as special cases.
Closes#10295
Co-authored-by: Andrzej Głuszak <gluszak.andrzej@gmail.com>
10387: Move `IdxRange` into la-arena r=Veykril a=arzg
Currently, `IdxRange` (named `IdRange`) is located in `hir_def::item_tree`, when really it isn’t specific to `hir_def` and could become part of la-arena. The rename from `IdRange` to `IdxRange` is to maintain consistency with the naming convention used throughout la-arena (`Idx` instead of `Id`, `RawIdx` instead of `RawId`). This PR also adds a few new APIs to la-arena on top of `IdxRange` for convenience, namely:
- indexing into an `Arena` by an `IdxRange` and getting a slice of values back
- creating an `IdxRange` from an inclusive range
Currently this PR also exposes a new `Arena::next_idx` method to make constructing inclusive`IdxRange`s using `IdxRange::new` easier; however, it would in my opinion be better to remove this as it allows for easy creation of out-of-bounds `Idx`s, when `IdxRange::new_inclusive` mostly covers the same use-case while being less error-prone.
I decided to bump the la-arena version to 0.3.0 from 0.2.0 because adding a new `Index` impl for `Arena` turned out to be a breaking change: I had to add a type hint in `crates/hir_def/src/body/scope.rs` when one wasn’t necessary before, since rustc couldn’t work out the type of a closure parameter now that there are multiple `Index` impls. I’m not sure whether this is the right decision, though.
Co-authored-by: Aramis Razzaghipour <aramisnoah@gmail.com>
10600: minor: Make some functions non-generic r=Veykril a=lnicola
This reduces `text` size by 10192 bytes (0.064% 😢), with no apparent change in performance.
Co-authored-by: Laurențiu Nicola <lnicola@dend.ro>
10417: feat(assist): add new assist to unwrap the result return type r=bnjjj a=bnjjj
do the opposite of assist "wrap the return type in Result"
Co-authored-by: Benjamin Coenen <5719034+bnjjj@users.noreply.github.com>
Co-authored-by: Coenen Benjamin <benjamin.coenen@hotmail.com>
10542: Use workspace cargo to fetch rust source's metadata r=lnicola a=Alexendoo
Previously the detected cargo is the global one, as it uses the
directory of the rust source which doesn't pick up the local override
This fixes the case in clippy where the local rust toolchain is a recent
nightly that has a 2021 edition Cargo.toml. The global (stable) cargo
returns an error attempting to parse it
Fixes#10445
Co-authored-by: Alex Macleod <alex@macleod.io>
10552: fix: Fix missing_fields diagnostic fix replacing wrong text ranges r=Veykril a=Veykril
Fixes#5393 by replacing the problematic behaviour there with a new "problem"
It replaces the correct range now, but it potentially discards the whitespace in the macro input. This is the best we can do currently until we get a formatter.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10543: Narrow add_missing_match_arms assist range r=Veykril a=antonfirsov
Contributes to #10220 with logic borrowed from #10267.
Note: if anyone has recommendations for further analyzers to check, I'm happy to (hard to do it on my own, I'm completely new to the language).
Co-authored-by: Anton Firszov <antonfir@gmail.com>
10491: Support nested type on replace if let with match r=k-nasa a=k-nasa
## Why
close: https://github.com/rust-analyzer/rust-analyzer/issues/8690
Now, Replacing if-let with match cant't output exhaustive patterns code.
This was because the `else` conversion used specific types (ex. Option, Result) instead of wildcards.
I thought it was more of a problem to generate non-exhaustive patterns than the benefits of using the concrete one.
How about using wildcards in `else`?
Is this change policy acceptable?
## What
- using wildcards on `make_else_arm`
- Change test cases
Co-authored-by: k-nasa <htilcs1115@gmail.com>
10546: feat: Implement promote_local_to_const assist r=Veykril a=Veykril
Fixes#7692, that is now one can invoke the `extract_variable` assist on something and then follow that up with this assist to turn it into a const.
bors r+
Co-authored-by: Lukas Wirth <lukastw97@gmail.com>
10539: Add "generate delegate methods" assist r=Veykril a=yoshuawuyts
_Co-authored with `@rylev_.`
This patch adds a new assist: "generate delegate method" which creates a method that calls to a method defined on an inner field. Delegation is common when authoring newtypes, and having IDE support for this is the best way we can make this easier to author in Rust, bar adding language-level support for it. Thanks!
Closes#5944.
## Example
__before__
```rust
struct Age(u8);
impl Age {
fn age(&self) -> u8 {
self.0
}
}
struct Person {
ag$0e: Age,
}
```
__after__
```rust
struct Age(u8);
impl Age {
fn age(&self) -> u8 {
self.0
}
}
struct Person {
age: Age,
}
impl Person {
$0fn age(&self) -> u8 {
self.age.age()
}
}
```
Co-authored-by: Ryan Levick <me@ryanlevick.com>
Co-authored-by: Yoshua Wuyts <yoshuawuyts@gmail.com>
Previously the detected cargo is the global one, as it uses the
directory of the rust source which doesn't pick up the local override
This fixes the case in clippy where the local rust toolchain is a recent
nightly that has a 2021 edition Cargo.toml. The global (stable) cargo
returns an error attempting to parse it
Fixes#10445