Some initial normalization method changes
1. Rename `AtExt::normalize` to `QueryNormalizeExt::query_normalize` (using the `QueryNormalizer`)
2. Introduce `NormalizeExt::normalize` to replace `partially_normalize_associated_types_in` (using the `AssocTypeNormalizer`)
3. Rename `FnCtxt::normalize_associated_types_in` to `FnCtxt::normalize`
4. Remove some unused other normalization fns in `Inherited` and `FnCtxt`
Also includes one drive-by where we're no longer creating a `FnCtxt` inside of `check_fn`, but passing it in. This means we don't need such weird `FnCtxt` construction logic.
Stacked on top of #104835 for convenience.
r? types
type alias impl trait: add tests showing that hidden type only outlives lifetimes that occur in bounds
fixes#103642https://github.com/rust-lang/rust/pull/102417 only made sure that hidden types cannot outlive lifetimes other than the ones mentioned on bounds, but didn't allow us to actually infer anything from that.
cc `@aliemjay`
Prefer doc comments over `//`-comments in compiler
Doc comments are generally nicer: they show up in the documentation, they are shown in IDEs when you hover other mentions of items, etc. Thus it makes sense to use them instead of `//`-comments.
Separate lifetime ident from lifetime resolution in HIR
Drive-by: change how suggested generic args are computed.
Fixes https://github.com/rust-lang/rust/issues/103815
I recommend reviewing commit-by-commit.
Avoid `GenFuture` shim when compiling async constructs
Previously, async constructs would be lowered to "normal" generators, with an additional `from_generator` / `GenFuture` shim in between to convert from `Generator` to `Future`.
The compiler will now special-case these generators internally so that async constructs will *directly* implement `Future` without the need to go through the `from_generator` / `GenFuture` shim.
The primary motivation for this change was hiding this implementation detail in stack traces and debuginfo, but it can in theory also help the optimizer as there is less abstractions to see through.
---
Given this demo code:
```rust
pub async fn a(arg: u32) -> Backtrace {
let bt = b().await;
let _arg = arg;
bt
}
pub async fn b() -> Backtrace {
Backtrace::force_capture()
}
```
I would get the following with the latest stable compiler (on Windows):
```
4: async_codegen:🅱️:async_fn$0
at .\src\lib.rs:10
5: core::future::from_generator::impl$1::poll<enum2$<async_codegen:🅱️:async_fn_env$0> >
at /rustc/897e37553bba8b42751c67658967889d11ecd120\library\core\src\future\mod.rs:91
6: async_codegen:🅰️:async_fn$0
at .\src\lib.rs:4
7: core::future::from_generator::impl$1::poll<enum2$<async_codegen:🅰️:async_fn_env$0> >
at /rustc/897e37553bba8b42751c67658967889d11ecd120\library\core\src\future\mod.rs:91
```
whereas now I get a much cleaner stack trace:
```
3: async_codegen:🅱️:async_fn$0
at .\src\lib.rs:10
4: async_codegen:🅰️:async_fn$0
at .\src\lib.rs:4
```
Previously, async constructs would be lowered to "normal" generators,
with an additional `from_generator` / `GenFuture` shim in between to
convert from `Generator` to `Future`.
The compiler will now special-case these generators internally so that
async constructs will *directly* implement `Future` without the need
to go through the `from_generator` / `GenFuture` shim.
The primary motivation for this change was hiding this implementation
detail in stack traces and debuginfo, but it can in theory also help
the optimizer as there is less abstractions to see through.
Use `tcx.require_lang_item` instead of unwrapping lang items
I clearly remember esteban telling me that there is `require_lang_item` but he was from a phone atm and I couldn't find it, so I didn't use it. Stumbled on it today, so here we are :)
Support using `Self` or projections inside an RPIT/async fn
I reuse the same idea as https://github.com/rust-lang/rust/pull/103449 to use variances to encode whether a lifetime parameter is captured by impl-trait.
The current implementation of async and RPIT replace all lifetimes from the parent generics by `'static`. This PR changes the scheme
```rust
impl<'a> Foo<'a> {
fn foo<'b, T>() -> impl Into<Self> + 'b { ... }
}
opaque Foo::<'_a>::foo::<'_b, T>::opaque<'b>: Into<Foo<'_a>> + 'b;
impl<'a> Foo<'a> {
// OLD
fn foo<'b, T>() -> Foo::<'static>::foo::<'static, T>::opaque::<'b> { ... }
^^^^^^^ the `Self` becomes `Foo<'static>`
// NEW
fn foo<'b, T>() -> Foo::<'a>::foo::<'b, T>::opaque::<'b> { ... }
^^ the `Self` stays `Foo<'a>`
}
```
There is the same issue with projections. In the example, substitute `Self` by `<T as Trait<'b>>::Assoc` in the sugared version, and `Foo<'_a>` by `<T as Trait<'_b>>::Assoc` in the desugared one.
This allows to support `Self` in impl-trait, since we do not replace lifetimes by `'static` any more. The same trick allows to use projections like `T::Assoc` where `Self` is allowed. The feature is gated behind a `impl_trait_projections` feature gate.
The implementation relies on 2 tweaking rules for opaques in 2 places:
- we only relate substs that correspond to captured lifetimes during TypeRelation;
- we only list captured lifetimes in choice region computation.
For simplicity, I encoded the "capturedness" of lifetimes as a variance, `Bivariant` vs `Invariant` for unused vs captured lifetimes. The `variances_of` query used to ICE for opaques.
Impl-trait that do not reference `Self` or projections will have their variances as:
- `o` (invariant) for each parent type or const;
- `*` (bivariant) for each parent lifetime --> will not participate in borrowck;
- `o` (invariant) for each own lifetime.
Impl-trait that does reference `Self` and/or projections will have some parent lifetimes marked as `o` (as the example above), and participate in type relation and borrowck. In the example above, `variances_of(opaque) = ['_a: o, '_b: *, T: o, 'b: o]`.
r? types
cc `@compiler-errors` , as you asked about the issue with `Self` and projections.