Currently `Decoder` implementations are not provided the tuple arity as
a parameter to `read_tuple`. This forces all encoder/decoder combos to
serialize the arity along with the elements. Tuple-arity is always known
statically at the decode site, because it is part of the type of the
tuple, so it could instead be provided as an argument to `read_tuple`,
as it is to `read_struct`.
The upside to this is that serialized tuples could become smaller in
encoder/decoder implementations which choose not to serialize type
(arity) information. For example, @TyOverby's
[binary-encode](https://github.com/TyOverby/binary-encode) format is
currently forced to serialize the tuple-arity along with every tuple,
despite the information being statically known at the decode site.
A downside to this change is that the tuple-arity of serialized tuples
can no longer be automatically checked during deserialization. However,
for formats which do serialize the tuple-arity, either explicitly (rbml)
or implicitly (json), this check can be added to the `read_tuple` method.
The signature of `Deserialize::read_tuple` and
`Deserialize::read_tuple_struct` are changed, and thus binary
backwards-compatibility is broken. This change does *not* force
serialization formats to change, and thus does not break decoding values
serialized prior to this change.
[breaking-change]
In some obscure circumstances, failure to do this can cause
unsubstituted type parameters to show up where they aren't
expected and cause an ICE.
Closes#18514
As part of the collections reform RFC, this commit removes all collections
traits in favor of inherent methods on collections themselves. All methods
should continue to be available on all collections.
This is a breaking change with all of the collections traits being removed and
no longer being in the prelude. In order to update old code you should move the
trait implementations to inherent implementations directly on the type itself.
Note that some traits had default methods which will also need to be implemented
to maintain backwards compatibility.
[breaking-change]
cc #18424
As part of the collections reform RFC, this commit removes all collections
traits in favor of inherent methods on collections themselves. All methods
should continue to be available on all collections.
This is a breaking change with all of the collections traits being removed and
no longer being in the prelude. In order to update old code you should move the
trait implementations to inherent implementations directly on the type itself.
Note that some traits had default methods which will also need to be implemented
to maintain backwards compatibility.
[breaking-change]
cc #18424
Currently `Decoder` implementations are not provided the tuple arity as
a parameter to `read_tuple`. This forces all encoder/decoder combos to
serialize the arity along with the elements. Tuple-arity is always known
statically at the decode site, because it is part of the type of the
tuple, so it could instead be provided as an argument to `read_tuple`,
as it is to `read_struct`.
The upside to this is that serialized tuples could become smaller in
encoder/decoder implementations which choose not to serialize type
(arity) information. For example, @TyOverby's
[binary-encode](https://github.com/TyOverby/binary-encode) format is
currently forced to serialize the tuple-arity along with every tuple,
despite the information being statically known at the decode site.
A downside to this change is that the tuple-arity of serialized tuples
can no longer be automatically checked during deserialization. However,
for formats which do serialize the tuple-arity, either explicitly (rbml)
or implicitly (json), this check can be added to the `read_tuple` method.
The signature of `Deserialize::read_tuple` and
`Deserialize::read_tuple_struct` are changed, and thus binary
backwards-compatibility is broken. This change does *not* force
serialization formats to change, and thus does not break decoding values
serialized prior to this change.
[breaking-change]
Methods that used to take `ToCStr` implementors by value, now take them by reference. In particular, this breaks some uses of `Command`:
``` rust
Command::new("foo"); // Still works
Command::new(path) -> Command::new(&path)
cmd.arg(string) -> cmd.arg(&string) or cmd.arg(string.as_slice())
```
[breaking-change]
---
It may be sensible to remove `impl ToCstr for String` since:
- We're getting `impl Deref<str> for String`, so `string.to_cstr()` would still work
- `Command` methods would still be able to use `cmd.arg(string[..])` instead of `cmd.arg(&string)`.
But, I'm leaving that up to the library stabilization process.
r? @aturon
cc #16918
On some Windows versions of GDB this is more stable than setting breakpoints via function names. This is also something I wanted to do for some time now because it makes the tests more consistent.
@brson:
These changes are in response to issue #17540. It works on my machine with the toolchain mentioned in the issue. In order to find out if the problem is really worked around, we also need to make the build bots use the newer GDB version again.
Teach variance checker about the lifetime bounds that appear in trait object types.
[breaking-change] This patch fixes a hole in the type system which resulted in lifetime parameters that were only used in trait objects not being checked. It's hard to characterize precisely the changes that might be needed to fix target code.
cc #18262 (this fixes the test case by @jakub- but I am not sure if this is the same issue that @alexcrichton was reporting)
r? @pnkfelix
Fixes#18205
As with last time (where I marked .woff files as binary so that git does
not perform newline normalization), it's unclear how it got corrupted in
the first place.
- The signature of the `*_equiv` methods of `HashMap` and similar structures have changed, and now require one less level of indirection. Change your code from:
``` rust
hashmap.find_equiv(&"Hello");
hashmap.find_equiv(&&[0u8, 1, 2]);
```
to:
``` rust
hashmap.find_equiv("Hello");
hashmap.find_equiv(&[0u8, 1, 2]);
```
- The generic parameter `T` of the `Hasher::hash<T>` method have become `Sized?`. Downstream code must add `Sized?` to that method in their implementations. For example:
``` rust
impl Hasher<FnvState> for FnvHasher {
fn hash<T: Hash<FnvState>>(&self, t: &T) -> u64 { /* .. */ }
}
```
must be changed to:
``` rust
impl Hasher<FnvState> for FnvHasher {
fn hash<Sized? T: Hash<FnvState>>(&self, t: &T) -> u64 { /* .. */ }
// ^^^^^^
}
```
[breaking-change]
---
After review I'll squash the commits and update the commit message with the above paragraph.
r? @aturon
cc #16918
This fixes ICEs caused by late-bound lifetimes ending up in argument
datum types and being used in cleanup - user Drop impl's would then
fail to monomorphize if the type was used to look up the impl of a
method call - which happens in trans now, I presume for multidispatch.
This fixes ICEs caused by late-bound lifetimes ending up in argument
datum types and being used in cleanup - user Drop impl's would then
fail to monomorphize if the type was used to look up the impl of a
method call - which happens in trans now, I presume for multidispatch.
- The signature of the `*_equiv` methods of `HashMap` and similar structures
have changed, and now require one less level of indirection. Change your code
from:
```
hashmap.find_equiv(&"Hello");
hashmap.find_equiv(&&[0u8, 1, 2]);
```
to:
```
hashmap.find_equiv("Hello");
hashmap.find_equiv(&[0u8, 1, 2]);
```
- The generic parameter `T` of the `Hasher::hash<T>` method have become
`Sized?`. Downstream code must add `Sized?` to that method in their
implementations. For example:
```
impl Hasher<FnvState> for FnvHasher {
fn hash<T: Hash<FnvState>>(&self, t: &T) -> u64 { /* .. */ }
}
```
must be changed to:
```
impl Hasher<FnvState> for FnvHasher {
fn hash<Sized? T: Hash<FnvState>>(&self, t: &T) -> u64 { /* .. */ }
// ^^^^^^
}
```
[breaking-change]
This PR aims to improve the readability of diagnostic messages that involve unresolved type variables. Currently, messages like the following:
```rust
mismatched types: expected `core::result::Result<uint,()>`, found `core::option::Option<<generic #1>>`
<anon>:6 let a: Result<uint, ()> = None;
^~~~
mismatched types: expected `&mut <generic #2>`, found `uint`
<anon>:7 f(42u);
^~~
```
tend to appear unapproachable to new users. [0] While specific type var IDs are valuable in
diagnostics that deal with more than one such variable, in practice many messages
only mention one. In those cases, leaving out the specific number makes the messages
slightly less terrifying.
```rust
mismatched types: expected `core::result::Result<uint, ()>`, found `core::option::Option<_>`
<anon>:6 let a: Result<uint, ()> = None;
^~~~
mismatched types: expected `&mut _`, found `uint`
<anon>:7 f(42u);
^~~
```
As you can see, I also tweaked the aesthetics slightly by changing type variables to use the type hole syntax _. For integer variables, the syntax used is:
```rust
mismatched types: expected `core::result::Result<uint, ()>`, found `core::option::Option<_#1i>`
<anon>:6 let a: Result<uint, ()> = Some(1);
```
and float variables:
```rust
mismatched types: expected `core::result::Result<uint, ()>`, found `core::option::Option<_#1f>`
<anon>:6 let a: Result<uint, ()> = Some(0.5);
```
[0] https://twitter.com/coda/status/517713085465772032
Closes https://github.com/rust-lang/rust/issues/2632.
Closes https://github.com/rust-lang/rust/issues/3404.
Closes https://github.com/rust-lang/rust/issues/18426.
This is an implementation of the rustc bits of [RFC 403][rfc]. This adds a new
flag to the compiler, `-l`, as well as tweaking the `include!` macro (and
related source-centric macros).
The compiler's new `-l` flag is used to link libraries in from the command line.
This flag stacks with `#[link]` directives already found in the program. The
purpose of this flag, also stated in the RFC, is to ease linking against native
libraries which have wildly different requirements across platforms and even
within distributions of one platform. This flag accepts a string of the form
`NAME[:KIND]` where `KIND` is optional or one of dylib, static, or framework.
This is roughly equivalent to if the equivalent `#[link]` directive were just
written in the program.
The `include!` macro has been modified to recursively expand macros to allow
usage of `concat!` as an argument, for example. The use case spelled out in RFC
403 was for `env!` to be used as well to include compile-time generated files.
The macro also received a bit of tweaking to allow it to expand to either an
expression or a series of items, depending on what context it's used in.
[rfc]: https://github.com/rust-lang/rfcs/pull/403