* Rename `utf16_items` to `decode_utf16`. "Items" is meaningless.
* Move it to `rustc_unicode::char`, exposed in `std::char`.
* Generalize it to any `u16` iterable, not just `&[u16]`.
* Make it yield `Result` instead of a custom `Utf16Item` enum that was isomorphic to `Result`. This enable using the `FromIterator for Result` impl.
* Add a `REPLACEMENT_CHARACTER` constant.
* Document how `result.unwrap_or(REPLACEMENT_CHARACTER)` replaces `Utf16Item::to_char_lossy`.
Fix (and extend) src/test/run-pass/foreign-call-no-runtime.rs
While going over various problems signaled by valgrind when running `make check` on a build configured with `--enable-valgrind`, I discovered a bug in this test case.
Namely, the test case was previously creating an `i32` (originally an `int` aka `isize` but then we changed the name and the fallback rules), and then reading from a `*const isize`. Valgrind rightly complains about this, since we are reading an 8 byte value on 64-bit systems, but in principle only 4 bytes have been initialized.
(I wish this was the only valgrind unclean test, but unfortunately there are a bunch more. This was just the easiest/first one that I dissected.)
The methods gave wrong results for TyIs and TyUs, whose suffix len
should be 5 nowadays. But since they were only used for parsing,
and unneeded for that since 606a309d, remove them rather than fixing.
I hope this is ok to do, since all of rustc is considered unstable...
Issue #27583 was caused by the fact that `LUB('a,'b)` yielded `'static`, even if there existed a region `'tcx:'a+'b`. This PR replaces the old very hacky code for computing how free regions relate to one another with something rather more robust. This solves the issue for #27583, though I think that similar bizarro bugs can no doubt arise in other ways -- the root of the problem is that the region-inference code was written in an era when a LUB always existed, but that hasn't held for some time. To *truly* solve this problem, it needs to be generalized to cope with that reality. But let's leave that battle for another day.
r? @aturon
These commits move libcore into a state so that it's ready for stabilization, performing some minor cleanup:
* The primitive modules for integers in the standard library were all removed from the source tree as they were just straight reexports of the libcore variants.
* The `core::atomic` module now lives in `core::sync::atomic`. The `core::sync` module is otherwise empty, but ripe for expansion!
* The `core::prelude::v1` module was stabilized after auditing that it is a subset of the standard library's prelude plus some primitive extension traits (char, str, and slice)
* Some unstable-hacks for float parsing errors were shifted around to not use the same unstable hacks (e.g. the `flt2dec` module is now used for "privacy").
After this commit, the remaining large unstable functionality specific to libcore is:
* `raw`, `intrinsics`, `nonzero`, `array`, `panicking`, `simd` -- these modules are all unstable or not reexported in the standard library, so they're just remaining in the same status quo as before
* `num::Float` - this extension trait for floats needs to be audited for functionality (much of that is happening in #27823) and may also want to be renamed to `FloatExt` or `F32Ext`/`F64Ext`.
* Should the extension traits for primitives be stabilized in libcore?
I believe other unstable pieces are not isolated to just libcore but also affect the standard library.
cc #27701
Returning a primitive bool results in a somewhat confusing API - does
`true` indicate success - i.e. no timeout, or that a timeout has
occurred? An explicitly named enum makes it clearer.
[breaking-change]
r? @alexcrichton
In the case where there are no paren in the AST, the pretty printer doesn't correctly print binary operations where precedence is concerned. Parenthesis may be missing due to some kind of expansion or manipulation of the AST.
Example:
Pretty printer prints Expr(*, Expr(+, 1, 1), 2) as 1 + 1 * 2, as opposed to (1 + 1) * 2
r? @nrc
This prepares both for the FCP of #27718
Arc:
* Add previously omitted function `Arc::try_unwrap(Self) -> Result<T, Self>`
* Move `arc.downgrade()` to `Arc::downgrade(&Self)` per conventions.
* Deprecate `Arc::weak_count` and `Arc::strong_count` for raciness. It is almost
impossible to correctly act on these results without a CAS loop on the actual
fields.
* Rename `Arc::make_unique` to `Arc::make_mut` to avoid uniqueness terminology
and to clarify relation to `Arc::get_mut`.
Rc:
* Add `Rc::would_unwrap(&Self) -> bool` to introspect whether try_unwrap would succeed,
because it's destructive (unlike get_mut).
* Move `rc.downgrade()` to `Rc::downgrade(&Self)` per conventions.
* Deprecate `Rc::weak_count` and `Rc::strong_count` for questionable utility.
* Deprecate `Rc::is_unique` for questionable semantics (there are two kinds of
uniqueness with Weak pointers in play).
* Rename `rc.make_unique()` to `Rc::make_mut(&mut Self)` per conventions, to
avoid uniqueness terminology, and to clarify the relation to `Rc::get_mut`.
Notable omission:
* Arc::would_unwrap is not added due to the fact that it's racy, and therefore doesn't
provide much actionable information.
(note: rc_would_unwrap is not proposed for FCP as it is truly experimental)
r? @aturon (careful attention needs to be taken for the new Arc::try_unwrap, but intuitively it should "just work" by virtue of being semantically equivalent to Drop).