Refactor Windows stdio and remove stdin double buffering
I was looking for something nice and small to work on, tried to tackle a few FIXME's in Windows stdio, and things grew from there.
This part of the standard library contains some tricky code, and has changed over the years to handle more corner cases. It could use some refactoring and extra comments.
Changes/fixes:
- Made `StderrRaw` `pub(crate)`, to remove the `Write` implementations on `sys::Stderr` (used unsynchronised for panic output).
- Remove the unused `Read` implementation on `sys::windows::stdin`
- The `windows::stdio::Output` enum made sense when we cached the handles, but we can use simple functions like `is_console` now that we get the handle on every read/write
- `write` can now calculate the number of written bytes as UTF-8 when we can't write all `u16`s.
- If `write` could only write one half of a surrogate pair, attempt another write for the other because user code can't reslice in any way that would allow us to write it otherwise.
- Removed the double buffering on stdin. Documentation on the unexposed `StdinRaw` says: 'This handle is not synchronized or buffered in any fashion'; which is now true.
- `sys::windows::Stdin` now always only partially fills its buffer, so we can guarantee any arbitrary UTF-16 can be re-encoded without losing any data.
- `sys::windows::STDIN_BUF_SIZE` is slightly larger to compensate. There should be no real change in the number of syscalls the buffered `Stdin` does. This buffer is a little larger, while the extra buffer on Stdin is gone.
- `sys::windows::Stdin` now attempts to handle unpaired surrogates at its buffer boundary.
- `sys::windows::Stdin` no langer allocates for its buffer, but the UTF-16 decoding still does.
### Testing
I did some manual testing of reading and writing to console. The console does support UTF-16 in some sense, but doesn't supporting displaying characters outside the BMP.
- compile stage 1 stdlib with a tiny value for `MAX_BUFFER_SIZE` to make it easier to catch corner cases
- run a simple test program that reads on stdin, and echo's to stdout
- write some lines with plenty of ASCII and emoji in a text editor
- copy and paste in console to stdin
- return with `\r\n\` or CTRL-Z
- copy and paste in text editor
- check it round-trips
-----
Fixes https://github.com/rust-lang/rust/issues/23344. All but one of the suggestions in that issue are now implemented. the missing one is:
> * When reading data, we require the entire set of input to be valid UTF-16. We should instead attempt to read as much of the input as possible as valid UTF-16, only returning an error for the actual invalid elements. For example if we read 10 elements, 5 of which are valid UTF-16, the 6th is bad, and then the remaining are all valid UTF-16, we should probably return the first 5 on a call to `read`, then return an error, then return the remaining on the next call to `read`.
Stdin in Console mode is dealing with text directly input by a user. In my opinion getting an unpaired surrogate is quite unlikely in that case, and a valid reason to error on the entire line of input (which is probably short). Dealing with it is incompatible with an unbuffered stdin, which seems the more interesting guarantee to me.
Simplify the unix `Weak` functionality
- We can avoid allocation by adding a NUL to the function name.
- We can get `Option<F>` directly, rather than aliasing the inner `AtomicUsize`.
Clarify guarantees for `Box` allocation
This basically says `Box` does the obvious things for its allocations.
See also: https://users.rust-lang.org/t/alloc-crate-guarantees/24981
This may require a T-libs FCP? Not sure.
r? @sfackler
Update parking_lot to 0.7
Unfortunately this'll dupe parking_lot until the data_structures crate
is published and be updated in rls in conjunction with crossbeam-channel
Added a connection timeout and speed threshold when downloading the Docker cache
This is an attempt to fix one possible cause of #56112. Similar to #52846, this changed the download docker-cache command to fail if it cannot connect or falls below 10 bytes/s after 30 seconds, so it could be retried sooner.
Don't generate minification variables if minification disabled
If the minification is disabled, there is no sense having those variables.
r? @QuietMisdreavus
Allow Self::Module to be mutated.
This only changes `&Self::Module` to `&mut Self::Module` in a couple of places.
`codegen_allocator` and `write_metadata` from `ExtraBackendMethods` mutate the underlying LLVM module. As such, it makes sense for these two functions to receive a mutable reference to the module (as opposed to an immutable one).
I am trying to implement `codegen_allocator` for my backend, and I need to be able to mutate `Self::Module`:
f66e4697ae/src/librustc_codegen_ssa/traits/backend.rs (L41)
Modifying the module in `codegen_allocator`/`write_metadata` is not a problem for the LLVM backend, because [ModuleLlvm](https://github.com/rust-lang/rust/blob/master/src/librustc_codegen_llvm/lib.rs#L357) contains a raw pointer to the underlying LLVM module, so it can easily be mutated through FFI calls.
I am trying to avoid interior mutability and `unsafe` as much as I can. What do you think? Does this change make sense, or is there a reason why this should stay the way it is?