I was looking into #9303 and was curious if this would still be valuable. @kballard had already done 99% of the work, so I brought the branch up to date and added a feature gate. Any feedback would be appreciated; I wasn't sure if this should be set up as a syntax extension with `#[macro_registrar]`, and if so, where it should be located.
Original PR is here: #9255
TODO:
* [x] Convert to loadable syntax extension
* [x] Default to big endian
* [x] Add `target` identifier
* [x] Expand to include code points 128-255
It was decided that a consistent result across platforms would be the
most useful and least surprising. A "target" option has been added to
get the old behaviour of using the target platform's endianess.
fourcc!() allows you to embed FourCC (or OSType) values that are
evaluated as u32 literals. It takes a 4-byte ASCII string and produces
the u32 resulting in interpreting those 4 bytes as a u32, using either
the platform-native endianness, or explicitly as big or little endian.
I don't know if anything depends on MemReader::fill returning an empty slice instead of EndOfFile, but I'm pretty sure that MemReader::read_until should not go into an infinite loop.
These are ancient. I removed a bunch of questions that are less relevant - or completely unrelevant, updated other entries, and removed things that are already better expressed elsewhere.
Before:
test test::bench_nonpod_nonarena ... bench: 62 ns/iter (+/- 6)
test test::bench_pod_nonarena ... bench: 0 ns/iter (+/- 0)
After:
test test::bench_nonpod_nonarena ... bench: 158 ns/iter (+/- 11)
test test::bench_pod_nonarena ... bench: 48 ns/iter (+/- 2)
The other tests show no change, but are adjusted to use the generic
return value of `.iter` anyway so that this doesn't change in future.
benchmarking.
This allows a result to be marked as "used" by passing it to a function
LLVM cannot see inside. By making `iter` generic and using this
`black_box` on the result benchmarks can get this behaviour simply by
returning their computation.
`make dist` or building from a generated tarball is currently not possible due to some files that have been renamed and the ongoing libextra split. This PR fixes all (current) issues in order to build rust from the .tar.gz source alone.
This pull request tries to fix#12050.
I went after these wrong errors quite aggressively so it might be that I also changed some strings that are not actual errors.
Please point those out and I will update this pull request accordingly.
libextra is currently being split into several crates. This commit adds
them all to the dist target in order to have them in the final tarballs.
Signed-off-by: Luca Bruno <lucab@debian.org>
Error messages cleaned in librustc/middle
Error messages cleaned in libsyntax
Error messages cleaned in libsyntax more agressively
Error messages cleaned in librustc more aggressively
Fixed affected tests
Fixed other failing tests
Last failing tests fixed
src/README.txt has been renamed in a30d61b05a, make dist is
thus failing as unable to find it.
This commit makes the dist target working again.
Signed-off-by: Luca Bruno <lucab@debian.org>
The lexer and json were using `transmute(-1): char` as a sentinel value for EOF, which is invalid since `char` is strictly a unicode codepoint.
Fixing this allows for range asserts on chars since they always lie between 0 and 0x10FFFF.
Declare a `type SendStr = MaybeOwned<'static>` to ease readibility of
types that needed the old SendStr behavior.
Implement all the traits for MaybeOwned that SendStr used to implement.
- Convert the formatting traits to `&self` rather than `_: &Self`
- Rejig `syntax::ext::{format,deriving}` a little in preparation
- Implement `#[deriving(Show)]`
The transmute was unsound.
There are many instances of .unwrap_or('\x00') for "ignoring" EOF which
either do not make the situation worse than it was (well, actually make
it better, since it's easy to grep for places that don't handle EOF) or
can never ever be read.
Fixes#8971.