This will make sure that system files that rust binaries depend on in Windows get packaged into stage0 snapshots as well as into Windows installer.
Currently these include `libgcc_s_dw2-1.dll`, `libstdc++-6.dll` and `libpthread-2.dll`. Note that the latter will need to be changed to `pthreadGC2.dll` once Windows build bots get upgraded to mingw 4.0
Closes#9252Closes#5878Closes#9218Closes#5712
Mostly as per a short discussion on irc. (@cmr)
08:46 < cmr> so I'm thinking
Obsolete{Let,With,FieldTerminator,ClassTraits,ModeInFnType,MoveInit,BinaryMove,I
mplSyntax,MutOwnedPointer,MutVector,RecordType,RecordPattern,PostFnTySigil,Newty
pEnum,Mode,ImplicitSelf,LifetimeNotation,Purity,StaticMethod,ConstItem,FixedLeng
thVectorType}
08:46 < cmr> Those are the ones that are older than 0.6
08:46 < cmr> (at least!)
This PR removes these specific "obsolete syntax"/"suggestion for change" errors and just lets the parser run into regular parser errors for long-invalid syntax. I also removed `ObsoletePrivSection` which apparently dates further back than cmr or I could recall and `ObsoleteUnenforcedBound` which seemed unused. Also I removed `ObsoleteNewtypeEnum`.
Replaces existing tests for removed obsolete-syntax errors with tests
for the resulting regular errors, adds a test for each of the removed
parser errors to make sure that obsolete forms don't start working
again, removes some obsolete/superfluous tests that were now failing.
Deletes some amount of dead code in the parser, also includes some small
changes to parser error messages to accomodate new tests.
do_strptime() and do_strftime()
I don't see any reason why src/libextra/time.rs has two functions that just call a very similar function, which only use is to be called by them.
strftime() just calls do_strftime()
strptime() just calls do_strptime()
and the do_functions aren't called by anyone else.
the parameters and return types are exactly the same in both levels, so I just have removed the second layer.
Unless there is a need for that second level.
This purges about 500 lines of visitor cruft from lint passes. All lints are
handled in a much more sane way at this point. The other huge bonus of this
commit is that there are no more @-boxes in the lint passes, fixing the 500MB
memory regression seen when the lint passes were refactored.
Closes#8589
This purges about 500 lines of visitor cruft from lint passes. All lints are
handled in a much more sane way at this point. The other huge bonus of this
commit is that there are no more @-boxes in the lint passes, fixing the 500MB
memory regression seen when the lint passes were refactored.
Closes#8589
This fixes two existing bugs along the way:
* The `transmute` intrinsic did not correctly handle casts of immediate
aggregates like newtype structs and tuples.
* The code for calling foreign functions used the wrong type to create
an `alloca` temporary
enum Foo { A, B }
fn foo() -> Foo { A }
Before:
; Function Attrs: nounwind uwtable
define void @_ZN3foo18hbedc642d5d9cf5aag4v0.0E(%enum.Foo* noalias nocapture sret, { i64, %tydesc*, i8*, i8*, i8 }* nocapture readnone) #0 {
"function top level":
%2 = getelementptr inbounds %enum.Foo* %0, i64 0, i32 0
store i64 0, i64* %2, align 8
ret void
}
After:
; Function Attrs: nounwind readnone uwtable
define %enum.Foo @_ZN3foo18hbedc642d5d9cf5aag4v0.0E({ i64, %tydesc*, i8*, i8*, i8 }* nocapture readnone) #0 {
"function top level":
ret %enum.Foo zeroinitializer
}
The general idea of hyperlinking between crates is that it should require as
little configuration as possible, if any at all. In this vein, there are two
separate ways to generate hyperlinks between crates:
1. When you're generating documentation for a crate 'foo' into folder 'doc',
then if foo's external crate dependencies already have documented in the
folder 'doc', then hyperlinks will be generated. This will work because all
documentation is in the same folder, allowing links to work seamlessly both
on the web and on the local filesystem browser.
The rationale for this use case is a package with multiple libraries/crates
that all want to link to one another, and you don't want to have to deal with
going to the web. In theory this could be extended to have a RUST_PATH-style
searching situtation, but I'm not sure that it would work seamlessly on the
web as it does on the local filesystem, so I'm not attempting to explore this
case in this pull request. I believe to fully realize this potential rustdoc
would have to be acting as a server instead of a static site generator.
2. One of foo's external dependencies has a #[doc(html_root_url = "...")]
attribute. This means that all hyperlinks to the dependency will be rooted at
this url.
This use case encompasses all packages using libstd/libextra. These two
crates now have this attribute encoded (currently at the /doc/master url) and
will be read by anything which has a dependency on libstd/libextra. This
should also work for arbitrary crates in the wild that have online
documentation. I don't like how the version is hard-wired into the url, but I
think that this may be a case-by-case thing which doesn't end up being too
bad in the long run.
Closes#9539
That is, only a single expression or item gets parsed, so if there are
any extra tokens (e.g. the start of another item/expression) the user
should be told, rather than silently dropping them.
An example:
macro_rules! foo {
() => {
println("hi");
println("bye);
}
}
would expand to just `println("hi")`, which is almost certainly not
what the programmer wanted.
Fixes#8012.
That is, only a single expression or item gets parsed, so if there are
any extra tokens (e.g. the start of another item/expression) the user
should be told, rather than silently dropping them.
An example:
macro_rules! foo {
() => {
println("hi");
println("bye);
}
}
would expand to just `println("hi")`, which is almost certainly not
what the programmer wanted.
Fixes#8012.