Auto merge of #35166 - nikomatsakis:incr-comp-ice-34991-2, r=mw
Address ICEs running w/ incremental compilation and building glium Fixes for various ICEs I encountered trying to build glium with incremental compilation enabled. Building glium now works. Of the 4 ICEs, I have test cases for 3 of them -- I didn't isolate a test for the last commit and kind of want to go do other things -- most notably, figuring out why incremental isn't saving much *effort*. But if it seems worthwhile and I can come back and try to narrow down the problem. r? @michaelwoerister Fixes #34991 Fixes #32015
This commit is contained in:
commit
e1d2bc2916
@ -134,6 +134,10 @@ to read from it. Similarly, reading from the `tcache` map for item `X`
|
||||
(which is a `DepTrackingMap`, described below) automatically invokes
|
||||
`dep_graph.read(ItemSignature(X))`.
|
||||
|
||||
**Note:** adding `Hir` nodes requires a bit of caution due to the
|
||||
"inlining" that old trans and constant evaluation still use. See the
|
||||
section on inlining below.
|
||||
|
||||
To make this strategy work, a certain amount of indirection is
|
||||
required. For example, modules in the HIR do not have direct pointers
|
||||
to the items that they contain. Rather, they contain node-ids -- one
|
||||
@ -387,3 +391,24 @@ RUST_DEP_GRAPH_FILTER='Hir&foo -> TypeckItemBody & bar'
|
||||
This will dump out all the nodes that lead from `Hir(foo)` to
|
||||
`TypeckItemBody(bar)`, from which you can (hopefully) see the source
|
||||
of the erroneous edge.
|
||||
|
||||
### Inlining of HIR nodes
|
||||
|
||||
For the time being, at least, we still sometimes "inline" HIR nodes
|
||||
from other crates into the current HIR map. This creates a weird
|
||||
scenario where the same logical item (let's call it `X`) has two
|
||||
def-ids: the original def-id `X` and a new, inlined one `X'`. `X'` is
|
||||
in the current crate, but it's not like other HIR nodes: in
|
||||
particular, when we restart compilation, it will not be available to
|
||||
hash. Therefore, we do not want `Hir(X')` nodes appearing in our
|
||||
graph. Instead, we want a "read" of `Hir(X')` to be represented as a
|
||||
read of `MetaData(X)`, since the metadata for `X` is where the inlined
|
||||
representation originated in the first place.
|
||||
|
||||
To achieve this, the HIR map will detect if the def-id originates in
|
||||
an inlined node and add a dependency to a suitable `MetaData` node
|
||||
instead. If you are reading a HIR node and are not sure if it may be
|
||||
inlined or not, you can use `tcx.map.read(node_id)` and it will detect
|
||||
whether the node is inlined or not and do the right thing. You can
|
||||
also use `tcx.map.is_inlined_def_id()` and
|
||||
`tcx.map.is_inlined_node_id()` to test.
|
||||
|
@ -20,7 +20,7 @@ macro_rules! try_opt {
|
||||
)
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, RustcEncodable, RustcDecodable)]
|
||||
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash, RustcEncodable, RustcDecodable)]
|
||||
pub enum DepNode<D: Clone + Debug> {
|
||||
// The `D` type is "how definitions are identified".
|
||||
// During compilation, it is always `DefId`, but when serializing
|
||||
@ -82,9 +82,11 @@ pub enum DepNode<D: Clone + Debug> {
|
||||
Privacy,
|
||||
IntrinsicCheck(D),
|
||||
MatchCheck(D),
|
||||
MirMapConstruction(D),
|
||||
MirPass(D),
|
||||
MirTypeck(D),
|
||||
|
||||
// Represents the MIR for a fn; also used as the task node for
|
||||
// things read/modify that MIR.
|
||||
Mir(D),
|
||||
|
||||
BorrowCheck(D),
|
||||
RvalueCheck(D),
|
||||
Reachability,
|
||||
@ -214,9 +216,7 @@ impl<D: Clone + Debug> DepNode<D> {
|
||||
CheckConst(ref d) => op(d).map(CheckConst),
|
||||
IntrinsicCheck(ref d) => op(d).map(IntrinsicCheck),
|
||||
MatchCheck(ref d) => op(d).map(MatchCheck),
|
||||
MirMapConstruction(ref d) => op(d).map(MirMapConstruction),
|
||||
MirPass(ref d) => op(d).map(MirPass),
|
||||
MirTypeck(ref d) => op(d).map(MirTypeck),
|
||||
Mir(ref d) => op(d).map(Mir),
|
||||
BorrowCheck(ref d) => op(d).map(BorrowCheck),
|
||||
RvalueCheck(ref d) => op(d).map(RvalueCheck),
|
||||
TransCrateItem(ref d) => op(d).map(TransCrateItem),
|
||||
@ -245,6 +245,6 @@ impl<D: Clone + Debug> DepNode<D> {
|
||||
/// some independent path or string that persists between runs without
|
||||
/// the need to be mapped or unmapped. (This ensures we can serialize
|
||||
/// them even in the absence of a tcx.)
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash, RustcEncodable, RustcDecodable)]
|
||||
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash, RustcEncodable, RustcDecodable)]
|
||||
pub struct WorkProductId(pub String);
|
||||
|
||||
|
@ -61,6 +61,12 @@ impl<M: DepTrackingMapConfig> DepTrackingMap<M> {
|
||||
self.map.get(k)
|
||||
}
|
||||
|
||||
pub fn get_mut(&mut self, k: &M::Key) -> Option<&mut M::Value> {
|
||||
self.read(k);
|
||||
self.write(k);
|
||||
self.map.get_mut(k)
|
||||
}
|
||||
|
||||
pub fn insert(&mut self, k: M::Key, v: M::Value) -> Option<M::Value> {
|
||||
self.write(&k);
|
||||
self.map.insert(k, v)
|
||||
@ -70,6 +76,10 @@ impl<M: DepTrackingMapConfig> DepTrackingMap<M> {
|
||||
self.read(k);
|
||||
self.map.contains_key(k)
|
||||
}
|
||||
|
||||
pub fn keys(&self) -> Vec<M::Key> {
|
||||
self.map.keys().cloned().collect()
|
||||
}
|
||||
}
|
||||
|
||||
impl<M: DepTrackingMapConfig> MemoizationMap for RefCell<DepTrackingMap<M>> {
|
||||
|
@ -41,6 +41,7 @@ pub fn visit_all_items_in_krate<'a, 'tcx, V, F>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
let task_id = (self.dep_node_fn)(item_def_id);
|
||||
let _task = self.tcx.dep_graph.in_task(task_id.clone());
|
||||
debug!("Started task {:?}", task_id);
|
||||
assert!(!self.tcx.map.is_inlined_def_id(item_def_id));
|
||||
self.tcx.dep_graph.read(DepNode::Hir(item_def_id));
|
||||
self.visitor.visit_item(i);
|
||||
debug!("Ended task {:?}", task_id);
|
||||
|
@ -1854,7 +1854,7 @@ impl<'a> LoweringContext<'a> {
|
||||
|
||||
let parent_def = self.parent_def;
|
||||
let def = self.resolver.definitions().map(|defs| {
|
||||
let def_path_data = DefPathData::Binding(name);
|
||||
let def_path_data = DefPathData::Binding(name.as_str());
|
||||
let def_index = defs.create_def_with_parent(parent_def, pat.id, def_path_data);
|
||||
Def::Local(DefId::local(def_index), pat.id)
|
||||
}).unwrap_or(Def::Err);
|
||||
|
@ -135,11 +135,11 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
DefPathData::Impl,
|
||||
ItemKind::Enum(..) | ItemKind::Struct(..) | ItemKind::Trait(..) |
|
||||
ItemKind::ExternCrate(..) | ItemKind::ForeignMod(..) | ItemKind::Ty(..) =>
|
||||
DefPathData::TypeNs(i.ident.name),
|
||||
ItemKind::Mod(..) => DefPathData::Module(i.ident.name),
|
||||
DefPathData::TypeNs(i.ident.name.as_str()),
|
||||
ItemKind::Mod(..) => DefPathData::Module(i.ident.name.as_str()),
|
||||
ItemKind::Static(..) | ItemKind::Const(..) | ItemKind::Fn(..) =>
|
||||
DefPathData::ValueNs(i.ident.name),
|
||||
ItemKind::Mac(..) => DefPathData::MacroDef(i.ident.name),
|
||||
DefPathData::ValueNs(i.ident.name.as_str()),
|
||||
ItemKind::Mac(..) => DefPathData::MacroDef(i.ident.name.as_str()),
|
||||
ItemKind::Use(..) => DefPathData::Misc,
|
||||
};
|
||||
let def = self.create_def(i.id, def_data);
|
||||
@ -150,12 +150,12 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
for v in &enum_definition.variants {
|
||||
let variant_def_index =
|
||||
this.create_def(v.node.data.id(),
|
||||
DefPathData::EnumVariant(v.node.name.name));
|
||||
DefPathData::EnumVariant(v.node.name.name.as_str()));
|
||||
this.with_parent(variant_def_index, |this| {
|
||||
for (index, field) in v.node.data.fields().iter().enumerate() {
|
||||
let name = field.ident.map(|ident| ident.name)
|
||||
.unwrap_or_else(|| token::intern(&index.to_string()));
|
||||
this.create_def(field.id, DefPathData::Field(name));
|
||||
this.create_def(field.id, DefPathData::Field(name.as_str()));
|
||||
}
|
||||
|
||||
if let Some(ref expr) = v.node.disr_expr {
|
||||
@ -172,8 +172,8 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
for (index, field) in struct_def.fields().iter().enumerate() {
|
||||
let name = field.ident.map(|ident| ident.name)
|
||||
.unwrap_or(token::intern(&index.to_string()));
|
||||
let name = field.ident.map(|ident| ident.name.as_str())
|
||||
.unwrap_or(token::intern(&index.to_string()).as_str());
|
||||
this.create_def(field.id, DefPathData::Field(name));
|
||||
}
|
||||
}
|
||||
@ -184,7 +184,8 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
fn visit_foreign_item(&mut self, foreign_item: &ForeignItem) {
|
||||
let def = self.create_def(foreign_item.id, DefPathData::ValueNs(foreign_item.ident.name));
|
||||
let def = self.create_def(foreign_item.id,
|
||||
DefPathData::ValueNs(foreign_item.ident.name.as_str()));
|
||||
|
||||
self.with_parent(def, |this| {
|
||||
visit::walk_foreign_item(this, foreign_item);
|
||||
@ -193,7 +194,7 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
|
||||
fn visit_generics(&mut self, generics: &Generics) {
|
||||
for ty_param in generics.ty_params.iter() {
|
||||
self.create_def(ty_param.id, DefPathData::TypeParam(ty_param.ident.name));
|
||||
self.create_def(ty_param.id, DefPathData::TypeParam(ty_param.ident.name.as_str()));
|
||||
}
|
||||
|
||||
visit::walk_generics(self, generics);
|
||||
@ -202,9 +203,9 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
fn visit_trait_item(&mut self, ti: &TraitItem) {
|
||||
let def_data = match ti.node {
|
||||
TraitItemKind::Method(..) | TraitItemKind::Const(..) =>
|
||||
DefPathData::ValueNs(ti.ident.name),
|
||||
TraitItemKind::Type(..) => DefPathData::TypeNs(ti.ident.name),
|
||||
TraitItemKind::Macro(..) => DefPathData::MacroDef(ti.ident.name),
|
||||
DefPathData::ValueNs(ti.ident.name.as_str()),
|
||||
TraitItemKind::Type(..) => DefPathData::TypeNs(ti.ident.name.as_str()),
|
||||
TraitItemKind::Macro(..) => DefPathData::MacroDef(ti.ident.name.as_str()),
|
||||
};
|
||||
|
||||
let def = self.create_def(ti.id, def_data);
|
||||
@ -220,9 +221,9 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
fn visit_impl_item(&mut self, ii: &ImplItem) {
|
||||
let def_data = match ii.node {
|
||||
ImplItemKind::Method(..) | ImplItemKind::Const(..) =>
|
||||
DefPathData::ValueNs(ii.ident.name),
|
||||
ImplItemKind::Type(..) => DefPathData::TypeNs(ii.ident.name),
|
||||
ImplItemKind::Macro(..) => DefPathData::MacroDef(ii.ident.name),
|
||||
DefPathData::ValueNs(ii.ident.name.as_str()),
|
||||
ImplItemKind::Type(..) => DefPathData::TypeNs(ii.ident.name.as_str()),
|
||||
ImplItemKind::Macro(..) => DefPathData::MacroDef(ii.ident.name.as_str()),
|
||||
};
|
||||
|
||||
let def = self.create_def(ii.id, def_data);
|
||||
@ -239,7 +240,7 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
let parent_def = self.parent_def;
|
||||
|
||||
if let PatKind::Ident(_, id, _) = pat.node {
|
||||
let def = self.create_def(pat.id, DefPathData::Binding(id.node.name));
|
||||
let def = self.create_def(pat.id, DefPathData::Binding(id.node.name.as_str()));
|
||||
self.parent_def = Some(def);
|
||||
}
|
||||
|
||||
@ -271,11 +272,11 @@ impl<'ast> visit::Visitor for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
fn visit_lifetime_def(&mut self, def: &LifetimeDef) {
|
||||
self.create_def(def.lifetime.id, DefPathData::LifetimeDef(def.lifetime.name));
|
||||
self.create_def(def.lifetime.id, DefPathData::LifetimeDef(def.lifetime.name.as_str()));
|
||||
}
|
||||
|
||||
fn visit_macro_def(&mut self, macro_def: &MacroDef) {
|
||||
self.create_def(macro_def.id, DefPathData::MacroDef(macro_def.ident.name));
|
||||
self.create_def(macro_def.id, DefPathData::MacroDef(macro_def.ident.name.as_str()));
|
||||
}
|
||||
}
|
||||
|
||||
@ -301,9 +302,9 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
hir::ItemEnum(..) | hir::ItemStruct(..) | hir::ItemTrait(..) |
|
||||
hir::ItemExternCrate(..) | hir::ItemMod(..) | hir::ItemForeignMod(..) |
|
||||
hir::ItemTy(..) =>
|
||||
DefPathData::TypeNs(i.name),
|
||||
DefPathData::TypeNs(i.name.as_str()),
|
||||
hir::ItemStatic(..) | hir::ItemConst(..) | hir::ItemFn(..) =>
|
||||
DefPathData::ValueNs(i.name),
|
||||
DefPathData::ValueNs(i.name.as_str()),
|
||||
hir::ItemUse(..) => DefPathData::Misc,
|
||||
};
|
||||
let def = self.create_def(i.id, def_data);
|
||||
@ -314,12 +315,12 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
for v in &enum_definition.variants {
|
||||
let variant_def_index =
|
||||
this.create_def(v.node.data.id(),
|
||||
DefPathData::EnumVariant(v.node.name));
|
||||
DefPathData::EnumVariant(v.node.name.as_str()));
|
||||
|
||||
this.with_parent(variant_def_index, |this| {
|
||||
for field in v.node.data.fields() {
|
||||
this.create_def(field.id,
|
||||
DefPathData::Field(field.name));
|
||||
DefPathData::Field(field.name.as_str()));
|
||||
}
|
||||
if let Some(ref expr) = v.node.disr_expr {
|
||||
this.visit_hir_const_integer(expr);
|
||||
@ -335,7 +336,7 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
for field in struct_def.fields() {
|
||||
this.create_def(field.id, DefPathData::Field(field.name));
|
||||
this.create_def(field.id, DefPathData::Field(field.name.as_str()));
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
@ -345,7 +346,8 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
fn visit_foreign_item(&mut self, foreign_item: &'ast hir::ForeignItem) {
|
||||
let def = self.create_def(foreign_item.id, DefPathData::ValueNs(foreign_item.name));
|
||||
let def = self.create_def(foreign_item.id,
|
||||
DefPathData::ValueNs(foreign_item.name.as_str()));
|
||||
|
||||
self.with_parent(def, |this| {
|
||||
intravisit::walk_foreign_item(this, foreign_item);
|
||||
@ -354,7 +356,7 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
|
||||
fn visit_generics(&mut self, generics: &'ast hir::Generics) {
|
||||
for ty_param in generics.ty_params.iter() {
|
||||
self.create_def(ty_param.id, DefPathData::TypeParam(ty_param.name));
|
||||
self.create_def(ty_param.id, DefPathData::TypeParam(ty_param.name.as_str()));
|
||||
}
|
||||
|
||||
intravisit::walk_generics(self, generics);
|
||||
@ -363,8 +365,8 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
fn visit_trait_item(&mut self, ti: &'ast hir::TraitItem) {
|
||||
let def_data = match ti.node {
|
||||
hir::MethodTraitItem(..) | hir::ConstTraitItem(..) =>
|
||||
DefPathData::ValueNs(ti.name),
|
||||
hir::TypeTraitItem(..) => DefPathData::TypeNs(ti.name),
|
||||
DefPathData::ValueNs(ti.name.as_str()),
|
||||
hir::TypeTraitItem(..) => DefPathData::TypeNs(ti.name.as_str()),
|
||||
};
|
||||
|
||||
let def = self.create_def(ti.id, def_data);
|
||||
@ -380,8 +382,8 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
fn visit_impl_item(&mut self, ii: &'ast hir::ImplItem) {
|
||||
let def_data = match ii.node {
|
||||
hir::ImplItemKind::Method(..) | hir::ImplItemKind::Const(..) =>
|
||||
DefPathData::ValueNs(ii.name),
|
||||
hir::ImplItemKind::Type(..) => DefPathData::TypeNs(ii.name),
|
||||
DefPathData::ValueNs(ii.name.as_str()),
|
||||
hir::ImplItemKind::Type(..) => DefPathData::TypeNs(ii.name.as_str()),
|
||||
};
|
||||
|
||||
let def = self.create_def(ii.id, def_data);
|
||||
@ -398,7 +400,7 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
let parent_def = self.parent_def;
|
||||
|
||||
if let hir::PatKind::Binding(_, name, _) = pat.node {
|
||||
let def = self.create_def(pat.id, DefPathData::Binding(name.node));
|
||||
let def = self.create_def(pat.id, DefPathData::Binding(name.node.as_str()));
|
||||
self.parent_def = Some(def);
|
||||
}
|
||||
|
||||
@ -430,10 +432,10 @@ impl<'ast> intravisit::Visitor<'ast> for DefCollector<'ast> {
|
||||
}
|
||||
|
||||
fn visit_lifetime_def(&mut self, def: &'ast hir::LifetimeDef) {
|
||||
self.create_def(def.lifetime.id, DefPathData::LifetimeDef(def.lifetime.name));
|
||||
self.create_def(def.lifetime.id, DefPathData::LifetimeDef(def.lifetime.name.as_str()));
|
||||
}
|
||||
|
||||
fn visit_macro_def(&mut self, macro_def: &'ast hir::MacroDef) {
|
||||
self.create_def(macro_def.id, DefPathData::MacroDef(macro_def.name));
|
||||
self.create_def(macro_def.id, DefPathData::MacroDef(macro_def.name.as_str()));
|
||||
}
|
||||
}
|
||||
|
@ -12,8 +12,11 @@ use middle::cstore::LOCAL_CRATE;
|
||||
use hir::def_id::{DefId, DefIndex};
|
||||
use hir::map::def_collector::DefCollector;
|
||||
use rustc_data_structures::fnv::FnvHashMap;
|
||||
use std::fmt::Write;
|
||||
use std::hash::{Hash, Hasher, SipHasher};
|
||||
use syntax::{ast, visit};
|
||||
use syntax::parse::token::InternedString;
|
||||
use ty::TyCtxt;
|
||||
use util::nodemap::NodeMap;
|
||||
|
||||
/// The definition table containing node definitions
|
||||
@ -109,6 +112,40 @@ impl DefPath {
|
||||
data.reverse();
|
||||
DefPath { data: data, krate: krate }
|
||||
}
|
||||
|
||||
pub fn to_string(&self, tcx: TyCtxt) -> String {
|
||||
let mut s = String::with_capacity(self.data.len() * 16);
|
||||
|
||||
if self.krate == LOCAL_CRATE {
|
||||
s.push_str(&tcx.crate_name(self.krate));
|
||||
} else {
|
||||
s.push_str(&tcx.sess.cstore.original_crate_name(self.krate));
|
||||
}
|
||||
s.push_str("/");
|
||||
s.push_str(&tcx.crate_disambiguator(self.krate));
|
||||
|
||||
for component in &self.data {
|
||||
write!(s,
|
||||
"::{}[{}]",
|
||||
component.data.as_interned_str(),
|
||||
component.disambiguator)
|
||||
.unwrap();
|
||||
}
|
||||
|
||||
s
|
||||
}
|
||||
|
||||
pub fn deterministic_hash(&self, tcx: TyCtxt) -> u64 {
|
||||
let mut state = SipHasher::new();
|
||||
self.deterministic_hash_to(tcx, &mut state);
|
||||
state.finish()
|
||||
}
|
||||
|
||||
pub fn deterministic_hash_to<H: Hasher>(&self, tcx: TyCtxt, state: &mut H) {
|
||||
tcx.crate_name(self.krate).hash(state);
|
||||
tcx.crate_disambiguator(self.krate).hash(state);
|
||||
self.data.hash(state);
|
||||
}
|
||||
}
|
||||
|
||||
/// Root of an inlined item. We track the `DefPath` of the item within
|
||||
@ -153,31 +190,31 @@ pub enum DefPathData {
|
||||
/// An impl
|
||||
Impl,
|
||||
/// Something in the type NS
|
||||
TypeNs(ast::Name),
|
||||
TypeNs(InternedString),
|
||||
/// Something in the value NS
|
||||
ValueNs(ast::Name),
|
||||
ValueNs(InternedString),
|
||||
/// A module declaration
|
||||
Module(ast::Name),
|
||||
Module(InternedString),
|
||||
/// A macro rule
|
||||
MacroDef(ast::Name),
|
||||
MacroDef(InternedString),
|
||||
/// A closure expression
|
||||
ClosureExpr,
|
||||
|
||||
// Subportions of items
|
||||
/// A type parameter (generic parameter)
|
||||
TypeParam(ast::Name),
|
||||
TypeParam(InternedString),
|
||||
/// A lifetime definition
|
||||
LifetimeDef(ast::Name),
|
||||
LifetimeDef(InternedString),
|
||||
/// A variant of a enum
|
||||
EnumVariant(ast::Name),
|
||||
EnumVariant(InternedString),
|
||||
/// A struct field
|
||||
Field(ast::Name),
|
||||
Field(InternedString),
|
||||
/// Implicit ctor for a tuple-like struct
|
||||
StructCtor,
|
||||
/// Initializer for a const
|
||||
Initializer,
|
||||
/// Pattern binding
|
||||
Binding(ast::Name),
|
||||
Binding(InternedString),
|
||||
}
|
||||
|
||||
impl Definitions {
|
||||
@ -291,16 +328,16 @@ impl DefPathData {
|
||||
pub fn as_interned_str(&self) -> InternedString {
|
||||
use self::DefPathData::*;
|
||||
match *self {
|
||||
TypeNs(name) |
|
||||
ValueNs(name) |
|
||||
Module(name) |
|
||||
MacroDef(name) |
|
||||
TypeParam(name) |
|
||||
LifetimeDef(name) |
|
||||
EnumVariant(name) |
|
||||
Binding(name) |
|
||||
Field(name) => {
|
||||
name.as_str()
|
||||
TypeNs(ref name) |
|
||||
ValueNs(ref name) |
|
||||
Module(ref name) |
|
||||
MacroDef(ref name) |
|
||||
TypeParam(ref name) |
|
||||
LifetimeDef(ref name) |
|
||||
EnumVariant(ref name) |
|
||||
Binding(ref name) |
|
||||
Field(ref name) => {
|
||||
name.clone()
|
||||
}
|
||||
|
||||
Impl => {
|
||||
|
@ -204,9 +204,21 @@ pub struct Map<'ast> {
|
||||
/// All NodeIds that are numerically greater or equal to this value come
|
||||
/// from inlined items.
|
||||
local_node_id_watermark: NodeId,
|
||||
|
||||
/// All def-indices that are numerically greater or equal to this value come
|
||||
/// from inlined items.
|
||||
local_def_id_watermark: usize,
|
||||
}
|
||||
|
||||
impl<'ast> Map<'ast> {
|
||||
pub fn is_inlined_def_id(&self, id: DefId) -> bool {
|
||||
id.is_local() && id.index.as_usize() >= self.local_def_id_watermark
|
||||
}
|
||||
|
||||
pub fn is_inlined_node_id(&self, id: NodeId) -> bool {
|
||||
id >= self.local_node_id_watermark
|
||||
}
|
||||
|
||||
/// Registers a read in the dependency graph of the AST node with
|
||||
/// the given `id`. This needs to be called each time a public
|
||||
/// function returns the HIR for a node -- in other words, when it
|
||||
@ -214,60 +226,99 @@ impl<'ast> Map<'ast> {
|
||||
/// otherwise have had access to those contents, and hence needs a
|
||||
/// read recorded). If the function just returns a DefId or
|
||||
/// NodeId, no actual content was returned, so no read is needed.
|
||||
fn read(&self, id: NodeId) {
|
||||
pub fn read(&self, id: NodeId) {
|
||||
self.dep_graph.read(self.dep_node(id));
|
||||
}
|
||||
|
||||
fn dep_node(&self, id0: NodeId) -> DepNode<DefId> {
|
||||
let map = self.map.borrow();
|
||||
let mut id = id0;
|
||||
loop {
|
||||
match map[id as usize] {
|
||||
EntryItem(_, item) => {
|
||||
let def_id = self.local_def_id(item.id);
|
||||
// NB ^~~~~~~
|
||||
//
|
||||
// You would expect that `item.id == id`, but this
|
||||
// is not always the case. In particular, for a
|
||||
// ViewPath item like `use self::{mem, foo}`, we
|
||||
// map the ids for `mem` and `foo` to the
|
||||
// enclosing view path item. This seems mega super
|
||||
// ultra wrong, but then who am I to judge?
|
||||
// -nmatsakis
|
||||
return DepNode::Hir(def_id);
|
||||
if !self.is_inlined_node_id(id) {
|
||||
loop {
|
||||
match map[id as usize] {
|
||||
EntryItem(_, item) => {
|
||||
let def_id = self.local_def_id(item.id);
|
||||
// NB ^~~~~~~
|
||||
//
|
||||
// You would expect that `item.id == id`, but this
|
||||
// is not always the case. In particular, for a
|
||||
// ViewPath item like `use self::{mem, foo}`, we
|
||||
// map the ids for `mem` and `foo` to the
|
||||
// enclosing view path item. This seems mega super
|
||||
// ultra wrong, but then who am I to judge?
|
||||
// -nmatsakis
|
||||
assert!(!self.is_inlined_def_id(def_id));
|
||||
return DepNode::Hir(def_id);
|
||||
}
|
||||
|
||||
EntryForeignItem(p, _) |
|
||||
EntryTraitItem(p, _) |
|
||||
EntryImplItem(p, _) |
|
||||
EntryVariant(p, _) |
|
||||
EntryExpr(p, _) |
|
||||
EntryStmt(p, _) |
|
||||
EntryLocal(p, _) |
|
||||
EntryPat(p, _) |
|
||||
EntryBlock(p, _) |
|
||||
EntryStructCtor(p, _) |
|
||||
EntryLifetime(p, _) |
|
||||
EntryTyParam(p, _) =>
|
||||
id = p,
|
||||
|
||||
RootCrate =>
|
||||
return DepNode::Krate,
|
||||
|
||||
RootInlinedParent(_) =>
|
||||
bug!("node {} has inlined ancestor but is not inlined", id0),
|
||||
|
||||
NotPresent =>
|
||||
// Some nodes, notably struct fields, are not
|
||||
// present in the map for whatever reason, but
|
||||
// they *do* have def-ids. So if we encounter an
|
||||
// empty hole, check for that case.
|
||||
return self.opt_local_def_id(id)
|
||||
.map(|def_id| DepNode::Hir(def_id))
|
||||
.unwrap_or_else(|| {
|
||||
bug!("Walking parents from `{}` \
|
||||
led to `NotPresent` at `{}`",
|
||||
id0, id)
|
||||
}),
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// reading from an inlined def-id is really a read out of
|
||||
// the metadata from which we loaded the item.
|
||||
loop {
|
||||
match map[id as usize] {
|
||||
EntryItem(p, _) |
|
||||
EntryForeignItem(p, _) |
|
||||
EntryTraitItem(p, _) |
|
||||
EntryImplItem(p, _) |
|
||||
EntryVariant(p, _) |
|
||||
EntryExpr(p, _) |
|
||||
EntryStmt(p, _) |
|
||||
EntryLocal(p, _) |
|
||||
EntryPat(p, _) |
|
||||
EntryBlock(p, _) |
|
||||
EntryStructCtor(p, _) |
|
||||
EntryLifetime(p, _) |
|
||||
EntryTyParam(p, _) =>
|
||||
id = p,
|
||||
|
||||
EntryForeignItem(p, _) |
|
||||
EntryTraitItem(p, _) |
|
||||
EntryImplItem(p, _) |
|
||||
EntryVariant(p, _) |
|
||||
EntryExpr(p, _) |
|
||||
EntryStmt(p, _) |
|
||||
EntryLocal(p, _) |
|
||||
EntryPat(p, _) |
|
||||
EntryBlock(p, _) |
|
||||
EntryStructCtor(p, _) |
|
||||
EntryLifetime(p, _) |
|
||||
EntryTyParam(p, _) =>
|
||||
id = p,
|
||||
RootInlinedParent(parent) => match *parent {
|
||||
InlinedItem::Item(def_id, _) |
|
||||
InlinedItem::TraitItem(def_id, _) |
|
||||
InlinedItem::ImplItem(def_id, _) |
|
||||
InlinedItem::Foreign(def_id, _) =>
|
||||
return DepNode::MetaData(def_id)
|
||||
},
|
||||
|
||||
RootCrate |
|
||||
RootInlinedParent(_) =>
|
||||
// FIXME(#32015) clarify story about cross-crate dep tracking
|
||||
return DepNode::Krate,
|
||||
RootCrate =>
|
||||
bug!("node {} has crate ancestor but is inlined", id0),
|
||||
|
||||
NotPresent =>
|
||||
// Some nodes, notably struct fields, are not
|
||||
// present in the map for whatever reason, but
|
||||
// they *do* have def-ids. So if we encounter an
|
||||
// empty hole, check for that case.
|
||||
return self.opt_local_def_id(id)
|
||||
.map(|def_id| DepNode::Hir(def_id))
|
||||
.unwrap_or_else(|| {
|
||||
bug!("Walking parents from `{}` \
|
||||
led to `NotPresent` at `{}`",
|
||||
id0, id)
|
||||
}),
|
||||
NotPresent =>
|
||||
bug!("node {} is inlined but not present in map", id0),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -664,10 +715,6 @@ impl<'ast> Map<'ast> {
|
||||
pub fn node_to_user_string(&self, id: NodeId) -> String {
|
||||
node_id_to_string(self, id, false)
|
||||
}
|
||||
|
||||
pub fn is_inlined(&self, id: NodeId) -> bool {
|
||||
id >= self.local_node_id_watermark
|
||||
}
|
||||
}
|
||||
|
||||
pub struct NodesMatchingSuffix<'a, 'ast:'a> {
|
||||
@ -846,13 +893,15 @@ pub fn map_crate<'ast>(forest: &'ast mut Forest,
|
||||
}
|
||||
|
||||
let local_node_id_watermark = map.len() as NodeId;
|
||||
let local_def_id_watermark = definitions.len();
|
||||
|
||||
Map {
|
||||
forest: forest,
|
||||
dep_graph: forest.dep_graph.clone(),
|
||||
map: RefCell::new(map),
|
||||
definitions: RefCell::new(definitions),
|
||||
local_node_id_watermark: local_node_id_watermark
|
||||
local_node_id_watermark: local_node_id_watermark,
|
||||
local_def_id_watermark: local_def_id_watermark,
|
||||
}
|
||||
}
|
||||
|
||||
@ -866,7 +915,8 @@ pub fn map_decoded_item<'ast, F: FoldOps>(map: &Map<'ast>,
|
||||
-> &'ast InlinedItem {
|
||||
let mut fld = IdAndSpanUpdater::new(fold_ops);
|
||||
let ii = match ii {
|
||||
II::Item(i) => II::Item(i.map(|i| fld.fold_item(i))),
|
||||
II::Item(d, i) => II::Item(fld.fold_ops.new_def_id(d),
|
||||
i.map(|i| fld.fold_item(i))),
|
||||
II::TraitItem(d, ti) => {
|
||||
II::TraitItem(fld.fold_ops.new_def_id(d),
|
||||
ti.map(|ti| fld.fold_trait_item(ti)))
|
||||
@ -875,7 +925,8 @@ pub fn map_decoded_item<'ast, F: FoldOps>(map: &Map<'ast>,
|
||||
II::ImplItem(fld.fold_ops.new_def_id(d),
|
||||
ii.map(|ii| fld.fold_impl_item(ii)))
|
||||
}
|
||||
II::Foreign(i) => II::Foreign(i.map(|i| fld.fold_foreign_item(i)))
|
||||
II::Foreign(d, i) => II::Foreign(fld.fold_ops.new_def_id(d),
|
||||
i.map(|i| fld.fold_foreign_item(i)))
|
||||
};
|
||||
|
||||
let ii = map.forest.inlined_items.alloc(ii);
|
||||
|
@ -94,19 +94,19 @@ pub enum DefLike {
|
||||
/// that we trans.
|
||||
#[derive(Clone, PartialEq, Eq, RustcEncodable, RustcDecodable, Hash, Debug)]
|
||||
pub enum InlinedItem {
|
||||
Item(P<hir::Item>),
|
||||
Item(DefId /* def-id in source crate */, P<hir::Item>),
|
||||
TraitItem(DefId /* impl id */, P<hir::TraitItem>),
|
||||
ImplItem(DefId /* impl id */, P<hir::ImplItem>),
|
||||
Foreign(P<hir::ForeignItem>),
|
||||
Foreign(DefId /* extern item */, P<hir::ForeignItem>),
|
||||
}
|
||||
|
||||
/// A borrowed version of `hir::InlinedItem`.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug)]
|
||||
pub enum InlinedItemRef<'a> {
|
||||
Item(&'a hir::Item),
|
||||
Item(DefId, &'a hir::Item),
|
||||
TraitItem(DefId, &'a hir::TraitItem),
|
||||
ImplItem(DefId, &'a hir::ImplItem),
|
||||
Foreign(&'a hir::ForeignItem)
|
||||
Foreign(DefId, &'a hir::ForeignItem)
|
||||
}
|
||||
|
||||
/// Item definitions in the currently-compiled crate would have the CrateNum
|
||||
@ -283,8 +283,8 @@ impl InlinedItem {
|
||||
where V: Visitor<'ast>
|
||||
{
|
||||
match *self {
|
||||
InlinedItem::Item(ref i) => visitor.visit_item(&i),
|
||||
InlinedItem::Foreign(ref i) => visitor.visit_foreign_item(&i),
|
||||
InlinedItem::Item(_, ref i) => visitor.visit_item(&i),
|
||||
InlinedItem::Foreign(_, ref i) => visitor.visit_foreign_item(&i),
|
||||
InlinedItem::TraitItem(_, ref ti) => visitor.visit_trait_item(ti),
|
||||
InlinedItem::ImplItem(_, ref ii) => visitor.visit_impl_item(ii),
|
||||
}
|
||||
|
@ -8,9 +8,31 @@
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use util::nodemap::NodeMap;
|
||||
use dep_graph::{DepGraph, DepNode, DepTrackingMap, DepTrackingMapConfig};
|
||||
use hir::def_id::DefId;
|
||||
use mir::repr::Mir;
|
||||
use std::marker::PhantomData;
|
||||
|
||||
pub struct MirMap<'tcx> {
|
||||
pub map: NodeMap<Mir<'tcx>>,
|
||||
pub map: DepTrackingMap<MirMapConfig<'tcx>>,
|
||||
}
|
||||
|
||||
impl<'tcx> MirMap<'tcx> {
|
||||
pub fn new(graph: DepGraph) -> Self {
|
||||
MirMap {
|
||||
map: DepTrackingMap::new(graph)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct MirMapConfig<'tcx> {
|
||||
data: PhantomData<&'tcx ()>
|
||||
}
|
||||
|
||||
impl<'tcx> DepTrackingMapConfig for MirMapConfig<'tcx> {
|
||||
type Key = DefId;
|
||||
type Value = Mir<'tcx>;
|
||||
fn to_dep_node(key: &DefId) -> DepNode<DefId> {
|
||||
DepNode::Mir(*key)
|
||||
}
|
||||
}
|
||||
|
@ -11,7 +11,6 @@
|
||||
use dep_graph::DepNode;
|
||||
use hir;
|
||||
use hir::map::DefPathData;
|
||||
use hir::def_id::DefId;
|
||||
use mir::mir_map::MirMap;
|
||||
use mir::repr::{Mir, Promoted};
|
||||
use ty::TyCtxt;
|
||||
@ -73,9 +72,6 @@ impl<'a, 'tcx> MirSource {
|
||||
/// Various information about pass.
|
||||
pub trait Pass {
|
||||
// fn should_run(Session) to check if pass should run?
|
||||
fn dep_node(&self, def_id: DefId) -> DepNode<DefId> {
|
||||
DepNode::MirPass(def_id)
|
||||
}
|
||||
fn name(&self) -> &str {
|
||||
let name = unsafe { ::std::intrinsics::type_name::<Self>() };
|
||||
if let Some(tail) = name.rfind(":") {
|
||||
@ -119,10 +115,11 @@ impl<'tcx, T: MirPass<'tcx>> MirMapPass<'tcx> for T {
|
||||
map: &mut MirMap<'tcx>,
|
||||
hooks: &mut [Box<for<'s> MirPassHook<'s>>])
|
||||
{
|
||||
for (&id, mir) in &mut map.map {
|
||||
let def_id = tcx.map.local_def_id(id);
|
||||
let _task = tcx.dep_graph.in_task(self.dep_node(def_id));
|
||||
|
||||
let def_ids = map.map.keys();
|
||||
for def_id in def_ids {
|
||||
let _task = tcx.dep_graph.in_task(DepNode::Mir(def_id));
|
||||
let mir = map.map.get_mut(&def_id).unwrap();
|
||||
let id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
let src = MirSource::from_node(tcx, id);
|
||||
|
||||
for hook in &mut *hooks {
|
||||
|
@ -745,6 +745,8 @@ options! {DebuggingOptions, DebuggingSetter, basic_debugging_options,
|
||||
"attempt to recover from parse errors (experimental)"),
|
||||
incremental: Option<String> = (None, parse_opt_string,
|
||||
"enable incremental compilation (experimental)"),
|
||||
incremental_info: bool = (false, parse_bool,
|
||||
"print high-level information about incremental reuse (or the lack thereof)"),
|
||||
dump_dep_graph: bool = (false, parse_bool,
|
||||
"dump the dependency graph to $RUST_DEP_GRAPH (default: /tmp/dep_graph.gv)"),
|
||||
query_dep_graph: bool = (false, parse_bool,
|
||||
|
@ -525,7 +525,7 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
|
||||
}
|
||||
|
||||
pub fn retrace_path(self, path: &DefPath) -> Option<DefId> {
|
||||
debug!("retrace_path(path={:?})", path);
|
||||
debug!("retrace_path(path={:?}, krate={:?})", path, self.crate_name(path.krate));
|
||||
|
||||
let root_key = DefKey {
|
||||
parent: None,
|
||||
|
@ -52,8 +52,10 @@ impl<'tcx, 'lt> TyIVar<'tcx, 'lt> {
|
||||
self.untracked_get()
|
||||
}
|
||||
|
||||
/// Reads the ivar without registered a dep-graph read. Use with
|
||||
/// caution.
|
||||
#[inline]
|
||||
fn untracked_get(&self) -> Option<Ty<'tcx>> {
|
||||
pub fn untracked_get(&self) -> Option<Ty<'tcx>> {
|
||||
match self.0.get() {
|
||||
None => None,
|
||||
// valid because of invariant (A)
|
||||
|
@ -1757,8 +1757,7 @@ impl<'a, 'gcx, 'tcx, 'container> AdtDefData<'tcx, 'container> {
|
||||
/// Due to normalization being eager, this applies even if
|
||||
/// the associated type is behind a pointer, e.g. issue #31299.
|
||||
pub fn sized_constraint(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>) -> Ty<'tcx> {
|
||||
let dep_node = DepNode::SizedConstraint(self.did);
|
||||
match self.sized_constraint.get(dep_node) {
|
||||
match self.sized_constraint.get(DepNode::SizedConstraint(self.did)) {
|
||||
None => {
|
||||
let global_tcx = tcx.global_tcx();
|
||||
let this = global_tcx.lookup_adt_def_master(self.did);
|
||||
@ -1786,12 +1785,18 @@ impl<'a, 'tcx> AdtDefData<'tcx, 'tcx> {
|
||||
/// such.
|
||||
/// - a TyError, if a type contained itself. The representability
|
||||
/// check should catch this case.
|
||||
fn calculate_sized_constraint_inner(&'tcx self, tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
fn calculate_sized_constraint_inner(&'tcx self,
|
||||
tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
stack: &mut Vec<AdtDefMaster<'tcx>>)
|
||||
{
|
||||
|
||||
let dep_node = || DepNode::SizedConstraint(self.did);
|
||||
if self.sized_constraint.get(dep_node()).is_some() {
|
||||
|
||||
// Follow the memoization pattern: push the computation of
|
||||
// DepNode::SizedConstraint as our current task.
|
||||
let _task = tcx.dep_graph.in_task(dep_node());
|
||||
if self.sized_constraint.untracked_get().is_some() {
|
||||
// ---------------
|
||||
// can skip the dep-graph read since we just pushed the task
|
||||
return;
|
||||
}
|
||||
|
||||
|
@ -168,8 +168,10 @@ fn borrowck_fn(this: &mut BorrowckCtxt,
|
||||
attributes: &[ast::Attribute]) {
|
||||
debug!("borrowck_fn(id={})", id);
|
||||
|
||||
let def_id = this.tcx.map.local_def_id(id);
|
||||
|
||||
if attributes.iter().any(|item| item.check_name("rustc_mir_borrowck")) {
|
||||
let mir = this.mir_map.unwrap().map.get(&id).unwrap();
|
||||
let mir = this.mir_map.unwrap().map.get(&def_id).unwrap();
|
||||
this.with_temp_region_map(id, |this| {
|
||||
mir::borrowck_mir(this, fk, decl, mir, body, sp, id, attributes)
|
||||
});
|
||||
|
@ -142,7 +142,7 @@ pub fn lookup_const_by_id<'a, 'tcx: 'a>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
}
|
||||
let mut used_substs = false;
|
||||
let expr_ty = match tcx.sess.cstore.maybe_get_item_ast(tcx, def_id) {
|
||||
Some((&InlinedItem::Item(ref item), _)) => match item.node {
|
||||
Some((&InlinedItem::Item(_, ref item), _)) => match item.node {
|
||||
hir::ItemConst(ref ty, ref const_expr) => {
|
||||
Some((&**const_expr, tcx.ast_ty_to_prim_ty(ty)))
|
||||
},
|
||||
@ -198,7 +198,7 @@ fn inline_const_fn_from_external_crate<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
}
|
||||
|
||||
let fn_id = match tcx.sess.cstore.maybe_get_item_ast(tcx, def_id) {
|
||||
Some((&InlinedItem::Item(ref item), _)) => Some(item.id),
|
||||
Some((&InlinedItem::Item(_, ref item), _)) => Some(item.id),
|
||||
Some((&InlinedItem::ImplItem(_, ref item), _)) => Some(item.id),
|
||||
_ => None
|
||||
};
|
||||
|
@ -22,6 +22,12 @@ impl BitVector {
|
||||
BitVector { data: vec![0; num_words] }
|
||||
}
|
||||
|
||||
pub fn clear(&mut self) {
|
||||
for p in &mut self.data {
|
||||
*p = 0;
|
||||
}
|
||||
}
|
||||
|
||||
pub fn contains(&self, bit: usize) -> bool {
|
||||
let (word, mask) = word_mask(bit);
|
||||
(self.data[word] & mask) != 0
|
||||
@ -118,32 +124,32 @@ impl FromIterator<bool> for BitVector {
|
||||
}
|
||||
}
|
||||
|
||||
/// A "bit matrix" is basically a square matrix of booleans
|
||||
/// represented as one gigantic bitvector. In other words, it is as if
|
||||
/// you have N bitvectors, each of length N. Note that `elements` here is `N`/
|
||||
/// A "bit matrix" is basically a matrix of booleans represented as
|
||||
/// one gigantic bitvector. In other words, it is as if you have
|
||||
/// `rows` bitvectors, each of length `columns`.
|
||||
#[derive(Clone)]
|
||||
pub struct BitMatrix {
|
||||
elements: usize,
|
||||
columns: usize,
|
||||
vector: Vec<u64>,
|
||||
}
|
||||
|
||||
impl BitMatrix {
|
||||
// Create a new `elements x elements` matrix, initially empty.
|
||||
pub fn new(elements: usize) -> BitMatrix {
|
||||
// Create a new `rows x columns` matrix, initially empty.
|
||||
pub fn new(rows: usize, columns: usize) -> BitMatrix {
|
||||
// For every element, we need one bit for every other
|
||||
// element. Round up to an even number of u64s.
|
||||
let u64s_per_elem = u64s(elements);
|
||||
let u64s_per_row = u64s(columns);
|
||||
BitMatrix {
|
||||
elements: elements,
|
||||
vector: vec![0; elements * u64s_per_elem],
|
||||
columns: columns,
|
||||
vector: vec![0; rows * u64s_per_row],
|
||||
}
|
||||
}
|
||||
|
||||
/// The range of bits for a given element.
|
||||
fn range(&self, element: usize) -> (usize, usize) {
|
||||
let u64s_per_elem = u64s(self.elements);
|
||||
let start = element * u64s_per_elem;
|
||||
(start, start + u64s_per_elem)
|
||||
/// The range of bits for a given row.
|
||||
fn range(&self, row: usize) -> (usize, usize) {
|
||||
let u64s_per_row = u64s(self.columns);
|
||||
let start = row * u64s_per_row;
|
||||
(start, start + u64s_per_row)
|
||||
}
|
||||
|
||||
pub fn add(&mut self, source: usize, target: usize) -> bool {
|
||||
@ -173,7 +179,7 @@ impl BitMatrix {
|
||||
pub fn intersection(&self, a: usize, b: usize) -> Vec<usize> {
|
||||
let (a_start, a_end) = self.range(a);
|
||||
let (b_start, b_end) = self.range(b);
|
||||
let mut result = Vec::with_capacity(self.elements);
|
||||
let mut result = Vec::with_capacity(self.columns);
|
||||
for (base, (i, j)) in (a_start..a_end).zip(b_start..b_end).enumerate() {
|
||||
let mut v = self.vector[i] & self.vector[j];
|
||||
for bit in 0..64 {
|
||||
@ -209,6 +215,15 @@ impl BitMatrix {
|
||||
}
|
||||
changed
|
||||
}
|
||||
|
||||
pub fn iter<'a>(&'a self, row: usize) -> BitVectorIter<'a> {
|
||||
let (start, end) = self.range(row);
|
||||
BitVectorIter {
|
||||
iter: self.vector[start..end].iter(),
|
||||
current: 0,
|
||||
idx: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn u64s(elements: usize) -> usize {
|
||||
@ -294,7 +309,7 @@ fn grow() {
|
||||
|
||||
#[test]
|
||||
fn matrix_intersection() {
|
||||
let mut vec1 = BitMatrix::new(200);
|
||||
let mut vec1 = BitMatrix::new(200, 200);
|
||||
|
||||
// (*) Elements reachable from both 2 and 65.
|
||||
|
||||
@ -322,3 +337,45 @@ fn matrix_intersection() {
|
||||
let intersection = vec1.intersection(2, 65);
|
||||
assert_eq!(intersection, &[10, 64, 160]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn matrix_iter() {
|
||||
let mut matrix = BitMatrix::new(64, 100);
|
||||
matrix.add(3, 22);
|
||||
matrix.add(3, 75);
|
||||
matrix.add(2, 99);
|
||||
matrix.add(4, 0);
|
||||
matrix.merge(3, 5);
|
||||
|
||||
let expected = [99];
|
||||
let mut iter = expected.iter();
|
||||
for i in matrix.iter(2) {
|
||||
let j = *iter.next().unwrap();
|
||||
assert_eq!(i, j);
|
||||
}
|
||||
assert!(iter.next().is_none());
|
||||
|
||||
let expected = [22, 75];
|
||||
let mut iter = expected.iter();
|
||||
for i in matrix.iter(3) {
|
||||
let j = *iter.next().unwrap();
|
||||
assert_eq!(i, j);
|
||||
}
|
||||
assert!(iter.next().is_none());
|
||||
|
||||
let expected = [0];
|
||||
let mut iter = expected.iter();
|
||||
for i in matrix.iter(4) {
|
||||
let j = *iter.next().unwrap();
|
||||
assert_eq!(i, j);
|
||||
}
|
||||
assert!(iter.next().is_none());
|
||||
|
||||
let expected = [22, 75];
|
||||
let mut iter = expected.iter();
|
||||
for i in matrix.iter(5) {
|
||||
let j = *iter.next().unwrap();
|
||||
assert_eq!(i, j);
|
||||
}
|
||||
assert!(iter.next().is_none());
|
||||
}
|
||||
|
@ -296,12 +296,7 @@ impl<N: Debug, E: Debug> Graph<N, E> {
|
||||
start: NodeIndex,
|
||||
direction: Direction)
|
||||
-> DepthFirstTraversal<'a, N, E> {
|
||||
DepthFirstTraversal {
|
||||
graph: self,
|
||||
stack: vec![start],
|
||||
visited: BitVector::new(self.nodes.len()),
|
||||
direction: direction,
|
||||
}
|
||||
DepthFirstTraversal::with_start_node(self, start, direction)
|
||||
}
|
||||
}
|
||||
|
||||
@ -378,26 +373,57 @@ pub struct DepthFirstTraversal<'g, N: 'g, E: 'g> {
|
||||
direction: Direction,
|
||||
}
|
||||
|
||||
impl<'g, N: Debug, E: Debug> DepthFirstTraversal<'g, N, E> {
|
||||
pub fn new(graph: &'g Graph<N, E>, direction: Direction) -> Self {
|
||||
let visited = BitVector::new(graph.len_nodes());
|
||||
DepthFirstTraversal {
|
||||
graph: graph,
|
||||
stack: vec![],
|
||||
visited: visited,
|
||||
direction: direction
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_start_node(graph: &'g Graph<N, E>,
|
||||
start_node: NodeIndex,
|
||||
direction: Direction)
|
||||
-> Self {
|
||||
let mut visited = BitVector::new(graph.len_nodes());
|
||||
visited.insert(start_node.node_id());
|
||||
DepthFirstTraversal {
|
||||
graph: graph,
|
||||
stack: vec![start_node],
|
||||
visited: visited,
|
||||
direction: direction
|
||||
}
|
||||
}
|
||||
|
||||
pub fn reset(&mut self, start_node: NodeIndex) {
|
||||
self.stack.truncate(0);
|
||||
self.stack.push(start_node);
|
||||
self.visited.clear();
|
||||
self.visited.insert(start_node.node_id());
|
||||
}
|
||||
|
||||
fn visit(&mut self, node: NodeIndex) {
|
||||
if self.visited.insert(node.node_id()) {
|
||||
self.stack.push(node);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'g, N: Debug, E: Debug> Iterator for DepthFirstTraversal<'g, N, E> {
|
||||
type Item = NodeIndex;
|
||||
|
||||
fn next(&mut self) -> Option<NodeIndex> {
|
||||
while let Some(idx) = self.stack.pop() {
|
||||
if !self.visited.insert(idx.node_id()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
let next = self.stack.pop();
|
||||
if let Some(idx) = next {
|
||||
for (_, edge) in self.graph.adjacent_edges(idx, self.direction) {
|
||||
let target = edge.source_or_target(self.direction);
|
||||
if !self.visited.contains(target.node_id()) {
|
||||
self.stack.push(target);
|
||||
}
|
||||
self.visit(target);
|
||||
}
|
||||
|
||||
return Some(idx);
|
||||
}
|
||||
|
||||
return None;
|
||||
next
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -252,7 +252,8 @@ impl<T: Debug + PartialEq> TransitiveRelation<T> {
|
||||
}
|
||||
|
||||
fn compute_closure(&self) -> BitMatrix {
|
||||
let mut matrix = BitMatrix::new(self.elements.len());
|
||||
let mut matrix = BitMatrix::new(self.elements.len(),
|
||||
self.elements.len());
|
||||
let mut changed = true;
|
||||
while changed {
|
||||
changed = false;
|
||||
|
@ -956,20 +956,24 @@ fn print_with_analysis<'tcx, 'a: 'tcx>(sess: &'a Session,
|
||||
PpmMir | PpmMirCFG => {
|
||||
if let Some(mir_map) = mir_map {
|
||||
if let Some(nodeid) = nodeid {
|
||||
let mir = mir_map.map.get(&nodeid).unwrap_or_else(|| {
|
||||
sess.fatal(&format!("no MIR map entry for node {}", nodeid))
|
||||
});
|
||||
let def_id = tcx.map.local_def_id(nodeid);
|
||||
match ppm {
|
||||
PpmMir => write_mir_pretty(tcx, iter::once((&nodeid, mir)), &mut out),
|
||||
PpmMir => write_mir_pretty(tcx, iter::once(def_id), &mir_map, &mut out),
|
||||
PpmMirCFG => {
|
||||
write_mir_graphviz(tcx, iter::once((&nodeid, mir)), &mut out)
|
||||
write_mir_graphviz(tcx, iter::once(def_id), &mir_map, &mut out)
|
||||
}
|
||||
_ => unreachable!(),
|
||||
}?;
|
||||
} else {
|
||||
match ppm {
|
||||
PpmMir => write_mir_pretty(tcx, mir_map.map.iter(), &mut out),
|
||||
PpmMirCFG => write_mir_graphviz(tcx, mir_map.map.iter(), &mut out),
|
||||
PpmMir => write_mir_pretty(tcx,
|
||||
mir_map.map.keys().into_iter(),
|
||||
&mir_map,
|
||||
&mut out),
|
||||
PpmMirCFG => write_mir_graphviz(tcx,
|
||||
mir_map.map.keys().into_iter(),
|
||||
&mir_map,
|
||||
&mut out),
|
||||
_ => unreachable!(),
|
||||
}?;
|
||||
}
|
||||
|
@ -71,6 +71,13 @@ pub fn assert_dep_graph<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>) {
|
||||
dump_graph(tcx);
|
||||
}
|
||||
|
||||
// if the `rustc_attrs` feature is not enabled, then the
|
||||
// attributes we are interested in cannot be present anyway, so
|
||||
// skip the walk.
|
||||
if !tcx.sess.features.borrow().rustc_attrs {
|
||||
return;
|
||||
}
|
||||
|
||||
// Find annotations supplied by user (if any).
|
||||
let (if_this_changed, then_this_would_need) = {
|
||||
let mut visitor = IfThisChanged { tcx: tcx,
|
||||
|
@ -14,6 +14,7 @@
|
||||
use syntax::attr::AttributeMethods;
|
||||
use std::hash::{Hash, SipHasher, Hasher};
|
||||
use rustc::hir::def_id::{CRATE_DEF_INDEX, DefId};
|
||||
use rustc::hir::map::{NodeItem, NodeForeignItem};
|
||||
use rustc::hir::svh::Svh;
|
||||
use rustc::ty::TyCtxt;
|
||||
use rustc::hir::intravisit::{self, Visitor};
|
||||
@ -92,8 +93,12 @@ impl<'a, 'tcx> SvhCalculate for TyCtxt<'a, 'tcx, 'tcx> {
|
||||
intravisit::walk_crate(&mut visit, krate);
|
||||
} else {
|
||||
let node_id = self.map.as_local_node_id(def_id).unwrap();
|
||||
let item = self.map.expect_item(node_id);
|
||||
visit.visit_item(item);
|
||||
match self.map.find(node_id) {
|
||||
Some(NodeItem(item)) => visit.visit_item(item),
|
||||
Some(NodeForeignItem(item)) => visit.visit_foreign_item(item),
|
||||
r => bug!("calculate_item_hash: expected an item for node {} not {:?}",
|
||||
node_id, r),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -139,12 +144,7 @@ mod svh_visitor {
|
||||
}
|
||||
|
||||
fn hash_def_path(&mut self, path: &DefPath) {
|
||||
self.tcx.crate_name(path.krate).hash(self.st);
|
||||
self.tcx.crate_disambiguator(path.krate).hash(self.st);
|
||||
for data in &path.data {
|
||||
data.data.as_interned_str().hash(self.st);
|
||||
data.disambiguator.hash(self.st);
|
||||
}
|
||||
path.deterministic_hash_to(self.tcx, self.st);
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -19,7 +19,6 @@ use super::directory::DefPathIndex;
|
||||
/// Data for use when recompiling the **current crate**.
|
||||
#[derive(Debug, RustcEncodable, RustcDecodable)]
|
||||
pub struct SerializedDepGraph {
|
||||
pub nodes: Vec<DepNode<DefPathIndex>>,
|
||||
pub edges: Vec<SerializedEdge>,
|
||||
|
||||
/// These are hashes of two things:
|
||||
@ -44,15 +43,22 @@ pub struct SerializedDepGraph {
|
||||
pub hashes: Vec<SerializedHash>,
|
||||
}
|
||||
|
||||
/// Represents a "reduced" dependency edge. Unlike the full dep-graph,
|
||||
/// the dep-graph we serialize contains only edges `S -> T` where the
|
||||
/// source `S` is something hashable (a HIR node or foreign metadata)
|
||||
/// and the target `T` is something significant, like a work-product.
|
||||
/// Normally, significant nodes are only those that have saved data on
|
||||
/// disk, but in unit-testing the set of significant nodes can be
|
||||
/// increased.
|
||||
pub type SerializedEdge = (DepNode<DefPathIndex>, DepNode<DefPathIndex>);
|
||||
|
||||
#[derive(Debug, RustcEncodable, RustcDecodable)]
|
||||
pub struct SerializedHash {
|
||||
/// node being hashed; either a Hir or MetaData variant, in
|
||||
/// practice
|
||||
pub node: DepNode<DefPathIndex>,
|
||||
/// def-id of thing being hashed
|
||||
pub dep_node: DepNode<DefPathIndex>,
|
||||
|
||||
/// the hash itself, computed by `calculate_item_hash`
|
||||
/// the hash as of previous compilation, computed by code in
|
||||
/// `hash` module
|
||||
pub hash: u64,
|
||||
}
|
||||
|
||||
|
@ -16,9 +16,12 @@
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir::map::DefPath;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::middle::cstore::LOCAL_CRATE;
|
||||
use rustc::ty::TyCtxt;
|
||||
use rustc::util::nodemap::DefIdMap;
|
||||
use std::fmt::{self, Debug};
|
||||
use std::iter::once;
|
||||
use syntax::ast;
|
||||
|
||||
/// Index into the DefIdDirectory
|
||||
#[derive(Copy, Clone, Debug, PartialOrd, Ord, Hash, PartialEq, Eq,
|
||||
@ -31,17 +34,79 @@ pub struct DefPathIndex {
|
||||
pub struct DefIdDirectory {
|
||||
// N.B. don't use Removable here because these def-ids are loaded
|
||||
// directly without remapping, so loading them should not fail.
|
||||
paths: Vec<DefPath>
|
||||
paths: Vec<DefPath>,
|
||||
|
||||
// For each crate, saves the crate-name/disambiguator so that
|
||||
// later we can match crate-numbers up again.
|
||||
krates: Vec<CrateInfo>,
|
||||
}
|
||||
|
||||
#[derive(Debug, RustcEncodable, RustcDecodable)]
|
||||
pub struct CrateInfo {
|
||||
krate: ast::CrateNum,
|
||||
name: String,
|
||||
disambiguator: String,
|
||||
}
|
||||
|
||||
impl DefIdDirectory {
|
||||
pub fn new() -> DefIdDirectory {
|
||||
DefIdDirectory { paths: vec![] }
|
||||
pub fn new(krates: Vec<CrateInfo>) -> DefIdDirectory {
|
||||
DefIdDirectory { paths: vec![], krates: krates }
|
||||
}
|
||||
|
||||
fn max_current_crate(&self, tcx: TyCtxt) -> ast::CrateNum {
|
||||
tcx.sess.cstore.crates()
|
||||
.into_iter()
|
||||
.max()
|
||||
.unwrap_or(LOCAL_CRATE)
|
||||
}
|
||||
|
||||
/// Returns a string form for `index`; useful for debugging
|
||||
pub fn def_path_string(&self, tcx: TyCtxt, index: DefPathIndex) -> String {
|
||||
let path = &self.paths[index.index as usize];
|
||||
if self.krate_still_valid(tcx, self.max_current_crate(tcx), path.krate) {
|
||||
path.to_string(tcx)
|
||||
} else {
|
||||
format!("<crate {} changed>", path.krate)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn krate_still_valid(&self,
|
||||
tcx: TyCtxt,
|
||||
max_current_crate: ast::CrateNum,
|
||||
krate: ast::CrateNum) -> bool {
|
||||
// Check that the crate-number still matches. For now, if it
|
||||
// doesn't, just return None. We could do better, such as
|
||||
// finding the new number.
|
||||
|
||||
if krate > max_current_crate {
|
||||
false
|
||||
} else {
|
||||
let old_info = &self.krates[krate as usize];
|
||||
assert_eq!(old_info.krate, krate);
|
||||
let old_name: &str = &old_info.name;
|
||||
let old_disambiguator: &str = &old_info.disambiguator;
|
||||
let new_name: &str = &tcx.crate_name(krate);
|
||||
let new_disambiguator: &str = &tcx.crate_disambiguator(krate);
|
||||
old_name == new_name && old_disambiguator == new_disambiguator
|
||||
}
|
||||
}
|
||||
|
||||
pub fn retrace(&self, tcx: TyCtxt) -> RetracedDefIdDirectory {
|
||||
let max_current_crate = self.max_current_crate(tcx);
|
||||
|
||||
let ids = self.paths.iter()
|
||||
.map(|path| tcx.retrace_path(path))
|
||||
.map(|path| {
|
||||
if self.krate_still_valid(tcx, max_current_crate, path.krate) {
|
||||
tcx.retrace_path(path)
|
||||
} else {
|
||||
debug!("crate {} changed from {:?} to {:?}/{:?}",
|
||||
path.krate,
|
||||
self.krates[path.krate as usize],
|
||||
tcx.crate_name(path.krate),
|
||||
tcx.crate_disambiguator(path.krate));
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
RetracedDefIdDirectory { ids: ids }
|
||||
}
|
||||
@ -70,10 +135,26 @@ pub struct DefIdDirectoryBuilder<'a,'tcx:'a> {
|
||||
|
||||
impl<'a,'tcx> DefIdDirectoryBuilder<'a,'tcx> {
|
||||
pub fn new(tcx: TyCtxt<'a, 'tcx, 'tcx>) -> DefIdDirectoryBuilder<'a, 'tcx> {
|
||||
let mut krates: Vec<_> =
|
||||
once(LOCAL_CRATE)
|
||||
.chain(tcx.sess.cstore.crates())
|
||||
.map(|krate| {
|
||||
CrateInfo {
|
||||
krate: krate,
|
||||
name: tcx.crate_name(krate).to_string(),
|
||||
disambiguator: tcx.crate_disambiguator(krate).to_string()
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
// the result of crates() is not in order, so sort list of
|
||||
// crates so that we can just index it later
|
||||
krates.sort_by_key(|k| k.krate);
|
||||
|
||||
DefIdDirectoryBuilder {
|
||||
tcx: tcx,
|
||||
hash: DefIdMap(),
|
||||
directory: DefIdDirectory::new()
|
||||
directory: DefIdDirectory::new(krates),
|
||||
}
|
||||
}
|
||||
|
||||
@ -91,12 +172,17 @@ impl<'a,'tcx> DefIdDirectoryBuilder<'a,'tcx> {
|
||||
.clone()
|
||||
}
|
||||
|
||||
pub fn lookup_def_path(&self, id: DefPathIndex) -> &DefPath {
|
||||
&self.directory.paths[id.index as usize]
|
||||
}
|
||||
|
||||
|
||||
pub fn map(&mut self, node: &DepNode<DefId>) -> DepNode<DefPathIndex> {
|
||||
node.map_def(|&def_id| Some(self.add(def_id))).unwrap()
|
||||
}
|
||||
|
||||
pub fn into_directory(self) -> DefIdDirectory {
|
||||
self.directory
|
||||
pub fn directory(&self) -> &DefIdDirectory {
|
||||
&self.directory
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -39,12 +39,19 @@ impl<'a, 'tcx> HashContext<'a, 'tcx> {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn hash(&mut self, dep_node: &DepNode<DefId>) -> Option<u64> {
|
||||
pub fn is_hashable(dep_node: &DepNode<DefId>) -> bool {
|
||||
match *dep_node {
|
||||
DepNode::Hir(_) => true,
|
||||
DepNode::MetaData(def_id) => !def_id.is_local(),
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn hash(&mut self, dep_node: &DepNode<DefId>) -> Option<(DefId, u64)> {
|
||||
match *dep_node {
|
||||
// HIR nodes (which always come from our crate) are an input:
|
||||
DepNode::Hir(def_id) => {
|
||||
assert!(def_id.is_local());
|
||||
Some(self.hir_hash(def_id))
|
||||
Some((def_id, self.hir_hash(def_id)))
|
||||
}
|
||||
|
||||
// MetaData from other crates is an *input* to us.
|
||||
@ -52,7 +59,7 @@ impl<'a, 'tcx> HashContext<'a, 'tcx> {
|
||||
// don't hash them, but we do compute a hash for them and
|
||||
// save it for others to use.
|
||||
DepNode::MetaData(def_id) if !def_id.is_local() => {
|
||||
Some(self.metadata_hash(def_id))
|
||||
Some((def_id, self.metadata_hash(def_id)))
|
||||
}
|
||||
|
||||
_ => {
|
||||
@ -66,7 +73,16 @@ impl<'a, 'tcx> HashContext<'a, 'tcx> {
|
||||
}
|
||||
|
||||
fn hir_hash(&mut self, def_id: DefId) -> u64 {
|
||||
assert!(def_id.is_local());
|
||||
assert!(def_id.is_local(),
|
||||
"cannot hash HIR for non-local def-id {:?} => {:?}",
|
||||
def_id,
|
||||
self.tcx.item_path_str(def_id));
|
||||
|
||||
assert!(!self.tcx.map.is_inlined_def_id(def_id),
|
||||
"cannot hash HIR for inlined def-id {:?} => {:?}",
|
||||
def_id,
|
||||
self.tcx.item_path_str(def_id));
|
||||
|
||||
// FIXME(#32753) -- should we use a distinct hash here
|
||||
self.tcx.calculate_item_hash(def_id)
|
||||
}
|
||||
|
@ -28,7 +28,7 @@ use super::dirty_clean;
|
||||
use super::hash::*;
|
||||
use super::util::*;
|
||||
|
||||
type DirtyNodes = FnvHashSet<DepNode<DefId>>;
|
||||
type DirtyNodes = FnvHashSet<DepNode<DefPathIndex>>;
|
||||
|
||||
type CleanEdges = Vec<(DepNode<DefId>, DepNode<DefId>)>;
|
||||
|
||||
@ -93,7 +93,6 @@ fn load_data(sess: &Session, path: &Path) -> Option<Vec<u8>> {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/// Decode the dep graph and load the edges/nodes that are still clean
|
||||
@ -108,155 +107,125 @@ pub fn decode_dep_graph<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
let directory = try!(DefIdDirectory::decode(&mut dep_graph_decoder));
|
||||
let serialized_dep_graph = try!(SerializedDepGraph::decode(&mut dep_graph_decoder));
|
||||
|
||||
debug!("decode_dep_graph: directory = {:#?}", directory);
|
||||
debug!("decode_dep_graph: serialized_dep_graph = {:#?}", serialized_dep_graph);
|
||||
|
||||
// Retrace the paths in the directory to find their current location (if any).
|
||||
let retraced = directory.retrace(tcx);
|
||||
|
||||
debug!("decode_dep_graph: retraced = {:#?}", retraced);
|
||||
// Compute the set of Hir nodes whose data has changed or which
|
||||
// have been removed. These are "raw" source nodes, which means
|
||||
// that they still use the original `DefPathIndex` values from the
|
||||
// encoding, rather than having been retraced to a `DefId`. The
|
||||
// reason for this is that this way we can include nodes that have
|
||||
// been removed (which no longer have a `DefId` in the current
|
||||
// compilation).
|
||||
let dirty_raw_source_nodes = dirty_nodes(tcx, &serialized_dep_graph.hashes, &retraced);
|
||||
|
||||
// Compute the set of Hir nodes whose data has changed.
|
||||
let mut dirty_nodes =
|
||||
initial_dirty_nodes(tcx, &serialized_dep_graph.hashes, &retraced);
|
||||
// Create a list of (raw-source-node ->
|
||||
// retracted-target-node) edges. In the process of retracing the
|
||||
// target nodes, we may discover some of them def-paths no longer exist,
|
||||
// in which case there is no need to mark the corresopnding nodes as dirty
|
||||
// (they are just not present). So this list may be smaller than the original.
|
||||
//
|
||||
// Note though that in the common case the target nodes are
|
||||
// `DepNode::WorkProduct` instances, and those don't have a
|
||||
// def-id, so they will never be considered to not exist. Instead,
|
||||
// we do a secondary hashing step (later, in trans) when we know
|
||||
// the set of symbols that go into a work-product: if any symbols
|
||||
// have been removed (or added) the hash will be different and
|
||||
// we'll ignore the work-product then.
|
||||
let retraced_edges: Vec<_> =
|
||||
serialized_dep_graph.edges.iter()
|
||||
.filter_map(|&(ref raw_source_node, ref raw_target_node)| {
|
||||
retraced.map(raw_target_node)
|
||||
.map(|target_node| (raw_source_node, target_node))
|
||||
})
|
||||
.collect();
|
||||
|
||||
debug!("decode_dep_graph: initial dirty_nodes = {:#?}", dirty_nodes);
|
||||
// Compute which work-products have an input that has changed or
|
||||
// been removed. Put the dirty ones into a set.
|
||||
let mut dirty_target_nodes = FnvHashSet();
|
||||
for &(raw_source_node, ref target_node) in &retraced_edges {
|
||||
if dirty_raw_source_nodes.contains(raw_source_node) {
|
||||
if !dirty_target_nodes.contains(target_node) {
|
||||
dirty_target_nodes.insert(target_node.clone());
|
||||
|
||||
// Find all DepNodes reachable from that core set. This loop
|
||||
// iterates repeatedly over the list of edges whose source is not
|
||||
// known to be dirty (`clean_edges`). If it finds an edge whose
|
||||
// source is dirty, it removes it from that list and adds the
|
||||
// target to `dirty_nodes`. It stops when it reaches a fixed
|
||||
// point.
|
||||
let clean_edges = compute_clean_edges(&serialized_dep_graph.edges,
|
||||
&retraced,
|
||||
&mut dirty_nodes);
|
||||
if tcx.sess.opts.debugging_opts.incremental_info {
|
||||
// It'd be nice to pretty-print these paths better than just
|
||||
// using the `Debug` impls, but wev.
|
||||
println!("module {:?} is dirty because {:?} changed or was removed",
|
||||
target_node,
|
||||
raw_source_node.map_def(|&index| {
|
||||
Some(directory.def_path_string(tcx, index))
|
||||
}).unwrap());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add synthetic `foo->foo` edges for each clean node `foo` that
|
||||
// we had before. This is sort of a hack to create clean nodes in
|
||||
// the graph, since the existence of a node is a signal that the
|
||||
// work it represents need not be repeated.
|
||||
let clean_nodes =
|
||||
serialized_dep_graph.nodes
|
||||
.iter()
|
||||
.filter_map(|node| retraced.map(node))
|
||||
.filter(|node| !dirty_nodes.contains(node))
|
||||
.map(|node| (node.clone(), node));
|
||||
|
||||
// Add nodes and edges that are not dirty into our main graph.
|
||||
// For work-products that are still clean, add their deps into the
|
||||
// graph. This is needed because later we will have to save this
|
||||
// back out again!
|
||||
let dep_graph = tcx.dep_graph.clone();
|
||||
for (source, target) in clean_edges.into_iter().chain(clean_nodes) {
|
||||
debug!("decode_dep_graph: clean edge: {:?} -> {:?}", source, target);
|
||||
for (raw_source_node, target_node) in retraced_edges {
|
||||
if dirty_target_nodes.contains(&target_node) {
|
||||
continue;
|
||||
}
|
||||
|
||||
let _task = dep_graph.in_task(target);
|
||||
dep_graph.read(source);
|
||||
let source_node = retraced.map(raw_source_node).unwrap();
|
||||
|
||||
debug!("decode_dep_graph: clean edge: {:?} -> {:?}", source_node, target_node);
|
||||
|
||||
let _task = dep_graph.in_task(target_node);
|
||||
dep_graph.read(source_node);
|
||||
}
|
||||
|
||||
// Add in work-products that are still clean, and delete those that are
|
||||
// dirty.
|
||||
let mut work_product_decoder = Decoder::new(work_products_data, 0);
|
||||
let work_products = try!(<Vec<SerializedWorkProduct>>::decode(&mut work_product_decoder));
|
||||
reconcile_work_products(tcx, work_products, &dirty_nodes);
|
||||
reconcile_work_products(tcx, work_products, &dirty_target_nodes);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn initial_dirty_nodes<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
hashes: &[SerializedHash],
|
||||
retraced: &RetracedDefIdDirectory)
|
||||
-> DirtyNodes {
|
||||
/// Computes which of the original set of def-ids are dirty. Stored in
|
||||
/// a bit vector where the index is the DefPathIndex.
|
||||
fn dirty_nodes<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
hashes: &[SerializedHash],
|
||||
retraced: &RetracedDefIdDirectory)
|
||||
-> DirtyNodes {
|
||||
let mut hcx = HashContext::new(tcx);
|
||||
let mut items_removed = false;
|
||||
let mut dirty_nodes = FnvHashSet();
|
||||
for hash in hashes {
|
||||
match hash.node.map_def(|&i| retraced.def_id(i)) {
|
||||
Some(dep_node) => {
|
||||
let current_hash = hcx.hash(&dep_node).unwrap();
|
||||
if current_hash != hash.hash {
|
||||
debug!("initial_dirty_nodes: {:?} is dirty as hash is {:?}, was {:?}",
|
||||
dep_node, current_hash, hash.hash);
|
||||
dirty_nodes.insert(dep_node);
|
||||
}
|
||||
}
|
||||
None => {
|
||||
items_removed = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If any of the items in the krate have changed, then we consider
|
||||
// the meta-node `Krate` to be dirty, since that means something
|
||||
// which (potentially) read the contents of every single item.
|
||||
if items_removed || !dirty_nodes.is_empty() {
|
||||
dirty_nodes.insert(DepNode::Krate);
|
||||
for hash in hashes {
|
||||
if let Some(dep_node) = retraced.map(&hash.dep_node) {
|
||||
let (_, current_hash) = hcx.hash(&dep_node).unwrap();
|
||||
if current_hash == hash.hash {
|
||||
continue;
|
||||
}
|
||||
debug!("initial_dirty_nodes: {:?} is dirty as hash is {:?}, was {:?}",
|
||||
dep_node.map_def(|&def_id| Some(tcx.def_path(def_id))).unwrap(),
|
||||
current_hash,
|
||||
hash.hash);
|
||||
} else {
|
||||
debug!("initial_dirty_nodes: {:?} is dirty as it was removed",
|
||||
hash.dep_node);
|
||||
}
|
||||
|
||||
dirty_nodes.insert(hash.dep_node.clone());
|
||||
}
|
||||
|
||||
dirty_nodes
|
||||
}
|
||||
|
||||
fn compute_clean_edges(serialized_edges: &[(SerializedEdge)],
|
||||
retraced: &RetracedDefIdDirectory,
|
||||
dirty_nodes: &mut DirtyNodes)
|
||||
-> CleanEdges {
|
||||
// Build up an initial list of edges. Include an edge (source,
|
||||
// target) if neither node has been removed. If the source has
|
||||
// been removed, add target to the list of dirty nodes.
|
||||
let mut clean_edges = Vec::with_capacity(serialized_edges.len());
|
||||
for &(ref serialized_source, ref serialized_target) in serialized_edges {
|
||||
if let Some(target) = retraced.map(serialized_target) {
|
||||
if let Some(source) = retraced.map(serialized_source) {
|
||||
clean_edges.push((source, target))
|
||||
} else {
|
||||
// source removed, target must be dirty
|
||||
debug!("compute_clean_edges: {:?} dirty because {:?} no longer exists",
|
||||
target, serialized_source);
|
||||
dirty_nodes.insert(target);
|
||||
}
|
||||
} else {
|
||||
// target removed, ignore the edge
|
||||
}
|
||||
}
|
||||
|
||||
debug!("compute_clean_edges: dirty_nodes={:#?}", dirty_nodes);
|
||||
|
||||
// Propagate dirty marks by iterating repeatedly over
|
||||
// `clean_edges`. If we find an edge `(source, target)` where
|
||||
// `source` is dirty, add `target` to the list of dirty nodes and
|
||||
// remove it. Keep doing this until we find no more dirty nodes.
|
||||
let mut previous_size = 0;
|
||||
while dirty_nodes.len() > previous_size {
|
||||
debug!("compute_clean_edges: previous_size={}", previous_size);
|
||||
previous_size = dirty_nodes.len();
|
||||
let mut i = 0;
|
||||
while i < clean_edges.len() {
|
||||
if dirty_nodes.contains(&clean_edges[i].0) {
|
||||
let (source, target) = clean_edges.swap_remove(i);
|
||||
debug!("compute_clean_edges: dirty source {:?} -> {:?}",
|
||||
source, target);
|
||||
dirty_nodes.insert(target);
|
||||
} else if dirty_nodes.contains(&clean_edges[i].1) {
|
||||
let (source, target) = clean_edges.swap_remove(i);
|
||||
debug!("compute_clean_edges: dirty target {:?} -> {:?}",
|
||||
source, target);
|
||||
} else {
|
||||
i += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
clean_edges
|
||||
}
|
||||
|
||||
/// Go through the list of work-products produced in the previous run.
|
||||
/// Delete any whose nodes have been found to be dirty or which are
|
||||
/// otherwise no longer applicable.
|
||||
fn reconcile_work_products<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
work_products: Vec<SerializedWorkProduct>,
|
||||
dirty_nodes: &DirtyNodes) {
|
||||
dirty_target_nodes: &FnvHashSet<DepNode<DefId>>) {
|
||||
debug!("reconcile_work_products({:?})", work_products);
|
||||
for swp in work_products {
|
||||
let dep_node = DepNode::WorkProduct(swp.id.clone());
|
||||
if dirty_nodes.contains(&dep_node) {
|
||||
if dirty_target_nodes.contains(&DepNode::WorkProduct(swp.id.clone())) {
|
||||
debug!("reconcile_work_products: dep-node for {:?} is dirty", swp);
|
||||
delete_dirty_work_product(tcx, swp);
|
||||
} else {
|
||||
|
@ -17,6 +17,7 @@ mod directory;
|
||||
mod dirty_clean;
|
||||
mod hash;
|
||||
mod load;
|
||||
mod preds;
|
||||
mod save;
|
||||
mod util;
|
||||
mod work_product;
|
||||
|
73
src/librustc_incremental/persist/preds.rs
Normal file
73
src/librustc_incremental/persist/preds.rs
Normal file
@ -0,0 +1,73 @@
|
||||
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use rustc::dep_graph::{DepGraphQuery, DepNode};
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc_data_structures::fnv::FnvHashMap;
|
||||
use rustc_data_structures::graph::{DepthFirstTraversal, INCOMING, NodeIndex};
|
||||
|
||||
use super::hash::*;
|
||||
|
||||
/// A data-structure that makes it easy to enumerate the hashable
|
||||
/// predecessors of any given dep-node.
|
||||
pub struct Predecessors<'query> {
|
||||
// - Keys: dep-nodes that may have work-products, output meta-data
|
||||
// nodes.
|
||||
// - Values: transitive predecessors of the key that are hashable
|
||||
// (e.g., HIR nodes, input meta-data nodes)
|
||||
pub inputs: FnvHashMap<&'query DepNode<DefId>, Vec<&'query DepNode<DefId>>>,
|
||||
|
||||
// - Keys: some hashable node
|
||||
// - Values: the hash thereof
|
||||
pub hashes: FnvHashMap<&'query DepNode<DefId>, u64>,
|
||||
}
|
||||
|
||||
impl<'q> Predecessors<'q> {
|
||||
pub fn new(query: &'q DepGraphQuery<DefId>, hcx: &mut HashContext) -> Self {
|
||||
// Find nodes for which we want to know the full set of preds
|
||||
let mut dfs = DepthFirstTraversal::new(&query.graph, INCOMING);
|
||||
let all_nodes = query.graph.all_nodes();
|
||||
let tcx = hcx.tcx;
|
||||
|
||||
let inputs: FnvHashMap<_, _> = all_nodes.iter()
|
||||
.enumerate()
|
||||
.filter(|&(_, node)| match node.data {
|
||||
DepNode::WorkProduct(_) => true,
|
||||
DepNode::MetaData(ref def_id) => def_id.is_local(),
|
||||
|
||||
// if -Z query-dep-graph is passed, save more extended data
|
||||
// to enable better unit testing
|
||||
DepNode::TypeckItemBody(_) |
|
||||
DepNode::TransCrateItem(_) => tcx.sess.opts.debugging_opts.query_dep_graph,
|
||||
|
||||
_ => false,
|
||||
})
|
||||
.map(|(node_index, node)| {
|
||||
dfs.reset(NodeIndex(node_index));
|
||||
let inputs: Vec<_> = dfs.by_ref()
|
||||
.map(|i| &all_nodes[i.node_id()].data)
|
||||
.filter(|d| HashContext::is_hashable(d))
|
||||
.collect();
|
||||
(&node.data, inputs)
|
||||
})
|
||||
.collect();
|
||||
|
||||
let mut hashes = FnvHashMap();
|
||||
for input in inputs.values().flat_map(|v| v.iter().cloned()) {
|
||||
hashes.entry(input)
|
||||
.or_insert_with(|| hcx.hash(input).unwrap().1);
|
||||
}
|
||||
|
||||
Predecessors {
|
||||
inputs: inputs,
|
||||
hashes: hashes,
|
||||
}
|
||||
}
|
||||
}
|
@ -10,11 +10,13 @@
|
||||
|
||||
use rbml::opaque::Encoder;
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::middle::cstore::LOCAL_CRATE;
|
||||
use rustc::session::Session;
|
||||
use rustc::ty::TyCtxt;
|
||||
use rustc_serialize::{Encodable as RustcEncodable};
|
||||
use std::hash::{Hasher, SipHasher};
|
||||
use rustc_data_structures::fnv::FnvHashMap;
|
||||
use rustc_serialize::Encodable as RustcEncodable;
|
||||
use std::hash::{Hash, Hasher, SipHasher};
|
||||
use std::io::{self, Cursor, Write};
|
||||
use std::fs::{self, File};
|
||||
use std::path::PathBuf;
|
||||
@ -22,15 +24,26 @@ use std::path::PathBuf;
|
||||
use super::data::*;
|
||||
use super::directory::*;
|
||||
use super::hash::*;
|
||||
use super::preds::*;
|
||||
use super::util::*;
|
||||
|
||||
pub fn save_dep_graph<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>) {
|
||||
debug!("save_dep_graph()");
|
||||
let _ignore = tcx.dep_graph.in_ignore();
|
||||
let sess = tcx.sess;
|
||||
if sess.opts.incremental.is_none() {
|
||||
return;
|
||||
}
|
||||
let mut hcx = HashContext::new(tcx);
|
||||
save_in(sess, dep_graph_path(tcx), |e| encode_dep_graph(&mut hcx, e));
|
||||
save_in(sess, metadata_hash_path(tcx, LOCAL_CRATE), |e| encode_metadata_hashes(&mut hcx, e));
|
||||
let mut builder = DefIdDirectoryBuilder::new(tcx);
|
||||
let query = tcx.dep_graph.query();
|
||||
let preds = Predecessors::new(&query, &mut hcx);
|
||||
save_in(sess,
|
||||
dep_graph_path(tcx),
|
||||
|e| encode_dep_graph(&preds, &mut builder, e));
|
||||
save_in(sess,
|
||||
metadata_hash_path(tcx, LOCAL_CRATE),
|
||||
|e| encode_metadata_hashes(tcx, &preds, &mut builder, e));
|
||||
}
|
||||
|
||||
pub fn save_work_products(sess: &Session, local_crate_name: &str) {
|
||||
@ -40,14 +53,12 @@ pub fn save_work_products(sess: &Session, local_crate_name: &str) {
|
||||
save_in(sess, path, |e| encode_work_products(sess, e));
|
||||
}
|
||||
|
||||
fn save_in<F>(sess: &Session,
|
||||
opt_path_buf: Option<PathBuf>,
|
||||
encode: F)
|
||||
fn save_in<F>(sess: &Session, opt_path_buf: Option<PathBuf>, encode: F)
|
||||
where F: FnOnce(&mut Encoder) -> io::Result<()>
|
||||
{
|
||||
let path_buf = match opt_path_buf {
|
||||
Some(p) => p,
|
||||
None => return
|
||||
None => return,
|
||||
};
|
||||
|
||||
// FIXME(#32754) lock file?
|
||||
@ -55,11 +66,11 @@ fn save_in<F>(sess: &Session,
|
||||
// delete the old dep-graph, if any
|
||||
if path_buf.exists() {
|
||||
match fs::remove_file(&path_buf) {
|
||||
Ok(()) => { }
|
||||
Ok(()) => {}
|
||||
Err(err) => {
|
||||
sess.err(
|
||||
&format!("unable to delete old dep-graph at `{}`: {}",
|
||||
path_buf.display(), err));
|
||||
sess.err(&format!("unable to delete old dep-graph at `{}`: {}",
|
||||
path_buf.display(),
|
||||
err));
|
||||
return;
|
||||
}
|
||||
}
|
||||
@ -68,97 +79,106 @@ fn save_in<F>(sess: &Session,
|
||||
// generate the data in a memory buffer
|
||||
let mut wr = Cursor::new(Vec::new());
|
||||
match encode(&mut Encoder::new(&mut wr)) {
|
||||
Ok(()) => { }
|
||||
Ok(()) => {}
|
||||
Err(err) => {
|
||||
sess.err(
|
||||
&format!("could not encode dep-graph to `{}`: {}",
|
||||
path_buf.display(), err));
|
||||
sess.err(&format!("could not encode dep-graph to `{}`: {}",
|
||||
path_buf.display(),
|
||||
err));
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// write the data out
|
||||
let data = wr.into_inner();
|
||||
match
|
||||
File::create(&path_buf)
|
||||
.and_then(|mut file| file.write_all(&data))
|
||||
{
|
||||
Ok(_) => { }
|
||||
match File::create(&path_buf).and_then(|mut file| file.write_all(&data)) {
|
||||
Ok(_) => {}
|
||||
Err(err) => {
|
||||
sess.err(
|
||||
&format!("failed to write dep-graph to `{}`: {}",
|
||||
path_buf.display(), err));
|
||||
sess.err(&format!("failed to write dep-graph to `{}`: {}",
|
||||
path_buf.display(),
|
||||
err));
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn encode_dep_graph<'a, 'tcx>(hcx: &mut HashContext<'a, 'tcx>,
|
||||
encoder: &mut Encoder)
|
||||
-> io::Result<()>
|
||||
{
|
||||
let tcx = hcx.tcx;
|
||||
let query = tcx.dep_graph.query();
|
||||
|
||||
let mut builder = DefIdDirectoryBuilder::new(tcx);
|
||||
|
||||
// Create hashes for inputs.
|
||||
let hashes =
|
||||
query.nodes()
|
||||
.into_iter()
|
||||
.filter_map(|dep_node| {
|
||||
hcx.hash(&dep_node)
|
||||
.map(|hash| {
|
||||
let node = builder.map(dep_node);
|
||||
SerializedHash { node: node, hash: hash }
|
||||
})
|
||||
})
|
||||
.collect();
|
||||
pub fn encode_dep_graph(preds: &Predecessors,
|
||||
builder: &mut DefIdDirectoryBuilder,
|
||||
encoder: &mut Encoder)
|
||||
-> io::Result<()> {
|
||||
// Create a flat list of (Input, WorkProduct) edges for
|
||||
// serialization.
|
||||
let mut edges = vec![];
|
||||
for (&target, sources) in &preds.inputs {
|
||||
match *target {
|
||||
DepNode::MetaData(ref def_id) => {
|
||||
// Metadata *targets* are always local metadata nodes. We handle
|
||||
// those in `encode_metadata_hashes`, which comes later.
|
||||
assert!(def_id.is_local());
|
||||
continue;
|
||||
}
|
||||
_ => (),
|
||||
}
|
||||
let target = builder.map(target);
|
||||
for &source in sources {
|
||||
let source = builder.map(source);
|
||||
edges.push((source, target.clone()));
|
||||
}
|
||||
}
|
||||
|
||||
// Create the serialized dep-graph.
|
||||
let graph = SerializedDepGraph {
|
||||
nodes: query.nodes().into_iter()
|
||||
.map(|node| builder.map(node))
|
||||
.collect(),
|
||||
edges: query.edges().into_iter()
|
||||
.map(|(source_node, target_node)| {
|
||||
let source = builder.map(source_node);
|
||||
let target = builder.map(target_node);
|
||||
(source, target)
|
||||
})
|
||||
.collect(),
|
||||
hashes: hashes,
|
||||
edges: edges,
|
||||
hashes: preds.hashes
|
||||
.iter()
|
||||
.map(|(&dep_node, &hash)| {
|
||||
SerializedHash {
|
||||
dep_node: builder.map(dep_node),
|
||||
hash: hash,
|
||||
}
|
||||
})
|
||||
.collect(),
|
||||
};
|
||||
|
||||
debug!("graph = {:#?}", graph);
|
||||
|
||||
// Encode the directory and then the graph data.
|
||||
let directory = builder.into_directory();
|
||||
try!(directory.encode(encoder));
|
||||
try!(builder.directory().encode(encoder));
|
||||
try!(graph.encode(encoder));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn encode_metadata_hashes<'a, 'tcx>(hcx: &mut HashContext<'a, 'tcx>,
|
||||
encoder: &mut Encoder)
|
||||
-> io::Result<()>
|
||||
{
|
||||
let tcx = hcx.tcx;
|
||||
let query = tcx.dep_graph.query();
|
||||
pub fn encode_metadata_hashes(tcx: TyCtxt,
|
||||
preds: &Predecessors,
|
||||
builder: &mut DefIdDirectoryBuilder,
|
||||
encoder: &mut Encoder)
|
||||
-> io::Result<()> {
|
||||
let mut def_id_hashes = FnvHashMap();
|
||||
let mut def_id_hash = |def_id: DefId| -> u64 {
|
||||
*def_id_hashes.entry(def_id)
|
||||
.or_insert_with(|| {
|
||||
let index = builder.add(def_id);
|
||||
let path = builder.lookup_def_path(index);
|
||||
path.deterministic_hash(tcx)
|
||||
})
|
||||
};
|
||||
|
||||
let serialized_hashes = {
|
||||
// Identify the `MetaData(X)` nodes where `X` is local. These are
|
||||
// the metadata items we export. Downstream crates will want to
|
||||
// see a hash that tells them whether we might have changed the
|
||||
// metadata for a given item since they last compiled.
|
||||
let meta_data_def_ids =
|
||||
query.nodes()
|
||||
.into_iter()
|
||||
.filter_map(|dep_node| match *dep_node {
|
||||
DepNode::MetaData(def_id) if def_id.is_local() => Some(def_id),
|
||||
_ => None,
|
||||
});
|
||||
// For each `MetaData(X)` node where `X` is local, accumulate a
|
||||
// hash. These are the metadata items we export. Downstream
|
||||
// crates will want to see a hash that tells them whether we might
|
||||
// have changed the metadata for a given item since they last
|
||||
// compiled.
|
||||
//
|
||||
// (I initially wrote this with an iterator, but it seemed harder to read.)
|
||||
let mut serialized_hashes = SerializedMetadataHashes { hashes: vec![] };
|
||||
for (&target, sources) in &preds.inputs {
|
||||
let def_id = match *target {
|
||||
DepNode::MetaData(def_id) => {
|
||||
assert!(def_id.is_local());
|
||||
def_id
|
||||
}
|
||||
_ => continue,
|
||||
};
|
||||
|
||||
// To create the hash for each item `X`, we don't hash the raw
|
||||
// bytes of the metadata (though in principle we
|
||||
@ -166,34 +186,32 @@ pub fn encode_metadata_hashes<'a, 'tcx>(hcx: &mut HashContext<'a, 'tcx>,
|
||||
// from the dep-graph. This corresponds to all the inputs that
|
||||
// were read to construct the metadata. To create the hash for
|
||||
// the metadata, we hash (the hash of) all of those inputs.
|
||||
let hashes =
|
||||
meta_data_def_ids
|
||||
.map(|def_id| {
|
||||
assert!(def_id.is_local());
|
||||
let dep_node = DepNode::MetaData(def_id);
|
||||
let mut state = SipHasher::new();
|
||||
debug!("save: computing metadata hash for {:?}", dep_node);
|
||||
for node in query.transitive_predecessors(&dep_node) {
|
||||
if let Some(hash) = hcx.hash(&node) {
|
||||
debug!("save: predecessor {:?} has hash {}", node, hash);
|
||||
state.write_u64(hash.to_le());
|
||||
} else {
|
||||
debug!("save: predecessor {:?} cannot be hashed", node);
|
||||
}
|
||||
}
|
||||
let hash = state.finish();
|
||||
debug!("save: metadata hash for {:?} is {}", dep_node, hash);
|
||||
SerializedMetadataHash {
|
||||
def_index: def_id.index,
|
||||
hash: hash,
|
||||
}
|
||||
});
|
||||
debug!("save: computing metadata hash for {:?}", def_id);
|
||||
|
||||
// Collect these up into a vector.
|
||||
SerializedMetadataHashes {
|
||||
hashes: hashes.collect()
|
||||
}
|
||||
};
|
||||
// Create a vector containing a pair of (source-id, hash).
|
||||
// The source-id is stored as a `DepNode<u64>`, where the u64
|
||||
// is the det. hash of the def-path. This is convenient
|
||||
// because we can sort this to get a stable ordering across
|
||||
// compilations, even if the def-ids themselves have changed.
|
||||
let mut hashes: Vec<(DepNode<u64>, u64)> = sources.iter()
|
||||
.map(|dep_node| {
|
||||
let hash_dep_node = dep_node.map_def(|&def_id| Some(def_id_hash(def_id))).unwrap();
|
||||
let hash = preds.hashes[dep_node];
|
||||
(hash_dep_node, hash)
|
||||
})
|
||||
.collect();
|
||||
|
||||
hashes.sort();
|
||||
let mut state = SipHasher::new();
|
||||
hashes.hash(&mut state);
|
||||
let hash = state.finish();
|
||||
|
||||
debug!("save: metadata hash for {:?} is {}", def_id, hash);
|
||||
serialized_hashes.hashes.push(SerializedMetadataHash {
|
||||
def_index: def_id.index,
|
||||
hash: hash,
|
||||
});
|
||||
}
|
||||
|
||||
// Encode everything.
|
||||
try!(serialized_hashes.encode(encoder));
|
||||
@ -201,21 +219,17 @@ pub fn encode_metadata_hashes<'a, 'tcx>(hcx: &mut HashContext<'a, 'tcx>,
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn encode_work_products(sess: &Session,
|
||||
encoder: &mut Encoder)
|
||||
-> io::Result<()>
|
||||
{
|
||||
let work_products: Vec<_> =
|
||||
sess.dep_graph.work_products()
|
||||
.iter()
|
||||
.map(|(id, work_product)| {
|
||||
SerializedWorkProduct {
|
||||
id: id.clone(),
|
||||
work_product: work_product.clone(),
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
pub fn encode_work_products(sess: &Session, encoder: &mut Encoder) -> io::Result<()> {
|
||||
let work_products: Vec<_> = sess.dep_graph
|
||||
.work_products()
|
||||
.iter()
|
||||
.map(|(id, work_product)| {
|
||||
SerializedWorkProduct {
|
||||
id: id.clone(),
|
||||
work_product: work_product.clone(),
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
work_products.encode(encoder)
|
||||
}
|
||||
|
||||
|
@ -78,8 +78,8 @@ pub fn encode_inlined_item(ecx: &e::EncodeContext,
|
||||
rbml_w: &mut Encoder,
|
||||
ii: InlinedItemRef) {
|
||||
let id = match ii {
|
||||
InlinedItemRef::Item(i) => i.id,
|
||||
InlinedItemRef::Foreign(i) => i.id,
|
||||
InlinedItemRef::Item(_, i) => i.id,
|
||||
InlinedItemRef::Foreign(_, i) => i.id,
|
||||
InlinedItemRef::TraitItem(_, ti) => ti.id,
|
||||
InlinedItemRef::ImplItem(_, ii) => ii.id,
|
||||
};
|
||||
@ -146,8 +146,8 @@ pub fn decode_inlined_item<'a, 'tcx>(cdata: &cstore::CrateMetadata,
|
||||
decode_ast(ast_doc),
|
||||
dcx);
|
||||
let name = match *ii {
|
||||
InlinedItem::Item(ref i) => i.name,
|
||||
InlinedItem::Foreign(ref i) => i.name,
|
||||
InlinedItem::Item(_, ref i) => i.name,
|
||||
InlinedItem::Foreign(_, ref i) => i.name,
|
||||
InlinedItem::TraitItem(_, ref ti) => ti.name,
|
||||
InlinedItem::ImplItem(_, ref ii) => ii.name
|
||||
};
|
||||
@ -158,7 +158,7 @@ pub fn decode_inlined_item<'a, 'tcx>(cdata: &cstore::CrateMetadata,
|
||||
region::resolve_inlined_item(&tcx.sess, &tcx.region_maps, ii);
|
||||
decode_side_tables(dcx, ast_doc);
|
||||
copy_item_types(dcx, ii, orig_did);
|
||||
if let InlinedItem::Item(ref i) = *ii {
|
||||
if let InlinedItem::Item(_, ref i) = *ii {
|
||||
debug!(">>> DECODED ITEM >>>\n{}\n<<< DECODED ITEM <<<",
|
||||
::rustc::hir::print::item_to_string(&i));
|
||||
}
|
||||
@ -348,8 +348,8 @@ fn simplify_ast(ii: InlinedItemRef) -> (InlinedItem, IdRange) {
|
||||
|
||||
let ii = match ii {
|
||||
// HACK we're not dropping items.
|
||||
InlinedItemRef::Item(i) => {
|
||||
InlinedItem::Item(P(fold::noop_fold_item(i.clone(), &mut fld)))
|
||||
InlinedItemRef::Item(d, i) => {
|
||||
InlinedItem::Item(d, P(fold::noop_fold_item(i.clone(), &mut fld)))
|
||||
}
|
||||
InlinedItemRef::TraitItem(d, ti) => {
|
||||
InlinedItem::TraitItem(d, P(fold::noop_fold_trait_item(ti.clone(), &mut fld)))
|
||||
@ -357,8 +357,8 @@ fn simplify_ast(ii: InlinedItemRef) -> (InlinedItem, IdRange) {
|
||||
InlinedItemRef::ImplItem(d, ii) => {
|
||||
InlinedItem::ImplItem(d, P(fold::noop_fold_impl_item(ii.clone(), &mut fld)))
|
||||
}
|
||||
InlinedItemRef::Foreign(i) => {
|
||||
InlinedItem::Foreign(P(fold::noop_fold_foreign_item(i.clone(), &mut fld)))
|
||||
InlinedItemRef::Foreign(d, i) => {
|
||||
InlinedItem::Foreign(d, P(fold::noop_fold_foreign_item(i.clone(), &mut fld)))
|
||||
}
|
||||
};
|
||||
|
||||
@ -1241,15 +1241,15 @@ fn copy_item_types(dcx: &DecodeContext, ii: &InlinedItem, orig_did: DefId) {
|
||||
}
|
||||
// copy the entry for the item itself
|
||||
let item_node_id = match ii {
|
||||
&InlinedItem::Item(ref i) => i.id,
|
||||
&InlinedItem::Item(_, ref i) => i.id,
|
||||
&InlinedItem::TraitItem(_, ref ti) => ti.id,
|
||||
&InlinedItem::ImplItem(_, ref ii) => ii.id,
|
||||
&InlinedItem::Foreign(ref fi) => fi.id
|
||||
&InlinedItem::Foreign(_, ref fi) => fi.id
|
||||
};
|
||||
copy_item_type(dcx, item_node_id, orig_did);
|
||||
|
||||
// copy the entries of inner items
|
||||
if let &InlinedItem::Item(ref item) = ii {
|
||||
if let &InlinedItem::Item(_, ref item) = ii {
|
||||
match item.node {
|
||||
hir::ItemEnum(ref def, _) => {
|
||||
let orig_def = dcx.tcx.lookup_adt_def(orig_did);
|
||||
@ -1383,6 +1383,9 @@ fn test_more() {
|
||||
|
||||
#[test]
|
||||
fn test_simplification() {
|
||||
use middle::cstore::LOCAL_CRATE;
|
||||
use rustc::hir::def_id::CRATE_DEF_INDEX;
|
||||
|
||||
let cx = mk_ctxt();
|
||||
let item = quote_item!(&cx,
|
||||
fn new_int_alist<B>() -> alist<isize, B> {
|
||||
@ -1393,15 +1396,16 @@ fn test_simplification() {
|
||||
let cx = mk_ctxt();
|
||||
with_testing_context(|lcx| {
|
||||
let hir_item = lcx.lower_item(&item);
|
||||
let item_in = InlinedItemRef::Item(&hir_item);
|
||||
let def_id = DefId { krate: LOCAL_CRATE, index: CRATE_DEF_INDEX }; // dummy
|
||||
let item_in = InlinedItemRef::Item(def_id, &hir_item);
|
||||
let (item_out, _) = simplify_ast(item_in);
|
||||
let item_exp = InlinedItem::Item(P(lcx.lower_item("e_item!(&cx,
|
||||
let item_exp = InlinedItem::Item(def_id, P(lcx.lower_item("e_item!(&cx,
|
||||
fn new_int_alist<B>() -> alist<isize, B> {
|
||||
return alist {eq_fn: eq_int, data: Vec::new()};
|
||||
}
|
||||
).unwrap())));
|
||||
match (item_out, item_exp) {
|
||||
(InlinedItem::Item(item_out), InlinedItem::Item(item_exp)) => {
|
||||
(InlinedItem::Item(_, item_out), InlinedItem::Item(_, item_exp)) => {
|
||||
assert!(pprust::item_to_string(&item_out) ==
|
||||
pprust::item_to_string(&item_exp));
|
||||
}
|
||||
|
@ -546,11 +546,13 @@ impl<'tcx> CrateStore<'tcx> for cstore::CStore {
|
||||
.borrow_mut()
|
||||
.insert(def_id, None);
|
||||
}
|
||||
decoder::FoundAst::Found(&InlinedItem::Item(ref item)) => {
|
||||
decoder::FoundAst::Found(&InlinedItem::Item(d, ref item)) => {
|
||||
assert_eq!(d, def_id);
|
||||
let inlined_root_node_id = find_inlined_item_root(item.id);
|
||||
cache_inlined_item(def_id, item.id, inlined_root_node_id);
|
||||
}
|
||||
decoder::FoundAst::Found(&InlinedItem::Foreign(ref item)) => {
|
||||
decoder::FoundAst::Found(&InlinedItem::Foreign(d, ref item)) => {
|
||||
assert_eq!(d, def_id);
|
||||
let inlined_root_node_id = find_inlined_item_root(item.id);
|
||||
cache_inlined_item(def_id, item.id, inlined_root_node_id);
|
||||
}
|
||||
|
@ -797,7 +797,7 @@ pub fn maybe_get_item_ast<'a, 'tcx>(cdata: Cmd, tcx: TyCtxt<'a, 'tcx, 'tcx>, id:
|
||||
grandparent_def_id,
|
||||
ast_doc,
|
||||
parent_did);
|
||||
if let &InlinedItem::Item(ref i) = ii {
|
||||
if let &InlinedItem::Item(_, ref i) = ii {
|
||||
return FoundAst::FoundParent(parent_did, i);
|
||||
}
|
||||
}
|
||||
@ -1690,7 +1690,7 @@ fn item_def_key(item_doc: rbml::Doc) -> hir_map::DefKey {
|
||||
let mut decoder = reader::Decoder::new(def_key_doc);
|
||||
let simple_key = def_key::DefKey::decode(&mut decoder).unwrap();
|
||||
let name = reader::maybe_get_doc(item_doc, tag_paths_data_name).map(|name| {
|
||||
token::intern(name.as_str_slice())
|
||||
token::intern(name.as_str_slice()).as_str()
|
||||
});
|
||||
def_key::recover_def_key(simple_key, name)
|
||||
}
|
||||
|
@ -10,7 +10,7 @@
|
||||
|
||||
use rustc::hir::def_id::DefIndex;
|
||||
use rustc::hir::map as hir_map;
|
||||
use syntax::ast::Name;
|
||||
use syntax::parse::token::InternedString;
|
||||
|
||||
#[derive(RustcEncodable, RustcDecodable)]
|
||||
pub struct DefKey {
|
||||
@ -75,7 +75,7 @@ fn simplify_def_path_data(data: hir_map::DefPathData) -> DefPathData {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn recover_def_key(key: DefKey, name: Option<Name>) -> hir_map::DefKey {
|
||||
pub fn recover_def_key(key: DefKey, name: Option<InternedString>) -> hir_map::DefKey {
|
||||
let data = hir_map::DisambiguatedDefPathData {
|
||||
data: recover_def_path_data(key.disambiguated_data.data, name),
|
||||
disambiguator: key.disambiguated_data.disambiguator,
|
||||
@ -86,7 +86,7 @@ pub fn recover_def_key(key: DefKey, name: Option<Name>) -> hir_map::DefKey {
|
||||
}
|
||||
}
|
||||
|
||||
fn recover_def_path_data(data: DefPathData, name: Option<Name>) -> hir_map::DefPathData {
|
||||
fn recover_def_path_data(data: DefPathData, name: Option<InternedString>) -> hir_map::DefPathData {
|
||||
match data {
|
||||
DefPathData::CrateRoot => hir_map::DefPathData::CrateRoot,
|
||||
DefPathData::Misc => hir_map::DefPathData::Misc,
|
||||
|
@ -743,7 +743,8 @@ fn encode_repr_attrs(rbml_w: &mut Encoder,
|
||||
}
|
||||
|
||||
fn encode_mir(ecx: &EncodeContext, rbml_w: &mut Encoder, node_id: NodeId) {
|
||||
if let Some(mir) = ecx.mir_map.map.get(&node_id) {
|
||||
let def_id = ecx.tcx.map.local_def_id(node_id);
|
||||
if let Some(mir) = ecx.mir_map.map.get(&def_id) {
|
||||
rbml_w.start_tag(tag_mir as usize);
|
||||
rbml_w.emit_opaque(|opaque_encoder| {
|
||||
tls::enter_encoding_context(ecx, opaque_encoder, |_, opaque_encoder| {
|
||||
@ -861,7 +862,7 @@ fn encode_info_for_item<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
||||
encode_bounds_and_type_for_item(rbml_w, ecx, index, item.id);
|
||||
encode_name(rbml_w, item.name);
|
||||
encode_attributes(rbml_w, &item.attrs);
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(item));
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(def_id, item));
|
||||
encode_mir(ecx, rbml_w, item.id);
|
||||
encode_visibility(rbml_w, vis);
|
||||
encode_stability(rbml_w, stab);
|
||||
@ -879,7 +880,7 @@ fn encode_info_for_item<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
||||
encode_attributes(rbml_w, &item.attrs);
|
||||
let needs_inline = tps_len > 0 || attr::requests_inline(&item.attrs);
|
||||
if needs_inline || constness == hir::Constness::Const {
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(item));
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(def_id, item));
|
||||
encode_mir(ecx, rbml_w, item.id);
|
||||
}
|
||||
encode_constness(rbml_w, constness);
|
||||
@ -942,7 +943,7 @@ fn encode_info_for_item<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
||||
for v in &enum_definition.variants {
|
||||
encode_variant_id(rbml_w, ecx.tcx.map.local_def_id(v.node.data.id()));
|
||||
}
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(item));
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(def_id, item));
|
||||
encode_mir(ecx, rbml_w, item.id);
|
||||
|
||||
// Encode inherent implementations for this enumeration.
|
||||
@ -989,7 +990,7 @@ fn encode_info_for_item<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
||||
needs to know*/
|
||||
encode_struct_fields(rbml_w, variant);
|
||||
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(item));
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Item(def_id, item));
|
||||
encode_mir(ecx, rbml_w, item.id);
|
||||
|
||||
// Encode inherent implementations for this structure.
|
||||
@ -1311,7 +1312,7 @@ fn encode_info_for_foreign_item<'a, 'tcx>(ecx: &EncodeContext<'a, 'tcx>,
|
||||
encode_bounds_and_type_for_item(rbml_w, ecx, index, nitem.id);
|
||||
encode_name(rbml_w, nitem.name);
|
||||
if abi == Abi::RustIntrinsic || abi == Abi::PlatformIntrinsic {
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Foreign(nitem));
|
||||
encode_inlined_item(ecx, rbml_w, InlinedItemRef::Foreign(def_id, nitem));
|
||||
encode_mir(ecx, rbml_w, nitem.id);
|
||||
}
|
||||
encode_attributes(rbml_w, &nitem.attrs);
|
||||
@ -1361,7 +1362,7 @@ fn my_visit_expr(expr: &hir::Expr,
|
||||
ecx.tcx.closure_kind(def_id).encode(rbml_w).unwrap();
|
||||
rbml_w.end_tag();
|
||||
|
||||
assert!(ecx.mir_map.map.contains_key(&expr.id));
|
||||
assert!(ecx.mir_map.map.contains_key(&def_id));
|
||||
encode_mir(ecx, rbml_w, expr.id);
|
||||
|
||||
rbml_w.end_tag();
|
||||
|
@ -9,7 +9,9 @@
|
||||
// except according to those terms.
|
||||
|
||||
use dot;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::mir::repr::*;
|
||||
use rustc::mir::mir_map::MirMap;
|
||||
use rustc::ty::{self, TyCtxt};
|
||||
use std::fmt::Debug;
|
||||
use std::io::{self, Write};
|
||||
@ -19,10 +21,16 @@ use rustc_data_structures::indexed_vec::Idx;
|
||||
|
||||
/// Write a graphviz DOT graph of a list of MIRs.
|
||||
pub fn write_mir_graphviz<'a, 'b, 'tcx, W, I>(tcx: TyCtxt<'b, 'tcx, 'tcx>,
|
||||
iter: I, w: &mut W)
|
||||
iter: I,
|
||||
mir_map: &MirMap<'tcx>,
|
||||
w: &mut W)
|
||||
-> io::Result<()>
|
||||
where W: Write, I: Iterator<Item=(&'a NodeId, &'a Mir<'a>)> {
|
||||
for (&nodeid, mir) in iter {
|
||||
where W: Write, I: Iterator<Item=DefId>
|
||||
{
|
||||
for def_id in iter {
|
||||
let nodeid = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
let mir = &mir_map.map[&def_id];
|
||||
|
||||
writeln!(w, "digraph Mir_{} {{", nodeid)?;
|
||||
|
||||
// Global graph properties
|
||||
|
@ -22,6 +22,7 @@ use rustc::mir::transform::MirSource;
|
||||
use rustc::middle::const_val::ConstVal;
|
||||
use rustc_const_eval as const_eval;
|
||||
use rustc_data_structures::indexed_vec::Idx;
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::hir::intravisit::FnKind;
|
||||
use rustc::hir::map::blocks::FnLikeNode;
|
||||
@ -61,7 +62,17 @@ impl<'a, 'gcx, 'tcx> Cx<'a, 'gcx, 'tcx> {
|
||||
MirSource::Promoted(..) => bug!()
|
||||
};
|
||||
|
||||
let attrs = infcx.tcx.map.attrs(src.item_id());
|
||||
let src_node_id = src.item_id();
|
||||
|
||||
// We are going to be accessing various tables
|
||||
// generated by TypeckItemBody; we also assume
|
||||
// that the body passes type check. These tables
|
||||
// are not individually tracked, so just register
|
||||
// a read here.
|
||||
let src_def_id = infcx.tcx.map.local_def_id(src_node_id);
|
||||
infcx.tcx.dep_graph.read(DepNode::TypeckItemBody(src_def_id));
|
||||
|
||||
let attrs = infcx.tcx.map.attrs(src_node_id);
|
||||
|
||||
// Some functions always have overflow checks enabled,
|
||||
// however, they may not get codegen'd, depending on
|
||||
|
@ -18,6 +18,7 @@
|
||||
|
||||
use build;
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::mir::repr::Mir;
|
||||
use rustc::mir::transform::MirSource;
|
||||
use rustc::mir::visit::MutVisitor;
|
||||
@ -29,7 +30,6 @@ use rustc::infer::InferCtxtBuilder;
|
||||
use rustc::traits::ProjectionMode;
|
||||
use rustc::ty::{self, Ty, TyCtxt};
|
||||
use rustc::ty::subst::Substs;
|
||||
use rustc::util::nodemap::NodeMap;
|
||||
use rustc::hir;
|
||||
use rustc::hir::intravisit::{self, FnKind, Visitor};
|
||||
use syntax::ast;
|
||||
@ -38,15 +38,13 @@ use syntax_pos::Span;
|
||||
use std::mem;
|
||||
|
||||
pub fn build_mir_for_crate<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>) -> MirMap<'tcx> {
|
||||
let mut map = MirMap {
|
||||
map: NodeMap(),
|
||||
};
|
||||
let mut map = MirMap::new(tcx.dep_graph.clone());
|
||||
{
|
||||
let mut dump = BuildMir {
|
||||
tcx: tcx,
|
||||
map: &mut map,
|
||||
};
|
||||
tcx.visit_all_items_in_krate(DepNode::MirMapConstruction, &mut dump);
|
||||
tcx.visit_all_items_in_krate(DepNode::Mir, &mut dump);
|
||||
}
|
||||
map
|
||||
}
|
||||
@ -94,6 +92,7 @@ struct BuildMir<'a, 'tcx: 'a> {
|
||||
/// F: for<'b, 'tcx> where 'gcx: 'tcx FnOnce(Cx<'b, 'gcx, 'tcx>).
|
||||
struct CxBuilder<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
src: MirSource,
|
||||
def_id: DefId,
|
||||
infcx: InferCtxtBuilder<'a, 'gcx, 'tcx>,
|
||||
map: &'a mut MirMap<'gcx>,
|
||||
}
|
||||
@ -101,9 +100,11 @@ struct CxBuilder<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
impl<'a, 'gcx, 'tcx> BuildMir<'a, 'gcx> {
|
||||
fn cx<'b>(&'b mut self, src: MirSource) -> CxBuilder<'b, 'gcx, 'tcx> {
|
||||
let param_env = ty::ParameterEnvironment::for_item(self.tcx, src.item_id());
|
||||
let def_id = self.tcx.map.local_def_id(src.item_id());
|
||||
CxBuilder {
|
||||
src: src,
|
||||
infcx: self.tcx.infer_ctxt(None, Some(param_env), ProjectionMode::AnyFinal),
|
||||
def_id: def_id,
|
||||
map: self.map
|
||||
}
|
||||
}
|
||||
@ -133,7 +134,7 @@ impl<'a, 'gcx, 'tcx> CxBuilder<'a, 'gcx, 'tcx> {
|
||||
mir
|
||||
});
|
||||
|
||||
assert!(self.map.map.insert(src.item_id(), mir).is_none())
|
||||
assert!(self.map.map.insert(self.def_id, mir).is_none())
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -10,7 +10,9 @@
|
||||
|
||||
use build::{Location, ScopeAuxiliaryVec, ScopeId};
|
||||
use rustc::hir;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::mir::repr::*;
|
||||
use rustc::mir::mir_map::MirMap;
|
||||
use rustc::mir::transform::MirSource;
|
||||
use rustc::ty::{self, TyCtxt};
|
||||
use rustc_data_structures::fnv::FnvHashMap;
|
||||
@ -18,7 +20,6 @@ use rustc_data_structures::indexed_vec::{Idx};
|
||||
use std::fmt::Display;
|
||||
use std::fs;
|
||||
use std::io::{self, Write};
|
||||
use syntax::ast::NodeId;
|
||||
use std::path::{PathBuf, Path};
|
||||
|
||||
const INDENT: &'static str = " ";
|
||||
@ -89,12 +90,15 @@ pub fn dump_mir<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
/// Write out a human-readable textual representation for the given MIR.
|
||||
pub fn write_mir_pretty<'a, 'b, 'tcx, I>(tcx: TyCtxt<'b, 'tcx, 'tcx>,
|
||||
iter: I,
|
||||
mir_map: &MirMap<'tcx>,
|
||||
w: &mut Write)
|
||||
-> io::Result<()>
|
||||
where I: Iterator<Item=(&'a NodeId, &'a Mir<'tcx>)>, 'tcx: 'a
|
||||
where I: Iterator<Item=DefId>, 'tcx: 'a
|
||||
{
|
||||
let mut first = true;
|
||||
for (&id, mir) in iter {
|
||||
for def_id in iter {
|
||||
let mir = &mir_map.map[&def_id];
|
||||
|
||||
if first {
|
||||
first = false;
|
||||
} else {
|
||||
@ -102,6 +106,7 @@ pub fn write_mir_pretty<'a, 'b, 'tcx, I>(tcx: TyCtxt<'b, 'tcx, 'tcx>,
|
||||
writeln!(w, "")?;
|
||||
}
|
||||
|
||||
let id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
let src = MirSource::from_node(tcx, id);
|
||||
write_mir_fn(tcx, src, mir, w, None)?;
|
||||
|
||||
|
@ -16,6 +16,7 @@
|
||||
|
||||
use rustc_data_structures::bitvec::BitVector;
|
||||
use rustc_data_structures::indexed_vec::{IndexVec, Idx};
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::hir::intravisit::FnKind;
|
||||
@ -883,8 +884,8 @@ fn qualify_const_item_cached<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
|
||||
let extern_mir;
|
||||
let param_env_and_mir = if def_id.is_local() {
|
||||
let node_id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
mir_map.and_then(|map| map.map.get(&node_id)).map(|mir| {
|
||||
mir_map.and_then(|map| map.map.get(&def_id)).map(|mir| {
|
||||
let node_id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
(ty::ParameterEnvironment::for_item(tcx, node_id), mir)
|
||||
})
|
||||
} else if let Some(mir) = tcx.sess.cstore.maybe_get_item_mir(tcx, def_id) {
|
||||
@ -919,9 +920,10 @@ impl<'tcx> MirMapPass<'tcx> for QualifyAndPromoteConstants {
|
||||
|
||||
// First, visit `const` items, potentially recursing, to get
|
||||
// accurate MUTABLE_INTERIOR and NEEDS_DROP qualifications.
|
||||
for &id in map.map.keys() {
|
||||
let def_id = tcx.map.local_def_id(id);
|
||||
let _task = tcx.dep_graph.in_task(self.dep_node(def_id));
|
||||
let keys = map.map.keys();
|
||||
for &def_id in &keys {
|
||||
let _task = tcx.dep_graph.in_task(DepNode::Mir(def_id));
|
||||
let id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
let src = MirSource::from_node(tcx, id);
|
||||
if let MirSource::Const(_) = src {
|
||||
qualify_const_item_cached(tcx, &mut qualif_map, Some(map), def_id);
|
||||
@ -931,9 +933,9 @@ impl<'tcx> MirMapPass<'tcx> for QualifyAndPromoteConstants {
|
||||
// Then, handle everything else, without recursing,
|
||||
// as the MIR map is not shared, since promotion
|
||||
// in functions (including `const fn`) mutates it.
|
||||
for (&id, mir) in &mut map.map {
|
||||
let def_id = tcx.map.local_def_id(id);
|
||||
let _task = tcx.dep_graph.in_task(self.dep_node(def_id));
|
||||
for &def_id in &keys {
|
||||
let _task = tcx.dep_graph.in_task(DepNode::Mir(def_id));
|
||||
let id = tcx.map.as_local_node_id(def_id).unwrap();
|
||||
let src = MirSource::from_node(tcx, id);
|
||||
let mode = match src {
|
||||
MirSource::Fn(_) => {
|
||||
@ -950,6 +952,7 @@ impl<'tcx> MirMapPass<'tcx> for QualifyAndPromoteConstants {
|
||||
};
|
||||
let param_env = ty::ParameterEnvironment::for_item(tcx, id);
|
||||
|
||||
let mir = map.map.get_mut(&def_id).unwrap();
|
||||
for hook in &mut *hooks {
|
||||
hook.on_mir_pass(tcx, src, mir, self, false);
|
||||
}
|
||||
|
@ -11,8 +11,6 @@
|
||||
//! This pass type-checks the MIR to ensure it is not broken.
|
||||
#![allow(unreachable_code)]
|
||||
|
||||
use rustc::dep_graph::DepNode;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::infer::{self, InferCtxt, InferOk};
|
||||
use rustc::traits::{self, ProjectionMode};
|
||||
use rustc::ty::fold::TypeFoldable;
|
||||
@ -714,7 +712,4 @@ impl<'tcx> MirPass<'tcx> for TypeckMir {
|
||||
}
|
||||
|
||||
impl Pass for TypeckMir {
|
||||
fn dep_node(&self, def_id: DefId) -> DepNode<DefId> {
|
||||
DepNode::MirTypeck(def_id)
|
||||
}
|
||||
}
|
||||
|
@ -108,36 +108,13 @@ use rustc::ty::{self, TyCtxt, TypeFoldable};
|
||||
use rustc::ty::item_path::{self, ItemPathBuffer, RootMode};
|
||||
use rustc::hir::map::definitions::{DefPath, DefPathData};
|
||||
|
||||
use std::fmt::Write;
|
||||
use syntax::attr;
|
||||
use syntax::parse::token::{self, InternedString};
|
||||
use serialize::hex::ToHex;
|
||||
|
||||
pub fn def_id_to_string<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_id: DefId) -> String {
|
||||
let def_path = tcx.def_path(def_id);
|
||||
def_path_to_string(tcx, &def_path)
|
||||
}
|
||||
|
||||
fn def_path_to_string<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_path: &DefPath) -> String {
|
||||
let mut s = String::with_capacity(def_path.data.len() * 16);
|
||||
|
||||
if def_path.krate == cstore::LOCAL_CRATE {
|
||||
s.push_str(&tcx.crate_name(def_path.krate));
|
||||
} else {
|
||||
s.push_str(&tcx.sess.cstore.original_crate_name(def_path.krate));
|
||||
}
|
||||
s.push_str("/");
|
||||
s.push_str(&tcx.crate_disambiguator(def_path.krate));
|
||||
|
||||
for component in &def_path.data {
|
||||
write!(s,
|
||||
"::{}[{}]",
|
||||
component.data.as_interned_str(),
|
||||
component.disambiguator)
|
||||
.unwrap();
|
||||
}
|
||||
|
||||
s
|
||||
def_path.to_string(tcx)
|
||||
}
|
||||
|
||||
fn get_symbol_hash<'a, 'tcx>(scx: &SharedCrateContext<'a, 'tcx>,
|
||||
@ -167,7 +144,7 @@ fn get_symbol_hash<'a, 'tcx>(scx: &SharedCrateContext<'a, 'tcx>,
|
||||
// the main symbol name is not necessarily unique; hash in the
|
||||
// compiler's internal def-path, guaranteeing each symbol has a
|
||||
// truly unique path
|
||||
hash_state.input_str(&def_path_to_string(tcx, def_path));
|
||||
hash_state.input_str(&def_path.to_string(tcx));
|
||||
|
||||
// Include the main item-type. Note that, in this case, the
|
||||
// assertions about `needs_subst` may not hold, but this item-type
|
||||
|
@ -725,6 +725,10 @@ pub fn run_passes(sess: &Session,
|
||||
work_items.push(work);
|
||||
}
|
||||
|
||||
if sess.opts.debugging_opts.incremental_info {
|
||||
dump_incremental_data(&trans);
|
||||
}
|
||||
|
||||
// Process the work items, optionally using worker threads.
|
||||
// NOTE: This code is not really adapted to incremental compilation where
|
||||
// the compiler decides the number of codegen units (and will
|
||||
@ -902,6 +906,17 @@ pub fn run_passes(sess: &Session,
|
||||
}
|
||||
}
|
||||
|
||||
fn dump_incremental_data(trans: &CrateTranslation) {
|
||||
let mut reuse = 0;
|
||||
for mtrans in trans.modules.iter() {
|
||||
match mtrans.source {
|
||||
ModuleSource::Preexisting(..) => reuse += 1,
|
||||
ModuleSource::Translated(..) => (),
|
||||
}
|
||||
}
|
||||
println!("incremental: re-using {} out of {} modules", reuse, trans.modules.len());
|
||||
}
|
||||
|
||||
struct WorkItem {
|
||||
mtrans: ModuleTranslation,
|
||||
config: ModuleConfig,
|
||||
|
@ -1244,7 +1244,7 @@ pub fn inlined_variant_def<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>,
|
||||
}), ..}) => ty,
|
||||
_ => ctor_ty
|
||||
}.ty_adt_def().unwrap();
|
||||
let variant_def_id = if ccx.tcx().map.is_inlined(inlined_vid) {
|
||||
let variant_def_id = if ccx.tcx().map.is_inlined_node_id(inlined_vid) {
|
||||
ccx.defid_for_inlined_node(inlined_vid).unwrap()
|
||||
} else {
|
||||
ccx.tcx().map.local_def_id(inlined_vid)
|
||||
|
@ -1026,7 +1026,7 @@ pub fn get_static<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, def_id: DefId)
|
||||
.get(TransItem::Static(id))
|
||||
.expect("Local statics should always be in the SymbolMap");
|
||||
// Make sure that this is never executed for something inlined.
|
||||
assert!(!ccx.tcx().map.is_inlined(id));
|
||||
assert!(!ccx.tcx().map.is_inlined_node_id(id));
|
||||
|
||||
let defined_in_current_codegen_unit = ccx.codegen_unit()
|
||||
.items()
|
||||
|
@ -81,7 +81,7 @@ pub struct SharedCrateContext<'a, 'tcx: 'a> {
|
||||
check_overflow: bool,
|
||||
check_drop_flag_for_sanity: bool,
|
||||
mir_map: &'a MirMap<'tcx>,
|
||||
mir_cache: RefCell<DefIdMap<Rc<mir::Mir<'tcx>>>>,
|
||||
mir_cache: RefCell<DepTrackingMap<MirCache<'tcx>>>,
|
||||
|
||||
use_dll_storage_attrs: bool,
|
||||
|
||||
@ -186,6 +186,19 @@ impl<'tcx> DepTrackingMapConfig for TraitSelectionCache<'tcx> {
|
||||
}
|
||||
}
|
||||
|
||||
// Cache for mir loaded from metadata
|
||||
struct MirCache<'tcx> {
|
||||
data: PhantomData<&'tcx ()>
|
||||
}
|
||||
|
||||
impl<'tcx> DepTrackingMapConfig for MirCache<'tcx> {
|
||||
type Key = DefId;
|
||||
type Value = Rc<mir::Mir<'tcx>>;
|
||||
fn to_dep_node(key: &DefId) -> DepNode<DefId> {
|
||||
DepNode::Mir(*key)
|
||||
}
|
||||
}
|
||||
|
||||
/// This list owns a number of LocalCrateContexts and binds them to their common
|
||||
/// SharedCrateContext. This type just exists as a convenience, something to
|
||||
/// pass around all LocalCrateContexts with and get an iterator over them.
|
||||
@ -474,7 +487,7 @@ impl<'b, 'tcx> SharedCrateContext<'b, 'tcx> {
|
||||
symbol_hasher: RefCell::new(symbol_hasher),
|
||||
tcx: tcx,
|
||||
mir_map: mir_map,
|
||||
mir_cache: RefCell::new(DefIdMap()),
|
||||
mir_cache: RefCell::new(DepTrackingMap::new(tcx.dep_graph.clone())),
|
||||
stats: Stats {
|
||||
n_glues_created: Cell::new(0),
|
||||
n_null_glues: Cell::new(0),
|
||||
@ -538,8 +551,7 @@ impl<'b, 'tcx> SharedCrateContext<'b, 'tcx> {
|
||||
|
||||
pub fn get_mir(&self, def_id: DefId) -> Option<CachedMir<'b, 'tcx>> {
|
||||
if def_id.is_local() {
|
||||
let node_id = self.tcx.map.as_local_node_id(def_id).unwrap();
|
||||
self.mir_map.map.get(&node_id).map(CachedMir::Ref)
|
||||
self.mir_map.map.get(&def_id).map(CachedMir::Ref)
|
||||
} else {
|
||||
if let Some(mir) = self.mir_cache.borrow().get(&def_id).cloned() {
|
||||
return Some(CachedMir::Owned(mir));
|
||||
|
@ -326,7 +326,7 @@ impl<'tcx> TypeMap<'tcx> {
|
||||
// First, find out the 'real' def_id of the type. Items inlined from
|
||||
// other crates have to be mapped back to their source.
|
||||
let def_id = if let Some(node_id) = cx.tcx().map.as_local_node_id(def_id) {
|
||||
if cx.tcx().map.is_inlined(node_id) {
|
||||
if cx.tcx().map.is_inlined_node_id(node_id) {
|
||||
// The given def_id identifies the inlined copy of a
|
||||
// type definition, let's take the source of the copy.
|
||||
cx.defid_for_inlined_node(node_id).unwrap()
|
||||
@ -1846,7 +1846,7 @@ pub fn create_global_var_metadata(cx: &CrateContext,
|
||||
// crate should already contain debuginfo for it. More importantly, the
|
||||
// global might not even exist in un-inlined form anywhere which would lead
|
||||
// to a linker errors.
|
||||
if cx.tcx().map.is_inlined(node_id) {
|
||||
if cx.tcx().map.is_inlined_node_id(node_id) {
|
||||
return;
|
||||
}
|
||||
|
||||
|
@ -83,8 +83,6 @@ impl<'a, 'tcx> TransItem<'tcx> {
|
||||
// that the incoming edges to a particular fn are from a
|
||||
// particular set.
|
||||
|
||||
self.register_reads(ccx);
|
||||
|
||||
match *self {
|
||||
TransItem::Static(node_id) => {
|
||||
let def_id = ccx.tcx().map.local_def_id(node_id);
|
||||
@ -120,46 +118,6 @@ impl<'a, 'tcx> TransItem<'tcx> {
|
||||
ccx.codegen_unit().name());
|
||||
}
|
||||
|
||||
/// If necessary, creates a subtask for trans'ing a particular item and registers reads on
|
||||
/// `TypeckItemBody` and `Hir`.
|
||||
fn register_reads(&self, ccx: &CrateContext<'a, 'tcx>) {
|
||||
let tcx = ccx.tcx();
|
||||
let def_id = match *self {
|
||||
TransItem::Static(node_id) => {
|
||||
tcx.map.local_def_id(node_id)
|
||||
}
|
||||
TransItem::Fn(instance) => {
|
||||
if let Some(node) = tcx.map.as_local_node_id(instance.def) {
|
||||
if let hir_map::Node::NodeItem(_) = tcx.map.get(node) {
|
||||
// This already is a "real" item
|
||||
instance.def
|
||||
} else {
|
||||
// Get the enclosing item and register a read on it
|
||||
tcx.map.get_parent_did(node)
|
||||
}
|
||||
} else {
|
||||
// Translating an inlined item from another crate? Don't track anything.
|
||||
return;
|
||||
}
|
||||
}
|
||||
TransItem::DropGlue(_) => {
|
||||
// Nothing to track for drop glue
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
tcx.dep_graph.with_task(DepNode::TransCrateItem(def_id), || {
|
||||
tcx.dep_graph.read(DepNode::Hir(def_id));
|
||||
|
||||
// We are going to be accessing various tables
|
||||
// generated by TypeckItemBody; we also assume
|
||||
// that the body passes type check. These tables
|
||||
// are not individually tracked, so just register
|
||||
// a read here.
|
||||
tcx.dep_graph.read(DepNode::TypeckItemBody(def_id));
|
||||
});
|
||||
}
|
||||
|
||||
pub fn predefine(&self,
|
||||
ccx: &CrateContext<'a, 'tcx>,
|
||||
linkage: llvm::Linkage) {
|
||||
|
@ -1442,6 +1442,7 @@ fn type_scheme_of_item<'a,'tcx>(ccx: &CrateCtxt<'a,'tcx>,
|
||||
// NB. Since the `memoized` function enters a new task, and we
|
||||
// are giving this task access to the item `item`, we must
|
||||
// register a read.
|
||||
assert!(!ccx.tcx.map.is_inlined_def_id(item_def_id));
|
||||
ccx.tcx.dep_graph.read(DepNode::Hir(item_def_id));
|
||||
compute_type_scheme_of_item(ccx, item)
|
||||
})
|
||||
@ -1569,6 +1570,7 @@ fn type_scheme_of_foreign_item<'a, 'tcx>(
|
||||
// NB. Since the `memoized` function enters a new task, and we
|
||||
// are giving this task access to the item `item`, we must
|
||||
// register a read.
|
||||
assert!(!ccx.tcx.map.is_inlined_def_id(item_def_id));
|
||||
ccx.tcx.dep_graph.read(DepNode::Hir(item_def_id));
|
||||
compute_type_scheme_of_foreign_item(ccx, item, abi)
|
||||
})
|
||||
|
@ -10,6 +10,7 @@
|
||||
|
||||
// aux-build:a.rs
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags:-Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -9,6 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
// revisions: rpass1 cfail2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![allow(warnings)]
|
||||
#![feature(rustc_attrs)]
|
||||
|
45
src/test/incremental/foreign.rs
Normal file
45
src/test/incremental/foreign.rs
Normal file
@ -0,0 +1,45 @@
|
||||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Test what happens we save incremental compilation state that makes
|
||||
// use of foreign items. This used to ICE (#34991).
|
||||
|
||||
// revisions: rpass1
|
||||
|
||||
#![feature(libc)]
|
||||
|
||||
extern crate libc;
|
||||
|
||||
use std::ffi::CString;
|
||||
|
||||
mod mlibc {
|
||||
use libc::{c_char, c_long, c_longlong};
|
||||
|
||||
extern {
|
||||
pub fn atol(x: *const c_char) -> c_long;
|
||||
pub fn atoll(x: *const c_char) -> c_longlong;
|
||||
}
|
||||
}
|
||||
|
||||
fn atol(s: String) -> isize {
|
||||
let c = CString::new(s).unwrap();
|
||||
unsafe { mlibc::atol(c.as_ptr()) as isize }
|
||||
}
|
||||
|
||||
fn atoll(s: String) -> i64 {
|
||||
let c = CString::new(s).unwrap();
|
||||
unsafe { mlibc::atoll(c.as_ptr()) as i64 }
|
||||
}
|
||||
|
||||
pub fn main() {
|
||||
assert_eq!(atol("1024".to_string()) * 10, atol("10240".to_string()));
|
||||
assert_eq!((atoll("11111111111111111".to_string()) * 10),
|
||||
atoll("111111111111111110".to_string()));
|
||||
}
|
@ -9,6 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
// revisions: rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![allow(warnings)]
|
||||
#![feature(rustc_attrs)]
|
||||
|
33
src/test/incremental/inlined_hir_34991/main.rs
Normal file
33
src/test/incremental/inlined_hir_34991/main.rs
Normal file
@ -0,0 +1,33 @@
|
||||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Regression test for #34991: an ICE occurred here because we inline
|
||||
// some of the vector routines and give them a local def-id `X`. This
|
||||
// got hashed after trans (`Hir(X)`). When we load back up, we get an
|
||||
// error because the `X` is remapped to the original def-id (in
|
||||
// libstd), and we can't hash a HIR node from std.
|
||||
|
||||
// revisions:rpass1 rpass2
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
use std::vec::Vec;
|
||||
|
||||
pub fn foo() -> Vec<i32> {
|
||||
vec![1, 2, 3]
|
||||
}
|
||||
|
||||
pub fn bar() {
|
||||
foo();
|
||||
}
|
||||
|
||||
pub fn main() {
|
||||
bar();
|
||||
}
|
14
src/test/incremental/krate_reassign_34991/auxiliary/a.rs
Normal file
14
src/test/incremental/krate_reassign_34991/auxiliary/a.rs
Normal file
@ -0,0 +1,14 @@
|
||||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![crate_type="rlib"]
|
||||
|
||||
pub type X = u32;
|
||||
|
30
src/test/incremental/krate_reassign_34991/main.rs
Normal file
30
src/test/incremental/krate_reassign_34991/main.rs
Normal file
@ -0,0 +1,30 @@
|
||||
// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// aux-build:a.rs
|
||||
// revisions:rpass1 rpass2
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
#[cfg(rpass1)]
|
||||
extern crate a;
|
||||
|
||||
#[cfg(rpass1)]
|
||||
pub fn use_X() -> u32 {
|
||||
let x: a::X = 22;
|
||||
x as u32
|
||||
}
|
||||
|
||||
#[cfg(rpass2)]
|
||||
pub fn use_X() -> u32 {
|
||||
22
|
||||
}
|
||||
|
||||
pub fn main() { }
|
@ -16,7 +16,7 @@
|
||||
// aux-build:a.rs
|
||||
// revisions:rpass1 rpass2 rpass3
|
||||
// no-prefer-dynamic
|
||||
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -35,14 +35,10 @@ mod x {
|
||||
X { x: 11, y: 11 }
|
||||
}
|
||||
|
||||
#[rustc_dirty(label="TypeckItemBody", cfg="rpass2")]
|
||||
#[rustc_clean(label="ItemSignature", cfg="rpass2")]
|
||||
pub fn new() -> X {
|
||||
make()
|
||||
}
|
||||
|
||||
#[rustc_clean(label="TypeckItemBody", cfg="rpass2")]
|
||||
#[rustc_clean(label="ItemSignature", cfg="rpass2")]
|
||||
pub fn sum(x: &X) -> u32 {
|
||||
x.x + x.y
|
||||
}
|
||||
@ -51,7 +47,6 @@ mod x {
|
||||
mod y {
|
||||
use x;
|
||||
|
||||
#[rustc_clean(label="TypeckItemBody", cfg="rpass2")]
|
||||
pub fn assert_sum() -> bool {
|
||||
let x = x::new();
|
||||
x::sum(&x) == 22
|
||||
|
@ -9,6 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
// revisions: rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![allow(warnings)]
|
||||
#![feature(rustc_attrs)]
|
||||
|
@ -12,6 +12,7 @@
|
||||
// in between revisions (hashing should be stable).
|
||||
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -12,6 +12,7 @@
|
||||
// in between revisions (hashing should be stable).
|
||||
|
||||
// revisions:rpass1 cfail2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -12,6 +12,7 @@
|
||||
// in between revisions (hashing should be stable).
|
||||
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -10,6 +10,7 @@
|
||||
|
||||
// aux-build:a.rs
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -12,6 +12,7 @@
|
||||
// in between revisions (hashing should be stable).
|
||||
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -12,6 +12,7 @@
|
||||
// in between revisions (hashing should be stable).
|
||||
|
||||
// revisions:rpass1 rpass2
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
@ -10,6 +10,7 @@
|
||||
|
||||
// aux-build:a.rs
|
||||
// revisions:rpass1 rpass2 rpass3
|
||||
// compile-flags: -Z query-dep-graph
|
||||
|
||||
#![feature(rustc_attrs)]
|
||||
|
||||
|
Loading…
x
Reference in New Issue
Block a user