When a field should be skipped during deserialization, it will not use its own Default implementation
when the container structure has `#[serde(default)]` set.
The serde_test Serializer and Deserializer panic in is_human_readable unless the
readableness has been set explicitly through one of the hidden functions. This
is to force types that have distinct readable/compact representations to be
tested explicitly in one or the other, rather than with a plain assert_tokens
which arbitrarily picks one.
We need to follow up by designing a better API in serde_test to expose this
publicly. For now serde_test cannot be used to test types that rely on
is_human_readable. (The hidden functions are meant for our test suite only.)
Since we know exactly how many bytes we should serialize as we can hint
to the serializer that it is not required which further reduces the
serialized size when compared to just serializing as bytes.
This implements the KISS suggested in https://github.com/serde-rs/serde/issues/790.
It is possible that one of the other approaches may be better but this
seemed like the simplest one to reignite som discussion.
Personally I find the original suggestion of adding two traits perhaps slightly
cleaner in theory but I think it ends up more complicated in the end
since the added traits also need to be duplicated to to the `Seed`
traits.
Closes#790
This commit implements the two serde traits for the libstd `OsStr` and
`OsString` types. This came up as a use case during implementing sccache where
we're basically just doing IPC to communicate paths around. Additionally the
`Path` and `PathBuf` implementations have been updated to delegate to the os
string ones.
These types are platform-specific, however, so the serialization/deserialization
isn't trivial. Currently this "fakes" a newtype variant for Unix/Windows to
prevent cross-platform serialization/deserialization. This means if you're doing
IPC within the same OS (e.g. Windows to Windows) then serialization should be
infallible. If you're doing IPC across platforms (e.g. Unix to Windows) then
using `OsString` is guaranteed to fail as bytes from one OS won't deserialize on
the other (even if they're unicode).
This allows structs to use the default value for each field defined in
the struct’s `std::default::Default` implementation, rather then the
default value for the field’s type.
```
struct StructDefault {
a: i32,
b: String,
}
impl Default for StructDefault {
fn default() -> StructDefault {
StructDefault{
a: 100,
b: "default".to_string(),
}
}
}
```
The code above will now return `100` for field `a` and `”default”` for
`b`, rather then `0` and `””` respectively.