Rollup merge of #71167 - RalfJung:big-o, r=shepmaster
big-O notation: parenthesis for function calls, explicit multiplication I saw `O(n m log n)` in the docs and found that really hard to parse. In particular, I don't think we should use blank space as syntax for *both* multiplication and function calls, that is just confusing. This PR makes both multiplication and function calls explicit using Rust-like syntax. If you prefer, I can also leave one of them implicit, but I believe explicit is better here. While I was at it I also added backticks consistently.
This commit is contained in:
commit
d5d9bf0406
@ -1,10 +1,10 @@
|
||||
//! A priority queue implemented with a binary heap.
|
||||
//!
|
||||
//! Insertion and popping the largest element have `O(log n)` time complexity.
|
||||
//! Insertion and popping the largest element have `O(log(n))` time complexity.
|
||||
//! Checking the largest element is `O(1)`. Converting a vector to a binary heap
|
||||
//! can be done in-place, and has `O(n)` complexity. A binary heap can also be
|
||||
//! converted to a sorted vector in-place, allowing it to be used for an `O(n
|
||||
//! log n)` in-place heapsort.
|
||||
//! converted to a sorted vector in-place, allowing it to be used for an `O(n * log(n))`
|
||||
//! in-place heapsort.
|
||||
//!
|
||||
//! # Examples
|
||||
//!
|
||||
@ -233,9 +233,9 @@
|
||||
///
|
||||
/// # Time complexity
|
||||
///
|
||||
/// | [push] | [pop] | [peek]/[peek\_mut] |
|
||||
/// |--------|----------|--------------------|
|
||||
/// | O(1)~ | O(log n) | O(1) |
|
||||
/// | [push] | [pop] | [peek]/[peek\_mut] |
|
||||
/// |--------|-----------|--------------------|
|
||||
/// | O(1)~ | O(log(n)) | O(1) |
|
||||
///
|
||||
/// The value for `push` is an expected cost; the method documentation gives a
|
||||
/// more detailed analysis.
|
||||
@ -398,7 +398,7 @@ pub fn with_capacity(capacity: usize) -> BinaryHeap<T> {
|
||||
///
|
||||
/// # Time complexity
|
||||
///
|
||||
/// Cost is O(1) in the worst case.
|
||||
/// Cost is `O(1)` in the worst case.
|
||||
#[stable(feature = "binary_heap_peek_mut", since = "1.12.0")]
|
||||
pub fn peek_mut(&mut self) -> Option<PeekMut<'_, T>> {
|
||||
if self.is_empty() { None } else { Some(PeekMut { heap: self, sift: true }) }
|
||||
@ -422,8 +422,7 @@ pub fn peek_mut(&mut self) -> Option<PeekMut<'_, T>> {
|
||||
///
|
||||
/// # Time complexity
|
||||
///
|
||||
/// The worst case cost of `pop` on a heap containing *n* elements is O(log
|
||||
/// n).
|
||||
/// The worst case cost of `pop` on a heap containing *n* elements is `O(log(n))`.
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
pub fn pop(&mut self) -> Option<T> {
|
||||
self.data.pop().map(|mut item| {
|
||||
@ -456,15 +455,15 @@ pub fn pop(&mut self) -> Option<T> {
|
||||
///
|
||||
/// The expected cost of `push`, averaged over every possible ordering of
|
||||
/// the elements being pushed, and over a sufficiently large number of
|
||||
/// pushes, is O(1). This is the most meaningful cost metric when pushing
|
||||
/// pushes, is `O(1)`. This is the most meaningful cost metric when pushing
|
||||
/// elements that are *not* already in any sorted pattern.
|
||||
///
|
||||
/// The time complexity degrades if elements are pushed in predominantly
|
||||
/// ascending order. In the worst case, elements are pushed in ascending
|
||||
/// sorted order and the amortized cost per push is O(log n) against a heap
|
||||
/// sorted order and the amortized cost per push is `O(log(n))` against a heap
|
||||
/// containing *n* elements.
|
||||
///
|
||||
/// The worst case cost of a *single* call to `push` is O(n). The worst case
|
||||
/// The worst case cost of a *single* call to `push` is `O(n)`. The worst case
|
||||
/// occurs when capacity is exhausted and needs a resize. The resize cost
|
||||
/// has been amortized in the previous figures.
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
@ -623,7 +622,7 @@ fn log2_fast(x: usize) -> usize {
|
||||
|
||||
// `rebuild` takes O(len1 + len2) operations
|
||||
// and about 2 * (len1 + len2) comparisons in the worst case
|
||||
// while `extend` takes O(len2 * log_2(len1)) operations
|
||||
// while `extend` takes O(len2 * log(len1)) operations
|
||||
// and about 1 * len2 * log_2(len1) comparisons in the worst case,
|
||||
// assuming len1 >= len2.
|
||||
#[inline]
|
||||
@ -644,7 +643,7 @@ fn better_to_rebuild(len1: usize, len2: usize) -> bool {
|
||||
/// The remaining elements will be removed on drop in heap order.
|
||||
///
|
||||
/// Note:
|
||||
/// * `.drain_sorted()` is O(n lg n); much slower than `.drain()`.
|
||||
/// * `.drain_sorted()` is `O(n * log(n))`; much slower than `.drain()`.
|
||||
/// You should use the latter for most cases.
|
||||
///
|
||||
/// # Examples
|
||||
@ -729,7 +728,7 @@ pub fn into_iter_sorted(self) -> IntoIterSorted<T> {
|
||||
///
|
||||
/// # Time complexity
|
||||
///
|
||||
/// Cost is O(1) in the worst case.
|
||||
/// Cost is `O(1)` in the worst case.
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
pub fn peek(&self) -> Option<&T> {
|
||||
self.data.get(0)
|
||||
|
@ -40,7 +40,7 @@
|
||||
/// performance on *small* nodes of elements which are cheap to compare. However in the future we
|
||||
/// would like to further explore choosing the optimal search strategy based on the choice of B,
|
||||
/// and possibly other factors. Using linear search, searching for a random element is expected
|
||||
/// to take O(B log<sub>B</sub>n) comparisons, which is generally worse than a BST. In practice,
|
||||
/// to take O(B * log(n)) comparisons, which is generally worse than a BST. In practice,
|
||||
/// however, performance is excellent.
|
||||
///
|
||||
/// It is a logic error for a key to be modified in such a way that the key's ordering relative to
|
||||
|
@ -390,7 +390,7 @@ pub const fn new() -> Self {
|
||||
/// This reuses all the nodes from `other` and moves them into `self`. After
|
||||
/// this operation, `other` becomes empty.
|
||||
///
|
||||
/// This operation should compute in O(1) time and O(1) memory.
|
||||
/// This operation should compute in `O(1)` time and `O(1)` memory.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -547,7 +547,7 @@ pub fn cursor_back_mut(&mut self) -> CursorMut<'_, T> {
|
||||
|
||||
/// Returns `true` if the `LinkedList` is empty.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -568,7 +568,7 @@ pub fn is_empty(&self) -> bool {
|
||||
|
||||
/// Returns the length of the `LinkedList`.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -594,7 +594,7 @@ pub fn len(&self) -> usize {
|
||||
|
||||
/// Removes all elements from the `LinkedList`.
|
||||
///
|
||||
/// This operation should compute in O(n) time.
|
||||
/// This operation should compute in `O(n)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -737,7 +737,7 @@ pub fn back_mut(&mut self) -> Option<&mut T> {
|
||||
|
||||
/// Adds an element first in the list.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -760,7 +760,7 @@ pub fn push_front(&mut self, elt: T) {
|
||||
/// Removes the first element and returns it, or `None` if the list is
|
||||
/// empty.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -783,7 +783,7 @@ pub fn pop_front(&mut self) -> Option<T> {
|
||||
|
||||
/// Appends an element to the back of a list.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -803,7 +803,7 @@ pub fn push_back(&mut self, elt: T) {
|
||||
/// Removes the last element from a list and returns it, or `None` if
|
||||
/// it is empty.
|
||||
///
|
||||
/// This operation should compute in O(1) time.
|
||||
/// This operation should compute in `O(1)` time.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -824,7 +824,7 @@ pub fn pop_back(&mut self) -> Option<T> {
|
||||
/// Splits the list into two at the given index. Returns everything after the given index,
|
||||
/// including the index.
|
||||
///
|
||||
/// This operation should compute in O(n) time.
|
||||
/// This operation should compute in `O(n)` time.
|
||||
///
|
||||
/// # Panics
|
||||
///
|
||||
@ -880,7 +880,7 @@ pub fn split_off(&mut self, at: usize) -> LinkedList<T> {
|
||||
|
||||
/// Removes the element at the given index and returns it.
|
||||
///
|
||||
/// This operation should compute in O(n) time.
|
||||
/// This operation should compute in `O(n)` time.
|
||||
///
|
||||
/// # Panics
|
||||
/// Panics if at >= len
|
||||
|
@ -1391,7 +1391,7 @@ fn is_contiguous(&self) -> bool {
|
||||
/// Removes an element from anywhere in the `VecDeque` and returns it,
|
||||
/// replacing it with the first element.
|
||||
///
|
||||
/// This does not preserve ordering, but is O(1).
|
||||
/// This does not preserve ordering, but is `O(1)`.
|
||||
///
|
||||
/// Returns `None` if `index` is out of bounds.
|
||||
///
|
||||
@ -1426,7 +1426,7 @@ pub fn swap_remove_front(&mut self, index: usize) -> Option<T> {
|
||||
/// Removes an element from anywhere in the `VecDeque` and returns it, replacing it with the
|
||||
/// last element.
|
||||
///
|
||||
/// This does not preserve ordering, but is O(1).
|
||||
/// This does not preserve ordering, but is `O(1)`.
|
||||
///
|
||||
/// Returns `None` if `index` is out of bounds.
|
||||
///
|
||||
@ -2927,7 +2927,7 @@ impl<T> From<VecDeque<T>> for Vec<T> {
|
||||
/// [`Vec<T>`]: crate::vec::Vec
|
||||
/// [`VecDeque<T>`]: crate::collections::VecDeque
|
||||
///
|
||||
/// This never needs to re-allocate, but does need to do O(n) data movement if
|
||||
/// This never needs to re-allocate, but does need to do `O(n)` data movement if
|
||||
/// the circular buffer doesn't happen to be at the beginning of the allocation.
|
||||
///
|
||||
/// # Examples
|
||||
|
@ -165,7 +165,7 @@ pub fn to_vec<T>(s: &[T]) -> Vec<T>
|
||||
impl<T> [T] {
|
||||
/// Sorts the slice.
|
||||
///
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(n log n)` worst-case.
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(n * log(n))` worst-case.
|
||||
///
|
||||
/// When applicable, unstable sorting is preferred because it is generally faster than stable
|
||||
/// sorting and it doesn't allocate auxiliary memory.
|
||||
@ -200,7 +200,7 @@ pub fn sort(&mut self)
|
||||
|
||||
/// Sorts the slice with a comparator function.
|
||||
///
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(n log n)` worst-case.
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(n * log(n))` worst-case.
|
||||
///
|
||||
/// The comparator function must define a total ordering for the elements in the slice. If
|
||||
/// the ordering is not total, the order of the elements is unspecified. An order is a
|
||||
@ -254,7 +254,7 @@ pub fn sort_by<F>(&mut self, mut compare: F)
|
||||
|
||||
/// Sorts the slice with a key extraction function.
|
||||
///
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(m n log n)`
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(m * n * log(n))`
|
||||
/// worst-case, where the key function is `O(m)`.
|
||||
///
|
||||
/// For expensive key functions (e.g. functions that are not simple property accesses or
|
||||
@ -297,7 +297,7 @@ pub fn sort_by_key<K, F>(&mut self, mut f: F)
|
||||
///
|
||||
/// During sorting, the key function is called only once per element.
|
||||
///
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(m n + n log n)`
|
||||
/// This sort is stable (i.e., does not reorder equal elements) and `O(m * n + n * log(n))`
|
||||
/// worst-case, where the key function is `O(m)`.
|
||||
///
|
||||
/// For simple key functions (e.g., functions that are property accesses or
|
||||
@ -935,7 +935,7 @@ fn drop(&mut self) {
|
||||
/// 1. for every `i` in `1..runs.len()`: `runs[i - 1].len > runs[i].len`
|
||||
/// 2. for every `i` in `2..runs.len()`: `runs[i - 2].len > runs[i - 1].len + runs[i].len`
|
||||
///
|
||||
/// The invariants ensure that the total running time is `O(n log n)` worst-case.
|
||||
/// The invariants ensure that the total running time is `O(n * log(n))` worst-case.
|
||||
fn merge_sort<T, F>(v: &mut [T], mut is_less: F)
|
||||
where
|
||||
F: FnMut(&T, &T) -> bool,
|
||||
|
@ -1606,7 +1606,7 @@ pub fn binary_search_by_key<'a, B, F>(&'a self, b: &B, mut f: F) -> Result<usize
|
||||
/// Sorts the slice, but may not preserve the order of equal elements.
|
||||
///
|
||||
/// This sort is unstable (i.e., may reorder equal elements), in-place
|
||||
/// (i.e., does not allocate), and `O(n log n)` worst-case.
|
||||
/// (i.e., does not allocate), and `O(n * log(n))` worst-case.
|
||||
///
|
||||
/// # Current implementation
|
||||
///
|
||||
@ -1642,7 +1642,7 @@ pub fn sort_unstable(&mut self)
|
||||
/// elements.
|
||||
///
|
||||
/// This sort is unstable (i.e., may reorder equal elements), in-place
|
||||
/// (i.e., does not allocate), and `O(n log n)` worst-case.
|
||||
/// (i.e., does not allocate), and `O(n * log(n))` worst-case.
|
||||
///
|
||||
/// The comparator function must define a total ordering for the elements in the slice. If
|
||||
/// the ordering is not total, the order of the elements is unspecified. An order is a
|
||||
@ -1697,7 +1697,7 @@ pub fn sort_unstable_by<F>(&mut self, mut compare: F)
|
||||
/// elements.
|
||||
///
|
||||
/// This sort is unstable (i.e., may reorder equal elements), in-place
|
||||
/// (i.e., does not allocate), and `O(m n log n)` worst-case, where the key function is
|
||||
/// (i.e., does not allocate), and `O(m * n * log(n))` worst-case, where the key function is
|
||||
/// `O(m)`.
|
||||
///
|
||||
/// # Current implementation
|
||||
@ -1957,7 +1957,7 @@ pub fn partition_dedup_by<F>(&mut self, mut same_bucket: F) -> (&mut [T], &mut [
|
||||
// over all the elements, swapping as we go so that at the end
|
||||
// the elements we wish to keep are in the front, and those we
|
||||
// wish to reject are at the back. We can then split the slice.
|
||||
// This operation is still O(n).
|
||||
// This operation is still `O(n)`.
|
||||
//
|
||||
// Example: We start in this state, where `r` represents "next
|
||||
// read" and `w` represents "next_write`.
|
||||
|
@ -143,7 +143,7 @@ fn insertion_sort<T, F>(v: &mut [T], is_less: &mut F)
|
||||
}
|
||||
}
|
||||
|
||||
/// Sorts `v` using heapsort, which guarantees `O(n log n)` worst-case.
|
||||
/// Sorts `v` using heapsort, which guarantees `O(n * log(n))` worst-case.
|
||||
#[cold]
|
||||
pub fn heapsort<T, F>(v: &mut [T], is_less: &mut F)
|
||||
where
|
||||
@ -621,7 +621,7 @@ fn recurse<'a, T, F>(mut v: &'a mut [T], is_less: &mut F, mut pred: Option<&'a T
|
||||
}
|
||||
|
||||
// If too many bad pivot choices were made, simply fall back to heapsort in order to
|
||||
// guarantee `O(n log n)` worst-case.
|
||||
// guarantee `O(n * log(n))` worst-case.
|
||||
if limit == 0 {
|
||||
heapsort(v, is_less);
|
||||
return;
|
||||
@ -684,7 +684,7 @@ fn recurse<'a, T, F>(mut v: &'a mut [T], is_less: &mut F, mut pred: Option<&'a T
|
||||
}
|
||||
}
|
||||
|
||||
/// Sorts `v` using pattern-defeating quicksort, which is `O(n log n)` worst-case.
|
||||
/// Sorts `v` using pattern-defeating quicksort, which is `O(n * log(n))` worst-case.
|
||||
pub fn quicksort<T, F>(v: &mut [T], mut is_less: F)
|
||||
where
|
||||
F: FnMut(&T, &T) -> bool,
|
||||
|
@ -110,10 +110,10 @@
|
||||
//!
|
||||
//! For Sets, all operations have the cost of the equivalent Map operation.
|
||||
//!
|
||||
//! | | get | insert | remove | predecessor | append |
|
||||
//! |--------------|-----------|----------|----------|-------------|--------|
|
||||
//! | [`HashMap`] | O(1)~ | O(1)~* | O(1)~ | N/A | N/A |
|
||||
//! | [`BTreeMap`] | O(log n) | O(log n) | O(log n) | O(log n) | O(n+m) |
|
||||
//! | | get | insert | remove | predecessor | append |
|
||||
//! |--------------|-----------|-----------|-----------|-------------|--------|
|
||||
//! | [`HashMap`] | O(1)~ | O(1)~* | O(1)~ | N/A | N/A |
|
||||
//! | [`BTreeMap`] | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n+m) |
|
||||
//!
|
||||
//! # Correct and Efficient Usage of Collections
|
||||
//!
|
||||
|
@ -43,8 +43,8 @@
|
||||
//! terminator, so the buffer length is really `len+1` characters.
|
||||
//! Rust strings don't have a nul terminator; their length is always
|
||||
//! stored and does not need to be calculated. While in Rust
|
||||
//! accessing a string's length is a O(1) operation (because the
|
||||
//! length is stored); in C it is an O(length) operation because the
|
||||
//! accessing a string's length is a `O(1)` operation (because the
|
||||
//! length is stored); in C it is an `O(length)` operation because the
|
||||
//! length needs to be computed by scanning the string for the nul
|
||||
//! terminator.
|
||||
//!
|
||||
|
Loading…
Reference in New Issue
Block a user