Skip to content

consensus_encoding: tighten memory bound in VecDecoder? #5157

@apoelstra

Description

@apoelstra

Right now in VecDecoder we have the code

            // `cast_to_usize_if_valid` asserts length < 4,000,000, so no DoS vector here.
            self.buffer = Vec::with_capacity(self.bytes_expected);

But asserting that the vector length is under 4 megs only asserts that we're allocating fewer than 4 million Ts, where T's size in principle has no bound. In rust-bitcoin I think the largest type we deserialize is TxIn which on my system is 104 bytes. So this limit is not a 4-meg allocation limit but a 416-meg allocation limit. (And if downstream users try to use this with some T of their own, say an Elements type containing inline surjection proofs or something, the limit might be very high indeed..)

This might be fine, but if so we should at least clarify the comment. If not, we may want to add a method to Decodeable indicating a "minimum encoded size" for an object. Which in particular, would be 9 for TxOut, 41 for TxIn, 10 for Transaction, 80 for BlockHeader, etc.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions