Skip to content

[C++][Parquet] Decoding byte stream split encoded columns fails when it has null values #28737

@asfimport

Description

@asfimport

Reading from a parquet file fails with the following error

Data size too small for number of values (corrupted file?).

This happens for the case when there is a BYTE_STREAM_SPLIT-encoded column which has less values stored than number of rows, which is the case when the column has null values (definition levels are present).

The problematic part is the condition checked in ByteStreamSplitDecoder<DType>::SetData, which raises the error if the number of values does not match the size of the data array.

I'm unsure whether I have enough experience with the internals of the encoding/decoding part of this implementation to fix this issue, but my suggestion would be to initialize num_values_in_buffer_ with len/static_cast<int64_t>(sizeof(T)).

Reporter: Roman Karlstetter / @romankarlstetter

Note: This issue was originally created as ARROW-13024. Please see the migration documentation for further details.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions