Skip to content

appending to a table with Decimal > 32767 results in int too big to convert #669

@vtk9

Description

@vtk9

Apache Iceberg version

0.6.0 (latest release)

Please describe the bug 🐞

Hello,

Is this a bug or is there something obvious I am misunderstanding/misusing w.r.t to Decimal? 32767 = 2^15 -1 so perhaps some endianness issue? (I am relatively new to iceberg). Tested on MacOS M2 arm64

from decimal import Decimal
from pyiceberg.catalog.sql import SqlCatalog
import pyarrow as pa

pylist = [{'decimal_col': Decimal('32768.')}]
arrow_schema = pa.schema(
    [
        pa.field('decimal_col', pa.decimal128(38, 0)),
    ],
)
arrow_table = pa.Table.from_pylist(pylist, schema=arrow_schema)

catalog = SqlCatalog(
    'test_catalog',
    **{
        'type': "sql'",
        'uri': 'sqlite:///pyiceberg.db',
    },
)

namespace = 'test_ns'
table_name = 'test_table'

catalog.create_namespace(namespace=namespace)
new_table = catalog.create_table(
    identifier=f'{namespace}.{table_name}',
    schema=arrow_schema,
    location='.',
)

new_table.append(arrow_table)
OverflowError: int too big to convert
  • Note: pylist = [{'decimal_col': Decimal('32767.')}] works
  • Switching pa.field('decimal_col', pa.decimal128(38, 1)), also works

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions