Skip to content

Encode images as a byte blob + pixel format + resolution; not as a tensor #6386

@teh-cmc

Description

@teh-cmc
  • The blob is untyped -- it's always a u8 blob, not a generic TensorData.
  • The stride should be a component, overridable in the UI and/or blueprint.
  • The pixel format should be a component, overridable in the UI and/or blueprint.

Examples of pixel formats:

  • RGBA8
  • RG32F
  • NV12
  • JPEG

Also:

  • "Swizzling" width and height can be handled using a 2D transform.
  • If someone creates an Image straight out of a numpy tensor, we still log the shape as a component, which the viewer will use to guess the stride and pixel format, unless they have been logged explicitly by the user.
  • Some pixel formats might require secondary caching due to heavy intermediate representation changes (e.g. JPEG).

Update

Following live discussion, we introduce ImageEncoded, as such:

archetype Image {
  buffer: ImageBuffer,
  resolution: Resolution2D,
  pixel_format: PixelFormat,
  stride: Option<ImageStride>,
}

component ImageStride {
  bytes_per_row: u32,
  bytes_per_plane: Option<u32>,
}

/// e.g. JPEG, PNG, …
archetype ImageEncoded {
  blob: Blob,
  media_type: Option<MediaType>,
}

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions