Skip to content

[C++] What's the difference between ARROW_SIMD_LEVEL and ARROW_RUNTIME_SIMD_LEVEL #40333

@mapleFU

Description

@mapleFU

Describe the usage question you have. Please include as many useful details as possible.

In Arrow, we have two SIMD flag in CMake listed below:

  define_option_string(ARROW_SIMD_LEVEL
                       "Compile-time SIMD optimization level"
                       "DEFAULT" # default to SSE4_2 on x86, NEON on Arm, NONE otherwise
                       "NONE"
                       "SSE4_2"
                       "AVX2"
                       "AVX512"
                       "NEON"
                       "SVE" # size agnostic SVE
                       "SVE128" # fixed size SVE
                       "SVE256" # "
                       "SVE512" # "
                       "DEFAULT")

  define_option_string(ARROW_RUNTIME_SIMD_LEVEL
                       "Max runtime SIMD optimization level"
                       "MAX" # default to max supported by compiler
                       "NONE"
                       "SSE4_2"
                       "AVX2"
                       "AVX512"
                       "MAX")

They're added in the two patches:

  1. c15637d
  2. f8c9c8b

However, ARROW_RUNTIME_SIMD_LEVEL would also be checked in compile time. So, what's the difference between these two flags? Or do I misunderstand the meaning of "runtime" here(I mean dynamicly checking it when runing the program)?

Component(s)

C++

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions