Skip to content

Conversation

@wesm
Copy link
Member

@wesm wesm commented Jun 5, 2020

The idea of this patch is to provide a more comprehensive baseline for the optimization work I'm undertaking.

Summary:

  • Benchmark take when indices are monotonic and contain no nulls. Monotonic takes perform much faster because it accesses memory consecutively rather than at random
  • Test null percentages down to 0.01% (1% is even a lot of nulls, and obscures behavior between 1% and 0%).
  • Benchmark indices/filter-mask with and without nulls, because there may be faster code paths for the no-nulls case
  • Benchmark when values being taken/filtered are all not null
  • Benchmark filtering/taking smaller strings. The benchmarks were doing strings of size 0 to 128 -- realistic workloads generally will be working with smaller strings, so I set a range instead of 0 to 32 with 16 the average

@wesm wesm requested a review from fsaintjacques June 5, 2020 17:56
@github-actions
Copy link

github-actions bot commented Jun 5, 2020

};

static void FilterInt64FilterNoNulls(benchmark::State& state) {
return FilterBenchmark(state, false).Int64();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You might get compiler warnings for returning in a void function.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah thanks, fixing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants