Skip to content

build: fix macOS build (requires bazel 0.6.0)#1779

Merged
mattklein123 merged 1 commit intoenvoyproxy:masterfrom
turbinelabs:fix-osx-bazel-0.6.0
Sep 29, 2017
Merged

build: fix macOS build (requires bazel 0.6.0)#1779
mattklein123 merged 1 commit intoenvoyproxy:masterfrom
turbinelabs:fix-osx-bazel-0.6.0

Conversation

@zuercher
Copy link
Copy Markdown
Member

Re-patched bazel files cribbed from the bazel tree for OS X. Fewer changes now, but strip is still broken without these mods (see bazelbuild/bazel#3838). I may try to submit a patch for this, since it's just a CROSSTOOL.tpl change now -- the fix for bazelbuild/bazel#209 made it possible to specify flags for strip in that file.

Homebrew doesn't support bazel 0.6.0 yet, so one has to install it manually. I submitted a patch to Homebrew, hopefully it lands soon: Homebrew/homebrew-core#18686.

Signed-off-by: Stephan Zuercher stephan@turbinelabs.io

Signed-off-by: Stephan Zuercher <stephan@turbinelabs.io>
Copy link
Copy Markdown
Member

@htuch htuch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update. Looking forward to the day we can completely nuke our fork of toolchain autoconfigure, between the work of @jmillikin-stripe and yours I think we're closing in on this.

Copy link
Copy Markdown
Member

@mattklein123 mattklein123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🍎

@mattklein123 mattklein123 merged commit 486cd7a into envoyproxy:master Sep 29, 2017
@zuercher zuercher deleted the fix-osx-bazel-0.6.0 branch September 29, 2017 16:15
costinm pushed a commit to costinm/envoy that referenced this pull request Oct 2, 2017
Signed-off-by: Stephan Zuercher <stephan@turbinelabs.io>
mathetake pushed a commit that referenced this pull request Mar 3, 2026
**Description**

This proposal introduces Batch Processing API support for Envoy AI
Gateway, enabling users to submit large-scale asynchronous inference
jobs with significant cost savings (typically 50% vs real-time
inference). See details in `007-batch-processing/proposal.md` file.

---------

Signed-off-by: Xiaolin Lin <xlin158@bloomberg.net>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants