Add e2e test for external metrics#136799
Conversation
|
@johanneswuerbach, I think this provides the necessary pieces you can use to add end-to-end tests. |
| ginkgo.It("should scale up and down based on external metric value", func(ctx context.Context) { | ||
| ginkgo.By("Creating the resource consumer deployment") | ||
| initPods := 1 | ||
| rc = e2eautoscaling.NewDynamicResourceConsumer(ctx, |
There was a problem hiding this comment.
Unrelated to your change, but this seems to end up being using apiVersion: apps/v1beta2, we should fix that at some stage
There was a problem hiding this comment.
Yes - we can handle this in a follow-up PR. It will likely require updates across all tests.
Signed-off-by: Omer Aplatony <omerap12@gmail.com>
bff232e to
d712e0e
Compare
|
This is awesome! Thanks for doing it! |
|
/retest |
1 similar comment
|
/retest |
Totally, when this merges I intend to open some good-first-issues to improve converge. |
|
I'm happy with this, it's adding a e2e test for a feature that's never been tested, which is amazing. /lgtm |
|
LGTM label has been added. DetailsGit tree hash: 129265fafd5a4b704c7bbdfbd8ae5f905ad5020e |
|
|
||
| func (emc *ExternalMetricsController) doRequestWithPortForward(ctx context.Context, action, metricName string, params url.Values) error { | ||
| // Find the pod | ||
| pods, err := emc.clientSet.CoreV1().Pods(emc.namespace).List(ctx, metav1.ListOptions{ |
There was a problem hiding this comment.
Suggestion: you could replace this entire port-forward logic with directly invoking PortForwardOptions.RunPortForwardContext. We've written those commands such that they should be easily embedable anywhere where needed. Also, we're slowly switching again from SPDY, which you're using below, towards WebSockets. So using port-forward's code directly will give you the benefit that you don't have to think about it 😉
There was a problem hiding this comment.
Yeah, that sounds good. will open an issue to address it ( I don't want to delay this PR since we need for other work ).
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: omerap12, soltysh The full list of commands accepted by this bot can be found here. The pull request process is described here DetailsNeeds approval from an approver in each of these files:
Approvers can indicate their approval by writing |
|
/triage accepted |
|
/retest |
1 similar comment
|
/retest |
This commit adds three new test cases to expand the HPA External Metrics test coverage: 1. Multiple Metrics: Verifies that a single HPA can correctly handle two or more external metrics simultaneously, ensuring that the highest desired replica count from all metrics is used. 2. Stabilization Window: Ensures that the HPA does not scale too aggressively when metric values fluctuate rapidly by respecting the configured stabilization window. 3. Scaling Limits: Validates that the configured scaling limits (behavior.scaleUp and behavior.scaleDown) are enforced correctly, including policies for maximum pods per period. Additionally, a new helper function CreateMultiMetricHorizontalPodAutoscalerWithBehavior is added to the framework to support creating HPAs with multiple metrics and custom behavior policies. Fixes: kubernetes#137132 Related to: kubernetes#136799"
What type of PR is this?
/kind feature
What this PR does / why we need it:
Following the KEPs:
kubernetes/enhancements#5680 and kubernetes/enhancements#2022 we need a testing framework for HPA based on external metrics.
This PR introduces the necessary helper functions to enable those tests (based on #136721) and adds an initial basic test, since none currently exist.
Which issue(s) this PR is related to:
Special notes for your reviewer:
Does this PR introduce a user-facing change?
Additional documentation e.g., KEPs (Kubernetes Enhancement Proposals), usage docs, etc.: