Skip to content

improve(liquidator): Send log when WDF is purposefully not activated#2432

Merged
nicholaspai merged 8 commits intomasterfrom
npai/2419
Jan 22, 2021
Merged

improve(liquidator): Send log when WDF is purposefully not activated#2432
nicholaspai merged 8 commits intomasterfrom
npai/2419

Conversation

@nicholaspai
Copy link
Copy Markdown
Member

@nicholaspai nicholaspai commented Jan 19, 2021

Signed-off-by: Nick Pai npai.nyc@gmail.com

Motivation

Problem mentioned as part of #2419

Previously, this error log is being emitted even if the WDF is not being activated on purpose (i.e there is no error):
Screen Shot 2021-01-19 at 13 06 30

Summary

Liquidator queries LiquidatorStrategy to determine if it should emit a specialized log about the WDF not being activated yet.

…ated

Signed-off-by: Nick Pai <npai.nyc@gmail.com>
@coveralls
Copy link
Copy Markdown

coveralls commented Jan 19, 2021

Coverage Status

Coverage remained the same at 93.284% when pulling 39c1912 on npai/2419 into e6a54e6 on master.

Copy link
Copy Markdown
Member

@chrismaree chrismaree left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! as discussed IRL we need to think about if we always want to be sending out these info messages. We really just need to know when the initial WDF liquidation is sent out and if the bot is running low on funds/cant extend a liquidation. The way this is set up is we'll see this info message every 5 mins, which I think is fine, but might be a bit noisy.

parseFloat(defenseActivationPercent)
)
return false;
if (!passedDefenseActivationPercent({ position, currentBlockTime })) return false;
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you not just return passedDefenseActivationPercent({ position, currentBlockTime }) at this point?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

true!

) {
this.logger.info({
at: "Liquidator",
message: "Withdrawal liveness has not passed WDF activation threshold, skipping✋",
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we use a new emoji? this one is used in other places.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

key comment, will do

Copy link
Copy Markdown
Contributor

@daywiss daywiss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great Nick, I like how you broke out the check into its own function. Approving assuming you will address @chrismaree concerns

this.defenseActivationPercent &&
!this.liquidationStrategy.utils.passedDefenseActivationPercent({ position, currentBlockTime })
) {
this.logger.info({
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will these logs cause pager duty to go off? I think id prefer not tripping PD on this since it could happen every time the bot runs

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

INFO level logs should not set of PagerDuty

@nicholaspai
Copy link
Copy Markdown
Member Author

Looks good! as discussed IRL we need to think about if we always want to be sending out these info messages. We really just need to know when the initial WDF liquidation is sent out and if the bot is running low on funds/cant extend a liquidation. The way this is set up is we'll see this info message every 5 mins, which I think is fine, but might be a bit noisy.

Good point, given the constraint that this bot runs serverlessly this is a bit tricky. Thoughts on changing this to a debug log vs keeping it as an info? I'm not sure what we'd prefer as a team, but I kind of like the idea of having more information than not.

nicholaspai and others added 3 commits January 20, 2021 10:15
Co-authored-by: Chris Maree <christopher.maree@gmail.com>
Signed-off-by: Nick Pai <npai.nyc@gmail.com>
Signed-off-by: Nick Pai <npai.nyc@gmail.com>
Signed-off-by: Nick Pai <npai.nyc@gmail.com>
@nicholaspai nicholaspai changed the title improve(liquidator): Send INFO log when WDF is purposefully not activated improve(liquidator): Send log when WDF is purposefully not activated Jan 20, 2021
Signed-off-by: Nick Pai <npai.nyc@gmail.com>
liquidatorConfig,
liquidatorOverridePrice
liquidatorOverridePrice,
startingBlock,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as discussed on slack, you could consider putting these within the liquidatorConfig.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to pack startingBlock and endingBlock into liquidatorConfig before creating the Liquidator.js object, but I still want to read those vars from the environment because the serverless hub sets them conveniently there

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See this commit for the proposed changes in response to your comment.

Signed-off-by: Nick Pai <npai.nyc@gmail.com>
Copy link
Copy Markdown
Contributor

@daywiss daywiss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, i just have questions about how starting and ending block gets set. Is this something that gets persisted in gcp somehow and set dynamically in cloud run or what?

Copy link
Copy Markdown
Member

@chrismaree chrismaree left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! thanks for refining this & adding all the nits.

};

// Add block window into `liquidatorConfig`
let overriddenLiquidatorConfig = {
Copy link
Copy Markdown
Member

@chrismaree chrismaree Jan 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: does this need to be a new variable? could you not just re-asign liquidatorConfig?

liquidatorConfig = {...liquidatorConfig, startingBlock, endingBlock };

The only reason I bring this up is that it is more consistent with how I was trying to override the liquidator config in the perpetual generalization.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call, I originally had it set this way but ran into an error because I redeclared liquidatorConfig with a let statement.

@nicholaspai
Copy link
Copy Markdown
Member Author

Looks good to me, i just have questions about how starting and ending block gets set. Is this something that gets persisted in gcp somehow and set dynamically in cloud run or what?

Yep look at how the Hub sets the environment here

Signed-off-by: Nick Pai <npai.nyc@gmail.com>
fromBlock: this.startingBlock,
toBlock: this.endingBlock
});
if (blockWindowHasRequestedWithdraw) {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tiny nit: This could be done more cleanly using a simply ternary operator.

const logLevel = blockWindowHasRequestedWithdraw ? "info" : "debug";

Feel free to ignore as well, this is more personal preference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants