Skip to content

Call log.warning instead of raising PluginWarning#1723

Merged
liiight merged 9 commits intoFlexget:developfrom
jonathanou:fix_delete
Mar 15, 2017
Merged

Call log.warning instead of raising PluginWarning#1723
liiight merged 9 commits intoFlexget:developfrom
jonathanou:fix_delete

Conversation

@jonathanou
Copy link
Copy Markdown
Contributor

Motivation for changes:

I use the delete plugin to clean folders older than 30 days. The delete plugin has a feature to clean the source folder (clean_source: [folder size]) once the folder is under a certain size. When using this feature, I often run into this scenario:

  1. Folder contains 2 files, A.txt and B.txt, both 0.5 MB large.
  2. The delete plugin specifies clean_source: 1, so it will clean the source folder once the folder size is less than 1 MB.
  3. The delete plugin accepts A.txt, B.txt, as well as other files in other folders as its list of entries to delete.
  4. The delete plugin removes A.txt, the folder size is now 0.5 MB.
  5. Immediately after removing A.txt, the plugin checks if the folder size is less than 1 MB. It is, so it removes the entire folder (including B.txt).
  6. The delete plugin moves on to remove B.txt, but it does not exist anymore.
  7. PluginWarning is raised claiming it can't find B.txt and stops the task.
  8. Subsequent files that were scheduled to be deleted are not processed.

Detailed changes:

  • Instead of raising PluginWarning in move.py when the code detects something is wrong, we call log.warning to still log the message, but will allow us to move on and process the next file.

Jonathan Ouyang and others added 9 commits January 15, 2017 18:02
bt.hliang.com provides asian tv show torrents.  However, it does not support rss feeds.  To auto download shows, use http plugin to download the homepage of bt.hliang.com, and this url rewrite plugin will do the following:

1.  Go through each entry url and follow the link to open the details page.
2.  In the details page, search for the actual torrent download url (any link that contains "down.php?[unique id]"
3. Return the url if found, or raise UrlRewriteError.
Forgot to escape '.' and '?' for the "down.php?" regex pattern.
1.  Wrapped requests.get() call inside try-except
2.  Changed plugin registration to specify "interfaces" instead of "groups"

Additional changes:
1.  Removed unused urllib2 import.
2.  Grouped import statements.
When making request calls, catch request specific exceptions and log the error before raising UrlRewritingError.

For the soup library call, it is not immediately clear what Exception types are guaranteed to throw, therefore it is left unchanged.
Change from log.exception to log.error as requested in the pull request.  Also fixed how the arguments are being passed, the message takes a format string and the arguments that follow are the values.  The previous implementation was incorrect.
Instead of storing the actual exception, store exception message in UrlRewritingError. Printing the exception as a string shows the traceback, which for log files is a little verbose.  Showing the the exception message helps give more meaningful info.
Flexget will stop the entire task when the code throws PluginWarning exception.  This prevents the task from processing subsequent files.

By calling log.warning and continuing to the next iteration, subsequent files can be processed.
@liiight
Copy link
Copy Markdown
Member

liiight commented Mar 10, 2017

It sounds like the correct approach would be to break the loop instead of just logging a warning

@jonathanou
Copy link
Copy Markdown
Contributor Author

But breaking the loop stops other files from being processed. If I'm deleting a bunch of files and one of them don't exist on disk anymore, why should it stop the other files from being deleted?

@liiight
Copy link
Copy Markdown
Member

liiight commented Mar 10, 2017

Maybe I misunderstood the scenario. You describe a situation where during iteration over files in a folder, that folder gets deleted, and trying to operate on subsequent files in the loop cannot work, right?
If so, when deleting the folder the loop need to be handled correctly.

If I didn't under this correctly I apologize.

@jonathanou
Copy link
Copy Markdown
Contributor Author

Yes, that's right. Here's a more conrete example:

  1. I have clean_source: 1 specified (delete the folder if less than 1 MB).
  2. I'm trying to delete the following files Folder_A/A.txt, Folder_A/B.txt, and Folder_B/C.txt.
  3. The loop deletes Folder_A/A.txt, then checks if Folder_A is less than 1 MB. Let's say it is, and so it deletes Folder_A.
  4. The loop iterates to Folder_A/B.txt, but finds that it does not exist on disk anymore. The current implementation will raise PluginWarning and the entire task aborts.
  5. Folder_B/C.txt never gets processed.

I feel that the loop should carry on to the next iteration regardless if it can perform the operation on the current item or not. Even if deleting the folder is handled by the loop gracefully, there could still be other reasons to cause the delete to fail, like permission denied.

I think the easiest way is to log a warning and continue, like what I have proposed.

@liiight
Copy link
Copy Markdown
Member

liiight commented Mar 10, 2017

The easiest way and the best way are not always interchangeable. That being said, I'm not sure you're wrong. I rather wait for some more feedback on this.

@jonathanou
Copy link
Copy Markdown
Contributor Author

Was there anyone you wanted to include to discuss this?

@liiight
Copy link
Copy Markdown
Member

liiight commented Mar 13, 2017

@cvium and @paranoidi maybe. I imagine they saw this already

@gazpachoking
Copy link
Copy Markdown
Member

This change seems good to me. We don't want errors handling one entry to stop all subsequent entries from processing. I'm a bit confused by all the extra commits in this PR, but I suppose they don't matter as we can just squash it down when merging.

@liiight liiight merged commit cddaef4 into Flexget:develop Mar 15, 2017
@jonathanou
Copy link
Copy Markdown
Contributor Author

The extra commits in the PR was for a separate PR that was already merged back in January. I reused the same fork to add this change.

I'm quite new to Git PRs so I'm probably not doing it right. Is it recommended to delete the fork once a PR is merged, and then later re-fork if you want to create another PR? Or is there a way to reuse the fork but keep the commit history clean?

@jonathanou jonathanou deleted the fix_delete branch March 16, 2017 01:31
@liiight
Copy link
Copy Markdown
Member

liiight commented Mar 16, 2017

Best practice is to maintain your fork in sync and open a branch per feature /pr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants