Add 'Schedule' buttons in the Jobs page
This PR adds a schedule endpoint based on #256. Note that the 'Cancel' and the 'Schedule' buttons still not visible in Python 3 due to #312.
Codecov Report
Merging #321 into master will decrease coverage by
0.45%. The diff coverage is11.11%.
@@ Coverage Diff @@
## master #321 +/- ##
==========================================
- Coverage 68.37% 67.91% -0.46%
==========================================
Files 16 16
Lines 819 826 +7
Branches 96 98 +2
==========================================
+ Hits 560 561 +1
- Misses 230 236 +6
Partials 29 29
| Impacted Files | Coverage Δ | |
|---|---|---|
| scrapyd/website.py | 56.43% <11.11%> (-3.14%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact),ø = not affected,? = missing dataPowered by Codecov. Last update 5298f9b...c0a9b06. Read the comment docs.
Codecov Report
Merging #321 into master will decrease coverage by
0.45%. The diff coverage is11.11%.
@@ Coverage Diff @@
## master #321 +/- ##
==========================================
- Coverage 68.37% 67.91% -0.46%
==========================================
Files 16 16
Lines 819 826 +7
Branches 96 98 +2
==========================================
+ Hits 560 561 +1
- Misses 230 236 +6
Partials 29 29
| Impacted Files | Coverage Δ | |
|---|---|---|
| scrapyd/website.py | 56.43% <11.11%> (-3.14%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact),ø = not affected,? = missing dataPowered by Codecov. Last update 5298f9b...c0a9b06. Read the comment docs.
@Digenis, What do you think about the added 'Schedule' button?
There's no way to add arguments and settings and, taken from the finished jobs, it creates the confusion that the job will run with the same arguments.
The reason I wrote the cancel shortcut is that the solution suits any user.
If we add a form to add arguments, we are getting in to the scope of other tools such as scrapyweb. What do you think?
Not every spider run needs arguments and settings from cmd. And it's convenient for repeating crawling jobs with just a click.
PS: typo for scrapydweb
I share the concerns of @Digenis in https://github.com/scrapy/scrapyd/pull/321#issuecomment-485448325