Fix broken examples#753
Merged
samuelcolvin merged 2 commits intopydantic:masterfrom Aug 17, 2019
Merged
Conversation
Codecov Report
@@ Coverage Diff @@
## master #753 +/- ##
=====================================
Coverage 100% 100%
=====================================
Files 15 15
Lines 2723 2723
Branches 536 536
=====================================
Hits 2723 2723 |
Member
|
thanks, as you say I think we should run all example scripts in CI. The simplest way of denoting if they should fail would be either to include |
alexdrydew
pushed a commit
to alexdrydew/pydantic
that referenced
this pull request
Dec 23, 2023
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Change Summary
Fixes a handful of broken examples in the docs, and adds a make recipe
test-examplesthat can be used to identify which examples are broken.Some of them explicitly claim they will error, so I didn't set up any sort of pass/fail test, and didn't add the recipe to any of the CI or similar. But at least with this recipe you can quickly get a list of the failures and confirm there are no surprises.
I suppose we could add validation that the failures are the exact files we expect, but I figured I'd keep this change small.
Checklist
Unit tests for the changes exist
N/A ?
Tests pass on CI and coverage remains at 100%
Documentation reflects the changes where applicable
changes/<pull request or issue id>-<github username>.rstfile added describing change(see changes/README.md for details)