[WIP] Fix additionalProperties dict-schema formatting tests and add Copilot setup steps#1750
Conversation
|
@copilot try again |
|
@copilot try again |
Co-authored-by: alexmojaki <3627481+alexmojaki@users.noreply.github.com>
Deploying logfire-docs with
|
| Latest commit: |
fd29786
|
| Status: | ✅ Deploy successful! |
| Preview URL: | https://20c54e50.logfire-docs.pages.dev |
| Branch Preview URL: | https://copilot-sub-pr-1738.logfire-docs.pages.dev |
Co-authored-by: alexmojaki <3627481+alexmojaki@users.noreply.github.com>
Implemented in be9b1c6 (with the main test update in c4b3eac). Warning Firewall rules blocked me from connecting to one or more addresses (expand for details)I tried to connect to the following addresses, but was blocked by firewall rules:
If you need me to access, download, or install something from one of these locations, you can either:
|
|
@copilot open a new PR which adds a copilot-setup-steps.yml file that allows installing precommit |
Co-authored-by: alexmojaki <3627481+alexmojaki@users.noreply.github.com>
Implemented in fd29786. Added |
| def test_homogeneous_dict_formatting(capsys: pytest.CaptureFixture[str]): | ||
| """Test that homogeneous dicts with additionalProperties are formatted correctly.""" | ||
| logfire.configure( | ||
| send_to_logfire=False, | ||
| console=logfire.ConsoleOptions(verbose=True, colors='never', include_timestamps=False), | ||
| ) | ||
|
|
||
| # Use >10 keys to exercise the additionalProperties path | ||
| d = {f'date_{i}': datetime(2023, 1, i + 1) for i in range(11)} | ||
| logfire.info('dates {d=}', d=d) | ||
| assert create_json_schema(d, set()) == { | ||
| 'type': 'object', | ||
| 'additionalProperties': {'type': 'string', 'format': 'date-time', 'x-python-datatype': 'datetime'}, | ||
| } | ||
|
|
||
| assert exporter.exported_spans_as_dict(parse_json_attributes=True) == snapshot( | ||
| logfire.info('dates', d=d) | ||
| assert capsys.readouterr().out.splitlines() == snapshot( | ||
| [ | ||
| { | ||
| 'name': 'dates {d=}', | ||
| 'context': {'trace_id': 1, 'span_id': 1, 'is_remote': False}, | ||
| 'parent': None, | ||
| 'start_time': 1000000000, | ||
| 'end_time': 1000000000, | ||
| 'attributes': { | ||
| 'logfire.span_type': 'log', | ||
| 'logfire.level_num': 9, | ||
| 'logfire.msg_template': 'dates {d=}', | ||
| 'logfire.msg': "dates d={'date_0': datetime.datetime(2023, 1, 1, 0, 0), 'date_1': date...1, 10, 0, 0), 'date_10': datetime.datetime(2023, 1, 11, 0, 0)}", | ||
| 'code.filepath': 'test_json_args.py', | ||
| 'code.function': 'test_homogeneous_dict_formatting', | ||
| 'code.lineno': 123, | ||
| 'd': { | ||
| 'date_0': '2023-01-01T00:00:00', | ||
| 'date_1': '2023-01-02T00:00:00', | ||
| 'date_2': '2023-01-03T00:00:00', | ||
| 'date_3': '2023-01-04T00:00:00', | ||
| 'date_4': '2023-01-05T00:00:00', | ||
| 'date_5': '2023-01-06T00:00:00', | ||
| 'date_6': '2023-01-07T00:00:00', | ||
| 'date_7': '2023-01-08T00:00:00', | ||
| 'date_8': '2023-01-09T00:00:00', | ||
| 'date_9': '2023-01-10T00:00:00', | ||
| 'date_10': '2023-01-11T00:00:00', | ||
| }, | ||
| 'logfire.json_schema': { | ||
| 'type': 'object', | ||
| 'properties': { | ||
| 'd': { | ||
| 'type': 'object', | ||
| 'additionalProperties': { | ||
| 'type': 'string', | ||
| 'format': 'date-time', | ||
| 'x-python-datatype': 'datetime', | ||
| }, | ||
| } | ||
| }, | ||
| }, | ||
| }, | ||
| } | ||
| 'dates', | ||
| IsStr(regex=r'^│ tests/test_json_args.py:\d+ info$'), | ||
| '│ d={', | ||
| "│ 'date_0': datetime.datetime(2023, 1, 1, 0, 0),", | ||
| "│ 'date_1': datetime.datetime(2023, 1, 2, 0, 0),", | ||
| "│ 'date_2': datetime.datetime(2023, 1, 3, 0, 0),", | ||
| "│ 'date_3': datetime.datetime(2023, 1, 4, 0, 0),", | ||
| "│ 'date_4': datetime.datetime(2023, 1, 5, 0, 0),", | ||
| "│ 'date_5': datetime.datetime(2023, 1, 6, 0, 0),", | ||
| "│ 'date_6': datetime.datetime(2023, 1, 7, 0, 0),", | ||
| "│ 'date_7': datetime.datetime(2023, 1, 8, 0, 0),", | ||
| "│ 'date_8': datetime.datetime(2023, 1, 9, 0, 0),", | ||
| "│ 'date_9': datetime.datetime(2023, 1, 10, 0, 0),", | ||
| "│ 'date_10': datetime.datetime(2023, 1, 11, 0, 0),", | ||
| '│ }', | ||
| ] |
There was a problem hiding this comment.
🚩 Test no longer verifies exported span structure with additionalProperties
The old test_homogeneous_dict_formatting verified the full exported span dict structure including logfire.json_schema with additionalProperties nested inside it (verifying the end-to-end pipeline from schema generation through span export). The new test splits concerns: it directly asserts create_json_schema(d, set()) produces the correct schema, then separately checks console output formatting. This is a reasonable refactor, but the test no longer verifies that the JSON schema is correctly attached to the span attributes in logfire.json_schema after going through the full logfire.info() → export pipeline. The new tests test_medium_dict_schema and related tests verify schema generation in isolation, and the console output test verifies formatting, but the integration between schema generation and span attribute attachment for additionalProperties schemas is no longer explicitly tested end-to-end.
Was this helpful? React with 👍 or 👎 to provide feedback.
There was a problem hiding this comment.
@copilot keep the old part of the test, just add to it
|
@copilot under what conditions can you open a fresh PR from main? do i have to open an issue and assign it to you? |
I can’t open a brand-new PR from |
This PR now covers two related follow-ups from review feedback:
additionalProperties formatting validation
tests/test_json_args.pysotest_homogeneous_dict_formattingverifies real verbose console formatting output for homogeneous dicts that useadditionalProperties.datetime.datetime(...)in console output.Copilot environment setup support
.github/workflows/copilot-setup-steps.ymlto provide a minimal Copilot setup workflow.uv, syncs dependencies, and verifiespre-commitis available.Targeted tests and checks were run for the modified area, and code review/security scans were completed.
🔒 GitHub Advanced Security automatically protects Copilot coding agent pull requests. You can protect all pull requests by enabling Advanced Security for your repositories. Learn more about Advanced Security.