-
Notifications
You must be signed in to change notification settings - Fork 47
Fix container tests in run_tests.sh
use hard-coded DV_VERSION
#225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
JR-1991
wants to merge
12
commits into
main
Choose a base branch
from
fix-container-tests
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Standardized quoting in docker-compose-base.yml and removed unused version fields. Enhanced unit test service in docker-compose-test-all.yml to dynamically fetch Dataverse version, install required dependencies, and improve readiness checks for Dataverse before running tests.
Introduces DV_VERSION=6.7.1 to local-test.env for use as the default Dataverse version, which will be dynamically detected in tests.
Enhanced messaging and status reporting for test containers, including better wait feedback, exit code handling, and conditional log output. The script now distinguishes between missing log files and container logs, and provides clearer start/stop notifications for containers.
pdurbin
reviewed
Sep 24, 2025
printf "\n🧹 Stopping containers...\n" | ||
docker compose \ | ||
-f docker/docker-compose-base.yml \ | ||
-f ./docker/docker-compose-test-all.yml \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm getting this error:
pdurbin@beamish pyDataverse % ./run-tests.sh
⚠️ No Python version specified falling back to '3.11'
🚀 Preparing containers
Using PYTHON_VERSION=3.11
🚀 Starting all containers...
The test container will wait for Dataverse and fetch the version automatically
[+] Running 10/10
✔ Network pydataverse_dataverse Created 0.0s
✔ Network pydataverse_default Created 0.0s
✔ Container dv_initializer Exited 1.4s
✔ Container solr_initializer Exited 0.8s
✔ Container postgres Started 0.3s
✔ Container smtp Started 0.3s
✔ Container solr Started 0.9s
✔ Container dataverse Healthy 31.9s
✔ Container bootstrap Exited 47.0s
✔ Container unit-tests Started 47.0s
🔎 Running pyDataverse tests
Test container will handle version detection and testing...
⏳ Waiting for tests to complete... (1)
⏳ Waiting for tests to complete... (2)
⏳ Waiting for tests to complete... (3)
⏳ Waiting for tests to complete... (4)
⏳ Waiting for tests to complete... (5)
⏳ Waiting for tests to complete... (6)
⏳ Waiting for tests to complete... (7)
⏳ Waiting for tests to complete... (8)
📋 Unit tests completed with exit code: 1
❌ Unit tests failed. Showing test results...
=== PYTEST OUTPUT ===
============================= test session starts ==============================
platform linux -- Python 3.11.13, pytest-8.4.2, pluggy-1.6.0 -- /usr/local/bin/python3
cachedir: .pytest_cache
rootdir: /pydataverse
configfile: pyproject.toml
plugins: asyncio-1.2.0, cov-7.0.0, anyio-4.11.0
asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=None, asyncio_default_test_loop_scope=function
collecting ... collected 82 items
tests/api/test_access.py::TestDataAccess::test_get_data_by_id PASSED [ 1%]
tests/api/test_access.py::TestDataAccess::test_get_data_by_pid PASSED [ 2%]
tests/api/test_api.py::TestApiConnect::test_api_connect PASSED [ 3%]
tests/api/test_api.py::TestApiConnect::test_api_connect_base_url_wrong PASSED [ 4%]
tests/api/test_api.py::TestApiTokenAndAuthBehavior::test_api_token_none_and_auth_none PASSED [ 6%]
tests/api/test_api.py::TestApiTokenAndAuthBehavior::test_api_token_none_and_auth PASSED [ 7%]
tests/api/test_api.py::TestApiTokenAndAuthBehavior::test_api_token_and_auth PASSED [ 8%]
tests/api/test_api.py::TestApiTokenAndAuthBehavior::test_api_token_and_auth_none PASSED [ 9%]
tests/api/test_api.py::TestApiRequests::test_get_request PASSED [ 10%]
tests/api/test_api.py::TestApiRequests::test_get_dataverse PASSED [ 12%]
tests/api/test_api.py::TestApiToken::test_token_missing PASSED [ 13%]
tests/api/test_api.py::TestApiToken::test_token_empty_string PASSED [ 14%]
tests/api/test_api.py::TestApiToken::test_token_right_create_dataset_rights FAILED [ 15%]
tests/api/test_api.py::TestApiToken::test_token_should_not_be_exposed_on_error PASSED [ 17%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated[True] PASSED [ 18%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated[False] PASSED [ 19%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated[api-token] PASSED [ 20%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated[auth3] PASSED [ 21%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated_unauthorized[True] PASSED [ 23%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated_unauthorized[False] PASSED [ 24%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated_unauthorized[api-token] PASSED [ 25%]
tests/api/test_api.py::TestApiToken::test_using_auth_on_individual_requests_is_deprecated_unauthorized[auth3] PASSED [ 26%]
tests/api/test_api.py::TestApiToken::test_sword_api_requires_http_basic_auth PASSED [ 28%]
tests/api/test_api.py::TestApiToken::test_sword_api_can_authenticate PASSED [ 29%]
tests/api/test_api.py::TestApiToken::test_sword_api_cannot_authenticate_without_token PASSED [ 30%]
tests/api/test_async_api.py::TestAsyncAPI::test_async_api PASSED [ 31%]
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_replace PASSED [ 32%]
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_add PASSED [ 34%]
tests/api/test_upload.py::TestFileUpload::test_file_upload PASSED [ 35%]
tests/api/test_upload.py::TestFileUpload::test_file_upload_without_metadata PASSED [ 36%]
tests/api/test_upload.py::TestFileUpload::test_bulk_file_upload PASSED [ 37%]
tests/api/test_upload.py::TestFileUpload::test_file_replacement_wo_metadata FAILED [ 39%]
tests/api/test_upload.py::TestFileUpload::test_file_replacement_w_metadata FAILED [ 40%]
tests/auth/test_auth.py::TestApiTokenAuth::test_token_header_is_added_during_auth_flow PASSED [ 41%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[123_0] PASSED [ 42%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[non_str_token1] PASSED [ 43%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[<lambda>] PASSED [ 45%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[1.423] PASSED [ 46%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[123_1] PASSED [ 47%]
tests/auth/test_auth.py::TestApiTokenAuth::test_raise_if_token_is_not_str[non_str_token5] PASSED [ 48%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_authorization_header_is_added_during_auth_flow PASSED [ 50%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[123_0] PASSED [ 51%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[non_str_token1] PASSED [ 52%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[<lambda>] PASSED [ 53%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[1.423] PASSED [ 54%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[123_1] PASSED [ 56%]
tests/auth/test_auth.py::TestBearerTokenAuth::test_raise_if_token_is_not_str[non_str_token5] PASSED [ 57%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_set_and_get_valid PASSED [ 58%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_set_invalid PASSED [ 59%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_from_json_valid PASSED [ 60%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_from_json_invalid PASSED [ 62%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_to_json_valid PASSED [ 63%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_to_json_invalid PASSED [ 64%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_validate_json_valid PASSED [ 65%]
tests/models/test_datafile.py::TestDatafileGeneric::test_datafile_validate_json_invalid PASSED [ 67%]
tests/models/test_datafile.py::TestDatafileSpecific::test_datafile_init_valid PASSED [ 68%]
tests/models/test_datafile.py::TestDatafileSpecific::test_datafile_init_invalid PASSED [ 69%]
tests/models/test_datafile.py::TestDatafileGenericTravisNot::test_dataverse_from_json_to_json_valid PASSED [ 70%]
tests/models/test_dataset.py::TestDatasetGeneric::test_dataset_set_and_get_valid PASSED [ 71%]
tests/models/test_dataset.py::TestDatasetGeneric::test_dataset_set_invalid PASSED [ 73%]
tests/models/test_dataset.py::TestDatasetGeneric::test_dataset_validate_json_valid PASSED [ 74%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_from_json_valid PASSED [ 75%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_to_json_valid PASSED [ 76%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_init_valid PASSED [ 78%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_init_invalid PASSED [ 79%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_from_json_invalid PASSED [ 80%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_to_json_invalid PASSED [ 81%]
tests/models/test_dataset.py::TestDatasetSpecific::test_dataset_validate_json_invalid PASSED [ 82%]
tests/models/test_dataset.py::TestDatasetSpecificTravisNot::test_dataset_to_json_from_json_valid PASSED [ 84%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_set_and_get_valid PASSED [ 85%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_set_invalid PASSED [ 86%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_from_json_valid PASSED [ 87%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_from_json_invalid PASSED [ 89%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_to_json_valid PASSED [ 90%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_to_json_invalid PASSED [ 91%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_validate_json_valid PASSED [ 92%]
tests/models/test_dataverse.py::TestDataverseGeneric::test_dataverse_validate_json_invalid PASSED [ 93%]
tests/models/test_dataverse.py::TestDataverseSpecific::test_dataverse_init_valid PASSED [ 95%]
tests/models/test_dataverse.py::TestDataverseSpecific::test_dataverse_init_invalid PASSED [ 96%]
tests/models/test_dataverse.py::TestDataverseGenericTravisNot::test_dataverse_from_json_to_json_valid PASSED [ 97%]
tests/models/test_dvobject.py::TestDVObject::test_dataverse_init PASSED [ 98%]
tests/utils/test_utils.py::TestUtilsSaveTreeData::test_dataverse_tree_walker_valid_default PASSED [100%]
=================================== FAILURES ===================================
_____________ TestApiToken.test_token_right_create_dataset_rights ______________
tests/api/test_api.py:184: in test_token_right_create_dataset_rights
assert resp.json()["status"] == "OK"
E AssertionError: assert 'ERROR' == 'OK'
E
E - OK
E + ERROR
----------------------------- Captured stdout call -----------------------------
Dataset with pid 'doi:10.5072/FK2/AMR3PV' created.
_______________ TestFileUpload.test_file_replacement_wo_metadata _______________
tests/api/test_upload.py:186: in test_file_replacement_wo_metadata
file_id = response.json()["data"]["files"][0]["dataFile"]["id"]
^^^^^^^^^^^^^^^^^^^^^^^
E KeyError: 'data'
_______________ TestFileUpload.test_file_replacement_w_metadata ________________
tests/api/test_upload.py:248: in test_file_replacement_w_metadata
file_id = response.json()["data"]["files"][0]["dataFile"]["id"]
^^^^^^^^^^^^^^^^^^^^^^^
E KeyError: 'data'
=============================== warnings summary ===============================
tests/api/test_api.py: 3 warnings
tests/api/test_upload.py: 56 warnings
/pydataverse/pyDataverse/api.py:244: DeprecationWarning: The auth parameter is deprecated. Please pass your auth arguments to the __init__ method instead.
warn(
tests/api/test_api.py::TestApiToken::test_token_right_create_dataset_rights
/pydataverse/pyDataverse/api.py:374: DeprecationWarning: The auth parameter is deprecated. Please pass your auth arguments to the __init__ method instead.
warn(
tests/api/test_api.py::TestApiToken::test_sword_api_requires_http_basic_auth
tests/api/test_api.py::TestApiToken::test_sword_api_can_authenticate
/pydataverse/pyDataverse/api.py:127: UserWarning: You provided both, an api_token and a custom auth method. We will only use the auth method.
warn(
tests/api/test_api.py::TestApiToken::test_sword_api_can_authenticate
tests/api/test_api.py::TestApiToken::test_sword_api_cannot_authenticate_without_token
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_replace
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_add
/pydataverse/pyDataverse/api.py:182: DeprecationWarning: The auth parameter is deprecated. Please pass your auth arguments to the __init__ method instead.
warn(
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_replace
tests/api/test_edit.py::TestEditDatasetMetadata::test_edit_dataset_metadata_add
/pydataverse/pyDataverse/api.py:312: DeprecationWarning: The auth parameter is deprecated. Please pass your auth arguments to the __init__ method instead.
warn(
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================ tests coverage ================================
_______________ coverage: platform linux, python 3.11.13-final-0 _______________
Name Stmts Miss Cover
-----------------------------------------------
pyDataverse/__init__.py 9 0 100%
pyDataverse/api.py 579 334 42%
pyDataverse/auth.py 19 0 100%
pyDataverse/exceptions.py 20 0 100%
pyDataverse/models.py 521 71 86%
pyDataverse/utils.py 216 140 35%
-----------------------------------------------
TOTAL 1364 545 60%
=========================== short test summary info ============================
FAILED tests/api/test_api.py::TestApiToken::test_token_right_create_dataset_rights - AssertionError: assert 'ERROR' == 'OK'
- OK
+ ERROR
FAILED tests/api/test_upload.py::TestFileUpload::test_file_replacement_wo_metadata - KeyError: 'data'
FAILED tests/api/test_upload.py::TestFileUpload::test_file_replacement_w_metadata - KeyError: 'data'
================== 3 failed, 79 passed, 68 warnings in 9.87s ===================
=== END PYTEST OUTPUT ===
🧹 Stopping containers...
[+] Running 10/10
✔ Container unit-tests Removed 0.1s
✔ Container smtp Removed 0.1s
✔ Container bootstrap Removed 0.0s
✔ Container dataverse Removed 10.2s
✔ Container postgres Removed 0.2s
✔ Container solr Removed 0.6s
✔ Container dv_initializer Removed 0.0s
✔ Container solr_initializer Removed 0.0s
✔ Network pydataverse_dataverse Removed 0.2s
✔ Network pydataverse_default Removed 0.4s
pdurbin@beamish pyDataverse %
Inserted response.raise_for_status() calls after API requests in test_upload.py to ensure HTTP errors are caught during testing. Also refactored some assert statements for improved readability.
Replaced status assertions with explicit error handling in test_api.py and test_upload.py. Now exceptions are raised with API error messages when responses are not OK, improving test reliability and clarity.
Replaces usage of 'response.ok' with 'response.is_success' for response validation. Refactors assertions to use single-line format with error messages as arguments for improved readability.
Added information about the location of Dataverse logs and guidance for sharing logs on Zulip in case of internal server errors.
Adds a step to save Dataverse container logs to dv/dataverse-logs.log after unit tests complete, aiding in debugging and post-test analysis.
Changed the dv volume mount in docker-compose-test-all.yml to use an absolute path based on ${PWD} for improved reliability and consistency.
@pdurbin, the script now saves the Dataverse logs into a separate file. This might help pinpoint the exact issue causing the tests to fail. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue #224 addressed the problem of the local test runner using containers failing due to various reasons. The current implementation hard-coded
DV_VERSION=6.3
, which is not sustainable with newer versions. This PR addresses this issue specifically and also cleans up a couple of other processes in the test chain.Testing workflow improvements:
unit-tests
container now waits for Dataverse to be ready before running tests, dynamically fetches the Dataverse version via its API, and sets theDV_VERSION
environment variable accordingly. The test startup script also installs required Python dependencies and provides more detailed output during test execution.run-tests.sh
script has been enhanced to provide clearer progress updates, improved error handling, and more informative output for both successful and failed test runs, including displaying pytest results or container logs as appropriate. [1] [2]Configuration and environment updates:
local-test.env
file now includes a defaultDV_VERSION
, which is overridden by dynamic detection during tests.Docker Compose file standardization:
docker/docker-compose-base.yml
(such as container names, hostnames, ports, commands, and restart policies) have been updated to use single quotes for consistency and to avoid YAML parsing issues. [1] [2] [3] [4]Please note, the test log in #224 provided several other failed tests, but I failed to reproduce these on my machine. Tests pass using the
6.7.1
version. @pdurbin these could be related to other issues we should investigate.unit-tests.log