-
Notifications
You must be signed in to change notification settings - Fork 60
Add DSS integration tests (New) #1787
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
First job sets-up the repo and venv, then one job runs the CPU integration tests, and the next runs GPU integration tests if the NVIDIA GPU was enabled successfully
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1787 +/- ##
=======================================
Coverage 49.83% 49.83%
=======================================
Files 377 377
Lines 40719 40719
Branches 6851 6851
=======================================
Hits 20294 20294
Misses 19700 19700
Partials 725 725
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
@fernando79513 , is there anything holding this PR anymore? I am unfortunately sick today, but I hope to be able to look at fixing any issues tomorrow. |
Hey @motjuste, what's the exact purpose of this PR? |
In this PR, we clone the DSS repo and only run their integration tests. The integration tests expect that the DSS snap is installed and there's a kubernetes cluster set up. We do not install DSS from this repo. I am sorry for the confusion. |
You are right, the DSS integration test are basically calling the DSS snap in different scenarios. So what's the difference between these tests and the ones we are already running? |
Co-authored-by: Fernando Bravo <[email protected]>
To be honest, our tests at the moment are more extensive than what the DSS integration tests cover. For example, the DSS integration tests currently do not cover Intel GPUs at all, but we do. I am following a spec KF114 that @deusebio et.al. created for me. It does make sense to run the integration tests from the product team, especially since they will be in the best position to keep those tests up-to-date, e.g. with more granular tests, that don't make sense to be added to this Checkbox Provider. There's one pattern we can use for handling potential version mis-match:
The DSS repo currently doesn't provide separate branches. I believe the default @deusebio : please provide any comments. |
I had a discussion with @deusebio from the DSS team, and he was indeed in favour of adding these into the Checkbox provider. The DSS product team will be able to add more specific tests in the future without having to update this Checkbox provider. To avoid version mismatch, there are two options:
|
Yes, @motjuste, thank you for the summary. I'm definitely in favor to externalize UATs test into a repository managed by the product team such that we can keep on updating tests and also implement new ones without the workflow on checkbox to be modified. The versioning mismatch can be fixed in either ways you propose. Maybe my preference could possibly be option 2, given for DSS everything is self-contained in one repository. Charmed Kubeflow is a bit more complex (given it is made up of multiple repo/charms/services) and we have a dedicated repo for UATs, therefore the need to use branches, but this is not required here. The hash of the commit could be parsed from the version (e.g. version of DSS is |
@fernando79513 ... I looking into parsing the commit hash ... but feel free to add any other comments you may have in the meantime |
I still don't think this should be a checkbox test. The idea of having these tests in checkbox (and eventually on Test Observer) is to have control of some of the core functionalities of the DSS module across different. If you really want to use checkbox to run these integration tests, I would have a separated testflinger job that does all the setup and calls And this integration tests should not be included in the dss-validation test plan |
My understanding was that the PR here was to provide the framework to enable testing, but use-case definition and implementation would still be on the product team. From your answer I gather this aims to be "independent" testing and you must both own definition and implementation of tests. I don't have particular concerns with this and it is definitely your call to make whether externalizing tests is appropriate, but the latitude on the support we can provide for evolving those tests will be obviously more limited (probably for good reasons since they are independent tests and product team has limited knowledge of the checkbox framework). |
I don't know who is going to be in charge of testing/implementing the tests, but I don't think having this in a checkbox test is a good approach with the current reporting system we have in place. I see the point on running the integration tests in a set of devices, since these will probably be more up to date, but I think the setup should be done in a separate script (not in a checkbox test) and probably include everything in a separate testflinger job. |
For the sake of reproducible test results indeed any sort of a floating (changing depending on when a test job is run) testing definition being pulled into a Checkbox job doesn't seem a good idea. If the tests are executable in some other context and there's a reason to reference them for that reason as a git ref (tag, or precise commit hash) for example, I can understand that. But since Checkbox is the way testing on real hardware in the labs is conducted and since testing on real hardware seems essential for the usecases relevant, @deusebio seems reasonable to expect that your team would become aware of this test tool enough to help maintain the coverage. Happy to discuss in a meeting before or at the sprint if you do not find consensus otherwise. |
I feel there are a few different points of discussions we should unpack:
Yes, I believe having a dedicated meeting in Frankfurt would definitely be very helpful to unpack those properly. I would prefer during roadmap if possible, since engineering sprint is looking more hectic. Would relevant stakeholders be there in the first week already? I'm sure we will be able to understand the constraint and the latitude for compromise we have. As a general point, I would like to centralize the business logic for user testing in one place, as re-implementing the same tests in different frameworks or repository would increase substantially the burden of maintainance and keeping things in sync, left aside the current skill gap. That's really my main concern and point here |
Agreed on all of that, let's resolve at the sprint and conclude back here (product week indeed the better option also for us). |
Description
install-deps
script, add installinggit
andpython3-venv
to be able to setup for the running the DSS integration tests from their repo.venv
withtox
for running the tests.TODO
Resolved issues
Documentation
No changes to Checkbox documentation.
Tests
There are no new unit-tests added. A full run of the checkbox-provider in Testflinger can be found here.