You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
in test_get_presence() there is a new check added by PR (#2985):
name = module.get_name(platform_api_conn, i)
if name in self.skip_mod_list:
self.expect(presence is False, "Module {} is not present".format(i))
else:
self.expect(presence is True, "Module {} is not present".format(i))
I am seeing some issue with the above logic.
In our testbed where we share a physical chassis that uses the same Supervisor card but group different LCs for different "logical chassis" so for some "logical chassis" some LCs are added to "skip_mod_list" as they are not meant to be tested for that logical chassis. Our expectation is that if it is marked as skipped, it is not meant for testing and no state of those skipped modules should be used for any validation logic. We should always trust what the inventory marked as skipped and not try to look at whatever state the "skipped module" should be.
I am raising this as a testcase issue and will follow up with a PR to address this
Steps to reproduce the issue:
Setup inventory file for your logical chassis to skip some LC(s) that share on the same physical chassis.
Run platform_tests/api/test_module.py::TestModuleApi::test_get_presence
See those skipped modules that shows online (physically) but not part of the "logical chassis" flagged as failed by the testcase
Describe the results you received:
admin@chassis-sup-3:~$ show chassis modules status
Name Description Physical-Slot Oper-Status Admin-Status
------------ ------------------------------------------- --------------- ------------- --------------
FABRIC-CARD0 Cisco 8808 Fabric Card for 14.4T Line Cards 18 Online up
FABRIC-CARD1 N/A 19 Empty up
FABRIC-CARD2 Cisco 8808 Fabric Card for 14.4T Line Cards 20 Online up
FABRIC-CARD3 N/A 21 Empty up
FABRIC-CARD4 Cisco 8808 Fabric Card for 14.4T Line Cards 22 Online up
FABRIC-CARD5 Cisco 8808 Fabric Card for 14.4T Line Cards 23 Online up
FABRIC-CARD6 Cisco 8808 Fabric Card for 14.4T Line Cards 24 Online up
FABRIC-CARD7 N/A 25 Empty up
LINE-CARD0 N/A 2 Online up
LINE-CARD1 N/A 4 Online up
LINE-CARD2 N/A 6 Online up
LINE-CARD3 N/A 8 Online up
LINE-CARD4 N/A 10 Online up
LINE-CARD5 N/A 12 Empty up
LINE-CARD6 N/A 14 Empty up
LINE-CARD7 N/A 16 Empty up
SUPERVISOR0 Cisco 8800 Route Processor 30 Online up
SUPERVISOR1 N/A 31 Empty up
admin@chassis-sup-3:~$
Here is the skip module marked in the inventory file for the logical chassis that I am running the test for:
self = <test_module.TestModuleApi object at 0x7f545e9df7d0>, duthosts = [<MultiAsicSonicHost chassis-sup-3>], enum_rand_one_per_hwsku_hostname = 'chassis-sup-3'
localhost = <tests.common.devices.local.Localhost object at 0x7f5457caa7d0>, platform_api_conn = <httplib.HTTPConnection instance at 0x7f5423ed86e0>
def test_get_presence(self, duthosts, enum_rand_one_per_hwsku_hostname, localhost, platform_api_conn):
duthost = duthosts[enum_rand_one_per_hwsku_hostname]
#self.ignore_mod_list = get_ignore_mod_list(duthost)
for i in range(self.num_modules):
presence = module.get_presence(platform_api_conn, i)
if self.expect(presence is not None, "Unable to retrieve module {} presence".format(i)):
if self.expect(isinstance(presence, bool), "Module {} presence appears incorrect".format(i)):
name = module.get_name(platform_api_conn, i)
if name in self.skip_mod_list:
self.expect(presence is False, "Module {} is not present".format(i))
else:
self.expect(presence is True, "Module {} is not present".format(i))
> self.assert_expectations()
duthost = <MultiAsicSonicHost chassis-sup-3>
duthosts = [<MultiAsicSonicHost chassis-sup-3>]
enum_rand_one_per_hwsku_hostname = 'chassis-sup-3'
i = 17
localhost = <tests.common.devices.local.Localhost object at 0x7f5457caa7d0>
name = 'FABRIC-CARD7'
platform_api_conn = <httplib.HTTPConnection instance at 0x7f5423ed86e0>
presence = False
self = <test_module.TestModuleApi object at 0x7f545e9df7d0>
platform_tests/api/test_module.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <test_module.TestModuleApi object at 0x7f545e9df7d0>
def assert_expectations(self):
"""
Checks if there are any error messages waiting in failed_expectations.
If so, it will fail an assert and pass a concatenation of all pending
error messages. It will also clear failed_expectations to prepare it
for the next use.
"""
if len(self.failed_expectations) > 0:
err_msg = ", ".join(self.failed_expectations)
# TODO: When we move to Python 3.3+, we can use self.failed_expectations.clear() instead
del self.failed_expectations[:]
> pytest_assert(False, err_msg)
E Failed: Module 2 is not present, Module 6 is not present
err_msg = 'Module 2 is not present, Module 6 is not present'
self = <test_module.TestModuleApi object at 0x7f545e9df7d0>
platform_tests/api/platform_api_test_base.py:32: Failed
--------------------------------------------------------------------------------------- generated xml file: /var/src/sonic-mgmt-int/tests/logs/tr.xml ---------------------------------------------------------------------------------------
========================================================================================================== short test summary info ==========================================================================================================
FAILED platform_tests/api/test_module.py::TestModuleApi::test_get_presence[chassis-sup-3] - Failed: Module 2 is not present, Module 6 is not present
======================================================================================================== 1 failed in 226.05 seconds =========================================================================================================
INFO:root:Can not get Allure report URL. Please check logs
gechiang@4219e8f3f5c8:/var/src/sonic-mgmt-int/tests$
Note that Module 2 corresponds to "LINE-CARD0" and Module 6 corresponds to "LINE-CARD4" that were meant to be skipped for this test although they are physically "Online"
Describe the results you expected:
Whatever is marked as skipped should be skipped and no state of it should be used as part of any test result.
The text was updated successfully, but these errors were encountered:
Description
in test_get_presence() there is a new check added by PR (#2985):
I am seeing some issue with the above logic.
In our testbed where we share a physical chassis that uses the same Supervisor card but group different LCs for different "logical chassis" so for some "logical chassis" some LCs are added to "skip_mod_list" as they are not meant to be tested for that logical chassis. Our expectation is that if it is marked as skipped, it is not meant for testing and no state of those skipped modules should be used for any validation logic. We should always trust what the inventory marked as skipped and not try to look at whatever state the "skipped module" should be.
I am raising this as a testcase issue and will follow up with a PR to address this
Steps to reproduce the issue:
Describe the results you received:
Here is the skip module marked in the inventory file for the logical chassis that I am running the test for:
but the test ran failed with the following error:
Note that Module 2 corresponds to "LINE-CARD0" and Module 6 corresponds to "LINE-CARD4" that were meant to be skipped for this test although they are physically "Online"
Describe the results you expected:
Whatever is marked as skipped should be skipped and no state of it should be used as part of any test result.
The text was updated successfully, but these errors were encountered: