This Confluence has been LDAP enabled, if you are an ASF Committer, please use your LDAP Credentials to login. Any problems file an INFRA jira ticket please.

Skip to end of metadata
Go to start of metadata
Phase1 Changes:
  1. Deploy Data Center Changes:

    • Separating TestClient from deploy Data Center: Earlier, testclient creation is also part of deployDatacenter. It is not part of this module responsibility to create test client in general. Now with this separation, testclient creation is  moved to cloudstackTestClient module and is responsible for creating and providing the test client for regression run. Added changes to both deployDataCenter and cloudstackTestClient. Fixed few misc issues and did some clean up here. Now deployDC will just take the required configuration and parameters and create the Data Center we needed. 
    • Delete Data Center:   Deleting a created data center in current deploy Data Center is not possible. Added a new Delete Data Center Interface to delete the earlier created data center. If user wants to delete a Data Center created and recreate, deleting has to be manual earlier. With this change, he can delete a data center created and can recreate with new settings. It will be useful at many places for testing to destroy and recreate with new settings etc.
      Usage: python -i <path_to_marvin_config> -r<path_to_exported_create_dc_config>
    • Data Center creation as Transaction : At any given point of time during deploy data center, if an issue happens half way through, earlier created entries were still available in CS, it wont clean up i.e., data center is half created, Again, if tried to recreate , the earlier existing entries won't allowed to recreate. Now, data center works as transaction, either all entries are created as part of data center run or none, added a clean up to delete the half created data center.  
    • Export the created Data Center: Export the created Data Center as configuration, so that user can use it to further delete the data center with these exported entries etc. So, we can create, export,delete,recreate etc for our testing. This will  help in testing. With this change, once deploy Data Center finishes, it exports the created configuration to a path mentioned under input config file.

  2. Parallel test runs against single management server :  This change will  help to run parallel test suites against multiple zones with each zone having one hypervisor. This will  help in cutting regression time, bvt time and explore few new bugs through running parallel suites against multiple zones. Earlier, there were hard coding for running test modules against a CS setup, under test suites, we had some hard coding values used to pickup first zone, first pod, first host,first template etc. So, even though we had multiple zones, hosts etc, it always used to pickup first zone,pod etc. With new changes added, we can now run automation for multiple hosts across zones on one management server driven by configuration.  So, a simple work flow as below:
    • Deploy Data Center with multiple zones Say 3 zones one for each hypervisor( zoneforxen,zoneforkvm,zoneforvmware etc).
        python -i <input_cfg_file>

    • Run all test suites against each zone in parallel.

nosetests --with-xunit --xunit-file=xen_hypervisor_output.xml --with-marvin --marvin-config=/hudson/scripts/nightly_asf_master.cfg  -w <path_to_smoke_tests> -a tags=advanced --zone=<zoneforxen>

nosetests --with-xunit --xunit-file=xen_hypervisor_output.xml --with-marvin --marvin-config=/hudson/scripts/nightly_asf_master.cfg  -w <path_to_smoke_tests> -a tags=advanced --zone=<zoneforkvm>

nosetests --with-xunit --xunit-file=xen_hypervisor_output.xml --with-marvin --marvin-config=/hudson/scripts/nightly_asf_master.cfg  -w <path_to_smoke_tests> -a tags=advanced --zone=<zoneforvmware> etc. 

3 .  Added few new test suites, features EX: new cases have been added for volumes, instances which many have reported issues during a release. Fixed few issues reported under marvin\test suites. More details under bugs below:

4. Marvin Enhancements: 

  •  Marvin SSH Library Issues : Current SSH library client of marvin, has issues EX: executing command returns error and output as part of same list, create connection with more delay, taking too much time some times, and other ssh issues. Fixed them.
  •  Logging issues:  All related logs will be streamlined at one place for  the total run, It will help in debugging failures post run at one place. Now,  for the entire run, and for each test suite, we will see different logs viz., "run\results\exceptions+passfail " log all at one place. So, all logs separated by different category help resolve issues quickly. identify product issues vs test failures.  Each test suite logs will be a different folder with all these logs.
  • Moving Services from tests suites: In phase1, we removed services dict from sanity tests and moved out. This will  help in segregating "tests code" from "tests data". Lot of places, services dictionary used as part of test code, makes it unreadable, not maintainable and it makes code a lot unclear, its passed as it is to base libraries and any changes in services dict breaks base libraries. EX: On a given test fixture, if we are to run all test suites for a given template, currently os type is hard coded under all test suites, that means we have to change under all test suites to get it working. This value of ostype was mentioned as part of services dict. Now, as such it is moved out, we can make change at one place and it will effect all test suites, other issues related to it were fixed.
  • TestClient is now made a clear interface with marvin framework and test suites. EX: getHypervisorInfo, getzoneInfo, getparsedTestData etc are all accessed by tests using test client. So, any changes to the underlying things will still  be abstracted to test suites and they continue to work.
  • There was an interface change for class "CSConnection" under , earlier management details were passed individually, now they were passed as it as a single structure. scheme variable was removed, this variable earlier signifies protocol(http\https), this can be obtained again from management details structure.


Phase2 Changes:

Planned to fix below issue\Enhancements.

  • No labels