Ooad-Items

1. Q/A & Optimisation activity

For Q/A & Optimisation is intended here an activity which can be applied to the software product, either in a global context and within each component domain of the project.
This activity includes the following specific topics:
  • a. monitoring of dynamic memory allocation (memory leaks) by using specific tools applied to well defined testdbed applications.

  • Goal: identify possible weaknesses/bugs in the code concerning memory access, dynamic memory usage and illegal pointer operations in order to improve overall usability and robustness of the applications at run-time.
  • b. perfomance monitoring and profiling, based on well defined testbed applications/tests.

  • Goal: identify areas in the code where simple devices and solutions can be applied to improve run-time performance. This can be achieved by performing a statistical analysis over the CPU, memory usage and process profiling.
  • c. filtering of the source code to identify major violations of coding rules which may point to potential bugs in the software, or violations to our coding conventions.

  • - the set of coding rules should be defined and proper tools and filtering scripts adopted.
    Goal: improve quality and reliability of the code by assuring that developed code does not violate major or established coding conventions.
  • d. filtering of the source code to identify evident metrics violations.

  • Goal: improve quality, maintainability and portability of the code, by identifying areas of complexity and quantifying through quality factors the textual and structural aspects pertaining to the source code.
  • e. Test coverage analysis.

  • Goal: improve quality of testing, by statistically identify those parts of the code (files, functions, statements) which are not executed and therefore not exercised during normal testing.
Priority should be given to topics (a) and (b).

This activity has to be integrated in the current Software Process and therefore based on "mutual trust" by the G4 developers and Category Coordinators.

  • The implementation of the suggested fixes to the code should be -seriously- evaluated and carry high priority with Category Coordinators.
  • If not considered, a valid motivation should be expressed and documented.
Actions:
  • q1. Create a team of maximum 2 people involved for -at least- 30% of their time for set up and implementing the topics above. The team should take the responsibility of:
    • identifying the proper professional tools to be used associated to the topics mentioned above, also considering their availability and project resources.
    • select and/or define the 'testbed' applications to be used and maintain/upgrade them along the development.
      • maintenance is foreseen in collaboration with the STT team and Category Coordinators.
    • making a first complete analysis for points (a) and (b) above.
    • delegate to Category Coordinators the responsibility to monitor, assign to developers and/or implement fixes result of the QA activity.
  • q2. Identify possible resources (tools/people) within external groups which are part of the Geant4 Collaboration.
  • q3. The created QA team should take the responsibility of identify the valid set of rules at point (c) and implement the proper filters and scripts to be adopted.
  • q4. Other activities involving the QA team are:
    • make publicy available to collaborators/developers tools, setups and script, to allow 'unit' QA activity within each project domain (see also automation below).
    • automate as much as possible (and document) the QA activity, possibly through the World-Wide-Web (access restricted to Geant4 developers).
    • make available and distribute the results of the filtering and analysis done (see also automation, through WWW and/or direct mail contact with Category Coordinators).
    • perform a complete analysis regularly every 1 or 2 months, based on the latest suggested Reference Tag.
Workplan & timescale:
  • q1. - Creation of the team: as soon as possible. Review current situation.
  • q2. - As soon as possible.
  • q3. - December 15th 2000.
  • q4. - Progressive improvement.

2. Analysis & Design software cycle

The following list of actions proposed are meant to document a well established OOP procedure which is still required in the "production" phase of the software product. They are needed in order to assure that improvements and new developments are kept in sync with the overall OOA&D.
These actions, most of which have to be integrated in the regular software development of each project domain, if associated with a regular QA activity, will guarantee that the code quality will not degrade with time. It will also assure a coherent development where coupling will not increase with the complexity of the software.

Process Elements:

  • a1. Periodically (every 6 months) review the current category diagram and check/identify those areas where violations/changes have been introduced, in a global context.
  • a2. Within each Category domain, Category Coordinators should periodically perform the following actions:
    • review and if necessary update the User Requirements Document (URD) by analysing it in the context of their category domain, possibly starting from a "use-cases" list;
    • review and identify those areas where an OOA&D software cycle needs to be applied and implement it;
    • review the A&D documents (class diagrams AND scenario diagrams for the most relevant object interactions concerned). If necessary update and integrate them;
    • review the code to check its consistency with the design.
  • a3. Within each Category domain, Category Coordinators should regularly:
    • ensure that the on-going development in their category is consistent with the design dictated by the above documents by supervising the development activity and, according to available resources, organising proper training for the developers in the team.
  • a4. Make available on web the above documents and define a clear procedure for their maintenance and update.
  • a5.Collect architectural design diagrams, and define a clear procedure for their mainteinance and update.
Workplan & timescale:
  • a1. - Perform the first review by October 31st 2000.
  • a2. & a3. - Assessment of the current Software Process in Geant4:
  1. Generate a questionaire based on ISO/ESA (references: CERN Project Support Team and past SPICE assessment), and distribute to Category Coordinators - DONE;
  2. Collect results of the questionaire - DONE;
  3. Integrate assessment's results and Software Process improvement proposal in a ISO-15504 document - DONE.
  • a4. - Updated documents/diagrams to be posted on web should be submitted to the Software Manager Coordinator by December 15th 2000. Maintenance and update procedure will be defined during this period.
  • a5. Create a CVS repository with restricted access to Geant4 Category Coordinators for collecting architectural designs (.mdl Rose files) and implement a policy to regularly update: DONE.

  • All documents in the repository - by December 15th 2000.
    Repository status

    3. Testing

    The following is a list of actions to be considered in order to improve the System Testing activity and assure continuity.
    • t1. Review currently dedicated resources available for testing.
      • prioritise recruitment of new manpower and promote a training activity for 'new-comers'.
    • t2. Improvement of system tests & examples (priority order):
    1. establish clear responsibilities for maintenance and integration of system tests and examples in the normal development process, in order to improve communication and collaboration between the Testing team and developers;
    2. review and improve code quality of official public examples. The goal is to facilitate take up and training of users of Geant4 by providing them with a set of well thought out reference examples.
    3. review and properly document current system tests; check and possibly increase the scope of the system tests by verifying correspondance with URD and use cases.

    4. This is required to easily monitor the evolution of the system tests and verify that the required functionalities are correctly implemented and tested.
    5. use "regression tests".

    6. This is required to detect and understand behavioral changes that may happen by the integration of new development.
    7. use "statistical tests".

    8. The goal is to help with verifying the coherence of the results with their physical meaning (statistical distributions) for each test.
  • t3. Improve/implement automation (priority order):
    1. adoption of Bonsai tool and LXR browser.

    2. To automate and facilitate the testing activity adopting a WWW tool for submitting tags to system testing (Bonsai) and provide a way of easily browsing the code online through the WWW (LXR).
    3. adoption of an automatic testing system based on Tinderbox with customisation for Geant4 (tagged code).

    4. To allow all developers the ability to view and monitor the progress of system tests and allow their distributed control.
    5. integrate Q/A automation (see above).

    6. To provide to developers the possibility to perform basic Q/A checks in a easy way to their code, sharing the same tools/scripts adopted by the Q/A team, through the same integrated environment used for testing (Bonsai).
    Workplan & timescale:
    • t1. - DONE; review current situation;
    • t2. - implementation:
      • Point (1) - as soon as possible;
      • Point (2) - by November 2000;
      • Point (3) - DONE; review current situation.

      • Review system tests correspondance with URD by December 2000.
        Introduce new tests to cover deficiencies found above: progressive;
      • Point (4) - by December 2000;
      • Point (5) - progressive improvement.
    • t3. - implementation:
      • Bonsai - DONE; put in production latest fixes.

      • LXR - DONE.
      • Tinderbox - by December 2000.
      • Q/A automation - progressive improvement.