CrossGrid Quality Assurance Activities Report - PowerPoint PPT Presentation

1 / 70
About This Presentation
Title:

CrossGrid Quality Assurance Activities Report

Description:

Due to 'backward compatibility' of reports, Work Package data have been gathered ... wrong names; avoiding braces support in expressions; not using constants; ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 71
Provided by: kamil6
Category:

less

Transcript and Presenter's Notes

Title: CrossGrid Quality Assurance Activities Report


1
CrossGrid Quality Assurance Activities - Report
Robert Pajak, Piotr Nowakowski and Grzegorz
Mlynarczyk X TAT ACC CYFRONET AGH, Kraków,
Poland www.eu-crossgrid.org

2
Outline
  • Recommendations from the CG Review
  • QA Procedures for deliverables
  • Quality Assurance Reports
  • Process Quality Indicators
  • Testbed Quality Indicators
  • Code Quality Indicators
  • progress and effectiveness of testing
  • static source code metrics

3
Recommendations from the CG Review
  • Recommendations for Quality Assurance in the
    CG project from the Final Report on the CG Annual
    Review, 18 June 2003
  • A project Quality Engineer should be quickly
    nominated, and tightly integrated into the
    project management structure, reporting directly
    to the project coordinator. The Quality Engineer
    must have the overall responsibility for all
    quality-related aspects of the entire project
    (documents, reports, software). He must
    permanently monitor the quality all deliverables
    and dissemination material.
  • Done.
  • We recommend that for the next deliverables up to
    and including M18, both the draft reports
    arriving at the project Quality Engineer and the
    final versions be sent to the Commission.
  • Done.

4
QAP for deliverables
  • After internal review each deliverable is
    verified and corrected by the Quality Engineer.
  • Both the draft reports arriving at the project
    Quality Engineer and the final versions are
    afterwards sent to the Commission.

5
Number of monthly reports delivered to CG Office
on time
6
Number of monthly reports delivered to CG Office
on time
  • The following Partners have never sent their
    monthly report on time during the last 7 months
  • INP (AC3), UCY (AC12), UAB (AC16), AUTH ( AC19),
    CSIC
  • (CR15)
  • The following Partners have been sending most
    frequently their monthly report on time during
    the last 7 months
  • AC9 (USTUTT) and CR5 (UVA) - 6 times each
  • AC7 (UNI LINZ) and AC10 (TUM) - 4 times each

7
Number of persons allocated by each Partner to
development of all tasks during Dec 2003 (EU
funded hours)
8
Mailing lists activity in December 2003
9
Mailing lists activity in December 2003
10
Testbed Quality Indicators - December 2003
11
Testbed Quality Indicators last 4 months
12
Testbed Quality Indicators - notes
  • Number of sites corresponds to the sum of all
    sites involved in the Production, Development and
    Validation testbeds.
  • Number of users refers to the users registered in
    all crossgrid VOs.
  • Job submission refers to the number of globus
    jobs submitted in the Production testbed.
  • Monthly uptime refers to the CE gatekeeper
    availability it includes gatekeeper and ICMP
    response.
  • During the period November-December 2003 several
    sites were being upgraded this is the reason why
    the success rate and uptime is very low.

13
Code QIs - Progress and effectiveness of testing
December 2003
14
Code QIs - Progress and effectiveness of testing
December 2003
15
Code QIs - Progress and effectiveness of testing
December 2003
16
Code QIs - Progress and effectiveness of testing
December 2003
17
Code QIs - Progress and effectiveness of testing
December 2003
18
Code QIs - Progress and effectiveness of testing
December 2003
19
Code QIs - Progress and effectiveness of testing
missing reports
  • The reports from Tasks 1.1, 1.2, 1.3 and 1.4 have
    not been sent to the Quality Engineer and are
    still missing.

20
Code QIs - Progress and effectiveness of testing
December 2003
21
Code QIs - Progress and effectiveness of testing
December 2003
  • The main conclusions about the above data and
    bugs tracking system
  • From the beginning of the project only few bugs
    were reported using bugs tracking system so no
    advanced measurements of testing phase could be
    calculated since last month people working on
    Task1.3 started to use bugtracker to inform about
    compilation errors
  • In the configuration of the bugtracker many
    projects still have no defined persons
    responsible for resolving reported bugs and whom
    the bugs may and should be assigned to.

22
Code QIs - Progress and effectiveness of testing
December 2003
  • In the configuration of the bugtracker many
    projects still have no defined categories so even
    if somebody from development team would like to
    enter a bug it is not possible to assign a
    category to it. It may in the future complicate
    analysis of the reported issues/problems and
    stability of distinct modules because right now
    projects without categories are treated from the
    bugs tracking stand point as a one big package.
  • Feedback after the first quality report indicates
    that most of people are using direct email
    notification to inform each other about bugs
    but this form isnt sufficient enough to track
    development process developers should use
    bug-tracking tool as a centralized database for
    issues reporting, which itself provides email
    notification mechanism in order to keep them
    receiving notification about problems.

23
Tasks source code in project hierarchy
24
Tasks source code in project hierarchy - notes
  • The diagram contains distinguished, on the first
    level, only these tasks of individual WPs, that
    directories in CVS use naming convention wpX_Y_Z.
  • If the existing project isnt considered in this
    way in classification but it should be analyzed,
    then the QE should be informed about such cases.
  • The diagram contains marked in green only these
    tasks, which in their source directories
    possessed sources that could be compiled and for
    which suitable makefiles were delivered.
  • Adjusting source code directories of individual
    projects might have brought in December 2003 QA
    report disruption to indicators values, since
    values provided a month ago were obtained from
    different directories.

25
Tasks source code in project hierarchy - notes
  • Due to backward compatibility of reports, Work
    Package data have been gathered using whole
    content found in CVS under Crossgrid/wpx
    directory. As a result, values of several
    indicators at package level may not be just a
    simple sum of appropriate indicators at project
    level. Project level indicators may omit (as
    requested) other sources not found in src dir,
    i.e. sources of tests or includes.
  • Values of indicators at package level presented
    in this report were adjusted to not contain fake
    sources that had been found in CVS. For
    instance, GTK sources (found in wp2_4_1-perfmon
    directory) and doubled Workload sources were
    eliminated from Work Package data.

26
Code QIs - Static source code metrics December
2003
27
Code QIs - Static source code metrics December
2003
28
Code QIs - Static source code metrics December
2003
29
Code QIs - Static source code metrics December
2003
30
Code QIs - Static source code metrics December
2003
31
Code QIs - Static source code metrics December
2003
32
Code QIs - Static source code metrics December
2003
33
Code QIs - Static source code metrics December
2003
34
Code QIs - Static source code metrics December
2003
Total complexity of CrossGrid has been reduced.
35
Code QIs - Static source code metrics December
2003
36
Code QIs - Static source code metrics December
2003
37
Code QIs - Static source code metrics last 6
months
38
Code QIs - Static source code metrics last6
months
39
Code QIs - Static source code metrics December
2003
40
Code QIs - Static source code metrics December
2003
41
Code QIs - Static source code metrics December
2003
42
Code QIs - Static source code metrics December
2003
  • Number of quality notifications for WP4 tasks
  • The lack of dividing on individual projects joins
    the situation, in which individual tasks of WP4
    do not contain source codes generating the
    qualitative remarks.
  • Remarks are generated on basis of headers being
    in CVS in catalogs not recognized in QA report as
    task catalogues.

43
Code QIs - Static source code metrics
44
Code QIs - Static source code metrics
45
Code QIs - Static source code metrics
46
Code QIs - Static source code metrics
47
Code QIs - Static source code metrics December
2003
48
Code QIs - Static source code metrics December
2003
  • The main conclusions about the above data
  • The projects wp2_3- the bench as well as marmot
    possess the largest ratio of number of lines of
    comment to effective lines of code.
  • Value of this coefficient close to 0.9 in case of
    the first of these two projects marks, that
    statistically 9 on 10 lines of actual (effective)
    code possess the own line of comment.
  • In practice so large values of this coefficient
    are not necessary but it is important to keep it
    on level not lower than 0.4 the majority of
    examined projects still do not follow this rule.

49
Code QIs - Static source code metrics December
2003
50
Code QIs - Static source code metrics December
2003
  • The main conclusions about the above data
  • The coefficient is calculated as the ratio of
    number of quality notifications/violations to the
    number of effective lines of code.
  • Accepting 0.1 as a satisfactory value of this
    coefficient (the lower values the better quality)
    it has been observed that only a few projects are
    below this level.
  • Most of projects keep the value of this
    coefficient between 0.1 and 0.2 which is
    unacceptable and should be corrected.
  • Verification of quality rules used for the
    analysis can be done by reading through reports
    generated by RSM tool.

51
Code QIs - Static source code metrics December
2003
52
Code QIs - Static source code metrics December
2003
  • The main conclusions about the above data
  • During analysis the problems with too high
    complexity of modules have been noticed within
    the limits of individual projects.
  • Even if the average function complexity in all
    projects does not exceed 10, then the average
    classes complexity in half of projects exceeds
    widely accepted in industry value 15.

53
Code QIs - Static source code metrics Nov-Dec
2003
54
Code QIs - Static source code metrics Nov-Dec
2003
55
Code QIs - Static source code metrics Nov-Dec
2003
56
Code QIs - Static source code metrics Nov-Dec
2003
57
Code QIs - Static source code metrics Nov-Dec
2003
58
Code QIs - Static source code metrics Nov-Dec
2003
59
Code QIs - Static source code metrics Nov-Dec
2003
60
Code QIs - Static source code metrics Nov-Dec
2003
61
Code QIs - Static source code metrics Nov-Dec
2003
62
Code QIs - Static source code metrics Nov-Dec
2003
63
Code QIs - Static source code metrics Nov-Dec
2003
64
Code QIs - Static source code metrics Nov-Dec
2003
65
Code QIs - Static source code metrics Nov-Dec
2003
66
Code QIs - Static source code metrics Nov-Dec
2003
67
Code QIs - Static source code metrics Nov-Dec
2003
68
Code QIs - Static source code metrics Nov-Dec
2003
69
Code QIs - Static source code metrics Nov-Dec
2003
70
Quality Assurance Reports
  • All Quality Assurance Reports are available at
    the following website (password-protected)
  • http//www.eu-crossgrid.org/wp5-1-login/QA_reports
    .htm
Write a Comment
User Comments (0)
About PowerShow.com