VirtuCo - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

VirtuCo

Description:

Architecture description of new requirements added. Updated ... Deliverables. Peer testing. Risks. Actual device. Time underrun. Schedule. Task dependencies ... – PowerPoint PPT presentation

Number of Views:14
Avg rating:3.0/5.0
Slides: 28
Provided by: samihan
Category:
Tags: virtuco | underrun

less

Transcript and Presenter's Notes

Title: VirtuCo


1
VirtuCo
  • Implementation 2
  • Project Review
  • 11. 2.2004

2
Agenda
  • Project status
  • achieved goals in the iteration
  • project metrics
  • Used work practices
  • Completed work
  • Mobile demo
  • Plans for the next iteration

3
Status of planned goals
  • Review and refactor code
  • Clearer structure of code.
  • Future implementation was easier.
  • Construct community architecture
  • All community requirements produced.
  • Produce simple community demo
  • New functionality will be shown with a working
    demo.

4
Status of planned deliverables
  • Project plan and requirements documents
  • Updated to match current situation
  • Technical documentation
  • Architecture description of new requirements
    added
  • Updated after refactoring
  • Test case specifications
  • Updated to cover new functionality
  • Test report
  • Added reports of new executions

5
Realization of the tasks
  • Design hours went to implementation
  • Debugging was hindered
  • New cycle was started before I1 ended

6
Working hours by person
Realized hours in this iteration
Plan in the beginning of this iteration
Real Plan Diff
Hannu 541 60 -5
Jorma 921 77 16
Kaarle 521 60 -7
Kai 441 47 -2
Petteri 561 52 5
Sami 545 53 6
Ville 881 54 35
Total 451 403 48
PP I1 Subtot I2 I3 DE Total
Hannu 30 67 97 54 42 23 210
Jorma 25 54 79 57 48 25 210
Kaarle 30 70 90 48 40 25 210
Kai 32 89 121 48 16 25 210
Petteri 41 72 113 42 36 23 210
Sami 55 45 100 45 32 36 210
Ville 44 70 114 47 32 23 210
Total 257 465 722 341 266 122 1470
Latest plan (inc. realized hours and other
updates)
  • Client debugging problems
  • Work load shifted to Ville
  • Server studying

PP I1 I2 Subtot I3 DE Total
Hannu 30 67 55 152 36 22 210
Jorma 25 54 93 172 24 14 210
Kaarle 30 70 53 153 40 17 210
Kai 32 89 45 166 17 27 210
Petteri 41 72 57 160 32 18 210
Sami 55 45 59 159 40 21 220
Ville 44 70 89 203 9 8 220
Total 257 465 451 704 194 121 1490
7
Hours per work type in I2
8
Hours per work type in the whole project
9
Quality metrics
Issue metrics
PP I1 I2 Total
Reported 92 4 25 121
Closed 92 3 17 112
Open 0 1 8 9
Implementation 2 Blockers Critical Major Minor Trivial Total
Total open 0 0 4 3 1 8
This iteration reported 0 3 8 12 2 25
  • Results of system testing
  • Testing discovered new bugs that are being fixed

10
Quality assessment
Functional area Coverage Quality Comments
Conversation 3 K Some major bugs, but works overall.
Push messages 2 J Works
User management 3 K Some major bugs, but works overall.
  • Overall quality is good enough for a
    demonstration
  • All unit tests passed!

Legend Coverage 0 nothing 1 we looked at
it 2 we checked all functions 3 its
tested Quality J quality is good K not
sure L quality is bad
11
Software size
Lines of code I1 I2 I3 DE
Client total 2114 5895
Client comments 946 2404
Server total 2006 4592
Server comments 941 1958
Common total 204 156
Common comments 109 99
Total 4324 10643
Comments 1996 4461
Lines of code per programming hour I1 I2 I3 DE
Functional lines of code 15 10
Any line of code 27 16

12
Risks
  • MIDP2.0 phone
  • Have not received a phone to test on.
  • Testing with phone will cause delays.
  • Tools produce problems
  • Debugging didnt work with Eclipse and NDS.
  • Project started slow
  • Personell was sick or otherwise unable to work.

13
Work practices
  • Meeting practices
  • Usability tests
  • Unit tests
  • Process construction and tuning
  • Refactoring

14
Unit tests
  • Unit tests were written when possible
  • Enhanced code quality
  • Metrics (testclasses / actual classes)
  • client 12 / 29 41
  • common 0 / 3 0
  • server 3 / 58 5,2
  • all 15 / 90 16,7
  • Each test class contained multiple tests.
  • Common package doesnt contain functionality.

15
Meeting practices
  • Started to use in this I1
  • Meetings took roughly half the time
  • All necessary subjects were addressed
  • Metrics
  • Meetings lasted 7 less than planned
  • Agenda times differed 60 from planned.

16
Refactoring
  • Refactoring was done for the whole system before
    christmas.
  • Code was refactore simpler and cleared at the
    expense of efficiency.
  • It was decided to optimize later, if necessary.

17
Usability tests
  • A GUI walkthrough test was carried out with a HUT
    student
  • Usability issues were discovered in each
    usability category.
  • New improvements to the GUI was derived from the
    issues and they were listed.

18
Process construction and tuning
  • First reflection workshop was held
  • Improvement suggestions were discovered.
  • Many of them were implemented immediately, some
    left to be implemented later on.
  • One more workshop to be held
  • Generally, workshops seem to be very good way to
    gather information about process from the project
    group

19
Configuration management
  • Nightly builds
  • Succeeded in 49 times
  • Failed 14 times, 9 times due to unit tests.
  • Other times were due to problems with scripts.

20
Design patterns
  • The number of added design patterns decreased
  • Architecture was stablized
  • No new were needed
  • Refactoring
  • Constitutes many of the newly used design
    patterns.

21
Results of the iteration
  • Software
  • Community functionality
  • Server and client
  • Demonstration
  • New documentation
  • Users guide

22
Spikes
  • MIDP1.0
  • Connection over HTTP worked
  • J2MEUnit
  • Tested and works
  • GameCanvas and fullscreen
  • Tested with MIDP2.0 and works.

23
Lessons learned
  • Eclipse and NDS wont work together when
    debugging MIDlets.

24
Service overview
  • by Jorma Rinkinen

25
Architecture overview
26
Demonstration
27
Plans for the next iteration
  • Goals
  • Scorched Earth game
  • Deliverables
  • Peer testing
  • Risks
  • Actual device
  • Time underrun
  • Schedule
  • Task dependencies
Write a Comment
User Comments (0)
About PowerShow.com