The VEGA checkout overall approach - PowerPoint PPT Presentation


PPT – The VEGA checkout overall approach PowerPoint presentation | free to download - id: 23fb3f-NDk1N


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

The VEGA checkout overall approach


Multiple launch: one main satellite up to six micro-satellites ... The VEGA ground checkout systems include EGSE (Colleferro) and CCV (Kourou) ... – PowerPoint PPT presentation

Number of Views:161
Avg rating:3.0/5.0
Slides: 17
Provided by: mcar98
Learn more at:


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: The VEGA checkout overall approach

The VEGA check-out overall approach
EGOS 2005 Mauro Cardone (ESA/IPT)
The VEGA Mission   VEGA is the European launcher
designed to allow a wide range of missions and
payload configurations in order to best respond
to market expectations. Single launch one
satellite Double launch two satellites
Multiple launch one main satellite up to six
micro-satellites  Total Payload mass up to
From equatorial to polar and Sun Synchronous
orbits, VEGA will allow a large range of
missions   Inclination from 5.2? to 102? (0.05?
accuracy) Altitude from 300 to 1500 km (10 km
accuracy) The reference for the in-orbit capacity
of Vega is 1500 kg into a 700 km-altitude in
polar orbit.   ESA ,through the Launchers
Directorate and the Vega Department, is in charge
of the management of the Vega Programme. The
development activities of the Vega Programme are
organised in three projects run by the VEGA
Integrated Project Team (ESA, ASI, CNES)
          Launch Vehicle           P80 (1st
stage solid rocket motor)           Ground
  • The VEGA checkout systems
  • The VEGA ground checkout systems include EGSE
    (Colleferro) and CCV (Kourou). They are composed
    by a Test Configuration System (Centre de
    Calcul) and Test execution System (Banc de
    contrôle Vega)
  • The Test Configuration system will enable
    preparation of data and procedures needed for
    operation or tests to be performed on the Test
    Execution System, independently of running
  • The Test Execution System will execute test or
  • The Post Processing system will enable results
  • EGSE will perform avionic chains qualification
    tests and stage acceptance tests.
  • CCV will perform P80 acceptance tests, LV
    integration and check-out until lift-off. CCV
    will moreover perform launch campaign
    organization, i.e. mainly
  • control, operation and monitoring of launch
    vehicle and ground fluid process
  • countdown and final sequence (synchronized
  • safety state assurance
  • data storage for on-line and off-line result

  • The EGSE shall interface with stages, simulated
    launcher (Vega Electrical Simulator-VES and
    VEGASIM) and
  • special devices.
  • The CCV shall interface with P80, stages, full
    launcher, VES, fluids process, SA devices and
    Mission Control Centre.

  • Coherence of VEGA checkout systems
  • The VEGA program objective is to verify and set
    in coherence the requirements and design of the
    two check-out systems in order to guarantee a
    maximum coherence on tests and operations during
    development, qualification and production phases
    and with a view to achieve
  • Optimisation during development taking advantage
    of common equipments and sub-systems and relevant
    proposed solutions (HW, SW, system evolutions,
    human resources) avoiding developing two times
    the same functions,
  • Risk reduction during production phases via a
    more effective fault detection and correction,
  • Increase of maintainability and reduction of
    relevant costs,
  • Portability of launcher procedures between sites
    in order to test possible evolutions and for
    training purposes.

  • Target levels of coherence
  • Coherence Level 1 Hardware, Level 1 SW COTS,
    OS, RDMS, runtime database, SCOS, ASE, router,
    SCOEs Software ( LLCS, TLM, 1553).
  • Coherence Level 2 Level 2 SW tools and
    services (archiving, monitoring, commanding,)
    specifications and realization.
  • Coherence Level 3 level 3 SWoperational and
    test procedures specifications (cadre de
    procedures) and implementation (logiciel

Level 3 Very H
Level 2 High
Level 1 of coherence MMI and performances
The target level of coherence is medium and
relevant to the two following areas. MMI  In
order to have operators in a similar environment
when performing tests in Colleferro and in
Kourou, MMI interface is to be similar in order
to allow operators to -         send manual
commands in the same way -         manage tests
sequences with the same logic -         view
synoptics in the same manner - give same
approach for anomalies investigation -        
have same security levels for data monitoring and
procedures Performances Some differences are
justified in the HW performance requirements due
to the different Launch Vehicle configurations in
Europe and Guyana. Higher convergence is however
targeted as for the reaction time of the system
and the monitoring/acquisition frequencies.
Level 2 of coherence
  • The target level of coherence for level 2 is high
    and based on compatible definitions and
    implementation of
  • Test and operational environment,
  • Pluto engine
  • Space System Model SSM

Level 2 of coherence test/operational
  • Different tool may be used depending on various
    interfaces and user needs. Part of these
    elementary tools are called directly by operator
    in order to be used in manual mode ( part of MMI
    commonalities). Most are used under automatic
    tests ( part of Tests sequences management).
  • This set of services shall be implemented on both
  • Command tools
  • Acquisition and Control tools
  • Monitoring tools
  • SCD specific tools
  • TM front ends management
  • Editors
  • TDI ( off-line immediate) and TD (off-line) data
  • Management of automatic test procedures
  • The coherence shall be ensured for test sequences
    though the tailoring of standards ECSS-E-70-32
    (Procedures definition language) and ECSS-E-70-31
    (Monitoring and control data definition) for
    launcher applications, in order to have a common
    environment that implements a PLUTO engine and
    SSM syntax.
  • Moreover compatible time-tagging, archiving and
    logging services will be implemented to ensure
    test comparison between sites.

Level 2 of coherence the PLUTO engine
  • The portability of procedures shall be
    implemented by the use of the same semantic in
    the program language (guaranteed by the use of
    PLUTO standard, ECSS-E70-32) and same services to
    implement the automatic procedures.
  • E-70-32 specifies the language used to define
    those activities of the Space System Model to be
    implemented as Ground Procedures.
  • This language is called the Procedure Language
    for Users in Test and Operations (PLUTO).
  • The procedure language is standardized in order
  • facilitate the transfer of procedure knowledge,
    acquired in the check-out domain during
    functional testing, to the mission operations
    domain (for a given mission)
  • encourage the use of a common procedure language
    across missions this reduces the learning time
    for staff moving between missions.
  • E-70-32 refers to the E-70-31 presenting the
    limited view of the SSM required to prepare and
    execute procedures.
  • E-70-32 introduces the requirements that have
    been followed to define the procedure language
    and specifies
  • The structure and dynamic behavior of procedures.
  • The syntax and semantics of the procedure

Level 2 of coherence the PLUTO engine
  • PLUTO language shall be tailored for launcher
    EGSE/CCV activities in order to map the Vega
  • VEGA Automatic procedures
  • Application modules or procedure subset (MA)
  • Shutdown configuration modules or recovery
    procedure (MCA)
  • Specific process actions or watchdog procedure
  • Sets of PARMODS (MODifiable Parameters )( for
    each MA, MCA and ASP)
  • Chaining file (list and the execution order of
  • FRPA (result file)

Level 2 of coherence SSM definition
  • The portability of procedures is to be
    implemented by the use of compatible definitions
    of the SSM, I.e following the same tailored
    standard ECSS-E-70-31.
  • E-70-31 introduces
  • the concept of a generic Space System Model (SSM)
    made of a hierarchy of System Elements
    representing the functional and physical
    decomposition of the Space System and the
    relationships between these objects.
  • the monitoring and control view of this space
    system (knowledge required by the EGSE and
    Mission Control System). This view is modeled as
  • characteristics of a given system element (e.g.
    the APID of an onboard application process, the
    address of a device)
  • Monitoring data objects (e.g. TM parameter, SCOE
    parameter, monitor tables)
  • Activities (e.g. a TC, a procedure, a SCOE
    command, a request for an operator)
  • Events (e.g. OOL, sw and HW failures)

Level 3 of coherence
  • The target level of coherence for level 3 is very
    high and based on
  • Compatible Test and operational specifications,
  • Same data and procedures as per Launcher

Level 3 of coherence specifications
Test specification (ATS/SMO) Test requirements.
A specification is foreseen for each Avionics
Test these specifications shall be developed
according to the overall Avionics test plan. They
shall include test requirements, test logic,
expected results, LV constraints, etcthese
specifications are divided into -         ATS
(Avionics Test Specification) Test
specifications for LV integration test in
Europe (EGSE) and Guyana (CCV). -         SMO
(Specification de Mise en Oeuvre) Test
specifications for LV flight preparation in
Guyana (CCV). Test SW specifications (Level 3
SW) SW sequence/procedure requirements. One or
several SW sequences/procedures shall be
specified and developed for each Test. Test
Sequence/procedure or LN3 SW Automatic Test
procedures to be activated on EGSE/CCV in order
to perform the tests (equipment switch on/off,
performance test, acceptance tests, integration
tests, configuration tests).
Level 3 of coherence specifications
ObjectivePart of the ATS/SMO could be common
for Europe and Guyana (or at least very similar
with some differences linked to environmental
constraints), for qualification purpose, for
assembly acceptance, for LV integration checkout
and for Flight preparation and in general for all
tests to be performed at equipment/Assembly
level, especially in case of equipment
replacement (test done a first time in Europe and
a second time in Guyana). The coherence between
Europe ATS, Guyana ATS and SMO will avoid a
duplication of work and will optimise the overall
test development activities avionics test
specifications -gt test SW specification -gt test
SW coding -gt test SW validation -gt test SW
configuration management.
Level 3 of coherence launcher data and
User coherency for launcher data and activities
is guaranteed though proper data specification
and population via Vega Interface Database
(VIDB), which is part of overall operational
database (VODB). VIDB contains a definition of
the LV and its related subsystems to be shared
between EGSE and CCV, such as
  • Commands (digital, analogical, 1553)
  • Measurements (digital, analogical, 1553,
  • Signal calibration data (calibration curve from
    raw to engineering values)
  • Parameters for Characterization of the 1553
  • Parameters for Characterization of the wired
    equipments (sensors, batteries, )
  • Parameters for Characterization of the TLM

Level 3 of coherence local data and procedures
User coherency for local data and activities is
to be guaranteed. Data base related to elementary
activities ( such as wired SCOE, 1553 Bus SCOE,
OBC management,..) shall be with identical
structure and identical bus streams and
Note that the activities implementation could be
different because of the different implementation
of interfaces to devices to be accessed by the
testing environment (SSM drivers) .
Activity IPDU1_Power_On
Activity load_frame
Activity start_frame
Activity load_frame
Activity start_frame