The deliverable: THE ALICE PHOS FRONT END ELECTRONICS moving data from PHOS detector to ALICE DAQ sy - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

The deliverable: THE ALICE PHOS FRONT END ELECTRONICS moving data from PHOS detector to ALICE DAQ sy

Description:

... PHENIX electromagnetic calorimeter ... PHENIX Electromagnetic Calorimeter. PHENIX: what can ... RHIC PHENIX calorimeter fast trigger. Courtesy: the PHENIX ... – PowerPoint PPT presentation

Number of Views:85
Avg rating:3.0/5.0
Slides: 39
Provided by: bernhar4
Category:

less

Transcript and Presenter's Notes

Title: The deliverable: THE ALICE PHOS FRONT END ELECTRONICS moving data from PHOS detector to ALICE DAQ sy


1
The deliverable THE ALICE PHOS FRONT END
ELECTRONICS- moving data from PHOS detector to
ALICE DAQ system -
PHOS FEE project, Beijing and Wuhan, Nov. 2002
http//www.fys.uio.no/elg/alice
B. Skaali, t.b.skaali_at_fys.uio.no
2
The PHOS detector in ALICE Status for PHOS
Physics requirements for PHOS FrontEnd
Electronics (FEE) General organization of the
PHOS FEE Timing synchronization with LHC and
ALICE, ALICE Trigger Data flow from Very
FrontEnd to Data acquisition FEE for the RHIC
PHENIX electromagnetic calorimeter Quick look
at US proposal for the PHOS (and EMCAL) FEE
Design cycle Integrating the FEE and Detector
Control Project road map
PHOS FEE project, Beijing and Wuhan, Nov. 2002
http//www.fys.uio.no/elg/alice
B. Skaali, t.b.skaali_at_fys.uio.no
3
The ALICE detector
PHOS
4
The ALICE detector longitudial view
5
The PHOS detector
Courtesy Dmitry Budnikov, RFNC-VNIIEF, Sarov
(Sep. 2002)
6
PHOS status March 2002
7
Physics requirements for FEE - summary
8
The PHOS FrontEnd Electronics (FEE)
  • RD and construction of PHOS electronics has so
    far been focused at preamp/shaper for the
    photodiode (PIN, and now APD). A design of the
    readout, trigger and TOF systems does not exist
  • As such, the PHOS FEE project is late, and is on
    the watch list of the ALICE management
  • Time constraint a baseline system (Energy) must
    be available when integration and calibration of
    the first PHOS module starts in 2004
  • Development resources of the European PHOS
    partners are not adequate for design and
    delivery-in-time
  • A US proposal for building a separate EMCAL for
    ALICE, as well as delivery of the complete PHOS
    FEE chain, has been put on hold, earliest release
    of money is 2004
  • Note also that the US groups have indicated
    strong interest in using a non-US PHOS FEE also
    for EMCAL!
  • ALICE management has therefore urged the PHOS
    collaboration to adopt electronics and solutions
    developed by other ALICE detectors

9
General functional requirements for FEE
  • Geometry PHOS module mechanics, space/volume,
    heat removal capability
  • PHOS specific requirements
  • energy resolution
  • also individual/sector gain control of
    photodetector via APD HV settings or variable
    gain amplifier (?)
  • dynamic range
  • L0 trigger generation ( L1 ROI ?)
  • TOF capabilities
  • ALICE requirements
  • LHC timing and synchronization, interface to
    ALICE trigger and the TTC system
  • Data flow and link interface to DAQ hardware-
    and protocol-wise
  • Interface to ALICE Detector Control System
  • voltage setting/monitoring
  • cooling, temperature
  • channel gain control (?)

10
Module geometry for PHOS FEE
  • 64 x 56 (3584) channels
  • warm volume for FEE underneath the crystal
    volume
  • (horizontal) area 130 x 150 cm²
  • total height 32 cm, with 20 cm height (TBD) for
    vertically mounted (daughter) cards, each card
    serving a strip unit with 8 crystals via a
    connector and cabling to the cooled (-25?C)
    crystal volume
  • heat removal capacity 2 kW
  • Total (single side) area for FEE (see following
    mechanical design proposal)
  • 448 daughter cards
  • plus horizontal area 19500 cm²
  • in theory an enormous area available for the
    FEE!
  • however, there are a number of constraints!

PWO crystals, -25 C
warm volume, room temp
11
Mechanics
12
TTC - Timing Trigger Control
  • ALICE Central Trigger Processor (CTP)
  • PHOS Local Trigger (VME) Crate
  • TTC signals to FEE
  • LHC 40.08 MHz clock
  • Overall synchronization
  • ALICE L1 trigger and L1 trigger word
  • ALICE L2 trigger and L2 message
  • TOF reference
  • Note! At the detector this clock will have a
    jitter of some hundred ps, use PLL to clean
  • Interface TTC ?FEE via the TTCRX chip

13
Data flow FEE gt ALICE DAQ
  • FEE data sampling and event buffering
  • latched and flow control from ALICE trigger
    sequence L0 (1.2 ?s), L1 (5.5 ?s),
    L2accept/L2reject
  • single event and multi-event buffer, detector
    specific
  • ALICE L1 trigger data block and LHC bunch and
    orbit included in event data structure
  • Data transport from detector RCU (Readout
    Controller Unit) to ALICE DAQ RORC (ReadOut
    Receiver Card) over (ALICE specific) Digital Data
    Link (DDL)
  • DDL 100 Mbyte/sec bi-directional optical link
  • PHOS one DDL per module (5 modules)

14
The ALICE Digital Data Link
  • DDL specs for H/W designers of detector FEE
  • Source Interface Unit (SIU) must be mounted on
    FEE card (Readout Controller Unit), data flow
    from frontend buffer to DDL controlled by onboard
    firmware
  • right top proto of Destination Interface Unit
    (DIU) on the RORC side. Note dual fibres for
    bi-directional traffic.
  • right bottom SIU proto
  • Nov. 2002 new DDL (SIU and DIU) link cards in
    CMC form factor now under test!

Courtesy CERN PhotoLab
15
Luciano Musa
L0 1.2 ?sec (fixed) L1 5.5 ?sec
(fixed) L2accept lt 100 ?sec L2reject after
L1 For PHOS modules, L0 and L1 buffers are in
on-detector boards, L2 buffer probably on a
single RCU (Readout Controller Unit)
board. F2D-DL is the ALICE Digital Data Link
(DDL) RORC is part of the ALICE DAQ (Data
Acquisition system)
16
PHOS Trigger
  • PHOS shall supply a L0 trigger to the ALICE
    Trigger system
  • Max delivery latency 800 nsec (the ALICE L0 is
    issued after around 1200 nsec).
  • Traditional approach is to make fast (analog)
    sums over a group of crystals and compare the sum
    with a threshold.
  • Sliding summing window technique, illustrated to
    the right. The yellow rectangle indicates a
    cluster of crystals with output above threshold.
  • What about clusters that are distributed over
    several groups, as illustrated by the green
    rectangle?

17
The PHENIX Electromagnetic Calorimeter
  • PHENIX what can we learn?
  • Physics performance Energy, Trigger,
    Time-Of-Flight
  • Electronics system design
  • Pb-scintillator calorimeter Pb-glass
    calorimeter
  • read-out by Photomultipliers
  • Pb-glass best granularity and energy resolution
  • Pb-scintillator best linearity and timing
  • Electronics
  • Energy variable gain amplifier, gain can be set
    remotely, two outputs low and high gain.
    Analog stage ASIC.
  • gain control important for uniform amplitude
    response for trigger summation
  • Fast trigger ASIC summing of 4 analog signals
    plus results from neighbouring ASICs
  • channel TOF TAC started by signal above
    threshold, stopped by beam clock, pipeline values
    until trigger decision. Analog stage ASIC

The PHENIX Collaboration PHENIX calorimeter
18
RHIC PHENIX calorimeter fast trigger
Within each ASIC the 4x4 sum signal is compared
to three separate thresholds, each remotely
programmable.
Courtesy the PHENIX Collaboration
19
RHIC PHENIX Pb-scintillator - timing
PMT out risetime ? 5 ns Beam clock 9.4 MHz
Courtesy the PHENIX Collaboration
20
The US proposal for PHOS FEE (and ALICE EMCAL)
  • Based on the PHENIX system
  • Includes a full chain from APDs to ALICE DAQ
  • topology a three level system a PHOS module
    56 x FrontEndModules
    gt 7 x DCM layer1 gt 1 x DCM layer2
  • includes some specially designed ASICs

21
Designing and building the full PHOS FEE
  • Activity so far (since 1997) preamplifier
    (shaperADC)energy card for PHOS-256 prototype
  • Remaining Energy chain, Trigger, TOF and
    interface to ALICE systems (DAQ, Trigger, DCS)
    to be designed and built
  • ALICE/PHOS FEE
  • US proposal 2001-02 (now on HOLD..)
  • Design study June 2002 (Oslo)
  • Design study Sep 2002, and Design of FEE for
    PHOS, Nov. 2002 (I. Sibiriak)
  • The above studies are way short of a real design
    specification!

Oct. 2003 ALICE milestone final design PHOS
FEE 2004 Start integration and calibration of
first PHOS Module
However, the PHOS FEE project offers challenging
tasks in developing state-of-the-art electronics
in a short time, with close contact to the
international high tech community at CERN!
22
PHOS FEE baseline
  • Physics requirements main issues previously
    listed
  • ALICE (protocol) interfaces summary on previous
    foils
  • Some interfaces not yet frozen, for instance
    Central Trigger processor, Detector Control
    System (DCS)
  • PHOS module geometry and FEE
  • main mechanical parameters fixed, but internal
    organization of FEE warm volume remains to be
    defined
  • detector module is built of 448 strip unit for
    8 crystals with APDs and preamps. Signals/power
    connection to each strip unit via a common
    connector with cabling into cold volume.
  • first FEE stage could be one vertically mounted
    card per strip unit
  • a number of horizontally mounted (mother)boards
    for processing sectors with say 14x16 (224)
    crystals, giving 16 sectors per PHOS module
  • PHOS preamplifier (Bergen, Kurchatov)
  • output dynamic range, signal characteristics
  • any modification regarding current output 7V
    dynamic range?

23
PHOS FEE design baseline I
  • ALICE Technical Co-ordinator in order to deliver
    in time, re-use electronics / concepts already
    developed by/for other ALICE detectors!
  • Electronic development resources are urgently
    required, in particular in digital (programmable)
    logic
  • However, there is also a mechanical engineering
    task to be coordinated with the FEE layout
  • We have to design and build a FEE system, not a
    number of individual electronics blocks
  • Beam tests of prototypes and construction of
    test benches
  • TOF performance
  • communication with timing/trigger system via the
    TTC system
  • First milestone March 2003 a preliminary design
    (or design options) and time planning has to be
    presented by PHOS Project leader V. Manko to the
    LHC Committee.
  • Economy

24
PHOS FEE design baseline II
  • Off-the-shelf components only, no ASICs
  • No time for ASIC development, neither is it
    necessary
  • For maximum flexibility and low cost, use
    programmable logic wherever possible
  • FPGAs offers a wide range of circuit design
    facilities
  • Have Megabits internal memory capacity, replaces
    separate memory chips
  • Processing offers an inherent parallellism
  • IP (Intellectual Property) processor and
    interfacing cores can be bought (for a price )
  • Technology trend increase in speed, capacity,
    matched with advanced development tools
  • Minimize interboard cabling a source for
    crosstalk, signal degradation, etc.

25
Key system issues
  • Event / trigger data to ALICE
  • Crystal Energy, TOF value
  • Crystal / crystal block gain setting / bias
    voltage (TBD)
  • Crystal sector trigger generation
  • PHOS module temperature readings, etc (how
    many?)
  • PHOS L0, possibly L1 ROI (Region Of Interest)
  • Needed a redesigned shaper with outputs for
    Energy, Trigger and TOF channels
  • Energy current protodesign with shaping ? 2
    microsec for best signal/noise and for ADC
    amplitude digitization
  • sampling digitization shaping time ??
  • Very large dynamic range gt multiple gain outputs
    (PHENIX have low and high)
  • Programmable gain for output equalization as an
    alternative to individual bias regulation ??
  • Trigger and TOF channels fast, the same signal
    may be used for both channels
  • Segmentation for trigger calculation
  • dividing the 3584 crystal channels into a number
    of individual trigger logic matrices

26
PHOS preamp output
Intrinsic rise time with APD capacity ? 10
ns With beam signals the risetime is longer,
around 50 ns. This increase reflects the
scintillation decay times (several components) of
the PbWO4 crystals plus internal multiple
scattering. On the right are shown preamp outputs
after LED pulses into the crystal. These
measurement also show that the rise time is
amplitude dependent. (B. Pommersche, Bergen)
27
PHOS Trigger Zhongbao Yin, Univ. of Bergen
28
Trigger generation an example
Trigger decision in 300 ns Sector with 16x14
crystals, first stage analog sum 2x2 ref. Oslo
Design Study
29
Trigger logic with sampling
Measurements show a rise time of around 40-50 ns
for (fast) signals from current PHOS electronics
(preamp). However, this does not cause a problem
for trigger logic, as time sliding summing window
can be included in the FPGA trigger firmware..
LHC clock is 40 MHz. The beam crossing interval
in p-p is 25 ns, in Pb-Pb 125 ns. All output data
Energy, Trigger, TOF must be referred to a
specific LHC beam crossing .
30
Time Of Flight timelines
Note this is an ideal case. What is the time
jitter due to different shower characteristics,
internal multiple light scattering, etc? Beam
test and simulation (difficult) data are needed!
31
anti-neutrons
32
System design options
  • Convential design
  • revised shaper with slow (2 ?s shaping) E channel
    with peak digitization, plus fast outputs for
    trigger / timing
  • trigger fast digitization and FPGA firmware
  • timing TAC, stop by LHC clock, buffer until L0
    decision (Sibiriak)
  • alternative fast FPGA firmware

Nov. 2002 Sibiriak note
  • TPC type design
  • revised shaper with shorter (200 ns?) E channel
    shaping
  • Energy ALTRO 40 MHz sampling, 2 (or 3) ADC
    channels for 2 (or 3) E-gain shaper outputs into
    ALTROs 10-bit ADC
  • TOF timing calculate directly from E signal (?)
  • trigger use ALTRO 40 MHz sampling and FEC
    onboard FPGA
  • use a modified TPC FrontEnd Controller with more
    channels for larger trigger sector, current
    version has 128 chs

33
The convential system Oslo Design Study
summary
  • Low number (16) of identical motherboards for
    14x16 crystals
  • Motherboard including Energy ADCs mainly contains
    a number of off-the-shelf ADC and two medium
    sized FPGAs (of 2002 speed-grade)
  • Buffer memory in FPGA
  • Moderate bandwidth requirements
  • One of the motherboard with firmware for
    multi-event buffering and communication with
    Readout Controller Unit. The RCU is probably best
    mounted outside the module mechanics for easy
    access to ALICE data link
  • Minimized cabling
  • Power budget well below PHOS design specs

34
TPC RCU as PHOS FEE back-end
DDL SIU/DIU
First version of a generic card for TPC and HLT,
designed by Heidelberg. It can be configured as a
Readout Controller Unit (RCU) and HLT/DAQ Read
Out Receiver Card (RORC). DDL SIU/DIU is mounted
as mezzanine. For HLT the FPGA (lightly coloured
chip) will also contain the track finder
co-processor. The rest of the card is mainly
memory. To be used as TPC RCU it is complemented
with a mezzanine card (designed by Oslo),
attached through four connectors in the middle of
the PCB. This mezzanine contains the GTL bus
drivers for read-out of the TPC FrontEnd Cards,
PROFIBUS chip for Detector Control, and TTCRX
chip for trigger and timing. The PCI bus
connector is only used for testing and debugging
35
TPC RCU board design
The data processing functionalties in the FPGA
are not indicated here. For PHOS that would as a
minimum imply formatting the event to be shipped
to DAQ. The data format over the DLL must conform
to the common ALICE standard.
36
TPC RCU (FPGA) Firmware
most of the components also relevant for PHOS
37
PHOS FEE and Detector Control System
38
PHOS FEE project road map
  • Organization
  • Main partners Kurchatov and Wuhan
  • Challenge a fair sharing of responsibilities
    and benefits
  • Others Bergen, Protvino , Oslo
  • Steering committee B. Skaali, V. Manko, I.
    Sibiriak, .. (Wuhan), A. Klovning, L. Musa. US
    contact T. Awes.
  • The project requirements are sufficiently well
    understood to get a first order estimate of FTE
    (Full Time Engineering) years required for 2003
    04/05 (definitely not less than 10!)
  • Road map
  • March 03 design options with respective plans to
    LHCC
  • 2003 building and testing of prototype systems
  • new shaper
  • ALTRO based test bench
  • TOF performance (both for ALTRO and Sibiriak
    proposal)
  • Trigger logic (lower priority in startup phase)
  • end 2003 (milestone Oct!) full plan, including
    mechanical structure and Detector Control System
Write a Comment
User Comments (0)
About PowerShow.com