SciDAC Accelerator Modeling Project Kwok Ko and Robert D. Ryne SciDAC PI meeting Charleston, South C - PowerPoint PPT Presentation

About This Presentation
Title:

SciDAC Accelerator Modeling Project Kwok Ko and Robert D. Ryne SciDAC PI meeting Charleston, South C

Description:

OSIRIS, VORPAL, QuickPIC, UPIC. IMPACT code suite User-Map ... Modeling a Plasma Wakefield Accelerator with added realism in full 3D models (OSIRIS, VORPAL) ... – PowerPoint PPT presentation

Number of Views:180
Avg rating:3.0/5.0
Slides: 43
Provided by: csmo
Learn more at: https://www.csm.ornl.gov
Category:

less

Transcript and Presenter's Notes

Title: SciDAC Accelerator Modeling Project Kwok Ko and Robert D. Ryne SciDAC PI meeting Charleston, South C


1
SciDAC Accelerator Modeling ProjectKwok Ko and
Robert D. RyneSciDAC PI meetingCharleston,
South CarolinaMarch 23, 2004
2
Outline
  • Project overview
  • Applications
  • Collaborations in Applied Math and Computer
    Science
  • Future Plans

3
Outline
  • Project overview
  • Applications
  • Collaborations in Applied Math and Computer
    Science
  • Future Plans

4
DOEs Facilities for the Future of Science, 20yr
Outlookis a testament to the importance of
DOE/SC and the importance of particle
accelerators
Of the 28 priorities on the list,nearly 1/2 are
accelerator facilities
5
Accelerator projects on 20yr list
  • LCLS
  • RIA
  • CEBAF upgrade
  • BTeV
  • Linear Collider
  • SNS upgrade
  • RHIC II
  • NSLS upgrade
  • Super Neutrino Beam
  • ALS upgrade
  • APS upgrade
  • eRHIC
  • IBX

6
SciDAC Accelerator Modeling Project
Goal Create a comprehensive simulation
environment, capable of modeling a broad range of
physical effects, to solve the most challenging
problems in 21st century accelerator science and
technology Sponsored by DOE/SC Office of High
Energy Physics (formerly HENP) in collaboration
w/ Office of Advanced Scientific Computing
Research
7
SciDAC codes are having a major impact on
existing accelerators and future projects
  • PEP-II interaction region heating analysis
    (Omega3P,Tau3P,T3P)
  • Simulation of beam-beam effects in Tevatron,
    PEP-II, RHIC, and LHC (BeamBeam3D)
  • Discovery that self-ionization can lead to
    meter-long high density plasma sources for plasma
    accelerators
  • NLC acc. structure design (Omega3P) wakefield
    computation (Omega3P, S3P, Tau3P)
  • Beam loss studies at FNAL booster (Synergia)
  • Study of e-cloud instability in LHC (QuickPIC)
  • NLC peak surface fields and dark current
    simulations (Tau3P, Track3P)
  • Gas jet modeling (Chombo/EB)
  • RIA RFQ cavity modeling (Omega3P)

8
The SciDAC Accelerator Modeling Project teamA
multidisciplinary, multi-institutional team
producing comprehensive terascale accelerator
design tools
BNL Space-charge in rings wakefield effects
Booster expts
FNAL Space-charge in rings software integration
Booster expts
UC Davis Particle Mesh Visualization
LBNL (AFRD) Beam-Beam Space Charge in linacs
rings parallel Poisson solvers
Mef2 ef3 ef4 NA-1 M A
U. Maryland Lie Methods in Accelerator Physics
SLAC Large-Scale Electromagnetic Modeling
LANL High Intensity Linacs, Computer Model
Evaluation
SNL Mesh Generation
Stanford, LBNL (CRD) Parallel Linear
Solvers, Eigensolvers, PDE Solvers, AMR
UCLA, USC, Tech-X, U. Colorado
Plasma-Based Accelerator Modeling Parallel PIC
framworks (UPIC)
9
Code Development
  • Electromagnetics
  • Omega3P, Tau3P,T3P, S3P, Track3P
  • Beam Dynamics
  • BeamBeam3D, IMPACT, MaryLie/IMPACT, Synergia,
    Langevin3D
  • Advanced Accelerators
  • OSIRIS, VORPAL, QuickPIC, UPIC

10
IMPACT code suite User-Map
  • SLAC
  • LBNL
  • LANL
  • TX corp
  • FNAL
  • ORNL
  • MSU
  • BNL
  • JLab
  • RAL
  • PSI
  • GSI
  • KEK

11
Collaborations with Applied Math and Computer
Science
  • SciDAC ISICs (TOPS, APDEC, TSTT), SAPP
  • Eigensolvers and linear solvers
  • Poisson solvers
  • AMR
  • Meshing Discretization
  • Parallel PIC methods
  • Partitioning
  • Visualization
  • Stat methods

12
Outline
  • Project overview
  • Applications
  • Collaborations in Applied Math and Computer
    Science
  • Future Plans

13
Modeling the PEP-II Interaction Region
Courtesy K. Ko et al., SLAC
FULL-SCALE OMEGA3P MODEL FROM CROTCH TO CROTCH
Beam heating in the beamline complex near the IR
limited the PEP-II from operating at high
currents. Omega3P analysis helped in redesigning
the IR for the upgrade.
14
Modeling the PEP-II Interaction Region
15
Tevatron Modeling
  • Large computing requirement each point requires
    12 hrs x 1024 procs
  • Recent result good agreement for pbar lifetime
    vs proton intensity
  • Courtesy Fermilab and LBNL

16
Beam-Beam Studies of PEP-II
  • Collaborative study/comparison of beam-beam codes
  • Predicted luminosity sensitive to of slices
    used in simulation

17
Modeling a Plasma Wakefield Accelerator with
added realism in full 3D models (OSIRIS, VORPAL)
Full EM PIC simulation of drive beam ionizing
Lithium in a gas cell. Courtesy W. Mori et al,
UCLA
18
Full Scale modeling of 30-cell Structure
  • Distributed model on a mesh of half million
    hexahedral elements
  • Study RF damage at high power X-Band operation
    using Tau3P Ptrack3D

Courtesy K. Ko et al., SLAC
19
NLC Accelerating Structure Design
20
QuickPIC calculations have resulted in up to 500x
increase in performance over fully EM PIC
Wake produced by an electron beam propagating
through a plasma cell
21
Modeling beam loss in the Fermilab Booster using
Synergia
Booster simulation and experimental results. (P.
Spentzouris and J. Amundson, FNAL)
22
Outline
  • Project overview
  • Applications
  • Collaborations in Applied Math and Computer
    Science
  • Future Plans

23
Collaboration w/ SciDAC ISICs
  • TOPS linear algebra libraries, preconditioners,
    eigensolvers for better convergence accuracy
  • APDEC solvers based on block-structured AMR, and
    methods for AMR/PIC
  • TSTT gridding and meshing tools

24
Collaboration with TOPSEigensolvers and Linear
Solvers
25
Collaboration w/ TOPS Partitioning
26
Collaboration with APDEC
  • AMR for particle-in-cell.
  • Goal Develop a flexible suite of fast solvers
    for PIC codes, based on ADPECs Chombo framework
    for block-structured adaptive mesh refinement
    (AMR).
  • Block-structured adaptive mesh solvers.
  • Fast infinite-domain boundary conditions.
  • Flexible specification of interaction between
    grid and particle data.
  • Accurate representation of complex geometries.

27
Collaboration with APDECBenefits from Heavy Ion
Fusion program
AMR modeling of an HIF source and triode region
in (r,z) geometry
  • In this example, we obtain a 4x savings in
    computational cost for the same answer

Courtesy of A. Friedman, P. Colella et al., LBNL
28
Collaboration with APDECEmbedded boundary
methods for gas jet modeling
29
Collaboration with TSTT Meshing Discretization
30
Collaboration with TSTT AMR on Unstructured Grids
31
SciDAC Accelerator Modeling Project provides
challenging visualization problems
Courtesy K.-L. Ma et al., UC Davis
32
Simulating high intensity beams beam halos
  • Courtesy Andreas Adelmann (PSI) and Cristina
    Siegerist (NERSC viz group)

Courtesy Andreas Adelmann and PSI viz group
33
Parallel Performance and Parallel Implementation
Issues
  • ExampleBeamBeam3D

Scaling using weak-strong option
Performance of different parallelization technique
s in strong-strong case
Milestone First-ever million particle, million
turn, strong-strong simulation performed for LHC
34
High Aspect Ratio solver based on Integrated
Green Function (IGF)New algorithm provides lt 1
accuracy using 64x64 grid (black curve).
64x1024
64x2048
IGF 64x64
64x4096
64x8192
64x16384
35
Comparisons with Experiments
  • LANL proton radiography (single-particle optics)
  • LANL LEDA beam halo experiment
  • J-PARC front end test (collab w/ KEK/JAERI)
  • FNAL booster
  • BNL booster
  • CERN PS (collab w/ CERN, GSI)

36
Statistical Methods for Calibration and
Forecasting
  • Determining initial phase space distribution from
    1D wire scan data.
  • Courtesy D. Higdon (LANL) et al.

Simulation of a high intensity proton beam
through a series of quadrupole magnets. 
Statistical techniques were used to combine 1D
profile monitor data with simulations to infer
the 4D beam distribution. The figure shows the
90 intervals for the predicted profile at
scanner 6 (shaded regions), and, for comparison,
the observed data (black line). Only data from
the odd numbered scanners were used to make the
prediction.
37
Outline
  • Project overview
  • Applications
  • Collaborations in Applied Math and Computer
    Science
  • Future Plans

38
NLC Accelerating Structure Design
39
NLC Accelerating Structure Design
40
3D First-Principles Fokker-Planck Modeling
  • Requires analog of 1000s of space-charge
    calculations/step
  • it would be completely impractical (in terms of
    of particles, computation time, and statistical
    fluctuations) to actually compute the Rosenbluth
    potentials as multiple integrals J.Math.Phys.
    138 (1997).

FALSE. Feasibility demonstrated on parallel
machines at NERSC and ACL
Self-Consistent Diffusion Coefficients
Spitzer approximation
Previous approximate calculations performed w/out
parallel computation were not self-consistent
Courtesy J. Qiang (LBNL) and S. Habib (LANL)
41
Optimization
  • Accelerator system design including space charge
  • Shape optimization
  • Plasma afterburner

42
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com