OptIPuter Overview - PowerPoint PPT Presentation

1 / 37
About This Presentation
Title:

OptIPuter Overview

Description:

OptIPuter Overview – PowerPoint PPT presentation

Number of Views:187
Avg rating:3.0/5.0
Slides: 38
Provided by: jerry161
Category:

less

Transcript and Presenter's Notes

Title: OptIPuter Overview


1
OptIPuter Overview
  • Larry Smarr
  • Director, California Institute for
    Telecommunications and Information Technology
  • University of California, San Diego
  • September 2003

2
OptIPuter Overview
  • Motivation from e-Science Distributed
    Cyberinfrastructure
  • What are the Science Barriers We are Trying to
    Overcome?
  • Gigabyte Data Objects Need Interactive
    Visualization
  • Shared Internet Limits Speed of File Transfers
  • Inadequate Campus Grid Infrastructure
  • Creating a Multi-Latency OptIPuter Laboratory
  • System Software From Grid to LambdaGrid
  • Education and Outreach
  • Project Management

3
We Have Studied e-Science Barriers in
Distributed Cyberinfrastructure
LHC
ATLAS
4
Application Barrier OneGigabyte Data Objects
Need Interactive Visualization
  • Montages--Hundred-Million Pixel 2-D Images
  • Microscopy or Telescopes
  • Remote Sensing
  • GigaZone 3-D Objects
  • Seismic or Medical Imaging
  • Supercomputer Simulations
  • Interactive Analysis and Visualization of Such
    High Resolution Data Objects Requires
  • Scalable Visualization Displays
  • Montage and Volumetric Visualization Software
  • JuxtaView and Vol-a-Tile

5
OptIPuter Includes On-Line Microscopes
CreatingVery Large Biological Montage Images
IBM 9M Pixels
  • 2-Photon Laser Confocal Microscope
  • High Speed On-line Capability
  • Using High-Res IBM Displays to Interactively Pan
    and Zoom Large Montage Images
  • Montage Image Sizes Exceed 16x Highest Resolution
    Monitors
  • 150 Million Pixels!

Source David Lee, NCMIR, UCSD
6
Challenge Finding the Geo Needle in the
HaystackSIO ARAD Seismic Volume 642 Million
Voxels !
Source Graham Kent, SIO Mike Bailey, SDSC
7
Many Groups Are Experimenting with Tiled
DisplaysFor 2D-Montage and 3D-Volumetric Viewing
PerspecTile Running JuxtaView Jason Leigh, EVL,
UIC
www.llnl.gov/icc/sdd/img/images/pdf/Walls97.pdf
LCD Panels
Video Projectors
Each 3x520 Megapixels Total
8
OptIPuter Project GoalScaling to 100 Million
Pixels
  • JuxtaView (UIC EVL) for PerspecTile LCD Wall
  • Digital Montage Viewer
  • 8000x3600 Pixel Resolution30M Pixels
  • Display Is Powered By
  • 16 PCs with Graphics Cards
  • 2 Gigabit Networking per PC

See Jason Leigh Talk for More
Source Jason Leigh, EVL, UIC USGS EROS
9
Vol-a-Tile Using Scalable Displays for 3D
Visualization ARAD Seismic Data 1001 x 801 x 801
x 32bits
SIO/IGPP - Graham Kent
See Jason Leigh and John Orcutt Talks for More
10
Application Barrier TwoShared Internet Makes
Interactive Gigabyte Impossible
  • NASA Earth Observation System--
  • Over 100,000 Users Pull Data from Federated
    Repositories
  • Two Million Data Products Delivered per Year
  • 10-50 Mbps (May 2003) Throughput to Campuses
  • Typically Over Abilene From Goddard, Langley, or
    EROS
  • Best FTP with Direct Fiber OC-12 Goddard to
    UMaryland
  • 123 Mbps
  • UCSD-SIO to Goddard (ICESAT, CERES Satellite
    Data)
  • 12.4 Mbps
  • Interactive Megabyte Possible, but Gigabyte is
    Impossible
  • BIRN Between UCSD and Boston-Similar Story
  • Lots of Specialized Networking Tuning Used
  • 50-80 Mbps

11
For Those Who Want the Details
Source Bernard Minster, SIO, UCSD
12
Removing User Networking BarriersGlobal
Intellectual Convergence
  • SERENATE is a Strategic Study into the Evolution
    of European Research and Education Networking
    Over the Next 5-10 Years
  • Some Findings
  • On A Multi-year Timescale, Move Towards Optical
    Switching
  • Evolution Towards Heterogeneous NREN Networks
    (and GÉANT), with General Internet Use
    (Many-to-many) via Classical Packet Switching
    AND
  • Specialized High-Speed Traffic (Few-to-Few) via
    Optical Paths?
  • Even End-to-End Paths??
  • ? OptIPuter Project
  • Research Needed to Make This Practical

Source David Williams, CERN (2003)
13
Solution is to Use Dedicated1-10 Gigabit Lambdas
Parallel Lambdas Will Drive This Decade The Way
Parallel Processors Drove the 1990s
14
Application Barrier ThreeCampus Grid
Infrastructure is Inadequate
  • Campus Infrastructure is Designed for Web Objects
  • Being Swamped by Sharing of Digital Multimedia
    Objects
  • No Strategic Thinking About Needs of Data
    Researchers
  • Challenge of Matching Storage to Bandwidth
  • Need To Ingest And Feed Data At Multi-Gbps
  • Scaling to Enormous Capacity
  • Use Standards-Based Commodity Clusters Disk
    Arrays
  • OptIPuter Aims at Prototyping a National
    Architecture
  • Federated National and Global Data Repositories
  • Lambdas on Demand
  • Campus Laboratories Using Clusters with
    TeraBuckets
  • Campus Eventually with a Shared PetaCache

Giving Researchers What Students Have with P2P
15
OptIPuter End User Building BlocksCompute
Storage Viz Linux Clusters
  • Cluster 16 128 Nodes (Typically Two Intel
    Processors)
  • Storage 0.1-1 TB per Node
  • Graphics Nvidia Card Per Node
  • Visualization Displays Desktop, Wall, Theatre,
    Tiled, VR
  • Specialized Data Source/Sink Instruments
  • All Nodes Have 1 or 10 GigE I/O

See Tom DeFanti Phil P. Talks for More
Commodity GigE Switch
Fibers or Lambdas
16
The UCSD OptIPuter Deployment
Year One UCSD-Prototyping a Campus-Scale
OptIPuter
To CENIC
Forged a New Level Of Campus Collaboration In
Networking Infrastructure
SDSC
SDSC
SDSCAnnex
SDSC Annex
Preuss
High School
JSOE
Engineering
2 Miles 0.01 ms
CRCA
SOM
6th College
Medicine
Phys. Sci -Keck
Collocation
Node M
Earth Sciences
SIO
Details of UCSD and UIC Campus
Infrastructure Tom DeFanti and Phil P Talks
Source Phil Papadopoulos, SDSC Greg Hidley,
Cal-(IT)2
17
Year Two We Will Install Tele-Presence at SoCal
and Chicago Sites
Falko Kuester, UCI, Laboratory with Smart Boards
and Optically Connected Large Screens Cal-(IT)2
Collaboration Research See Leigh Talk
18
The OptIPuter Laboratory is Designed to Enable
Research on Differential Latency
  • Observations
  • The Cost of Latency Cannot be Eliminated
  • Each Doubling of Bandwidth Doubles the Cost of
    Latency
  • Protocols with Fewer Round-Trips Should be
    Preferred, even if the Messages per Trip are
    Greater Than Corresponding Many-Round Protocols

Nome
41
53
31
9
22
32
New York
Chicago
LA
14
28
Miami
Honolulu
29
The OptIPuter Difference What else can we do
now that we have Unlimited Bandwidth and
Control the Entire Network Ourselves from the
Hardware and Protocols to the Application?
Source Michael Goodrich, UCI
19
Year Two Multi-Latency OptIPuter
LaboratoryMetro-Scale Experimental Network
  • Linked UCSD and SDSU
  • Dedication March 4, 2002

UCSD
Linking Control Rooms
44 Miles of Cox Fiber 0.2 ms
SDSU
Cox, Panoram, SAIC, SGI, IBM, TeraBurst
Networks SD Telecom Council
20
Year Two Multi-Latency OptIPuter
LaboratoryState-Scale Experimental Network
  • Identified at UC Irvine
  • Cluster Lab
  • Campus Optical Fiber
  • CENIC Optical Connections
  • OptIPuter Up End of 2003

NASA Ames?
Source CENIC
400 Miles 2 ms
USC
UCI
See Phil P. Talk for More
SDSU
UCSD
21
Year Two Multi-Latency OptIPuter
LaboratoryNational-Scale Experimental Network
National Lambda Rail
See Tom DeFanti Talk for More
USC, UCI UCSD, SDSU
SoCal OptIPuter
2000 Miles 10 ms 1000x Campus Latency
Source John Silvester, Dave Reese, Tom West-CENIC
22
OptIPuter is Studying the Best Application
Usagefor Both Routed vs. Switched Lambdas
  • OptIPuter Evaluating Both
  • Routers
  • Chiaro, Juniper, Cisco, Force10
  • Optical Switches
  • Calient, Glimmerglass
  • UCSD Focusing on Routing Initially
  • UIC Focusing on Switching initially
  • Next Year Merge into Mixed Optical Fabric

Optical Switch Workshop October 2002
23
UCSD Uses Chiaro Optical Phased Array Multiple
Parallel Optical Waveguides
See Phil P. Talk for More
Output Fibers
GaAs Waveguides
WG 1
WG 128
Air Gap
24
UIC Calient DiamondWave Switches for StarLight
and NetherLight
  • 3D MEMS structure
  • Bulk MEMS
  • High Density Chips
  • Electrostatic Actuation
  • 800/port at any Speed
  • 128x128 at StarLight
  • 64x64 at NetherLight

See Tom DeFanti Talk for More
25
UICs Uses 64x64 GlimmerGlass Networks Photonic
Switch for Optical Multicast
1 2 3 64
Optical Fibers OUT
1 2 3 63
64
REPLICATION MODULE
See Jason Leigh Talk for More
Credit Glimmerglass Networks, Inc.
26
OptIPuter Open Source LambdaGrid Software for
Distributed Virtual Computers
Source Andrew Chien, UCSD OptIPuter Software
Architect
27
SIO Uses the Visualization Center to Teach a
Wide Variety of Graduate Classes
  • Geodesy
  • Gravity and Geomagnetism
  • Planetary Physics
  • Radar and Sonar Interferometry
  • Seismology
  • Tectonics
  • Time Series Analysis

Multiple Interactive Views of Seismicity and
Topography Datasets
See John Orcutt Talk for More
28
How Can We Make Scientific Discovery as Engaging
as Video Games?
An OptIPuter Project
Source Mike Bailey, SDSC
Interactive 3D APPLICATIONS
Underground Earth Sciences
Neurosciences
Anatomy
Geography
29
Outreach MetricYear OneInvited OptIPuter
Lectures from the Technical Community
  • University Researchers
  • Invited Berkeley CS Colloquium Jan 2003
  • Industry
  • Industrial Research Institute Invited Talk Oct
    2003
  • International Networking
  • NorduNET Invited Talk August 2003
  • Supercomputing
  • SC2003 Panel SuperNetworking Transforming
    Supercomputing Nov 2003
  • NSF Applications
  • Ocean Research Interactive Observatories Network
    workshop Jan 2004
  • NASA
  • Invited Directors Talks 2003-2004
  • NIH
  • Invited Panel on Networking, Digital Biology
    Summit Nov 2003
  • Publications
  • CACM Special Issue Nov 2003

Dozens More Listed in Annual Report
30
Building the OptIPuter TeamUsing Team Leader
Approach
  • Data, Visualization, Collaboration Jason
    Leigh-UIC/EVL
  • Bob Grossman-UIC Data
  • Padhraic Smyth-UCI Data
  • Marcus Thiébaux-USC ISI Visualization
  • Graham Kent-UCSD/SIO Earth Sciences
  • Debi Kilb-UCSD/SIO Earth Sciences
  • Atul Nayak-UCSD/SIO Visualization
  • Rob Newman-UCSD/SIO Visualization
  • David Lee-UCSD/NCMIR Medical Imaging
  • Luc Renambot-UIC/EVL Visualization
  • Chaitan Baru-UCSD/SDSC Data
  • System Software Andrew Chien-UCSD
  • Kane Kim-UCI Real-Time
  • Sid Karin-UCSD Security
  • Michael Goodrich-UCI Security
  • Carl Kesselman-USC ISI Globus
  • Valerie Taylor-TAMU Performance

31
Building the OptIPuter TeamUsing Team Leader
Approach
  • Optical Architecture Joe Mambretti-Northwestern
  • Oliver Yu-UIC Optical Architecture
  • George Clapp-Telcordia/SAIC Network Management
  • Routed Infrastructure Phil Papadopoulos-UCSD/SDS
    C
  • Greg Hidley-UCSD/Cal-(IT)2 Infrastructure
  • Tom Hutton-UCSD/SDSC Networking
  • Alan Benner-IBM Cluster Networking
  • Switched Infrastructure Tom DeFanti-UIC/EVL
  • Joe Mambretti-NU/iCAIR Optical Architecture
  • Cees de Laat-UAmsterdam Optical Networking
  • High-Performance Protocols Joe Bannister-USC
    ISI
  • Jason Leigh-UIC/EVL UDP and Quanta
  • Ted Faber-USC ISI Protocols
  • Aaron Falk-USC ISI Protocols
  • Eric Coe-USC ISI Protocols
  • Andrew Chien-UCSD System Software
  • Bob Grossman-UIC/LAC Data

32
Building the OptIPuter TeamUsing Team Leader
Approach
  • Medical Imaging Mark Ellisman-UCSD/NCMIR
  • Steve Peltier-UCSD/NCMIR
  • Hiro Hakozaki-UCSD/NCMIR
  • David Lee-UCSD/NCMIR
  • Jason Leigh-UIC/EVL
  • Earth Sciences John Orcutt-UCSD/SIO
  • Graham Kent-UCSD/SIO Earth Sciences
  • Debi Kilb-UCSD/SIO Earth Sciences
  • Atul Nayak-UCSD/SIO Visualization
  • Rob Newman-UCSD/SIO Visualization
  • Mike Bailey-UCSD/SDSC Visualization
  • Eric Frost-SDSU Optically Linked Visualization
  • Brian Davis-USGS Remote Sensing
  • UCSD Education Rozeanne Steckler-UCSD/SDSC
  • Mike Bailey-UCSD/SDSC Visualization
  • Debi Kilb-UCSD/SIO Earth Sciences
  • UIC Education Tom Moher-UIC
  • Debi Kilb-UCSD/SIO Earth Sciences

33
Methods by Which We Build the Team
  • Ad Hoc Meetings Called by Team Leaders
  • Working Visits
  • EVL to NCMIR and SIO
  • Data from NCMIR and SIO to EVL
  • All Hands Meetings
  • February 2003
  • January 2004
  • Topical Meetings
  • Optical Switch Workshop January 2003
  • Optical Signaling Network Management Meeting
    May 2003
  • Bi-Weekly Conference Calls on Technical
    Developments
  • Frontier Advisory Committee Meeting February 2003
  • Iterated Written Architecture or Deployment
    Documents
  • Annual Report, Program Plan, and Site Visit
    Process
  • Seasoned Leaders Who Have Worked Together Before

34
SIGGRAPH 89 Science by Satellite
What we really have to do is eliminate distance
between individuals who want to interact with
other people and with other computers. ? Larry
Smarr, Director National Center for
Supercomputing Applications, UIUC
Using satellite technologydemo of What It might
be like to have high-speed fiber-optic links
between advanced computers in two different
geographic locations. ? Al Gore, Senator Chair,
US Senate Subcommittee on Science, Technology and
Space
35
SIGGRAPH 92Showcase Science in the 21st Century
From the atom to the Universeits all here.
Three dozen projects can now come through the
network and appear to be in McCormick
PlaceSimulations on remote supercomputers or
created from data gathered from far away
instruments, these visualizations demonstrate the
power of distributed computing, doing computing
where the resources are and not necessarily on a
single machine. ? Larry Smarr, Director,
National Center for Supercomputing Applications,
UIUC
UCSD NCMIR in San Diego
UCSD NCMIR in Chicago
We have to develop the technology and
techniques?and the sociology?to go along with
group activities. ? Sid Karin, Director, San
Diego Supercomputer Center, UCSD
UIC
UCSD National Center for Microscopy and Imaging
Research (NCMIR) http//www-ncmir.ucsd.edu
36
Supercomputing 95I-WAY Information Wide Area
Year
  • I-Way Featured
  • Networked Visualization Application
    Demonstrations
  • OC-3 (155Mbps) Backbone
  • Large-Scale Immersive Displays
  • I-Soft Programming Environment?Globus

Cellular Semiotics
One of the reasons weve been working on
virtual- reality technology is because its an
excellent test for this sort of technology. We
need the supercomputers to give us the realism
and the simulations, and we need the high-speed
networks to give us the feel of telepresence?of
being somewhere else. ? Tom DeFanti, Director,
Electronic Visualization Lab, UIC VR is an
intelligent user interface into the whole
electronic superhighway. How are people going to
talk to computers in the future? ? Maxine Brown,
Associate Director, Electronic Visualization Lab,
UIC
http//archive.ncsa.uiuc.edu/General/Training/SC95
/GII.HPCC.html
UIC
37
iGrid 2002 September 24-26, 2002, Amsterdam, The
Netherlands
  • Fifteen Countries/Locations Proposing 28
    Demonstrations Canada, CERN, France, Germany,
    Greece, Italy, Japan, The Netherlands, Singapore,
    Spain, Sweden, Taiwan, United Kingdom, United
    States
  • Applications Demonstrated Art, Bioinformatics,
    Chemistry, Cosmology, Cultural Heritage,
    Education, High-Definition Media Streaming,
    Manufacturing, Medicine, Neuroscience, Physics,
    Tele-science
  • Grid Technologies Grid Middleware, Data
    Management/ Replication Grids, Visualization
    Grids, Computational Grids, Access Grids, Grid
    Portal

Sponsors HP, IBM, Cisco, Philips, Level (3),
Glimmerglass, etc.
UIC
www.startap.net/igrid2002
Write a Comment
User Comments (0)
About PowerShow.com