Why Not Store Everything in Main Memory? Why use disks? - PowerPoint PPT Presentation

Loading...

PPT – Why Not Store Everything in Main Memory? Why use disks? PowerPoint presentation | free to download - id: 753f98-YWE2M



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Why Not Store Everything in Main Memory? Why use disks?

Description:

V(d) Gradient(V)=2Aod 2a11 2a12 ... 2a1n 2a21 2a22 ... 2a2n: ' 2an1 ... we can hill-climb akk below to a d that gives us the global maximum variance. – PowerPoint PPT presentation

Number of Views:17
Avg rating:3.0/5.0
Slides: 29
Provided by: William1326
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Why Not Store Everything in Main Memory? Why use disks?


1
How do we use this theory? For Dot Product gap
based Clustering, we can hill-climb akk below to
a d that gives us the global maximum variance.
Heuristically, higher variance means more
prominent gaps.
Maximizing theVariance
Given any table, X(X1, ..., Xn), and any unit
vector, d, in n-space, let
These computations are O(C) (Cnumber of classes)
and are instantaneous. Once we have the matrix
A, we can hill-climb to obtain a d that maximizes
the variance of the dot product projections of
the class means.
FAUST Classifier MVDI (Maximized Variance
Definite Indefinite
Build a Decision tree. 1. Find d that maximizes
variance of dot product projections of class
means each round. 2. Apply DI each round
FAUST technology relies on 1. a distance
dominating functional, F. 2. Use of gaps
in range(F) to separate.
We can separate out the diagonal or not
For Unsupervised (Clustering) Hierarchical
Divisive? Piecewise Linear? other? Perf
Anal (which approach is best for which type of
table?)
For Supervised (Classification), Decision Tree?
Nearest Nbr? Piecewise Linear? Perf Anal (which
is best for training set?)
d1?(V(d0))
? d0, one can hill-climb it to locally maximize
the variance, V, as follows
d2?(V(d1))... where
White papers Terabyte Head Wall. The Only Good
Data is Data in Motion Multilevel pTrees k0,1
suffices! A PTreeSet is defined by specifying
a table, an array of stride_lengths (usually
equi-length so just that one length is specified)
and a stride_predicate (T\F condition on a stride
(stridebag or array? of bits) So the metadata
of PTreeSet(T,sl,sp) specifies T, sl and sp. A
raw PTreeSet has sl1 and the identity
predicate (sl and sp not used). A cooked
PTreeSet (AKA Level-1 PTreeSet) for a table with
sl?1 (main purpose provide
compact summary information on the table.) Let
PTS(T) be a raw PTreeSet, then it, plus
PTS(T,64,p), ..., PTS(T,64k,p) form a tree of
vertical summarizations of T. Note that P(T,
6464, p) is different from P(P(T,64,p), 64, p),
but both make sense since P(t, 64, p) is a table
and P(P(T, 64, p), 64, p) is just a cooked pTree
on it.
2
FAUST MVDI
on IRIS 15 records from each Class for Testing
(Virg39 was removed as an outlier.)

Definite_____ Indefinite s-Mean 50.49
34.74 14.74 2.43 s -1 10 e-Mean
63.50 30.00 44.00 13.50 e 23 48
s_ei 23 10 empty i-Mean 61.00 31.50
55.50 21.50 i 38 70 se_i 38
48
In this case, since the indefinite interval is so
narrow, we absorb it into the two definite
intervals resulting in decision tree
3
FAUST MVDI
SatLog 413train 4atr 6cls 127test
Using class means FoMN Ct min max
max1 mn4 83 101 104 82 113 8 110 121
122 mn3 85 103 108 85 117 79 105 128
129 mn1 69 106 115 94 133 12 123 148
149 Using full data (much better!) mn4 83 101
104 82 59 8 56 65 66 mn3 85 103
108 85 62 79 52 74 75 mn1 69 106
115 94 81 12 73 95 96
Gradient Hill Climb of Variance(d) d1 d2
d3 d4 Vd) 0.00 0.00 1.00 0.00
282 0.13 0.38 0.64 0.65 700 0.20 0.51
0.62 0.57 742 0.26 0.62 0.57 0.47
781 0.30 0.70 0.53 0.38 810 0.34 0.76
0.48 0.30 830 0.36 0.79 0.44 0.23
841 0.37 0.81 0.40 0.18 847 0.38 0.83
0.38 0.15 850 0.39 0.84 0.36 0.12
852 0.39 0.84 0.35 0.10 853
Fomn Ct min max max1 mn2
49 40 115 119 106 108 91 155 156 mn5 58
58 76 64 108 61 92 145 146 mn7 69 77
81 64 131 154 104 160 161 mn4 78 91 96
74 152 60 127 178 179 mn1 67 103 114 94
167 27 118 189 190 mn3 89 107 112 88 178
155 157 206 207
Gradient Hill Climb of Var(d)on t25 d1 d2
d3 d4 Vd) 0.00 0.00 0.00 1.00
1137 -0.11 -0.22 0.54 0.81 1747
MNod Ct ClMn ClMx ClMx1 mn2
45 33 115 124 150 54 102 177 178 mn5 55 52
72 59 69 33 45 88 89
Gradient Hill Climb of Var(d)on t257 0.00
0.00 1.00 0.00 496 -0.15 -0.29 0.56
0.76 1595 Same using class means or training
subset.
Gradient Hill Climb of Var(d)on t75 0.00 0.00
1.00 0.00 12 0.04 -0.09 0.83 0.55
20 -0.01 -0.19 0.70 0.69 21
Gradient Hill Climb of Var(d)on t13 0.00 0.00
1.00 0.00 29 -0.83 0.17 0.42 0.34
166 0.00 0.00 1.00 0.00 25 -0.66
0.14 0.65 0.36 81 -0.81 0.17 0.45
0.33 88
On the 127 sample SatLog TestSet 4 errors or
96.8 accuracy.
speed? With horizontal data, DTI is applied one
unclassified sample at a time (per execution
thread). With this pTree Decision Tree, we take
the entire TestSet (a PTreeSet), create the
various dot product SPTS (one for each inode),
create ut SPTS Masks. These masks mask the
results for the entire TestSet.
Gradient Hill Climb of Var(d)on t143 0.00 0.00
1.00 0.00 19 -0.66 0.19 0.47 0.56
95 0.00 0.00 1.00 0.00 27 -0.17
0.35 0.75 0.53 54 -0.32 0.36 0.65
0.58 57 -0.41 0.34 0.62 0.58 58
For WINE min
max1 8.40 10.33 27.00 9.63 28.65 9.9
53.4 7.56 11.19 32.61 10.38 34.32 7.7
111.8 8.57 12.84 30.55 11.65 32.72 8.7
108.4 8.91 13.64 34.93 11.97 37.16 13.1
92.2 Awful results!
Gradient Hill Climb of Var t156161 0.00 0.00
1.00 0.00 5 -0.23 -0.28 0.89 0.28
19 -0.02 -0.06 0.12 0.99 157 0.02 -0.02
0.02 1.00 159 0.00 0.00 1.00 0.00
1 -0.46 -0.53 0.57 0.43
2 Inconclusive both ways so predict
purality4(17) (3ct3 tct6
Gradient Hill Climb of Var t146156 0.00 0.00
1.00 0.00 0 0.03 -0.08 0.81 -0.58
1 0.00 0.00 1.00 0.00 13 0.02 0.20
0.92 0.34 16 0.02 0.25 0.86 0.45
17 Inconclusive both ways so predict
purality4(17) (7ct15 2ct2
Gradient Hill Climb of Var t127 0.00 0.00
1.00 0.00 41 -0.01 -0.01 0.70 0.71
90 -0.04 -0.04 0.65 0.75 91 0.00 0.00
1.00 0.00 35 -0.32 -0.14 0.59 0.73
105 Inconclusive predict purality7(62 4(15)
1(5) 2(8) 5(7)
4
FAUST MVDI
Concrete
d0 -0.34 -0.16 0.81 -0.45
7 test errors / 30 77
For Concrete min max1 train 335.3 657.1
0 l 120.5 611.6 12 m 321.1 633.5 0 h Test
0 l 1 m 0
h 0 321 3.0 57.0
0 l 3.0 361.0 11 m 28.0 92.0 0 h
0 l 2 m 0 h
92 999
Seeds
.97 .17 -.02 .15 d0 13.3 19.3 0
0 l 16.4 23.5 0 0 m 12.2 15.2
25 5 h 0 13.2 19.3 23.5
d3 547.9 860.9 4 l 617.1 957.3 0 m 762.5
867.7 0 h 0 l
0 m 0 h . 0
617
8 test errors / 32 75
d2 544.2 651.5 0 l 515.7 661.1 0 m 591.0
847.4 40 h 1 l 0
m 11 h 662
999
5
FAUST Classifier
1. Cut in the middle ofVectorOfMedians (VOM),
not the means. Use stdev ratio not middle for
even better cut placement? 2. Cut in the middle
of MaxRod, MinVod. (assuming mRod?mVod) If
no gap, move cut to minimize Rerrors
Verrors. 3. Hill-climb d to maximize gap or to
minimize training set errors or (simplest) to
minimize dis(maxrod,minvod) . 4. Replace mr,
mv with the avg of the margin points?
y? PR or y?PV , Definite classifications else
re-do on Indefinite region, PCutR?xod?CutV
until actual ? gap (AND with certain stop cond?
E.g., "On nth round, use definite only (cut at
midpt(mR,mV)."
Another way to view FAUST DI is that it is a
Decision Tree Method. With each non-empty
indefinite set, descend down the tree to a new
level For each definite set, terminate the
descent and make the classification.
dim 2
Each round, it may be advisable to go through an
outlier removal process on each class before
setting MinVod and MaxRod (E.g., Iteratively
check if F-1(MinVod) consists of V-outliers).
r   v v r mR   r
   v v v v       r    r      v
mV
v      r    v v     r        
v                    
dim 1
6
FAUST DI K-class training set, TK, and a given
d (e.g., from DMeanTK?MedTK)
Let mimeanCi s.t. dom1?dom2? ...?domK
MniMindoCi MxiMaxdoCi MngtiMinjgtiMnj
MxltiMaxjltiMxj
Definitei ( Mxlti, Mngti )
Indefinitei,i1 Mngti, Mxlti1
Then recurse on each Indefinite.
For IRIS 15 records were extracted from each
Class for Testing. The rest are the Training
Set, TK. DMEANs?MEANe

Definite_____ Indefinite__ s-Mean 50.49
34.74 14.74 2.43 s -1 25 e-Mean
63.50 30.00 44.00 13.50 e 10 37
se 25 10 empty i-Mean 61.00 31.50 55.50
21.50 i 48 128 ei 37 48
F lt 18 ?
setosa (35 seto) 1ST
ROUND DMeans?Meane 18 lt F lt 37
? versicolor (15 vers)
37 ? F ? 48 ?
IndefiniteSet2 (20 vers, 10 virg)
48 lt F ? virginica
(25 virg)
F lt 7 ? versicolor (17
vers. 0 virg) IndefSet2
ROUND DMeane?Meani 7 ? F ? 10
? IndefSet3 ( 3 vers, 5 virg)
10 lt F ? virginica ( 0 vers, 5 virg)
F lt 3 ? versicolor (
2 vers. 0 virg)
IndefSet3 ROUND DMeane?Meani 3 ? F ?
7 ? IndefSet4 ( 2 vers, 1 virg)
Here we will assign 0 ? F
? 7 versicolor 7 lt F ?
virginica ( 0 vers, 3 virg)
7 lt F
virginica
Test F lt 15
? setosa (15 seto)
1ST ROUND DMeans?Meane 15 lt F lt 15
? versicolor ( 0
vers, 0 virg) 15 ? F ? 41
? IndefiniteSet2 (15 vers, 1
virg) 41 lt F
? virginica ( 14 virg)
100 accuracy.
F lt 20 ? versicolor (15 vers. 0 virg)
IndefSet2 ROUND DMeane?Meani 20 lt F ?
virginica ( 0 vers, 1 virg)
Option-1 The sequence of D's is
Mean(Classk)?Mean(Classk1) k1... (and Mean
could be replaced by VOM or?)
Option-2 The sequence of D's is
Mean(Classk)?Mean(?hk1..nClassh) k1... (and
Mean could be replaced by VOM or?)
Option-3 D seq Mean(Classk)?Mean(?h not used
yetClassh) where k is the Class with max count in
subcluster (VoM instead?)
Option-2 D seq. Mean(Classk)?Mean(?hk1..nClass
h) (VOM?) where k is Class with max count in
subcluster.
Option-4 D seq. always pick the means pair
which are furthest separated from each other.
Option-5 D Start with Median-to-Mean of
IndefiniteSet, then means pair corresp to max
separation of F(meani), F(meanj)
Option-6 D Always use Median-to-Mean of
IndefiniteSet, IS. (initially, ISX)
7
FAUST DI sequential
For SEEDS 15 records were extracted from each
Class for Testing.
Option-4, means pair most separated in X.
m1 14.4 5.6 2.7 5.1 4.4 d(m1,m2) DEFINITE
INDEFINITE m2 18.6 6.2 3.7 6.0 3.4
d(m1,m3) 2 -inf 0 m3 11.8 5.0 4.7 5.0
7.0 d(m2,m3) 1 106 0 12 0 106 0 ?
F ? 106,
3 106 inf 23 0 106 so totally
non-productive!
Option-6 D Median-to-Mean of IndefSet (initially
ISX)
m1 14.4 5.6 2.7 5.1 37.3 meanF1 DEFINITE
Cl1 2 3 INDEFINITE m2 18.6
6.2 3.7 6.0 71.2 meanF2 def3 -inf 21) 0
0 32 m3 11.8 5.0 4.7 5.0 2.0 meanF3 def1
28 49) 22 0 0 ind1 21
28 ) On whole TR def2
58 inf) 0 30 0 ind2 49 58
)
8
FAUST DI sequential
For SEEDS 15 records were extracted from each
Class for Testing.
Option-6 D Median-to-Mean of X
m1 14.4 5.6 2.7 5.1 37.3 meanF1 DEFINITE
Cl1 2 3 INDEFINITE m2 18.6
6.2 3.7 6.0 71.2 meanF2 def3 -inf 21) 0
0 32 m3 11.8 5.0 4.7 5.0 2.0 meanF3 def1
28 49) 22 0 0 ind31 21
28 ) On whole TR def2
58 inf) 0 30 0 ind12 49
58 )
-inf, 21)class3
28, 49)class2

58.inf) class3 d(.,9, -,1, -.2, -.2)
21,28)ind31
d(-.9, -.1, .14, -.1)
49, 58)ind12 d(0, .31, -.9, 0)

-inf,18)def
49, 58)ind23
9
FAUST CLUSTERING
Use DPPd(x), but which unit vector, d, provides
the best gap(s)?
1. DPPd exhaustively searches a grid of d's for
the best gap provider.
2. Use some heuristic to choose a good d?
GV Gradient-optimized Variance MM Use the d
that maximizes MedianF(X)-Mean(F(X)).
We have Avg as a function of d.
Median? (Can you do it?) HMM Use a heuristic
for MedianF(X) F(VectorOfMedians)VOMod MVM
Use DMEAN(X)?VOM(X), dD/D
Maximize variance - is it wise?
0 0 0 0 0
0 0 0 1 0 5
0 0 0 0 0
2 0 5 2 0 0 0
0 3 0 5 2 3
0 0 0 4 0
5 4 3 6 0 0 median
5 0 5 4 3 6 9
0 6 0 5 6 6
6 9 10 7 0
5 6 6 6 9 10
8 0 5 8 6 9 9
10 9 0 5 8
9 9 9 10 10 10
10 10 10 10 10 10 std
3.16 2.87 2.13 3.20 3.35 3.82 4.57
4.98 variance 10.0 8.3 4.5 10.2
11.2 14.6 20.9 24.8 Avg 5.00
0.91 5.00 4.55 4.18 4.73 5.00
4.55 consecutive 1 0 5 0
0 0 0 0 differences 1 0
0 2 0 0 0 0
1 0 0 0 3 0 0
0 1 0 0 2
0 6 0 0 1 0
0 0 0 0 9 0
1 0 0 2 3 0
0 10 1 0 0 0
0 0 0 0 1
0 0 2 0 3 0 0
1 0 0 0 3 0
0 0 1 10 5 2
1 1 1 0 avgCD 1.00
1.00 1.00 1.00 1.00 1.00 1.00
1.00 maxCD 1.00 10.00 5.00 2.00
3.00 6.00 9.00 10.00 mean-VOM 0.00
0.91 0.00 0.55 1.18 1.27 4.00 4.55
10
FAUST Clustering, simple example Gd(x)xod
Fd(x)Gd(x)-MinG on a dataset of 15 image points
X x1 x2 1 2 3 4 5 6 7 8 9 a b 1 1 1 1q 3
1 2 3 2 2 3 2 4 3 3 4 5 2 5 5 9
3 6 15 1 7 f 14 2 8 15 3 9
6 p d 13 4 a
b 10 9 b c e 1110 c 9 11
d a 1111 e 8 7 8 f 7 9
The 15 Value_Arrays (one for each
qz1,z2,z3,...) z1 0 1 2 5 6 10 11 12 14 z2
0 1 2 5 6 10 11 12 14 z3 0 1 2 5 6 10 11
12 14 z4 0 1 3 6 10 11 12 14 z5 0 1 2 3
5 6 10 11 12 14 z6 0 1 2 3 7 8 9 10 z7 0
1 2 3 4 6 9 11 12 z8 0 1 2 3 4 6 9
11 12 z9 0 1 2 3 4 6 7 10 12 13 za 0 1
2 3 4 5 7 11 12 13 zb 0 1 2 3 4 6 8 10
11 12 zc 0 1 2 3 5 6 7 8 9 11 12 13 zd
0 1 2 3 7 8 9 10 ze 0 1 2 3 5 7 9 11
12 13 zf 0 1 3 5 6 7 8 9 10 11
The 15 Count_Arrays z1 2 2 4 1 1 1 1 2
1 z2 2 2 4 1 1 1 1 2 1 z3 1 5 2 1 1
1 1 2 1 z4 2 4 2 2 1 1 2 1 z5 2 2
3 1 1 1 1 1 2 1 z6 2 1 1 1 1 3 3
3 z7 1 4 1 3 1 1 1 2 1 z8 1 2 3 1 3
1 1 2 1 z9 2 1 1 2 1 3 1 1 2 1 za
2 1 1 1 1 1 4 1 1 2 zb 1 2 1 1 3 2
1 1 1 2 zc 1 1 1 2 2 1 1 1 1 1 1
2 zd 3 3 3 1 1 1 1 2 ze 1 1 2 1 3 2
1 1 2 1 zf 1 2 1 1 2 1 2 2 2 1
gap F6, F10
gap F2, F5
pTree masks of the 3 z1_clusters (obtained by
ORing)
z12 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1
z11 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0
z13 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0
11
What have we learned? What is the DPPd FAUST
CLUSTER algorithm?
DMedian?Mean, d1D/D is a good start. But
first, Variance-Gradient hill-climb it. (Median
means Vector of Medians).
For X2SubCluster2 use a d2 which is
perpendicular to d1? In high dimensions, there
are many perpendicular directions. GV hill-climb
d2D2/D2 (D2MedianX2-MeanX2) constrained to be
? to d1, i.e., constrained to d2od10 (in
addition to d2od21. We may not want to constrain
this second hill-climb to unit vectors
perpendicular to d1. It might be the case that
the gap gets wider using a d2 which is not
perpendicular to d1?
GMP Gradient hill-climb (wrt d) VarianceDPPd
starting at d2D2/D2 where d2Unitized(
Vomx-xod1x?X2 - Meanx-xod1x?X2 )
Variance-Gradient hill-climbed subject only to
dod1
(We shouldn't constrain the 2nd hill-climb to
d1od20 and subsequent hill-climbs to dkodh0,
h2...k-1. (gap could be larger). So the 2nd
round starts at d2Unitized( Vomx-xod1x?X2 -
Meanx-xod1x?X2 ) and hill-climbs subject only
to dod1)
GCCP Gradient hill-climb (wrt d) VarianceDPPd
starting at d2D2/D2 where D2CCi(X2)-CCj(X2),
and hill-climbs subject to dod1, where the CCs
are two of the Circumscribing rectangle's Corners
(the CCs may be a faster calculations than Mean
and Vom). Taking all edges and diagonals of
CCR(X) (the Coordinate-wise Circumscribing
Rectangle of X) provides a grid of unit vectors.
It is an equi-spaced grid iff we use a CCC(X)
(Coordinate-wise Circumscribing Cube of X). Note
that there may be many CCC(X)s. A canonical one
is the one that is furthest from the origin (take
the longest side first. Extend each other side
the same distance from the origin side of that
edge. A good choice may be to always take the
longest side of CR(X) as D, DLSCR(X). Should
outliers on the (n-1)-dim-faces at the ends of
LSCR(X) be removed first? So remove all
LSCR(X)-endface outliers until after removal the
same side is still the LSCR(X). Then use that
LSCR(X) as D.
12
MVM
C11 F-MN gp?2 0 1 1 1 1 1 2 3 1 3
3 2 5 2 1 6 1 2 8 2 2 10 2
1 11 1 1 12 4 1 13 1 2 15 2
WINE
GV
ACCURACY WINE GV 62.7 MVM 66.7 GM
81.3
GM
.11 .19 .96 .19 209 -.02 .41 .91 0
232 C1(F-MN) gp?3 0 1 1 1 6 1 2 5
1 3 2 1 4 4 1 5 8 1 6 8 1 7
4 1 8 3 1 9 7 1 10 1 1 11 4
1 12 6 1 13 4 1 14 2 1 15 3 1 16
3 1 17 2 1 18 2 1 19 3 1 20 4
1 21 6 1 22 4 1 23 1 1 24 2 1 25
4 1 26 1 1 27 1 2 29 2 1 30 2
2 32 1 3 35 1 1 36 1 1 37 1 1 38
1 1 39 4 1 40 2 2 42 2 2 44 1
1 45 2 2 47 4 1 48 2 1 49 1 1 50
1 3 53 1 1 54 2 1 55 2
0.12)
1L 0H
F-MN Ct gp?8 0 1 12 12 1 3 15 2 13
28 1 2 30 1 2 32 2 2 34 1 1
35 2 3 38 1 8 46 1 1 47 3 10
57 1 1 58 1 1 59 1 1 60 1 2
62 1 2 64 1 1 65 1 1 66 1 1
67 4 1 68 2 1 69 1 1 70 1 2
72 3 1 73 1 1 74 3 1 75 2 1
76 1 1 77 1 2 79 1 3 82 1 1
83 1 1 84 2 1 85 1 1 86 1 2
88 2 1 89 4 1 90 2 1 91 1 1
92 6 1 93 3 1 94 5 1 95 4 2
97 5 1 98 2 1 99 1 1 100 4
1 101 7 1 102 4 1 103 2 1 104 3
1 105 6 1 106 3 1 107 8 1 108 10
1 109 2 1 110 4 1 111 5 1 112 2
1 113 4 1 114 1
.07 .15 .98 .12 588 -.01 .26 .97 .00 608 (F-MN)
gp?8 0 1 1 1 4 1 2 4 1 3 5
1 4 4 1 5 6 1 6 8 1 7 6
1 8 4 1 9 5 1 10 2 1 11 3
1 12 7 1 13 4 1 14 3 1 15 2
1 16 2 1 17 3 1 18 4 1 19 3
1 20 4 1 21 1 1 22 7 1 23 2
1 24 4 1 25 1 1 26 1 1 27 1
1 28 1 1 29 1 1 30 1 1 31 1
1 32 1 3 35 1 2 37 3 1 38 1
1 39 1 1 40 3 1 41 3 3 44 2
1 45 2 1 46 4 1 47 2 2 49 1
2 51 1 1 52 1 3 55 1 1 56 1
1 57 1 9 66 2 1 67 2 8 75 1
4 79 2 1 80 1 2 82 2 1 83 1
2 85 1 13 98 1 2 100 1 3 103 1
11 114 1
-.05 -.31 -.95 -.01 605 .01 -.27 -.96 -.0
608 XF-M gp?3 0 1 11 11 1 4 15 1 1 16
1 13 29 1 1 30 1 2 32 2 2 34 1
1 35 2 4 39 1 8 47 2 1 48 2 9 57
1 1 58 1 1 59 1 2 61 1 2 63 1
2 65 1 1 66 1 1 67 1 1 68 5 1 69
2 1 70 1 3 73 3 1 74 3 1 75 1
1 76 2 1 77 2 2 79 1 3 82 1 1 83
1 1 84 1 1 85 1 1 86 1 1 87 1
1 88 1 1 89 1 1 90 4 1 91 2 1 92
7 1 93 1 1 94 5 1 95 4 1 96 2
1 97 3 1 98 2 1 99 2 1 100 3 1 101
4 1 102 7 1 103 3 1 104 2 1 105 6
1 106 3 1 107 5 1 108 9 1 109 6 1 110
4 1 111 5 1 112 4 1 113 4 1 114 1
_4L 2H
___ _ 12,28) 1L 2H
_2L 1H
2L 0H C1
-.11 -.02 -.86 .5 43 -.05 -.4 -.92 .01 68 C7F-M3
g?3 0 1 3 3 1 2 5 1 4 9 2
6 15 1 3 18 1 2 20 1 1 21 1 1 22
3 3 25 2 3 28 1 2 30 3 1 31 2
1 32 1 1 33 1 1 34 2 1 35 1 1 36
3 3 39 2 1 40 1 1 41 2 3 44 1
2 46 3 2 48 1 2 50 2 2 52 1 1 53
1 1 54 1 1 55 1 1 56 2 1 57 1
1 58 2 1 59 2 1 60 2 1 61 1 1 62
1 1 63 2 2 65 1 1 66 1 1 67 1
1 68 1 1 69 3 1 70 1 1 71 1 1 72
1 1 73 1 1 74 2 1 75 2 1 76 1
1 77 3 1 78 4 1 79 3 1 80 4 1 81
1 1 82 1 1 83 1 2 85 2 1 86 2
2 88 3 1 89 2 2 91 2 2 93 4 3 96
1
_0L 2H
_0L 2H C2
3L 5H
-.08 .59 -.8 -.07 80 .08 .83 -.56 -.01 95
C5 g?3 0 1 4 4 1 8 12 1 3 15 1
2 17 1 2 19 1 4 23 1 1 24 1 2 26
1 1 27 1 2 29 3 2 31 1 1 32 1
1 33 1
___ _ 28,46) 2L 6H
1L 1H
___ _ 46,57) 2L 2H
.05 .59 -.293 .75 18 -.1 .9 -.3 .1 34 C68
?16 0 1 4 4 2 16 20 1 11 31 1
37 68 1 15 83 1 15 98 1 8 106 1
11 117 1 1 118 2
_2L 4H C71
C121 max thin 0 1 1 1 6 1 2 5 1
3 3 1 4 3 1 5 8 1 6 8 1 7 4
1 8 7 1 9 3 1 10 1 1 11 5
1 12 6 1 13 3 1 14 2 1 15 3 1 16
3 1 17 4
_2L 5H C3
_0L 1H C4
_3L 0H
C1 F-M Ct g?3 0 1 1 1 2 1 2 2
3 5 1 1 6 1 1 7 4 1 8 2
2 10 2 1 11 1 2 13 2 1 14 1
1 15 1 1 16 5 2 18 1 2 20 2
3 23 1 1 24 1 1 25 1 1 26 2
2 28 1 1 29 1 1 30 5 1 31 2
1 32 1 1 33 4 1 34 5 1 35 4
1 36 4 1 37 2 1 38 3 1 39 3
1 40 2 1 41 4 1 42 3 1 43 5
1 44 3 1 45 4 1 46 5 1 47 4
1 48 3 1 49 11 1 50 5 1 51 3
1 52 5 1 53 4 1 54 4 1 55 1
_1L 2H
___ 4L 2H
_0L 1H C4
___ 0L 2H
_1L 2H
_2L
3L 2H
23L 25H 6L 21H
_1L 6H
5L 5H
_2L 12H
_9L 7H C5
.19 .8 -.54 .18 7 -.21 .7 -.7 -.09 9
C763F-M8 g?8 0 2 16 16 1 13 29 1
12 41 2 4 45 1 7 52 1 4 56 1 7 63
1 8 71 2
_1L 4H
.01 -.27 -.96 -.01 23 -.04 -.43 -.9 .03
24 C764 g?3 0 1 31 31 1 3 34 1 1 35
2 2 37 1 2 39 2 2 41 1 2 43 1
1 44 1 2 46 3 3 49 1 1 50 1 1 51
2 1 52 1 1 53 2 2 55 2 2 57 2
3 60 1 2 62 2 1 63 1 2 65 3 1 66
2 3 69 1 1 70 1 1 71 2 2 73 2
1 74 1 1 75 2 1 76 3 1 77 2 1 78
2 1 79 3 1 80 3 2 82 1 2 84 1
2 86 2 1 87 1 1 88 1 2 90 2 1 91
2 3 94 2 2 96 2 1 97 2
C11 10L 13H C12
0L 2H
___ _1L 0H
___ _0L 1H
_2L 0H
0.35) C11
38L 68H
C12 F-M gp?2 0 1 1 1 8 1 2 3 1 3
2 1 4 4 1 5 11 1 6 8 1 7 2
1 8 6 1 9 4 1 10 3 1 11 4 1 12
4 1 13 5 1 14 3 1 15 3 1 16 4
2 18 2 1 19 5 1 20 6 1 21 4 1 22
1 1 23 2 1 24 3 1 25 3 3 28 2
1 29 2 2 31 1
4L 8H C6
_2L 4H
4L8H C763
_0L 2H
-.21 .34 -.91 .9 8 C766 16 g?4 0 1 30 30 1
2 32 1 7 39 1 1 40 1 1 41 1
1 42 1 4 46 1 2 48 1 2 50 2 5 55
1 3 58 1 7 65 1 2 67 1 5 72 1
3 75 2 2 77 1 1 78 4 2 80 1 3 83
1 1 84 2 4 88 1 1 89 1 11 100 1
4 104 1 11 115 1
_0L 1H
35,53) C12
10L 13H
___ _ 2L 9H
_2L 0H
___ 53,56) 3L 2H
_3L 1H
29L 46H
___ _ 1L 8H
51L 83H
0.66) C1
_1L 3H
___ _ 66,75) 2L 2H
_2L 0H
_2L 0H
7L 19H
2L 2H
___ _ 75,98) 2L 6H
0L 1H
___ 57,115) 51L 83H C1
_4L 8H
___ _ 98,115) 2L 2H
17L 15H C766
38L 68H C7
_2L 0H
_0L 2H
___ 28L 44H C76 1L
_1L 0H
___ _ 3L 3H
13
SEEDS
GV
MVM
ACCURACY SEEDS WINE GV 94 62.7 MVM
93.3 66.7 GM 96 81.3
219 31 14 29 akk d1 d2 d3 d4 V(d .98 .14
.06 .13 9 .98 .14 .06 .13 9 10(F-MN) gp?6 0
2 1 1 10 1 2 5 1 3 1 6 9 3 1
10 10 1 11 10 1 12 2 6 18 2 1 19
3 1 20 7 1 21 2 1 22 1 1 23 3 6
29 6 1 30 4 1 31 7 1 32 1 6 38
1 1 39 2 1 40 6 1 41 5 1 42 1 7
49 3 1 50 1 2 52 7 1 53 2 7 60
1 2 62 4 1 63 3 8 71 5 1 72 2 2
74 1 6 80 5 1 81 8 1 82 5 1 83
3 9 92 2 10 102 1 1 103 2 1 104 1
10(F-MN)gp?6 0 2 1 1 10 1 2 5 1 3
1 6 9 3 1 10 10 1 11 10 1 12 2
6 18 2 1 19 3 1 20 7 1 21 2 1 22
1 1 23 3 6 29 6 1 30 4 1 31 7
1 32 1 6 38 1 1 39 2 1 40 6 1 41
5 1 42 1 7 49 3 1 50 1 2 52 7
1 53 2 7 60 1 2 62 4 1 63 3 8 71
5 1 72 2 2 74 1 6 80 5 1 81 8
1 82 5 1 83 3 9 92 2 10 102 1 1 103
2 1 104 1
___ ___ 0,9) 0k 0r 18c C1
___ ___ 0,9) 0k 0r 18c C1
___ ___ 9,18) 1k 0r 24c C2
GM
___ ___ 9,18) 1k 0r 24c C2
.794 -.403 -.304 .337 6 0.957 .156 -.205 .132
9 10(F-MN) gp?3 0 1 2 2 1 2 4 4
2 6 3 2 8 7 2 10 2 2 12 1
2 14 1 2 16 10 2 18 10 1 19 2
3 22 2 1 23 2 1 24 1 1 25 1
2 27 4 2 29 4 2 31 4 2 33 2
5 38 3 1 39 3 2 41 7 2 43 2
2 45 2 1 46 1 2 48 1 1 49 1
1 50 4 2 52 5 1 53 1 1 54 3
3 57 2 2 59 3 2 61 3 1 62 1
2 64 3 2 66 3 3 69 5 7 76 1
2 78 2 2 80 2 2 82 4 2 84 1
2 86 1 2 88 4 1 89 1 1 90 8
2 92 5 11 103 2 1 104 1 1 105 1
1 106 1 2 108 1
___ ___ 18,29) 10k 0r 8c C3
___ ___ 18,29) 10k 0r 8c C3
___ ___ 29,38) 18k 0r 0c C4
___ ___ 29,38) 18k 0r 0c C4
___ ___ 38,49) 13k 2r 0c C5
-.577 .577 .577 .000 1 .119 .112 .986 .000
3 C2 10(F-MN) gp?10 0 1 10 10 2 1 11 3
10 21 3 10 31 5 10 41 1 10 51 1
11 62 1 1 63 1
___ ___ 0,22) 0k 0r 42c C1
___ ___ 38,49) 13k 2r 0c C5
___ ___ 49,60) 7k 6r 0c C6
___ ___ 0,31) 9k 0r 0c C21
___ ___ 49,60) 7k 6r 0c C6
___ ___ 60,71) 1k 7r 0c C7
___ ___ 31,41) 1k 0r 4c C22
___ ___ 60,71) 1k 7r 0c C7
___ ___ 71,80) 0k 8r 0c C8
___ ___ 22,33) 10k 0r 8c C2
___ ___ 41,64) 0k 0r 4c C23
___ ___ 71,80) 0k 8r 0c C8
___ ___ 80,92) 0k 21r 0c C9
___ ___ 92,102) 0k 2r 0c Ca
___ ___ 80,92) 0k 21r 0c C9
___ ___ 92,102) 0k 2r 0c Ca
___ ___ 102,105) 0k 4r 0c Cb
C3 200(F-MN)gp?12 0 2 12 12 3 12 24 3
12 36 5 12 48 1 12 60 1 12 72 1
40 112 2
___ ___ 102,105) 0k 4r 0c Cb
___ ___ 33,57) 33k 2r 0c C3
C3 .97 .15 .09 .14 0 0 .07 1 0 4
10F-M g?9 0 2 10 10 3 10 20 3 10 30 4
1 31 1 9 40 1 10 50 1 11 61 1 9 70 2
___ ___ 0,35) 8k 0r 0c
___ ___ 0,10) 2k 0r 0c
___ ___ 35,48) 2k 0r 3c
___ ___ 10,20) 2k 0r 1c
-.832 -.282 .134 -.458 0 -.44 .00 -.87
-.22 2 C4 10(F-MN) gp?21 0 3 11 11 2
20 31 3 21 52 3 27 79 1 20 99 3
___ ___ 20,30) 2k 0r 1c
___ ___ 48,72) 0k 0r 2c
___ ___ 57,69) 6k 9r 0c C4
___ ___ 69,76) 1k 4r 0c C6
___ ___ 30,40) 4k 0r 1c
___ ___ 72,113) 0k 0r 3c
___ ___ 40,50) 0k 0r 1c
___ ___ 50,61) 0k 0r 1c
___ ___ 61,70) 0k 0r 1c
___ ___ 0,52) 1k 7r C41
___ ___ 70,71) 0k 0r 2c
___ ___ 52,79) 1k 2r C42
C6 200(F-MN)gp?12 0 3 12 12 1 38 50 3
10 60 1 2 62 3 12 74 2
___ ___ 79100) 4k 0r C43
___ ___ 0,50) 4k 0r 0c
___ ___ 50,60) 1k 0r 2c
___ ___ 76,103) 0k 26r 0c C7
___ ___ 0,22) 4k 0r 0c
___ ___ 60,74) 1k 0r 3c
___ ___ 74,75) 1k 0r 1c
___ ___ 103,109) 0k 6r 0c C8
___ ___ 22,49) 3k 6r 0c
14
MVM
.81 .28 -.28 .42 13... .53 .23 .73 .37
39 C12 4F-M g?3 0 2 4 4 1 4 8 2
2 10 1 2 12 1 2 14 1 3 17 1 1 18
1 2 20 1 1 21 1 1 22 1 2 24 3
1 25 1 2 27 1 1 28 1 2 30 1 4 34
1 2 36 2 2 38 1 3 41 2 3 44 1
2 46 2 2 48 1 2 50 1 2 52 2 2 54
1 1 55 1 1 56 1 1 57 1 1 58 3
1 59 1 1 60 2 2 62 1 1 63 4 2 65
1 1 66 1 1 67 1 1 68 1 1 69 3
3 72 1 2 74 1 2 76 1 2 78 1 1 79
1 3 82 1 2 84 1 1 85 1 4 89 1
1 90 1 2 92 1 1 93 1
C2 3762 808 2260 266 d1 d2 d3 d4 .84 .18 .51
.06 64 .57 .22 .71 .34 82 .51 .22 .74 .38
83 (F-MN)3 Ct gp?3 0 1 2 2 1 1
3 1 2 5 1 15 20 2 3 23 1 3 26 2
2 28 1 1 29 1 2 31 1 2 33 2 2
35 2 2 37 1 1 38 3 1 39 1 1 40 1
1 41 1 1 42 1 4 46 1 1 47 2 2
49 1 1 50 1 1 51 1 2 53 1 1 54 2
2 56 1 2 58 1 1 59 2 2 61 2 1
62 2 1 63 3 1 64 1 1 65 2 2 67 3
1 68 2 1 69 1 1 70 2 1 71 2 1
72 2 2 74 1 1 75 1 2 77 1 1 78 1
1 79 1 2 81 1 2 83 1 1 84 1 3
87 2 1 88 1 1 89 1 1 90 2 1 91 1
1 92 2 3 95 1 1 96 2 1 97 1 2
99 1 2 101 2 2 103 1 3 106 3 3 109 1
1 110 1 1 111 1
IRIS GM
GV
ACCURACY IRIS SEEDS WINE GV 82.7 94
62.7 MVM 94 93.3 66.7 GM 94.7
96 81.3
C23 F-M3 g?3 3847 818 2284 257 .96 .22 .06 -.14
15 0 1 6 6 1 2 8 1 4 12 1 3
15 1 1 16 1 2 18 2 8 26 1 2 28 1
1 29 1 1 30 1 2 32 1 1 33 1 3
36 1 3 39 2 1 40 1 1 41 2 1 42 2
2 44 2 2 46 1 1 47 2 5 52 1 1
53 1 3 56 1 1 57 1 3 60 1 1 61 1
1 62 1 2 64 1 6 70 2 5 75 1 2
77 2 3 80 1 9 89 1 8 97 1
F-MN gp?8 0 2 3 3 5 1 4 5 1 5
14 1 6 11 1 7 6 1 8 1 1 9 5
1 10 1 5 15 1 8 23 1 2 25 2 2 27
1 2 29 1 1.. 68 1
.88 .09 -.98 -.18 168 -.29 .13 -.88 -.36
417 -.36 .09 -.86 -.36 420 F-MN Ct gp?5 0
1 3 3 2 1 4 1 2 6 1 1 7 1
2 9 2 1 10 1 2 12 3 1 13 1 1 14
3 1 15 4 1 16 2 1 17 3 1 18 1
1 19 6 1 20 3 1 21 1 1 22 2 1 23
2 1 24 6 1 25 7 1 26 2 1 27 3
1 28 2 1 29 6 1 30 3 1 31 2 1 32
3 1 33 3 1 34 3 1 35 3 1 36 5
1 37 1 1 38 2 1 39 1 1 40 2 1 41
1 2 43 1 2 45 1 1 46 1 1 47 1
5 52 1 8 60 2 1 61 3 1 62 4 1 63
3 1 64 13 1 65 12 1 66 4 1 67 5
1 68 2 2 70 2
.90 .24 .37 .04 180 .41 -.04 .84 .35 418 .36
-.08 .86 .36 420 F-MN Ct gp?3 0 2 2 2 2
1 3 2 1 4 5 1 5 7 1 6 16 1 7
6 1 8 4 1 9 4 1 10 2 8 18 1
5 23 1 2 25 2 2 27 1 2 29 1 1 30
1 1 31 1 1 32 2 1 33 1 1 34 3
1 35 5 1 36 4 1 37 3 1 38 1 1 39
4 1 40 3 1 41 3 1 42 4 1 43 4
1 44 2 1 45 5 1 46 7 1 47 3 1 48
2 1 49 1 1 50 3 1 51 4 1 52 3
1 53 2 1 54 3 1 55 3 1 56 3 1 57
1 1 58 4 3 61 2 1 62 1 1 63 1
2 65 1 1 66 1 1 67 2 3 70 1
___ 1e 0i
-.36 .09 -.86 -.36 105 -.54 -0.17 -.76
-.33 118 C1 2(F-M g?3 0 2 4 4 1 1 5
1 1 6 1 5 11 1 2 13 1 3 16 1
2 18 1 3 21 1 1 22 1 1 23 1 2 25
2 1 26 2 2 28 2 1 29 1 2 31 3
1 32 1 2 34 1 1 35 2 1 36 2 1 37
4 3 40 2 1 41 1 2 43 3 2 45 1
2 47 4 1 48 1 1 49 2 1 50 4 1 51
3 2 53 5 1 54 2 1 55 2 1 56 1
1 57 3 2 59 3 2 61 2 2 63 1 1 64
1 1 65 2 2 67 1 1 68 1 1 69 2
1 70 2 1 71 3 1 72 1 1 73 2 2 75
1 1 76 1 1 77 1 1 78 1 1 79 1
1 80 1 2 82 2 10 92 1 2 94 2 2 96
1
__2e 5i
50s 1i C1 C2
___ 4e 1i C21
___ 19e 1i
C22 4(F-?) g?4 0 1 6 6 1 4 10 1
2 12 1 4 ... 33 2 1 34 1 4 38
1 1 39 1 3 ... 79 1 2 81 1 5
86 1 2 88 2 2 90 1 1 91 1 1
92 2 2 94 1 1 95 1 2 97 1 1
98 1 3 101 2 1 102 2 4 106 1
1 107 1 2 109 1 1 110 2 1 111 2
6 117 1 1 118 1 1 119 1 1 120 1
___50s 1i C1
___ 6e 0i
___ 18e

C221 29e 14i ___
___
___ 19e 1i C22
___28i C11
___ 16e 11i
18e 11i C123
___ 6e
___ 2e
___ 3e 2i
C221 8F-? g?5 0 1 7 7 1 4 11 1
5 16 1 1 17 1 3 20 1 1 21 1
2 23 1 1 24 1 5 29 1 3 32 2
2 34 1 1 35 1 4 39 3 5 44 1
3 47 2 3 50 1 3 53 1 4 57 1
3 60 1 3 63 1 1 64 2 5 69 2
1 70 1 3 73 1 1 74 1 1 75 1
4 79 1 1 80 2 2 82 2 1 83 1
1 84 1 2 86 1 4 90 1 5 95 1
___1e

___ 0e 3i
___ 2e

___ 26i

___ 0e 4i
C221 8F-?)g?5 0 1 7 7 1 4 11 1
5 16 1 1 17 1 3 20 1 1 21 1
2 23 1 1 24 1 5 29 1 3 32 2
2 34 1 1 35 1 4 39 3 5 44 1
3 47 2 3 50 1 3 53 1 4 57 1
3 60 1 3 63 1 1 64 2 5 69 2
1 70 1 3 73 1 1 74 1 1 75 1
4 79 1 1 80 2 2 82 2 1 83 1
1 84 1 2 86 1 4 90 1 5 95 1
___50e 49i C1
___ 3e .

__ 4e 1i

-.034 .37 -.31 .87 4 C123 12F-M g?4 0 1
6 6 1 10 16 1 2 18 1 3 21 1 1 22
1 1 23 1 6 29 1 3 32 1 3 35 1
5 40 2 5 45 1 4 49 1 1 50 2 4 54
1 2 56 1 5 61 2 1 62 1 2 64 1
1 65 1 2 67 1 3 70 1 1 71 1 12 83
1 1 84 1 1 85 1
__ 1i .
___ 50e 40i C2 9i C3
___ 1e .
_46e 21i C12
___9e

___ 5e 1i

___ 4e C13
___ 27e 16i C23
___9e .

___ 50s 1i C2
___ 9e 1i .
MVM C2 2(F-?)g?4 0 1 4 4 1 1 5 1
4 9 1 3 ... 69 1 4 73 1 1 74 1
2 76 2 4 80 1 4 84 1 2 86 2 5 91
1
__9e 2i

___ 9i C24
_ 4e .
__9e 2i

___ 3e

__ 0e 2i .
47e 40i C22

___ 8i
___ 3i
___ 2e 6i .
___ 5e 10i

___ _3i
___ 1i

___ 0e 11i

___ 2e 1i
___ 5e 11i

15
CONCRETE
GM
MVM C11 F-?/4 g?4 0 4 2 2 1 2 4 4
2 6 25 2 8 2 1 9 7 1 10 4 1 11
9 2 13 3 1 14 6 1 15 4 1 16 1
3 19 5 4 23 2 3 26 5 1 27 4 1 28
9 1 29 5 2 31 6 1 32 5 3 35 6
5 40 2
ACCURACY CONCRETE IRIS SEEDS WINE GV
76 82.7 94 62.7 MVM 78.8 94
93.3 66.7 GM 83 94.7 96
81.3
C2-.6 .2 -.07 .771 6882.. -.72 .19 -.40 .54
9251 .38 .14 -.79 .46 11781
F-m/8 g?4 C2 0 1 2 2 1 1 3 1 2 5
2 3 8 1 2 10 1 1 11 1 5 16 1
2 18 1 5 23 1 1 24 1 1 25 2 1 26
2 1 27 1 2 29 4 1 30 2 1 31 2
1 32 1 1 33 3 2 ... 1s 65 1
X g?4 (F-MN)/8 0 2 2 2 1 2 4 2
1 5 1 3 8 2 3 11 1 1 12 3
2 14 4 1 15 3 1 16 3 1 17 2
1 18 3 1 19 6 1 20 3 1 21 3
1 22 2 1 23 5 1 24 4 1 25 3
1 26 6 1 27 3 1 28 1 1 29 6
1 30 3 1 31 2 1 32 3 1 33 3
1 34 1 2 36 3 1 37 1 1 38 2
1 39 3 1 40 5 1 41 1 1 42 6
1 43 1 1 44 3 2 46 5 1 47 1
1 48 3 1 49 1 1 50 2 1 51 1
1 52 1 1 53 1 1 54 1 1 55 1
1 56 3 1 57 3 2 59 1 2 61 1
1 62 3 3 65 2 9 74 1 4 78 1
3 81 1 2 83 1 3 86 1 2 88 1
2 90 1 1 91 1 4 95 1 2 97 1
1 98 1 2 100 1 4 104 1 3 107 1
0 1 1 1 1 4 5 1 1 ... 1s
46 4 3 49 1 7 56 1 2 58 1 3
61 1 4 65 1 1 66 1 3 69 1 2
71 1 6 77 1 3 80 1 3 83 1 3
86 1 14 100 1 3 103 1 2 105 1
3 108 2 4 112 1
___ 2M
C2 gp?8 (F-MN)/5 0 2 2 2 1 2 4 2
1 5 1 3 8 2 3 11 1 1 12 2
2 14 4 1 15 3 1 16 3 1 17 2
1 18 3 1 19 6 1 20 3 1 21 3
1 22 1 1 23 5 1 24 3 1 25 3
1 26 6 1 27 3 1 28 1 1 29 6
1 30 3 1 31 2 1 32 1 1 33 3
1 34 1 2 36 3 2 38 2 1 39 2
1 40 5 1 41 1 1 42 6 1 43 1
1 44 3 2 46 5 1 47 1 1 48 1
1 49 1 1 50 2 1 51 1 1 52 1
1 53 1 1 54 1 1 55 1 1 56 3
1 57 2 2 59 1 2 61 1 1 62 3
3 65 2 9 74 1 4 78 1 8 86 1
2 88 1 2 90 1 5 95 1 2 97 1
1 98 1 2 100 1 4 104 1
C21 0L 8M 0H

C1 43L 33M 55H
C22 2M 0H C23
g?4 F-MN/8 0 1 2 2 1 2 4 1
2 6 1 1 7 1 1 8 1 2 10 1
1 11 1 1 12 1 1 13 1 3 16 2 3 19
1 2 21 1 5 26 1 1 27 1 1 28 2
1 29 2 1 30 1 2 32 5 1 33 2 1 34
2 1 35 1 1 36 3 1 37 3 1 38 3
1 39 5 1 40 3 1 41 7 1 42 6 1 43
3 1 44 5 1 45 1 1 46 3 1 47 3
1 48 4 1 49 7 1 50 4 1 51 6 1 52
10 1 53 3 1 54 4 1 55 8 1 56 5
1 57 3 1 58 7 1 59 2 1 60 2 1 61
1 1 62 2 2 64 1 1 65 2 1 66 1
1 67 2
C21 g?4 F-M/4 0 1 1 1 1 3 4 1 3 7
2 1 8 2 1 9 1 2 11 1 2 13 4
1 14 2 1 15 4 1 16 1 2 18 2 1 19
3 1 20 1 1 21 2 1 22 6 2 24 2
1 25 3 1 26 1 2 28 2 2 30 1 1 31
1 2 33 1 4 37 1 1 38 2 1 39 2
1 40 1 1 41 1 1 42 1 1 43 2 1 44
1 1 45 2 1 46 1 1 47 1 1 48 1
1 49 2 2 51 2 4 55 1 1 56 8 1 57
4 1 58 4 1 59 2 1 60 1 1 61 1
2 63 5 2 65 1 2 67 2 1 68 1 3 71
1 1 72 4 1 73 8 1 74 5 1 75 1
8 83 3 1 84 3 1 85 2 1 86 1 99 3
C211 g?5 F-M)/4 0 1 6 6 2 1 7 2
5 12 1 1 13 4 1 14 1 1 15 4 2 17
1 1 18 2 1 19 2 2 21 2 1 22 3
1 23 1 1 24 3 4 28 1 14 42 1 2 44
1 1 45 1 3 48 2 2 50 1 5 55 1
2 57 1 1 58 1 5 63 1 1 64 1 7 71
1 11 82 1 16 98 2
GV
___5L .
C111 3L 23M 49H
___ 7M C2
___ 4M C3
___6M C4
___ 30L 1M 4H
C231 g?4 F-M/8 0 1 7 ... 1s 12 1
2 14 6 1 15 7 4 19 1 1 20 3 1 21
3 1 22 2 1 23 1 2 25 1 2 27 1
2 29 1 1 30 1 1 31 1 2 33 1 6 39
1 3 42 1 4 46 1 10 56 2
__20L 5M .
C1F-?/4 g?4
___14M 0H C1 C2
0 1 1 1 1 7 8 1 4 12 1 4 16
1 2 18 1 2 20 2 1 21 2 2 ...
1s2s 71 2 2 73 1 1 74 1 2 76 2
2 78 2 4 82 2 2 84 1 6 90 2 8 98
1 9 107 1 16 123 1
___ 5L 1M .
___ 4M .
___ 2L 1M .
C211 32L 13M 0H
___5L 1M
C11 43L 23M 53H
_30L 8H_ . 3L 2M
C212 g?5 F-M/3 0 1 20 20 1 8 28 1
1 29 2 9 38 1 11 49 1 5 54 1
11 65 1 10 75 2 3 78 1 11 89 1
7 96 1 2 98 1 2 100 1 11 111 2
1 112 1
___6M 2H
C212 7L 3M 10H
2L 2M 1H
__6L 3M .
__1L 2H
C111 F-?/4 g?4 0 1 16 16 3 1 17 2 1 18 9
1 19 3 2 21 5 6 27 3 1 28 5 1 29 14 1 30
1 8 38 2 2 40 15 1 41 3 4 45 3 2 47 2
19 66 3 21 87 1
___1L 4M 3H
___ __1L
___1L 1M 4H
___ 8H
43L 38M 55H C2 0L 14M 0H C1
___ 3L 2M 18H
1L 21M
43L 28M 55H C21
__ 1L 2M 20H
C213 4L 7M 38H
___4L 2M 8H
___ 8H

C214 0L 5M 7H
___ 2M 9H
___ ___ .
1H 2M
0L 10M 0H C22
___ __ 31H
___1L 2H
16
ABALONE
GM
ACR CONC IRIS SEEDS WINE ABAL GV 76 83 94
63 73 MVM 79 94 93 67 79 GM 83
95 96 81 81
0.39 0.57 0.10 -0.72 0.21 0.57 0.44
0.09 -0.69 0.24 0.77 0.61 0.17 0.01
2.19 0.58 0.48 0.17 0.64 3.8 0.55 0.46
0.16 0.68 3.81
g?3 200F-M 0 1 11 11 1 14 25 1 17 42
1 1 43 1 5 48 1 3 51 1 2 ... 67 2
1 68 2 1 69 3 2 ... 1s 92 1
1H
1M _
1H
X g?2 100(F-M) 3 2 3 6 1 2 8 1 1
9 2 3 12 1 3 15 2 1 16 1 2 18 2
1 19 1 1 20 2 1 21 3 1 22 2
1 23 1 1 24 6 1 25 1 1 26 1 2 28
3 1 29 2 1 30 2 2 32 3 1 33 2
1 34 3 1 35 5 1 36 4 1 37 4 1 38
3 1 39 5 1 40 3 1 41 2 1 42 1
1 43 2 1 44 3 1 45 4 1 46 2 1 47
3 1 48 3 1 49 1 1 50 3 1 51 1
1 52 1 1 53 7 1 54 4 1 55 3 1 56
3 1 57 4 1 58 2 1 59 1 1 60 3
1 61 4 1 62 2 2 64 2 1 65 1 1 66
1 2 68 3 1 69 2 1 70 1 4 74 1
2 76 1 3 79 2 1 80 2 3 83 2 2 85
1 4 89 1 13 102 1
0.25 0.30 -0.20 -0.90 0.18 -0.44 -0.37
-0.19 -0.79 0.81 -0.52 -0.42 -0.19 -0.72
0.83 C1 g?3 300(F-M) 0 1 1 1 1 2 3
2 1 4 1 1 5 1 1 6 2 1 7 1
3 10 1 1 11 1 3 14 3 2 16 2 1 17
1 1 18 2 2 20 1 2 22 1 1 23 2
1 24 1 1 25 2 1 26 3 1 27 1 1 28
2 1 29 1 2 31 1 1 32 1 3 35 1
1 36 1 2 38 1 3 41 1 3 44 3 1 45
1 1 46 2 2 48 1 1 49 1 1 50 2
2 52 2 1 53 1 1 54 1 1 55 1 4 59
2 1 60 1 4 64 1 1 65 1 1 66 1
1 67 2 2 69 2 1 70 2 1 71 2 2 73
1 1 74 1 1 75 2 1 76 2 1 77 1
1 78 3 2 80 1 1 81 3 2 83 2 1 84
1 1 85 1 1 86 1 2 88 1 1 89 1
1 90 1 2 92 1
2M 1H _
5M 12H _

6L .
1M _

3L .
30L 85M 12H C1

C1 g?3 100F-M 0 1 6 6 1 1 ...
1s 54 1 2 56 2 3... 71 2
7M 4H .
1H
20L 84M 11H C11 10L 1M 0H

12L 7M _

3L 4M _

C11 g?3 400F-M 0 1 1 1 1 4 5 1 3
8 4 1 9 1 3 12 2 2 .. 81 2 3 84
2 1 85 1
2M 1H _
4M 1H _
2L 0M 0H _

1L 19M 1H _

16M 8H C11
17L 78M 9H C111 3L

1.0 .00 .00 .00 10 .62 .41 .13 .65 46 .33 .29 .13
.89 56 C2 g?3 300F-M 0 1 8 8 1 1 9
1 2 11 1 1 12 1 1 13 3 1 14 1
2 16 2 1 17 1 1 18 3 2 20 2 1 21
1 3 24 1 1 25 1 2 27 2 1 28 1
1 29 2 1 30 1 1 31 1 2 33 2 1 34
1 1 35 1 2 37 1 1 38 3 1 39 1
1 40 1 5 45 1 1 46 1 2 48 1 6 54
1 4 58 1 1 59 1 3 62 1 1 63 1
1 64 1 4 68 1 1 69 1 14 83 1 3 86
1 23 109 1
7L 3M 0H _

C111 g?3 1500F-M 0 1 15 15 1 5 20 1
4 24 1 1 25 1 1 26 1 3 29 1
1 30 1 1 31 2 1 32 1 1 33 2
3 36 1 2 38 3 1 39 2 2 41 2
1 42 1 1 43 2 2 45 1 2 47 3
1 48 1 2 50 1 1 51 1 4 55 2
1 56 3 2 58 1 2 60 3 1 61 2
1 62 2 2 64 1 1 65 2 3 68 2
1 ... 112 1 4 116 2
.55 .43 .14 .27 .38 C11 g?3 1000(F-M) 0 1
10 10 1 7 17 1 2 19 1 8 27 1
9 36 1 11 47 2 2 49 1 3 52 2 4 56
1 4 60 1 2 62 1 2 64 1 7 71 3
1 72 1 5 77 2 4 81 1 3 84 1 6 90
1
3L _

3M _

6L 8M 0H _

17M 2H .
13M 5H _
1M 2H _
4L 3M _

0M 6H _
1M 2H _
4L 72M 15H C1 10L 1M 0H

3M 1H _
2L 21M 1H _

12M 7H _
3L 13M 2H
15H _
1L 7M _

5M 10H _

1M _

4L 8M 4H

1H

6M 5H _
3L 30M 1H
1M 1H _ 1H

3L 51M 3H

17
gp1 Ct8 C16
. outliers. Some of them are substantial



MVM gapsgt6avg
KOSblogs dUnitSTDVec ggt6avg
3364 1804 185.38 0.56 0 3365 3399 186.38
1.00 1 3366 980 186.68 0.30 0 3367
1518 187.84 1.15 1 3368 2090 188.45 0.61
1 3369 890 189.10 0.65 1 3370 24
189.74 0.65 1 3371 2435 189.77 0.03
0 3372 804 190.14 0.36 0 3373 930
190.24 0.11 0 3374 1096 191.30 1.06
1 3375 1441 191.39 0.09 0 3376 2885
191.86 0.47 0 3377 2315 191.91 0.05
0 3378 699 192.04 0.13 0 3379 2108
194.34 2.30 1 3380 1316 195.58 1.24
1 3381 991 195.85 0.27 0 3382 1564
196.05 0.20 0 3383 2800 196.37 0.32
0 3384 880 196.62 0.25 0 3385 2038
196.75 0.13 0 3386 481 197.09 0.34
0 3387 480 197.85 0.76 1 3388 295
198.38 0.53 0 3389 1234 200.42 2.04
1 3390 2140 201.46 1.04 1 3391 3353
202.36 0.90 1 3392 3402 202.64 0.28
0 3393 45 202.86 0.21 0 3394 3017
204.63 1.77 1 3395 3365 207.54 2.91
1 3396 2436 207.77 0.24 0 3397 553
209.73 1.96 1 3398 2545 210.52 0.79
1 3399 54 213.63 3.11 1 3400 1933
214.58 0.95 1 3401 3201 216.16 1.57
1 3402 2895 217.18 1.02 1 3403 446
217.83 0.65 1 3404 2302 218.43 0.61
1 3405 2873 219.47 1.04 1 3406 3388
223.00 3.52 1 3407 1509 225.98 2.99
1 3408 32 229.46 3.48 1 3409 3189
231.30 1.84 1 3410 3228 231.43 0.13
0 3411 2107 232.39 0.96 1 3412 1150
232.79 0.40 0 3413 2279 236.69 3.90
1 3414 2289 237.43 0.74 1 3415 2385
238.03 0.60 0 3416 1037 245.93 7.90
1 3417 201 246.72 0.79 1 3418 1252
249.23 2.51 1 3419 1739 250.34 1.11
1 3420 2446 257.59 7.26 1 3421 1637
258.64 1.05 1 3422 3220 260.55 1.91
1 3423 1304 262.67 2.12 1 3424 2355
271.20 8.53 1 3425 232 293.86 22.66
1 3426 3411 299.23 5.37 1 3427 1955
303.42 4.19 1 3428 1832 328.03 24.61
1 3429 1197 335.83 7.81 1 3430 2852
364.01 28.18 1
AvgGp.0085 gpgt6avg ROW KOS F GAP
CT 1 1791 0.2270 --- -- 2 1317 0.2920
0.065 1 2668 1602 6.6576 0.007 2667 3090 1390
9.8504 0.004 422 3132 1546 10.278 0.012
42 3148 2662 10.507 0.021 16 3216 505 11.289
0.019 68 3264 2219 11.994 0.027 48 3291 231
12.445 0.039 27 3302 710 12.631 0.038
11 3317 220 12.934 0.023 15 3338 405 13.315
0.028 21 3355 194 13.693 0.009 17 3368 12
14.151 0.078 8 3378 2731 14.590 0.011
10 3392 1096 15.459 0.022 5
0.1AvgGp 64gaps Row
Doc F 28.2MxGp .6GapThreshold 1
1791 5.67 Gap 0 ... ... ... ...
... 8 3389 7.00 0.19 0 9 2397
7.65 0.65 1 10 2841 7.82 0.17 0
... ... ... ... ... 2621 2334 89.40
0.06 0 2622 1122 90.00 0.60 1 2623
245 90.06 0.06 0 ... ... ... ...
... 3123 3169 132.06 0.00 0 3124 321
132.81 0.75 1 3125 2047 133.05 0.24
0 ... ... ... ... ... 3210 343
145.29 0.37 0 3211 2475 145.89 0.60
1 3212 458 146.10 0.21 0 ... ...
... ... ... 3240 542 151.15 0.09 0
3241 2569 151.76 0.61 1 3242 1143 151.92
0.15 0 ... ... ... ... ... 3285
1803 157.97 0.00 0 3286 2257 158.70 0.73
1 3287 2723 158.77 0.07 0 ... ...
... ... ... 3293 129 159.56 0.32 0
3294 2541 160.45 0.89 1 3295 2870 160.48
0.03 0 ... ... ... ... ... 3301
401 161.38 0.04 0 3302 2918 162.03 0.65
1 3303 100 162.07 0.04 0 ... ...
... ... ... 3312 1157 164.54 0.08 0
3313 185 165.26 0.72 1 3314 685 165.91
0.65 1 3315 2948 166.25 0.34 0 ...
... ... ... ... 3325 190 168.59 0.37
0 3326 2498 169.20 0.61 1 3327 264
169.31 0.11 0 3328 1611 169.64 0.33
0 3329 3052 169.96 0.32 0 3330 1002
170.43 0.47 0 3331 1628 170.64 0.20
0 3332 1241 171.80 1.16 1 3333 3155
172.00 0.20 0 ... ... ... ...
... 3342 861 173.84 0.15 0 3343 2509
174.98 1.13 1 3344 2293 175.65 0.67
1 3345 1257 175.67 0.02 0 3346 2776
176.04 0.37 0 3347 1422 177.15 1.11
1 3348 12 177.24 0.09 0 3349 183
177.26 0.02 0 3350 620 177.29 0.03
0 3351 679 179.08 1.79 1 3352 462
179.15 0.07 0 3353 3404 180.02 0.88
1 3354 1850 180.79 0.76 1 3355 3342
181.21 0.43 0 3356 1396 183.04 1.82
1 3357 2982 183.26 0.22 0
___
___ gap.65 Ct9 C1


___
___ gap.6 Ct2613 C2


___
___ gap.75 Ct 502 C3


___
___ gap.6 Ct 87 C4


___
___ gap.61 Ct30 C5


___
___ gap.73 Ct45 C6


___
___ gap.89 Ct8 C7


___
___ gap.65 Ct8 C8


___
___ gp.72 Ct 11 C9


___
___ gp.65 Ct1 outlr


___
___ gp.61 Ct12 C11


___
___ gp1.2 Ct6 C12


___
___ gp1.1 Ct11 C13


___
___ gap.67 Ct1 utlr


___
___ gp1.1 Ct3 C15


___
___ gp1.8 Ct4 C16


___
___ gp1.8 Ct5 otlr


18
GV using a grid (Unitized Corners of Unit Cube
Diagonal of the Variance Matrix
Mean-to-
About PowerShow.com