ASSESSMENT OF TECHNICAL EFFICIENCY OF INDIAN B-SCHOOLS - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

ASSESSMENT OF TECHNICAL EFFICIENCY OF INDIAN B-SCHOOLS

Description:

PRE Liberalization - the government had a very a vital role in ... Xaviers Labour Relation Institute, ... Xavier Institute Of Management, Bhubaneswar ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 41
Provided by: astroT
Learn more at: http://astro.temple.edu
Category:

less

Transcript and Presenter's Notes

Title: ASSESSMENT OF TECHNICAL EFFICIENCY OF INDIAN B-SCHOOLS


1
ASSESSMENT OF TECHNICAL EFFICIENCY OF INDIAN
B-SCHOOLS A COMPARISON BETWEEN THE
CROSS-SECTIONAL AND TIME-SERIES ANALYSIS
  • Presented by
  • Gokulananda Patel
  • Birla Institute of Management Technology
  • India
  • gn.patel_at_bimtech.ac.in

2
Brief Introduction
  • PRE Liberalization - the government had a very a
    vital role in the development of Indian economy.
  • Most of the industries were in the hands
    Governments, Only a handful was open for the
    private sector. And as these private sector units
    were operating in a monopolistic environment they
    were still profitable despite producing low
    quality products and providing poor customers
    service.

3
Brief Introduction
  • POST Liberalization - Economy open for private
    and foreign companies.
  • Entry of multinationals - large operations,
    optimum capacity utilization, accompanied with
    massive advertisement and effective sales
    promotions, backed by quality product and along
    with good customer relationship.
  • This has compelled the Indian corporate sector to
    give more stress on quality of products,
    reduction in the cost of production, provision
    for better after sales service, adoption of
    modern technology, to engage efficient human
    resources and providing a fair rate of return to
    the investors
  • In this background the responsibilities of
    management institutes have gone up manifold and
    has necessitated the efficiency measurement of
    these schools.

4
Some Indian magazines doing the B-School Rankings
  • Outlook Cfore
  • Business World COSMODE
  • Indian Management IMRB
  • The parameters covered are
  • Intellectual capital, Infrastructure, Admission,
    Placements, Research and Publications etc.
  • Why these rankings are important
  • Enhances the prestige of individual MBA
    programmes
  • Influences the starting salary of students
    (placement performance)
  • The Stakeholders may use these ranking to choose
    the School.

5
Ranking of B-Schools may differ depending on the
target audience
  • The Ranking criterion of Students may be
    different from the criterions of Employers.
  • For Program Administrator all the criteria as
    considered by Students and Employers may be
    important.

6
Ranking of B-Schools may differ depending on the
target audience (Contd.)
  • students may be more interested in the placement
    performance of the school, the average salary at
    the initial placement etc.
  • The employers may be more interested in the
    intellectual capital, infrastructure facilities
    etc.

7
Comparison of Outlook Cfore Survey Business
World COSMODE Survey
Outlook-Cfore Survey Outlook-Cfore Survey Outlook-Cfore Survey Business World-COSMODE Business World-COSMODE Business World-COSMODE
Sl.No. Parameters Weightage Sl.No. Parameters Weightage
1 Placement Performance 24 1 Infrastructure both physical and academic 100 (10)
2 Intellectual Capital and Faculty 15 2 Faculty, Research Consultancy, MDP and Other programme 250 (25)
3 Industry Interface 14 3 Admission System, Curriculum and Delivery 250 (25)
4 Infrastructure Facilities 13 4 Networking and Innovation 100 (10)
5 Perception Score 20 5 Placement and Industry Interface 300 (30)
6 Extra Curricular Activities 5 6
7 International Linkages 9 7
8
Interpretation from Comparison
  • On comparison of the above table above it is
    clear that
  • both the survey uses different set of parameters
    and assign different weights to them.
  • The weight given to placement performance and
    industry interface by Cfore survey is 38 whereas
    the weight given to these parameters by COSMODE
    is 30.
  • Similarly the sub parameters considered in
    Intellectual Capital Faculty and extra
    curricular activities in C fore survey is same as
    the sub parameters considered in Faculty,
    Research Consultancy, MDP and other programmes
    in COSMODE,
  • the weightage given by Cfore survey to this
    parameter is 20 whereas the weightage given by
    COSMODE is 25.

9
Interpretation from Comparison (Contd.)
  • The methodology of converting the raw data into
    scores on various parameters across these two
    surveys also varies.
  • The COSMODE survey uses both absolute and
    relative scale to assign the scores based on the
    nature of the parameters. Wherever the absolute
    score is used it is based on the slabs or
    intervals, the relative score is assigned based
    on the comparison with the highest in the
    category.
  • In the Cfore the marks are allotted in a
    particular category by normalizing against the
    topper in that category.
  • The number of institute participating in these
    two surveys also varies, the C fore survey
    invited 950 B-School out of them 236
    participated, whereas for the COSMODE 384
    B-Schools were given with questionnaire out of
    which 130 B-Schools submitted the completely
    filled in questionnaire and they ranked 100
    B-schools.

10
Comparison of Ranks
Sl.No. Name of the Institute Rank (COSMODE) Rank (Cfore) Absolute Deviation(d)
1 Indian Institute of Management, Ahmedabad 1 1 0
2 Indian Institute of Management, Bangalore 2 3 1
3 Management Development Institute, Gurgaon 3 4 1
4 Indian Institute of Management, Calcutta 4 2 2
5 Xaviers Labour Relation Institute, Jamshedpur 5 6 1
6 Indian Institute of Foreign Trade, New Delhi, 6 9 3
7 National Institute Of Industrial Engineering, Mumbai 7 7 0
8 Institute of Rural Management , Anand 8 - -
9 S P Jain Institute, Mumbai 9 8 1
10 Xavier Institute Of Management, Bhubaneswar 10 12 2
11
Sub-parameters their Weightages for CFORE Survey
Parameters Sub-parameters Marks Total Marks Percen-tage
Placement performance Percentage of student placed 55 385 24
Placement performance Median salary 55 385 24
Placement performance Maximum salary 55 385 24
Placement performance Minimum salary 55 385 24
Placement performance Percentage of students placed abroad 55 385 24
Placement performance Max salary (abroad jobs) 55 385 24
Placement performance ROI index (AvgMedMin Salary/Fee) 55 385 24
Intellectual capital and faculty Books 35 250 15
Intellectual capital and faculty Research papers 70 250 15
Intellectual capital and faculty Cases 35 250 15
Intellectual capital and faculty Journal 20 250 15
Intellectual capital and faculty Faculty/student ratio 20 250 15
Intellectual capital and faculty Teaching experience of Faculty 15 250 15
Intellectual capital and faculty Corporate experience of Faculty/Students 25 250 15
Intellectual capital and faculty PhD/students ratio 15 250 15
Intellectual capital and faculty Faculty with Ph.D.(abroad) 15 250 15
12
Sub-parameters their Weightages for CFORE
Survey contd
Parameters Sub-parameters Marks Total Marks Percen-tage
Industry interface Revenue from consultancy 50 230 14
Industry interface Revenue from MDP 100 230 14
Industry interface Seminars 40 230 14
Industry interface Workshops 40 230 14
Infrastructure and facilities Area (In Acres) 25 205 13
Infrastructure and facilities Built-Up Area 40 205 13
Infrastructure and facilities Computers (PII) Pre Batch 35 205 13
Infrastructure and facilities Amphitheatre Class Room 15 205 13
Infrastructure and facilities Library Books 25 205 13
Infrastructure and facilities Electronic Database 25 205 13
Infrastructure and facilities Residential Facilities 15 205 13
Infrastructure and facilities Single Occupancy Room 15 205 13
Infrastructure and facilities MDP Hostel 10 205 13
Perception score Perception score 320 320 20
Extra-curricular activities National-level events organized 40 80 5
Extra-curricular activities Awards won by students 40 80 5
International linkages Student Exchange Programme 25 150 9
International linkages Faculty Exchange Programme 25 150 9
International linkages Students actually exchanged 50 150 9
International linkages Faculty actually exchanged 50 150 9
TOTAL TOTAL TOTAL 1620 100
13
Objective Ranking from Students, Employers
Programme Administrators Perspective
  • The cfore ranking - uses surveys of students,
    employers and programme administrators to measure
    their combined perceptions on the quality of MBA
    programmes. we develop objective ranking of
    B-Schools that address the interest of students,
    employers and MBA programme administrators
    separately.
  • Student- PP, SAL, and FEE (29 schools lost
    rank- 5 have lost within 5 positions ICFAI
    Hyderabad lost 20 positions. Three schools
    maintained rank)
  • Employer -IC, IF, ECA, IL, RS (23
    schools lost their rank out of which 19 lost
    within 5 position )
  • Programme Administrators All Parameters (44
    schools lost their rank)
  • So we want an alternative ranking method which is
    capable to handle complexities involved in
    ranking the B-Schools and demands for a
    methodology which is mathematically robust.

14
Berlin Principles International Rankings Expert
Group (IREG)-2006- Highlights
  • Rankings -effective techniques of assessing
    higher education inputs processes and output.
  • Ranking process should recognize the diversity
    among institutions and take the divergent goals
    and missions of the institutions into
    consideration.
  • Authenticity of databases.
  • The process should take into account the
    linguistic, cultural, economic and historical
    contexts of the educational systems being ranked.
  • There should be transparency about the
    methodology used for creating the rankings.
  • Weights should be assigned to various indicators

15
Choosing The Suitable Technique
  • Analytical Hierarchy Process (AHP)
  • Bayesian Latent Variable Model
  • Data Envelopment Analysis (DEA)

16
Choosing The Suitable Technique contd
  • AHP helps capture both subjective and objective
    evaluation measures, providing a useful mechanism
    for checking the consistency of the evaluation
    measures and alternatives suggested by the team
    thus reducing bias in decision-making.
  • Though AHP has many advantages ,it has some
    limitations which make the method difficult to
    apply to B-school ranking.
  • The B-School ranking problem involves large
    number units (B-Schools) and wide range of
    parameters such situation may make the pair wise
    comparison difficult.
  • AHP assumes linear independence of criteria and
    alternatives. If there is dependence among the
    criteria, Analytic Network Process (ANP) is more
    appropriate yet ANP requires far more
    comparisons, which may be formidable in practical
    decision environment (Ozden Birsen, 2005).
  • The other drawbacks can be- the highly subjective
    nature of preference weights , Problems with
    inconsistencies in preferences between objectives
    sometimes arise(Qureshi Harrison, 2003).

17
Choosing The Suitable Technique contd
  • Not many applications of Bayesian latent variable
    model are found on the body of literature.
  • The model works in two steps first it calculates
    the relative importance of parameters using
    information embedded in the data then
    simultaneously determines the degree of
    uncertainty that surrounds the ranks.
  • The method considers the variability present in
    the data and according adjustments is made. If an
    institution is performing well in all the
    parameters then clearly it is ranked above all
    other institute.
  • Though the method has some merits, but
    application of such method makes it difficult to
    distinguish the performance of public and private
    institutions specially in Indian context where
    public institutions receive lot of Government
    funding and expected do very well in certain
    inputs like infrastructure, faculty strength etc.

18
Choosing The Suitable Technique
contd
  • Data Envelopment Analysis on the other hand is
    non-parametric method based on the application of
    a mathematical technique called linear
    programming. It has been successfully employed
    for assessing the relative importance of set of
    firms, which use a variety of identical inputs to
    produce a variety of identical outputs. Unlike
    statistical methods of performance analysis, DEA
    is non-parametric in the sense that it does not
    require an assumption of functional form relating
    inputs to outputs.
  • We Choose Data Envelopment Analysis for ranking
    the Indian B-schools.

19
Research Objectives
  • The objectives of present study focuses on
    ranking of some B-schools of India based on their
    efficiency scores,
  • Find out bench marking institutions and discusses
    improvement areas for inefficient institutions.
  • The study is also extended by considering the
    performance of the B-schools across the year.
  • A sensitivity analysis is done to know the
    robustness of the model.

20
Data Set
  • For our study we have considered the surveys
    conducted by one of the popular Indian magazine
    Outlook.
  • The data is collected for two consecutive years
    i.e. 2004 and 2005.
  • These surveys are conducted to rank top B-Schools
    in India based on their performance for the
    corresponding years.
  • The data is compiled from various sources
    (outlook,2004 Bschool Directory,2005 Business
    School Directory, 2005, www.bschools.com). We
    have top twenty eight Indian B-schools for
    analysis.

21
Data Classification and Reversal for DEA
Applications
  • For our DEA analysis the data is classified into
    two categories viz. inputs and outputs. The
    criteria of selection of inputs and outputs are
    quite subjective there is no specific rule for
    determining the procedure for selection of inputs
    and outputs (Ramanathan, 2001). The parameters
    used for the analysis is shown below

22
Input Output Classification
Input Output
X1 IC Y1 II
X2 IF Y2 PP
X3 FEE Y3 IL
Y4 RS
Y5 SS
Y6 FS
Y7 ECA
Y8 SAL
23
Data Reversal
  • For applying the DEA we have reversed two of the
    inputs IC and IF.
  • The total score for Intellectual capital (IC) and
    infrastructure and facilities (IF) are 250 and
    200 respectively.
  • These scores are not directly taken for DEA
    analysis as higher score of IC and IF means they
    have developed more infrastructure, facilities
    and intellectual capital, which is desirable.
  • If we directly use the score in the model higher
    value will be reflected as usage of more input
    for producing the desired output, which is
    contradictory.
  • So for the DEA analysis the complement of the
    score from the total is used i.e. if an institute
    scored 230 out of 250 in an input variable then
    input value is taken as 20.

24
Analysis and Results
  • The general output oriented BCC DEA model is used
    to solve the problem and get the efficiency
    score.
  • The result of DEA analysis is shown in Table I.
    The 1st column of the Table-I shows the rank as
    assigned by the Magazine, the 2nd column shows
    the efficiency score as calculated from BCC model
    for the year 2004 and 3rd column for the year
    2005. The 4th column is new rank assigned to the
    B-Schools based on the efficiency score for 2004
    and 5th column for the new rank for the year
    2005. The 6th and 7th column shows the deviation
    in conventional ranking and the DEA ranking for
    2004 and 2005 respectively.

25
Table-I Results of BCC -Output
orientated DEA Model
DMU/ Rank (I) Efficiency 2004 Efficiency 2005 New Rank(II)2004 New Rank(III) 2005 D1 (I-II) D2 (I-III)
1 1 1 1 1 0 0
2 1 1 1 1 1 1
3 1 1 1 1 2 2
4 1 1 1 1 3 3
5 1 1 1 1 4 4
6 0.918367 1 13 1 -7 5
7 0.902937 0.933465 19 21 -12 -14
8 0.918367 0.913751 13 27 -5 -19
9 0.884011 0.977761 22 16 -13 -7
10 0.869822 1 26 1 -16 9
11 0.79845 0.952867 28 19 -17 -8
12 1 1 1 1 11 11
13 0.94808 0.972128 11 18 2 -5
26
Results Contd
14 1 1 1 1 13 13
15 0.897959 0.99376 20 13 -5 2
16 0.973211 1 8 1 8 15
17 0.912843 0.919431 15 25 2 -8
18 0.877607 0.983258 24 15 -6 3
19 0.909846 0.926927 17 23 2 -4
20 0.872909 0.92611 25 24 -5 -4
21 0.959184 0.8083 9 28 12 -7
22 0.906798 0.933034 18 22 4 0
23 0.89258 1 21 1 2 22
24 0.95352 0.988082 10 14 14 10
25 0.910084 0.947667 16 20 9 5
26 0.931626 0.916263 12 26 14 0
27 0.882043 1 23 1 4 26
28 0.815822 0.976355 27 17 1 11
Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756 Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756 Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756 Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756 Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756 Mean Efficiency (2004) 0.926288, Mean Efficiency (2005) 0.966756
27
Discussion
  • The results show that top six Indian B-schools
    are retaining their positions.
  • There is a improvement in the mean efficiency
    score of the institute over the year.
  • The high value of efficiency score is obtained as
    only very top B-schools are considered for the
    analysis.
  • It is also interesting to see the last two
    columns that the position of top five schools is
    not changing over the year.
  • DMU1 which is one of the top B-School in India
    (IIM-Ahmedabad) is retaining its position in all
    rankings.
  • The highest loser in the year 2004 on technical
    efficiency score is DMU11 which lost seventeen
    position and highest gainer being DMU24 and DMU26
    which gained fourteen positions.
  • Similarly the last column of the Table-I shows
    the lose and gain of the B-Schools for the year
    2005.

28
Sensitivity Analysis
  • DEA is an extreme point technique because the
    efficiency frontier is formed by the actual
    performance of best-performing DMUs.
  • A direct consequence of this aspect is that
    errors in measurement can affect the DEA result
    significantly.
  • So according to DEA technique, it is possible
    for a B-School to become efficient if it achieves
    exceptionally better results in terms of one
    output but performs below average in other
    outputs.

29
Sensitivity Analysis
  • The sensitivity of DEA efficiency can be verified
    by checking whether the efficiency of a DMU is
    affected appreciably
  • If only one input or output is omitted from DEA
    analysis.
  • Dropping one efficient DMU at a time from DEA
    analysis.
  • For our study the robustness test of the DEA
    results obtained is done in two ways
  • Initially the input Intellectual Capital is
    dropped from the analysis and technical
    efficiency of DMUs is calculated, then input
    fee is dropped, similarly the outputs Industry
    Interface and Placement Performance is dropped
    one by one.
  • At the second level the efficient units DMU1,
    DMU12is dropped one by one and technical
    efficiency is calculated.

30
Table-II Sensitivity Analysis Report (Output
Oriented DEA, Scale Assumption VRS)
DMU/ Rank (I) Efficiency 2004 Efficiency 2005 Dropping IC 2004 Dropping IC 2005 Dropping FEE 2004 Dropping FEE 2005 Dropping DMU 1, 2004 Dropping DMU 1, 2005 Dropping DMU 12, 2004 Dropping DMU 12, 2005
1 1 1 1 1 1 1 --- --- 1 1
2 1 1 1 1 1 1 1 1 1 1
3 1 1 1 1 1 1 1 1 1 1
4 1 1 1 1 1 1 1 1 1 1
5 1 1 1 1 0.902 1 1 1 1 1
6 0.918 1 0.918 1 0.918 1 0.918 1 0.918 1
7 0.902 0.933 0.902 0.933 0.902 0.933 0.902 0.933 0.902 0.933
8 0.918 0.913 0.918 0.913 0.918 0.901 0.918 0.930 0.918 0.913
9 0.884 0.977 0.884 0.977 0.883 0.977 0.888 0.977 0.884 0.977
10 0.869 1 0.869 1 0.869 1 0.869 1 0.869 1
11 0.798 0.952 0.798 0.952 0.798 0.952 0.814 0.952 0.798 0.952
12 1 1 1 1 0.834 0.879 1 1 --- ---
31
Table-II Sensitivity Analysis Report (Output
Oriented DEA, Scale Assumption VRS)
13 0.948 0.972 0.948 0.972 0.897 0.936 0.948 0.979 0.986 0.972
14 1 1 1 1 0.918 0.922 1 1 1 1
15 0.897 0.993 0.897 0.993 0.897 0.978 0.897 0.995 0.897 0.993
16 0.973 1 0.973 1 0.834 0.957 0.973 1 1 1
17 0.912 0.919 0.912 0.919 0.846 0.871 0.912 0.919 1 0.919
18 0.877 0.983 0.877 0.983 0.828 0.936 0.877 0.987 0.899 0.984
19 0.909 0.926 0.909 0.926 0.857 0.879 0.909 0.938 0.925 0.928
20 0.872 0.926 0.872 0.926 0.822 0.886 0.872 0.930 0.886 0.926
21 0.959 0.808 0.959 0.808 0.959 0.797 0.959 0.836 0.959 0.808
22 0.906 0.933 0.906 0.933 0.869 0.884 0.906 0.946 0.913 0.936
23 0.892 1 0.892 1 0.836 0.948 0.892 1 0.919 1
24 0.953 0.988 0.953 0.988 0.857 0.914 0.953 1 0.974 1
25 0.910 0.947 0.910 0.947 0.834 0.914 0.910 0.954 0.941 0.947
26 0.931 0.916 0.931 0.916 0.846 0.855 0.931 0.916 0.949 0.932
27 0.882 1 0.882 1 0.852 0.980 0.882 1 0.889 1
28 0.815 0.976 0.815 0.976 0.804 0.957 0.815 0.976 0.817 0.976
32
Discussion
  • It is observed from the table above that when the
    input IC is dropped from the analysis there is no
    change in the technical score.
  • When the input Fee is dropped from the analysis
    then there is change in efficiency scores two
    DMUs viz. DMU5 and DMU14 is becoming inefficient.
  • Dropping the efficient DMUs from the analysis is
    not making the efficient units inefficient one.
    The analysis shows the robustness of the model
    used.

33
Time-Series Analysis
  • A time series are the values of a function
    sampled at different points in time
  • In this section we have observed the DMUs over
    multiple time periods to find the changes in
    efficiency over time. In such a setting, it is
    possible to perform DEA over time by using a
    moving average analogue, where a DMU in each
    different period is treated as if it were a
    "different" DMU. Specifically, a DMU's
    performance in a particular period is contrasted
    with its performance in other periods in addition
    to the performance of the other DMUs (Cooper et
    al.).

34
Results
DMU Period-1 Period-2 Period-3
1 1.000 1.000 1.000
2 1.000 1.000 1.000
3 1.000 1.000 1.000
4 1.000 0.974 1.000
5 1.000 1.000 1.000
6 1.000 0.926 1.000
7 0.906 0.866 0.950
8 0.895 1.000 1.000
9 0.970 0.880 1.000
10 0.959 0.864 0.992
11 0.941 0.784 0.958
12 1.000 1.000 0.933
13 0.964 1.000 0.962
35
Results
14 1.000 1.000 0.972
15 0.939 0.898 0.970
16 1.000 1.000 0.973
17 0.919 0.901 0.888
18 0.954 0.869 0.948
19 0.913 0.902 0.899
20 0.922 0.864 0.905
21 0.786 0.959 0.799
22 0.905 0.901 0.898
23 1.000 0.883 0.976
24 0.967 0.940 0.934
25 0.918 0.897 0.910
26 0.914 0.917 0.877
27 0.993 0.877 0.998
28 0.922 0.814 0.950
36
Discussion
  • The above table reflects the stability of
    technical efficiency score over a period of time.
    It is observed that the technical efficiency
    score of inefficient units has decreased in
    period -2, but is maximum in period -3. The
    performance of DMU14 and DMU16 has come down over
    the period and is becoming relatively inefficient
    over the period

37
Conclsion
  • As the management education characterizes
    multi-input and multi-output system. Data
    Envelopment Analysis (DEA), with its ability to
    handle multiple inputs and multiple outputs has
    been used in this paper to rank the Indian
    B-Schools based on their technical efficiency
    score.
  • The ranking is done using BCC model and the
    results are compared with conventional ranking
    done by popular Indian magazines. The comparison
    shows that ranking using DEA-VRS model differs
    significantly from the conventional ranking.

38
Coclusion
  • The sensitivity analysis done shows that there is
    no significant change in the efficiency score of
    DMUs when an input or output is dropped from the
    DEA analysis.
  • The dropping of efficient DMUs from the analysis
    also shows the same fact. This shows the
    robustness of the model.
  • The time series analysis done over three periods
    of time shows the stability of technical
    efficiency score over a period of time. It is
    observed that the technical efficiency score of
    inefficient units has decreased in period -2.
    The analysis has shown the performance of schools
    has improved over period of time.
  • The methodology suggested in the paper can
    provide useful information by identifying
    clusters of DMUs performing better in certain
    contexts. This technique allows the researcher to
    investigate why and how they are able to perform
    better.

39
References
  • Banker R.D, Charnes.A , Cooper W.W (1984) Some
    models for estimating technical scale
    efficiencies in Data Envelopment Analysis ,
    Management Science,30 (1984), 1078-1092
  • B School Directory 2005 Largest Listing of
    B-Schools by Business India.
  •  Business School Directory 2005 by Dalal
    Street.
  •  Charnes.A , Cooper W.W Rhodes.E, Measuring
    efficiency of decision making units , European
    journal of Operational Research 2 (1978)
    ,429-444
  •  Juran, J.M. and Gryna, F.M. Jr (Eds), (1988),
    Jurans Quality Control Handbook, 4th ed.,
    McGraw-Hill, New York, NY.
  • Natarajan, R., 2003, Quality and Accreditation
    in Technical Management Education,
    Productivity, Vol .44 No.2, July-September.
  •  Parasuraman, A., Zeithaml, V. A. and Berry, L.L.
    (1985), A Conceptual Model of Service Quality
    and its Implication for Future Research, Journal
    of Marketing, Vol. 49 (Fall), pp. 41-50.
  •  Peters, T.J. and Waterman, R.H. (1982), In
    Search of Excellence, Harper and Row, New York,
    NY.
  •  Ramanathan.R A Data Envelopment Analysis of
    comparative performance of schools in Netherland
    Opsearch Vol. 38 No.2 2001,Page no. 160-182
  •  Sreekumar, G.N.Patel, (2005), Measuring the
    Relative Efficiency of Some Indian MBA
    Programmes- A DEA Approach, Business
    Perspective, Vol.7, No.2, July-Dec 2005,pp-47-59
  •  Sreekumar, G.Patel, (2007), Comparative
    Analysis of B-school Rankings and an Alternate
    Ranking Method, International Journal of
    Operations and Quantitative Management, Vol 13,
    No.1, March, 2007, PP-33-46
  •  Outlook September 27,2004
  •  Outlook September ,2005
  •  William W. Cooper, Lawrence M. Seiford and Joe
    Zhu, Data Envelopment Analysis History, Models
    and Interpretations
  •  www.indiabschools.com

40
  • Thank You
Write a Comment
User Comments (0)
About PowerShow.com