The Testing of Language for Business Purposes - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

The Testing of Language for Business Purposes

Description:

... company documents (e.g. Rolls Royce, Japan Airlines, McDonalds and The Body Shop ) ... (source: English Business Communication, Past Paper EL-NBC 12: p6) ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 37
Provided by: barryos
Category:

less

Transcript and Presenter's Notes

Title: The Testing of Language for Business Purposes


1
The Testing of Language for Business Purposes
  • Barry OSullivan
  • Centre for Research in Testing Evaluation
    Curriculum
  • Roehampton University

2
Focus of this talk
Overview of current assessment practices in the
business domain
  • An historical overview of testing language for
    business purposes

Outline of current practice in testing language
for business purposes
A theoretical framework for describing specific
purpose tests
3
Business Assessment Practice
FOCUS
4
Historical Overview
1554
London Salhof - young apprentices from Germany
had to spend one year in the countryto get a
proper command of everyday English and the more
specific technical terms
1972
LCCIEB Language Section undertook a major
analysis of foreign language use involving over
11,500 employees of almost 600 international
firms. Replicated in the Germany, France, Greece
and Spain between 1982 and 1985
5
Historical Overview
1979
Test of English for International Commerce
(TOEIC). Developed by ETS in response to
suggestions by the Japanese government. Format
standardised MCQ (listening reading)
1986
Royal Society of Arts in the UK developed the
Certificate in English as a Foreign Languages for
Secretaries (CEFLS). English Oral, Reading
Writing and Listening tests (French, German and
Swedish Translation)
6
Historical Overview
1988
UCLES introduced the Certificate in English for
International Business and Trade (CEIBT) which
consisted of three papers, Reading Writing,
Listening and Oral Interaction. Based on actual
company documents (e.g. Rolls Royce, Japan
Airlines, McDonalds and The Body Shop )
1990
Oxford Delegacy of Local Examinations (UODLE)
introduced the Oxford International Business
English Certificate (OIBEC) in 1990. The OIBEC
tested all four skills at two levels, First and
Executive.
7
Historical Overview
1993
UCLES introduced the Business English Certificate
(BEC) which consisted of three papers, Reading,
Writing, and Listening. Those candidates who
performed well went on to take a Speaking paper.
BEC 2 introduced in 1994, Level 3 in 1996.
Mid 1990s
UCLES introduced English version of Business
Language Testing System (BULATS). ALTE partners
introduce French, German and Spanish versions.
8
Current LSP Testing Practice
Developers?
Cambridge ESOL ETS Pitman Qualifications
LCCIEB all English Lang. Goethe (German)
Perugia (Italian) Alliance (French) Salamanca
(Spanish) JETRO (Japanese)
Skills?
Mixed some receptive only, some offer
individual skill papers others all 4 skills
Tasks Items?
MCQ SAF cloze guided extended writing
speaking interview, long turn,
interactive) Some tests use only a single item
type others more
Reporting?
Usually criterion referenced (pass/fail or
grade), but some norm-referenced (related to
other test takers)
9
Task and Item Types I
10
Task and Item Types II
11
Task and Item Types III
12
What Can We Say About These?
They vary between appearing to be very similar to
a specific domain (business English) and
appearing to be very general in nature
They seem to vary on a number of dimensions
(purpose, response format, length nature of
input and output etc.)
They also seem to vary in the way the candidate
is expected to interact with the task (in terms
of cognitive and meta-cognitive processing)
13
Current LSP Testing Theory
Douglas (2000 19) defines LSP testing as
one in which test content and methods are
derived from an analysis of a specific purpose
target language use situation, so that test tasks
and content are authentically representative of
tasks in the target situation, allowing for an
interaction between the test takers language
ability and specific purpose content knowledge,
on one hand, and the test tasks on the other.
Such a test allows us to make inferences about a
test takers capacity to use language in the
specific purpose domain.
The focus is on AUTHENTICITY
14
Authenticity
test tasks and content are authentically
representative of tasks in the target situation
SITUATIONAL
an interaction between the test takers language
ability and specific purpose content knowledge,
on one hand, and the test tasks on the other
INTERACTIONAL
15
Problems with Current Testing Theory
Distinguishability
no clear borders between different domains
Non-language factors
Potential for inseparability of skills need to
decide if this is good or bad (or both, or
neither)
How should it be done and who should do it?
Assessment
Operational definition
Any definition needs to be of practical use
16
Distinguishability
Most language is not domain specific
It is not possible to mark definitive borders to
any language domain
This implies that we can never create a true test
of language for any specific purpose
HOWEVER
We can, and have, identified uses of language
that are specific to a domain
17
Impact of Non-Language Factors
Two ways to look at this
In everyday communication transferral of message
is achieved through a combination of language,
cues, signals and symbols the same can be said
of communication in a specific purpose domain
Background knowledge, in this case of the
business domain, can have a positive (or
negative) impact on an individuals ability to
perform a particular task
18
Assessment
A number of ways to look at this
We can opt to use assessment scales which include
criteria that focus on language only and do not
take into account the notion of task fulfilment
We can create indigenous scales which are
focused on task performance rather than on
language per se
We can attempt to do both of the above this was
done in the PLAB test
19
Operational Definition
At the moment there is none that is universally
accepted
Douglas offers a basic theoretical defence of SP
testing, but not an operational definition (e.g.
his test reviews are not driven by a systematic
framework)
Critics of SP testing (e.g. Davies, Elder,
Cummings) do not offer us an operational solution
We need to be able to make more evidence driven
statements about a test in order to more clearly
understand its specific nature
20
How Specific are LSP Tests?
There seems to be a range of specificity
21
What is Specificity?
The Degree of Test Specificity Continuum
22
Extending our Ideas of Specificity
23
Locating Specificity I
Framework for Test Validation (based on Weir,
2004)
24
Locating Specificity II
Aspects of Context Validity for Speaking (from
Weir, 2004)
25
Locating Specificity III
A Multi-Componential View of Specificity
Where Degree of Specificity is seen as evidence
of Situational Authenticity
26
Specificity as Situational Authenticity
Situational Authenticity is defined as lying
along not a single continuum, but along a whole
series of continua each one reflecting an
aspect of the demands that define the test task
27
Locating Specificity Example I
Differences in Task Demands between LSP and
General Proficiency Test Papers
28
Locating Specificity Example II
Differences in Text Demands between LSP and
General Proficiency Test Papers
29
The Other Side of Authenticity
Aspects of Theory-Based Validity for Speaking
(from Weir, 2004)
30
Interactional Authenticity
Framework for Test Validation (based on Weir,
2004)
31
Specificity as Interactional Authenticity
Interactional authenticity is defined as the
interaction of the test takers executive
resources and internal processes i.e. their
cognitive and meta-cognitive processing and the
context of the test task, as defined by the
demands of that task
32
Evidence of Interactional Authenticity
Interactional Authenticity is located within the
individual test taker
It is realised in the cognitive and
meta-cognitive processes used by the individual
in responding to tasks
Though we hope that SP test and real life tasks
will result in the same interaction between
processing and resources we can never truly
measure interactional authenticity
We can use protocol analysis and/or
questionnaire-based analysis to probe for evidence
33
Observations
SP tests tend to reflect (to varying degrees) the
SP language domain in that they refer to
domain-specific and domain non-specific language
SP tests can be described theoretically using
Douglas authenticity-based approach as a basis
In order to operationalise this description, we
need to adopt a multi-componential view of test
specificity (or situational authenticity) while
recognising that interactional authenticity can
only be explored through candidate-based research
34
References
Cumming, A. 2001. ESL/EFL instructors practices
for writing assessment specific purposes or
general purposes? Language Testing, 18, 2
207-224. Davies, A. 2001. The logic of testing
languages for specific purposes. Language
Testing, 18, 2 133-147. Douglas, D. 2000.
Assessing Language for Specific Purposes.
Cambridge, CUP Elder, C. 2001. Assessing the
language proficiency of teachers are there any
border controls? Language Testing, 18, 2
149-170. OSullivan, B. 2005. Issues in Testing
English for Business Purposes The BEC Revision
Project. Studies in Language Testing Series, Vol.
17. Cambridge CUP/Cambridge ESOL Weir, C. J.
2004. Language Testing and Validation An
Evidence-Based Approach. Oxford Palgrave
35
CONTACT
Dr Barry OSullivan Centre for Research in
Testing, Evaluation Curriculum Erasmus
House Roehampton University Roehampton
Lane London SW15 5PU United Kingdom Tel 44
(0)20 8392 3467Fax 44 (0)20-8392-3031 b.osulli
van_at_roehampton.ac.uk
36
Main Reference
Barry OSullivan 2005 Issues in Testing English
for Business Purposes The BEC Revision Project.
Studies in Language Testing Series, Vol. 17.
Cambridge CUP/Cambridge ESOL
Write a Comment
User Comments (0)
About PowerShow.com