Reflections of an editor on research and practice - PowerPoint PPT Presentation

1 / 47
About This Presentation
Title:

Reflections of an editor on research and practice

Description:

What I want to talk about. The disconnect between practice and research ... 'research gain' (what are the chances that an investment could result in real advances? ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 48
Provided by: bma121
Category:

less

Transcript and Presenter's Notes

Title: Reflections of an editor on research and practice


1
Reflections of an editoron research and practice?
  • Richard Smith
  • Editor, BMJ
  • Granada, May 2002
  • www.bmj.com/talks

2
What I want to talk about
  • The disconnect between practice and research
  • A rough history of health research in Britain
  • A vision of how to improve the connection
  • The relation between disease burden and volume of
    research
  • Setting research priorities

3
What I want to talk about
  • Where does innovation come from?
  • Peer review of research
  • Measuring the value of research
  • Disseminating research
  • How to get from research to change?
  • Conclusions

4
The disconnect between practice and research
  • Research is usually funded by the Ministry of
    Education, whereas health care and public health
    is funded by the Ministry of Health
  • Research is run by researchers who value basic
    science, discovery, and original questions,
    thinking, and methodology
  • The answering of practical questions is seen as
    dull, unoriginal, and unimportant in scientific
    terms
  • Nobel prizes go to the discoverers of molecular
    mechanisms not those who work out the most cost
    effective method for treating incontinence

5
The disconnect between practice and research
  • There is often no mechanism to transmit the
    questions of practitioners (and patients) to
    researchers
  • Scientists are wary of directed research only
    scientists can know what is scientifically
    important directed research leads nowhere
  • The results of research do not seem valuable to
    practitioners
  • The idea that doctors are scientists is a myth

6
The disconnect between practice and research
  • Most practitioners are not competent researchers
  • Nor are practitioners sophisticated consumers of
    research
  • Practice is one thing research another.
    I make decisions based on my experience and
    what clinical experts advise
  • Health policy makers sometimes boast that they
    dont use research results
  • Evidence based practice is a force for change,
    bringing research and practice together

7
The disconnect between practice and research
  • It must be evidence informed practice not
    evidence tyrannised practice
  • But we realise that fewer than 5 of studies in
    medical journals are both valid and relevant to
    clinicians or policy makers in most journals
    its less than 1
  • We have good evidence on perhaps 10 of
    treatments and a smaller percentage of questions
    about diagnosis, symptoms and signs, and
    prognosis
  • Evidence needed for health management and policy
    is even weaker

8
A rough history of health research in Britain
  • 1900s--independent researchers
  • 1930s--Medical Research Council (MRC) begins
  • 1940--Pharmaceutical companies begin to do a
    great deal of research
  • 1980--MRC begins some health services research
  • 1986--House of Lords realises that the National
    Health Service has almost no research capacity

9
A rough history of health research in Britain
  • 1990--NHS research and development directorate
    established
  • vision is a knowledge based health service
  • aim is to spend 3 of NHS turnover on RD
  • programme attracts international interest
  • 2001--NHS RD programme still there but is less
    central than it once was

10
Bringing practice and research closer together a
vision
  • Patient asks a question to a doctor
  • Doctors consults databases on what the evidence
    says (Cochrane Library, Clinical Evidence, or an
    electronic decision support system)
  • (Or, increasingly, patient consults the same
    knowledge sources as the doctor--besttreatments.or
    g)
  • If there is evidence, patient and doctor discuss
    best course of action

11
Bringing practice and research closer together a
vision
  • If there is no evidence, then a systematic review
    may be needed
  • Or the patient and doctors consult the
    meta-register of trials underway
  • If there is a trial, the patient may enter the
    trial (knowing that patients treated in trials do
    better than others no matter whether they get the
    active treatment)
  • If there is no trial, then the patient and doctor
    register the question with a central database

12
Bringing practice and research closer together a
vision
  • Trials can then be conducted to answer the
    questions that are most important and arising
    most commonly
  • The information sources needed to achieve this
    vision exist for questions on treatment
  • (The culture and the infrastructure do not exist)
  • The information sources do not exist for
    questions on diagnosis, prognosis, health policy,
    and much else--but could be created

13
The relationship between disease burden and the
amount of research
  • Examples from the US, Africa, and neurology
  • The 9010 rule--90 of research is on diseases
    affecting 10 of the worlds population
  • The association is often small some diseases
    with a small burden are highly researched,
    whereas some with a high burden are poorly
    researched

14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
A system for prioritising research
  • Consider disease burden
  • Consider questions generated by patients and
    health care providers
  • Consider possible research gain (what are the
    chances that an investment could result in real
    advances?)
  • Incorporate social and professional values
  • Britain has had a system along these lines--but
    only for NHS RD programme

18
Where does innovation come from? Two models
  • The linear model curiosity driven
    research---applied research---experimental
    development---innovation
  • The market pull model market need---applied
    research---experimental development--innovation

19
Project Hindsight (1966)
  • Examined 20 weapon systems (including Polaris)
  • Researchers identified 686 research or
    exploratory development events that were
    essential for development of the weapons
  • Only 9 were scientific research (0.3 basic
    research)
  • Only 9 of research conducted in universities

20
Project Hindsight (1966)
  • Science and technology funds deliberately
    invested and managed for defence purposes have
    been about one order of magnitude more efficient
    in producing useful events than the same amount
    of funds invested without specific concern for
    defence needs.

21
TRACES Study (1968)
  • Technology in Retrospect and Critical Events in
    Science
  • Origins of magnetic ferrites, video recorder,
    contraceptive pill, electron microscope, and
    matrix isolation
  • Looked back 50 not 20 years, as did Project
    Hindsight

22
TRACES Study (1968)
  • 340 events
  • 70 non-mission research, 20 mission oriented,
    and 10 development and application
  • Universities did 75 of non-mission and one third
    of mission oriented research

23
Comroe and Dripps (1976)
  • Julius Comroe, physiologist, and Robert Dripps,
    anaesthetist
  • The top 10 advances in cardiovascular and
    pulmonary medicine and surgery in the last 30
    years
  • Around 100 specialists selected the top 10

24
Top 10 advances
  • Cardiac surgery
  • Vascular surgery
  • Drug treatment of hypertension
  • Medical treatment of myocardial ischaemia
  • Cardiac resuscitation
  • Oral diuretics
  • Intensive care units
  • Antibiotics
  • New diagnostic methods
  • Prevention of polio

25
Comroe and Dripps (1976)
  • Went back to the dawn of time
  • 137 essential bodies of knowledge
  • 500 essential or key articles
  • 41 not clinically oriented
  • 37 basic not clinically oriented
  • 25 basic clinically oriented

26
Conclusions from studies of innovation
  • The sources of innovation are numerous, varied,
    and scattered
  • Both the science push and market pull models of
    innovation are oversimplified
  • Research funders should not put all their eggs in
    one basket
  • Attempts to force more relevant research may
    backfire

27
Conclusions from studies of innovation
  • The coming together of different lines of
    research and and scientists from different
    disciplines seems to be important
  • Promoting interdisciplinary research may seed
    innovations
  • Research into research may be beneficial

28
Peer review of research
  • Research grants are often given after peer review
  • Which research will be published is often decided
    by peer review
  • But there are problems with peer review

29
Problems with peer review
  • A lottery
  • A black box
  • Ineffective
  • Slow
  • Expensive
  • Biased
  • Easily abused
  • Cant detect fraud

30
Peer review
  • But it is hard to find an alternative to peer
    review
  • Its like democracy--the least bad system
  • The answer seems to be to improve peer review
    with training, openness, blinding, etc

31
Measuring the value of research
  • The point of health research is to improve health
  • But researchers are usually rewarded according to
    measures of scientific value
  • These include the impact factor of the journal in
    which they publish--despite there being little or
    no correlation for individual authors between the
    impact factor of the journal in which they
    publish and citations to their articles
  • There are many other problems with impact
    factors--bias towards certain disciplines, US,
    methodology data are often unreliable

32
(No Transcript)
33
Measuring the value of research
  • Royal Netherlands Academy of Arts and Sciences is
    trying to devise a measure of the social impact
    of research
  • Might include publications, software, products,
    press coverage, etc
  • But its not easy to find a reliable measure

34
My suggestions for measuring influence/impact
  • Level one (the highest) making change happen
  • Level two setting the agenda for debate
  • Level three leading by example
  • Level four being quoted
  • Level five being paid attention to
  • Level six (the lowest) being known about

35
Dissemination of research
  • There are tens of thousands of journals
  • Millions of studies are published each year
  • Most studies are neither valid nor relevant
  • Its hard--usually impossible--for clinicians and
    policy makers to keep up
  • There is a need to review research results
    systematically

36
Dissemination of research
  • Evidence based journals--coverage of one off
    studies, not put into context
  • Cochrane Library--treatments only, big gaps,
    researcher (not clinician) led questions, complex
  • Clinical Evidence--treatments only, 160 topics,
    400 needed

37
Dissemination of research
  • Guidelines--cover only some topics, sometimes not
    evidence based, go beyond the evidence, tell
    people what to do
  • Appraisals by National Institute of Clinical
    Excellence (NICE) --cover only a few topics, must
    incorporate evidence, cost, and values,
    insufficiently transparents

38
From information to change
  • Change
  • Know how
  • Know about
  • Information
  • Data

39
Failures to follow evidence
  • Aspirin underused in patients with vascular
    disease
  • ACE inhibitors underused in patients with heart
    failure
  • Inhalational steroids underused in patients with
    asthma
  • Antibiotics overused in patients with upper
    respiratory tract infections and acute otitis
    media
  • Enemas, pubic shaving, and episiotomies overused
    in women in labour

40
From research to change
  • We should stop all research for two years and
    concentrate instead on implementing what we
    already know.
  • Somebody in, I think, the Lancet quite some time
    ago

41
From information to change
  • Achieving change is hard
  • Information on its own rarely changes practice
  • Combinations of audit and feedback, computerised
    reminders, educational outreach, and interactive
    educational sessions will sometimes change
    practice

42
From information to change
  • Interactive learning
  • Improvement methods
  • Organisational development
  • Consultancy
  • Just in time information

43
(No Transcript)
44
The thing that will save us
  • Able to answer highly complex questions
  • Connected to a large valid database
  • Electronic - portable, fast, and easy to use
  • Prompts doctors - in a helpful rather than
    demeaning way
  • Connected to the patient record
  • A servant of patients as doctors
  • Responds to the need for psychological support
    and affirmation

45
Conclusions
  • Research and practice are currently not well
    connected
  • Its possible to envision how they might be
    better connected
  • Some substantial health problems are poorly
    researched, while some smallish problems are
    heavily researched
  • Mechanisms are needed to set priorities in health
    research

46
Conclusions
  • Research into sources of innovation suggests that
    different sorts of research in different
    circumstances are important
  • Innovation often comes from interdisciplinary
    innovation
  • Peer review has many problems but can probably be
    improved

47
Conclusions
  • Better methods are needed for measuring the
    performance of health researchers
  • The dissemination of research results is
    inadequate, but better means are appearing
  • Moving from research to change is hard, but we
    can see how to do it better
Write a Comment
User Comments (0)
About PowerShow.com