Deliberation and Mass Participation in Regulatory Rulemaking - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

Deliberation and Mass Participation in Regulatory Rulemaking

Description:

'THE STATE OF RULEMAKING IN THE FEDERAL GOVERNMENT' ... Two coders checked the work of each coder after the second round. Coming soon ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 43
Provided by: sshu3
Category:

less

Transcript and Presenter's Notes

Title: Deliberation and Mass Participation in Regulatory Rulemaking


1
Deliberation and Mass Participationin Regulatory
Rulemaking
  • Dr. Stuart W. Shulman

American University, March 15, 2005THE STATE OF
RULEMAKING IN THE FEDERAL GOVERNMENT
2
  • This research has been supported by grants from
    the National Science Foundation
  • EIA 0089892
  • SGER Citizen Agenda-Setting in the Regulatory
    Process Electronic Collection and Synthesis of
    Public Commentary
  • EIA 0327979, 0328175, 0328914 0328618
  • SGER Collaborative A Testbed for eRulemaking
    Data
  • IIS 0429293, 0429102, 0429360 0429243
  • Collaborative Research Language Processing
    Technology for Electronic Rulemaking
  • SES 0322662
  • Democracy and E-Rulemaking Comparing
    Traditional vs. Electronic Comment from a
    Discursive Democratic Framework
  • Any opinions, findings and conclusions or
    recommendations expressed in this material are
    those of the authors and do not necessarily
    reflect those of the National Science Foundation

3
Jamie Callan, Carnegie Mellon University Eduard
Hovy, USC/Information Sciences Institute Stuart
Shulman, University of Pittsburgh Stephen
Zavestoski, University of San Francisco
David Schlosberg, Northern Arizona
University with Shulman and Zavestoski
Project Home Page http//erulemaking.ucsur.pitt.e
du Contact Shulman_at_pitt.edu
4
Overview of the Talk
  • Origin of the Research Question
  • Is electronic rulemaking more deliberative?
  • How would we know it if we saw it?
  • Overview of the Digital Landscape
  • Mass participation trends
  • Background on a recent survey
  • Nature of the sample and initial findings
  • Update on the Qualitative Data Analysis
  • Over 2000 public comments coded to date

5
A Social Science Research Agenda
  • Discursive democracy
  • Can IT better serve democratic ideals underlying
    citizen participation public discourse?
  • Tough concepts to test
  • Survey research one route
  • Qualitative data analysis is another

Stuart W. Shulman, David Schlosberg, Stephen
Zavestoski, and David Courard-Hauri, "Electronic
Rulemaking New Frontiers in Public
Participation," Social Science Computer Review
21, 2 (Summer 2003), 162-178.
6
E-Rulemaking and Discursive Democracy
  • Looking for signs of
  • Deliberation, not preference aggregation
  • Inclusion of difference
  • Respect for multiple positions
  • Transformation of preferences
  • Expanding discourse in the public sphere
  • Authenticity and impact
  • The jury is out, however
  • the early signs are mixed for advocates of
    ameliorative e-deliberation in rulemaking

7
(No Transcript)
8
(No Transcript)
9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
Will any old words add meaning?
What other trust is required here?
14
Trust that you arenot being misled?
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
Clearly there are some unusual things going
on with public commenters
21
Duplicate Detection Solutions Language
Processing Technology
  • Duplicate detection algorithms
  • Generate summary counts
  • Identify the reference copy
  • Summarize differences from reference copy
  • Near-duplicate detection techniques
  • Use cosine correlation to identify similar
    documents
  • Identify near-duplicates using document
    fingerprints
  • Sequences of words that match in each document
  • Output
  • A reliable and easy count of duplicates
  • Unique passages isolated and displayed/clustered

22
OAR-2003-0214
OAR-2001-0017
OAR-2003-0214
23
Overview of the Regulatory Actions
24
Case Characteristics and Data Access
The original unsorted dataset shared by the EPA
had duplicates, triplicates, spam, and other
unrelated e-mails
25
Summary of Completed Telephone Surveys
Draft paper available at http//snipurl.com/ddl2
26
Brief Summary of Survey Findings
  • Electronic commenters are no more discursive than
    paper commenters
  • High levels of self-reported discursive activity
    across all types of commenters
  • Significant differences between individuals who
    submitted original comments and those who posted
    or mailed form letters

27
Paper vs. Electronic
  • One significant difference
  • Paper submitters were more likely to refer to
    arguments of others
  • One significant similarity
  • Paper commenters also use the web for information
    gathering
  • Implication for this project
  • It may just be too late to distinguish between
    paper and electronic comments

28
Overall High Levels of Discursive Activity
  • Four indicators of discursive activity were high
    across submission type
  • Commenters we surveyed claim they
  • are information seekers
  • 90 say they get a lot or some information
  • review others comments
  • 68.0 say they had read the comments of others
  • gain an understanding of others
  • 73.2 say they get a better understanding of the
    positions of other citizens by reading their
    comments
  • change their positions
  • 36.3 of those surveyed report that their
    position on an issue actually changed after
    reading others comments

29
Differences Between the Experiences of Original
and Form Comment Submitters
  • Original commenters use agency websites more
    often and are more likely to
  • report gaining a greater understanding of the
    positions of others
  • think their comments will be reviewed by
    government employees
  • have a positive view of the agency after
    participating
  • be satisfied with agency decisions (though
    overall, response is overwhelmingly negative)
  • trust the government to do what is right

30
  • Our current qualitative data analysis efforts
  • Sample of 1000 e-mails
  • Drawn at random from 536,000 e-mails sent to the
    EPA on the mercury rule
  • Two completed rounds of coding this data
  • Overlap to test inter-rater reliability
  • Two coders checked the work of each coder after
    the second round
  • Coming soon
  • A new batch of 1000 documents from the official
    mercury E-DOCKET population (4200)

31
Primary documents were divided up to allow
for systematic checks of inter-rater reliability
32
Qualitative Methods
  • Goal Valid inferences about phenomena
  • Replicable and transparent methods
  • Attention to error and corrective measures
  • External validation of results
  • Using computers for qualitative data analysis
    helps
  • The rigor still originates with the research
    design
  • Software makes better organization possible

33
Only 1 passage of text made referenceto another
comment (but not really)
32 passages of textmade reference to
information in the docket
680 of the first 1000 mercury e-mails sampled
were coded by one student and then checked by two
others. 485 (71) were 98 exact duplicates, 122
(18) were 70 duplicates, 50 (7) were
off-topic, and only 23 (3) were original
submissions.
122 passages were unique text added to a form
letter
34
1 passage made a referenceto another comment
32 passages made a reference toinformation in
the Docket
320 of the 1000 emails were coded by two
students, and then checked by two more.
140 passages contained unique Text added to a
form letter
35
Reports from the original coding can be subcoded
as new analytical units
36
(No Transcript)
37
(No Transcript)
38
(No Transcript)
39
(No Transcript)
40
(No Transcript)
41
(From the Waters public comment dataset)
42
Thank-you for inviting me!
  • Dr. Stuart W. ShulmanUniversity of Pittsburgh
  • Shulman_at_pitt.edu (e-mail)
  • http//shulman.ucsur.pitt.edu (home page)
  • 412.624.3776 (voice)
Write a Comment
User Comments (0)
About PowerShow.com