A Social Validation of Collaborative Annotations on Digital Documents - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

A Social Validation of Collaborative Annotations on Digital Documents

Description:

Guillaume Cabanac , Max Chevalier , , Claude Chrisment , Christine Julien ... ( Cabanac, Chevalier, Teste & Ravat, 2006) to appear in EGC'06 ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 29
Provided by: guillaum65
Category:

less

Transcript and Presenter's Notes

Title: A Social Validation of Collaborative Annotations on Digital Documents


1
A Social Validation of Collaborative Annotations
on Digital Documents
International Workshop on Annotation for
Collaboration Paris, November, 2425, 2005
Guillaume Cabanac, Max Chevalier,, Claude
Chrisment, Christine Julien

Laboratoire de Gestion et Cognition
2
Our context of work
hardcopy
Unsharable ? lost
redactor
87
(Ovsiannikov et al., 1999)
13
reader
3
Talk Roadmap Social Validation of Collaborative
Annotations
  • Collaborative annotations weaknessesUtility and
    usability study
  • Our approach definitions and validity
    computation
  • Implementation the TafAnnote prototype
  • Conclusion and perspectives of work

4
Collaborative annotations Weaknesses (1/2)
I. Collaborative annotations weaknesses, utility
and usability study
  • Demonstration

Annotea(W3C)Annotation server
Amaya(INRIA W3C)Annotation system
5
Collaborative annotations Weaknesses (2/2)
I. Collaborative annotations weaknesses, utility
and usability study
  • no range information (starting point only)
  • no replies count
  • no annotators expertise
  • no annotators opinion in a discussion thread
  • painful annotations exploration
  • no personal annotation spaceas for bookmarks
  • scalability issue

6
Collaborative annotations Utility (1/4)
I. Collaborative annotations weaknesses, utility
and usability study
  • Who does need them?
  • Web context
  • for redactors publication improvement
  • for annotators debate about different point of
    views

This is wrong because You should
correctness
You can also cite (Robert, 1999) who
completeness
No, I think that the death of Keats mother made
him
poetry
Im not sure. In his poem To hope, he shows
7
Collaborative annotations Utility (2/4)
I. Collaborative annotations weaknesses, utility
and usability study
  • Who does need them?
  • Decisional Systems
  • annotations formulated on
  • schema elements
  • values themselves
  • ? collect and build an expertise memory
    cf. (Cabanac, Chevalier, Teste Ravat, 2006) to
    appear in EGC06

8
Collaborative annotations Utility (3/4)
I. Collaborative annotations weaknesses, utility
and usability study
  • Who does need them?
  • Digital Libraries
  • digitized documents number ?
  • librarians annotations ? improve indexing
    process

information retrieval U process
9
Collaborative annotations Utility (4/4)
I. Collaborative annotations weaknesses, utility
and usability study
  • Who does need them?
  • Industrial context
  • technical documentation improvement

test annotate
draft
technicians
feedbacks
modify
use annotate
a planetechnicaldocumentation
engineers
pilots
10
Collaborative annotations Usability
I. Collaborative annotations weaknesses, utility
and usability study
  • Are they convenient and practicable for use?
  • Scalability projection
  • Our proposal identify socially validated
    annotation

11
Talk Roadmap Social Validation of Collaborative
Annotations
  • Collaborative annotations weaknessesUtility and
    usability study
  • Our approach definitions and validity
    computation
  • Implementation the TafAnnote prototype
  • Conclusion and perspectives of work

12
Definitions (1/2)
II. Our approach definitions and validity
computation
  • Collaborative annotationmodel
  • Objective information
  • Subjective information
  • Annotators expertise
  • Annotation types

13
Definitions (2/2)
II. Our approach definitions and validity
computation
  • Annotation Model
  • Objective information
  • Subjective information
  • Model instantiation

???? John Doe, 12/21/2004 Internet vs
Web invention Its false, Vint Cerf and Bob Kahn
invented IP 1, and Tim Berners-Lee invented the
Web 2.
1
Internet history
2
  • Inventor
  • Tim Berners-Lee

?? Robert Langdon, 05/14/2005 Tim
Berners-Lee point of view Tim explains on his
webpage 1 that he didnt invent the Internet,
but rather the World Wide Web.
1
14
Agreement of an annotator (1/3)
II. Our approach definitions and validity
computation
  • A mathematical example
  • Considering the combination of annotation types

ann1 You are mistaking, consider
correcting with
ann11 Ok, for example
ann12 Wrong equ. for neg. values
ann121 In general
ann2 False formula, see this
counterexample
ann3 You should precise the domain of x
15
Agreement of an annotator (2/3)
II. Our approach definitions and validity
computation
  • Considering the annotators involvement
  • in commenting
  • in referencing

ann1 You are mistaking, consider
correcting with
ann2 ?
1
ann3 You are mistaking, see 1,
2
2
16
Agreement of an annotator (3/3)
II. Our approach definitions and validity
computation
  • Mixing-up agreement of an annotator
  • A concrete example
  • a 0.6 gt b 0.4 ? comments more
    weighted than references

agreement(a)
0.641 com. 2 ref.
0.220 com. 0 ref.
0.361 com. 0 ref.
0.20 1 com. 1 ref.
0.290 com. 3 ref.
17
Reliability of an annotated passage
II. Our approach definitions and validity
computation
  • Annotated passage social reliability

18
Validity of a collaborative annotation (1/2)
Towards a discussion thread opinion synthesis
II. Our approach definitions and validity
computation
reliability
0dubious
1 not reliable at all
1reliable
19
Validity of a collaborative annotation (2/2)
Discussion thread opinion synthesis
II. Our approach definitions and validity
computation
  • We consider
  • agreement of replies (types, comment, references)
  • expertise declared by people who reply (?)
  • context many replies ? annotation more validated

is more validated than
20
RoadmapSocial Validation of Collaborative
Annotations
  • Collaborative annotations weaknessesUtility and
    usability study
  • Our approach definitions and validity
    computation
  • Implementation the TafAnnote prototype
  • Conclusion and perspectives of work

21
The TafAnnote prototypeGeneral description
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Client / server architecture
  • Mozilla Firefox extension

22
The TafAnnote prototypeMain features (1/4)
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Annotationcreation

personal annotation space
23
The TafAnnote prototype Main features (2/4)
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Annotation consultation

adaptative notification of new information
discussion thread
24
The TafAnnote prototypeMain features (3/4)
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Personal annotation space management

DD reorganization
View by annotation type
25
The TafAnnote prototypeMain features (4/4)
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Annotation search
  • boolean search
  • filter by annotators types

26
The TafAnnote prototypeSocial validation at work
III. Implementation social validation of
annotations in the TafAnnote prototype
  • Implemented in Oracle PL/SQL (server side)
  • Modification of annotations display

emphasized ? validated
quite hidden ? dubious remark
27
Conclusion and perspectives
IV. Conclusion and perspectives of future work
  • A social validation of collaborative annotations
  • why? scalability issue considering usability
  • how? exploit people opinions expressed in
    discussion threads
  • aim? indicate validated information expressed by
    annotations
  • Implementation the TafAnnote prototype
  • Client / Server architecture
  • user interaction Mozilla Firefox extension
  • annotations storage Oracle RDBMS
  • personal and collective annotation management
  • emphasized validated annotations ? pre-sort right
    information
  • Perspectives of work
  • evaluation with concrete users
  • NLP techniques ? deducing annotators opinions
  • exploitation indexing, summarizing

28
Question time
Write a Comment
User Comments (0)
About PowerShow.com