Approaches To Automated Benchmarking Of Public Sector Web Sites - PowerPoint PPT Presentation

About This Presentation
Title:

Approaches To Automated Benchmarking Of Public Sector Web Sites

Description:

Local and central government organisations are developing Web ... Moray Council West Lothian Council. Cardiff CC Ceredigion CC. Isle of Anglesey CC Wrexham CBC ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 29
Provided by: brian89
Category:

less

Transcript and Presenter's Notes

Title: Approaches To Automated Benchmarking Of Public Sector Web Sites


1
Approaches To Automated Benchmarking Of Public
Sector Web Sites
  • Brian Kelly
  • UK Web FocusUKOLN
  • University of Bath
  • Bath, BA2 7AY

Email B.Kelly_at_ukoln.ac.uk URL http//www.ukoln.ac.
uk/
UKOLN is supported by
2
Contents
  • Background
  • WebWatch Project
  • Current Approach
  • Pilot UK Local Authority Survey
  • Other Approaches
  • Discussion
  • Conclusions and Recommendations

3
The Problem
  • Background
  • Local and central government organisations are
    developing Web-based services
  • There is a need to audit the services in order to
    measure compliance with standards and guidelines,
    coverage, usability, etc.
  • Aim Of This Talk
  • This talk describes experiences in the use of
    Web-based auditing services and summarises the
    benefits and limitations of this approach
  • NOTE
  • The talk does not provide detailed results of a
    survey of UK public sector Web sites although a
    summary of a pilot is given
  • The talk does not cover manual evaluation of Web
    sites

4
Web Site Benchmarking
  • Why benchmark Web sites?
  • To monitor compliance with standards guidelines
  • To monitor trends and developments across a
    community
  • To allow funders to observe developments
  • To allow members of a community to see how the
    community is developing and how they compare with
    the community
  • To inform the Web community on the uptake of Web
    standards and protocols e.g.
  • inform W3C on extent of compliance with WAI
    guidelines across large communities
  • inform e-Government on take-up of E-GIF standards

5
Benchmarking Examples
  • Examples
  • Do local government Web sites comply with W3C WAI
    guidelines?
  • How large are the entry points to local
    government Web sites?
  • Do the Web sites comply with HTML, CSS, XML, etc.
    standards?
  • Do the Web sites work?
  • Does it appear, for example, that awareness of
    the importance of accessibility and standards
    compliance been accepted or does it seem to be
    too difficult to provide compliance?

6
WebWatch Project
  • WebWatch project
  • Funded for one year by British Library
  • Started in 1997
  • Software developer recruited
  • Development and use of robot software to monitor
    Web sites across communities
  • Several surveys carried out
  • UK Public Library Web sites
  • UK University Web sites
  • See lthttp//www.ukoln.ac.uk/web-focus/webwatch/re
    ports/gt

7
WebWatch Mark II
  • By 1999
  • Funding had finished
  • Software developer left
  • Realisation that
  • Development of in-house software was expensive
  • Web site auditing tools were becoming available
  • Web site auditing Web services were becoming
    available
  • Since 1999
  • Use of (mainly) freely available Web services to
    benchmark various public sector Web communities
  • Regular columns in Ariadne e-journal
    lthttp//www.ariadne.ac.uk/gt (list at
    lthttp//www.ukoln.ac.uk/web-focus/webwatch/report
    slatest/gt)
  • Experience gained in issues of Web site
    benchmarking

8
Benchmarking Web Sites
http//www.cast.org/bobby/
  • Bobby is an example of a Web-based benchmarking
    service which provides information on compliance
    with W3C WAI guidelines

9
Use Of The Services
  • The benchmarking Web sites are normally designed
    for interactive (manual) use
  • However the input to the Web sites can be managed
    automatically, which speeds up the submission
    process
  • It would be possible to automate processing of
    the results, but this hasnt (yet) been done
  • Lack of software developer resources
  • Quality of output needs to be determined
  • It should be the responsibility of the service
    provider to provide output in reusable format

10
Displaying Results
  • The input to the benchmarking Web services and a
    summary of the results is provided as a Web
    resource.
  • This provides
  • Openness of methodology
  • Ability to compare your Web sites with those
    published
  • Technique Used
  • Use the Web service on a site
  • Copy URL into template
  • Determine URL structure
  • Use as basis for use with other URLs

http//bobby.cast.org/bobby/bobbyServlet?
URLhttp3A2F2Fwww2.brent.gov.uk2FoutputSubmi
tglwcag1-aaa
11
Use of Bobby
Bobby analysis oflthttp//www.ukonline.gov.uk/gt
  • Analysis of UKOnline appears to show a compliant
    site, 0.5K in size.
  • Examination show that this is an analysis of a
    Redirect page. Analysis of the destination shows
    lack of compliance with WAI guidelines and a size
    of 1.17 K
  • Further examination show that this is an analysis
    of a Frames page. Analysis of the individual
    frames shows
  • A file size of 24.8 K for one frame
  • The other frame could not be analysed due to lack
    of support for cookies in Bobby

12
Benchmarking Services (2)
http//www.netmechanic.com/
  • NetMechanic is another examples of a Web-based
    Web site testing services
  • It can check
  • Links
  • HTML and browser compatibility
  • File sizes

13
Benchmarking Sites
  • It is possible to benchmark entire Web sites and
    not just individual pages, such as entry points
  • Nos. of links to Web site
  • Nos. of pages indexed
  • Relationships with other Web sites
  • You can also measure the server availability and
    uptime (e.g. using Netcraft)

14
Standard Files
  • It is also possible to analyse a number of
    standard Web sites files
  • The robots.txt file
  • Has one been created (to stop robots for
    indexing, say, pre-release information)?
  • Is it valid?
  • The 404 error page
  • Has a tailored 404 page been created or is the
    server default one used?
  • Is it rich in functionality (search facility,
    links to appropriate help information, etc.)?
  • Search Engine page
  • Is a search facility provided, and, if so, what
    type?

Note manual observation of functionality of
these files is currently needed
15
Pilot Benchmarking
  • Short-listed candidates for the SPIN 2001-SOCITM
    Web site Awards were used in a pilot benchmarking
    exercise
  • Benchmarking initially carried out in July 2001
    (for a paper at the EuroWeb 2001 conference)
  • Repeated in April 2002 (allowed trends to be
    spotted)
  • Web sites analysed were
  • L B Brent L B Camden
  • L B Richmond Tameside MBC
  • Wokingham Council Dumfries Galloway Council
  • Dundee City Council East Renfrewshire Council
  • Moray Council West Lothian Council
  • Cardiff CC Ceredigion CC
  • Isle of Anglesey CC Wrexham CBC
  • Antrim BC Armagh DC
  • Belfast City Council Newtownabbey BC

5 English, 5 Scottish, 4 Welsh and 4 Northern
Ireland
Findings at lthttp//www.ukoln.ac.uk/web-focus/even
ts/conferences/spin-2002/gt
16
Pilot Benchmarking Findings (1)
  • Accessibility (using Bobby)
  • In first survey 8 (44) sites had no WAI P1
    errors on home page
  • In second survey only 1 site had no P1 errors
  • Comments
  • Accessibility is an important issue and awareness
    of this is growing. but the most visible page
    on these Web sites tends not to be accessible,
    and this is getting worse
  • Discussion
  • Bobby changed its service between the two
    surveys. It no longer reports on the file size.
    Has it changed its algorithm for measuring
    accessibility?

17
Pilot Benchmarking Findings (2)
  • HTML Quality, etc. (using NetMechanic)
  • One home page appeared to have a broken link in
    both surveys, but this appears not to be the case
  • All home pages have HTML errors, and in some
    cases this is getting worse (from 4 errors to 101
    errors in one case)
  • Comments
  • Compliance with HTML standards is needed in order
    to (a) avoid browser dependencies (b) facilitate
    access by specialist browser (c) facilitate
    repurposing.
  • The Web sites do not appear to be addressing this
  • Many errors could be easily fixed e.g. by
    adding an HTML DTD statement at top of file

18
Pilot Benchmarking Findings (3)
  • Web Server Software (using Netcraft)
  • 12 Web sites (66.7) use an MS Windows platform,
    5 (27.8) a Unix platform and 1 (5.6) an unknown
    platform.
  • Proportions had not changed in second survey
  • Will proportions change in light of MS licensing
    fees?
  • Link Popularity (using LinkPopularity)
  • In the initial survey the most linked-to site in
    the initial survey was Dundee City Council (896
    links according to AltaVista) or L B Brent (626
    links according to Google).
  • In the second survey the most linked-to site was
    Brent (883 links according to AltaVista) or
    Cambden (1,810 links according to Google).

19
Pilot Benchmarking Findings (4)
  • 404 Page
  • 12 Web sites (67) still had the server default
    404 page
  • Proportions had not changed in second survey
  • Search Engine Page
  • 6 Web sites (33) do not appear to have a search
    facility

20
Some Issues (1)
  • When using Bobby and NetMechanic different
    results may be obtained.
  • This may be due to
  • Analysis vs following redirects
  • Analysis of frameset page but not individual
    frame pages
  • Not analysing images due to Robot Exclusion
    Protocol
  • Differences in covering external resources such
    as JavaScript files, CSS, etc.
  • Splash screens

21
Some Issues (2)
  • Bobby changed its interface, URL, functionality
    and licensing conditions between the two surveys
  • URL change meant previous live survey wouldnt
    work
  • Bobby no longer provides information on browser
    compatibility errors or file sizes
  • The downloadable version of Bobby is no longer
    free (not an issue for this work)
  • This illustrates some of the dangers of this
    approach
  • It is not known if Bobbys algorithms were
    changed for measuring WAI compliance, which could
    affect comparisons

22
Market For Benchmarking
  • There is increasing interest in Web site
    benchmarking
  • Consortia e.g. see SOCITM Will you be Better
    Connected in 2001? service at lthttp//www.socitm.
    gov.uk/mapp/mapdf/Web_inner.pdfgt
  • Visual impairment rating
  • 12 page report about your site
  • Recommendations for improving site
  • 495 (subscribers) or 950 for survey
  • Industry e.g. companies such as Business2www
  • Published factual audit of Local Government sites
  • See lthttp//www.business2www.com/gt
  • Or Google search for Web auditing

23
Who Does The Work And Why?
  • Who should benchmark?
  • Community itself (e.g. national association)
  • But how self-critical can it be?
  • The funders
  • But will they take on-board the complexities?
  • Neutral body
  • But is there an obvious body to do the work?
  • What is the purpose of the benchmarking?
  • Is it linked to funding, with penalty clauses for
    non-compliance?
  • Is it to support the development of the
    community, by highlighting best practices?

24
Technical Issues
  • Web Services
  • There is a need to develop from use of
    interactive Web sites to services designed for
    machine use
  • There may be a role for a Web Service approach
    in which a rich set of input can be provided
    (e.g. using SOAP).
  • EARL
  • There is a need for a neutral and reusable output
    format from benchmarking services
  • W3Cs EARL (Evaluation and Reporting Language)
    may have a role to play
  • As EARL is based on RDF it should be capable of
    describing the benchmarking environment in a rich
    and machine understandable way
  • See lthttp//www.w3.org/WAI/ER/IG/earl.htmlgt

25
Recommendations (1)
  • Standards Bodies (e.g. W3C Equivalent)
  • There is a clear need for rigourous definitions
    to assist in Web auditing in order to ensure that
    valid comparisons can be made across auditing
    services
  • It would be useful to provide test case Web sites
    in order to compare different audits
  • Examples
  • Definitions of a page
  • Files which should be analysed
  • How to handle robot exclusion protocol
  • User-agent view

26
Recommendations (2)
  • Applications Developers
  • There is to ensure that Web-based benchmarking
    services can be tailored and the output can be
    reused
  • Benchmarking services should be capable of
    emulating a range of user agents
  • Benchmarking services should provide user control
    over compliance with the Robot Exclusion Protocol
  • Benchmarking services should provide user control
    over definitions of files to be analysed
  • Benchmarking services should provide user control
    over the definition of a page (e.g. include
    redirected pages, sum results of original and
    redirected page, etc.)

27
Recommendations (3)
  • There are benefits to communities in monitoring
    trends and sharing best practices which have been
    spotted in benchmarking work
  • Lets share the results and issues across our
    related communities
  • Lets share the approaches to benchmarking across
    bodies involved in benchmarking

28
Questions
  • Any questions?
Write a Comment
User Comments (0)
About PowerShow.com