Title: Solving Some Modeling Challenges when Testing Rich Internet Applications for Security
1Solving Some Modeling Challenges when Testing
Rich Internet Applications for Security
Software Security Research Group (SSRG),
University of Ottawa In collaboration with IBM
2SSRG Members
- University of Ottawa
- Prof. Guy-Vincent Jourdan
- Prof. Gregor v. Bochmann
- Suryakant Choudhary (Master student)
- Emre Dincturk (PhD student)
- Khaled Ben Hafaiedh (PhD student)
- Seyed M. Mir Taheri (PhD student)
- Ali Moosavi (Master student)
- In collaboration with
- Research and Development, IBM Security AppScan
Enterprise - Iosif Viorel Onut (PhD)
3IBM Security Solutions
IBM Rational AppScan Enterprise Edition Product
overview
4IBM Rational AppScan Suite Comprehensive
Application Vulnerability Management
4
Dynamic Analysis/Blackbox Static
Analysis/Whitebox -
SECURITY
REQUIREMENTS
CODE
BUILD
PRE-PROD
PRODUCTION
QA
AppScan Enterprise
AppScan onDemand
AppScan Reporting Console
Security Requirements Definition
AppScan Source
AppScan Standard
AppScan Standard
AppScan Tester
AppScan Build
Security requirements defined before design
implementation
Outsourced testing for security audits
production site monitoring
Security Compliance Testing, oversight,
control, policy, audits
Security / compliance testing incorporated into
testing remediation workflows
Automate Security / Compliance testing in the
Build Process
Build security testing into the IDE
Application Security Best Practices Secure
Engineering Framework
5AppScan Enterprise Edition capabilities
5
6AppScan Enterprise Workflows
- Management
- Review most common security issues
- View trends
- Assess risk
- Evaluate progress
-
- Development QA
- Conduct assessments
- View assessment results
- Remediate issues
- Assign issue status
- Compliance Officers
- Review compliance reports
AppScan Enterprise
- Build automation
- Source code analysis for security issues as part
of build verification - Publish findings for remediation and trending
- Information Security
- Schedule and automate assessments
- Conduct assessments with AppScan Standard and
AppScan Source and publish findings for
remediation and trending
- Tools
- AppScan Standard Edition
- AppScan Source Edition
- Tools
- AppScan Source for Automation
- AppScan Standard Edition CLI
6
7View detailed security issues reports
- Security Issues Identified with Static Analysis
- Security Issues Identified with Dynamic Analysis
- Aggregated and correlated results
- Remediation Tasks
- Security Risk Assessment
7
8Obtain a high-level view of the security of your
applications
- Compare the number of issues across teams and
applications - Identify top security issues and risks
- View trending of the number of issues by severity
over time - Monitor the progress of issue resolution
8
9Assess regulatory compliance risk
- Over 40 compliance reports, including
- The Payment Card Industry Data Security Standard
(PCI) - VISA CISP
- Children Online Privacy Protection Act (COPPA)
- Financial Services (GLBA)
- Healthcare Services (HIPAA)
- Sarbanes-Oxley Act (SOX)
9
10Introduction Traditional Web Applications
- Navigation is achieved using the links (URLs)
- Synchronous communication
11Introduction Rich Internet Applications
- More interactive and responsive web apps
- Page changes via client-side code (JavaScript)
- Asynchronous communication
12Crawling and web application security testing
- All parts of the application must be discovered
before we analyze for security. - Why automatic crawling algorithm are important
for security testing ? - Most RIAs are too large for manual exploration
- Efficiency
- Coverage
13What we present
- Techniques and Approaches to make web application
security assessment tools perform better - How to improve the performance?
- Make them efficient by analysing only whats
important and ignore irrelevant information - Making rich internet applications accessible to
them.
14Web application crawlers
- Main components
- Crawling strategy
- Algorithm which guides the crawler
- State equivalence
- Algorithm which indicates what should be
considered new
15State Equivalence
- Client states
- Decides if two client states of an application
should be considered different or the same. - Why important?
- Infinite runs or state explosion
- Incomplete coverage of the application
16Techniques
- Load-Reload Discovering non-relevant dynamic
content of web pages - Identifying Session Variables and Parameters
171. Load-Reload Discovering non-relevant dynamic
content of web pages
- Extracting the relevant information from a page.
18What we propose
- Reload the web page (URL) to determine the parts
of the content that are relevant.
Calculate Delta (X) Content that changed
between the two loads.
19What we propose (2)
-
- Delta(X) X is any web page and Delta(X) is
collection of xpaths of the contents that are
not relevant - E.g. Delta(X) html\body\div\,
html\body\a\_at_href
20Example
21Example (2)
22What we propose (3)
- Delta (X)
- Is purpose and application dependent
- Few computing techniques
- Use proxies
- Manual identification to supplement automatic
detection algorithm etc.
232. Identifying Session Variables and Parameters
- What is a session?
- A session is a conversation between the server
and a client. - Why should a session be maintained?
- HTTP is Stateless When there is a series of
continuous request and response from a same
client to a server, the server cannot identify
from which client it is getting requests.
24Identifying Session Variables and Parameters (2)
- Session tracking methods
- User authorization
- Hidden fields
- URL rewriting
- Cookies
- Session tracking API
- Problems that are addressed
- Redundant crawling Might result in crawler trap
or infinite runs. - Session termination problem Incomplete coverage
of the application if application requires
session throughout the access.
25What we propose
- Two recordings of the log-in sequence are done on
the same website, using the same user input (e.g.
same user name and password) and the same user
actions.
26Example
273. Crawling Strategies For RIAs
- Crawling extracts a model of the application
that consists of - States, which are distinct web pages
- Transitions are triggered by event executions
- Strategy decides how the application exploration
should proceed
28Standard Crawling Strategies
- Breadth-First and Depth-First
- They are not flexible
- They do not adapt themselves to the application
- Breadth-First often goes back to the initial page
- Increases the number of reloads (loading the URL)
- Depth-First requires traversing long paths
- Increases the number of event executions
29What we propose
- Model Based Crawling
- Model is an assumption about the structure of the
application - Specify a good strategy for crawling any
application that follows the model. - Specify how to adapt the crawling strategy in
case that the application being crawled deviates
from the model.
30What we propose (2)
- Existing models
- Hypercube Model
- Independent events
- The set of enabled events at a state are the same
as the initial state except the ones executed to
reach it. - Probability Model
- Statistics gathered about event execution results
are used to guide the application exploration
strategy
31Conclusion
- Crawling is essential for automated security
testing of web applications - We introduced two techniques to enhance security
testing of web applications - Identifying and ignoring irrelevant web page
contents - Identifying and ignoring session information
- We have worked on new crawling algorithms
32Thank You !
33Demonstration
- Rich Internet Application Security Testing
- - IBM Security AppScan Enterprise
34DEMO IBM Security AppScan Enterprise
- IBM Security AppScan Enterprise is an automated
web application scanner - We added RIA crawling capability on a prototype
of AppScan - We will demo how the coverage of the tool
increases with RIA crawling capability
35DEMO Test Site (Altoro Mutual)
36DEMO Results
37DEMO - Results