top of page

A Web Application Hacker’s Toolkit

Updated: Jan 3, 2023


Web Application Hacker’s Toolkit
Web Application Hacker’s Toolkit

A Web Application Hacker’s Toolkit

Some attacks on web applications can be performed using only a standard web browser; however, the majority of them require you to use some additional tools. Many of these tools operate in conjunction with the browser, either as extensions that modify the browser’s own functionality or as external tools that run alongside the browser and modify its interaction with the target application





The most important item in your toolkit falls into this latter category. It operates as an intercepting web proxy, enabling you to view and modify all the HTTP messages passing between your browser and the target application. Over the years, basic intercepting proxies have evolved into powerful integrated tool suites containing numerous other functions designed to help you attack web applications. This blog examines how these tools work and describes how you can best use their functionality


The most important item in your toolkit falls into this latter category. It operates as an intercepting web proxy, enabling you to view and modify all the HTTP messages passing between your browser and the target application. Over the years, basic intercepting proxies have evolved into powerful integrated tool suites containing numerous other functions designed to help you attack web applications. This blog examines how these tools work and describes how you can best use their functionality.

Finally, numerous smaller tools are designed to perform specific tasks when testing web applications. Although you may use these tools only occasionally, they can prove extremely useful in particular situations


Web Browsers

A web browser is not exactly a hack tool, as it is the standard means by which web applications are designed to be accessed. Nevertheless, your choice of web browser may have an impact on your effectiveness when attacking a web application. Furthermore, various extensions are available to different types of browsers, which can help you carry out an attack. This section briefly examines three popular browsers and some of the extensions available for them.


Firefox

Firefox is currently the second most widely used web browser. By most estimates, it makes up approximately 35% of the market. The majority of web applications work correctly on Firefox; however, it has no native support for ActiveX controls.


There are many subtle variations among different browsers’ handling of HTML and JavaScript, particularly when they do not strictly comply with the standards. Often, you will find that an application’s defenses against bugs such as cross-site scripting mean that your attacks are not effective against every browser platform. Firefox’s popularity is sufficient that Firefox-specific XSS exploits are perfectly valid, so you should test these against Firefox if you encounter difficulties getting them to work against IE. Also, features specific to Firefox have historically allowed a range of attacks to work that are not possible against IE,



A large number of browser extensions are available for Firefox that may be useful when attacking web applications, including the following:



  • HttpWatch is also available for Firefox.

  • FoxyProxy enables fl exible management of the browser’s proxy confi guration, allowing quick switching, setting of different proxies for different URLs, and so on.

  • LiveHTTPHeaders lets you modify requests and responses and replay individual requests.

  • PrefBar allows you to enable and disable cookies, allowing quick access control checks, as well as switching between different proxies, clearing the cache, and switching the browser’s user agent.

  • Wappalyzer uncovers technologies in use on the current page, showing an icon for each one found in the URL bar.

  • The Web Developer toolbar provides a variety of useful features. Among the most helpful is the ability to view all links on a page, alter HTML to make form fields writable, remove maximum lengths, unhide hidden form fields, and change a request method from getting to POST





Chrome

Chrome is a relatively new arrival on the browser scene, but it has rapidly gained popularity, capturing approximately 15% of the market.

A number of browser extensions are available for Chrome that may be useful when attacking web applications, including the following:

  • XSS Rays is an extension that tests for XSS vulnerabilities and allows DOM inspection

  • Cookie editor allows in-browser viewing and editing of cookies

  • Cookie editor allows in-browser viewing and editing of cookies

  • The Web Developer Toolbar is also available for Chrome

Chrome is likely to contain its fair share of quirky features that can be used when constructing exploits for XSS and other vulnerabilities.


Integrated Testing Suites


After the essential web browser, the most useful item in your toolkit when attacking a web application is an intercepting proxy. In the early days of web applications, the intercepting proxy was a standalone tool that provided minimal functionality. The venerable Achilles proxy simply displayed each request and response for editing. Although it was extremely basic, buggy, and a headache to use, Achilles was sufficient to compromise many a web application in the hands of a skilled attacker.

Over the years, the humble intercepting proxy has evolved into a number of highly functional tool suites, each containing several interconnected tools designed to facilitate the common tasks involved in attacking a web application. Several testing suites are commonly used by web application security testers:

  • Burp Suite

  • WebScarab

  • Paros

  • Zed Attack Proxy

  • Andiparos

  • Fiddler

  • CAT

  • Charles

These toolkits differ widely in their capabilities, and some are newer and more experimental than others. In terms of pure functionality, Burp Suite is the most sophisticated, and currently, it is the only toolkit that contains all the functionality described in the following sections. To some extent, which tools you use is a matter of personal preference. If you do not yet have a preference, we recommend that you download and use several of the suites in a real-world situation and establish which best meets your needs.


This section examines how the tools work and describes the common workflows involved in making the best use of them in your web application testing.


How the Tools Work


Each integrated testing suite contains several complementary tools that share information about the target application. Typically, the attacker engages with the application in the normal way via his browser. The tools monitor the resulting requests and responses, storing all relevant details about the target application and providing numerous useful functions. The typical suite contains the following core components.

  • An intercepting proxy

  • A web application spider

  • A customizable web application fuzzer

  • A vulnerability scanner

  • A manual request tool

  • Functions for analyzing session cookies and other tokens

  • Various shared functions and utilities

Intercepting Proxies


The intercepting proxy lies at the heart of the tool suite and remains today the only essential component. To use an intercepting proxy, you must configure your browser to use as its proxy server a port on the local machine. The proxy tool is configured to listen on this port and receives all requests issued by the browser. Because the proxy has access to the two-way communications between the browser and the destination web server, it can stall each message for review and modification by the user and perform other useful functions.


Configuring Your Browser


If you have never set up your browser to use a proxy server, this is easy to do on any browser. First, establish which local port your intercepting proxy uses by default to listen for connections (usually 8080). Then follow the steps required for your browser


  • In Internet Explorer, select Tools ÿ Internet Options ÿ Connections ÿ LAN settings. Ensure that the “Automatically detect settings” and “Use automatic configuration script” boxes are not checked. Ensure that the “Use a proxy server for your LAN” box is checked. In the Address field, enter 127.0.0.1, and in the Port field, enter the port used by your proxy. Click the Advanced button, and ensure that the “Use the same proxy server for all protocols” box is checked. If the hostname of the application you are attacking matches any of the expressions in the “Do not use a proxy server for addresses beginning with” box, remove these expressions. Click OK in all the dialogs to confirm the new configuration.

  • In Firefox, select Tools ÿ Options ÿ Advanced ÿ Network ÿ Settings. Ensure that the Manual Proxy Configuration option is selected. In the HTTP Proxy field, enter 127.0.0.1, and in the adjacent Port field, enter the port used by your proxy. Ensure that the “Use this proxy server for all protocols” box is checked. If the hostname of the application you are attacking matches any of the expressions in the “No proxy for” box, remove these expressions. Click OK in all the dialogs to confirm the new configuration.

  • Chrome uses the proxy settings from the native browser that ships with the operating system on which it is running. You can access these settings via Chrome by selecting Options ÿ Under the Bonnet ÿ Network Change Proxy Settings.


Web Application Spiders

Web application spiders work much like traditional web spiders. They request web pages, parse them for links to other pages, and then request those pages, continuing recursively until all of a site’s content has been discovered. To accommodate the differences between functional web applications and traditional websites, application spiders must go beyond this core function and address various other challenges.

  • Forms-based navigation, using drop-down lists, text input, and other methods

  • JavaScript-based navigation, such as dynamically generated menus

  • Multistage functions requiring actions to be performed in a defined sequence

  • Authentication and sessions

  • The use of parameter-based identifiers, rather than the URL, to specify different content and functionality

  • The appearance of tokens and other volatile parameters within the URL query string, leading to problems identifying unique content.


Several of these problems are addressed in integrated testing suites by sharing data between the intercepting proxy and spider components. This enables you to use the target application in the normal way, with all requests being processed by the proxy and passed to the spider for further analysis. Any unusual mechanisms for navigation, authentication, and session handling are thereby taken care of by your browser and your actions. This enables the spider to build a detailed picture of the application’s contents under your fine-grained control. Having assembled as much information as possible, the spider can then be launched to investigate further under its own steam, potentially discovering additional content and functionality.


The following features are commonly implemented within web application spiders:


  • Automatic update of the site map with URLs accessed via the intercepting proxy.

  • Passive spidering of content processed by the proxy, by parsing it for links and adding these to the site map without actually requesting them (see Figure 20-7).

  • Presentation of discovered content in table and tree form, with the facility to search these results.

  • Fine-grained control over the scope of automated spidering. This enables you to specify which hostnames, IP addresses, directory paths, file types, and other items the spider should request to focus on a particular area of functionality. You should prevent the spider from following inappropriate links either within or outside of the target application’s infrastructure. This feature is also essential to avoid spidering powerful functionality such as administrative interfaces, which may cause dangerous side effects such as the deletion of user accounts. It is also useful to prevent the spider from requesting the logout function, thereby invalidating its own session.

  • Automatic parsing of HTML forms, scripts, comments, and images, and analysis of these within the site map.

  • Parsing of JavaScript content for URLs and resource names. Even if a full JavaScript engine is not implemented, this function often enables a spider to discover the targets of JavaScript-based navigation, because these usually appear in literal form within the script.

  • Automatic and user-guided submission of forms with suitable parameters. see figure 1.1

  • Detection of customized File Not Found responses. Many applications respond with an HTTP 200 message when an invalid resource is requested. If spiders are unable to recognize this, the resulting content map will contain false positives.

  • Detection of customized File Not Found responses. Many applications respond with an HTTP 200 message when an invalid resource is requested. If spiders are unable to recognize this, the resulting content map will contain false positives.

  • Automatic retrieval of the root of all enumerated directories. This can be useful to check for directory listings or default content.

  • Automatic processing and use of cookies issued by the application to enable spidering to be performed in the context of an authenticated session.

  • Automatic testing of session dependence of individual pages. This involves requesting each page both with and without any cookies that have been received. If the same content is retrieved, the page does not require a session or authentication. This can be useful when probing for some kinds of access control flaws.

  • Automatic use of the correct Referer header when issuing requests. Some applications may check the contents of this header, and this function ensures that the spider behaves as much as possible like an ordinary browser.

  • Automatic use of the correct Referer header when issuing requests. Some applications may check the contents of this header, and this function ensures that the spider behaves as much as possible like an ordinary browser

  • Automatic use of the correct Referer header when issuing requests. Some applications may check the contents of this header, and this function ensures that the spider behaves as much as possible like an ordinary browser.


The results of passive application spidering, where items in gray have  been identified passively but not yet requested
Figure 1

The results of passive application spidering, where items in gray have been identified passively but not yet requested.


Burp Spider prompting for user guidance when  submitting forms
Figure 1.1

Burp Spider prompting for user guidance when submitting forms


In our next blog, we will cover further Web Application tools.

Recent Posts

See All
bottom of page