Readers, now we will learn to emphasize on the need to secure the applications as they permit an attacker to compromise a web server or network over the legitimate port of entry. As more businesses are hosting web based applications as a natural extension of themselves, the damage that can result as a result of compromise assumes significant proportions.
After completing this, you will be familiar with the following aspects:
- Understanding Web Application Security
- Common Web Application Security Vulnerabilities
- Web Application Penetration Methodologies
- Input Manipulation
- Authentication And Session Management
- Tools: Lynx, Teleport Pro, Black Widow, Web Sleuth
- Countermeasures
Understanding Web Application Security
Web based application security differs from the general discussion on security. In the general context, usually an IDS and/firewall lends some degree of security. However in the case of web applications, the session takes place through the allowed port - the default web server port 80. This is equivalent to establishing a connection without a firewall. Even if encryption is implemented, it only encrypts the transport protocol and in the event of an attack, the attacker's session will just be encrypted in nature. Encryption does not thwart the attack.
|
Attacking web applications is one of the most common way attackers compromise hosts, networks and users. It is a challenging task to defend against these attacks as there is no scope for logging the actions performed. This is particularly true for today's business applications where a significant percentage of applications are custom made or sourced from third party software components.
- Reliability of Client-Side Data
- Special Characters that have not been escaped
- HTML Output Character Filtering
- Root accessibility of web applications
- ActiveX/JavaScript Authentication
- Lack of User Authentication before performing critical tasks.
It has been noted that more often web application vulnerability can be eliminated to a great extent by the way they are designed. Apart from this, common security procedures are often overlooked by the functioning of the application.
Threat |
Reliability of Client-Side Data: It is recommended that the web application rely on server side data for critical operations rather than the client side data, especially for input purposes.
|
Threat |
Special Characters that have not been escaped: Often this aspect is overlooked and special characters that can be used to modify the instructions by the attackers are found in the web application code. For example, UTF-7 provides alternative encoding for "<" and ">", and several popular browsers recognize these as the start and end of a tag.
|
Threat |
HTML Output Character Filtering: Output filtering helps a developer build an application which is not susceptible to cross site scripting attacks. When information is displayed to users, it should be escaped. HTML should be rendered inactive to prevent cross site scripting attacks.
|
Threat |
Root accessibility of web applications: Ideally web applications should not expose the root directory of the web server. Sometimes, it is possible for the user to access the root directory if he can manipulate the input or the URL.
|
Threat |
ActiveX/JavaScript Authentication: Client side scripting languages are vulnerable to attacks such as cross side scripting.
|
Threat |
Lack of User Authentication before performing critical tasks: An obvious security lapse, where restricted area access is given without proper authentication, reuse of authentication cache or poor logout procedures. These applications can be vulnerable to cookie based attacks.
|
- Information Gathering and Discovery
- Documenting Application / Site Map
- Identifiable Characteristics / Fingerprinting
- Signature Error and Response Codes
- File / Application Enumeration
- Forced Browsing
- Hidden Files
- Vulnerable CGIs
- Sample Files
-
-
- Input/Output Client-Side Data Manipulation
Penetrating web servers is no different from attacking other systems when it comes to the basic methodology. Here also, we begin with information gathering and discovery. This can be anything from searching for particular file types / banners on search engines like google. For examples, searching for "index/" may bring up unsuspecting directories on interesting sites where one may find information that can be used for penetrating the web server.
Instant Source lets you take a look at a web page's source code, to see how things are done. Also, you can edit HTML directly inside Internet Explorer!
The program integrates into Internet Explorer and opens a new toolbar window which instantly displays the source code for whatever part of the page you select in the browser window.
Instant Source is an application that lets the user view the underlying source code as he browses a web page. The traditional way of doing this has been the View Source command in the browser. However, the process was tedious as the viewer has to parse the entire text file if he is searching for a particular block of code. Instant Source allows the user to view the code for the selected elements instantly without having to open the entire source.
|
The program integrates into Internet Explorer and opens a new toolbar window, instantly displaying the source code of the page / selection in the browser window. Instant Source can show all Flash movies, script files (*.JS, *.VBS), style sheets (*.CSS) and images on a page. All external files can be demarcated and stored separately in a folder. The tool also includes HTML, JavaScript and VBScript syntax highlighting and support for viewing external CSS and scripts files directly in the browser. This is not available from the view source command option.
With dynamic HTML, the source code changes after the basic HTML page loads - which is the HTML that was loaded from the server without any further processing. Instant Source integrates into Internet Explorer and shows these changes, thereby eliminating the need for an external viewer.
Lynx is a text-based browser used for downloading source files and directory links.
Lynx is a text browser client for users running cursor-addressable, character-cell display devices. It can display HTML documents containing links to files on the local system, as well as files on remote systems running http, gopher, ftp, wais, nntp, finger, or cso/ph/qi servers, and services accessible via logins to telnet, tn3270 or rlogin accounts. Current versions of Lynx run on UNIX, VMS, Windows3.x/9x/NT, 386DOS and OS/2 EMX.
|
Lynx can be used to access information on the Internet, or to build information systems intended primarily for local access. The current developmental Lynx has two PC ports. The ports are for Win32 (95 and NT) and DOS 386+. There Is a SSL enabled version of Lynx for Win32 by the name of lynxw32.lzh
There is a default Download option of Save to disk. This is disabled if Lynx is running in anonymous mode. Any number of download methods such as kermit and zmodem may be defined in addition to this default in the lynx.cfg file.
- Wget is a command line tool for Windows and Unix that will download the contents of a web site.
- It works non-interactively, so it will work in the background, after having logged off.
- Wget works particularly well with slow or unstable connections by continuing to retrieve a document until the document is fully downloaded.
- Both http and ftp retrievals can be time stamped, so Wget can see if the remote file has changed since the last retrieval and automatically retrieve the new version if it has.
GNU Wget is a freely available network utility to retrieve files from the Internet using HTTP and FTP. It works non-interactively, allowing the user to enabling work in the background, after having logged off. The recursive retrieval of HTML pages, as well as FTP sites is supported. Can be used to make mirrors of archives and home pages, or traverse the web like a WWW robot.
|
Wget works well on slow or unstable connections, keeping getting the document until it is fully retrieved and re-getting files from where it left off works on servers (both HTTP and FTP) that support it. Matching of wildcards and recursive mirroring of directories are available when retrieving via FTP. Both HTTP and FTP retrievals can be time-stamped, thus Wget can see if the remote file has changed since last retrieval and automatically retrieve the new version if it has.
By default, Wget supports proxy servers, which can lighten the network load, speed up retrieval and provide access behind firewalls. However, if behind a firewall that requires a socks style gateway, the user can get the socks library and compile wget with support for socks.
Another tool that can be found in an attacker's arsenal is Black Widow. This tool can be used for various purposes because it functions as a web site scanner, a site mapping tool, a site ripper, a site mirroring tool, and an offline browser program. Note its use as a site mirroring tool. An attacker can use it to mirror the target site on his hard drive and parse it for security flaws in the offline mode.
|
The attacker can also use this for the information gathering and discovery phase by scanning the site and creating a complete profile of the site's structure, files, e-mail addresses, external links and even errors messages. This will help him launch a targeted attack that has more chance of succeeding and leaving a smaller footprint.
The attacker can also look for specific file types and download any selection of files: from 'JPG' to 'CGI' to 'HTML' to MIME types. There is no file size restriction, and the user can download small to large files, that are a part of a site or from a group of sites.
Websleuth is a tool that combines web crawling with the capability of a personal proxy. The current version of sleuth supports functionality to: convert hidden & select form elements to textboxes; efficient forms parsing and analysis; edit rendered source of WebPages; edit raw cookies in their raw state etc.
|
It can also make raw http requests to servers impersonating the referrer, cookie etc..; block javascript popups automatically; highlight & parse full html source code; and analyze cgi links apart from logging all surfing activities and http headers for requests and responses.
Sleuth can generate reports of elements of web page; facilitate enhanced i.e. Proxy management, as well as security settings management. Sleuth has the facility to monitor cookies in real-time. Javascript console aids in interacting directly with the pages scripts and remove all scripts in a webpage.
- Hidden fields are embedded within HTML forms to maintain values that will be sent back to the server.
- Hidden fields serve as a mean for the web application to pass information between different applications.
- Using this method, an application may pass the data without saving it to a common backend system (typically a database.)
- A major assumption about the hidden fields is that since they are non visible (i.e. hidden) they will not be viewed or changed by the client.
- Web attacks challenge this assumption by examining the HTML code of the page and changing the request (usually a POST request) going to the server.
- By changing the value the entire logic between the different application parts, the application is damaged and manipulated to the new value.
Hidden field tampering:
Most of us who have dabbled with some HTML coding have come across the hidden field. For example, consider the code below:
|
Most web applications rely on HTML forms to receive input from the user. However, users can choose to save the form to a file, edit it and then use the edited form to submit data back to the server. Herein lies the vulnerability, as this is a "stateless" interaction with the web application. HTTP transactions are connectionless, one-time transmissions.
The conventional way of checking for the continuity of connection is to check the state of the user from information stored at the user's end (Another pointer to the fallacy in trusting the client side data). This can be stored in a browser in three ways; cookies, encoded URLs and HTML form "hidden" fields
Countermeasure
The first rule in web application development from a security standpoint is not to rely on the client side data for critical processes. Using encrypted sessions such as SSL or "secure" cookies are advocated instead of using hidden fields. Digital algorithms may be used where values of critical parameters may be hashed with a digital signature to ascertain the authenticity of data. The safest bet would be to rely on server side authentication mechanisms for high security applications.
---Regards,
Amarjit Singh
0 Visitor Reactions & Comments:
Post a Comment