0% found this document useful (0 votes)
149 views21 pages

Week 2 - Enumerating An Application

This document discusses tools and techniques for enumerating information from a web application, including identifying the web server and applications running on various ports. Specific steps outlined include using Netcraft to identify the web server version of a given site, examining the robots.txt file, using tools like Wget and Curl to download files, and using Nmap to scan for open ports and identify services running on them. The goal is to gather as much useful information as possible before conducting further testing.

Uploaded by

Paul Crane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
149 views21 pages

Week 2 - Enumerating An Application

This document discusses tools and techniques for enumerating information from a web application, including identifying the web server and applications running on various ports. Specific steps outlined include using Netcraft to identify the web server version of a given site, examining the robots.txt file, using tools like Wget and Curl to download files, and using Nmap to scan for open ports and identify services running on them. The goal is to gather as much useful information as possible before conducting further testing.

Uploaded by

Paul Crane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Enumerating a Web

Application exercises
Ethical Hacking lab exercise.

NOTE: -
You should make notes about the tools and evaluate which of these
tools you will likely be using for your coursework.

Note that Information contained in this document is for educational purposes.

1|Page
+ Contents
1 Introduction to Enumeration...............................................................................................................3
1.1 Netcraft........................................................................................................................................3
1.2 Punkspider...................................................................................................................................3
2 Enumeration exercises........................................................................................................................4
2.1 Fingerprinting the web server......................................................................................................4
2.2 Examining robots.txt....................................................................................................................4
2.3 WGET tool....................................................................................................................................5
2.4 CURL tool.....................................................................................................................................5
2.5 Enumerate the applications on the server...................................................................................6
2.6 Nmap Scripts................................................................................................................................8
2.7 Introduction to NIKTO..................................................................................................................9
2.8 Footprinting a web application..................................................................................................10
2.8.1 Whatweb...........................................................................................................................10
2.8.2 Blind Elephant....................................................................................................................10
2.9 Spidering a web application.......................................................................................................11
2.9.1 Spidering using OWASP ZAP...............................................................................................11
2.10 Brute-forcing hidden folders and files.......................................................................................13
2.10.1 dirb.....................................................................................................................................13
2.10.2 dirbuster............................................................................................................................13
Appendices................................................................................................................................................14
Appendix A - Command line kung fu with netcat..................................................................................14
Appendix B - Configuring OWASP ZAP Web proxy.................................................................................17
Appendix C- Wackopicko url’s from OWASP ZAP...................................................................................21

2|Page
1 INTRODUCTION TO ENUMERATION
The first phase in security assessment is focused on collecting as much information as possible about a
target application. Information Gathering is a necessary step of a penetration test. This task can be
carried out in many different ways.

By using public tools (search engines), scanners, sending simple HTTP requests, or specially crafted
requests, it is possible to force the application to leak information, e.g., disclosing error messages or
revealing the versions and technologies used.

In addition to the exercises here, footprinting techniques can be used to enumerate an application, for
example: -

 Search engine discovery and Google hacking.


 Shodan
 Archive.org
 Maltego

1.1 NETCRAFT
From the netcraft web site, find out the web Server software and version that www.abertay.ac.uk is
running.

 Browse to https://www.netcraft.com/

1.2 PUNKSPIDER
Punkspider is a web application vulnerability search engine. It searches for BlindSQL injection, SQL
injection,Cross-Sute Scripting, Path Traversal, Mail Injection, OS Command Injection (yes format c:) and
Xpath Injection.

 Investigate the use of the site https://www.punkspider.org/

DO NOT CLICK ON ANY OF THE VULNERABLE WEBSITES.

The most recent vulnerable websites are only available by logging in. Note that any E-mail address can
be specified (since it's not verified). You must choose a complex password though e.g. "Hacklab123".

3|Page
2 ENUMERATION EXERCISES
2.1 FINGERPRINTING THE WEB SERVER
Web server fingerprinting uses tests to indicate the web server by their implementation differences in
the HTTP protocol. It involves sending requests and monitoring the responses.

There are two good tools that can be used for this, namely, Httpprint and Httprecon (copy these from
Hacklab1\Student tools).

 Use these tools to footprint the Web app Virtual Machine ie. 192.168.1.100

References:-

httprint - http://net-square.com/httprint.html
httprecon - http://www.computec.ch/projekte/httprecon/

2.2 EXAMINING ROBOTS.TXT


Web spiders (such as Google) retrieve a web page and then recursively traverse hyperlinks to retrieve
further web content. Their accepted behavior is specified by the “Robots Exclusion Protocol” of the
robots.txt file in the web root directory. The file can be accessed by simply specifying it in the URL. E.g.
www.xxxxxx.com/robots.txt. An example is below: -
User-agent: *
Disallow: /search
Disallow: /groups
Disallow: /images

The user-agent entry can either be used to target specific spiders but would normally be * (i.e. all
spiders). The disallow directive informs spiders not to examine specific folders. This means that a
hacker/tester would be interested in these folders.

4|Page
 Pick any web site that you are familiar with and examine the robots.txt file.

 Try the web app virtual machine.

http://192.168.1.100/robots.txt
http://192.168.1.100/dvwa/robots.txt

2.3 WGET TOOL


Wget is a useful command tool for retrieving files from HTTP, HTTPS and FTP. It has a lot of functionality
that can be used by web testers. It is available under different OS platforms.

 Browse to the WGET manual page http://www.gnu.org/software/wget/manual/wget.html

This exercise will show the simplest case of downloading the DVWA robots.txt file.

 Restore the virtual machine Badstore (revert to snapshot Booted).

Note: You may have to issue the command in the Badstore virtual machine: -

Ifconfig eth0 192.168.1.101

From a terminal in Kali linux,

wget http://192.168.1.100/robots.txt

Then examine the file downloaded.

more robots.txt

2.4 CURL TOOL


Curl is a command line tool for doing all sorts of URL manipulations and transfers. It can be very useful
when scripting.

curl -O http://192.168.1.100/robots.txt

Reference: - https://curl.haxx.se/docs/httpscripting.html

5|Page
2.5 ENUMERATE THE APPLICATIONS ON THE SERVER
Nmap is very useful against web applications and has many switches and scripts that can be used t
gather information against a target.

While web applications usually listen on port 80 (http) and 443 (https), there is nothing magic about
these port numbers. In fact, web applications may be associated with arbitrary TCP ports, and can be
referenced by specifying the port number as follows: http[s]://www.example.com:port/. For example,
http://www.example.com:20000/. It is easy to check for the existence of web applications on non-
standard ports. A port scanner such as nmap is capable of performing service recognition by means of
the -sV option, and will identify http[s] services on arbitrary ports.

The normal port scan would be for all TCP ports (i.e. p 0-65535) but for speed, we will use 1-10000.

 Under kali linux, run the following command from a terminal:-

nmap -p 1-10000 -sT 192.168.1.100

This should show all the open ports. We should see that port 80 is open (the normal HTTP port) and also
443 (normal 443). If we add version checking then we can better determine what is actually running.

nmap -sV -p 1-10000 -sT 192.168.1.100

The screen output is shown below: -

There looks to be a web server running on port 8081.

 Prove this by browsing to http://192.168.1.100:8081. Note: - In the real situation, you would
search for vulnerabilities on these versions.

6|Page
2.6 NMAP SCRIPTS
The Nmap Scripting Engine (NSE) is one of Nmap's most powerful and flexible features. It allows users to
write (and share) simple scripts to automate a wide variety of networking tasks. There are many that are
useful in Wed application testing.

A list of the scripts is available at

http://nmap.org/nsedoc/index.html

 Select Scripts.

 Scroll down to see a list of the script starting with HTTP.

 Under Kali, browse to the scripts that are installed (usr/share/nmap/scripts).

 Open the scripts with a text viewer to get an idea of what they achieve.

Run the following scripts against the bee-box Apache server.

nmap 192.168.1.100 -p 80,443 --script=http-headers


nmap 192.168.1.100 -p 80,443 --script=http-methods
nmap 192.168.1.100 -p 80,443 --script=http-apache-negotiation
nmap 192.168.1.100 -p 80,443 --script=http-comments-displayer
nmap 192.168.1.100 -p 80,443 --script=http-date

7|Page
2.7 INTRODUCTION TO NIKTO
Nikto is an Open Source web server scanner which can perform comprehensive tests against web
servers. It can also be used for simple scans such as footprinting. The Nikto homepage is
http://www.cirt.net/nikto2

 To show the options, run the following: -

nikto -V

You should be able to see something similar to the following: -

nikto_favicon.plugin 2.08 2011-08-08


nikto_headers.plugin 2.09 2011-08-08
nikto_httpoptions.plugin 2.09 2011-02-19

We can run any of the plug-ins individually by choosing the variable between “nikto_” and “.plugin”. In
this case we will use “headers”.

nikto -Plugins headers -h http://192.168.1.100


nikto -Plugins headers -h http://192.168.1.100:8080
nikto -Plugins headers -h http://192.168.1.100:9080

 To run all Nikto tests,

nikto -h http://192.168.1.100
nikto -h http://192.168.1.100:8080
nikto -h http://192.168.1.100:9080

 Examine the information retrieved (e.g. look at some of the discovered folders).

8|Page
2.8 FOOTPRINTING A WEB APPLICATION.

2.8.1 Whatweb
There are several different vendors and versions of web servers on the market today. Knowing the type
of web server that you are testing significantly helps in the testing process, and will also change the
course of the test.

This information can be derived by sending the web server specific commands and analysing the output,
as each version of web server software may respond differently to these commands. By knowing how
each type of web server responds to specific commands and keeping this information in a web server
fingerprint database, a penetration tester can send these commands to the web server, analyse the
response, and compare it to the database of known signatures. Note that it usually takes several
different commands to accurately identify the web server, as different versions may react similarly to
the same command. Rarely, however, different versions react the same to all HTTP commands. So, by
sending several different commands, you increase the accuracy of your guess.

Ruby Matches for fingerprinting are made with Text strings (case sensitive), Regular expressions, Google
Hack Database queries (limited set of keywords), MD5 hashes , URL recognition, HTML tag patterns
Custom ruby code for passive and aggressive operations.

From a terminal, type

whatweb

 Use whatweb to footprint the 192.168.1.100 (ports 80,8080 and 9080).

whatweb 192.168.1.100
whatweb 192.168.1.100:8081

2.8.2 Blind Elephant


Blind elephant is an open-source web application fingerprint utility.
https://community.qualys.com/community/blindelephant
 Browse to the OWASP Broken Web Applications virtual machine and select the icon
below:-

9|Page
 Look down the page and you should see a section “Old (Vulnerable) Versions of Real
Applications”

BlindElephant allows us to footprint the exact version of CMS applications such as these. For example,
examining the Joomla install. Browse to http://192.168.1.100/joomla/ (the version should be shown
on the page).

 In Kali Linux, from a terminal, footprint the application and hopefully the version has been
detected.

BlindElephant.py http://192.168.1.100/joomla joomla

2.9 SPIDERING A WEB APPLICATION.


An automatic spider is a tool used to automatically discover the resources/pages on a website. A spider
starts with a base URL to visit and will then map out the entire application.

2.9.1 Spidering using OWASP ZAP


We will use OWASP ZAP to spider the WackoPicko website.

 In Kali Linux, run the OWASP ZAP proxy and configure the Dolphin web browser as previously
(see Appendix B for a reminder of instructions).

A note: - Running Kali fullscreen can be helpful and also Alt-Tab can be used to toggle between OWASP
ZAP and Dolphin.

 In Dolphin, browse to the Wackopicko site. (http://192.168.1.100/WackoPicko/index.php)

 Log in using the username bryce and the password bryce.

10 | P a g e
You should see that the WackoPicko folder appears in OWASP ZAP (see screenshot below).

 In OWASP ZAP, right-click on WackoPicko and select Spider.

 We only wish to spider the Subtree so ensure that this is selected.

11 | P a g e
 We can now export all the URLS to a file (e.g. /root/Desktop/links.txt).

Note: The URL’s found are contained in Appendix C.

2.10 BRUTE-FORCING HIDDEN FOLDERS AND FILES.


Many web applications have content that is not obtained using a spider. Brute force applications will try
to guess the existence of files and folders.

2.10.1 dirb
We can specify a dictionary or it can use the dirb default dictionary.

dirb http://192.168.1.100/WackoPicko
dirb http://192.168.1.100/WackoPicko /usr/share/dirb/wordlists/common.txt

In Kali Linux, there are dictionaries in the folders /usr/share/dirb/wordlists and also
/usr/share/dirbuster/wordlists

2.10.2 dirbuster
DirBuster is a multi threaded java application designed to brute force directories and files names on
web/application servers. From a command prompt, run: -

dirbuster

This application is easy to run! The dictionaries are held in /usr/share/dirbuster/wordlists

12 | P a g e
APPENDICES
APPENDIX A - COMMAND LINE KUNG FU WITH NETCAT
Netcat allows us to make raw connections, send simple commands and view the web applications
headers. The following is a reference list of all the HTTP possible headers: -

http://en.wikipedia.org/wiki/List_of_HTTP_header_fields

GET method
The HTTP GET command is by the most commonly used and will simply retrieve a web page. From Kali
linux, ensure that the web application is working.

 Browse to 192.168.1.100 and choose the bWAPP link.

We can now use netcat to connect to the application

 Open a terminal window.

 Enter the following (note that you will not get a response until the end): -

nc 192.168.1.100 80
GET /bWAPP/login.php HTTP/1.1
host: 192.168.1.100

 Then you must press enter twice.

 Scroll up to examine the header.

13 | P a g e
By creating a simple connection, we have discovered the Web Server version and other information
about the server setup.

Note that this information is being leaked by the “Server:” line and also “X-Powered-By” gives useful
information.

HEAD method
The HEAD command can be useful as it only retrieves the header. This can be useful when we are
automating information gathering against multiple targets.

 Enter the following into a terminal window under Kali.

nc 192.168.1.100 80
HEAD /bWAPP/login.php HTTP/1.1
host: 192.168.1.100

 Then you must press enter twice.

Note that the HEAD command can be disabled in Web servers.

OPTIONS method
The OPTIONS command can be used to find which HTTP options are supported by the web server.

 Enter the following into a terminal window under Kali.

nc 192.168.1.100 80
OPTIONS / HTTP/1.1
host: 192.168.1.100

14 | P a g e
This shows that the TRACE command is enabled. This is considered a dangerous command as it may be
exploited by Cross-site scripting. Note that both the HEAD and the OPTIONS commands can be disabled
in Web Servers.

WebDAV options
WebDAV is a service that allows files to be shared over the Internet. When you connect to a WebDAV
file server, you can open, edit and delete files as if they were on your device. Files stored on a WebDAV
file server can be accessed (by default) from a wide variety of devices, such as Windows PCs, Macs and
iPads. WebDAV is convenient for developers as it allows them to remotely edit and manage files on web
servers. If not configured correctly, Webdav can leave a server vulnerable

Enter the following into a terminal window under Kali.

nc 192.168.1.100 80
OPTIONS /webdav/ HTTP/1.1
host: 192.168.1.100

https://blog.skullsecurity.org/2009/webdav-detection-vulnerability-checking-and-exploitation

http://skidspot.blogspot.co.uk/2010/05/hacking-iis-via-webdav.html

POST method
To re-create the login form contained at http://192.168.1.100/bWAPP/login.php, we can use the
following: -

nc 192.168.1.100 80
POST /bWAPP/login.php HTTP/1.0
Content-Type: application/x-www-form-urlencoded
Content-Length: 35

login=bee&password=bug&form=submit

15 | P a g e
APPENDIX B - CONFIGURING OWASP ZAP WEB PROXY
OWASP proxy is a free and open source MITM proxy software. It allows viewing of HTTP and HTTPS and
allows information to be intercepted and altered. Most Proxies are very similar to setup.

 Run OWASP ZAP from the Web Application Analysis menu in Kali (it can be slow to run
the first time).

 Select the first option for persistence.

 Paros proxy listens on port 8080. We must configure our Browser to send the traffic to
this port.

 Run Dolphin web browser (this can be run from the first icon from the left hand menu).

16 | P a g e
Now we can tell the Browser to send the traffic to localhost| Port 8080

 You may have to enable the Menu Bar (right-click on the bar at the top).

 Then Select Edit and Preferences.

 Then Advanced, Network

17 | P a g e
 Enter the proxy and also make sure that No proxy is blanked out.

You should be able browse to a website on a virtual machine. The information should be captured in
OWASP ZAP (as shown below).

 All requests and responses should also be captured.

18 | P a g e
19 | P a g e
APPENDIX C- WACKOPICKO URL’S FROM OWASP ZAP.
http://192.168.1.100/WackoPicko/
http://192.168.1.100/WackoPicko/action.swf?directory=%2FWackoPicko%2F
http://192.168.1.100/WackoPicko/admin
http://192.168.1.100/WackoPicko/admin/index.php?page=login
http://192.168.1.100/WackoPicko/calendar.php
http://192.168.1.100/WackoPicko/calendar.php?date=1504259908
http://192.168.1.100/WackoPicko/css/
http://192.168.1.100/WackoPicko/css/?C=S;O=D
http://192.168.1.100/WackoPicko/css/blueprint/
http://192.168.1.100/WackoPicko/css/blueprint/?C=M;O=D
http://192.168.1.100/WackoPicko/css/blueprint/ie.css
http://192.168.1.100/WackoPicko/css/blueprint/plugins/
http://192.168.1.100/WackoPicko/css/blueprint/plugins/?C=S;O=D
http://192.168.1.100/WackoPicko/css/blueprint/plugins/fancy-type/
http://192.168.1.100/WackoPicko/css/blueprint/plugins/fancy-type/?C=D;O=D
http://192.168.1.100/WackoPicko/css/blueprint/plugins/fancy-type/readme.txt
http://192.168.1.100/WackoPicko/css/blueprint/plugins/fancy-type/screen.css
http://192.168.1.100/WackoPicko/css/blueprint/print.css
http://192.168.1.100/WackoPicko/css/blueprint/screen.css
http://192.168.1.100/WackoPicko/css/blueprint/src/
http://192.168.1.100/WackoPicko/css/blueprint/src/?C=S;O=D
http://192.168.1.100/WackoPicko/css/blueprint/src/forms.css
http://192.168.1.100/WackoPicko/css/blueprint/src/grid.css
http://192.168.1.100/WackoPicko/css/blueprint/src/ie.css
http://192.168.1.100/WackoPicko/css/blueprint/src/print.css
http://192.168.1.100/WackoPicko/css/blueprint/src/reset.css
http://192.168.1.100/WackoPicko/css/blueprint/src/typography.css
http://192.168.1.100/WackoPicko/css/stylings.css
http://192.168.1.100/WackoPicko/css/stylings.php
http://192.168.1.100/WackoPicko/error.php?msg=Error,%20need%20to%20provide%20a%20query
%20to%20search
http://192.168.1.100/WackoPicko/guestbook.php
http://192.168.1.100/WackoPicko/index.php
http://192.168.1.100/WackoPicko/passcheck.php
http://192.168.1.100/WackoPicko/pic'%20+%20'check'%20+%20'.php
http://192.168.1.100/WackoPicko/pictures/
http://192.168.1.100/WackoPicko/pictures/?C=D;O=D
http://192.168.1.100/WackoPicko/pictures/conflict.php
http://192.168.1.100/WackoPicko/pictures/conflictview.php
http://192.168.1.100/WackoPicko/pictures/high_quality.php
http://192.168.1.100/WackoPicko/pictures/purchased.php
http://192.168.1.100/WackoPicko/pictures/recent.php
http://192.168.1.100/WackoPicko/pictures/search.php
http://192.168.1.100/WackoPicko/pictures/search.php?query=ZAP
http://192.168.1.100/WackoPicko/pictures/upload.php
http://192.168.1.100/WackoPicko/pictures/view.php
http://192.168.1.100/WackoPicko/pictures/view.php?picid=15

20 | P a g e
http://192.168.1.100/WackoPicko/tos.php
http://192.168.1.100/WackoPicko/users/
http://192.168.1.100/WackoPicko/users/?C=S;O=D
http://192.168.1.100/WackoPicko/users/check_pass.php
http://192.168.1.100/WackoPicko/users/home.php
http://192.168.1.100/WackoPicko/users/login.php
http://192.168.1.100/WackoPicko/users/logout.php
http://192.168.1.100/WackoPicko/users/register.php
http://192.168.1.100/WackoPicko/users/sample.php
http://192.168.1.100/WackoPicko/users/sample.php?userid=1
http://192.168.1.100/WackoPicko/users/similar.php
http://192.168.1.100/WackoPicko/users/view.php
http://ocsp.digicert.com/

21 | P a g e

You might also like