In this post, I am highlighting which sources and tools I use to perform passive footprinting as part of the Reconnaissance phase of an ethical hacking exercise.

Passive footprinting involves the uses of tools and resources that can assist you in obtaining more information about your target without ever ‘touching’ the target’s environment. So, the key objective here is obtaining information while remaining stealthy.

Links and Commands

Here are a summary of the links and commands used in the detailed descriptions below, in the unlikely event you do not want to read the rest of this post:



Google search for data on the target’s website

<search string> site:<website domain name>

Google search to see where target appears in a URL

inurl: <target name>

Google search for information about the target on social media

<target name>, <target name>, etc.

Google search for information about the target on Job Sites

<target name>, <target name>, etc.




DNS Tools

dnsrecon <target domain> -w

dnscan -d <target domain> -w <subdomain text file> -v

dmitry -winse <target domain>

theharvester -d <target domain> -l 500 -b google

python –d <target domain>



The best place to start is with a WHOIS query to obtain information regarding the registered users and assignees of your target’s domain. There are many WHOIS resources on the Internet. The one I prefer to use is the whois lookup hosted by ICANN which you can find here:


Google Dorks / Google Hacking

There is a treasure trove of information of you know how and where to look. Google Dorks are Google query strings you can use to help you in gathering information about your target. There are literally thousands of different Google Dorks for you to use and a great resource that lists these is the Google Hacking Database which you can find here:

Here are a few of the generic ones I use when starting the exercise. I often return to this list at the end of the entire Reconnaissance phase if I am still missing some vital information.

For reconnaissance targeting I recommend using the following:

<target name> – Generic Search

<search string> site:<website domain name>  – Search for data on target’s website

inurl: <target name> – see where the target’s name appears in a URL on the web

inurl:admin inurl:uploads site:<website domain name> – Fishes images and text from upload sites – Toby Rose


Social Media

You should also scan social media sites for information about the target. This is very useful in obtaining information about employees, what technologies they have in use etc. Google Dorks making sifting through social media data a little less painful and does make this task a little easier e.g. <target name>, <target name>, etc. will return all indexed information about the target from Twitter and Facebook. Of course there are many social media sites out there so please adapt these Google Dorks accordingly for LinkedIn, Instagram etc. You will be amazed at the amount of information that you can find on social media so do not skip this step.


Job Sites

Job sites are valuable resources for identifying technologies in use by the target organization. Once again use Google Dorks to search these e.g. <target name>, <target name>, etc.


Public Website / Press Releases

The target’s own public website and other digital assets it hosts in the public domain can also be used to gather information needed in further phases of the ethical hacking exercise. Press releases issued by the organization can also be useful as they state the names and designations of key employees and successful technologies or projects that they have implemented.


Netcraft is a useful online site which aids you in multiple aspects of web application security including anti-phishing and anti-fraud services. Netcraft can be used to gather information about websites which are run by the target information and returns information such as its IP address, hosting provider, technology in use etc. Use the Netcraft Toolbar on the right-hand side of the page as shown in the image below.

Netcraft Foootprinting


Same IP

Often discovering what site is running on the same server as your target’s website uncovers valuable information. For example, you might find sub-domains or development sites. Often the service provider who hosts this site is responsible for other services as well. Remember that during the Reconnaissance phase there is no such thing as too much information. Two sites that provide this service are onsameip and /



The greatest tool at your disposal during this phase of reconnaissance is DNS. This Internet protocol will help you in obtaining a list of IP addresses and match these to possible services the target is running. In addition, DNS will also give insight into how the target’s email is being routed, special application configurations you can derive from TXT and SRV records and of course the IP and names of the authorative DNS servers. You could use the built-in DNS tool ‘nslookup’ which comes pre-installed on most operating systems to perform DNS investigations. There are however many tools that automate this for you and perform other services like WHOIS lookups, Google Searches etc. Let’s take a look at 5 tools I like to use.

DNS Recon

DNSRecon is a great tool for conducting DNS Reconnaissance. You can also install it by visiting the creator’s GitHub repository at Once installed, run the following command ‘dnsrecon <target domain> -w’ where the -w option initiates a deep WHOIS record analysis. The output of DNSRecon will provide you with the WHOIS record, host addresses, name servers and IP addresses as well as the MX mail records and other pertinent DNS information.


dnscan is another DNS reconnaissance tool. You can download the python script from the creator’s GitHub repository at dnscan has similar features to DNSRecon but it comes with a DNS subdomain dictionary which is an invaluable tool for finding subdomains for the internet domain you are interrogating. To run dnscan type the following command in the terminal  ‘python dnscan -d <target domain>’ -w <subdomain text file> -v. dnscan has DNS subdomain text files saved in the same GitHub repository which comes in quite handy when you need a subdomain text file. the -v option just adds verbosity to the script so that you can track progress as the script runs.


dmitry is another DNS / Web Search Footprinting Reconnaissance tools. dmitry is an abbreviation for Deepmagic Information Gathering Tool. You can install it on your own Linux machine by visiting the creator’s website here: The command to perform a dmitry ‘footprinting’ scan is ‘dmitry -winse <target domain>.


theharvester is another enumeration tool that can assist you in gathering information during the early phases of an ethical hacking exercise. theharvester gathers information such as email addresses, subdomains, hosts, employee names etc.. You can install it by following the instructions here: To run theharvester type the following at the command line ‘theharvester -d <domain> -l <number of searches e.g. 500> -b <search engine e.g. google>’.


Belati is a relatively new kid on the block and comes with similar features to the previously mentioned tools. You can install Belati by following the instructions on the creator’s GitHub page here:  This tool is written in Python and uses the Django web framework for reporting. This tool enumerates WHOIS, HTTP banners, subdomains, checks Google, GIT etc. Below is an example of an output report. To run Belati type the following at the command line in the Belati directory: python –d <target domain>


Footprinting - Belati Report Interface


There are of course many footprinting tools out there and many more created as time goes by. This list is by no means complete and a quick Google search will uncover many others. This is a list of tools and resources I use to start the information gathering exercise and I hope that you find it adds some value.


One Comment

Comments are closed.