Tier I
Passive Info Gathering
Passive information gathering involves collecting data without directly interacting with the target system.

Passive Information Gathering
Passive information gathering involves collecting data without directly interacting with the target system.
This reduces the chance of detection and helps gain useful insights through public sources and tools.
General Methods & Tools
Common Methods
- /robots.txt: May expose restricted or sensitive areas of a website
host <domain>
: Resolve the IP and other DNS recordswhois <domain/ip>
: Retrieve registration and owner informationwhatweb <domain>
: Identify web technologies in use- HTTrack: Clone websites for offline analysis
- NetCraft: Web-based tool to analyze technologies, emails, and systems
DNS Reconnaissance
dnsrecon -d <domain>
: Discover records, email servers, technologies- DNSDumpster: Web-based DNS analysis tool
Email Harvesting
- TheHarvester: Search for public emails using Google, LinkedIn, etc.
theHarvester -d <domain> -b google,linkedin
Firewall Reconnaissance
- Wafw00f: Detect web application firewalls and technologies (available on GitHub)
Leaked Password Databases
- HaveIBeenPwned: Check if an email/password was leaked in past breaches
Subdomain Enumeration
- Sublist3r: Finds public subdomains using engines like Google
sublister -d <domain> -e <engine>
: Command-line usage
Website Archive
- Wayback Machine: View historical snapshots of websites via
web.archive.org
Google Dorks
Advanced Google search operators to find hidden content.
You can mix different Dorks to improve your search.
site:website
— searches all indexed pages of that websiteinurl:admin
— looks for “admin” in the URLinurl:password of index
- searches for passwords on sightsite:*.website
— subdomain discoveryintitle:admin
,filetype:pdf
,courses
, etc.- Examples of sensitive info:
inurl:auth_user_file.txt
inurl:password.txt