???????

Subdomain enumeration,Web scraping and finding usernames automation script written in python

Installation

git clone https://github.com/d8rkmind/Pyosint.git
cd PyOsint
pip3 install -r requirements.txt

Usage :

python3 Pyosint.py [OPTIONS]

Brief info:

The main functionality of this program has been divided to 3 parts

  • Find – Module to search For usenames form a list of 326 websites
  • Scrap – To Scrap a website to extract all links form a given website and store it in a file
  • Enum – To automate the search of subdomains of a given domain from different services

In Scrap module results are automatically stored in output/web folder wit he ip-address of the website as the filename

The services used are Virus Total,PassiveDns,CrtSearch,ThreatCrowd
Enum module an Api key of Virus total that you can get from going Here

Paste the key inside api.json file:

* if this step is not done Virus total may block your request

Command Line Utilization Information.

The following are the sub-commands that work this program

Arguments Shot
form
Long
form
Functionality
Name -n –name To specify the domain name or username to use
Module -m –module To specify which module to use
Output -o –output To specify outputfile name
Thread -t –threads To specify the number of threads to use
[ Not applicable to web crawling ]
Limit -l –limit to specify the maxium value of web urls to crawl
[ Applicable only to web crawling ]
Verbose -v –verbose To enable verbose mode
[ Applicable only to Enumeration ]
Ports -p –ports To specify the ports to scan
[ Applicable only to Enumeration ]
Help -h –help To Show the help options

Example :

Linux commands:

<div class="snippet-clipboard-content position-relative overflow-auto" data-snippet-clipboard-copy-content="python3 pyosint.py -m find -n exampleuser <– Username-huntdown

python3 pyosint.py -m scrap -n http://scanme.nmap.org <– Scrapping using bot

python3 pyosint.py -m enum -n google.com

python3 pyosint.py -m find -n exampleuser               <-- Username-huntdown

python3 pyosint.py -m scrap -n http://scanme.nmap.org   <-- Scrapping using bot

python3 pyosint.py -m enum -n google.com                <-- Subdomain enum