Search the Community
Showing results for tags 'crawler' or ''.
-
Scraper is an automatic plugin that copies content and posts automatically from any web site. With tons of useful and unique features, Scraper WordPress plugin takes content creating process to another level. [Hidden Content] [hide][Hidden Content]]
-
crawlergo is a browser crawler that uses chrome headless mode for URL collection. It hooks key positions of the whole web page with DOM rendering stage, automatically fills and submits forms, with intelligent JS event triggering, and collects as many entries exposed by the website as possible. The built-in URL de-duplication module filters out a large number of pseudo-static URLs, still maintains a fast parsing and crawling speed for large websites, and finally gets a high-quality collection of request results. crawlergo currently supports the following features: chrome browser environment rendering Intelligent form filling, automated submission Full DOM event collection with automated triggering Smart URL de-duplication to remove most duplicate requests Intelligent analysis of web pages and collection of URLs, including javascript file content, page comments, robots.txt files and automatic Fuzz of common paths Support Host binding, automatically fix and add Referer Support browser request proxy Support pushing the results to passive web vulnerability scanners [hide][Hidden Content]]
-
SourceWolf Amazingly fast response crawler to find juicy stuff in the source code! What can SourceWolf do? Crawl through responses to find hidden endpoints, either by sending requests or from the local response files (if any). Brute forcing host using a wordlist. Get the status codes for a list of URLs / Filtering out the live domains from a list of hosts. All the features mentioned above execute with great speed. SourceWolf uses the Session module from the requests library, which means, it reuses the TCP connection, making it really fast. SourceWolf provides you with an option to crawl the responses files locally so that you aren’t sending requests again to an endpoint, whose response you already have a copy of. The final endpoints are in a complete form with a host like [Hidden Content] are not as /api/admin. This can come useful when you are scanning a list of hosts. Changelog v1.8 new-features: SourceWolf can now grab github and linkedin profiles along with social media links! [hide][Hidden Content]]
-
- 2
-
- sourcewolf
- v1.8
- (and 4 more)
-
Interactive CLI Web Crawler. Evine is a simple, fast, and interactive web crawler and web scraper written in Golang. Evine is useful for a wide range of purposes such as metadata and data extraction, data mining, reconnaissance and testing. VIEWS URL: In this view, you should enter the URL string. Options: This view is for setting options. Headers: This view is for setting the HTTP Headers. Keys: This view is used after the crawling web. It will be used to extract the data(docs, URLs, etc) from the web pages that have been crawled. Regex: This view is useful to search the Regexes in web pages that have been crawled. Write your Regex in this view and press Enter. Response: All of the results write in this view Search: This view is used to search the Regexes in the Response content. [hide][Hidden Content]]
-
SourceWolf Amazingly fast response crawler to find juicy stuff in the source code! What can SourceWolf do? Crawl through responses to find hidden endpoints, either by sending requests or from the local response files (if any). Brute forcing host using a wordlist. Get the status codes for a list of URLs / Filtering out the live domains from a list of hosts. All the features mentioned above execute with great speed. SourceWolf uses the Session module from the requests library, which means, it reuses the TCP connection, making it really fast. SourceWolf provides you with an option to crawl the responses files locally so that you aren’t sending requests again to an endpoint, whose response you already have a copy of. The final endpoints are in a complete form with a host like [Hidden Content] are not as /api/admin. This can come useful when you are scanning a list of hosts. Changelog v1.3 Bug which did not allow detecting const variables fixed! [hide][Hidden Content]]
-
- 1
-
- sourcewolf
- v1.3
- (and 4 more)
-
diskover - File system crawler, disk space usage, file search engine and storage analytics powered by Elasticsearch diskover is an open source file system crawler and disk space usage software that uses Elasticsearch to index and manage data across heterogeneous storage systems. Using diskover, you are able to more effectively search and organize files and system administrators are able to manage storage infrastructure, efficiently provision storage, monitor and report on storage use, and effectively make decisions about new infrastructure purchases. As the amount of file data generated by businesses continues to expand, the stress on expensive storage infrastructure, users and system administrators, and IT budgets continues to grow. Using diskover, users can identify old and unused files and give better insights into data change, file duplication and wasted space. diskover supports crawling local file-systems, crawling NFS/SMB, through TCP sockets using the Tree Walk Client or using http with the Storage Agent. Importing Amazon S3 inventory files is also supported. Plugin extensions can be used for adding additional meta data. diskover is written and maintained by the company Shirosaidev and runs on Linux, OS X/macOS and Windows 10 (using windows subsystem for linux) using Python 2 or 3. diskover crawler and workerbots running in terminal [Hidden Content]
-
web crawler And Scan SQL injection vulnerability [HIDE][Hidden Content]]
-
- 5
-
- ultimate-dork
- web
-
(and 1 more)
Tagged with:
-
Web Crawler, Scanner, and Analyzer Framework (Shell-Script based) Bashter is a tool for scanning a Web-based Application. Bashter is very suitable for doing Bug Bounty or Penentration Testing. It is designed like a framework so you can easily add a script for detect vulnerability. [HIDE][Hidden Content]]