-Sh0w

LvL-23
  • Content Count

    5
  • Avg. Content Per Day

    0
  • Joined

  • Last visited

Community Reputation

50 Excellent

About -Sh0w

  • Rank
    ./End
  • Birthday 03/15/1996

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. [video=youtube;_zdi-11DgsU][Hidden Content] WebCruiser – Web Vulnerability Scanner, a compact but powerful web security scanning tool that will aid you in auditing your site! It has a Vulnerability Scanner and a series of security tools. It can support scanning website as well as POC (Proof of concept) for web vulnerabilities: SQL Injection, Cross Site Scripting, XPath Injection etc. So, WebCruiser is also an automatic SQL injection tool, an XPath injection tool, and a Cross Site Scripting tool! Key Features: * Crawler(Site Directories And Files); * Vulnerability Scanner: SQL Injection, Cross Site Scripting, XPath Injection etc.; * SQL Injection Scanner; * SQL Injection Tool: GET/Post/Cookie Injection POC(Proof of Concept); * SQL Injection for SQL Server: PlainText/Union/Blind Injection; * SQL Injection for MySQL: PlainText/Union/Blind Injection; * SQL Injection for Oracle: PlainText/Union/Blind/CrossSite Injection; * SQL Injection for DB2: Union/Blind Injection; * SQL Injection for Access: Union/Blind Injection; * Post Data Resend; * Cross Site Scripting Scanner and POC; * XPath Injection Scanner and POC; * Auto Get Cookie From Web Browser For Authentication; * Report Output. System Requirement: Windows with .Net Framework 2.0 or higher Download: [HIDE-THANKS][Hidden Content]]
  2. Re: web application hacking course , a good point to start learning websites hacking from zero to advanced level Good tutorial
  3. -Sh0w

    BYG - BeYeuGroup's Shell

    Re: BYG - BeYeuGroup's Shell Qual password de login code ofuscado '-'
  4. -Sh0w

    Scanner Website 1.0

    Funções: Finder Admin: Um admin finder pra quem não sabe tenta encontrar a URL onde o administrador se loga. Web Crawler: É um software desenvolvido para realizar uma varredura na internet de maneira sistemática através de informação vista como relevante a sua função. Eles capturam os textos das páginas e cadastram os links encontrados e assim possibilitam encontrar novas páginas. São uma das bases das Search Engines, eles são os responsáveis pela indexação dos sites, armazenando-os na base de dados dos motores de busca. Também são conhecidos como Spider ou Bot (robô). [HIDE-THANKS]Download[/HIDE-THANKS] Scan Créditos: -Show Fonte: [Hidden Content]