Jump to content
YOUR-AD-HERE
HOSTING
TOOLS

Search the Community

Showing results for tags 'reddit'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Staff Control
    • Staff Announcements
  • General doubts | News
    • General doubts
    • News
  • Hacking | Remote Administration | Bugs & Exploits
    • Hacking
    • Remote Administration
    • Bugs & Exploits
  • Programming | Web | SEO | Prefabricated applications
    • General Programming
    • Web Programming
    • Prefabricated Applications
    • SEO
  • Pentesting Zone
    • Pentesting Accounts
    • Reverse Engineering
  • Security & Anonymity
    • Security
    • Wireless Security
    • Web Security
    • Anonymity
  • Operating Systems | Hardware | Programs
    • Operating systems
    • Hardware
    • PC programs
    • iOS
    • Android
  • Graphic Design
    • Graphic Design
  • vBCms Comments
  • live stream tv
    • live stream tv
  • Marketplace
    • Sell
    • Services
    • Request
  • Pentesting Premium
    • Pentesting Accounts
  • Modders Section
    • Source Codes
    • Manuals | Videos
    • Tools
    • Others
  • PRIV8-Section
    • Exploits
    • Accounts|Dumps
    • Crypter|Binder|Bots
    • Tutorials|Videos
    • Cracked Tools
    • Make Money
    • More Tools
    • Databeses
    • Ebooks
  • Pentesting Zone PRIV8
    • Pentesting Accounts
    • Reverse Engineering
    • Cracker Preview Area
  • Carding Zone PRIV8
    • Carding
    • Phishing
    • Defacing
    • Doxing
    • Special User Premium Preview Area
  • Recycle Bin
    • Recycle
  • Null3D's Nulled Group

Product Groups

  • PRIV8
  • Advertising
  • Access Basic
  • Seller
  • Services

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 19 results

  1. Proxies: Yes Bots: 100 User:Pass Capture: Premium / Mail / Comment Karma / Post Karma / Credits / Coins / Country / City [hide][Hidden Content]]
  2. Abusing Reddit API to host the C2 traffic, since most of the blue-team members use Reddit, might be a great way to make the traffic look legit. Workflow Teamserver Go to the specific Reddit Post & post a new comment with the command ("in: ") Read for new comment which includes the word "out:" If no such comment is found, go back to step 2 Parse the comment, decrypt it and read it's output Edit the existing comment to "executed", to avoid reexecuting it Client Go to the specific Reddit Post & read the latest comment which includes "in:" If no new comment is detected, go back to step 1 Parse the command out of the comment, decrypt it and execute it locally Encrypt the command's output and reply it to the respective comment ("out:" ) [Disclaimer]: Use of this project is for Educational/ Testing purposes only. Using it on unauthorised machines is strictly forbidden. If somebody is found to use it for illegal/ malicious intent, author of the repo will not be held responsible. [hide][Hidden Content]]
  3. Proxies: Yes Bots: 100 User:Pass / Email:Pass Capture: [hide][Hidden Content]]
  4. Proxies: Yes Bots: 100 User:Pass Capture: Karma / CakeDay [hide][Hidden Content]]
  5. Proxies: Yes Bots: 100 User:Pass Capture: hasGoldSubscription / hasPaypalSubscription / hasAndroidSubscription / hasIOSSubscription [hide][Hidden Content]]
  6. This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. Changelog v3.3.2 Added Source code In Redditor.py: Added a new method GetInteractions._get_user_subreddit() – extracts subreddit data from the UserSubreddit object into a dictionary. Tests In test_Redditor.py: Added TestGetUserSubredditMethod().test_get_user_subreddit() to test the new method. Changed Source code In Redditor.py: GetInteractions._get_user_info() calls the new GetInteractions._get_user_subreddit() method to set the Redditor’s subreddit data within the main Redditor information dictionary. In Version.py: Incremented version number. README Incremented PRAW badge version number. [hide][Hidden Content]]
  7. Proxies: Yes Bots: 100 User:Pass Capture: isVerified / isMod / Gold / Gold Credits / Link [hide][Hidden Content]]
  8. Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. [hide][Hidden Content]]
  9. This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. Summary Introduced livestreaming tools: Livestream comments or submissions submitted within Subreddits. Livestream comments or submissions submitted by a Redditor. [hide][Hidden Content]]
  10. Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. Changelog v3.2.1 Structured comments export has been upgraded to include comments of all levels. Structured comments are now the default export format. Exporting to raw format requires including the --raw flag. Tons of metadata has been added to all scrapers. See the Full Changelog section for a full list of attributes that have been added. Credentials.py has been deprecated in favor of .env to avoid hard-coding API credentials. Added more terminal eye candy – Halo has been implemented to spice up the output. [hide][Hidden Content]]
  11. Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. [hide][Hidden Content]]
  12. Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. Changelog v3.1.2 New in 3.1.2 URS will create sub-directories within the date directory based on the scraper. Exported files will now be stored in the subreddits, redditors, or comments directories. These directories are only created if the scraper is ran. For example, the redditors directory will not be created if you never run the Redditor scraper. Removed the first character used in exported filenames to distinguish scrape type in previous iterations of URS. This is no longer necessary due to the new sub-directory creation. The forbidden access message that may appear when running the Redditor scraper was originally red. Changed the color from red to yellow to avoid confusion. Fixed a filenaming bug that would omit the scrape type if the filename length is greater than 50 characters. Updated README Updated demo GIFs Added new directory structure visual generated by the tree command. Created new section headers to improve navigation. Minor code reformatting/refactoring. Updated STYLE_GUIDE to reflect new changes and made a minor change to the PRAW API walkthrough. [hide][Hidden Content]]
  13. Universal Reddit Scraper v3.1.1 releases: Scrape Subreddits, Redditors, and comments on posts Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. You can specify Subreddits, which category of posts, and how many results are returned from each scrape. I have also added a search option where you can search for keyword(s) within a Subreddit and the scraper will get all posts that are returned from the search. These are the post category options: Hot New Controversial Top Rising Search NOTE: All results are returned if you search for something within a Subreddit, so you will not be able to specify how many results to keep. Changelog v3.1.1 Users will now be able to specify a time filter for Subreddit categories Controversial, Search, and Top. The valid time filters are: all day hour month week year Updated CLI unit tests to match new changes to how Subreddit args are parsed. Updated community documents located in the .github/ directory: STYLE_GUIDE, and PULL_REQUEST_TEMPLATE. Updated README to reflect new changes. [hide][Hidden Content]]
  14. Universal Reddit Scraper This is a universal Reddit scraper that can scrape Subreddits, Redditors, and comments on posts. Scrape speeds will be determined by the speed of your internet connection. Changelog v3.1 New in 3.1.0: Scrapes will now be exported to the scrapes/ directory within a subdirectory corresponding to the date of the scrape. These directories are automatically created for you when you run URS. Added log decorators that record what is happening during each scrape, which scrapes were ran, and any errors that might arise during runtime in the log file scrapes.log. The log is stored in the same subdirectory corresponding to the date of the scrape. Replaced bulky titles with minimalist titles for a cleaner look. Added color to terminal output. Improved naming convention for scripts. Integrating Travis CI and Codecov. Updated community documents located in the .github/ directory: BUG_REPORT, CONTRIBUTING, FEATURE_REQUEST, PULL_REQUEST_TEMPLATE, and STYLE_GUIDE Numerous changes to README. The most significant change was splitting and storing walkthroughs in docs/. [HIDE][Hidden Content]]
  15. Proxies: Yes Bots: 100 User:Pass Capture: Karma Points / Coins / Is Gold / Expires On [HIDE][Hidden Content]]
  16. Proxies: Yes Bots: 100 User:Pass Capture: is_suspended / has_verified_email / suspension_expiration_utc / link_karma / Comment_karma [HIDE][Hidden Content]]
  17. Proxies: Yes Bots: 100 User:Pass / Email:Pass Capture: Karma [HIDE][Hidden Content]]
  18. EMAIL/USER: USER CAPTURE: IMG BELOW PROXYLESS: NO REDDIT CONFIG | PARSE FULL ACCOUNT INFORMATION + AUTO FOLLOW HOW TO CONFIG AUTO FOLLOW [HIDE][Hidden Content]]
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.