ParanoidSec (@paranoids3c) 's Twitter Profile
ParanoidSec

@paranoids3c

Security for Paranoid People.

Proactive Paranoid Works

ID: 1924473220843106305

linkhttps://t.me/paranoidsec calendar_today19-05-2025 14:33:18

57 Tweet

9 Takipçi

1 Takip Edilen

ParanoidSec (@paranoids3c) 's Twitter Profile Photo

Recon is not about finding domains. It’s about finding patterns. Patterns reveal infrastructure. Infrastructure reveals people. 🧠 #OSINT #BugBounty #CyberSecurity

ParanoidSec (@paranoids3c) 's Twitter Profile Photo

Many researchers underestimate OPSEC during investigations. A single reused username, timezone pattern, or unfiltered request can expose the operator. Automation helps, but discipline protects. 🧠🐰 #OSINT #OPSEC #CyberSecurity

Phoronix (@phoronix) 's Twitter Profile Photo

sudo-rs Affected By Multiple Security Vulnerabilities - Impacting Ubuntu 25.10 One of the issues involves the sudo password being leaked in case of timeout or the sudo process being killed. phoronix.com/news/sudo-rs-s…

ParanoidSec (@paranoids3c) 's Twitter Profile Photo

Your surface isn’t what you think it is. Your real attack surface is everything you deployed, inherited, misconfigured, abandoned… and forgot.

ParanoidSec (@paranoids3c) 's Twitter Profile Photo

Security isn’t broken by advanced exploits - it’s broken by shortcuts. Check your habits. Review your footprint. Treat convenience as a threat. Paranoia is hygiene. 🧠🐰

Bash Bunny (@_bashbunny_) 's Twitter Profile Photo

While cloudflare is down, I've been updating my Dark Web crawler. Now is fast and doesn't need too many things to work, just pure bash. Check it here: github.com/bash-bunny/DW_… For your info, here is running in Alpine Linux with 2Gb of ram.

While cloudflare is down, I've been updating my Dark Web crawler.

Now is fast and doesn't need too many things to work, just pure bash.

Check it here: github.com/bash-bunny/DW_…

For your info, here is running in Alpine Linux with 2Gb of ram.
Bash Bunny (@_bashbunny_) 's Twitter Profile Photo

Update: this is what it got since yesterday. Almost 2 millions of URLs, and 6000 unique domains. Not bad for that 2Gb alpine and pure bash. Should I make my own Dark Web search engine?

Update: this is what it got since yesterday.

Almost 2 millions of URLs, and 6000 unique domains.

Not bad for that 2Gb alpine and pure bash.

Should I make my own Dark Web search engine?
Bash Bunny (@_bashbunny_) 's Twitter Profile Photo

Day 2: - 1.7Gb of text file - Almost 9 millions of URLs - 9000 unique domains Not bad for only two days and a machine with Alpine.

Bash Bunny (@_bashbunny_) 's Twitter Profile Photo

Been doing some tests with Claude and this happens to me yesterday. Without asking to do it, claude automatically try to access my filesystem. It can't, but its interesting to see that is running as root on the server.

Been doing some tests with <a href="/claudeai/">Claude</a> and this happens to me yesterday.

Without asking to do it, claude automatically try to access my filesystem.

It can't, but its interesting to see that is running as root on the server.
Bash Bunny (@_bashbunny_) 's Twitter Profile Photo

Day 4: - 1.9 Gb of text file - 10 millions of URLs - Almost 14k of unique domains I've been improving the crawler, because there are some nasty sites that hang on indefinitely. Also now it allows you to recover past runs. Check it here: github.com/bash-bunny/DW_…

Day 4:
- 1.9 Gb of text file
- 10 millions of URLs
- Almost 14k of unique domains

I've been improving the crawler, because there are some nasty sites that hang on indefinitely.

Also now it allows you to recover past runs.

Check it here: github.com/bash-bunny/DW_…