- https://en.wikipedia.org/wiki/Have_I_Been_Pwned?
- https://en.wikipedia.org/wiki/Troy_Hunt – 2011–present: Microsoft MVP for Developer Security
Category: Internet
-
“Have I Been Pwned?” – WTF? who is behind it?
-
details of an HTTPS server – as seen from outside
- https://en.wikipedia.org/wiki/HTTPS
- https://stackoverflow.com/questions/40557031/command-prompt-to-check-tls-version-required-by-a-host/55764641#55764641
- https://maxchadwick.xyz/blog/checking-ssl-tls-version-support-of-remote-host-from-command-line
I found this command line (suggested above) really, really useful:
$ nmap –script ssl-enum-ciphers -p 443 www.google.com
Make sure nmap is fairly up-to-date, otherwise you will miss too many interesting bits – seriously! In other words: it makes no sense at all working like this with an outdated nmap.
-
details of sshd (Secure Shell) – as seen from inside
- https://en.wikipedia.org/wiki/Secure_Shell
- “sshd -v” (or with any other illegal option) tells you the version of your OpenSSH and OpenSSL
- “sshd -T” tells you all possible features of your sshd
-
URL decoding
URL decoding is the inverse process of URL encoding. It is used to parse query strings or path parameters passed in URLs. It is also used to decode HTML form parameters that are submitted with application/x-www-form-urlencoded MIME format
URLs, as you might know, can only contain a limited set of characters from the US-ASCII character set. These characters include Alphabets (A-Z a-z), Digits (0-9), hyphen (-), underscore (_), tilde (~), and dot (.). Any character outside this allowed set is encoded using URL encoding or Percent encoding.
This is why, it becomes necessary to decode query strings or path parameters passed in URLs to get the actual values.
You can have URLs with umlauts and other characters not listed above shown in your browser – but if you copy that URL to a text area, it gets “URL encoded” – that’s usually not, what I like to have in that text area, e.g. my text editor. Far to often I fixed those URL encoded strings manually – but recently I came across the tool listed above, and I quite like it for this purpose.
-
robots / web harvesters eating up my Internet traffic volume
- https://wiki.hostsharing.net/?title=Traffic – recognise, analyse, block, …
- …
My 2 top volume killers:
… wp-cron.php?doing_wp_cron …
$HOME/var/web.logThat is functionality of my own WordPress site(s) – rather suprising, that my own software “kills” me. It is actually my “top volume killer”. The problem gets described here:
… wp-content/themes/twentytwenty/assets/fonts/inter/Inter-upright-var.woff2 …
$HOME/web.logThat’s a font file, that gets downloaded excessively from my WordPress site(s). The problem gets described here:
I zeroed the font file in question – after backing up the original. (In fact I accidentally zeroed the other font file w/o backing it up.)
I can reverse the effect of zeroing that font file
- by de-activating the WordPress theme in question,
- removing, re-installing, and re-activating that theme.