Pocket Trick

Firefox’s Pocket browser plugin is great for capturing web content to read later. Also, check out this trick — If you happen to be checking out pages that you’re not sure about, and trying to protect your self with JavaScript disabled, sometimes those pages seem to not load. Pocket may help there too. It’s possible the site’s content has loaded enough for Pocket to grab the text, even if the page seems to be waiting for you.

It does this by looking at the HTML source code. You could read that as well, by right-clicking on the page, and selecting “view source“, but the nice trick about Pocket is that if formats all nice and pretty to read, so you don’t have to be your own HTML parser.

The case of the missing docker volume

Contain yourself…

Containers, or docker containers in my case, can sure be fun. There are so many moments of instant gratification, which are especially powerful when linking multiple orchestrated images and firing it up with docker-compose up. Suddenly, a stack of services that may have taken days to setup with multiple virtual machines or physical boxes, is conjured up and running on a single machine. Then, you just get to play. However, what if that stack is so fun, and maybe so useful, that it seems like you may want to keep it going? Well then, of course we need to start thinking about how to make that ephemeral stack have some persistence.

Detecting changes over time

In a recent case, I was playing with a docker compose script that comes from the folks as Wazuh, which packs up OSSEC HIDS, along with and ELK stack. I had experimented with this a few times in the past, but I was considering leaving a Wazuh stack running for a while. (Having just setup a new Windows laptop computer, and thinking about documenting the initial state, I thought it seemed like good job for OSSEC. Maybe overkill, but frankly, its just how I do.)

So, I thought it was a good time to try out Wazuh again, and put docker to the test while I was at.

A Window into Docker

While my home lab is mix of operating systems, the machine I have been using for a virtual machine “server” is a Windows 10 Pro XPS, with 32 GB of RAM, i7. I do think my first experience with docker was actually on Windows 8.1 and of Windows Server 2012, I really have preferred using Docker in Linux for the most part. There’s just something I’ve never really felt comfortable with the that GUI entry point for some reason. Also, the need to share the whole C drive in order mount drives just feels icky. Am I crazy? Sorry, I digress. The point is, I was reluctant to make this by main VM server, and to use Docker along with it, but Hyper-V has been a solid choice for me in the past, I know it pretty well, and there’s so many resources for troubleshooting, I went with it.

Why is this important? Well, it all comes back to the pesky mounting of data thing I mentioned. As I was testing the set up, I thought I’d make a few volumes to persist the data, but, I decided to just let docker handle it for bit, with one exception. I chose to edit the docker compose yaml file, and add my one named volumes. One for OSSEC data (/var/ossec/data) and one for the configs (/var/ossec/etc).

Everything seemed to work just fine.

It wasn’t fine

It seemed to me, that the named volumes didn’t persist one day. I had been making some changes to the host system, and had to turn off the docker desktop service. When I brought the machines back up, everything worked, but my data was not there. It was like a new installation. What happened to my volume? The problem, I’ve come to believe, is more a problem of timing. It think the initial spin up I did, didn’t actually use my named volumes. So, what did that mean? Well, it means that the docker volumes, with massive random names, probably had my data.

First, I had to see if I could even find my volumes. Inspecting the volume (docker volume inspect <volume name>) showed that the path as /var/lib... which, on a Windows 10 machine, didn’t make much sense. So, I figured (and verified online) that the .vhdx drive that the Docker uses is where I need to look.

Next time, how did I go looking for that data? Did I find it? Once I did, how would I get it back into machine it belongs in? To be continued…

Capture a pcap from a router (Ubiquity EdgeX)

Remote connection

Perhaps you’ve used Wireshark to capture packets on your laptop, or pc. But what if you need to troubleshoot your router? A quick way to grap a dump, is a simple ssh/tcpdump combo. It’s easy to do, and can be done remotely.

First, your router must accept ssh connections for this method. In the case of EdgeMax router software, the settings to enable are in the “System” tab, found at the bottom of the web-based admin portal.

ssh and saving the pcap file.

$ ssh $(router_ip) "sudo tcpdump -s 0 -w - " > ~/Desktop/router_capture.pcap

The above command basically does it, and you can stop the capture with a Ctrl-C. Until then, you’ll be capturing packets remotely from your router, right onto you desktop. However, note that you’ll need to change $(router_ip) to your router, like, or example. You will likely be asked for passwords, both to connect to the router, and to use the sudo.

You could change the ‘~/router_capture.pcap‘ line to any path on your computer. However, I recommend saving somewhere only you can access. It may have sensitive info.

Parsing the file

This packet capture file will be readable by Wireshark, if this is all accomplished with a modern enough system. However, if you prefer to NOT have that format, and perhaps just ‘grep’ or read the text output, you can remove the ‘-w – ‘ part of the command. In addition, you could pipe the tcpdump command into a ‘grep’ first, to limit the amount of data. For instance:

$ ssh "sudo tcpdump -s 0 | grep -e -v '192\.168\.0" > ~/router_capture.txt

The above will remove IPs that have “192.168.0” in the output. However, I don’t suggest this, as Wireshark will be much more powerful for searching and finding issues you might be having.

Apdex – a simple measure of happiness

Rabbit Hole #1

So, I was just poking around websites, checking out the source code as I do from time to time when I’m curious and noticed some New Relic JavaScript. I had not looked at New Relic for while, and I’ve never had the occasion to use the services, so I thought I’d check them out for a minute.

Side note

(Really quick context for people that have no idea what I’m talking about. On web pages, you can right-click, and from the menu choose to look at the source code. There you see JavaScript code, which is the part of the web page code that tends to make the page feel “fluid” and “dynamic”. It will send and receives data as you do stuff on the page. For instance, as you type in search fields, the results start showing up so you don’t have to hit enter. Or it might load more of the page and data as you scroll down. New Relic is a service that helps monitor those web pages for problems, using JavaScript code to gather and send information [errors, data, etc.] to their servers. They can then aggregate details report to the owners of the website. Go check it out. For those who need this service, I’m sure it’s very worth it. https://newrelic.com)

Rabbit hole #2

Back to the “Apdex” part of all this. As I hit the New Relic page, and scrolled, that term jumped out at me. I thought, “huh, ‘Apdex’, I’m not sure I know what that is…”, so, down another rabbit hole, I went. Turns out it is short for Application Performance Index. It’s this sort of beautiful little formula and metric at its core. Frankly,  in the first 10 seconds of looking into it, was reflecting just how rare it is to see something this simple.

It is a method of ranking an experience, from 0-1. Or, if you want to think of in percentages, 0-100%. Apdex.org clearly wants this to be more than just for web pages, but for the sake of tying this to my little research journey, think of this metric being used for websites. Specifically, how the user liked the experience using the site. So,

Apdex = ( Fully Happy + Content/2) / Total Visits

Taking a step back

Clearly, this isn’t going to be super illuminating in-depth information with lot s of details all on its own. Consider this though: Survey 100 people and most of them would give you an answer to “how satisfied are you”. Intuitively, I wanted there to be more to this equation when I first saw it, but it grew on me.  This detail over 100 visits an hour would be very illuminating. Especially over time or as new changes were made to the site. And perhaps comparing from page to page, you’d find a clear dip on a page or two.


Alright, at this point you probably either agree with me, or you don’t. Or you just don’t really care. Hey! We can take that Apdex!

Let’s say 5 people agree, 3 people don’t, and 8 readers got this far but are kinda bored.

56.25% = ( 5 + 8/2 ) / 16

Hmm, not a great score… I’ll have to add some memes next time.

Why I chose ProtonMail for my Domain SMTP service

SMTP with protonmail

Like many in and around the infosec community, I’ve followed privacy issues as they relate to security for a while. When I first heard about protonmail, I believe it was in the wake of the Snowden revelations. Now, let me be clear, my primary email address is a Gmail account, I also have a grandfathered in Gsuit account for another domain, but I needed to set up SMTP for beaukinstler.com, even if just to redirect it.

SMTP at ProtonMail

Nowadays, the choice for free personal domain email seems rather limited. I could use the IPS’s offering and pay, or I could switch registrars and maybe get some free email service, and there’s Zoho (which I tried, but didn’t get their confirmation text for about an hour, which didn’t bode well). So ultimately, I decided to accomplish two things with my need. I decided to upgrade my protonmail account to a paid version, and add my domain (beaukinstler.com) for email use, which was pretty simple.

Because of this, I now support an organization with some dollars that are doing great work for securing information. If you too are interested in really secure email, both in transit and at rest, then check them out. ProtonMail.com or ProtonMail.ch (Swiss).

Helping protect your recipients

One other point. I think people often give up on email as a secure form of communication, especially when it’s not B2B because they just can’t really rely on how others have their email setup. PGP, S-MIME, etc isn’t easy. I’ve set it up, but never use it. But with ProtonMail, you can send emails to others that DON’T have a secure account, and give them a password out-of-band. They can even email you back via that message. And, if you know the recipient of your email needs a file or a message that would be bad for others to see, but they’re for any reason unable to set up a secure system for themselves, this is something you could use to help them. So it’s really slick.

Quick look

Here are a couple screenshots to give you the idea:

Enable encryption on the message you’re going to send

add encryption to email


Create a password, and if you like a hint!

setting password in protonmail








For me? It could be. Honestly, I have hardly needed this level of privacy ever, but I’m lucky. I live in a place and have a life that I’m not required to be secret about. If I NEED it, it usually for some tax records, or something mundane. But there are people who need this level of protection, along with things like Tor, and I’m happy that my need for SMTP services, can help support them.