Picking a blogging platform.
I’ve been bouncing around between picking a blog platform, and realized I should really settle down and stick with one. This is mostly a page to track my decision-making, and isn’t meant to be an objective comparison of different platforms. The one thing I'’ll be limiting myself to is static site generators. They’re light, easy to write to, and I can keep my blog in source control.
The markup language I’d really rather avoid having to learn a wholle new markup, so that effectively limits me to either Markdown or ReST.
Using shush as a crontab wrapper
Cron is a great tool for linux servers, but it's a very limited in it's capabilities (since it follows the Unix philosophy), so when I started to run up against those limits, I began doing all sorts of bash trickery to accomplish what I needed to happen, but that swiftly started giving me even more problem. At work, I use the Jenkins CI tool as a cron replacement (great tool, allows for distributed runs, queuing tasks, emails on failure, etc), but it seemed rather heavy weight for a homelab.
Flashing LSI SAS 9211-8i with EFI
I recently went on an upgrade crusade to my homelab, and as part of that, upgraded FreeNAS to 9.3. When I did, there was a non-urgent alert about a driver mismatch for my LSI HBA (FreeNAS expected 16, LSI had 12). Thus, I decided to upgrade the firmware.
Directions This assumes that your server can boot directly into an EFI shell. It might require a Shell.efi for some motherboards, but I can't tell you much more than that, as it boots straight to EFI on mine.
Using Heritrix to archive sites to a directory structure
So I one day I found myself in the market for a good web archiver. Specifically, there were some interesting open directories I wanted to mirror. My ideal solution would be a web front end around wget, but a little bit of research and testing showed that such an architecture would be too simplistic for the level of detail I wanted. There were a couple spider frameworks I tried out, like scrapy, but I wasn't enthusiastic about the prospect of trying to roll my own solution, when I knew sites like the Internet Archive had the exact kind of thing I had in mind, and they use the Heritrix engine archive their material.
Check_mk and FreeNAS, Pt. 3
A continuation of Check_mk and FreeNAS, Pt. 2
Now that we have all our smart data nicely set up, let's see if we can't get some stats on I/O speed. I'm pretty sure FreeNAS is supposed to have a I/O section in its "Reports" section, but for whatever reason, it's not in my install, and I'd like to have the data in Nagios in any case.
Just like with the SMART data, we're going to write a small script that the check_mk agent can use.
Check_mk and FreeNAS, Pt. 2
A continuation of Check_mk and FreeNAS, Pt. 1.
So I've got my check_mk on set up on my NAS, and it's monitoring stuff beautifully. However, it's not monitoring something very near and dear to my heart for this server: S.M.A.R.T. data. FreeNAS comes with smartctl, and there's already S.M.A.R.T. data plugins for the linux agents, so I figured this wouldn't be a big deal. And I was right! All I had to do was add the following script to my plugins/ folder for check_mk to find, and the server picked it up automatically.
Check_mk and FreeNAS
Note
Software involved:
FreeNAS 9.2.0 OMD 1.10 (check_mk 1.2.2p3) FreeNAS is great, and the web interface makes it easy and simple to understand my NAS's overall structure. However, my favored method of monitoring in my homelab is OMD with Check_mk, while FreeNAS prefers a self-contained collectd solution. We're in luck however, in that FreeNAS is heavily based on FreeBSD, which check_mk happens to have a plugin for, so it shouldn't be too hard to set things up the way I like them.
BASH Documentation
One of the things that always bothers me is lack of proper documentation. Now, I’m lazy just like everyone else, but if I’m going to document something, I prefer to do it properly and keep it up to date. I’ve inhierited a nice suite of bash scripts, which aren’t really complicated, but they all have the same copy & pasted header that’s dated from 2003. Not exactly helpful.
So while I have a wiki that explains how some of the processes work on a higher level, it would be nice to have clean documentation in my bash scripts.
Fun with Basic PHP Optimization
A while ago I came across a full-featured PHP application for controlling a daemon. It worked well with a small data set, but quickly became laggy with a dataset numbering in the thousands. Admittedly, it really wasn’t built for that kind of load, so I removed it and controlled the daemon manually, which wasn’t a big deal.
Then a while later, I came across a post by someone who managed to mitigate the problem by shifting a particularly expensive operation to an external python program.