Drew Stephens

Perl and Dubstep

In the past year, I have come to quite like dubstep, a burgeoning electronic music genre. Perhaps this is a more interesting introduction than an article. Dubstep’s current phase reminds me of generalized techno (mostly house & dance) during the mid-90’s—a huge variety of styles, many small artists, much of it distributed through non-traditional channels. In the late 90’s the distribution was via early filesharing networks, most notably Napster. For Dubstep, YouTube, Mixcloud, and Soundcloud seem to be the preferred ways of getting new tracks out.

To chronicle and share this growth of a genre, I started @DailyWub, a Twitter account that posts a new dubstep track every day. Being an engineer, I found the idea of manually keeping a queue and posting a track every day to be a dreadful task. For some time, the account has been powered by Buffer, a simple webapp that allows you to create a queue of tweets that are metered out at a schedule of your choosing. Buffer is ok, but they limit the queue to 10 tweets, and at some point started shortening URLs even when not needed, which breaks the YouTube thumbnails in many Twitter clients. Having a queue that is regularly plucked from and emitted to Twitter is a fairly simple operation, so I wrote my own program to do it—Net::Twitter::Queue.

Net::Twitter::Queue is a simple Perl module that employs Net::Twitter to do the heavy lifting. To use it, I have a queue of tweets in a file, tweets.yaml:

- Caspa - Where's My Money? http://www.youtube.com/watch?v=myZU2DZoD9w
- Skrillex - First Of The Year http://www.youtube.com/watch?v=2cXDgFwE13g
- Rusko - Everyday http://www.youtube.com/watch?v=xDAX2aVWAag

When run, Net::Twitter::Queue will remove the top item from that YAML file and post it to Twitter using the account information specified in config.yaml:

consumer_key: [consumer_key]
consumer_secret: [consumer_secret]
access_token: [access_token]
access_token_secret: [access_token_secret]

Where do those values come from? Two places: the consumer information is on the page for your application at dev.twitter.com (go ahead, make one!) and the access tokens are specific to the account you want to post as. To generate them, I used Twurl. With the consumer key & secret in hand, simply run Twurl:

Titus:~/$ twurl authorize --consumer-key [consumer_key] \
--consumer-secret [consumer_secret]

Twurl will respond with a URL that you can visit in a web browser, login to Twitter with the account you want to post as, and get a PIN back. Give the PIN to Twurl and it will complete the authentication process, saving the access token & associated secret in your ~/.twurlrc. Grab those two, toss them into config.yaml and run Net::Twitter::Queue from the directory that has config.yaml & tweets.yaml in it:

caligula:~/twitter/dailywub$ ls
config.yaml  tweets.yaml
caligula:~/twitter/dailywub$ perl -MNet::Twitter::Queue -e \
'$q=Net::Twitter::Queue->new->tweet'

Easy as that—the top entry in tweets.yaml has been popped and posted to Twitter.

Creating a Perl Module in Modern Style

One of the very best things about Perl is CPAN, a repository of modules to do everything from browse the web to manipulating image files. CPAN provides a consistent method for installing modules (install cpanminus and then cpanm <Module::Name>) and the largest number of modules of all the scripting languages. More often than not, what you are trying to do has already been done and exists as a CPAN module. In the event you are doing something new, the best way to give back to the community and get free help is to encapsulate your work and distribute it as a module on CPAN.

Historically, creating a module suitable for general consumption was a confusing. Tutorials from years past abound, each one longer than the previous, and always employing a different toolchain, making synthesis of common concepts impossible. These days, however, things are much easier. In the past 6 years or so the Perl community has tried on a number of methods for building modules. The focus of this article will be a recent (circa late 2009…the Perl ecosystem takes a measured pace) build system, Dist::Zilla.

Whereas ExtUtils::MakeMaker and Module::Build are systems for building, testing, and installing a release, Dist::Zilla sits at a higher level. With Dist::Zilla, you create a single file that controls the build & test flow (using ExtUtils::MakeMaker under the hood) but also provide functionality for generating semi-boilerplate files (LICENSE, MANIFEST, META.yml) and releasing the code via CPAN. The configuration file, dist.ini, is easy-to-read and short in contrast to prior build systems.

To start using Dist::Zilla, install it using the standard CPAN shell command (cpan -i Dist::Zilla) or with cpanminus (cpanm Dist::Zilla). You utilize Dist::Zilla through the command dzil; if that’s not in your path (which dzil), then you’ll want to find it and symlink it somewhere useful or add it to your path. Global setup of dzil is done by invoking dzil setup and answering the questions it poses. With that done you can very easily mint a new distribution:

1
2
3
4
Titus:~/sandbox$ dzil new Number::Cruncher
[DZ] making target dir /Users/dinomite/Dropbox/sandbox/Number-Cruncher
[DZ] writing files to /Users/dinomite/Dropbox/sandbox/Number-Cruncher
[DZ] dist minted in ./Number-Cruncher

In the newly created Number-Cruncher directory you’ll find a lib directory containing Number/Cruncher.pm and a dist.ini :

1
2
3
4
5
6
7
8
9
name    = Number-Cruncher
author  = Drew Stephens [drew@dinomite.net]
license = Perl_5
copyright_holder = Drew Stephens
copyright_year   = 2011

version = 0.001

[@Basic]

At this point, code and tests are the only things needed to have a Perl module. Here’s some simple code for lib/Number/Cruncher.pm:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
use strict;
use warnings;
package Number::Cruncher;

=head1 NAME

Number::Cruncher - crunch numbers

=cut

sub new {
    my $class = shift;
    my $self = {
        first => shift,
        second => shift,
    };
    bless $self, $class;
    return $self;
}

sub crunch {
    my $self = shift;
    return $self->{first} + $self->{second};
}

1;

And the associated test file that I created, t/number-cruncher.t:

1
2
3
4
5
6
7
8
9
10
11
12
#!/usr/bin/env perl
use strict;
use warnings;

use Test::More tests => 2;

BEGIN { use_ok 'Number::Cruncher'; }

my $first = 1;
my $second = 7;
my $cruncher = Number::Cruncher->new($first, $second);
ok (($first + $second) == $cruncher->crunch());

In three files—dist.ini, lib/Number/Cruncher.pm, and t/number-cruncher.t—we have a module. Run the test with dzil test and you’ll see Dist::Zilla build your module into a temporary directory and run the test suite from there:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Titus:~/sandbox/Number-Cruncher$ dzil test
[DZ] building test distribution under .build/mgJsQhkFi6
[DZ] beginning to build Number-Cruncher
[DZ] guessing dist's main_module is lib/Number/Cruncher.pm
[DZ] extracting distribution abstract from lib/Number/Cruncher.pm
[DZ] writing Number-Cruncher in .build/mgJsQhkFi6
Checking if your kit is complete...
Looks good
Writing Makefile for Number::Cruncher
cp lib/Number/Cruncher.pm blib/lib/Number/Cruncher.pm
Manifying blib/man3/Number::Cruncher.3
PERL_DL_NONLAZY=1 /usr/local/Cellar/perl/5.12.3/bin/perl "-MExtUtils::Command::MM" "-e" "test_harness(0, 'blib/lib', 'blib/arch')" t/*.t
t/number-cruncher.t .. ok   
All tests successful.
Files=1, Tests=2,  0 wallclock secs ( 0.02 usr  0.01 sys +  0.02 cusr  0.00 csys =  0.05 CPU)
Result: PASS
[DZ] all's well; removing .build/mgJsQhkFi6

To build a distributable tarball, just run dzil build:

1
2
3
4
5
6
7
8
9
10
11
12
13
Titus:~/sandbox/Number-Cruncher$ dzil build
[DZ] beginning to build Number-Cruncher
[DZ] guessing dist's main_module is lib/Number/Cruncher.pm
[DZ] extracting distribution abstract from lib/Number/Cruncher.pm
[DZ] writing Number-Cruncher in Number-Cruncher-0.001
[DZ] building archive with Archive::Tar::Wrapper
[DZ] writing archive to Number-Cruncher-0.001.tar.gz
Titus:~/sandbox/Number-Cruncher$ ls Number-Cruncher-0.001*
Number-Cruncher-0.001.tar.gz

Number-Cruncher-0.001:
LICENSE     META.yml    README      lib
MANIFEST    Makefile.PL dist.ini    t

If you already have a PAUSE account, you can use dzil release to upload that tarball to PAUSE for inclusion in CPAN. If you haven’t authored a Perl module before, request an account for uploading modules to CPAN. With PAUSE and Dist::Zilla, creating widely-available Perl modules is easy.

See Also


Rafter Pull-Up Bar

I quite enjoy exercising and, ever since I did parkour, I’ve really liked body weight movements. I don’t do much in the way of parkour any more, but I do CrossFit religiously, which features a lot of pull-ups. I’ve got a doorway pull-up bar which is superb for just getting going in the morning; since they’re only $25, I think everyone should have (and use) one. While having a pull-up bar inside the house is great, I do most of my real workouts in my garage—with easy access to running (the street), box jumps (wall in my back yard), and a barbell, I can do many different WODs. For WODs that involve pull-ups, I’ve been doing them on the rafters in the garage. While this works, the rafters are a bit too high to get to easily and they really strain my grip. I’m all for the workout being difficult, but I don’t always want it to be training for climbing. To that end, I built a pull-up bar this weekend. pull-up bar

bolt detail It’s a pretty simple affair, largely based upon this instructional video. The pull-up bar runs perpendicular to a pair of rafters that are 4 feet apart, hanging a few inches below the bottom of the rafters. In my garage, the rafters are 2x6, which seem to be plenty sturdy enough even when I kip & do muscle-ups over the bar.

The materials I used for the bar:

  • A 48” long, 1” diameter pipe
  • Two 12” long 2x6s
  • Two 6” long 2x6s
  • Six 4” long 3/8” bolts
  • Six 3/8” nuts
  • Twelve flat washers
  • Eight 2.5” long wood screws
  • Construction adhesive

bolt detail The only prep beyond cutting the wood is drilling the holes for the bar in the larger 2x6 pieces. Note that a 1” diameter bar has a 1” interior diameter, so you need a larger drill bit than 1”; I used a 1¼” bit which was still slightly small, but worked with a bit of extra drill action. Once the bar holes are drilled, I made the holes for the 3/8” bolts on the opposite half of the 2x6, and corresponding holes in the rafters where the bar was to be mounted. My standard procedure for woodworking is to supplement my lackluster skill with adhesive, so I applied some construction adhesive with my caulk gun between the 2x6 and the rafter and then bolted it in place.

With the supports bolted in place, I slipped the bar itself through the holes and quickly discovered the need for something more. The bar itself is 48 inches long, and the space between my rafters is also 48 inches. Since I put the supports inside a pair of rafters, the bar was nicely flush with the outside of the supports, but this also meant it would only take a couple of inches of lateral movement for it to fall out. The solution is to cap fashion caps using some more 2x6 and secure it with wood screws.

Parallel Processing in PHP

Though not a first choice for long-running processes, many web shops end up writing daemons or batch processing scripts in PHP. As business grows, the need to process records more quickly to deal with traffic becomes an issue. Often times, the processing is limited by something other than raw processing power—network latency and database query times being the usual slowdowns. When this is the case, the easiest way to increase throughput is with multiprocessing: multiple children that spread the time waiting so as the fully utilize the processing power available.

To this end, I have created a simple framework for managing child/worker multiprocessing in PHP. Like other high-level interpreted languages, the most straightforward way to spin things up is using fork(2) to create new processes. While not as Hardcore and Awesome as the lightweight threads that other languages provide, OS-level process creation isn’t a huge hindrance if you code for it: make the child processes long running so as to mitigate the startup cost.

The framework is part of the Team Lazer Beez Open Source project—you can find it in the utility package. The entire thing is simple enough to fit in a single class, gosUtility_Parallel, the basics of which can be credited to chaos’ post on Stack Overflow.

Using the library is simple—extend gosUtility_Parallel and override the doWorkChildImpl() method:

// Include the Genius config file
require_once dirname(dirname(__FILE__)) .'/Core/testConfig.inc.php';

class Minimal extends gosUtility_Parallel {
    protected function doWorkChildImpl() {
        gosUtility_Parallel::$logger->debug($this->workerID . " started");
        usleep(2000000);

        gosUtility_Parallel::$logger->debug($this->workerID . " doing work");
        usleep(2000000);

        gosUtility_Parallel::$logger->info($this->workerID . " finishing");
        exit(1);
        return;
    }
}

This class creates simple workers that print a couple of debug messages with some sleeping in between, and then announce that they are done working. Now you can instantiate the class with a single argument: the number of children to run. gosUtility_Parallel will take care of all the details.

// Make with the go
$minimal = new Minimal(2);
$minimal->go();

If children exit with a non-zero status, the parent will spin up a replacement. The parent will continue to run until all children have exited normally, or it gets INT (say, ctrl+c) or TERM (the default signal sent by kill(1)), in which case it will pass that signal on to the children, ensure they shut down, and then end itself. gosUtility_Parallel provides ample logging information; running the above produces the following output:

INFO - Started worker 0 (pid 42093)
DEBUG - 0 started
INFO - Started worker 1 (pid 42094)
DEBUG - 1 started
DEBUG - 0 doing work
DEBUG - Checking worker 0 (pid 42093)
DEBUG - Checking worker 1 (pid 42094)
DEBUG - 1 doing work
INFO - 0 finishing
INFO - 1 finishing
DEBUG - Checking worker 0 (pid 42093)
INFO - Worker 0 (pid 42093) exited normally
DEBUG - Checking worker 1 (pid 42094)
INFO - Worker 1 (pid 42094) exited normally

gosUtility_Parallel provides a number of overrideable methods whose names explain their purpose: parentSetup(), parentCleanup(), and childCleanup(). Children can also get their $workerID and the $maxWorkers number making processing based upon modular division trivial. The example parallel class in the distribution demonstrates some of these features:

// Include the Genius config file
require_once dirname(dirname(__FILE__)) .'/Core/testConfig.inc.php';

class Par extends gosUtility_Parallel {
    public function __construct($maxWorkers) {
        parent::__construct($maxWorkers);

        // Redefine the logger
        gosUtility_Parallel::$logger = Log5PHP_Manager::getLogger('gosParallel.Par');
    }

    protected function doWorkChildImpl() {
        gosUtility_Parallel::$logger->debug($this->workerID . " started");

        // Run until told not to
        global $run;
        while ($run) {
            gosUtility_Parallel::$logger->debug($this->workerID . " doing work.");
            usleep(2000000);
            if ($this->workerID == 0 && rand(0,10) == 7) {
                gosUtility_Parallel::$logger->info($this->workerID . " returning");
                return;
            }
        }
    }

    protected function parentCleanup() {
        gosUtility_Parallel::$logger->debug("Parent cleaning up");
    }

    protected function childCleanup() {
        gosUtility_Parallel::$logger->debug($this->workerID . " cleaning up");
    }
}

The example above runs out-of-the-box (provided your PHP was built with --enable-pcntl, so I encourage you to download the source and take it for a test drive.

Incidently, if you’re in the Perl world you can just use Parallel::ForkManager and be on your way.

Learning About Nutrition

"Go Paleo" I have learned a lot about nutrition in the past few years, mainly fueled by my interest in fitness. Once I got beyond run-of-the-mill Globo Gym workouts by delving into truly challenging fitness like parkour and CrossFit, it became apparent that I would need to match exercise with proper nutrition in order to excel. Note: if you don’t give a shit about what I’ve done and just want to learn about nutrition, head to the bottom.

In The Beginning, There Was Parkour

Training at Primal Fitness was the first time I came across folks who offered dietary advice that wasn’t focused on weight loss—something I haven’t ever been interested in or needed. Right in line with the nature of the parkour community at the time, the focus of nutrition was pretty loose: eat more protein and less sugar. Like any athlete trying to build muscle, eating more protein than that on a standard American diet (SAD) is mostly a no brainer. Weight lifting folks long ago figured out that protein was essential to building muscle, and in recent years it’s common knowledge since we all see Bros downing their protein powder. Less sugar has almost always been generally accepted as good nutrition advice…at least until it was pushed out by the blind fear of fat…but I’m getting ahead of myself.

The Next Level: CrossFit

I picked up CrossFit from hanging out at Primal Fitness, and largely as a way to get better at parkour. Parkour involves lots of short distance sprinting with long distance running and a large amount of gymnastic strength & jumping. What better to train such a diverse set of skills than the general purpose fitness focus of CrossFit? Indeed, Primal is also a CrossFit box in addition to being the first facility in the US with such a focus on Parkour. What does CrossFit have to say about nutrition? Quite a bit, and it’s a significant part of the famous World-Class Fitness in 100 Words:

Eat meat and vegetables, nuts and seeds, some fruit, little starch and no sugar. Keep intake to levels that will support exercise but not body fat. Practice and train major lifts: Deadlift, clean, squat, presses, C&J, and snatch. Similarly, master the basics of gymnastics: pull-ups, dips, rope climb, push-ups, sit-ups, presses to handstand, pirouettes, flips, splits, and holds. Bike, run, swim, row, etc, hard and fast. Five or six days per week mix these elements in as many combinations and patterns as creativity will allow. Routine is the enemy. Keep workouts short and intense. Regularly learn and play new sports.

Enter The Zone

The main dietary message from CrossFit HQ is that The Zone Diet is the best set of guiding principals for optimum nutrition. As much as it may seem on the face of it, this isn’t just some marketing cross-promotion crap—much like CrossFit you don’t need to buy anything to eat Zone. The basic tenet is that your meals should all be made up of 40% energy (calories) provided by carbohydrates, 30% by protein, and 30% by fat (the food pyramid advises about 55:20:25). Zone prescribes that the carbohydrates you eat be of the low-glycemic index variety: vegetables, whole grains, whole fruit, though some argue that this is offered as secondary to the macronutrient ratio that is the center of Zone.

I gave Zone a solid try while doing CrossFit on my own in 2009 and really liked it—when I managed to stick to the relatively low carbohydrate formula for even a few days I had much more stable energy throughout the day and felt ready to tackle the workout of the day whether I decided to do it first thing in the morning or late in the evening. For a couple of years, I more-or-less followed the Zone and was pretty happy. When I moved back to DC, I started CrossFitting at Potomac CrossFit right around the time they were starting a Paleo Challenge, which encourages people to give a strict paleo diet a try for 30 days. While I didn’t participate in the challenge, I figured it was worth reading up on paleo and giving it a try since I had heard so much about it in the CrossFit community.

Where I End Up: Paleo

I picked up Robb Wolf’s book, The Paleo Solution and, after initially being off-put by the self-help, anecdotal nature at the beginning of the book, I was impressed with the scientific information and references provided later. Over the week I read the book I quickly moved from “I’ll give this paleo thing a bit of a try” to “I will only eat grass-fed beef and organic broccoli cooked in coconut oil”. I found it so convincing in part because of the science, but also the back-to-basics origin for the ideas on nutrition. As anyone who has read widely on modern nutritionism knows, the dietary advice offerings in the past 50 years have done nothing to make Americans or Westerners in general any healthier. Things like margarine are pushed as healthier replacements only to later find that partially hydrogenated fats are supremely deadly.

Paleo starts by saying, “Nutrition is so complex we haven’t come close to understanding it enough scientifically to offer complete dietary advice.” Instead, paleo nutrition bases the nutrition guidelines on what we evolved to eat, that is the foods that sustained humans for the hundreds of thousands of years prior to the rise of agriculture. Since we have arguably evolved very little since the products of agriculture (grains, legumes) became the central part of our diet, about 10,000 years ago, looking back to what our evolution had us eating seems a very good start. We need not attempt to reenact the caveman lifestyle, but we can use the diets of our evolutionary ancestors as a logical framework for making nutrition choices in the modern world.

In so many words, that is my nutritional journey—I am now a complete paleo convert. After trying it for a month, I was absolutely hooked and, a lot like CrossFit, I now try to tell everyone I meet about how awesome this paleo thing is. In addition to never suffering from blood sugar fluctuation induced unhappiness I am also not only ready to tackle workouts whenever, but reading to absolutely own them. Beyond that, the dietary guidelines of paleo fall in line with a wide set of evidence showing that modern diets are wrong in so many ways I really believe that eating this way makes me greatly healthier overall.

Information on Nutrition

The original point of this post was not to just tell my story of nutritional discovery, but to let others know what they should read to understand nutrition. I never like to just tell people what they should eat, because that makes me just another guy hocking advice that Really Will Make Everything Better! Instead, I want to give folks the information to make their own decision—and I think the evidence points so strongly in one direction that anyone who does read up on it will be in the same camp that I’m in.

If you’re just getting interested in nutrition, then these are the articles you should read. If you don’t care about nutrition, I would encourage you to at least read those by Michael Pollan: they paint a pretty grim picture of the food supply in the United States

For those who want to know more, a good next step is to check out some documentaries:

To really understand what nutrition is about, book reading is in order:

  • Why We Get Fat – A much longer version of the Gary Taubes article above explaining the history & details of modern nutrition advice
  • The Omnivore’s Dilemma – Michael Pollan’s book that introduced nutritionism and its failings, and describes the industrial food system in detail, contrasting it with local agriculture
  • In Defense of Food – More on food & nutritionism. This is where “Eat food. Not too much. Mostly plants” comes from
  • The Vegetarian Myth – A very striking and thorough tearing apart of every angle of vegetarianism written by a former long-time vegan

If you really want to have a thorough understanding of the lipid hypothesis, Gary Taubes wrote another book that is basically Why We Get Fat with even more scientific evidence and why the calories in-calories out model for obesity doesn’t work entitled Good Calories, Bad Calories.

To get an idea of this whole Paleo thing that I have fallen in love with check out these things, in order of brevity (read: depth):

Building an Olympic Lifting Platform

Finished platform CrossFit got me started on Olympic lifting and it is now one of my favorite things that I do in the exercise realm. The tools needed for CrossFit are few and generally inexpensive, but Oly lifting does require some significant outlay if you plan on doing it at home. One of the things you need is a platform upon which to perform lifts—it’s important to have a stable, flat surface to stand on, and have the ability to drop weights. Before going through with building a platform I dumped a light bar on my brick patio and the back yard, both of which left marks. It was time for me to build a platform.

The basic arrangement of a platform is simple: a sufficiently wide patch of wood to stand on flanked by rubber to absorb dropped weights. There are a handful of notes on building platforms online, most of which suggest two base layers made from full sheets of ¾” plywood topped with a third half-sheet of ¾” plywood with horestall mats on either side. Simple enough, but these instructions also usually state that the resulting device cannot be lifted by on person alone. I wanted to have the ability to at least flip my platform on end by myself to get it out of the way, so I took a different route. To keep the weight more reasonable, I used 3/8” (11/32” at your local lumber yard) plywood and reduced the fore to aft dimension from 8 feet to 6, making for an 8 foot wide by 6 foot long platform. In practice, this is plenty of space to perform lifts, even if you’re doing some real speed as part of a CrossFit workout.

Materials:

  • Two 6x8 foot sheets of cheap 11/32 plywood
  • Two 4x6 foot sheets of cheap 11/32 plywood
  • One 4x6 foot sheet of nice 11/32 plywood
  • Construction adhesive
  • 150 ¾ inch wood screws

Assembly is a very simple affair:

  1. Lay the 6x8 sheets with their long edges side by side (photo)
  2. Apply construction adhesive to half of 6x8 sheet pair and top with one of the 4x6 sheets
  3. Drill & screw edges every 6 to 10 inches, and put screws all over the interior as well—it’s OK to put screws anywhere, as you won’t be standing on this portion
  4. Top with weights to ensure layers bond evenly (photo)
  5. Apply adhesive to other half, top with remaining 4x6 sheet, and repeat screwing process
  6. Test fit 4x6 piece of nice plywood in the center of the previously assembled parts and mark edges
  7. Apply adhesive to the platform between marks, lay nice plywood down and secure with screws along front & back (4 foot) edges only
  8. Put a crapton of weights on top to ensure your platform comes out flat
  9. Once everything is dry, give it a coat of deck sealant to make it waterproof

The end result is a platform that can (just barely) be dragged by one person. If I were doing this whole thing again, I’d take one more foot off of each dimension, making for a 5 by 7 foot platform and saving significant weight in the process. I think this would sill leave enough room for any sort of lifting.

Watching iPad Applications

Shortly after the release of The Daily, Andy Baio created The Daily: Indexed and, more importantly, described how he created that index in a blog post. The crux of his reverse engineering of The Daily app was Charles, which he describes how to use in the aforementioned blog post. Since reading that post, I’ve wanted to explore a number of applications on my iPad and iPhone to see what they’re really doing when they cause the network indicator to spin.

First things first, I setup Charles and started up Reeder, an RSS feed reader that integrates tightly with Google Reader. My main interest was to see when it actually marked posts as read—I often read in short spurts on my iPhone, which results in pulling up a post only to switch out of Reeder or lock my phone seconds later. Sometimes the posts would be marked read if I pulled up Google Reader, but sometimes they were still marked unread. Was this a network latency problem when the phone was using 3G/Edge internet, or was it Reeder doing some fanciness with when it marked posts read?

I didn’t get to my goal right away because the first thing I noticed upon starting up Reeder is that it hits the original blog for every single one of the feeds that I subscribed to in Google Reader. As someone who hasn’t cleaned up a number of blogs that don’t post anymore, this was a few hundred feeds. I shouldn’t have been surprised by this, as it doesn’t really make sense for Google to pull all of the content for all of those blogs and package it up for my convenience.

Getting back to my initial focus, Reeder attempts to mark a post read as soon as you open a post—any failure of a post being marked read is because the network was slow or inoperative at the time you were reading. Reeder periodically refreshes all of your feeds, as indicated by the spinning icon on the iPad or the replaced battery display on the iPhone, but it actually spends much longer doing this than the icon’s state would lead you to believe. From day-to-day usage the update to the feeds I care about (read: those that actually have updates) is done in short order, but Charles reveals that Reeder is still pulling data from individual websites. My guess is that Reeder pulls the feed list from Google, gets the new posts mentioned therein, and then proceeds to do its own checking of feeds.

Reeder was the only app that had really crossed my mind after Andy Baio’s post, and it fulfilled my desire to experiment with Charles, which is a very good tool that I’ll turn to if I have future questions that need answering.

Paleo Egg Muffins

There are a number of recipes for paleo muffins, which are a great way to get fast paleo food in the morning. As I usually do with cooking, I created my own recipe from the ones I found online. For 24 muffins, I assembled:

  • 18 eggs
  • 5 small sausages (breakfast size)
  • 2 large sausages
  • 1 red pepper
  • 1 large onion
  • ½ cup strained yogurt (Greek style)

The odd sausage arrangement is simply because that is what I had on hand—once cooked and crumbled it was about 2 ½ cups worth of sausage.

I began by cooking the sausage, followed by sautéing the diced onion & pepper in the fat that rendered out of the sausage. After cracking all of the eggs into a a large bowl, I whisked them together with the yogurt. With all of that assembled, I combined the sausage with the pepper & onion and portioned it into a pair of muffin pans. The eggs are the last thing before the oven, filling each cup ⅔rds full. Into the oven for 15 minutes, rotate top/bottom pan, and give them another 7-10. The bottoms of mine were a bit under cooked, so I might try putting the lower rack all the way down, rather than the standard in-the-middle.

Altering Many Directories at Once With CmdDirs

On any machine I use I create a directory, sandbox, at the root of my home directory to hold checkouts of source code I’m working on. This directory often contains code from many different repositories, dozens of projects that I intermittently work on. Many of these repositories depend on others, in particular Java submodules for Clearspring, and I want to be able to easily update all of them at once. With Subversion this is easy: the svn command allows you to act upon a checkout without being in the directory that contains it. Simply issuing svn up * from ~/sandbox ensures that I have the latest code revision in each of my checkouts and svn st * allows me to see if I have any uncommited changes.

While I love Git, it does not make such actions this simple. Git requires you to be in the repository directory (or set a number of environment variables) to work wit that repo. While the -exec option of find(1) allows me to descend into each directory and perform an action, I wanted to make this easy, because such all-checkout-actions are something that I want to do a number of times each day. Like most problems, this one is (best?) solved with Perl. Enter App::CmdDirs.

CmdDirs is a fairly simple Perl app that I have written to do what I describe above–descend into any number of directories and perform a command in each one.

titus:~/sandbox$ ls
CmdAll                 mac-itunes             genius-os
scoreboard             GAE                    hf
uaParser               WebService-LOC-CongRec iTunes-Sync
titus:~/sandbox$ cmddirs "git st"
Performing `git st` in <cmdall>
## master
?? App-CmdDirs-1.00.tar.gz

Performing `git st` in <itunes-sync>
## master

Performing `git st` in <uaparser>
## master
 M uaParser/test/test_user_agent.py

Performing `git st` in <webservice-loc-congrec>
## master

See the numerous directories? Note that there are 9 directories in my sandbox but git st was only performed in a few of them, those which are Git repositories. CmdDirs has a modicum of intelligence: if it knows what your command is, the command will only be performed in applicable directories. This can be overridden with -all, -git, or -svn doing what you expect. Git and Subversion are the only two things supported right now, because that’s all I have a need for. Writing new Traversers is simple—just copy the form of git.pm or svn.pm. You can probably Achieve at this endeavor even without knowing Perl.

Here’s a one-liner for installing cmddirs:

curl https://github.com/dinomite/CmdDirs/raw/master/bin/cmddirs > ~/bin/cmddirs && chmod a+x ~/bin/cmddirs

Crapcan Racing Tips

The Jaywatch car Drivers’ schools are incredibly fun and the best way to improve your driving prowess. Track days can be incredibly useful for honing your skill once you have been on track enough to catch your own mistakes. While incredibly useful tools for learning to drive fast, neither can touch the 24 Hours of LeMons for having fun with motorsports. I have previously written about the basics of getting to a LeMons race, and what happened at our first race. The team just finished another race, the 2010 Arse-Freeze-Apalooza at Buttonwillow this past weekend, where we finished 68th of 173. Better than half made us happy, because the head gasket blew with an hour and a half left in the race.

Track driving is certainly a specialized skill and racing is a step beyond that—in addition to driving a car at the limit, you have to deal with numerous other cars on track who may decide to pass you at any time. In crapcan racing things are even crazier because most folks don’t care much if the car gets hit and many of them have little to no experience driving on a racetrack, much less in anger.

Track Experience

The biggest piece of advice I can give is to get some track experience before heading out on track in your first LeMons race. Street driving, or even autocross shares very little with driving on a racetrack, and there is no adequate preparation aside from sufficient time on a track. Even if you have done a lemons race, going to a much less crowded HPDE or other track even twill be very information—rare are the times you get to take an unmolested line, much less lap, at a race.

Many Races

F'ed Up Racing The first time you go to a 24 Hours of LeMons event, you’ll find at least three different types of race teams: those who built something that doesn’t belong on a racetrack; the teams there to win; and the teams that are just there to have fun in any way possible. Don’t get me wrong, everyone is there to have fun, but the former two have very specific goals in mind. The folks who bring a Fiat 600 or a limousine aren’t interested in winning the race outright—they’re looking for the Index of Effluency, or simply testing their own mechanical mettle.

By the same token, some teams come with the intent to win the speed race—they have a reliable, quick car, know how to do fast pit stops, and don’t intend to spend time in the penalty box. If it’s not obvious who these teams are, check the time sheets after a couple hours of racing; they are the ones on the lead lap or just behind. Figure out which cars belong in this group, do your best to stay out of their way, and certainly don’t hit them.

Caution Wave

If you’ve done other racing or track days, one of the first things you’ll notice about crapcan racing is the number of yellow flags thrown. When you have hundreds of $500 cars on a racetrack with inexperienced drivers, problems happen often. At the Arse-Freeze-Apalooza, Jay said that the recovery crew did 75 tows on Saturday alone. Oftentimes, cautions will be thrown well ahead of what you can see if you’re closely following a pack of cars. For this reason, you’ll see folks who have been racing a while throw up their hand by the rear-view mirror and wave when they see a yellow flag. This way, everyone behind knows that they are going to slow for caution, and you should too.

Photos by Marshall Pierce.