Introducing cyborg.txt: robots.txt utilities for Node

In short: cyborg.txt is a collection of robots.txt utilities for Node.

I've been experimenting with webcrawlers lately. I doubt I'll build the next great search engine, but it's fun to do things like sentiment analysis on the web. When you're a (polite) webcrawler, you quickly find that you'll need to parse robots.txt.

Node has a few different libraries for dealing with robots.txt, but I thought it'd be fun (and possibly useful) to make my own. Go take a look at cyborg.txt, my new library for parsing and generating robots.txt files!