For those who are not computer genuises, but still want to try an electronic approach to content analysis, you might want to try the Firefox extension downTHEMall. What it does is allow you to selectively, automatically download all links on a website to a local folder.
Say, for instance, that you wanted to perform a content analysis of GoogleNews reporting. Open your start page, then right-click. You then have the option of filtering out certain URLs (e.g., those containing the strings "google" or ".net"), then automatically saving the others.
It's not the perfect solution, but it might be easier (and quicker) for some of you than learning Perl or Java.
Kudos to Shanna for the tip.