I'll get into this a little more, but for now here's a quick intro...The first thing you need to do, is download the free page "sucker":
www.httrack.com
It's free/open source and what it does is take pages and download them to your website to browse off-line.
However, we'll use it to convert our dynamic pages to static html.
Making pages with HTTrack:
I won't get into the specifics of HTTrack, other than this general guide: This guide assumes you've created a "skin" for Fatbomb that will match your site.
In Fatbomb admin, be sure to config it for the resources you want to use.
Open HTTrack and:
-Name the project
-Create a category (not sure what this really does)
-Click NEXT
Here's where you do your "work".
Paste in a long list of URLS in the "WEB ADDRESSES" box. We don't need to spider the site, instead just enter URLs.
Note: Be careful, you don't want to suck so fast that you bring your own site to its knees, or make your host mad.
To set your "seed limit" Cick "Set Options" on the add URLs sceen, then click "Limits" and adjust appropriate limits.
If you can't figure out the proper "Limits", then just do 10-20 or so pages at a time...But the last thing you waht to do is bring down your own server.
- Click NEXT
- Click FINISH
Your Fatbomb pages will be sucked and downloaded to your hard drive.
-If you own Tuelz, you can touch them up from there.
HOw to make URLs to suck pages using Notetab:
Free at www.notetab.com
-Enter your list of keywords
-Find:
_ (_ represents one blankspace)
Replace:
- (hyphen)
Goal: To add a hyphen between all multi-word phrases, replacing the blankspace.
-Add your base URL to at the beginning of each phase:
Find:
^p
Replace:
^phttp://YourDomain.com/cgi-bin/fatbomb/fatbomb.cgi/?skin=default&&keywords=
Or if using the htaccess file...
Replace:
^phttp://YourDomain.com/
Note the ^p at the beginning of the URL.
This should put the base URL before all your keywords...
Experts with Tuelz:
Run this list through Replacez to vary the length of your pages...
length=20
length=8
length=29
length=17
Say from 8-30 results. A problem with scraper pages is they all have the same number of results per page. This gives your pages variety...
You can repeat this steps over and over again, using different "skins" and resources/databases each time to give you a vast assortment of pages for any given niche.
Also, run Namez to give your page file names some keywords. Don't worry about matching each up perfectly. Note: I STRONGLY suggest using the htaccess method, which will give your pages the elvis-presley.html file names.
You can also use Tagz and Replacez to mess with your pages titles and other stuff, so that your pages look more unique and less "machine made".
The main purpose of sucking Fatbomb pages is to create static html pages that are not only server friendly, but they can be used on a variety of other domains without having to reinstall cgi scripts on each and every one.
While the initial sucking takes some server load, from that point on you'll have pages that load as fast as possible and require very little server resources.
I'll get into this a little more...Just download www.httrack.com for starters...
For those wanting to make and suck pages like this, use the Linez Tuel.I don't want this thread to be about Linez, so no questions about it, instead post on the Linez thread, in the Tuelz forum.
I will post this quick guide:
1. paste your list of keywords into Linez.
2. Find: (enter a single blank space)
Replace: - (hyphen)
3. Suffix lines with: .html
4. Prefix lines with: http://www.yourdomain.com/
(assuming this is how the mod-rewrite is set up)
5. Paste results/URLs into HtTrack and run the program.
You can create tons and tons of static pages this way.
For best results, use your own custom databases and/or the ability to give the best results more "weight" so they appear higher in your search results.