Results 1 to 10 of 10
  1. #1
    ABW Ambassador Doug247's Avatar
    Join Date
    January 18th, 2005
    Location
    DE USA
    Posts
    931
    I am starting to use more and more datafeeds and wanted to know if veteran affiliate marketers have a certain method to managing your feeds.

    Here is what I am thinking.

    Option 1
    Do you create a table for every merchant based on the number and content of each field?

    Option 2
    Do you have a standard table for every merchant, with say 25 fields and just doctor every feed to fit nicely into that table?
    Ex. you have 3 feeds so you would use 3 tables

    Option 3
    Do you use a master table with say 25 fields and doctor all the feeds into that table?
    Ex. You have 3 feeds but load all of them into one table

    Option 4
    Do you a different method and care to share.


    I am trying to find the most efficient way to manage feeds while trying to have a plan for incorporating new feeds. I know each way will have pros and cons but I’d like to get the feed back of others.

    Thanks,
    Doug
    Thanks,
    Doug

  2. #2
    mega crap martyogelvie's Avatar
    Join Date
    January 18th, 2005
    Location
    Atlanta
    Posts
    608
    Doug,
    Here is how I do it...
    I create a serperate Database for Each Merchant.
    I then have to create a seperate connection to each db.

    Then I create seperate tables for different brand/items from each merchant.

    as an example using Akiva's Sportsfanfare datafeed. I have a single DB for all of Akiva's SportsFanFare items. Then I create seperate tables for each of the following; NFL, MLB, NBA, NHL, NCAA, Autographs.

    This helps with querying the db. I would like to think my dynamic pages would load faster simply because there is less data to sort through.

    As a general database rule, create more tables instead of just creating more columns per table.

    I am anxious to see what others do.


  3. #3
    ABW Ambassador Doug247's Avatar
    Join Date
    January 18th, 2005
    Location
    DE USA
    Posts
    931
    Hmm... I Know what you mean but is there a noticeable difference in speed on a datafeed of that size? I could definitly see doing that with larger feeds. Thanks for the input.
    Thanks,
    Doug

  4. #4
    mega crap martyogelvie's Avatar
    Join Date
    January 18th, 2005
    Location
    Atlanta
    Posts
    608
    Doug,
    the Sportsfanfare feed from Akiva has 10k + records and I am sure Akiva is NOT done. That database will continue to grow.

    half a second here or half a second there could mean the difference in a sale.

    By seperating merchants into different dbs and then creating seperate tables per category or brand, I can limit a query from 10k + records to somewhere around 1k records.

    Plus it makes seperating the data on my pages easier. ie. I have a db for Sportsfanfare and serperate tables for NFL, MLB, etc. When I create a page for say Dallas Cowboy merchandise, I create a recordset for the table NFL and then I only have to query once for Dallas Cowboys vs having one large table, creating a recordset for that table then having to query for NFL and then query again for Dallas Cowboys.

    make sense?

  5. #5
    Full Member
    Join Date
    January 18th, 2005
    Posts
    362
    do you have an automatic way of backing up each database? I was thinkin of a seperate table per feed in one database so I can just grab the whole database for a backup and it would have my whole site in it.

    but on the other hand if I add something like the betty mills feed into the mix would it slow down the single database? or being a seperate table wouldn't make a difference.

    I can see seperate databases being handy if you normalize and format each feed to be nearly identical so you don't have to change queries other than the database name for the connection.

    Still slowly learning this stuff myself.. I want it up NOW but damnit.. I want to do it right also :P

  6. #6
    mega crap martyogelvie's Avatar
    Join Date
    January 18th, 2005
    Location
    Atlanta
    Posts
    608
    Canadian eh,
    My host performs routine backups nightly, so the database gets backed up often, plus I have a copy on my PC.

    I think that a seperate table per merchant in a single DB would probably not slow the query down since you create a recordset off the single table.

    One of the best things about having mutliple databases is upload times. If I added a feed, I add a single database which may be 5-10mb to upload. If I had all my feeds in one database and added a feed or just edited some of the records in the database I would have to upload the entire db which could get rather large and take upwards of 1 hour to upload even with broadband. Many of your dynamic pages that pull data from this database will be down until the entire db is uploaded. One way of offsetting this is to create a web page to edit/add to the database in real time. But that doesn't answer the question of adding an entire feed.

    I am sure that I do this bassackwards, but this is how I do it.


  7. #7
    Full Member jollygoodpirate's Avatar
    Join Date
    January 18th, 2005
    Location
    NC
    Posts
    227
    I use all the options above and one you have not mentioned. I create a table for different merchants. I create a table for multiple merchants and I have created tables and websites from the same merchant, for different parts of the datafeed.

    You do whatever you need to do, based on the purpose of the website.

    For my "big" site I use multiple feeds in one table. The advantage of this is that I can create a simple search script that would search accross all the merchants products. And If I want to create a page on blue widgets, I can start with one query to one table in one big swooop.

    To offset the problem of having your website down while you are loading new data... well, I've used two methods (both are mysql specific):


    One is to create a temporary table exactly laid out like your final table... load all your data to that table while your website uses the original table to serve data, when it is done you can do:

    (please read mysql manual for exact commands)
    rename original_table to temp;
    rename loading_table to original;
    rename temp to loading_table;
    flush tables;

    and you are done and you will have a loading_table ready for the next time. If you have problems with the new table you can roll back to your last table by reversing the above commands.

    For really large tables this is still too slow, and I use this method instead:

    load the table on your local linux machine.
    Bzip2 or Gzip all table files. (bzip saves you multiple megabytes)
    Upload gzip file to server (hour or more upload)
    Then unzip the files into the mysql data directory and run
    repair tables;

    The uploading still is very slow, but the unzipping and repair will only take 5 minutes if that much. Do it at 4am and nobody will notice. If you gzip your original files before you unzip the new ones, you can still roll back if you have problems


    Hey, don't tell anybody, these are all my own secrets... wouldn't want everybody doing this.

  8. #8
    Newbie
    Join Date
    January 18th, 2005
    Posts
    22
    Doug,

    I currently do something most like your Option #2.

    I create a unique database (table) for each merchant and I map all fields into my own standard ones. This way all tags that I use in my templates are the same for all merchants.

    I search the databases by creating an inverted index on standard tags that I select (usually product name, description, id, keywords, etc).

    With the inverted index, I can then search any merchant database super fast without regard to database size.

    Soon, I am planning on combining all merchant datafeeds into a single database with a combined index that performs kind of like Froogle.

    I automate all steps with cron and update all feeds on a regularly scheduled basis.

    Bo

  9. #9
    Full Member
    Join Date
    January 18th, 2005
    Posts
    362
    what is an inverted index? Can't seem to find specifics with google.

    also when you format each feed into your own field layout do you use PHP or mysql commands to do it?

    I was spending hours on one problem trying to get it working with PHP and found using MYSQL insert select combo did it in one line :P

  10. #10
    Newbie
    Join Date
    January 18th, 2005
    Posts
    22
    An inverted index is like the index in the back of a book. It contains a list of words and a reference to what page it appears on.

    Like google does with web pages, I do the same with the datafeeds. I take each word for each datafeed entry and save which product it is associated with. Then when you perform a search with keywords, the lookup is blazingly fast.

    Here is a good article describing the basic approach in more detail:

    http://hotwired.lycos.com/webmonkey/...6/index2a.html

    There are tons of good articles if you search for "search engine inverted index" in google.

    I do all of my programming in PERL. I use a Berkely database (DB_File) with a few twists to save on disk space.

    Bo

  11. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. Looking for someone to manage CJ account
    By lipink in forum Job Postings
    Replies: 3
    Last Post: May 25th, 2007, 06:25 AM
  2. Will the datafeeds in the merchants datafeeds thread track my commissions?
    By john9245 in forum Programming / Datafeeds / Tools
    Replies: 5
    Last Post: March 29th, 2005, 09:42 AM
  3. Help! What to use to manage the cj datafeed?
    By womanht in forum Commission Junction - CJ
    Replies: 5
    Last Post: June 28th, 2003, 07:34 AM
  4. How do you manage your site?
    By Steveinid in forum Couponer's Corner
    Replies: 19
    Last Post: June 26th, 2003, 08:35 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •