Page 1 of 2 12 LastLast
Results 1 to 25 of 30
  1. #1
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Question Anyone Use a DataFeed with MySQL or SQL Server?
    I'm used to working in an Intranet where I have access to all the pieces. I'm trying to figure out how I get my data from my CSV datafeed file to my database on the server on the Internet.

    Can anyone point me in the right direction?

    Thank you for any help.

  2. #2
    Member SeanW's Avatar
    Join Date
    July 28th, 2005
    Posts
    69
    Depends on how you've written your code and how your database is laid out. I have a perl script for each DF that parses the datafeed in whatever format into its components, then passes it off to a Perl class that handles the insert or update in the database.

    Sean

  3. #3
    Full Member
    Join Date
    September 30th, 2005
    Location
    Spain
    Posts
    180
    With big feeds, you have to transform it offline and then upload it

  4. #4
    Lite On The Do, Heavy On The Nuts Donuts's Avatar
    Join Date
    January 18th, 2005
    Location
    Winter Park, FL
    Posts
    6,930
    Assuming you've set up your db/tables within MySql, you just need an script that grabs your uploaded csv file and imports it. I found a free script for this purpose and use it on several sites of mine.
    http://www.hotscripts.com/Detailed/18098.html

    And I think Uncle Scooter, an affiliate here at ABW, has a script he sells that automates it all via cron jobs.

    I bet DesignerWiz has something like this as well.

  5. #5
    Moderator MichaelColey's Avatar
    Join Date
    January 18th, 2005
    Location
    Mansfield, TX
    Posts
    16,232
    Quote Originally Posted by pexcornel
    With big feeds, you have to transform it offline and then upload it
    Actually, big feeds are ideal for processing on the server. I process several feeds with hundreds of thousands of items. The computer doesn't care whether it's 1 item or a million.
    Michael Coley
    Amazing-Bargains.com
     Affiliate Tips | Merchant Best Practices | Affiliate Friendly? | Couponing | CPA Networks? | ABW Tips | Activating Affiliates
    "Education is the most powerful weapon which you can use to change the world." Nelson Mandela

  6. #6
    Member buy-tees's Avatar
    Join Date
    December 1st, 2005
    Location
    UK
    Posts
    175
    I use DTS in SQL Server and let it do the importing automatically for me. I just download the feed from CJ, AWIN etc into a folder and SQL Server will pick it up when it's scheduled to. It can even FTP the data itself but I haven't got that far yet.

  7. #7
    Newbie
    Join Date
    March 17th, 2006
    Location
    London
    Posts
    4
    CJ datafeeds
    Don't CJ charge for datafeeds from merchants? All I reuire is a datafeed and not any help from CJ to set it up for me. Any ideas?

  8. #8
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Quote Originally Posted by nutscrewsbolts
    Don't CJ charge for datafeeds from merchants? All I reuire is a datafeed and not any help from CJ to set it up for me. Any ideas?
    Some merchants have free datafeeds. Others probably would give a free datafeed if you sold enough,from what I read. If you are like me you need the datafeeds initially to sell, so that last point may be out. I just read a big thread here at ABW on the whole CJ payment thing and everyone say's it's a ripoff. But what are you gonna do if you are new? I think all you can do is pay the one-time fee. $200 is not bad for a business investment. But check the merchant first. I haven't paid yet.

    I eventually figured out what I needed to do on the database transfer. I received the datafeed and imported it into Access. From there I changed the columns to the datatypes I needed. I then right clicked on the table and exported it to an ODBC database (the one on my Internet Server).

  9. #9
    What's the word? Rhia7's Avatar
    Join Date
    January 13th, 2006
    Posts
    9,578
    Quote Originally Posted by johnnyWebAffiliate
    I eventually figured out what I needed to do on the database transfer. I received the datafeed and imported it into Access. From there I changed the columns to the datatypes I needed. I then right clicked on the table and exported it to an ODBC database (the one on my Internet Server).
    Fantastic idea!
    ~Rhia7 -- Remember the 7
    Twitter me

  10. #10
    ABW Ambassador Snib's Avatar
    Join Date
    January 18th, 2005
    Location
    Virginia
    Posts
    5,303
    I've got a system in place that downloads the feeds to my server and updates my prices automatically. I've got it set up so I have a single import script that reads the datafeed mapping information from a table. This information is configured by me via my administration panel and tells my import script which field goes where in my database. Everything is updated via the crontab and it takes me about 10-20 minutes to add a clean datafeed to the system. It's just unfortunate that most feeds aren't clean.

    Building this was no small feat. It's taken me over 2 years to develop and get it to this point, but it's been well worth the investment.

    - Scott
    Hatred stirs up strife, But love covers all transgressions.

  11. #11
    Newbie TheHoff's Avatar
    Join Date
    March 5th, 2006
    Posts
    260
    ^^ Same here as Scott though mine is not as slick. I'm in the process of abstracting it and making it easier to add feeds across different sites from one interface. Take a cue and start writing your own import system rather than pulling each feed into Excel or Access.

  12. #12
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Quote Originally Posted by TheHoff
    ^^ Same here as Scott though mine is not as slick. I'm in the process of abstracting it and making it easier to add feeds across different sites from one interface. Take a cue and start writing your own import system rather than pulling each feed into Excel or Access.
    Oh, I didn't mean to imply what I did was great or anything (not saying anyone said that). It was just a quick way for me to get started. In the future, I can see building a script that gets the datafeed and uses INSERT SELECT, if the data and schema is constant.

  13. #13
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Thumbs down
    Quote Originally Posted by Rhia7
    Fantastic idea!
    Well, it was when there were only 13k rows. I finally paid (and I know everyone says it's a ripoff) CJ and got my feeds. It ended up as 134,000 rows. I did the export and about 2 hours later it was almost halfway through. I don't know of a quicker way to upload the data which is a 130MB as a text file and 255MB as an Access 97 database. At least I am using a cable modem....

  14. #14
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Quote Originally Posted by Snib
    I've got a system in place that downloads the feeds to my server and updates my prices automatically. I've got it set up so I have a single import script that reads the datafeed mapping information from a table. This information is configured by me via my administration panel and tells my import script which field goes where in my database. Everything is updated via the crontab and it takes me about 10-20 minutes to add a clean datafeed to the system. It's just unfortunate that most feeds aren't clean.

    Building this was no small feat. It's taken me over 2 years to develop and get it to this point, but it's been well worth the investment.

    - Scott
    I know I need a "real" script, but I wonder if I put the Acces DB on the server and built an asp.net page to read from the Access .mdb and upload it. Do you think it would be faster since all is on the server? Or will my middle page (my asp page) cause it to be just as slow? I guess I can look for importing form the text file with Sql server.

  15. #15
    ABW Ambassador Snib's Avatar
    Join Date
    January 18th, 2005
    Location
    Virginia
    Posts
    5,303
    Quote Originally Posted by johnnyWebAffiliate
    I know I need a "real" script, but I wonder if I put the Acces DB on the server and built an asp.net page to read from the Access .mdb and upload it. Do you think it would be faster since all is on the server? Or will my middle page (my asp page) cause it to be just as slow? I guess I can look for importing form the text file with Sql server.
    I think the slowest part of this will be importing the data into your mdb database. If you automate this aspect, it should be okay. Just as long as you've got all the proper indexes created on your Access table it should operate smoothly. Creating the asp.net page to display your site based on the mdb file isn't trivial, but that shouldn't cause any slowdown.

    - Scott
    Hatred stirs up strife, But love covers all transgressions.

  16. #16
    Member
    Join Date
    September 5th, 2005
    Location
    Mansfield, TX
    Posts
    161
    If your using MySQL i would check out their "LOAD DATA INFILE" command, it will load the big files very fast. The command is something like.


    load data infile 'YourFile.CSV'
    into table cj_temp
    fields
    terminated by ','
    optionally enclosed by '"'
    lines terminated by '\n'
    ignore 1 lines

  17. #17
    ABW Ambassador Snib's Avatar
    Join Date
    January 18th, 2005
    Location
    Virginia
    Posts
    5,303
    Quote Originally Posted by DHolland
    If your using MySQL i would check out their "LOAD DATA INFILE" command, it will load the big files very fast.
    The problem with this is it will only import the products into a table. It doesn't take into account existing products, so it'll dupulicate anything that you've already imported previously. You'll have to delete all your data every time before doing this and you'll lose your auto increment product ID numbers in the table.

    I think it's better to have a script that will update the existing data and insert any new products. I just find it sloppy to delete everything every time you need to run an update.

    - Scott
    Hatred stirs up strife, But love covers all transgressions.

  18. #18
    Member
    Join Date
    September 5th, 2005
    Location
    Mansfield, TX
    Posts
    161
    No, the command comes with a REPLACE parameter which will replace any existing record with the new imported one. You just need to have your keys set correctly

    load data infile 'YourFile.CSV' REPLACE
    into table cj_temp
    fields
    terminated by ','
    optionally enclosed by '"'
    lines terminated by '\n'
    ignore 1 lines

  19. #19
    ABW Ambassador Snib's Avatar
    Join Date
    January 18th, 2005
    Location
    Virginia
    Posts
    5,303
    Quote Originally Posted by DHolland
    No, the command comes with a REPLACE parameter which will replace any existing record with the new imported one. You just need to have your keys set correctly

    load data infile 'YourFile.CSV' REPLACE
    into table cj_temp
    fields
    terminated by ','
    optionally enclosed by '"'
    lines terminated by '\n'
    ignore 1 lines
    But what criteria does it know to replace an item? Will it replace if it finds the same SKU? What if you've got more than one merchant in your table and there are overlapping SKU's? I just think it's not a good long term solution.

    - Scott
    Hatred stirs up strife, But love covers all transgressions.

  20. #20
    Member
    Join Date
    April 11th, 2006
    Location
    Michigan
    Posts
    70
    We are in the process of builing an application which will download data feed from various merchants in our master database. Since we offer hosting for affiliates. We will provide affiliate data feed for free of cost. Our application will upload data feed automatically using our cron job. Then it will distributed to each host to use data feed and store these data in their own mySQL tables. We will be responsible to manage hosting and data.
    Burhan Tanweer
    Marketing Manager
    sales/at/awebhosts.com
    [URL=http://www.awebhosts.com]www.awebhosts.com [/URL]

  21. #21
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Quote Originally Posted by csearch
    We are in the process of builing an application which will download data feed from various merchants in our master database. Since we offer hosting for affiliates. We will provide affiliate data feed for free of cost. Our application will upload data feed automatically using our cron job. Then it will distributed to each host to use data feed and store these data in their own mySQL tables. We will be responsible to manage hosting and data.
    Too bad you don't use SQL Server. I can use MySQL and have used it for many applications in the past. I am using SQL Server now, however. It works much better with Visual Studio .NET 2005 and ASP.NET 2.0.

  22. #22
    Member
    Join Date
    April 11th, 2006
    Location
    Michigan
    Posts
    70
    we have both linux and windows hosting with mySQl and SQL Server respectively. Linux hosting is cheaper than windows. It depends that how much an affiliate would like to spend for hosting.
    We offer hosting services. Since affiliate data feed is getting popularity and everyone is working individually to download data locally. We have decided to offfer data feed services for free to the website hosted by us.
    Burhan Tanweer
    Marketing Manager
    sales/at/awebhosts.com
    [URL=http://www.awebhosts.com]www.awebhosts.com [/URL]

  23. #23
    Member
    Join Date
    September 5th, 2005
    Location
    Mansfield, TX
    Posts
    161
    Snib,

    As long as you make the vendor and the sku/product id uniques keys it will replace the correct one.

    That said I personally think it is better to delete the vendor inventory and then reload. That way you get rid of old products that are no longer valid and your database is up to date. Also inserts are about 10-20 times faster than updates which is important as your database grows larger.

    Everyone has their own way of doing things. I say use what works best for you.


    JohnnyWebAffiliate,

    A similar program exists in SQL Server. It's called BCP. Just google it and you'll get the parameters.

  24. #24
    ABW Ambassador Snib's Avatar
    Join Date
    January 18th, 2005
    Location
    Virginia
    Posts
    5,303
    Quote Originally Posted by DHolland
    That said I personally think it is better to delete the vendor inventory and then reload. That way you get rid of old products that are no longer valid and your database is up to date. Also inserts are about 10-20 times faster than updates which is important as your database grows larger.
    Reducing insertion time is definitely an important factor and something I strive for. I agree that it should be as fast as possible. One advantage to keeping the old, out of stock data is to be able to let the customer know that the product they're seeking is out of stock and to offer alternate, similar suggestions.

    I use my auto_increment product ID field in my product URL's. This gives me a short and unique URL for each product across many merchants. I can also tie product reviews, wishlists and various other tools to the product ID's.

    This is just what works best for me.

    - Scott
    Hatred stirs up strife, But love covers all transgressions.

  25. #25
    Full Member
    Join Date
    January 16th, 2006
    Posts
    447
    Quote Originally Posted by DHolland
    Snib,

    As long as you make the vendor and the sku/product id uniques keys it will replace the correct one.

    That said I personally think it is better to delete the vendor inventory and then reload. That way you get rid of old products that are no longer valid and your database is up to date. Also inserts are about 10-20 times faster than updates which is important as your database grows larger.

    Everyone has their own way of doing things. I say use what works best for you.


    JohnnyWebAffiliate,

    A similar program exists in SQL Server. It's called BCP. Just google it and you'll get the parameters.
    Thanks I found BCP earlier today after I wrote my email. I can't use it though on my host. I have worked out something else with them. Now I am stuck on CJ's datafeed not having all the stuff I want in the feeds. For example, they are missing sizes on the items. In fact, there is not a size column at all. I don' t know why. I hope I can get fields/columns in the datafeed I want besides what they have initially set up. I don't know if they set up what you want or not.

+ Reply to Thread
Page 1 of 2 12 LastLast

Similar Threads

  1. Visual Basic script: import ShareASale datafeeds to SQL Server
    By markwelch in forum Programming / Datafeeds / Tools
    Replies: 0
    Last Post: March 11th, 2007, 03:53 PM
  2. Need ASP.NET and SQL Server 2005 Hosting
    By johnnyWebAffiliate in forum Domains & Hosting
    Replies: 1
    Last Post: October 14th, 2006, 11:58 PM
  3. Hosting provider for MS SQL server on Win2k
    By Kcedit in forum Domains & Hosting
    Replies: 17
    Last Post: July 21st, 2003, 03:52 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •