Results 1 to 21 of 21
  1. #1
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    543
    any guide or tutorial converting data feed ( CSV ) with php/mysql ?

  2. #2
    ABW Ambassador iucpxleps's Avatar
    Join Date
    January 18th, 2005
    Posts
    648
    convert CSV to what?

    He who steals a minaret prepares a proper
    cover beforehand, said of someone who intends to do something illegal.

  3. #3
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    543
    to HTML pages.
    i have some datefeed from merchant like BCS but don't know how to convert it to HTML pages


    thanks iuxcleps

  4. #4
    Newbie
    Join Date
    January 18th, 2005
    Posts
    21
    What are people using for a query to do there feeds. I use to use a LOAD DATA query, but my hosting company just turned off the permissions so I need to find a new solution (not to mention a new hosting company)

    thanks for your time

    partners@bigvaluezone.com
    ------------------------------
    Let us show you where the best online deals are!
    --------------------------------
    http://www.bigvaluezone.com

  5. #5
    ABW Ambassador Greg Rice's Avatar
    Join Date
    January 18th, 2005
    Location
    Ohio
    Posts
    4,889
    My current host offers phpMyAdmin so it's done with a GUI interface so I don't think I can help much. My big problem now is how to get some of these huge files uploaded to my server. Zipped up, they're not too bad, 20-70 MB but unzipped they're ungodly. Is it possible to have mySQL unzip and load the data?

  6. #6
    Newbie
    Join Date
    January 18th, 2005
    Posts
    21
    php has a ton of ftp commands and has some unzip commands, so you might log into the ftp, cd to the directory copy the file to your server, close the ftp, then unzip the file, and then do a load data query.

    hope that helps

    My hosting company (No MonthlyFees) is no longer allowing me to use load data queries in my php, so you might check with your hosting company about permissions first before doinga all in one php load script.

    partners@bigvaluezone.com
    ------------------------------
    Let us show you where the best online deals are!
    --------------------------------
    http://www.bigvaluezone.com

  7. #7
    ABW Ambassador Greg Rice's Avatar
    Join Date
    January 18th, 2005
    Location
    Ohio
    Posts
    4,889
    Thanks much, I'll have to ask them. Currently, I'm uploading text files which works well since they're only a couple megs each but some of the newer sources I'm looking at would be much too large to upload as text files. The small one is over 190 MB uncompressed. Thanks again. If it helps any, I'm hosting with featureprice.com and they offer mySQL/PHP with their premium and platinum plans. Not sure if they will allow you to do what you need to do but you might ask them.

  8. #8
    Member
    Join Date
    January 18th, 2005
    Posts
    63
    They probably stopped you because you were killing the server or using up all their bandwidth. If you have broadband then you should try downloading the datafeed yourself, then modifying it (install apache and php on your own computer perhaps), then zip up all the files you want to upload and upload them, and finally use a php page to decompress the zipped files on the server. It will reduce the server's bandwidth usage (saving you money) and also cut down the server's load so that your host won't get (as mad at you.

  9. #9
    ABW Ambassador CrazyGuy's Avatar
    Join Date
    January 18th, 2005
    Posts
    1,463
    I was downloading and uploading until I discovered the perl module Net::FTP.

    It is approx zillions times faster to transfer server to server - you'll be amazed [img]/infopop/emoticons/icon_biggrin.gif[/img]

    The code below is an anonymized version of the code I use to get the LinkShare datafeeds.

    As always, your mileage may vary according to your host, perl mods installed etc. Oh, and it seems you have to delete the old file before you can gunzip the new one.

    ===================================

    #!/usr/bin/perl

    print "Content-type: text/html\n\n";

    use Net::FTP;
    # open a connection and log in!

    $ftp = Net::FTP->new('site_to_connect_to','Debug',10);
    $ftp->login('your_account','your_password');

    # set transfer mode to binary

    $ftp->binary();

    $ftp->get('file_name.txt.gz', '/path/towhere/youwant/file_name.txt.gz') or warn "Couldn't get file!";

    system('gunzip file_name.txt.gz');

    print "<P>done</P>";

    ===================================

    Are you Crazy?

  10. #10
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    ÄúsTrálíĺ
    Posts
    1,372
    FTP with php (you need V4.something or later because earlier versions don't support it)

    http://www.php.net/manual/en/ref.ftp.php

    <pre class="ip-ubbcode-code-pre">
    &lt;?php
    // set up basic connection
    $conn_id = ftp_connect($ftp_server);

    // login with username and password
    $login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);

    // check connection
    if ((!$conn_id) || (!$login_result)) {
    echo "FTP connection has failed!";
    echo "Attempted to connect to $ftp_server for user $ftp_user_name";
    die;
    } else {
    echo "Connected to $ftp_server, for user $ftp_user_name";
    }

    // download the file
    $download = ftp_get($conn_id, $destination_file, $source_file, FTP_BINARY);

    // check download status
    if (!$download ) {
    echo "FTP download has failed!";
    } else {
    echo "download $source_file from $ftp_server to$destination_file";
    }

    // close the FTP stream
    ftp_close($conn_id);
    ?&gt;
    </pre>

  11. #11
    ABW Ambassador Greg Rice's Avatar
    Join Date
    January 18th, 2005
    Location
    Ohio
    Posts
    4,889
    I'm not sure I understand the benefit of using PHP to FTP vs using any other FTP software, at least for downloading. I can see a benefit of using PHP to upload if I knew where to put the file and then could then uncompress it.

    After checking the settings for my PHP install, I see they have an upload limit of 10 MB but I've only been able to get it to take around 6MB max. Currently, for one large datafile I download it, open it up and split it into 4 files of around 5MB each. Then I upload each section, one at a time, into my database. It's not too big of a deal but there must be a better way to handle it.

    The Wal-Mart datafeed is just too large for me to handle right now as it's just under 80MB compressed. A site to site transfer would probably work wonders for me.

    If I upload using PHP, how do I know where the file should go? I need this location info to also uncompress it and then import it into my database. [img]/infopop/emoticons/icon_confused.gif[/img]

  12. #12
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    ÄúsTrálíĺ
    Posts
    1,372
    I use php (with a cron job) to automatically download my feeds daily and store them on my server.
    I then use other cron jobs to add the data to the various tables, update existing products, delete old products etc.

  13. #13
    ABW Ambassador CrazyGuy's Avatar
    Join Date
    January 18th, 2005
    Posts
    1,463
    Xandman - these scripts are to transfer from the merchant/network server direct to your web server, without your PC getting involved at all.

    Ideally, they get run automatically to transfer latest datafeed and process it, but they can also be useful just to do the transfer when you trigger it.

    Of course, if you're having to process the data through some desktop tools before you can use it, you need to down and upload.

    It's worth looking at though - honestly, massive files zip across between these servers with their fast connections.

    Are you Crazy?

  14. #14
    Member
    Join Date
    January 18th, 2005
    Posts
    63
    Yes they transfer very fast from server to server but it will also mean huge ammounts of traffic that you have to pay for. Anyone with broadband should just download the files themself, figure out if anything is new, create pages (or a database to be merged with the server's) for the new products, zip everything (to transfer quicker and use less server bandwidth), then use a php page (or whatever else) to unzip it and put it where it needs to go. This way may not be as speedy as server to server, but it will keep you web hosting bills down and keep your host from getting pissed at you. I know if I was Pete's host I would be sure to charge him a buttload for transferring feeds daily! Remember people, if you're using cheap hosting (ie. shared server) then you're not only effecting your own site, but also every other site that is on that server!
    Edit: Oh, and one more thing. My way also allows you to get around the server's file size limits.

  15. #15
    ABW Ambassador CrazyGuy's Avatar
    Join Date
    January 18th, 2005
    Posts
    1,463
    Tweaker - I guess it depends on how you value your time.

    And what kind of hosting deal you have.

    I certainly got fed up with the limitations of shared hosting for a number of reasons, but bandwidth wasn't one of them. Even transferring an 80Mb datafeed every day totals 2.4GB/month (if my maths is correct [img]/infopop/emoticons/icon_confused.gif[/img]) when even an $8/month plan gives you 8GB/month bandwidth.

    It's hard to justify a lot of manual handling of data to overcome bandwidth limitations at those prices.

    Are you Crazy?

  16. #16
    Member
    Join Date
    January 18th, 2005
    Posts
    63
    That's somewhat fine and dandy (in terms of bandwidth) for an 80MB feed, but feeds can grow much larger than that. Also consider the fact that while you're script is handling the feed (putting it into a database or making pages, etc.) that you're putting a huge strain on the server's cpu. If you're host notices, they'll probably get pissed off a bit (especially it's happening every day). And if I'm not mistaken, most feeds don't even get updated daily so it's sort of futile anyways. I'm simply trying to warn people to watch out what they do and to keep in contact with their host so they don't run into trouble. And I'd also recommend to only update your feeds once a week at a time when there's not much traffic (ie. 3AM using a cron job).

  17. #17
    ABW Ambassador Greg Rice's Avatar
    Join Date
    January 18th, 2005
    Location
    Ohio
    Posts
    4,889
    Well, monthly bandwidth isn't a problem for me. One thing I don't know is if I were to use a site to site transfer, I have no idea where to put the file. My hosting provider uses phpmyAdmin so now I currently upload the files through it.

    Another possibility could be to compare the current datafile to the last one that was uploaded and only upload the changed products. Any idea if this is possible? If it is, how could you identify products that were dropped and remove them from the working table?

    Pete and CrazyGuy, thanks for your time I appreciate it. [img]/infopop/emoticons/icon_wink.gif[/img]

  18. #18
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    1,086
    The fastest way to upload data is generally to use whatever data loading procedures came with your database.

    If your host doesn't allow file uploads. I've found that you can stuff extremely large amounts of data in a textarea boxes. I am not sure of the limit. To do this, you simply create a form with a textarea box, then paste a large amount of data in the box and press your submit button.

    Your PHP program will need to parse the data in the textarea variable and create sql statements that you can send to the DB.

    If your data is in CSV (with quotes and properly escaped) you might be able to just stuff it as in a sql statements directly...line by line. I generally explode the inner string and validate the data.

    some php pseudo code:

    $sqlStart = "INSERT INTO My_Table (colone, coltwo, colthree ...) VALUES (";

    $tok = strtok($textareaString,"\n");
    while ($tok) {
    // add more validation here
    // e.g. $array = explode($tok) ...
    echo $sqlStart.$tok.");&lt;br&gt;\n";
    // execute sql statement
    $tok = strtok("\n");
    }

    The important thing to note in the code is that you use strtok to loop through the big blob of text. You can use explode on the inner loop.

    Protophoto - Short Stories

  19. #19
    Member
    Join Date
    January 18th, 2005
    Posts
    63
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Another possibility could be to compare the current datafile to the last one that was uploaded and only upload the changed products. Any idea if this is possible?<HR></BLOCKQUOTE>
    It's part of what I stated earlier. In order to do this, you'd first have to download it to your own computer, then check for changes, then upload it to your server. You could still do it all with php but you'd have to install apache and php on your own computer.

  20. #20
    ABW Ambassador iucpxleps's Avatar
    Join Date
    January 18th, 2005
    Posts
    648
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by yintercept:
    The fastest way to upload data is generally to use whatever data loading procedures came with your database.

    If your host doesn't allow file uploads. I've found that you can stuff extremely large amounts of data in a textarea boxes. I am not sure of the limit. To do this, you simply create a form with a textarea box, then paste a large amount of data in the box and press your submit button.

    Your PHP program will need to parse the data in the textarea variable and create sql statements that you can send to the DB.

    If your data is in CSV (with quotes and properly escaped) you might be able to just stuff it as in a sql statements directly...line by line. I generally explode the inner string and validate the data.

    some php pseudo code:

    $sqlStart = "INSERT INTO My_Table (colone, coltwo, colthree ...) VALUES (";

    $tok = strtok($textareaString,"\n");
    while ($tok) {
    // add more validation here
    // e.g. $array = explode($tok) ...
    echo $sqlStart.$tok.");&lt;br&gt;\n";
    // execute sql statement
    $tok = strtok("\n");
    }

    The important thing to note in the code is that you use strtok to loop through the big blob of text. You can use explode on the inner loop.

    http://protophoto.com - http://rgreetings.com/WRITINGS.HTM<HR></BLOCKQUOTE>


    You may try using file(); to read the datafeed line by line into an array than process it this way you say. Although I know what to do in my mind, I'm still trying to figure how to do it [img]/infopop/emoticons/icon_smile.gif[/img] Well I have a long weekend ahead of me :P

    He who steals a minaret prepares a proper
    cover beforehand, said of someone who intends to do something illegal.

  21. #21
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    543
    Hi
    i just learn mysql & PHP.
    some my question

    how can we create thousand pages with that datafeed ? i mean can you show me php script/function that refer it (create) ?

    ie

    echo (" what ever ")

    sorry but i still confuse

  22. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. Tool to convert excel data to mysql
    By acpd in forum Commission Junction - CJ
    Replies: 5
    Last Post: November 29th, 2008, 11:51 PM
  2. PHP/mySQL-based affiliate data feed parsing script?
    By AlpineZone in forum Programming / Datafeeds / Tools
    Replies: 4
    Last Post: June 17th, 2005, 07:02 PM
  3. data feeds and SID/Member ID/u1 tracking using php/mysql
    By scattdaddy21 in forum Programming / Datafeeds / Tools
    Replies: 0
    Last Post: July 22nd, 2004, 07:26 PM
  4. Data Feed for PHP Site?
    By gearguy in forum Programming / Datafeeds / Tools
    Replies: 3
    Last Post: May 26th, 2003, 07:47 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •