Results 1 to 2 of 2
  1. #1
    Join Date
    October 12th, 2005
    I frigging did it again ..
    Otay, the problem lies in the available space for tmp processing ... can't kill a repair either But ... I killed the repair prior to it deleting the table completely ... so I had a partial db.
    Holiday weekend so I can't get all ftp approvals that I wanted. So I got weary and did something.
    Maybe not as helpful as Dirk but ... create a table ( as I understand SAS to be similar) ... note this index was one I wanted and may be deleted or changed.

    CREATE TABLE bigun (
    ItemNumber varchar(25) NOT NULL default '',
    ItemName varchar(255) NOT NULL default '',
    VendID varchar(25) NOT NULL default '',
    VendName varchar(100) NOT NULL default '',
    ItemLink varchar(255) NOT NULL default '',
    ItemThumb varchar(255) NOT NULL default '',
    ItemLarge varchar(255) NOT NULL default '',
    ItemPrice mediumint(15) NOT NULL default '0',
    AltPrice varchar(15) NOT NULL default '',
    GenCat varchar(255) NOT NULL default '',
    SpecCat varchar(255) NOT NULL default '',
    Description varchar(255) NOT NULL default '',
    SearchTerms varchar(255) NOT NULL default '',
    Custom1 varchar(100) NOT NULL default '',
    Custom2 varchar(100) NOT NULL default '',
    Custom3 varchar(100) NOT NULL default '',
    Custom4 varchar(100) NOT NULL default '',
    Custom5 varchar(100) NOT NULL default '',
    Stock varchar(100) NOT NULL default '',
    PRIMARY KEY (ItemNumber),
    FULLTEXT KEY ItemName (ItemName,VendName,Description)
    ) TYPE=MyISAM;
    Then, when downloading feeds from SAS ( was the quickest access to the most items to fill in the holes), save and then later ftp them via ascii all to the same folder on the server as is ( consistency is cool). All of them - 1 -100.
    Modify the following and you have a populated db to tinker with ( only tested to fill in gaps .. + little raw test and it did add some 34,000 records to a new db only using feeds that began with "c" in about 3 minutes)
    So here might be your introduction to feeds ( again, the index may need to be modified to your needs ... and I may need more sleep)
    This script will run the gamut through the directory picking up files that end with .txt ( perhaps .html too as it was one used for other purposes) and place the data in the db. It should strip the html, tabs and the crazy "/t" entered by too many for not knowing.
    Probably needs work ( feel free to hack) but was simply quick, down and dirty with the eyes/mind a dragging. Food for thought at least ....
    Final note .... offered as is and for free -> do not come up with another db tool to sell.
    Enough ... good luck and enjoy ( I hope) Named adamnit.cgi for I had had enough but wanted it up top. Hope it comes out in print usable ...

    use Mysql;
    use CGI;
    use CGI::Carp qw (fatalsToBrowser);
    $|++; ## Turn off buffering

    my ($dbh, $sth, $sql, $to_search_on);
    my $host = '';
    my $database = 'db_name';
    my $user = 'user_name';
    my $dbpassword = 'db_password';
    $row = "ItemNumber,ItemName,VendID,VendName,ItemLink,ItemThumb,ItemLarge,ItemPrice,AltPrice,GenCat,SpecCat,Description,SearchTerms,Custom1,Custom2,Custom3,Custom4,Custom5,Stock";
    $table = "bigun"; ## table name
    $old_path = "<[^>]*>";
    $notab = "\t";
    $notab = "\\t";
    $new_path = "";
    $file_dir = "/directory_where_you_put_the_files/SAS";
    ### no editing below required ( I think)
    print "Content-type: text/html\n\n";

    if ($dbh = Mysql->connect($host, $database, $user, $dbpassword)) {

    opendir(thefiles, $file_dir) || die ("Unable to open directory");
    while ($afile = readdir(thefiles)) {
    if (!(-d $afile) && ($afile =~ /\.txt\b|\.html\b/)) {
    print qq|Working on <B>$afile</B><BR>\n|;
    foreach $line (@WHOLEFILE) {
    $line =~ s/$old_path/$new_path/ig;
    $line =~ s/$notabs/$new_path/ig;
    $line =~ s/$notab/$new_path/ig;
    print FILE1 "$line";
    @tabledata = split(/\s*\|\s*/,$line ,$fields);
    $itemnumber = $tabledata[0];
    $itemname = substr($tabledata[1], 0,254);
    $vendid = $tabledata[2];
    $vendname = $tabledata[3];
    $itemlink = $tabledata[4];
    $itemthumb = $tabledata[5];
    $itemlarge = $tabledata[6];
    $itemprice = $tabledata[7];
    $altprice = $tabledata[8];
    $gencat = substr($tabledata[9], 0,254);
    $speccat = substr($tabledata[10], 0,254);
    $description = substr($tabledata[11], 0,254);
    $searchterms = substr($tabledata[12], 0,254);
    $custom1 = substr($tabledata[13], 0,254);
    $custom2 = substr($tabledata[14], 0,254);
    $custom3 = substr($tabledata[15], 0,254);
    $custom4 = substr($tabledata[16], 0,254);
    $custom5 = substr($tabledata[17], 0,254);
    $stock = substr($tabledata[18], 0,254);
    $sql = "INSERT INTO $table ($row) VALUES ('$itemnumber', '$itemname', '$vendid', '$vendname', '$itemlink', '$itemthumb', '$itemlarge', '$itemprice', '$altprice', '$gencat', '$speccat', '$description', '$searchterms', '$custom1', '$custom2', '$custom3', '$custom4', '$custom5', '$stock')";
    $sth = $dbh->Query ($sql);

    print qq|Done with $file_dir\n|;

    } else { #end 1if
    print $q->header;
    print $q->start_html( -title => 'Search Results',
    -base =>'true',
    -bgcolor =>'#FFFFFF',
    -text => '#000000',
    -link => '#800040',
    -vlink => '#0000FF',
    -alink => '#800040'
    print qq|Couldn't connect to database\n|;

  2. #2
    Join Date
    October 12th, 2005
    otay, the great fireball is coming up and I'm ready to go down but ...
    $notab = "\t";
    should be
    $notabs = "\t";
    and note that data is still raw .... you will need to substitute userid with your own as you utilize the data.
    This is SAS ... though easily modified.

  3. Newsletter Signup

+ Reply to Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts