Results 1 to 9 of 9
March 25th, 2009, 10:27 AM #1Update Mysql Table Using Inner Join Problem
I am working on a php script that will update a mysql table using another mysql table. TableA and TableB have the same filed names. Each tables has around 150,000 records. TableA is my main table that has my live data. TableB has some potential updates like price for one example that I want to update in TableA.
ID PROGRAM PRICE ....
ID PROGRAM PRICE ....
"UPDATE TableA A INNER JOIN TableB B ON A.ID = B.ID AND A.PROGRAM = B.PROGRAM SET A.PRICE = B.PRICE";
Now this is a valid mysql query, but after running for a minute, I see the following message:
Unable to query: Server shutdown in progress
I was thinking that using an update with a join would be the way to update the quickest, but it looks like I may have gone over some memory limit, although were only talking about 150K records.
March 25th, 2009, 11:08 AM #2
I'm no database expert but here is how I update one table with another:
UPDATE TableA,TableB SET TableA.PRICE=TableB.PRICE
Of course this only updates one field at a time so I'm not sure if this can be used for multiple fields at the same time.
March 25th, 2009, 11:42 AM #3
- Join Date
- October 22nd, 2006
Using a simple loop in php will be faster.
Read Table A
Query Table B using the Table A Keys
Update Table A with Table B data
End For Loop
March 25th, 2009, 01:05 PM #4
Looping through each record takes forever.
March 25th, 2009, 05:05 PM #5
What I do is run a PHP script that checks the data in TableB and compares it to the data in TableA. I create objects representing the data in both tables and serialize them to compare the data. If there is a variation then I update TableA with the data in TableB. This way there are no unnecessary updates to TableA.
I do this for every product, one merchant at a time. 150k products isn't really that many to loop through though.
- ScottHatred stirs up strife, But love covers all transgressions.
March 26th, 2009, 10:50 AM #6
The biggest issue to me is that when you do a mass update as a single statement, it can "lock" the table until he statement finishes. If your web site is pulling data from that table, pages will time out. Also, it's easy to miss "deletes" in a mass update. If a product was in the datafeed at one time and no longer is, what do you want to do with it?
March 26th, 2009, 11:10 AM #7
I ended up giving up on this. I am too limited by memory size from my shared hosting setup.
March 26th, 2009, 08:40 PM #8
Could always try brute force, add this to the beginning of the script:
Better still is there some way to break it into smaller segments. Like Michael said, you don't want to "lock" up a table.
March 26th, 2009, 08:42 PM #9
Or could you do the update locally and then upload the finished table to your remote server for your live table? Kind of a pain but could be the easiest way in the end.
By Uncle Rico in forum Programming / Datafeeds / ToolsReplies: 11Last Post: December 23rd, 2009, 12:35 PM
By writerguy in forum Programming / Datafeeds / ToolsReplies: 9Last Post: August 2nd, 2008, 05:07 PM
By dealpigg in forum Midnight Cafe'Replies: 6Last Post: February 20th, 2008, 08:05 AM
By Gib in forum Programming / Datafeeds / ToolsReplies: 3Last Post: February 25th, 2005, 05:25 PM
By Doug247 in forum WebMerge (Fourthworld.com)Replies: 8Last Post: March 9th, 2004, 09:48 AM