Page 1 of 2 12 LastLast
Results 1 to 25 of 27
  1. #1
    Newbie
    Join Date
    January 18th, 2005
    Posts
    4
    I have a wierd problem and wonder if others have had similar issues.

    I have a database with over 12000 records that I'm processing, however the build pages function always seems to stall out at record 9984...

    I thought maybe that this was simply due to a bad record that was confusing webmerge so I got rid of that record, but build pages still stalls out at record 9984.

    I've processed larger numbers of records without a hitch before...

    Anybody have any ideas of what might cause this?

    Thanks,

    Roger

  2. #2
    Full Member
    Join Date
    January 18th, 2005
    Posts
    413
    I had a problem similar to this a while back. In my case the culprit was a quotation mark (") in the Category field.

    Check any field that you are using to generate file names to make sure that they do not include unwanted charaters, e.g., / ( ) ".

    It's OK to have those characters in a "Desription" Field, but they cause problems when they are in the Category and Sub-Category fields (if you use those to generate file names.

    Also, you might check the record that occurs just prior to #9984. It could be the one causing the problem.

    Finally, some advice from Richard (from a previous post):

    =================
    RICHARD WROTE:
    An operating system can return an error to WebMerge when WebMerge tries to create a file for any number of reasons. It's not possible to guess as to what could be at play here without knowing what the path is that's shown with the error message.

    In most cases seeing the full path will tell us what the problem is in short order, which is why I wrote the error-handling to display it.

    Please send me an email with the full path as shown in the error dialog and I should be able to determine why the OS is having a problem with it.
    ========================

    Have a Creative Day!
    FRANK, Baertracks
    Be More Creative

  3. #3
    Newbie
    Join Date
    January 18th, 2005
    Posts
    4
    Thanks for trying to help Frank... alas I'm still unable to fix the problem.

    I did go and check the record and the one before it, but they both seem fine.

    In this instance I'm using serialized names for the pages so they're not pulling names from any given category.

    I think Richard may be right that it's an OS issue, because sometimes when I go to look in a folder in which I have generated a lot of pages, windows explorer will crap out, unable to handle the large number of files.

    Unfortunately I'm unable to read the error log when I do this build, because webmerge simply stops responding and I have to go in and end the process.

    I'm using windows XP with over 700 megs of Ram, so I wouldn't think it would be such a issue... Anybody got any other ideas? or know a way to assign more memory to webmerge?

    Much obliged-

    Roger

  4. #4
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by codetrance:
    Thanks for trying to help Frank... alas I'm still unable to fix the problem.

    I did go and check the record and the one before it, but they both seem fine.

    In this instance I'm using serialized names for the pages so they're not pulling names from any given category.

    I think Richard may be right that it's an OS issue, because sometimes when I go to look in a folder in which I have generated a lot of pages, windows explorer will crap out, unable to handle the large number of files.

    Unfortunately I'm unable to read the error log when I do this build, because webmerge simply stops responding and I have to go in and end the process.

    I'm using windows XP with over 700 megs of Ram, so I wouldn't think it would be such a issue... Anybody got any other ideas? or know a way to assign more memory to webmerge? <HR></BLOCKQUOTE>

    You should not need to manually adjust WebMerge's memory on Windows or OS X. Only Mac Classic requires that for handling large files.

    If you're absolutely confident the data does not have any odd characters in it (I've seen NULL bytes present in text which will cause errors), it may be that one of the detail files being generated has the same name as an index file; with the index file still open to write, the detail page cannot be written.

    With serialized pages remember that WebMerge 2.3 and earlier only handle 9,999 pages due to the four-character serialization limit (this is changing in v2.4). This has never been reported as a problem since most large data sets also contain a unique identifier (for product feeds it's usually the SKU or product ID field), and using the unique identifier field is strongly recommended for generated page names to allow any number of pages without fear of overwriting any.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  5. #5
    Newbie
    Join Date
    January 18th, 2005
    Posts
    4
    Hey All,

    I think I have come up with a workaround for my problem, so maybe this will work for you if you have the same issue:

    Basically what I did was create a new field in the the database called ProductIDnew

    for half the records in the DB for this field I put this in: folder_A/youruniqueitemID

    for the other half, I put this in: folder_B/youruniqueitemID

    On the webmerge detail page you can set "Name Based on contents of this Field". If you select the ProductIDnew field, webmerge will generate detail pages into two folders, folder_A and folder_B.

    You can set this up to generate as many folders as you wish.

    The issue is that windows doesn't like too many items in one folder. By splitting the 12,000 records into two folders with 6000 each. Windows was able to handle it, and WebMerge was able to complete the build.

    This might be an idea for a new feature for webmerge- to be able to have it automatically create a new folder every 5000 records or so, instead of having to create fields in the database to make it do this. I'm sure it could help rectify the issues surrounding handling large numbers of files.

    Alternately, maybe someone could tell me how to assign more memory to windows explorer such that it could handle this...

    Best Regards,

    Roger

  6. #6
    Newbie
    Join Date
    January 18th, 2005
    Posts
    4
    Ahh... so it was because I was doing serialized pages! Dope!

    So much for my fancy shmancy workaround :P

    Thanks Richard.

  7. #7
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by codetrance:
    Ahh... so it was because I was doing serialized pages! Dope!

    So much for my fancy shmancy workaround :P<HR></BLOCKQUOTE>

    No problem. Just use the product ID or other unique identifying field and you should be good to go.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  8. #8
    ABW Ambassador buy_online's Avatar
    Join Date
    January 18th, 2005
    Location
    Richmond, VA
    Posts
    3,234
    "The issue is that windows doesn't like too many items in one folder. By splitting the 12,000 records into two folders with 6000 each."

    Exactly! That was my workaround too. Not fun, but win98 didn't like WM writing all those files to one directory. I pulled my hair out trying to get it fixed, changing/fixing/altering records - nothing worked.

    My final solution was to convert an old PC into a Linux box hung off of my LAN. I also added a new 20GB.

    So far, I have had WM process almost 50,000 files at one time. This also solved another problem I had, which was FTP'ng to the site. Linux is so robust, I am able to do all of these things without spending lots of money - and it always works.

    Fred

    Are you sure the nurses know you're using the computer?

  9. #9
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by buy_online:
    _"The issue is that windows doesn't like too many items in one folder. By splitting the 12,000 records into two folders with 6000 each."_

    Exactly! That was my workaround too. Not fun, but win98 didn't like WM writing all those files to one directory. I pulled my hair out trying to get it fixed, changing/fixing/altering records - nothing worked.

    My final solution was to convert an old PC into a Linux box hung off of my LAN. I also added a new 20GB.

    So far, I have had WM process almost 50,000 files at one time. This also solved another problem I had, which was FTP'ng to the site. Linux is so robust, I am able to do all of these things without spending lots of money - and it always works.<HR></BLOCKQUOTE>

    Great story -- chalk up one more for the penguin!

    Would it be helpful if you had a Linux version of WebMerge so you could do everything from you Linux box?

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  10. #10
    ABW Ambassador Mike O's Avatar
    Join Date
    January 18th, 2005
    Location
    Los Angeles area
    Posts
    843
    FWIW, I used WM to build 30,000 pages from the BettyMills feed, and put 'em all into one directory with no problem in Windows XP.

    Perhaps the limitation was in Windows 98.

    -- Mike

    "Men travel faster now, but I do not know if they go to better things."
    -- Willa Cather

  11. #11
    Full Member
    Join Date
    January 18th, 2005
    Posts
    413
    I had the same experience as Mike. 30,000 files in one directory in WinXP is no problem.

    Have a Creative Day!
    FRANK, Baertracks
    Be More Creative

  12. #12
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by MikeO:
    FWIW, I used WM to build 30,000 pages from the BettyMills feed, and put 'em all into one directory with no problem in Windows XP.

    Perhaps the limitation was in Windows 98.<HR></BLOCKQUOTE>

    I've had reports like this with users running Win98, while it works great on MW, 2K, and XP.

    Micro$oft will stop supporting Win98 soon. Time to upgrade? The security enhancements in XP make it well worth the trouble. Win98 is like swiss cheese to a hacker.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  13. #13
    Member
    Join Date
    January 18th, 2005
    Posts
    145
    I'm a bit confused by this ......

    Is the Windows 98 problem regarding thousands of files in one directory only an issue if you have W98 on YOUR machine?

    Or more seriously, does this also mean that anyone who tries to view your site in W98 will encounter errors - if you have thousands of files in a single directory?

  14. #14
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Tiebreaker:
    I'm a bit confused by this ......

    Is the Windows 98 problem regarding thousands of files in one directory only an issue if you have W98 on YOUR machine?

    Or more seriously, does this also mean that anyone who tries to view your site in W98 will encounter errors - if you have thousands of files in a single directory?<HR></BLOCKQUOTE>

    Based on what I can gather from anecdotal evidence only (I've been unable to find a Micro$oft tech note on this), it appears that there may be a limit on the number of files that can be written to a given directory in Windows 98. I've had no other reports of such incidents on any later Microsoft OS (nor on Mac or any server OS).

    Given what appears to be the nature of the problem, your site visitors would only be affected if they were running Win98 and downloaded every one of your thousand of pages to a single folder on their hard drive. I doubt many would care to do that.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  15. #15
    Member
    Join Date
    January 18th, 2005
    Posts
    145
    Thanks for putting my mind at ease Richard, I thought I could be losing valuable traffic - There are still a lot of W98 users out there.

  16. #16
    ABW Veteran Mr. Sal's Avatar
    Join Date
    January 18th, 2005
    Posts
    6,795
    Count me in, I just uploaded over 1,300 pages two weeks ago and I still use Windows 98 and FrontPage Explorer Vertion 3.0.3, and so far, so good.

    Sal.

  17. #17
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Mr. Sal:
    Count me in, I just uploaded over 1,300 pages two weeks ago and I still use Windows 98 and FrontPage Explorer Version 3.0.3, and so far, so good.

    Sal.<HR></BLOCKQUOTE>

    Should be no problem: the issue I've seen only crops up when you have more than 9,200+ files in a single directory.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  18. #18
    Member
    Join Date
    January 18th, 2005
    Posts
    145
    I've experimented with Windows 98, putting all my details pages in one directory ....

    Webmerge stopped running at record number 32,766

    (I'm not sure about this 9,200 records figure that Richard quotes - I've never had a problem with that amount)

    There is no problem with the feed - when I place the files into different directories the whole feed runs fine - over 40,000 records - so I know the problem is purely having over 32,766 files in a single directory.

    Time to update my machine to 2K or XP

    We've had a couple of posts saying 30,000 files are no problem in XP - but W98 can process 30,000 files too - are we sure there are no problems with XP once you get to 40-50,000+ files?

  19. #19
    Affiliate Manager
    Join Date
    January 18th, 2005
    Location
    Los Angeles, California
    Posts
    1,913
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Tiebreaker:
    I've experimented with Windows 98, putting all my details pages in one directory ....

    Webmerge stopped running at record number 32,766

    (I'm not sure about this 9,200 records figure that Richard quotes - I've never had a problem with that amount)

    There is no problem with the feed - when I place the files into different directories the whole feed runs fine - over 40,000 records - so I know the problem is purely having over 32,766 files in a single directory.

    Time to update my machine to 2K or XP

    We've had a couple of posts saying 30,000 files are no problem in XP - but W98 can process 30,000 files too - are we sure there are no problems with XP once you get to 40-50,000+ files?<HR></BLOCKQUOTE>

    It has not been reported as a problem, so I think your safe on systems later than Win98.

    32766 sounds more logical (in binary it's a round number); I think there were other factors involved in Win98 throwing errors at 9200+ files.

    --
    Richard Gaskin
    Fourth World Media Corporation
    Developer of WebMerge: Publish any database on any site
    ___________________________________________________________
    Ambassador@FourthWorld.com http://www.FourthWorld.com
    Tel: 323-225-3717 AIM: FourthWorldInc

  20. #20
    Member
    Join Date
    January 18th, 2005
    Location
    Australia
    Posts
    118
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by MikeO:
    FWIW, I used WM to build 30,000 pages from the BettyMills feed, and put 'em all into one directory with no problem in Windows XP. <HR></BLOCKQUOTE>

    I can believe that, but I think that statement needs to be qualified. I have an older PC with a 1 GHz P3 CPU with WinXP Pro, 512 MB RAM and cannot get past 21,959 files in one directory. By splitting the data file, I can build the remaining pages in another directory.

    I haven't looked up (on the web) what are the variables that lead to such blockages, but I suspect it is system RAM. I cannot copy the files from the second directory into the main one. Neither can I select those 21,959 files to zip them up.

    FWIW.
    [URL=http://www.netmagellan.com/]Net Magellan blog[/URL]

  21. #21
    ABW Ambassador Mike O's Avatar
    Join Date
    January 18th, 2005
    Location
    Los Angeles area
    Posts
    843
    Might be the RAM, but I've only got twice that, at 1 Gig of RAM.

    Does it work if you shut down most other programs first? That should free up more RAM.
    A joy shared is a joy doubled.
    A burden shared is a burden lightened.

  22. #22
    ABW Ambassador buy_online's Avatar
    Join Date
    January 18th, 2005
    Location
    Richmond, VA
    Posts
    3,234
    In my tests, it seems to be the Operating System or maybe RAM. For example, in Win98 WM chokes after a while depending on the number of files, and the OS. I have 512 Mb or RAM, and have never run it all the way down during the analyzing process. I have also had situations where the datafeed itself caused WM to stop. Although I cannot reproduce the error - it does happen. If it is indeed Betty Mills - it may be the feed - it's happened to me before, and I ended up cutting the feed in half after separating the records by the category I wanted to sort by.

    I have also had problems (in the old days) when I had 256k RAM, and ran it all the way down. (I use a tool called "Cacheman" that gives me an up-to-the-second readout of available RAM.) WM called it quits in that situation. So the finger that I have been pointing has been aimed (mostly) at the OS. Although, I would think that Win2k and above are far more robust than Win98 - so it's a little weird that this problem is occurring. It may simply be the feed.

    As I've posted before, my current solution is to use a better operating system to write the files to. Just last week, I ran a feed through that had more than 86,000 records non-stop! The trick was to have WM run on my Win98 machine, and write the files across my network to my Linux machine. Never a hiccup with this configuration, it's worked dozens of times.

    I have on the back-burner, a project to get WebMerge ported over to WINE, so that it will simply run right on the Linux machine. I'll let everyone know how it works.

    Fred

  23. #23
    Member
    Join Date
    January 18th, 2005
    Location
    Australia
    Posts
    118
    I dug deeper. I forgot that I had left this drive as FAT32 - occasionally comes in handy when networking with older machines. Now FAT32 can handle 65,536 files in a directory(See this doc from Microsoft) but it also mentions "(The use of long file names can significantly reduce the number of available files and subfolders within a folder.)" I have tried to build the entire BM set in one folder for SEO reasons, so I have hit this limit. I'll look to prune bytes here and there in the directory names or use one of my NTFS drives.
    [URL=http://www.netmagellan.com/]Net Magellan blog[/URL]

  24. #24
    Member
    Join Date
    January 18th, 2005
    Location
    Australia
    Posts
    118
    I can now confirm that switching to an NTFS volume allowed me to build the full BM set in one directory without any problem. Therefore I conclude that RAM did not have a lot to do with it (or that 256 MB and sufficient HD space to spare was sufficient).

    But I also went overboard in shortening the file names - I was trying to reduce the duplication in the Super-Cat/Cat/Sub-Cat trio and renamed "Index Cards" to "Index". You can imagine what happened.

    [Added: I also accidentally shortened some descriptions within a column down to "wood", "metal" - that messed it up completely. Moral: Do not prune at all, or do it in small chunks after searching to see the implications.]

    Festina Lente. (Make haste slowly)
    [URL=http://www.netmagellan.com/]Net Magellan blog[/URL]

  25. #25
    ABW Ambassador buy_online's Avatar
    Join Date
    January 18th, 2005
    Location
    Richmond, VA
    Posts
    3,234
    Thanks for the advice, it now makes more sense to me why my Linux machine "swallows" all of the files I write to it, where the Win98 (Fat16) machine does not.

    From Microsoft (crm911's link):
    Each file system supports a maximum volume size, file size, and number of files per volume. Because FAT16 and FAT32 volumes are limited to 4 GB and 32 GB respectively, you must use NTFS to create volumes larger than 32 GB.

    Thanks again,

    Fred

+ Reply to Thread
Page 1 of 2 12 LastLast

Similar Threads

  1. generating only new pages
    By dpdiaz in forum WebMerge (Fourthworld.com)
    Replies: 3
    Last Post: September 7th, 2010, 07:20 AM
  2. Frustrating problem with generating Index pages - fixed
    By robot in forum WebMerge (Fourthworld.com)
    Replies: 9
    Last Post: May 18th, 2009, 07:42 PM
  3. Generating an index page without generating detail pages
    By Pierre (aka Terdef) in forum WebMerge (Fourthworld.com)
    Replies: 6
    Last Post: July 15th, 2005, 10:16 AM
  4. SiteSearch Pro - stalls after about 100 pages
    By aprillougheed in forum WebMerge (Fourthworld.com)
    Replies: 5
    Last Post: May 23rd, 2004, 12:04 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •