Originally Posted By: Hal Itosis
Well, 256 is only slightly more than 150...

If by "slightly" we mean "almost double," then you're right.

Quote:
if say perhaps (just speculating) it needed two copies of the data: one to buffer the input it's reading, and another to store the output it's constructing (be it ASCII-betical sorting or column-width calculating, etc.), then 150 x 2 = 300. [at this point i presume my "alert the operator" guess was probably incorrect... just pointing out that it wasn't necessarily totally ridiculous either, since (so far) there doesn't seem to be an actual memory error.]

But since it's already been established that -exec + is cutting off and starting a new command line every 128 KiB or so, similar to xargs, that's clearly not the issue.

If this were designed to "alert the operator", then it would simply give an error such as "argument list too long" or something actually appropriate. This is not the case.


Quote:
Reading those statements, it seems apparent you were not yet aware of find's new {} + syntax (if Leopard 10.5.0 can still be called "new" that is), which i briefly mentioned earlier in this very thread. Its express intent is to emulate xargs. And —in some of the measurements i've taken on occasion, since Fall 2007 —it actually surpasses xargs (...not by much mind you, but still impressive).

Instead of posting snotty comments like this, you could just think about it and realize the problem with -exec + is that it evidently DOESN'T WORK RIGHT, whereas xargs does. Given the choice between taking a fraction of a second longer and actually working versus failing slightly faster, I'd go with the one that works (and if speed is what you're after, why are you using the shell anyway?).

And since -exec + cuts off the command line at around 128 KiB whereas xargs cuts off at 256 KiB, if the command line you're running takes a non-trivial time to execute, you might make back that 0.15 seconds anyway. I'd use xargs for now, until they fix find.

You were suggesting to use ; instead of + to work around this error, which would of course be quite a bit slower. I pointed out that xargs is a better solution for this case. Chill.
Quote:
bunch of fiddling with command line switches to attempt to isolate the issue

I've actually issued the same command line multiple times, and gotten failures sometimes and not others. At this point, I think that there is probably a random element involved, depending on the state of memory at the time the code is run (which is often the case with memory-related issues), and thus the only really effective way to troubleshoot it would be to go through the source code to the find tool itself with a debugger and a fine-toothed comb. If the bug could be isolated, then it could even be reported to Apple, along with a patch to fix the issue. This would be a legitimate thing to investigate; however, I'm losing interest in this. I posted in this thread because artie505 asked for help; I did so by posting a C-based tool which should scan for HFS+ damaged files significantly faster and more accurately than using the find tool, as well as being more reliable due to the lack of reliance on possibly damaged shell tools. However, I'm starting to think it was a mistake to come here. If the thread's going to be about defending your "This error was designed to inform the operator about something that the tool automatically takes care of anyway, by reporting a completely different error" statement ad nauseum, then I'm out.