(I just noticed that final entry
find: fts_read: Cannot allocate memory
Any idea what it's all about?)
The
-exec . . . {} + portion of the
find command seems to have tripped over those results of yours (which add up to around 150 KB worth of pathnames to be processed. I'm not sure exactly where it breaks down (100K maybe?). The safe way to prepare for a really large results list is to first write the output from
find into a temp file... and then read (and/or pipe) that temp file into whatever command
-exec was calling.