Title pretty much says it all.
I have a loop that reads attributes from a directory into a buffer until all of the enteries have been read. The buffer size is 16K, typically enough to read 100-150 directory entries at a time. I've successfully tested this code on directories with thousands of files.
However, I have one customer that's getting a ERANGE (34) error when trying to read a specific directory (/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/matplotlib/tests). This directory has about 90 files on my system and reads just fine.
If I interpret it correctly, the ERANGE error occurs when the buffer isn't big enough to copy even a single entry into the buffer. Is it possible for a single directory entry to consume more than 16K of data?
The attributes I'm requestion are:
ATTR_CMN_NAME
ATTR_CMN_DEVID
ATTR_CMN_OBJTYPE
ATTR_CMN_SCRIPT
ATTR_CMN_CRTIME
ATTR_CMN_MODTIME
ATTR_CMN_CHGTIME
ATTR_CMN_OWNERID
ATTR_CMN_GRPID
ATTR_CMN_ACCESSMASK
ATTR_CMN_FLAGS
ATTR_CMN_USERACCESS
ATTR_CMN_FILEID
files also get:
ATTR_FILE_LINKCOUNT
ATTR_FILE_TOTALSIZE
ATTR_FILE_ALLOCSIZE
ATTR_FILE_DATALENGTH
directories also get:
ATTR_DIR_LINKCOUNT
ATTR_DIR_ENTRYCOUNT
ATTR_DIR_MOUNTSTATUS
All of these (except for the name) are fixed size types, so how could a single entry require more than 16K? Could their directory structure be damaged?