I'm using the following VB6 code fragment to read, for a given folder, all
the file names
and their associated modified times. This works ok up to about 4000 items,
then the time
to read the next 1000 begins to increase dramatically. I've got between
25K-30K items per
folder to read. The whole reason I do this anyway is so that I can sort
the times, and then
process the files sequentailly based on the modified date. The array is
dim'd at a large
number (50K) but changing it doesn't seem to effect performance.

This is reading from a CD, and reading from a local HD is only slightly faster.
The performance monitory shows the time being spent in the kernal.

Stats: 2000 items: 79 seconds, 4000 items: 300 seconds, 24000 items: 9750

Any suggestions?

I've enjoyed reading the other posts. Thanks for having this forum!


' Procedure to get all the file name and date/time pairs.
' Use i to just loop through.
' Set T1 to the start time.
' Folderspec is of the format: d:\04112000\
Set fs = CreateObject("scripting.FileSystemObject")
Set f = fs.GetFolder(folderspec)
Set fc = f.Files
i = 0
j = 1
strT1 = CStr(Time)
For Each fl In fc
i = i + 1
aryFiles(i, 1) = fl.Name
aryFiles(i, 2) = fl.DateLastModified
' Display a message box for every 1000.
' Use j to count how many times this loops trough.
' Use T2 for the interval end time.
' Reset the beginning time (T1), so the display is an interval.
If i > (1000 * j) Then
strT2 = CStr(Time)
strTmpB = strRoutineDisp + strT1 + "->" + strT2 + " = " + CStr(DateDiff("s",
strT1, strT2))
MsgBox CStr(1000 * j) + " loaded." + vbCrLf + strTmpB
strT1 = strT2
j = j + 1
End If