Every month, we FTP over 200 files from the mainframe to a Windows server and it can take from 8-12 hours. Many of the files are very large and have to be split for the FTP. We are able to compare record counts (expected and received) and a few of the files have incorrect record counts and must be re-FTP'd. It's usually less than 50 records per file and happens to a random 2-10 files.
We tried limiting the number of FTP stream from 5 to 3 but that had no effect.
Where would be the first places to research to identify the cause? Reftp is a time consuming process because of the numerous large files.
MK