When uploading large datasets to platform via SFTP — especially volumes in the range of hundreds of GB or even several TB — efficient data transfer becomes critical. Below are best practices to help you optimize your upload process, save time, and avoid interruptions.
Copy Files to a Local Disk Before Uploading
Avoid uploading directly from network drives or external shares.
Why? Uploading from a network drive causes two simultaneous network operations — first reading the data from the drive, then uploading it — which slows down performance significantly.
Tip: First move the files to your local hard drive (e.g. C:\Temp) and upload from there.
Use a High-Performance SFTP Client with Compression Enabled
Clients like WinSCP, FileZilla, or Cyberduck are highly recommended.
Enable compression in the transfer settings (e.g.,
zlib).Especially useful for text-based or uncompressed binary files.
Compression may increase CPU load — monitor system performance during transfer.
Enable Parallel Uploads
Most SFTP clients support multi-threaded transfers. This can dramatically improve throughput.
Start with 2–4 parallel transfers
Avoid too many threads if you're on limited bandwidth or using a slower drive
In WinSCP: Go to Preferences → Transfer → Maximum concurrent transfers
Test Your Internet Upload Speed
Your upload bandwidth is the main bottleneck. Use tools like:
Compare your actual upload speed (Mbps) with the expected time for your file size.
Upload During Off-Peak Hours
Network congestion — either in your local network or with your ISP — can slow down transfers during business hours.
Try scheduling uploads in the early morning, evening, or overnight.
Monitor CPU, RAM, and Disk Usage
Slow SFTP performance may be caused by system overload.
Watch CPU usage, disk activity, and RAM consumption during the upload.
Antivirus tools, browser tabs, or background sync (OneDrive, Dropbox) may interfere.
Use Task Manager (Windows) or Activity Monitor (macOS) to identify performance issues.
Use Resume-Capable SFTP Clients
Interrupted connection? Don’t start over.
Choose clients that support “resume on reconnect”
This allows you to pick up where you left off after a dropped session
Both FileZilla and WinSCP support this out of the box
(Advanced) Split Very Large Files Before Uploading
Some file systems or clients may struggle with multi-GB files.
Use tools like
7-Ziporsplit(Linux/macOS) to divide large files into 2–5 GB chunksAfter upload, we can reassemble them server-side (on request)
Contact our support team if you'd like assistance with this method
(Optional) Use SFTP Over Wired Connection
Wi-Fi is convenient, but unstable connections increase the risk of timeouts or dropped packets during long uploads.
Use a wired (Ethernet) connection for better reliability and speed, especially for multi-GB transfers.
Still Experiencing Upload Issues?
If your upload is stuck or significantly slower than expected:
Let us know the client used, upload speed, and approximate data size
We can help troubleshoot or propose alternatives (e.g., physical transfer, external storage links)
