The Reports of FTP Death are Greatly Exaggerated

The reports of FTP death are greatly exaggerated .jpg

The reports of my death are greatly exaggerated.
— File Transfer Protocol

Earlier this year, you may recall reading that the venerable and ubiquitous File Transfer Protocol (FTP) was going the way of the dodo. The rumor began after the Debian Project announced on April 25 that it was ending FTP services on November 1.

Debian Project said the decision came because FTP had outlived its usefulness. The protocol doesn’t support caching or acceleration, is difficult to use and configure, is inefficient and doesn’t play well with firewalls, and doesn’t get much use anymore.

The announcement sent ripples across the tech landscape, prompting The Register to opine, in typical El Reg style, FTP Becoming Forgotten Transfer Protocol. But a closer read of both the Debian announcement and the Register article confirm what logic dictates: it takes more than six months to kill off more than four decades of implementation. FTP isn’t going anywhere anytime soon. That news is both good and bad.

It’s good news because there are an unknown number of FTP servers still in use and replacing legacy tech is often a chore. It’s bad news because there are an unknown number of FTP servers still in use and, well, FTP was never designed to be a secure protocol and so has proven to be vulnerable to hackers.

FTP was created in 1971 to meet a simple need: facilitate the movement of data files from one point to another. File size and bandwidth constraints at the time necessitated a specialized process to get the job done and setting up an FTP server was a simple enough solution that, as computing and networking evolved, FTP maintained its niche, even if that niche has evolved from useful legacy to shadow IT.

Today there are numerous options for effecting data transfer, and most of them are simple and (when used correctly) secure. If you are an individual who wants to send photos and videos of a recent wedding or vacation with friends and family, there are applications that make it fast and simple. If you run a small business and need a way to share information and collaborate with employees and partners, there are cloud-based facilities that are cost-effective and user-friendly.

And if you are an enterprise with a need to securely transfer large files of sensitive data, and do so within increasing regulatory restrictions intended to ensure the protection of valuable intellectual property and private consumer data, there are powerful options designed to meet those specific needs—without compromising objectives for the sake of convenience.

Top of the list for such mission critical-tasks is managed file transfer (MFT) technology, which is engineered to handle the transmission of huge files in a safe, reliable manner. What do we mean by huge? The size of a file required to store and analyze genomic data could be hundreds of megabytes, while the resulting sequenced data might be measured in gigabytes. Files containing hours of raw, high definition video shot from a remote location and sent halfway across the world to a Hollywood studio for editing might be measured in terabytes.

Moving that kind of data is no easy task. Consider that, with all the physical limitations, if you wanted to send 1TB of data from New York City to London over a standard 100Mbs Internet connection it would be faster to transfer the file to physical media and fly it on a 747 airliner. The data transfer would take ten days, in which time you could arrive in London and visit places like Big Ben, Parliament, and Windsor Castle; take a drive to Edinburgh, sail across the Firth of Forth, play 18 holes at the Royal and Ancient Golf Club at St. Andrews; head back to London and, on the way, take in a Manchester United match at Old Trafford … before your data transfer were completed.

Managed file transfer doesn’t accommodate sightseeing, but it does accommodate speed and efficiency (unlike FTP). It also accommodates vital operational functions like security and authentication, regulatory compliance, data integration, and process automation that are necessary for doing business today. These are not just problems that only large companies face. To the contrary, because today’s collaborative communications technologies support the existence of lean organizations that may consist of a core group of employees, dozens of contractors, and a network of business partners, it is not uncommon for smaller enterprises to be responsible for outsized amounts of highly sensitive data.

Whether a medical services provider under the aegis of the Health Insurance Portability and Accountability Act (HIPAA) or a financial services firm handling client files filled private personal and account data answerable to a litany of laws like Gramm-Leach-Bliley (GLBA), Fair Credit Reporting Act (FCRA), Fair and Accurate Credit Transactions Act (FACT), Right to Financial Privacy Act and others, there is little room for error. And for any organization doing business overseas, the European Union’s omnibus General Data Protection Regulation (GDPR) goes into effect in May of 2018, and with it, stiff financial penalties for negligence.

With the stakes so high, there’s no benefit to taking chances using not-ready-for-prime-time consumer-grade services, or outdated, outgunned, roll-your-own technologies like FTP. It’s probably worth fighting your way through the cobwebs to find that old FTP server that a long-retired techie installed during the Reagan Administration and shutting it down. But change can be hard, and the dislodging of legacy tech takes time. So while it may be too soon to write FTP’s obituary, it’s long since time to take an inventory of the systems that are entrusted with handling your most sensitive—and at-risk—data files.


About Greg Hoffer

Greg Hoffer is Vice President of Engineering at Globalscape where he leads the product development teams responsible for the design and engineering of all of Globalscape products.

More About Greg