Practically speaking, its use of two ports often causes problems with residential ADSL that are often doubled-NATed (1st NAT at home router, 2nd NAT at the local gateway).
FTP is not secure. FTPS is but I haven't seen it used in ages.
FTP server programs are hard to write properly and properly secure. See the huge number of vulnerabilities in FTP programs.
The GNU project is already running an HTTP server. What does running an additional program with identical functionalities to a static HTTP server bring them? I see it as a not-so-useful expansion of the attack surface.
I am quite against blindingly following fashions in computing, but I fail to see what is the rationale for maintaining an FTP infrastructure for publishing files over simply serving them using plain HTTP.
> Practically speaking, its use of two ports often causes problems with residential ADSL that are often doubled-NATed (1st NAT at home router, 2nd NAT at the local gateway).
Doesn't PASV mode solve this?
> FTP is not secure.
Neither is HTTP
> FTPS is but I haven't seen it used in ages.
Presumably because of the ubiquity of HTTPS, which is fair enough.
> The GNU project is already running an HTTP server. What does running an additional program with identical functionalities to a static HTTP server bring them? I see it as a not-so-useful expansion of the attack surface.
Presumably just tradition. I guess also that it's worked for the past 20+ years and they don't see a reason to change. I haven't checked but I imagine you could get the programs via HTTP as well.
> I am quite against blindingly following fashions in computing, but I fail to see what is the rationale for maintaining an FTP infrastructure for publishing files over simply serving them using plain HTTP.
As I see is, the main two advantages of FTP are that it's possible to browse the directory hierarchy, and to upload files. Of course both of these things can be done via HTTP, but in a non-standard manner (with the exception of WebDAV, though that is an extension to HTTP rather than part of HTTP itself).
I agree that FTP is less relevant today than it used to be, but I'm not sure that it's fundamentally broken. The two-way connection thing is the main problem as far as I'm concerned but as I understand it PASV mode solves that.
It's not possible to upload files using FTP, if you care about preventing anonymous strangers from stealing your credentials and uploading their own files.
(And yet, many people still run FTP servers in public that allow uploads, and others upload to them over public/unencrypted WiFi. That's my personal gripe.)
Practically speaking, its use of two ports often causes problems with residential ADSL that are often doubled-NATed (1st NAT at home router, 2nd NAT at the local gateway).
FTP is not secure. FTPS is but I haven't seen it used in ages.
FTP server programs are hard to write properly and properly secure. See the huge number of vulnerabilities in FTP programs.
The GNU project is already running an HTTP server. What does running an additional program with identical functionalities to a static HTTP server bring them? I see it as a not-so-useful expansion of the attack surface.
I am quite against blindingly following fashions in computing, but I fail to see what is the rationale for maintaining an FTP infrastructure for publishing files over simply serving them using plain HTTP.