It is pretty neat, but I have yet to actually use it to fix a bug. The bugs that I cannot just spot tend to be bugs that occurred long ago in the program execution, something that record doesn't handle well (well, it could but it takes a lot of space to store all that info).
It seems that the only thing that reverse debugging helps me to do, i.e. find errors from not long ago, are something that is actually pretty easy to do. Maybe I am using it wrong.
The releases are cryptographically signed, so its not an issue that the download is not done over SSL.
Or is there something else that worries you about FTP?
> Or is there something else that worries you about FTP?
You know what his objection likely is. Computing is a fashion industry now, so if you want to sound modern, you have to use something "new" solely because it is new, and stop using "old" stuff because it doesnt run on Rails and isnt agile or whatever.
Can http do downloads from an arbitrary offset to another ? Not a rhetorical question, genuinley curious.
A long long time ago I had written an application (out of a mix of curiosity, whim and as a learning excercise) to share rpm files (red hat package manager) peer to peer. I chose ftp because it could parallelize the download to max out my bandwidth.
Connectivity to the main servers used to be terrible in those days in India. Neither were there local mirrors with fat pipes.
Practically speaking, its use of two ports often causes problems with residential ADSL that are often doubled-NATed (1st NAT at home router, 2nd NAT at the local gateway).
FTP is not secure. FTPS is but I haven't seen it used in ages.
FTP server programs are hard to write properly and properly secure. See the huge number of vulnerabilities in FTP programs.
The GNU project is already running an HTTP server. What does running an additional program with identical functionalities to a static HTTP server bring them? I see it as a not-so-useful expansion of the attack surface.
I am quite against blindingly following fashions in computing, but I fail to see what is the rationale for maintaining an FTP infrastructure for publishing files over simply serving them using plain HTTP.
> Practically speaking, its use of two ports often causes problems with residential ADSL that are often doubled-NATed (1st NAT at home router, 2nd NAT at the local gateway).
Doesn't PASV mode solve this?
> FTP is not secure.
Neither is HTTP
> FTPS is but I haven't seen it used in ages.
Presumably because of the ubiquity of HTTPS, which is fair enough.
> The GNU project is already running an HTTP server. What does running an additional program with identical functionalities to a static HTTP server bring them? I see it as a not-so-useful expansion of the attack surface.
Presumably just tradition. I guess also that it's worked for the past 20+ years and they don't see a reason to change. I haven't checked but I imagine you could get the programs via HTTP as well.
> I am quite against blindingly following fashions in computing, but I fail to see what is the rationale for maintaining an FTP infrastructure for publishing files over simply serving them using plain HTTP.
As I see is, the main two advantages of FTP are that it's possible to browse the directory hierarchy, and to upload files. Of course both of these things can be done via HTTP, but in a non-standard manner (with the exception of WebDAV, though that is an extension to HTTP rather than part of HTTP itself).
I agree that FTP is less relevant today than it used to be, but I'm not sure that it's fundamentally broken. The two-way connection thing is the main problem as far as I'm concerned but as I understand it PASV mode solves that.
It's not possible to upload files using FTP, if you care about preventing anonymous strangers from stealing your credentials and uploading their own files.
(And yet, many people still run FTP servers in public that allow uploads, and others upload to them over public/unencrypted WiFi. That's my personal gripe.)