I don't think a compression algorithm is going to work. Compression rarely works on such short texts and most URLs are already very dense in information per character. Also, URL shorteners have to stay in the URL-safe character set, and the final product has to have a URL format.
I tried gzipping a Google Maps directions URL and then outputting that in base64. Results:
$ wc -c url*
217 url
186 url.gz
252 url.gz.base64
So the compressed and then base64'ed version is actually longer. And of course it's more opaque. And I haven't even added on some http://some.domain/ at the beginning to make it URL-like.
This doesn't even work in theory, let alone the practical impossibility of getting every http-fetching service to adhere to this scheme.
Hello. Gzip is not the way to go, try 'Smaz' and you will get better results, but I'm going to write a specifically tuned version of Smaz in order to work very well with urls.
I'm doing this work for Redis (another project of mine) but I guess that somebody else can exploit this work in order to build a stateless url shortener service. I hope so at least.
I don't know much about this sort of compression, but there are some pretty long urls out there. Reducing a four-line Google Maps url to one line would be an amazing achievement, but it doesn't seem like it's quite enough for what people do with reduced urls today.
I tried gzipping a Google Maps directions URL and then outputting that in base64. Results:
So the compressed and then base64'ed version is actually longer. And of course it's more opaque. And I haven't even added on some http://some.domain/ at the beginning to make it URL-like.This doesn't even work in theory, let alone the practical impossibility of getting every http-fetching service to adhere to this scheme.