One trouble with discoverability is the problem with precise semantics. Even if an automated system can discover the existence of a service, to know ~exactly~ what to do with it requires progress in upper and middle ontologies that still isn't here in 2012.
I hate to defend SOAP, but WSDL does a good job of documenting web services. My jaw dropped when I pointed Visual Studio at Salesforce.com's WSDL file and it correctly created a set of statically typed stub functions for a very complex API... I'm so used to this stuff not working that when it does it's like WOW.
HATEOS addresses just a fraction of what WSDL does.
My experience is limited, but I've never had a WSDL provide any tangible benefit. In one instance, the WSDL for a single method (+login) SOAP service generated a 26,000 line Apache Axis2 stub that didn't compile without manual attention. I was able to replace it with about 30 lines of custom code once I had the wire-trace and could figure out the weird undocumented incantations required for it to work.
I suspect that .NET WSDLs (of which this was one) work well with .NET clients - but in that case it's might was well be any proprietary RPC protocol.
Correct me if I'm wrong, but REST discoverability isn't meant to provide machine discoverability, but human discoverability.
The applications that tend to get pentested most are for a variety of reasons more likely to have SOAP interfaces than apps in general, so we spend a fair bit of time with WSDL files, and this just isn't my experience; for the most part, given a WSDL for a service, a Ruby RPC binding for that service is mostly painless.
That doesn't mean I like SOAP (I don't), but I don't find that this particular critique of it rings true.
HATEOS is a design constraint, WSDL is a spec. HATEOAS could accomplish a lot more than what WSDL does with appropriate specs and a widely adopted programming model or two. This has taken a long time to improve.
The challenge has been, IMO, a programming model that fits the Web, which can then fit into a media type spec or two that adopts it (similar to how HTML was codified for the experience of a web browser). WSDL basically is the procedural programming model, where networked interactions are usually mapped procedure calls. This model a whole bunch of long discussed problems when dealing with wider scale interoperability.
Is WSDL richer and more productive than documenting a bunch of URI patterns for a complex API? I'm not sure about that. WSDL tends towards code generation for very specific sorts of interaction scenarios, and ignores things like cacheability of results, the safety or idempotency of the request, and the ability to access data across endpoints without a lot of a priori knowledge in the client. Many of those properties are easier to implement with plain HTTP used in a RESTful manner.
I hate to defend SOAP, but WSDL does a good job of documenting web services. My jaw dropped when I pointed Visual Studio at Salesforce.com's WSDL file and it correctly created a set of statically typed stub functions for a very complex API... I'm so used to this stuff not working that when it does it's like WOW.
HATEOS addresses just a fraction of what WSDL does.