Many people would need additional tooling to parse JSON into a structured type, especially in dynamic languages where unstructured data types are the norm. And even if you do that, you're still going to have to validate the data and tell the client where they might have messed up the formatting.
Parsing is a totally fine, low-cost, low-risk way to validate chunks of JSON before trying to parse them -- or not trying to parse them, if that's your choice. People who use TypeScript on the server side, for example, will only need to validate.
> Many people would need additional tooling to parse JSON into a structured type, especially in dynamic languages where unstructured data types are the norm.
Well, there's your first issue. Specifying your formats is a good idea, regardless of language. And this is completely possible in dynamic languages too, see for example Python's Marshmallow.[0]
> And even if you do that, you're still going to have to validate the data and tell the client where they might have messed up the formatting.
The point is that you should take the opportunity to encode this validation in your types. A validated email address is no longer a `String`, it's an `Email`, with different valid operations. Or for an extreme example, you (hopefully) wouldn't try do time calculations directly on RFC 3339 strings.
This doesn't have to happen at exactly the same time as parsing the JSON AST, it's completely sensible to go JSON -> JSON AST -> Model.
OK, I understand what you're saying. OpenAPI can be used for what you're describing. I just don't really understand the false dichotomy between validation and parsing. OpenAPI can be used for both depending on the language, and if you're writing docs, you might as well write them in OpenAPI format and get a validator for free.
There is a difference between the common meaning of parsing (text -> (ast | error)) and the generalized meaning that Alexis uses in the post (less structured -> (more structured | error)).
Yes. Schemas are useful. What alternative do you propose? The only other option I can imagine is sending the schema along with the object, in which case you should just be using XML instead of JSON.
REST is often used as a very, very limited transport model for a two-bit, buggy, incomplete and incoherent clone of SOAP, yes.
Unfortunately, neither REST nor SOAP are understood by majority of developers writing code for them (to this day, I insist that the real issue with SOAP was people using IDE code generation without ever studying how SOAP works)
I've found OpenAPI is very poor for more dynamic resource-based APIs where the mechanics of interactions between the resources is standard but the resources themselves differ, e.g. a JSONAPI API. Does anyone know of tooling that more accurately fits that use case? I have hand rolled (well, generated) JSON Schemas in the past and published those, but that felt unsatisfying and too ad hoc.
I don't know a tool for it, but I often find from experience that dynamic API's are usually not the right tool for the job. However if it is required and you have a declarative API, it will make the process much easier I would imagine.
EDIT: `declarative API` as in some kind of proxy layer which you can better isolate.
OpenAPI v3 can handle what you describe, although it's a pain to write it by hand and tooling in your language may be missing support for edge cases (i.e. incomplete spec implementation).
My suggestion would be to define your APIs in a neutral, flexible format (such as Spot[1]) and then generate OAS 3 or whatever other formats you need. That helps you avoid lock-in and any tedium required to handle less-capable formats.
If I'm understanding you correctly, this is what discriminators are supposed to solve, but I've never used them and don't know the state of the tooling for them.