JSON Schema is awesome. I wish Typescript had better support for it though, having to do stuff in Zod and JSON Scheme sucks.
I have a system I built that compiles TS types to JSON schema, which then validates data coming into my endpoints, this way I am typesafe at compile time (Typescript API) but if someone hits my REST endpoint w/o using my library, I still get runtime goodness.
The number of different ways that JSON schema can be programmatically generated and therefore expressed is a bit high, different tools generate very different JSON Schemas.
Also the error messages JSON schema gives back are kind of trash, then again the JSON Schema on one of our endpoints is over 200KB in size.
Ditto. It took me a while (coming from C++) to get that types are not the ideal source of truth in TypeScript, due to their disappearance at runtime.
Having a runtime parser (which can consequently express things impossible with TypeScript types alone, like "a positive integer" rather than being satisfied with any instance of `number`) from which types are inferred, is a needed mindset shift made easy with Zod.
I use JSON Schemas for request validation and response serialisation (eg: [1]) in Fastify, derived from Zod parsers via zod-to-json-schema [2]. Some of Zod's runtime validation does not translate to JSON Schema (typically transforms), so YMMV. But this gives a good runtime glue with the static typing of request handlers.
Maybe I was unclear - my post was a criticism of Zod because it involves a bunch of duct tape that I'm not sure makes sense.
Typebox just creates JSON Schema objects at runtime and projects them into the type system with `Static<T>`. In so doing, you simultaneously create schema and types and the process of doing so is pleasant--you can just hand a TObject to Fastify as a validator object and you're done. Plus, with a Typebox type provider, it infers down to your handler.
JSON Schema is the lingua franca; to me, working in it, rather than converting to it, is a much easier proposition.
Zod has the same "simultaneously create schema and types and the process of doing so is pleasant" feature, it's just not using JSON Schema for it (and has more power than JSON schema does).
In the Zod world, if you need JSON Schema to e.g. communicate to the outside world, you can extract it from the Zod schema with https://github.com/StefanTerdell/zod-to-json-schema -- but if you don't, you don't.
JSON Schema is kind of underpowered for real validation, so if you limit yourself to it, then you'll just have a another round of validation immediately after. TypeBox'
s CreateType seems to be the same idea, it cannot be expressed in just JSON Schema.
I think Colin's a rad programmer, he works on EdgeDB which is absolutely my favorite datastore I've ever used and I think Zod's fine if you want to use it. But I don't agree with your premise. I don't, in practice, find the way Zod asks you to think about data compelling. It's probably because I think specifically in communication between systems; making the particulars of the interchange format central to the act of writing the thing, for me, keeps top-of-mind the necessity of both ends having an identical understanding of the allowable semantics. (Similarly, I've never used any Typebox features that don't map to JSON Schema, and I've never felt the need to.)
Interesting.. I'd rather write typescript types than Zod schemas. I haven't used JSON schema, but going TS to Zod was straightforward and really pleasant
Typescript can't understand types as objects, but it can understand objects as types (using typeof). So starting with objects and generating types is more natural.
Also if you start with objects you can express runtime conditions that are not possible with types, for example maximum sizes.
Zod is quite a bit more general than JSON schema because it can express transformations not just validations.
Maybe a better solution (not necessarily for your exact your use case) would be to generate the Typescript types from JSON schema? The schema feels more like it should be the real source of truth. That's how protobuf (for example) works - there's a language-independent schema and you can generate types for any languages you wish (but don't have to depend on any particular language if you don't need it).
1. Fancy combo of build and compile time generics generated all of our libraries
2. Tooling ran over return values from the generic definitions and created the schema
I rewrote it and now everything is defined in JSON files and those JSON files are ran through a code generator that creates our exported libraries, and we still generate the schema based on the TS exports.
Having everything defined in JSON then allowed us to write tooling on top of the JSON to make changes to our libraries.
Protobuf v3 is horrible, had to start using it recently. The type system is so anemic, it is a joke how hard it is to model things in it.
JSON schema is more powerful than TS, and TS is orders of magnitude more powerful than what can be expressed in PB.
The original generics code was super cool, and obscenely succinct, but it wasn't amenable to being auto generated.
Not OP, but we have a very similar setup. We use https://github.com/vega/ts-json-schema-generator in a prebuild hook to generate a schema from Typescript. The schema is then fed into Fastify, and referenced in route configurations for incoming (validation) and outgoing (fast serialization) data, as well as auto-generated Swagger. Took some custom code to simplify attaching definition references to our route configs, but it works quite well.
As an extra, our frontend uses the same shared TS code for data transfer objects, this way there's an extra level of type safety at the API.
I have a system I built that compiles TS types to JSON schema, which then validates data coming into my endpoints, this way I am typesafe at compile time (Typescript API) but if someone hits my REST endpoint w/o using my library, I still get runtime goodness.
The number of different ways that JSON schema can be programmatically generated and therefore expressed is a bit high, different tools generate very different JSON Schemas.
Also the error messages JSON schema gives back are kind of trash, then again the JSON Schema on one of our endpoints is over 200KB in size.