We have been struggling with the siren song of no-code ideologies for a while. We have managed to make so much of our platform data-driven that it seems like a natural evolution to make it 100% so.
Unfortunately, the closer we get the more dragons seem to appear. The hardest thing I have seen that will need to be addressed is handling the mapping of business facts from an internal representation to representations required by external systems we are interacting with. In many cases this isn't a simple 1:1 map, and we have to write imperative code to handle that last mile of wiring various things together. Also, developing business rules and validations is incredibly difficult without proper code if you are dealing with any sort of complex logical combinations of facts or entities.
The most sustainable model I have been able to come up with (at least for our products and customers) is to push as much we reasonably can into the data-driven realm, and then explicitly expose standardized coding contexts for mapping/validating/etc various combinations of data-driven entities.
For our purposes I would say the 80/20 rule applies fairly well. Approximately 80% of our platform can be data-driven, with the remaining 20% absolutely requiring code to connect the dots.
With a data-driven system, you basically end up with the data as a sort of intermediate code which is then interpreted by the "environment". The interpreter still needs to support the domain objects that implement the required behaviours. If you keep extending such a system until it meets all the requirements, you end up with yet another programming language and interpreter.
A far more practical approach is to have clearly demarcated, library of reusable domain specific objects which are then assembled as required. Thus the majority of the functionality (the 80%) ends up being written in a sort of embedded domain specific language.
Is that final 20% not just configuration? It's maybe still a candidate for a no/low-code interface.
Imperative code is the easiest way for technical people to solve mapping problems and encode business rules. They definitely aren't the only ways of solving those types of problems though. My guess is that there will be a few attempts to create accessible and generalised way of expressing business logic for the everyday user - a DSL for business logic. Software engineers have largely rejected visual programming environments for serious programming but perhaps these tools will be reinvented for an entirely new audience.
Unfortunately, the closer we get the more dragons seem to appear. The hardest thing I have seen that will need to be addressed is handling the mapping of business facts from an internal representation to representations required by external systems we are interacting with. In many cases this isn't a simple 1:1 map, and we have to write imperative code to handle that last mile of wiring various things together. Also, developing business rules and validations is incredibly difficult without proper code if you are dealing with any sort of complex logical combinations of facts or entities.
The most sustainable model I have been able to come up with (at least for our products and customers) is to push as much we reasonably can into the data-driven realm, and then explicitly expose standardized coding contexts for mapping/validating/etc various combinations of data-driven entities.
For our purposes I would say the 80/20 rule applies fairly well. Approximately 80% of our platform can be data-driven, with the remaining 20% absolutely requiring code to connect the dots.