If so, could there perhaps be a step where the LoRA is merged back into the main model?
That would be like sleeping :-)
LoRAs tend to be adapters bolted onto to systems by people other than the system designers, and they are low rank factorizations.
There is nothing low rank or adapter here.
If so, could there perhaps be a step where the LoRA is merged back into the main model?
That would be like sleeping :-)