Hi there, it's because I copied the project template from my previous project, will check and try to upgrade to the React 19 soon if it'll work with the libraries
First time I see this, by the first impression I think the output quality of this one is better than mine and my code is only based on one model
In my understanding, It'll possible if the model's author build to https://onnxruntime.ai ONNX Runtime. And maybe the downside is user will need to download ton of data to their device, currently it's ~100-200mb
Yeah I think it's because the quanlity of the model, hopefully we will have better quanlity in near future. I will see if anything I can do with the settings
Same here, this is the first time working with this library to me, it's really making me believe even more in the future of Transformer.js/WebGPU, it just beginning
Exactly this, as mentioned in the post I've used the same technology with this playground (copied lots of code from here)
What I do is mostly make the UX better
I tried it on my Chrome-Mac device, but not sure how it works on other devices (assuming it works). I'll test it out later. Sorry for the inconvenience
The error is ambiguous right now and I'll try to make it clearer (welcome for contributions)
The idea is it fallbacks to not use webgpu if your browser is not supported, but it was made in 2 hours, bug is acceptable :)