Cool! Sweep actually already does this before the PR is shown to the user. I agree it might help to expose some of the Sweep generated reviews (which we did before)
We made a tool to Hacker News Posts with ChatGPT. You can customize which topic you're concerned about, then ask ChatGPT to give you a TLDR of the posts. Welcome to give it a try!
Hmm, we allow the CLI to execute AOT code sections embedded in WASM files as a convenience feature -- you can do AOT compilation and then execute it on your CLI.
For server side deployment, you should always do the AOT compilation on the server. But I agree this could be more clear. We are adding a new CLI option to disable AOT code segment in the WASM file for people who do not wish to perform the separate AOT compilation step.
* Lightweight (no GC, under 2MB)
* Designed to be a secure sandbox (not just a language runtime)
* Modular (add host functions as needed — eg support for Tensorflow)
* Multiple languages (esp compiled languages like Rust)
* Works with container tools (Docker, k8s etc)
Lightweight (no GC, under 2MB) => Which is why all GC based languages are forced to bring their own GC when targeting WASM.
Designed to be a secure sandbox (not just a language runtime) => JVM and CLR also were designed as such. It didn't work out as well as planned, when black hat hackers started researching how to game the sandbox. Still open for WASM.
Modular (add host functions as needed — eg support for Tensorflow) => JVM and CLR have dynamic code loading capabilities. WASM still hasn't standardized dynamic linking
=> Multiple languages (esp compiled languages like Rust) => Just like CLR, hence why it is Common Language Runtime, C and C++ were part of version 1.0 release.
Works with container tools (Docker, k8s etc) => Just like JVM and CLR do.
One of the benefits of running JS in Wasm (specifically the QuickJS approach) is that you can create JS APIs in Rust. That allows you to move a lot of compute intensive operations into Rust while still giving your developers a clean and nice JS API. WasmEdge does this with its JS Tensorflow API:
After working on a couple codebases that used wasmtime and wasmer heavily, I think if I were to start from scratch I'd just use V8 as my WASM runtime.
(And the model I would follow would be to containerize the V8 runtime piece inside its own cgroup, with resource budgets and permission constraints. cgroups accounting beats the limited and inefficient accounting one gets from the various WASM runtime opcode based accounting systems.)
Personally, I believe v8 is and always will be primarily focused on the browser use case. It’s support for wasi, container tooling will always take a back seat compared with priorities in the browser.
Today, there are a large set of Rust apps that can run in WasmEdge, wasmtime, Fermyon spin, wasmCloud etc, but would not run on v8’s embedded Wasm engine …
One of the challenges is that Wasm supports multiple languages. So, we will have to decide to start from a Rust app or a JS app or something else. Would love your suggestions.
Disclaimer: I am a maintainer of WasmEdge. WASI-NN allows Wasm to be a wrapper of native tensor frameworks — very much the same way Python is a wrapper for Tensorflow and PyTorch.
The benefit of using Wasm as a wrapper is its high performance (Use Rust to prepare data) and multi language support (inference in JS!)
WasmEdge supports Tensorflow, PyTorch, and OpenVINO as wasi-NN backends.