Hacker News new | past | comments | ask | show | jobs | submit | juntao's comments login

Maybe you can try https://flows.network/, which supports environment variables when creating an app. You can also manage the environment variables later with their UI. Click on https://flows.network/flow/createByTemplate/Telegram-ChatGPT, you can see you could set up system_prompt.


I'm wondering what will happen if we let ChatGPT review these PRs created by ChatGPT.

Yes, We made a small tool to help developer review their PR. Seems a great supplement for Sweep AI.

Build your own PR review bot in 3 minutes here: https://github.com/flows-network/github-pr-summary


Cool! Sweep actually already does this before the PR is shown to the user. I agree it might help to expose some of the Sweep generated reviews (which we did before)


We made a tool to Hacker News Posts with ChatGPT. You can customize which topic you're concerned about, then ask ChatGPT to give you a TLDR of the posts. Welcome to give it a try!

Discord version: https://github.com/flows-network/hacker-news-alert-chatgpt-d...

Slack version: https://github.com/flows-network/hacker-news-alert-chatgpt-s...

The Telegram version is on the way.


Actually, That's why I got your post so quickly.


Github as the conversational UI.


Hmm, we allow the CLI to execute AOT code sections embedded in WASM files as a convenience feature -- you can do AOT compilation and then execute it on your CLI.

For server side deployment, you should always do the AOT compilation on the server. But I agree this could be more clear. We are adding a new CLI option to disable AOT code segment in the WASM file for people who do not wish to perform the separate AOT compilation step.


I can think of a few:

* Lightweight (no GC, under 2MB) * Designed to be a secure sandbox (not just a language runtime) * Modular (add host functions as needed — eg support for Tensorflow) * Multiple languages (esp compiled languages like Rust) * Works with container tools (Docker, k8s etc)


Lightweight (no GC, under 2MB) => Which is why all GC based languages are forced to bring their own GC when targeting WASM.

Designed to be a secure sandbox (not just a language runtime) => JVM and CLR also were designed as such. It didn't work out as well as planned, when black hat hackers started researching how to game the sandbox. Still open for WASM.

Modular (add host functions as needed — eg support for Tensorflow) => JVM and CLR have dynamic code loading capabilities. WASM still hasn't standardized dynamic linking

=> Multiple languages (esp compiled languages like Rust) => Just like CLR, hence why it is Common Language Runtime, C and C++ were part of version 1.0 release.

Works with container tools (Docker, k8s etc) => Just like JVM and CLR do.


We are hoping to get WasmEdge running on as many IoT devices as possible …

We would love to see more containerization on the edge:

https://www.cncf.io/blog/2021/11/11/containerization-on-the-...


Disclaimer: I am a maintainer at WasmEdge.

One of the benefits of running JS in Wasm (specifically the QuickJS approach) is that you can create JS APIs in Rust. That allows you to move a lot of compute intensive operations into Rust while still giving your developers a clean and nice JS API. WasmEdge does this with its JS Tensorflow API:

https://wasmedge.org/book/en/write_wasm/js/tensorflow.html

In fact, we are using Rust to support the entire Node.JS API in WasmEdge. :)


Not to denigrate your work, it's good what you're working on... but I can create JS APIs in Rust with native V8 bindings, too: https://github.com/denoland/rusty_v8/blob/main/examples/proc...

After working on a couple codebases that used wasmtime and wasmer heavily, I think if I were to start from scratch I'd just use V8 as my WASM runtime.

(And the model I would follow would be to containerize the V8 runtime piece inside its own cgroup, with resource budgets and permission constraints. cgroups accounting beats the limited and inefficient accounting one gets from the various WASM runtime opcode based accounting systems.)


Personally, I believe v8 is and always will be primarily focused on the browser use case. It’s support for wasi, container tooling will always take a back seat compared with priorities in the browser.

Today, there are a large set of Rust apps that can run in WasmEdge, wasmtime, Fermyon spin, wasmCloud etc, but would not run on v8’s embedded Wasm engine …

However, I do agree that v8 excels in JavaScript.


The design of WASI and the component model are layered, though: you could start with V8 and build a runtime with full WASI etc. support.

In fact, Node.js and Deno have both done exactly that, a V8 core with layered APIs on top including WASI.


Will improve! Thanks!

One of the challenges is that Wasm supports multiple languages. So, we will have to decide to start from a Rust app or a JS app or something else. Would love your suggestions.


I've seen multiple docs have a default code area (JS), with tabs for other languages (rust, etc)


Thanks. I think JS would be a good option since it's a common language for web developers, who are most likely to be the target audience.


Disclaimer: I am a maintainer of WasmEdge. WASI-NN allows Wasm to be a wrapper of native tensor frameworks — very much the same way Python is a wrapper for Tensorflow and PyTorch.

The benefit of using Wasm as a wrapper is its high performance (Use Rust to prepare data) and multi language support (inference in JS!)

WasmEdge supports Tensorflow, PyTorch, and OpenVINO as wasi-NN backends.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: