Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Swift sucks at web serving or does it? (wadetregaskis.com)
146 points by MBCook on May 16, 2024 | hide | past | favorite | 84 comments


Swift could have been a big player in web development 9-10 years ago if they invested more in Linux support with a good standard library (and [0]package manager). Nowadays I would say Go is more an alternative rather than Node, maybe for some extreme cases Rust.

I wanted to love Swift, but there was (maybe still is) low Apple investment outside the Apple ecosystem.

[0] Still not sure to this day why Swift has a package manager if Xcode is not using it (or at least was not using it a few years ago), I just followed lightly Swift development the past few years.


I’ve been doing a lot of backend stuff recently with Dart after coming from a Node / Ruby background and I can’t begin to describe what a breath of fresh air it is. Insanely high quality of life improvement for development.

The experience is very similar to what people here are describing that they wanted from Swift.

Also in a similar vein to Typescript, it’s another language I can now comfortably use and share between front and backend but with a very different and I think for most use cases better set of trade offs.

Honestly I think it’s the most underrated modern language out there at the moment because people still only associate it with Flutter which is unfortunate.


Problem with backend dev is that it's 99% about plumbing between components written by someone else ( db, cache, queue, other web services , etc).

In that sense, the most important property of a language is its popularity: it will ensure connectors exist, and are used enough to be working fine.


One of the nice things about Dart in particular is that they are actually very aware that they are currently a small community.

In that sense they ended up making a big focus on having great interop with other languages. Currently they have C, C++, Rust, Swift, JavaScript (browser and node), Java, Kotlin and Objective C all more or less working and an upcoming goal of WASI will also open up that entire ecosystem.

Plus all of your usual interop solutions for out of process communication. I.e anything that speaks gRPC like Ruby, C#, Go, PHP etc has strongly typed automatically generated code libraries for you.


Many of those solutions rely on Flutter platform channels (ie. serialization) which may not suit high performance use cases.

Plug: native Swift platform channel implementation which I’ve been working on lately. https://github.com/PADL/FlutterSwift


I don't think this is wrong, but maybe missing some nuance:

Popularity definitely gives you "more stuff on the shelf", but it's also probably fair to say that at some level of popularity there's "enough" on the shelf.

What "enough" is, probably varies for different projects.

Just as an example, a while back I was involved in an Erlang project and while that ecosystem certainly does not have as much available as, say, Ruby, it had the pieces we needed, like a Postgres library and a few other things. The runtime itself was a better fit than others for the semi-embedded system we were working on, so in the end we chose that over more popular languages.


For sure you're right, it's always a tradeoff. But to compensate for a deficit in popularity, a technology needs to have tremendous merits.

Language performance is only important in some very specific cases (I/O trumps CPU in most use case), type safety is now becoming pretty standard everywhere.

Your example of embedded system is indeed a case where the environment has so specific constraints that the runtime behavior of the technology can make a huge difference. But for the regular "json <-> postgres" backend that are the bread and butter of our industry, i'd say it's an exception.


>In that sense, the most important property of a language is its popularity: it will ensure connectors exist, and are used enough to be working fine.

I would have thought so too, but surprisingly we have seen completely new ecosystems emerge where incumbants had seemingly covered all the bases.

When Node.js and Go came along, we already had Perl, PHP, Python, Java and C# for backends (as well as tons of less popular options).

Each new player offered some technical benefits, but I never felt that these relatively modest technical differences merited starting a whole new library ecosystem and doing it all over again.

I can only conclude that you and I are wrong to think that the availability of mature libraries dominates other factors such as people wanting to get rid of the previous generation's culture. Wiping the slate clean seems to be a deeply felt desire.


Most of the stalwarts you mention had been built to depend on more or less huge systems where the webserver/appserver was in a central postion (Apache, J2EE or IIS), node had a certain minimalness combined with client/server code sharing going for it and Go had the whole "be as cool as Google" vibe off it (also not tied to a big server but being self-contained) and both of them meshed well with a certain microservice trend of those days.

Yes, there was a certain hype driving them but they also fit into the "devops" paradigm shift where the unit of management became a container/application instead of begging a BOFH to configure part of a "large" server that were suddenly becoming commoditized (The desire wasn't maybe so much a clean slate as to not be held back by sysadmins).


Fashion-driven development is obviously much more pervasive than we'd like to admit. And if I don't understand the previous tech better than the new one I might just as well use what's hot and all the rage now.


Dart's claim to fame was a static type system, but TS's type system eventually surpassed Dart's and then some: TS not only gives lots of escape hatches from soundness because the JS ecosystem is messy and needs it, but it also doesn't repeat Dart's colossal mistake of making every generic type covariant then inserting runtime checks on writes (something everyone knew was a mistake after Java arrays and Eiffel did it).

Dart has some things in its favor: it's not defined by underlying JS semantics, so it's free to jettison some of its legacy (bye bye `undefined`). Unfortunately its changelog doesn't paint a picture of an evolving language.


Dart "lost" because it was a completely different language than JavaScript, with completely different , much smaller ecosystem of libraries and thus huge switching costs.

TypeScript, being a type system over JavaScript, re-uses JavaScript ecosystem and allowed incremental adoption.

Dart is now "winning" because Flutter is a very good way to write once and deploy on Android and iOS (and maybe in the future on web and desktop). That drives adoption of Dart and builds ecosystem of libraries.


Yah I just added the bit about Dart not being chained to JS, which is potentially a big win. But stuff like generic covariance is a whopper. Still, they fixed null-safety and implicit downcasts (which was pure WTF), so there's hope yet.


> Unfortunately its changelog doesn't paint a picture of an evolving language.

You haven't kept up with Dart in the last 3 or 4 years, have you??

They've improved almost every part of the language in this timeframe, I don't know what is an "evolving" language if Dart isn't it.


> You haven't kept up with Dart in the last 3 or 4 years, have you??

Not lately, but to my shame the changelog I looked at stopped at Dart 2, so I got a poor picture of things. Looks like variance annotations are still “experimental” though, but I take back what I said about Dart standing still.


It's easy to make that mistake if you don't use Dart, I was also shocked as I had played with Dart 1 long ago, and when I "came back" to it many years later, I found a totally changed language.

I would suggest having a look at the changes since Dart 2 here (Dart is currently at version 3): https://dart.dev/guides/whats-new

There's A LOT of new stuff (new releases every 6 months or something like that, similar to Java).


I also like dart having used it for flutter.

How’s the performance story when using for backend?

I love and use go for 99% backend and node for scripts (usually there’s a npm package that does what I need - done in 10 min) or spinning up simple aws lambda.

Go is so highly performant at all things including CPU bound tasks (no point comparing io bound stuff, although it can handle a larger volume of these).

Plus the scaling story is that it just always works fast the more resource you throw at it.

I don’t use rust because I don’t need that level of perf. If rust is 10 go is 8-9. That’s fine for me. For context I have services doing 10-100k rps that cost me like $4 plus bandwidth (depending on usefase, each).I don’t need beefy instances.


I mean it’s a fully compiled, optimized machine code binary that I deploy but having said that, it’s not Rust or Go and I wouldn’t expect 100k rps at all.

I just throw it up on Cloud Run and let it auto scale up and down as required and call it a day.


Stop putting Go in the same bucket as Rust. You never looked at its low-level details. Please do.

It has weak compiler, expensive write barriers and neither offers you APIs to reach optimal hardware utilization (aside from completely custom ASM dialect) nor the ability to define zero-cost abstractions.

Its purpose is fast networked back-end applications that are scaled horizontally (by cranking replica count, because Go's GC does not scale well with cores, and the CSP pattern has scalability issues too), but that's pretty much it.


I’m literally responding to someone who talked about Go and Rust who wanted a comparison.


Interesting. Thank you!

Cheers


How do you find it compared to Java or Kotlin?


Kotlin would be the most direct comparison for sure.

There is much less boilerplate than Java and also just by the fact that it’s a relatively new language (~10 years old) that was allowed to grow without a large community attached to it for a long time but with an absolutely first class technical steering committee they actually had the chance to bring in a lot of really cool features without big messy community migrations.

Even those major migrations that have occurred (I.e introducing fully sound null safety a couple of years ago) were done in such a smooth way it gave me a huge amount of hope for its future.


I write mostly Java and Kotlin professionally but have lots of Dart side projects.

Dart has evolved a lot in the last few years and is now a wonderful language to work with, not only the language itself has lots of useful features (non-nullability by default, sealed classes and pattern matching, Isolates, List spreading, first-class await/async, stream generators) but it has high quality documentation/package-manager/IDE support. The fact you can hot swap code without fuss and benchmark it using the Dart Dev Tools is amazing (https://dart.dev/tools/dart-devtools).

Kotlin as a language has some features that are nicer (e.g. data classes - though Dart has just released the first part of its new macro system which may make Dart more powerful than Kotlin in this regard) and others that are worse (e.g. pattern matching), so it's a close call. In terms of tooling, Kotlin has wonderful IDE support as well, but lacks in terms of code documentation (its doc system was in "alpha" quality level for years, haven't checked lately but I wouldn't be surprised it's still beta), package management (it pigbacks on top of Maven, which is not so great IMO, it does the job but it's not as nice as Rust and Dart, have a look at pub.dev - it's so good) and its main build system (Gradle seems to be the most popular and everyone who has used it knows it's not that nice to use, though it's extremely powerful - but notice you can use Maven just fine if you're not doing multi-platform, at least).

If you want to do multi-platform, Dart is a clear winner because you'll have Flutter, which has an extremely polished UX and stable support for all platforms. Kotlin MP is still catching up, I have used it a lot recently and it's all just coming out of beta now (and a lot of peripherical stuff is unfinished). Kotlin has an advantage, however: you can write Kotlin common code and use the native UI system instead of going the Flutter way, which requires you to write the UI in Dart. So if that's what you prefer, Kotlin is the better option.

If you do backend web development, Kotlin is superior, I think, because it can use all stuff from the Java world (Spring, Jetty, all the instrumentation stuff, libs to do anything). But Dart is fine as well, specially if you want to keep one language for back/front ends. And Dart has better interop with JS and a well developed FFI for calling native code (which makes the lack of libraries less of a problem... it's not much of a problem anyway as Dart has A LOT of packages these days).

In the end, I think that which one is "better" depends a lot on pure personal preference, but regardless of which language you pick, you'll be alright, they're both great languages now.


Apple just did not care enough about server development to invest in it. There are plenty of languages and frameworks out there. I wouldn't say Go and Rust are particularly logical choices for web development though. Of course they are being used for that but it's not necessarily the best fit.

Rust is kind of restrictive and requires some serious skills to be effective in. And with Rust developers being in high demand, you end up paying a premium for them. So, if you need a simple backend, you might want to pick something more accessible. Go is of course very popular for all sorts of server stuff but you mostly find it lower in the stack in all sorts of middle ware. They both are a bit low level for web development.

People seem to gravitate to more flexible/richer languages here. Which is why Java and Javascript are very popular and why e.g. Python, Php, and other scripting languages are still widely used for web development. On the native front you have things like Elixir, and a few others gaining traction.

Swift being similar to Kotlin, which is popular for server stuff, could have stepped up easily. But it didn't happen. Mostly that's because Apple tightly controls its roadmap and just wasn't interested in it.

Interestingly, Kotlin now has an IOS compiler and a native compiler for Linux. And there are some Kotlin server side frameworks that work on kotlin native (e.g. ktor). However, it's not a great option for native server development yet mostly for the same reasons Swift is not ideal either: library support is a bit lacking and server/linux support just isn't a huge priority in either community. So, you get compiler bugs, lack of features, performance issues, etc.

On the JVM it's a different story. There Kotlin is a drop in replacement for Java in pretty much any Java framework (you can name) and there are a few kotlin specific ones as well. IMHO, it's a great fit with Spring Boot which comes with lots of Kotlin support. I've been using it for years for this and can highly recommend it. I'm interested in switching to Kotlin native for server stuff but right now it seems like I'd be a very early adopter. Swift is very much in the same boat. You can make it work probably but you'd be on your own.


Elixir is not a native language. It compiles to Erlang's bytecode format (BEAM) for the Erlang VM.


Swift was first released in June 2014, which is 1 month away from being exactly 10 years ago. So you wanted them to optimize it for web development as it was being birthed as a language?


The alternatives are .NET and Java platforms, with plenty of languages to chose from, and AOT is also part of the picture, if needed.

Go only if one is feeling like I was coding in 1990's.

Even node is better, thanks to Typescript, and if there are performance issues, native modules are an option.


I was a long time Windows .NET Framework dev and switched to TypeScript + Node for a while because that's what all the startups were using and thoroughly enjoyed it. One thing that stood out to me in particular was just how fluid it is working with types in TS and also how good the unit testing story is on Node.

Then switched back to .NET 6 when I joined another (rare) startup using C#/.NET for the backend. .NET now is a fantastic backend platform and I've been very pleasantly surprised.

Part of it is that the base framework now feels really complete and well put together -- it feels "batteries included", but with the Duracell branded batteries and not the cheap knockoffs. The Entity Framework ORM didn't seem as competent in the .NET Framework days, but now EF Core is really, really good; highly performant and ergonomic to use. Part of it is that the multi-platform capabilities matured a lot since the 4.8 Framework days; the team I'm on is all macOS and we deploy to a mix of AWS t4g Arm64 instances. Part of it is that there's no longer a hard dependency on Visual Studio for most workloads; the team is a mix of VS Code and Rider.

What really stood out to me after that stint with TS is how close TS and C# have become[0] and even more so with the more recent releases of C# 12.

I think it's probably easier for teams that are working with TS + Node to add C#/.NET workloads than any other language/runtime like Go or Rust when there is a benefit to add multi-threading or get higher throughput from some backend system.

[0] https://github.com/CharlieDigital/js-ts-csharp


Maybe a bit going off topic, but I'm down the exact same route: used .NET Standard + EF as a Windows guy up until around 4.8, then I started TS development (not for backend but because of switching focus on frontend, also simultaneously switching to Apple ecosystem/iOS dev etc. but still) and still love it.

If I were to start a new backend project today, I'd be going with node.js/TS but because that's what I'm used to as language/framework, not because it's necessarily better. But I loved C#/.NET/VS/EF/Azure stack just as it too back in the days.

Would you recommend going in with .NET Core/Vscode/EF Core on macOS today, if you were to start a fresh backend project?


    > Would you recommend going in with .NET Core/Vscode/EF Core on macOS today, if you were to start a fresh backend project? 
"It depends". I also do a lot of work with Firebase and there, I prefer doing backend using TypeScript and Node on Cloud Functions. The front-end integration is really tight and Firebase is designed to work that way as a stack. That said, I'll still develop pure back-end workloads (e.g. async jobs) alongside my Firebase stack using C# and deploy into Cloud Run Jobs.

If you're building a pure backend, I'd choose .NET 8 with confidence because the overall DX feels better than Node to me. EF Core is really, really good; I prefer it to Prisma and the handful of alternatives on Node for relational DBs. I think .NET gives senior devs more "reach" (while still being approachable for junior devs) because it is multi-threaded[0], has source generators[1], interops with C++ easily, has deep optimization options for performance, and is generally a deeper platform than Node. Deployment into the serverless container runtimes is a breeze[2]. I also find myself patching CVEs and updating packages less in .NET land than in Node land (felt like it was a monthly and sometimes weekly chore in Node to be upgrading deps because of a security advisory).

Things I miss from Node are the aforementioned unit testing, JS has better hot reload than .NET (serviceable, but not as good), discriminated unions (though its possible to patch that on with libraries like OneOf[3]), and Node has more package variety (while I think .NET has better package quality).

[0] https://chrlschn.dev/blog/2024/05/need-for-speed-llms-beyond...

[1] https://chrlschn.dev/blog/2023/08/dotnet-source-generators-d...

[2] https://www.youtube.com/watch?v=GlnEm7JyvyY

[3] https://github.com/mcintyre321/OneOf


I really should take EF Core apart and see how it all works before I next work on db layer code, I've stolen ideas from older versions before now but I suspect it deserves a deep dive on the current version.

(JS db layer code always makes me sad and I always end up missing https://p3rl.org/DBIx::Class)


The most interesting JS DB lib I came across was https://github.com/kysely-org/kysely but still didn't feel as good as EF Core.


Somewhat belated, but thank you - I'll probably have a play with that, it looks ... less annoying than anything else I've seen, at the very least.

Losing my temper and writing my own is still on the cards at some point, I imagine, but this was definitely worth a mention.


Yes, there are .NET teams using Macs exclusively, it's no different to using Node/Spring/etc. Except much better experience and performance (Spring is notoriously slow, JS "performance" needs no introduction).


Xcode projects can have Swift package dependencies, and Xcode can open Swift packages directly.



And has for some time now. Haven’t used CocoaPods or Carthage in years (thank goodness).


Swift could have been a big player but equally I can’t see a particularly good reason why you'd choose it. I like it a lot as a language but unless I was really wedded to the idea of reusing code between server and native app client I wouldn’t be pushing it. For one it’s way too easy to create memory leaks compared to garbage collected languages.


Once you've worked with refcounting GC systems for a while avoiding leaks becomes pretty much second nature.

It is absolutely a pain when you first come to it from a mark and sweep GC'ed language but once figuring out the right spot to use a weak reference becomes automatic creating leaks largely ceases to feel 'too easy' though occasionally throwing a cycle checker at your code is recommended nonetheless.


For sure. But the question still remains: when you’re running on a high performance server that’s more than capable of speedy garbage collection, why bother?


My last comment was only addressing your 'too easy to' part (which, per sibling comment, may not be as universal an appearance as I thought).

However, fair question.

My answer here is that I am also really rather fond of timely destruction - in the same way as a Rust value's Drop happens as soon as the value is no longer required, a refcounting GC can fire destructors/finalizers immediately.

That means that e.g. if code that's using an "external" resource (think file descriptor) throws an exception, the resource will automatically be released during the stack unwind rather than having to wait for the next GC pass.

It also means that you can (and I, in fact, do) have a transaction object that can emit a warning at a useful point if you let it go out of scope without calling commit or rollback on it.

You can absolutely get (close) equivalents to this with a mark and sweep GC language, go's 'defer' strikes me as something I could implement at the very least 90% of these patterns with, but it's still something you have to consciously implement.

What's trickier is e.g. if you have N objects in flight sharing a resource but want the resource to go away when all of those are done - in a non-refcounting language you either have to track the count yourself (i.e. effectively implementing a mini-refcounter) or again rely on GC to sweep it up soon enough not to be a problem.

So it's a trade-off between having to think about resource releases and having to think about breaking cycles, and I'm not at all arguing refcounting is -better- but I personally mostly find it just -different- and I'm definitely not convinced that getting used to it is (much) harder than getting used to the relevant dances to not temporarily leak file descriptors and similar.


Given my experience with COM, and Cocoa, I pretty much doubt it.


Possibly it just fits my brain unusually well; possibly those environments have more footguns than the ones I've used refcounting in.

I do think something Recycler-enabled is preferable, but I'm still very fond of deterministic destruction and personally willing to make the trade-off.


re: why does Swift have a package manager if Xcode doesn’t use it:

I occasionally use Swift but almost never use Xcode. I find command line Swift tools, including the package manager, with either or both Emacs and VSCode makes for a nicer dev experience.


>A few folks asserted that the CPU-heavy nature of calculating Fibonacci numbers isn’t representative of web servers generally. Multiple people noted that – in the Swift implementation, at least – the majority of the CPU time was spent doing the Fibonacci calculation. Some felt this was therefore not a useful benchmark of Vapor itself. A lot of this boiled down to the “no true Scotsman” problem, which is very common in benchmarking, with a bit of perfect world logical fallacy peppered in, trying to identify the One True Representative Benchmark

Well, it doesn't take any True Scotsman to understand that "busy calc of Fibonacci" is a crap web server benchmark.

>Accusations were made pretty quickly that the benchmark is “unfair” to Swift because Swift doesn’t – it was asserted – have a properly-optimised “BigInt” implementation, unlike all the other languages tested. No real evidence was given for this. Even if it were true, it doesn’t invalidate the benchmark – in fact, it just makes the benchmark more successful because it’s then highlighted an area where Swift is lacking.

This takes defensiveness to the next level.

Sure, it might highlighted an area where Swift is lacking. Even lead to opportunity for improvement there.

But an even more important question is: does is say anything about what it was supposed to say (and the conclusions they drew from it), that is: whether Swift is good for web serving?

In that end, are BigInts a big part of web serving? I've used them maybe a couple of times in 20 years of web/APIs, in very specialized cases.

Instead of finding various excuses why the benchmark is not crap, maybe write a decent benchmark to begin with? One can find stupid non-relevant bottlenecks for any language (e.g. nerf Node by blocking work, find some PHP operation that's hella slow and lean on that, etc).


From memory you do need BigInts for ASN.1 decoding which is used in PKIX and thus TLS. Not a bottleneck though.


Maybe, but all the others are not compared with TLS - who serves web pages with Node or PHP or even Java to terminate TLS? So why would the Swift version need to handle it?


Ha, ok, the fact I am not a web developer shows. Comment withdrawn :)


Sounds like macOS sucks at web serving :)

I guess one lesson not explicitly mentioned is that doing benchmarking on something like macOS can have lot of pitfalls compared to systems more intended for this kind of workload (Linux...).

But the mention of wrk immediately stood out as a red flag; there is notable problem with wrk called "coordinated omission":

https://github.com/wg/wrk/issues/485

https://github.com/giltene/wrk2

https://news.ycombinator.com/item?id=34148502

Benchmarking is difficult. And there is lot of subtlety on how you set up the client(s) and servers that can impact the results a lot.


People who read the entire article might have noticed many of these problems were reproduced on Linux.


The default limits of MacOS is not really web server benchmark friendly.

   ulimit -a
  Maximum size of core files created                              (kB, -c) 0
  Maximum size of a process’s data segment                        (kB, -d) unlimited
  Maximum size of files created by the shell                      (kB, -f) unlimited
  Maximum size that may be locked into memory                     (kB, -l) unlimited
  Maximum resident set size                                       (kB, -m) unlimited
  Maximum number of open file descriptors                             (-n) 256
  Maximum stack size                                              (kB, -s) 8176
  Maximum amount of CPU time in seconds                      (seconds, -t) unlimited
  Maximum number of processes available to current user               (-u) 5333
  Maximum amount of virtual memory available to each process      (kB, -v) unlimited

Debian's limits are better:

  ~> ulimit -a
  Maximum size of core files created                              (kB, -c) 0
  Maximum size of a process’s data segment                        (kB, -d) unlimited
  Control of maximum nice priority                                    (-e) 0
  Maximum size of files created by the shell                      (kB, -f) unlimited
  Maximum number of pending signals                                   (-i) 30744
  Maximum size that may be locked into memory                     (kB, -l) 993336
  Maximum resident set size                                       (kB, -m) unlimited
  Maximum number of open file descriptors                             (-n) 64000
  Maximum bytes in POSIX message queues                           (kB, -q) 800
  Maximum realtime scheduling priority                                (-r) 0
  Maximum stack size                                              (kB, -s) 8192
  Maximum amount of CPU time in seconds                      (seconds, -t) unlimited
  Maximum number of processes available to current user               (-u) 30744
  Maximum amount of virtual memory available to each process      (kB, -v) unlimited
  Maximum contiguous realtime CPU time                                (-y) unlimited

For testing I always create an instance somewhere (mostly AWS) and configure the OS to have much higher performance limits to enable better performance outcomes.


interestingly macos does have a "server performance mode" you can set with kernel boot args

https://apple.stackexchange.com/questions/264958/what-does-s...

https://support.apple.com/101992


I feel like comparing performance of Vapor against Node JS in May of 2024 isn’t all I want to see. Like yes Node JS still dominates but we know other runtimes like Bun run much faster.

That part aside, I have to say Vapor kind of took over my brain for a year when I was in college. So much so that I actually built some freelance sites with it. A typed language for web development?! I loved the idea of using swift to make websites and honestly everything just worked so well—-until it didn't.

Remember Heroku? Well there was a heroku buildpack to make a vapor site build with a push to a git repo. Worked great 4 years ago. Well I’ve been ignoring heroku emails (my email is a landfill) for about 3 years and little did I know they had completely end-of-lifed the Postgres version I had been using to the point where the entire build was bricked. I could have spent like 5 hours downloading Xcode and getting it updated and working through errors but instead I just spun up a next js TypeScript site and had a working solution with code ported over (yay for static small sites) and deployed live. Today TypeScript does the job that Swift did for me back then, enabling types for web dev. Anyway, thanks to everyone who worked on or currently works on Vapor!


Swift has a pretty good VSCode extension these days actually.

Check out https://marketplace.visualstudio.com/items?itemName=sswg.swi...


Solid is pushing it. Its ok. Its still miles behind Go, Rust, C++. Partly because swift-lsp-server itself is lacking in functionality.


> A typed language for web development?!

What a novel idea.


Even the selection of frameworks is kind of weird, versus

https://www.techempower.com/benchmarks

This is what would be interesting to see Vapour/Swift on.


I don't think this blog post ever intended to provide a comprehensive comparison between Swift/Vapor and other languages/frameworks.

I think the bigger lesson from this writeup is that we should be careful when looking at these kinds of benchmark comparisons because there may be much more nuance to their results than there initially appears to be.


I think so too, I also saw it as a lesson on careful and methodical debugging. I've been in situations like the one written up, trying to debug something tricky where each team member is coming up with multiple wrong hypotheses and missing something obvious.


Yeah and it leaves out a lot of details to be desired.

For example, no debian build of Swift. And when I ported the ubuntu version to .deb, I ran into odd bugs, like when you kill it, it lives on! Apparently a known issue.

Basic _big_ bugs that shows Apple's half assed commitment to open-source on any other platform than their own. I can't blame them, but Swift on linux, beware.

Swift is a wonderful language, sadly I'm not seeing the commitment from Apple to make it ubiquitous. Also to note, Apple has basically killed macOS as a server, rendering Swift more towards a system/UI target language; which is a shame, because again, it's a wonderful, safe, language.


I find it kind of surprising that they don't put more effort behind swift-on-linux, it would make learning swift a generally more valuable skill. This increases the pool of people who would be willing to write mac software, and in particular, standalone libraries


I don't find it at all surprising that Apple is not putting any effort into Linux and is instead pooling their developer efforts behind other things. Hopefully things like SwiftUI, a framework that has existed for around five years and still feels a bit immature.


Yes, SwiftUI, Swift Concurrency and to an extent SPM have been quite frustrating. SwiftUI on macOS in particluar is a major step backwards from AppKit as it is right now.


Foundation is going through a gradual open-source rewrite by Apple and the Swift community, which is going to take a while but will eventually open up Swift more to other platforms.


By the time the Swift project eventually has a great message to share on that, there could well be hardly any non-iOS/macOS developers listening. Look at the last 5-10 years. The momentum gap between e.g. Rust and Swift in the wider development community is gargantuan.

IMO if Swift had had a great multiplatform toolchain/developer-experience in the ~2015-2019 timeframe then it could have established itself as the no-brainer C++ successor. But that moment has passed and they squandered the opportunity.


It still has an opportunity to be so, but only in the context of Apple's kingdom.

If nothing else, because the King says what tools are allowed.


Yeah I have high hopes for the new Foundation. The current open source Foundation is OK but it doesn't quite match what's there on Apple platforms and makes it all a bit difficult. Having one shared implementation should improve things.


It's still funny how this is positively encouraged by HN and yet .NET receives constant stream of FUD and criticism for already being what Swift wishes to be in some 5 to 10 years in terms of being platform agnostic (both the runtime and the tools).


There's a project for using Swift to write GNOME applications, which is fascinating to me: https://github.com/AparokshaUI/adwaita-swift

I wish them success but realistically they won't have success.


Yeah I get that it's frustrating. But it is open source, and there isn't anything stopping you or anyone else from making contributions that bring the language to more places. Sure, various aspects of the project could be improved but that doesn't mean it is "half assed".


Vapor is on that list, although it has not implemented all tasks (and thus has no composite score). It has results for JSON serialization, and database interop. It places near the bottom in each task it has completed.


Are the kernel bugs bugs or just not as one assumed? Lots of socket behavior edge cases are unexpected, but also 30 years old.


He's wrong about increasing somaxconn. That'll just increase 'bufferbloat' if you're service can't keep up processing requests and is overloaded. The upstream load balancer will notice far too late and keep sending you way too many requests.


Interesting article (did not finish it), however the little apartes feel soooo condescending. It's impeding the reading in my case. Might just be me though, don't pay too much of a mind to it.


Makes me wonder how the other languages/frameworks would perform if experts were also looking into them and why they were so slow. Recent versions of pure native PHP, in particular, are incredibly fast and lightweight, and I'm quite surprised by its pokey showing here.


Is there a mistake in the Swift graph of 'Success rate (Y) over concurrent requests (X)'? It shows a near-constant 1.0 success rate, but the description says "[Swift] had higher throughput than PHP yet requests started failing much sooner as load increased".

EDIT: Nm, it must be referring to the slight dip that starts early in the Swift graph.


That's the pb OP is mentioning quite early : the benchmark didn't make sense.


Benchmark comparison should have used most popular* Web framework for each respective platform and include .NET too.

*I could write a Web micro-framework with U8String I'm working on or raw Span/Memory<byte> slices in C# with zero-copy request argument/body binding and beat all of these, would it be a representative experience compared to ASP.NET Core? No, because that's not would be used in an average scenario.


Yeah one could use https://github.com/walkor/workerman for PHP and it would probably beat all other frameworks in the graph.

Would it be a representative experience compared to PHP? No, because that's not would be used in an average scenario.


It would still be slower :P

At least once you start "gaming" benchmarks interpreted and/or dynamically typed languages have a strict ceiling they can't really surpass (just.js doesn't count as it's as thin wrapper on top of C as it can get)

https://www.techempower.com/benchmarks/#hw=ph&test=fortune&s...

(all top entries are bottlenecked by DB driver implementation and its ability to multiplex queries, and context switching cost, so those frameworks which can do perfect static partitioning and query multiplexing win out)


Wonderful travel down the rabbit hole.


I am curious the full PHP setup because these benchmarks seem like very uncharitable stats.


I have no idea what I just read




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: