I've been mostly happy with Action launcher. It has the few features I really liked from Nova that are missing from pixel launcher: I can make my home screens scroll in a circular/infinite manner, I can remove the search bar and the google news feed or whatever they call the left page, I can set more than one page in the dock.
Unfortunately, the app list page isn't quite as configurable. There are folders rather than tabs, and there's an extra click necessary to search by app name. Overall, it does the job.
Could you explain why? I have been interested, in theory, in Kotlin Multiplatform. But I'm already very comfortable in Dart and Flutter. I have decades of experience with Java, Javascript, and quite a few years with Typescript. Kotlin feels like a different kind of language, one I find grating. I think this is primarily aesthetic, but it's still enough to make getting over the initial hump annoying. As petty as it is, I think the lack of statement-terminating semicolons is a major reason I do not like it.
I would welcome a factual list of things that make the KM experience better for you.
Kotlin doesn’t feel right to me either. I did a portion of AoC in it this year and it was surprisingly more verbose than I expected. I think the thing I liked the least was trailing lambda syntax combined with how verbose it was to define variables with types.
It also inherits all of the bad parts of the JVM. Crappy build tooling (gradle), and then the slow startup and high memory usage.
Coming from writing a ton of Swift (and previously Obj-C), Kotlin’s ergonomics feel kinda off. It also feels like it’s different for the sake of being different more often than I’d like.
Can you give concrete examples? I played with Swift in school after already making the dive into Kotlin. It just felt like Kotlin but trapped on Apple. Part of the program even included a course in using a different language every couple weeks where we went through Scala, Lisp, and some others just to see what they could do.
Currently Kotlin is far and away my favorite language but I also haven't looked into the newer languages recently and am interested in hearing pain points people have. Especially if it isn't annoyances with Gradle
Broadly speaking, Kotlin deviates from popular conventions more than Swift does. For example, Kotlin expects you to use inline if statements where in Swift, ternary operators work like they do in C, JavaScript, and many other languages.
There's also things like Swift's guard statements that can help make intent clearer and read a bit more nicely.
Same opinion. It just feels off. Why use `fun` for function declarations instead of func or function? Dropping () before { makes it hard to tell what runs first. It feels like it's trying to be different for the sake of being different. I'm not able to quickly skim kotlin code like other 'C-like' languages and tell what is going on because it's trying to be too clever.
I used Kotlin as well and it just feels off too. The package support is a major thing, as I don't want to mess around in Gradle, I want something that Just Works™. Dart 3 has much of the same feature set as Kotlin now with sealed class support, it's just not as functional, but it recently got tearoffs so you don't have to specify the class, just the property, similar to Swift (ie if you have an `enum Color { red, blue }` and a function takes `Color`, you can just do `f(.red)` not `f(Color.red)`).
The main thing though is that Dart has pub.dev and a CLI that makes it extremely easy to add packages, via `dart pub add`. If I do want to go more of a functional route I'll just use Rust instead, it has all of what Kotlin has and more, plus a similar streamlined package management as Dart in the form of `cargo add`.
Darts pretty good. It has a lot of modern features, nullable types, pattern matching, sum types, and factory constructors; some really good build tooling. It can compile fully AoT.
Eh, it's getting there, slowly at first but more rapidly now. It now got tearoffs, I explained in another comment but
> if you have an `enum Color { red, blue }` and a function takes `Color`, you can just do `f(.red)` not `f(Color.red)`
Dart is getting new features pretty fast, they really started focusing on the DX more after Dart 2 and now especially after Dart 3. Macros were supposed to ship but it was incompatible with the goals of fast compilation, so other sorts of smaller features will ship instead.
Big turnoff with Dart is the lack of json (de) serialization -- kind of shocking to have to resort to source code generation libraries in a modern language.
Also, statement based instead of expression based, and not immutable by default are kind of a drag; not the end of the world but a bit unpleasant, IMO.
Serialization support is coming, probably this year. As for statements vs expressions, it does have some expressions such as if and for inside lists but changing it wholesale to an expression based language would be too much of a breaking change.
Serialization support has been coming for years, I lost patience.
Otherwise, yes, some support for expressions, some support for immutability, no support for optional semi-colons, no privacy modifiers so "_" littered everywhere.
I just found it to be an exceedingly ugly language when I used it a couple of years ago. Yes, some more pleasant modern functionality has been bolted on since then, but it's unfortunate that Dart was chosen as the backing language for Flutter, which is an awesome mobile framework.
Serialization has always been possible via libraries, so most people were doing fine with that, what is coming is native serialization support, but in practice it will be functionally the same, ie rather than you running build_runner, the compiler will do it for you. I'm not sure what you used but that's what you were hung up on, there were always ways to solve it.
Dart is a pragmatic language, it has everything you need and has a lot of benefits too, such as sound null checking (very few languages have this, Rust comes to mind), JIT and AOT support (Javascript / TypeScript such as for React Native doesn't, and Kotlin is just getting there with Kotlin Native but it still has a lot of issues), and now more functional programming concepts with algebraic data types via sealed classes and pattern matching.
What language would you have chosen when Flutter came out circa a decade ago, or, we can be even more charitable and ask what language would you use today if you were to implement Flutter? I'm curious because everyone has their own ideas but they all don't work for one reason or another.
isn't this just because Dart is way newer than those? it's from the 2010s. it's really modern in comparison (same generation as Kotlin swift and typescript)
Dart is hands down the best modern language out there for app development right now what are you even talking about? I understand that maybe a lot of people haven’t used it or maybe haven’t used it in years and that probably drives a lot of the FUD but for those who use it, it has stupidly high ratings from developers who use it and has for years.
Great progress! But smells a lot like the language I had it pegged for when "underscore as a wildcard" lands in February 2025, 2 years after pattern matching lands.
How did they ship pattern matching in 2023, with a million examples of how to do it right already hashed out and in the wild... and then not figure out a wildcard symbol for 2 years?
-
* Dart was awful, lost to Javascript because no one rated it highly enough to justify moving off Javascript, and was practically dead until Flutter dusted off the corpse and pivoted away from their browser goals... so super weird revisionism to act like we're talking about some beloved evergreen language.
> How did they ship pattern matching in 2023, with a million examples of how to do it right already hashed out and in the wild... and then not figure out a wildcard symbol for 2 years?
We shipped support for `_` as wildcards in patterns with Dart 3.0 when pattern matching first shipped.
However, prior to Dart 3.0, `_` was already a valid identifier (as it is in most other languages). The feature you're mentioning from last year was to remove support for uses of `_` as an identifier outside of patterns. This way `_` consistently behaves like a wildcard everywhere in the language. We didn't ship that in 3.0 because it's a breaking change and those are harder to roll out without causing a lot of user pain.
It's OK to not like Dart. There are multiple popular languages for a reason. But it is helpful when communicating about a language to others to be accurate so that they can make their own informed opinions.
Dart wasn’t awful. It wasn’t adopted at the time because it had a distinct runtime that would require splitting web in two which nobody wanted. On top of that it gave Google too much power, because now they would control both runtime (V8) + language (Dart).
TypeScript won and became king because it was pretty much JS 2.0 instead of JS++ like Dart.
In your version of history Dart was always a great language... but Google was simultaneously too powerful for other vendors to allow Dart to proliferate, but also too weak to sustain it themselves despite Chrome going on to do just that for many many web standards.
I'm sure that's a really cozy idea, but doesn't pass the "common sense" test: a bit like your random misuse of the term FUD.
-
The simple reality is it wasn't very good, so no one was rushing to use it, and that limited how hard Google could push it. ES6 made Javascript good enough for the time being.
Dart 1.x had a weak type system, and Dart 2 was adding basics Kotlin already had almost 2 years earlier: that was also around the time I first crossed paths with Flutter, and honestly Flutter by itself was also pretty god awful since it was slowly reinventing native UI/UX from a canvas.
(It was a lot like Ionic: something you used when you had a captive user-base that literally couldn't pick a better product. Great for Google!)
> In your version of history Dart was always a great language... but Google was simultaneously too powerful for other vendors to allow Dart to proliferate, but also too weak to sustain it themselves despite Chrome going on to do just that for many many web standards.
"In my version of history"
It takes two seconds to find this if you weren't there when it happened. Google had a fork of Chromium with Dart VM called Dartium, it wasn't a matter of resources. Industry flipped Google off, plain and simple.
Educate yourself before making such claims, the decision to not adopt Dart wasn't because of its technical merits as a language.
The rest of your comment is just your opinion, so you do you. I'm not a Dart or Flutter devrel team to sell you their product.
I guess this is the Dunning-Kruger effect everyone talks about!
To understand just enough to regurgitate what happened, but miss why it happened... and then assume someone who's pointing at the much more relevant why is just plain wrong.
Because the why requires actually understanding of things like developer mindshare rather than regurgitating search results.
-
The hint I'll leave if you're willing to consider maybe you don't know everything ever... look at who's feedback is being promoted when Chrome wants to do obviously unpopular things on the web: https://github.com/webmachinelearning/prompt-api/blob/main/R...
And model for yourself what happens if developer interest exceeds vendor refusal in magnitude, so Google just ships the thing, without a feature flag, to a massive percentage of the web-going world.
Maybe Windows, I haven't used it in a long time. But I have noticed my son's MacBook pro (used to be my work laptop) only pretends to be available after "waking". It'll repeatedly fail to actually take input in the user login password field. It does so silently, leading to missing characters in the password and needs several attempts to actually fill out fully. I don't know what it's doing in this time, but not having the "busy beachball" is a lie.
Neither "wanting to be liked" or "wanting to avoid being disliked" rings true to me, at least as applied to my own social anxiety. I want to avoid being thought of at all. The idea of being liked is just as anxiety-producing as being disliked. Possibly more so. Every relationship with another person, positive or negative, is another cognitive burden to maintain. I would vastly prefer most of my interactions to just remain at the default/stranger level where I can re-use the same anticipatory model for most people I deal with.
Tangentially related, I have for some time had a desire to write short stories, but the anxiety around revealing anything that might expose my inner self is probably the biggest reason why I don't.
I was reading a collection of short stories yesterday and came upon Michael Swanwick's "Slow Life". It struck me that it shares more than a few similarities to his "The Very Pulse of the Machine": Woman astronaut on a moon in the outer solar system is placed in lethal danger, encounters alien intelligence that communicates by reading/influencing minds, she isn't sure whether the communication is genuine or hallucinated, eventually the alien intelligence provides a long-shot resolution to save her.
Maybe Swanwick just had another story to tell with some of the same beats. It happens. Or maybe it's like bare feet in a Tarantino movie. The point is, the idea of someone examining my own stories and thinking such thoughts about me is extremely distressing. It's not being disliked that I try to avoid. I'm trying to avoid the baseline stress of social interaction.
I recognize the irony of opening up about this in writing. If you have something to say _about me_, please don't.
The weird thing about "The idea of being liked is just as anxiety-producing as being disliked" is that it is an incorrect prediction of reality: actually being liked would. Thinking about it is really a different thing: it overestimates the stakes involved, it mistakenly invents "ideas of people" to do the liking which do not behave like actual people, and is unable to build any self-esteem by imagining people liking you because these imagined people are under your our control; being liked by your own imagined people doesn't "count" the way being liked by real people would...
The human mind is not really designed to handle under-socialization well, and seems to fill in the empty space with imaginary figures which fail to meet its social needs. Taken outside its natural tribal operating regime, it bugs out in all kinds of strange ways.
> the idea of someone examining my own stories and thinking such thoughts about me is extremely distressing
This is a very familiar feeling to me, and in my experience it actually is a fear of being disliked, or more specifically about not being able to control others' reactions to me. But the fear is so great and unapproachable that the mind cordons it "out of sight" of conscious feeling.
It becomes better to not be thought of than to expose myself to the possibility of others seeing me poorly, especially if I'm not able to defend myself and make the case for my being seen with grace. I suspect that it is over-exposure to human meanness and judgement and under-exposure to kindness and grace which brings about this expectation of others' dispositions towards oneself; this perhaps is the reason for the Christian injunction that humans not judge one another--it guards against this particular failure mode of the social mind.
I looked at several yesterday. I landed on Action Launcher because it has a scrollable dock and infinite scrolling homepages. It lacks tabs in the app drawer, but it does let you create folders there.
Hyperion seemed to have all necessary features, but the UI was unintuitive and the documentation non-existent.
I also liked Lynx, but that would be a major change for me.
I don't watch Youtube, but my young kid loves to watch screaming Minecraft morons pretend to scare themselves. I've been growing increasingly concerned by the ads he's incidentally seeing and was just yesterday looking at giving in and subscribing to a premium. There's a normal premium for $14 and a premium "lite" for $8. I'd have subscribed at $1/month, but those prices are absolutely insane.
Instead, I installed revanced on his tablet. For free.
To manage my child’s online viewing, I remove YouTube from all of his devices and instead use Pinchflat[1] to automatically download videos from a curated list of pre‑approved channels. We periodically explore YouTube together to discover new content aligned with his interests, which I then add to the list. Pinchflat retrieves new uploads within hours and automatically deletes them after a set period. The videos are stored locally and made available through Plex, Emby, or Jellyfin for remote access. This approach eliminates ads and algorithm‑driven recommendations while giving me full control over content, though it requires some setup, ongoing management, and storage capacity.
I don't have kids of my own, but I'm generally concerned about some of the advertising kids are exposed to these days (as seen through friends and family with kids). At the risk of viewing my own childhood with rose-tinted glasses, it seems much worse today. The main source of ads back then were TV and newspaper, and while neither were "perfect" at this, they both seemed to have much higher standards for what kinds of ads to allow than platforms do today.
Whenever I see my (9-year-old) nephew watching YouTube on their TV at home, I get a little horrified at some of the ads he's exposed to. But this has been normalized throughout his childhood, so it seems unremarkable to him, and his parents seem to be desensitized to it as well. I suppose some of this is bias on my part: I aggressively avoid exposing myself to advertising, using in-browser ad blocking, network-level ad-blocking, and OS-level DNS VPN-based ad-blocking on my phone. Whenever offered, I always pay for the service tier (like YouTube Premium) that removes ads. I get that this could get expensive real fast for a lot of people, and isn't feasible. (But I know a lot of people who don't even install browser ad-blockers, which is just baffling to me.)
But that's the big problem... people are being forced to choose between an uncomfortable level of financial expense, and paying for things with their attention, where that attention is being exploited with deeper and deeper psychological manipulation that has been fine-tuned over the span of decades. On the occasion I do see an ad (especially a video ad), my reaction to it is so viscerally negative that I mute audio and look away, even sometimes shutting it down entirely.
Feels like I'm in the twilight zone here. A few minutes of my pay per month for my kid to not be bombarded with bullshit ads seems like a good deal to me.
why is it required that children have access to YouTube? I'm not that old and we went long stretches with no TV in our household because we didn't have the money for it. It was fine. I'm fine. I didn't turn out normal but it was for a lot of other reasons. Read a book, kid.
$14/mo is a lot just for brainrot so you can listen to your kid shout "skibidi Ohio sigma rizz" all the time
On the other hand I had unlimited video games which boomers were certain was going to doom me, yet it was fine, and I'm fine. My kid's watch less YouTube than I played NHL 93 at their age tbh.
> $14/mo is a lot just for brainrot so you can listen to your kid shout "skibidi Ohio sigma rizz" all the time
They'll get that from school, most of them aren't watching a YouTube vid and inventing culture, they're just picking it up. Like every other generation
> There's a normal premium for $14 and a premium "lite" for $8. I'd have subscribed at $1/month, but those prices are absolutely insane
This just demonstrates how much money is in those ads and how much Youtube needs to charge to compensate for that. And advertisers wouldn't pay that much if it wasn't at least somewhat worth it, i.e., the psychological manupulation is worth at least that much w.r.t. their bottom line. The average person therefore "pays" with $14 worth of brainwashing.
You could obviously argue that $14 is just a ripoff and they don't make that much money off you. Sure you are not average and perhaps less influenced than others, but fun fact, a large majority of people believe that about themselves.
By the way, Youtube Premium still collects your data and uses it. Without that, it would need to be even more expensive.
For me I feel $5/month would be pretty reasonable. But $14 is out of control. Some of my friends however are subscribed at that price and consider it money well spent.
It's money well spent for me. It's the only "streaming service" I use. I've also heard multiple creators say that they get more out of a Premium view than an ad supported one.
The popular ones are boring and way over used
Most people are happy with a subscription service that doesn't really serve their tastes or compensate the sources effectively.
* I won't get really into it unless I have a solution using hardware I own.
* People that care about it are incredibly passionate about it. For the rest it is a boring subject in and of itself.
* "It can change the world"
"it mostly worked" is just a more nuanced way of saying "it didn't work".
Apparently the author did eventually get something working, but it is false to say that the LLMs produced a working project.
Well, yeah. It’s a more nuanced way of saying that because “it didn’t work” isn’t very useful nor descriptive.
What if it wrote all of the boilerplate and just let you focus on the important bit that deserves your scrutiny?
You could say I failed every single project I ever built because it took many iterations to get to the final deliverable stage with lots of errors along the way. So more nuance would be needed.
But when it comes to LLMs suddenly we get all gleeful about how negatively we can frame the experience. Even among HN tech scholars.
I dunno. Depending on the writer and their particularly axe to grind the definition can vary widely. I would like it to mean, "any fixes I needed to make were minimal and not time intensive."
you can use zigler for a c nif, using easy_c (or c_src) options.
the big advantage is that it will automatically box/unbox to/from c values for you and generate sane error messages (which rustler does not, last i checked) when you pass incompatible terms in to the function.
on the other hand rustler lets you precompile (which is coming in a future version of Zigler)
What is your definition of "a working project"? It does what it says on the tin (actually it probably does more, because splint throws some warnings...)
Amen, thank you for noticing. The goal here was not to produce something of stellar quality, which is anyway out of the question as I don't have the skills/knowledge to evaluate anything other than "it returns the Elixir map I wanted". It was to see if this is feasible at all.
Unfortunately, the app list page isn't quite as configurable. There are folders rather than tabs, and there's an extra click necessary to search by app name. Overall, it does the job.
reply