Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have been using Jellyfin on my synology server for the past 2 years. It's a dream, but you do need to name your files/directories per their naming convention: https://jellyfin.org/docs/general/server/media/shows/ -- https://jellyfin.org/docs/general/server/media/movies/


I've always wondered why Jellyfin doesn't have better support for parsing Scene release names, which follow strict naming conventions.

https://en.wikipedia.org/wiki/Standard_(warez)

https://scenerules.org/t.html?id=2020_X265.nfo

99% coverage is achievable via a few straightforward regexes.

Don't get me wrong though, I really like and appreciate Jellyfin, especially on Apple TV with Swiftfin, it's my daily driver for big screen entertainment and it's amazing, 10e9 times better than Chromecasting from a laptop to GoogleTV, which is just a horrible UX (no pause button on the TV) and also would randomly freeze for 5-30 seconds every few minutes.

Plex was nice too, and works great if you are okay with being at the mercy of a closed system for your media center. Though I sure don't miss those pointless forced UI "downgrade in functionality" updates!


Jellyfin parses scene naming conventions fine. I have thousands of films and hundreds of TV shows (with thousands of episodes) all in scene name format and I can think of a handful of matching errors on Jellyfish but it's usually due to a commonish film name and a wrong year or something similar.


I agree it works most of the time, but sometimes falls flat on it's face. Especially for TV episodes and even entire seasons.


The same. I basically never edit metadata. I rarely add IMDB id when slip happens.


You can automate that with sonarr/radarr/lidarr. Works like a dream.


Add prowlarr and bazarr for the full win.


the full win would also include 2 instances of readarr, once for ebooks and one for audio books, whisparr for your 18+ needs, stash for your 18+ needs frontend, and I think audiobookshelf and kavita to play the audiobooks and ebooks respectively. If you're into comics you might also want to throw mylar3 in there and if you have multiple users (aka a spouse and/or offspring) you may want to throw ombi in there too.


Some additional recommendations:

- Setup two Radarr instances (normal and 4K) and two Sonarr instances (TV and anime)

- Follow the TRaSH guides to improve media quality

- Komga to read manga and comics

- Readarr Calibre integration to delegate parsing & metadata to Calibre

- Jellyseerr or Overseerr instead of Ombi

- Unpackerr to catch and handle compressed media

- autobrr if you’re looking to build ratio


Why use a separate instance for 4k? Can't you just use a separate quality profile? And why a separate one for anime? It seems unnecessary to me.


For Radarr, it's for cleaner separation and easier automation. Also, keep in mind that you cannot have the same movie with two qualities on the same instance.

My 4K instance has one root path (different from non-4K) and one quality profile that's auto-selected by default. Using the Radarr connect option, I have a quality profile on the non-4K instance that automatically adds the movie to the 4K instance. This way, you can ensure you have both 4K and non-4K copies of certain movies (e.g., for external access/transcoding). You also ~never need to actually interact with the 4K instance. See this page for details: https://trash-guides.info/Radarr/Tips/Sync-2-radarr-sonarr/

For Sonarr, it's also for separation, but less of an issue if you're on Sonarr v4. The one thing you gain even on v4 is the ability to have different quality definitions for anime and non-anime content. See: https://trash-guides.info/Sonarr/Sonarr-Quality-Settings-Fil...


There are anime specific trackers that are better for downloading anime from, especially if you're looking for non-English subs, but radarr doesn't have a way to tell it to use a specific tracker for a particular series, so having a separate radarr instance with only those trackers on it ensures it downloads from them everytime.


Pretty sure the 4k thing is a hard limitation due to not being able to select the same root path or something. I don't remember tbh.

For anime first time I've heard that suggestion but anime in general is super annoying to download due to not having real seasons or whatever often. Wonder how it helps..


I have a separate radarr for anime because there's are anime specific trackers that are much better for downloading anime, especially if you are looking for non-English subs, but radarr doesn't have a way to pick a specific tracker to download a particular series from. So my anime radarr instance only has those specific trackers on it so it will download from them everytime.


Which non-English subs do you look for? I try to look for Chinese subs and it’s been a struggle to find good tracker.


What might be easier than a whole separate Chinese tracker, Plex at least has a pretty robust feature where you can have it automatically find and download subtitles in whatever language you set. Only problem there is sometimes the timing is a little off with the actual video so the subs don't pop up at the right time, but you can change the offset with just a few clicks until it's right and then usually it's the same for every episode going forward.

If you're not using Plex or find it doesn't work so well for Chinese subs, can also try bazarr which does the same thing with subs but is a standalone software.

So with the above setup you can download the actual series from wherever and then it'll fetch Chinese subs for you and add them automatically. Otherwise for actual Chinese trackers try share.dmhy.org and mikanani.me for public trackers.

Feel free to shoot me an email(in my profile) if you need any advice or help setting anything up :)


With anime people are wayyy picker about release groups because they come part and parcel with fanmade translated subtitles, preference for which track is the default for dual-language releases, are all the honorifics written out, etc.

You either have to commit those preferences to a tag that you try to remember to put on everything (fiddly), or you just commit to them being on everything you download regardless of whether or not it is anime, which can cause weird selections when sonarr is sourcing western media.


Jesus christ. Thank you for the Unpackerr recommendation, that was pissing me off, but jesus christ.

I’d pay an extra $10/mo for my seedbox to have just a single interface for all of this without having to manage all these independent apps. Trying to debug why sonarr->prowlarr->flaresolvrr don’t work is a nightmare. That reminds me:

- flaresolverr: proxy that handles cloudflare bot checks for torrent trackers that are starting to put it up


Haha yeah, it can be definitely become a rabbit hole if you have the interest and time :)

I think the base setup of Radarr + Sonarr + SABnzbd/qBittorrent + Prowlarr is a huge improvemenet over doing things manually. A lot stuff on top of that is helpful, but the benefit vs. effort ratio diminishes quite quickly.

I haven't had to setup Flaresolverr just yet, but might do that soon.


I set up radarr + sonarr after a few years of doing it manually. Really glad I did — adding something from my phone and then having it pop into plex on my tv is delightful.


Why do you recommend Jellyseerr or Overseerr instead of Ombi?


Readarr is weird. Turned it off after I started get messages from it saying it was deleting my ebooks. I don't know why it can't just work like Sonarr.


Throw in bragibooks for parsing the audiobooks from readarr and adding metadata for audiobookshelf like chapters etc


Some may prefer Overseerr to Ombi.


Indeed there are probably multiple paths to the full win. I like to use airsonic-advanced to feed music to my legacy sonos systems (can somebody please make an easy to use, pretty opensource alternative to the sonos multiroom software stack that can output to hifiberry's or ideally a cheaper alternative? even better would be an included best at each price level alternative speaker to double sided tape the hifiberry to). I also prefer to use beets and a cron job to convert my whole music collection into decent quality opus files and syncthing to get them onto my phone rather than using a frontend and mobile data to stream the flac files from my house.


Prowlarr is a deitysend. Instead of having to configure indexers in Sonarr, it does the heavy lifting for you. Goes in, configures an indexer, and that's your search setup done.


How many indexers do you have for this to become a problem ? I never considered the indexer setup to be a pain point, genuinely curious.


I would also prefer to manage my download clients from prowlarr but for some reason the maintainers are conceptually opposed to that.


Toss in RDClient for real-debrid support via the qbitorrent plugin in sonaar/radarr and it's fantastic.


I use Prowlarr as it’s tightly integrated with the *arr stack, but Jackett is another great option if you don’t need that.


What do prowlarr and bazarr do


Bazarr seems to be a subtitle downloader, and Prowlarr an “Index Manager” (though I’m unsure what that means).


Means you can say "Here's my API key for these private trackers" in one spot instead of 8. "Indexer Manager" is probably a better description.


Ooooooh.


Prowlarr is also important if you do multiple *arr. Sonarr, Radarr, Readarr. instead of configuring your indexers (multiple) in multiple places? You point the searches at Prowlarr and then keep it updated.

I personally run a fairly modified version HTPC on a QNAP (was synology but upgraded earlier this year - better base board + dedicated graphics card)

https://github.com/sebgl/htpc-download-box


I use the full stack and all very grateful for the products.

The remaining annoyances are:

- lack of multi language support

- there is no connection between the systems when you want to remove a movie (you remove it in one place and everything knows about that and acts accordingly)

- I still did not make to fully grasp how and where to say "I do not want this particular release". I think I saw that in radarr but it never is obvious to me where it is.


Yeah I was initially surprised when learning about the *arr stack for the first time, as my intuition was very insistently telling me: "I must be gettint it wrong, these ought to be all a single service!!"

They all definitely feel like small parts of a single package, don't look like they merit being their own thing. But it's not my thing, so what do I know.


The architecture of how they work together is a lot more functional than lots of the software at Fortune 500 companies


I have no idea why Radar and Sonarr are seperate programs. Surely the difference in searching for Movies vs TV is a single line of a query search string.


For some reason the developed hate symlinks, which would fix most of the issues.


Honestly I just think the concept of Son/Radarr doesn't translate well to music, I find Lidarr fiddly in general.

In particular I'd add to your list that the overnight scans to update cover art are an absolute mess. It's not so big a deal on my libraries in Sonarr and Radarr, but for Lidarr? Jesus Christ. I have reasonably sized music library (~400/500 gig), and every night Lidarr starts phoning out to check, for every single album and artist, whether the associated cover art or artist image has changed. This takes hours, and is completely unnecessary, and cannot be turned off. I've resorted to just blocking the addresses it does this on, but this breaks things when I try and use it to add new music.


> there is no connection between the systems when you want to remove a movie (you remove it in one place and everything knows about that and acts accordingly)

Not following this. Settings -> Connections in Sonarr for example.


Sorry for not having been clearer.

What I meant is that there is no common management of dat. When I make changes in one service, the others do not know it, or know it after the fact (via a reindexing for instance).

Say for instance I delete a movie in Jellyfin. Radarr will then pick up that it is missing and re-upload.

Or that I deleted a torrent in deluge. Radar will restart it.

Most of the things are doable, one just need to know what to do where, exactly (otherwise the other pieces may try to recover).

Right now I am making the chnages in Radarr (despite actually seeing them in Jellyfin). This is a real problem when I have, say, two uploaded versions of a movie and want to keep only one. I have to be very careful to track down in JF what I will remove in Radarr.

It would have been great if there was a common indexing mechanism across all the suite.

What you pointed me are notifiers - they work fine (but are shoot-and-forget kind of services)


I’ve been using it for a year. I don’t name anything in a special way, and it just works.


Yup running it on RPi with external hard drive. No need to name anything.


As rPi 4Bs are hard to get still / overpriced, unless you’re absolutely married to the rPi, check out OrangePi 5B for a much faster CPU, more RAM, eMMC, and good wifi OR get the 5+ for the same minus wifi (it has an E-key m.2 to add it) but adds a 2280 m.2 so you could throw a 4TB m.2 SSD on there (which are about $200 now) and now you’ve got a pretty dang good little NAS that’s fast and fanless for about $350-400 all-in. Did I mention the 5+ has 2x 2.5G Ethernet?


For less than that much you could get a tiny N100 computer.


My pi3b with 3 512GB flash drives streams perfectly in 720p and uses less power than the lamp next to the sofa, what a time to be alive


How would you rate the performance? Is it sufficient for 4K HDR streams over your network?


Which model Synology? I am looking into one of the dual/quad core Celeron models because I heard the iGPU is critical for any kind of transcoding.


Not OP, but you pretty much have to run the Celeron ones. I don't think the docker image would work with the Realtek ones (the ones that end in j). I have a DS220+ (J4025 with dual core only). It works ok, you pretty much max out one of the cores running a 4K stream. I would recommend separating the storage and server if you can afford it. The price difference between the quad core (4 bay) and 2 core (has 2 bay) is enough to get a 2 bay + a N95 mini pc that can handle 4 streams of 4K.


Another idea is to get a thin client with an i7 or i9 (you probably want at least 10th gen at this point) and either an external enclosure for a few SSDs or if you find one with a PCIE slot, a PCIE card to fit maybe 4x m.2 SSDs. Some good deals are out there on U.2 SSDs if you look as well, like 8TB for $400 from Intel or WD good.

Don’t forget that Asrock Rack and Supermicro sell Atom and Xeon-D boards, as well as some Ryzen AM4/5 models if you want to DIY. There are great cases out there (eg Fractal Node 304/804) these days that support full-size modular PSUs with 80+ Titanium ratings to sip power. That’s been my biggest gripe with x86 over ARM: idle power usage for something I expect to have on 24x7 with PG&Es 50c/kWh. I just rebuilt my old desktop 5950x into a NAS using a Silverstone RM44 with air cooling, but it’s made to support liquid as well. That’s got plenty of room to fit 4X full-size GPUs and a power supply to match if you dabble with AI on the side. RTX 4060s are coming soon for $300 and that should be more than enough power for transcodes for the whole family.


I actually think buying a first gen Mac Mini M1 is a better idea with a thunderbolt drive storage. I have friends in California that does this for the reasons you mentioned. The utility price has made homelab servers and 3D printing pretty much non viable unless you want to pay a $300+ electricity bill.


Actually I do have an M1 Mac Mini - what's his setup? Maybe my problem is that MacOS doesn't feel as friendly for getting an entire homeland setup.


He runs Jellyfin straight up from the downloads page [1]. I'm not entirely sure if it runs on Rosetta but he hasn't had any issues with multiple streams. For storage you have a couple options but enabling file sharing on macos + a large drive of your choice is your best bet.

[1] https://jellyfin.org/downloads/macos


It’ll run through docker surely? A nice docker compose media server is a beautiful thing to behold.

I use a Nuc but the Mac mini would be a great server (though costs a lot).


If anyone wants to try this, be wary of transcode acceleration / HW passthru with Docker, esp on M1/M2. In general I’d love it if there was a GUI-less stripped-down MacOS Server edition instead of running the full-fat consumer OS as an always-on server.


What is there to be wary of? It won't work at all?


I run an Intel Nuc with an fstab entry to mount an nfs share. The gruntier Nucs will even take a full size pci card and therefore 10gbe.

Mine has been bullet proof and works really well with a huge amount of storage (Synology).

It’ll transcode 10+ streams of high bitrate video without breaking a sweat.


It worked on my older "DS 216+II". Could not stream more than 1 person at a time though. That was ok for my usage but not sure for yours.

I recently upgraded to "DS 423+" and it's a lot faster - can have multiple streams going if I want.


Do you ever notice/regret the fact that it only has 1GBe instead of 2.5?


I run a 1821 and stuck a super cheap 10gbe card into it. It’s fantastic.


It was the same thing with Emby.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: