Skip to content

What's new at ElfHosted?

We're small, passionate, and building rapidly. Keep an eye on our blog for updates!

Here's our latest developments...

Emby / Jellyfin transcode fixes

The recent transcode path fixes to Jellyfin / Emby have brought a small bug out of the woodwork.. in some cases, the streamers may insist on transcoding your media based on your perceived bandwidth limits, and then fail to transcode because (a) it's 4K content, or (b) the transcoding path is not set to /transcode, and we've prevented use of the network storage for such!

A simple workaround is to edit your Jellyfin / Emby users, and to remove their permissions to perform video transcodes, something like this:

Read on for more fixes...

Jellyfin 10.9 is out! (and now, fixed)

Jellyfin 10.9 was released last week, with new features including "trickplay" (live video scrobbing), admin UI revamping, and improved ffmpeg transcoding powerz.

Unfortunately there was a significant bug in 10.9 causing random lockups, but we were unable to roll back because of database upgrades. While the devs were working on the fix, we rolled out some changes to our health checks - rather than a TCP connection to confirm Jellyfin is alive (but, sadly, still locked up), we now use an HTTP test against the API's health endpoint, which fails when Jellyfin is locked up, so that we can at least quickly kill and restart a stuck Jellyfin instance.

The bugfix has rolled out in 10.9.2, so hopefully this is now a non-issue!

It also turns out that Jellyfin was not consistent in where it stored its transcoding data, and some instances were defaulted to /config/transcodes (/config is backed by our expensive NVMe network storage, not where we want to be sending GBs of temporary transcoding data!), while others were set to /transcode (correct) or /config/cache/transcode (also incorrect).

Tonight's update symlinks all of these combinations to /transcode, the 50GB ephemeral NVMe-backed disk on the local node, avoid stressing our network storage. In summary, you can ignore the transcode path in Jellyfin. We'll make it work in the backend :)

KnightCrawler updated to v2, filename results fixed

The first ElfHosted Stremio Addon was a hacky, untested build of iPromKnight's monster rewrite of the backend components of the original torrentio open-source code.

It served us well from Feb 2024, and was my introduction to the wider StremioAddons community, but the rapid pace of the KnightCrawler devs outpaced our build, and so while fresh builds were prancing around with fancy parsers and speedy Redis caches, we ended up with a monster MongoDB instance ๐Ÿท (shared by the consumers, and public/private addon instances), which would occasionally eat more than its allocated 4 vCPUs and get into a bad state!

To migrate our hosted instance to the v2 code, we ran a parallel build, imported the 2M torrents we'd scraped/ingested, and re-ran these through KnightCrawler's v2 parsing/ingestion process. Look at how happliy our v2 instance is purring along ๐Ÿฏ now!

We cut over to the v2 code a few days ago, and since then we've had some users of the Prowlarr indexer pointing out that the results coming back from the KnightCrawler indexer were...

RDTClient Updated, tested

In the past we've had issues with updates to RDTClient, since the version which we initially used (based on the laster13 fork of https://github.com/rogerfar/rdt-client) ran as root (later dropping privileges), and was hard to lock down.

Today an intrepid team of elves worked on refactoring and testing the latest official upstream (v2.0.73 currently, but it changes fast!), which I'm happy to report is working well, and is noticeably faster than the old version.

By the time you read this, you'll have been auto-upgrade to the latest version, and subsequent upstream updates will be automatically applied (no more testing required after each upstream release).

Show @rogerfar some โค

If you'd like to encourage RDTClient's developer, @rogerfar, to continue making bug fixes and feature improvements, weigh in on or just add some reactions to this issue!

Stremio Simultaneous Streams are Stupendous

Now that the April 2024 repricing is behind us, and the dust has settled, we can continue working on the ongoing tension of features and stability!

Today's headline update is Stremio Server - an instance of the official Stremio server/transcoder, which you can leverage to:

  1. Safely stream direct torrents (not necessarily Debrid, see below) without worrying about a VPN
  2. Simultaneously stream Debrid downloads from supported clients (currently just https://web.strem.io or our hosted Stremio Web instance)

For more details, see the step-by-step guide to simultaneous Stremio streaming!

Read on for more developments, features, and bugfixes!

Price rebalancing planned for 1 April 2024

As previously announced, we're be doing the first of what will become routine price-rebalancing, starting on 1 April 2024. This is the first time we've adjusted prices since we were "born" in June of 2023, and so this process may be a little jarring / unpolished at first!

TL;DR - How will this affect me?

There will be no change to the price of existing, active subscriptions. New prices will apply to new subscriptions.

However.. where there's a significant difference between existing and new pricing, we may reduce resource allocations on existing subscriptions proportionately. (for example, the existing Plex pricing only covers about 30% of actual expenses) - I'm seeking community feedback on how to proceed here.

I've laid out our transparent pricing process and costs in detail, and the stated goal is:

At the highest level, our goal is to recover hardware costs equivalent to resources consumed, and to realize 20% profit on costs (see how we're going on this goal by examining our monthly reports!)

Read below for details on app-specific changes..

Setting Sail for Stability

It's been 5 days since we had to turn all tenant workloads down for 10h while we performed "open-heart" surgery on our storage nodes. The root cause of our storage issues, it turned out, was some poor defaults applied to new ceph metadata pools when the SSD-based /config volumes were first added. At that time, our size / load wasn't significant enough to bring the issue to light.

Since the fix was applied (extending the pg_num on the blockpool from 1 to 32), we've not seen another I/O issue, nor pods stuck waiting on restarting.

So.. now that the fire is out, and we're enjoying a period of nice, calm, stability, here's what's coming...

Plex storage migration approaches

We've started to run into I/O contention on the big, busy storage volumes, which use CephFS to make files available to multiple pods (for example, to Plex for storage, but to FileBrowser for browsing config).

In the next maintenance window, we'll be migrating users' Plex libraries (some of which are >50GB!) to Ceph "block" storage, which bypasses the I/O contention, at the cost of being unable to "view" your Plex metadata / DB / logs via FileBrowser (we'll come up with an alternate solution as required).

This migration has been observed to take a long time, especially for larger libraries (roughly 2 min / GB of Library), so we're trying to "stagger" it as best as possible, by shifting "earlybird" users onto the alpha channel, so that their migration can happen in a time of their choosing, rather than during maintenance.

If you've got a big(ish) Plex library, or you care quite a lot about when it goes down, create an #elf-help ticket to arrange a time (before the next maintenance window) to have your Plex instance migrated.

Auto-delete trash in Plex now disabled

Thanks to a suggestion from @pomnz, we're now explicitly setting Plex's setting not to delete trash automatically after a scan. (This can cause confusion and re-downloads as deleted files disappear, and then Radarr/Sonarr attempt to re-download the content!)

Today's scoreboard

Metric Numberz Delta
Total subscribers 354 +29
๐Ÿ‘ฝ Zurg mounts: 131 +2
๐Ÿ’พ ElfStorage in TBs 65 -
Tenant pods 4368 +271
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished - -
New toyz 2 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!