Skip to content

What's new at ElfHosted?

We're small, passionate, and building rapidly. Keep an eye on our blog for updates!

Here's our latest developments...

KnightCrawler updated to v2, filename results fixed

The first ElfHosted Stremio Addon was a hacky, untested build of iPromKnight's monster rewrite of the backend components of the original torrentio open-source code.

It served us well from Feb 2024, and was my introduction to the wider StremioAddons community, but the rapid pace of the KnightCrawler devs outpaced our build, and so while fresh builds were prancing around with fancy parsers and speedy Redis caches, we ended up with a monster MongoDB instance ๐Ÿท (shared by the consumers, and public/private addon instances), which would occasionally eat more than its allocated 4 vCPUs and get into a bad state!

To migrate our hosted instance to the v2 code, we ran a parallel build, imported the 2M torrents we'd scraped/ingested, and re-ran these through KnightCrawler's v2 parsing/ingestion process. Look at how happliy our v2 instance is purring along ๐Ÿฏ now!

We cut over to the v2 code a few days ago, and since then we've had some users of the Prowlarr indexer pointing out that the results coming back from the KnightCrawler indexer were...

RDTClient Updated, tested

In the past we've had issues with updates to RDTClient, since the version which we initially used (based on the laster13 fork of https://github.com/rogerfar/rdt-client) ran as root (later dropping privileges), and was hard to lock down.

Today an intrepid team of elves worked on refactoring and testing the latest official upstream (v2.0.73 currently, but it changes fast!), which I'm happy to report is working well, and is noticeably faster than the old version.

By the time you read this, you'll have been auto-upgrade to the latest version, and subsequent upstream updates will be automatically applied (no more testing required after each upstream release).

Show @rogerfar some โค

If you'd like to encourage RDTClient's developer, @rogerfar, to continue making bug fixes and feature improvements, weigh in on or just add some reactions to this issue!

Stremio Simultaneous Streams are Stupendous

Now that the April 2024 repricing is behind us, and the dust has settled, we can continue working on the ongoing tension of features and stability!

Today's headline update is Stremio Server - an instance of the official Stremio server/transcoder, which you can leverage to:

  1. Safely stream direct torrents (not necessarily Debrid, see below) without worrying about a VPN
  2. Simultaneously stream Debrid downloads from supported clients (currently just https://web.strem.io or our hosted Stremio Web instance)

For more details, see the step-by-step guide to simultaneous Stremio streaming!

Read on for more developments, features, and bugfixes!

Price rebalancing planned for 1 April 2024

As previously announced, we're be doing the first of what will become routine price-rebalancing, starting on 1 April 2024. This is the first time we've adjusted prices since we were "born" in June of 2023, and so this process may be a little jarring / unpolished at first!

TL;DR - How will this affect me?

There will be no change to the price of existing, active subscriptions. New prices will apply to new subscriptions.

However.. where there's a significant difference between existing and new pricing, we may reduce resource allocations on existing subscriptions proportionately. (for example, the existing Plex pricing only covers about 30% of actual expenses) - I'm seeking community feedback on how to proceed here.

I've laid out our transparent pricing process and costs in detail, and the stated goal is:

At the highest level, our goal is to recover hardware costs equivalent to resources consumed, and to realize 20% profit on costs (see how we're going on this goal by examining our monthly reports!)

Read below for details on app-specific changes..

Setting Sail for Stability

It's been 5 days since we had to turn all tenant workloads down for 10h while we performed "open-heart" surgery on our storage nodes. The root cause of our storage issues, it turned out, was some poor defaults applied to new ceph metadata pools when the SSD-based /config volumes were first added. At that time, our size / load wasn't significant enough to bring the issue to light.

Since the fix was applied (extending the pg_num on the blockpool from 1 to 32), we've not seen another I/O issue, nor pods stuck waiting on restarting.

So.. now that the fire is out, and we're enjoying a period of nice, calm, stability, here's what's coming...

Plex storage migration approaches

We've started to run into I/O contention on the big, busy storage volumes, which use CephFS to make files available to multiple pods (for example, to Plex for storage, but to FileBrowser for browsing config).

In the next maintenance window, we'll be migrating users' Plex libraries (some of which are >50GB!) to Ceph "block" storage, which bypasses the I/O contention, at the cost of being unable to "view" your Plex metadata / DB / logs via FileBrowser (we'll come up with an alternate solution as required).

This migration has been observed to take a long time, especially for larger libraries (roughly 2 min / GB of Library), so we're trying to "stagger" it as best as possible, by shifting "earlybird" users onto the alpha channel, so that their migration can happen in a time of their choosing, rather than during maintenance.

If you've got a big(ish) Plex library, or you care quite a lot about when it goes down, create an #elf-help ticket to arrange a time (before the next maintenance window) to have your Plex instance migrated.

Auto-delete trash in Plex now disabled

Thanks to a suggestion from @pomnz, we're now explicitly setting Plex's setting not to delete trash automatically after a scan. (This can cause confusion and re-downloads as deleted files disappear, and then Radarr/Sonarr attempt to re-download the content!)

Today's scoreboard

Metric Numberz Delta
Total subscribers 354 +29
๐Ÿ‘ฝ Zurg mounts: 131 +2
๐Ÿ’พ ElfStorage in TBs 65 -
Tenant pods 4368 +271
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished - -
New toyz 2 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!

Elf-Disclosure for Feb 2024 now available

We're 7-months old now, starting to outgrow our (1Gbps) baby ๐Ÿ‘ถ clothes, and exploring a new service niche, the hosted Stremio addons! Our Feb 2024 report is now published!

Here's an excerpt:

February saw us outgrowing our 1Gbps storage nodes, and becoming more involved in the r/StremioAddons community, moving from hosting a single hacky alternative for the popular-but-overloaded torrentio service, to providing free / subscription hosting for many popular Stremio Addons.

Highlights from February 2024

Upcoming in March 2024

You can read about all the changes, expenses vs income, and geek out over the tech stats, in the Elf-Disclosure for Feb 2024 report!

We recently refactored where we store symlinks, for optimal I/O utilization. I've made a few adjustments based on early feedback, and updated the documentation here:

To summarize the changes since the last update..

  1. /storage/symlinks is now 100Gb, because the aars were refusing to import new content to 1Gb volume, since they thought it wouldn't fit. We can't afford to give out 100Gb of NVMe storage to all users though, so this space is intended to utilized only for symlinks (and so remain empty), and a regular process will delete any files in /storage/symlinks which are not symlinks.
  2. ElfBot will import symlinks to /storage/symlinks/import, so as not to conflict with library storage at /storage/symlinks/{movies,series,etc}
  3. ElfBot can also symlink individual folders now, so for example, you could navigate to /storage/realdebrid-zurg/shows/The Simpsons - All Seasons, and run elfbot symlink here, and you'd end up with /storage/symlinks/import/The Simpsons - All Seasons to point Sonarr at.

Need to migrate your symlinks?

Have you got thousands of symlinks sitting in /storage/elfstorage/ on spinning rust, and need them migrated to /storage/symlinks? We'll have an illustrated guide available in a day or two, so sit tight, or create an #elf-help ticket if you'd like a hand!

In addition to reporting on broken symlinks with elfbot symlink report-broken, you can now tell ElfBot to delete any broken symlinks, with elfbot symlink delete-broken

Today's scoreboard

Metric Numberz Delta
Total subscribers 325 +9
Storageboxes mounted 18 -
๐Ÿ‘ฝ Zurg mounts: 131 +2
๐Ÿ’พ ElfStorage in TBs 65 +3
Tenant pods 4097 +38
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished 1 -
New toyz 1 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!

When we started using symlinks for real-debrid content, the natural place to store them was in /storage/elfstorage, since all users get 100GB free, so it's a consistent location. As we've increased in users, and users have increased their libraries sizes, I realized that this may be an inefficient way of "storing" the symlinks.

ElfStorage is backed by HDDs, and is using Ceph's erasure coding at a profile of 8:2, meaning every object gets split into 8 checks, and distributed across our 10 ElfStorage "dwarf" nodes (2 coding chunks). This is great for 80GB movies (10GB per node!), but probably not great for tiny little symlinks, since it represents a lot of extra work for the ceph cluster!

To this end, I've added a /storage/symlinks folder for each user. Unlike /storage/elfstorage, this is backed by NVMes on our 10Gbit nodes, in a simple 2-replica configuration, so the actual load on the cluster resulting from storage to /storage/symlinks should be negligible.

Your old symlinks in /storage/elfstorage will continue to work, but ElfBot will new create all new symlinks into /storage/symlinks. Apps default configs will be updated for fresh installs, but you'll want to update your Radarr / Sonarr root folders if you use them as part of your symlink setup.

Many users with large Real-Debrid collections have been noticing that their content occasionally "goes away". The symlinks remain, but the original sources in /storage/realdebrid-zurg/__all__ (for example) are removed. We're not sure exactly why this happens to some content and not others, but to make it easier to manage, ElfBot can now generate you a report of all broken symlinks.

To get your report, run elfbot symlink report-broken, and find the resulting report in /storage/symlinks/report-broken.txt.

Deleting broken symlinks

It's not implemented yet (some apps legitimately use symlinks which appear to be broken upon scanning), but we could also allow ElfBot to automatically delete broken symlinks. Watch this space!

Today's scoreboard

Metric Numberz Delta
Total subscribers 316 +43
Storageboxes mounted 18 +1
๐Ÿ‘ฝ Zurg mounts: 129 +21
๐Ÿ’พ ElfStorage in TBs 62 +5
Tenant pods 4059 +402
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished 1 -
New toyz 1 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!