Skip to content

What's new at ElfHosted?

Price rebalancing planned for 1 April 2024

As previously announced, we're be doing the first of what will become routine price-rebalancing, starting on 1 April 2024. This is the first time we've adjusted prices since we were "born" in June of 2023, and so this process may be a little jarring / unpolished at first!

TL;DR - How will this affect me?

There will be no change to the price of existing, active subscriptions. New prices will apply to new subscriptions.

However.. where there's a significant difference between existing and new pricing, we may reduce resource allocations on existing subscriptions proportionately. (for example, the existing Plex pricing only covers about 30% of actual expenses) - I'm seeking community feedback on how to proceed here.

I've laid out our transparent pricing process and costs in detail, and the stated goal is:

At the highest level, our goal is to recover hardware costs equivalent to resources consumed, and to realize 20% profit on costs (see how we're going on this goal by examining our monthly reports!)

Read below for details on app-specific changes..

Setting Sail for Stability

It's been 5 days since we had to turn all tenant workloads down for 10h while we performed "open-heart" surgery on our storage nodes. The root cause of our storage issues, it turned out, was some poor defaults applied to new ceph metadata pools when the SSD-based /config volumes were first added. At that time, our size / load wasn't significant enough to bring the issue to light.

Since the fix was applied (extending the pg_num on the blockpool from 1 to 32), we've not seen another I/O issue, nor pods stuck waiting on restarting.

So.. now that the fire is out, and we're enjoying a period of nice, calm, stability, here's what's coming...

Plex storage migration approaches

We've started to run into I/O contention on the big, busy storage volumes, which use CephFS to make files available to multiple pods (for example, to Plex for storage, but to FileBrowser for browsing config).

In the next maintenance window, we'll be migrating users' Plex libraries (some of which are >50GB!) to Ceph "block" storage, which bypasses the I/O contention, at the cost of being unable to "view" your Plex metadata / DB / logs via FileBrowser (we'll come up with an alternate solution as required).

This migration has been observed to take a long time, especially for larger libraries (roughly 2 min / GB of Library), so we're trying to "stagger" it as best as possible, by shifting "earlybird" users onto the alpha channel, so that their migration can happen in a time of their choosing, rather than during maintenance.

If you've got a big(ish) Plex library, or you care quite a lot about when it goes down, create an #elf-help ticket to arrange a time (before the next maintenance window) to have your Plex instance migrated.

Auto-delete trash in Plex now disabled

Thanks to a suggestion from @pomnz, we're now explicitly setting Plex's setting not to delete trash automatically after a scan. (This can cause confusion and re-downloads as deleted files disappear, and then Radarr/Sonarr attempt to re-download the content!)

Today's scoreboard

Metric Numberz Delta
Total subscribers 354 +29
๐Ÿ‘ฝ Zurg mounts: 131 +2
๐Ÿ’พ ElfStorage in TBs 65 -
Tenant pods 4368 +271
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished - -
New toyz 2 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!

Elf-Disclosure for Feb 2024 now available

We're 7-months old now, starting to outgrow our (1Gbps) baby ๐Ÿ‘ถ clothes, and exploring a new service niche, the hosted Stremio addons! Our Feb 2024 report is now published!

Here's an excerpt:

February saw us outgrowing our 1Gbps storage nodes, and becoming more involved in the r/StremioAddons community, moving from hosting a single hacky alternative for the popular-but-overloaded torrentio service, to providing free / subscription hosting for many popular Stremio Addons.

Highlights from February 2024

Upcoming in March 2024

You can read about all the changes, expenses vs income, and geek out over the tech stats, in the Elf-Disclosure for Feb 2024 report!

We recently refactored where we store symlinks, for optimal I/O utilization. I've made a few adjustments based on early feedback, and updated the documentation here:

To summarize the changes since the last update..

  1. /storage/symlinks is now 100Gb, because the aars were refusing to import new content to 1Gb volume, since they thought it wouldn't fit. We can't afford to give out 100Gb of NVMe storage to all users though, so this space is intended to utilized only for symlinks (and so remain empty), and a regular process will delete any files in /storage/symlinks which are not symlinks.
  2. ElfBot will import symlinks to /storage/symlinks/import, so as not to conflict with library storage at /storage/symlinks/{movies,series,etc}
  3. ElfBot can also symlink individual folders now, so for example, you could navigate to /storage/realdebrid-zurg/shows/The Simpsons - All Seasons, and run elfbot symlink here, and you'd end up with /storage/symlinks/import/The Simpsons - All Seasons to point Sonarr at.

Need to migrate your symlinks?

Have you got thousands of symlinks sitting in /storage/elfstorage/ on spinning rust, and need them migrated to /storage/symlinks? We'll have an illustrated guide available in a day or two, so sit tight, or create an #elf-help ticket if you'd like a hand!

In addition to reporting on broken symlinks with elfbot symlink report-broken, you can now tell ElfBot to delete any broken symlinks, with elfbot symlink delete-broken

Today's scoreboard

Metric Numberz Delta
Total subscribers 325 +9
Storageboxes mounted 18 -
๐Ÿ‘ฝ Zurg mounts: 131 +2
๐Ÿ’พ ElfStorage in TBs 65 +3
Tenant pods 4097 +38
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished 1 -
New toyz 1 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!

When we started using symlinks for real-debrid content, the natural place to store them was in /storage/elfstorage, since all users get 100GB free, so it's a consistent location. As we've increased in users, and users have increased their libraries sizes, I realized that this may be an inefficient way of "storing" the symlinks.

ElfStorage is backed by HDDs, and is using Ceph's erasure coding at a profile of 8:2, meaning every object gets split into 8 checks, and distributed across our 10 ElfStorage "dwarf" nodes (2 coding chunks). This is great for 80GB movies (10GB per node!), but probably not great for tiny little symlinks, since it represents a lot of extra work for the ceph cluster!

To this end, I've added a /storage/symlinks folder for each user. Unlike /storage/elfstorage, this is backed by NVMes on our 10Gbit nodes, in a simple 2-replica configuration, so the actual load on the cluster resulting from storage to /storage/symlinks should be negligible.

Your old symlinks in /storage/elfstorage will continue to work, but ElfBot will new create all new symlinks into /storage/symlinks. Apps default configs will be updated for fresh installs, but you'll want to update your Radarr / Sonarr root folders if you use them as part of your symlink setup.

Many users with large Real-Debrid collections have been noticing that their content occasionally "goes away". The symlinks remain, but the original sources in /storage/realdebrid-zurg/__all__ (for example) are removed. We're not sure exactly why this happens to some content and not others, but to make it easier to manage, ElfBot can now generate you a report of all broken symlinks.

To get your report, run elfbot symlink report-broken, and find the resulting report in /storage/symlinks/report-broken.txt.

Deleting broken symlinks

It's not implemented yet (some apps legitimately use symlinks which appear to be broken upon scanning), but we could also allow ElfBot to automatically delete broken symlinks. Watch this space!

Today's scoreboard

Metric Numberz Delta
Total subscribers 316 +43
Storageboxes mounted 18 +1
๐Ÿ‘ฝ Zurg mounts: 129 +21
๐Ÿ’พ ElfStorage in TBs 62 +5
Tenant pods 4059 +402
๐Ÿฆธ Elf-vengers 4 -
๐Ÿง‘โ€๐ŸŽ“ Trainees 2 -
Bugz squished 1 -
New toyz 1 -

Summary

Thanks for geeking out with us, and please share these posts with related geeks!

Full service outage on 29 February 2024 between 6:00 AM and 8:00 AM (CET)

Full service outage on 29 February 2024 between 6:00 AM and 8:00 AM (CET)

All ElfHosted services will be affected. Your pods will be shutdown at 6AM, and restored at 8AM

We've received notice from Hetzner that they'll be undertaking maintenance on the switch which connects our 10Gbps "goblin" nodes, providing all of the shared storage for our /config volumes. The maintenance is scheduled for 29 February 2024 between 6:30 AM and 7:30 AM (CET)

Given the criticality of these nodes, and the amount of #chaos which would ensue if they all went away simultaneously, we'll be taking all customer workloads down 30 minutes prior to the scheduled maintenance, and bringing them back again 30 minutes afterwards.

This means that for ElfHosted customers, the actual service outage will be a total of 2 hours, from 6:00 AM to 8:00 AM (CET).

Time to 2-hour full service shutdown

In summary..

Full service outage on 29 February 2024 between 6:00 AM and 8:00 AM (CET)

All ElfHosted services will be affected. Your pods will be shutdown at 6AM, and restored at 8AM

Xtremio Excels Excellently

After our recent hosting of public TorrentIO / Knightcrawler and Annatar instances, I was approached to take over hosting of another community Addon, the "Xtremio IPTV" addon. This one doesn't use Real-Debrid at all, but if you plug in your paid IPTV subscription details (assuming your provider supports xtreme codes), you can stream all your IPTV content inside Stremio!

It was a fun little addon to integrate, since there's no persistence or database required - you can see the final result at https://xtremio.elfhosted.com, and the public traefik stats here.

We do now offer hosted instances of Xtremio, which provide the same features, but with a higher rate-limit.

Annatar Arrives "Afficially"

A casual offer of hosting to a new addon announced in r/StremioAddons has turned into rewarding partnership with Annatar, a Stremio addon which leverages Jackett and Redis to provide a cached, just-in-time torrent search for Stremio.

The official, community site is at https://annatar.elfhosted.com, and we (@brett and I) have been geeking out over the public "GrafAnnatar" metrics.

Once the public instance was up, I added ElfHosted bundles to the store - these private instances benefit from the shared redis cache, which is well-seeded by the public instance, but allow users to configure their own selection of torrent indexers in their own Jackett backend.

AllDebrid 100% supported (if you BYO VPN)

We've had AllDebrid support for our debrid-related apps in testing for a few days, and our elf-a-testers have confirmed it's good to go (thanks elf-a-testers!).

The trick with AllDebrid, it turns out, is that they don't allow use of their API or WebDAV endpoints from "datacenter IPs", which includes our Hetzner IP ranges. (Confusingly, they allow listing of WebDAV files from anywhere, but not actual downloading!).

However, if you configured your AllDebrid account to specify that you're using a VPN, then you can access their endpoints via that VPN provider. Soo.. the workaround to using AllDebrid with ElfHosted apps is to... BYO VPN credentials (like we do with our torrent clients).

Currently this is just working with PrivateInternetAccess (PIA), but we can easily add more VPN flavors in future (let me know if you need one ASAP)

Here are the AllDebrid / VPN-enhanced versions of our regular debrid apps...