Compare commits

..

45 commits

Author SHA1 Message Date
Felix Jancso-Szabo
67d8bd5598
Fix spelling of 'Unaffected' in help texts (#838)
Noticed this while setting up pinchflat, figured I'd submit a fix.
2025-12-16 09:30:00 -08:00
Daniel Da Cunha
6cb715c1d0
Move Active Tasks to tab in Media History section (#836)
Consolidate the home page UI by moving Active Tasks from a separate
section into a third tab alongside Downloaded and Pending tabs.

Co-authored-by: Daniel <ddacunha@MacBook-Pro-14.local>
2025-12-16 09:29:32 -08:00
Edward Horsey
d38c26f6fd
Enable overflow scroll for tables inside tabs (#822) 2025-12-16 09:28:51 -08:00
Googleplex
163e8eb8cc
Update selfhosted.Dockerfile (#802) 2025-09-27 21:31:17 -07:00
Kieran Eglin
0688e880f5
Added yt=dlp for arm64 2025-09-27 10:20:33 -07:00
Kieran Eglin
4f8cba3f9c
Bumped version 2025-09-26 16:04:56 -07:00
Kieran
2a371c4419
[Dev] Add Deno to Dockerfiles (#801)
* Added Deno to Dockerfiles

* Updated yt-dlp source

* Added unzip

* Update deno directory

* The ACTUAL deno install path this time. Christ, am I new here?

* Linting
2025-09-26 16:04:22 -07:00
Kieran Eglin
076f2fe78b
Version bump 2025-06-06 14:39:32 -07:00
Kieran
68da8bc522
[Housekeeping] Dependency updates 6-Jun-2025 (#733)
* Bumped Elixir

* Silenced mix check warnings

* Updated all deps with minor version upgrades

* Updated more deps; Refactored text components to work with phoenix_html updates
2025-06-06 13:44:14 -07:00
Kieran
1cee7a19ee
Made source sorting case-insensitive (#708) 2025-04-28 11:43:51 -07:00
Kieran
a55f17ac5f
Update the link (#697) 2025-04-10 09:39:37 -07:00
Brandon Philips
f637bbd195
[Docs] Add podman to README (#686)
* README: add podman

Docker always has a tendency to get in my way on Debian. Also, I really
like the userns setup for podman for giving permissions between host and
container.

* Ran linting on README

---------

Co-authored-by: Kieran Eglin <kieran.eglin@gmail.com>
2025-04-10 09:33:34 -07:00
Kieran
7f56c0c802
Better copy (#696) 2025-04-10 09:20:56 -07:00
Kieran Eglin
6d97c8c1c4
Bumped version 2025-03-17 15:02:16 -07:00
Kieran
030f5fbdfe
[Enhancement] Add setting to restrict filenames to ASCII characters (#660)
* Added a new column for restricting filenames

* Adds restrict-filenames to command runner

* Added UI to settings form
2025-03-17 14:58:25 -07:00
Kieran
ee2db3e9b7
Stopped logging healthcheck requests (#659) 2025-03-17 14:48:07 -07:00
Kieran
4554648ba7
[Enhancement] Add download rate limiting to app settings (#646)
* Added rate limit column to settings

* Added limit_rate option to command runner

* Added rate limit to settings form
2025-03-11 15:45:56 -07:00
Kieran Eglin
0fbf810cb6
bumped version 2025-03-06 14:41:36 -08:00
Kieran
a97bb248e2
[Enhancement] Retry a download using cookies if doing so might help (#641)
* Sources that use cookies when_needed now retry downloads if we think it'd help

* tweaked error message we're checking on to match media_download_worker
2025-03-05 16:41:07 -08:00
Kieran
ac895944a8
[Enhancement] Add option for a source to only use cookies when needed (#640)
* Updated model with new attribute

* Update app logic to use new cookie logic

* lots of tests

* Updated UI and renamed attribute

* Updated tests
2025-03-05 15:32:15 -08:00
Kieran Eglin
59f8aa69cd
updating yt-dlp permissions, again 2025-03-04 11:08:02 -08:00
Kieran
b790e05133
Testing yt-dlp binary permissions (#634) 2025-03-04 10:53:40 -08:00
Kieran Eglin
9953e4d316
bumped version 2025-02-20 15:49:44 -08:00
Kieran
b62eb2bc6b
[Bugfix] Improve YouTube shorts detection for new YouTube pants (#618)
* Update youtube shorts detection to support youtube pants

* Updates a test
2025-02-20 15:49:09 -08:00
Kieran Eglin
464a595045
readme wording 2025-02-14 15:10:06 -08:00
Kieran Eglin
05f33acd78
Added note to README 2025-02-14 15:06:37 -08:00
Kieran
e7adc9d68f
[Enhancement] Record and display errors related to downloading (#610)
* Added last_error to media item table

* Error messages are now persisted to the last_error field

* Minor layout updates

* Added help tooltip to source content view

* Added error information to homepage tables

* Remove unneeded index

* Added docs to tooltip component
2025-02-12 10:17:24 -08:00
Kieran
fe5c00dbef
[Enhancement] Download failures due to videos being members-only are not immediately retried (#609) 2025-02-10 12:13:37 -08:00
rebel onion
28f0d8ca6e
[Enhancement] Support Multiple YouTube API Keys (#606)
* feat: multiple YouTube API keys

* fix: requested changes
2025-02-10 11:30:28 -08:00
Kieran Eglin
b62d5c201b
Bumped version 2025-01-27 15:48:20 -08:00
Kieran
6ead29182d
[Enhancement] Auto-update yt-dlp (#589)
* Added a command for updating yt-dlp

* Added a yt-dlp update worker to run daily

* Added a new file that runs post-boot when the app is ready to serve requests; put yt-dlp updater in there

* Updated config to expose the current env globally; updated startup tasks to not run in test env

* Removes unneeded test code
2025-01-27 11:33:38 -08:00
Kieran
62214b80a6
[Enhancement] Run fast indexing on source creation and at higher priority (#583)
* Updated default job priorities for downloading queue

* Added the ability to set priority to various downloading helpers

* Sets sources to fast index on creation
2025-01-22 14:54:15 -08:00
Kieran
704d29dc7e
[Enhancement] Add support for UMASK environment variable (#582)
* Add umask setting to docker start

* Testing adding umask env var

* Added umask to README
2025-01-21 14:22:04 -08:00
Kieran
3dd20141e0
Ensured first indexing pass runs if a source has never been indexed before (#581) 2025-01-21 11:55:27 -08:00
Kieran Eglin
993c57f853
Bumped version 2025-01-16 16:39:10 -08:00
Kieran
63bb4d2327
Added pending check before downloading media (#571) 2025-01-15 11:35:59 -08:00
Kieran
80406c9e0e
Change a GT to a GTE (#570) 2025-01-15 10:54:45 -08:00
Kieran Eglin
61ae50735f
Bumped version 2025-01-14 13:13:32 -08:00
Kieran
d8d7353228
[Enhancement] Add Discord link (#565)
* Add a discord link in sidebar

* Added discord link to README
2025-01-14 13:12:43 -08:00
Kieran Eglin
03a0afd657
Add blurb about websockets 2025-01-14 12:54:43 -08:00
Kieran
ca90da49f5
Add simple icons (#564) 2025-01-14 12:53:35 -08:00
Kieran Eglin
40cde43be1
Added grafana dashboards 2025-01-14 12:04:33 -08:00
Kieran
e9f6b45953
[Enhancement] Add rate limiting to yt-dlp requests; prevent saving Media Items when throttled by YouTube (#559)
* Added sleep interval to settings

* Added new sleep setting to yt-dlp runner and added tests

* Added setting for form; updated setting name

* Updated form label

* Prevented saving/updating of media items if being throttled by youtube

* Added the bot message to the list of non-retryable errors

* Fixed typo
2025-01-14 11:38:40 -08:00
Kieran
fb27988963
[Enhancement] Add Prometheus support (#556)
* Added prometheus to deps list

* WIP - screwing around with Prometheus and grafana

* Added basic prometheus config

* Updated docs in prom_ex module

* Updated README
2025-01-09 12:38:17 -08:00
Kieran
8a40d296c4
Updated healthcheck to run every 30s (#555) 2025-01-09 11:17:55 -08:00
91 changed files with 12217 additions and 403 deletions

View file

@ -1,3 +1,6 @@
> [!IMPORTANT]
> (2025-02-14) [zakkarry](https://github.com/sponsors/zakkarry), who is a collaborator on [cross-seed](https://github.com/cross-seed/cross-seed) and an extremely helpful community member in general, is facing hard times due to medical debt and family illness. If you're able, please consider [sponsoring him on GitHub](https://github.com/sponsors/zakkarry) or donating via [buymeacoffee](https://tip.ary.dev). Tell him I sent you!
<p align="center">
<img
src="priv/static/images/originals/logo-white-wordmark-with-background.png"
@ -15,7 +18,8 @@
<div align="center">
[![](https://img.shields.io/github/license/kieraneglin/pinchflat?style=for-the-badge&color=ee512b)](LICENSE)
[![](https://img.shields.io/github/v/release/kieraneglin/pinchflat?style=for-the-badge)](https://github.com/kieraneglin/pinchflat/releases)
[![](https://img.shields.io/github/v/release/kieraneglin/pinchflat?style=for-the-badge&color=purple)](https://github.com/kieraneglin/pinchflat/releases)
[![](https://img.shields.io/static/v1?style=for-the-badge&logo=discord&message=Chat&color=5865F2&label=Discord)](https://discord.gg/j7T6dCuwU4)
[![](https://img.shields.io/github/actions/workflow/status/kieraneglin/pinchflat/lint_and_test.yml?style=for-the-badge)](#)
[![](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode&style=for-the-badge)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/kieraneglin/pinchflat)
@ -33,6 +37,7 @@
- [Portainer](#portainer)
- [Docker](#docker)
- [Environment Variables](#environment-variables)
- [A note on reverse proxies](#reverse-proxies)
- [Username and Password (authentication)](https://github.com/kieraneglin/pinchflat/wiki/Username-and-Password)
- [Frequently asked questions](https://github.com/kieraneglin/pinchflat/wiki/Frequently-Asked-Questions)
- [Documentation](https://github.com/kieraneglin/pinchflat/wiki)
@ -124,6 +129,23 @@ docker run \
ghcr.io/kieraneglin/pinchflat:latest
```
### Podman
The Podman setup is similar to Docker but changes a few flags to run under a User Namespace instead of root. To run Pinchflat under Podman and use the current user's UID/GID for file access run this:
```
podman run \
--security-opt label=disable \
--userns=keep-id --user=$UID \
-e TZ=America/Los_Angeles \
-p 8945:8945 \
-v /host/path/to/config:/config:rw \
-v /host/path/to/downloads/:/downloads:rw \
ghcr.io/kieraneglin/pinchflat:latest
```
Using this setup consider creating a new `pinchflat` user and giving that user ownership to the config and download directory. See [Podman --userns](https://docs.podman.io/en/v4.6.1/markdown/options/userns.container.html) docs.
### IMPORTANT: File permissions
You _must_ ensure the host directories you've mounted are writable by the user running the Docker container. If you get a permission error follow the steps it suggests. See [#106](https://github.com/kieraneglin/pinchflat/issues/106) for more.
@ -142,18 +164,24 @@ If you change this setting and it works well for you, please leave a comment on
### Environment variables
| Name | Required? | Default | Notes |
| --------------------------- | --------- | ------------------------- | ------------------------------------------------------------------------------------------------------------------------------- |
| `TZ` | No | `UTC` | Must follow IANA TZ format |
| `LOG_LEVEL` | No | `debug` | Can be set to `info` but `debug` is strongly recommended |
| `BASIC_AUTH_USERNAME` | No | | See [authentication docs](https://github.com/kieraneglin/pinchflat/wiki/Username-and-Password) |
| `BASIC_AUTH_PASSWORD` | No | | See [authentication docs](https://github.com/kieraneglin/pinchflat/wiki/Username-and-Password) |
| `EXPOSE_FEED_ENDPOINTS` | No | `false` | See [RSS feed docs](https://github.com/kieraneglin/pinchflat/wiki/Podcast-RSS-Feeds) |
| `ENABLE_IPV6` | No | `false` | Setting to _any_ non-blank value will enable IPv6 |
| `JOURNAL_MODE` | No | `wal` | Set to `delete` if your config directory is stored on a network share (not recommended) |
| `TZ_DATA_DIR` | No | `/etc/elixir_tzdata_data` | The container path where the timezone database is stored |
| `BASE_ROUTE_PATH` | No | `/` | The base path for route generation. Useful when running behind certain reverse proxies, but prefix must be stripped. |
| `YT_DLP_WORKER_CONCURRENCY` | No | `2` | The number of concurrent workers that use `yt-dlp` _per queue_. Set to 1 if you're getting IP limited, otherwise don't touch it |
| Name | Required? | Default | Notes |
| --------------------------- | --------- | ------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- |
| `TZ` | No | `UTC` | Must follow IANA TZ format |
| `LOG_LEVEL` | No | `debug` | Can be set to `info` but `debug` is strongly recommended |
| `UMASK` | No | `022` | Unraid users may want to set this to `000` |
| `BASIC_AUTH_USERNAME` | No | | See [authentication docs](https://github.com/kieraneglin/pinchflat/wiki/Username-and-Password) |
| `BASIC_AUTH_PASSWORD` | No | | See [authentication docs](https://github.com/kieraneglin/pinchflat/wiki/Username-and-Password) |
| `EXPOSE_FEED_ENDPOINTS` | No | `false` | See [RSS feed docs](https://github.com/kieraneglin/pinchflat/wiki/Podcast-RSS-Feeds) |
| `ENABLE_IPV6` | No | `false` | Setting to _any_ non-blank value will enable IPv6 |
| `JOURNAL_MODE` | No | `wal` | Set to `delete` if your config directory is stored on a network share (not recommended) |
| `TZ_DATA_DIR` | No | `/etc/elixir_tzdata_data` | The container path where the timezone database is stored |
| `BASE_ROUTE_PATH` | No | `/` | The base path for route generation. Useful when running behind certain reverse proxies - prefixes must be stripped. |
| `YT_DLP_WORKER_CONCURRENCY` | No | `2` | The number of concurrent workers that use `yt-dlp` _per queue_. Set to 1 if you're getting IP limited, otherwise don't touch it |
| `ENABLE_PROMETHEUS` | No | `false` | Setting to _any_ non-blank value will enable Prometheus. See [docs](https://github.com/kieraneglin/pinchflat/wiki/Prometheus-and-Grafana) |
### Reverse Proxies
Pinchflat makes heavy use of websockets for real-time updates. If you're running Pinchflat behind a reverse proxy then you'll need to make sure it's configured to support websockets.
## EFF donations

View file

@ -347,6 +347,38 @@ module.exports = {
},
{ values }
)
}),
plugin(function ({ matchComponents, theme }) {
let iconsDir = path.join(__dirname, './vendor/simple-icons')
let values = {}
fs.readdirSync(iconsDir).forEach((file) => {
let name = path.basename(file, '.svg')
values[name] = { name, fullPath: path.join(iconsDir, file) }
})
matchComponents(
{
si: ({ name, fullPath }) => {
let content = fs
.readFileSync(fullPath)
.toString()
.replace(/\r?\n|\r/g, '')
return {
[`--si-${name}`]: `url('data:image/svg+xml;utf8,${content}')`,
'-webkit-mask': `var(--si-${name})`,
mask: `var(--si-${name})`,
'mask-repeat': 'no-repeat',
'background-color': 'currentColor',
'vertical-align': 'middle',
display: 'inline-block',
width: theme('spacing.5'),
height: theme('spacing.5')
}
}
},
{ values }
)
})
]
}

View file

@ -0,0 +1 @@
<svg role="img" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg"><title>Discord</title><path d="M20.317 4.3698a19.7913 19.7913 0 00-4.8851-1.5152.0741.0741 0 00-.0785.0371c-.211.3753-.4447.8648-.6083 1.2495-1.8447-.2762-3.68-.2762-5.4868 0-.1636-.3933-.4058-.8742-.6177-1.2495a.077.077 0 00-.0785-.037 19.7363 19.7363 0 00-4.8852 1.515.0699.0699 0 00-.0321.0277C.5334 9.0458-.319 13.5799.0992 18.0578a.0824.0824 0 00.0312.0561c2.0528 1.5076 4.0413 2.4228 5.9929 3.0294a.0777.0777 0 00.0842-.0276c.4616-.6304.8731-1.2952 1.226-1.9942a.076.076 0 00-.0416-.1057c-.6528-.2476-1.2743-.5495-1.8722-.8923a.077.077 0 01-.0076-.1277c.1258-.0943.2517-.1923.3718-.2914a.0743.0743 0 01.0776-.0105c3.9278 1.7933 8.18 1.7933 12.0614 0a.0739.0739 0 01.0785.0095c.1202.099.246.1981.3728.2924a.077.077 0 01-.0066.1276 12.2986 12.2986 0 01-1.873.8914.0766.0766 0 00-.0407.1067c.3604.698.7719 1.3628 1.225 1.9932a.076.076 0 00.0842.0286c1.961-.6067 3.9495-1.5219 6.0023-3.0294a.077.077 0 00.0313-.0552c.5004-5.177-.8382-9.6739-3.5485-13.6604a.061.061 0 00-.0312-.0286zM8.02 15.3312c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9555-2.4189 2.157-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.9555 2.4189-2.1569 2.4189zm7.9748 0c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9554-2.4189 2.1569-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.946 2.4189-2.1568 2.4189Z"/></svg>

After

Width:  |  Height:  |  Size: 1.3 KiB

1
assets/vendor/simple-icons/github.svg vendored Normal file
View file

@ -0,0 +1 @@
<svg role="img" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg"><title>GitHub</title><path d="M12 .297c-6.63 0-12 5.373-12 12 0 5.303 3.438 9.8 8.205 11.385.6.113.82-.258.82-.577 0-.285-.01-1.04-.015-2.04-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729 1.205.084 1.838 1.236 1.838 1.236 1.07 1.835 2.809 1.305 3.495.998.108-.776.417-1.305.76-1.605-2.665-.3-5.466-1.332-5.466-5.93 0-1.31.465-2.38 1.235-3.22-.135-.303-.54-1.523.105-3.176 0 0 1.005-.322 3.3 1.23.96-.267 1.98-.399 3-.405 1.02.006 2.04.138 3 .405 2.28-1.552 3.285-1.23 3.285-1.23.645 1.653.24 2.873.12 3.176.765.84 1.23 1.91 1.23 3.22 0 4.61-2.805 5.625-5.475 5.92.42.36.81 1.096.81 2.22 0 1.606-.015 2.896-.015 3.286 0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12"/></svg>

After

Width:  |  Height:  |  Size: 823 B

View file

@ -10,6 +10,7 @@ import Config
config :pinchflat,
ecto_repos: [Pinchflat.Repo],
generators: [timestamp_type: :utc_datetime],
env: config_env(),
# Specifying backend data here makes mocking and local testing SUPER easy
yt_dlp_executable: System.find_executable("yt-dlp"),
apprise_executable: System.find_executable("apprise"),
@ -49,16 +50,7 @@ config :pinchflat, PinchflatWeb.Endpoint,
config :pinchflat, Oban,
engine: Oban.Engines.Lite,
repo: Pinchflat.Repo,
# Keep old jobs for 30 days for display in the UI
plugins: [
{Oban.Plugins.Pruner, max_age: 30 * 24 * 60 * 60},
{Oban.Plugins.Cron,
crontab: [
{"0 1 * * *", Pinchflat.Downloading.MediaRetentionWorker},
{"0 2 * * *", Pinchflat.Downloading.MediaQualityUpgradeWorker}
]}
]
repo: Pinchflat.Repo
# Configures the mailer
#
@ -99,6 +91,12 @@ config :logger, :default_formatter,
# Use Jason for JSON parsing in Phoenix
config :phoenix, :json_library, Jason
config :pinchflat, Pinchflat.PromEx,
disabled: true,
manual_metrics_start_delay: :no_delay,
drop_metrics_groups: [],
metrics_server: :disabled
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{config_env()}.exs"

View file

@ -81,3 +81,5 @@ config :phoenix_live_view, :debug_heex_annotations, true
# Disable swoosh api client as it is only required for production adapters.
config :swoosh, :api_client, false
config :pinchflat, Pinchflat.PromEx, disabled: false

View file

@ -43,15 +43,30 @@ config :pinchflat, Pinchflat.Repo,
# Some users may want to increase the number of workers that use yt-dlp to improve speeds
# Others may want to decrease the number of these workers to lessen the chance of an IP ban
{yt_dlp_worker_count, _} = Integer.parse(System.get_env("YT_DLP_WORKER_CONCURRENCY", "2"))
# Used to set the cron for the yt-dlp update worker. The reason for this is
# to avoid all instances of PF updating yt-dlp at the same time, which 1)
# could result in rate limiting and 2) gives me time to react if an update
# breaks something
%{hour: current_hour, minute: current_minute} = DateTime.utc_now()
config :pinchflat, Oban,
queues: [
default: 10,
fast_indexing: 6,
fast_indexing: yt_dlp_worker_count,
media_collection_indexing: yt_dlp_worker_count,
media_fetching: yt_dlp_worker_count,
remote_metadata: yt_dlp_worker_count,
local_data: 8
],
plugins: [
# Keep old jobs for 30 days for display in the UI
{Oban.Plugins.Pruner, max_age: 30 * 24 * 60 * 60},
{Oban.Plugins.Cron,
crontab: [
{"#{current_minute} #{current_hour} * * *", Pinchflat.YtDlp.UpdateWorker},
{"0 1 * * *", Pinchflat.Downloading.MediaRetentionWorker},
{"0 2 * * *", Pinchflat.Downloading.MediaQualityUpgradeWorker}
]}
]
if config_env() == :prod do
@ -72,6 +87,7 @@ if config_env() == :prod do
# For running PF in a subdirectory via a reverse proxy
base_route_path = System.get_env("BASE_ROUTE_PATH", "/")
enable_ipv6 = String.length(System.get_env("ENABLE_IPV6", "")) > 0
enable_prometheus = String.length(System.get_env("ENABLE_PROMETHEUS", "")) > 0
config :logger, level: String.to_existing_atom(System.get_env("LOG_LEVEL", "debug"))
@ -95,6 +111,8 @@ if config_env() == :prod do
database: db_path,
journal_mode: journal_mode
config :pinchflat, Pinchflat.PromEx, disabled: !enable_prometheus
# The secret key base is used to sign/encrypt cookies and other secrets.
# A default value is used in config/dev.exs and config/test.exs but you
# want to use a different value for prod and you most likely don't want

View file

@ -1,6 +1,7 @@
ARG ELIXIR_VERSION=1.17.0
ARG OTP_VERSION=26.2.5
ARG DEBIAN_VERSION=bookworm-20240612-slim
ARG ELIXIR_VERSION=1.18.4
ARG OTP_VERSION=27.2.4
ARG DEBIAN_VERSION=bookworm-20250428-slim
ARG DEV_IMAGE="hexpm/elixir:${ELIXIR_VERSION}-erlang-${OTP_VERSION}-debian-${DEBIAN_VERSION}"
FROM ${DEV_IMAGE}
@ -12,7 +13,7 @@ RUN echo "Building for ${TARGETPLATFORM:?}"
RUN apt-get update -qq && \
apt-get install -y inotify-tools curl git openssh-client jq \
python3 python3-setuptools python3-wheel python3-dev pipx \
python3-mutagen locales procps build-essential graphviz zsh
python3-mutagen locales procps build-essential graphviz zsh unzip
# Install ffmpeg
RUN export FFMPEG_DOWNLOAD=$(case ${TARGETPLATFORM:-linux/amd64} in \
@ -31,8 +32,14 @@ RUN curl -sL https://deb.nodesource.com/setup_20.x -o nodesource_setup.sh && \
# Install baseline Elixir packages
mix local.hex --force && \
mix local.rebar --force && \
# Install Deno - required for YouTube downloads (See yt-dlp#14404)
curl -fsSL https://deno.land/install.sh | DENO_INSTALL=/usr/local sh -s -- -y --no-modify-path && \
# Download and update YT-DLP
curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp -o /usr/local/bin/yt-dlp && \
export YT_DLP_DOWNLOAD=$(case ${TARGETPLATFORM:-linux/amd64} in \
"linux/amd64") echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux" ;; \
"linux/arm64") echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_aarch64" ;; \
*) echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux" ;; esac) && \
curl -L ${YT_DLP_DOWNLOAD} -o /usr/local/bin/yt-dlp && \
chmod a+rx /usr/local/bin/yt-dlp && \
yt-dlp -U && \
# Install Apprise

View file

@ -1,13 +1,13 @@
# Find eligible builder and runner images on Docker Hub. We use Ubuntu/Debian
# instead of Alpine to avoid DNS resolution issues in production.
ARG ELIXIR_VERSION=1.17.0
ARG OTP_VERSION=26.2.5
ARG DEBIAN_VERSION=bookworm-20240612-slim
ARG ELIXIR_VERSION=1.18.4
ARG OTP_VERSION=27.2.4
ARG DEBIAN_VERSION=bookworm-20250428-slim
ARG BUILDER_IMAGE="hexpm/elixir:${ELIXIR_VERSION}-erlang-${OTP_VERSION}-debian-${DEBIAN_VERSION}"
ARG RUNNER_IMAGE="debian:${DEBIAN_VERSION}"
FROM ${BUILDER_IMAGE} as builder
FROM ${BUILDER_IMAGE} AS builder
ARG TARGETPLATFORM
RUN echo "Building for ${TARGETPLATFORM:?}"
@ -73,6 +73,7 @@ RUN mix release
FROM ${RUNNER_IMAGE}
ARG TARGETPLATFORM
ARG PORT=8945
COPY --from=builder ./usr/local/bin/ffmpeg /usr/bin/ffmpeg
@ -94,13 +95,21 @@ RUN apt-get update -y && \
python3 \
pipx \
jq \
# unzip is needed for Deno
unzip \
procps && \
# Install Deno - required for YouTube downloads (See yt-dlp#14404)
curl -fsSL https://deno.land/install.sh | DENO_INSTALL=/usr/local sh -s -- -y --no-modify-path && \
# Apprise
export PIPX_HOME=/opt/pipx && \
export PIPX_BIN_DIR=/usr/local/bin && \
pipx install apprise && \
# yt-dlp
curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp -o /usr/local/bin/yt-dlp && \
export YT_DLP_DOWNLOAD=$(case ${TARGETPLATFORM:-linux/amd64} in \
"linux/amd64") echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux" ;; \
"linux/arm64") echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_aarch64" ;; \
*) echo "https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux" ;; esac) && \
curl -L ${YT_DLP_DOWNLOAD} -o /usr/local/bin/yt-dlp && \
chmod a+rx /usr/local/bin/yt-dlp && \
yt-dlp -U && \
# Set the locale
@ -110,26 +119,27 @@ RUN apt-get update -y && \
rm -rf /var/lib/apt/lists/*
# More locale setup
ENV LANG en_US.UTF-8
ENV LANGUAGE en_US:en
ENV LC_ALL en_US.UTF-8
ENV LANG=en_US.UTF-8
ENV LANGUAGE=en_US:en
ENV LC_ALL=en_US.UTF-8
WORKDIR "/app"
# Set up data volumes
RUN mkdir -p /config /downloads /etc/elixir_tzdata_data /etc/yt-dlp/plugins && \
chmod ugo+rw /etc/elixir_tzdata_data /etc/yt-dlp /etc/yt-dlp/plugins
chmod ugo+rw /etc/elixir_tzdata_data /etc/yt-dlp /etc/yt-dlp/plugins /usr/local/bin /usr/local/bin/yt-dlp
# set runner ENV
ENV MIX_ENV="prod"
ENV PORT=${PORT}
ENV RUN_CONTEXT="selfhosted"
ENV UMASK=022
EXPOSE ${PORT}
# Only copy the final release from the build stage
COPY --from=builder /app/_build/${MIX_ENV}/rel/pinchflat ./
HEALTHCHECK --interval=120s --start-period=10s \
HEALTHCHECK --interval=30s --start-period=15s \
CMD curl --fail http://localhost:${PORT}/healthcheck || exit 1
# Start the app

View file

@ -9,8 +9,13 @@ defmodule Pinchflat.Application do
@impl true
def start(_type, _args) do
check_and_update_timezone()
attach_oban_telemetry()
Logger.add_handlers(:pinchflat)
children = [
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
[
Pinchflat.PromEx,
PinchflatWeb.Telemetry,
Pinchflat.Repo,
# Must be before startup tasks
@ -23,17 +28,11 @@ defmodule Pinchflat.Application do
{Finch, name: Pinchflat.Finch},
# Start a worker by calling: Pinchflat.Worker.start_link(arg)
# {Pinchflat.Worker, arg},
# Start to serve requests, typically the last entry
PinchflatWeb.Endpoint
# Start to serve requests, typically the last entry (except for the post-boot tasks)
PinchflatWeb.Endpoint,
Pinchflat.Boot.PostBootStartupTasks
]
attach_oban_telemetry()
Logger.add_handlers(:pinchflat)
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Pinchflat.Supervisor]
Supervisor.start_link(children, opts)
|> Supervisor.start_link(strategy: :one_for_one, name: Pinchflat.Supervisor)
end
# Tell Phoenix to update the endpoint configuration

View file

@ -0,0 +1,46 @@
defmodule Pinchflat.Boot.PostBootStartupTasks do
@moduledoc """
This module is responsible for running startup tasks on app boot
AFTER all other boot steps have taken place and the app is ready to serve requests.
It's a GenServer because that plays REALLY nicely with the existing
Phoenix supervision tree.
"""
alias Pinchflat.YtDlp.UpdateWorker, as: YtDlpUpdateWorker
# restart: :temporary means that this process will never be restarted (ie: will run once and then die)
use GenServer, restart: :temporary
import Ecto.Query, warn: false
def start_link(opts \\ []) do
GenServer.start_link(__MODULE__, %{env: Application.get_env(:pinchflat, :env)}, opts)
end
@doc """
Runs post-boot application startup tasks.
Any code defined here will run every time the application starts. You must
make sure that the code is idempotent and safe to run multiple times.
This is a good place to set up default settings, create initial records, stuff like that.
Should be fast - anything with the potential to be slow should be kicked off as a job instead.
"""
@impl true
def init(%{env: :test} = state) do
# Do nothing _as part of the app bootup process_.
# Since bootup calls `start_link` and that's where the `env` state is injected,
# you can still call `.init()` manually to run these tasks for testing purposes
{:ok, state}
end
def init(state) do
update_yt_dlp()
{:ok, state}
end
defp update_yt_dlp do
YtDlpUpdateWorker.kickoff()
end
end

View file

@ -1,7 +1,7 @@
defmodule Pinchflat.Boot.PostJobStartupTasks do
@moduledoc """
This module is responsible for running startup tasks on app boot
AFTER the job runner has initiallized.
AFTER the job runner has initialized.
It's a GenServer because that plays REALLY nicely with the existing
Phoenix supervision tree.
@ -12,7 +12,7 @@ defmodule Pinchflat.Boot.PostJobStartupTasks do
import Ecto.Query, warn: false
def start_link(opts \\ []) do
GenServer.start_link(__MODULE__, %{}, opts)
GenServer.start_link(__MODULE__, %{env: Application.get_env(:pinchflat, :env)}, opts)
end
@doc """
@ -25,6 +25,13 @@ defmodule Pinchflat.Boot.PostJobStartupTasks do
Should be fast - anything with the potential to be slow should be kicked off as a job instead.
"""
@impl true
def init(%{env: :test} = state) do
# Do nothing _as part of the app bootup process_.
# Since bootup calls `start_link` and that's where the `env` state is injected,
# you can still call `.init()` manually to run these tasks for testing purposes
{:ok, state}
end
def init(state) do
# Nothing at the moment!

View file

@ -19,7 +19,7 @@ defmodule Pinchflat.Boot.PreJobStartupTasks do
alias Pinchflat.Lifecycle.UserScripts.CommandRunner, as: UserScriptRunner
def start_link(opts \\ []) do
GenServer.start_link(__MODULE__, %{}, opts)
GenServer.start_link(__MODULE__, %{env: Application.get_env(:pinchflat, :env)}, opts)
end
@doc """
@ -32,6 +32,13 @@ defmodule Pinchflat.Boot.PreJobStartupTasks do
Should be fast - anything with the potential to be slow should be kicked off as a job instead.
"""
@impl true
def init(%{env: :test} = state) do
# Do nothing _as part of the app bootup process_.
# Since bootup calls `start_link` and that's where the `env` state is injected,
# you can still call `.init()` manually to run these tasks for testing purposes
{:ok, state}
end
def init(state) do
ensure_tmpfile_directory()
reset_executing_jobs()

View file

@ -27,13 +27,15 @@ defmodule Pinchflat.Downloading.DownloadingHelpers do
Returns :ok
"""
def enqueue_pending_download_tasks(%Source{download_media: true} = source) do
def enqueue_pending_download_tasks(source, job_opts \\ [])
def enqueue_pending_download_tasks(%Source{download_media: true} = source, job_opts) do
source
|> Media.list_pending_media_items_for()
|> Enum.each(&MediaDownloadWorker.kickoff_with_task/1)
|> Enum.each(&MediaDownloadWorker.kickoff_with_task(&1, %{}, job_opts))
end
def enqueue_pending_download_tasks(%Source{download_media: false}) do
def enqueue_pending_download_tasks(%Source{download_media: false}, _job_opts) do
:ok
end
@ -55,13 +57,13 @@ defmodule Pinchflat.Downloading.DownloadingHelpers do
Returns {:ok, %Task{}} | {:error, :should_not_download} | {:error, any()}
"""
def kickoff_download_if_pending(%MediaItem{} = media_item) do
def kickoff_download_if_pending(%MediaItem{} = media_item, job_opts \\ []) do
media_item = Repo.preload(media_item, :source)
if media_item.source.download_media && Media.pending_download?(media_item) do
Logger.info("Kicking off download for media item ##{media_item.id} (#{media_item.media_id})")
MediaDownloadWorker.kickoff_with_task(media_item)
MediaDownloadWorker.kickoff_with_task(media_item, %{}, job_opts)
else
{:error, :should_not_download}
end

View file

@ -3,6 +3,7 @@ defmodule Pinchflat.Downloading.MediaDownloadWorker do
use Oban.Worker,
queue: :media_fetching,
priority: 5,
unique: [period: :infinity, states: [:available, :scheduled, :retryable, :executing]],
tags: ["media_item", "media_fetching", "show_in_dashboard"]
@ -49,8 +50,7 @@ defmodule Pinchflat.Downloading.MediaDownloadWorker do
media_item = fetch_and_run_prevent_download_user_script(media_item_id)
# If the source or media item is set to not download media, perform a no-op unless forced
if (media_item.source.download_media && !media_item.prevent_download) || should_force do
if should_download_media?(media_item, should_force, is_quality_upgrade) do
download_media_and_schedule_jobs(media_item, is_quality_upgrade, should_force)
else
:ok
@ -60,6 +60,20 @@ defmodule Pinchflat.Downloading.MediaDownloadWorker do
Ecto.StaleEntryError -> Logger.info("#{__MODULE__} discarded: media item #{media_item_id} stale")
end
# If this is a quality upgrade, only check if the source is set to download media
# or that the media item's download hasn't been prevented
defp should_download_media?(media_item, should_force, true = _is_quality_upgrade) do
(media_item.source.download_media && !media_item.prevent_download) || should_force
end
# If it's not a quality upgrade, additionally check if the media item is pending download
defp should_download_media?(media_item, should_force, _is_quality_upgrade) do
source = media_item.source
is_pending = Media.pending_download?(media_item)
(is_pending && source.download_media && !media_item.prevent_download) || should_force
end
# If a user script exists and, when run, returns a non-zero exit code, prevent this and all future downloads
# of the media item.
defp fetch_and_run_prevent_download_user_script(media_item_id) do
@ -91,13 +105,13 @@ defmodule Pinchflat.Downloading.MediaDownloadWorker do
:ok
{:recovered, _} ->
{:recovered, _media_item, _message} ->
{:error, :retry}
{:error, :unsuitable_for_download} ->
{:error, :unsuitable_for_download, _message} ->
{:ok, :non_retry}
{:error, message} ->
{:error, _error_atom, message} ->
action_on_error(message)
end
end
@ -115,7 +129,11 @@ defmodule Pinchflat.Downloading.MediaDownloadWorker do
defp action_on_error(message) do
# This will attempt re-download at the next indexing, but it won't be retried
# immediately as part of job failure logic
non_retryable_errors = ["Video unavailable"]
non_retryable_errors = [
"Video unavailable",
"Sign in to confirm",
"This video is available to this channel's members"
]
if String.contains?(to_string(message), non_retryable_errors) do
Logger.error("yt-dlp download will not be retried: #{inspect(message)}")

View file

@ -9,7 +9,9 @@ defmodule Pinchflat.Downloading.MediaDownloader do
alias Pinchflat.Repo
alias Pinchflat.Media
alias Pinchflat.Sources
alias Pinchflat.Media.MediaItem
alias Pinchflat.Utils.StringUtils
alias Pinchflat.Metadata.NfoBuilder
alias Pinchflat.Metadata.MetadataParser
alias Pinchflat.Metadata.MetadataFileHelpers
@ -20,16 +22,57 @@ defmodule Pinchflat.Downloading.MediaDownloader do
@doc """
Downloads media for a media item, updating the media item based on the metadata
returned by yt-dlp. Also saves the entire metadata response to the associated
media_metadata record.
returned by yt-dlp. Encountered errors are saved to the Media Item record. Saves
the entire metadata response to the associated media_metadata record.
NOTE: related methods (like the download worker) won't download if the media item's source
NOTE: related methods (like the download worker) won't download if Pthe media item's source
is set to not download media. However, I'm not enforcing that here since I need this for testing.
This may change in the future but I'm not stressed.
Returns {:ok, %MediaItem{}} | {:error, any, ...any}
Returns {:ok, %MediaItem{}} | {:error, atom(), String.t()} | {:recovered, %MediaItem{}, String.t()}
"""
def download_for_media_item(%MediaItem{} = media_item, override_opts \\ []) do
case attempt_download_and_update_for_media_item(media_item, override_opts) do
{:ok, media_item} ->
# Returns {:ok, %MediaItem{}}
Media.update_media_item(media_item, %{last_error: nil})
{:error, error_atom, message} ->
Media.update_media_item(media_item, %{last_error: StringUtils.wrap_string(message)})
{:error, error_atom, message}
{:recovered, media_item, message} ->
{:ok, updated_media_item} = Media.update_media_item(media_item, %{last_error: StringUtils.wrap_string(message)})
{:recovered, updated_media_item, message}
end
end
# Looks complicated, but here's the key points:
# - download_with_options runs a pre-check to see if the media item is suitable for download.
# - If the media item fails the precheck, it returns {:error, :unsuitable_for_download, message}
# - However, if the precheck fails in a way that we think can be fixed by using cookies, we retry with cookies
# and return the result of that
# - If the precheck passes but the download fails, it normally returns {:error, :download_failed, message}
# - However, there are some errors we can recover from (eg: failure to communicate with SponsorBlock).
# In this case, we attempt the download anyway and update the media item with what details we do have.
# This case returns {:recovered, updated_media_item, message}
# - If we attempt a retry but it fails, we return {:error, :unrecoverable, message}
# - If there is an unknown error unrelated to the above, we return {:error, :unknown, message}
# - Finally, if there is no error, we update the media item with the parsed JSON and return {:ok, updated_media_item}
#
# Restated, here are the return values for each case:
# - On success: {:ok, updated_media_item}
# - On initial failure but successfully recovered: {:recovered, updated_media_item, message}
# - On error: {:error, error_atom, message} where error_atom is one of:
# - `:unsuitable_for_download` if the media item fails the precheck
# - `:unrecoverable` if there was an initial failure and the recovery attempt failed
# - `:download_failed` for all other yt-dlp-related downloading errors
# - `:unknown` for any other errors, including those not related to yt-dlp
# - If we retry using cookies, all of the above return values apply. The cookie retry
# logic is handled transparently as far as the caller is concerned
defp attempt_download_and_update_for_media_item(media_item, override_opts) do
output_filepath = FilesystemUtils.generate_metadata_tmpfile(:json)
media_with_preloads = Repo.preload(media_item, [:metadata, source: :media_profile])
@ -38,31 +81,30 @@ defmodule Pinchflat.Downloading.MediaDownloader do
update_media_item_from_parsed_json(media_with_preloads, parsed_json)
{:error, :unsuitable_for_download} ->
Logger.warning(
message =
"Media item ##{media_with_preloads.id} isn't suitable for download yet. May be an active or processing live stream"
)
{:error, :unsuitable_for_download}
Logger.warning(message)
{:error, :unsuitable_for_download, message}
{:error, message, _exit_code} ->
Logger.error("yt-dlp download error for media item ##{media_with_preloads.id}: #{inspect(message)}")
if String.contains?(to_string(message), recoverable_errors()) do
attempt_update_media_item(media_with_preloads, output_filepath)
{:recovered, message}
attempt_recovery_from_error(media_with_preloads, output_filepath, message)
else
{:error, message}
{:error, :download_failed, message}
end
err ->
Logger.error("Unknown error downloading media item ##{media_with_preloads.id}: #{inspect(err)}")
{:error, "Unknown error: #{inspect(err)}"}
{:error, :unknown, "Unknown error: #{inspect(err)}"}
end
end
defp attempt_update_media_item(media_with_preloads, output_filepath) do
defp attempt_recovery_from_error(media_with_preloads, output_filepath, error_message) do
with {:ok, contents} <- File.read(output_filepath),
{:ok, parsed_json} <- Phoenix.json_library().decode(contents) do
Logger.info("""
@ -71,12 +113,13 @@ defmodule Pinchflat.Downloading.MediaDownloader do
anyway
""")
update_media_item_from_parsed_json(media_with_preloads, parsed_json)
{:ok, updated_media_item} = update_media_item_from_parsed_json(media_with_preloads, parsed_json)
{:recovered, updated_media_item, error_message}
else
err ->
Logger.error("Unable to recover error for media item ##{media_with_preloads.id}: #{inspect(err)}")
{:error, :retry_failed}
{:error, :unrecoverable, error_message}
end
end
@ -113,13 +156,48 @@ defmodule Pinchflat.Downloading.MediaDownloader do
defp download_with_options(url, item_with_preloads, output_filepath, override_opts) do
{:ok, options} = DownloadOptionBuilder.build(item_with_preloads, override_opts)
use_cookies = item_with_preloads.source.use_cookies
runner_opts = [output_filepath: output_filepath, use_cookies: use_cookies]
force_use_cookies = Keyword.get(override_opts, :force_use_cookies, false)
source_uses_cookies = Sources.use_cookies?(item_with_preloads.source, :downloading)
should_use_cookies = force_use_cookies || source_uses_cookies
case YtDlpMedia.get_downloadable_status(url, use_cookies: use_cookies) do
{:ok, :downloadable} -> YtDlpMedia.download(url, options, runner_opts)
{:ok, :ignorable} -> {:error, :unsuitable_for_download}
err -> err
runner_opts = [output_filepath: output_filepath, use_cookies: should_use_cookies]
case {YtDlpMedia.get_downloadable_status(url, use_cookies: should_use_cookies), should_use_cookies} do
{{:ok, :downloadable}, _} ->
YtDlpMedia.download(url, options, runner_opts)
{{:ok, :ignorable}, _} ->
{:error, :unsuitable_for_download}
{{:error, _message, _exit_code} = err, false} ->
# If there was an error and we don't have cookies, this method will retry with cookies
# if doing so would help AND the source allows. Otherwise, it will return the error as-is
maybe_retry_with_cookies(url, item_with_preloads, output_filepath, override_opts, err)
# This gets hit if cookies are enabled which, importantly, also covers the case where we
# retry a download with cookies and it fails again
{{:error, message, exit_code}, true} ->
{:error, message, exit_code}
{err, _} ->
err
end
end
defp maybe_retry_with_cookies(url, item_with_preloads, output_filepath, override_opts, err) do
{:error, message, _} = err
source = item_with_preloads.source
message_contains_cookie_error = String.contains?(to_string(message), recoverable_cookie_errors())
if Sources.use_cookies?(source, :error_recovery) && message_contains_cookie_error do
download_with_options(
url,
item_with_preloads,
output_filepath,
Keyword.put(override_opts, :force_use_cookies, true)
)
else
err
end
end
@ -128,4 +206,11 @@ defmodule Pinchflat.Downloading.MediaDownloader do
"Unable to communicate with SponsorBlock"
]
end
defp recoverable_cookie_errors do
[
"Sign in to confirm",
"This video is available to this channel's members"
]
end
end

View file

@ -12,6 +12,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
alias Pinchflat.Repo
alias Pinchflat.Media
alias Pinchflat.Tasks
alias Pinchflat.Sources
alias Pinchflat.Sources.Source
alias Pinchflat.FastIndexing.YoutubeRss
alias Pinchflat.FastIndexing.YoutubeApi
@ -40,7 +41,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
Returns [%MediaItem{}] where each item is a new media item that was created _but not necessarily
downloaded_.
"""
def kickoff_download_tasks_from_youtube_rss_feed(%Source{} = source) do
def index_and_kickoff_downloads(%Source{} = source) do
# The media_profile is needed to determine the quality options to _then_ determine a more
# accurate predicted filepath
source = Repo.preload(source, [:media_profile])
@ -53,6 +54,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
Enum.map(new_media_ids, fn media_id ->
case create_media_item_from_media_id(source, media_id) do
{:ok, media_item} ->
DownloadingHelpers.kickoff_download_if_pending(media_item, priority: 0)
media_item
err ->
@ -61,7 +63,9 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
end
end)
DownloadingHelpers.enqueue_pending_download_tasks(source)
# Pick up any stragglers. Intentionally has a lower priority than the per-media item
# kickoff above
DownloadingHelpers.enqueue_pending_download_tasks(source, priority: 1)
Enum.filter(maybe_new_media_items, & &1)
end
@ -85,12 +89,16 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpers do
defp create_media_item_from_media_id(source, media_id) do
url = "https://www.youtube.com/watch?v=#{media_id}"
# This is set to :metadata instead of :indexing since this happens _after_ the
# actual indexing process. In reality, slow indexing is the only thing that
# should be using :indexing.
should_use_cookies = Sources.use_cookies?(source, :metadata)
command_opts =
[output: DownloadOptionBuilder.build_output_path_for(source)] ++
DownloadOptionBuilder.build_quality_options_for(source)
case YtDlpMedia.get_media_attributes(url, command_opts, use_cookies: source.use_cookies) do
case YtDlpMedia.get_media_attributes(url, command_opts, use_cookies: should_use_cookies) do
{:ok, media_attrs} ->
Media.create_media_item_from_backend_attrs(source, media_attrs)

View file

@ -38,8 +38,8 @@ defmodule Pinchflat.FastIndexing.FastIndexingWorker do
Order of operations:
1. FastIndexingWorker (this module) periodically checks the YouTube RSS feed for new media.
with `FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed`
2. If the above `kickoff_download_tasks_from_youtube_rss_feed` finds new media items in the RSS feed,
with `FastIndexingHelpers.index_and_kickoff_downloads`
2. If the above `index_and_kickoff_downloads` finds new media items in the RSS feed,
it indexes them with a yt-dlp call to create the media item records then kicks off downloading
tasks (MediaDownloadWorker) for any new media items _that should be downloaded_.
3. Once downloads are kicked off, this worker sends a notification to the apprise server if applicable
@ -67,7 +67,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingWorker do
new_media_items =
source
|> FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed()
|> FastIndexingHelpers.index_and_kickoff_downloads()
|> Enum.filter(&Media.pending_download?(&1))
if source.download_media do

View file

@ -12,6 +12,8 @@ defmodule Pinchflat.FastIndexing.YoutubeApi do
@behaviour YoutubeBehaviour
@agent_name {:global, __MODULE__.KeyIndex}
@doc """
Determines if the YouTube API is enabled for fast indexing by checking
if the user has an API key set
@ -19,7 +21,7 @@ defmodule Pinchflat.FastIndexing.YoutubeApi do
Returns boolean()
"""
@impl YoutubeBehaviour
def enabled?(), do: is_binary(api_key())
def enabled?, do: Enum.any?(api_keys())
@doc """
Fetches the recent media IDs from the YouTube API for a given source.
@ -74,8 +76,45 @@ defmodule Pinchflat.FastIndexing.YoutubeApi do
|> FunctionUtils.wrap_ok()
end
defp api_key do
Settings.get!(:youtube_api_key)
defp api_keys do
case Settings.get!(:youtube_api_key) do
nil ->
[]
keys ->
keys
|> String.split(",")
|> Enum.map(&String.trim/1)
|> Enum.reject(&(&1 == ""))
end
end
defp get_or_start_api_key_agent do
case Agent.start(fn -> 0 end, name: @agent_name) do
{:ok, pid} -> pid
{:error, {:already_started, pid}} -> pid
end
end
# Gets the next API key in round-robin fashion
defp next_api_key do
keys = api_keys()
case keys do
[] ->
nil
keys ->
pid = get_or_start_api_key_agent()
current_index =
Agent.get_and_update(pid, fn current ->
{current, rem(current + 1, length(keys))}
end)
Logger.debug("Using YouTube API key: #{Enum.at(keys, current_index)}")
Enum.at(keys, current_index)
end
end
defp construct_api_endpoint(playlist_id) do
@ -83,7 +122,7 @@ defmodule Pinchflat.FastIndexing.YoutubeApi do
property_type = "contentDetails"
max_results = 50
"#{api_base}?part=#{property_type}&maxResults=#{max_results}&playlistId=#{playlist_id}&key=#{api_key()}"
"#{api_base}?part=#{property_type}&maxResults=#{max_results}&playlistId=#{playlist_id}&key=#{next_api_key()}"
end
defp http_client do

View file

@ -40,6 +40,7 @@ defmodule Pinchflat.Media.MediaItem do
:thumbnail_filepath,
:metadata_filepath,
:nfo_filepath,
:last_error,
# These are user or system controlled fields
:prevent_download,
:prevent_culling,
@ -88,6 +89,7 @@ defmodule Pinchflat.Media.MediaItem do
# Will very likely revisit because I can't leave well-enough alone.
field :subtitle_filepaths, {:array, {:array, :string}}, default: []
field :last_error, :string
field :prevent_download, :boolean, default: false
field :prevent_culling, :boolean, default: false
field :culled_at, :utc_datetime
@ -112,6 +114,9 @@ defmodule Pinchflat.Media.MediaItem do
|> dynamic_default(:uuid, fn _ -> Ecto.UUID.generate() end)
|> update_upload_date_index()
|> validate_required(@required_fields)
# Validate that the title does NOT start with "youtube video #" since that indicates a restriction by YouTube.
# See issue #549 for more information.
|> validate_format(:title, ~r/^(?!youtube video #)/)
|> unique_constraint([:media_id, :source_id])
end

View file

@ -9,6 +9,7 @@ defmodule Pinchflat.Metadata.MetadataFileHelpers do
needed
"""
alias Pinchflat.Sources
alias Pinchflat.Utils.FilesystemUtils
alias Pinchflat.YtDlp.Media, as: YtDlpMedia
@ -66,7 +67,7 @@ defmodule Pinchflat.Metadata.MetadataFileHelpers do
yt_dlp_filepath = generate_filepath_for(media_item_with_preloads, "thumbnail.%(ext)s")
real_filepath = generate_filepath_for(media_item_with_preloads, "thumbnail.jpg")
command_opts = [output: yt_dlp_filepath]
addl_opts = [use_cookies: media_item_with_preloads.source.use_cookies]
addl_opts = [use_cookies: Sources.use_cookies?(media_item_with_preloads.source, :metadata)]
case YtDlpMedia.download_thumbnail(media_item_with_preloads.original_url, command_opts, addl_opts) do
{:ok, _} -> real_filepath

View file

@ -93,7 +93,7 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorker do
defp determine_series_directory(source) do
output_path = DownloadOptionBuilder.build_output_path_for(source)
runner_opts = [output: output_path]
addl_opts = [use_cookies: source.use_cookies]
addl_opts = [use_cookies: Sources.use_cookies?(source, :metadata)]
{:ok, %{filepath: filepath}} = MediaCollection.get_source_details(source.original_url, runner_opts, addl_opts)
case MetadataFileHelpers.series_directory_from_media_filepath(filepath) do
@ -113,6 +113,7 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorker do
defp fetch_metadata_for_source(source) do
tmp_output_path = "#{tmp_directory()}/#{StringUtils.random_string(16)}/source_image.%(ext)S"
base_opts = [convert_thumbnails: "jpg", output: tmp_output_path]
should_use_cookies = Sources.use_cookies?(source, :metadata)
opts =
if source.collection_type == :channel do
@ -121,7 +122,7 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorker do
base_opts ++ [:write_thumbnail, playlist_items: 1]
end
MediaCollection.get_source_metadata(source.original_url, opts, use_cookies: source.use_cookies)
MediaCollection.get_source_metadata(source.original_url, opts, use_cookies: should_use_cookies)
end
defp tmp_directory do

View file

@ -14,7 +14,7 @@ defmodule Pinchflat.Profiles.MediaProfileDeletionWorker do
Starts the profile deletion worker. Does not attach it to a task like `kickoff_with_task/2`
since deletion also cancels all tasks for the profile
Returns {:ok, %Task{}} | {:error, %Ecto.Changeset{}}
Returns {:ok, %Oban.Job{}} | {:error, %Ecto.Changeset{}}
"""
def kickoff(profile, job_args \\ %{}, job_opts \\ []) do
%{id: profile.id}

40
lib/pinchflat/prom_ex.ex Normal file
View file

@ -0,0 +1,40 @@
defmodule Pinchflat.PromEx do
@moduledoc """
Configuration for the PromEx library which provides Prometheus metrics
"""
use PromEx, otp_app: :pinchflat
alias PromEx.Plugins
@impl true
def plugins do
[
Plugins.Application,
Plugins.Beam,
{Plugins.Phoenix, router: PinchflatWeb.Router, endpoint: PinchflatWeb.Endpoint},
Plugins.Ecto,
Plugins.Oban,
Plugins.PhoenixLiveView
]
end
@impl true
def dashboard_assigns do
[
default_selected_interval: "30s"
]
end
@impl true
def dashboards do
[
{:prom_ex, "application.json"},
{:prom_ex, "beam.json"},
{:prom_ex, "phoenix.json"},
{:prom_ex, "ecto.json"},
{:prom_ex, "oban.json"},
{:prom_ex, "phoenix_live_view.json"}
]
end
end

View file

@ -14,15 +14,19 @@ defmodule Pinchflat.Settings.Setting do
:apprise_server,
:video_codec_preference,
:audio_codec_preference,
:youtube_api_key
:youtube_api_key,
:extractor_sleep_interval_seconds,
:download_throughput_limit,
:restrict_filenames
]
@required_fields ~w(
onboarding
pro_enabled
video_codec_preference
audio_codec_preference
)a
@required_fields [
:onboarding,
:pro_enabled,
:video_codec_preference,
:audio_codec_preference,
:extractor_sleep_interval_seconds
]
schema "settings" do
field :onboarding, :boolean, default: true
@ -32,6 +36,10 @@ defmodule Pinchflat.Settings.Setting do
field :apprise_server, :string
field :youtube_api_key, :string
field :route_token, :string
field :extractor_sleep_interval_seconds, :integer, default: 0
# This is a string because it accepts values like "100K" or "4.2M"
field :download_throughput_limit, :string
field :restrict_filenames, :boolean, default: false
field :video_codec_preference, :string
field :audio_codec_preference, :string
@ -42,5 +50,6 @@ defmodule Pinchflat.Settings.Setting do
setting
|> cast(attrs, @allowed_fields)
|> validate_required(@required_fields)
|> validate_number(:extractor_sleep_interval_seconds, greater_than_or_equal_to: 0)
end
end

View file

@ -106,7 +106,7 @@ defmodule Pinchflat.SlowIndexing.FileFollowerServer do
{:noreply, %{state | last_activity: DateTime.utc_now()}}
:eof ->
Logger.debug("EOF reached, waiting before trying to read new lines")
Logger.debug("Current batch of media processed. Will check again in #{@poll_interval_ms}ms")
Process.send_after(self(), :read_new_lines, @poll_interval_ms)
{:noreply, state}

View file

@ -39,7 +39,6 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpers do
def kickoff_indexing_task(%Source{} = source, job_args \\ %{}, job_opts \\ []) do
job_offset_seconds = if job_args[:force], do: 0, else: calculate_job_offset_seconds(source)
Tasks.delete_pending_tasks_for(source, "FastIndexingWorker")
Tasks.delete_pending_tasks_for(source, "MediaCollectionIndexingWorker", include_executing: true)
MediaCollectionIndexingWorker.kickoff_with_task(source, job_args, job_opts ++ [schedule_in: job_offset_seconds])
@ -133,13 +132,14 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpers do
{:ok, pid} = FileFollowerServer.start_link()
handler = fn filepath -> setup_file_follower_watcher(pid, filepath, source) end
should_use_cookies = Sources.use_cookies?(source, :indexing)
command_opts =
[output: DownloadOptionBuilder.build_output_path_for(source)] ++
DownloadOptionBuilder.build_quality_options_for(source) ++
build_download_archive_options(source, was_forced)
runner_opts = [file_listener_handler: handler, use_cookies: source.use_cookies]
runner_opts = [file_listener_handler: handler, use_cookies: should_use_cookies]
result = MediaCollection.get_media_attributes_for_collection(source.original_url, command_opts, runner_opts)
FileFollowerServer.stop(pid)
@ -231,8 +231,9 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpers do
# The download archive isn't useful for playlists (since those are ordered arbitrarily)
# and we don't want to use it if the indexing was forced by the user. In other words,
# only create an archive for channels that are being indexed as part of their regular
# indexing schedule
# indexing schedule. The first indexing pass should also not create an archive.
defp build_download_archive_options(%Source{collection_type: :playlist}, _was_forced), do: []
defp build_download_archive_options(%Source{last_indexed_at: nil}, _was_forced), do: []
defp build_download_archive_options(_source, true), do: []
defp build_download_archive_options(source, _was_forced) do

View file

@ -28,7 +28,7 @@ defmodule Pinchflat.Sources.Source do
series_directory
index_frequency_minutes
fast_index
use_cookies
cookie_behaviour
download_media
last_indexed_at
original_url
@ -78,7 +78,7 @@ defmodule Pinchflat.Sources.Source do
field :collection_type, Ecto.Enum, values: [:channel, :playlist]
field :index_frequency_minutes, :integer, default: 60 * 24
field :fast_index, :boolean, default: false
field :use_cookies, :boolean, default: false
field :cookie_behaviour, Ecto.Enum, values: [:disabled, :when_needed, :all_operations], default: :disabled
field :download_media, :boolean, default: true
field :last_indexed_at, :utc_datetime
# Only download media items that were published after this date

View file

@ -32,6 +32,19 @@ defmodule Pinchflat.Sources do
source.output_path_template_override || media_profile.output_path_template
end
@doc """
Returns a boolean indicating whether or not cookies should be used for a given operation.
Returns boolean()
"""
def use_cookies?(source, operation) when operation in [:indexing, :downloading, :metadata, :error_recovery] do
case source.cookie_behaviour do
:disabled -> false
:all_operations -> true
:when_needed -> operation in [:indexing, :error_recovery]
end
end
@doc """
Returns the list of sources. Returns [%Source{}, ...]
"""
@ -180,9 +193,12 @@ defmodule Pinchflat.Sources do
end
defp add_source_details_to_changeset(source, changeset) do
use_cookies = Ecto.Changeset.get_field(changeset, :use_cookies)
original_url = changeset.changes.original_url
should_use_cookies = Ecto.Changeset.get_field(changeset, :cookie_behaviour) == :all_operations
# Skipping sleep interval since this is UI blocking and we want to keep this as fast as possible
addl_opts = [use_cookies: should_use_cookies, skip_sleep_interval: true]
case MediaCollection.get_source_details(changeset.changes.original_url, [], use_cookies: use_cookies) do
case MediaCollection.get_source_details(original_url, [], addl_opts) do
{:ok, source_details} ->
add_source_details_by_collection_type(source, changeset, source_details)
@ -297,6 +313,10 @@ defmodule Pinchflat.Sources do
%{__meta__: %{state: :built}} ->
SlowIndexingHelpers.kickoff_indexing_task(source)
if Ecto.Changeset.get_field(changeset, :fast_index) do
FastIndexingHelpers.kickoff_indexing_task(source)
end
# If the record has been persisted, only run indexing if the
# indexing frequency has been changed and is now greater than 0
%{__meta__: %{state: :loaded}} ->

View file

@ -36,4 +36,18 @@ defmodule Pinchflat.Utils.NumberUtils do
end
end)
end
@doc """
Adds jitter to a number based on a percentage. Returns 0 if the number is less than or equal to 0.
Returns integer()
"""
def add_jitter(num, jitter_percentage \\ 0.5)
def add_jitter(num, _jitter_percentage) when num <= 0, do: 0
def add_jitter(num, jitter_percentage) do
jitter = :rand.uniform(round(num * jitter_percentage))
round(num + jitter)
end
end

View file

@ -35,4 +35,13 @@ defmodule Pinchflat.Utils.StringUtils do
def double_brace(string) do
"{{ #{string} }}"
end
@doc """
Wraps a string in quotes if it's not already a string. Useful for working with
error messages whose types can vary.
Returns binary()
"""
def wrap_string(message) when is_binary(message), do: message
def wrap_string(message), do: "#{inspect(message)}"
end

View file

@ -5,7 +5,9 @@ defmodule Pinchflat.YtDlp.CommandRunner do
require Logger
alias Pinchflat.Settings
alias Pinchflat.Utils.CliUtils
alias Pinchflat.Utils.NumberUtils
alias Pinchflat.YtDlp.YtDlpCommandRunner
alias Pinchflat.Utils.FilesystemUtils, as: FSUtils
@ -22,23 +24,23 @@ defmodule Pinchflat.YtDlp.CommandRunner do
for a file watcher.
- :use_cookies - if true, will add a cookie file to the command options. Will not
attach a cookie file if the user hasn't set one up.
- :skip_sleep_interval - if true, will not add the sleep interval options to the command.
Usually only used for commands that would be UI-blocking
Returns {:ok, binary()} | {:error, output, status}.
"""
@impl YtDlpCommandRunner
def run(url, action_name, command_opts, output_template, addl_opts \\ []) do
Logger.debug("Running yt-dlp command for action: #{action_name}")
# This approach lets us mock the command for testing
command = backend_executable()
output_filepath = generate_output_filepath(addl_opts)
print_to_file_opts = [{:print_to_file, output_template}, output_filepath]
user_configured_opts = cookie_file_options(addl_opts)
user_configured_opts = cookie_file_options(addl_opts) ++ rate_limit_options(addl_opts) ++ misc_options()
# These must stay in exactly this order, hence why I'm giving it its own variable.
all_opts = command_opts ++ print_to_file_opts ++ user_configured_opts ++ global_options()
formatted_command_opts = [url] ++ CliUtils.parse_options(all_opts)
case CliUtils.wrap_cmd(command, formatted_command_opts, stderr_to_stdout: true) do
case CliUtils.wrap_cmd(backend_executable(), formatted_command_opts, stderr_to_stdout: true) do
# yt-dlp exit codes:
# 0 = Everything is successful
# 100 = yt-dlp must restart for update to complete
@ -74,6 +76,24 @@ defmodule Pinchflat.YtDlp.CommandRunner do
end
end
@doc """
Updates yt-dlp to the latest version
Returns {:ok, binary()} | {:error, binary()}
"""
@impl YtDlpCommandRunner
def update do
command = backend_executable()
case CliUtils.wrap_cmd(command, ["--update"]) do
{output, 0} ->
{:ok, String.trim(output)}
{output, _} ->
{:error, output}
end
end
defp generate_output_filepath(addl_opts) do
case Keyword.get(addl_opts, :output_filepath) do
nil -> FSUtils.generate_metadata_tmpfile(:json)
@ -111,6 +131,32 @@ defmodule Pinchflat.YtDlp.CommandRunner do
end)
end
defp rate_limit_options(addl_opts) do
throughput_limit = Settings.get!(:download_throughput_limit)
sleep_interval_opts = sleep_interval_opts(addl_opts)
throughput_option = if throughput_limit, do: [limit_rate: throughput_limit], else: []
throughput_option ++ sleep_interval_opts
end
defp sleep_interval_opts(addl_opts) do
sleep_interval = Settings.get!(:extractor_sleep_interval_seconds)
if sleep_interval <= 0 || Keyword.get(addl_opts, :skip_sleep_interval) do
[]
else
[
sleep_requests: NumberUtils.add_jitter(sleep_interval),
sleep_interval: NumberUtils.add_jitter(sleep_interval),
sleep_subtitles: NumberUtils.add_jitter(sleep_interval)
]
end
end
defp misc_options do
if Settings.get!(:restrict_filenames), do: [:restrict_filenames], else: []
end
defp backend_executable do
Application.get_env(:pinchflat, :yt_dlp_executable)
end

View file

@ -151,7 +151,7 @@ defmodule Pinchflat.YtDlp.Media do
#
# These don't fail if duration or aspect_ratio are missing
# due to Elixir's comparison semantics
response["duration"] <= 60 && response["aspect_ratio"] <= 0.85
response["duration"] <= 180 && response["aspect_ratio"] <= 0.85
end
end

View file

@ -0,0 +1,44 @@
defmodule Pinchflat.YtDlp.UpdateWorker do
@moduledoc false
use Oban.Worker,
queue: :local_data,
tags: ["local_data"]
require Logger
alias __MODULE__
alias Pinchflat.Settings
@doc """
Starts the yt-dlp update worker. Does not attach it to a task like `kickoff_with_task/2`
Returns {:ok, %Oban.Job{}} | {:error, %Ecto.Changeset{}}
"""
def kickoff do
Oban.insert(UpdateWorker.new(%{}))
end
@doc """
Updates yt-dlp and saves the version to the settings.
This worker is scheduled to run via the Oban Cron plugin as well as on app boot.
Returns :ok
"""
@impl Oban.Worker
def perform(%Oban.Job{}) do
Logger.info("Updating yt-dlp")
yt_dlp_runner().update()
{:ok, yt_dlp_version} = yt_dlp_runner().version()
Settings.set(yt_dlp_version: yt_dlp_version)
:ok
end
defp yt_dlp_runner do
Application.get_env(:pinchflat, :yt_dlp_runner)
end
end

View file

@ -9,4 +9,5 @@ defmodule Pinchflat.YtDlp.YtDlpCommandRunner do
@callback run(binary(), atom(), keyword(), binary()) :: {:ok, binary()} | {:error, binary(), integer()}
@callback run(binary(), atom(), keyword(), binary(), keyword()) :: {:ok, binary()} | {:error, binary(), integer()}
@callback version() :: {:ok, binary()} | {:error, binary()}
@callback update() :: {:ok, binary()} | {:error, binary()}
end

View file

@ -43,7 +43,7 @@ defmodule PinchflatWeb do
layouts: [html: PinchflatWeb.Layouts]
import Plug.Conn
import PinchflatWeb.Gettext
use Gettext, backend: PinchflatWeb.Gettext
alias Pinchflat.Settings
alias PinchflatWeb.Layouts
@ -94,7 +94,7 @@ defmodule PinchflatWeb do
# HTML escaping functionality
import Phoenix.HTML
# Core UI components and translation
import PinchflatWeb.Gettext
use Gettext, backend: PinchflatWeb.Gettext
import PinchflatWeb.CoreComponents
import PinchflatWeb.CustomComponents.TabComponents
import PinchflatWeb.CustomComponents.TextComponents

View file

@ -15,8 +15,7 @@ defmodule PinchflatWeb.CoreComponents do
Icons are provided by [heroicons](https://heroicons.com). See `icon/1` for usage.
"""
use Phoenix.Component, global_prefixes: ~w(x-)
import PinchflatWeb.Gettext
use Gettext, backend: PinchflatWeb.Gettext
alias Phoenix.LiveView.JS
alias PinchflatWeb.CustomComponents.TextComponents
@ -700,7 +699,7 @@ defmodule PinchflatWeb.CoreComponents do
attr :class, :string, default: nil
attr :rest, :global
def icon(%{name: "hero-" <> _} = assigns) do
def icon(assigns) do
~H"""
<span class={[@name, @class]} {@rest} />
"""

View file

@ -3,6 +3,7 @@ defmodule PinchflatWeb.CustomComponents.ButtonComponents do
use Phoenix.Component, global_prefixes: ~w(x-)
alias PinchflatWeb.CoreComponents
alias PinchflatWeb.CustomComponents.TextComponents
@doc """
Render a button
@ -104,7 +105,7 @@ defmodule PinchflatWeb.CustomComponents.ButtonComponents do
def icon_button(assigns) do
~H"""
<div class="group relative inline-block">
<TextComponents.tooltip position="bottom" tooltip={@tooltip} tooltip_class="text-nowrap">
<button
class={[
"flex justify-center items-center rounded-lg ",
@ -117,18 +118,7 @@ defmodule PinchflatWeb.CustomComponents.ButtonComponents do
>
<CoreComponents.icon name={@icon_name} class="text-stroke" />
</button>
<div
:if={@tooltip}
class={[
"hidden absolute left-1/2 top-full z-20 mt-3 -translate-x-1/2 whitespace-nowrap rounded-md",
"px-4.5 py-1.5 text-sm font-medium opacity-0 drop-shadow-4 group-hover:opacity-100 group-hover:block bg-meta-4"
]}
>
<span class="border-light absolute -top-1 left-1/2 -z-10 h-2 w-2 -translate-x-1/2 rotate-45 rounded-sm bg-meta-4">
</span>
<span>{@tooltip}</span>
</div>
</div>
</TextComponents.tooltip>
"""
end
end

View file

@ -41,7 +41,7 @@ defmodule PinchflatWeb.CustomComponents.TabComponents do
{render_slot(@tab_append)}
</div>
</header>
<div class="mt-4 min-h-60">
<div class="mt-4 min-h-60 overflow-x-auto">
<div :for={tab <- @tab} x-show={"openTab === '#{tab.id}'"} class="font-medium leading-relaxed">
{render_slot(tab)}
</div>

View file

@ -71,19 +71,39 @@ defmodule PinchflatWeb.CustomComponents.TextComponents do
formatted_text =
Regex.split(~r{https?://\S+}, assigns.text, include_captures: true)
|> Enum.map(fn
"http" <> _ = url ->
Phoenix.HTML.Tag.content_tag(:a, url, class: "text-blue-500 hover:text-blue-300", href: url, target: "_blank")
text ->
text
|> String.split("\n", trim: false)
|> Enum.intersperse(Phoenix.HTML.Tag.tag(:span, class: "inline-block mt-2"))
"http" <> _ = url -> {:url, url}
text -> Regex.split(~r{\n}, text, include_captures: true, trim: true)
end)
assigns = Map.put(assigns, :text, formatted_text)
~H"""
<span>{@text}</span>
<span>
<.rendered_description_line :for={line <- @text} content={line} />
</span>
"""
end
defp rendered_description_line(%{content: {:url, url}} = assigns) do
assigns = Map.put(assigns, :url, url)
~H"""
<a href={@url} target="_blank" class="text-blue-500 hover:text-blue-300">
{@url}
</a>
"""
end
defp rendered_description_line(%{content: list_of_content} = assigns) do
assigns = Map.put(assigns, :list_of_content, list_of_content)
~H"""
<span
:for={inner_content <- @list_of_content}
class={[if(inner_content == "\n", do: "block", else: "mt-2 inline-block")]}
>
{inner_content}
</span>
"""
end
@ -146,4 +166,60 @@ defmodule PinchflatWeb.CustomComponents.TextComponents do
<.localized_number number={@num} /> {@suffix}
"""
end
@doc """
Renders a tooltip with the given content
"""
attr :tooltip, :string, required: true
attr :position, :string, default: ""
attr :tooltip_class, :any, default: ""
attr :tooltip_arrow_class, :any, default: ""
slot :inner_block
def tooltip(%{position: "bottom-right"} = assigns) do
~H"""
<.tooltip tooltip={@tooltip} tooltip_class={@tooltip_class} tooltip_arrow_class={["-top-1", @tooltip_arrow_class]}>
{render_slot(@inner_block)}
</.tooltip>
"""
end
def tooltip(%{position: "bottom"} = assigns) do
~H"""
<.tooltip
tooltip={@tooltip}
tooltip_class={["left-1/2 -translate-x-1/2", @tooltip_class]}
tooltip_arrow_class={["-top-1 left-1/2 -translate-x-1/2", @tooltip_arrow_class]}
>
{render_slot(@inner_block)}
</.tooltip>
"""
end
def tooltip(assigns) do
~H"""
<div class="group relative inline-block cursor-pointer">
<div>
{render_slot(@inner_block)}
</div>
<div
:if={@tooltip}
class={[
"hidden absolute top-full z-20 mt-3 whitespace-nowrap rounded-md",
"p-1.5 text-sm font-medium opacity-0 drop-shadow-4 group-hover:opacity-100 group-hover:block bg-meta-4",
"border border-form-strokedark text-wrap",
@tooltip_class
]}
>
<span class={[
"border-t border-l border-form-strokedark absolute -z-10 h-2 w-2 rotate-45 rounded-sm bg-meta-4",
@tooltip_arrow_class
]}>
</span>
<div class="px-3">{@tooltip}</div>
</div>
</div>
"""
end
end

View file

@ -15,11 +15,12 @@ defmodule PinchflatWeb.Layouts do
attr :text, :string, required: true
attr :href, :any, required: true
attr :target, :any, default: "_self"
attr :icon_class, :string, default: ""
def sidebar_item(assigns) do
~H"""
<li class="text-bodydark1">
<.sidebar_link icon={@icon} text={@text} href={@href} target={@target} />
<.sidebar_link icon={@icon} text={@text} href={@href} target={@target} icon_class={@icon_class} />
</li>
"""
end
@ -89,6 +90,7 @@ defmodule PinchflatWeb.Layouts do
attr :href, :any, required: true
attr :target, :any, default: "_self"
attr :class, :string, default: ""
attr :icon_class, :string, default: ""
def sidebar_link(assigns) do
~H"""
@ -103,7 +105,7 @@ defmodule PinchflatWeb.Layouts do
@class
]}
>
<.icon :if={@icon} name={@icon} /> {@text}
<.icon :if={@icon} name={@icon} class={@icon_class} /> {@text}
</.link>
"""
end

View file

@ -8,7 +8,7 @@
>
<section>
<div class="flex items-center justify-between gap-2 px-6 py-4">
<a href="/" class="flex items-center">
<a href={~p"/"} class="flex items-center">
<img src={~p"/images/logo-2024-03-20.png"} alt="Pinchflat" class="w-auto" />
</a>
@ -47,8 +47,10 @@
text="Docs"
target="_blank"
href="https://github.com/kieraneglin/pinchflat/wiki"
icon_class="scale-110"
/>
<.sidebar_item icon="hero-cog" text="Github" target="_blank" href="https://github.com/kieraneglin/pinchflat" />
<.sidebar_item icon="si-github" text="Github" target="_blank" href="https://github.com/kieraneglin/pinchflat" />
<.sidebar_item icon="si-discord" text="Discord" target="_blank" href="https://discord.gg/j7T6dCuwU4" />
<li>
<span
class={[
@ -59,7 +61,7 @@
]}
phx-click={show_modal("donate-modal")}
>
<.icon name="hero-currency-dollar" /> Donate
<.icon name="hero-currency-dollar" class="scale-110" /> Donate
</span>
</li>
<li>

View file

@ -16,7 +16,7 @@
</.link>
</nav>
</div>
<div class="rounded-sm border border-stroke bg-white py-5 pt-6 shadow-default dark:border-strokedark dark:bg-boxdark px-7.5">
<div class="rounded-sm border py-5 pt-6 shadow-default border-strokedark bg-boxdark px-7.5">
<div class="max-w-full">
<.tabbed_layout>
<:tab_append>
@ -24,7 +24,15 @@
</:tab_append>
<:tab title="Media" id="media">
<div class="flex flex-col gap-10 dark:text-white">
<div class="flex flex-col gap-10 text-white">
<section :if={@media_item.last_error} class="mt-6">
<div class="flex items-center gap-1 mb-2">
<.icon name="hero-exclamation-circle-solid" class="text-red-500" />
<h3 class="font-bold text-xl">Last Error</h3>
</div>
<span>{@media_item.last_error}</span>
</section>
<%= if media_file_exists?(@media_item) do %>
<section class="grid grid-cols-1 xl:gap-6 mt-6">
<div>
@ -54,19 +62,21 @@
</section>
<% end %>
<h3 class="font-bold text-xl mt-6">Raw Attributes</h3>
<section>
<strong>Source:</strong>
<.subtle_link href={~p"/sources/#{@media_item.source_id}"}>
{@media_item.source.custom_name}
</.subtle_link>
<.list_items_from_map map={Map.from_struct(@media_item)} />
<h3 class="font-bold text-xl mb-2">Raw Attributes</h3>
<section>
<strong>Source:</strong>
<.subtle_link href={~p"/sources/#{@media_item.source_id}"}>
{@media_item.source.custom_name}
</.subtle_link>
<.list_items_from_map map={Map.from_struct(@media_item)} />
</section>
</section>
</div>
</:tab>
<:tab title="Tasks" id="tasks">
<%= if match?([_|_], @media_item.tasks) do %>
<.table rows={@media_item.tasks} table_class="text-black dark:text-white">
<.table rows={@media_item.tasks} table_class="text-white">
<:col :let={task} label="Worker">
{task.job.worker}
</:col>

View file

@ -100,7 +100,7 @@
field={f[:embed_subs]}
type="toggle"
label="Embed Subtitles"
help="Downloads and embeds subtitles in the media file itself, if supported. Uneffected by 'Download Subtitles'"
help="Downloads and embeds subtitles in the media file itself, if supported. Unaffected by 'Download Subtitles'"
x-init="$watch('selectedPreset', p => p && (enabled = presets[p]))"
/>
</section>
@ -154,7 +154,7 @@
field={f[:embed_thumbnail]}
type="toggle"
label="Embed Thumbnail"
help="Downloads and embeds thumbnail in the media file itself, if supported. Uneffected by 'Download Thumbnail' (recommended)"
help="Downloads and embeds thumbnail in the media file itself, if supported. Unaffected by 'Download Thumbnail' (recommended)"
x-init="$watch('selectedPreset', p => p && (enabled = presets[p]))"
/>
</section>
@ -178,7 +178,7 @@
field={f[:embed_metadata]}
type="toggle"
label="Embed Metadata"
help="Downloads and embeds metadata in the media file itself, if supported. Uneffected by 'Download Metadata' (recommended)"
help="Downloads and embeds metadata in the media file itself, if supported. Unaffected by 'Download Metadata' (recommended)"
x-init="$watch('selectedPreset', p => p && (enabled = presets[p]))"
/>
</section>

View file

@ -25,8 +25,10 @@
<:tab title="Media Profile" id="media-profile">
<div class="flex flex-col gap-10 text-white">
<h3 class="font-bold text-xl mt-6">Raw Attributes</h3>
<.list_items_from_map map={Map.from_struct(@media_profile)} />
<section>
<h3 class="font-bold text-xl mt-6 mb-2">Raw Attributes</h3>
<.list_items_from_map map={Map.from_struct(@media_profile)} />
</section>
</div>
</:tab>
<:tab title="Sources" id="sources">

View file

@ -28,10 +28,22 @@ defmodule Pinchflat.Pages.HistoryTableLive do
</span>
<div class="max-w-full overflow-x-auto">
<.table rows={@records} table_class="text-white">
<:col :let={media_item} label="Title" class="truncate max-w-xs">
<.subtle_link href={~p"/sources/#{media_item.source_id}/media/#{media_item}"}>
{media_item.title}
</.subtle_link>
<:col :let={media_item} label="Title" class="max-w-xs">
<section class="flex items-center space-x-1">
<.tooltip
:if={media_item.last_error}
tooltip={media_item.last_error}
position="bottom-right"
tooltip_class="w-64"
>
<.icon name="hero-exclamation-circle-solid" class="text-red-500" />
</.tooltip>
<span class="truncate">
<.subtle_link href={~p"/sources/#{media_item.source_id}/media/#{media_item.id}"}>
{media_item.title}
</.subtle_link>
</span>
</section>
</:col>
<:col :let={media_item} label="Upload Date">
{DateTime.to_date(media_item.uploaded_at)}

View file

@ -56,12 +56,8 @@
session: %{"media_state" => "pending"}
)}
</:tab>
<:tab title="Active Tasks" id="active-tasks">
{live_render(@conn, Pinchflat.Pages.JobTableLive)}
</:tab>
</.tabbed_layout>
</div>
<div class="rounded-sm border shadow-default border-strokedark bg-boxdark mt-4 p-5">
<span class="text-2xl font-medium mb-4">Active Tasks</span>
<section class="mt-6 min-h-80">
{live_render(@conn, Pinchflat.Pages.JobTableLive)}
</section>
</div>

View file

@ -29,18 +29,40 @@
<section class="mt-8">
<section>
<h3 class="text-2xl text-black dark:text-white">
Indexing Settings
Extractor Settings
</h3>
<.input
field={f[:youtube_api_key]}
placeholder="ABC123"
placeholder="ABC123,DEF456"
type="text"
label="YouTube API Key"
label="YouTube API Key(s)"
help={youtube_api_help()}
html_help={true}
inputclass="font-mono text-sm mr-4"
/>
<.input
field={f[:extractor_sleep_interval_seconds]}
placeholder="0"
type="number"
label="Sleep Interval (seconds)"
help="Sleep interval in seconds between each extractor request. Must be a positive whole number. Set to 0 to disable"
/>
<.input
field={f[:download_throughput_limit]}
placeholder="4.2M"
label="Download Throughput"
help="Sets the max bytes-per-second throughput when downloading media. Examples: '50K' or '4.2M'. Leave blank to disable"
/>
<.input
field={f[:restrict_filenames]}
type="toggle"
label="Restrict Filenames"
help="Restrict filenames to only ASCII characters and avoid ampersands/spaces in filenames"
/>
</section>
</section>

View file

@ -27,6 +27,14 @@ defmodule PinchflatWeb.Sources.SourceHTML do
]
end
def friendly_cookie_behaviours do
[
{"Disabled", :disabled},
{"When Needed", :when_needed},
{"All Operations", :all_operations}
]
end
def cutoff_date_presets do
[
{"7 days", compute_date_offset(7)},

View file

@ -23,7 +23,7 @@ defmodule PinchflatWeb.Sources.MediaItemTableLive do
<header class="flex justify-between items-center mb-4">
<span class="flex items-center">
<.icon_button icon_name="hero-arrow-path" class="h-10 w-10" phx-click="reload_page" tooltip="Refresh" />
<span class="ml-2">
<span class="mx-2">
Showing <.localized_number number={length(@records)} /> of <.localized_number number={@filtered_record_count} />
</span>
</span>
@ -46,10 +46,22 @@ defmodule PinchflatWeb.Sources.MediaItemTableLive do
</div>
</header>
<.table rows={@records} table_class="text-white">
<:col :let={media_item} label="Title" class="truncate max-w-xs">
<.subtle_link href={~p"/sources/#{@source.id}/media/#{media_item.id}"}>
{media_item.title}
</.subtle_link>
<:col :let={media_item} label="Title" class="max-w-xs">
<section class="flex items-center space-x-1">
<.tooltip
:if={media_item.last_error}
tooltip={media_item.last_error}
position="bottom-right"
tooltip_class="w-64"
>
<.icon name="hero-exclamation-circle-solid" class="text-red-500" />
</.tooltip>
<span class="truncate">
<.subtle_link href={~p"/sources/#{@source.id}/media/#{media_item.id}"}>
{media_item.title}
</.subtle_link>
</span>
</section>
</:col>
<:col :let={media_item} :if={@media_state == "other"} label="Manually Ignored?">
<.icon name={if media_item.prevent_download, do: "hero-check", else: "hero-x-mark"} />
@ -205,6 +217,6 @@ defmodule PinchflatWeb.Sources.MediaItemTableLive do
# Selecting only what we need GREATLY speeds up queries on large tables
defp select_fields do
[:id, :title, :uploaded_at, :prevent_download]
[:id, :title, :uploaded_at, :prevent_download, :last_error]
end
end

View file

@ -24,16 +24,18 @@
</:tab_append>
<:tab title="Source" id="source">
<div class="flex flex-col gap-10 text-white">
<h3 class="font-bold text-xl mt-6">Raw Attributes</h3>
<div class="flex flex-col text-white gap-10">
<section>
<strong>Media Profile:</strong>
<.subtle_link href={~p"/media_profiles/#{@source.media_profile_id}"}>
{@source.media_profile.name}
</.subtle_link>
</section>
<h3 class="font-bold text-xl mb-2 mt-6">Raw Attributes</h3>
<section>
<strong>Media Profile:</strong>
<.subtle_link href={~p"/media_profiles/#{@source.media_profile_id}"}>
{@source.media_profile.name}
</.subtle_link>
</section>
<.list_items_from_map map={Map.from_struct(@source)} />
<.list_items_from_map map={Map.from_struct(@source)} />
</section>
</div>
</:tab>
<:tab title="Pending" id="pending">

View file

@ -87,10 +87,11 @@
/>
<.input
field={f[:use_cookies]}
type="toggle"
label="Use Cookies for Downloading"
help="Uses your YouTube cookies for this source (if configured). Used for downloading private playlists and videos. See docs for important details"
field={f[:cookie_behaviour]}
options={friendly_cookie_behaviours()}
type="select"
label="Cookie Behaviour"
help="Uses your YouTube cookies for this source (if configured). 'When Needed' tries to minimize cookie usage except for certain indexing and downloading tasks. See docs"
/>
<section x-show="advancedMode">

View file

@ -54,8 +54,8 @@ defmodule PinchflatWeb.Sources.SourceLive.IndexTableLive do
defp sort_attr(:pending_count), do: dynamic([s, mp, dl, pe], pe.pending_count)
defp sort_attr(:downloaded_count), do: dynamic([s, mp, dl], dl.downloaded_count)
defp sort_attr(:media_size_bytes), do: dynamic([s, mp, dl], dl.media_size_bytes)
defp sort_attr(:media_profile_name), do: dynamic([s, mp], mp.name)
defp sort_attr(:custom_name), do: dynamic([s], s.custom_name)
defp sort_attr(:media_profile_name), do: dynamic([s, mp], fragment("? COLLATE NOCASE", mp.name))
defp sort_attr(:custom_name), do: dynamic([s], fragment("? COLLATE NOCASE", s.custom_name))
defp sort_attr(:enabled), do: dynamic([s], s.enabled)
defp set_sources(%{assigns: assigns} = socket) do

View file

@ -20,7 +20,7 @@ defmodule PinchflatWeb.Endpoint do
plug Plug.Static,
at: "/",
from: :pinchflat,
gzip: Mix.env() == :prod,
gzip: Application.compile_env(:pinchflat, :env) == :prod,
only: PinchflatWeb.static_paths()
# Code reloading can be explicitly enabled under the
@ -32,12 +32,17 @@ defmodule PinchflatWeb.Endpoint do
plug Phoenix.Ecto.CheckRepoStatus, otp_app: :pinchflat
end
plug PromEx.Plug, prom_ex_module: Pinchflat.PromEx
plug Phoenix.LiveDashboard.RequestLogger,
param_key: "request_logger",
cookie_key: "request_logger"
plug Plug.RequestId
plug Plug.Telemetry, event_prefix: [:phoenix, :endpoint]
plug Plug.Telemetry,
event_prefix: [:phoenix, :endpoint],
log: {__MODULE__, :log_level, []}
plug Plug.Parsers,
parsers: [:urlencoded, :multipart, :json],
@ -53,6 +58,10 @@ defmodule PinchflatWeb.Endpoint do
plug PinchflatWeb.Router
# Disables logging in Plug.Telemetry for healthcheck requests
def log_level(%Plug.Conn{path_info: ["healthcheck"]}), do: false
def log_level(_), do: :info
# URLs need to be generated using the host of the current page being accessed
# for things like Podcast RSS feeds to contain links to the right location.
#

View file

@ -5,7 +5,7 @@ defmodule PinchflatWeb.Gettext do
By using [Gettext](https://hexdocs.pm/gettext),
your module gains a set of macros for translations, for example:
import PinchflatWeb.Gettext
use Gettext, backend: PinchflatWeb.Gettext
# Simple translation
gettext("Here is the string to translate")
@ -20,5 +20,5 @@ defmodule PinchflatWeb.Gettext do
See the [Gettext Docs](https://hexdocs.pm/gettext) for detailed usage.
"""
use Gettext, otp_app: :pinchflat
use Gettext.Backend, otp_app: :pinchflat
end

View file

@ -68,7 +68,7 @@ defmodule PinchflatWeb.Router do
scope "/", PinchflatWeb do
pipe_through :api
get "/healthcheck", HealthController, :check
get "/healthcheck", HealthController, :check, log: false
end
scope "/dev" do

21
mix.exs
View file

@ -4,9 +4,10 @@ defmodule Pinchflat.MixProject do
def project do
[
app: :pinchflat,
version: "2025.1.3",
version: "2025.9.26",
elixir: "~> 1.17",
elixirc_paths: elixirc_paths(Mix.env()),
elixirc_options: [warnings_as_errors: System.get_env("EX_CHECK") == "1"],
start_permanent: Mix.env() == :prod,
aliases: aliases(),
deps: deps(),
@ -46,13 +47,13 @@ defmodule Pinchflat.MixProject do
# Type `mix help deps` for examples and options.
defp deps do
[
{:phoenix, "~> 1.7.14"},
{:phoenix, "~> 1.7.21"},
{:phoenix_ecto, "~> 4.4"},
{:ecto, "~> 3.12.3"},
{:ecto_sql, "~> 3.12"},
{:ecto_sqlite3, ">= 0.0.0"},
{:ecto_sqlite3_extras, "~> 1.2.0"},
{:phoenix_html, "~> 3.3"},
{:phoenix_html, "~> 4.2"},
{:phoenix_live_reload, "~> 1.2", only: :dev},
{:phoenix_live_view, "~> 1.0.0"},
{:floki, ">= 0.36.0", only: :test},
@ -60,20 +61,22 @@ defmodule Pinchflat.MixProject do
{:esbuild, "~> 0.8", runtime: Mix.env() == :dev},
{:tailwind, "~> 0.2.0", runtime: Mix.env() == :dev},
{:swoosh, "~> 1.3"},
{:finch, "~> 0.13"},
{:telemetry_metrics, "~> 0.6"},
{:telemetry_poller, "~> 1.0"},
{:finch, "~> 0.18"},
{:telemetry_metrics, "~> 1.0"},
{:telemetry_poller, "~> 1.1"},
{:gettext, "~> 0.20"},
{:jason, "~> 1.2"},
{:dns_cluster, "~> 0.1.1"},
{:dns_cluster, "~> 0.2"},
{:plug_cowboy, "~> 2.5"},
{:oban, "~> 2.17"},
{:nimble_parsec, "~> 1.4"},
{:timex, "~> 3.0"},
# See: https://github.com/bitwalker/timex/issues/778
{:timex, git: "https://github.com/bitwalker/timex.git", ref: "cc649c7a586f1266b17d57aff3c6eb1a56116ca2"},
{:prom_ex, "~> 1.11.0"},
{:mox, "~> 1.0", only: :test},
{:credo, "~> 1.7.7", only: [:dev, :test], runtime: false},
{:credo_naming, "~> 2.1", only: [:dev, :test], runtime: false},
{:ex_check, "~> 0.14.0", only: [:dev, :test], runtime: false},
{:ex_check, "~> 0.16.0", only: [:dev, :test], runtime: false},
{:faker, "~> 0.17", only: :test},
{:sobelow, "~> 0.13", only: [:dev, :test], runtime: false}
]

101
mix.lock
View file

@ -1,68 +1,73 @@
%{
"bunt": {:hex, :bunt, "1.0.0", "081c2c665f086849e6d57900292b3a161727ab40431219529f13c4ddcf3e7a44", [:mix], [], "hexpm", "dc5f86aa08a5f6fa6b8096f0735c4e76d54ae5c9fa2c143e5a1fc7c1cd9bb6b5"},
"castore": {:hex, :castore, "1.0.10", "43bbeeac820f16c89f79721af1b3e092399b3a1ecc8df1a472738fd853574911", [:mix], [], "hexpm", "1b0b7ea14d889d9ea21202c43a4fa015eb913021cb535e8ed91946f4b77a8848"},
"castore": {:hex, :castore, "1.0.14", "4582dd7d630b48cf5e1ca8d3d42494db51e406b7ba704e81fbd401866366896a", [:mix], [], "hexpm", "7bc1b65249d31701393edaaac18ec8398d8974d52c647b7904d01b964137b9f4"},
"cc_precompiler": {:hex, :cc_precompiler, "0.1.10", "47c9c08d8869cf09b41da36538f62bc1abd3e19e41701c2cea2675b53c704258", [:mix], [{:elixir_make, "~> 0.7", [hex: :elixir_make, repo: "hexpm", optional: false]}], "hexpm", "f6e046254e53cd6b41c6bacd70ae728011aa82b2742a80d6e2214855c6e06b22"},
"certifi": {:hex, :certifi, "2.12.0", "2d1cca2ec95f59643862af91f001478c9863c2ac9cb6e2f89780bfd8de987329", [:rebar3], [], "hexpm", "ee68d85df22e554040cdb4be100f33873ac6051387baf6a8f6ce82272340ff1c"},
"certifi": {:hex, :certifi, "2.15.0", "0e6e882fcdaaa0a5a9f2b3db55b1394dba07e8d6d9bcad08318fb604c6839712", [:rebar3], [], "hexpm", "b147ed22ce71d72eafdad94f055165c1c182f61a2ff49df28bcc71d1d5b94a60"},
"combine": {:hex, :combine, "0.10.0", "eff8224eeb56498a2af13011d142c5e7997a80c8f5b97c499f84c841032e429f", [:mix], [], "hexpm", "1b1dbc1790073076580d0d1d64e42eae2366583e7aecd455d1215b0d16f2451b"},
"cowboy": {:hex, :cowboy, "2.12.0", "f276d521a1ff88b2b9b4c54d0e753da6c66dd7be6c9fca3d9418b561828a3731", [:make, :rebar3], [{:cowlib, "2.13.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "8a7abe6d183372ceb21caa2709bec928ab2b72e18a3911aa1771639bef82651e"},
"cowboy": {:hex, :cowboy, "2.13.0", "09d770dd5f6a22cc60c071f432cd7cb87776164527f205c5a6b0f24ff6b38990", [:make, :rebar3], [{:cowlib, ">= 2.14.0 and < 3.0.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, ">= 1.8.0 and < 3.0.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "e724d3a70995025d654c1992c7b11dbfea95205c047d86ff9bf1cda92ddc5614"},
"cowboy_telemetry": {:hex, :cowboy_telemetry, "0.4.0", "f239f68b588efa7707abce16a84d0d2acf3a0f50571f8bb7f56a15865aae820c", [:rebar3], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7d98bac1ee4565d31b62d59f8823dfd8356a169e7fcbb83831b8a5397404c9de"},
"cowlib": {:hex, :cowlib, "2.13.0", "db8f7505d8332d98ef50a3ef34b34c1afddec7506e4ee4dd4a3a266285d282ca", [:make, :rebar3], [], "hexpm", "e1e1284dc3fc030a64b1ad0d8382ae7e99da46c3246b815318a4b848873800a4"},
"credo": {:hex, :credo, "1.7.7", "771445037228f763f9b2afd612b6aa2fd8e28432a95dbbc60d8e03ce71ba4446", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "8bc87496c9aaacdc3f90f01b7b0582467b69b4bd2441fe8aae3109d843cc2f2e"},
"cowlib": {:hex, :cowlib, "2.15.0", "3c97a318a933962d1c12b96ab7c1d728267d2c523c25a5b57b0f93392b6e9e25", [:make, :rebar3], [], "hexpm", "4f00c879a64b4fe7c8fcb42a4281925e9ffdb928820b03c3ad325a617e857532"},
"credo": {:hex, :credo, "1.7.12", "9e3c20463de4b5f3f23721527fcaf16722ec815e70ff6c60b86412c695d426c1", [:mix], [{:bunt, "~> 0.2.1 or ~> 1.0", [hex: :bunt, repo: "hexpm", optional: false]}, {:file_system, "~> 0.2 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "8493d45c656c5427d9c729235b99d498bd133421f3e0a683e5c1b561471291e5"},
"credo_naming": {:hex, :credo_naming, "2.1.0", "d44ad58890d4db552e141ce64756a74ac1573665af766d1ac64931aa90d47744", [:make, :mix], [{:credo, "~> 1.6", [hex: :credo, repo: "hexpm", optional: false]}], "hexpm", "830e23b3fba972e2fccec49c0c089fe78c1e64bc16782a2682d78082351a2909"},
"db_connection": {:hex, :db_connection, "2.7.0", "b99faa9291bb09892c7da373bb82cba59aefa9b36300f6145c5f201c7adf48ec", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "dcf08f31b2701f857dfc787fbad78223d61a32204f217f15e881dd93e4bdd3ff"},
"decimal": {:hex, :decimal, "2.2.0", "df3d06bb9517e302b1bd265c1e7f16cda51547ad9d99892049340841f3e15836", [:mix], [], "hexpm", "af8daf87384b51b7e611fb1a1f2c4d4876b65ef968fa8bd3adf44cff401c7f21"},
"dns_cluster": {:hex, :dns_cluster, "0.1.2", "3eb5be824c7888dadf9781018e1a5f1d3d1113b333c50bce90fb1b83df1015f2", [:mix], [], "hexpm", "7494272040f847637bbdb01bcdf4b871e82daf09b813e7d3cb3b84f112c6f2f8"},
"ecto": {:hex, :ecto, "3.12.3", "1a9111560731f6c3606924c81c870a68a34c819f6d4f03822f370ea31a582208", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9efd91506ae722f95e48dc49e70d0cb632ede3b7a23896252a60a14ac6d59165"},
"ecto_sql": {:hex, :ecto_sql, "3.12.0", "73cea17edfa54bde76ee8561b30d29ea08f630959685006d9c6e7d1e59113b7d", [:mix], [{:db_connection, "~> 2.4.1 or ~> 2.5", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.12", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.7", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.19 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "dc9e4d206f274f3947e96142a8fdc5f69a2a6a9abb4649ef5c882323b6d512f0"},
"ecto_sqlite3": {:hex, :ecto_sqlite3, "0.17.2", "200226e057f76c40be55fbac77771eb1a233260ab8ec7283f5da6d9402bde8de", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:ecto, "~> 3.12", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "~> 3.12", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:exqlite, "~> 0.22", [hex: :exqlite, repo: "hexpm", optional: false]}], "hexpm", "a3838919c5a34c268c28cafab87b910bcda354a9a4e778658da46c149bb2c1da"},
"decimal": {:hex, :decimal, "2.3.0", "3ad6255aa77b4a3c4f818171b12d237500e63525c2fd056699967a3e7ea20f62", [:mix], [], "hexpm", "a4d66355cb29cb47c3cf30e71329e58361cfcb37c34235ef3bf1d7bf3773aeac"},
"dns_cluster": {:hex, :dns_cluster, "0.2.0", "aa8eb46e3bd0326bd67b84790c561733b25c5ba2fe3c7e36f28e88f384ebcb33", [:mix], [], "hexpm", "ba6f1893411c69c01b9e8e8f772062535a4cf70f3f35bcc964a324078d8c8240"},
"ecto": {:hex, :ecto, "3.12.5", "4a312960ce612e17337e7cefcf9be45b95a3be6b36b6f94dfb3d8c361d631866", [:mix], [{:decimal, "~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "6eb18e80bef8bb57e17f5a7f068a1719fbda384d40fc37acb8eb8aeca493b6ea"},
"ecto_sql": {:hex, :ecto_sql, "3.12.1", "c0d0d60e85d9ff4631f12bafa454bc392ce8b9ec83531a412c12a0d415a3a4d0", [:mix], [{:db_connection, "~> 2.4.1 or ~> 2.5", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.12", [hex: :ecto, repo: "hexpm", optional: false]}, {:myxql, "~> 0.7", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.19 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:tds, "~> 2.1.1 or ~> 2.2", [hex: :tds, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "aff5b958a899762c5f09028c847569f7dfb9cc9d63bdb8133bff8a5546de6bf5"},
"ecto_sqlite3": {:hex, :ecto_sqlite3, "0.19.0", "00030bbaba150369ff3754bbc0d2c28858e8f528ae406bf6997d1772d3a03203", [:mix], [{:decimal, "~> 1.6 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: false]}, {:ecto, "~> 3.12", [hex: :ecto, repo: "hexpm", optional: false]}, {:ecto_sql, "~> 3.12", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:exqlite, "~> 0.22", [hex: :exqlite, repo: "hexpm", optional: false]}], "hexpm", "297b16750fe229f3056fe32afd3247de308094e8b0298aef0d73a8493ce97c81"},
"ecto_sqlite3_extras": {:hex, :ecto_sqlite3_extras, "1.2.2", "36e60b561a11441d15f26c791817999269fb578b985162207ebb08b04ca71e40", [:mix], [{:exqlite, ">= 0.13.2", [hex: :exqlite, repo: "hexpm", optional: false]}, {:table_rex, "~> 4.0", [hex: :table_rex, repo: "hexpm", optional: false]}], "hexpm", "2b66ba7246bb4f7e39e2578acd4a0e4e4be54f60ff52d450a01be95eeb78ff1e"},
"elixir_make": {:hex, :elixir_make, "0.8.4", "4960a03ce79081dee8fe119d80ad372c4e7badb84c493cc75983f9d3bc8bde0f", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:certifi, "~> 2.0", [hex: :certifi, repo: "hexpm", optional: true]}], "hexpm", "6e7f1d619b5f61dfabd0a20aa268e575572b542ac31723293a4c1a567d5ef040"},
"esbuild": {:hex, :esbuild, "0.8.1", "0cbf919f0eccb136d2eeef0df49c4acf55336de864e63594adcea3814f3edf41", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "25fc876a67c13cb0a776e7b5d7974851556baeda2085296c14ab48555ea7560f"},
"ex_check": {:hex, :ex_check, "0.14.0", "d6fbe0bcc51cf38fea276f5bc2af0c9ae0a2bb059f602f8de88709421dae4f0e", [:mix], [], "hexpm", "8a602e98c66e6a4be3a639321f1f545292042f290f91fa942a285888c6868af0"},
"expo": {:hex, :expo, "0.5.1", "249e826a897cac48f591deba863b26c16682b43711dd15ee86b92f25eafd96d9", [:mix], [], "hexpm", "68a4233b0658a3d12ee00d27d37d856b1ba48607e7ce20fd376958d0ba6ce92b"},
"exqlite": {:hex, :exqlite, "0.23.0", "6e851c937a033299d0784994c66da24845415072adbc455a337e20087bce9033", [:make, :mix], [{:cc_precompiler, "~> 0.1", [hex: :cc_precompiler, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.8", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "404341cceec5e6466aaed160cf0b58be2019b60af82588c215e1224ebd3ec831"},
"faker": {:hex, :faker, "0.17.0", "671019d0652f63aefd8723b72167ecdb284baf7d47ad3a82a15e9b8a6df5d1fa", [:mix], [], "hexpm", "a7d4ad84a93fd25c5f5303510753789fc2433ff241bf3b4144d3f6f291658a6a"},
"file_system": {:hex, :file_system, "0.2.10", "fb082005a9cd1711c05b5248710f8826b02d7d1784e7c3451f9c1231d4fc162d", [:mix], [], "hexpm", "41195edbfb562a593726eda3b3e8b103a309b733ad25f3d642ba49696bf715dc"},
"finch": {:hex, :finch, "0.17.0", "17d06e1d44d891d20dbd437335eebe844e2426a0cd7e3a3e220b461127c73f70", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6 or ~> 1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "8d014a661bb6a437263d4b5abf0bcbd3cf0deb26b1e8596f2a271d22e48934c7"},
"floki": {:hex, :floki, "0.36.3", "1102f93b16a55bc5383b85ae3ec470f82dee056eaeff9195e8afdf0ef2a43c30", [:mix], [], "hexpm", "fe0158bff509e407735f6d40b3ee0d7deb47f3f3ee7c6c182ad28599f9f6b27a"},
"gettext": {:hex, :gettext, "0.24.0", "6f4d90ac5f3111673cbefc4ebee96fe5f37a114861ab8c7b7d5b30a1108ce6d8", [:mix], [{:expo, "~> 0.5.1", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "bdf75cdfcbe9e4622dd18e034b227d77dd17f0f133853a1c73b97b3d6c770e8b"},
"hackney": {:hex, :hackney, "1.20.1", "8d97aec62ddddd757d128bfd1df6c5861093419f8f7a4223823537bad5d064e2", [:rebar3], [{:certifi, "~> 2.12.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.1", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "fe9094e5f1a2a2c0a7d10918fee36bfec0ec2a979994cff8cfe8058cd9af38e3"},
"hpax": {:hex, :hpax, "0.1.2", "09a75600d9d8bbd064cdd741f21fc06fc1f4cf3d0fcc335e5aa19be1a7235c84", [:mix], [], "hexpm", "2c87843d5a23f5f16748ebe77969880e29809580efdaccd615cd3bed628a8c13"},
"elixir_make": {:hex, :elixir_make, "0.9.0", "6484b3cd8c0cee58f09f05ecaf1a140a8c97670671a6a0e7ab4dc326c3109726", [:mix], [], "hexpm", "db23d4fd8b757462ad02f8aa73431a426fe6671c80b200d9710caf3d1dd0ffdb"},
"esbuild": {:hex, :esbuild, "0.10.0", "b0aa3388a1c23e727c5a3e7427c932d89ee791746b0081bbe56103e9ef3d291f", [:mix], [{:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "468489cda427b974a7cc9f03ace55368a83e1a7be12fba7e30969af78e5f8c70"},
"ex_check": {:hex, :ex_check, "0.16.0", "07615bef493c5b8d12d5119de3914274277299c6483989e52b0f6b8358a26b5f", [:mix], [], "hexpm", "4d809b72a18d405514dda4809257d8e665ae7cf37a7aee3be6b74a34dec310f5"},
"expo": {:hex, :expo, "1.1.0", "f7b9ed7fb5745ebe1eeedf3d6f29226c5dd52897ac67c0f8af62a07e661e5c75", [:mix], [], "hexpm", "fbadf93f4700fb44c331362177bdca9eeb8097e8b0ef525c9cc501cb9917c960"},
"exqlite": {:hex, :exqlite, "0.31.0", "bdf87c618861147382cee29eb8bd91d8cfb0949f89238b353d24fa331527a33a", [:make, :mix], [{:cc_precompiler, "~> 0.1", [hex: :cc_precompiler, repo: "hexpm", optional: false]}, {:db_connection, "~> 2.1", [hex: :db_connection, repo: "hexpm", optional: false]}, {:elixir_make, "~> 0.8", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:table, "~> 0.1.0", [hex: :table, repo: "hexpm", optional: true]}], "hexpm", "df352de99ba4ce1bac2ad4943d09dbe9ad59e0e7ace55917b493ae289c78fc75"},
"faker": {:hex, :faker, "0.18.0", "943e479319a22ea4e8e39e8e076b81c02827d9302f3d32726c5bf82f430e6e14", [:mix], [], "hexpm", "bfbdd83958d78e2788e99ec9317c4816e651ad05e24cfd1196ce5db5b3e81797"},
"file_system": {:hex, :file_system, "1.1.0", "08d232062284546c6c34426997dd7ef6ec9f8bbd090eb91780283c9016840e8f", [:mix], [], "hexpm", "bfcf81244f416871f2a2e15c1b515287faa5db9c6bcf290222206d120b3d43f6"},
"finch": {:hex, :finch, "0.19.0", "c644641491ea854fc5c1bbaef36bfc764e3f08e7185e1f084e35e0672241b76d", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.6.2 or ~> 1.7", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4 or ~> 1.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 1.1", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "fc5324ce209125d1e2fa0fcd2634601c52a787aff1cd33ee833664a5af4ea2b6"},
"floki": {:hex, :floki, "0.37.1", "d7aaee758c8a5b4a7495799a4260754fec5530d95b9c383c03b27359dea117cf", [:mix], [], "hexpm", "673d040cb594d31318d514590246b6dd587ed341d3b67e17c1c0eb8ce7ca6f04"},
"gettext": {:hex, :gettext, "0.26.2", "5978aa7b21fada6deabf1f6341ddba50bc69c999e812211903b169799208f2a8", [:mix], [{:expo, "~> 0.5.1 or ~> 1.0", [hex: :expo, repo: "hexpm", optional: false]}], "hexpm", "aa978504bcf76511efdc22d580ba08e2279caab1066b76bb9aa81c4a1e0a32a5"},
"hackney": {:hex, :hackney, "1.24.1", "f5205a125bba6ed4587f9db3cc7c729d11316fa8f215d3e57ed1c067a9703fa9", [:rebar3], [{:certifi, "~> 2.15.0", [hex: :certifi, repo: "hexpm", optional: false]}, {:idna, "~> 6.1.0", [hex: :idna, repo: "hexpm", optional: false]}, {:metrics, "~> 1.0.0", [hex: :metrics, repo: "hexpm", optional: false]}, {:mimerl, "~> 1.4", [hex: :mimerl, repo: "hexpm", optional: false]}, {:parse_trans, "3.4.1", [hex: :parse_trans, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1.0", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}, {:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "f4a7392a0b53d8bbc3eb855bdcc919cd677358e65b2afd3840b5b3690c4c8a39"},
"hpax": {:hex, :hpax, "1.0.3", "ed67ef51ad4df91e75cc6a1494f851850c0bd98ebc0be6e81b026e765ee535aa", [:mix], [], "hexpm", "8eab6e1cfa8d5918c2ce4ba43588e894af35dbd8e91e6e55c817bca5847df34a"},
"idna": {:hex, :idna, "6.1.1", "8a63070e9f7d0c62eb9d9fcb360a7de382448200fbbd1b106cc96d3d8099df8d", [:rebar3], [{:unicode_util_compat, "~> 0.7.0", [hex: :unicode_util_compat, repo: "hexpm", optional: false]}], "hexpm", "92376eb7894412ed19ac475e4a86f7b413c1b9fbb5bd16dccd57934157944cea"},
"jason": {:hex, :jason, "1.4.4", "b9226785a9aa77b6857ca22832cffa5d5011a667207eb2a0ad56adb5db443b8a", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "c5eb0cab91f094599f94d55bc63409236a8ec69a21a67814529e8d5f6cc90b3b"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm", "69b09adddc4f74a40716ae54d140f93beb0fb8978d8636eaded0c31b6f099f16"},
"mime": {:hex, :mime, "2.0.6", "8f18486773d9b15f95f4f4f1e39b710045fa1de891fada4516559967276e4dc2", [:mix], [], "hexpm", "c9945363a6b26d747389aac3643f8e0e09d30499a138ad64fe8fd1d13d9b153e"},
"mimerl": {:hex, :mimerl, "1.3.0", "d0cd9fc04b9061f82490f6581e0128379830e78535e017f7780f37fea7545726", [:rebar3], [], "hexpm", "a1e15a50d1887217de95f0b9b0793e32853f7c258a5cd227650889b38839fe9d"},
"mint": {:hex, :mint, "1.5.2", "4805e059f96028948870d23d7783613b7e6b0e2fb4e98d720383852a760067fd", [:mix], [{:castore, "~> 0.1.0 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "d77d9e9ce4eb35941907f1d3df38d8f750c357865353e21d335bdcdf6d892a02"},
"mox": {:hex, :mox, "1.1.0", "0f5e399649ce9ab7602f72e718305c0f9cdc351190f72844599545e4996af73c", [:mix], [], "hexpm", "d44474c50be02d5b72131070281a5d3895c0e7a95c780e90bc0cfe712f633a13"},
"nimble_options": {:hex, :nimble_options, "1.1.0", "3b31a57ede9cb1502071fade751ab0c7b8dbe75a9a4c2b5bbb0943a690b63172", [:mix], [], "hexpm", "8bbbb3941af3ca9acc7835f5655ea062111c9c27bcac53e004460dfd19008a99"},
"nimble_parsec": {:hex, :nimble_parsec, "1.4.0", "51f9b613ea62cfa97b25ccc2c1b4216e81df970acd8e16e8d1bdc58fef21370d", [:mix], [], "hexpm", "9c565862810fb383e9838c1dd2d7d2c437b3d13b267414ba6af33e50d2d1cf28"},
"nimble_pool": {:hex, :nimble_pool, "1.0.0", "5eb82705d138f4dd4423f69ceb19ac667b3b492ae570c9f5c900bb3d2f50a847", [:mix], [], "hexpm", "80be3b882d2d351882256087078e1b1952a28bf98d0a287be87e4a24a710b67a"},
"oban": {:hex, :oban, "2.18.3", "1608c04f8856c108555c379f2f56bc0759149d35fa9d3b825cb8a6769f8ae926", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "36ca6ca84ef6518f9c2c759ea88efd438a3c81d667ba23b02b062a0aa785475e"},
"mime": {:hex, :mime, "2.0.7", "b8d739037be7cd402aee1ba0306edfdef982687ee7e9859bee6198c1e7e2f128", [:mix], [], "hexpm", "6171188e399ee16023ffc5b76ce445eb6d9672e2e241d2df6050f3c771e80ccd"},
"mimerl": {:hex, :mimerl, "1.4.0", "3882a5ca67fbbe7117ba8947f27643557adec38fa2307490c4c4207624cb213b", [:rebar3], [], "hexpm", "13af15f9f68c65884ecca3a3891d50a7b57d82152792f3e19d88650aa126b144"},
"mint": {:hex, :mint, "1.7.1", "113fdb2b2f3b59e47c7955971854641c61f378549d73e829e1768de90fc1abf1", [:mix], [{:castore, "~> 0.1.0 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1 or ~> 0.2.0 or ~> 1.0", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "fceba0a4d0f24301ddee3024ae116df1c3f4bb7a563a731f45fdfeb9d39a231b"},
"mox": {:hex, :mox, "1.2.0", "a2cd96b4b80a3883e3100a221e8adc1b98e4c3a332a8fc434c39526babafd5b3", [:mix], [{:nimble_ownership, "~> 1.0", [hex: :nimble_ownership, repo: "hexpm", optional: false]}], "hexpm", "c7b92b3cc69ee24a7eeeaf944cd7be22013c52fcb580c1f33f50845ec821089a"},
"nimble_options": {:hex, :nimble_options, "1.1.1", "e3a492d54d85fc3fd7c5baf411d9d2852922f66e69476317787a7b2bb000a61b", [:mix], [], "hexpm", "821b2470ca9442c4b6984882fe9bb0389371b8ddec4d45a9504f00a66f650b44"},
"nimble_ownership": {:hex, :nimble_ownership, "1.0.1", "f69fae0cdd451b1614364013544e66e4f5d25f36a2056a9698b793305c5aa3a6", [:mix], [], "hexpm", "3825e461025464f519f3f3e4a1f9b68c47dc151369611629ad08b636b73bb22d"},
"nimble_parsec": {:hex, :nimble_parsec, "1.4.2", "8efba0122db06df95bfaa78f791344a89352ba04baedd3849593bfce4d0dc1c6", [:mix], [], "hexpm", "4b21398942dda052b403bbe1da991ccd03a053668d147d53fb8c4e0efe09c973"},
"nimble_pool": {:hex, :nimble_pool, "1.1.0", "bf9c29fbdcba3564a8b800d1eeb5a3c58f36e1e11d7b7fb2e084a643f645f06b", [:mix], [], "hexpm", "af2e4e6b34197db81f7aad230c1118eac993acc0dae6bc83bac0126d4ae0813a"},
"oban": {:hex, :oban, "2.19.4", "045adb10db1161dceb75c254782f97cdc6596e7044af456a59decb6d06da73c1", [:mix], [{:ecto_sql, "~> 3.10", [hex: :ecto_sql, repo: "hexpm", optional: false]}, {:ecto_sqlite3, "~> 0.9", [hex: :ecto_sqlite3, repo: "hexpm", optional: true]}, {:igniter, "~> 0.5", [hex: :igniter, repo: "hexpm", optional: true]}, {:jason, "~> 1.1", [hex: :jason, repo: "hexpm", optional: true]}, {:myxql, "~> 0.7", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.16", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "5fcc6219e6464525b808d97add17896e724131f498444a292071bf8991c99f97"},
"octo_fetch": {:hex, :octo_fetch, "0.4.0", "074b5ecbc08be10b05b27e9db08bc20a3060142769436242702931c418695b19", [:mix], [{:castore, "~> 0.1 or ~> 1.0", [hex: :castore, repo: "hexpm", optional: false]}, {:ssl_verify_fun, "~> 1.1", [hex: :ssl_verify_fun, repo: "hexpm", optional: false]}], "hexpm", "cf8be6f40cd519d7000bb4e84adcf661c32e59369ca2827c4e20042eda7a7fc6"},
"parse_trans": {:hex, :parse_trans, "3.4.1", "6e6aa8167cb44cc8f39441d05193be6e6f4e7c2946cb2759f015f8c56b76e5ff", [:rebar3], [], "hexpm", "620a406ce75dada827b82e453c19cf06776be266f5a67cff34e1ef2cbb60e49a"},
"phoenix": {:hex, :phoenix, "1.7.17", "2fcdceecc6fb90bec26fab008f96abbd0fd93bc9956ec7985e5892cf545152ca", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "50e8ad537f3f7b0efb1509b2f75b5c918f697be6a45d48e49a30d3b7c0e464c9"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.4.3", "86e9878f833829c3f66da03d75254c155d91d72a201eb56ae83482328dc7ca93", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "d36c401206f3011fefd63d04e8ef626ec8791975d9d107f9a0817d426f61ac07"},
"phoenix_html": {:hex, :phoenix_html, "3.3.4", "42a09fc443bbc1da37e372a5c8e6755d046f22b9b11343bf885067357da21cb3", [:mix], [{:plug, "~> 1.5", [hex: :plug, repo: "hexpm", optional: true]}], "hexpm", "0249d3abec3714aff3415e7ee3d9786cb325be3151e6c4b3021502c585bf53fb"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.8.3", "7ff51c9b6609470f681fbea20578dede0e548302b0c8bdf338b5a753a4f045bf", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:ecto_sqlite3_extras, "~> 1.1.7 or ~> 1.2.0", [hex: :ecto_sqlite3_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.19 or ~> 1.0", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "f9470a0a8bae4f56430a23d42f977b5a6205fdba6559d76f932b876bfaec652d"},
"phoenix_live_reload": {:hex, :phoenix_live_reload, "1.4.1", "2aff698f5e47369decde4357ba91fc9c37c6487a512b41732818f2204a8ef1d3", [:mix], [{:file_system, "~> 0.2.1 or ~> 0.3", [hex: :file_system, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.4", [hex: :phoenix, repo: "hexpm", optional: false]}], "hexpm", "9bffb834e7ddf08467fe54ae58b5785507aaba6255568ae22b4d46e2bb3615ab"},
"phoenix_live_view": {:hex, :phoenix_live_view, "1.0.0", "3a10dfce8f87b2ad4dc65de0732fc2a11e670b2779a19e8d3281f4619a85bce4", [:mix], [{:floki, "~> 0.36", [hex: :floki, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.15", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "254caef0028765965ca6bd104cc7d68dcc7d57cc42912bef92f6b03047251d99"},
"peep": {:hex, :peep, "3.4.1", "0e5263710fa0b42675bd0a11fdcdd3ee4f484e319105b6ad9a576c91a5d3cb55", [:mix], [{:nimble_options, "~> 1.1", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:plug, "~> 1.16", [hex: :plug, repo: "hexpm", optional: true]}, {:telemetry_metrics, "~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "7a9b8c1f17b8b9475efb27b7048afa4d89ab84ef33a3d1df13696c85c12cd632"},
"phoenix": {:hex, :phoenix, "1.7.21", "14ca4f1071a5f65121217d6b57ac5712d1857e40a0833aff7a691b7870fc9a3b", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 2.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.7", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:plug_crypto, "~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:websock_adapter, "~> 0.5.3", [hex: :websock_adapter, repo: "hexpm", optional: false]}], "hexpm", "336dce4f86cba56fed312a7d280bf2282c720abb6074bdb1b61ec8095bdd0bc9"},
"phoenix_ecto": {:hex, :phoenix_ecto, "4.6.4", "dcf3483ab45bab4c15e3a47c34451392f64e433846b08469f5d16c2a4cd70052", [:mix], [{:ecto, "~> 3.5", [hex: :ecto, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.1", [hex: :phoenix_html, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:postgrex, "~> 0.16 or ~> 1.0", [hex: :postgrex, repo: "hexpm", optional: true]}], "hexpm", "f5b8584c36ccc9b903948a696fc9b8b81102c79c7c0c751a9f00cdec55d5f2d7"},
"phoenix_html": {:hex, :phoenix_html, "4.2.1", "35279e2a39140068fc03f8874408d58eef734e488fc142153f055c5454fd1c08", [:mix], [], "hexpm", "cff108100ae2715dd959ae8f2a8cef8e20b593f8dfd031c9cba92702cf23e053"},
"phoenix_live_dashboard": {:hex, :phoenix_live_dashboard, "0.8.7", "405880012cb4b706f26dd1c6349125bfc903fb9e44d1ea668adaf4e04d4884b7", [:mix], [{:ecto, "~> 3.6.2 or ~> 3.7", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_mysql_extras, "~> 0.5", [hex: :ecto_mysql_extras, repo: "hexpm", optional: true]}, {:ecto_psql_extras, "~> 0.7", [hex: :ecto_psql_extras, repo: "hexpm", optional: true]}, {:ecto_sqlite3_extras, "~> 1.1.7 or ~> 1.2.0", [hex: :ecto_sqlite3_extras, repo: "hexpm", optional: true]}, {:mime, "~> 1.6 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:phoenix_live_view, "~> 0.19 or ~> 1.0", [hex: :phoenix_live_view, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "3a8625cab39ec261d48a13b7468dc619c0ede099601b084e343968309bd4d7d7"},
"phoenix_live_reload": {:hex, :phoenix_live_reload, "1.6.0", "2791fac0e2776b640192308cc90c0dbcf67843ad51387ed4ecae2038263d708d", [:mix], [{:file_system, "~> 0.2.10 or ~> 1.0", [hex: :file_system, repo: "hexpm", optional: false]}, {:phoenix, "~> 1.4", [hex: :phoenix, repo: "hexpm", optional: false]}], "hexpm", "b3a1fa036d7eb2f956774eda7a7638cf5123f8f2175aca6d6420a7f95e598e1c"},
"phoenix_live_view": {:hex, :phoenix_live_view, "1.0.17", "beeb16d83a7d3760f7ad463df94e83b087577665d2acc0bf2987cd7d9778068f", [:mix], [{:floki, "~> 0.36", [hex: :floki, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix, "~> 1.6.15 or ~> 1.7.0 or ~> 1.8.0-rc", [hex: :phoenix, repo: "hexpm", optional: false]}, {:phoenix_html, "~> 3.3 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: false]}, {:phoenix_template, "~> 1.0", [hex: :phoenix_template, repo: "hexpm", optional: false]}, {:phoenix_view, "~> 2.0", [hex: :phoenix_view, repo: "hexpm", optional: true]}, {:plug, "~> 1.15", [hex: :plug, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a4ca05c1eb6922c4d07a508a75bfa12c45e5f4d8f77ae83283465f02c53741e1"},
"phoenix_pubsub": {:hex, :phoenix_pubsub, "2.1.3", "3168d78ba41835aecad272d5e8cd51aa87a7ac9eb836eabc42f6e57538e3731d", [:mix], [], "hexpm", "bba06bc1dcfd8cb086759f0edc94a8ba2bc8896d5331a1e2c2902bf8e36ee502"},
"phoenix_template": {:hex, :phoenix_template, "1.0.4", "e2092c132f3b5e5b2d49c96695342eb36d0ed514c5b252a77048d5969330d639", [:mix], [{:phoenix_html, "~> 2.14.2 or ~> 3.0 or ~> 4.0", [hex: :phoenix_html, repo: "hexpm", optional: true]}], "hexpm", "2c0c81f0e5c6753faf5cca2f229c9709919aba34fab866d3bc05060c9c444206"},
"plug": {:hex, :plug, "1.16.1", "40c74619c12f82736d2214557dedec2e9762029b2438d6d175c5074c933edc9d", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "a13ff6b9006b03d7e33874945b2755253841b238c34071ed85b0e86057f8cddc"},
"plug_cowboy": {:hex, :plug_cowboy, "2.7.2", "fdadb973799ae691bf9ecad99125b16625b1c6039999da5fe544d99218e662e4", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "245d8a11ee2306094840c000e8816f0cbed69a23fc0ac2bcf8d7835ae019bb2f"},
"plug_crypto": {:hex, :plug_crypto, "2.1.0", "f44309c2b06d249c27c8d3f65cfe08158ade08418cf540fd4f72d4d6863abb7b", [:mix], [], "hexpm", "131216a4b030b8f8ce0f26038bc4421ae60e4bb95c5cf5395e1421437824c4fa"},
"ranch": {:hex, :ranch, "1.8.0", "8c7a100a139fd57f17327b6413e4167ac559fbc04ca7448e9be9057311597a1d", [:make, :rebar3], [], "hexpm", "49fbcfd3682fab1f5d109351b61257676da1a2fdbe295904176d5e521a2ddfe5"},
"sobelow": {:hex, :sobelow, "0.13.0", "218afe9075904793f5c64b8837cc356e493d88fddde126a463839351870b8d1e", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "cd6e9026b85fc35d7529da14f95e85a078d9dd1907a9097b3ba6ac7ebbe34a0d"},
"plug": {:hex, :plug, "1.18.0", "d78df36c41f7e798f2edf1f33e1727eae438e9dd5d809a9997c463a108244042", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2 or ~> 2.0", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "819f9e176d51e44dc38132e132fe0accaf6767eab7f0303431e404da8476cfa2"},
"plug_cowboy": {:hex, :plug_cowboy, "2.7.3", "1304d36752e8bdde213cea59ef424ca932910a91a07ef9f3874be709c4ddb94b", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "77c95524b2aa5364b247fa17089029e73b951ebc1adeef429361eab0bb55819d"},
"plug_crypto": {:hex, :plug_crypto, "2.1.1", "19bda8184399cb24afa10be734f84a16ea0a2bc65054e23a62bb10f06bc89491", [:mix], [], "hexpm", "6470bce6ffe41c8bd497612ffde1a7e4af67f36a15eea5f921af71cf3e11247c"},
"prom_ex": {:hex, :prom_ex, "1.11.0", "1f6d67f2dead92224cb4f59beb3e4d319257c5728d9638b4a5e8ceb51a4f9c7e", [:mix], [{:absinthe, ">= 1.7.0", [hex: :absinthe, repo: "hexpm", optional: true]}, {:broadway, ">= 1.1.0", [hex: :broadway, repo: "hexpm", optional: true]}, {:ecto, ">= 3.11.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:finch, "~> 0.18", [hex: :finch, repo: "hexpm", optional: false]}, {:jason, "~> 1.4", [hex: :jason, repo: "hexpm", optional: false]}, {:oban, ">= 2.10.0", [hex: :oban, repo: "hexpm", optional: true]}, {:octo_fetch, "~> 0.4", [hex: :octo_fetch, repo: "hexpm", optional: false]}, {:peep, "~> 3.0", [hex: :peep, repo: "hexpm", optional: false]}, {:phoenix, ">= 1.7.0", [hex: :phoenix, repo: "hexpm", optional: true]}, {:phoenix_live_view, ">= 0.20.0", [hex: :phoenix_live_view, repo: "hexpm", optional: true]}, {:plug, ">= 1.16.0", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 2.6.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, ">= 1.0.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}, {:telemetry_metrics_prometheus_core, "~> 1.2", [hex: :telemetry_metrics_prometheus_core, repo: "hexpm", optional: false]}, {:telemetry_poller, "~> 1.1", [hex: :telemetry_poller, repo: "hexpm", optional: false]}], "hexpm", "76b074bc3730f0802978a7eb5c7091a65473eaaf07e99ec9e933138dcc327805"},
"ranch": {:hex, :ranch, "2.2.0", "25528f82bc8d7c6152c57666ca99ec716510fe0925cb188172f41ce93117b1b0", [:make, :rebar3], [], "hexpm", "fa0b99a1780c80218a4197a59ea8d3bdae32fbff7e88527d7d8a4787eff4f8e7"},
"sobelow": {:hex, :sobelow, "0.14.0", "dd82aae8f72503f924fe9dd97ffe4ca694d2f17ec463dcfd365987c9752af6ee", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "7ecf91e298acfd9b24f5d761f19e8f6e6ac585b9387fb6301023f1f2cd5eed5f"},
"ssl_verify_fun": {:hex, :ssl_verify_fun, "1.1.7", "354c321cf377240c7b8716899e182ce4890c5938111a1296add3ec74cf1715df", [:make, :mix, :rebar3], [], "hexpm", "fe4c190e8f37401d30167c8c405eda19469f34577987c76dde613e838bbc67f8"},
"swoosh": {:hex, :swoosh, "1.14.4", "94e9dba91f7695a10f49b0172c4a4cb658ef24abef7e8140394521b7f3bbb2d4", [:mix], [{:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.4 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "081c5a590e4ba85cc89baddf7b2beecf6c13f7f84a958f1cd969290815f0f026"},
"table_rex": {:hex, :table_rex, "4.0.0", "3c613a68ebdc6d4d1e731bc973c233500974ec3993c99fcdabb210407b90959b", [:mix], [], "hexpm", "c35c4d5612ca49ebb0344ea10387da4d2afe278387d4019e4d8111e815df8f55"},
"swoosh": {:hex, :swoosh, "1.19.1", "77e839b27fc7af0704788e5854934c77d4dea7b437270c924a717513d598b8a4", [:mix], [{:bandit, ">= 1.0.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:cowboy, "~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:ex_aws, "~> 2.1", [hex: :ex_aws, repo: "hexpm", optional: true]}, {:finch, "~> 0.6", [hex: :finch, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13 or ~> 1.0", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mua, "~> 0.2.3", [hex: :mua, repo: "hexpm", optional: true]}, {:multipart, "~> 0.4", [hex: :multipart, repo: "hexpm", optional: true]}, {:plug, "~> 1.9", [hex: :plug, repo: "hexpm", optional: true]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:req, "~> 0.5.10 or ~> 0.6 or ~> 1.0", [hex: :req, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.2 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "eab57462d41a3330e82cb93a9d7640f5c79a85951f3457db25c1eb28fda193a6"},
"table_rex": {:hex, :table_rex, "4.1.0", "fbaa8b1ce154c9772012bf445bfb86b587430fb96f3b12022d3f35ee4a68c918", [:mix], [], "hexpm", "95932701df195d43bc2d1c6531178fc8338aa8f38c80f098504d529c43bc2601"},
"tailwind": {:hex, :tailwind, "0.2.2", "9e27288b568ede1d88517e8c61259bc214a12d7eed271e102db4c93fcca9b2cd", [:mix], [{:castore, ">= 0.0.0", [hex: :castore, repo: "hexpm", optional: false]}], "hexpm", "ccfb5025179ea307f7f899d1bb3905cd0ac9f687ed77feebc8f67bdca78565c4"},
"telemetry": {:hex, :telemetry, "1.3.0", "fedebbae410d715cf8e7062c96a1ef32ec22e764197f70cda73d82778d61e7a2", [:rebar3], [], "hexpm", "7015fc8919dbe63764f4b4b87a95b7c0996bd539e0d499be6ec9d7f3875b79e6"},
"telemetry_metrics": {:hex, :telemetry_metrics, "0.6.2", "2caabe9344ec17eafe5403304771c3539f3b6e2f7fb6a6f602558c825d0d0bfb", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "9b43db0dc33863930b9ef9d27137e78974756f5f198cae18409970ed6fa5b561"},
"telemetry_poller": {:hex, :telemetry_poller, "1.0.0", "db91bb424e07f2bb6e73926fcafbfcbcb295f0193e0a00e825e589a0a47e8453", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "b3a24eafd66c3f42da30fc3ca7dda1e9d546c12250a2d60d7b81d264fbec4f6e"},
"timex": {:hex, :timex, "3.7.11", "bb95cb4eb1d06e27346325de506bcc6c30f9c6dea40d1ebe390b262fad1862d1", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.20", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 1.1", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "8b9024f7efbabaf9bd7aa04f65cf8dcd7c9818ca5737677c7b76acbc6a94d1aa"},
"tzdata": {:hex, :tzdata, "1.1.2", "45e5f1fcf8729525ec27c65e163be5b3d247ab1702581a94674e008413eef50b", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "cec7b286e608371602318c414f344941d5eb0375e14cfdab605cca2fe66cba8b"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.0", "bc84380c9ab48177092f43ac89e4dfa2c6d62b40b8bd132b1059ecc7232f9a78", [:rebar3], [], "hexpm", "25eee6d67df61960cf6a794239566599b09e17e668d3700247bc498638152521"},
"telemetry_metrics": {:hex, :telemetry_metrics, "1.1.0", "5bd5f3b5637e0abea0426b947e3ce5dd304f8b3bc6617039e2b5a008adc02f8f", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "e7b79e8ddfde70adb6db8a6623d1778ec66401f366e9a8f5dd0955c56bc8ce67"},
"telemetry_metrics_prometheus_core": {:hex, :telemetry_metrics_prometheus_core, "1.2.1", "c9755987d7b959b557084e6990990cb96a50d6482c683fb9622a63837f3cd3d8", [:mix], [{:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}, {:telemetry_metrics, "~> 0.6 or ~> 1.0", [hex: :telemetry_metrics, repo: "hexpm", optional: false]}], "hexpm", "5e2c599da4983c4f88a33e9571f1458bf98b0cf6ba930f1dc3a6e8cf45d5afb6"},
"telemetry_poller": {:hex, :telemetry_poller, "1.2.0", "ba82e333215aed9dd2096f93bd1d13ae89d249f82760fcada0850ba33bac154b", [:rebar3], [{:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7216e21a6c326eb9aa44328028c34e9fd348fb53667ca837be59d0aa2a0156e8"},
"timex": {:git, "https://github.com/bitwalker/timex.git", "cc649c7a586f1266b17d57aff3c6eb1a56116ca2", [ref: "cc649c7a586f1266b17d57aff3c6eb1a56116ca2"]},
"tzdata": {:hex, :tzdata, "1.1.3", "b1cef7bb6de1de90d4ddc25d33892b32830f907e7fc2fccd1e7e22778ab7dfbc", [:mix], [{:hackney, "~> 1.17", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "d4ca85575a064d29d4e94253ee95912edfb165938743dbf002acdf0dcecb0c28"},
"unicode_util_compat": {:hex, :unicode_util_compat, "0.7.1", "a48703a25c170eedadca83b11e88985af08d35f37c6f664d6dcfb106a97782fc", [:rebar3], [], "hexpm", "b3a917854ce3ae233619744ad1e0102e05673136776fb2fa76234f3e03b23642"},
"websock": {:hex, :websock, "0.5.3", "2f69a6ebe810328555b6fe5c831a851f485e303a7c8ce6c5f675abeb20ebdadc", [:mix], [], "hexpm", "6105453d7fac22c712ad66fab1d45abdf049868f253cf719b625151460b8b453"},
"websock_adapter": {:hex, :websock_adapter, "0.5.8", "3b97dc94e407e2d1fc666b2fb9acf6be81a1798a2602294aac000260a7c4a47d", [:mix], [{:bandit, ">= 0.6.0", [hex: :bandit, repo: "hexpm", optional: true]}, {:plug, "~> 1.14", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.6", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:websock, "~> 0.5", [hex: :websock, repo: "hexpm", optional: false]}], "hexpm", "315b9a1865552212b5f35140ad194e67ce31af45bcee443d4ecb96b5fd3f3782"},
}

View file

@ -0,0 +1,607 @@
{
"annotations": {
"list": [
{
"builtIn": 1,
"datasource": "-- Grafana --",
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"type": "dashboard"
},
{
"datasource": "-- Grafana --",
"enable": true,
"hide": true,
"iconColor": "#73BF69",
"limit": 100,
"name": "PromEx service start",
"showIn": 0,
"tags": ["prom_ex", "pinchflat", "start"],
"type": "tags"
},
{
"datasource": "-- Grafana --",
"enable": true,
"hide": true,
"iconColor": "#FF9830",
"limit": 100,
"name": "PromEx service stop",
"showIn": 0,
"tags": ["prom_ex", "pinchflat", "stop"],
"type": "tags"
}
]
},
"description": "All the data that is presented here is captured by the PromEx Application plugin (https://github.com/akoutmos/prom_ex/blob/master/lib/prom_ex/plugins/application.ex)",
"editable": false,
"gnetId": null,
"graphTooltip": 1,
"id": null,
"links": [
{
"asDropdown": false,
"icon": "bolt",
"includeVars": false,
"keepTime": false,
"tags": [],
"targetBlank": true,
"title": "Sponsor PromEx",
"tooltip": "",
"type": "link",
"url": "https://github.com/sponsors/akoutmos"
},
{
"asDropdown": false,
"icon": "doc",
"includeVars": false,
"keepTime": false,
"tags": [],
"targetBlank": true,
"title": "Application Plugin Docs",
"tooltip": "",
"type": "link",
"url": "https://hexdocs.pm/prom_ex/PromEx.Plugins.Application.html"
}
],
"panels": [
{
"datasource": "prometheus",
"description": "The amount of time that the application has been running.",
"fieldConfig": {
"defaults": {
"custom": {},
"decimals": 1,
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
},
"unit": "dtdurationms"
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 0
},
"id": 6,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_uptime_milliseconds_count{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Uptime",
"type": "stat"
},
{
"datasource": "prometheus",
"description": "The data is populated by the PromEx Application plugin and provides information regarding your application's dependencies.",
"fieldConfig": {
"defaults": {
"custom": {
"align": "left",
"displayMode": "auto"
},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "Status"
},
"properties": [
{
"id": "custom.displayMode",
"value": "color-background"
},
{
"id": "mappings",
"value": [
{
"from": "",
"id": 0,
"text": "Started",
"to": "",
"type": 1,
"value": "1"
},
{
"from": "",
"id": 1,
"text": "Loaded",
"to": "",
"type": 1,
"value": "0"
}
]
},
{
"id": "custom.align",
"value": "center"
},
{
"id": "custom.width",
"value": 202
}
]
},
{
"matcher": {
"id": "byName",
"options": "Name"
},
"properties": [
{
"id": "custom.width",
"value": 349
}
]
},
{
"matcher": {
"id": "byName",
"options": "Version"
},
"properties": [
{
"id": "custom.width",
"value": 187
}
]
}
]
},
"gridPos": {
"h": 36,
"w": 16,
"x": 8,
"y": 0
},
"id": 2,
"options": {
"showHeader": true,
"sortBy": [
{
"desc": false,
"displayName": "Name"
}
]
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_dependency_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Dependency Information",
"transformations": [
{
"id": "organize",
"options": {
"excludeByName": {
"Time": true,
"__name__": true,
"instance": true,
"job": true,
"Value": true
},
"indexByName": {
"Time": 0,
"Value": 4,
"__name__": 1,
"instance": 2,
"job": 3,
"modules": 7,
"name": 5,
"version": 6
},
"renameByName": {
"Value": "Status",
"modules": "Number of Modules Loaded",
"name": "Name",
"version": "Version"
}
}
}
],
"type": "table"
},
{
"datasource": "prometheus",
"description": "The name of the primary application that is running.",
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 6
},
"id": 11,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "/^name$/",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_primary_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Name",
"type": "stat"
},
{
"datasource": "prometheus",
"description": "The Git SHA of the application.",
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 12
},
"id": 10,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "/^sha$/",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_git_sha_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Git SHA",
"type": "stat"
},
{
"datasource": "prometheus",
"description": "The author of the application's last Git commit.",
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 18
},
"id": 12,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "/^author$/",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_git_author_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Git Author",
"type": "stat"
},
{
"datasource": "prometheus",
"description": "The version of the primary application that is running.",
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 24
},
"id": 7,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "/^version$/",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_primary_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Version",
"type": "stat"
},
{
"datasource": "prometheus",
"description": "The number of modules loaded by the primary application that is running.",
"fieldConfig": {
"defaults": {
"custom": {},
"mappings": [],
"thresholds": {
"mode": "absolute",
"steps": [
{
"color": "green",
"value": null
}
]
}
},
"overrides": []
},
"gridPos": {
"h": 6,
"w": 8,
"x": 0,
"y": 30
},
"id": 9,
"options": {
"colorMode": "value",
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"reduceOptions": {
"calcs": ["last"],
"fields": "/^modules$/",
"values": false
},
"textMode": "auto"
},
"pluginVersion": "7.1.3",
"targets": [
{
"expr": "pinchflat_prom_ex_application_primary_info{job=\"$job\", instance=\"$instance\"}",
"format": "table",
"instant": true,
"interval": "",
"legendFormat": "",
"refId": "A"
}
],
"timeFrom": null,
"timeShift": null,
"title": "Application Modules Loaded",
"type": "stat"
}
],
"refresh": "5s",
"schemaVersion": 26,
"style": "dark",
"tags": ["PromEx", "Application", "pinchflat"],
"templating": {
"list": [
{
"allValue": null,
"datasource": "prometheus",
"definition": "label_values(pinchflat_prom_ex_prom_ex_status_info, job)",
"hide": 0,
"includeAll": false,
"label": "Prometheus Job",
"multi": false,
"name": "job",
"options": [],
"query": "label_values(pinchflat_prom_ex_prom_ex_status_info, job)",
"refresh": 2,
"regex": "",
"skipUrlSync": false,
"sort": 6,
"tagValuesQuery": "",
"tags": [],
"tagsQuery": "",
"type": "query",
"useTags": false
},
{
"allValue": null,
"datasource": "prometheus",
"definition": "label_values(pinchflat_prom_ex_prom_ex_status_info, instance)",
"hide": 0,
"includeAll": false,
"label": "Application Instance",
"multi": false,
"name": "instance",
"options": [],
"query": "label_values(pinchflat_prom_ex_prom_ex_status_info{job=\"$job\"}, instance)",
"refresh": 2,
"regex": "",
"skipUrlSync": false,
"sort": 0,
"tagValuesQuery": "",
"tags": [],
"tagsQuery": "",
"type": "query",
"useTags": false
}
]
},
"time": {
"from": "now-1h",
"to": "now"
},
"timepicker": {
"refresh_intervals": ["5s", "10s", "30s", "1m", "5m"]
},
"timezone": "",
"title": "Pinchflat - PromEx Application Dashboard",
"uid": "7DBBC471C5775585391E8F24D1E62319",
"version": 1
}

2328
priv/grafana/beam.json Normal file

File diff suppressed because it is too large Load diff

1247
priv/grafana/ecto.json Normal file

File diff suppressed because it is too large Load diff

2866
priv/grafana/oban.json Normal file

File diff suppressed because it is too large Load diff

1978
priv/grafana/phoenix.json Normal file

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

Binary file not shown.

Before

Width:  |  Height:  |  Size: 442 KiB

After

Width:  |  Height:  |  Size: 506 KiB

Before After
Before After

View file

@ -0,0 +1,9 @@
defmodule Pinchflat.Repo.Migrations.AddExtractorSleepIntervalToSettings do
use Ecto.Migration
def change do
alter table(:settings) do
add :extractor_sleep_interval_seconds, :number, null: false, default: 0
end
end
end

View file

@ -0,0 +1,9 @@
defmodule Pinchflat.Repo.Migrations.AddLastErrorToMediaItem do
use Ecto.Migration
def change do
alter table(:media_items) do
add :last_error, :string
end
end
end

View file

@ -0,0 +1,18 @@
defmodule Pinchflat.Repo.Migrations.AddCookieBehaviourToSources do
use Ecto.Migration
def change do
alter table(:sources) do
add :cookie_behaviour, :string, null: false, default: "disabled"
end
execute(
"UPDATE sources SET cookie_behaviour = 'all_operations' WHERE use_cookies = TRUE",
"UPDATE sources SET use_cookies = TRUE WHERE cookie_behaviour = 'all_operations'"
)
alter table(:sources) do
remove :use_cookies, :boolean, null: false, default: false
end
end
end

View file

@ -0,0 +1,9 @@
defmodule Pinchflat.Repo.Migrations.AddRateLimitSpeedToSettings do
use Ecto.Migration
def change do
alter table(:settings) do
add :download_throughput_limit, :string
end
end
end

View file

@ -0,0 +1,9 @@
defmodule Pinchflat.Repo.Migrations.AddRestrictFilenamesToSettings do
use Ecto.Migration
def change do
alter table(:settings) do
add :restrict_filenames, :boolean, default: false
end
end
end

View file

@ -6,6 +6,9 @@ if [ $? -ne 0 ]; then
exit 1
fi
echo "Setting umask to ${UMASK}"
umask ${UMASK}
/app/bin/migrate
cd -P -- "$(dirname -- "$0")"

View file

@ -0,0 +1,16 @@
defmodule Pinchflat.Boot.PostBootStartupTasksTest do
use Pinchflat.DataCase
alias Pinchflat.YtDlp.UpdateWorker
alias Pinchflat.Boot.PostBootStartupTasks
describe "update_yt_dlp" do
test "enqueues an update job" do
assert [] = all_enqueued(worker: UpdateWorker)
PostBootStartupTasks.init(%{})
assert [%Oban.Job{}] = all_enqueued(worker: UpdateWorker)
end
end
end

View file

@ -10,7 +10,7 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
alias Pinchflat.Downloading.MediaDownloadWorker
describe "enqueue_pending_download_tasks/1" do
test "it enqueues a job for each pending media item" do
test "enqueues a job for each pending media item" do
source = source_fixture()
media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
@ -19,7 +19,7 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
end
test "it does not enqueue a job for media items with a filepath" do
test "does not enqueue a job for media items with a filepath" do
source = source_fixture()
_media_item = media_item_fixture(source_id: source.id, media_filepath: "some/filepath.mp4")
@ -28,7 +28,7 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
refute_enqueued(worker: MediaDownloadWorker)
end
test "it attaches a task to each enqueued job" do
test "attaches a task to each enqueued job" do
source = source_fixture()
media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
@ -39,7 +39,7 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
assert [_] = Tasks.list_tasks_for(media_item)
end
test "it does not create a job if the source is set to not download" do
test "does not create a job if the source is set to not download" do
source = source_fixture(download_media: false)
assert :ok = DownloadingHelpers.enqueue_pending_download_tasks(source)
@ -47,17 +47,26 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
refute_enqueued(worker: MediaDownloadWorker)
end
test "it does not attach tasks if the source is set to not download" do
test "does not attach tasks if the source is set to not download" do
source = source_fixture(download_media: false)
media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
assert :ok = DownloadingHelpers.enqueue_pending_download_tasks(source)
assert [] = Tasks.list_tasks_for(media_item)
end
test "can pass job options" do
source = source_fixture()
media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
assert :ok = DownloadingHelpers.enqueue_pending_download_tasks(source, priority: 1)
assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id}, priority: 1)
end
end
describe "dequeue_pending_download_tasks/1" do
test "it deletes all pending tasks for a source's media items" do
test "deletes all pending tasks for a source's media items" do
source = source_fixture()
media_item = media_item_fixture(source_id: source.id, media_filepath: nil)
@ -109,6 +118,14 @@ defmodule Pinchflat.Downloading.DownloadingHelpersTest do
refute_enqueued(worker: MediaDownloadWorker)
end
test "can pass job options" do
media_item = media_item_fixture(media_filepath: nil)
assert {:ok, _} = DownloadingHelpers.kickoff_download_if_pending(media_item, priority: 1)
assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id}, priority: 1)
end
end
describe "kickoff_redownload_for_existing_media/1" do

View file

@ -46,13 +46,20 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
assert_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id, "force" => true})
end
test "can be called with additional job options", %{media_item: media_item} do
job_opts = [max_attempts: 5]
assert {:ok, _} = MediaDownloadWorker.kickoff_with_task(media_item, %{}, job_opts)
test "has a priority of 5 by default", %{media_item: media_item} do
assert {:ok, _} = MediaDownloadWorker.kickoff_with_task(media_item)
[job] = all_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
assert job.max_attempts == 5
assert job.priority == 5
end
test "priority can be set", %{media_item: media_item} do
assert {:ok, _} = MediaDownloadWorker.kickoff_with_task(media_item, %{}, priority: 0)
[job] = all_enqueued(worker: MediaDownloadWorker, args: %{"id" => media_item.id})
assert job.priority == 0
end
end
@ -67,7 +74,7 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
:ok
end
test "it saves attributes to the media_item", %{media_item: media_item} do
test "saves attributes to the media_item", %{media_item: media_item} do
assert media_item.media_filepath == nil
perform_job(MediaDownloadWorker, %{id: media_item.id})
media_item = Repo.reload(media_item)
@ -75,20 +82,20 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
assert media_item.media_filepath != nil
end
test "it saves the metadata to the media_item", %{media_item: media_item} do
test "saves the metadata to the media_item", %{media_item: media_item} do
assert media_item.metadata == nil
perform_job(MediaDownloadWorker, %{id: media_item.id})
assert Repo.reload(media_item).metadata != nil
end
test "it won't double-schedule downloading jobs", %{media_item: media_item} do
test "won't double-schedule downloading jobs", %{media_item: media_item} do
Oban.insert(MediaDownloadWorker.new(%{id: media_item.id}))
Oban.insert(MediaDownloadWorker.new(%{id: media_item.id}))
assert [_] = all_enqueued(worker: MediaDownloadWorker)
end
test "it sets the job to retryable if the download fails", %{media_item: media_item} do
test "sets the job to retryable if the download fails", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:error, "error"}
@ -127,7 +134,36 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
end)
end
test "it ensures error are returned in a 2-item tuple", %{media_item: media_item} do
test "does not set the job to retryable if youtube thinks you're a bot", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:error, "Sign in to confirm you're not a bot", 1}
end)
Oban.Testing.with_testing_mode(:inline, fn ->
{:ok, job} = Oban.insert(MediaDownloadWorker.new(%{id: media_item.id, quality_upgrade?: true}))
assert job.state == "completed"
end)
end
test "does not set the job to retryable you aren't a member", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
_url, :download, _opts, _ot, _addl ->
{:error, "This video is available to this channel's members on level: foo", 1}
end)
Oban.Testing.with_testing_mode(:inline, fn ->
{:ok, job} = Oban.insert(MediaDownloadWorker.new(%{id: media_item.id, quality_upgrade?: true}))
assert job.state == "completed"
end)
end
test "ensures error are returned in a 2-item tuple", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:error, "error", 1}
@ -136,7 +172,7 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
assert {:error, :download_failed} = perform_job(MediaDownloadWorker, %{id: media_item.id})
end
test "it does not download if the source is set to not download", %{media_item: media_item} do
test "does not download if the source is set to not download", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 0, fn _url, :download, _opts, _ot, _addl -> :ok end)
Sources.update_source(media_item.source, %{download_media: false})
@ -152,7 +188,7 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
perform_job(MediaDownloadWorker, %{id: media_item.id})
end
test "it saves the file's size to the database", %{media_item: media_item} do
test "saves the file's size to the database", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
@ -201,6 +237,14 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
perform_job(MediaDownloadWorker, %{id: media_item.id})
end
test "does not download if the media item isn't pending download", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 0, fn _url, :download, _opts, _ot, _addl -> :ok end)
Media.update_media_item(media_item, %{media_filepath: "foo.mp4"})
perform_job(MediaDownloadWorker, %{id: media_item.id})
end
end
describe "perform/1 when testing non-downloadable media" do
@ -219,12 +263,30 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
describe "perform/1 when testing forced downloads" do
test "ignores 'prevent_download' if forced", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:ok, render_metadata(:media_metadata)}
_url, :download_thumbnail, _opts, _ot, _addl -> {:ok, ""}
end)
Sources.update_source(media_item.source, %{download_media: false})
Media.update_media_item(media_item, %{prevent_download: true})
perform_job(MediaDownloadWorker, %{id: media_item.id, force: true})
end
test "ignores whether the media item is pending when forced", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:ok, render_metadata(:media_metadata)}
_url, :download_thumbnail, _opts, _ot, _addl -> {:ok, ""}
end)
Media.update_media_item(media_item, %{media_filepath: "foo.mp4"})
perform_job(MediaDownloadWorker, %{id: media_item.id, force: true})
end
test "sets force_overwrites runner option", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
@ -252,6 +314,34 @@ defmodule Pinchflat.Downloading.MediaDownloadWorkerTest do
assert media_item.media_redownloaded_at != nil
end
test "ignores whether the media item is pending when re-downloaded", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:ok, render_metadata(:media_metadata)}
_url, :download_thumbnail, _opts, _ot, _addl -> {:ok, ""}
end)
Media.update_media_item(media_item, %{media_filepath: "foo.mp4"})
perform_job(MediaDownloadWorker, %{id: media_item.id, quality_upgrade?: true})
end
test "doesn't redownload if the source is set to not download", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 0, fn _url, :download, _opts, _ot, _addl -> :ok end)
Sources.update_source(media_item.source, %{download_media: false})
perform_job(MediaDownloadWorker, %{id: media_item.id, quality_upgrade?: true})
end
test "doesn't redownload if the media item is set to not download", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 0, fn _url, :download, _opts, _ot, _addl -> :ok end)
Media.update_media_item(media_item, %{prevent_download: true})
perform_job(MediaDownloadWorker, %{id: media_item.id, quality_upgrade?: true})
end
test "sets force_overwrites runner option", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->

View file

@ -60,7 +60,8 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
{:ok, Phoenix.json_library().encode!(%{"live_status" => "is_live"})}
end)
assert {:error, :unsuitable_for_download} = MediaDownloader.download_for_media_item(media_item)
assert {:error, :unsuitable_for_download, message} = MediaDownloader.download_for_media_item(media_item)
assert message =~ "Media item ##{media_item.id} isn't suitable for download yet."
end
test "non-recoverable errors are passed through", %{media_item: media_item} do
@ -69,7 +70,7 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
_url, :download, _opts, _ot, _addl -> {:error, :some_error, 1}
end)
assert {:error, :some_error} = MediaDownloader.download_for_media_item(media_item)
assert {:error, :download_failed, :some_error} = MediaDownloader.download_for_media_item(media_item)
end
test "unknown errors are passed through", %{media_item: media_item} do
@ -78,7 +79,7 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
_url, :download, _opts, _ot, _addl -> {:error, :some_error}
end)
assert {:error, message} = MediaDownloader.download_for_media_item(media_item)
assert {:error, :unknown, message} = MediaDownloader.download_for_media_item(media_item)
assert message == "Unknown error: {:error, :some_error}"
end
end
@ -107,13 +108,15 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
expect(YtDlpRunnerMock, :run, 0, fn _url, :download, _opts, _ot, _addl -> {:ok, ""} end)
assert {:error, :unsuitable_for_download} = MediaDownloader.download_for_media_item(media_item)
assert {:error, :unsuitable_for_download, message} = MediaDownloader.download_for_media_item(media_item)
assert message =~ "Media item ##{media_item.id} isn't suitable for download yet."
end
test "returns unexpected errors from the download status determination method", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, fn _url, :get_downloadable_status, _opts, _ot, _addl -> {:error, :what_tha} end)
assert {:error, "Unknown error: {:error, :what_tha}"} = MediaDownloader.download_for_media_item(media_item)
assert {:error, :unknown, "Unknown error: {:error, :what_tha}"} =
MediaDownloader.download_for_media_item(media_item)
end
end
@ -154,15 +157,16 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
{:ok, ""}
end)
source = source_fixture(%{use_cookies: true})
source = source_fixture(%{cookie_behaviour: :all_operations})
media_item = media_item_fixture(%{source_id: source.id})
assert {:ok, _} = MediaDownloader.download_for_media_item(media_item)
end
test "does not set use_cookies if the source does not use cookies" do
test "does not set use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
_url, :get_downloadable_status, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, "{}"}
_url, :download, _opts, _ot, addl ->
@ -173,26 +177,52 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
{:ok, ""}
end)
source = source_fixture(%{use_cookies: false})
source = source_fixture(%{cookie_behaviour: :when_needed})
media_item = media_item_fixture(%{source_id: source.id})
assert {:ok, _} = MediaDownloader.download_for_media_item(media_item)
end
test "does not set use_cookies if the source does not use cookies" do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, "{}"}
_url, :download, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, render_metadata(:media_metadata)}
_url, :download_thumbnail, _opts, _ot, _addl ->
{:ok, ""}
end)
source = source_fixture(%{cookie_behaviour: :disabled})
media_item = media_item_fixture(%{source_id: source.id})
assert {:ok, _} = MediaDownloader.download_for_media_item(media_item)
end
end
describe "download_for_media_item/3 when testing retries" do
describe "download_for_media_item/3 when testing non-cookie retries" do
test "returns a recovered tuple on recoverable errors", %{media_item: media_item} do
message = "Unable to communicate with SponsorBlock"
expect(YtDlpRunnerMock, :run, 2, fn
expect(YtDlpRunnerMock, :run, 3, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
_url, :download, _opts, _ot, _addl ->
_url, :download, _opts, _ot, addl ->
[{:output_filepath, filepath} | _] = addl
File.write(filepath, render_metadata(:media_metadata))
{:error, message, 1}
_url, :download_thumbnail, _opts, _ot, _addl ->
{:ok, ""}
end)
assert {:recovered, ^message} = MediaDownloader.download_for_media_item(media_item)
assert {:recovered, _media_item, ^message} = MediaDownloader.download_for_media_item(media_item)
end
test "attempts to update the media item on recoverable errors", %{media_item: media_item} do
@ -212,11 +242,121 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
{:ok, ""}
end)
assert {:recovered, ^message} = MediaDownloader.download_for_media_item(media_item)
assert {:recovered, updated_media_item, ^message} = MediaDownloader.download_for_media_item(media_item)
assert DateTime.diff(DateTime.utc_now(), updated_media_item.media_downloaded_at) < 2
assert String.ends_with?(updated_media_item.media_filepath, ".mkv")
end
test "returns an unrecoverable tuple if recovery fails", %{media_item: media_item} do
message = "Unable to communicate with SponsorBlock"
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
_url, :download, _opts, _ot, _addl ->
# This errors because the metadata is not written to the file so JSON parsing fails
{:error, message, 1}
end)
assert {:error, :unrecoverable, ^message} = MediaDownloader.download_for_media_item(media_item)
end
test "sets the last_error appropriately when recovered", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 3, fn
_url, :download, _opts, _ot, addl ->
[{:output_filepath, filepath} | _] = addl
File.write(filepath, render_metadata(:media_metadata))
{:error, "Unable to communicate with SponsorBlock", 1}
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
_url, :download_thumbnail, _opts, _ot, _addl ->
{:ok, ""}
end)
assert {:recovered, updated_media_item, _message} = MediaDownloader.download_for_media_item(media_item)
assert updated_media_item.last_error == "Unable to communicate with SponsorBlock"
end
test "sets the last_error appropriately when unrecoverable", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl ->
{:ok, "{}"}
_url, :download, _opts, _ot, _addl ->
{:error, "Unable to communicate with SponsorBlock", 1}
end)
assert {:error, :unrecoverable, _message} = MediaDownloader.download_for_media_item(media_item)
media_item = Repo.reload(media_item)
assert DateTime.diff(DateTime.utc_now(), media_item.media_downloaded_at) < 2
assert String.ends_with?(media_item.media_filepath, ".mkv")
assert media_item.last_error == "Unable to communicate with SponsorBlock"
end
end
describe "download_for_media_item/3 when testing cookie retries" do
test "retries with cookies if we think it would help and the source allows" do
expect(YtDlpRunnerMock, :run, 4, fn
_url, :get_downloadable_status, _opts, _ot, [use_cookies: false] ->
{:error, "Sign in to confirm your age", 1}
_url, :get_downloadable_status, _opts, _ot, [use_cookies: true] ->
{:ok, "{}"}
_url, :download, _opts, _ot, addl ->
assert {:use_cookies, true} in addl
{:ok, render_metadata(:media_metadata)}
_url, :download_thumbnail, _opts, _ot, _addl ->
{:ok, ""}
end)
source = source_fixture(%{cookie_behaviour: :when_needed})
media_item = media_item_fixture(%{source_id: source.id})
assert {:ok, _} = MediaDownloader.download_for_media_item(media_item)
end
test "does not retry with cookies if we don't think it would help even the source allows" do
expect(YtDlpRunnerMock, :run, 1, fn
_url, :get_downloadable_status, _opts, _ot, [use_cookies: false] ->
{:error, "Some other error", 1}
end)
source = source_fixture(%{cookie_behaviour: :when_needed})
media_item = media_item_fixture(%{source_id: source.id})
assert {:error, :download_failed, "Some other error"} = MediaDownloader.download_for_media_item(media_item)
end
test "does not retry with cookies even if we think it would help but source doesn't allow" do
expect(YtDlpRunnerMock, :run, 1, fn
_url, :get_downloadable_status, _opts, _ot, [use_cookies: false] ->
{:error, "Sign in to confirm your age", 1}
end)
source = source_fixture(%{cookie_behaviour: :disabled})
media_item = media_item_fixture(%{source_id: source.id})
assert {:error, :download_failed, "Sign in to confirm your age"} =
MediaDownloader.download_for_media_item(media_item)
end
test "does not retry with cookies if cookies were already used" do
expect(YtDlpRunnerMock, :run, 1, fn
_url, :get_downloadable_status, _opts, _ot, [use_cookies: true] ->
{:error, "This video is available to this channel's members", 1}
end)
source = source_fixture(%{cookie_behaviour: :all_operations})
media_item = media_item_fixture(%{source_id: source.id})
assert {:error, :download_failed, "This video is available to this channel's members"} =
MediaDownloader.download_for_media_item(media_item)
end
end
@ -324,6 +464,25 @@ defmodule Pinchflat.Downloading.MediaDownloaderTest do
File.rm(updated_media_item.metadata_filepath)
end
test "sets the last_error to nil on success" do
media_item = media_item_fixture(%{last_error: "Some error"})
assert {:ok, updated_media_item} = MediaDownloader.download_for_media_item(media_item)
assert updated_media_item.last_error == nil
end
test "sets the last_error to the error message on failure", %{media_item: media_item} do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_downloadable_status, _opts, _ot, _addl -> {:ok, "{}"}
_url, :download, _opts, _ot, _addl -> {:error, :some_error}
end)
assert {:error, :unknown, _message} = MediaDownloader.download_for_media_item(media_item)
media_item = Repo.reload(media_item)
assert media_item.last_error == "Unknown error: {:error, :some_error}"
end
end
describe "download_for_media_item/3 when testing NFO generation" do

View file

@ -38,36 +38,48 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
end
end
describe "kickoff_download_tasks_from_youtube_rss_feed/1" do
test "enqueues a new worker for each new media_id in the source's RSS feed", %{source: source} do
describe "index_and_kickoff_downloads/1" do
test "enqueues a worker for each new media_id in the source's RSS feed", %{source: source} do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
assert [media_item] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [media_item] = FastIndexingHelpers.index_and_kickoff_downloads(source)
assert [worker] = all_enqueued(worker: MediaDownloadWorker)
assert worker.args["id"] == media_item.id
assert worker.priority == 0
end
test "does not enqueue a new worker for the source's media IDs we already know about", %{source: source} do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
media_item_fixture(source_id: source.id, media_id: "test_1")
assert [] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [] = FastIndexingHelpers.index_and_kickoff_downloads(source)
refute_enqueued(worker: MediaDownloadWorker)
end
test "kicks off a download task for all pending media but at a lower priority", %{source: source} do
pending_item = media_item_fixture(source_id: source.id, media_filepath: nil)
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
assert [worker_1, _worker_2] = all_enqueued(worker: MediaDownloadWorker)
assert worker_1.args["id"] == pending_item.id
assert worker_1.priority == 1
end
test "returns the found media items", %{source: source} do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "does not enqueue a download job if the source does not allow it" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
source = source_fixture(%{download_media: false})
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
refute_enqueued(worker: MediaDownloadWorker)
end
@ -75,7 +87,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
test "creates a download task record", %{source: source} do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
assert [media_item] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [media_item] = FastIndexingHelpers.index_and_kickoff_downloads(source)
assert [_] = Tasks.list_tasks_for(media_item, "MediaDownloadWorker")
end
@ -89,35 +101,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
{:ok, media_attributes_return_fixture()}
end)
FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
end
test "sets use_cookies if the source uses cookies" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes, _opts, _ot, addl ->
assert {:use_cookies, true} in addl
{:ok, media_attributes_return_fixture()}
end)
source = source_fixture(%{use_cookies: true})
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
end
test "does not set use_cookies if the source does not use cookies" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, media_attributes_return_fixture()}
end)
source = source_fixture(%{use_cookies: false})
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "does not enqueue a download job if the media item does not match the format rules" do
@ -142,7 +126,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
{:ok, output}
end)
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
refute_enqueued(worker: MediaDownloadWorker)
end
@ -154,7 +138,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
{:ok, "{}"}
end)
assert [] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "does not blow up if a media item causes a yt-dlp error", %{source: source} do
@ -164,11 +148,55 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
{:error, "message", 1}
end)
assert [] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
end
describe "kickoff_download_tasks_from_youtube_rss_feed/1 when testing backends" do
describe "index_and_kickoff_downloads/1 when testing cookies" do
test "sets use_cookies if the source uses cookies" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes, _opts, _ot, addl ->
assert {:use_cookies, true} in addl
{:ok, media_attributes_return_fixture()}
end)
source = source_fixture(%{cookie_behaviour: :all_operations})
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "does not set use_cookies if the source uses cookies when needed" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, media_attributes_return_fixture()}
end)
source = source_fixture(%{cookie_behaviour: :when_needed})
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "does not set use_cookies if the source does not use cookies" do
expect(HTTPClientMock, :get, fn _url -> {:ok, "<yt:videoId>test_1</yt:videoId>"} end)
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, media_attributes_return_fixture()}
end)
source = source_fixture(%{cookie_behaviour: :disabled})
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
end
describe "index_and_kickoff_downloads/1 when testing backends" do
test "uses the YouTube API if it is enabled", %{source: source} do
expect(HTTPClientMock, :get, fn url, _headers ->
assert url =~ "https://youtube.googleapis.com/youtube/v3/playlistItems"
@ -178,7 +206,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
Settings.set(youtube_api_key: "test_key")
assert [] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "the YouTube API creates records as expected", %{source: source} do
@ -188,7 +216,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
Settings.set(youtube_api_key: "test_key")
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "RSS is used as a backup if the API fails", %{source: source} do
@ -197,7 +225,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
Settings.set(youtube_api_key: "test_key")
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
test "RSS is used if the API is not enabled", %{source: source} do
@ -209,7 +237,7 @@ defmodule Pinchflat.FastIndexing.FastIndexingHelpersTest do
Settings.set(youtube_api_key: nil)
assert [%MediaItem{}] = FastIndexingHelpers.kickoff_download_tasks_from_youtube_rss_feed(source)
assert [%MediaItem{}] = FastIndexingHelpers.index_and_kickoff_downloads(source)
end
end
end

View file

@ -7,31 +7,67 @@ defmodule Pinchflat.FastIndexing.YoutubeApiTest do
alias Pinchflat.FastIndexing.YoutubeApi
describe "enabled?/0" do
test "returns true if the user has set a YouTube API key" do
test "returns true if the user has set YouTube API keys" do
Settings.set(youtube_api_key: "key1, key2")
assert YoutubeApi.enabled?()
end
test "returns true with a single API key" do
Settings.set(youtube_api_key: "test_key")
assert YoutubeApi.enabled?()
end
test "returns false if the user has not set an API key" do
test "returns false if the user has not set any API keys" do
Settings.set(youtube_api_key: nil)
refute YoutubeApi.enabled?()
end
test "returns false if only empty or whitespace keys are provided" do
Settings.set(youtube_api_key: " , ,")
refute YoutubeApi.enabled?()
end
end
describe "get_recent_media_ids/1" do
setup do
case :global.whereis_name(YoutubeApi.KeyIndex) do
:undefined -> :ok
pid -> Agent.stop(pid)
end
source = source_fixture()
Settings.set(youtube_api_key: "test_key")
Settings.set(youtube_api_key: "key1, key2")
{:ok, source: source}
end
test "rotates through API keys", %{source: source} do
expect(HTTPClientMock, :get, fn url, _headers ->
assert url =~ "key=key1"
{:ok, "{}"}
end)
expect(HTTPClientMock, :get, fn url, _headers ->
assert url =~ "key=key2"
{:ok, "{}"}
end)
expect(HTTPClientMock, :get, fn url, _headers ->
assert url =~ "key=key1"
{:ok, "{}"}
end)
# three calls to verify rotation
YoutubeApi.get_recent_media_ids(source)
YoutubeApi.get_recent_media_ids(source)
YoutubeApi.get_recent_media_ids(source)
end
test "calls the expected URL", %{source: source} do
expect(HTTPClientMock, :get, fn url, headers ->
api_base = "https://youtube.googleapis.com/youtube/v3/playlistItems"
request_url = "#{api_base}?part=contentDetails&maxResults=50&playlistId=#{source.collection_id}&key=test_key"
request_url = "#{api_base}?part=contentDetails&maxResults=50&playlistId=#{source.collection_id}&key=key1"
assert url == request_url
assert headers == [accept: "application/json"]

View file

@ -921,6 +921,14 @@ defmodule Pinchflat.MediaTest do
media_item = media_item_fixture()
assert %Ecto.Changeset{} = Media.change_media_item(media_item)
end
test "validates the title doesn't start with 'youtube video #'" do
# This is to account for youtube restricting indexing. See issue #549 for more
media_item = media_item_fixture()
assert %Ecto.Changeset{valid?: false} = Media.change_media_item(media_item, %{title: "youtube video #123"})
assert %Ecto.Changeset{valid?: true} = Media.change_media_item(media_item, %{title: "any other title"})
end
end
describe "change_media_item/1 when testing upload_date_index and source is a channel" do

View file

@ -88,13 +88,35 @@ defmodule Pinchflat.Metadata.MetadataFileHelpersTest do
Helpers.download_and_store_thumbnail_for(media_item)
end
test "returns nil if yt-dlp fails", %{media_item: media_item} do
stub(YtDlpRunnerMock, :run, fn _url, :download_thumbnail, _opts, _ot, _addl -> {:error, "error"} end)
filepath = Helpers.download_and_store_thumbnail_for(media_item)
assert filepath == nil
end
end
describe "download_and_store_thumbnail_for/2 when testing cookie usage" do
test "sets use_cookies if the source uses cookies" do
expect(YtDlpRunnerMock, :run, fn _url, :download_thumbnail, _opts, _ot, addl ->
assert {:use_cookies, true} in addl
{:ok, ""}
end)
source = source_fixture(%{use_cookies: true})
source = source_fixture(%{cookie_behaviour: :all_operations})
media_item = Repo.preload(media_item_fixture(%{source_id: source.id}), :source)
Helpers.download_and_store_thumbnail_for(media_item)
end
test "does not set use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, fn _url, :download_thumbnail, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, ""}
end)
source = source_fixture(%{cookie_behaviour: :when_needed})
media_item = Repo.preload(media_item_fixture(%{source_id: source.id}), :source)
Helpers.download_and_store_thumbnail_for(media_item)
@ -106,19 +128,11 @@ defmodule Pinchflat.Metadata.MetadataFileHelpersTest do
{:ok, ""}
end)
source = source_fixture(%{use_cookies: false})
source = source_fixture(%{cookie_behaviour: :disabled})
media_item = Repo.preload(media_item_fixture(%{source_id: source.id}), :source)
Helpers.download_and_store_thumbnail_for(media_item)
end
test "returns nil if yt-dlp fails", %{media_item: media_item} do
stub(YtDlpRunnerMock, :run, fn _url, :download_thumbnail, _opts, _ot, _addl -> {:error, "error"} end)
filepath = Helpers.download_and_store_thumbnail_for(media_item)
assert filepath == nil
end
end
describe "parse_upload_date/1" do

View file

@ -254,7 +254,23 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorkerTest do
end)
profile = media_profile_fixture(%{download_source_images: true})
source = source_fixture(media_profile_id: profile.id, use_cookies: true)
source = source_fixture(media_profile_id: profile.id, cookie_behaviour: :all_operations)
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
test "does not set use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_source_details, _opts, _ot, _addl ->
{:ok, source_details_return_fixture()}
_url, :get_source_metadata, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, render_metadata(:channel_source_metadata)}
end)
profile = media_profile_fixture(%{download_source_images: true})
source = source_fixture(media_profile_id: profile.id, cookie_behaviour: :when_needed)
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
@ -270,7 +286,7 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorkerTest do
end)
profile = media_profile_fixture(%{download_source_images: true})
source = source_fixture(media_profile_id: profile.id, use_cookies: false)
source = source_fixture(media_profile_id: profile.id, cookie_behaviour: :disabled)
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
@ -323,7 +339,21 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorkerTest do
{:ok, "{}"}
end)
source = source_fixture(%{series_directory: nil, use_cookies: true})
source = source_fixture(%{series_directory: nil, cookie_behaviour: :all_operations})
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
test "does not set use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, 2, fn
_url, :get_source_details, _opts, _ot, addl ->
assert {:use_cookies, false} in addl
{:ok, source_details_return_fixture()}
_url, :get_source_metadata, _opts, _ot, _addl ->
{:ok, "{}"}
end)
source = source_fixture(%{series_directory: nil, cookie_behaviour: :when_needed})
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
@ -337,7 +367,7 @@ defmodule Pinchflat.Metadata.SourceMetadataStorageWorkerTest do
{:ok, "{}"}
end)
source = source_fixture(%{series_directory: nil, use_cookies: false})
source = source_fixture(%{series_directory: nil, cookie_behaviour: :disabled})
perform_job(SourceMetadataStorageWorker, %{id: source.id})
end
end

View file

@ -77,5 +77,20 @@ defmodule Pinchflat.SettingsTest do
assert %Ecto.Changeset{} = Settings.change_setting(setting, %{onboarding: true})
end
test "ensures the extractor sleep interval is positive" do
setting = Settings.record()
assert %Ecto.Changeset{valid?: true} = Settings.change_setting(setting, %{extractor_sleep_interval_seconds: 1})
assert %Ecto.Changeset{valid?: true} = Settings.change_setting(setting, %{extractor_sleep_interval_seconds: 0})
assert %Ecto.Changeset{valid?: false} = Settings.change_setting(setting, %{extractor_sleep_interval_seconds: -1})
end
test "allows you to reset the extractor sleep interval" do
setting = Settings.record()
assert {:ok, setting} = Settings.update_setting(setting, %{extractor_sleep_interval_seconds: 1})
assert %Ecto.Changeset{valid?: true} = Settings.change_setting(setting, %{extractor_sleep_interval_seconds: 0})
end
end
end

View file

@ -96,26 +96,6 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
end
test "deletes any pending media tasks for the source" do
source = source_fixture()
{:ok, job} = Oban.insert(FastIndexingWorker.new(%{"id" => source.id}))
task = task_fixture(source_id: source.id, job_id: job.id)
assert {:ok, _} = SlowIndexingHelpers.kickoff_indexing_task(source)
assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
end
test "deletes any fast indexing tasks for the source" do
source = source_fixture()
{:ok, job} = Oban.insert(FastIndexingWorker.new(%{"id" => source.id}))
task = task_fixture(source_id: source.id, job_id: job.id)
assert {:ok, _} = SlowIndexingHelpers.kickoff_indexing_task(source)
assert_raise Ecto.NoResultsError, fn -> Repo.reload!(task) end
end
test "can be called with additional job arguments" do
source = source_fixture(index_frequency_minutes: 1)
job_args = %{"force" => true}
@ -290,6 +270,29 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
assert %Ecto.Changeset{} = changeset
end
test "doesn't blow up if the media item cannot be saved", %{source: source} do
stub(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, _opts, _ot, _addl_opts ->
response =
Phoenix.json_library().encode!(%{
id: "video1",
# This is a disallowed title - see MediaItem changeset or issue #549
title: "youtube video #123",
original_url: "https://example.com/video1",
live_status: "not_live",
description: "desc1",
aspect_ratio: 1.67,
duration: 12.34,
upload_date: "20210101"
})
{:ok, response}
end)
assert [changeset] = SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
assert %Ecto.Changeset{} = changeset
end
test "passes the source's download options to the yt-dlp runner", %{source: source} do
expect(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, opts, _ot, _addl_opts ->
assert {:output, "/tmp/test/media/%(title)S.%(ext)S"} in opts
@ -299,14 +302,27 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
end
describe "index_and_enqueue_download_for_media_items/2 when testing cookies" do
test "sets use_cookies if the source uses cookies" do
expect(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, _opts, _ot, addl_opts ->
assert {:use_cookies, true} in addl_opts
{:ok, source_attributes_return_fixture()}
end)
source = source_fixture(%{use_cookies: true})
source = source_fixture(%{cookie_behaviour: :all_operations})
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
test "sets use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, _opts, _ot, addl_opts ->
assert {:use_cookies, true} in addl_opts
{:ok, source_attributes_return_fixture()}
end)
source = source_fixture(%{cookie_behaviour: :when_needed})
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
@ -317,7 +333,7 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
{:ok, source_attributes_return_fixture()}
end)
source = source_fixture(%{use_cookies: false})
source = source_fixture(%{cookie_behaviour: :disabled})
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
@ -452,7 +468,9 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
end
describe "index_and_enqueue_download_for_media_items when testing the download archive" do
test "a download archive is used if the source is a channel", %{source: source} do
test "a download archive is used if the source is a channel that has been indexed before" do
source = source_fixture(%{collection_type: :channel, last_indexed_at: now()})
expect(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, opts, _ot, _addl_opts ->
assert :break_on_existing in opts
assert Keyword.has_key?(opts, :download_archive)
@ -476,6 +494,19 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
test "a download archive is not used if the source has never been indexed before" do
source = source_fixture(%{collection_type: :channel, last_indexed_at: nil})
expect(YtDlpRunnerMock, :run, fn _url, :get_media_attributes_for_collection, opts, _ot, _addl_opts ->
refute :break_on_existing in opts
refute Keyword.has_key?(opts, :download_archive)
{:ok, source_attributes_return_fixture()}
end)
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source)
end
test "a download archive is not used if the index has been forced to run" do
source = source_fixture(%{collection_type: :channel})
@ -489,7 +520,9 @@ defmodule Pinchflat.SlowIndexing.SlowIndexingHelpersTest do
SlowIndexingHelpers.index_and_enqueue_download_for_media_items(source, was_forced: true)
end
test "the download archive is formatted correctly and contains the right video", %{source: source} do
test "the download archive is formatted correctly and contains the right video" do
source = source_fixture(%{collection_type: :channel, last_indexed_at: now()})
media_items =
1..21
|> Enum.map(fn n ->

View file

@ -60,6 +60,33 @@ defmodule Pinchflat.SourcesTest do
end
end
describe "use_cookies?/2" do
test "returns true if the source has been set to use cookies" do
source = source_fixture(%{cookie_behaviour: :all_operations})
assert Sources.use_cookies?(source, :downloading)
end
test "returns false if the source has not been set to use cookies" do
source = source_fixture(%{cookie_behaviour: :disabled})
refute Sources.use_cookies?(source, :downloading)
end
test "returns true if the action is indexing and the source is set to :when_needed" do
source = source_fixture(%{cookie_behaviour: :when_needed})
assert Sources.use_cookies?(source, :indexing)
end
test "returns false if the action is downloading and the source is set to :when_needed" do
source = source_fixture(%{cookie_behaviour: :when_needed})
refute Sources.use_cookies?(source, :downloading)
end
test "returns true if the action is error_recovery and the source is set to :when_needed" do
source = source_fixture(%{cookie_behaviour: :when_needed})
assert Sources.use_cookies?(source, :error_recovery)
end
end
describe "list_sources/0" do
test "it returns all sources" do
source = source_fixture()
@ -294,6 +321,34 @@ defmodule Pinchflat.SourcesTest do
assert_enqueued(worker: MediaCollectionIndexingWorker, args: %{"id" => source.id})
end
test "creation will schedule a fast indexing job if the fast_index option is set" do
expect(YtDlpRunnerMock, :run, &channel_mock/5)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123",
fast_index: true
}
assert {:ok, %Source{} = source} = Sources.create_source(valid_attrs)
assert_enqueued(worker: FastIndexingWorker, args: %{"id" => source.id})
end
test "creation will not schedule a fast indexing job if the fast_index option is not set" do
expect(YtDlpRunnerMock, :run, &channel_mock/5)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123",
fast_index: false
}
assert {:ok, %Source{}} = Sources.create_source(valid_attrs)
refute_enqueued(worker: FastIndexingWorker)
end
test "creation schedules an index test even if the index frequency is 0" do
expect(YtDlpRunnerMock, :run, &channel_mock/5)
@ -354,7 +409,72 @@ defmodule Pinchflat.SourcesTest do
end
end
describe "create_source/2 when testing options" do
describe "create_source/2 when testing yt-dlp options" do
test "sets use_cookies to true if the source has been set to use cookies" do
expect(YtDlpRunnerMock, :run, fn _url, :get_source_details, _opts, _ot, addl ->
assert Keyword.get(addl, :use_cookies)
{:ok, playlist_return()}
end)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123",
cookie_behaviour: :all_operations
}
assert {:ok, %Source{}} = Sources.create_source(valid_attrs)
end
test "does not set use_cookies if the source uses cookies when needed" do
expect(YtDlpRunnerMock, :run, fn _url, :get_source_details, _opts, _ot, addl ->
refute Keyword.get(addl, :use_cookies)
{:ok, playlist_return()}
end)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123",
cookie_behaviour: :when_needed
}
assert {:ok, %Source{}} = Sources.create_source(valid_attrs)
end
test "does not set use_cookies if the source has not been set to use cookies" do
expect(YtDlpRunnerMock, :run, fn _url, :get_source_details, _opts, _ot, addl ->
refute Keyword.get(addl, :use_cookies)
{:ok, playlist_return()}
end)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123",
cookie_behaviour: :disabled
}
assert {:ok, %Source{}} = Sources.create_source(valid_attrs)
end
test "skips sleep interval" do
expect(YtDlpRunnerMock, :run, fn _url, :get_source_details, _opts, _ot, addl ->
assert Keyword.get(addl, :skip_sleep_interval)
{:ok, playlist_return()}
end)
valid_attrs = %{
media_profile_id: media_profile_fixture().id,
original_url: "https://www.youtube.com/channel/abc123"
}
assert {:ok, %Source{}} = Sources.create_source(valid_attrs)
end
end
describe "create_source/2 when testing its options" do
test "run_post_commit_tasks: false won't enqueue post-commit tasks" do
expect(YtDlpRunnerMock, :run, &channel_mock/5)
@ -902,28 +1022,30 @@ defmodule Pinchflat.SourcesTest do
end
defp playlist_mock(_url, :get_source_details, _opts, _ot, _addl) do
{
:ok,
Phoenix.json_library().encode!(%{
channel: nil,
channel_id: nil,
playlist_id: "some_playlist_id_#{:rand.uniform(1_000_000)}",
playlist_title: "some playlist name"
})
}
{:ok, playlist_return()}
end
defp channel_mock(_url, :get_source_details, _opts, _ot, _addl) do
{:ok, channel_return()}
end
defp playlist_return do
Phoenix.json_library().encode!(%{
channel: nil,
channel_id: nil,
playlist_id: "some_playlist_id_#{:rand.uniform(1_000_000)}",
playlist_title: "some playlist name"
})
end
defp channel_return do
channel_id = "some_channel_id_#{:rand.uniform(1_000_000)}"
{
:ok,
Phoenix.json_library().encode!(%{
channel: "some channel name",
channel_id: channel_id,
playlist_id: channel_id,
playlist_title: "some channel name - videos"
})
}
Phoenix.json_library().encode!(%{
channel: "some channel name",
channel_id: channel_id,
playlist_id: channel_id,
playlist_title: "some channel name - videos"
})
end
end

View file

@ -47,4 +47,21 @@ defmodule Pinchflat.Utils.NumberUtilsTest do
assert NumberUtils.human_byte_size(nil) == {0, "B"}
end
end
describe "add_jitter/2" do
test "returns 0 when the number is less than or equal to 0" do
assert NumberUtils.add_jitter(0) == 0
assert NumberUtils.add_jitter(-1) == 0
end
test "returns the number with jitter added" do
assert NumberUtils.add_jitter(100) in 100..150
end
test "optionally takes a jitter percentage" do
assert NumberUtils.add_jitter(100, 0.1) in 90..110
assert NumberUtils.add_jitter(100, 0.5) in 50..150
assert NumberUtils.add_jitter(100, 1) in 0..200
end
end
end

View file

@ -33,4 +33,14 @@ defmodule Pinchflat.Utils.StringUtilsTest do
assert StringUtils.double_brace("hello") == "{{ hello }}"
end
end
describe "wrap_string/1" do
test "returns strings as-is" do
assert StringUtils.wrap_string("hello") == "hello"
end
test "returns other values as inspected strings" do
assert StringUtils.wrap_string(1) == "1"
end
end
end

View file

@ -1,6 +1,7 @@
defmodule Pinchflat.YtDlp.CommandRunnerTest do
use Pinchflat.DataCase
alias Pinchflat.Settings
alias Pinchflat.Utils.FilesystemUtils
alias Pinchflat.YtDlp.CommandRunner, as: Runner
@ -95,6 +96,52 @@ defmodule Pinchflat.YtDlp.CommandRunnerTest do
end
end
describe "run/4 when testing rate limit options" do
test "includes sleep interval options by default" do
Settings.set(extractor_sleep_interval_seconds: 5)
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
assert String.contains?(output, "--sleep-interval")
assert String.contains?(output, "--sleep-requests")
assert String.contains?(output, "--sleep-subtitles")
end
test "doesn't include sleep interval options when skip_sleep_interval is true" do
assert {:ok, output} = Runner.run(@media_url, :foo, [], "", skip_sleep_interval: true)
refute String.contains?(output, "--sleep-interval")
refute String.contains?(output, "--sleep-requests")
refute String.contains?(output, "--sleep-subtitles")
end
test "doesn't include sleep interval options when extractor_sleep_interval_seconds is 0" do
Settings.set(extractor_sleep_interval_seconds: 0)
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
refute String.contains?(output, "--sleep-interval")
refute String.contains?(output, "--sleep-requests")
refute String.contains?(output, "--sleep-subtitles")
end
test "includes limit_rate option when specified" do
Settings.set(download_throughput_limit: "100K")
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
assert String.contains?(output, "--limit-rate 100K")
end
test "doesn't include limit_rate option when download_throughput_limit is nil" do
Settings.set(download_throughput_limit: nil)
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
refute String.contains?(output, "--limit-rate")
end
end
describe "run/4 when testing global options" do
test "creates windows-safe filenames" do
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
@ -115,6 +162,24 @@ defmodule Pinchflat.YtDlp.CommandRunnerTest do
end
end
describe "run/4 when testing misc options" do
test "includes --restrict-filenames when enabled" do
Settings.set(restrict_filenames: true)
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
assert String.contains?(output, "--restrict-filenames")
end
test "doesn't include --restrict-filenames when disabled" do
Settings.set(restrict_filenames: false)
assert {:ok, output} = Runner.run(@media_url, :foo, [], "")
refute String.contains?(output, "--restrict-filenames")
end
end
describe "version/0" do
test "adds the version arg" do
assert {:ok, output} = Runner.version()
@ -123,6 +188,14 @@ defmodule Pinchflat.YtDlp.CommandRunnerTest do
end
end
describe "update/0" do
test "adds the update arg" do
assert {:ok, output} = Runner.update()
assert String.contains?(output, "--update")
end
end
defp wrap_executable(new_executable, fun) do
Application.put_env(:pinchflat, :yt_dlp_executable, new_executable)
fun.()

View file

@ -269,7 +269,7 @@ defmodule Pinchflat.YtDlp.MediaTest do
response = %{
"original_url" => "https://www.youtube.com/watch?v=TiZPUDkDYbk",
"aspect_ratio" => 0.5,
"duration" => 59,
"duration" => 150,
"upload_date" => "20210101"
}

View file

@ -0,0 +1,24 @@
defmodule Pinchflat.YtDlp.UpdateWorkerTest do
use Pinchflat.DataCase
alias Pinchflat.Settings
alias Pinchflat.YtDlp.UpdateWorker
describe "perform/1" do
test "calls the yt-dlp runner to update yt-dlp" do
expect(YtDlpRunnerMock, :update, fn -> {:ok, ""} end)
expect(YtDlpRunnerMock, :version, fn -> {:ok, ""} end)
perform_job(UpdateWorker, %{})
end
test "saves the new version to the database" do
expect(YtDlpRunnerMock, :update, fn -> {:ok, ""} end)
expect(YtDlpRunnerMock, :version, fn -> {:ok, "1.2.3"} end)
perform_job(UpdateWorker, %{})
assert {:ok, "1.2.3"} = Settings.get(:yt_dlp_version)
end
end
end

View file

@ -75,6 +75,16 @@ defmodule PinchflatWeb.Sources.SourceLive.IndexTableLiveTest do
assert render_element(view, "tbody tr:first-child") =~ source1.custom_name
assert render_element(view, "tbody tr:last-child") =~ source2.custom_name
end
test "name is sorted without case sensitivity", %{conn: conn} do
source1 = source_fixture(custom_name: "Source_B")
source2 = source_fixture(custom_name: "source_a")
{:ok, view, _html} = live_isolated(conn, IndexTableLive, session: create_session())
assert render_element(view, "tbody tr:first-child") =~ source2.custom_name
assert render_element(view, "tbody tr:last-child") =~ source1.custom_name
end
end
describe "when testing pagination" do

View file

@ -10,8 +10,6 @@ Application.put_env(:pinchflat, :http_client, HTTPClientMock)
Mox.defmock(UserScriptRunnerMock, for: Pinchflat.Lifecycle.UserScripts.UserScriptCommandRunner)
Application.put_env(:pinchflat, :user_script_runner, UserScriptRunnerMock)
if System.get_env("EX_CHECK"), do: Code.put_compiler_option(:warnings_as_errors, true)
ExUnit.start()
Ecto.Adapters.SQL.Sandbox.mode(Pinchflat.Repo, :manual)
Faker.start()

View file

@ -18,6 +18,7 @@
{:sobelow, "mix sobelow --config"},
{:prettier_formatting, "yarn run lint:check", fix: "yarn run lint:fix"},
{:npm_test, false},
{:gettext, false},
{:ex_unit, env: %{"MIX_ENV" => "test", "EX_CHECK" => "1"}}
## curated tools may be disabled (e.g. the check for compilation warnings)