Compare commits

..

1709 commits
v0.2.1 ... main

Author SHA1 Message Date
SergeantPanda
8521df94ad
Merge pull request #868 from DawtCom:main
Some checks failed
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Move caching to client to remove burden on dispatch server
2026-01-18 17:26:49 -06:00
DawtCom
c970cfcf9a Move caching to client to remove burden on dispatch server 2026-01-18 00:49:17 -06:00
SergeantPanda
fe60c4f3bc Enhancement: Update frontend tests workflow to ensure proper triggering on push and pull request events only when frontend code changes.
Some checks failed
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Frontend Tests / test (push) Has been cancelled
2026-01-17 18:30:13 -06:00
SergeantPanda
7cf7aecdf2
Merge pull request #857 from Dispatcharr/dev
Some checks failed
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Frontend Tests / test (push) Has been cancelled
Dev
2026-01-15 09:05:06 -06:00
SergeantPanda
54644df9a3 Test Fix: Fixed SettingsUtils frontend tests for new grouped settings architecture. Updated test suite to properly verify grouped JSON settings (stream_settings, dvr_settings, etc.) instead of individual CharField settings, including tests for type conversions, array-to-CSV transformations, and special handling of proxy_settings and network_access. Frontend tests GitHub workflow now uses Node.js 24 (matching Dockerfile) and runs on both main and dev branch pushes and pull requests for comprehensive CI coverage.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Frontend Tests / test (push) Waiting to run
2026-01-15 08:55:38 -06:00
SergeantPanda
38fa0fe99d Bug Fix: Fixed NumPy baseline detection in Docker entrypoint. Now calls numpy.show_config() directly with case-insensitive grep instead of incorrectly wrapping the output.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-14 17:10:06 -06:00
SergeantPanda
a772f5c353 changelog: Update missed close on 0.17.0 changelog.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-13 17:03:30 -06:00
GitHub Actions
da186bcb9d Release v0.17.0
Some checks failed
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Frontend Tests / test (push) Has been cancelled
2026-01-13 22:48:13 +00:00
SergeantPanda
75df00e329
Merge pull request #849 from Dispatcharr/dev
Version 0.17.0
2026-01-13 16:45:33 -06:00
SergeantPanda
d0ed682b3d Bug Fix: Fixed bulk channel profile membership update endpoint silently ignoring channels without existing membership records. The endpoint now creates missing memberships automatically (matching single-channel endpoint behavior), validates that all channel IDs exist before processing, and provides detailed response feedback including counts of updated vs. created memberships. Added comprehensive Swagger documentation with request/response schemas.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-13 15:43:44 -06:00
SergeantPanda
60955a39c7 changelog: Update changelog for settings refactor. 2026-01-13 14:35:01 -06:00
SergeantPanda
6c15ae940d changelog: Update changelog for PR 837 2026-01-13 14:30:25 -06:00
SergeantPanda
516d0e02aa
Merge pull request #837 from mdellavo/bulk-update-fix 2026-01-13 14:26:04 -06:00
Marc DellaVolpe
6607cef5d4 fix bulk update error on unmatched first entry, add tests to cover bulk update api 2026-01-13 15:05:40 -05:00
SergeantPanda
2f9b544519
Merge pull request #848 from Dispatcharr/settings-refactor
Refactor CoreSettings to use JSONField for value storage and update r…
2026-01-13 13:34:50 -06:00
SergeantPanda
36967c10ce Refactor CoreSettings to use JSONField for value storage and update related logic for proper type handling. Adjusted serializers and forms to accommodate new data structure, ensuring seamless integration across the application. 2026-01-13 12:18:34 -06:00
SergeantPanda
4bfdd15b37 Bug Fix: Fixed PostgreSQL backup restore not completely cleaning database before restoration. The restore process now drops and recreates the entire public schema before running pg_restore, ensuring a truly clean restore that removes all tables, functions, and other objects not present in the backup file. This prevents leftover database objects from persisting when restoring backups from older branches or versions. Added --no-owner flag to pg_restore to avoid role permission errors when the backup was created by a different PostgreSQL user.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-12 16:38:20 -06:00
SergeantPanda
2a3d0db670 Enhancement: Loading feedback for all confirmation dialogs: Extended visual loading indicators across all confirmation dialogs throughout the application. Delete, cleanup, and bulk operation dialogs now show an animated dots loader and disabled state during async operations, providing consistent user feedback for backups (restore/delete), channels, EPGs, logos, VOD logos, M3U accounts, streams, users, groups, filters, profiles, batch operations, and network access changes.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-12 13:53:44 -06:00
SergeantPanda
43636a84d0 Enhancement: Added visual loading indicator to the backup restore confirmation dialog. When clicking "Restore", the button now displays an animated dots loader and becomes disabled, providing clear feedback that the restore operation is in progress. 2026-01-12 13:22:24 -06:00
SergeantPanda
6d5d16d667 Enhancement: Add check for existing NumPy baseline support before reinstalling legacy NumPy to avoid unnecessary installations. 2026-01-12 12:29:54 -06:00
SergeantPanda
f821dabe8e Enhancement: Users can now rename existing channel profiles and create duplicates with automatic channel membership cloning. Each profile action (edit, duplicate, delete) in the profile dropdown for quick access. 2026-01-12 11:29:33 -06:00
SergeantPanda
564dceb210 Bug fix: Fixed TV Guide loading overlay not disappearing after navigating from DVR page. The fetchRecordings() function in the channels store was setting isLoading: true on start but never resetting it to false on successful completion, causing the Guide page's loading overlay to remain visible indefinitely when accessed after the DVR page.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-11 20:51:26 -06:00
SergeantPanda
2e9280cf59
Merge pull request #834 from justinforlenza/stream-profile-argument-parse
Fix: Support quoted stream profile arguments
2026-01-11 20:27:15 -06:00
SergeantPanda
7594ba0a08 changelog: Update changelog for command line parsing PR. 2026-01-11 20:24:59 -06:00
Justin
e8d949db86 replaced standard str.split() with shlex.split() 2026-01-11 20:40:18 -05:00
SergeantPanda
a9a433bc5b changelog: Update changelog for test PR submitted. 2026-01-11 19:24:18 -06:00
SergeantPanda
e72e0215cb
Merge pull request #841 from nick4810/tests/frontend-unit-tests
Tests/frontend unit tests
2026-01-11 19:20:37 -06:00
SergeantPanda
b8374fcc68 Refactor/Enhancement: Refactored channel numbering dialogs into a unified CreateChannelModal component that now includes channel profile selection alongside channel number assignment for both single and bulk channel creation. Users can choose to add channels to all profiles, no profiles, or specific profiles with mutual exclusivity between special options ("All Profiles", "None") and specific profile selections. Profile selection defaults to the current table filter for intuitive workflow. 2026-01-11 19:05:07 -06:00
SergeantPanda
6b873be3cf Bug Fix: Fixed bulk and manual channel creation not refreshing channel profile memberships in the UI for all connected clients. WebSocket channels_created event now calls fetchChannelProfiles() to ensure profile membership updates are reflected in real-time for all users without requiring a page refresh. 2026-01-11 17:51:00 -06:00
SergeantPanda
edfa497203 Enhancement: Channel Profile membership control for manual channel creation and bulk operations: Extended the existing channel_profile_ids parameter from POST /api/channels/from-stream/ to also support POST /api/channels/ (manual creation) and bulk creation tasks with the same flexible semantics:
- Omitted parameter (default): Channels are added to ALL profiles (preserves backward compatibility)
  - Empty array `[]`: Channels are added to NO profiles
  - Sentinel value `[0]`: Channels are added to ALL profiles (explicit)
  - Specific IDs `[1, 2, ...]`: Channels are added only to the specified profiles
  This allows API consumers to control profile membership across all channel creation methods without requiring all channels to be added to every profile by default.
2026-01-11 17:31:15 -06:00
Nick Sandstrom
0242eb69ee Updated tests for mocked regex 2026-01-10 20:22:36 -08:00
Nick Sandstrom
93f74c9d91 Squashed commit of the following:
commit df18a89d0562edc8fd8fb5bc4cac702aefb5272c
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sat Jan 10 19:18:23 2026 -0800

    Updated tests

commit 90240344b89717fbad0e16fe209dbf00c567b1a8
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Jan 4 03:18:41 2026 -0800

    Updated tests

commit 525b7cb32bc8d235613706d6795795a0177ea24b
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Jan 4 03:18:31 2026 -0800

    Extracted component and util logic

commit e54ea2c3173c0ce3cfb0a2d70d76fdd0a66accc8
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 31 11:55:40 2025 -0800

    Updated tests

commit 5cbe164cb9818d8eab607af037da5faee2c1556f
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 31 11:55:14 2025 -0800

    Minor changes

    Exporting UiSettingsForm as default
    Reverted admin level type check

commit f9ab0d2a06091a2eed3ee6f34268c81bfd746f1e
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 30 23:31:29 2025 -0800

    Extracted component and util logic

commit a705a4db4a32d0851d087a984111837a0a83f722
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Dec 28 00:47:29 2025 -0800

    Updated tests

commit a72c6720a3980d0f279edf050b6b51eaae11cdbd
Merge: e8dcab6f 43525ca3
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Dec 28 00:04:24 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit e8dcab6f832570cb986f114cfa574db4994b3aab
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sat Dec 27 22:35:59 2025 -0800

    Updated tests

commit 0fd230503844fba0c418ab0a03c46dc878697a55
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sat Dec 27 22:35:53 2025 -0800

    Added plugins store

commit d987f2de72272f24e26b1ed5bc04bb5c83033868
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sat Dec 27 22:35:43 2025 -0800

    Extracted component and util logic

commit 5a3138370a468a99c9f1ed0a36709a173656d809
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 24 23:13:07 2025 -0800

    Lazy-loading button modals

commit ac6945b5b55e0e16d050d4412a20c82f19250c4b
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 24 22:41:51 2025 -0800

    Extracted notification util

commit befe159fc06b67ee415f7498b5400fee0dc82528
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 24 22:28:12 2025 -0800

    Extracted component and util logic

commit ec10a3a4200a0c94cae29691a9fe06e5c4317bb7
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 24 22:22:09 2025 -0800

    Updated tests

commit c1c7214c8589c0ce7645ea24418d9dd978ac8c1f
Merge: eba6dce7 9c9cbab9
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 23 12:41:25 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit eba6dce786495e352d4696030500db41d028036e
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Dec 21 10:12:19 2025 -0800

    Updated style props

commit 2024b0b267b849a5f100e5543b9188e8ad6dd3d9
Merge: b3700956 1029eb5b
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Sun Dec 21 09:27:21 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit b3700956a4c2f473f1e977826f9537d27ea018ae
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Thu Dec 18 07:45:36 2025 -0800

    Reverted Channels change

commit 137cbb02473b7f2f41488601e3b64e5ff45ac656
Merge: 644ed001 2a0df81c
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 17 13:36:05 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit 644ed00196c41eaa44df1b98236b7e5cc3124d82
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Wed Dec 17 13:29:13 2025 -0800

    Updated tests

commit c62d1bd0534aa19be99b8f87232ba872420111a0
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 14:12:31 2025 -0800

    Updated tests

commit 0cc0ee31d5ad84c59d8eba9fc4424f118f5e0ee2
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 13:44:55 2025 -0800

    Extracted component and util logic

commit 25d1b112af250b5ccebb1006511bff8e4387fc76
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 13:44:11 2025 -0800

    Added correct import for Text component

commit d8a04c6c09edf158220d3073939c9fb60069745c
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 13:43:55 2025 -0800

    Fixed component syntax

commit 59e35d3a4d0da8ed8476560cedacadf76162ea43
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 13:43:39 2025 -0800

    Fixed cache_url fallback

commit d2a170d2efd3d2b0e6078c9eebeb8dcea237be3b
Merge: b8f7e435 6c1b0f9a
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Tue Dec 16 12:00:45 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit b8f7e4358a23f2e3a902929b57ab7a7d115241c5
Merge: 5b12c68a d97f0c90
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Mon Dec 15 07:42:06 2025 -0800

    Merge branch 'enhancement/component-cleanup' into test/component-cleanup

commit 5b12c68ab8ce429adc8d1355632aa411007d365b
Merge: eff58126 c63cb75b
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Mon Dec 8 16:56:14 2025 -0800

    Merge branch 'enhancement/unit-tests' into stage

commit eff58126fb6aba4ebe9a0c67eee65773bffb8ae9
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Mon Dec 8 16:49:43 2025 -0800

    Update .gitignore

commit c63cb75b8cad204d48a392a28d8a5bdf8c270496
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Mon Dec 8 16:28:03 2025 -0800

    Added unit tests for pages

commit 75306a6181ddeb2eaeb306387ba2b44c7fcfd5e3
Author: Nick Sandstrom <32273437+nick4810@users.noreply.github.com>
Date:   Mon Dec 8 16:27:19 2025 -0800

    Added Actions workflow
2026-01-10 19:36:23 -08:00
Nick Sandstrom
e2e6f61dee Merge remote-tracking branch 'upstream/dev' into tests/frontend-unit-tests 2026-01-10 19:35:40 -08:00
SergeantPanda
719a975210 Enhancement: Visual stale indicators for streams and groups: Added is_stale field to Stream and both is_stale and last_seen fields to ChannelGroupM3UAccount models to track items in their retention grace period. Stale groups display with orange buttons and a warning tooltip, while stale streams show with a red background color matching the visual treatment of empty channels.
Some checks failed
CI Pipeline / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Has been cancelled
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
CI Pipeline / create-manifest (push) Has been cancelled
2026-01-09 14:57:07 -06:00
SergeantPanda
a84553d15c Enhancement: Stale status indicators for streams and groups: Added is_stale field to both Stream and ChannelGroupM3UAccount models to track items in their grace period (seen in previous refresh but not current). 2026-01-09 13:53:01 -06:00
SergeantPanda
cc9d38212e Enhancement: Groups now follow the same stale retention logic as streams, using the account's stale_stream_days setting. Groups that temporarily disappear from an M3U source are retained for the configured retention period instead of being immediately deleted, preserving user settings and preventing data loss when providers temporarily remove/re-add groups. (Closes #809) 2026-01-09 12:03:55 -06:00
SergeantPanda
caf56a59f3 Bug Fix: Fixed manual channel creation not adding channels to channel profiles. Manually created channels are now added to the selected profile if one is active, or to all profiles if "All" is selected, matching the behavior of channels created from streams.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-09 10:41:04 -06:00
SergeantPanda
ba5aa861e3 Bug Fix: Fixed Channel Profile filter incorrectly applying profile membership filtering even when "Show Disabled" was enabled, preventing all channels from being displayed. Profile filter now only applies when hiding disabled channels. (Fixes #825) 2026-01-09 10:26:09 -06:00
SergeantPanda
312fa11cfb More cleanup of base image.
Some checks failed
Base Image Build / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Base Image Build / docker (amd64, ubuntu-24.04) (push) Has been cancelled
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
Base Image Build / create-manifest (push) Has been cancelled
2026-01-08 14:53:25 -06:00
SergeantPanda
ad334347a9 More cleanup of base image.
Some checks failed
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Base Image Build / prepare (push) Has been cancelled
Base Image Build / docker (amd64, ubuntu-24.04) (push) Has been cancelled
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
Base Image Build / create-manifest (push) Has been cancelled
2026-01-08 14:52:58 -06:00
SergeantPanda
74a9d3d0cb
Merge pull request #823 from patchy8736/uwsgi-socket-timeout
Some checks are pending
Base Image Build / prepare (push) Waiting to run
Base Image Build / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
Base Image Build / create-manifest (push) Blocked by required conditions
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Bug Fix: Add socket-timeout to uWSGI production config to prevent VOD stream timeouts/streams vanishing from stats page
2026-01-08 13:36:18 -06:00
SergeantPanda
fa6315de33 changelog: Fix VOD streams disappearing from stats page during playback by updating uWSGI config to prevent premature cleanup 2026-01-08 13:35:38 -06:00
SergeantPanda
d6c1a2369b Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/patchy8736/823 2026-01-08 13:27:42 -06:00
SergeantPanda
72d9125c36
Merge pull request #811 from nick4810/enhancement/component-cleanup
Extracted component and util logic
2026-01-08 13:07:41 -06:00
SergeantPanda
6e74c370cb changelog: Document refactor of Stats and VOD pages for improved readability and maintainability 2026-01-08 13:06:30 -06:00
SergeantPanda
10447f8c86 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/nick4810/811 2026-01-08 11:50:39 -06:00
SergeantPanda
1a2d39de91 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2026-01-08 10:28:15 -06:00
SergeantPanda
f389420251 Add optional legacy NumPy support for older CPUs in Docker configurations 2026-01-08 10:27:58 -06:00
SergeantPanda
3f6eff96fc changelog: Update changelog for USE_LEGACY_NUMPY support 2026-01-08 10:26:08 -06:00
SergeantPanda
02faa1a4a7
Merge pull request #827 from Dispatcharr/numpy-none-baseline
Enhance Docker setup for legacy NumPy support and streamline installa…
2026-01-08 10:04:58 -06:00
SergeantPanda
c5a3a2af81 Enhance Docker setup for legacy NumPy support and streamline installation process
Some checks are pending
Base Image Build / prepare (push) Waiting to run
Base Image Build / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
Base Image Build / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-08 10:02:29 -06:00
SergeantPanda
01370e8892 Bug fix: Fixed duplicate key constraint violations by treating TMDB/IMDB ID values of 0 or '0' as invalid (some providers use this to indicate "no ID"), converting them to NULL to prevent multiple items from incorrectly sharing the same ID. (Fixes #813)
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-07 16:38:09 -06:00
SergeantPanda
8cbb55c44b Bug Fix: Fixed Channels table EPG column showing "Not Assigned" on initial load for users with large EPG datasets. Added tvgsLoaded flag to EPG store to track when EPG data has finished loading, ensuring the table waits for EPG data before displaying. EPG cells now show animated skeleton placeholders while loading instead of incorrectly showing "Not Assigned". (Fixes #810) 2026-01-07 16:08:53 -06:00
patchy8736
0441dd7b7e Bug Fix: Add socket-timeout to uWSGI production config to prevent VOD stream timeouts during client buffering
The production uWSGI configuration (docker/uwsgi.ini) was missing the socket-timeout directive, causing it to default to 4 seconds. When clients (e.g., VLC) buffer VOD streams and temporarily stop reading from the HTTP socket, uWSGI's write operations timeout after 4 seconds, triggering premature stream cleanup and causing VOD streams to disappear from the stats page.

The fix adds socket-timeout = 600 to match the existing http-timeout = 600 value, giving uWSGI sufficient time to wait for clients to resume reading from buffered sockets. This prevents:
- uwsgi_response_write_body_do() TIMEOUT !!! errors in logs
- GeneratorExit exceptions and premature stream cleanup
- VOD streams vanishing from the stats page when clients buffer

The debug config already had socket-timeout = 3600, which is why the issue wasn't observed in debug mode. This fix aligns production behavior with the debug config while maintaining the production-appropriate 10-minute timeout duration.
2026-01-07 14:10:17 +01:00
SergeantPanda
30d093a2d3 Fixed bulk_create and bulk_update errors during VOD content refresh by pre-checking object existence with optimized bulk queries (3 queries total instead of N per batch) before creating new objects. This ensures all movie/series objects have primary keys before relation operations, preventing "prohibited to prevent data loss due to unsaved related object" errors. (Fixes #813)
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-06 16:12:50 -06:00
SergeantPanda
518c93c398 Enhance Docker setup for legacy NumPy support and streamline installation process 2026-01-06 14:07:37 -06:00
SergeantPanda
cc09c89156
Merge pull request #812 from Dispatcharr/React-Hooke-Form
Some checks failed
CI Pipeline / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Has been cancelled
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
CI Pipeline / create-manifest (push) Has been cancelled
Refactor forms to use react-hook-form and Yup validation
2026-01-04 21:03:10 -06:00
Nick Sandstrom
21c0758cc9 Extracted component and util logic 2026-01-04 18:51:09 -08:00
SergeantPanda
f664910bf4 changelog: Add RHF and removetrailingzeros changes . 2026-01-04 20:49:39 -06:00
SergeantPanda
bc19bf8629 Remove "removeTrailingZeros" prop from the Channel Edit Form 2026-01-04 20:45:52 -06:00
SergeantPanda
16bbc1d875 Refactor forms to use react-hook-form and Yup for validation
- Replaced Formik with react-hook-form in Logo, M3UGroupFilter, M3UProfile, Stream, StreamProfile, and UserAgent components.
- Integrated Yup for schema validation in all updated forms.
- Updated form submission logic to accommodate new form handling methods.
- Adjusted state management and error handling to align with react-hook-form's API.
- Ensured compatibility with existing functionality while improving code readability and maintainability.
2026-01-04 20:40:16 -06:00
SergeantPanda
9612a67412 Change: VOD upstream read timeout reduced from 30 seconds to 10 seconds to minimize lock hold time when clients disconnect during connection phase
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-04 15:21:22 -06:00
SergeantPanda
4e65ffd113 Bug fix: Fixed VOD profile connection count not being decremented when stream connection fails (timeout, 404, etc.), preventing profiles from reaching capacity limits and rejecting valid stream requests 2026-01-04 15:00:08 -06:00
SergeantPanda
6031885537 Bug Fix: M3UMovieRelation.get_stream_url() and M3UEpisodeRelation.get_stream_url() to use XC client's _normalize_url() method instead of simple rstrip('/'). This properly handles malformed M3U account URLs (e.g., containing /player_api.php or query parameters) before constructing VOD stream endpoints, matching behavior of live channel URL building. (Closes #722) 2026-01-04 14:36:03 -06:00
SergeantPanda
8ae1a98a3b Bug Fix: Fixed onboarding message appearing in the Channels Table when filtered results are empty. The onboarding message now only displays when there are no channels created at all, not when channels exist but are filtered out by current filters. 2026-01-04 14:05:30 -06:00
SergeantPanda
48bdcfbd65 Bug fix: Release workflow Docker tagging: Fixed issue where latest and version tags (e.g., 0.16.0) were creating separate manifests instead of pointing to the same image digest, which caused old latest tags to become orphaned/untagged after new releases. Now creates a single multi-arch manifest with both tags, maintaining proper tag relationships and download statistics visibility on GitHub. 2026-01-04 12:05:01 -06:00
GitHub Actions
e151da27b9 Release v0.16.0
Some checks failed
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
2026-01-04 01:15:46 +00:00
SergeantPanda
fdca1fd165
Merge pull request #803 from Dispatcharr/dev
Some checks are pending
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Version 0.16.0
2026-01-03 19:15:10 -06:00
SergeantPanda
9cc90354ee changelog: Update changelog for region code addition.
Some checks failed
CI Pipeline / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Has been cancelled
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
CI Pipeline / create-manifest (push) Has been cancelled
2026-01-02 15:45:05 -06:00
SergeantPanda
62b6cfa2fb
Merge pull request #423 from bigpandaaaa/uk-region
Add 'UK' region
2026-01-02 15:39:27 -06:00
SergeantPanda
3f46f28a70 Bug Fix: Auto Channel Sync Force EPG Source feature not properly forcing "No EPG" assignment - When selecting "Force EPG Source" > "No EPG (Disabled)", channels were still being auto-matched to EPG data instead of forcing dummy/no EPG. Now correctly sets force_dummy_epg flag to prevent unwanted EPG assignment. (Fixes #788) 2026-01-02 15:22:25 -06:00
SergeantPanda
058de26bdf
Merge pull request #787 from nick4810/enhancement/component-cleanup
Enhancement/component cleanup
2026-01-02 13:56:08 -06:00
SergeantPanda
f51463162c
Merge pull request #794 from sethwv:dev
Fix root-owned __pycache__ by running Django commands as non-root user
2026-01-02 12:06:14 -06:00
SergeantPanda
0cb189acba changelog: Document Docker container file permissions update for Django management commands 2026-01-02 12:03:42 -06:00
SergeantPanda
3fe5ff9130 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/sethwv/794 2026-01-02 11:54:14 -06:00
SergeantPanda
131ebf9f55 changelog: Updated changelog for new refactor. 2026-01-02 11:29:01 -06:00
SergeantPanda
2ed784e8c4 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/nick4810/787 2026-01-02 11:27:22 -06:00
Nick Sandstrom
2e0aa90cd6 Merge remote-tracking branch 'upstream/dev' into enhancement/component-cleanup 2026-01-02 08:33:06 -08:00
SergeantPanda
a363d9f0e6
Merge pull request #796 from patchy8736:dev
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
Fix episode processing issues in VOD tasks (#770)
2026-01-02 10:13:48 -06:00
SergeantPanda
6a985d7a7d changelog: Update changelog for PR 2026-01-02 10:13:01 -06:00
SergeantPanda
1a67f3c8ec Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/patchy8736/796 2026-01-02 09:53:54 -06:00
SergeantPanda
6bd8a0c12d Enhance error logging for invalid season and episode numbers in batch_process_episodes 2026-01-02 09:53:45 -06:00
Nick Sandstrom
6678311fa7 Added loading overlay while programs are fetching 2026-01-02 02:03:50 -08:00
SergeantPanda
e8c9432f65 changelog: Update changelog for VOD category filtering.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-01 18:29:54 -06:00
SergeantPanda
33f988b2c6
Merge pull request #784 from Vitekant:fix/vod-category-pipe-parsing
Fix VOD category filtering for names containing pipe "|" characters
2026-01-01 18:28:07 -06:00
SergeantPanda
13e4b19960 changelog: Add change for settings/logo refactor. 2026-01-01 18:21:52 -06:00
SergeantPanda
042c34eecc
Merge pull request #795 from nick4810/enhancement/component-cleanup-logos-settings 2026-01-01 18:15:47 -06:00
SergeantPanda
ded785de54
Merge pull request #789 from nick4810/fix/standard-users-signal-logos-ready
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2026-01-01 17:37:01 -06:00
patchy8736
c57f9fd7e7 Fix episode processing issues in VOD tasks
- Ensure season and episode numbers are properly converted to integers with error handling
- Remove zero-padding from debug log format for season/episode numbers
- Add validation to filter out relations with unsaved episodes that have no primary key
- Add proper logging for skipped relations when episode is not saved to database

These changes address potential crashes when API returns string values instead of integers
and prevent database errors when bulk creation operations fail silently due to conflicts.

Fixes issue #770
2026-01-01 15:57:27 +01:00
Nick Sandstrom
b4b0774189 Including notification util changes 2025-12-31 13:20:09 -08:00
Nick Sandstrom
7b1a85617f Minor changes
Exporting UiSettingsForm as default
Reverted admin level type check
2025-12-31 13:12:24 -08:00
Nick Sandstrom
a6361a07d2 Extracted component and util logic 2025-12-31 13:12:24 -08:00
sethwv-alt
b157159b87
Fix root-owned __pycache__ by running Django commands as non-root user 2025-12-31 12:16:19 -05:00
Nick Sandstrom
d9fc0e68d6 Signaling ready when no StreamTable rendered 2025-12-29 22:18:42 -08:00
Nick Sandstrom
43525ca32a Moved RecordingList outside of DVRPage
Helps to prevent renders
2025-12-27 23:49:06 -08:00
Nick Sandstrom
ffa1331c3b Updated to use util functions 2025-12-27 23:17:42 -08:00
Nick Sandstrom
26d9dbd246 Added plugins store 2025-12-27 22:45:48 -08:00
Nick Sandstrom
f97399de07 Extracted component and util logic 2025-12-27 22:45:48 -08:00
Nick Sandstrom
a5688605cd Lazy-loading button modals 2025-12-27 22:45:48 -08:00
Nick Sandstrom
ca96adf781 Extracted notification util 2025-12-27 22:45:48 -08:00
Nick Sandstrom
61247a452a Extracted component and util logic 2025-12-27 22:45:48 -08:00
Nick Sandstrom
fda188e738 Updated style props 2025-12-27 22:45:48 -08:00
SergeantPanda
57a6a842b2 Bug Fix/Enhancement:
Some checks failed
CI Pipeline / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Has been cancelled
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
CI Pipeline / create-manifest (push) Has been cancelled
- M3U and EPG URLs now correctly preserve non-standard HTTPS ports (e.g., `:8443`) when accessed behind reverse proxies that forward the port in headers — `get_host_and_port()` now properly checks `X-Forwarded-Port` header before falling back to other detection methods (Fixes #704)
- M3U stream URLs now use `build_absolute_uri_with_port()` for consistency with EPG and logo URLs, ensuring uniform port handling across all M3U file URLs
2025-12-27 09:57:36 -06:00
SergeantPanda
f1c096bc94 Bug Fix: XtreamCodes M3U files now correctly set x-tvg-url and url-tvg headers to reference XC EPG URL (xmltv.php) instead of standard EPG endpoint when downloaded via XC API (Fixes #629) 2025-12-27 08:19:58 -06:00
Vitek
5a4be532fd Fix VOD category filtering for names containing pipe characters
Use rsplit('|', 1) instead of split('|', 1) to split from the right,
preserving any pipe characters in category names (e.g., "PL | BAJKI",
"EN | MOVIES"). This ensures the category_type is correctly extracted
as the last segment while keeping the full category name intact.

Fixes MovieFilter, SeriesFilter, and UnifiedContentViewSet category parsing.
2025-12-27 00:21:42 +01:00
SergeantPanda
cc3ed80e1a changelog: Add thanks for errorboundary.
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2025-12-26 16:16:25 -06:00
SergeantPanda
af88756197
Merge pull request #761 from nick4810/enhancement/component-cleanup
Enhancement/component cleanup
2025-12-26 16:08:49 -06:00
SergeantPanda
1b1f360705 Enhancement: Channel number inputs in stream-to-channel creation modals no longer have a maximum value restriction, allowing users to enter any valid channel number supported by the database 2025-12-26 15:55:25 -06:00
SergeantPanda
bc3ef1a3a9 Bug Fix: M3U and EPG manager page no longer crashes when a playlist references a deleted channel group (Fixes screen blank on navigation)
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
2025-12-26 14:58:02 -06:00
SergeantPanda
81af73a086 changelog: Add entry for stream validation continuing GET request on HEAD failure 2025-12-26 13:53:48 -06:00
SergeantPanda
0abacf1fef
Merge pull request #783 from kvnnap/dev 2025-12-26 13:51:03 -06:00
SergeantPanda
36a39cd4de Bug fix: XtreamCodes EPG limit parameter now properly converted to integer to prevent type errors when accessing EPG listings (Fixes #781) 2025-12-26 13:34:53 -06:00
SergeantPanda
46413b7e3a changelog: Update changelog for code refactoring and logo changes. 2025-12-26 12:44:26 -06:00
SergeantPanda
874e981449 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/nick4810/761 2025-12-26 12:37:57 -06:00
SergeantPanda
f5c6d2b576 Enhancement: Implement event-driven logo loading orchestration on Channels page
Introduce gated logo loading system that ensures logos render after both
ChannelsTable and StreamsTable have completed their initial data fetch,
preventing visual race conditions and ensuring proper paint order.

Changes:
- Add `allowLogoRendering` flag to logos store to gate logo fetching
- Implement `onReady` callbacks in ChannelsTable and StreamsTable
- Add orchestration logic in Channels.jsx to coordinate table readiness
- Use double requestAnimationFrame to defer logo loading until after browser paint
- Remove background logo loading from App.jsx (now page-specific)
- Simplify fetchChannelAssignableLogos to reuse fetchAllLogos
- Remove logos dependency from ChannelsTable columns to prevent re-renders

This ensures visual loading order: Channels → EPG → Streams → Logos,
regardless of network speed or data size, without timer-based hacks.
2025-12-26 12:30:08 -06:00
Kevin Napoli
1ef5a9ca13
Fix: Continue GET request if HEAD fails with the peer closing the connection without returning a response 2025-12-26 15:27:51 +01:00
SergeantPanda
2d31eca93d changelog: Correct formatting for thanks of event viewer arrow direction fix entry
Some checks failed
Base Image Build / prepare (push) Has been cancelled
CI Pipeline / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Base Image Build / docker (amd64, ubuntu-24.04) (push) Has been cancelled
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
Base Image Build / create-manifest (push) Has been cancelled
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Has been cancelled
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
CI Pipeline / create-manifest (push) Has been cancelled
2025-12-24 16:15:23 -06:00
SergeantPanda
510c9fc617
Merge pull request #757 from sethwv/dev 2025-12-24 16:14:29 -06:00
SergeantPanda
8f63659ad7 changelog: Update changelog for VLC support 2025-12-24 16:11:52 -06:00
SergeantPanda
31b9868bfd Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/sethwv/757 2025-12-24 16:04:04 -06:00
SergeantPanda
da4597ac95 Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-12-24 15:49:20 -06:00
SergeantPanda
523a127c81 Enhancement: Add VLC dependencies to DispatcharrBase image.
Some checks failed
Base Image Build / prepare (push) Has been cancelled
Build and Push Multi-Arch Docker Image / build-and-push (push) Has been cancelled
Base Image Build / docker (amd64, ubuntu-24.04) (push) Has been cancelled
Base Image Build / docker (arm64, ubuntu-24.04-arm) (push) Has been cancelled
Base Image Build / create-manifest (push) Has been cancelled
2025-12-24 15:48:56 -06:00
SergeantPanda
ec3093d9af Changelog: Update changelog to include client IP display in network access warning modal 2025-12-24 15:43:33 -06:00
SergeantPanda
5481b18d8a
Merge pull request #779 from damien-alt-sudo/feature/ui-network-access-clientip 2025-12-24 15:39:31 -06:00
Damien
bfca663870 Feature: Add client_ip response from settings check api to UI 2025-12-24 19:30:03 +00:00
Damien
11b3320277 Feature: Add client_ip response from settings check API 2025-12-24 19:27:38 +00:00
SergeantPanda
44a122924f advanced filtering for hiding disabled channels and viewing only empty channels
Some checks are pending
CI Pipeline / prepare (push) Waiting to run
CI Pipeline / docker (amd64, ubuntu-24.04) (push) Blocked by required conditions
CI Pipeline / docker (arm64, ubuntu-24.04-arm) (push) Blocked by required conditions
CI Pipeline / create-manifest (push) Blocked by required conditions
Build and Push Multi-Arch Docker Image / build-and-push (push) Waiting to run
(cherry picked from commit ea38c0b4b8)
Closes #182
2025-12-23 17:37:38 -06:00
SergeantPanda
48ebaffadd Cleanup dockerfile a bit. 2025-12-23 17:04:09 -06:00
SergeantPanda
daa919c764 Refactor logging messages in StreamManager for clarity and consistency. Also removed redundant parsing. 2025-12-23 15:52:56 -06:00
SergeantPanda
8f811f2ed3 Correct profile name casing for FFmpeg, Streamlink, and VLC in fixtures.json 2025-12-23 15:17:50 -06:00
SergeantPanda
ff7298a93e Enhance StreamManager for efficient log parsing and update VLC stream profile naming 2025-12-23 15:07:25 -06:00
Nick Sandstrom
9c9cbab94c Reverted lazy load of StreamsTable 2025-12-23 12:27:29 -08:00
SergeantPanda
904500906c Bug Fix: Update stream validation to return original URL instead of redirected URL when using redirect profile. 2025-12-23 09:51:02 -06:00
SergeantPanda
106ea72c9d Changelog: Fix event viewer arrow direction for corrected UI behavior 2025-12-22 17:38:55 -06:00
drnikcuk
eea84cfd8b
Update Stats.jsx (#773)
* Update Stats.jsx

Adds fix for stats control arrows direction swap
2025-12-22 17:33:26 -06:00
GitHub Actions
c7590d204e Release v0.15.1 2025-12-22 22:58:41 +00:00
SergeantPanda
7a0af3445a
Merge pull request #774 from Dispatcharr/dev
Version 0.15.1
2025-12-22 16:55:59 -06:00
SergeantPanda
18645fc08f Bug Fix: Re-apply failed merge to fix clients that don't have ipv6 support. 2025-12-22 16:39:09 -06:00
Seth Van Niekerk
aa5db6c3f4
Squash: Log Parsing Refactor & Enhancing 2025-12-22 15:14:46 -05:00
Nick Sandstrom
1029eb5b5c Table length checking if data is already set 2025-12-19 19:19:04 -08:00
SergeantPanda
ee183a9f75 Bug Fix: XtreamCodes EPG has_archive field now returns integer 0 instead of string "0" for proper JSON type consistency 2025-12-19 18:39:43 -06:00
nick4810
63daa3ddf2
Merge branch 'dev' into enhancement/component-cleanup 2025-12-19 16:35:26 -08:00
Nick Sandstrom
4cd63bc898 Reverted LoadingOverlay 2025-12-19 16:33:21 -08:00
GitHub Actions
05b62c22ad Release v0.15.0 2025-12-20 00:08:41 +00:00
SergeantPanda
2c12e8b872
Merge pull request #767 from Dispatcharr/dev
Version 0.15.0
2025-12-19 17:55:40 -06:00
SergeantPanda
20182c7ebf
Merge branch 'main' into dev 2025-12-19 17:53:06 -06:00
SergeantPanda
f0a9a3fc15 Bug Fix: Docker init script now validates DISPATCHARR_PORT is an integer before using it, preventing sed errors when Kubernetes sets it to a service URL like tcp://10.98.37.10:80. Falls back to default port 9191 when invalid (Fixes #737) 2025-12-19 17:00:30 -06:00
nick4810
097551ccf7
Merge branch 'dev' into enhancement/component-cleanup 2025-12-19 14:11:01 -08:00
Nick Sandstrom
22527b085d Checking if data has been fetched before displaying empty channels 2025-12-19 14:09:17 -08:00
SergeantPanda
944736612b Bug Fix: M3U profile form resets local state for search and replace patterns after saving, preventing validation errors when adding multiple profiles in a row 2025-12-19 15:49:18 -06:00
SergeantPanda
abc6ae94e5 Enhancement: Update SuperuserForm to include logo, version info, and improved layout 2025-12-19 10:44:39 -06:00
SergeantPanda
5371519d8a Enhancement: Update default backup settings to enable backups and set retention count to 3 2025-12-19 10:40:56 -06:00
SergeantPanda
b83f12809f Enhancement: Add HEADER_HEIGHT and ERROR_HEIGHT constants for improved layout calculations in FloatingVideo component 2025-12-18 17:18:44 -06:00
SergeantPanda
601f7d0297 changelog: Update changelog for DVR bug fix. 2025-12-18 16:57:43 -06:00
SergeantPanda
de31826137 refactor: externalize Redis and Celery configuration via environment variables
Replace hardcoded localhost:6379 values throughout codebase with environment-based configuration. Add REDIS_PORT support and allow REDIS_URL override for external Redis services. Configure Celery broker/result backend to use Redis settings with environment variable overrides.

Closes #762
2025-12-18 16:54:59 -06:00
SergeantPanda
e78c18c473 Bug Fix: XC get_simple_data_table now returns the id of the program in the database and epg_id the epg id from the matched epg. 2025-12-18 16:11:26 -06:00
SergeantPanda
73956924f5 Enhancement: Stream group as available hash option: Users can now select 'Group' as a hash key option in Settings → Stream Settings → M3U Hash Key, allowing streams to be differentiated by their group membership in addition to name, URL, TVG-ID, and M3U ID 2025-12-18 15:26:08 -06:00
SergeantPanda
0a4d27c236 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-12-18 14:47:04 -06:00
SergeantPanda
45ea63e9cf chore: update dependencies in package.json
- Bump eslint from ^9.21.0 to ^9.27.0
- Upgrade vite from ^6.2.0 to ^7.1.7
- Add overrides for js-yaml to ^4.1.1
2025-12-18 14:45:55 -06:00
Dispatcharr
1510197bf0 Floating Video
Added handles on the corners of FloatingVideo to resize
2025-12-18 14:19:51 -06:00
SergeantPanda
9623dff6b1 Enhancement: Updated dependencies: Django (5.2.4 → 5.2.9) includes CVE security patch, psycopg2-binary (2.9.10 → 2.9.11), celery (5.5.3 → 5.6.0), djangorestframework (3.16.0 → 3.16.1), requests (2.32.4 → 2.32.5), psutil (7.0.0 → 7.1.3), gevent (25.5.1 → 25.9.1), rapidfuzz (3.13.0 → 3.14.3), torch (2.7.1 → 2.9.1), sentence-transformers (5.1.0 → 5.2.0), lxml (6.0.0 → 6.0.2) (Closes #662) 2025-12-18 13:19:18 -06:00
SergeantPanda
3ddcadb50d changelog: Give acknowledgement and reference issue. 2025-12-18 11:07:13 -06:00
SergeantPanda
1e42aa1011
Merge pull request #763 from Dispatcharr/OCI-docker-labels
Enhancement: Refactor Docker workflows to use docker/metadata-action …
2025-12-18 11:02:50 -06:00
SergeantPanda
ee0502f559
Merge branch 'dev' into OCI-docker-labels 2025-12-18 11:01:55 -06:00
SergeantPanda
f43de44946 Enhancement: Refactor Docker workflows to use docker/metadata-action for cleaner OCI label management 2025-12-18 10:58:48 -06:00
Nick Sandstrom
2b1d5622a6 Setting User before fetch settings completes 2025-12-18 07:47:18 -08:00
Nick Sandstrom
bd148a7f14 Reverted Channels change for initial render 2025-12-18 07:46:21 -08:00
SergeantPanda
a76a81c7f4
Merge pull request #738 from jdblack/flexible_devbuild
Give arguments to docker/build-dev.sh
2025-12-18 08:57:45 -06:00
SergeantPanda
bd57ee3f3c
Merge branch 'dev' into flexible_devbuild 2025-12-18 08:56:58 -06:00
SergeantPanda
2558ea0b0b Enhancement: Add VOD client stop functionality to Stats page 2025-12-17 16:54:10 -06:00
Nick Sandstrom
2a0df81c59 Lazy loading components 2025-12-17 13:35:12 -08:00
Nick Sandstrom
1906c9955e Updated to default export 2025-12-17 13:34:53 -08:00
Nick Sandstrom
4c60ce0c28 Extracted Series and Movie components 2025-12-17 13:34:20 -08:00
Dispatcharr
865ba432d3 Updated url path
#697 Encode tvg_id for DVR series rule deletions
2025-12-16 23:06:49 -06:00
Dispatcharr
7ea843956b Updated FloatingVideo.jsx
Added resizing of the floating video
Fixed floating video dragging
2025-12-16 21:52:35 -06:00
SergeantPanda
98a016a418 Enhance series info retrieval to return unique episodes and improve relation handling for active M3U accounts 2025-12-16 15:54:33 -06:00
Nick Sandstrom
36ec2fb1b0 Extracted component and util logic 2025-12-16 13:46:24 -08:00
Nick Sandstrom
dd75b5b21a Added correct import for Text component 2025-12-16 13:46:24 -08:00
Nick Sandstrom
38033da90f Fixed component syntax 2025-12-16 13:46:24 -08:00
Nick Sandstrom
7c45542332 Fixed cache_url fallback 2025-12-16 13:46:24 -08:00
SergeantPanda
748d5dc72d Bug Fix: When multiple m3uepisoderelations for a requested episode existed, the xc api would fail.(Fixes #569) 2025-12-16 15:44:42 -06:00
SergeantPanda
48e7060cdb Bug Fix: VOD episode processing now correctly handles duplicate episodes from the same provider. (Fixes #556) 2025-12-16 15:24:16 -06:00
Nick Sandstrom
6c1b0f9a60 Extracted component and util logic 2025-12-16 11:55:22 -08:00
Nick Sandstrom
ffd8d9fe6b Using util for getPosterUrl 2025-12-16 11:53:45 -08:00
Nick Sandstrom
0ba22df233 Updated Component syntax 2025-12-16 11:53:26 -08:00
Seth Van Niekerk
bc72b2d4a3
Add VLC and streamlink codec parsing support 2025-12-15 20:09:54 -05:00
Seth Van Niekerk
88c10e85c3
Add VLC TS demux output detection for codec parsing 2025-12-15 20:09:54 -05:00
Seth Van Niekerk
1ad8d6cdfd
Add VLC profile to fixtures with correct parameter order 2025-12-15 20:09:54 -05:00
Seth Van Niekerk
ee7a39fe21
Add VLC stream profile migration with correct parameters 2025-12-15 20:09:54 -05:00
Seth Van Niekerk
3b7f6dadaa
Add VLC packages and environment variables to DispatcharrBase 2025-12-15 20:09:54 -05:00
SergeantPanda
41642cd479 Improve orphaned CrontabSchedule cleanup logic to avoid deleting in-use schedules 2025-12-15 16:54:12 -06:00
SergeantPanda
1b27472c81 changelog: Add automated configuration backup/restore system to changelog 2025-12-15 16:22:38 -06:00
SergeantPanda
a60fd530f3 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-12-15 16:17:53 -06:00
SergeantPanda
4878e92f44
Merge pull request #488 from stlalpha/feature/automated-backups
Enhancement: Add automated configuration backups
2025-12-15 16:17:33 -06:00
Nick Sandstrom
3bf8ddf376 Removed unused imports 2025-12-15 09:19:54 -08:00
Nick Sandstrom
65dbc5498d Fixed handler arrow functions 2025-12-15 08:21:00 -08:00
Nick Sandstrom
85390a078c Removed unused imports 2025-12-15 07:48:24 -08:00
Jim McBride
bd6cf287dc
Clean up orphaned CrontabSchedule records
- Add _cleanup_orphaned_crontab() helper function
  - Delete old crontab when disabling backup schedule
  - Delete old crontab when schedule settings change
  - Prevents database bloat from unused CrontabSchedule records
2025-12-13 19:02:36 -06:00
Jim McBride
662c5ff89a Reorganize simple mode backup scheduler layout
- Row 1: Frequency, Day (if weekly), Hour, Minute, Period (if 12h)
- Row 2: Retention, Save button
- Use wrap=nowrap to keep time selectors on same row
2025-12-13 18:49:36 -06:00
Jim McBride
1dc7700a62
Add timezone support for backup scheduling
- Set CrontabSchedule timezone to system timezone for accurate scheduling
  - Replace time TextInput with hour/minute Select dropdowns for cleaner UX
  - Remove UTC/local time conversion logic (handled by Celery)
  - Add tests for timezone functionality in simple and advanced modes
2025-12-13 13:27:56 -06:00
Nick Sandstrom
d97f0c907f Updated DVR for extracted logic 2025-12-13 06:33:28 -08:00
Nick Sandstrom
ae60f81314 Extracted DVR utils 2025-12-13 06:32:16 -08:00
Nick Sandstrom
bfcc47c331 Extracted DVR components 2025-12-13 06:31:56 -08:00
SergeantPanda
679adb324c changelog: Update changelog to reference github issue. 2025-12-12 17:26:24 -06:00
SergeantPanda
58a6cdedf7 Bug Fix: Fix handling of None values in xc_get_epg output to prevent AttributeError when title and/or description are none. 2025-12-12 17:23:02 -06:00
SergeantPanda
dedd898a29 Changelog: Document removal of unreachable code path in m3u output 2025-12-12 16:44:09 -06:00
SergeantPanda
0b09cd18b9
Merge pull request #725 from DawtCom/main
Removing unreachable code
2025-12-12 16:19:52 -06:00
dekzter
3537c9ee09
Merge pull request #741 from Dispatcharr/hide-disabled-channels
Advanced Filtering
2025-12-12 08:42:34 -05:00
dekzter
97930c3de8
Merge pull request #740 from Dispatcharr/revert-736-hide-disabled-channels
Revert "Advanced Filtering"
2025-12-12 08:41:20 -05:00
dekzter
c51916b40c
Revert "Advanced Filtering" 2025-12-12 08:30:17 -05:00
dekzter
ed61ac656a
Merge pull request #736 from Dispatcharr/hide-disabled-channels
Advanced Filtering
2025-12-12 08:05:21 -05:00
James Blackwell
56cf37d637 Give arguments to docker/build-dev.sh
This command improves docker/build-dev.sh, providing a variety of
arguments to assist building images

 -h for help

 -p push the build

 -r Specify a different registry, such as myname on dockerhub, or
  myregistry.local

 -a arch[,arch] cross build to one or more architectures; .e.g. -a
   linux/arm64,linux/amd64
2025-12-12 12:59:03 +07:00
dekzter
ea38c0b4b8 advanced filtering for hiding disabled channels and viewing only empty channels 2025-12-11 11:54:41 -05:00
Nick Sandstrom
dd5ae8450d Updated pages to utilize error boundary 2025-12-10 22:31:04 -08:00
Nick Sandstrom
0070d9e500 Added ErrorBoundary component 2025-12-10 22:29:48 -08:00
Nick Sandstrom
aea888238a Removed unused pages 2025-12-10 21:38:46 -08:00
Nick Sandstrom
700d0d2383 Moved error logic to separate component 2025-12-10 20:38:28 -08:00
dekzter
0bfd06a5a3 Merge remote-tracking branch 'origin/dev' into hide-disabled-channels 2025-12-10 17:32:50 -05:00
Jim McBride
8388152d79
Use system timezone for backup filenames
Updated create_backup to use the system's configured timezone for backup
filenames instead of always using UTC. This makes filenames more intuitive
and matches users' local time expectations.

Changes:
- Import pytz and CoreSettings
- Get system timezone from CoreSettings.get_system_time_zone()
- Convert current UTC time to system timezone for filename timestamp
- Fallback to UTC if timezone conversion fails
- Internal metadata timestamps remain UTC for consistency

Example:
- System timezone: America/New_York (EST)
- Created at 3:00 PM EST
- Old filename: dispatcharr-backup-2025.12.09.20.00.00.zip (UTC time)
- New filename: dispatcharr-backup-2025.12.09.15.00.00.zip (local time)

This aligns with the timezone-aware scheduling already implemented.
2025-12-09 09:06:22 -06:00
Jim McBride
795934dafe
Give Created column flexible width to prevent wrapping
Changed Created column from fixed size (160px) to flexible minSize (180px)
with whiteSpace: nowrap to ensure date/time displays on a single line.

The column can now expand as needed while maintaining readability of
timestamps in various date/time format combinations.
2025-12-09 08:58:38 -06:00
Jim McBride
70e574e25a
Add tests for cron expression functionality
Added comprehensive test coverage for cron expression support:

New tests in BackupSchedulerTestCase:
- test_cron_expression_stores_value: Verify cron_expression persists correctly
- test_cron_expression_creates_correct_schedule: Verify CrontabSchedule creation from cron
- test_cron_expression_invalid_format: Verify validation rejects malformed expressions
- test_cron_expression_empty_uses_simple_mode: Verify fallback to frequency/time mode
- test_cron_expression_overrides_simple_settings: Verify cron takes precedence

Updated existing tests to include cron_expression field:
- test_get_schedule_settings_defaults: Now checks cron_expression default
- test_get_schedule_success: Added cron_expression to mock response
- test_update_schedule_success: Added cron_expression to mock response

All tests verify the new cron functionality works correctly alongside
existing simple scheduling mode.
2025-12-09 08:54:23 -06:00
Jim McBride
3c76c72479
Add Enabled/Disabled label to Advanced toggle
Added dynamic label to the Advanced (Cron Expression) switch to match
the Scheduled Backups toggle above it.

Now displays:
  Scheduled Backups          [Enabled]
  Advanced (Cron Expression) [Enabled]

Provides consistent UI pattern and clearer status indication.
2025-12-09 08:38:57 -06:00
Jim McBride
53159bd420
Improve Advanced toggle layout alignment
Changed Advanced (Cron Expression) from a labeled switch to a proper
Group with space-between layout matching the Scheduled Backups row above.

Now displays as:
  Scheduled Backups          [Enabled toggle]
  Advanced (Cron Expression) [Toggle]

This creates consistent visual alignment with both text labels on the left
and toggle switches on the right.
2025-12-09 08:35:55 -06:00
Jim McBride
901cc09e38
Align Advanced toggle below Scheduled Backups header
Moved the Advanced (Cron Expression) switch outside the scheduleLoading
conditional and wrapped it in its own Group for proper alignment.

Layout is now:
- Scheduled Backups header with Enabled/Disabled switch
- Advanced (Cron Expression) toggle (aligned left)
- Schedule configuration inputs (conditional based on mode)

This provides clearer visual hierarchy and better UX.
2025-12-09 08:32:49 -06:00
Jim McBride
d4fbc9dc61
Honor user date/time format preferences for backup timestamps
- Import dayjs for date formatting
- Read date-format setting from localStorage ('mdy' or 'dmy')
- Move formatDate function into component to access user preferences
- Format dates according to user's date and time format settings:
  - MDY: MM/DD/YYYY
  - DMY: DD/MM/YYYY
  - 12h: h:mm:ss A
  - 24h: HH:mm:ss

The Created column now respects the same date/time format preferences
used throughout the app (Guide, Stats, DVR, SystemEvents, etc).
2025-12-09 08:29:28 -06:00
Jim McBride
1a350e79e0
Fix cron validation to support */N step notation
Updated regex to properly support step notation with asterisk (e.g., */2, */5).

Now supports all common cron patterns:
- * (wildcard)
- */2 (every 2 units - step notation)
- 5 (specific value)
- 1-5 (range)
- 1-5/2 (step within range)
- 1,3,5 (list)
- 10-20/5 (step within range)

Changed regex from:
  /^(\*|(\d+(-\d+)?(,\d+(-\d+)?)*)(\/\d+)?)$/
To:
  /^(\*\/\d+|\*|\d+(-\d+)?(\/\d+)?(,\d+(-\d+)?(\/\d+)?)*)$/

The key change is adding \*\/\d+ as the first alternative to explicitly
match step notation like */2, */5, */10, etc.

Backend already supports this via Django Celery Beat's CrontabSchedule,
which accepts standard cron syntax including step notation.
2025-12-09 08:22:20 -06:00
Jim McBride
e71e6bc3d7
Fix backup timestamp display to use UTC timezone
The list_backups function was creating timezone-naive datetime objects,
which caused the frontend to incorrectly interpret timestamps.

Now uses datetime.UTC when creating timestamps from file modification time
(consistent with other usage in this file on lines 186, 216), so the ISO
string includes timezone info (+00:00). This allows the browser to properly
convert UTC timestamps to the user's local timezone for display.

Before: Backend sends "2025-12-09T14:12:44" (ambiguous timezone)
After: Backend sends "2025-12-09T14:12:44+00:00" (explicit UTC)

The frontend's toLocaleString() will now correctly convert to local time.
2025-12-09 08:16:04 -06:00
Jim McBride
c65df2de89
Add real-time validation for cron expressions
- Add validateCronExpression function with comprehensive validation:
  - Checks for exactly 5 parts (minute hour day month weekday)
  - Validates cron syntax (*, ranges, lists, steps)
  - Validates numeric ranges (minute 0-59, hour 0-23, etc.)
  - Returns detailed error messages for each validation failure

- Add cronError state to track validation errors
- Validate on input change with handleScheduleChange
- Display error message below input field
- Disable Save button when cron expression is invalid
- Auto-validate when switching to advanced mode
- Clear errors when switching back to simple mode

User gets immediate feedback on cron syntax errors before attempting to save.
2025-12-09 08:09:56 -06:00
Jim McBride
5fbcaa91e0
Add custom cron expression support for backup scheduling
Frontend changes:
- Add advanced mode toggle switch for cron expressions
- Show cron expression input with helpful examples when enabled
- Display format hints: "minute hour day month weekday"
- Provide common examples (daily, weekly, every 6 hours, etc.)
- Conditionally render simple or advanced scheduling UI
- Support switching between simple and advanced modes

Backend changes:
- Add cron_expression to schedule settings (SETTING_KEYS, DEFAULTS)
- Update get_schedule_settings to include cron_expression
- Update update_schedule_settings to handle cron_expression
- Extend _sync_periodic_task to parse and use cron expressions
- Parse 5-part cron format: minute hour day_of_month month_of_year day_of_week
- Create CrontabSchedule from cron expression or simple frequency
- Add validation and error handling for invalid cron expressions

This addresses maintainer feedback for "custom scheduler (cron style) for more control".
Users can now schedule backups with full cron flexibility beyond daily/weekly.
2025-12-09 07:55:47 -06:00
Jim McBride
d718e5a142
Implement timezone-aware backup scheduling
- Add timezone conversion functions (utcToLocal, localToUtc)
- Use user's configured timezone from Settings (localStorage 'time-zone')
- Convert times to UTC when saving to backend
- Convert times from UTC to local when loading from backend
- Display timezone info showing user's timezone and scheduled time
- Helper text shows: "Timezone: America/New_York • Backup will run at 03:00"

This addresses maintainer feedback to handle timezone properly:
backend stores/schedules in UTC, frontend displays/edits in user's local time.
2025-12-09 07:52:53 -06:00
Jim McBride
806f78244d
Add proper ConfirmationDialog usage to BackupManager
- Import useWarningsStore from warnings store
- Add suppressWarning hook to component
- Add actionKey props to restore and delete confirmation dialogs
- Add onSuppressChange callback to enable "Don't ask again" functionality

This aligns BackupManager with the project's standard confirmation dialog pattern
used throughout the codebase (ChannelsTable, EPGsTable, etc).
2025-12-09 07:49:31 -06:00
DawtCom
e8fb01ebdd Removing unreachable code 2025-12-08 21:50:13 -06:00
SergeantPanda
514e7e06e4 Bug fix: EPG API now returns correct date/time format for start/end fields and proper string types for timestamps and channel_id 2025-12-08 20:50:50 -06:00
SergeantPanda
69f9ecd93c Bug Fix: Remove ipv6 binding from nginx config if ipv6 is not available. 2025-12-08 20:12:44 -06:00
GitHub Actions
4df4e5f963 Release v0.14.0 2025-12-09 00:01:50 +00:00
SergeantPanda
ecbef65891
Merge pull request #723 from Dispatcharr:dev
Version 0.14.0
2025-12-08 17:59:12 -06:00
SergeantPanda
98b29f97a1 changelog: Update verbiage 2025-12-08 17:49:40 -06:00
SergeantPanda
62f5c32609 Remove DJANGO_SECRET_KEY environment variable from uwsgi configuration files 2025-12-08 17:27:07 -06:00
dekzter
43b55e2d99 first run at hiding disabled channels in channel profiles 2025-12-08 08:38:39 -05:00
SergeantPanda
c03ddf60a0 Fixed verbiage for epg parsing status. 2025-12-07 21:28:04 -06:00
SergeantPanda
ce70b04097 changelog: update changelog 2025-12-07 20:56:59 -06:00
SergeantPanda
e2736babaa Reset umask after creating secret file. 2025-12-07 20:04:58 -06:00
SergeantPanda
2155229d7f Fix uwsgi command path in entrypoint script 2025-12-07 19:40:32 -06:00
SergeantPanda
cf37c6fd98 changelog: Updated changelog for 0.13.1 2025-12-07 19:06:45 -06:00
SergeantPanda
3512c3a623 Add DJANGO_SECRET_KEY environment variable to uwsgi configuration files 2025-12-07 19:05:31 -06:00
dekzter
d0edc3fa07 remove permission lines to see if this resolves lack of django secret key in environment profile.d 2025-12-07 07:54:30 -05:00
dekzter
b18bc62983 merged in from main 2025-12-06 14:13:06 -05:00
GitHub Actions
a912055255 Release v0.13.1 2025-12-06 18:43:16 +00:00
dekzter
10f329d673 release notes for built 2025-12-06 13:42:48 -05:00
dekzter
f3a901cb3a Security Fix - generate JWT on application init 2025-12-06 13:40:10 -05:00
SergeantPanda
759569b871 Enhancement: Add a priority field to EPGSource and prefer higher-priority sources when matching channels. Also ignore EPG sources where is_active is false during matching, and update serializers/forms/frontend accordingly.(Closes #603, #672) 2025-12-05 09:54:11 -06:00
SergeantPanda
c1d960138e Fix: Bulk channel editor confirmation dialog now shows the correct stream profile that will be set. 2025-12-05 09:02:03 -06:00
SergeantPanda
0d177e44f8 changelog: Change updated change to bug fix instead of change. 2025-12-04 15:45:09 -06:00
SergeantPanda
3b34fb11ef Fix: Fixes bug where Updated column wouldn't update in the EPG table without a webui refresh. 2025-12-04 15:43:33 -06:00
SergeantPanda
6c8270d0e5 Enhancement: Add support for 'extracting' status and display additional progress information in EPGsTable 2025-12-04 15:28:21 -06:00
SergeantPanda
5693ee7f9e perf: optimize EPG program parsing and implement atomic database updates to reduce I/O overhead and prevent partial data visibility 2025-12-04 14:57:57 -06:00
SergeantPanda
256ac2f55a Enhancement: Clean up orphaned programs for unmapped EPG entries 2025-12-04 14:25:44 -06:00
SergeantPanda
2a8ba9125c perf: optimize EPG program parsing for multi-channel sources
Dramatically improve EPG refresh performance by parsing the XML file once
per source instead of once per channel. The new implementation:

- Pre-filters to only process EPG entries mapped to actual channels
- Parses the entire XML file in a single pass
- Uses O(1) set lookups to skip unmapped channel programmes
- Skips non-mapped channels entirely with minimal overhead

For EPG sources with many channels but few mapped (e.g., 10,000 channels
with 100 mapped to channels), this provides approximately:
- 99% reduction in file open operations
- 99% reduction in XML file scans
- Proportional reduction in CPU and I/O overhead

The parse_programs_for_tvg_id() function is retained for single-channel
use cases (e.g., when a new channel is mapped via signals).

Fixes inefficient repeated file parsing that was occurring with large
EPG sources.
2025-12-04 14:07:28 -06:00
SergeantPanda
2de6ac5da1 changelog: Add sort buttons for 'Group' and 'M3U' columns in Streams table 2025-12-03 17:31:16 -06:00
SergeantPanda
6a96b6b485
Merge pull request #707 from bobey6/main
Enhancement: Add sort by 'Group' or 'M3U' buttons to Streams
2025-12-03 17:27:42 -06:00
SergeantPanda
5fce83fb51 style: Adjust table header and input components for consistent width 2025-12-03 17:13:50 -06:00
SergeantPanda
81b6570366 Fix name not sorting. 2025-12-03 17:03:58 -06:00
SergeantPanda
042612c677 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/bobey6/707 2025-12-03 16:49:21 -06:00
Jim McBride
e64002dfc4
Refactor BackupManager to match app table conventions 2025-12-02 22:19:20 -06:00
Jim McBride
70cf8928c4
Use CustomTable component for backup list 2025-12-02 22:01:59 -06:00
Jim McBride
3f9fd424e2
Update backup feature based on PR feedback
- Simplify to database-only backups (remove data directory backup)
- Update UI to match app styling patterns:
  - Use ActionIcon with transparent variant for table actions
  - Match icon/color conventions (SquareMinus/red.9, RotateCcw/yellow.5, Download/blue.5)
  - Use standard button bar layout with Paper/Box/Flex
  - Green "Create Backup" button matching "Add" pattern
  - Remove Card wrapper, Alert, and Divider for cleaner layout
  - Update to Mantine v8 Table syntax
- Use standard ConfirmationDialog (remove unused color prop)
- Update tests to remove get_data_dirs references
2025-12-02 19:33:27 -06:00
SergeantPanda
f38fb36eba Skip builds during documentation updates. 2025-12-02 15:03:34 -06:00
SergeantPanda
5e1ae23c4e docs: Update CHANGELOG 2025-12-02 14:58:22 -06:00
SergeantPanda
53a50474ba
Merge pull request #701 from jordandalley/nginx-add-ipv6-bind 2025-12-02 14:49:49 -06:00
SergeantPanda
92ced69bfd
Merge pull request #698 from adrianmace/fix-ipv6-access-issues
fix: Allow all IPv6 CIDRs by default
2025-12-02 14:36:51 -06:00
SergeantPanda
f1320c9a5d Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/stlalpha/488 2025-12-02 13:39:06 -06:00
GitHub Actions
5b193249a8 Release v0.13.0 2025-12-02 19:25:22 +00:00
SergeantPanda
0571c6801a
Merge pull request #709 from Dispatcharr:dev
Version 0.13.0
2025-12-02 13:24:28 -06:00
SergeantPanda
c57c7d64de Docs: Improved wording for DVR date/time formats. 2025-12-02 13:10:34 -06:00
SergeantPanda
0bf3499917 Add update_changelog script and integrate it into release workflow 2025-12-02 12:14:50 -06:00
SergeantPanda
3cb695279a Docs: Added issue references to changelog. 2025-12-02 11:49:32 -06:00
SergeantPanda
2c5fbaffb4 Update changelog with current unreleased changes. 2025-12-02 11:09:09 -06:00
SergeantPanda
85b5b18a57 Add CHANGELOG.md to document project updates and notable changes 2025-12-02 09:40:31 -06:00
SergeantPanda
be0409bfc2 Add referrorpolicy to Youtube iframe's for series and VOD modals. 2025-12-01 18:21:19 -06:00
SergeantPanda
bd3709463a Bug fix: Use UUID instead of ID for episode URL. This fixes links not working in the series modal. 2025-12-01 16:30:51 -06:00
root
cf08e54bd8 Fix sorting functionality for Group and M3U columns
- Add missing header properties to group and m3u columns
- Fix layout issues with sort buttons (proper flex layout, remove blocking onClick)
- Fix sorting state initialization (use boolean instead of empty string)
- Fix sorting comparison operators (use strict equality)
- Fix 3rd click behavior to return to default sort instead of clearing
- Map frontend column IDs to backend field names for proper API requests
2025-12-01 18:11:58 +00:00
GitHub Copilot
641dcfc21e Add sorting functionality to Group and M3U columns in Streams table
- Added m3u_account__name to backend ordering_fields in StreamViewSet
- Implemented field mapping in frontend to convert column IDs to backend field names
- Added sort buttons to both Group and M3U columns with proper icons
- Sort buttons show current sort state (ascending/descending/none)
- Maintains consistent UX with existing Name column sorting
2025-11-30 19:20:25 +00:00
3l3m3nt
43949c3ef4 Added IPv6 port bind to nginx.conf 2025-11-30 19:30:47 +13:00
Adrian Mace
6a9b5282cd
fix: allow all IPv6 CIDRs by default
This change ensures that by default, IPv6 clients can
connect to the service unless explicitly denied.

Fixes #593
2025-11-30 00:39:30 +11:00
SergeantPanda
b791190e3b Enhancement: Add scrollable modal support for M3UFilters and M3UProfiles components to improve usability on mobile devices. 2025-11-28 12:05:08 -06:00
SergeantPanda
1d23ed3685 Enhancement: Allow scrolling when edit m3u account modal is open on mobile. 2025-11-28 11:10:11 -06:00
Jim McBride
3fb18ecce8 Enhancement: Respect user's 12h/24h time format preference in backup scheduler
- Read time-format setting from UI Settings via useLocalStorage
- Show 12-hour time input with AM/PM selector when user prefers 12h
- Show 24-hour time input when user prefers 24h
- Backend always stores 24-hour format (no API changes)
2025-11-27 08:49:29 -06:00
Jim McBride
3eaa76174e Feature: Automated configuration backups with scheduling
- Create/Download/Upload/Restore database backups (PostgreSQL and SQLite)
- Configurable data directory backups (via settings.py)
- Scheduled backups (daily/weekly) via Celery Beat
- Retention policy (keep last N backups)
- Token-based auth for async task polling
- X-Accel-Redirect support for nginx file serving
- Comprehensive tests
2025-11-26 21:11:13 -06:00
SergeantPanda
2b58d7d46e Enhancement: Ensure "Uncategorized" categories and relations exist for VOD accounts. This improves content management for movies and series without assigned categories. Closes #627 2025-11-25 17:14:51 -06:00
SergeantPanda
fb084d013b Bug Fix: Filter out non-existent channel groups in M3UGroupFilter component. This fixes a bug where if a group was removed and you attempt to edit an m3u before the frontend was notified, the webui would go blank. 2025-11-25 14:51:41 -06:00
SergeantPanda
8754839c81 Enhancement: Add search icon to name headers for the channels and streams tables. Closes #686 2025-11-25 13:45:25 -06:00
SergeantPanda
13ad62d3e1 Bug Fix: Fix bug where previewing a stream would always select the default m3u profile regardless of utilization. 2025-11-25 13:07:49 -06:00
SergeantPanda
0997cd7a9d Enhancement: Improved minimum horizontal size in the stats page for improved usability on smaller displays. 2025-11-25 10:43:02 -06:00
SergeantPanda
962d5e965b Enhancement: Hide drop-downs for the system event viewer when in the collapsed state and on a smaller display. 2025-11-25 10:04:52 -06:00
SergeantPanda
7673cd0793 fix: Convert float channel numbers to integers for XC client compatibility
XC (Xtream Codes) clients require integer channel numbers and fail to parse
float values (e.g., 100.5). This change implements collision-free mapping
that converts floats to integers while preserving existing integer channel
numbers where possible.

Changes:
- Implemented two-pass collision detection algorithm that assigns integers
  to float channel numbers by incrementing until an unused number is found
- Applied mapping to all XC client interfaces: live streams API, EPG API,
  and XMLTV endpoint
- Detection: XC clients identified by authenticated user (user is not None)
- Regular M3U/EPG clients (user is None) continue to receive float channel
  numbers without modification

Example: Channels 100, 100.5, 100.9, 101 become 100, 101, 102, 103 for XC
clients, ensuring no duplicate channel numbers and full compatibility.
2025-11-22 09:01:46 -06:00
SergeantPanda
aae7b1bc14 Enhancement: Refactor xc_player_api to streamline action handling and ensure consistent responses for unknown actions 2025-11-21 16:49:45 -06:00
SergeantPanda
e7700b60f3 Enhancement: Add validation for EPG objects and payloads in updateEPG functions to prevent errors from invalid data 2025-11-21 15:10:54 -06:00
SergeantPanda
aa9fa09822 Fix: Improve date parsing logic in generate_custom_dummy_programs to handle empty or invalid inputs 2025-11-21 14:19:35 -06:00
SergeantPanda
c5f6d8ccf3 Enhancement: Update xc_player_api to return server_info for unknown actions to align with provider behavior 2025-11-21 10:57:35 -06:00
SergeantPanda
cb1953baf2 Enhancement: Implement comprehensive logging for user authentication events and network access restrictions 2025-11-21 10:50:48 -06:00
SergeantPanda
d94d615d76 Fix: Handle missing channel profiles in m3u and EPG generation with appropriate error logging 2025-11-20 18:54:32 -06:00
SergeantPanda
05f98e9275
Bug fix: Fixes DVR cards not respecting users preference for date and time formats.
UI now reflects date and time formats chosen by user
2025-11-20 18:22:43 -06:00
SergeantPanda
db276f6d32
Merge pull request #679 from Dispatcharr/event-viewer
Enhancement: Add system event logging and viewer with M3U/EPG endpoint caching
2025-11-20 17:44:29 -06:00
SergeantPanda
89a23164ff Enhancement: Add system event logging and viewer with M3U/EPG endpoint caching
System Event Logging:
- Add SystemEvent model with 15 event types tracking channel operations, client connections, M3U/EPG activities, and buffering events
- Log detailed metrics for M3U/EPG refresh operations (streams/programs created/updated/deleted)
- Track M3U/EPG downloads with client information (IP address, user agent, profile, channel count)
- Record channel lifecycle events (start, stop, reconnect) with stream and client details
- Monitor client connections/disconnections and buffering events with stream metadata

Event Viewer UI:
- Add SystemEvents component with real-time updates via WebSocket
- Implement pagination, filtering by event type, and configurable auto-refresh
- Display events with color-coded badges and type-specific icons
- Integrate event viewer into Stats page with modal display
- Add event management settings (retention period, refresh rate)

M3U/EPG Endpoint Optimizations:
- Implement content caching with 5-minute TTL to reduce duplicate processing
- Add client-based event deduplication (2-second window) using IP and user agent hashing
- Support HEAD requests for efficient preflight checks
- Cache streamed EPG responses while maintaining streaming behavior for first request
2025-11-20 17:41:06 -06:00
Biologisten
1f0fe00cbf UI now reflects date and time formats chosen by user 2025-11-20 17:34:03 +01:00
SergeantPanda
204a5a0c76
Merge pull request #659 from FiveBoroughs/fix/channel-stream-order-serialization
Fix: Preserve stream order in ChannelSerializer PATCH/PUT responses
2025-11-19 15:59:38 -06:00
GitHub Actions
fea7c99021 Release v0.12.0 2025-11-19 03:39:13 +00:00
SergeantPanda
3e77259b2c
Merge pull request #670 from Dispatcharr/dev 2025-11-18 21:38:34 -06:00
SergeantPanda
968a8f1cd0 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-11-18 10:02:47 -06:00
SergeantPanda
b6c3234e96 Enhancement: Improve zombie channel handling in ProxyServer by checking client connections and cleaning up orphaned metadata, ensuring better resource management and stability. 2025-11-18 10:02:35 -06:00
SergeantPanda
dc2a408041
Merge pull request #666 from Dispatcharr:XC-Profile-rate-limit
Enhancement: Implement background profile refresh task with rate limiting to prevent provider bans during account profile updates.
2025-11-18 09:28:36 -06:00
SergeantPanda
afedce5cb2 Enhancement: Implement background profile refresh task with rate limiting to prevent provider bans during account profile updates. 2025-11-18 09:27:22 -06:00
SergeantPanda
d8df848136 Enhancement: Add success notification for channel updates in API, improving user feedback on successful operations. 2025-11-17 17:57:04 -06:00
SergeantPanda
1b16df4482 Enhancement: Improve batch EPG association in ChannelViewSet by adding validation for associations and implementing bulk updates, enhancing performance and error handling. 2025-11-17 17:41:52 -06:00
SergeantPanda
1560afab97 Enhancement: Optimize bulk channel editing in ChannelViewSet by validating updates first and applying them in a single transaction, improving performance by about 50% and error handling. 2025-11-17 17:33:10 -06:00
FiveBoroughs
bbe1f6364b Fix: Preserve stream order in ChannelSerializer PATCH/PUT responses
The ChannelSerializer.to_representation() method was not respecting the
ChannelStream.order field when serializing PATCH/PUT responses. This
caused streams to be returned in an arbitrary order rather than the
order specified in the request.

The update() method correctly saves the stream order to the database
using the ChannelStream.order field, and GET requests (with
include_streams=True) correctly return ordered streams via
get_streams(). However, standard PATCH/PUT responses were using
PrimaryKeyRelatedField which doesn't respect the ordering.

This fix ensures that all representations (GET, PATCH, PUT) return
streams ordered by the channelstream__order field.

Impact:
- PATCH/PUT responses now correctly reflect the stream order saved
- Clients can trust the response data without needing a follow-up GET
- No breaking changes - only fixes inconsistent behavior

Tested with:
- PATCH request with ordered stream IDs
- Verified response matches request order
- Verified GET request confirms order persisted to database
2025-11-16 23:29:17 +01:00
SergeantPanda
6bd5958c3c Enhancement: Improve channel shutdown logic in ProxyServer to handle connection timeouts and grace periods more effectively, ensuring proper channel management based on client connections. 2025-11-15 14:22:26 -06:00
SergeantPanda
0700cf29ea Enhancement: Add copy link functionality to SeriesModal and VODModal, allowing users to easily copy episode and VOD links to clipboard with notifications for success or failure. 2025-11-14 20:13:40 -06:00
SergeantPanda
2514528337 Enhancement: Update channel state handling in ProxyServer and views to include 'STOPPING' state, ensuring proper cleanup and preventing reinitialization during shutdown. 2025-11-14 19:57:59 -06:00
SergeantPanda
827501c9f7 Better spacing for version text. 2025-11-14 18:00:08 -06:00
SergeantPanda
23e2814fe7 Enhancement: Add loading state and dynamic text to submit buttons in Channel forms. Also remove old unused "Channels.jsx" form 2025-11-14 17:52:57 -06:00
SergeantPanda
5160ead093
Merge pull request #166 from maluueu/dev
Improve handling POST requests for M3U generation and add tests
2025-11-14 17:35:32 -06:00
SergeantPanda
acbcc46a91 Revert "docker: init: 02-postgres.sh: allow DB user to create new DB (for tests)"
This reverts commit 7e5be6094f.
2025-11-14 17:32:05 -06:00
SergeantPanda
ed7e16483b Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/maluueu/166 2025-11-14 17:11:00 -06:00
SergeantPanda
a3be679acf Enhancement: Add loading state to login button for better user feedback 2025-11-14 16:52:26 -06:00
SergeantPanda
4f29f7f3f9 Enhancement: Conditionally render Sidebar based on authentication status 2025-11-14 16:45:38 -06:00
SergeantPanda
7321a6d7f8 Enhancement: Allow saving password to local storage. 2025-11-14 16:42:40 -06:00
SergeantPanda
761ee42396 Enhancement: Add version to login form. 2025-11-14 16:02:10 -06:00
SergeantPanda
6dab5e3cf3 Enhancement: Add "Remember Me" checkbox to login for that will save the current username to local storage. 2025-11-14 15:25:56 -06:00
SergeantPanda
b2a041c7c4 Enhancement: Add forgot password link to login form with instructions on how to properly reset a forgotten password. 2025-11-14 15:18:57 -06:00
SergeantPanda
575b764487 Refactor LoginForm: Enhance UI with logo, updated title, and improved layout for better user experience. 2025-11-14 15:06:33 -06:00
SergeantPanda
325c836510
Merge pull request #625 from 0x53c65c0a8bd30fff/force-epg-menu-alphabetical-sorting 2025-11-14 13:51:08 -06:00
SergeantPanda
0360292b94 Refactor LoginForm: Restore navigation effect and streamline login handling. Also remove "Loading..." from M3U & EPG Manager that would clear forms during initial login. 2025-11-14 12:00:22 -06:00
SergeantPanda
cc7cd32c90 Improved syncronization of timeline and guide for mobile touch (including momentum) 2025-11-13 21:19:54 -06:00
SergeantPanda
4b5d3047bb Enhancement: Add wheel scrolling support for TV guide and synchronize scrolling with timeline 2025-11-13 19:47:39 -06:00
SergeantPanda
6e79b37a66 When stream_type is UDP do not add user_agent to FFmpeg command. 2025-11-13 14:04:46 -06:00
SergeantPanda
4720e045a3 Implement manual redirect for non-HTTP protocols (RTSP/RTP/UDP) in stream URL handling 2025-11-12 17:31:27 -06:00
SergeantPanda
79895a1ce4 Enhancement: Update URL validation to support authentication for non-FQDN hostnames in rtsp/rtp/udp URLs 2025-11-12 16:58:07 -06:00
SergeantPanda
a3c16d48ec Skip HTTP validation for non-HTTP protocols (UDP/RTP/RTSP) in stream URL validation 2025-11-12 16:17:06 -06:00
SergeantPanda
431ea6da32
Merge pull request #640 from Dispatcharr:udp-stream-support
Enhancement: Adds support for UDP streams. Closes #617
2025-11-11 18:32:09 -06:00
SergeantPanda
b9e819e343 Enhancement: Adds support for UDP streams. Closes #617 2025-11-11 18:30:59 -06:00
SergeantPanda
a7f449f746
Merge pull request #565 from ragchuck:enable-rtsp
feat: added support for rtsp
2025-11-11 17:29:03 -06:00
SergeantPanda
b608af1d51 Auto detect RTSP streams when proxy profile is selected and force FFmpeg. 2025-11-11 17:26:56 -06:00
SergeantPanda
21723e29bc Update URL validation to support FQDNs for rtsp/rtp protocols and improve regex pattern for flexibility. 2025-11-11 17:17:01 -06:00
SergeantPanda
dc22dff713 Bug Fix: Refactor parse_extinf_line to improve attribute extraction and display name handling. Fixes #637 2025-11-11 16:11:42 -06:00
SergeantPanda
9a5e04af0e Better confirmation dialog messages for vod logo table. 2025-11-07 16:16:07 -06:00
SergeantPanda
6037c158f4 Fix header checkbox not clearing on bulk delete in logo tables. 2025-11-07 15:25:09 -06:00
SergeantPanda
860c671f8c Fix delete button not activating for vod logo table. 2025-11-07 15:05:43 -06:00
SergeantPanda
4701456a46 Enhancement: Force fetch all logos after logo cleanup. 2025-11-07 14:43:41 -06:00
SergeantPanda
c3153f6b93
Merge pull request #628 from Dispatcharr:vod-logos
Separate VOD and channel logos into distinct tables with dedicated management UI
2025-11-07 14:07:59 -06:00
SergeantPanda
da628705df Separate VOD and channel logos into distinct tables with dedicated management UI
- Created VODLogo model for movies/series, separate from Logo (channels only)
- Added database migration to create vodlogo table and migrate existing VOD logos
- Implemented VODLogoViewSet with pagination, filtering (used/unused/movies/series), and bulk operations
- Built VODLogosTable component with server-side pagination matching channel logos styling
- Added VOD logos tab with on-demand loading to Logos page
- Fixed orphaned VOD content cleanup to always remove unused entries
- Removed redundant channel_assignable filtering from channel logos
2025-11-07 13:19:18 -06:00
0x68732f6e69622fff
ed86eb2274 Ensures that in the groups section of M3U playlist management, the EPG Source dropdown for the 'Force EPG Source' option displays entries sorted alphabetically by name 2025-11-06 14:29:13 +00:00
SergeantPanda
871f9f953e Another attempt for the get_host_and_port function to better handle port detection behind reverse proxies. 2025-11-05 16:23:13 -06:00
SergeantPanda
77e98508fb Enhancement: Refactor get_host_and_port for smarter port selection when using reverse proxies. Fixes #618 2025-11-04 19:08:31 -06:00
SergeantPanda
e6146e5243 Bug fix: Reduce websocket message size when processing epgs. Also remove unnecessary console logging during epg refresh. Fixes [Bug]: Page goes blank if sending too many requests / responses
Fixes #327
2025-11-04 18:23:45 -06:00
GitHub Actions
d0ebfb57c4 Release v0.11.2 2025-11-04 18:25:03 +00:00
SergeantPanda
81639c0f15
Merge pull request #615 from Dispatcharr/dev
Version 0.11.2
2025-11-04 12:23:35 -06:00
SergeantPanda
93f074241d Bug fix/enhancement: Check if we're the owner of /data and set ownership if not. Also pre-create /data/models folder 2025-11-04 09:03:39 -06:00
SergeantPanda
d15d8f6644 Bug Fix: Fix bug for the frontend custom dummy epg examples where DST was calculated on todays date instead of date of the event. 2025-11-01 14:08:36 -05:00
SergeantPanda
12aae44672 Enhancement: For custom dummy epg's, use correct date for placeholders if provided. 2025-11-01 13:39:55 -05:00
SergeantPanda
60f77c85da Enhancement: Support month strings for date parsing. 2025-10-31 14:08:07 -05:00
SergeantPanda
c7e955b4a8
Merge pull request #604 from Dispatcharr:proxy-client-ttl
Enhance client record TTL settings and periodic refresh during streaming
2025-10-31 13:00:07 -05:00
SergeantPanda
6715bc7c5c Enhancement: Update TTL settings for client records and implement periodic refresh during active streaming 2025-10-31 11:53:16 -05:00
SergeantPanda
1b282f1987
Merge pull request #594 from 0x53c65c0a8bd30fff/truncated-channel-titles-fix
Channel name truncation when containing apostrophes or special characters
2025-10-30 16:53:30 -05:00
SergeantPanda
16c44ea851
Merge pull request #600 from lasharor/patch-2
Check for "SERVER_PORT" header before falling back to 9191
2025-10-30 16:30:32 -05:00
lasharor
400c77f258
Update views.py
I moved on to LXC install on PVE. The x-tvg-url defaults back to 9191 port whilst in reality the port is defined as :80, therefore this obviously brakes this functionality. 

Check request.META.get("SERVER_PORT") before falling back to defaults. This should capture the actual server port from the request.
2025-10-30 22:25:13 +01:00
SergeantPanda
9d4fd63cde Enhancement: Add date placeholders for custom dummy EPG based on output and source timezones. Closes #597 2025-10-30 16:13:45 -05:00
SergeantPanda
0741e45ce6 Enhancement: Add custom fallback templates for dummy EPG generation to handle unmatched patterns 2025-10-30 14:57:32 -05:00
SergeantPanda
4284955412 Enhancement: Adds the ability to use an exisitng custom dummy epg as a template for a new epg. 2025-10-29 17:38:47 -05:00
SergeantPanda
c9d7e66545 Enhancement: Add {starttime_long} and {endtime_long} to custom dummy EPG. 2025-10-29 17:21:37 -05:00
SergeantPanda
28c211cd56 Enhancement: Add {endtime} as an available output for custom dummy epg. Convert {time} to {starttime}
Closes #590
2025-10-29 17:10:35 -05:00
0x68732f6e69622fff
1fde8e4600 Fixes the issue where channel titles are truncated early after an apostrophe 2025-10-29 13:38:51 +00:00
SergeantPanda
5c27bd2c10 Enhancement: Increase maximum URL length for Stream model to 4096 characters (from 2000) and add validation for URL length in processing tasks.
Fixes #585
2025-10-28 15:02:36 -05:00
SergeantPanda
3e2e704765 Enhancement: Add 'Include New Tag' option to mark programs as new in custom dummy EPG output 2025-10-25 11:47:54 -05:00
SergeantPanda
423c56f582 Enhancement: Sort groups during xc_get_live_categories in the order that they first appear in the channel list (lowest channel number) 2025-10-25 09:47:39 -05:00
SergeantPanda
2042274f10 Enhancement: Bulk assign custom dummy epgs.
Enhancement: Display a dynamic warning dialog when saving batch channel edits that will display what will be changed.
2025-10-25 09:27:31 -05:00
SergeantPanda
0fd464cb96 Enhancement: Adds ability to set a custom poster and channel logo with regex for custom epg dummy's. 2025-10-24 12:55:25 -05:00
SergeantPanda
a1834d9885 Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-10-22 17:03:34 -05:00
SergeantPanda
57b99e3900 Enhancement: Change sub_title field in ProgramData model from CharField to TextField. This will allow for longer than 255 character sub titles. (Closes #579) 2025-10-22 17:02:40 -05:00
GitHub Actions
29c46eeb0a Release v0.11.1 2025-10-22 21:35:39 +00:00
SergeantPanda
2de5acf12c
Merge pull request #580 from Dispatcharr/dev
# 🚀 Release Notes – Version 0.11.1
2025-10-22 16:35:02 -05:00
SergeantPanda
0a6f9eb8e1 Bug fix: Fixes uWSGI not getting environmental variables passed to it and LXC not being able to access daemons launched by uWSGI.
Fixes #575, #576, #577
2025-10-22 15:31:12 -05:00
SergeantPanda
73bb1ecd2d Trying a different approach to fix LXC issues. 2025-10-22 08:26:15 -05:00
SergeantPanda
645c1ec9df Testing redis potential fix. Launch redis in entrypoint before uWSGI 2025-10-21 21:47:15 -05:00
GitHub Actions
dd5f0d0753 Release v0.11.0 2025-10-22 01:16:18 +00:00
SergeantPanda
d5de69cd6a
Merge pull request #574 from Dispatcharr/dev
Version 0.11.0
2025-10-21 19:55:15 -05:00
SergeantPanda
119b222428 Enhancement: Allow setting both Celery and UWSGI nice levels. Default nice levels are UWSGI=0, Celery=5. Added documentation to the compose files on how to use it. Using nice levels for UWSGI allows us to have streaming run at a high priority. 2025-10-21 10:11:53 -05:00
SergeantPanda
92d499a274 Enhancement: Switch regex compilation from re to regex module
Use the 'regex' package instead of Python's standard 're' module for pattern
compilation in custom dummy EPG generation. This enables variable-width
lookbehind support, matching JavaScript regex behavior and removing the
fixed-width limitation of the standard library.

This fixes issues where patterns like (?<=\d{1,2}...) would fail with
"look-behind requires fixed-width pattern" errors.
2025-10-19 19:39:53 -05:00
SergeantPanda
97c24dbea3 Bug Fix: Fix inconsistency with how 24 hour time is displayed between frontend and backend. If minutes were > 0 and using 24 hour time, hours 0-9 would show as single digit but if minute was 0 they would show double digit. 2025-10-19 17:16:49 -05:00
SergeantPanda
4b74673795 Bug fix: Use search instead of match for checking if the title matches the title_match regex for custom dummy. 2025-10-19 17:06:44 -05:00
SergeantPanda
6a85475402 Enhancement: Adds ability to specify categories (comma separted) as well as include date and live tags in custom epg dummy output. 2025-10-19 10:06:48 -05:00
SergeantPanda
6e0e646938 Enhancement: EPG Dummy - Convert time based on timezone for preview. 2025-10-19 09:00:13 -05:00
SergeantPanda
937c20c082 Enhancement: Add upcoming and ended previews for custom dummy epg. 2025-10-19 08:53:49 -05:00
SergeantPanda
75215cfdc6 Enhancement: Convert <time> and <time24> in the frontend. 2025-10-19 08:42:27 -05:00
SergeantPanda
163b1dd7cf Move buttons to the right side and correctly load in output timezone for custom dummy epg. 2025-10-18 21:19:54 -05:00
SergeantPanda
603c9f9269 Enhancement: Added ability to specify output timezone for custom epg dummy. 2025-10-18 21:12:20 -05:00
SergeantPanda
fe540045fc Enhancement: Add a {time} and {time24} output for better output formatting of the time. 2025-10-18 21:06:27 -05:00
SergeantPanda
dee672287b Bug fix: Fixes bug where if minute was not specified matching would fail for custom dummy. 2025-10-18 20:50:56 -05:00
SergeantPanda
c21ea5ecbe Bug Fix: Use event timezone for date calculation in custom dummy channels
Previously used UTC date which caused events to be scheduled a day late when current UTC time had crossed midnight but the event timezone hadn't.
2025-10-18 20:46:06 -05:00
SergeantPanda
d456051eb3 Verify /app also has the correct permissions. 2025-10-18 19:52:22 -05:00
SergeantPanda
9b07f013a4 Refactor directory creation and ownership management in init script for clarity and maintainability. Will only chown recursively if we are not the owner. This should help improve boot speeds. 2025-10-18 19:33:23 -05:00
SergeantPanda
7cbdb61f2c Enhancement: Ensure root's .bashrc sources Dispatcharr profile scripts for interactive non-login shells. This will help when running root commands from the container cli 2025-10-18 19:08:02 -05:00
SergeantPanda
8494f615d0 Bug fix: Use correct name if stream name is supposed to be used to build dummy. 2025-10-18 17:29:44 -05:00
SergeantPanda
0d987aae99 Enhancement: If a stream profile is set for a custom stream, when previewing the stream Dispatcharr will now use the assigned stream profile. Fixes #186 2025-10-18 16:24:47 -05:00
SergeantPanda
81276bfc16 Bug fix: Current settings for Stream Profile and Group were not displayed when opening the edit form for custom streams 2025-10-18 16:04:55 -05:00
SergeantPanda
1a541bd133 Bug fix: Unable to preview custom streams. 2025-10-18 15:52:48 -05:00
SergeantPanda
fa2a90fab4 Bug fix: Remove bottom horizontal scroll bar from TV Guide, it was not synced with the guide and served no purpose. 2025-10-18 14:07:34 -05:00
SergeantPanda
91eaa64ebb Enhancement: Force a specific EPG for auto channel sync channels. 2025-10-18 13:43:49 -05:00
SergeantPanda
0a4c7cae25 Bug fix: Fix bug where channel logo would revert back to provider logo after a refresh (for auto channel sync channels) 2025-10-18 12:38:43 -05:00
SergeantPanda
ba695ebbe9
Merge pull request #568 from Dispatcharr/dummy-epgs
Enhancement: Add Custom Dummy EPG with Dynamic Pattern Matching
2025-10-18 12:25:37 -05:00
SergeantPanda
22fb0b3bdd Enhancement: Add Custom Dummy EPG with Dynamic Pattern Matching and Name Source Selection
This enhancement introduces a powerful custom dummy EPG system that allows users to generate EPG programs on-demand by parsing channel or stream names using configurable regex patterns.

Key Features:
- Custom Pattern Matching: Define regex patterns to extract information from channel/stream names (teams, leagues, times, dates, etc.)
- Flexible Name Source: Choose to parse either the channel name or a specific stream name (by index)
- Timezone-Aware Scheduling: Automatic DST handling using pytz timezone names (e.g., 'US/Eastern', 'Europe/London')
- Time Format Support: Parse both 12-hour (AM/PM) and 24-hour time formats
- Date Parsing: Extract dates from names with flexible month/day/year patterns
- Custom Templates: Format EPG titles and descriptions using captured groups with {placeholder} syntax
- Upcoming/Ended Customization: Define custom titles and descriptions for programs before and after scheduled events
- Live Preview: Test patterns and templates in real-time with sample input
- Smart Program Generation: Automatically creates "Upcoming" and "Ended" programs around scheduled events

Use Cases:
- Sports channels with event details in stream names (e.g., "NHL 01: Bruins VS Leafs @ 8:00PM ET")
- Movie channels with genre/title/year information
- Racing events with driver/track/series details
- Any scenario where EPG data is embedded in channel/stream naming conventions

Technical Implementation:
- Backend: Pattern matching engine with timezone conversion and program scheduling logic
- Frontend: Interactive form with validation, pattern testing, and visual group preview
- Name Source Options: Parse from channel name or selectable stream index (1-based)
- Fallback Behavior: Uses standard dummy EPG if patterns don't match
- Custom Properties: Stores all configuration in EPGSource.custom_properties JSON field

Configuration Options:
- Title Pattern: Extract primary information (required)
- Time Pattern: Extract hour/minute/AM-PM (optional)
- Date Pattern: Extract month/day/year (optional)
- Timezone: Event timezone with automatic DST support
- Program Duration: Length of generated programs in minutes
- Title Template: Format EPG title using captured groups
- Description Template: Format EPG description using captured groups
- Upcoming Title Template: Custom title for programs before event starts (optional)
- Upcoming Description Template: Custom description for programs before event starts (optional)
- Ended Title Template: Custom title for programs after event ends (optional)
- Ended Description Template: Custom description for programs after event ends (optional)
- Name Source: Channel name or stream name
- Stream Index: Which stream to use when parsing stream names (1, 2, 3, etc.)

Closes #293
2025-10-18 12:08:56 -05:00
SergeantPanda
ca8e9d0143 Enhancement: Add custom logo support for channel groups in Auto Sync Channels.
Closes #555
2025-10-17 10:03:21 -05:00
SergeantPanda
d3d7f3c733
Merge pull request #531 from jordandalley/fix-ipv6-validation
Fix: Add IPv6 CIDR validation in Settings
2025-10-16 18:04:36 -05:00
SergeantPanda
7744d7287b Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/jordandalley/531 2025-10-16 17:57:39 -05:00
SergeantPanda
ec21e8329d Enhancement: Increase time for a client to search for an available connection from 1.5 seconds to 3 seconds. This will help when clients are changing channels and release the old connection AFTER attempting to start the new connection. Closes #503
Bug Fix: Fix a bug where searching for an available stream could clear out stream locks for streams that it never acquired a lock for.
2025-10-16 17:38:23 -05:00
SergeantPanda
0031d55bab Bug Fix: Resizing columns in the channel table may cause the page to crash.
Fixes #516
2025-10-16 16:49:13 -05:00
SergeantPanda
b9a0aaa574 Re-allign tables and buttons after adjusting button layout. 2025-10-16 14:33:15 -05:00
SergeantPanda
9b2ebf169b Better database connection cleanup. 2025-10-16 14:22:19 -05:00
Ragchuck
ae8b85a3e2 feat: added support for rtsp 2025-10-15 22:06:01 +02:00
SergeantPanda
4df2f79bcf Bug fix: Fixes bug where if there were no channel profiles other than ALL, streamer and standard accounts could not stream any channels even if they had ALL profiles selected. 2025-10-14 15:03:42 -05:00
SergeantPanda
ed065f718d Enhancement: Implement caching for proxy settings to improve performance and reduce database load. Also, ensure database connections are closed after use in both config and stream manager. 2025-10-14 13:44:28 -05:00
SergeantPanda
90d065df80 Enhancement: Show "Saved Successfully" message when changing stream settings and DVR settings. Also only show "Saved Successfully" when the api actually returns a success. 2025-10-13 17:00:18 -05:00
SergeantPanda
87d2131789 Bug fix: Fixes saving settings returning error.
Fixes [Bug]: Error saving Stream Settings
Fixes #535
2025-10-13 16:45:22 -05:00
SergeantPanda
071561c570
Merge pull request #553 from Dispatcharr:Proxy-changes
Enhance HTTP streaming and timeout configurations
2025-10-12 09:46:21 -05:00
SergeantPanda
404d2f82a3 Switch HTTP streamer to a thread and pipe its output to a local pipe where the fetch chunk can access it the same way our transcode processes would be accessed. Simplifies the code. 2025-10-12 09:42:15 -05:00
SergeantPanda
74280baa85 Changed read timeout for http connection for the proxy server to 10 secounds to avoid unnecessary timeouts.
Improved try_next_stream to not fail if the returned stream is the same. It will now try a different stream.
Force a client to jump ahead in the buffer if they fall to far behind.
2025-10-11 19:45:21 -05:00
SergeantPanda
fa08216600 Enhancement: Add chunk timeout configuration in ConfigHelper. Improve StreamManager timeout handling for consistency. Only 1 heartbeat thread per worker should be started now. Timeout on proxy reduced from 60 seconds to 5. 2025-10-11 18:08:20 -05:00
SergeantPanda
fbd83e61b7 Enhancement: Added tooltips to stream table fields. Also removed unneeded imports for logos. 2025-10-11 10:54:11 -05:00
SergeantPanda
6acb0da933 Bug fix/Enhancement: Selecting many streams will no longer cause the stream table to create a new row for buttons. Also reordered buttons in the stream table slightly. 2025-10-11 10:46:52 -05:00
SergeantPanda
d32abecb25 Enhancement: Add confirmation dialog to stream delete functions. 2025-10-11 10:36:04 -05:00
SergeantPanda
d5f9ba7e5e Sort EPG output by channel number. 2025-10-10 17:55:51 -05:00
SergeantPanda
f58bc81c36 Enhancement: Optimize EPG program fetching by implementing chunked retrieval and explicit ordering to improve performance and reduce memory issues. 2025-10-10 17:52:05 -05:00
SergeantPanda
fefab4c4c6 Enhancement: Improve resource cleanup in ProxyServer and StreamManager classes to avoid "SystemError: (libev) error creating signal/async pipe: Too many open files" errors 2025-10-10 15:26:02 -05:00
SergeantPanda
9dc54fdcff Fix: Ensure channel_id and channel.uuid are converted to strings before processing. This fixes an issue where sending a stream switch event would fail if the event was sent from a non owning worker.
Fixes [Bug]: Manually switching active stream not working when using XC client.
Fixes #269
2025-10-09 19:10:38 -05:00
SergeantPanda
85fdfedabe
Merge pull request #543 from Dispatcharr/Auto-disable-new-categories 2025-10-09 15:32:57 -05:00
SergeantPanda
951af5f3fb Enhancement: Add auto-enable settings for new groups and categories in M3U and VOD components
Bug Fix: Remove orphaned categories for VOD and Series
Fixes #540
Closes #208
2025-10-09 15:28:37 -05:00
GitHub Actions
fe58594a36 Release v0.10.4 2025-10-08 00:55:25 +00:00
SergeantPanda
072201016c
Merge pull request #534 from Dispatcharr/dev
Dispatcharr – Version 0.10.4
2025-10-07 19:54:34 -05:00
SergeantPanda
8794156767 Enhancement: Only fetch playlists and channel profiles after successful m3u refresh not every m3u_refresh status update. This should reduce network traffice and efficiency.
This may fix #327
2025-10-07 10:02:26 -05:00
SergeantPanda
da245c409a Bug fix: Backend now notifies frontend when a new playlist is creating. This should fix where accounts will sometimes only show fetching groups after a new account is added. 2025-10-07 09:55:35 -05:00
SergeantPanda
171bb004c4 Set XC as default for new M3U accounts 2025-10-07 09:20:01 -05:00
SergeantPanda
99ad0ecb7b Bug fix: Fixes bug where adding multiple M3U accounts in a row would only modify the first M3U account that was added unless you refresh the web UI. 2025-10-07 09:18:43 -05:00
3l3m3nt
a959ba1748 Fix: Add IPv6 CIDR validation in Settings. Add ::0/0 as a default as well as 0.0.0.0/0 2025-10-07 20:02:47 +13:00
SergeantPanda
d1aa9fe441 Bug fix: Add logo URL validation to prevent PostgreSQL btree index errors during channel creation from stream 2025-10-06 21:22:16 -05:00
SergeantPanda
3326b9fbdc Bug fix: Add logo URL validation to prevent PostgreSQL btree index errors during bulk channel creation
Fixes #519
2025-10-06 21:11:53 -05:00
SergeantPanda
13874d64ad Bug fix: If no streams are found during an XC account refresh we were not releasing the task lock and we weren't processing vods.
Fixes #449
2025-10-06 20:04:22 -05:00
SergeantPanda
bc574c272c Bug fix: convert commas to decimals and verify float before saving vod/series to database.
Fixes #526
2025-10-06 19:45:43 -05:00
SergeantPanda
144a861142 Bug fix: When using direct urls in m3u output, use the correct stream based on the order for the channel.
Fixes #528
2025-10-06 18:18:05 -05:00
SergeantPanda
e01338f055 Enhancement: Properly track channel creation time and channel update times in the database. XC Get Live streams will now use this for the added field. 2025-10-06 18:07:51 -05:00
SergeantPanda
22493c2797 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-10-06 16:50:28 -05:00
SergeantPanda
a4a677a6fb Bug fix: Streamers with ALL assigned will now get all channels they have access to instead of All profiles. If there wasn't a profile other than default before, the streamer would not get any channels. 2025-10-06 16:50:17 -05:00
OkinawaBoss
8db9689999
Merge branch 'DVR-Update' into dev 2025-10-06 06:25:40 -07:00
Dispatcharr
dea6411e1c Time Zones
- Added time zone settings
2025-10-06 07:46:23 -05:00
SergeantPanda
a31feee311 Bug fix: Ensure distinct channel results in generate_m3u, generate_epg, and xc_get_live_streams functions. Fixes duplicate channels output for streamer profiles that were set to "All" 2025-10-05 19:32:40 -05:00
SergeantPanda
ad4143d035
Merge pull request #514 from Blarm1959/dev
Replaces #499 - debian_install.sh - Update installed packages, node and create missing folders
2025-10-05 18:54:56 -05:00
SergeantPanda
0406c868bc
Merge pull request #515 from EmeraldPi/set-name-logo 2025-10-05 18:51:30 -05:00
SergeantPanda
cfd235ba34
Merge pull request #457 from EmeraldPi/logo-batch-edit 2025-10-05 18:47:29 -05:00
SergeantPanda
cedd0d6f4d Finish merge 2025-10-05 18:44:26 -05:00
SergeantPanda
e6a9672a14 Merge branch 'dev' into pr/csmith1210/457 2025-10-05 18:40:59 -05:00
SergeantPanda
d7735255ec Enhancement: Add confirmation dialogs for setting names, logos, TVG-IDs, and clearing EPG assignments in ChannelBatchForm 2025-10-04 17:39:56 -05:00
SergeantPanda
67a6ad1168 Enhancement: Add clear EPG button to ChannelBatchForm for resetting EPG assignments 2025-10-04 17:35:01 -05:00
SergeantPanda
18dc73cbcb
Merge pull request #518 from Dispatcharr/Assign-tvg-id-from-epg 2025-10-04 17:24:49 -05:00
SergeantPanda
25d6322186 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into Assign-tvg-id-from-epg 2025-10-04 16:59:53 -05:00
GitHub Actions
882be5cdf8 Release v0.10.3 2025-10-04 21:54:01 +00:00
SergeantPanda
9173f0b876
Merge pull request #517 from Dispatcharr/dev 2025-10-04 16:52:13 -05:00
SergeantPanda
29ee837b24 Add recover=True to iterparse for parse_programs_for_tvg_id as well to fix cloudflare script injection. 2025-10-04 16:36:49 -05:00
SergeantPanda
94f966e027 Reverted to old method for parsing xml. Still will not break if Cloudflare adds a new root element. 2025-10-04 16:25:35 -05:00
SergeantPanda
d1ac5b11e5 Add EPG TVG-ID setting functionality
- Implemented API endpoint to set channel TVG-IDs from EPG data.
- Created Celery task to handle TVG-ID updates for multiple channels.
- Added frontend methods to initiate TVG-ID setting from EPG for both single and batch channel forms.
- Enhanced notifications for task status updates.
2025-10-04 15:20:35 -05:00
Connor Smith
23209e79e9 [Enhancement] Set logo name from url 2025-10-04 14:23:53 -04:00
Blarm1959
a502d309f1 Update installed packages, node and create missing folders 2025-10-04 19:11:30 +01:00
SergeantPanda
d316173ab6
Merge pull request #512 from Dispatcharr/revert-511-revert-458-set-logo-url-v1
Revert "Revert "Add option to add a logo by a URL in Channel editor/creator""
2025-10-04 09:23:01 -05:00
SergeantPanda
23be065c52
Revert "Revert "Add option to add a logo by a URL in Channel editor/creator"" 2025-10-04 09:22:09 -05:00
SergeantPanda
9947f36286
Merge pull request #511 from Dispatcharr/revert-458-set-logo-url-v1
Revert "Add option to add a logo by a URL in Channel editor/creator"
2025-10-04 09:12:13 -05:00
SergeantPanda
e4a6d19c17
Revert "Add option to add a logo by a URL in Channel editor/creator" 2025-10-04 09:11:38 -05:00
SergeantPanda
3128c116e8
Merge pull request #458 from csmith1210/set-logo-url-v1 2025-10-04 08:54:27 -05:00
Connor Smith
ff894acf4d Fix: Need onSuccess in file upload to update Channel.jsx logo 2025-10-03 21:15:45 -04:00
SergeantPanda
d0e31e8acd Rebuilt FFmpeg base container 2025-10-03 17:11:00 -05:00
SergeantPanda
4aafece68e Updated GitHub Actions workflow for building base image. Changed runner version, added native runner builds instead of qemu 2025-10-03 17:02:10 -05:00
GitHub Actions
edda2ca3a5 Release v0.10.2 2025-10-03 21:08:54 +00:00
SergeantPanda
d0413e63be Push releases to Dockerhub. 2025-10-03 16:05:56 -05:00
SergeantPanda
9b2402a421
Merge pull request #510 from Dispatcharr/dev
Release 0.10.2
2025-10-03 15:39:15 -05:00
SergeantPanda
22409b4f01
Merge branch 'main' into dev 2025-10-03 15:36:28 -05:00
SergeantPanda
172bb204f4 Refector release workflow to run on native architecture. 2025-10-03 14:33:18 -05:00
SergeantPanda
0a15e09805 Refactor XMLTV parsing to use iterparse for <tv> element. This fixes issues where Cloudflare is adding a root element and breaking the xml.
Fixes #497: Cloudflare hosted EPG not supported in dispatcharr
2025-10-03 13:30:20 -05:00
SergeantPanda
d06c5bfdf3 Add recover to xmltv parser if there are errors. 2025-10-03 10:15:24 -05:00
Connor Smith
68c8d4dc23 Change logo upload to link to Logo.jsx form 2025-10-03 07:05:56 -04:00
SergeantPanda
c73271c617 Bug Fix: Sort order while bulk creating channels will now be in the order they were selected in not reverse order. 2025-10-02 13:23:24 -05:00
SergeantPanda
5bb2b57a4e Remove redundant h from 12 hour time in settings page. 2025-10-02 11:48:16 -05:00
SergeantPanda
d8ad33ff77 Fix bug where m3u hash settings wouldn't save properly. 2025-10-02 11:45:39 -05:00
SergeantPanda
e841343e5b Dont convert m3u_id to string for hash. 2025-10-02 11:06:10 -05:00
SergeantPanda
f90b24db40 Change default M3U hash key to URL only for new installs. 2025-10-02 09:28:02 -05:00
SergeantPanda
a3e4f23891 Enhancement: Add m3u_id parameter to generate_hash_key and update related calls 2025-10-02 09:14:22 -05:00
SergeantPanda
c99456f77d Remove sha_short tag to reduce number of tags. 2025-09-30 20:12:33 -05:00
SergeantPanda
9de6f79016 More organization changes. 2025-09-30 20:03:02 -05:00
SergeantPanda
b74b5b9b1b Add DOCKERHUB_ORGANIZATION secret 2025-09-30 19:55:47 -05:00
SergeantPanda
c7c9607071 Remove comments 2025-09-30 19:32:58 -05:00
SergeantPanda
041cb69bb8 Push to dockerhub as well. 2025-09-30 19:17:35 -05:00
SergeantPanda
b5223c13e7 Use the same timestamp for all images. 2025-09-30 18:30:27 -05:00
SergeantPanda
7136f8392d Create combined manifest 2025-09-30 18:07:05 -05:00
SergeantPanda
58924e6834 Only build for current architecture. 2025-09-30 14:54:23 -05:00
SergeantPanda
44478804f8 Split install and build into separate tasks. 2025-09-30 10:25:39 -05:00
SergeantPanda
37b05f1dde Attempt to fix epipe error in docker build. 2025-09-30 10:01:11 -05:00
SergeantPanda
9da20b1941 Switch back to node 24, switch to npm for building frontend. Remove node_modules before starting frontend build. Fixes bug with NPM 2025-09-30 09:22:05 -05:00
SergeantPanda
0e6889c6c1 Try node 20 (again) 2025-09-28 18:43:11 -05:00
SergeantPanda
e072fb88f9 Another test. 2025-09-28 18:30:28 -05:00
SergeantPanda
bbb559d1cd Attempt to work around known npm bug for ARM. 2025-09-28 17:38:15 -05:00
SergeantPanda
a0745fcaa5 Switch from yarn to npm for building 2025-09-28 11:57:13 -05:00
SergeantPanda
f3de398c89 Remove extra build tools. 2025-09-28 11:49:24 -05:00
SergeantPanda
836871f102 Add installation of build dependencies in Dockerfile 2025-09-28 11:36:59 -05:00
SergeantPanda
cd5135ba27 Add testing-library/dom to package-lock 2025-09-28 11:30:14 -05:00
SergeantPanda
c5029213fc Add testing-library/dom to dev dependencies 2025-09-28 10:20:30 -05:00
SergeantPanda
deb532071b Switch from emulated arm building to native. 2025-09-28 10:06:58 -05:00
SergeantPanda
01971fb91a Switch from emulated arm building to native. 2025-09-28 10:04:34 -05:00
SergeantPanda
e281041458 Get logs from yarn install if failed. 2025-09-28 09:38:01 -05:00
SergeantPanda
19f0088b40 Switch from slim build to full build of node 24. 2025-09-28 09:30:26 -05:00
SergeantPanda
db5713a050 Update Dockerfile to use Node.js 24-slim for frontend build 2025-09-27 17:53:15 -05:00
SergeantPanda
563f890cf4 Yarn apparently no longer supports igrnore optional. 2025-09-27 17:47:23 -05:00
SergeantPanda
09b79bd174 Fix: Update yarn install command to remove fallback option 2025-09-27 17:41:30 -05:00
SergeantPanda
857471e1ad Attempt to get arm builds working 2025-09-27 17:39:00 -05:00
SergeantPanda
1493eaf28b Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-09-27 17:33:25 -05:00
SergeantPanda
4a4d93767e Separate steps to debug failed builds 2025-09-27 17:28:34 -05:00
SergeantPanda
134093b18e Enhancement: Add 'nice -n 5' to celery commands in configuration files for better process priority management 2025-09-27 15:32:29 -05:00
SergeantPanda
93dd37e822 Enhancement: Add x-tg-url and url-tvg URL generation with preserved query parameters to M3U content 2025-09-26 14:10:42 -05:00
SergeantPanda
ec193813b4 Enhancement: Update logo options to include the default logo 2025-09-26 10:24:00 -05:00
SergeantPanda
70f7484fb5 Better connection tracking for apps that do not reuse connections during seeking operations. 2025-09-26 09:18:43 -05:00
SergeantPanda
7bb4df78c8 Enhancement: Update M3U profile retrieval to include current connection counts and session handling during vod session start. 2025-09-26 09:18:42 -05:00
SergeantPanda
d961d4cad1 Bug fix: VOD will now select the correct M3U profile while starting.Fixes #461 2025-09-26 09:18:22 -05:00
SergeantPanda
0cdce1a81b Enhancement: Refactor stream selection logic when all available profiles have max connections used. Will retry faster. 2025-09-25 17:49:05 -05:00
SergeantPanda
23f397c805 Enhancement: Add exact Gracenote ID matching to EPG channel mapping 2025-09-25 11:51:10 -05:00
GitHub Actions
70aaa2a04c Release v0.10.1 2025-09-24 21:40:25 +00:00
SergeantPanda
86344b43ba
Merge pull request #468 from Dispatcharr:dev
# Dispatcharr Release – Version 0.10.1
2025-09-24 16:39:30 -05:00
SergeantPanda
a12bfeab46 Enhancement: Increase max_length of URL field in EPGSource model to 1000 characters 2025-09-24 16:12:17 -05:00
SergeantPanda
6fa12f90c5 Improved url transofrmation logic to support more advanced regex during profile refreshes. 2025-09-24 15:51:26 -05:00
Connor Smith
fd9038463b Allow for batch editing channel logos 2025-09-22 20:29:32 -04:00
SergeantPanda
75fbf9639a Enhancement: Update channel and program mapping to support multiple channels per TVG ID 2025-09-21 16:23:07 -05:00
SergeantPanda
4db8eca391
Merge pull request #448 from stlalpha/fix-438
Fix #438: Virtualize TV Guide rendering to improve large channel count performance and add regression tests
2025-09-21 15:50:28 -05:00
SergeantPanda
08e5b6f36f Apply prettier formatting. 2025-09-21 15:17:15 -05:00
SergeantPanda
6f79845b21 Enhancement: Only grab first display name for a channel during epg scanning. 2025-09-21 12:40:20 -05:00
SergeantPanda
99122cac7c Bug fix: If URL for Channel element in EPG is longer than 500 characters parsing would fail. Added validation during scanning. 2025-09-21 12:23:48 -05:00
SergeantPanda
3f7edd840e Change whiteSpace style from 'nowrap' to 'pre' in StreamsTable for better text formatting.
Users can you reliably copy a name or group from the stream table and use as a filter for m3u accounts
2025-09-21 11:17:14 -05:00
SergeantPanda
63729fb0ea Improved logging for stream filters. 2025-09-21 10:42:23 -05:00
SergeantPanda
207613c00b Fix frontend saving case sensitive setting as json string. 2025-09-21 10:06:53 -05:00
Jim McBride
323f1d5c05
Add TV guide utility tests and vitest setup 2025-09-21 10:00:15 -05:00
Jim McBride
00b8119b81 Revert "Virtualize TV guide rendering"
This reverts commit db024130be.
2025-09-21 01:25:29 -05:00
Jim McBride
db024130be Virtualize TV guide rendering 2025-09-21 01:02:32 -05:00
Dispatcharr
6536f35dc0 FIxed bug
Fixed bug that stopped stream from ending
2025-09-19 19:47:59 -05:00
Dispatcharr
eee4ab0725 Update for recurring rules 2025-09-19 16:21:28 -05:00
OkinawaBoss
02ac0d8de5
Merge pull request #430 from Dispatcharr/dev
Syncing with dev
2025-09-19 12:38:57 -07:00
GitHub Actions
773e8e7d54 Release v0.10.0 2025-09-18 22:28:47 +00:00
SergeantPanda
9eade91958
Merge pull request #424 from Dispatcharr/dev 2025-09-18 17:27:31 -05:00
BigPanda
0dbc5221b2 Add 'UK' region
I'm not sure if this was intentional, but the UK seems to be missing from
the region list.
2025-09-18 21:20:52 +01:00
SergeantPanda
b3debcd014 Fix bug during VOD cleanup where all VODs not from the current m3u scan would be deleted. 2025-09-18 14:14:04 -05:00
SergeantPanda
48a2f2da39 Simplify VOD and series access for all authenticated users by removing user-level restrictions on M3U accounts. 2025-09-18 11:17:49 -05:00
SergeantPanda
f4f29a0e27 Break out of channel child elements once required data is acquired. 2025-09-18 10:34:51 -05:00
Dispatcharr
424a450654 DVR Features and bug fixes
Added ability to use custom comskip.ini
Added series recording without reliance on EPG
Fixed comskip bug
Fixed timezone mismatch when scheduling DVR recordings

No migrations completed yet
2025-09-18 10:23:16 -05:00
SergeantPanda
edc18e07fe Auto-focus filter for epg. 2025-09-16 20:10:49 -05:00
SergeantPanda
00da233322 Rename logos variable to channelLogos to avoid future confusion. 2025-09-16 19:49:41 -05:00
SergeantPanda
9ef2aa966d Requery channels when setting channel names from epg. 2025-09-16 19:44:41 -05:00
SergeantPanda
ab3350d08d Search all logos instead of just channel assignable. 2025-09-16 19:41:11 -05:00
SergeantPanda
2e5280c46a Remove unneeded logo call. 2025-09-16 19:17:31 -05:00
SergeantPanda
8b740fc3ac Move buttons for use epg name and use epg logo . 2025-09-16 18:49:02 -05:00
SergeantPanda
7e13e51198 Update the frontend on logo change. 2025-09-16 17:55:55 -05:00
SergeantPanda
3cb5a061c9 Show progress as notifications. 2025-09-16 17:35:38 -05:00
SergeantPanda
d2d1984797 Switch bulk epg name and logo to backend celery tasks for efficiency and scrape epg channel logo during epg scanning. 2025-09-16 17:17:07 -05:00
SergeantPanda
8607d626fa Update logo store when bulk changing logos. 2025-09-16 16:32:32 -05:00
SergeantPanda
388d9e7171 Fix logos not being set. 2025-09-16 16:25:50 -05:00
SergeantPanda
64a019597d Add ability to channel edit form and bulk channel editor to set logos and names from assigned epg.
Closes #157 [Feature]: Logo from EPG
2025-09-16 16:20:16 -05:00
SergeantPanda
cc03ad7d64
Merge pull request #413 from Dispatcharr/epg-auto-match-refactor 2025-09-16 15:58:56 -05:00
SergeantPanda
a846b09ad3 Minor formatting adjustment. 2025-09-16 14:39:04 -05:00
SergeantPanda
60e378b1ce Add support for matching selected channels with EPG data
- Updated API to accept optional channel IDs for EPG matching.
- Enhanced match_epg method to process only specified channels if provided.
- Implemented new task for matching selected channels in the backend.
- Updated frontend to trigger EPG matching for selected channels with notifications.
2025-09-16 14:38:16 -05:00
SergeantPanda
20685b8344 Lower regional bonus. Remove epg_match script. 2025-09-16 14:27:07 -05:00
SergeantPanda
c7235f66ba Use stricter matching for bulk matching. 2025-09-16 14:12:45 -05:00
SergeantPanda
6384f4f56f Add progress notifications for EPG matching process 2025-09-16 13:47:59 -05:00
SergeantPanda
d6bb9e40b2 Implement memory cleanup for ML models after channel matching operations 2025-09-16 13:15:32 -05:00
SergeantPanda
c55dcfd26a Remove unnecessary checking of cache directories. Lets sentence transformers handle it. 2025-09-16 13:01:43 -05:00
SergeantPanda
fedc98f848 Removed unneeded debug logging. 2025-09-16 12:54:19 -05:00
SergeantPanda
d2085d57f8 Add sentence transformers to new matching function. 2025-09-16 12:43:21 -05:00
SergeantPanda
f6be6bc3a9 Don't use matching script 2025-09-16 09:18:41 -05:00
SergeantPanda
f1739f2394 Add EPG auto-match functionality for specific channels and update UI 2025-09-16 08:55:10 -05:00
SergeantPanda
eccc5d709a Sort categories by name during api retrieval. 2025-09-15 20:38:18 -05:00
SergeantPanda
f4e91013f2
Remove local data volume binding
Removed local data volume binding from docker-compose.
2025-09-15 20:14:02 -05:00
SergeantPanda
56aa5c77d2 Filter out profiles during db query that are inactive. 2025-09-15 20:02:40 -05:00
SergeantPanda
ed0b291237 Skip disabled m3u accounts when choosing streams during playback.
Closes #402
2025-09-15 17:36:31 -05:00
SergeantPanda
dfaae6e617 Enhance UserViewSet queryset to prefetch related channel_profiles for improved performance 2025-09-14 19:47:40 -05:00
SergeantPanda
22e1a8cc05 Fix: "ReferenceError: setIsInitialized is not defined" when logging into web ui. 2025-09-14 19:34:32 -05:00
SergeantPanda
5e661ea208 Fix VOD page not displaying correct order while changing pages. 2025-09-14 19:27:11 -05:00
SergeantPanda
f8e91155e2 Fix bug: cannot access local variable 'total_chunks' where it is not associated with a value 2025-09-14 15:19:51 -05:00
SergeantPanda
0c507988b6 Remove max size on group and epg columns. Increase default column size for epg 2025-09-14 14:53:55 -05:00
SergeantPanda
d7129d6195 Change EPG column to have (EPG ID - TVG-ID) and standardize header label. 2025-09-14 14:35:51 -05:00
SergeantPanda
f84a347514 Don't put channel name in EPG name when no epg is assigned. 2025-09-14 13:53:02 -05:00
SergeantPanda
661d5f9d43
Merge pull request #392 from Dispatcharr:bulk-channel-creation-refactor
Enhance channel creation process with starting channel number and bulk creation updates
2025-09-14 13:37:50 -05:00
SergeantPanda
a1d35a8dad Additional check if channel number is in use before creating. 2025-09-14 13:34:03 -05:00
SergeantPanda
6c1dbff91c Send websocket when channel is created. 2025-09-14 13:27:31 -05:00
SergeantPanda
58a3da386a Allow single channel creation from stream to specify channel number 2025-09-14 13:24:11 -05:00
SergeantPanda
4886426ea0 New feature: Allow specifying channel number during channel creation.
Closes #377
Closes #169 [Feature]: Starting channel number/continue auto numbering from channel number X (user defined)
2025-09-14 12:46:30 -05:00
SergeantPanda
0411fe003a Add ability to specify starting channel numbers to the api. 2025-09-14 12:30:24 -05:00
SergeantPanda
4f49636899 Remove old bulk create view. 2025-09-14 12:19:23 -05:00
SergeantPanda
8ab9a508d4 Remove deprecated createChannelsFromStreams method 2025-09-13 20:32:25 -05:00
SergeantPanda
05641cfb02 Remove duplicate websocket notification 2025-09-13 20:30:18 -05:00
SergeantPanda
7866eed613 Fetch channels after successful creation of channels. 2025-09-13 20:21:27 -05:00
SergeantPanda
301a162e71 Switch to normal notifications 2025-09-13 20:16:14 -05:00
SergeantPanda
cf7ea09341 Implement asynchronous bulk channel creation from stream IDs with WebSocket progress updates 2025-09-13 19:59:43 -05:00
SergeantPanda
5875c31750 Add POST endpoint to retrieve streams by IDs and enhance channel creation process 2025-09-13 19:10:47 -05:00
SergeantPanda
d09a5efe80 Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-09-13 18:24:04 -05:00
SergeantPanda
fe2edf16b0 Better rendering of horizontal scroll bar. 2025-09-13 18:17:09 -05:00
SergeantPanda
40cbb745cd Fix bug where not setting a channel number would cause an error when creating a channel. 2025-09-13 17:15:59 -05:00
SergeantPanda
92a26caf03 Fix bug where clicking add channel with a channel selected opened the edit form. 2025-09-13 17:09:52 -05:00
SergeantPanda
81520293f4 Fix bug where a channel created could use streams from another channel. The form wasn't being fully cleared. 2025-09-13 17:02:40 -05:00
SergeantPanda
0d659f4435 Set name to expand/contract automatically when growing/shrinking the stream and channel table. 2025-09-13 16:55:50 -05:00
SergeantPanda
d55909989c Remove unnecessary destructuring in StreamsTable component definition 2025-09-13 16:39:11 -05:00
SergeantPanda
33700ccc06
Merge pull request #379 from Dispatcharr:table-column-resizing
Implement column resizing and improve table layouts
2025-09-13 12:16:41 -05:00
GitHub Actions
082bbdd82b Release v0.9.1 2025-09-13 17:10:01 +00:00
SergeantPanda
59379ae59a
Merge pull request #385 from Dispatcharr:dev
Dispatcharr Release Notes – Version 0.9.1
2025-09-13 12:09:20 -05:00
SergeantPanda
41d7066d6e Fix incorrect paths for DVR and Plugins. 2025-09-13 11:49:04 -05:00
SergeantPanda
75816b5d8e Change folder creation for Recordings in entrypoint from 'recordings' to 'Recordings' 2025-09-13 09:07:08 -05:00
SergeantPanda
892fde89b5 Fixed broken migration for plugins 2025-09-13 08:52:53 -05:00
SergeantPanda
55fdc28450 Better sizing on user-agents table. 2025-09-12 17:10:49 -05:00
SergeantPanda
b578969b8e Better sizing of stream profiles table 2025-09-12 17:04:52 -05:00
SergeantPanda
5e2794a62e Fix sizing on logo table and user table. 2025-09-12 17:00:59 -05:00
SergeantPanda
65e947c95b Rename type columns. 2025-09-12 16:40:30 -05:00
SergeantPanda
900ce73200 Better sizing for epg and m3u tables. Rearranged m3u table slightly. 2025-09-12 16:38:18 -05:00
GitHub Actions
42f52b7484 Release v0.9.0 2025-09-12 18:43:30 +00:00
SergeantPanda
d88003b542
Merge pull request #378 from Dispatcharr/dev
# Dispatcharr – Version 0.9.0
2025-09-12 13:42:33 -05:00
SergeantPanda
fa41b8eb23 Save divider location in local storage. 2025-09-12 11:12:27 -05:00
SergeantPanda
448ef16987 Expand header width to fill container. 2025-09-12 10:56:17 -05:00
SergeantPanda
a9ca6502ed Better calculation for horizontal scroll bar in channel and stream tables. 2025-09-12 10:31:35 -05:00
SergeantPanda
51352ba734 Save stream table header sizing to local storage. 2025-09-12 10:24:23 -05:00
SergeantPanda
09ba42d36c Enhance column resizing functionality in ChannelsTable, CustomTableHeader, and CustomTableBody components 2025-09-12 10:08:46 -05:00
SergeantPanda
22f0a4078b Implement column resizing functionality in Channels Table, Streams Table and CustomTableHeader 2025-09-12 09:22:02 -05:00
Dispatcharr
c85316b912 Added DVR offset
Added pre/post offset for DVR recordings.
2025-09-11 20:58:42 -05:00
SergeantPanda
ab36b28b51 Fix logos not loading in channel edit form. 2025-09-11 15:03:26 -05:00
SergeantPanda
3d2873fd4f Add EPG Channel name to epg filter.
Closes #286 [Feature]: 2 Value/Column EPG in Channel Edit
2025-09-11 13:48:12 -05:00
SergeantPanda
0a8b8919b9 Add EPG source name before epg name in channel edit form. 2025-09-11 13:26:40 -05:00
SergeantPanda
43afbd4505 Shrink width of group column. 2025-09-11 12:57:25 -05:00
SergeantPanda
e5ee64c575 Allow filtering by EPG assignment.
Closes #155
2025-09-11 12:43:44 -05:00
SergeantPanda
7925b601f4 Add epg column to channels table. 2025-09-11 12:10:27 -05:00
SergeantPanda
f797e6fff5 Persist page size in localStorage and update page size change handler in VODsPage component 2025-09-11 11:00:47 -05:00
SergeantPanda
92b4e9e348 Reset category filter on type change in VODsPage component 2025-09-11 10:56:41 -05:00
SergeantPanda
5f00027425 Reorganize VODModal buttons: move Play button next to Watch Trailer and update styles 2025-09-11 10:55:32 -05:00
SergeantPanda
6cc67851b3 Remove account type badge from modals. 2025-09-11 10:43:22 -05:00
SergeantPanda
307bd25e3e Filter category options based on selected type in VODsPage component 2025-09-11 10:36:06 -05:00
SergeantPanda
45817c699f Sort combined VODs alphabetically by name/title and update fetch logic for 'all' type 2025-09-11 10:34:28 -05:00
SergeantPanda
7a95555bd5 Add page size to vod page. 2025-09-11 10:26:31 -05:00
SergeantPanda
d1a3b667fe Store file modification time in Redis for uploaded logo files 2025-09-11 09:50:21 -05:00
SergeantPanda
4c3aef196d Refactor VOD to separate out series and vod modals. 2025-09-11 09:19:27 -05:00
SergeantPanda
74aff3fb1a Enhance M3U profile management: add optional notes field, improve validation for default profiles, and update UI to display notes where applicable.
Also adds the ability to modify the name of the defualt profile.
Closes #280 [Feature]: add general text field in m3u/xs
2025-09-10 17:26:18 -05:00
SergeantPanda
f218eaad51
Merge pull request #366 from Dispatcharr:XC-account-info
Enhance M3U account profile with custom properties and account info retrieval
2025-09-09 18:28:07 -05:00
SergeantPanda
893fdcade3 Use playlist store 2025-09-09 18:22:47 -05:00
SergeantPanda
84c752761a Ability to refresh account info from account info modal and provide notification on results. 2025-09-09 18:15:33 -05:00
SergeantPanda
d1a5143312 Convert refresh time to local time. 2025-09-09 16:43:57 -05:00
SergeantPanda
c25de2a191 Add AccountInfoModal component and integrate account information display in M3UProfiles 2025-09-09 16:29:24 -05:00
SergeantPanda
c023cde8fe Add custom properties to M3UAccountProfile and implement account info retrieval 2025-09-09 16:05:58 -05:00
SergeantPanda
c0ddec6b4b Reduced cleanup time on vod from 10 seconds to 1. 2025-09-09 13:06:40 -05:00
SergeantPanda
64e500c524 Add logging for profile connection cleanup and implement delayed cleanup for idle Redis connections 2025-09-09 12:58:35 -05:00
OkinawaBoss
3fb8e0ebd1
Merge pull request #363 from Dispatcharr/Plugins
Plugins
2025-09-08 09:11:19 -05:00
SergeantPanda
0938a3c592 Better headers for the client including content range that includes total length as well. 2025-09-07 18:51:58 -05:00
SergeantPanda
b45c6eda38 Better content-type detection 2025-09-07 18:25:39 -05:00
Dispatcharr
1200d7d894 Plugin discovery fix
Fixed problem where plugins were trying to load before DB was ready.
2025-09-07 17:38:34 -05:00
SergeantPanda
be9823c5ce Added debug logs for active clients 2025-09-07 17:14:29 -05:00
SergeantPanda
adf960753c Changed user-agent for head request to use m3u account 2025-09-07 15:17:19 -05:00
Dispatcharr
5b31440018 Updated Plugins
Added Import + Delete
Added Modal confirmations
Safer Defaults
2025-09-07 11:54:22 -05:00
SergeantPanda
c239f0300f Better progress tracking for clients that start a new session on every seek instead of reusing existing session. 2025-09-06 15:36:14 -05:00
SergeantPanda
4ca6bf763e Add position calculation 2025-09-06 10:16:54 -05:00
SergeantPanda
80c3d2fa58 Reformat vod card a bit. 2025-09-06 08:34:28 -05:00
SergeantPanda
aeb1933abb Separate every vod connection even if same vod is being played. 2025-09-06 08:13:28 -05:00
SergeantPanda
f135c6ae8b
Merge pull request #357 from Dispatcharr:vod-stats
Rename Active Channels to Active Streams and enhance VOD features
2025-09-05 21:31:40 -05:00
SergeantPanda
1817aab2da Fix channel cards not closing when channel stops. 2025-09-05 21:09:03 -05:00
SergeantPanda
db80a2fa09 Made vod card similar to channel card 2025-09-05 21:04:41 -05:00
SergeantPanda
d4b7b3d3d2 Combine vod and streams into same view 2025-09-05 20:40:55 -05:00
SergeantPanda
6b9d42fec1 Refactor VODStatsView to use correct field names for content type and M3U profile ID, and update user agent handling 2025-09-05 20:34:43 -05:00
SergeantPanda
1a8763731b Remove unneeded headers 2025-09-05 20:26:35 -05:00
SergeantPanda
18b8462a5f Refactor user-agent handling to use 'client_user_agent' for consistency across VOD connection management 2025-09-05 20:15:11 -05:00
SergeantPanda
7c4d7865ea Fixed not using user-agent from m3u (was using client user-agent) 2025-09-05 19:36:08 -05:00
SergeantPanda
1080b1fb94 Refactor stats and vod proxy 2025-09-05 19:30:13 -05:00
Dispatcharr
e9a11588c4 Init Plugins 2025-09-05 17:10:11 -05:00
SergeantPanda
4712a4305c Rename "Active Channels" to "Active Streams" and enhance automatic refresh interval UI 2025-09-05 12:22:06 -05:00
SergeantPanda
b68b904838
Merge pull request #356 from Dispatcharr:stats-direct-api
Switch to api calls instead of celery beat for stats
2025-09-05 12:10:10 -05:00
SergeantPanda
f1196bb988 Send websocket update on client disconnect and connect. 2025-09-05 12:04:30 -05:00
SergeantPanda
98f485bac9 Fixed issue with recurring api calls. 2025-09-05 11:37:49 -05:00
SergeantPanda
870e77b137 Disable fetch-channel-statuses 2025-09-05 10:23:25 -05:00
SergeantPanda
d709d92936 Refactor channel stats fetching and enhance settings UI for better user experience 2025-09-05 09:42:52 -05:00
SergeantPanda
ca79cc1a1d
Merge pull request #354 from Dispatcharr:DVR-Refresh
DVR enhancements and Comskip integration
2025-09-04 17:30:29 -05:00
SergeantPanda
67939ba44b Auto cleanup orphaned vods and series during scan. 2025-09-04 16:57:32 -05:00
Dispatcharr
8c364d3eb8 Fix DVR
Fixed bug where connection wouldn't release
2025-09-04 16:00:01 -05:00
SergeantPanda
e80e1b9014 Add comskip to base image for upcoming DVR feature/overhaul. 2025-09-04 14:50:32 -05:00
Dispatcharr
7401b4c8d3 Combined migrations 2025-09-04 14:36:14 -05:00
Dispatcharr
f652d2b233 Comskip Update 2025-09-04 13:45:25 -05:00
SergeantPanda
fbeca53cd7 Minor fixes to VOD api. trailer returned Null instead of empty string, removed decimal from added time. 2025-09-04 13:44:15 -05:00
Dispatcharr
c76d68f382 Updated guide
Updated series rules button to match the rest of the layout.
Fixed problem where the remove button would not remove the rule
2025-09-04 08:40:48 -05:00
Dispatcharr
41e32bc08a DVR Updates
Added fallback settings.
Added subtitles to cards.
Add data volume mount to Docker container.
2025-09-04 08:22:13 -05:00
Dispatcharr
00cc83882a DVR update 2025-09-03 21:35:45 -05:00
SergeantPanda
5806464406 Fix unique constraint violations when rehashing streams 2025-09-03 16:23:58 -05:00
SergeantPanda
55f54081b1
Merge pull request #343 from Dispatcharr:Channel-Batch-Update
Add batch channel name updates with regex support
2025-09-02 19:51:22 -05:00
Dispatcharr
c4f9a998e5 Added batch channel name updates 2025-09-02 17:23:18 -05:00
SergeantPanda
5ccef5db32
Merge pull request #342 from Dispatcharr:vod-endpoint-optimizations
Vod-endpoint-optimizations
2025-09-02 15:54:49 -05:00
SergeantPanda
eff4b665b1 Fix bug where multiple relations would error out get_vod_info 2025-09-02 15:52:40 -05:00
SergeantPanda
ffe6ce5a6b Removed unnecessary fall backs for trailer. 2025-09-02 15:35:04 -05:00
SergeantPanda
ac5df3fd28 Optimize VOD streams retrieval by using prefetch_related to eliminate N+1 queries 2025-09-02 14:59:33 -05:00
SergeantPanda
e7201b4429
Merge pull request #341 from Dispatcharr:convert-custom_properties-from-text-to-jsonb-fields-
Convert-custom_properties-from-text-to-jsonb-fields-
2025-09-02 10:35:27 -05:00
SergeantPanda
6c1bfaf052 Increase server url max length from 200 to 1000 characters to accomodate tokens in the url 2025-09-02 10:20:23 -05:00
SergeantPanda
8cae13f845 Remove unneeded migration. 2025-09-02 09:59:06 -05:00
SergeantPanda
9c41960405 Added conversion to jsonb for users table. 2025-09-02 09:52:39 -05:00
SergeantPanda
246b00e2bf Convert backend to use jsonb fields. 2025-09-02 09:48:54 -05:00
SergeantPanda
6f6c28ca7c Convert custom_properties to jsonb in the backend. 2025-09-02 09:41:51 -05:00
SergeantPanda
a87f7c875d Merge custom properties when refreshing groups instead of overwriting. 2025-09-02 09:10:52 -05:00
SergeantPanda
071e74124e Remove relations to groups that no longer exist in the m3u account. Also delete groups that are left orphaned after the relation is removed. 2025-09-01 21:09:50 -05:00
SergeantPanda
7ea10064d7 Update group custom_properties during m3u update.
Fixes issue when converting from M3U account to XC. Will add the xc_id now.
2025-09-01 19:48:30 -05:00
SergeantPanda
648e2bb2dd Applied our Prettier formatting to all frontend code. 2025-08-31 09:50:37 -05:00
SergeantPanda
59c6b0565e Use buffering_speed from settings to determine color for speed badge instead of hardcoded at 1.0
Fixes [Bug]: Stat's "Current Speed" does not reflect "Buffering Speed" setting
Fixes #311
2025-08-31 09:28:30 -05:00
SergeantPanda
17cb303273 Refactor: Standardize string quotes and improve code formatting in ChannelCard component 2025-08-31 09:12:17 -05:00
SergeantPanda
10d5d487c3 Fix regex replacement pattern for M3U profile consistency in auto channel sync 2025-08-31 09:03:11 -05:00
SergeantPanda
3aa68c1a36 Enhancement: Allow overriding of default stream profile for auto channel sync. 2025-08-31 08:49:45 -05:00
SergeantPanda
b17bc21159 Cleanup orphaned channels if stale stream retention deleted the last stream from the channel and was created by auto channel sync for the current m3u. 2025-08-31 08:30:17 -05:00
SergeantPanda
fc2caa0e12 Cleaned up code for fetching. 2025-08-30 20:21:19 -05:00
SergeantPanda
22498de395 Better error detection and logging for the user when m3u downloads fail. 2025-08-30 20:14:58 -05:00
SergeantPanda
58446fdd33
Merge pull request #334 from RocketCaptain/main
indentation in the `clone_dispatcharr_repo` function for consistency …
2025-08-30 18:48:04 -05:00
SergeantPanda
12c2b8c7a7 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-08-30 10:53:47 -05:00
SergeantPanda
0228d8d7f4 Fix auto sync channel settings not saving. 2025-08-30 10:48:30 -05:00
SergeantPanda
144a71a636
Merge pull request #338 from Dispatcharr:m3u-batch-processing
M3u-batch-processing
2025-08-30 09:56:12 -05:00
SergeantPanda
b5e32421bd Cleanup unused code. 2025-08-30 09:33:56 -05:00
SergeantPanda
41bd372b14 Save all custom_properties for xc. 2025-08-29 19:55:02 -05:00
SergeantPanda
c24da847fc Get all live streams at once and filter by category. 2025-08-29 19:18:25 -05:00
SergeantPanda
d2b6096570 Changes to XC processing for faster parsing. 2025-08-29 16:39:57 -05:00
SergeantPanda
e3f988b071 Remove shared task from batch m3u processesing. 2025-08-29 15:51:42 -05:00
SergeantPanda
3a81227a73 Move tuner calculation to central location and use it to dynamically calculate tuner counts for XC api user_info 2025-08-28 15:26:51 -05:00
SergeantPanda
62c4e1625b Dynamically calculate expiration date 90 days out instead of hardcoded. 2025-08-28 15:12:13 -05:00
SergeantPanda
5c6d1fe6fd
Merge pull request #336 from Dispatcharr/vod-relationtest
Implement Video on Demand (VOD) functionality
2025-08-28 13:39:24 -05:00
SergeantPanda
2026eab7f2 Default VOD categories to disabled. 2025-08-28 13:17:22 -05:00
SergeantPanda
ad774db59e Fixed episode image failing to load. 2025-08-28 11:45:16 -05:00
SergeantPanda
11f54f3cda Fix VOD categories not loading on new accounts without webui refresh. 2025-08-28 11:20:37 -05:00
SergeantPanda
2b484a94ec Applied formatting to floating video. 2025-08-28 10:44:11 -05:00
SergeantPanda
d2b852c9a2 Fix cyclic error with logos loading. 2025-08-28 10:42:42 -05:00
SergeantPanda
73065ed319 Load logos after all initalization is done. True background logo loading. 2025-08-26 17:54:28 -05:00
SergeantPanda
fe350bda91 Smarter tracking of logos loading throughout different pages. 2025-08-26 17:26:59 -05:00
SergeantPanda
774c80e2a4 Convert a couple more functions to fetch ALL logos. 2025-08-26 16:42:16 -05:00
SergeantPanda
59dd6383eb Fetch all logos when running cleanup. 2025-08-26 15:48:51 -05:00
SergeantPanda
c35c02c64b Updated verbiage for cleanup unused logo function. 2025-08-26 15:37:05 -05:00
SergeantPanda
a0ab0e8688 Don't fetch all logos when uploading a new logo. 2025-08-26 15:33:35 -05:00
SergeantPanda
2645166c8c Don't fetch all logos if editing. 2025-08-26 15:18:44 -05:00
SergeantPanda
fd01d1b6af Fix bulk logo cleanup. 2025-08-26 14:59:05 -05:00
SergeantPanda
0e52117e78 Add channel logo selection and background loading functionality
- Introduced `useChannelLogoSelection` hook for managing channel-assignable logos.
- Updated `ChannelForm` and `ChannelsForm` components to use the new logo selection hook.
- Implemented background loading of channel-assignable logos after user login in the auth store.
- Enhanced logos store to handle separate state for channel logos and added fetching logic.
2025-08-26 14:30:41 -05:00
SergeantPanda
8ad1c6ab1a Better badge labeling depending on logo type. 2025-08-26 13:25:12 -05:00
SergeantPanda
b92d7c2c21 Major rework of how logos are handle. We will no longer load logos on login and they will only be loaded into the store when needed. 2025-08-26 13:19:29 -05:00
SergeantPanda
85a0b584a5 Added formatting to logos table and user agents table. 2025-08-26 08:53:35 -05:00
Michael Bateson
4942393842 indentation in the clone_dispatcharr_repo function for consistency and readability:
- Unindented `EOSU` marker and `else` block
   - Aligned code structure within the conditional blocks
2025-08-26 18:26:17 +10:00
SergeantPanda
5eedceedc3 Combine migrations. 2025-08-25 17:27:15 -05:00
SergeantPanda
07a29916de Reset formatting. 2025-08-25 17:11:41 -05:00
SergeantPanda
2d49b72f06 Reset Blame 2025-08-25 17:10:32 -05:00
dekzter
11bc2e57a9 optimized vod parsing, added in vod category filtering, added UI individual tabs for movies vs series VOD category filters 2025-08-25 14:37:20 -04:00
SergeantPanda
3ecd7137ff Added misssing bulk fetch function for vod categories. 2025-08-22 17:12:50 -05:00
dekzter
c80752a21d attempting to fix updated reference name 2025-08-22 17:36:57 -04:00
dekzter
efcf4c7920 missed commit 2025-08-22 17:31:22 -04:00
dekzter
a19bd14a84 added vod category filtering 2025-08-22 16:59:00 -04:00
SergeantPanda
e45082f5d6 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into vod-relationtest 2025-08-22 14:53:17 -05:00
SergeantPanda
0a5e7a3231 Enhance logo upload functionality: allow custom logo names and update handling in LogoForm component.
Fixes [Bug]: Logo Manager not allowing a change to name field
Fixes #320
2025-08-22 13:34:17 -05:00
SergeantPanda
6746588c15 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-08-22 13:24:58 -05:00
SergeantPanda
22e9e56a5b Add JSONParser to LogoViewSet parser_classes for improved request handling
Fixes [Bug]: Updating Logo throws error
2025-08-22 13:24:50 -05:00
SergeantPanda
48c4cd8ca9 Merge branch 'logosstore' of https://github.com/Dispatcharr/Dispatcharr into vod-relationtest 2025-08-22 12:09:45 -05:00
SergeantPanda
52d96201fe
Merge pull request #328 from Dispatcharr/logosstore 2025-08-22 12:07:30 -05:00
SergeantPanda
39598b4ff4 Refactor logo handling: Introduce useLogosStore, implement lazy loading for logos, and update components to use new store methods. 2025-08-22 09:56:20 -05:00
SergeantPanda
3bea39a117 Fix VOD video player to allow clicking seek bar (overlay was interferring) 2025-08-21 16:48:28 -05:00
SergeantPanda
f1a80c3389 Check for null values before saving to database and scrub empty string. 2025-08-21 16:31:41 -05:00
SergeantPanda
3590265836 Show all available options for VODs and their corresponding quality. 2025-08-21 15:04:47 -05:00
SergeantPanda
a4df1f1fb8 Allow specifying which account to play from. 2025-08-21 13:14:16 -05:00
SergeantPanda
a95ed79e93 Handle issue with relations. 2025-08-21 12:24:06 -05:00
SergeantPanda
90eccd6edf When getting detailed VOD and series info if an imdb or tmdb id were added it could have caused duplicate key violations, we now merge the two VOD's deleting the VOD that isn't being accessed at that time to avoid the user getting invalid links. 2025-08-21 11:41:58 -05:00
SergeantPanda
a03744e24e Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into vod-relationtest 2025-08-21 09:35:40 -05:00
SergeantPanda
56af468320 Merge migrations 2025-08-21 09:35:25 -05:00
SergeantPanda
a95d2a77d8
Merge pull request #322 from Biologisten:dev
Added date and time formats to UI Settings
2025-08-21 09:31:39 -05:00
Biologisten
81a293e7dd Added date and time formats to settings 2025-08-21 02:31:08 +02:00
SergeantPanda
24f876d09f Add priority for providers so VOD's can be auto selected based on the priority. 2025-08-20 17:38:21 -05:00
SergeantPanda
fa2b3fbe3e Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into vod-relationtest 2025-08-19 18:24:27 -05:00
SergeantPanda
e12ce4e596 Swagger endpoint can now be accessed with or without a trailing slash. 2025-08-19 18:16:21 -05:00
SergeantPanda
3a91994549 Rollback accidental change to uwsgi debug config. 2025-08-19 18:07:46 -05:00
SergeantPanda
c87bd79051 Cleanup the rest of the redis keys on connection close. 2025-08-19 17:52:59 -05:00
SergeantPanda
083eb264e6 Properly track m3u profile connections. 2025-08-19 17:45:09 -05:00
SergeantPanda
97b82e5520 Use redis to track provider connections to work with multi-worker uwsgi. 2025-08-19 17:35:51 -05:00
SergeantPanda
d04fdb6f69 Fix categories not linking. 2025-08-19 15:42:30 -05:00
SergeantPanda
9137bd47c9 Switch to pulling ALL vods, series and categories instead of an api call for each category. 2025-08-19 14:53:43 -05:00
SergeantPanda
2903773c86 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into vod-relationtest 2025-08-19 12:39:21 -05:00
GitHub Actions
750bf90576 Release v0.8.0 2025-08-19 17:05:08 +00:00
SergeantPanda
0843d32388
Merge pull request #313 from Dispatcharr/dev
Dev
2025-08-19 12:04:23 -05:00
SergeantPanda
fa19525ab9 Remove url from movie table and save custom_properties for series. 2025-08-19 11:25:56 -05:00
SergeantPanda
54404339c2 Fix batch processing using simpler method for duplicate detection. 2025-08-19 10:01:37 -05:00
SergeantPanda
0f8fe82f17 Skip logos if they are long. 2025-08-18 14:22:40 -05:00
SergeantPanda
1bf947caf2 Fix not detecting duplicates based on imdb and tmdb ids. 2025-08-14 13:40:31 -05:00
SergeantPanda
d92ba0c7e1 Fix VOD page not using pagination properly. 2025-08-14 12:51:56 -05:00
SergeantPanda
0cf2c74b40 Change from individual database calls to batch. 2025-08-14 12:51:39 -05:00
SergeantPanda
1ed4e97167 Let celery[redis] choose redis version. Update sentence-transformers. 2025-08-14 10:42:46 -05:00
SergeantPanda
eec53d5874
Merge pull request #294 from pantherale0/feat/chunked-extractor
Fix: Prevent memory exhaustion when extracting large compressed files
2025-08-14 09:50:52 -05:00
SergeantPanda
6a0871183c
Merge pull request #277 from deku-m/patch-1
Update debian script
2025-08-14 09:50:11 -05:00
SergeantPanda
e481ea4457 Bump celery to 5.5.3 from 5.3.1 2025-08-14 09:42:01 -05:00
SergeantPanda
dac1490acc Attempt to match incoming connections to sessions that are running with no clients. This will help with clients that disconnect with every timeshift and connect to the original URL. 2025-08-12 21:34:30 -05:00
SergeantPanda
3e16614eab Store full content-length if using head request to initalize. 2025-08-12 17:32:07 -05:00
SergeantPanda
5eeb51585d Support HEAD requests, keep connection alive. 2025-08-12 15:27:39 -05:00
SergeantPanda
be51be7872 Switched to connection tracking by url instead of parameter and 301 redirect instead of 302. 2025-08-12 14:48:35 -05:00
SergeantPanda
a5db9d98e9 Track number of connections to avoid shutting down connection during time shifting. Lower grace period to 10 seconds if client disconnects. 2025-08-12 13:26:32 -05:00
SergeantPanda
19017317f6 Fix invalid import. 2025-08-12 11:48:59 -05:00
SergeantPanda
2632e71815 Increased grace period to 30 seconds before closing provider connection 2025-08-12 11:11:03 -05:00
SergeantPanda
6addcebaf5 Fix duplicate connection counting in redis. 2025-08-12 10:59:40 -05:00
SergeantPanda
4acdfa99f9 Properly cleanup redis keys on client disconnect. 2025-08-12 10:51:06 -05:00
SergeantPanda
310f3c455e Properly close connection if client disconnects and is not seeking a new position. 2025-08-12 10:42:59 -05:00
SergeantPanda
b7fb9336be Reuse connections when seeking. 2025-08-12 09:56:30 -05:00
SergeantPanda
07966424f8 Fix seeking not working. 2025-08-12 08:29:26 -05:00
Jordan
72fee02ec4
ensure chunk is either null or empty to exit loop 2025-08-12 11:26:50 +01:00
SergeantPanda
7e94ee5d49 Fix duration display after converting from minutes to seconds in the table. 2025-08-11 16:45:44 -05:00
SergeantPanda
920201312e Fix episodes api call for series endpoint. 2025-08-11 16:07:41 -05:00
Jordan
429b01b569
prevent memory issues by implementing a chunked extractor 2025-08-11 16:28:23 +01:00
SergeantPanda
d4a93b8e4b Populate seasons array. 2025-08-10 21:56:39 -05:00
SergeantPanda
46a5b3355a Lots of fixes for series xc api responses. 2025-08-10 21:30:05 -05:00
SergeantPanda
3942517032 Save custom properties that aren't stream specific for episodes. 2025-08-09 16:55:26 -05:00
SergeantPanda
4d7987214b Fixed some vod apis, converted duration to duration_secs, add custom_poperties to vod_episode 2025-08-09 16:31:33 -05:00
SergeantPanda
3ac84e530a Switch to smarter extraction of port and if fails, take a guess. 2025-08-09 13:55:49 -05:00
SergeantPanda
21b7f80d42 Add forward port to headers. 2025-08-09 10:20:58 -05:00
SergeantPanda
b90956c49f Couple fixes for vod api. 2025-08-08 13:18:20 -05:00
SergeantPanda
19e59ae178 Switch to movie and episode id instead of relations. 2025-08-08 12:48:02 -05:00
SergeantPanda
4a80cb6d5c Fix get_vod_info 2025-08-08 11:34:27 -05:00
SergeantPanda
d054e2cac5 Add XC endpoints for VOD. 2025-08-08 09:03:25 -05:00
SergeantPanda
00361c15b9 Merged all migrations. 2025-08-08 08:41:43 -05:00
dekzter
44fecff2f2 return new object from store call 2025-08-08 09:40:02 -04:00
SergeantPanda
345247df11 Fix vod streaming. 2025-08-08 08:35:59 -05:00
SergeantPanda
22bc573c10 Always display play button for movies. 2025-08-08 08:15:02 -05:00
SergeantPanda
a332678cfb Remove URL from episode relation table. 2025-08-08 08:06:28 -05:00
SergeantPanda
84dfe69f0a Add series badge to series for easy distinction between the two. 2025-08-08 07:50:37 -05:00
SergeantPanda
5feb843d73 Add imdb and tmdb links to episodes if available. 2025-08-08 07:40:36 -05:00
SergeantPanda
b229c9628d Fix movies not loading on first modal open. 2025-08-07 20:59:45 -05:00
SergeantPanda
430a486438 Add imdb_id to movie api output 2025-08-07 20:52:24 -05:00
SergeantPanda
325d832c1b Fixes movie details after db changes. 2025-08-07 20:42:20 -05:00
SergeantPanda
3741d28565 Fix typo in air date parsing. 2025-08-07 19:41:10 -05:00
SergeantPanda
8d37c678d3 Fix air_date not populating for series. 2025-08-07 19:28:42 -05:00
SergeantPanda
56922e1c01 Fix custom properties. 2025-08-07 16:10:45 -05:00
SergeantPanda
656cd3007b Fix release date and description 2025-08-07 15:37:08 -05:00
SergeantPanda
5b931f6272 Add missing functions for processing episodes. 2025-08-07 14:57:28 -05:00
SergeantPanda
a6bace9241 Add custom properties for movies and series. Refactor update logic. 2025-08-07 14:22:27 -05:00
SergeantPanda
0e388968c4 Convert to relation tables to support multiple providers for each vod. 2025-08-07 12:31:05 -05:00
dekzter
5a92487d38 fixed group filtering: 2025-08-06 14:45:55 -04:00
SergeantPanda
44a2cf518c Track active connections the same way as ts_proxy 2025-08-05 21:35:04 -05:00
SergeantPanda
d18817acb0 Track VOD connections in Redis. 2025-08-05 21:24:41 -05:00
SergeantPanda
5e7987ce1a Add IMDB and TMDB links to modal and fix cast not displaying. 2025-08-05 21:04:49 -05:00
SergeantPanda
8275e33223 Move episode count to series modal. 2025-08-05 20:35:11 -05:00
SergeantPanda
5e5b5e9797 Fix logos resetting when opening series. 2025-08-05 20:23:05 -05:00
SergeantPanda
0565e0ae50 Fix backdrop and release date for series. 2025-08-05 20:01:29 -05:00
SergeantPanda
22b7a3efb2 Fix youtube trailers for series. 2025-08-05 16:43:51 -05:00
SergeantPanda
4b6792ccbe Add ability to expand an episode row to view additional information. 2025-08-05 16:19:50 -05:00
SergeantPanda
cc8064877d Fix episode number not displaying. 2025-08-05 15:55:29 -05:00
SergeantPanda
0120419d09 Get full release date. 2025-08-05 15:50:28 -05:00
SergeantPanda
a66028ff02 Fixes tabs not changing view. 2025-08-05 14:03:05 -05:00
SergeantPanda
4754a0dcd7 Show epsiodes nested under a season tab. 2025-08-05 14:00:05 -05:00
SergeantPanda
b11270ce4a Fix not populating episodes. 2025-08-05 13:53:43 -05:00
SergeantPanda
3c2534d8aa Combine a couple api calls. 2025-08-05 13:28:58 -05:00
SergeantPanda
8584aae675 Better naming for different functions. 2025-08-05 12:53:22 -05:00
SergeantPanda
151f654dd9 Only query series info if data is stale 2025-08-05 11:18:06 -05:00
SergeantPanda
9a927b1fc7 Update migration to track last series refresh. 2025-08-05 10:55:44 -05:00
SergeantPanda
4b27f95b30 Keep track of last time a series was refreshed. 2025-08-05 10:45:02 -05:00
SergeantPanda
ce331abdbc Don't merge provider details with vod store. 2025-08-05 10:23:49 -05:00
dekzter
b7abdac800 merged in dev 2025-08-05 10:56:01 -04:00
dekzter
2f1da5d76a fixed modal closing issues 2025-08-05 10:55:15 -04:00
SergeantPanda
fa29c679f5 Fix eslint error. 2025-08-04 18:33:42 -05:00
SergeantPanda
36450af23f Fix youtube links not loading. 2025-08-04 18:30:33 -05:00
SergeantPanda
b19efd2f75 Use backdrop image as background for modal. 2025-08-04 18:24:45 -05:00
SergeantPanda
d917a3a915 Rearranged data. 2025-08-04 18:21:18 -05:00
SergeantPanda
10ab3e4bd8 Fix movie link not building correctly for web player. 2025-08-04 17:28:20 -05:00
SergeantPanda
4accd2be85 Pull advanced info from provider when opening a movie. 2025-08-03 22:02:34 -05:00
SergeantPanda
e1f5cb24ec Smart sizing based on screen sizing. 2025-08-03 20:03:45 -05:00
SergeantPanda
9f8054c9de Fix video player in dev mode. 2025-08-03 19:59:10 -05:00
SergeantPanda
3baaf8c170 More flexible playback for vods. 2025-08-03 19:53:22 -05:00
SergeantPanda
88a9f8b14e Fix vods not displaying in the webui. 2025-08-03 10:40:17 -05:00
dekzter
f300da6eff case sensitive flag and other possible custom properties for filters 2025-08-03 08:40:00 -04:00
SergeantPanda
d9192d003d Remove json dumps from custom_properties as now the fields are jsonb in postgres. Also fix category matching. 2025-08-02 19:03:39 -05:00
SergeantPanda
f9a9d5d336 Convert postgres to UTF8 from ASCII 2025-08-02 19:02:57 -05:00
deku-m
5010cae068
Merge branch 'Dispatcharr:main' into patch-1 2025-08-02 21:53:07 +02:00
SergeantPanda
0585fc1dfe Separate Movies, Series and Episodes into their own tables 2025-08-02 13:32:44 -05:00
SergeantPanda
0b5906538e More robust parsing of year. 2025-08-02 12:59:02 -05:00
SergeantPanda
0ece58572e Extract year from other data fields if available, otherwise attempt name. 2025-08-02 12:33:41 -05:00
SergeantPanda
bcebcadfaa Add ability to scan for vods during m3u refresh. 2025-08-02 12:01:26 -05:00
SergeantPanda
386a03381c Initial frontend commit for vods. 2025-08-02 10:48:48 -05:00
SergeantPanda
84aa631196 Initial backend commit for vod 2025-08-02 10:42:36 -05:00
SergeantPanda
1c47b7f84a Adds ability to reverse the sort order for auto channel sync. 2025-08-01 16:42:42 -05:00
dekzter
f1752cc720 fixed editing 2025-08-01 17:12:06 -04:00
dekzter
1612df14c1 fixed deleting filter not removing it from the ui 2025-08-01 15:12:19 -04:00
dekzter
ead76fe661 first run at m3u filtering 2025-08-01 15:02:43 -04:00
SergeantPanda
7b5a617bf8 Use custom validator for urls fields to allow for non fqdn hostnames.
Fixes #63
2025-08-01 11:28:51 -05:00
SergeantPanda
953db79476 Display stream logo and name in channel card when previewing streams. 2025-07-31 21:26:59 -05:00
SergeantPanda
6945cecaca Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-07-31 20:15:42 -05:00
SergeantPanda
42c792cbb7
Merge pull request #278 from Dispatcharr/Updated-dependencies-
Update dependencies in requirements.txt
2025-07-31 20:13:58 -05:00
SergeantPanda
20651a8d59 Update dependencies in requirements.txt for compatibility and improvements 2025-07-31 20:10:36 -05:00
SergeantPanda
59b75c18fc Add ability to preview streams under a channel. 2025-07-31 15:54:24 -05:00
deku-m
a9aac72a60
Update debian_install.sh 2025-07-31 22:14:35 +02:00
SergeantPanda
406ac37fb9 Delete temp folder if it exists during upgrade. 2025-07-31 15:01:28 -05:00
SergeantPanda
108a992643 Detect mismatched Postgres version and automatically run pg_upgrade 2025-07-31 14:53:55 -05:00
SergeantPanda
826c824084
Bump Postgres to version 17 2025-07-31 14:03:37 -05:00
SergeantPanda
5a887cc55a Bump Postgres to version 17. 2025-07-31 13:54:20 -05:00
SergeantPanda
e029cd8b3d Fix XML escaping for channel ID in generate_dummy_epg function 2025-07-31 10:22:43 -05:00
SergeantPanda
4ae66e0bc9 Add membership creation in UpdateChannelMembershipAPIView if not found.
Fixes #275
2025-07-31 09:52:02 -05:00
SergeantPanda
06d8d01fd9 Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-07-29 16:22:48 -05:00
SergeantPanda
64ec6de414
Merge pull request #274 from Dispatcharr/stream-stats
Stream stats
2025-07-29 16:21:37 -05:00
GitHub Actions
26e237f2d1 Release v0.7.1 2025-07-29 21:06:06 +00:00
SergeantPanda
38813e6c27
Merge pull request #273 from Dispatcharr/dev
Release Notes – Version 0.7.1
2025-07-29 16:05:23 -05:00
SergeantPanda
613c0d8bb5 Add input_bitrate to technical for future use. 2025-07-29 15:43:44 -05:00
SergeantPanda
e26ecad013 Move m3u and url badges to same line as stream name. 2025-07-29 15:17:18 -05:00
SergeantPanda
7551869a2e Remove audio bitrate from basic stats. 2025-07-29 15:12:14 -05:00
SergeantPanda
44f8d45768 Add the ability to see advanced stats for channel streams. 2025-07-29 15:02:35 -05:00
SergeantPanda
cb49172e98 Fix bug where creating a channel from a stream that isn't displayed in the table has invalid stream name. 2025-07-29 15:02:13 -05:00
SergeantPanda
72ecb88c76 Fix bug where creating a channel from a stream that isn't displayed in the table has invalid stream name. 2025-07-29 14:59:03 -05:00
SergeantPanda
5b076fd00b Allow for clicking url badge and copy to clipboard. 2025-07-29 11:35:22 -05:00
SergeantPanda
b222dbe5a3 Display stream qualites in channel table. 2025-07-29 11:21:44 -05:00
SergeantPanda
6180b4ffef Save stream stats to database. 2025-07-28 21:40:29 -05:00
SergeantPanda
db6e09370c Fix bug when sorting by name. 2025-07-28 18:09:43 -05:00
SergeantPanda
7c442064e6 Implement natural sorting for channel names in auto channel sync 2025-07-28 18:01:50 -05:00
SergeantPanda
336a0d2558 Add ability to sort auto channel sync order by either provider order (default), name, TVG ID or updated at. 2025-07-28 17:43:19 -05:00
SergeantPanda
491b0cea29 Fetch channel profiles on successful m3u refresh. 2025-07-27 19:27:50 -05:00
SergeantPanda
7313cafa1c Adds the ability to have the auto channel sync create channels in specificed channel profiles.
Closes #255
2025-07-27 19:14:39 -05:00
SergeantPanda
39ec4d39fc Allow for multiple profiles to be added with the api. This will later be used for auto channel sync. 2025-07-27 17:48:09 -05:00
SergeantPanda
8f529bf495 Only add to selected profile if profile is selected during channel creation. 2025-07-27 17:33:23 -05:00
SergeantPanda
9c32cc68d9 Properly track uploaded logos in Redis so avoid redundant message from file scanner. 2025-07-27 14:55:02 -05:00
SergeantPanda
a7b9d278c2 In the logo manager, don't save when file is selected, wait for create button to be clicked. 2025-07-27 14:44:20 -05:00
SergeantPanda
1475ca70ab Fixes being unable to add a new logo via URL. 2025-07-27 14:18:48 -05:00
SergeantPanda
7eef45f1c0 Only use whole numbers when looking for a number not in use. 2025-07-20 20:13:59 -05:00
SergeantPanda
5148a5a79b Use channel name for display name in EPG output. 2025-07-20 20:01:51 -05:00
GitHub Actions
c54b847738 Release v0.7.0 2025-07-19 20:43:30 +00:00
SergeantPanda
da90ae5436
Merge pull request #253 from Dispatcharr/dev
Dispatcharr v0.7.0
2025-07-19 15:42:34 -05:00
SergeantPanda
9fcce85ea7 Reduced spacing between fields. 2025-07-19 15:31:45 -05:00
SergeantPanda
f6fa90178d Adds the ability to auto channel sync to only add channels where the stream name matches a regex. 2025-07-19 15:23:10 -05:00
SergeantPanda
935f4c1da2 Filter out stale streams from auto channel creation. 2025-07-19 14:30:57 -05:00
SergeantPanda
924d44c573 Add addvanced options drop down for auto channel sync. 2025-07-19 14:09:37 -05:00
SergeantPanda
003ad4c54f Add channel name regex match/replace for auto sync. 2025-07-19 13:13:05 -05:00
SergeantPanda
0b2e050e63 Page sizes for the stream table are now stored in local storage. 2025-07-18 21:54:39 -05:00
SergeantPanda
2df377b7f5 Fixes channel table not pulling from local storage for page size. 2025-07-18 21:32:41 -05:00
SergeantPanda
fc9b179e9a Updated styling for group manager to better match other forms. 2025-07-18 19:38:27 -05:00
SergeantPanda
0996d396cf
Merge pull request #251 from Dispatcharr/logo-manager
Logo manager
2025-07-18 15:43:32 -05:00
SergeantPanda
91c8999021 Rename to logo manager. 2025-07-18 15:41:56 -05:00
SergeantPanda
26881f41d6 Adds ability to delete local files during cleanup. 2025-07-18 15:36:13 -05:00
SergeantPanda
bc08cb1270 Ask to delete local files as well. 2025-07-18 15:23:30 -05:00
SergeantPanda
d926d90dd9 Fix websocket message. 2025-07-18 15:14:11 -05:00
SergeantPanda
e876af1aa2 Scan sub folders for logos. 2025-07-18 15:04:34 -05:00
SergeantPanda
479826709b Fetch logos when logos are added by filesystem scan. 2025-07-18 15:01:26 -05:00
SergeantPanda
13672919d0 Fetch Playlists on successful m3u update. 2025-07-18 14:26:09 -05:00
SergeantPanda
1ece74a0b0 Scan logos folder for new logos. 2025-07-18 14:07:58 -05:00
SergeantPanda
e27f45809b Allow /data/logos as a url. 2025-07-18 13:47:50 -05:00
SergeantPanda
0fcb8b9f2e Don't convert urls in the store. 2025-07-18 13:44:00 -05:00
SergeantPanda
e7771d5b67 Allow deleting logos that are assigned to channels. 2025-07-18 11:36:15 -05:00
SergeantPanda
23bd5484ee Enlarge logo on hover. 2025-07-17 21:12:05 -05:00
SergeantPanda
5d82fd17c2 Treat local files as valid urls 2025-07-17 21:09:05 -05:00
SergeantPanda
8e2309ac58 Fixes logo uploads 2025-07-17 21:02:50 -05:00
SergeantPanda
bd1831e226 Fix edits not saving 2025-07-17 20:49:05 -05:00
SergeantPanda
05539794e3 Set better sizing. 2025-07-17 20:45:04 -05:00
SergeantPanda
cebc4c8ca9 Add pagination. 2025-07-17 20:32:24 -05:00
SergeantPanda
122b902f0f Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into logo-manager 2025-07-17 19:56:11 -05:00
SergeantPanda
f40e9fb9be Update playlist store when auto sync settings change. 2025-07-17 19:24:27 -05:00
SergeantPanda
b406a3b504 Move force dummy epg to top. 2025-07-17 19:02:03 -05:00
SergeantPanda
8fa27904e7 Add ability to override group name during auto channel creation. 2025-07-17 18:42:42 -05:00
SergeantPanda
e9055a5ad6 Activate setting 2025-07-17 17:36:32 -05:00
SergeantPanda
a5f7a88ba0 Add option to force dummy epg. 2025-07-17 16:54:37 -05:00
SergeantPanda
1413c7a3d2 Show unused count when deleting. 2025-07-15 22:03:14 -05:00
SergeantPanda
3949a2ed5c Fixes cleanup unused logos. 2025-07-15 22:00:18 -05:00
SergeantPanda
fa470bee35 Add bulk delete abilites and cleanup unused action. 2025-07-15 21:39:17 -05:00
SergeantPanda
6a7abbaa82 Fix text filtering. 2025-07-15 20:36:32 -05:00
SergeantPanda
cd30f6da66 Enhance Logo management with filtering and usage details in API and UI 2025-07-15 20:26:02 -05:00
SergeantPanda
5f08d0fbbf Add padding to logos. 2025-07-15 20:14:34 -05:00
SergeantPanda
500df533bb Center logos in the column. 2025-07-15 20:12:25 -05:00
SergeantPanda
2bba31940d Use our custom table for displaying logos 2025-07-15 20:02:21 -05:00
SergeantPanda
6afd5a38c9 Add timeouts to logo fetching to avoid hanging UI if a logo is unreachable. Also add default user-agent to request to prevent servers from denying request.
Fixes #217 and Fixes #101
2025-07-15 18:45:21 -05:00
SergeantPanda
cea078f6ef Use default user-agent and adjust timeouts. 2025-07-15 18:37:22 -05:00
SergeantPanda
489851906e Logo manager initial commit. 2025-07-15 18:19:10 -05:00
SergeantPanda
7292ad460f If previous channel had decimal don't continue with decimals. 2025-07-14 16:58:27 -05:00
SergeantPanda
e7eab09dfc Only update channels that changed. Don't delete and readd all. Otherwise UUIDS/links will break. 2025-07-14 16:42:26 -05:00
SergeantPanda
1a475a29d0
Merge pull request #246 from Dispatcharr/auto-sync-groups
Auto sync groups
2025-07-13 19:19:35 -05:00
SergeantPanda
3f7e92ae44 Requery channels on successful m3u refresh. 2025-07-13 19:12:45 -05:00
SergeantPanda
df23d3660b Fetch channels after m3u refresh. 2025-07-13 18:51:04 -05:00
SergeantPanda
18d4c39291 Combined migrations. 2025-07-13 18:09:30 -05:00
SergeantPanda
12c086dcd1 Delete all auto-synced channels and create new ones. 2025-07-13 18:02:26 -05:00
SergeantPanda
b91a2286e4 Add auto-created tracking to Channel model and related serializers
- Introduced `auto_created` and `auto_created_by` fields in the Channel model to track channels created via M3U auto channel sync.
- Updated ChannelSerializer to include these new fields and added a method to retrieve the name of the M3U account that created the channel.
- Modified sync_auto_channels task to set these fields when creating channels and to filter existing channels accordingly.
2025-07-13 16:32:52 -05:00
SergeantPanda
ea81cfb1af Add auto channel sync settings to ChannelGroupM3UAccount and update related components
- Introduced `auto_channel_sync` and `auto_sync_channel_start` fields in the ChannelGroupM3UAccount model.
- Added API endpoint to update M3U group settings.
- Updated M3UGroupFilter component to manage auto sync settings.
- Enhanced M3URefreshNotification and M3U components for better user guidance.
- Created a Celery task for automatic channel synchronization after M3U refresh.
2025-07-13 15:59:25 -05:00
SergeantPanda
2cf9ade105
Merge pull request #244 from Dispatcharr/group-management
Group management
2025-07-12 19:20:33 -05:00
SergeantPanda
69f8f426a6 Refactor menu items in ChannelTableHeader to fix html error. 2025-07-12 19:10:59 -05:00
SergeantPanda
c4e5710b48 When adding a group. Fetch groups after. 2025-07-12 19:05:06 -05:00
SergeantPanda
8b361ee646 Fix eslint issues. 2025-07-12 18:12:25 -05:00
SergeantPanda
35d95c47c7 Fixed z index issue when stream table was refreshing. 2025-07-12 17:48:56 -05:00
SergeantPanda
2da8273de6 Add confirmation for deleting and cleaning up groups. 2025-07-12 17:41:35 -05:00
SergeantPanda
f10a6cb403 Add filtering based on group membership. 2025-07-12 17:37:24 -05:00
SergeantPanda
171d64841a Changed some colors to match our theme better. 2025-07-12 17:28:04 -05:00
SergeantPanda
9b7aa0c894 Add ability to cleanup all unused groups. 2025-07-12 17:05:48 -05:00
SergeantPanda
adc6604fa2 Disable buttons that can't be used. 2025-07-12 16:57:05 -05:00
SergeantPanda
9cb05a0ae1 Add search functionality to GroupManager for filtering groups 2025-07-12 16:27:49 -05:00
SergeantPanda
a1d9a7cbbe Fixed performance issue while creating group. 2025-07-12 16:21:40 -05:00
SergeantPanda
fcce1a36b2 Implement group management features: add GroupManager component. 2025-07-12 16:08:04 -05:00
SergeantPanda
3006209ecc Add rehash dialog type state and enhance confirmation handling for settings save and rehash 2025-07-11 17:45:19 -05:00
SergeantPanda
846418b63b Refactor settings save logic to remove unnecessary rehash execution 2025-07-11 17:14:44 -05:00
SergeantPanda
f476164ec7 Add confirmation dialog for rehashing streams and handle warning suppression 2025-07-11 16:55:55 -05:00
SergeantPanda
3a60526fbd Add rehashStreams method to API and update SettingsPage to trigger stream rehash 2025-07-11 16:38:20 -05:00
SergeantPanda
34a3f75c1c Remove garbage collection from websocket update. 2025-07-11 16:23:34 -05:00
SergeantPanda
db10e90801 Enhance rehash_streams task to send WebSocket notifications for blocked rehash attempts.. 2025-07-11 15:51:19 -05:00
SergeantPanda
073fe72a49 Acquire locks when rehash starts. 2025-07-11 15:13:29 -05:00
SergeantPanda
8ec489d26f Send websocket updates during rehash. 2025-07-11 14:47:12 -05:00
SergeantPanda
1c7fa21b86 Add rehash streams endpoint and UI integration for triggering stream rehashing 2025-07-11 14:11:41 -05:00
SergeantPanda
fafd93e958 Refactor XC Client usage to improve error handling and resource management with context management. Implement connection pooling for better performance. 2025-07-10 19:14:43 -05:00
SergeantPanda
65da85991c Enhance error handling in API requests by checking for common blocking responses and improving JSON decode error logging. 2025-07-10 18:07:25 -05:00
SergeantPanda
b392788d5f Improve error handling for API responses by checking for empty content and handling JSON decode errors. 2025-07-10 16:22:16 -05:00
SergeantPanda
d24520d3d8 Enhance EPG XML generation with additional metadata extraction and improved handling for keywords, languages, ratings, and credits. 2025-07-10 13:22:42 -05:00
GitHub Actions
8b6acf2375 Release v0.6.2 2025-07-10 15:41:58 +00:00
SergeantPanda
b9637f166b
Merge pull request #240 from Dispatcharr/dev
Dispatcharr Release Notes - v0.6.2
2025-07-10 10:41:17 -05:00
SergeantPanda
9f8a2db500 Include channel ID in more logs. 2025-07-09 16:44:00 -05:00
SergeantPanda
2284d47f9f If provider is slow but responsive, don't get locked up. 2025-07-08 17:10:55 -05:00
SergeantPanda
d6605e7119 Add timeout for chunks. 2025-07-08 15:57:11 -05:00
SergeantPanda
374aa82e22 Refactor editChannel to use selectedTableIds directly from the table state and remove unused selection clearing effects. 2025-07-07 17:08:53 -05:00
SergeantPanda
01d4b25303 Health monitor thread no longer will no longer attempt to reconnect it will only notify the main thread of issues. 2025-07-03 14:10:03 -05:00
SergeantPanda
55e19f05aa Check if stopping before adding chunks during transcoding. 2025-07-03 11:18:03 -05:00
SergeantPanda
580aa1975c Add process management for safe connection handling in StreamManager
- Introduced _wait_for_existing_processes_to_close method to ensure all existing processes and connections are fully closed before establishing new ones.
- Updated _establish_transcode_connection and _establish_http_connection methods to check for and close lingering processes and connections.
- Enhanced logging for better debugging and monitoring of connection states.
2025-07-03 11:02:07 -05:00
SergeantPanda
8e2c6c7780 Check for any state to detmine if channel is running. 2025-07-01 10:57:07 -05:00
SergeantPanda
2b97a958cd Check if a transcode process is running also to determine if we should close sockets. 2025-07-01 10:27:24 -05:00
SergeantPanda
5ff474d322 Fixes being unable to close web player on touch screens. 2025-07-01 09:47:31 -05:00
SergeantPanda
e9d60cdb1e Allow setting blank XC password. 2025-06-28 09:42:00 -05:00
SergeantPanda
6a57d4a7c7
Merge pull request #231 from Dispatcharr/users-table
Users table
2025-06-28 09:30:12 -05:00
SergeantPanda
a45c800718 Removed unused imports and variables 2025-06-28 09:25:28 -05:00
SergeantPanda
f6825418da Show first name if available. Remove placeholder link that was dead. 2025-06-28 09:10:19 -05:00
SergeantPanda
d8ffec474c Set minimum size 2025-06-28 09:04:28 -05:00
SergeantPanda
77d8ab8d55 Table formatting 2025-06-28 09:01:57 -05:00
SergeantPanda
ac07a5217f Set last_login when successful login occurs. 2025-06-28 08:55:07 -05:00
SergeantPanda
7dcb853c6c Add first and last name to user form. 2025-06-28 08:49:09 -05:00
SergeantPanda
1a8bbb6bb8 Reorder columns 2025-06-27 21:59:56 -05:00
SergeantPanda
615956d502 Standardized headers. 2025-06-27 21:45:27 -05:00
SergeantPanda
1e91dd7597 Added all available fields. 2025-06-27 21:43:50 -05:00
SergeantPanda
d50a6ebce5 Converted users to our custom table 2025-06-27 21:03:04 -05:00
SergeantPanda
c3d1600c07 Additional logging. 2025-06-27 20:22:49 -05:00
GitHub Actions
955176f45a Release v0.6.1 2025-06-27 21:18:17 +00:00
SergeantPanda
8dc6b12e8b
Merge pull request #228 from Dispatcharr/dev
Dispatcharr Release Notes - v0.6.1
2025-06-27 16:17:29 -05:00
SergeantPanda
f2a238915a Use streaming response during EPG generation to avoid clients timing out.
Closes #179
2025-06-27 14:54:47 -05:00
SergeantPanda
47d270aa9f Better sizing for link forms. 2025-06-27 11:17:31 -05:00
SergeantPanda
fe2df9b530 Render link forms overtop of all elements. 2025-06-27 10:56:24 -05:00
SergeantPanda
7b23b0a4df Slide text right when program starts before current view.
Closes #223
2025-06-27 10:24:08 -05:00
SergeantPanda
66b95f2ef8 Fixes channels not actually filtering based on selected group. 2025-06-27 10:06:54 -05:00
SergeantPanda
6d13aa5314 Fix console error. 2025-06-27 10:04:33 -05:00
SergeantPanda
8c47f7b0e6 Fixes groups not loading in TV Guide. 2025-06-27 09:59:23 -05:00
SergeantPanda
81d0c9472f When 1 channel is selected open correct channel editor. 2025-06-26 15:52:33 -05:00
SergeantPanda
5a38a56dc6 Fix setting stream profile to 'use default' 2025-06-26 13:56:12 -05:00
SergeantPanda
855578bf05 No change initial loading for group. 2025-06-26 13:53:36 -05:00
SergeantPanda
99f2b5b4b1 Add no change options to bulk edit. 2025-06-26 13:42:56 -05:00
SergeantPanda
00073698b3 Update front end when channels are edited. 2025-06-26 13:30:52 -05:00
SergeantPanda
ba6012b28c Fixes bulk channel editor not saving.
Fixes #222
2025-06-26 13:15:00 -05:00
SergeantPanda
65e0be80e0 Refactor channel selection handling to use table state and clear selection on data change 2025-06-26 11:50:22 -05:00
SergeantPanda
f6339b691c Set better sizes for stream profile table for mobile support. 2025-06-26 10:55:25 -05:00
SergeantPanda
f8ef219665 Set better sizes for user-agent table for mobile support. 2025-06-26 10:17:48 -05:00
SergeantPanda
23e63ba4a0 Add minimum width to settings for better mobile support. 2025-06-26 09:56:31 -05:00
SergeantPanda
c04ca0a804 Add buffering as an active state. 2025-06-25 17:01:21 -05:00
SergeantPanda
c44c380bb2
Merge pull request #221 from Dispatcharr/channels-table-formatting
Channels table formatting
2025-06-24 18:17:00 -05:00
SergeantPanda
a307c63b0b Better sizes for mobile 2025-06-24 18:13:13 -05:00
SergeantPanda
587ab4afe0 Gets rid of unneccessary blank space at the bottom. 2025-06-24 17:56:05 -05:00
SergeantPanda
cc41731ae1 Minimum widths set 2025-06-23 19:23:16 -05:00
SergeantPanda
e2b93d3e6f Better stats card filling. 2025-06-23 17:27:34 -05:00
SergeantPanda
58a1304ddc Always show 2 decimal places for FFmpeg speed. 2025-06-22 13:25:43 -05:00
SergeantPanda
49f141d64a Better calculation for number of cards per column.
Fixes #218
2025-06-22 13:22:11 -05:00
SergeantPanda
4b214a49ee Move buttons to label row 2025-06-22 01:02:32 -05:00
SergeantPanda
77590367ac Remove unnecessary padding. 2025-06-22 00:30:07 -05:00
SergeantPanda
0b6175bac6 Fix background color to match rest of the theme. 2025-06-22 00:19:31 -05:00
SergeantPanda
384609ae77 Better sizing. 2025-06-21 23:46:58 -05:00
SergeantPanda
dae143a724 Better rendering for small screens on m3u and epg tables. 2025-06-21 22:54:52 -05:00
SergeantPanda
c43231c492 Add confirmation dialog for user deletion with warning suppression 2025-06-21 14:09:53 -05:00
SergeantPanda
7f1bdd0129 Add support for 'num' property in channel number extraction 2025-06-21 14:00:41 -05:00
SergeantPanda
f4e4fb1d13 Add confirmation dialog for deleting a channel profile. 2025-06-21 13:24:32 -05:00
SergeantPanda
3fb2433d3a Add confirmation dialog for m3u profile deletion. 2025-06-21 13:19:19 -05:00
SergeantPanda
d4688fa4e4 Add M3U and EPG URL configuration options with dynamic parameters
Closes #207
2025-06-21 13:00:11 -05:00
GitHub Actions
de811b6f68 Release v0.6.0 2025-06-19 20:09:48 +00:00
SergeantPanda
d8ecd1f452
Merge pull request #204 from Dispatcharr/dev
Dispatcharr v0.6.0
2025-06-19 14:57:59 -05:00
SergeantPanda
03f39d7d48 Rename FFMPEG_BITRATE to FFMPEG_OUTPUT_BITRATE for clarity. 2025-06-18 17:10:08 -05:00
SergeantPanda
2e118a3f8d Smarter parsing for input vs output streams. 2025-06-18 16:59:12 -05:00
SergeantPanda
ae88141c36 Change matching pattern for detecting input vs output streams in FFmpeg. 2025-06-17 21:33:26 -05:00
SergeantPanda
573ed96e82 Bug fix - Wait until FFmpeg is closed before marking the connection as closed. May fix some stuck FFmpeg processes. 2025-06-17 17:27:41 -05:00
SergeantPanda
b6cde2fec8 Only update redis if stream switch was successful. 2025-06-17 16:21:10 -05:00
SergeantPanda
29a5e93cfd Don't count currently used profile in used connections. 2025-06-17 15:51:03 -05:00
SergeantPanda
d130de3c80 Check connections remaining before switching streams. 2025-06-17 13:59:22 -05:00
SergeantPanda
838a373bea Use correct m3u_profile_id when changing streams. 2025-06-17 13:11:03 -05:00
SergeantPanda
3448a3b494 Add buffering as an active state. 2025-06-17 12:30:20 -05:00
SergeantPanda
42747e743c Converted epg_channel_id to a string as well. 2025-06-16 18:05:04 -05:00
dekzter
0b63b1286f breakclass command to reset network access settings 2025-06-16 10:57:54 -04:00
SergeantPanda
db7ca1a0c8 Convert "category_id" to string (but not category_ids" 2025-06-15 18:21:41 -05:00
SergeantPanda
85a639ec5f Order XC live streams output by channel number. 2025-06-15 17:45:47 -05:00
SergeantPanda
7b16ca6ff7
Merge pull request #198 from Dispatcharr/proxy-settings
Proxy settings
2025-06-15 13:18:03 -05:00
SergeantPanda
a9ef41478c Adds reset to defaults for proxy settings. 2025-06-15 13:15:20 -05:00
SergeantPanda
ed346dd952 Better verbiage for a couple settings. 2025-06-15 13:09:40 -05:00
SergeantPanda
94b3255c93 Fixes settings getting overwrote when loading settings page. 2025-06-15 13:06:10 -05:00
SergeantPanda
9757f6a48d Fix key error with react. 2025-06-15 11:28:57 -05:00
SergeantPanda
e80d30689c Settings load correctly during first open. 2025-06-14 13:42:01 -05:00
SergeantPanda
9c5a174409 Move regions to constants. 2025-06-14 13:08:56 -05:00
SergeantPanda
51ce2d241c Use integers and floats instead of strings for proxy settings. 2025-06-14 12:42:49 -05:00
SergeantPanda
0c4d320dc2 Add descriptions to proxy settings 2025-06-14 12:39:07 -05:00
dekzter
72542bf0fd default cidrs in network access form, added proxy settings to ui 2025-06-14 08:52:56 -04:00
SergeantPanda
fa3ee35d4d Prepopulate settings in database. 2025-06-13 14:51:56 -05:00
SergeantPanda
cfff51a9eb Add missing import. 2025-06-13 14:40:28 -05:00
SergeantPanda
c4a6b1469e Change to JSON settings 2025-06-13 14:36:08 -05:00
SergeantPanda
d0cefd3813 Fixes being unable to edit stream profile. 2025-06-13 12:48:04 -05:00
SergeantPanda
d53fbef444
Merge pull request #178 from xham3/automatch-fix
Fix process communicate() deadlock when epg match data overfill subprocess.PIPE buffer
2025-06-13 10:46:09 -05:00
SergeantPanda
10cc9de31b Merge branch 'proxy-settings' of https://github.com/Dispatcharr/Dispatcharr into proxy-settings 2025-06-13 10:28:00 -05:00
SergeantPanda
8eec41cfbb Fixes a bug where heartbeat thread will exit if channel is in shutdown delay.
This may also fix #129
2025-06-13 10:27:51 -05:00
dekzter
c4c1b8a629 Merge remote-tracking branch 'origin/dev' into proxy-settings 2025-06-13 10:33:40 -04:00
xham3
5360f38b14 Fix process communicate() deadlock when epg match data overfill subprocess.PIPE buffer. 2025-06-12 15:34:24 -07:00
SergeantPanda
1e9ab54609 Use new methods for getting settings. 2025-06-12 16:11:43 -05:00
SergeantPanda
2f91e0ce1c Properly populate default values. 2025-06-12 16:02:08 -05:00
SergeantPanda
b4ae6911c9 Pull settings from database 2025-06-12 15:42:26 -05:00
SergeantPanda
08c04e710a Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-12 11:17:23 -05:00
SergeantPanda
a99a6431b2 More merge fixes. 2025-06-12 10:45:57 -05:00
SergeantPanda
ada1d51aaa Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into proxy-settings 2025-06-12 10:45:44 -05:00
SergeantPanda
59e4a28b31 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-11 17:36:47 -05:00
SergeantPanda
1bf16355c1
Merge pull request #183 from Dispatcharr/ffmpeg-stats
Ffmpeg stats
2025-06-11 17:35:19 -05:00
SergeantPanda
34ceffb86a Merge branch 'ffmpeg-stats' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-11 17:33:22 -05:00
SergeantPanda
bd53837f80 Better detection of input vs output stream information. 2025-06-11 17:20:27 -05:00
SergeantPanda
743cf4e297 Smarter parsing of ffmpeg stats output. 2025-06-11 16:55:14 -05:00
dekzter
1864b5d34c
Merge pull request #180 from Dispatcharr/user-management
User and Access Management
2025-06-11 08:39:59 -04:00
dekzter
30b2a19eb0 merged in main 2025-06-11 08:38:00 -04:00
dekzter
788667b687 better error checking, only warn for UI blocking 2025-06-11 08:24:32 -04:00
dekzter
e2e8b7088a Fixed bad merge conflict 2025-06-11 08:23:51 -04:00
SergeantPanda
a19f29b464
Merge pull request #177 from Dispatcharr/ffmpeg-stats
Ffmpeg stats
2025-06-10 21:26:16 -05:00
SergeantPanda
2add2c1dd2 Add new settings to database. 2025-06-10 21:23:04 -05:00
SergeantPanda
7812a410b3 Allow users to change proxy settings. 2025-06-10 21:17:30 -05:00
SergeantPanda
e753d9b9f8 Fixes a bug where stream profile name wouldn't update in stats. (Was outputting name string instead of ID 2025-06-10 19:16:52 -05:00
SergeantPanda
a2c7fc3046 [New feature]
Switch streams when buffering is detected.
2025-06-10 17:43:37 -05:00
SergeantPanda
2ec7d21bb9 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-10 15:50:04 -05:00
SergeantPanda
d850166a80 Add conditional channel_group_id assignment in ChannelViewSet. This fixes an issue where if a group isn't assigned to a stream it would fail to create a channel from the stream.
Closes #122
2025-06-10 15:47:27 -05:00
SergeantPanda
4fc306620a
Merge pull request #167 from maluueu/fix/ip-geo_rate-limit
core: api_views.py: add fallback IP geo provider
2025-06-10 15:20:48 -05:00
SergeantPanda
11d3d7a15a Add case-insensitive attribute lookup for M3U parsing 2025-06-10 14:45:49 -05:00
SergeantPanda
ae01440a15 Better error messaging for unsupported codecs in the web player. Also don't block controls with error messages. 2025-06-10 13:58:34 -05:00
SergeantPanda
0a994bbe3f Merge branch 'ffmpeg-stats' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-10 11:42:51 -05:00
SergeantPanda
62ac134359 Move stream type to after audio. 2025-06-10 11:38:52 -05:00
SergeantPanda
a56efa9ef5
Merge pull request #174 from Dispatcharr/ffmpeg-stats
Ffmpeg stats
2025-06-10 10:26:09 -05:00
SergeantPanda
3522066867 Changed some badge colors and added tooltips. 2025-06-10 10:17:28 -05:00
SergeantPanda
1f6f15ed73 Add stream type to stats page. 2025-06-10 10:10:05 -05:00
SergeantPanda
f869daa37c Add stream type for stream (HLS/MPEGTS, ETC) 2025-06-10 09:51:56 -05:00
SergeantPanda
b8992bde64 Add audio channels to stats page. 2025-06-10 09:23:43 -05:00
SergeantPanda
efaa64d00b Fix resolution not always parsing correctly. 2025-06-10 09:08:04 -05:00
dekzter
a1576bd493 merged in dev 2025-06-10 08:55:14 -04:00
dekzter
82f35d2aef check and warn before saving a network access setting that could block current client access 2025-06-10 08:46:36 -04:00
SergeantPanda
8acb31bbe8
Merge pull request #172 from Dispatcharr/ffmpeg-stats
Ffmpeg stats
2025-06-09 20:02:42 -05:00
SergeantPanda
cd47e76245 Add FFmpeg speed display and audio codec to channel card 2025-06-09 19:56:37 -05:00
SergeantPanda
0fed65a478 Add FFmpeg speed and audio codec information to channel details 2025-06-09 19:55:10 -05:00
SergeantPanda
677fbba1ac FFmpeg stats added to channel card. 2025-06-09 19:42:58 -05:00
SergeantPanda
71079aead3 Add ffmpeg stats to channel status api. 2025-06-09 19:36:27 -05:00
SergeantPanda
47500daafa Moved some functions to channel_service 2025-06-09 19:10:52 -05:00
SergeantPanda
7e25be0717 Store video and audio information in redis. 2025-06-09 18:57:36 -05:00
SergeantPanda
7d0c32ef3f Break line breaking for stats. 2025-06-09 17:45:00 -05:00
SergeantPanda
9d8e011e2c Store FFmpeg in redis. 2025-06-09 17:36:01 -05:00
SergeantPanda
44c8189c29 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into ffmpeg-stats 2025-06-09 15:42:31 -05:00
Marlon Alkan
18a6c428c1
core: api_views.py: add fallback IP geo provider
Fixes #127

- add ip-api.com as fallback geo provider
- fix silent JSONException by parsing only if HTTP 200 OK
- add logger and log error if IP geo can't be fetched
2025-06-08 19:28:56 +02:00
Marlon Alkan
192edda48e
apps: output: change body detection logic and add tests 2025-06-08 17:29:20 +02:00
Marlon Alkan
7e5be6094f
docker: init: 02-postgres.sh: allow DB user to create new DB (for tests) 2025-06-08 17:29:13 +02:00
dekzter
789d29c97a proper cidr validation server-side 2025-06-08 08:29:25 -04:00
SergeantPanda
7acc31ec97 Allow for tuners as low as 1 instead of 2.
Closes #149
2025-06-07 19:27:59 -05:00
SergeantPanda
8ae314fbb0 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-06 19:05:35 -05:00
SergeantPanda
8ee68a2349 Deny POST if there body isn't empty. 2025-06-06 19:04:19 -05:00
SergeantPanda
9b053ce1a5
Merge pull request #112 from maluueu/iptv-smarters-support_csrf-exception
apps: output: views.py: exclude M3U endpointfrom CSRF protection
2025-06-06 19:03:47 -05:00
SergeantPanda
708a269ae5 Adds the ability to add 'direct=true' to m3u output url. Doing so will use the URL from the provider instead of the Dispatcharr URL. 2025-06-06 12:21:08 -05:00
SergeantPanda
343ecfbca6 Allow for 'days' to be set when using the EPG url which will limit the number of days to output epg data for. 2025-06-06 12:00:08 -05:00
SergeantPanda
0ec5ffff33 Fix status message not updating the frontend during M3U refresh. 2025-06-06 08:00:19 -05:00
SergeantPanda
767c42a1fe Properly keep track of removed streams. 2025-06-05 21:34:14 -05:00
SergeantPanda
2359818d7b Calculate stale time from start of scan not start of stale cleanup function. 2025-06-05 21:26:11 -05:00
SergeantPanda
5e2757f578 Set back to a minimum of 1. Task needs to be updated first or all streams get removed. 2025-06-04 17:42:11 -05:00
SergeantPanda
8814f21d59 Don't auto populate stale field to 7 days if 0 is set.
Closes #123
2025-06-04 17:29:49 -05:00
SergeantPanda
a767f28eb6 Allow stale stream days to be set to 0 to disable retention completely. 2025-06-04 17:06:00 -05:00
SergeantPanda
353a51d07c
Merge pull request #146 from slamanna212/main
Remove Redundant Issue Form Labels | Triage Feature Requests
2025-06-04 16:38:03 -05:00
SergeantPanda
e7bf8cbede Added support for LIVE tag and dd_progrid numbering systems for epg. 2025-06-03 22:00:17 -05:00
SergeantPanda
722965b987 Replaced old images with ghcr images. 2025-06-03 21:32:24 -05:00
Sam LaManna
2a9a98cad7 Remove redundant labels now that we have types fully setup 2025-06-03 22:20:00 -04:00
Sam LaManna
abef4620d0 add triage to feature requests too 2025-06-03 22:19:28 -04:00
SergeantPanda
5cb2be7c93 Add support for dynamic tvg-id source selection in EPG generation
tvg_id_source accepts tvg_id, gracenote or the default if nothing is selected channel_number
2025-06-03 19:37:57 -05:00
SergeantPanda
1f0e643954 Add support for dynamic tvg-id source selection in M3U generation
`tvg_id_source` accepts `tvg_id`, `gracenote` or the default if nothing is selected `channel_number`
2025-06-03 19:10:14 -05:00
dekzter
9f96529707 merged in main 2025-06-03 17:49:24 -04:00
GitHub Actions
e6c30f178f Release v0.5.2 2025-06-03 21:35:26 +00:00
SergeantPanda
7b39ad4893
Merge pull request #135 from Dispatcharr/dev
v0.5.2
2025-06-03 16:34:38 -05:00
SergeantPanda
a72eaf118f Refactor channel info retrieval for safer decoding and improved error logging. Hopefully fixes stats not showing sometimes. 2025-06-03 10:59:53 -05:00
SergeantPanda
fb586a9f5a Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-03 07:55:21 -05:00
SergeantPanda
996b9848b4 Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-06-03 07:55:10 -05:00
SergeantPanda
5422595136
Merge pull request #131 from Dispatcharr/main
Sync main to dev
2025-06-02 20:47:08 -05:00
SergeantPanda
6ce387b0b0 Auto-scales Celery based on demand. Should lower overall memory and CPU usage while allowing for high cpu demand tasks to complete quickly.
Closes #111
2025-06-02 18:03:32 -05:00
SergeantPanda
58f5287a53 Improved logging for M3U processing. 2025-06-02 10:44:30 -05:00
SergeantPanda
18e08d3690
Merge pull request #119 from slamanna212/main
Add Support for Github Organization Issue Type
2025-06-01 14:26:46 -05:00
Sam LaManna
39a06f9ba2 Add Support for Github Organization Issue Type 2025-06-01 14:10:50 -04:00
SergeantPanda
dbf5acdcde
Merge pull request #117 from slamanna212/main
Add Issue Forms , Resubmit of PR 113
2025-06-01 12:02:10 -05:00
Sam LaManna
70e4e43d88 Add Issue Forms 2025-06-01 12:53:17 -04:00
dekzter
3f445607e0 looooots of updates for user-management, initial commit of access control 2025-05-31 18:01:46 -04:00
Marlon Alkan
669120e35a
apps: output: views.py: exclude M3U endpointfrom CSRF protection
Excludes the M3U generation view from CSRF protection. This allows IPTV Smarters apps (I only tested on Android) to access the playlist. Otherwise a 403 error is returned because instead of an GET request these apps are doing a POST request with empty body to the endpoint.
2025-05-31 01:31:19 +02:00
SergeantPanda
a9cdc9e37a Support ?cachedlogos=false for epg as well. 2025-05-30 14:40:06 -05:00
SergeantPanda
d339c322ed Support using direct logos add '?cachedlogos=false' to end of url 2025-05-30 14:23:03 -05:00
SergeantPanda
67663e2946 Add remove_blank_text=True to lxml parser. Fixes crashes related to poorly formatted xmltv files. 2025-05-30 10:25:47 -05:00
SergeantPanda
906fbef9c2 Output errors from parser if found. 2025-05-29 13:56:20 -05:00
dekzter
6504db3bd4 merged in dev 2025-05-28 18:34:45 -04:00
GitHub Actions
d378653983 Release v0.5.1 2025-05-28 21:29:07 +00:00
SergeantPanda
dc4ffe7a8c
Merge pull request #110 from Dispatcharr/dev
Release - v0.5.1
2025-05-28 16:27:43 -05:00
SergeantPanda
4c79af1f30 Log ffmpeg stats 2025-05-28 15:26:42 -05:00
SergeantPanda
50048518a9 Fixes bug where mulitiple channel initializations can occur which leads to choppy streams and zombie channels. 2025-05-27 19:05:26 -05:00
SergeantPanda
4dbb363211 Removed display-name from logging as it may not exist. 2025-05-27 18:00:35 -05:00
SergeantPanda
68a7f7f4fd Add profile name to hdhr friendly name and device id. 2025-05-27 10:46:49 -05:00
SergeantPanda
3fa5301894 If cached file doesn't exist for compressed mapped file, properly re-exctract. 2025-05-27 09:54:28 -05:00
SergeantPanda
45239b744c Delete cached files when deleting epg account. 2025-05-26 16:19:57 -05:00
SergeantPanda
5d2c604a4a Peak inside gz files to see if they contain xml files. Check file list for zip files to see if they contain xml files. 2025-05-26 15:49:51 -05:00
SergeantPanda
8f4e05b0b8 Add extracted_file_path to EPGSource model and update extraction logic 2025-05-26 15:10:54 -05:00
SergeantPanda
182a009d69 Track extracted files for mapped epg files. 2025-05-25 18:14:27 -05:00
SergeantPanda
7dbd41afa8 Extract compressed files after downloading and delete original. 2025-05-25 17:39:41 -05:00
SergeantPanda
d270e988bd Greatly improve filetype detection. 2025-05-25 16:50:03 -05:00
SergeantPanda
0322a5c904 Smarter logging for file type. 2025-05-25 15:47:22 -05:00
SergeantPanda
363a1a8080 Fixes selecting a profile leads to webui crash. 2025-05-25 15:01:04 -05:00
SergeantPanda
1ab04e31a4 Show channels without epg data. 2025-05-25 14:52:05 -05:00
SergeantPanda
c1eb3a6ecf Add Zip file support for EPG. 2025-05-25 14:05:59 -05:00
SergeantPanda
391a1d9707 A little more cleanup. 2025-05-25 13:29:39 -05:00
SergeantPanda
d7023bcdac Some more code cleanup 2025-05-25 13:19:49 -05:00
SergeantPanda
6c3102a60c Removed sleep from debugging. 2025-05-24 17:46:17 -05:00
SergeantPanda
f77b3b9756 Little more cleanup. 2025-05-24 17:40:45 -05:00
SergeantPanda
3f17c90a8b Cleaned up some old unneeded code. 2025-05-24 17:31:52 -05:00
SergeantPanda
6854af23eb Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-05-24 15:07:56 -05:00
SergeantPanda
759cbe2f7d Search entire epg for channels to accomodate non standard epg files. 2025-05-24 15:07:49 -05:00
dekzter
eb1bbdd299 merged in dev 2025-05-24 07:19:43 -04:00
dekzter
2c521b69ae tzlocal for upcoming MRs 2025-05-24 07:18:08 -04:00
dekzter
31293ebc44 wrap and tooltip for parameters 2025-05-24 07:17:46 -04:00
dekzter
f4381c9740 fixed DVR page, swapped clients table in stats for custom table 2025-05-23 17:01:02 -04:00
dekzter
9daa764fbb completely removed mantine-react-table, added empty / dummy output for VOD endpoints 2025-05-23 13:40:11 -04:00
dekzter
0b0373f4ee allow 'all' for streamer class by relating no profiles 2025-05-23 08:21:31 -04:00
SergeantPanda
f87ab4b071 Rolled back some earlier memory omptimizations that were causing issues with extremely large m3us. 2025-05-22 21:52:28 -05:00
SergeantPanda
925850a012 Fix change_streams and unchanged_streams possibly not existing when trying to clean up. 2025-05-22 16:51:20 -05:00
SergeantPanda
8302acd78a Additional cleanup while processing batches. 2025-05-22 15:34:20 -05:00
dekzter
e95c0859ab user custom properties, xc has its own password, properly checking xc permissions for streaming 2025-05-22 15:21:43 -04:00
SergeantPanda
d01a69828a Added additional logging for m3u processing 2025-05-22 11:58:44 -05:00
SergeantPanda
48e76273d1 Attempt at fixing timezone issues. 2025-05-22 10:23:59 -05:00
SergeantPanda
448f9bc6cf Decreased line height for status messages to look better on smaller screens. 2025-05-22 09:46:34 -05:00
SergeantPanda
57298eb811 Added some padding for parsing status messaging. 2025-05-21 16:47:00 -05:00
SergeantPanda
0fcab93ac3 Enhance WebSocket update handling by limiting data size and improving garbage collection for large operations 2025-05-21 16:28:15 -05:00
SergeantPanda
e816fa6afd Update epg form to be a closer match to m3u form. 2025-05-21 15:53:52 -05:00
SergeantPanda
6de565857d Set default refresh interval for files added via mapping to 0 since they will auto-update when modified. 2025-05-21 14:59:24 -05:00
dekzter
e3553b04ad more user management features and bug fixes 2025-05-21 15:33:54 -04:00
dekzter
e979113935 merged in dev 2025-05-21 15:24:30 -04:00
SergeantPanda
1087568de7 Fix bug when using file upload for m3u and not being able to set "Use Default" 2025-05-21 14:22:01 -05:00
SergeantPanda
5cae2e595e Add tvc-guide-id (gracenote ID) to channel when using bulk creation. 2025-05-21 12:52:30 -05:00
SergeantPanda
500e0941e2 Fixed calling process when it may not exist (log level higher than debug) 2025-05-21 12:11:03 -05:00
SergeantPanda
303123f3ec Buffer overflow error. 2025-05-21 09:44:09 -05:00
SergeantPanda
ba8eee16ee Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-05-21 09:02:20 -05:00
SergeantPanda
34374045a8 Improved channel cards formatting for different screen resolutions. 2025-05-21 09:02:10 -05:00
dekzter
a96c5f0f5c merged in dev 2025-05-21 08:16:35 -04:00
SergeantPanda
be622f48af
Merge pull request #104 from Dispatcharr/EPG-Processing
Epg processing
2025-05-20 20:43:51 -05:00
SergeantPanda
72783090cd Delete all instead of using a loop. 2025-05-20 20:36:04 -05:00
SergeantPanda
422bd0577a Removed cleanup from celery task. 2025-05-20 20:13:21 -05:00
SergeantPanda
55089044fa Cleaned up code and logging a bit more. 2025-05-20 20:07:10 -05:00
SergeantPanda
451c892457 Changed logging levels. 2025-05-20 19:49:55 -05:00
SergeantPanda
56be7b7194 Added missing dc dependency for auto-match. 2025-05-20 19:17:01 -05:00
SergeantPanda
4141aa11e5 Removed memory-profiler from requirements. 2025-05-20 18:55:05 -05:00
SergeantPanda
a74160a0b6 Add lxml to base image and set base to build automatically if requirements changes. 2025-05-20 18:51:40 -05:00
SergeantPanda
cfad5621ce Found the problem. I hate lxml. 2025-05-20 18:43:43 -05:00
SergeantPanda
ae823ae8ea Even more debug logging. 2025-05-20 15:48:21 -05:00
SergeantPanda
7d6ef38bce Added helpful comments. 2025-05-20 15:34:03 -05:00
SergeantPanda
d52ff40db1 More logging 2025-05-20 15:15:00 -05:00
SergeantPanda
cc060bbed6 More memory logging to debug. 2025-05-20 14:49:29 -05:00
SergeantPanda
8bf093d79b Increased logging and cleanup 2025-05-20 13:53:02 -05:00
SergeantPanda
06b1dec2b6 Better logic for cleanup task. Skip gathering memory if we aren't going to log it anyway. 2025-05-19 10:02:42 -05:00
SergeantPanda
eb223e1df2 Enable logging for core utils. 2025-05-19 09:53:52 -05:00
SergeantPanda
6087ecadf0 Cleaning up added gc's 2025-05-19 09:42:21 -05:00
SergeantPanda
7c809931d7 Rewored celery memory cleanup logic. 2025-05-18 20:57:37 -05:00
SergeantPanda
f821743163 Created a utility to clean up django memory. 2025-05-18 19:46:52 -05:00
SergeantPanda
b84e3f77f3 Fixed custom props not being loaded. 2025-05-18 18:56:30 -05:00
SergeantPanda
e641cef6f1 Fixed errors 2025-05-18 18:28:01 -05:00
dekzter
f229ca42b8 logout icon, better loading of channel page before auth user is loaded 2025-05-18 18:08:05 -04:00
SergeantPanda
ed665584e9 The struggle is real 2025-05-18 17:05:03 -05:00
dekzter
79104affe3 fixed migration issues 2025-05-18 11:53:09 -04:00
dekzter
74d58515d0 user management, user levels, user level channel access 2025-05-18 11:19:34 -04:00
SergeantPanda
8133af5d20 Remove old parser reference. 2025-05-17 17:26:18 -05:00
SergeantPanda
1174e2e0c7 EPG processing enhancements. Celery memory management. 2025-05-17 16:42:37 -05:00
SergeantPanda
7fe618b037 Much better memory usage. About half as much 2025-05-16 22:15:21 -05:00
SergeantPanda
f18ca4de37 Initial rework of EPG processesing. 2025-05-16 19:26:06 -05:00
SergeantPanda
eecf879119 Added random descriptions for dummy channels in tv guide. 2025-05-16 11:08:52 -05:00
SergeantPanda
a2299d0c52 Improved dummy output. 2025-05-16 10:17:06 -05:00
SergeantPanda
1508c2902e Switched logging match rest of application. 2025-05-16 09:46:55 -05:00
SergeantPanda
0843776b6b
Merge pull request #99 from Dispatcharr/Decimals-in-channels
Decimals in channels
2025-05-15 16:54:23 -05:00
SergeantPanda
8c86f7656a Convert HDHR floats correctly. 2025-05-15 16:21:14 -05:00
SergeantPanda
8394fc2ed4 EPG and M3U support decimals if the channel has a decimal otherwise use integer. 2025-05-15 16:13:08 -05:00
SergeantPanda
8ee1581588 Forms support floats now. 2025-05-15 15:32:21 -05:00
SergeantPanda
3b2250895d Fixed migrations. 2025-05-15 14:38:28 -05:00
SergeantPanda
baabea5006
Merge pull request #93 from MooseyOnTheLoosey/main
Channel Number to Float
2025-05-15 14:21:42 -05:00
SergeantPanda
5c3fdc1354 Enhance JWT authentication error handling and user redirection on token issues. 2025-05-15 14:12:31 -05:00
SergeantPanda
e5d353ec7f Handle corrupted JSON cache files in refresh_single_m3u_account function 2025-05-15 14:05:57 -05:00
SergeantPanda
1772bc7257 Configure Redis to not write to disk and also run in unprotect mode if running in debug mode. 2025-05-15 13:16:18 -05:00
SergeantPanda
1aac0f8011 Separated beat schedules. Scan files now only runs every 20 seconds. 2025-05-15 12:12:41 -05:00
GitHub Actions
692d7bfb88 Release v0.5.0 2025-05-15 15:35:56 +00:00
SergeantPanda
cc1878bdd7
Merge pull request #98 from Dispatcharr/dev
v0.5.0
2025-05-15 10:33:04 -05:00
SergeantPanda
790e3710a6 Created missing migrations. 2025-05-14 20:10:26 -05:00
SergeantPanda
2c831bd756 Fixes remote debugging. 2025-05-14 19:11:12 -05:00
SergeantPanda
44a79d2a8a Log UWSGI if debug is set (DISPATCHARR_DEBUG=true) 2025-05-14 18:49:46 -05:00
SergeantPanda
14c3944578 Don't hardcode dispatch as user for nginx. Use postgres_user. 2025-05-14 10:19:55 -05:00
SergeantPanda
cecc057ea4 Make sure we can change render group gid to match host and capture errors so we don't crash server. 2025-05-14 09:32:24 -05:00
SergeantPanda
e7439a074f Enhance WebSocket connection handling based on authentication status 2025-05-13 16:57:39 -05:00
SergeantPanda
0a222f27af Move fetchChannelProfiles to initdata 2025-05-13 16:52:21 -05:00
SergeantPanda
cbbb2a6d59 Fix Profiles not loading. 2025-05-13 16:13:44 -05:00
SergeantPanda
cff68625e0 Move hardware detection to end of entrypoint so it's easier to find. 2025-05-13 12:44:32 -05:00
SergeantPanda
0a9250c3d5 Vastly improved logic for detecting recommended acceleration methods. 2025-05-13 12:32:49 -05:00
SergeantPanda
dd54a13bdd Fix script exiting entrypoint if no devices were found. 2025-05-13 00:37:07 -05:00
SergeantPanda
40765ed46d One more attempt at proper intel detection. 2025-05-12 21:19:43 -05:00
SergeantPanda
bb73da1cb9 Fixes invalid regex detection for intel gpu models. 2025-05-12 20:55:35 -05:00
SergeantPanda
08493321dd Fixes intel devices not matching correctly for libva_driver recommendations. 2025-05-12 20:27:38 -05:00
SergeantPanda
b43f096ea6 Cleaned up unneeded comment for group_add and added log level. 2025-05-12 20:06:16 -05:00
SergeantPanda
bee2226e75 Use base for debug 2025-05-12 20:04:32 -05:00
SergeantPanda
1ab3dcac48 Use base for dev. 2025-05-12 20:04:02 -05:00
SergeantPanda
3098b96919 Removed render group creation as now it's done in the base build. 2025-05-12 20:03:25 -05:00
SergeantPanda
0953e044b7 Huge improvement to hardware acceleration script. Renamed for accuracy. 2025-05-12 19:46:23 -05:00
SergeantPanda
634d16d402 Improved logic for assigned GID's and group memberships for video and render groups. 2025-05-12 17:23:10 -05:00
SergeantPanda
397ec499fe Assign GID 109 to render group 2025-05-12 17:02:10 -05:00
SergeantPanda
9f1d382472 Don't force GID for render group. 2025-05-12 16:02:24 -05:00
SergeantPanda
74152406d1 Add Render group to base. 2025-05-12 15:57:25 -05:00
SergeantPanda
d25d57c162 Add render group creation for hardware acceleration support in Dockerfile 2025-05-12 14:26:32 -05:00
SergeantPanda
ade8a625bc Refactor environment variable handling in entrypoint.sh for consistency and efficiency 2025-05-12 13:57:22 -05:00
SergeantPanda
da9a78c875 Only write LIBVA_DRIVER_NAME to dispatcharr.sh if it's set by the user. 2025-05-12 13:16:15 -05:00
SergeantPanda
87798f4434 Only set LIBVA_DRIVER_NAME if user has it specified. Fixes but where FFmpeg wasn't auto detecting the correct driver for VAAPI and QSV. 2025-05-12 12:52:11 -05:00
SergeantPanda
8dd3bd6878 Use variables in export. 2025-05-12 09:10:38 -05:00
SergeantPanda
b85a5be023 Add LD_LIBRARY_PATH to dispatch user and global. 2025-05-12 08:44:07 -05:00
SergeantPanda
3a06b12f2c Hopefully fixes QSV hardware acceleration. 2025-05-11 12:45:57 -05:00
SergeantPanda
3a4631e9ad Add extra note if model isn't detected. 2025-05-10 16:51:42 -05:00
SergeantPanda
3c1157d330 Enhance user group verification messages to include the username for clarity 2025-05-10 16:45:01 -05:00
SergeantPanda
decd54dba9 Add NVIDIA device name to summary 2025-05-10 15:45:08 -05:00
SergeantPanda
4afa3166ba Improved group checking logic. 2025-05-10 15:35:48 -05:00
SergeantPanda
e59521ae94 Add Dispatch to video and render groups if they exist. 2025-05-10 15:29:25 -05:00
SergeantPanda
3a5dcab919 Also add LIBVA_DRIVERS_PATH to global environment. 2025-05-10 15:07:38 -05:00
SergeantPanda
67b2178978 Add LIBVA_DRIVER_PATH to environmental variables for all users. 2025-05-10 14:44:47 -05:00
SergeantPanda
ae2af82d1a Update docker compose to show new log level. 2025-05-10 14:33:52 -05:00
SergeantPanda
bb755c2351 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-05-10 14:18:09 -05:00
SergeantPanda
ee8cef5aa9 Cleaned up logging. 2025-05-10 14:13:43 -05:00
SergeantPanda
d5b64a56d6 Smarter logging for file change imports. 2025-05-10 13:56:55 -05:00
SergeantPanda
67aca64420 Properly set EV for all profiles so uWSGI daemons can see it. 2025-05-10 13:25:03 -05:00
SergeantPanda
ecd9d146c6 Changed several logging levels. 2025-05-10 10:29:17 -05:00
SergeantPanda
24fba3c2b1 Change some celery tasks from info to debug. 2025-05-10 09:58:57 -05:00
SergeantPanda
d3615e1a66 Huge overhaul of logging. More standardized and we are now capturing logs from celery task and sening to console.
Also adds a new environmental variable: DISPATCHARR_LOG_LEVEL, log levels available: TRACE, DEBUG, INFO, WARNING, ERROR, CRITICAL
2025-05-10 09:29:06 -05:00
dekzter
2dba3aca24 fixed dependencies for onEdit callback 2025-05-10 09:39:03 -04:00
dekzter
9c9e546f80 websockets behind auth, cleaned up errors and bad state handling in websocket.jsx 2025-05-10 08:40:53 -04:00
SergeantPanda
23b678bb03 Add pciutils for better hardware detection. 2025-05-09 20:06:28 -05:00
SergeantPanda
2ddc6beb15 Better messaging since nvidia variables will always exist in our container. 2025-05-09 18:31:16 -05:00
SergeantPanda
18bc422077 Better detection. 2025-05-09 17:37:39 -05:00
SergeantPanda
aff93591fd Detect hardware acceleration capabilities and provide suggestions. 2025-05-09 17:19:46 -05:00
SergeantPanda
f762e1b923 Capture and display transcode strd err. 2025-05-09 13:44:49 -05:00
SergeantPanda
04cbfa5f26 Build arm test. 2025-05-09 12:54:52 -05:00
SergeantPanda
178dc61e94 Better error handling for debug wrapper when timeouts occur. 2025-05-09 12:20:29 -05:00
SergeantPanda
25fc69d453 Properly disable celery task if m3u is disabled. 2025-05-09 12:06:43 -05:00
SergeantPanda
6afccadf74 Delete celery tasks properly during m3u account deletion. 2025-05-09 12:00:57 -05:00
SergeantPanda
d96a0d93ab
Merge pull request #96 from Dispatcharr/dev
Allow manual runs of CI Pipeline.
2025-05-09 11:15:48 -05:00
SergeantPanda
8779e6a8cf Allow manual runs of CI Pipeline. 2025-05-09 11:14:34 -05:00
SergeantPanda
8b49036cc2
Merge pull request #95 from Dispatcharr/dev
Dev
2025-05-09 11:03:59 -05:00
SergeantPanda
713350713f
Merge pull request #94 from SergeantPanda/docker-refactor
Docker refactor
2025-05-09 10:49:21 -05:00
SergeantPanda
2fcd2a6b80 Rebuild on base-image.yml change. 2025-05-09 08:43:44 -05:00
SergeantPanda
d25573cae3 Fixes unwanted changes. 2025-05-09 08:40:18 -05:00
SergeantPanda
2835a53e30 Fix base not pushing. 2025-05-09 08:37:08 -05:00
SergeantPanda
d6637d30a6 Build 2025-05-09 08:23:42 -05:00
SergeantPanda
60a8bed65a Properly pass build arguments. 2025-05-09 08:16:43 -05:00
SergeantPanda
e88e3928dd If EPG source is not found when task is run, delete the task so it doesn't keep happening. 2025-05-08 20:17:22 -05:00
SergeantPanda
c3cad47e4c Properly disable celery task when epg is disabled. 2025-05-08 20:02:03 -05:00
SergeantPanda
58773c015c Modify logging levels of uwsgi and fix epg tasks not being deleted when epg is deleted. 2025-05-08 19:47:05 -05:00
MooseOnTheLoose
88a7312397
Merge branch 'dev' into main 2025-05-08 19:16:49 -05:00
SergeantPanda
daa40033fc Reorganize base image build arguments in Dockerfile 2025-05-08 17:16:57 -05:00
SergeantPanda
1ff748d3d1 Dynamically build base image repo. 2025-05-08 17:10:27 -05:00
SergeantPanda
f34cf9e086 New workflow to make base image. 2025-05-08 17:00:20 -05:00
SergeantPanda
70313bce69 Cleaned up unnecessary layer. 2025-05-08 16:41:14 -05:00
SergeantPanda
1f4ade0be7 Slight refactor. 2025-05-08 14:23:05 -05:00
SergeantPanda
84acf743b9 Try cleaning up frontend folder. 2025-05-08 12:58:45 -05:00
SergeantPanda
daf0685355 Switch to ffmpeg as base. 2025-05-08 12:36:19 -05:00
dekzter
4ebfde2797 auto-assign numbers now configurable by selection, channel enable switch now applies to all selected 2025-05-08 12:43:27 -04:00
SergeantPanda
54bb3da486 Added missing libraries for ffmpeg. 2025-05-08 11:25:28 -05:00
SergeantPanda
cd5c6dff5f Add Timestamp 2025-05-08 10:56:05 -05:00
SergeantPanda
26d33f0f92 Added nginx config copy 2025-05-08 10:50:10 -05:00
SergeantPanda
38b113800c build 2025-05-08 10:28:55 -05:00
SergeantPanda
7dc3ed10cc build 2025-05-08 10:21:27 -05:00
MooseyOnTheLoosey
5bae7997c0 Removing more debug 2025-05-08 09:50:19 -05:00
MooseyOnTheLoosey
f6ea1b41b3 Removing debug 2025-05-08 09:15:08 -05:00
SergeantPanda
912a11da22 Use base 2025-05-08 09:13:05 -05:00
MooseyOnTheLoosey
88866cc905 Updated channel numbers from integer to float 2025-05-08 09:12:10 -05:00
SergeantPanda
92f30f5c4a Test build 2025-05-08 08:41:47 -05:00
SergeantPanda
6c94bbb0c2 base build 2025-05-08 08:05:34 -05:00
SergeantPanda
2b3f17972a Rename group to dispatch if it isn't already. 2025-05-07 21:10:56 -05:00
SergeantPanda
37572b9bdf build 2025-05-07 20:48:37 -05:00
SergeantPanda
3860ab62e3 build 2025-05-07 20:42:40 -05:00
SergeantPanda
3d3abab8b6 build 2025-05-07 20:03:52 -05:00
SergeantPanda
52cd19ada7 build 2025-05-07 19:56:39 -05:00
SergeantPanda
dc2b676cbc python... 2025-05-07 19:32:46 -05:00
SergeantPanda
2bcd9d24d7 again.. 2025-05-07 19:21:07 -05:00
SergeantPanda
147572869b Another attempt 2025-05-07 19:15:18 -05:00
SergeantPanda
ae5a189feb Build 2025-05-07 18:55:29 -05:00
SergeantPanda
05d70c894d build 2025-05-07 18:47:13 -05:00
SergeantPanda
51891a59ec Build 2025-05-07 18:36:31 -05:00
SergeantPanda
dc493073af build attempt 2025-05-07 18:32:50 -05:00
SergeantPanda
082ee90cc1 Switch to ubuntu as base 2025-05-07 18:20:41 -05:00
SergeantPanda
9fed32b715 ffmpeg as base 2025-05-07 18:10:13 -05:00
SergeantPanda
ff05c8de16 Switch to lscr ffmpeg. 2025-05-07 17:36:49 -05:00
SergeantPanda
2bdcc9417d Trying a different method for ffmpeg. 2025-05-07 16:51:40 -05:00
SergeantPanda
ca27b58eec Missed xz-utils 2025-05-07 16:20:56 -05:00
SergeantPanda
6de85d3326 Repo is looking for amd64 2025-05-07 16:06:12 -05:00
SergeantPanda
e35041d92d More detection issues... 2025-05-07 15:55:21 -05:00
SergeantPanda
b6f6506030 Arch detection again... 2025-05-07 15:38:24 -05:00
SergeantPanda
3aaefd752b Change arch detection. 2025-05-07 15:32:29 -05:00
SergeantPanda
e1cb86f832 Install FFmpeg from John Van Sickle Git repo. 2025-05-07 15:19:39 -05:00
SergeantPanda
e63a66bf57 Change status to disabled when epg or m3u disabled. Change to idle when enabled. 2025-05-07 13:50:10 -05:00
SergeantPanda
b84d1cc61a Enhance m3u profile creation to show live regex results. 2025-05-07 13:24:41 -05:00
SergeantPanda
c198c4c590 Sort profiles in M3UProfiles component to prioritize default profiles and order others alphabetically by name 2025-05-07 13:11:46 -05:00
SergeantPanda
51784b9073 Fix for "will-change memory consumption is too high" error 2025-05-07 13:06:19 -05:00
SergeantPanda
95277679b0 Improve error handling and validation for playlist profiles in M3UProfiles component 2025-05-07 10:57:21 -05:00
SergeantPanda
72ed1ff4f4 Error handling for m3u profile. 2025-05-07 10:40:50 -05:00
SergeantPanda
88c27ac8ae Refactor Button component to use inline styles for width based on submission state 2025-05-07 10:31:23 -05:00
SergeantPanda
b0f26d96b2 Apparently width=100% isn't recommended for react so switched it to "style={{ width: '100%' }}" 2025-05-07 09:39:05 -05:00
SergeantPanda
9ba03fdfaf Found more references to the problematic "fullWidth" and replaced with "width=100%" 2025-05-07 09:31:55 -05:00
SergeantPanda
c98b73835b Replaced problematic "fullWidth" with "width="100%" to get rid of dom errors in the console. 2025-05-07 09:22:06 -05:00
SergeantPanda
f5f47005aa Move is active to bottom of form. 2025-05-07 09:13:13 -05:00
SergeantPanda
b5cdf3ceb2 Add information on disabling scheduling. 2025-05-07 09:10:56 -05:00
SergeantPanda
312eacf64c Add warning dialog when deleting epg or m3u. 2025-05-07 08:59:12 -05:00
SergeantPanda
7825019ed1 Add descriptions to form fields in EPG and M3U components for better clarity 2025-05-07 08:30:54 -05:00
SergeantPanda
7ead75f5b6 Add unique key prop to Button component in M3UGroupFilter for improved rendering 2025-05-06 20:54:59 -05:00
SergeantPanda
63b260698d Removed old websocket call. 2025-05-06 20:50:33 -05:00
SergeantPanda
992c238388 Add fullWidth property to New button in M3UProfiles component 2025-05-06 20:24:56 -05:00
SergeantPanda
4966bb61bc Enhance M3U fetching process with status updates and detailed progress messages 2025-05-06 19:18:10 -05:00
SergeantPanda
14ad165b8e
Merge pull request #91 from Dispatcharr/xc-refactor
Enhance XCClient with robust error handling and user agent management…
2025-05-06 18:48:50 -05:00
SergeantPanda
92d09eea3b Enhance XCClient with robust error handling and user agent management; improve M3U processing logic for better error reporting and validation. 2025-05-06 18:00:37 -05:00
OkinawaBoss
5be2b9ab56
Update README.md
Added documentation link
2025-05-06 12:51:39 -05:00
SergeantPanda
728ab45534 Fixes XC login. 2025-05-06 10:52:48 -05:00
SergeantPanda
3440dfa28f Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-05-06 10:34:22 -05:00
SergeantPanda
c4bdb53749 Allow edit groups if user is on a different screen. 2025-05-06 10:33:51 -05:00
SergeantPanda
e9c5a719b1 Better handling for notifying of groups needing to be edited. 2025-05-06 10:09:59 -05:00
SergeantPanda
c225edc3ec Fixes updated time not updating the local state. 2025-05-06 08:19:05 -05:00
SergeantPanda
452e2ee213 Fixes status not changing to fetchign and parsing when those operations are occuring. 2025-05-06 08:13:59 -05:00
SergeantPanda
95ac700b3c Fixes all refreshes showing notification for pending setup. 2025-05-06 08:11:16 -05:00
dekzter
c6c1f2d7a4 properly handle data during the editChannel lifecycle 2025-05-06 08:47:23 -04:00
SergeantPanda
b3534833d7 Removed special character. 2025-05-05 20:28:31 -05:00
SergeantPanda
ae93bc2a1f Adds a "Pending Setup" status so the user knows they need to select groups before their streams are imported. 2025-05-05 19:50:20 -05:00
SergeantPanda
898224dc72 Fix potential issue during stream switches. 2025-05-05 17:02:28 -05:00
SergeantPanda
2b13d97196 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-05-05 10:08:32 -05:00
SergeantPanda
77d837eb7d Enhance 'Active' column in M3UTable: set minimum size, add sorting function, and align text to the left. 2025-05-05 09:25:05 -05:00
dekzter
995ae3008c honor m3u account user-agent in xc client 2025-05-05 10:24:37 -04:00
SergeantPanda
3a68a66e70 Changed sizing to not overflow text in to other rows. 2025-05-05 09:08:01 -05:00
SergeantPanda
5e7427c378
Merge pull request #88 from Dispatcharr/epg-table-refactor
Epg and m3u table refactor
2025-05-04 21:51:07 -05:00
SergeantPanda
82cee02962 Updated never if no date. 2025-05-04 21:46:30 -05:00
SergeantPanda
852993149f Sort by name and keep disabled at the bottom. 2025-05-04 21:43:30 -05:00
SergeantPanda
43903a98a6 Fixes rows not expanding for epg table as well. 2025-05-04 21:39:03 -05:00
SergeantPanda
086208cbab Fixes expanding rows. 2025-05-04 21:34:58 -05:00
SergeantPanda
89a86c159a Status messages working in status column again but row isn't expanding. 2025-05-04 21:20:03 -05:00
SergeantPanda
323045cef7 Standardized the style to match other tables. Size follows user setting. 2025-05-04 20:38:20 -05:00
SergeantPanda
30e82fb302 Lots of status message fixes. 2025-05-04 19:33:03 -05:00
SergeantPanda
b713b516b4 Convert m3u accounts to a similar format to epg for status updates. Renamed DB field last_error to last_message so we can use it for more messaging. Migration to change updated_at field to not be updated everytime the m3u changes. It should only update the updated_at when we successfully update the m3u. 2025-05-04 17:51:57 -05:00
SergeantPanda
5dac5858f2 Refactor updated_at handling in stream processing to only update for changed streams for both xc accounts and m3u acounts. 2025-05-04 14:30:27 -05:00
SergeantPanda
a329ae0c87 Fixes not setting updated_at if only custom properties changed. 2025-05-04 14:22:02 -05:00
SergeantPanda
d8f0700ff6 Fix last_seen not being wrote to properly. 2025-05-04 13:59:06 -05:00
SergeantPanda
de130dcca0 Show initializing refresh when the refresh button is pressed. 2025-05-04 13:36:01 -05:00
SergeantPanda
d26dd5f6b3 Add better display in the m3u table for the user to see what we are doing. 2025-05-04 13:26:29 -05:00
SergeantPanda
4619d09efe Fixes logos not showing properly. 2025-05-04 12:35:44 -05:00
SergeantPanda
0697ef55ed
Merge pull request #87 from SergeantPanda/weberrors-websockets
Weberrors websockets
2025-05-04 12:08:19 -05:00
SergeantPanda
7418abb31c Attempt to fix websockets for production mode. 2025-05-04 11:56:15 -05:00
SergeantPanda
509f2be3a8 Fixes a lot of "You provided a value prop to a form field without an onChange handler" errors.
Reworks websocket connection to be more robust and notify user of connection errors.

Will retry if websocket connection dies.
2025-05-04 10:58:38 -05:00
SergeantPanda
d2c8389f74 Add component prop to Text to fix react error in dev. 2025-05-04 09:55:04 -05:00
SergeantPanda
d1e9de6cdb Fix react error when loading in dev:
Change Text component to div in DraggableRow for better semantic structure
2025-05-04 09:51:15 -05:00
SergeantPanda
4803a3605e Fixes user state not updating when editing channels. 2025-05-04 09:45:55 -05:00
SergeantPanda
42db972e07
Merge pull request #86 from Dispatcharr/tvc-guide-stationid
Tvc guide stationid
2025-05-04 09:25:12 -05:00
SergeantPanda
f3abbaeb27 Add gracenote ID when creating channel if available. Fixes m3u output if gracenote ID doesn't exist. 2025-05-04 09:23:02 -05:00
SergeantPanda
2243e2470b
Merge pull request #84 from rykr/GraceNoteId
Add support for Channels DVR tvc-guide-stationid
2025-05-04 08:06:06 -05:00
SergeantPanda
4030c8d774 Show file path if local file. 2025-05-03 19:28:36 -05:00
Reggie Burnett
fcd1722d3a Add support for Channels DVR tvc-guide-stationid 2025-05-03 19:24:51 -05:00
SergeantPanda
c89c962374 Updated the look and reordered all columns. 2025-05-03 19:17:15 -05:00
SergeantPanda
0493dd37a1 Re-order columns 2025-05-03 19:07:30 -05:00
SergeantPanda
4595cea400 Status update overhaul for epg files. 2025-05-03 18:39:59 -05:00
SergeantPanda
9f52b22432 Show all HTTP errors to the user during epg update. 2025-05-03 16:26:30 -05:00
SergeantPanda
826883424c Completely remove size limits for file uploads. 2025-05-03 15:05:40 -05:00
SergeantPanda
e667299806 Stream tables and Channels table now load at the same time instead of streams waiting for all logos to download before loading. 2025-05-03 15:03:16 -05:00
SergeantPanda
c41d948fc9 Reset page number to 1 when changing filters to avoid page errors. 2025-05-03 14:42:50 -05:00
SergeantPanda
e65fd59a49 Add confirmations for deleting channels. 2025-05-03 14:21:48 -05:00
SergeantPanda
a387130d23 Fixes channel creation form resetting if also creating a group.
Also fixes group creation form not closing when saving.
2025-05-03 13:28:01 -05:00
dekzter
693d33e18d merged in xtream branch 2025-05-03 13:52:42 -04:00
SergeantPanda
104bad8f04 Added last seen as an api output for streams. 2025-05-03 12:38:45 -05:00
SergeantPanda
dbb3eb1664 Better look for hovering. 2025-05-03 10:47:40 -05:00
SergeantPanda
d4ae98900f Fixes rows not having hover affect if they are shaded red. 2025-05-03 10:41:18 -05:00
SergeantPanda
5fa9a1e64d Fixes bug where you may click delete on one channel and another deletes. Also adds shading to channels with no streams added. 2025-05-03 10:20:31 -05:00
dekzter
1a0d065eca ui settings with configurable table size, added setting to set / rehash stream hashes 2025-05-03 08:01:22 -04:00
SergeantPanda
3b4edde90f Fixes wrong edit and stream previews playing when filtering.
Include dependencies in useCallback for onEdit and onDelete, and update dependency array in StreamsTable for useTable
2025-05-02 21:56:30 -05:00
SergeantPanda
2fff015206 Enhance refresh_single_m3u_account to track completed tasks and ensure DB transactions are committed before cleanup 2025-05-02 17:33:51 -05:00
dekzter
3de768d954 fixed conflicting migratino 2025-05-02 15:38:26 -04:00
SergeantPanda
35579e79fb Enhance FloatingVideo component with loading and error handling overlays 2025-05-02 13:02:34 -05:00
SergeantPanda
5a844e4aaa Fixed channels not showing if they were mapped to the same epg id. 2025-05-02 12:16:38 -05:00
SergeantPanda
307728b7bc Improve file existence checks and error handling in parse_channels_only and parse_programs_for_source 2025-05-02 11:59:38 -05:00
SergeantPanda
98ef0d49f7 Add file existence checks and error handling in parse_channels_only 2025-05-02 11:23:36 -05:00
SergeantPanda
d3eb00fdf4 Add null check for channel name in notifications 2025-05-02 11:00:09 -05:00
SergeantPanda
e67f314656 Better error handling for web video player. 2025-05-02 10:22:19 -05:00
SergeantPanda
246ab0a22d Fixes console error when playing a stream preview due to channelID being null. 2025-05-02 10:02:35 -05:00
dekzter
06879ed8ef merged in dev 2025-05-02 09:33:09 -04:00
dekzter
e81b6e3189 option to add epg source with xc account 2025-05-02 09:30:24 -04:00
SergeantPanda
6f1bae8195 Fixes stream preview in stream table. 2025-05-02 08:23:11 -05:00
SergeantPanda
0e54062c73 Added drivers for hardware acceleration. 2025-05-01 17:42:23 -05:00
dekzter
5bf5e1e1c2 updated last_seen of existing streams: 2025-05-01 17:30:19 -04:00
SergeantPanda
d26944a7a5 Add stale_stream_days field to M3UAccount model and update related logic
- Introduced stale_stream_days field to M3UAccount to specify the retention period for streams.
- Updated cleanup_streams task to remove streams not seen within the specified stale_stream_days.
- Enhanced M3U form to include stale_stream_days input for user configuration.
2025-05-01 16:01:08 -05:00
GitHub Actions
4cb2cb7b20 Release v0.4.1 2025-05-01 18:49:59 +00:00
SergeantPanda
78b32ca639
Release Notes - v0.4.1
## Performance Improvements
- Optimized uWSGI configuration settings for better server performance
- Improved asynchronous processing by converting additional timers to gevent
## Network Improvements
- Enhanced EPG (Electronic Program Guide) downloading with proper user agent headers
## UI Improvements
- Fixed issue with "add streams to channel" functionality to correctly follow disabled state logic
2025-05-01 13:48:53 -05:00
SergeantPanda
90c1c3d2ed uwsgi config tuning. 2025-05-01 13:10:49 -05:00
SergeantPanda
78fc7d9f2b Use proper user agents when downloading epgs. And a little more robust user agent selection for m3u downloads. 2025-05-01 12:42:48 -05:00
dekzter
a2055827ce Merge remote-tracking branch 'origin/dev' into xtream 2025-05-01 13:02:28 -04:00
SergeantPanda
b3c4ff8f2d Finding more timers that can be converted to gevents. 2025-05-01 10:43:07 -05:00
dekzter
091d9a6823 immediately prompt for group filtering when using an XC account 2025-05-01 11:22:38 -04:00
SergeantPanda
e8ee59cf00 Not sure why it didn't push. 2025-05-01 09:31:26 -05:00
SergeantPanda
c11ce048c7 Disable monkey patching. 2025-05-01 09:24:16 -05:00
dekzter
c5dd351bf1 Merge remote-tracking branch 'origin/dev' into xtream 2025-05-01 10:15:42 -04:00
SergeantPanda
b811a3d45b More sleep events. 2025-05-01 09:05:51 -05:00
SergeantPanda
a86ae715b9 Fixes add streams to channel to follow correct logic of being disabled. 2025-04-30 21:11:41 -05:00
GitHub Actions
c6c5662472 Release v0.4.0 2025-05-01 01:09:08 +00:00
SergeantPanda
c65b431eba
Rolled back after failed release. 2025-04-30 20:05:37 -05:00
GitHub Actions
8219773a68 Release v0.6.0 2025-05-01 00:54:35 +00:00
SergeantPanda
4a00d2edd5
Merge pull request #81 from Dispatcharr/dev
Switch to Yarn for building frontend
2025-04-30 19:53:43 -05:00
SergeantPanda
e975a13c0f Another attempt at using yarn. 2025-04-30 19:30:53 -05:00
GitHub Actions
da1fae89a9 Release v0.5.0 2025-05-01 00:30:25 +00:00
SergeantPanda
7a67479e38 Back to NPM but use ignore scripts and rebuild. 2025-04-30 19:21:22 -05:00
SergeantPanda
3381cb8695 Switch to yarn for building frontend. 2025-04-30 19:14:18 -05:00
SergeantPanda
2b44c122e7 Update version display to include timestamp instead of build 2025-04-30 18:46:10 -05:00
GitHub Actions
a6087d1010 Release v0.4.0 2025-04-30 23:27:49 +00:00
SergeantPanda
0f915b71c1
v0.4.0
## What's New in This Release

This update brings significant improvements to performance, user experience, and stability across the application. We've revamped tables for better responsiveness, enhanced streaming capabilities, and fixed numerous bugs.

### 🔧 Major Improvements

- **Table System Rewrite**: Completely refactored channel and stream tables for dramatically improved performance with large datasets
- **Manual Stream Switching**: Added ability to manually switch between streams for a channel
- **EPG Auto-Match Notifications**: Users now receive feedback about how many matches were found during auto-matching
- **Improved Concurrency**: Replaced time.sleep with gevent.sleep for better performance when handling multiple streams

### 🖥️ UI/UX Enhancements

- Added URL copy buttons for stream and channel URLs
- Fixed spacing and padding in EPG and M3U tables for better readability on smaller displays
- Added informative tooltips throughout the interface, including stream profiles and user-agent details
- Improved table interactions:
  - Restored alternating row colors and hover effects
  - Added shift-click support for multiple row selection
  - Preserved drag-and-drop functionality
- Adjusted logo display to prevent layout shifts with different sized logos
- Improved sticky headers in tables

### 📺 Streaming & Channel Management

- Fixed stream ordering in channel selection
- Added M3U profile name to stream names for better identification
- Fixed channel form not updating some properties after saving
- Improved stream URL handling for search/replace patterns
- Enhanced stream lock management for better reliability
- Added stream name to channel status for better visibility
- Properly track current stream ID during stream switches
- Fixed issue with setting logos to default

### 📊 Stats & Monitoring

- Added display of connected time for each client
- Fixed issues with channel statistics randomly not working
- Added current M3U profile information to stats
- Added better logging for which channel clients are getting chunks from

### 🔄 EPG & M3U Processing

- Improved EPG cache handling and cleanup of old cache files
- Corrected content type for M3U file (using m3u instead of m3u8)
- Fixed logo URL handling in M3U generation
- Enhanced tuner count calculation to include only active M3U accounts

### 🐛 Bug Fixes

- Fixed channel creation from streams
- Fixed channel group saving
- Improved error handling throughout the application
- Fixed bugs in deleting stream profiles
- Resolved mimetype detection issues
- Fixed form display issues
- Added proper requerying after form submissions and item deletions
- Fixed bug overwriting tvg-id when loading TV Guide
- Fixed bug that prevented large m3u's and epg's from uploading
- Fixed typo in Stream Profile header column for Description (Thank you LoudSoftware for the PR!)
- Fixed typo in m3u input processing (tv-chno instead of tvg-chno) (Thank you @www2a for the PR!)

## Technical Improvements

- Increased thread stack size in uwsgi configuration
- Changed proxy to use uwsgi socket 
- Added build timestamp to version information
- Reduced excessive logging during M3U/EPG file importing
- Improved store variable handling to increase application efficiency

---

**The Dispatcharr Team** – Happy Streaming! 🚀
2025-04-30 18:27:13 -05:00
SergeantPanda
a50a7372c1 Removed unnecessary elif for invalid flag. 2025-04-30 18:22:27 -05:00
SergeantPanda
5475177cf4
Merge pull request #77 from www2000/dev-m3u-fix
Fix channel issues that i have with m3u files comming form tvheaden.
2025-04-30 18:19:26 -05:00
SergeantPanda
2943f84e46
Merge pull request #74 from LoudSoftware/patch-1
Fixed minor typo in UserAgentsTable.jsx
2025-04-30 18:01:57 -05:00
SergeantPanda
6adda8209f Ensure cache directory exists before saving EPG data 2025-04-30 17:54:48 -05:00
SergeantPanda
c058c4ed10 Fixes spacing and padding in epg and m3u tables. 2025-04-30 17:03:36 -05:00
SergeantPanda
e8355a78c6 Fetch channels when auto-match is complete. 2025-04-30 16:51:48 -05:00
dekzter
79392bb129 fixed channel form not updating some properties after saving 2025-04-30 16:58:16 -04:00
dekzter
a9ea30d862 channel number sorting restored, reset stream table default page size 2025-04-30 16:36:16 -04:00
dekzter
91f5e2ad7c fixed stream URL sample for search / replace pattersn 2025-04-30 16:15:00 -04:00
SergeantPanda
4f0c8333c6 Add return statement in get_cache_file method of EPGSource model 2025-04-30 14:42:32 -05:00
dekzter
3ea8c05466 Merge remote-tracking branch 'origin/dev' into xtream 2025-04-30 15:10:59 -04:00
SergeantPanda
80fe7e02f8 Added missing _attempt_health_recovery. 2025-04-30 13:43:01 -05:00
SergeantPanda
423020861c Replace time.sleep with gevent.sleep for improved concurrency 2025-04-30 13:32:16 -05:00
SergeantPanda
b7c543b5f5 Use gevent sleep instead of sleep. 2025-04-30 12:48:50 -05:00
SergeantPanda
bdb8d326a5 Add better logging for which channel clients are getting chunks from. 2025-04-30 12:17:11 -05:00
Jean-Paul Acneaux
d61de87fff Fix channel issues that i have with m3u files comming form tvheaden.
And add tvg-chno tag for finding channel numbers.
2025-04-30 05:10:31 +02:00
SergeantPanda
418bf01449 Notify user of how many matches auto-match found.
Add batch EPG association endpoint and improve EPG matching logic

- Implemented a new API endpoint to associate multiple channels with EPG data in a single request.
- Enhanced the EPG matching process to normalize TVG IDs and log relevant information.
- Updated frontend to handle batch EPG associations efficiently, falling back to legacy methods when necessary.
2025-04-29 18:13:42 -05:00
SergeantPanda
2f23909bed Fixed bug overwriting tvg-id when loading TV Guide. 2025-04-29 15:13:15 -05:00
SergeantPanda
d27e4b7e8a Release stream lock before returning url if using redirect profile. 2025-04-29 14:14:40 -05:00
SergeantPanda
9be42ce532 Don't select text when shift is held 2025-04-29 12:47:16 -05:00
SergeantPanda
7503ef9dd2 Allows holding shift and selecting rows. 2025-04-29 12:20:26 -05:00
SergeantPanda
342797d2ec Add key prop to row Box in CustomTableBody for improved rendering 2025-04-29 11:10:26 -05:00
SergeantPanda
9b443a0a3e Adds m3u profile name to stream name. 2025-04-29 09:50:57 -05:00
SergeantPanda
ee2c2194f8 Mimetype guessing as a fallback for remote images. 2025-04-28 20:36:09 -05:00
SergeantPanda
4cf4a0d68d Reverted unintended change. 2025-04-28 20:26:54 -05:00
SergeantPanda
06d3066783 Improve logo handling in LogoViewSet: set default content type and add Content-Disposition for inline display. 2025-04-28 20:22:44 -05:00
SergeantPanda
482803b241 Removed unnecessary logs. 2025-04-28 17:29:27 -05:00
SergeantPanda
cd1da5a61c Added a new channel model to update m3u profile counts and utilize it during stream switches. 2025-04-28 17:25:03 -05:00
SergeantPanda
b439eb810c Cleanup channel lock instead of stream lock. 2025-04-28 15:05:58 -05:00
Nicolas Znamenski
248ef90629
Fixed minor typo 2025-04-28 14:08:04 -04:00
SergeantPanda
0b8f20dc22 Fixes not being able to set logo to default. 2025-04-27 19:19:48 -05:00
SergeantPanda
164f0cdbb5 Increase thread stack size in uwsgi configuration for improved performance 2025-04-27 19:15:00 -05:00
SergeantPanda
07edf270fb Refactor CI workflow to update version.py with build timestamp in Dockerfile 2025-04-27 18:53:39 -05:00
SergeantPanda
a81daaea44 IDK WHY THAT KEEPS GETTING DELETED 2025-04-27 18:32:01 -05:00
SergeantPanda
202ef265de Fix sed command delimiter for updating timestamp in version.py 2025-04-27 18:30:15 -05:00
SergeantPanda
cb62a13c40 Attempt at fixing timestamp not being added to version. 2025-04-27 18:27:33 -05:00
SergeantPanda
a8a6322e30 Missed closing if statement. 2025-04-27 17:50:25 -05:00
SergeantPanda
c049e48c08 Use timestamp instead of build number increase. 2025-04-27 17:46:27 -05:00
GitHub Actions
768ca0e353 Increment build number to 36 [skip ci] 2025-04-27 20:53:27 +00:00
SergeantPanda
d59c8a9e33 Properly track current stream id during stream switches. 2025-04-27 15:52:10 -05:00
GitHub Actions
9e99de77ec Increment build number to 35 [skip ci] 2025-04-27 19:51:37 +00:00
SergeantPanda
9c112e6d8e Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-27 14:51:18 -05:00
SergeantPanda
c51f3136dd Add tooltip for stream profile and add clarity to user-agent. 2025-04-27 14:51:13 -05:00
GitHub Actions
77c92d52bd Increment build number to 34 [skip ci] 2025-04-27 19:43:43 +00:00
SergeantPanda
2b83515405 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-27 14:43:20 -05:00
SergeantPanda
88f27d62f1 Adds current m3u profile to stats. 2025-04-27 14:43:09 -05:00
dekzter
1ccf24fe5f Fixed caching path for non-xc only 2025-04-27 11:08:12 -04:00
dekzter
f295ee219c removed db index here, I don't think it's needed 2025-04-27 10:54:39 -04:00
dekzter
bfaa52ea13 fixed username field 2025-04-27 10:38:30 -04:00
dekzter
3054cf2ae9 initial xtreamcodes support 2025-04-27 10:32:29 -04:00
GitHub Actions
e02e1458fa Increment build number to 33 [skip ci] 2025-04-26 13:37:06 +00:00
SergeantPanda
d3a7dbca10 Imported missing os 2025-04-26 08:36:46 -05:00
GitHub Actions
7cfe7c2998 Increment build number to 32 [skip ci] 2025-04-26 13:29:51 +00:00
SergeantPanda
b2aad67921 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-26 08:29:35 -05:00
SergeantPanda
81fecde3b5 Add stream name to channel status. 2025-04-26 08:29:18 -05:00
GitHub Actions
ff0deffe36 Increment build number to 31 [skip ci] 2025-04-25 21:56:53 +00:00
SergeantPanda
8b057818b7 Adjusted logo box so data didn't shift with different size logos. 2025-04-25 16:55:30 -05:00
GitHub Actions
a2b499d453 Increment build number to 30 [skip ci] 2025-04-25 20:50:04 +00:00
SergeantPanda
b6fe53ba2b Removed no buffer for uwsgi. Caused issues. 2025-04-25 15:49:42 -05:00
GitHub Actions
00a8609b35 Increment build number to 29 [skip ci] 2025-04-25 20:32:59 +00:00
SergeantPanda
575d696c35 Changed proxy to use uwsgi socket and increased client_max_body_size to 128MB. 2025-04-25 15:32:34 -05:00
GitHub Actions
e4cb4bd1d2 Increment build number to 28 [skip ci] 2025-04-25 17:47:25 +00:00
dekzter
9c6e19fb3b clean up old cache files when we refresh epg from remote source 2025-04-25 13:47:09 -04:00
dekzter
3dcc4902fa fixed epg cache 2025-04-25 13:47:09 -04:00
GitHub Actions
530ac8727d Increment build number to 27 [skip ci] 2025-04-25 16:00:24 +00:00
SergeantPanda
4c3ff1cdbd Fixes streams being in incorrect order (for real this time) 2025-04-25 10:59:56 -05:00
GitHub Actions
7d0ea15924 Increment build number to 26 [skip ci] 2025-04-25 14:47:10 +00:00
dekzter
d7b7a32396 fixed bad library ref 2025-04-25 10:46:56 -04:00
GitHub Actions
133e2c6787 Increment build number to 25 [skip ci] 2025-04-25 13:46:57 +00:00
SergeantPanda
83bd0b1fe0 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-25 08:46:36 -05:00
SergeantPanda
e3100e2751 Use correct content type for m3u file instead of m3u8 2025-04-25 08:46:21 -05:00
GitHub Actions
cafb588e4b Increment build number to 24 [skip ci] 2025-04-25 13:42:26 +00:00
dekzter
5ac27043e4 fixed mimetypes 2025-04-25 09:42:07 -04:00
GitHub Actions
1924b71dea Increment build number to 23 [skip ci] 2025-04-25 13:13:23 +00:00
dekzter
d64215b5a6 another attempt 2025-04-25 09:13:06 -04:00
GitHub Actions
64b7b8ab95 Increment build number to 22 [skip ci] 2025-04-25 12:20:04 +00:00
dekzter
51e3c7cc51 fixed channelgroups value 2025-04-25 08:19:47 -04:00
GitHub Actions
8bdc027c93 Increment build number to 21 [skip ci] 2025-04-25 01:55:55 +00:00
SergeantPanda
d15ff0d7c5 Added tooltips. 2025-04-24 20:55:28 -05:00
SergeantPanda
44ea86e59a Show connected time for each client. 2025-04-24 20:30:04 -05:00
GitHub Actions
6f82eb4274 Increment build number to 20 [skip ci] 2025-04-25 01:08:22 +00:00
SergeantPanda
f573173c3e Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-24 20:07:43 -05:00
SergeantPanda
1a1c5dea9e Uses correct api to slected specific stream id. 2025-04-24 20:07:36 -05:00
GitHub Actions
6873164478 Increment build number to 19 [skip ci] 2025-04-25 00:36:34 +00:00
SergeantPanda
c61c7573bd Show the correct currently connect stream and display the available streams to switch to in the correct order. 2025-04-24 19:35:34 -05:00
SergeantPanda
f6905f26cd Manual stream switching is working. 2025-04-24 19:23:38 -05:00
SergeantPanda
059254dcd8 Fixes logo not showing actual channel logo. 2025-04-24 19:07:02 -05:00
SergeantPanda
3837ceab5e Hopefully fixes stats randomly not working. 2025-04-24 18:52:52 -05:00
GitHub Actions
c7caf1e1a3 Increment build number to 18 [skip ci] 2025-04-24 23:27:49 +00:00
SergeantPanda
b004c8de52 Fixes channel stats. 2025-04-24 18:27:26 -05:00
GitHub Actions
1c0c9c41cb Increment build number to 17 [skip ci] 2025-04-24 23:15:10 +00:00
SergeantPanda
58a121e0b6 Add new packages to package-lock 2025-04-24 18:14:50 -05:00
GitHub Actions
e120c11a45 Increment build number to 16 [skip ci] 2025-04-24 23:01:52 +00:00
dekzter
d28e3f4491 Attempting to fix blank screen from form 2025-04-24 19:01:33 -04:00
GitHub Actions
3cc7366301 Increment build number to 15 [skip ci] 2025-04-24 22:15:32 +00:00
SergeantPanda
70265455cf Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-24 17:15:15 -05:00
SergeantPanda
5be46c6e36 Greatly reduced number of logs for m3u/epg file importing/monitoring. 2025-04-24 17:15:00 -05:00
GitHub Actions
4340f82d38 Increment build number to 14 [skip ci] 2025-04-24 22:06:58 +00:00
dekzter
c5f7b183f1 update all row IDs in the store so stream table changes reflect in channel table 2025-04-24 18:06:41 -04:00
GitHub Actions
8625f2e524 Increment build number to 13 [skip ci] 2025-04-24 21:58:29 +00:00
dekzter
8e6f8d59e0 fixed channel creation from stream 2025-04-24 17:58:11 -04:00
GitHub Actions
cb18c4266c Increment build number to 12 [skip ci] 2025-04-24 21:54:47 +00:00
dekzter
bcd302ea3f fixed reference to channel streams 2025-04-24 17:54:29 -04:00
GitHub Actions
75a0262f5a Increment build number to 11 [skip ci] 2025-04-24 21:44:47 +00:00
dekzter
2de165833c missing packages 2025-04-24 17:44:32 -04:00
GitHub Actions
b16c9169e0 Increment build number to 10 [skip ci] 2025-04-24 21:42:40 +00:00
dekzter
cb5308fff5 bumped 2025-04-24 17:42:22 -04:00
GitHub Actions
874ddc8da8 Increment build number to 8 [skip ci] 2025-04-24 21:41:22 +00:00
SergeantPanda
70207ac711 Build bump 2025-04-24 16:41:01 -05:00
GitHub Actions
812efcd56a Increment build number to 6 [skip ci] 2025-04-24 21:14:43 +00:00
dekzter
8fe83fd9ee channel streams optimizations, bug fixes in tables, parallel api calls for paginated data fetching 2025-04-24 17:14:25 -04:00
dekzter
1d63075660 fixed sticky header 2025-04-24 17:14:25 -04:00
GitHub Actions
a6f2bbaa25 Increment build number to 5 [skip ci] 2025-04-24 17:33:34 +00:00
dekzter
202fbeec7a Fixing table data population 2025-04-24 13:33:19 -04:00
GitHub Actions
54094433ab Increment build number to 4 [skip ci] 2025-04-24 17:07:27 +00:00
dekzter
dd9c37ab07 merged in table-rewrite branch 2025-04-24 13:07:05 -04:00
dekzter
abab931711 tooltips, fixed sticky header 2025-04-24 11:51:47 -04:00
dekzter
29d2db0f8e testing out virtualized ,fixed some more bugs 2025-04-24 11:20:36 -04:00
dekzter
5fd3d95a14 fixed copy button 2025-04-24 07:59:14 -04:00
dekzter
d9787ea075 removed unused variables, pass in all row ids for state management 2025-04-24 07:02:21 -04:00
dekzter
0dc62fb039 refactored streams table as well, broke out custom table into smaller components 2025-04-23 18:04:00 -04:00
dekzter
03f6c77391 alternating row colors are restored, row hover restored, fixed drag and drop functionality in streams table 2025-04-23 13:02:01 -04:00
dekzter
bdc36bf5a0 restored channel group filter 2025-04-23 11:22:54 -04:00
dekzter
4506280400 restored support for adding streams from streams table 2025-04-23 11:20:00 -04:00
dekzter
260b57576a initial commit 2025-04-23 11:02:28 -04:00
dekzter
9f15c99c01 better error handling 2025-04-23 11:02:06 -04:00
dekzter
3e2f91abf8 hopefully finalizing table rewrite 2025-04-23 11:02:00 -04:00
dekzter
5eae8bd603 attempt to use localstorage for saving preferences 2025-04-23 10:50:23 -04:00
GitHub Actions
0846b9b42c Increment build number to 3 [skip ci] 2025-04-23 13:39:06 +00:00
SergeantPanda
8f29707a0b Fix logo URL handling in M3U generation to convert filesystem paths to web URLs 2025-04-23 08:38:29 -05:00
dekzter
40a86203d1 copy buttons for stream and channel urls 2025-04-21 18:33:01 -04:00
dekzter
21c67b999d fixing when to render onboarding channel section 2025-04-21 16:23:55 -04:00
dekzter
4df796ac7f requery on channel delete 2025-04-21 15:59:29 -04:00
dekzter
ebf514cbed fixed channel group saving 2025-04-21 08:21:21 -04:00
dekzter
fb56e4d3f5 Requery on form submission 2025-04-20 10:44:34 -04:00
dekzter
3fc37f8f4f cache url for logos in m3u and epg 2025-04-20 10:06:08 -04:00
dekzter
c4f470e8f7 Fixed bug in deleting stream profiles 2025-04-20 09:54:16 -04:00
dekzter
656c9e9e14 Fixed refresh token bug 2025-04-20 09:21:53 -04:00
dekzter
a199eeab92 bug fixes, display 'no data' on empty streams table 2025-04-19 18:04:48 -04:00
dekzter
eb9419ddd2 proper handling of store variables so we now aren't listening on any change from the state of a store 2025-04-19 08:49:04 -04:00
dekzter
ccdb8ab00d more table bug fixes, query optimizations, re-added channel expansion stream table with reworked drag-and-drop 2025-04-19 08:37:43 -04:00
dekzter
8d1bfcb975 Merge remote-tracking branch 'origin/dev' into table-rewrite 2025-04-19 08:36:47 -04:00
GitHub Actions
ca96921d24 Increment build number to 2 [skip ci] 2025-04-18 22:09:17 +00:00
SergeantPanda
5e0f81522c Enhance tuner count calculation to include only active M3U accounts 2025-04-18 17:08:47 -05:00
GitHub Actions
641a543e78 Increment build number to 1 [skip ci] 2025-04-18 14:22:25 +00:00
GitHub Actions
5220a2fe00 Release v0.3.3 2025-04-18 13:42:13 +00:00
SergeantPanda
d75a7ec84a
Release 0.3.3
# Release Notes

## Bug Fixes
- Fixed an issue with dummy EPG calculating  vali above 24, ensuring time vauto iremain within valid 24-hour format
- Fixed auto import functionality to properly process old files that hadn't been imported yet, rather than ignoring them
2025-04-18 08:38:47 -05:00
dekzter
b74b388f7d finishing up table refactor for channels 2025-04-17 17:57:33 -04:00
dekzter
c6d83bf1f9 Merge remote-tracking branch 'origin/dev' into table-rewrite 2025-04-17 17:56:55 -04:00
dekzter
153e5824a4 refactor, using custom table, efficient so far 2025-04-17 16:05:24 -04:00
GitHub Actions
84287b948f Increment build number to 3 [skip ci] 2025-04-17 08:34:57 +00:00
SergeantPanda
e57c712943 Fixes dummy epg calculating hours above 24 2025-04-17 03:34:25 -05:00
GitHub Actions
403eb3654e Increment build number to 2 [skip ci] 2025-04-17 05:40:53 +00:00
SergeantPanda
e7ac7bf5db Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-17 00:40:36 -05:00
SergeantPanda
9b76358320 Fixes auto import ignoring Old files that hadn't been imported yet. 2025-04-17 00:40:00 -05:00
GitHub Actions
6f4309d24a Increment build number to 1 [skip ci] 2025-04-17 04:01:00 +00:00
dekzter
6a0ce574b0 rewrite with tanstack table 2025-04-16 13:06:51 -04:00
GitHub Actions
526fafb087 Release v0.3.2 2025-04-16 14:46:30 +00:00
SergeantPanda
4341840e51
Merge pull request #62 from Dispatcharr/dev
Dev
2025-04-16 09:45:33 -05:00
GitHub Actions
e44ec6b951 Increment build number to 2 [skip ci] 2025-04-16 14:43:17 +00:00
SergeantPanda
1297ca4c26 bump build 2025-04-16 09:42:56 -05:00
SergeantPanda
2d04158b21 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-16 09:42:26 -05:00
SergeantPanda
5511cf64da Merge branch 'main' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-16 09:41:22 -05:00
GitHub Actions
b5a5a6868f Increment build number to 7 [skip ci] 2025-04-16 14:41:07 +00:00
SergeantPanda
c5748346eb Fixes stream ordering. 2025-04-16 09:40:29 -05:00
GitHub Actions
3a975f5f7f Release v0.3.1 2025-04-16 01:17:11 +00:00
SergeantPanda
a032051c95
Merge pull request #60 from Dispatcharr/dev
Dev
2025-04-15 20:16:35 -05:00
GitHub Actions
336ccac0d5 Increment build number to 6 [skip ci] 2025-04-16 01:04:19 +00:00
SergeantPanda
b09a4baae4 Add dummy epg channels to tv guide. 2025-04-15 20:03:55 -05:00
GitHub Actions
27b0cc0e66 Increment build number to 5 [skip ci] 2025-04-16 00:46:53 +00:00
SergeantPanda
cc4b0d557f Fixes not being able to set dummy epg. 2025-04-15 19:46:23 -05:00
GitHub Actions
70c9ffe195 Increment build number to 4 [skip ci] 2025-04-16 00:26:17 +00:00
SergeantPanda
3a846a7958 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-15 19:25:58 -05:00
SergeantPanda
29b10607ab Added a key to the nav links on the sidebar to get rid of dom error when loaded web ui. 2025-04-15 19:25:50 -05:00
GitHub Actions
7c906dab74 Increment build number to 3 [skip ci] 2025-04-16 00:20:04 +00:00
SergeantPanda
9b66dd1139 Fix channel numbers not saving. 2025-04-15 19:19:41 -05:00
dekzter
af638326e1 attempt at table rewrite for efficient virtualized table 2025-04-15 13:48:06 -04:00
GitHub Actions
4a6db3f414 Increment build number to 2 [skip ci] 2025-04-15 17:45:06 +00:00
SergeantPanda
b4ddf08e4a Fixes, notifications that don't give good error messges, epg's not refreshing when linking epg to channel. 2025-04-15 12:22:00 -05:00
SergeantPanda
a7e5090c72
Update release.yml
Removed armv7 tag as it's not supported.
2025-04-15 11:19:13 -05:00
GitHub Actions
7f35278a18 Increment build number to 1 [skip ci] 2025-04-15 15:14:23 +00:00
GitHub Actions
60d9133498 Release v0.3.0 2025-04-15 15:10:17 +00:00
SergeantPanda
8bb0487d1b
Merge pull request #59 from Dispatcharr/dev
0.3.0
2025-04-15 10:09:12 -05:00
GitHub Actions
8a2245bded Increment build number to 8 [skip ci] 2025-04-15 15:08:50 +00:00
SergeantPanda
cccee4b4bf
Merge branch 'main' into dev 2025-04-15 10:08:36 -05:00
SergeantPanda
bb7bac48cf Fixed broken links to compose files (again) [skip ci] 2025-04-15 09:31:54 -05:00
GitHub Actions
8b89f45510 Increment build number to 7 [skip ci] 2025-04-15 13:29:12 +00:00
SergeantPanda
1f181f052c Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-15 08:28:54 -05:00
SergeantPanda
14621598f6 Added url validation for redirect profile. 2025-04-15 08:28:47 -05:00
dekzter
eb48083cce channels pagination 2025-04-15 09:03:24 -04:00
GitHub Actions
d165437129 Increment build number to 6 [skip ci] 2025-04-15 12:44:43 +00:00
SergeantPanda
73d0d89422 Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into dev 2025-04-15 07:44:26 -05:00
SergeantPanda
5bb6539586 Fixes multiple streams in a row being dead. 2025-04-15 07:44:08 -05:00
SergeantPanda
02b5fb6fc0 Changed logging level of channel state checks for client. 2025-04-14 21:04:33 -05:00
GitHub Actions
20baa0ddcf Increment build number to 5 [skip ci] 2025-04-15 02:03:57 +00:00
SergeantPanda
60fd5afd94 More robust stream switches. Has client wait if in switching state. 2025-04-14 21:03:33 -05:00
GitHub Actions
95a51d71a0 Increment build number to 4 [skip ci] 2025-04-14 23:11:45 +00:00
SergeantPanda
90cc65eb7d Increase workers and threads for uwsgi. 2025-04-14 18:11:20 -05:00
dekzter
85835dbf08 attempting to optimize virtualized rows 2025-04-14 09:44:11 -04:00
GitHub Actions
55fce02469 Increment build number to 3 [skip ci] 2025-04-14 01:59:17 +00:00
SergeantPanda
2921588b23 Add custom stream count to calculation. 2025-04-13 20:58:46 -05:00
GitHub Actions
e936b56d3b Increment build number to 2 [skip ci] 2025-04-14 01:46:18 +00:00
SergeantPanda
cb891461bf Dynamically create TunerCount for HDHR based on profile max connections. Minimum of 2, unlimited is 10. 2025-04-13 20:31:10 -05:00
Dispatcharr
5670cb66b4 Added install script 2025-04-13 13:00:16 -05:00
OkinawaBoss
32da82adfd
Update README.md
Fixed docker links
2025-04-13 12:33:52 -05:00
GitHub Actions
d0073637ef Increment build number to 1 [skip ci] 2025-04-13 15:39:34 +00:00
378 changed files with 87386 additions and 10086 deletions

View file

@ -11,6 +11,10 @@
**/.toolstarget
**/.vs
**/.vscode
**/.history
**/media
**/models
**/static
**/*.*proj.user
**/*.dbmdl
**/*.jfm
@ -26,3 +30,5 @@
**/values.dev.yaml
LICENSE
README.md
data/
docker/data/

64
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View file

@ -0,0 +1,64 @@
name: Bug Report
description: I have an issue with Dispatcharr
title: "[Bug]: "
labels: ["Triage"]
type: "Bug"
projects: []
assignees: []
body:
- type: markdown
attributes:
value: |
Please make sure you search for similar issues before submitting. Thank you for your bug report!
- type: textarea
id: describe-the-bug
attributes:
label: Describe the bug
description: Make sure to attach screenshots if possible!
placeholder: Tell us what you see!
value: "A clear and concise description of what the bug is. What did you expect to happen?"
validations:
required: true
- type: textarea
id: reproduce
attributes:
label: How can we recreate this bug?
description: Be detailed!
placeholder: Tell us what you see!
value: "1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See error"
validations:
required: true
- type: input
id: dispatcharr-version
attributes:
label: Dispatcharr Version
description: What version of Dispatcharr are you running?
placeholder: Located bottom left of main screen
validations:
required: true
- type: input
id: docker-version
attributes:
label: Docker Version
description: What version of Docker are you running?
placeholder: docker --version
validations:
required: true
- type: textarea
id: docker-compose
attributes:
label: What's in your Docker Compose file?
description: Please share your docker-compose.yml file
placeholder: Tell us what you see!
value: "If not using Docker Compose just put not using."
validations:
required: true
- type: textarea
id: client-info
attributes:
label: Client Information
description: What are you using the view the streams from Dispatcharr
placeholder: Tell us what you see!
value: "Device, App, Versions for both, etc..."
validations:
required: true

1
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View file

@ -0,0 +1 @@
blank_issues_enabled: false

View file

@ -0,0 +1,39 @@
name: Feature request
description: I want to suggest a new feature for Dispatcharr
title: "[Feature]: "
labels: ["Triage"]
type: "Feature"
projects: []
assignees: []
body:
- type: markdown
attributes:
value: |
Thank you for helping to make Dispatcharr better!
- type: textarea
id: describe-problem
attributes:
label: Is your feature request related to a problem?
description: Make sure to attach screenshots if possible!
placeholder: Tell us what you see!
value: "A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]"
validations:
required: true
- type: textarea
id: describe-solution
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: Tell us what you see!
value: "Describe here."
validations:
required: true
- type: textarea
id: extras
attributes:
label: Additional context
description: Anything else you want to add?
placeholder: Tell us what you see!
value: "Nothing Extra"
validations:
required: true

250
.github/workflows/base-image.yml vendored Normal file
View file

@ -0,0 +1,250 @@
name: Base Image Build
on:
push:
branches: [main, dev]
paths:
- 'docker/DispatcharrBase'
- '.github/workflows/base-image.yml'
- 'requirements.txt'
pull_request:
branches: [main, dev]
paths:
- 'docker/DispatcharrBase'
- '.github/workflows/base-image.yml'
- 'requirements.txt'
workflow_dispatch: # Allow manual triggering
permissions:
contents: write # For managing releases and pushing tags
packages: write # For publishing to GitHub Container Registry
jobs:
prepare:
runs-on: ubuntu-24.04
outputs:
repo_owner: ${{ steps.meta.outputs.repo_owner }}
repo_name: ${{ steps.meta.outputs.repo_name }}
branch_tag: ${{ steps.meta.outputs.branch_tag }}
timestamp: ${{ steps.timestamp.outputs.timestamp }}
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Generate timestamp for build
id: timestamp
run: |
TIMESTAMP=$(date -u +'%Y%m%d%H%M%S')
echo "timestamp=${TIMESTAMP}" >> $GITHUB_OUTPUT
- name: Set repository and image metadata
id: meta
run: |
# Get lowercase repository owner
REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
echo "repo_owner=${REPO_OWNER}" >> $GITHUB_OUTPUT
# Get repository name
REPO_NAME=$(echo "${{ github.repository }}" | cut -d '/' -f 2 | tr '[:upper:]' '[:lower:]')
echo "repo_name=${REPO_NAME}" >> $GITHUB_OUTPUT
# Determine branch name
if [[ "${{ github.ref }}" == "refs/heads/main" ]]; then
echo "branch_tag=base" >> $GITHUB_OUTPUT
elif [[ "${{ github.ref }}" == "refs/heads/dev" ]]; then
echo "branch_tag=base-dev" >> $GITHUB_OUTPUT
else
# For other branches, use the branch name
BRANCH=$(echo "${{ github.ref }}" | sed 's/refs\/heads\///' | sed 's/[^a-zA-Z0-9]/-/g')
echo "branch_tag=base-${BRANCH}" >> $GITHUB_OUTPUT
fi
docker:
needs: [prepare]
strategy:
fail-fast: false
matrix:
platform: [amd64, arm64]
include:
- platform: amd64
runner: ubuntu-24.04
- platform: arm64
runner: ubuntu-24.04-arm
runs-on: ${{ matrix.runner }}
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Configure Git
run: |
git config user.name "GitHub Actions"
git config user.email "actions@github.com"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: |
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}
labels: |
org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}
org.opencontainers.image.description=Your ultimate IPTV & stream Management companion.
org.opencontainers.image.url=https://github.com/${{ github.repository }}
org.opencontainers.image.source=https://github.com/${{ github.repository }}
org.opencontainers.image.version=${{ needs.prepare.outputs.branch_tag }}-${{ needs.prepare.outputs.timestamp }}
org.opencontainers.image.created=${{ needs.prepare.outputs.timestamp }}
org.opencontainers.image.revision=${{ github.sha }}
org.opencontainers.image.licenses=See repository
org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/
org.opencontainers.image.vendor=${{ needs.prepare.outputs.repo_owner }}
org.opencontainers.image.authors=${{ github.actor }}
maintainer=${{ github.actor }}
build_version=DispatcharrBase version: ${{ needs.prepare.outputs.branch_tag }}-${{ needs.prepare.outputs.timestamp }}
- name: Build and push Docker base image
uses: docker/build-push-action@v4
with:
context: .
file: ./docker/DispatcharrBase
push: ${{ github.event_name != 'pull_request' }}
platforms: linux/${{ matrix.platform }}
tags: |
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ matrix.platform }}
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ needs.prepare.outputs.timestamp }}-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ needs.prepare.outputs.timestamp }}-${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
REPO_OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO_NAME=${{ needs.prepare.outputs.repo_name }}
BRANCH=${{ github.ref_name }}
REPO_URL=https://github.com/${{ github.repository }}
TIMESTAMP=${{ needs.prepare.outputs.timestamp }}
create-manifest:
needs: [prepare, docker]
runs-on: ubuntu-24.04
if: ${{ github.event_name != 'pull_request' }}
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Create multi-arch manifest tags
run: |
set -euo pipefail
OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO=${{ needs.prepare.outputs.repo_name }}
BRANCH_TAG=${{ needs.prepare.outputs.branch_tag }}
TIMESTAMP=${{ needs.prepare.outputs.timestamp }}
echo "Creating multi-arch manifest for ${OWNER}/${REPO}"
# GitHub Container Registry manifests
# branch tag (e.g. base or base-dev)
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=DispatcharrBase version: ${BRANCH_TAG}-${TIMESTAMP}" \
--tag ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG} \
ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-amd64 ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-arm64
# branch + timestamp tag
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=DispatcharrBase version: ${BRANCH_TAG}-${TIMESTAMP}" \
--tag ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-${TIMESTAMP} \
ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-${TIMESTAMP}-amd64 ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-${TIMESTAMP}-arm64
# Docker Hub manifests
# branch tag (e.g. base or base-dev)
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=DispatcharrBase version: ${BRANCH_TAG}-${TIMESTAMP}" \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG} \
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-amd64 docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-arm64
# branch + timestamp tag
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=DispatcharrBase version: ${BRANCH_TAG}-${TIMESTAMP}" \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-${TIMESTAMP} \
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-${TIMESTAMP}-amd64 docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-${TIMESTAMP}-arm64

View file

@ -2,18 +2,86 @@ name: CI Pipeline
on:
push:
branches: [ dev ]
branches: [dev]
paths-ignore:
- '**.md'
pull_request:
branches: [ dev ]
branches: [dev]
workflow_dispatch:
# Add explicit permissions for the workflow
permissions:
contents: write # For managing releases and pushing tags
packages: write # For publishing to GitHub Container Registry
contents: write
packages: write
jobs:
build:
runs-on: ubuntu-latest
prepare:
runs-on: ubuntu-24.04
# compute a single timestamp, version, and repo metadata for the entire workflow
outputs:
repo_owner: ${{ steps.meta.outputs.repo_owner }}
repo_name: ${{ steps.meta.outputs.repo_name }}
branch_tag: ${{ steps.meta.outputs.branch_tag }}
version: ${{ steps.version.outputs.version }}
timestamp: ${{ steps.timestamp.outputs.timestamp }}
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Generate timestamp for build
id: timestamp
run: |
TIMESTAMP=$(date -u +'%Y%m%d%H%M%S')
echo "timestamp=${TIMESTAMP}" >> $GITHUB_OUTPUT
- name: Extract version info
id: version
run: |
VERSION=$(python -c "import version; print(version.__version__)")
echo "version=${VERSION}" >> $GITHUB_OUTPUT
- name: Set repository and image metadata
id: meta
run: |
REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
echo "repo_owner=${REPO_OWNER}" >> $GITHUB_OUTPUT
REPO_NAME=$(echo "${{ github.repository }}" | cut -d '/' -f 2 | tr '[:upper:]' '[:lower:]')
echo "repo_name=${REPO_NAME}" >> $GITHUB_OUTPUT
if [[ "${{ github.ref }}" == "refs/heads/main" ]]; then
echo "branch_tag=latest" >> $GITHUB_OUTPUT
echo "is_main=true" >> $GITHUB_OUTPUT
elif [[ "${{ github.ref }}" == "refs/heads/dev" ]]; then
echo "branch_tag=dev" >> $GITHUB_OUTPUT
echo "is_main=false" >> $GITHUB_OUTPUT
else
BRANCH=$(echo "${{ github.ref }}" | sed 's/refs\/heads\///' | sed 's/[^a-zA-Z0-9]/-/g')
echo "branch_tag=${BRANCH}" >> $GITHUB_OUTPUT
echo "is_main=false" >> $GITHUB_OUTPUT
fi
if [[ "${{ github.event.pull_request.head.repo.fork }}" == "true" ]]; then
echo "is_fork=true" >> $GITHUB_OUTPUT
else
echo "is_fork=false" >> $GITHUB_OUTPUT
fi
docker:
needs: [prepare]
strategy:
fail-fast: false
matrix:
platform: [amd64, arm64]
include:
- platform: amd64
runner: ubuntu-24.04
- platform: arm64
runner: ubuntu-24.04-arm
runs-on: ${{ matrix.runner }}
# no per-job outputs here; shared metadata comes from the `prepare` job
steps:
- uses: actions/checkout@v3
with:
@ -44,73 +112,162 @@ jobs:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Increment Build Number
if: steps.check_actor.outputs.is_bot != 'true'
id: increment_build
run: |
python scripts/increment_build.py
BUILD=$(python -c "import version; print(version.__build__)")
echo "build=${BUILD}" >> $GITHUB_OUTPUT
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Commit Build Number Update
if: steps.check_actor.outputs.is_bot != 'true'
run: |
git add version.py
git commit -m "Increment build number to ${{ steps.increment_build.outputs.build }} [skip ci]"
git push
- name: Extract version info
id: version
run: |
VERSION=$(python -c "import version; print(version.__version__)")
BUILD=$(python -c "import version; print(version.__build__)")
echo "version=${VERSION}" >> $GITHUB_OUTPUT
echo "build=${BUILD}" >> $GITHUB_OUTPUT
echo "sha_short=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
- name: Set repository and image metadata
- name: Extract metadata for Docker
id: meta
run: |
# Get lowercase repository owner
REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
echo "repo_owner=${REPO_OWNER}" >> $GITHUB_OUTPUT
# Get repository name
REPO_NAME=$(echo "${{ github.repository }}" | cut -d '/' -f 2 | tr '[:upper:]' '[:lower:]')
echo "repo_name=${REPO_NAME}" >> $GITHUB_OUTPUT
# Determine branch name
if [[ "${{ github.ref }}" == "refs/heads/main" ]]; then
echo "branch_tag=latest" >> $GITHUB_OUTPUT
echo "is_main=true" >> $GITHUB_OUTPUT
elif [[ "${{ github.ref }}" == "refs/heads/dev" ]]; then
echo "branch_tag=dev" >> $GITHUB_OUTPUT
echo "is_main=false" >> $GITHUB_OUTPUT
else
# For other branches, use the branch name
BRANCH=$(echo "${{ github.ref }}" | sed 's/refs\/heads\///' | sed 's/[^a-zA-Z0-9]/-/g')
echo "branch_tag=${BRANCH}" >> $GITHUB_OUTPUT
echo "is_main=false" >> $GITHUB_OUTPUT
fi
# Determine if this is from a fork
if [[ "${{ github.event.pull_request.head.repo.fork }}" == "true" ]]; then
echo "is_fork=true" >> $GITHUB_OUTPUT
else
echo "is_fork=false" >> $GITHUB_OUTPUT
fi
uses: docker/metadata-action@v5
with:
images: |
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}
labels: |
org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}
org.opencontainers.image.description=Your ultimate IPTV & stream Management companion.
org.opencontainers.image.url=https://github.com/${{ github.repository }}
org.opencontainers.image.source=https://github.com/${{ github.repository }}
org.opencontainers.image.version=${{ needs.prepare.outputs.version }}-${{ needs.prepare.outputs.timestamp }}
org.opencontainers.image.created=${{ needs.prepare.outputs.timestamp }}
org.opencontainers.image.revision=${{ github.sha }}
org.opencontainers.image.licenses=See repository
org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/
org.opencontainers.image.vendor=${{ needs.prepare.outputs.repo_owner }}
org.opencontainers.image.authors=${{ github.actor }}
maintainer=${{ github.actor }}
build_version=Dispatcharr version: ${{ needs.prepare.outputs.version }}-${{ needs.prepare.outputs.timestamp }}
- name: Build and push Docker image
uses: docker/build-push-action@v4
with:
context: .
push: ${{ github.event_name != 'pull_request' }}
platforms: linux/amd64 # Fast build - amd64 only
# Build only the platform for this matrix job to avoid running amd64
# stages under qemu on an arm64 runner (and vice-versa). This makes
# the matrix runner's platform the one built by buildx.
platforms: linux/${{ matrix.platform }}
# push arch-specific tags from each matrix job (they will be combined
# into a multi-arch manifest in a follow-up job)
tags: |
ghcr.io/${{ steps.meta.outputs.repo_owner }}/${{ steps.meta.outputs.repo_name }}:${{ steps.meta.outputs.branch_tag }}
ghcr.io/${{ steps.meta.outputs.repo_owner }}/${{ steps.meta.outputs.repo_name }}:${{ steps.version.outputs.version }}-${{ steps.version.outputs.build }}
ghcr.io/${{ steps.meta.outputs.repo_owner }}/${{ steps.meta.outputs.repo_name }}:${{ steps.version.outputs.sha_short }}
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ matrix.platform }}
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.version }}-${{ needs.prepare.outputs.timestamp }}-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.branch_tag }}-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.version }}-${{ needs.prepare.outputs.timestamp }}-${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
REPO_OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO_NAME=${{ needs.prepare.outputs.repo_name }}
BASE_TAG=base
BRANCH=${{ github.ref_name }}
REPO_URL=https://github.com/${{ github.repository }}
TIMESTAMP=${{ needs.prepare.outputs.timestamp }}
file: ./docker/Dockerfile
create-manifest:
# wait for prepare and all matrix builds to finish
needs: [prepare, docker]
runs-on: ubuntu-24.04
if: ${{ github.event_name != 'pull_request' }}
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Create multi-arch manifest tags
run: |
set -euo pipefail
OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO=${{ needs.prepare.outputs.repo_name }}
BRANCH_TAG=${{ needs.prepare.outputs.branch_tag }}
VERSION=${{ needs.prepare.outputs.version }}
TIMESTAMP=${{ needs.prepare.outputs.timestamp }}
echo "Creating multi-arch manifest for ${OWNER}/${REPO}"
# branch tag (e.g. latest or dev)
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION}-${TIMESTAMP}" \
--tag ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG} \
ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-amd64 ghcr.io/${OWNER}/${REPO}:${BRANCH_TAG}-arm64
# version + timestamp tag
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${VERSION}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION}-${TIMESTAMP}" \
--tag ghcr.io/${OWNER}/${REPO}:${VERSION}-${TIMESTAMP} \
ghcr.io/${OWNER}/${REPO}:${VERSION}-${TIMESTAMP}-amd64 ghcr.io/${OWNER}/${REPO}:${VERSION}-${TIMESTAMP}-arm64
# also create Docker Hub manifests using the same username
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${BRANCH_TAG}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION}-${TIMESTAMP}" \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG} \
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-amd64 docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${BRANCH_TAG}-arm64
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${VERSION}-${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION}-${TIMESTAMP}" \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION}-${TIMESTAMP} \
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION}-${TIMESTAMP}-amd64 docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION}-${TIMESTAMP}-arm64

41
.github/workflows/frontend-tests.yml vendored Normal file
View file

@ -0,0 +1,41 @@
name: Frontend Tests
on:
push:
branches: [main, dev]
paths:
- 'frontend/**'
- '.github/workflows/frontend-tests.yml'
pull_request:
branches: [main, dev]
paths:
- 'frontend/**'
- '.github/workflows/frontend-tests.yml'
jobs:
test:
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./frontend
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '24'
cache: 'npm'
cache-dependency-path: './frontend/package-lock.json'
- name: Install dependencies
run: npm ci
# - name: Run linter
# run: npm run lint
- name: Run tests
run: npm test

View file

@ -15,16 +15,22 @@ on:
# Add explicit permissions for the workflow
permissions:
contents: write # For managing releases and pushing tags
packages: write # For publishing to GitHub Container Registry
contents: write # For managing releases and pushing tags
packages: write # For publishing to GitHub Container Registry
jobs:
release:
runs-on: ubuntu-latest
prepare:
runs-on: ubuntu-24.04
outputs:
new_version: ${{ steps.update_version.outputs.new_version }}
repo_owner: ${{ steps.meta.outputs.repo_owner }}
repo_name: ${{ steps.meta.outputs.repo_name }}
timestamp: ${{ steps.timestamp.outputs.timestamp }}
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Configure Git
run: |
@ -38,14 +44,55 @@ jobs:
NEW_VERSION=$(python -c "import version; print(f'{version.__version__}')")
echo "new_version=${NEW_VERSION}" >> $GITHUB_OUTPUT
- name: Set lowercase repo owner
id: repo_owner
- name: Update Changelog
run: |
python scripts/update_changelog.py ${{ steps.update_version.outputs.new_version }}
- name: Set repository metadata
id: meta
run: |
REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
echo "lowercase=${REPO_OWNER}" >> $GITHUB_OUTPUT
echo "repo_owner=${REPO_OWNER}" >> $GITHUB_OUTPUT
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
REPO_NAME=$(echo "${{ github.repository }}" | cut -d '/' -f 2 | tr '[:upper:]' '[:lower:]')
echo "repo_name=${REPO_NAME}" >> $GITHUB_OUTPUT
- name: Generate timestamp for build
id: timestamp
run: |
TIMESTAMP=$(date -u +'%Y%m%d%H%M%S')
echo "timestamp=${TIMESTAMP}" >> $GITHUB_OUTPUT
- name: Commit and Tag
run: |
git add version.py CHANGELOG.md
git commit -m "Release v${{ steps.update_version.outputs.new_version }}"
git tag -a "v${{ steps.update_version.outputs.new_version }}" -m "Release v${{ steps.update_version.outputs.new_version }}"
git push origin main --tags
docker:
needs: [prepare]
strategy:
fail-fast: false
matrix:
platform: [amd64, arm64]
include:
- platform: amd64
runner: ubuntu-24.04
- platform: arm64
runner: ubuntu-24.04-arm
runs-on: ${{ matrix.runner }}
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
ref: main
- name: Configure Git
run: |
git config user.name "GitHub Actions"
git config user.email "actions@github.com"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
@ -57,37 +104,134 @@ jobs:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Commit and Tag
run: |
git add version.py
git commit -m "Release v${{ steps.update_version.outputs.new_version }}"
git tag -a "v${{ steps.update_version.outputs.new_version }}" -m "Release v${{ steps.update_version.outputs.new_version }}"
git push origin main --tags
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and Push Release Image
- name: Extract metadata for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: |
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}
labels: |
org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}
org.opencontainers.image.description=Your ultimate IPTV & stream Management companion.
org.opencontainers.image.url=https://github.com/${{ github.repository }}
org.opencontainers.image.source=https://github.com/${{ github.repository }}
org.opencontainers.image.version=${{ needs.prepare.outputs.new_version }}
org.opencontainers.image.created=${{ needs.prepare.outputs.timestamp }}
org.opencontainers.image.revision=${{ github.sha }}
org.opencontainers.image.licenses=See repository
org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/
org.opencontainers.image.vendor=${{ needs.prepare.outputs.repo_owner }}
org.opencontainers.image.authors=${{ github.actor }}
maintainer=${{ github.actor }}
build_version=Dispatcharr version: ${{ needs.prepare.outputs.new_version }} Build date: ${{ needs.prepare.outputs.timestamp }}
- name: Build and push Docker image
uses: docker/build-push-action@v4
with:
context: .
push: true
platforms: linux/amd64,linux/arm64, #linux/arm/v7 # Multi-arch support for releases
platforms: linux/${{ matrix.platform }}
tags: |
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:latest
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:${{ steps.update_version.outputs.new_version }}
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:latest-amd64
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:latest-arm64
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:${{ steps.update_version.outputs.new_version }}-amd64
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:${{ steps.update_version.outputs.new_version }}-arm64
ghcr.io/${{ steps.repo_owner.outputs.lowercase }}/dispatcharr:${{ steps.update_version.outputs.new_version }}-armv7
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:latest-${{ matrix.platform }}
ghcr.io/${{ needs.prepare.outputs.repo_owner }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.new_version }}-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:latest-${{ matrix.platform }}
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${{ needs.prepare.outputs.repo_name }}:${{ needs.prepare.outputs.new_version }}-${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
REPO_OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO_NAME=${{ needs.prepare.outputs.repo_name }}
BRANCH=${{ github.ref_name }}
REPO_URL=https://github.com/${{ github.repository }}
file: ./docker/Dockerfile
create-manifest:
needs: [prepare, docker]
runs-on: ubuntu-24.04
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
registry: docker.io
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Create multi-arch manifest tags
run: |
set -euo pipefail
OWNER=${{ needs.prepare.outputs.repo_owner }}
REPO=${{ needs.prepare.outputs.repo_name }}
VERSION=${{ needs.prepare.outputs.new_version }}
TIMESTAMP=${{ needs.prepare.outputs.timestamp }}
echo "Creating multi-arch manifest for ${OWNER}/${REPO}"
# GitHub Container Registry manifests
# Create one manifest with both latest and version tags
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${VERSION}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION} Build date: ${TIMESTAMP}" \
--tag ghcr.io/${OWNER}/${REPO}:latest \
--tag ghcr.io/${OWNER}/${REPO}:${VERSION} \
ghcr.io/${OWNER}/${REPO}:${VERSION}-amd64 ghcr.io/${OWNER}/${REPO}:${VERSION}-arm64
# Docker Hub manifests
# Create one manifest with both latest and version tags
docker buildx imagetools create \
--annotation "index:org.opencontainers.image.title=${{ needs.prepare.outputs.repo_name }}" \
--annotation "index:org.opencontainers.image.description=Your ultimate IPTV & stream Management companion." \
--annotation "index:org.opencontainers.image.url=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.source=https://github.com/${{ github.repository }}" \
--annotation "index:org.opencontainers.image.version=${VERSION}" \
--annotation "index:org.opencontainers.image.created=${TIMESTAMP}" \
--annotation "index:org.opencontainers.image.revision=${{ github.sha }}" \
--annotation "index:org.opencontainers.image.licenses=See repository" \
--annotation "index:org.opencontainers.image.documentation=https://dispatcharr.github.io/Dispatcharr-Docs/" \
--annotation "index:org.opencontainers.image.vendor=${OWNER}" \
--annotation "index:org.opencontainers.image.authors=${{ github.actor }}" \
--annotation "index:maintainer=${{ github.actor }}" \
--annotation "index:build_version=Dispatcharr version: ${VERSION} Build date: ${TIMESTAMP}" \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:latest \
--tag docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION} \
docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION}-amd64 docker.io/${{ secrets.DOCKERHUB_ORGANIZATION }}/${REPO}:${VERSION}-arm64
create-release:
needs: [prepare, create-manifest]
runs-on: ubuntu-24.04
steps:
- name: Create GitHub Release
uses: softprops/action-gh-release@v1
with:
tag_name: v${{ steps.update_version.outputs.new_version }}
name: Release v${{ steps.update_version.outputs.new_version }}
tag_name: v${{ needs.prepare.outputs.new_version }}
name: Release v${{ needs.prepare.outputs.new_version }}
draft: false
prerelease: false
token: ${{ secrets.GITHUB_TOKEN }}

3
.gitignore vendored
View file

@ -18,4 +18,5 @@ dump.rdb
debugpy*
uwsgi.sock
package-lock.json
models
models
.idea

1014
CHANGELOG.md Normal file

File diff suppressed because it is too large Load diff

286
Plugins.md Normal file
View file

@ -0,0 +1,286 @@
# Dispatcharr Plugins
This document explains how to build, install, and use Python plugins in Dispatcharr. It covers discovery, the plugin interface, settings, actions, how to access application APIs, and examples.
---
## Quick Start
1) Create a folder under `/app/data/plugins/my_plugin/` (host path `data/plugins/my_plugin/` in the repo).
2) Add a `plugin.py` file exporting a `Plugin` class:
```
# /app/data/plugins/my_plugin/plugin.py
class Plugin:
name = "My Plugin"
version = "0.1.0"
description = "Does something useful"
# Settings fields rendered by the UI and persisted by the backend
fields = [
{"id": "enabled", "label": "Enabled", "type": "boolean", "default": True},
{"id": "limit", "label": "Item limit", "type": "number", "default": 5},
{"id": "mode", "label": "Mode", "type": "select", "default": "safe",
"options": [
{"value": "safe", "label": "Safe"},
{"value": "fast", "label": "Fast"},
]},
{"id": "note", "label": "Note", "type": "string", "default": ""},
]
# Actions appear as buttons. Clicking one calls run(action, params, context)
actions = [
{"id": "do_work", "label": "Do Work", "description": "Process items"},
]
def run(self, action: str, params: dict, context: dict):
settings = context.get("settings", {})
logger = context.get("logger")
if action == "do_work":
limit = int(settings.get("limit", 5))
mode = settings.get("mode", "safe")
logger.info(f"My Plugin running with limit={limit}, mode={mode}")
# Do a small amount of work here. Schedule Celery tasks for heavy work.
return {"status": "ok", "processed": limit, "mode": mode}
return {"status": "error", "message": f"Unknown action {action}"}
```
3) Open the Plugins page in the UI, click the refresh icon to reload discovery, then configure and run your plugin.
---
## Where Plugins Live
- Default directory: `/app/data/plugins` inside the container.
- Override with env var: `DISPATCHARR_PLUGINS_DIR`.
- Each plugin is a directory containing either:
- `plugin.py` exporting a `Plugin` class, or
- a Python package (`__init__.py`) exporting a `Plugin` class.
The directory name (lowercased, spaces as `_`) is used as the registry key and module import path (e.g. `my_plugin.plugin`).
---
## Discovery & Lifecycle
- Discovery runs at server startup and on-demand when:
- Fetching the plugins list from the UI
- Hitting `POST /api/plugins/plugins/reload/`
- The loader imports each plugin module and instantiates `Plugin()`.
- Metadata (name, version, description) and a per-plugin settings JSON are stored in the DB.
Backend code:
- Loader: `apps/plugins/loader.py`
- API Views: `apps/plugins/api_views.py`
- API URLs: `apps/plugins/api_urls.py`
- Model: `apps/plugins/models.py` (stores `enabled` flag and `settings` per plugin)
---
## Plugin Interface
Export a `Plugin` class. Supported attributes and behavior:
- `name` (str): Human-readable name.
- `version` (str): Semantic version string.
- `description` (str): Short description.
- `fields` (list): Settings schema used by the UI to render controls.
- `actions` (list): Available actions; the UI renders a Run button for each.
- `run(action, params, context)` (callable): Invoked when a user clicks an action.
### Settings Schema
Supported field `type`s:
- `boolean`
- `number`
- `string`
- `select` (requires `options`: `[{"value": ..., "label": ...}, ...]`)
Common field keys:
- `id` (str): Settings key.
- `label` (str): Label shown in the UI.
- `type` (str): One of above.
- `default` (any): Default value used until saved.
- `help_text` (str, optional): Shown under the control.
- `options` (list, for select): List of `{value, label}`.
The UI automatically renders settings and persists them. The backend stores settings in `PluginConfig.settings`.
Read settings in `run` via `context["settings"]`.
### Actions
Each action is a dict:
- `id` (str): Unique action id.
- `label` (str): Button label.
- `description` (str, optional): Helper text.
Clicking an action calls your plugins `run(action, params, context)` and shows a notification with the result or error.
### Action Confirmation (Modal)
Developers can request a confirmation modal per action using the `confirm` key on the action. Options:
- Boolean: `confirm: true` will show a default confirmation modal.
- Object: `confirm: { required: true, title: '...', message: '...' }` to customize the modal title and message.
Example:
```
actions = [
{
"id": "danger_run",
"label": "Do Something Risky",
"description": "Runs a job that affects many records.",
"confirm": { "required": true, "title": "Proceed?", "message": "This will modify many records." },
}
]
```
---
## Accessing Dispatcharr APIs from Plugins
Plugins are server-side Python code running within the Django application. You can:
- Import models and run queries/updates:
```
from apps.m3u.models import M3UAccount
from apps.epg.models import EPGSource
from apps.channels.models import Channel
from core.models import CoreSettings
```
- Dispatch Celery tasks for heavy work (recommended):
```
from apps.m3u.tasks import refresh_m3u_accounts # apps/m3u/tasks.py
from apps.epg.tasks import refresh_all_epg_data # apps/epg/tasks.py
refresh_m3u_accounts.delay()
refresh_all_epg_data.delay()
```
- Send WebSocket updates:
```
from core.utils import send_websocket_update
send_websocket_update('updates', 'update', {"type": "plugin", "plugin": "my_plugin", "message": "Done"})
```
- Use transactions:
```
from django.db import transaction
with transaction.atomic():
# bulk updates here
...
```
- Log via provided context or standard logging:
```
def run(self, action, params, context):
logger = context.get("logger") # already configured
logger.info("running action %s", action)
```
Prefer Celery tasks (`.delay()`) to keep `run` fast and non-blocking.
---
## REST Endpoints (for UI and tooling)
- List plugins: `GET /api/plugins/plugins/`
- Response: `{ "plugins": [{ key, name, version, description, enabled, fields, settings, actions }, ...] }`
- Reload discovery: `POST /api/plugins/plugins/reload/`
- Import plugin: `POST /api/plugins/plugins/import/` with form-data file field `file`
- Update settings: `POST /api/plugins/plugins/<key>/settings/` with `{"settings": {...}}`
- Run action: `POST /api/plugins/plugins/<key>/run/` with `{"action": "id", "params": {...}}`
- Enable/disable: `POST /api/plugins/plugins/<key>/enabled/` with `{"enabled": true|false}`
Notes:
- When disabled, a plugin cannot run actions; backend returns HTTP 403.
---
## Importing Plugins
- In the UI, click the Import button on the Plugins page and upload a `.zip` containing a plugin folder.
- The archive should contain either `plugin.py` or a Python package (`__init__.py`).
- On success, the UI shows the plugin name/description and lets you enable it immediately (plugins are disabled by default).
---
## Enabling / Disabling Plugins
- Each plugin has a persisted `enabled` flag (default: disabled) and `ever_enabled` flag in the DB (`apps/plugins/models.py`).
- New plugins are disabled by default and require an explicit enable.
- The first time a plugin is enabled, the UI shows a trust warning modal explaining that plugins can run arbitrary server-side code.
- The Plugins page shows a toggle in the card header. Turning it off dims the card and disables the Run button.
- Backend enforcement: Attempts to run an action for a disabled plugin return HTTP 403.
---
## Example: Refresh All Sources Plugin
Path: `data/plugins/refresh_all/plugin.py`
```
class Plugin:
name = "Refresh All Sources"
version = "1.0.0"
description = "Force refresh all M3U accounts and EPG sources."
fields = [
{"id": "confirm", "label": "Require confirmation", "type": "boolean", "default": True,
"help_text": "If enabled, the UI should ask before running."}
]
actions = [
{"id": "refresh_all", "label": "Refresh All M3Us and EPGs",
"description": "Queues background refresh for all active M3U accounts and EPG sources."}
]
def run(self, action: str, params: dict, context: dict):
if action == "refresh_all":
from apps.m3u.tasks import refresh_m3u_accounts
from apps.epg.tasks import refresh_all_epg_data
refresh_m3u_accounts.delay()
refresh_all_epg_data.delay()
return {"status": "queued", "message": "Refresh jobs queued"}
return {"status": "error", "message": f"Unknown action: {action}"}
```
---
## Best Practices
- Keep `run` short and schedule heavy operations via Celery tasks.
- Validate and sanitize `params` received from the UI.
- Use database transactions for bulk or related updates.
- Log actionable messages for troubleshooting.
- Only write files under `/data` or `/app/data` paths.
- Treat plugins as trusted code: they run with full app permissions.
---
## Troubleshooting
- Plugin not listed: ensure the folder exists and contains `plugin.py` with a `Plugin` class.
- Import errors: the folder name is the import name; avoid spaces or exotic characters.
- No confirmation: include a boolean field with `id: "confirm"` and set it to true or default true.
- HTTP 403 on run: the plugin is disabled; enable it from the toggle or via the `enabled/` endpoint.
---
## Contributing
- Keep dependencies minimal. Vendoring small helpers into the plugin folder is acceptable.
- Use the existing task and model APIs where possible; propose extensions if you need new capabilities.
---
## Internals Reference
- Loader: `apps/plugins/loader.py`
- API Views: `apps/plugins/api_views.py`
- API URLs: `apps/plugins/api_urls.py`
- Model: `apps/plugins/models.py`
- Frontend page: `frontend/src/pages/Plugins.jsx`
- Sidebar entry: `frontend/src/components/Sidebar.jsx`

View file

@ -22,6 +22,7 @@ Dispatcharr has officially entered **BETA**, bringing powerful new features and
📊 **Real-Time Stats Dashboard** — Live insights into stream health and client activity\
🧠 **EPG Auto-Match** — Match program data to channels automatically\
⚙️ **Streamlink + FFmpeg Support** — Flexible backend options for streaming and recording\
🎬 **VOD Management** — Full Video on Demand support with movies and TV series\
🧼 **UI & UX Enhancements** — Smoother, faster, more responsive interface\
🛁 **Output Compatibility** — HDHomeRun, M3U, and XMLTV EPG support for Plex, Jellyfin, and more
@ -31,6 +32,7 @@ Dispatcharr has officially entered **BETA**, bringing powerful new features and
**Full IPTV Control** — Import, organize, proxy, and monitor IPTV streams on your own terms\
**Smart Playlist Handling** — M3U import, filtering, grouping, and failover support\
**VOD Content Management** — Organize movies and TV series with metadata and streaming\
**Reliable EPG Integration** — Match and manage TV guide data with ease\
**Clean & Responsive Interface** — Modern design that gets out of your way\
**Fully Self-Hosted** — Total control, zero reliance on third-party services
@ -74,9 +76,9 @@ docker run -d \
| Use Case | File | Description |
| --------------------------- | ------------------------------------------------------- | ------------------------------------------------------------------------------------------------------ |
| **All-in-One Deployment** | [docker-compose-aio.yml](docker/docker-compose-aio.yml) | ⭐ Recommended! A simple, all-in-one solution — everything runs in a single container for quick setup. |
| **All-in-One Deployment** | [docker-compose.aio.yml](docker/docker-compose.aio.yml) | ⭐ Recommended! A simple, all-in-one solution — everything runs in a single container for quick setup. |
| **Modular Deployment** | [docker-compose.yml](docker/docker-compose.yml) | Separate containers for Dispatcharr, Celery, and Postgres — perfect if you want more granular control. |
| **Development Environment** | [docker-compose-dev.yml](docker/docker-compose-dev.yml) | Developer-friendly setup with pre-configured ports and settings for contributing and testing. |
| **Development Environment** | [docker-compose.dev.yml](docker/docker-compose.dev.yml) | Developer-friendly setup with pre-configured ports and settings for contributing and testing. |
---
@ -104,7 +106,7 @@ Heres how you can join the party:
## 📚 Roadmap & Documentation
- 📚 **Roadmap:** Coming soon!
- 📖 **Wiki:** In progress — tutorials, API references, and advanced setup guides on the way!
- 📖 **Documentation:** [Dispatcharr Docs](https://dispatcharr.github.io/Dispatcharr-Docs/)
---

View file

@ -1,41 +1,39 @@
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from .api_views import (
AuthViewSet, UserViewSet, GroupViewSet,
list_permissions, initialize_superuser
AuthViewSet,
UserViewSet,
GroupViewSet,
TokenObtainPairView,
TokenRefreshView,
list_permissions,
initialize_superuser,
)
from rest_framework_simplejwt import views as jwt_views
app_name = 'accounts'
app_name = "accounts"
# 🔹 Register ViewSets with a Router
router = DefaultRouter()
router.register(r'users', UserViewSet, basename='user')
router.register(r'groups', GroupViewSet, basename='group')
router.register(r"users", UserViewSet, basename="user")
router.register(r"groups", GroupViewSet, basename="group")
# 🔹 Custom Authentication Endpoints
auth_view = AuthViewSet.as_view({
'post': 'login'
})
auth_view = AuthViewSet.as_view({"post": "login"})
logout_view = AuthViewSet.as_view({
'post': 'logout'
})
logout_view = AuthViewSet.as_view({"post": "logout"})
# 🔹 Define API URL patterns
urlpatterns = [
# Authentication
path('auth/login/', auth_view, name='user-login'),
path('auth/logout/', logout_view, name='user-logout'),
path("auth/login/", auth_view, name="user-login"),
path("auth/logout/", logout_view, name="user-logout"),
# Superuser API
path('initialize-superuser/', initialize_superuser, name='initialize_superuser'),
path("initialize-superuser/", initialize_superuser, name="initialize_superuser"),
# Permissions API
path('permissions/', list_permissions, name='list-permissions'),
path('token/', jwt_views.TokenObtainPairView.as_view(), name='token_obtain_pair'),
path('token/refresh/', jwt_views.TokenRefreshView.as_view(), name='token_refresh'),
path("permissions/", list_permissions, name="list-permissions"),
path("token/", TokenObtainPairView.as_view(), name="token_obtain_pair"),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
]
# 🔹 Include ViewSet routes

View file

@ -2,16 +2,110 @@ from django.contrib.auth import authenticate, login, logout
from django.contrib.auth.models import Group, Permission
from django.http import JsonResponse, HttpResponse
from django.views.decorators.csrf import csrf_exempt
from rest_framework.decorators import api_view, permission_classes
from rest_framework.permissions import IsAuthenticated, AllowAny
from rest_framework.decorators import api_view, permission_classes, action
from rest_framework.response import Response
from rest_framework import viewsets
from rest_framework import viewsets, status
from drf_yasg.utils import swagger_auto_schema
from drf_yasg import openapi
import json
from .permissions import IsAdmin, Authenticated
from dispatcharr.utils import network_access_allowed
from .models import User
from .serializers import UserSerializer, GroupSerializer, PermissionSerializer
from rest_framework_simplejwt.views import TokenObtainPairView, TokenRefreshView
class TokenObtainPairView(TokenObtainPairView):
def post(self, request, *args, **kwargs):
# Custom logic here
if not network_access_allowed(request, "UI"):
# Log blocked login attempt due to network restrictions
from core.utils import log_system_event
username = request.data.get("username", 'unknown')
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='login_failed',
user=username,
client_ip=client_ip,
user_agent=user_agent,
reason='Network access denied',
)
return Response({"error": "Forbidden"}, status=status.HTTP_403_FORBIDDEN)
# Get the response from the parent class first
username = request.data.get("username")
# Log login attempt
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
try:
response = super().post(request, *args, **kwargs)
# If login was successful, update last_login and log success
if response.status_code == 200:
if username:
from django.utils import timezone
try:
user = User.objects.get(username=username)
user.last_login = timezone.now()
user.save(update_fields=['last_login'])
# Log successful login
log_system_event(
event_type='login_success',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
except User.DoesNotExist:
pass # User doesn't exist, but login somehow succeeded
else:
# Log failed login attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason='Invalid credentials',
)
return response
except Exception as e:
# If parent class raises an exception (e.g., validation error), log failed attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason=f'Authentication error: {str(e)[:100]}',
)
raise # Re-raise the exception to maintain normal error flow
class TokenRefreshView(TokenRefreshView):
def post(self, request, *args, **kwargs):
# Custom logic here
if not network_access_allowed(request, "UI"):
# Log blocked token refresh attempt due to network restrictions
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='login_failed',
user='token_refresh',
client_ip=client_ip,
user_agent=user_agent,
reason='Network access denied (token refresh)',
)
return Response({"error": "Unauthorized"}, status=status.HTTP_403_FORBIDDEN)
return super().post(request, *args, **kwargs)
@csrf_exempt # In production, consider CSRF protection strategies or ensure this endpoint is only accessible when no superuser exists.
def initialize_superuser(request):
@ -26,56 +120,114 @@ def initialize_superuser(request):
password = data.get("password")
email = data.get("email", "")
if not username or not password:
return JsonResponse({"error": "Username and password are required."}, status=400)
return JsonResponse(
{"error": "Username and password are required."}, status=400
)
# Create the superuser
User.objects.create_superuser(username=username, password=password, email=email)
User.objects.create_superuser(
username=username, password=password, email=email, user_level=10
)
return JsonResponse({"superuser_exists": True})
except Exception as e:
return JsonResponse({"error": str(e)}, status=500)
# For GET requests, indicate no superuser exists
return JsonResponse({"superuser_exists": False})
# 🔹 1) Authentication APIs
class AuthViewSet(viewsets.ViewSet):
"""Handles user login and logout"""
def get_permissions(self):
"""
Login doesn't require auth, but logout does
"""
if self.action == 'logout':
from rest_framework.permissions import IsAuthenticated
return [IsAuthenticated()]
return []
@swagger_auto_schema(
operation_description="Authenticate and log in a user",
request_body=openapi.Schema(
type=openapi.TYPE_OBJECT,
required=['username', 'password'],
required=["username", "password"],
properties={
'username': openapi.Schema(type=openapi.TYPE_STRING),
'password': openapi.Schema(type=openapi.TYPE_STRING, format=openapi.FORMAT_PASSWORD)
"username": openapi.Schema(type=openapi.TYPE_STRING),
"password": openapi.Schema(
type=openapi.TYPE_STRING, format=openapi.FORMAT_PASSWORD
),
},
),
responses={200: "Login successful", 400: "Invalid credentials"},
)
def login(self, request):
"""Logs in a user and returns user details"""
username = request.data.get('username')
password = request.data.get('password')
username = request.data.get("username")
password = request.data.get("password")
user = authenticate(request, username=username, password=password)
# Get client info for logging
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
if user:
login(request, user)
return Response({
"message": "Login successful",
"user": {
"id": user.id,
"username": user.username,
"email": user.email,
"groups": list(user.groups.values_list('name', flat=True))
# Update last_login timestamp
from django.utils import timezone
user.last_login = timezone.now()
user.save(update_fields=['last_login'])
# Log successful login
log_system_event(
event_type='login_success',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
return Response(
{
"message": "Login successful",
"user": {
"id": user.id,
"username": user.username,
"email": user.email,
"groups": list(user.groups.values_list("name", flat=True)),
},
}
})
)
# Log failed login attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason='Invalid credentials',
)
return Response({"error": "Invalid credentials"}, status=400)
@swagger_auto_schema(
operation_description="Log out the current user",
responses={200: "Logout successful"}
responses={200: "Logout successful"},
)
def logout(self, request):
"""Logs out the authenticated user"""
# Log logout event before actually logging out
from core.utils import log_system_event
username = request.user.username if request.user and request.user.is_authenticated else 'unknown'
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='logout',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
logout(request)
return Response({"message": "Logout successful"})
@ -83,13 +235,19 @@ class AuthViewSet(viewsets.ViewSet):
# 🔹 2) User Management APIs
class UserViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for Users"""
queryset = User.objects.all()
queryset = User.objects.all().prefetch_related('channel_profiles')
serializer_class = UserSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
if self.action == "me":
return [Authenticated()]
return [IsAdmin()]
@swagger_auto_schema(
operation_description="Retrieve a list of users",
responses={200: UserSerializer(many=True)}
responses={200: UserSerializer(many=True)},
)
def list(self, request, *args, **kwargs):
return super().list(request, *args, **kwargs)
@ -110,17 +268,28 @@ class UserViewSet(viewsets.ModelViewSet):
def destroy(self, request, *args, **kwargs):
return super().destroy(request, *args, **kwargs)
@swagger_auto_schema(
method="get",
operation_description="Get active user information",
)
@action(detail=False, methods=["get"], url_path="me")
def me(self, request):
user = request.user
serializer = UserSerializer(user)
return Response(serializer.data)
# 🔹 3) Group Management APIs
class GroupViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for Groups"""
queryset = Group.objects.all()
serializer_class = GroupSerializer
permission_classes = [IsAuthenticated]
permission_classes = [Authenticated]
@swagger_auto_schema(
operation_description="Retrieve a list of groups",
responses={200: GroupSerializer(many=True)}
responses={200: GroupSerializer(many=True)},
)
def list(self, request, *args, **kwargs):
return super().list(request, *args, **kwargs)
@ -144,12 +313,12 @@ class GroupViewSet(viewsets.ModelViewSet):
# 🔹 4) Permissions List API
@swagger_auto_schema(
method='get',
method="get",
operation_description="Retrieve a list of all permissions",
responses={200: PermissionSerializer(many=True)}
responses={200: PermissionSerializer(many=True)},
)
@api_view(['GET'])
@permission_classes([IsAuthenticated])
@api_view(["GET"])
@permission_classes([Authenticated])
def list_permissions(request):
"""Returns a list of all available permissions"""
permissions = Permission.objects.all()

View file

@ -1,6 +1,7 @@
from django.apps import AppConfig
class AccountsConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'apps.accounts'
default_auto_field = "django.db.models.BigAutoField"
name = "apps.accounts"
verbose_name = "Accounts & Authentication"

View file

@ -0,0 +1,43 @@
# Generated by Django 5.1.6 on 2025-05-18 15:47
from django.db import migrations, models
def set_user_level_to_10(apps, schema_editor):
User = apps.get_model("accounts", "User")
User.objects.update(user_level=10)
class Migration(migrations.Migration):
dependencies = [
("accounts", "0001_initial"),
("dispatcharr_channels", "0021_channel_user_level"),
]
operations = [
migrations.RemoveField(
model_name="user",
name="channel_groups",
),
migrations.AddField(
model_name="user",
name="channel_profiles",
field=models.ManyToManyField(
blank=True,
related_name="users",
to="dispatcharr_channels.channelprofile",
),
),
migrations.AddField(
model_name="user",
name="user_level",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="user",
name="custom_properties",
field=models.TextField(blank=True, null=True),
),
migrations.RunPython(set_user_level_to_10),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-09-02 14:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0002_remove_user_channel_groups_user_channel_profiles_and_more'),
]
operations = [
migrations.AlterField(
model_name='user',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
]

View file

@ -2,17 +2,26 @@
from django.db import models
from django.contrib.auth.models import AbstractUser, Permission
class User(AbstractUser):
"""
Custom user model for Dispatcharr.
Inherits from Django's AbstractUser to add additional fields if needed.
"""
class UserLevel(models.IntegerChoices):
STREAMER = 0, "Streamer"
STANDARD = 1, "Standard User"
ADMIN = 10, "Admin"
avatar_config = models.JSONField(default=dict, blank=True, null=True)
channel_groups = models.ManyToManyField(
'dispatcharr_channels.ChannelGroup', # Updated reference to renamed model
channel_profiles = models.ManyToManyField(
"dispatcharr_channels.ChannelProfile",
blank=True,
related_name="users"
related_name="users",
)
user_level = models.IntegerField(default=UserLevel.STREAMER)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
def __str__(self):
return self.username

View file

@ -0,0 +1,56 @@
from rest_framework.permissions import IsAuthenticated
from .models import User
from dispatcharr.utils import network_access_allowed
class Authenticated(IsAuthenticated):
def has_permission(self, request, view):
is_authenticated = super().has_permission(request, view)
network_allowed = network_access_allowed(request, "UI")
return is_authenticated and network_allowed
class IsStandardUser(Authenticated):
def has_permission(self, request, view):
if not super().has_permission(request, view):
return False
return request.user and request.user.user_level >= User.UserLevel.STANDARD
class IsAdmin(Authenticated):
def has_permission(self, request, view):
if not super().has_permission(request, view):
return False
return request.user.user_level >= 10
class IsOwnerOfObject(Authenticated):
def has_object_permission(self, request, view, obj):
if not super().has_permission(request, view):
return False
is_admin = IsAdmin().has_permission(request, view)
is_owner = request.user in obj.users.all()
return is_admin or is_owner
permission_classes_by_action = {
"list": [IsStandardUser],
"create": [IsAdmin],
"retrieve": [IsStandardUser],
"update": [IsAdmin],
"partial_update": [IsAdmin],
"destroy": [IsAdmin],
}
permission_classes_by_method = {
"GET": [IsStandardUser],
"POST": [IsAdmin],
"PATCH": [IsAdmin],
"PUT": [IsAdmin],
"DELETE": [IsAdmin],
}

View file

@ -1,13 +1,14 @@
from rest_framework import serializers
from django.contrib.auth.models import Group, Permission
from .models import User
from apps.channels.models import ChannelProfile
# 🔹 Fix for Permission serialization
class PermissionSerializer(serializers.ModelSerializer):
class Meta:
model = Permission
fields = ['id', 'name', 'codename']
fields = ["id", "name", "codename"]
# 🔹 Fix for Group serialization
@ -18,15 +19,61 @@ class GroupSerializer(serializers.ModelSerializer):
class Meta:
model = Group
fields = ['id', 'name', 'permissions']
fields = ["id", "name", "permissions"]
# 🔹 Fix for User serialization
class UserSerializer(serializers.ModelSerializer):
groups = serializers.SlugRelatedField(
many=True, queryset=Group.objects.all(), slug_field="name"
) # ✅ Fix ManyToMany `_meta` error
password = serializers.CharField(write_only=True)
channel_profiles = serializers.PrimaryKeyRelatedField(
queryset=ChannelProfile.objects.all(), many=True, required=False
)
class Meta:
model = User
fields = ['id', 'username', 'email', 'groups']
fields = [
"id",
"username",
"email",
"user_level",
"password",
"channel_profiles",
"custom_properties",
"avatar_config",
"is_active",
"is_staff",
"is_superuser",
"last_login",
"date_joined",
"first_name",
"last_name",
]
def create(self, validated_data):
channel_profiles = validated_data.pop("channel_profiles", [])
user = User(**validated_data)
user.set_password(validated_data["password"])
user.is_active = True
user.save()
user.channel_profiles.set(channel_profiles)
return user
def update(self, instance, validated_data):
password = validated_data.pop("password", None)
channel_profiles = validated_data.pop("channel_profiles", None)
for attr, value in validated_data.items():
setattr(instance, attr, value)
if password:
instance.set_password(password)
instance.save()
if channel_profiles is not None:
instance.channel_profiles.set(channel_profiles)
return instance

View file

@ -5,6 +5,7 @@ from django.db.models.signals import post_save
from django.dispatch import receiver
from .models import User
@receiver(post_save, sender=User)
def handle_new_user(sender, instance, created, **kwargs):
if created:

View file

@ -1,11 +1,10 @@
from django.urls import path, include
from django.urls import path, include, re_path
from drf_yasg.views import get_schema_view
from drf_yasg import openapi
from rest_framework.permissions import AllowAny
app_name = 'api'
# Configure Swagger Schema
schema_view = get_schema_view(
openapi.Info(
title="Dispatcharr API",
@ -26,6 +25,9 @@ urlpatterns = [
path('hdhr/', include(('apps.hdhr.api_urls', 'hdhr'), namespace='hdhr')),
path('m3u/', include(('apps.m3u.api_urls', 'm3u'), namespace='m3u')),
path('core/', include(('core.api_urls', 'core'), namespace='core')),
path('plugins/', include(('apps.plugins.api_urls', 'plugins'), namespace='plugins')),
path('vod/', include(('apps.vod.api_urls', 'vod'), namespace='vod')),
path('backups/', include(('apps.backups.api_urls', 'backups'), namespace='backups')),
# path('output/', include(('apps.output.api_urls', 'output'), namespace='output')),
#path('player/', include(('apps.player.api_urls', 'player'), namespace='player')),
#path('settings/', include(('apps.settings.api_urls', 'settings'), namespace='settings')),
@ -34,7 +36,7 @@ urlpatterns = [
# Swagger Documentation api_urls
path('swagger/', schema_view.with_ui('swagger', cache_timeout=0), name='schema-swagger-ui'),
re_path(r'^swagger/?$', schema_view.with_ui('swagger', cache_timeout=0), name='schema-swagger-ui'),
path('redoc/', schema_view.with_ui('redoc', cache_timeout=0), name='schema-redoc'),
path('swagger.json', schema_view.without_ui(cache_timeout=0), name='schema-json'),
]

0
apps/backups/__init__.py Normal file
View file

18
apps/backups/api_urls.py Normal file
View file

@ -0,0 +1,18 @@
from django.urls import path
from . import api_views
app_name = "backups"
urlpatterns = [
path("", api_views.list_backups, name="backup-list"),
path("create/", api_views.create_backup, name="backup-create"),
path("upload/", api_views.upload_backup, name="backup-upload"),
path("schedule/", api_views.get_schedule, name="backup-schedule-get"),
path("schedule/update/", api_views.update_schedule, name="backup-schedule-update"),
path("status/<str:task_id>/", api_views.backup_status, name="backup-status"),
path("<str:filename>/download-token/", api_views.get_download_token, name="backup-download-token"),
path("<str:filename>/download/", api_views.download_backup, name="backup-download"),
path("<str:filename>/delete/", api_views.delete_backup, name="backup-delete"),
path("<str:filename>/restore/", api_views.restore_backup, name="backup-restore"),
]

364
apps/backups/api_views.py Normal file
View file

@ -0,0 +1,364 @@
import hashlib
import hmac
import logging
import os
from pathlib import Path
from celery.result import AsyncResult
from django.conf import settings
from django.http import HttpResponse, StreamingHttpResponse, Http404
from rest_framework import status
from rest_framework.decorators import api_view, permission_classes, parser_classes
from rest_framework.permissions import IsAdminUser, AllowAny
from rest_framework.parsers import MultiPartParser, FormParser
from rest_framework.response import Response
from . import services
from .tasks import create_backup_task, restore_backup_task
from .scheduler import get_schedule_settings, update_schedule_settings
logger = logging.getLogger(__name__)
def _generate_task_token(task_id: str) -> str:
"""Generate a signed token for task status access without auth."""
secret = settings.SECRET_KEY.encode()
return hmac.new(secret, task_id.encode(), hashlib.sha256).hexdigest()[:32]
def _verify_task_token(task_id: str, token: str) -> bool:
"""Verify a task token is valid."""
expected = _generate_task_token(task_id)
return hmac.compare_digest(expected, token)
@api_view(["GET"])
@permission_classes([IsAdminUser])
def list_backups(request):
"""List all available backup files."""
try:
backups = services.list_backups()
return Response(backups, status=status.HTTP_200_OK)
except Exception as e:
return Response(
{"detail": f"Failed to list backups: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["POST"])
@permission_classes([IsAdminUser])
def create_backup(request):
"""Create a new backup (async via Celery)."""
try:
task = create_backup_task.delay()
return Response(
{
"detail": "Backup started",
"task_id": task.id,
"task_token": _generate_task_token(task.id),
},
status=status.HTTP_202_ACCEPTED,
)
except Exception as e:
return Response(
{"detail": f"Failed to start backup: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["GET"])
@permission_classes([AllowAny])
def backup_status(request, task_id):
"""Check the status of a backup/restore task.
Requires either:
- Valid admin authentication, OR
- Valid task_token query parameter
"""
# Check for token-based auth (for restore when session is invalidated)
token = request.query_params.get("token")
if token:
if not _verify_task_token(task_id, token):
return Response(
{"detail": "Invalid task token"},
status=status.HTTP_403_FORBIDDEN,
)
else:
# Fall back to admin auth check
if not request.user.is_authenticated or not request.user.is_staff:
return Response(
{"detail": "Authentication required"},
status=status.HTTP_401_UNAUTHORIZED,
)
try:
result = AsyncResult(task_id)
if result.ready():
task_result = result.get()
if task_result.get("status") == "completed":
return Response({
"state": "completed",
"result": task_result,
})
else:
return Response({
"state": "failed",
"error": task_result.get("error", "Unknown error"),
})
elif result.failed():
return Response({
"state": "failed",
"error": str(result.result),
})
else:
return Response({
"state": result.state.lower(),
})
except Exception as e:
return Response(
{"detail": f"Failed to get task status: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["GET"])
@permission_classes([IsAdminUser])
def get_download_token(request, filename):
"""Get a signed token for downloading a backup file."""
try:
# Security: prevent path traversal
if ".." in filename or "/" in filename or "\\" in filename:
raise Http404("Invalid filename")
backup_dir = services.get_backup_dir()
backup_file = backup_dir / filename
if not backup_file.exists():
raise Http404("Backup file not found")
token = _generate_task_token(filename)
return Response({"token": token})
except Http404:
raise
except Exception as e:
return Response(
{"detail": f"Failed to generate token: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["GET"])
@permission_classes([AllowAny])
def download_backup(request, filename):
"""Download a backup file.
Requires either:
- Valid admin authentication, OR
- Valid download_token query parameter
"""
# Check for token-based auth (avoids CORS preflight issues)
token = request.query_params.get("token")
if token:
if not _verify_task_token(filename, token):
return Response(
{"detail": "Invalid download token"},
status=status.HTTP_403_FORBIDDEN,
)
else:
# Fall back to admin auth check
if not request.user.is_authenticated or not request.user.is_staff:
return Response(
{"detail": "Authentication required"},
status=status.HTTP_401_UNAUTHORIZED,
)
try:
# Security: prevent path traversal by checking for suspicious characters
if ".." in filename or "/" in filename or "\\" in filename:
raise Http404("Invalid filename")
backup_dir = services.get_backup_dir()
backup_file = (backup_dir / filename).resolve()
# Security: ensure the resolved path is still within backup_dir
if not str(backup_file).startswith(str(backup_dir.resolve())):
raise Http404("Invalid filename")
if not backup_file.exists() or not backup_file.is_file():
raise Http404("Backup file not found")
file_size = backup_file.stat().st_size
# Use X-Accel-Redirect for nginx (AIO container) - nginx serves file directly
# Fall back to streaming for non-nginx deployments
use_nginx_accel = os.environ.get("USE_NGINX_ACCEL", "").lower() == "true"
logger.info(f"[DOWNLOAD] File: {filename}, Size: {file_size}, USE_NGINX_ACCEL: {use_nginx_accel}")
if use_nginx_accel:
# X-Accel-Redirect: Django returns immediately, nginx serves file
logger.info(f"[DOWNLOAD] Using X-Accel-Redirect: /protected-backups/{filename}")
response = HttpResponse()
response["X-Accel-Redirect"] = f"/protected-backups/{filename}"
response["Content-Type"] = "application/zip"
response["Content-Length"] = file_size
response["Content-Disposition"] = f'attachment; filename="{filename}"'
return response
else:
# Streaming fallback for non-nginx deployments
logger.info(f"[DOWNLOAD] Using streaming fallback (no nginx)")
def file_iterator(file_path, chunk_size=2 * 1024 * 1024):
with open(file_path, "rb") as f:
while chunk := f.read(chunk_size):
yield chunk
response = StreamingHttpResponse(
file_iterator(backup_file),
content_type="application/zip",
)
response["Content-Length"] = file_size
response["Content-Disposition"] = f'attachment; filename="{filename}"'
return response
except Http404:
raise
except Exception as e:
return Response(
{"detail": f"Download failed: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["DELETE"])
@permission_classes([IsAdminUser])
def delete_backup(request, filename):
"""Delete a backup file."""
try:
# Security: prevent path traversal
if ".." in filename or "/" in filename or "\\" in filename:
raise Http404("Invalid filename")
services.delete_backup(filename)
return Response(
{"detail": "Backup deleted successfully"},
status=status.HTTP_204_NO_CONTENT,
)
except FileNotFoundError:
raise Http404("Backup file not found")
except Exception as e:
return Response(
{"detail": f"Delete failed: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["POST"])
@permission_classes([IsAdminUser])
@parser_classes([MultiPartParser, FormParser])
def upload_backup(request):
"""Upload a backup file for restoration."""
uploaded = request.FILES.get("file")
if not uploaded:
return Response(
{"detail": "No file uploaded"},
status=status.HTTP_400_BAD_REQUEST,
)
try:
backup_dir = services.get_backup_dir()
filename = uploaded.name or "uploaded-backup.zip"
# Ensure unique filename
backup_file = backup_dir / filename
counter = 1
while backup_file.exists():
name_parts = filename.rsplit(".", 1)
if len(name_parts) == 2:
backup_file = backup_dir / f"{name_parts[0]}-{counter}.{name_parts[1]}"
else:
backup_file = backup_dir / f"{filename}-{counter}"
counter += 1
# Save uploaded file
with backup_file.open("wb") as f:
for chunk in uploaded.chunks():
f.write(chunk)
return Response(
{
"detail": "Backup uploaded successfully",
"filename": backup_file.name,
},
status=status.HTTP_201_CREATED,
)
except Exception as e:
return Response(
{"detail": f"Upload failed: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["POST"])
@permission_classes([IsAdminUser])
def restore_backup(request, filename):
"""Restore from a backup file (async via Celery). WARNING: This will flush the database!"""
try:
# Security: prevent path traversal
if ".." in filename or "/" in filename or "\\" in filename:
raise Http404("Invalid filename")
backup_dir = services.get_backup_dir()
backup_file = backup_dir / filename
if not backup_file.exists():
raise Http404("Backup file not found")
task = restore_backup_task.delay(filename)
return Response(
{
"detail": "Restore started",
"task_id": task.id,
"task_token": _generate_task_token(task.id),
},
status=status.HTTP_202_ACCEPTED,
)
except Http404:
raise
except Exception as e:
return Response(
{"detail": f"Failed to start restore: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["GET"])
@permission_classes([IsAdminUser])
def get_schedule(request):
"""Get backup schedule settings."""
try:
settings = get_schedule_settings()
return Response(settings)
except Exception as e:
return Response(
{"detail": f"Failed to get schedule: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@api_view(["PUT"])
@permission_classes([IsAdminUser])
def update_schedule(request):
"""Update backup schedule settings."""
try:
settings = update_schedule_settings(request.data)
return Response(settings)
except ValueError as e:
return Response(
{"detail": str(e)},
status=status.HTTP_400_BAD_REQUEST,
)
except Exception as e:
return Response(
{"detail": f"Failed to update schedule: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)

7
apps/backups/apps.py Normal file
View file

@ -0,0 +1,7 @@
from django.apps import AppConfig
class BackupsConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "apps.backups"
verbose_name = "Backups"

View file

0
apps/backups/models.py Normal file
View file

202
apps/backups/scheduler.py Normal file
View file

@ -0,0 +1,202 @@
import json
import logging
from django_celery_beat.models import PeriodicTask, CrontabSchedule
from core.models import CoreSettings
logger = logging.getLogger(__name__)
BACKUP_SCHEDULE_TASK_NAME = "backup-scheduled-task"
DEFAULTS = {
"schedule_enabled": True,
"schedule_frequency": "daily",
"schedule_time": "03:00",
"schedule_day_of_week": 0, # Sunday
"retention_count": 3,
"schedule_cron_expression": "",
}
def _get_backup_settings():
"""Get all backup settings from CoreSettings grouped JSON."""
try:
settings_obj = CoreSettings.objects.get(key="backup_settings")
return settings_obj.value if isinstance(settings_obj.value, dict) else DEFAULTS.copy()
except CoreSettings.DoesNotExist:
return DEFAULTS.copy()
def _update_backup_settings(updates: dict) -> None:
"""Update backup settings in the grouped JSON."""
obj, created = CoreSettings.objects.get_or_create(
key="backup_settings",
defaults={"name": "Backup Settings", "value": DEFAULTS.copy()}
)
current = obj.value if isinstance(obj.value, dict) else {}
current.update(updates)
obj.value = current
obj.save()
def get_schedule_settings() -> dict:
"""Get all backup schedule settings."""
settings = _get_backup_settings()
return {
"enabled": bool(settings.get("schedule_enabled", DEFAULTS["schedule_enabled"])),
"frequency": str(settings.get("schedule_frequency", DEFAULTS["schedule_frequency"])),
"time": str(settings.get("schedule_time", DEFAULTS["schedule_time"])),
"day_of_week": int(settings.get("schedule_day_of_week", DEFAULTS["schedule_day_of_week"])),
"retention_count": int(settings.get("retention_count", DEFAULTS["retention_count"])),
"cron_expression": str(settings.get("schedule_cron_expression", DEFAULTS["schedule_cron_expression"])),
}
def update_schedule_settings(data: dict) -> dict:
"""Update backup schedule settings and sync the PeriodicTask."""
# Validate
if "frequency" in data and data["frequency"] not in ("daily", "weekly"):
raise ValueError("frequency must be 'daily' or 'weekly'")
if "time" in data:
try:
hour, minute = data["time"].split(":")
int(hour)
int(minute)
except (ValueError, AttributeError):
raise ValueError("time must be in HH:MM format")
if "day_of_week" in data:
day = int(data["day_of_week"])
if day < 0 or day > 6:
raise ValueError("day_of_week must be 0-6 (Sunday-Saturday)")
if "retention_count" in data:
count = int(data["retention_count"])
if count < 0:
raise ValueError("retention_count must be >= 0")
# Update settings with proper key names
updates = {}
if "enabled" in data:
updates["schedule_enabled"] = bool(data["enabled"])
if "frequency" in data:
updates["schedule_frequency"] = str(data["frequency"])
if "time" in data:
updates["schedule_time"] = str(data["time"])
if "day_of_week" in data:
updates["schedule_day_of_week"] = int(data["day_of_week"])
if "retention_count" in data:
updates["retention_count"] = int(data["retention_count"])
if "cron_expression" in data:
updates["schedule_cron_expression"] = str(data["cron_expression"])
_update_backup_settings(updates)
# Sync the periodic task
_sync_periodic_task()
return get_schedule_settings()
def _sync_periodic_task() -> None:
"""Create, update, or delete the scheduled backup task based on settings."""
settings = get_schedule_settings()
if not settings["enabled"]:
# Delete the task if it exists
task = PeriodicTask.objects.filter(name=BACKUP_SCHEDULE_TASK_NAME).first()
if task:
old_crontab = task.crontab
task.delete()
_cleanup_orphaned_crontab(old_crontab)
logger.info("Backup schedule disabled, removed periodic task")
return
# Get old crontab before creating new one
old_crontab = None
try:
old_task = PeriodicTask.objects.get(name=BACKUP_SCHEDULE_TASK_NAME)
old_crontab = old_task.crontab
except PeriodicTask.DoesNotExist:
pass
# Check if using cron expression (advanced mode)
if settings["cron_expression"]:
# Parse cron expression: "minute hour day month weekday"
try:
parts = settings["cron_expression"].split()
if len(parts) != 5:
raise ValueError("Cron expression must have 5 parts: minute hour day month weekday")
minute, hour, day_of_month, month_of_year, day_of_week = parts
crontab, _ = CrontabSchedule.objects.get_or_create(
minute=minute,
hour=hour,
day_of_week=day_of_week,
day_of_month=day_of_month,
month_of_year=month_of_year,
timezone=CoreSettings.get_system_time_zone(),
)
except Exception as e:
logger.error(f"Invalid cron expression '{settings['cron_expression']}': {e}")
raise ValueError(f"Invalid cron expression: {e}")
else:
# Use simple frequency-based scheduling
# Parse time
hour, minute = settings["time"].split(":")
# Build crontab based on frequency
system_tz = CoreSettings.get_system_time_zone()
if settings["frequency"] == "daily":
crontab, _ = CrontabSchedule.objects.get_or_create(
minute=minute,
hour=hour,
day_of_week="*",
day_of_month="*",
month_of_year="*",
timezone=system_tz,
)
else: # weekly
crontab, _ = CrontabSchedule.objects.get_or_create(
minute=minute,
hour=hour,
day_of_week=str(settings["day_of_week"]),
day_of_month="*",
month_of_year="*",
timezone=system_tz,
)
# Create or update the periodic task
task, created = PeriodicTask.objects.update_or_create(
name=BACKUP_SCHEDULE_TASK_NAME,
defaults={
"task": "apps.backups.tasks.scheduled_backup_task",
"crontab": crontab,
"enabled": True,
"kwargs": json.dumps({"retention_count": settings["retention_count"]}),
},
)
# Clean up old crontab if it changed and is orphaned
if old_crontab and old_crontab.id != crontab.id:
_cleanup_orphaned_crontab(old_crontab)
action = "Created" if created else "Updated"
logger.info(f"{action} backup schedule: {settings['frequency']} at {settings['time']}")
def _cleanup_orphaned_crontab(crontab_schedule):
"""Delete old CrontabSchedule if no other tasks are using it."""
if crontab_schedule is None:
return
# Check if any other tasks are using this crontab
if PeriodicTask.objects.filter(crontab=crontab_schedule).exists():
logger.debug(f"CrontabSchedule {crontab_schedule.id} still in use, not deleting")
return
logger.debug(f"Cleaning up orphaned CrontabSchedule: {crontab_schedule.id}")
crontab_schedule.delete()

350
apps/backups/services.py Normal file
View file

@ -0,0 +1,350 @@
import datetime
import json
import os
import shutil
import subprocess
import tempfile
from pathlib import Path
from zipfile import ZipFile, ZIP_DEFLATED
import logging
import pytz
from django.conf import settings
from core.models import CoreSettings
logger = logging.getLogger(__name__)
def get_backup_dir() -> Path:
"""Get the backup directory, creating it if necessary."""
backup_dir = Path(settings.BACKUP_ROOT)
backup_dir.mkdir(parents=True, exist_ok=True)
return backup_dir
def _is_postgresql() -> bool:
"""Check if we're using PostgreSQL."""
return settings.DATABASES["default"]["ENGINE"] == "django.db.backends.postgresql"
def _get_pg_env() -> dict:
"""Get environment variables for PostgreSQL commands."""
db_config = settings.DATABASES["default"]
env = os.environ.copy()
env["PGPASSWORD"] = db_config.get("PASSWORD", "")
return env
def _get_pg_args() -> list[str]:
"""Get common PostgreSQL command arguments."""
db_config = settings.DATABASES["default"]
return [
"-h", db_config.get("HOST", "localhost"),
"-p", str(db_config.get("PORT", 5432)),
"-U", db_config.get("USER", "postgres"),
"-d", db_config.get("NAME", "dispatcharr"),
]
def _dump_postgresql(output_file: Path) -> None:
"""Dump PostgreSQL database using pg_dump."""
logger.info("Dumping PostgreSQL database with pg_dump...")
cmd = [
"pg_dump",
*_get_pg_args(),
"-Fc", # Custom format for pg_restore
"-v", # Verbose
"-f", str(output_file),
]
result = subprocess.run(
cmd,
env=_get_pg_env(),
capture_output=True,
text=True,
)
if result.returncode != 0:
logger.error(f"pg_dump failed: {result.stderr}")
raise RuntimeError(f"pg_dump failed: {result.stderr}")
logger.debug(f"pg_dump output: {result.stderr}")
def _clean_postgresql_schema() -> None:
"""Drop and recreate the public schema to ensure a completely clean restore."""
logger.info("[PG_CLEAN] Dropping and recreating public schema...")
# Commands to drop and recreate schema
sql_commands = "DROP SCHEMA IF EXISTS public CASCADE; CREATE SCHEMA public; GRANT ALL ON SCHEMA public TO public;"
cmd = [
"psql",
*_get_pg_args(),
"-c", sql_commands,
]
result = subprocess.run(
cmd,
env=_get_pg_env(),
capture_output=True,
text=True,
)
if result.returncode != 0:
logger.error(f"[PG_CLEAN] Failed to clean schema: {result.stderr}")
raise RuntimeError(f"Failed to clean PostgreSQL schema: {result.stderr}")
logger.info("[PG_CLEAN] Schema cleaned successfully")
def _restore_postgresql(dump_file: Path) -> None:
"""Restore PostgreSQL database using pg_restore."""
logger.info("[PG_RESTORE] Starting pg_restore...")
logger.info(f"[PG_RESTORE] Dump file: {dump_file}")
# Drop and recreate schema to ensure a completely clean restore
_clean_postgresql_schema()
pg_args = _get_pg_args()
logger.info(f"[PG_RESTORE] Connection args: {pg_args}")
cmd = [
"pg_restore",
"--no-owner", # Skip ownership commands (we already created schema)
*pg_args,
"-v", # Verbose
str(dump_file),
]
logger.info(f"[PG_RESTORE] Running command: {' '.join(cmd)}")
result = subprocess.run(
cmd,
env=_get_pg_env(),
capture_output=True,
text=True,
)
logger.info(f"[PG_RESTORE] Return code: {result.returncode}")
# pg_restore may return non-zero even on partial success
# Check for actual errors vs warnings
if result.returncode != 0:
# Some errors during restore are expected (e.g., "does not exist" when cleaning)
# Only fail on critical errors
stderr = result.stderr.lower()
if "fatal" in stderr or "could not connect" in stderr:
logger.error(f"[PG_RESTORE] Failed critically: {result.stderr}")
raise RuntimeError(f"pg_restore failed: {result.stderr}")
else:
logger.warning(f"[PG_RESTORE] Completed with warnings: {result.stderr[:500]}...")
logger.info("[PG_RESTORE] Completed successfully")
def _dump_sqlite(output_file: Path) -> None:
"""Dump SQLite database using sqlite3 .backup command."""
logger.info("Dumping SQLite database with sqlite3 .backup...")
db_path = Path(settings.DATABASES["default"]["NAME"])
if not db_path.exists():
raise FileNotFoundError(f"SQLite database not found: {db_path}")
# Use sqlite3 .backup command via stdin for reliable execution
result = subprocess.run(
["sqlite3", str(db_path)],
input=f".backup '{output_file}'\n",
capture_output=True,
text=True,
)
if result.returncode != 0:
logger.error(f"sqlite3 backup failed: {result.stderr}")
raise RuntimeError(f"sqlite3 backup failed: {result.stderr}")
# Verify the backup file was created
if not output_file.exists():
raise RuntimeError("sqlite3 backup failed: output file not created")
logger.info(f"sqlite3 backup completed successfully: {output_file}")
def _restore_sqlite(dump_file: Path) -> None:
"""Restore SQLite database by replacing the database file."""
logger.info("Restoring SQLite database...")
db_path = Path(settings.DATABASES["default"]["NAME"])
backup_current = None
# Backup current database before overwriting
if db_path.exists():
backup_current = db_path.with_suffix(".db.bak")
shutil.copy2(db_path, backup_current)
logger.info(f"Backed up current database to {backup_current}")
# Ensure parent directory exists
db_path.parent.mkdir(parents=True, exist_ok=True)
# The backup file from _dump_sqlite is a complete SQLite database file
# We can simply copy it over the existing database
shutil.copy2(dump_file, db_path)
# Verify the restore worked by checking if sqlite3 can read it
result = subprocess.run(
["sqlite3", str(db_path)],
input=".tables\n",
capture_output=True,
text=True,
)
if result.returncode != 0:
logger.error(f"sqlite3 verification failed: {result.stderr}")
# Try to restore from backup
if backup_current and backup_current.exists():
shutil.copy2(backup_current, db_path)
logger.info("Restored original database from backup")
raise RuntimeError(f"sqlite3 restore verification failed: {result.stderr}")
logger.info("sqlite3 restore completed successfully")
def create_backup() -> Path:
"""
Create a backup archive containing database dump and data directories.
Returns the path to the created backup file.
"""
backup_dir = get_backup_dir()
# Use system timezone for filename (user-friendly), but keep internal timestamps as UTC
system_tz_name = CoreSettings.get_system_time_zone()
try:
system_tz = pytz.timezone(system_tz_name)
now_local = datetime.datetime.now(datetime.UTC).astimezone(system_tz)
timestamp = now_local.strftime("%Y.%m.%d.%H.%M.%S")
except Exception as e:
logger.warning(f"Failed to use system timezone {system_tz_name}: {e}, falling back to UTC")
timestamp = datetime.datetime.now(datetime.UTC).strftime("%Y.%m.%d.%H.%M.%S")
backup_name = f"dispatcharr-backup-{timestamp}.zip"
backup_file = backup_dir / backup_name
logger.info(f"Creating backup: {backup_name}")
with tempfile.TemporaryDirectory(prefix="dispatcharr-backup-") as temp_dir:
temp_path = Path(temp_dir)
# Determine database type and dump accordingly
if _is_postgresql():
db_dump_file = temp_path / "database.dump"
_dump_postgresql(db_dump_file)
db_type = "postgresql"
else:
db_dump_file = temp_path / "database.sqlite3"
_dump_sqlite(db_dump_file)
db_type = "sqlite"
# Create ZIP archive with compression and ZIP64 support for large files
with ZipFile(backup_file, "w", compression=ZIP_DEFLATED, allowZip64=True) as zip_file:
# Add database dump
zip_file.write(db_dump_file, db_dump_file.name)
# Add metadata
metadata = {
"format": "dispatcharr-backup",
"version": 2,
"database_type": db_type,
"database_file": db_dump_file.name,
"created_at": datetime.datetime.now(datetime.UTC).isoformat(),
}
zip_file.writestr("metadata.json", json.dumps(metadata, indent=2))
logger.info(f"Backup created successfully: {backup_file}")
return backup_file
def restore_backup(backup_file: Path) -> None:
"""
Restore from a backup archive.
WARNING: This will overwrite the database!
"""
if not backup_file.exists():
raise FileNotFoundError(f"Backup file not found: {backup_file}")
logger.info(f"Restoring from backup: {backup_file}")
with tempfile.TemporaryDirectory(prefix="dispatcharr-restore-") as temp_dir:
temp_path = Path(temp_dir)
# Extract backup
logger.debug("Extracting backup archive...")
with ZipFile(backup_file, "r") as zip_file:
zip_file.extractall(temp_path)
# Read metadata
metadata_file = temp_path / "metadata.json"
if not metadata_file.exists():
raise ValueError("Invalid backup: missing metadata.json")
with open(metadata_file) as f:
metadata = json.load(f)
# Restore database
_restore_database(temp_path, metadata)
logger.info("Restore completed successfully")
def _restore_database(temp_path: Path, metadata: dict) -> None:
"""Restore database from backup."""
db_type = metadata.get("database_type", "postgresql")
db_file = metadata.get("database_file", "database.dump")
dump_file = temp_path / db_file
if not dump_file.exists():
raise ValueError(f"Invalid backup: missing {db_file}")
current_db_type = "postgresql" if _is_postgresql() else "sqlite"
if db_type != current_db_type:
raise ValueError(
f"Database type mismatch: backup is {db_type}, "
f"but current database is {current_db_type}"
)
if db_type == "postgresql":
_restore_postgresql(dump_file)
else:
_restore_sqlite(dump_file)
def list_backups() -> list[dict]:
"""List all available backup files with metadata."""
backup_dir = get_backup_dir()
backups = []
for backup_file in sorted(backup_dir.glob("dispatcharr-backup-*.zip"), reverse=True):
# Use UTC timezone so frontend can convert to user's local time
created_time = datetime.datetime.fromtimestamp(backup_file.stat().st_mtime, datetime.UTC)
backups.append({
"name": backup_file.name,
"size": backup_file.stat().st_size,
"created": created_time.isoformat(),
})
return backups
def delete_backup(filename: str) -> None:
"""Delete a backup file."""
backup_dir = get_backup_dir()
backup_file = backup_dir / filename
if not backup_file.exists():
raise FileNotFoundError(f"Backup file not found: {filename}")
if not backup_file.is_file():
raise ValueError(f"Invalid backup file: {filename}")
backup_file.unlink()
logger.info(f"Deleted backup: {filename}")

106
apps/backups/tasks.py Normal file
View file

@ -0,0 +1,106 @@
import logging
import traceback
from celery import shared_task
from . import services
logger = logging.getLogger(__name__)
def _cleanup_old_backups(retention_count: int) -> int:
"""Delete old backups, keeping only the most recent N. Returns count deleted."""
if retention_count <= 0:
return 0
backups = services.list_backups()
if len(backups) <= retention_count:
return 0
# Backups are sorted newest first, so delete from the end
to_delete = backups[retention_count:]
deleted = 0
for backup in to_delete:
try:
services.delete_backup(backup["name"])
deleted += 1
logger.info(f"[CLEANUP] Deleted old backup: {backup['name']}")
except Exception as e:
logger.error(f"[CLEANUP] Failed to delete {backup['name']}: {e}")
return deleted
@shared_task(bind=True)
def create_backup_task(self):
"""Celery task to create a backup asynchronously."""
try:
logger.info(f"[BACKUP] Starting backup task {self.request.id}")
backup_file = services.create_backup()
logger.info(f"[BACKUP] Task {self.request.id} completed: {backup_file.name}")
return {
"status": "completed",
"filename": backup_file.name,
"size": backup_file.stat().st_size,
}
except Exception as e:
logger.error(f"[BACKUP] Task {self.request.id} failed: {str(e)}")
logger.error(f"[BACKUP] Traceback: {traceback.format_exc()}")
return {
"status": "failed",
"error": str(e),
}
@shared_task(bind=True)
def restore_backup_task(self, filename: str):
"""Celery task to restore a backup asynchronously."""
try:
logger.info(f"[RESTORE] Starting restore task {self.request.id} for {filename}")
backup_dir = services.get_backup_dir()
backup_file = backup_dir / filename
logger.info(f"[RESTORE] Backup file path: {backup_file}")
services.restore_backup(backup_file)
logger.info(f"[RESTORE] Task {self.request.id} completed successfully")
return {
"status": "completed",
"filename": filename,
}
except Exception as e:
logger.error(f"[RESTORE] Task {self.request.id} failed: {str(e)}")
logger.error(f"[RESTORE] Traceback: {traceback.format_exc()}")
return {
"status": "failed",
"error": str(e),
}
@shared_task(bind=True)
def scheduled_backup_task(self, retention_count: int = 0):
"""Celery task for scheduled backups with optional retention cleanup."""
try:
logger.info(f"[SCHEDULED] Starting scheduled backup task {self.request.id}")
# Create backup
backup_file = services.create_backup()
logger.info(f"[SCHEDULED] Backup created: {backup_file.name}")
# Cleanup old backups if retention is set
deleted = 0
if retention_count > 0:
deleted = _cleanup_old_backups(retention_count)
logger.info(f"[SCHEDULED] Cleanup complete, deleted {deleted} old backup(s)")
return {
"status": "completed",
"filename": backup_file.name,
"size": backup_file.stat().st_size,
"deleted_count": deleted,
}
except Exception as e:
logger.error(f"[SCHEDULED] Task {self.request.id} failed: {str(e)}")
logger.error(f"[SCHEDULED] Traceback: {traceback.format_exc()}")
return {
"status": "failed",
"error": str(e),
}

1163
apps/backups/tests.py Normal file

File diff suppressed because it is too large Load diff

View file

@ -6,11 +6,21 @@ from .api_views import (
ChannelGroupViewSet,
BulkDeleteStreamsAPIView,
BulkDeleteChannelsAPIView,
BulkDeleteLogosAPIView,
CleanupUnusedLogosAPIView,
LogoViewSet,
ChannelProfileViewSet,
UpdateChannelMembershipAPIView,
BulkUpdateChannelMembershipAPIView,
RecordingViewSet,
RecurringRecordingRuleViewSet,
GetChannelStreamsAPIView,
SeriesRulesAPIView,
DeleteSeriesRuleAPIView,
EvaluateSeriesRulesAPIView,
BulkRemoveSeriesRecordingsAPIView,
BulkDeleteUpcomingRecordingsAPIView,
ComskipConfigAPIView,
)
app_name = 'channels' # for DRF routing
@ -22,13 +32,24 @@ router.register(r'channels', ChannelViewSet, basename='channel')
router.register(r'logos', LogoViewSet, basename='logo')
router.register(r'profiles', ChannelProfileViewSet, basename='profile')
router.register(r'recordings', RecordingViewSet, basename='recording')
router.register(r'recurring-rules', RecurringRecordingRuleViewSet, basename='recurring-rule')
urlpatterns = [
# Bulk delete is a single APIView, not a ViewSet
path('streams/bulk-delete/', BulkDeleteStreamsAPIView.as_view(), name='bulk_delete_streams'),
path('channels/bulk-delete/', BulkDeleteChannelsAPIView.as_view(), name='bulk_delete_channels'),
path('logos/bulk-delete/', BulkDeleteLogosAPIView.as_view(), name='bulk_delete_logos'),
path('logos/cleanup/', CleanupUnusedLogosAPIView.as_view(), name='cleanup_unused_logos'),
path('channels/<int:channel_id>/streams/', GetChannelStreamsAPIView.as_view(), name='get_channel_streams'),
path('profiles/<int:profile_id>/channels/<int:channel_id>/', UpdateChannelMembershipAPIView.as_view(), name='update_channel_membership'),
path('profiles/<int:profile_id>/channels/bulk-update/', BulkUpdateChannelMembershipAPIView.as_view(), name='bulk_update_channel_membership'),
# DVR series rules (order matters: specific routes before catch-all slug)
path('series-rules/', SeriesRulesAPIView.as_view(), name='series_rules'),
path('series-rules/evaluate/', EvaluateSeriesRulesAPIView.as_view(), name='evaluate_series_rules'),
path('series-rules/bulk-remove/', BulkRemoveSeriesRecordingsAPIView.as_view(), name='bulk_remove_series_recordings'),
path('series-rules/<path:tvg_id>/', DeleteSeriesRuleAPIView.as_view(), name='delete_series_rule'),
path('recordings/bulk-delete-upcoming/', BulkDeleteUpcomingRecordingsAPIView.as_view(), name='bulk_delete_upcoming_recordings'),
path('dvr/comskip-config/', ComskipConfigAPIView.as_view(), name='comskip_config'),
]
urlpatterns += router.urls

File diff suppressed because it is too large Load diff

View file

@ -14,6 +14,13 @@ class ChannelGroupForm(forms.ModelForm):
# Channel Form
#
class ChannelForm(forms.ModelForm):
# Explicitly define channel_number as FloatField to ensure decimal values work
channel_number = forms.FloatField(
required=False,
widget=forms.NumberInput(attrs={'step': '0.1'}), # Allow decimal steps
help_text="Channel number can include decimals (e.g., 1.1, 2.5)"
)
channel_group = forms.ModelChoiceField(
queryset=ChannelGroup.objects.all(),
required=False,

View file

@ -0,0 +1,38 @@
# Generated by Django 5.1.6 on 2025-04-18 16:21
from django.db import migrations, models
from django.db.models import Count
def remove_duplicate_channel_streams(apps, schema_editor):
ChannelStream = apps.get_model('dispatcharr_channels', 'ChannelStream')
# Find duplicates by (channel, stream)
duplicates = (
ChannelStream.objects
.values('channel', 'stream')
.annotate(count=Count('id'))
.filter(count__gt=1)
)
for dupe in duplicates:
# Get all duplicates for this pair
dups = ChannelStream.objects.filter(
channel=dupe['channel'],
stream=dupe['stream']
).order_by('id')
# Keep the first one, delete the rest
dups.exclude(id=dups.first().id).delete()
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0015_recording_custom_properties'),
]
operations = [
migrations.RunPython(remove_duplicate_channel_streams),
migrations.AddConstraint(
model_name='channelstream',
constraint=models.UniqueConstraint(fields=('channel', 'stream'), name='unique_channel_stream'),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-04-21 20:47
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0016_channelstream_unique_channel_stream'),
]
operations = [
migrations.AlterField(
model_name='channelgroup',
name='name',
field=models.TextField(db_index=True, unique=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-04-27 14:12
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0017_alter_channelgroup_name'),
]
operations = [
migrations.AddField(
model_name='channelgroupm3uaccount',
name='custom_properties',
field=models.TextField(blank=True, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-04 00:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0018_channelgroupm3uaccount_custom_properties_and_more'),
]
operations = [
migrations.AddField(
model_name='channel',
name='tvc_guide_stationid',
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-15 19:37
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0019_channel_tvc_guide_stationid'),
]
operations = [
migrations.AlterField(
model_name='channel',
name='channel_number',
field=models.FloatField(db_index=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-18 14:31
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0020_alter_channel_channel_number'),
]
operations = [
migrations.AddField(
model_name='channel',
name='user_level',
field=models.IntegerField(default=0),
),
]

View file

@ -0,0 +1,35 @@
# Generated by Django 5.1.6 on 2025-07-13 23:08
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0021_channel_user_level'),
('m3u', '0012_alter_m3uaccount_refresh_interval'),
]
operations = [
migrations.AddField(
model_name='channel',
name='auto_created',
field=models.BooleanField(default=False, help_text='Whether this channel was automatically created via M3U auto channel sync'),
),
migrations.AddField(
model_name='channel',
name='auto_created_by',
field=models.ForeignKey(blank=True, help_text='The M3U account that auto-created this channel', null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='auto_created_channels', to='m3u.m3uaccount'),
),
migrations.AddField(
model_name='channelgroupm3uaccount',
name='auto_channel_sync',
field=models.BooleanField(default=False, help_text='Automatically create/delete channels to match streams in this group'),
),
migrations.AddField(
model_name='channelgroupm3uaccount',
name='auto_sync_channel_start',
field=models.FloatField(blank=True, help_text='Starting channel number for auto-created channels in this group', null=True),
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 5.1.6 on 2025-07-29 02:39
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0022_channel_auto_created_channel_auto_created_by_and_more'),
]
operations = [
migrations.AddField(
model_name='stream',
name='stream_stats',
field=models.JSONField(blank=True, help_text='JSON object containing stream statistics like video codec, resolution, etc.', null=True),
),
migrations.AddField(
model_name='stream',
name='stream_stats_updated_at',
field=models.DateTimeField(blank=True, db_index=True, help_text='When stream statistics were last updated', null=True),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 5.2.4 on 2025-08-22 20:14
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0023_stream_stream_stats_stream_stream_stats_updated_at'),
]
operations = [
migrations.AlterField(
model_name='channelgroupm3uaccount',
name='channel_group',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='m3u_accounts', to='dispatcharr_channels.channelgroup'),
),
]

View file

@ -0,0 +1,28 @@
# Generated by Django 5.2.4 on 2025-09-02 14:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0024_alter_channelgroupm3uaccount_channel_group'),
]
operations = [
migrations.AlterField(
model_name='channelgroupm3uaccount',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AlterField(
model_name='recording',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AlterField(
model_name='stream',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
]

View file

@ -0,0 +1,31 @@
# Generated by Django 5.0.14 on 2025-09-18 14:56
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0025_alter_channelgroupm3uaccount_custom_properties_and_more'),
]
operations = [
migrations.CreateModel(
name='RecurringRecordingRule',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('days_of_week', models.JSONField(default=list)),
('start_time', models.TimeField()),
('end_time', models.TimeField()),
('enabled', models.BooleanField(default=True)),
('name', models.CharField(blank=True, max_length=255)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('channel', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='recurring_rules', to='dispatcharr_channels.channel')),
],
options={
'ordering': ['channel', 'start_time'],
},
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 5.2.4 on 2025-10-05 20:50
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0026_recurringrecordingrule'),
]
operations = [
migrations.AddField(
model_name='recurringrecordingrule',
name='end_date',
field=models.DateField(blank=True, null=True),
),
migrations.AddField(
model_name='recurringrecordingrule',
name='start_date',
field=models.DateField(blank=True, null=True),
),
]

View file

@ -0,0 +1,25 @@
# Generated by Django 5.2.4 on 2025-10-06 22:55
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0027_recurringrecordingrule_end_date_and_more'),
]
operations = [
migrations.AddField(
model_name='channel',
name='created_at',
field=models.DateTimeField(auto_now_add=True, default=django.utils.timezone.now, help_text='Timestamp when this channel was created'),
preserve_default=False,
),
migrations.AddField(
model_name='channel',
name='updated_at',
field=models.DateTimeField(auto_now=True, help_text='Timestamp when this channel was last updated'),
),
]

View file

@ -0,0 +1,54 @@
# Generated migration to backfill stream_hash for existing custom streams
from django.db import migrations
import hashlib
def backfill_custom_stream_hashes(apps, schema_editor):
"""
Generate stream_hash for all custom streams that don't have one.
Uses stream ID to create a stable hash that won't change when name/url is edited.
"""
Stream = apps.get_model('dispatcharr_channels', 'Stream')
custom_streams_without_hash = Stream.objects.filter(
is_custom=True,
stream_hash__isnull=True
)
updated_count = 0
for stream in custom_streams_without_hash:
# Generate a stable hash using the stream's ID
# This ensures the hash never changes even if name/url is edited
unique_string = f"custom_stream_{stream.id}"
stream.stream_hash = hashlib.sha256(unique_string.encode()).hexdigest()
stream.save(update_fields=['stream_hash'])
updated_count += 1
if updated_count > 0:
print(f"Backfilled stream_hash for {updated_count} custom streams")
else:
print("No custom streams needed stream_hash backfill")
def reverse_backfill(apps, schema_editor):
"""
Reverse migration - clear stream_hash for custom streams.
Note: This will break preview functionality for custom streams.
"""
Stream = apps.get_model('dispatcharr_channels', 'Stream')
custom_streams = Stream.objects.filter(is_custom=True)
count = custom_streams.update(stream_hash=None)
print(f"Cleared stream_hash for {count} custom streams")
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0028_channel_created_at_channel_updated_at'),
]
operations = [
migrations.RunPython(backfill_custom_stream_hashes, reverse_backfill),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-10-28 20:00
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0029_backfill_custom_stream_hashes'),
]
operations = [
migrations.AlterField(
model_name='stream',
name='url',
field=models.URLField(blank=True, max_length=4096, null=True),
),
]

View file

@ -0,0 +1,29 @@
# Generated by Django 5.2.9 on 2026-01-09 18:19
import datetime
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('dispatcharr_channels', '0030_alter_stream_url'),
]
operations = [
migrations.AddField(
model_name='channelgroupm3uaccount',
name='is_stale',
field=models.BooleanField(db_index=True, default=False, help_text='Whether this group relationship is stale (not seen in recent refresh, pending deletion)'),
),
migrations.AddField(
model_name='channelgroupm3uaccount',
name='last_seen',
field=models.DateTimeField(db_index=True, default=datetime.datetime.now, help_text='Last time this group was seen in the M3U source during a refresh'),
),
migrations.AddField(
model_name='stream',
name='is_stale',
field=models.BooleanField(db_index=True, default=False, help_text='Whether this stream is stale (not seen in recent refresh, pending deletion)'),
),
]

View file

@ -1,6 +1,5 @@
from django.db import models
from django.core.exceptions import ValidationError
from core.models import StreamProfile
from django.conf import settings
from core.models import StreamProfile, CoreSettings
from core.utils import RedisClient
@ -10,12 +9,14 @@ from datetime import datetime
import hashlib
import json
from apps.epg.models import EPGData
from apps.accounts.models import User
logger = logging.getLogger(__name__)
# If you have an M3UAccount model in apps.m3u, you can still import it:
from apps.m3u.models import M3UAccount
# Add fallback functions if Redis isn't available
def get_total_viewers(channel_id):
"""Get viewer count from Redis or return 0 if Redis isn't available"""
@ -26,8 +27,9 @@ def get_total_viewers(channel_id):
except Exception:
return 0
class ChannelGroup(models.Model):
name = models.CharField(max_length=100, unique=True)
name = models.TextField(unique=True, db_index=True)
def related_channels(self):
# local import if needed to avoid cyc. Usually fine in a single file though
@ -46,12 +48,14 @@ class ChannelGroup(models.Model):
return created_objects
class Stream(models.Model):
"""
Represents a single stream (e.g. from an M3U source or custom URL).
"""
name = models.CharField(max_length=255, default="Default Stream")
url = models.URLField(max_length=2000, blank=True, null=True)
url = models.URLField(max_length=4096, blank=True, null=True)
m3u_account = models.ForeignKey(
M3UAccount,
on_delete=models.CASCADE,
@ -61,7 +65,7 @@ class Stream(models.Model):
)
logo_url = models.TextField(blank=True, null=True)
tvg_id = models.CharField(max_length=255, blank=True, null=True)
local_file = models.FileField(upload_to='uploads/', blank=True, null=True)
local_file = models.FileField(upload_to="uploads/", blank=True, null=True)
current_viewers = models.PositiveIntegerField(default=0)
updated_at = models.DateTimeField(auto_now=True)
channel_group = models.ForeignKey(
@ -69,18 +73,18 @@ class Stream(models.Model):
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='streams'
related_name="streams",
)
stream_profile = models.ForeignKey(
StreamProfile,
null=True,
blank=True,
on_delete=models.SET_NULL,
related_name='streams'
related_name="streams",
)
is_custom = models.BooleanField(
default=False,
help_text="Whether this is a user-created stream or from an M3U account"
help_text="Whether this is a user-created stream or from an M3U account",
)
stream_hash = models.CharField(
max_length=255,
@ -90,30 +94,48 @@ class Stream(models.Model):
db_index=True,
)
last_seen = models.DateTimeField(db_index=True, default=datetime.now)
custom_properties = models.TextField(null=True, blank=True)
is_stale = models.BooleanField(
default=False,
db_index=True,
help_text="Whether this stream is stale (not seen in recent refresh, pending deletion)"
)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
# Stream statistics fields
stream_stats = models.JSONField(
null=True,
blank=True,
help_text="JSON object containing stream statistics like video codec, resolution, etc."
)
stream_stats_updated_at = models.DateTimeField(
null=True,
blank=True,
help_text="When stream statistics were last updated",
db_index=True
)
class Meta:
# If you use m3u_account, you might do unique_together = ('name','url','m3u_account')
verbose_name = "Stream"
verbose_name_plural = "Streams"
ordering = ['-updated_at']
ordering = ["-updated_at"]
def __str__(self):
return self.name or self.url or f"Stream ID {self.id}"
@classmethod
def generate_hash_key(cls, name, url, tvg_id, keys=None):
def generate_hash_key(cls, name, url, tvg_id, keys=None, m3u_id=None, group=None):
if keys is None:
keys = CoreSettings.get_m3u_hash_key().split(",")
stream_parts = {
"name": name, "url": url, "tvg_id": tvg_id
}
stream_parts = {"name": name, "url": url, "tvg_id": tvg_id, "m3u_id": m3u_id, "group": group}
hash_parts = {key: stream_parts[key] for key in keys if key in stream_parts}
# Serialize and hash the dictionary
serialized_obj = json.dumps(hash_parts, sort_keys=True) # sort_keys ensures consistent ordering
serialized_obj = json.dumps(
hash_parts, sort_keys=True
) # sort_keys ensures consistent ordering
hash_object = hashlib.sha256(serialized_obj.encode())
return hash_object.hexdigest()
@ -129,13 +151,23 @@ class Stream(models.Model):
return stream, False # False means it was updated, not created
except cls.DoesNotExist:
# If it doesn't exist, create a new object with the given hash
fields_to_update['stream_hash'] = hash_value # Make sure the hash field is set
fields_to_update["stream_hash"] = (
hash_value # Make sure the hash field is set
)
stream = cls.objects.create(**fields_to_update)
return stream, True # True means it was created
# @TODO: honor stream's stream profile
def get_stream_profile(self):
stream_profile = StreamProfile.objects.get(id=CoreSettings.get_default_stream_profile_id())
"""
Get the stream profile for this stream.
Uses the stream's own profile if set, otherwise returns the default.
"""
if self.stream_profile:
return self.stream_profile
stream_profile = StreamProfile.objects.get(
id=CoreSettings.get_default_stream_profile_id()
)
return stream_profile
@ -153,7 +185,9 @@ class Stream(models.Model):
m3u_account = self.m3u_account
m3u_profiles = m3u_account.profiles.all()
default_profile = next((obj for obj in m3u_profiles if obj.is_default), None)
profiles = [default_profile] + [obj for obj in m3u_profiles if not obj.is_default]
profiles = [default_profile] + [
obj for obj in m3u_profiles if not obj.is_default
]
for profile in profiles:
logger.info(profile)
@ -168,13 +202,19 @@ class Stream(models.Model):
if profile.max_streams == 0 or current_connections < profile.max_streams:
# Start a new stream
redis_client.set(f"channel_stream:{self.id}", self.id)
redis_client.set(f"stream_profile:{self.id}", profile.id) # Store only the matched profile
redis_client.set(
f"stream_profile:{self.id}", profile.id
) # Store only the matched profile
# Increment connection count for profiles with limits
if profile.max_streams > 0:
redis_client.incr(profile_connections_key)
return self.id, profile.id, None # Return newly assigned stream and matched profile
return (
self.id,
profile.id,
None,
) # Return newly assigned stream and matched profile
# 4. No available streams
return None, None, None
@ -195,7 +235,9 @@ class Stream(models.Model):
redis_client.delete(f"stream_profile:{stream_id}") # Remove profile association
profile_id = int(profile_id)
logger.debug(f"Found profile ID {profile_id} associated with stream {stream_id}")
logger.debug(
f"Found profile ID {profile_id} associated with stream {stream_id}"
)
profile_connections_key = f"profile_connections:{profile_id}"
@ -204,45 +246,45 @@ class Stream(models.Model):
if current_count > 0:
redis_client.decr(profile_connections_key)
class ChannelManager(models.Manager):
def active(self):
return self.all()
class Channel(models.Model):
channel_number = models.IntegerField()
channel_number = models.FloatField(db_index=True)
name = models.CharField(max_length=255)
logo = models.ForeignKey(
'Logo',
"Logo",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='channels',
related_name="channels",
)
# M2M to Stream now in the same file
streams = models.ManyToManyField(
Stream,
blank=True,
through='ChannelStream',
related_name='channels'
Stream, blank=True, through="ChannelStream", related_name="channels"
)
channel_group = models.ForeignKey(
'ChannelGroup',
"ChannelGroup",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='channels',
help_text="Channel group this channel belongs to."
related_name="channels",
help_text="Channel group this channel belongs to.",
)
tvg_id = models.CharField(max_length=255, blank=True, null=True)
tvc_guide_stationid = models.CharField(max_length=255, blank=True, null=True)
epg_data = models.ForeignKey(
EPGData,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='channels'
related_name="channels",
)
stream_profile = models.ForeignKey(
@ -250,16 +292,41 @@ class Channel(models.Model):
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='channels'
related_name="channels",
)
uuid = models.UUIDField(default=uuid.uuid4, editable=False, unique=True, db_index=True)
uuid = models.UUIDField(
default=uuid.uuid4, editable=False, unique=True, db_index=True
)
user_level = models.IntegerField(default=0)
auto_created = models.BooleanField(
default=False,
help_text="Whether this channel was automatically created via M3U auto channel sync"
)
auto_created_by = models.ForeignKey(
"m3u.M3UAccount",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name="auto_created_channels",
help_text="The M3U account that auto-created this channel"
)
created_at = models.DateTimeField(
auto_now_add=True,
help_text="Timestamp when this channel was created"
)
updated_at = models.DateTimeField(
auto_now=True,
help_text="Timestamp when this channel was last updated"
)
def clean(self):
# Enforce unique channel_number within a given group
existing = Channel.objects.filter(
channel_number=self.channel_number,
channel_group=self.channel_group
channel_number=self.channel_number, channel_group=self.channel_group
).exclude(id=self.id)
if existing.exists():
raise ValidationError(
@ -271,7 +338,7 @@ class Channel(models.Model):
@classmethod
def get_next_available_channel_number(cls, starting_from=1):
used_numbers = set(cls.objects.all().values_list('channel_number', flat=True))
used_numbers = set(cls.objects.all().values_list("channel_number", flat=True))
n = starting_from
while n in used_numbers:
n += 1
@ -281,7 +348,9 @@ class Channel(models.Model):
def get_stream_profile(self):
stream_profile = self.stream_profile
if not stream_profile:
stream_profile = StreamProfile.objects.get(id=CoreSettings.get_default_stream_profile_id())
stream_profile = StreamProfile.objects.get(
id=CoreSettings.get_default_stream_profile_id()
)
return stream_profile
@ -311,44 +380,55 @@ class Channel(models.Model):
profile_id = int(profile_id_bytes)
return stream_id, profile_id, None
except (ValueError, TypeError):
logger.debug(f"Invalid profile ID retrieved from Redis: {profile_id_bytes}")
logger.debug(
f"Invalid profile ID retrieved from Redis: {profile_id_bytes}"
)
except (ValueError, TypeError):
logger.debug(f"Invalid stream ID retrieved from Redis: {stream_id_bytes}")
logger.debug(
f"Invalid stream ID retrieved from Redis: {stream_id_bytes}"
)
# No existing active stream, attempt to assign a new one
has_streams_but_maxed_out = False
has_active_profiles = False
# Iterate through channel streams and their profiles
for stream in self.streams.all().order_by('channelstream__order'):
for stream in self.streams.all().order_by("channelstream__order"):
# Retrieve the M3U account associated with the stream.
m3u_account = stream.m3u_account
if not m3u_account:
logger.debug(f"Stream {stream.id} has no M3U account")
continue
m3u_profiles = m3u_account.profiles.all()
default_profile = next((obj for obj in m3u_profiles if obj.is_default), None)
if not default_profile:
logger.debug(f"M3U account {m3u_account.id} has no default profile")
if m3u_account.is_active == False:
logger.debug(f"M3U account {m3u_account.id} is inactive, skipping.")
continue
profiles = [default_profile] + [obj for obj in m3u_profiles if not obj.is_default]
m3u_profiles = m3u_account.profiles.filter(is_active=True)
default_profile = next(
(obj for obj in m3u_profiles if obj.is_default), None
)
if not default_profile:
logger.debug(f"M3U account {m3u_account.id} has no active default profile")
continue
profiles = [default_profile] + [
obj for obj in m3u_profiles if not obj.is_default
]
for profile in profiles:
# Skip inactive profiles
if not profile.is_active:
logger.debug(f"Skipping inactive profile {profile.id}")
continue
has_active_profiles = True
profile_connections_key = f"profile_connections:{profile.id}"
current_connections = int(redis_client.get(profile_connections_key) or 0)
current_connections = int(
redis_client.get(profile_connections_key) or 0
)
# Check if profile has available slots (or unlimited connections)
if profile.max_streams == 0 or current_connections < profile.max_streams:
if (
profile.max_streams == 0
or current_connections < profile.max_streams
):
# Start a new stream
redis_client.set(f"channel_stream:{self.id}", stream.id)
redis_client.set(f"stream_profile:{stream.id}", profile.id)
@ -357,17 +437,23 @@ class Channel(models.Model):
if profile.max_streams > 0:
redis_client.incr(profile_connections_key)
return stream.id, profile.id, None # Return newly assigned stream and matched profile
return (
stream.id,
profile.id,
None,
) # Return newly assigned stream and matched profile
else:
# This profile is at max connections
has_streams_but_maxed_out = True
logger.debug(f"Profile {profile.id} at max connections: {current_connections}/{profile.max_streams}")
logger.debug(
f"Profile {profile.id} at max connections: {current_connections}/{profile.max_streams}"
)
# No available streams - determine specific reason
if has_streams_but_maxed_out:
error_reason = "All M3U profiles have reached maximum connection limits"
error_reason = "All active M3U profiles have reached maximum connection limits"
elif has_active_profiles:
error_reason = "No compatible profile found for any assigned stream"
error_reason = "No compatible active profile found for any assigned stream"
else:
error_reason = "No active profiles found for any assigned stream"
@ -387,7 +473,9 @@ class Channel(models.Model):
redis_client.delete(f"channel_stream:{self.id}") # Remove active stream
stream_id = int(stream_id)
logger.debug(f"Found stream ID {stream_id} associated with channel stream {self.id}")
logger.debug(
f"Found stream ID {stream_id} associated with channel stream {self.id}"
)
# Get the matched profile for cleanup
profile_id = redis_client.get(f"stream_profile:{stream_id}")
@ -398,7 +486,9 @@ class Channel(models.Model):
redis_client.delete(f"stream_profile:{stream_id}") # Remove profile association
profile_id = int(profile_id)
logger.debug(f"Found profile ID {profile_id} associated with stream {stream_id}")
logger.debug(
f"Found profile ID {profile_id} associated with stream {stream_id}"
)
profile_connections_key = f"profile_connections:{profile_id}"
@ -407,17 +497,70 @@ class Channel(models.Model):
if current_count > 0:
redis_client.decr(profile_connections_key)
def update_stream_profile(self, new_profile_id):
"""
Updates the profile for the current stream and adjusts connection counts.
Args:
new_profile_id: The ID of the new stream profile to use
Returns:
bool: True if successful, False otherwise
"""
redis_client = RedisClient.get_client()
# Get current stream ID
stream_id_bytes = redis_client.get(f"channel_stream:{self.id}")
if not stream_id_bytes:
logger.debug("No active stream found for channel")
return False
stream_id = int(stream_id_bytes)
# Get current profile ID
current_profile_id_bytes = redis_client.get(f"stream_profile:{stream_id}")
if not current_profile_id_bytes:
logger.debug("No profile found for current stream")
return False
current_profile_id = int(current_profile_id_bytes)
# Don't do anything if the profile is already set to the requested one
if current_profile_id == new_profile_id:
return True
# Decrement connection count for old profile
old_profile_connections_key = f"profile_connections:{current_profile_id}"
old_count = int(redis_client.get(old_profile_connections_key) or 0)
if old_count > 0:
redis_client.decr(old_profile_connections_key)
# Update the profile mapping
redis_client.set(f"stream_profile:{stream_id}", new_profile_id)
# Increment connection count for new profile
new_profile_connections_key = f"profile_connections:{new_profile_id}"
redis_client.incr(new_profile_connections_key)
logger.info(
f"Updated stream {stream_id} profile from {current_profile_id} to {new_profile_id}"
)
return True
class ChannelProfile(models.Model):
name = models.CharField(max_length=100, unique=True)
class ChannelProfileMembership(models.Model):
channel_profile = models.ForeignKey(ChannelProfile, on_delete=models.CASCADE)
channel = models.ForeignKey(Channel, on_delete=models.CASCADE)
enabled = models.BooleanField(default=True) # Track if the channel is enabled for this group
enabled = models.BooleanField(
default=True
) # Track if the channel is enabled for this group
class Meta:
unique_together = ('channel_profile', 'channel')
unique_together = ("channel_profile", "channel")
class ChannelStream(models.Model):
channel = models.ForeignKey(Channel, on_delete=models.CASCADE)
@ -425,23 +568,45 @@ class ChannelStream(models.Model):
order = models.PositiveIntegerField(default=0) # Ordering field
class Meta:
ordering = ['order'] # Ensure streams are retrieved in order
ordering = ["order"] # Ensure streams are retrieved in order
constraints = [
models.UniqueConstraint(
fields=["channel", "stream"], name="unique_channel_stream"
)
]
class ChannelGroupM3UAccount(models.Model):
channel_group = models.ForeignKey(
ChannelGroup,
on_delete=models.CASCADE,
related_name='m3u_account'
ChannelGroup, on_delete=models.CASCADE, related_name="m3u_accounts"
)
m3u_account = models.ForeignKey(
M3UAccount,
on_delete=models.CASCADE,
related_name='channel_group'
M3UAccount, on_delete=models.CASCADE, related_name="channel_group"
)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
enabled = models.BooleanField(default=True)
auto_channel_sync = models.BooleanField(
default=False,
help_text='Automatically create/delete channels to match streams in this group'
)
auto_sync_channel_start = models.FloatField(
null=True,
blank=True,
help_text='Starting channel number for auto-created channels in this group'
)
last_seen = models.DateTimeField(
default=datetime.now,
db_index=True,
help_text='Last time this group was seen in the M3U source during a refresh'
)
is_stale = models.BooleanField(
default=False,
db_index=True,
help_text='Whether this group relationship is stale (not seen in recent refresh, pending deletion)'
)
class Meta:
unique_together = ('channel_group', 'm3u_account')
unique_together = ("channel_group", "m3u_account")
def __str__(self):
return f"{self.channel_group.name} - {self.m3u_account.name} (Enabled: {self.enabled})"
@ -454,12 +619,47 @@ class Logo(models.Model):
def __str__(self):
return self.name
class Recording(models.Model):
channel = models.ForeignKey("Channel", on_delete=models.CASCADE, related_name="recordings")
channel = models.ForeignKey(
"Channel", on_delete=models.CASCADE, related_name="recordings"
)
start_time = models.DateTimeField()
end_time = models.DateTimeField()
task_id = models.CharField(max_length=255, null=True, blank=True)
custom_properties = models.TextField(null=True, blank=True)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
def __str__(self):
return f"{self.channel.name} - {self.start_time} to {self.end_time}"
class RecurringRecordingRule(models.Model):
"""Rule describing a recurring manual DVR schedule."""
channel = models.ForeignKey(
"Channel",
on_delete=models.CASCADE,
related_name="recurring_rules",
)
days_of_week = models.JSONField(default=list)
start_time = models.TimeField()
end_time = models.TimeField()
enabled = models.BooleanField(default=True)
name = models.CharField(max_length=255, blank=True)
start_date = models.DateField(null=True, blank=True)
end_date = models.DateField(null=True, blank=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
ordering = ["channel", "start_time"]
def __str__(self):
channel_name = getattr(self.channel, "name", str(self.channel_id))
return f"Recurring rule for {channel_name}"
def cleaned_days(self):
try:
return sorted({int(d) for d in (self.days_of_week or []) if 0 <= int(d) <= 6})
except Exception:
return []

View file

@ -1,110 +1,234 @@
import json
from datetime import datetime
from rest_framework import serializers
from .models import Stream, Channel, ChannelGroup, ChannelStream, ChannelGroupM3UAccount, Logo, ChannelProfile, ChannelProfileMembership, Recording
from .models import (
Stream,
Channel,
ChannelGroup,
ChannelStream,
ChannelGroupM3UAccount,
Logo,
ChannelProfile,
ChannelProfileMembership,
Recording,
RecurringRecordingRule,
)
from apps.epg.serializers import EPGDataSerializer
from core.models import StreamProfile
from apps.epg.models import EPGData
from django.urls import reverse
from rest_framework import serializers
from django.utils import timezone
from core.utils import validate_flexible_url
class LogoSerializer(serializers.ModelSerializer):
cache_url = serializers.SerializerMethodField()
channel_count = serializers.SerializerMethodField()
is_used = serializers.SerializerMethodField()
channel_names = serializers.SerializerMethodField()
class Meta:
model = Logo
fields = ['id', 'name', 'url', 'cache_url']
fields = ["id", "name", "url", "cache_url", "channel_count", "is_used", "channel_names"]
def validate_url(self, value):
"""Validate that the URL is unique for creation or update"""
if self.instance and self.instance.url == value:
return value
if Logo.objects.filter(url=value).exists():
raise serializers.ValidationError("A logo with this URL already exists.")
return value
def create(self, validated_data):
"""Handle logo creation with proper URL validation"""
return Logo.objects.create(**validated_data)
def update(self, instance, validated_data):
"""Handle logo updates"""
for attr, value in validated_data.items():
setattr(instance, attr, value)
instance.save()
return instance
def get_cache_url(self, obj):
# return f"/api/channels/logos/{obj.id}/cache/"
request = self.context.get('request')
request = self.context.get("request")
if request:
return request.build_absolute_uri(reverse('api:channels:logo-cache', args=[obj.id]))
return reverse('api:channels:logo-cache', args=[obj.id])
return request.build_absolute_uri(
reverse("api:channels:logo-cache", args=[obj.id])
)
return reverse("api:channels:logo-cache", args=[obj.id])
def get_channel_count(self, obj):
"""Get the number of channels using this logo"""
return obj.channels.count()
def get_is_used(self, obj):
"""Check if this logo is used by any channels"""
return obj.channels.exists()
def get_channel_names(self, obj):
"""Get the names of channels using this logo (limited to first 5)"""
names = []
# Get channel names
channels = obj.channels.all()[:5]
for channel in channels:
names.append(f"Channel: {channel.name}")
# Calculate total count for "more" message
total_count = self.get_channel_count(obj)
if total_count > 5:
names.append(f"...and {total_count - 5} more")
return names
#
# Stream
#
class StreamSerializer(serializers.ModelSerializer):
url = serializers.CharField(
required=False,
allow_blank=True,
allow_null=True,
validators=[validate_flexible_url]
)
stream_profile_id = serializers.PrimaryKeyRelatedField(
queryset=StreamProfile.objects.all(),
source='stream_profile',
source="stream_profile",
allow_null=True,
required=False
required=False,
)
read_only_fields = ['is_custom', 'm3u_account', 'stream_hash']
read_only_fields = ["is_custom", "m3u_account", "stream_hash"]
class Meta:
model = Stream
fields = [
'id',
'name',
'url',
'm3u_account', # Uncomment if using M3U fields
'logo_url',
'tvg_id',
'local_file',
'current_viewers',
'updated_at',
'stream_profile_id',
'is_custom',
'channel_group',
'stream_hash',
"id",
"name",
"url",
"m3u_account", # Uncomment if using M3U fields
"logo_url",
"tvg_id",
"local_file",
"current_viewers",
"updated_at",
"last_seen",
"is_stale",
"stream_profile_id",
"is_custom",
"channel_group",
"stream_hash",
"stream_stats",
"stream_stats_updated_at",
]
def get_fields(self):
fields = super().get_fields()
# Unable to edit specific properties if this stream was created from an M3U account
if self.instance and getattr(self.instance, 'm3u_account', None) and not self.instance.is_custom:
fields['id'].read_only = True
fields['name'].read_only = True
fields['url'].read_only = True
fields['m3u_account'].read_only = True
fields['tvg_id'].read_only = True
fields['channel_group'].read_only = True
if (
self.instance
and getattr(self.instance, "m3u_account", None)
and not self.instance.is_custom
):
fields["id"].read_only = True
fields["name"].read_only = True
fields["url"].read_only = True
fields["m3u_account"].read_only = True
fields["tvg_id"].read_only = True
fields["channel_group"].read_only = True
return fields
class ChannelGroupM3UAccountSerializer(serializers.ModelSerializer):
m3u_accounts = serializers.IntegerField(source="m3u_accounts.id", read_only=True)
enabled = serializers.BooleanField()
auto_channel_sync = serializers.BooleanField(default=False)
auto_sync_channel_start = serializers.FloatField(allow_null=True, required=False)
custom_properties = serializers.JSONField(required=False)
class Meta:
model = ChannelGroupM3UAccount
fields = ["m3u_accounts", "channel_group", "enabled", "auto_channel_sync", "auto_sync_channel_start", "custom_properties", "is_stale", "last_seen"]
def to_representation(self, instance):
data = super().to_representation(instance)
custom_props = instance.custom_properties or {}
return data
def to_internal_value(self, data):
# Accept both dict and JSON string for custom_properties (for backward compatibility)
val = data.get("custom_properties")
if isinstance(val, str):
try:
data["custom_properties"] = json.loads(val)
except Exception:
pass
return super().to_internal_value(data)
#
# Channel Group
#
class ChannelGroupSerializer(serializers.ModelSerializer):
channel_count = serializers.SerializerMethodField()
m3u_account_count = serializers.SerializerMethodField()
m3u_accounts = ChannelGroupM3UAccountSerializer(
many=True,
read_only=True
)
class Meta:
model = ChannelGroup
fields = ['id', 'name']
fields = ["id", "name", "channel_count", "m3u_account_count", "m3u_accounts"]
def get_channel_count(self, obj):
"""Get count of channels in this group"""
return obj.channels.count()
def get_m3u_account_count(self, obj):
"""Get count of M3U accounts associated with this group"""
return obj.m3u_accounts.count()
class ChannelProfileSerializer(serializers.ModelSerializer):
channels = serializers.SerializerMethodField()
class Meta:
model = ChannelProfile
fields = ['id', 'name', 'channels']
fields = ["id", "name", "channels"]
def get_channels(self, obj):
memberships = ChannelProfileMembership.objects.filter(channel_profile=obj)
return [
{
'id': membership.channel.id,
'enabled': membership.enabled
}
for membership in memberships
]
memberships = ChannelProfileMembership.objects.filter(
channel_profile=obj, enabled=True
)
return [membership.channel.id for membership in memberships]
class ChannelProfileMembershipSerializer(serializers.ModelSerializer):
class Meta:
model = ChannelProfileMembership
fields = ['channel', 'enabled']
fields = ["channel", "enabled"]
class ChanneProfilelMembershipUpdateSerializer(serializers.Serializer):
channel_id = serializers.IntegerField() # Ensure channel_id is an integer
enabled = serializers.BooleanField()
class BulkChannelProfileMembershipSerializer(serializers.Serializer):
channels = serializers.ListField(
child=ChanneProfilelMembershipUpdateSerializer(), # Use the nested serializer
allow_empty=False
allow_empty=False,
)
def validate_channels(self, value):
@ -112,142 +236,228 @@ class BulkChannelProfileMembershipSerializer(serializers.Serializer):
raise serializers.ValidationError("At least one channel must be provided.")
return value
#
# Channel
#
class ChannelSerializer(serializers.ModelSerializer):
# Show nested group data, or ID
channel_number = serializers.IntegerField(allow_null=True, required=False)
channel_group = ChannelGroupSerializer(read_only=True)
channel_group_id = serializers.PrimaryKeyRelatedField(
queryset=ChannelGroup.objects.all(),
source="channel_group",
write_only=True,
required=False
# Ensure channel_number is explicitly typed as FloatField and properly validated
channel_number = serializers.FloatField(
allow_null=True,
required=False,
error_messages={"invalid": "Channel number must be a valid decimal number."},
)
channel_group_id = serializers.PrimaryKeyRelatedField(
queryset=ChannelGroup.objects.all(), source="channel_group", required=False
)
epg_data = EPGDataSerializer(read_only=True)
epg_data_id = serializers.PrimaryKeyRelatedField(
queryset=EPGData.objects.all(),
source="epg_data",
write_only=True,
required=False,
allow_null=True,
)
stream_profile_id = serializers.PrimaryKeyRelatedField(
queryset=StreamProfile.objects.all(),
source='stream_profile',
source="stream_profile",
allow_null=True,
required=False,
)
streams = serializers.SerializerMethodField()
stream_ids = serializers.PrimaryKeyRelatedField(
queryset=Stream.objects.all(), many=True, write_only=True, required=False
streams = serializers.PrimaryKeyRelatedField(
queryset=Stream.objects.all(), many=True, required=False
)
logo = LogoSerializer(read_only=True)
logo_id = serializers.PrimaryKeyRelatedField(
queryset=Logo.objects.all(),
source='logo',
source="logo",
allow_null=True,
required=False,
write_only=True,
)
auto_created_by_name = serializers.SerializerMethodField()
class Meta:
model = Channel
fields = [
'id',
'channel_number',
'name',
'channel_group',
'channel_group_id',
'tvg_id',
'epg_data',
'epg_data_id',
'streams',
'stream_ids',
'stream_profile_id',
'uuid',
'logo',
'logo_id',
"id",
"channel_number",
"name",
"channel_group_id",
"tvg_id",
"tvc_guide_stationid",
"epg_data_id",
"streams",
"stream_profile_id",
"uuid",
"logo_id",
"user_level",
"auto_created",
"auto_created_by",
"auto_created_by_name",
]
def get_streams(self, obj):
"""Retrieve ordered stream objects for GET requests."""
ordered_streams = obj.streams.all().order_by('channelstream__order')
return StreamSerializer(ordered_streams, many=True).data
def to_representation(self, instance):
include_streams = self.context.get("include_streams", False)
if include_streams:
self.fields["streams"] = serializers.SerializerMethodField()
return super().to_representation(instance)
else:
# Fix: For PATCH/PUT responses, ensure streams are ordered
representation = super().to_representation(instance)
if "streams" in representation:
representation["streams"] = list(
instance.streams.all()
.order_by("channelstream__order")
.values_list("id", flat=True)
)
return representation
def get_logo(self, obj):
return LogoSerializer(obj.logo).data
# def get_stream_ids(self, obj):
# """Retrieve ordered stream IDs for GET requests."""
# return list(obj.streams.all().order_by('channelstream__order').values_list('id', flat=True))
def get_streams(self, obj):
"""Retrieve ordered stream IDs for GET requests."""
return StreamSerializer(
obj.streams.all().order_by("channelstream__order"), many=True
).data
def create(self, validated_data):
stream_ids = validated_data.pop('streams', [])
channel_number = validated_data.pop('channel_number', Channel.get_next_available_channel_number())
streams = validated_data.pop("streams", [])
channel_number = validated_data.pop(
"channel_number", Channel.get_next_available_channel_number()
)
validated_data["channel_number"] = channel_number
channel = Channel.objects.create(**validated_data)
# Add streams in the specified order
for index, stream_id in enumerate(stream_ids):
ChannelStream.objects.create(channel=channel, stream_id=stream_id, order=index)
for index, stream in enumerate(streams):
ChannelStream.objects.create(
channel=channel, stream_id=stream.id, order=index
)
return channel
def update(self, instance, validated_data):
streams = validated_data.pop('stream_ids', None)
streams = validated_data.pop("streams", None)
# Update the actual Channel fields
instance.channel_number = validated_data.get('channel_number', instance.channel_number)
instance.name = validated_data.get('name', instance.name)
instance.tvg_id = validated_data.get('tvg_id', instance.tvg_id)
instance.epg_data = validated_data.get('epg_data', None)
# If serializer allows changing channel_group or stream_profile:
if 'channel_group' in validated_data:
instance.channel_group = validated_data['channel_group']
if 'stream_profile' in validated_data:
instance.stream_profile = validated_data['stream_profile']
if 'logo' in validated_data:
instance.logo = validated_data['logo']
# Update standard fields
for attr, value in validated_data.items():
setattr(instance, attr, value)
instance.save()
# Handle the many-to-many 'streams'
if streams is not None:
# Clear existing relationships
instance.channelstream_set.all().delete()
# Add new streams in order
for index, stream in enumerate(streams):
print(f'Setting stream {stream.id} to index {index}')
ChannelStream.objects.create(channel=instance, stream_id=stream.id, order=index)
# Normalize stream IDs
normalized_ids = [
stream.id if hasattr(stream, "id") else stream for stream in streams
]
print(normalized_ids)
# Get current mapping of stream_id -> ChannelStream
current_links = {
cs.stream_id: cs for cs in instance.channelstream_set.all()
}
# Track existing stream IDs
existing_ids = set(current_links.keys())
new_ids = set(normalized_ids)
# Delete any links not in the new list
to_remove = existing_ids - new_ids
if to_remove:
instance.channelstream_set.filter(stream_id__in=to_remove).delete()
# Update or create with new order
for order, stream_id in enumerate(normalized_ids):
if stream_id in current_links:
cs = current_links[stream_id]
if cs.order != order:
cs.order = order
cs.save(update_fields=["order"])
else:
ChannelStream.objects.create(
channel=instance, stream_id=stream_id, order=order
)
return instance
class ChannelGroupM3UAccountSerializer(serializers.ModelSerializer):
enabled = serializers.BooleanField()
def validate_channel_number(self, value):
"""Ensure channel_number is properly processed as a float"""
if value is None:
return value
class Meta:
model = ChannelGroupM3UAccount
fields = ['id', 'channel_group', 'enabled']
try:
# Ensure it's processed as a float
return float(value)
except (ValueError, TypeError):
raise serializers.ValidationError(
"Channel number must be a valid decimal number."
)
# Optionally, if you only need the id of the ChannelGroup, you can customize it like this:
# channel_group = serializers.PrimaryKeyRelatedField(queryset=ChannelGroup.objects.all())
def validate_stream_profile(self, value):
"""Handle special case where empty/0 values mean 'use default' (null)"""
if value == "0" or value == 0 or value == "" or value is None:
return None
return value # PrimaryKeyRelatedField will handle the conversion to object
def get_auto_created_by_name(self, obj):
"""Get the name of the M3U account that auto-created this channel."""
if obj.auto_created_by:
return obj.auto_created_by.name
return None
class RecordingSerializer(serializers.ModelSerializer):
class Meta:
model = Recording
fields = '__all__'
read_only_fields = ['task_id']
fields = "__all__"
read_only_fields = ["task_id"]
def validate(self, data):
start_time = data.get('start_time')
end_time = data.get('end_time')
from core.models import CoreSettings
start_time = data.get("start_time")
end_time = data.get("end_time")
if start_time and timezone.is_naive(start_time):
start_time = timezone.make_aware(start_time, timezone.get_current_timezone())
data["start_time"] = start_time
if end_time and timezone.is_naive(end_time):
end_time = timezone.make_aware(end_time, timezone.get_current_timezone())
data["end_time"] = end_time
# If this is an EPG-based recording (program provided), apply global pre/post offsets
try:
cp = data.get("custom_properties") or {}
is_epg_based = isinstance(cp, dict) and isinstance(cp.get("program"), (dict,))
except Exception:
is_epg_based = False
if is_epg_based and start_time and end_time:
try:
pre_min = int(CoreSettings.get_dvr_pre_offset_minutes())
except Exception:
pre_min = 0
try:
post_min = int(CoreSettings.get_dvr_post_offset_minutes())
except Exception:
post_min = 0
from datetime import timedelta
try:
if pre_min and pre_min > 0:
start_time = start_time - timedelta(minutes=pre_min)
except Exception:
pass
try:
if post_min and post_min > 0:
end_time = end_time + timedelta(minutes=post_min)
except Exception:
pass
# write back adjusted times so scheduling uses them
data["start_time"] = start_time
data["end_time"] = end_time
now = timezone.now() # timezone-aware current time
@ -256,8 +466,61 @@ class RecordingSerializer(serializers.ModelSerializer):
if start_time < now:
# Optional: Adjust start_time if it's in the past but end_time is in the future
data['start_time'] = now # or: timezone.now() + timedelta(seconds=1)
if end_time <= data['start_time']:
data["start_time"] = now # or: timezone.now() + timedelta(seconds=1)
if end_time <= data["start_time"]:
raise serializers.ValidationError("End time must be after start time.")
return data
class RecurringRecordingRuleSerializer(serializers.ModelSerializer):
class Meta:
model = RecurringRecordingRule
fields = "__all__"
read_only_fields = ["created_at", "updated_at"]
def validate_days_of_week(self, value):
if not value:
raise serializers.ValidationError("Select at least one day of the week")
cleaned = []
for entry in value:
try:
iv = int(entry)
except (TypeError, ValueError):
raise serializers.ValidationError("Days of week must be integers 0-6")
if iv < 0 or iv > 6:
raise serializers.ValidationError("Days of week must be between 0 (Monday) and 6 (Sunday)")
cleaned.append(iv)
return sorted(set(cleaned))
def validate(self, attrs):
start = attrs.get("start_time") or getattr(self.instance, "start_time", None)
end = attrs.get("end_time") or getattr(self.instance, "end_time", None)
start_date = attrs.get("start_date") if "start_date" in attrs else getattr(self.instance, "start_date", None)
end_date = attrs.get("end_date") if "end_date" in attrs else getattr(self.instance, "end_date", None)
if start_date is None:
existing_start = getattr(self.instance, "start_date", None)
if existing_start is None:
raise serializers.ValidationError("Start date is required")
if start_date and end_date and end_date < start_date:
raise serializers.ValidationError("End date must be on or after start date")
if end_date is None:
existing_end = getattr(self.instance, "end_date", None)
if existing_end is None:
raise serializers.ValidationError("End date is required")
if start and end and start_date and end_date:
start_dt = datetime.combine(start_date, start)
end_dt = datetime.combine(end_date, end)
if end_dt <= start_dt:
raise serializers.ValidationError("End datetime must be after start datetime")
elif start and end and end == start:
raise serializers.ValidationError("End time must be different from start time")
# Normalize empty strings to None for dates
if attrs.get("end_date") == "":
attrs["end_date"] = None
if attrs.get("start_date") == "":
attrs["start_date"] = None
return super().validate(attrs)
def create(self, validated_data):
return super().create(validated_data)

View file

@ -8,7 +8,7 @@ from .models import Channel, Stream, ChannelProfile, ChannelProfileMembership, R
from apps.m3u.models import M3UAccount
from apps.epg.tasks import parse_programs_for_tvg_id
import logging, requests, time
from .tasks import run_recording
from .tasks import run_recording, prefetch_recording_artwork
from django.utils.timezone import now, is_aware, make_aware
from datetime import timedelta
@ -45,19 +45,36 @@ def set_default_m3u_account(sender, instance, **kwargs):
else:
raise ValueError("No default M3UAccount found.")
@receiver(post_save, sender=Channel)
def refresh_epg_programs(sender, instance, created, **kwargs):
if instance.epg_data:
parse_programs_for_tvg_id.delay(instance.epg_data.id)
@receiver(post_save, sender=Stream)
def generate_custom_stream_hash(sender, instance, created, **kwargs):
"""
Generate a stable stream_hash for custom streams after creation.
Uses the stream's ID to ensure the hash never changes even if name/url is edited.
"""
if instance.is_custom and not instance.stream_hash and created:
import hashlib
# Use stream ID for a stable, unique hash that never changes
unique_string = f"custom_stream_{instance.id}"
instance.stream_hash = hashlib.sha256(unique_string.encode()).hexdigest()
# Use update to avoid triggering signals again
Stream.objects.filter(id=instance.id).update(stream_hash=instance.stream_hash)
@receiver(post_save, sender=Channel)
def add_new_channel_to_groups(sender, instance, created, **kwargs):
if created:
profiles = ChannelProfile.objects.all()
ChannelProfileMembership.objects.bulk_create([
ChannelProfileMembership(channel_profile=profile, channel=instance)
for profile in profiles
])
def refresh_epg_programs(sender, instance, created, **kwargs):
"""
When a channel is saved, check if the EPG data has changed.
If so, trigger a refresh of the program data for the EPG.
"""
# Check if this is an update (not a new channel) and the epg_data has changed
if not created and kwargs.get('update_fields') and 'epg_data' in kwargs['update_fields']:
logger.info(f"Channel {instance.id} ({instance.name}) EPG data updated, refreshing program data")
if instance.epg_data:
logger.info(f"Triggering EPG program refresh for {instance.epg_data.tvg_id}")
parse_programs_for_tvg_id.delay(instance.epg_data.id)
# For new channels with EPG data, also refresh
elif created and instance.epg_data:
logger.info(f"New channel {instance.id} ({instance.name}) created with EPG data, refreshing program data")
parse_programs_for_tvg_id.delay(instance.epg_data.id)
@receiver(post_save, sender=ChannelProfile)
def create_profile_memberships(sender, instance, created, **kwargs):
@ -70,8 +87,9 @@ def create_profile_memberships(sender, instance, created, **kwargs):
def schedule_recording_task(instance):
eta = instance.start_time
# Pass recording_id first so task can persist metadata to the correct row
task = run_recording.apply_async(
args=[instance.channel_id, str(instance.start_time), str(instance.end_time)],
args=[instance.id, instance.channel_id, str(instance.start_time), str(instance.end_time)],
eta=eta
)
return task.id
@ -120,6 +138,11 @@ def schedule_task_on_save(sender, instance, created, **kwargs):
instance.save(update_fields=['task_id'])
else:
print("Start time is in the past. Not scheduling.")
# Kick off poster/artwork prefetch to enrich Upcoming cards
try:
prefetch_recording_artwork.apply_async(args=[instance.id], countdown=1)
except Exception as e:
print("Error scheduling artwork prefetch:", e)
except Exception as e:
import traceback
print("Error in post_save signal:", e)

File diff suppressed because it is too large Load diff

View file

View file

@ -0,0 +1,211 @@
from django.test import TestCase
from django.contrib.auth import get_user_model
from rest_framework.test import APIClient
from rest_framework import status
from apps.channels.models import Channel, ChannelGroup
User = get_user_model()
class ChannelBulkEditAPITests(TestCase):
def setUp(self):
# Create a test admin user (user_level >= 10) and authenticate
self.user = User.objects.create_user(username="testuser", password="testpass123")
self.user.user_level = 10 # Set admin level
self.user.save()
self.client = APIClient()
self.client.force_authenticate(user=self.user)
self.bulk_edit_url = "/api/channels/channels/edit/bulk/"
# Create test channel group
self.group1 = ChannelGroup.objects.create(name="Test Group 1")
self.group2 = ChannelGroup.objects.create(name="Test Group 2")
# Create test channels
self.channel1 = Channel.objects.create(
channel_number=1.0,
name="Channel 1",
tvg_id="channel1",
channel_group=self.group1
)
self.channel2 = Channel.objects.create(
channel_number=2.0,
name="Channel 2",
tvg_id="channel2",
channel_group=self.group1
)
self.channel3 = Channel.objects.create(
channel_number=3.0,
name="Channel 3",
tvg_id="channel3"
)
def test_bulk_edit_success(self):
"""Test successful bulk update of multiple channels"""
data = [
{"id": self.channel1.id, "name": "Updated Channel 1"},
{"id": self.channel2.id, "name": "Updated Channel 2", "channel_number": 22.0},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["message"], "Successfully updated 2 channels")
self.assertEqual(len(response.data["channels"]), 2)
# Verify database changes
self.channel1.refresh_from_db()
self.channel2.refresh_from_db()
self.assertEqual(self.channel1.name, "Updated Channel 1")
self.assertEqual(self.channel2.name, "Updated Channel 2")
self.assertEqual(self.channel2.channel_number, 22.0)
def test_bulk_edit_with_empty_validated_data_first(self):
"""
Test the bug fix: when first channel has empty validated_data.
This was causing: ValueError: Field names must be given to bulk_update()
"""
# Create a channel with data that will be "unchanged" (empty validated_data)
# We'll send the same data it already has
data = [
# First channel: no actual changes (this would create empty validated_data)
{"id": self.channel1.id},
# Second channel: has changes
{"id": self.channel2.id, "name": "Updated Channel 2"},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
# Should not crash with ValueError
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["message"], "Successfully updated 2 channels")
# Verify the channel with changes was updated
self.channel2.refresh_from_db()
self.assertEqual(self.channel2.name, "Updated Channel 2")
def test_bulk_edit_all_empty_updates(self):
"""Test when all channels have empty updates (no actual changes)"""
data = [
{"id": self.channel1.id},
{"id": self.channel2.id},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
# Should succeed without calling bulk_update
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["message"], "Successfully updated 2 channels")
def test_bulk_edit_mixed_fields(self):
"""Test bulk update where different channels update different fields"""
data = [
{"id": self.channel1.id, "name": "New Name 1"},
{"id": self.channel2.id, "channel_number": 99.0},
{"id": self.channel3.id, "tvg_id": "new_tvg_id", "name": "New Name 3"},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["message"], "Successfully updated 3 channels")
# Verify all updates
self.channel1.refresh_from_db()
self.channel2.refresh_from_db()
self.channel3.refresh_from_db()
self.assertEqual(self.channel1.name, "New Name 1")
self.assertEqual(self.channel2.channel_number, 99.0)
self.assertEqual(self.channel3.tvg_id, "new_tvg_id")
self.assertEqual(self.channel3.name, "New Name 3")
def test_bulk_edit_with_channel_group(self):
"""Test bulk update with channel_group_id changes"""
data = [
{"id": self.channel1.id, "channel_group_id": self.group2.id},
{"id": self.channel3.id, "channel_group_id": self.group1.id},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Verify group changes
self.channel1.refresh_from_db()
self.channel3.refresh_from_db()
self.assertEqual(self.channel1.channel_group, self.group2)
self.assertEqual(self.channel3.channel_group, self.group1)
def test_bulk_edit_nonexistent_channel(self):
"""Test bulk update with a channel that doesn't exist"""
nonexistent_id = 99999
data = [
{"id": nonexistent_id, "name": "Should Fail"},
{"id": self.channel1.id, "name": "Should Still Update"},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
# Should return 400 with errors
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("errors", response.data)
self.assertEqual(len(response.data["errors"]), 1)
self.assertEqual(response.data["errors"][0]["channel_id"], nonexistent_id)
self.assertEqual(response.data["errors"][0]["error"], "Channel not found")
# The valid channel should still be updated
self.assertEqual(response.data["updated_count"], 1)
def test_bulk_edit_validation_error(self):
"""Test bulk update with invalid data (validation error)"""
data = [
{"id": self.channel1.id, "channel_number": "invalid_number"},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
# Should return 400 with validation errors
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("errors", response.data)
self.assertEqual(len(response.data["errors"]), 1)
self.assertIn("channel_number", response.data["errors"][0]["errors"])
def test_bulk_edit_empty_channel_updates(self):
"""Test bulk update with empty list"""
data = []
response = self.client.patch(self.bulk_edit_url, data, format="json")
# Empty list is accepted and returns success with 0 updates
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["message"], "Successfully updated 0 channels")
def test_bulk_edit_missing_channel_updates(self):
"""Test bulk update without proper format (dict instead of list)"""
data = {"channel_updates": {}}
response = self.client.patch(self.bulk_edit_url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(response.data["error"], "Expected a list of channel updates")
def test_bulk_edit_preserves_other_fields(self):
"""Test that bulk update only changes specified fields"""
original_channel_number = self.channel1.channel_number
original_tvg_id = self.channel1.tvg_id
data = [
{"id": self.channel1.id, "name": "Only Name Changed"},
]
response = self.client.patch(self.bulk_edit_url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Verify only name changed, other fields preserved
self.channel1.refresh_from_db()
self.assertEqual(self.channel1.name, "Only Name Changed")
self.assertEqual(self.channel1.channel_number, original_channel_number)
self.assertEqual(self.channel1.tvg_id, original_tvg_id)

View file

@ -0,0 +1,40 @@
from datetime import datetime, timedelta
from django.test import TestCase
from django.utils import timezone
from apps.channels.models import Channel, RecurringRecordingRule, Recording
from apps.channels.tasks import sync_recurring_rule_impl, purge_recurring_rule_impl
class RecurringRecordingRuleTasksTests(TestCase):
def test_sync_recurring_rule_creates_and_purges_recordings(self):
now = timezone.now()
channel = Channel.objects.create(channel_number=1, name='Test Channel')
start_time = (now + timedelta(minutes=15)).time().replace(second=0, microsecond=0)
end_time = (now + timedelta(minutes=75)).time().replace(second=0, microsecond=0)
rule = RecurringRecordingRule.objects.create(
channel=channel,
days_of_week=[now.weekday()],
start_time=start_time,
end_time=end_time,
)
created = sync_recurring_rule_impl(rule.id, drop_existing=True, horizon_days=1)
self.assertEqual(created, 1)
recording = Recording.objects.filter(custom_properties__rule__id=rule.id).first()
self.assertIsNotNone(recording)
self.assertEqual(recording.channel, channel)
self.assertEqual(recording.custom_properties.get('rule', {}).get('id'), rule.id)
expected_start = timezone.make_aware(
datetime.combine(recording.start_time.date(), start_time),
timezone.get_current_timezone(),
)
self.assertLess(abs((recording.start_time - expected_start).total_seconds()), 60)
removed = purge_recurring_rule_impl(rule.id)
self.assertEqual(removed, 1)
self.assertFalse(Recording.objects.filter(custom_properties__rule__id=rule.id).exists())

View file

@ -2,47 +2,66 @@ import logging, os
from rest_framework import viewsets, status
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.permissions import IsAuthenticated
from rest_framework.decorators import action
from drf_yasg.utils import swagger_auto_schema
from drf_yasg import openapi
from django.utils import timezone
from datetime import timedelta
from .models import EPGSource, ProgramData, EPGData # Added ProgramData
from .serializers import ProgramDataSerializer, EPGSourceSerializer, EPGDataSerializer # Updated serializer
from .serializers import (
ProgramDataSerializer,
EPGSourceSerializer,
EPGDataSerializer,
) # Updated serializer
from .tasks import refresh_epg_data
from apps.accounts.permissions import (
Authenticated,
permission_classes_by_action,
permission_classes_by_method,
)
logger = logging.getLogger(__name__)
# ─────────────────────────────
# 1) EPG Source API (CRUD)
# ─────────────────────────────
class EPGSourceViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for EPG sources"""
"""
API endpoint that allows EPG sources to be viewed or edited.
"""
queryset = EPGSource.objects.all()
serializer_class = EPGSourceSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
def list(self, request, *args, **kwargs):
logger.debug("Listing all EPG sources.")
return super().list(request, *args, **kwargs)
@action(detail=False, methods=['post'])
@action(detail=False, methods=["post"])
def upload(self, request):
if 'file' not in request.FILES:
return Response({'error': 'No file uploaded'}, status=status.HTTP_400_BAD_REQUEST)
if "file" not in request.FILES:
return Response(
{"error": "No file uploaded"}, status=status.HTTP_400_BAD_REQUEST
)
file = request.FILES['file']
file = request.FILES["file"]
file_name = file.name
file_path = os.path.join('/data/uploads/epgs', file_name)
file_path = os.path.join("/data/uploads/epgs", file_name)
os.makedirs(os.path.dirname(file_path), exist_ok=True)
with open(file_path, 'wb+') as destination:
with open(file_path, "wb+") as destination:
for chunk in file.chunks():
destination.write(chunk)
new_obj_data = request.data.copy()
new_obj_data['file_path'] = file_path
new_obj_data["file_path"] = file_path
serializer = self.get_serializer(data=new_obj_data)
serializer.is_valid(raise_exception=True)
@ -50,47 +69,293 @@ class EPGSourceViewSet(viewsets.ModelViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
def partial_update(self, request, *args, **kwargs):
"""Handle partial updates with special logic for is_active field"""
instance = self.get_object()
# Check if we're toggling is_active
if (
"is_active" in request.data
and instance.is_active != request.data["is_active"]
):
# Set appropriate status based on new is_active value
if request.data["is_active"]:
request.data["status"] = "idle"
else:
request.data["status"] = "disabled"
# Continue with regular partial update
return super().partial_update(request, *args, **kwargs)
# ─────────────────────────────
# 2) Program API (CRUD)
# ─────────────────────────────
class ProgramViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for EPG programs"""
queryset = ProgramData.objects.all()
serializer_class = ProgramDataSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
def list(self, request, *args, **kwargs):
logger.debug("Listing all EPG programs.")
return super().list(request, *args, **kwargs)
# ─────────────────────────────
# 3) EPG Grid View
# ─────────────────────────────
class EPGGridAPIView(APIView):
"""Returns all programs airing in the next 24 hours including currently running ones and recent ones"""
def get_permissions(self):
try:
return [
perm() for perm in permission_classes_by_method[self.request.method]
]
except KeyError:
return [Authenticated()]
@swagger_auto_schema(
operation_description="Retrieve programs from the previous hour, currently running and upcoming for the next 24 hours",
responses={200: ProgramDataSerializer(many=True)}
responses={200: ProgramDataSerializer(many=True)},
)
def get(self, request, format=None):
# Use current time instead of midnight
now = timezone.now()
one_hour_ago = now - timedelta(hours=1)
twenty_four_hours_later = now + timedelta(hours=24)
logger.debug(f"EPGGridAPIView: Querying programs between {one_hour_ago} and {twenty_four_hours_later}.")
logger.debug(
f"EPGGridAPIView: Querying programs between {one_hour_ago} and {twenty_four_hours_later}."
)
# Use select_related to prefetch EPGData and include programs from the last hour
programs = ProgramData.objects.select_related('epg').filter(
programs = ProgramData.objects.select_related("epg").filter(
# Programs that end after one hour ago (includes recently ended programs)
end_time__gt=one_hour_ago,
# AND start before the end time window
start_time__lt=twenty_four_hours_later
start_time__lt=twenty_four_hours_later,
)
count = programs.count()
logger.debug(f"EPGGridAPIView: Found {count} program(s), including recently ended, currently running, and upcoming shows.")
serializer = ProgramDataSerializer(programs, many=True)
return Response({'data': serializer.data}, status=status.HTTP_200_OK)
logger.debug(
f"EPGGridAPIView: Found {count} program(s), including recently ended, currently running, and upcoming shows."
)
# Generate dummy programs for channels that have no EPG data OR dummy EPG sources
from apps.channels.models import Channel
from apps.epg.models import EPGSource
from django.db.models import Q
# Get channels with no EPG data at all (standard dummy)
channels_without_epg = Channel.objects.filter(Q(epg_data__isnull=True))
# Get channels with custom dummy EPG sources (generate on-demand with patterns)
channels_with_custom_dummy = Channel.objects.filter(
epg_data__epg_source__source_type='dummy'
).distinct()
# Log what we found
without_count = channels_without_epg.count()
custom_count = channels_with_custom_dummy.count()
if without_count > 0:
channel_names = [f"{ch.name} (ID: {ch.id})" for ch in channels_without_epg]
logger.debug(
f"EPGGridAPIView: Channels needing standard dummy EPG: {', '.join(channel_names)}"
)
if custom_count > 0:
channel_names = [f"{ch.name} (ID: {ch.id})" for ch in channels_with_custom_dummy]
logger.debug(
f"EPGGridAPIView: Channels needing custom dummy EPG: {', '.join(channel_names)}"
)
logger.debug(
f"EPGGridAPIView: Found {without_count} channels needing standard dummy, {custom_count} needing custom dummy EPG."
)
# Serialize the regular programs
serialized_programs = ProgramDataSerializer(programs, many=True).data
# Humorous program descriptions based on time of day - same as in output/views.py
time_descriptions = {
(0, 4): [
"Late Night with {channel} - Where insomniacs unite!",
"The 'Why Am I Still Awake?' Show on {channel}",
"Counting Sheep - A {channel} production for the sleepless",
],
(4, 8): [
"Dawn Patrol - Rise and shine with {channel}!",
"Early Bird Special - Coffee not included",
"Morning Zombies - Before coffee viewing on {channel}",
],
(8, 12): [
"Mid-Morning Meetings - Pretend you're paying attention while watching {channel}",
"The 'I Should Be Working' Hour on {channel}",
"Productivity Killer - {channel}'s daytime programming",
],
(12, 16): [
"Lunchtime Laziness with {channel}",
"The Afternoon Slump - Brought to you by {channel}",
"Post-Lunch Food Coma Theater on {channel}",
],
(16, 20): [
"Rush Hour - {channel}'s alternative to traffic",
"The 'What's For Dinner?' Debate on {channel}",
"Evening Escapism - {channel}'s remedy for reality",
],
(20, 24): [
"Prime Time Placeholder - {channel}'s finest not-programming",
"The 'Netflix Was Too Complicated' Show on {channel}",
"Family Argument Avoider - Courtesy of {channel}",
],
}
# Generate and append dummy programs
dummy_programs = []
# Import the function from output.views
from apps.output.views import generate_dummy_programs as gen_dummy_progs
# Handle channels with CUSTOM dummy EPG sources (with patterns)
for channel in channels_with_custom_dummy:
# For dummy EPGs, ALWAYS use channel UUID to ensure unique programs per channel
# This prevents multiple channels assigned to the same dummy EPG from showing identical data
# Each channel gets its own unique program data even if they share the same EPG source
dummy_tvg_id = str(channel.uuid)
try:
# Get the custom dummy EPG source
epg_source = channel.epg_data.epg_source if channel.epg_data else None
logger.debug(f"Generating custom dummy programs for channel: {channel.name} (ID: {channel.id})")
# Determine which name to parse based on custom properties
name_to_parse = channel.name
if epg_source and epg_source.custom_properties:
custom_props = epg_source.custom_properties
name_source = custom_props.get('name_source')
if name_source == 'stream':
# Get the stream index (1-based from user, convert to 0-based)
stream_index = custom_props.get('stream_index', 1) - 1
# Get streams ordered by channelstream order
channel_streams = channel.streams.all().order_by('channelstream__order')
if channel_streams.exists() and 0 <= stream_index < channel_streams.count():
stream = list(channel_streams)[stream_index]
name_to_parse = stream.name
logger.debug(f"Using stream name for parsing: {name_to_parse} (stream index: {stream_index})")
else:
logger.warning(f"Stream index {stream_index} not found for channel {channel.name}, falling back to channel name")
elif name_source == 'channel':
logger.debug(f"Using channel name for parsing: {name_to_parse}")
# Generate programs using custom patterns from the dummy EPG source
# Use the same tvg_id that will be set in the program data
generated = gen_dummy_progs(
channel_id=dummy_tvg_id,
channel_name=name_to_parse,
num_days=1,
program_length_hours=4,
epg_source=epg_source
)
# Custom dummy should always return data (either from patterns or fallback)
if generated:
logger.debug(f"Generated {len(generated)} custom dummy programs for {channel.name}")
# Convert generated programs to API format
for program in generated:
dummy_program = {
"id": f"dummy-custom-{channel.id}-{program['start_time'].hour}",
"epg": {"tvg_id": dummy_tvg_id, "name": channel.name},
"start_time": program['start_time'].isoformat(),
"end_time": program['end_time'].isoformat(),
"title": program['title'],
"description": program['description'],
"tvg_id": dummy_tvg_id,
"sub_title": None,
"custom_properties": None,
}
dummy_programs.append(dummy_program)
else:
logger.warning(f"No programs generated for custom dummy EPG channel: {channel.name}")
except Exception as e:
logger.error(
f"Error creating custom dummy programs for channel {channel.name} (ID: {channel.id}): {str(e)}"
)
# Handle channels with NO EPG data (standard dummy with humorous descriptions)
for channel in channels_without_epg:
# For channels with no EPG, use UUID to ensure uniqueness (matches frontend logic)
# The frontend uses: tvgRecord?.tvg_id ?? channel.uuid
# Since there's no EPG data, it will fall back to UUID
dummy_tvg_id = str(channel.uuid)
try:
logger.debug(f"Generating standard dummy programs for channel: {channel.name} (ID: {channel.id})")
# Create programs every 4 hours for the next 24 hours with humorous descriptions
for hour_offset in range(0, 24, 4):
# Use timedelta for time arithmetic instead of replace() to avoid hour overflow
start_time = now + timedelta(hours=hour_offset)
# Set minutes/seconds to zero for clean time blocks
start_time = start_time.replace(minute=0, second=0, microsecond=0)
end_time = start_time + timedelta(hours=4)
# Get the hour for selecting a description
hour = start_time.hour
day = 0 # Use 0 as we're only doing 1 day
# Find the appropriate time slot for description
for time_range, descriptions in time_descriptions.items():
start_range, end_range = time_range
if start_range <= hour < end_range:
# Pick a description using the sum of the hour and day as seed
# This makes it somewhat random but consistent for the same timeslot
description = descriptions[
(hour + day) % len(descriptions)
].format(channel=channel.name)
break
else:
# Fallback description if somehow no range matches
description = f"Placeholder program for {channel.name} - EPG data went on vacation"
# Create a dummy program in the same format as regular programs
dummy_program = {
"id": f"dummy-standard-{channel.id}-{hour_offset}",
"epg": {"tvg_id": dummy_tvg_id, "name": channel.name},
"start_time": start_time.isoformat(),
"end_time": end_time.isoformat(),
"title": f"{channel.name}",
"description": description,
"tvg_id": dummy_tvg_id,
"sub_title": None,
"custom_properties": None,
}
dummy_programs.append(dummy_program)
except Exception as e:
logger.error(
f"Error creating standard dummy programs for channel {channel.name} (ID: {channel.id}): {str(e)}"
)
# Combine regular and dummy programs
all_programs = list(serialized_programs) + dummy_programs
logger.debug(
f"EPGGridAPIView: Returning {len(all_programs)} total programs (including {len(dummy_programs)} dummy programs)."
)
return Response({"data": all_programs}, status=status.HTTP_200_OK)
# ─────────────────────────────
# 4) EPG Import View
@ -98,15 +363,41 @@ class EPGGridAPIView(APIView):
class EPGImportAPIView(APIView):
"""Triggers an EPG data refresh"""
def get_permissions(self):
try:
return [
perm() for perm in permission_classes_by_method[self.request.method]
]
except KeyError:
return [Authenticated()]
@swagger_auto_schema(
operation_description="Triggers an EPG data import",
responses={202: "EPG data import initiated"}
responses={202: "EPG data import initiated"},
)
def post(self, request, format=None):
logger.info("EPGImportAPIView: Received request to import EPG data.")
refresh_epg_data.delay(request.data.get('id', None)) # Trigger Celery task
epg_id = request.data.get("id", None)
# Check if this is a dummy EPG source
try:
from .models import EPGSource
epg_source = EPGSource.objects.get(id=epg_id)
if epg_source.source_type == 'dummy':
logger.info(f"EPGImportAPIView: Skipping refresh for dummy EPG source {epg_id}")
return Response(
{"success": False, "message": "Dummy EPG sources do not require refreshing."},
status=status.HTTP_400_BAD_REQUEST,
)
except EPGSource.DoesNotExist:
pass # Let the task handle the missing source
refresh_epg_data.delay(epg_id) # Trigger Celery task
logger.info("EPGImportAPIView: Task dispatched to refresh EPG data.")
return Response({'success': True, 'message': 'EPG data import initiated.'}, status=status.HTTP_202_ACCEPTED)
return Response(
{"success": True, "message": "EPG data import initiated."},
status=status.HTTP_202_ACCEPTED,
)
# ─────────────────────────────
@ -116,6 +407,13 @@ class EPGDataViewSet(viewsets.ReadOnlyModelViewSet):
"""
API endpoint that allows EPGData objects to be viewed.
"""
queryset = EPGData.objects.all()
serializer_class = EPGDataSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]

View file

@ -0,0 +1,23 @@
# Generated by Django
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0006_epgsource_refresh_interval_epgsource_refresh_task'),
]
operations = [
migrations.AddField(
model_name='epgsource',
name='status',
field=models.CharField(choices=[('idle', 'Idle'), ('fetching', 'Fetching'), ('parsing', 'Parsing'), ('error', 'Error'), ('success', 'Success')], default='idle', max_length=20),
),
migrations.AddField(
model_name='epgsource',
name='last_error',
field=models.TextField(blank=True, null=True),
),
]

View file

@ -0,0 +1,14 @@
# Generated by Django 5.1.6 on 2025-05-03 21:47
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('epg', '0007_epgsource_status_epgsource_last_error'),
('epg', '0009_alter_epgsource_created_at_and_more'),
]
operations = [
]

View file

@ -0,0 +1,42 @@
# Generated by Django 5.1.6 on 2025-05-04 21:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0010_merge_20250503_2147'),
]
operations = [
# Change updated_at field
migrations.AlterField(
model_name='epgsource',
name='updated_at',
field=models.DateTimeField(blank=True, help_text='Time when this source was last successfully refreshed', null=True),
),
# Add new last_message field
migrations.AddField(
model_name='epgsource',
name='last_message',
field=models.TextField(blank=True, help_text='Last status message, including success results or error information', null=True),
),
# Copy data from last_error to last_message
migrations.RunPython(
code=lambda apps, schema_editor: apps.get_model('epg', 'EPGSource').objects.all().update(
last_message=models.F('last_error')
),
reverse_code=lambda apps, schema_editor: apps.get_model('epg', 'EPGSource').objects.all().update(
last_error=models.F('last_message')
),
),
# Remove the old field
migrations.RemoveField(
model_name='epgsource',
name='last_error',
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-15 01:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0011_update_epgsource_fields'),
]
operations = [
migrations.AlterField(
model_name='epgsource',
name='status',
field=models.CharField(choices=[('idle', 'Idle'), ('fetching', 'Fetching'), ('parsing', 'Parsing'), ('error', 'Error'), ('success', 'Success'), ('disabled', 'Disabled')], default='idle', max_length=20),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-21 19:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0012_alter_epgsource_status'),
]
operations = [
migrations.AlterField(
model_name='epgsource',
name='refresh_interval',
field=models.IntegerField(default=0),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-26 15:48
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0013_alter_epgsource_refresh_interval'),
]
operations = [
migrations.AddField(
model_name='epgsource',
name='extracted_file_path',
field=models.CharField(blank=True, help_text='Path to extracted XML file after decompression', max_length=1024, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-09-02 14:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0014_epgsource_extracted_file_path'),
]
operations = [
migrations.AlterField(
model_name='programdata',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-09-16 22:01
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0015_alter_programdata_custom_properties'),
]
operations = [
migrations.AddField(
model_name='epgdata',
name='icon_url',
field=models.URLField(blank=True, max_length=500, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-09-24 21:07
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0016_epgdata_icon_url'),
]
operations = [
migrations.AlterField(
model_name='epgsource',
name='url',
field=models.URLField(blank=True, max_length=1000, null=True),
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 5.2.4 on 2025-10-17 17:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0017_alter_epgsource_url'),
]
operations = [
migrations.AddField(
model_name='epgsource',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, help_text='Custom properties for dummy EPG configuration (regex patterns, timezone, duration, etc.)', null=True),
),
migrations.AlterField(
model_name='epgsource',
name='source_type',
field=models.CharField(choices=[('xmltv', 'XMLTV URL'), ('schedules_direct', 'Schedules Direct API'), ('dummy', 'Custom Dummy EPG')], max_length=20),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-10-22 21:59
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0018_epgsource_custom_properties_and_more'),
]
operations = [
migrations.AlterField(
model_name='programdata',
name='sub_title',
field=models.TextField(blank=True, null=True),
),
]

View file

@ -0,0 +1,119 @@
# Generated migration to replace {time} placeholders with {starttime}
import re
from django.db import migrations
def migrate_time_placeholders(apps, schema_editor):
"""
Replace {time} with {starttime} and {time24} with {starttime24}
in all dummy EPG source custom_properties templates.
"""
EPGSource = apps.get_model('epg', 'EPGSource')
# Fields that contain templates with placeholders
template_fields = [
'title_template',
'description_template',
'upcoming_title_template',
'upcoming_description_template',
'ended_title_template',
'ended_description_template',
'channel_logo_url',
'program_poster_url',
]
# Get all dummy EPG sources
dummy_sources = EPGSource.objects.filter(source_type='dummy')
updated_count = 0
for source in dummy_sources:
if not source.custom_properties:
continue
modified = False
custom_props = source.custom_properties.copy()
for field in template_fields:
if field in custom_props and custom_props[field]:
original_value = custom_props[field]
# Replace {time24} first (before {time}) to avoid double replacement
# e.g., {time24} shouldn't become {starttime24} via {time} -> {starttime}
new_value = original_value
new_value = re.sub(r'\{time24\}', '{starttime24}', new_value)
new_value = re.sub(r'\{time\}', '{starttime}', new_value)
if new_value != original_value:
custom_props[field] = new_value
modified = True
if modified:
source.custom_properties = custom_props
source.save(update_fields=['custom_properties'])
updated_count += 1
if updated_count > 0:
print(f"Migration complete: Updated {updated_count} dummy EPG source(s) with new placeholder names.")
else:
print("No dummy EPG sources needed placeholder updates.")
def reverse_migration(apps, schema_editor):
"""
Reverse the migration by replacing {starttime} back to {time}.
"""
EPGSource = apps.get_model('epg', 'EPGSource')
template_fields = [
'title_template',
'description_template',
'upcoming_title_template',
'upcoming_description_template',
'ended_title_template',
'ended_description_template',
'channel_logo_url',
'program_poster_url',
]
dummy_sources = EPGSource.objects.filter(source_type='dummy')
updated_count = 0
for source in dummy_sources:
if not source.custom_properties:
continue
modified = False
custom_props = source.custom_properties.copy()
for field in template_fields:
if field in custom_props and custom_props[field]:
original_value = custom_props[field]
# Reverse the replacements
new_value = original_value
new_value = re.sub(r'\{starttime24\}', '{time24}', new_value)
new_value = re.sub(r'\{starttime\}', '{time}', new_value)
if new_value != original_value:
custom_props[field] = new_value
modified = True
if modified:
source.custom_properties = custom_props
source.save(update_fields=['custom_properties'])
updated_count += 1
if updated_count > 0:
print(f"Reverse migration complete: Reverted {updated_count} dummy EPG source(s) to old placeholder names.")
class Migration(migrations.Migration):
dependencies = [
('epg', '0019_alter_programdata_sub_title'),
]
operations = [
migrations.RunPython(migrate_time_placeholders, reverse_migration),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-12-05 15:24
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('epg', '0020_migrate_time_to_starttime_placeholders'),
]
operations = [
migrations.AddField(
model_name='epgsource',
name='priority',
field=models.PositiveIntegerField(default=0, help_text='Priority for EPG matching (higher numbers = higher priority). Used when multiple EPG sources have matching entries for a channel.'),
),
]

View file

@ -1,39 +1,144 @@
from django.db import models
from django.utils import timezone
from django_celery_beat.models import PeriodicTask
from django.conf import settings
import os
class EPGSource(models.Model):
SOURCE_TYPE_CHOICES = [
('xmltv', 'XMLTV URL'),
('schedules_direct', 'Schedules Direct API'),
('dummy', 'Custom Dummy EPG'),
]
STATUS_IDLE = 'idle'
STATUS_FETCHING = 'fetching'
STATUS_PARSING = 'parsing'
STATUS_ERROR = 'error'
STATUS_SUCCESS = 'success'
STATUS_DISABLED = 'disabled'
STATUS_CHOICES = [
(STATUS_IDLE, 'Idle'),
(STATUS_FETCHING, 'Fetching'),
(STATUS_PARSING, 'Parsing'),
(STATUS_ERROR, 'Error'),
(STATUS_SUCCESS, 'Success'),
(STATUS_DISABLED, 'Disabled'),
]
name = models.CharField(max_length=255, unique=True)
source_type = models.CharField(max_length=20, choices=SOURCE_TYPE_CHOICES)
url = models.URLField(blank=True, null=True) # For XMLTV
url = models.URLField(max_length=1000, blank=True, null=True) # For XMLTV
api_key = models.CharField(max_length=255, blank=True, null=True) # For Schedules Direct
is_active = models.BooleanField(default=True)
file_path = models.CharField(max_length=1024, blank=True, null=True)
refresh_interval = models.IntegerField(default=24)
extracted_file_path = models.CharField(max_length=1024, blank=True, null=True,
help_text="Path to extracted XML file after decompression")
refresh_interval = models.IntegerField(default=0)
refresh_task = models.ForeignKey(
PeriodicTask, on_delete=models.SET_NULL, null=True, blank=True
)
custom_properties = models.JSONField(
default=dict,
blank=True,
null=True,
help_text="Custom properties for dummy EPG configuration (regex patterns, timezone, duration, etc.)"
)
priority = models.PositiveIntegerField(
default=0,
help_text="Priority for EPG matching (higher numbers = higher priority). Used when multiple EPG sources have matching entries for a channel."
)
status = models.CharField(
max_length=20,
choices=STATUS_CHOICES,
default=STATUS_IDLE
)
last_message = models.TextField(
null=True,
blank=True,
help_text="Last status message, including success results or error information"
)
created_at = models.DateTimeField(
auto_now_add=True,
help_text="Time when this source was created"
)
updated_at = models.DateTimeField(
auto_now=True,
help_text="Time when this source was last updated"
null=True, blank=True,
help_text="Time when this source was last successfully refreshed"
)
def __str__(self):
return self.name
def get_cache_file(self):
import mimetypes
# Use a temporary extension for initial download
# The actual extension will be determined after content inspection
file_ext = ".tmp"
# If file_path is already set and contains an extension, use that
# This handles cases where we've already detected the proper type
if self.file_path and os.path.exists(self.file_path):
_, existing_ext = os.path.splitext(self.file_path)
if existing_ext:
file_ext = existing_ext
else:
# Try to detect the MIME type and map to extension
mime_type, _ = mimetypes.guess_type(self.file_path)
if mime_type:
if mime_type == 'application/gzip' or mime_type == 'application/x-gzip':
file_ext = '.gz'
elif mime_type == 'application/zip':
file_ext = '.zip'
elif mime_type == 'application/xml' or mime_type == 'text/xml':
file_ext = '.xml'
# For files without mime type detection, try peeking at content
else:
try:
with open(self.file_path, 'rb') as f:
header = f.read(4)
# Check for gzip magic number (1f 8b)
if header[:2] == b'\x1f\x8b':
file_ext = '.gz'
# Check for zip magic number (PK..)
elif header[:2] == b'PK':
file_ext = '.zip'
# Check for XML
elif header[:5] == b'<?xml' or header[:5] == b'<tv>':
file_ext = '.xml'
except Exception as e:
# If we can't read the file, just keep the default extension
pass
filename = f"{self.id}{file_ext}"
# Build full path in MEDIA_ROOT/cached_epg
cache_dir = os.path.join(settings.MEDIA_ROOT, "cached_epg")
# Create directory if it doesn't exist
os.makedirs(cache_dir, exist_ok=True)
cache = os.path.join(cache_dir, filename)
return cache
def save(self, *args, **kwargs):
# Prevent auto_now behavior by handling updated_at manually
if 'update_fields' in kwargs and 'updated_at' not in kwargs['update_fields']:
# Don't modify updated_at for regular updates
kwargs.setdefault('update_fields', [])
if 'updated_at' in kwargs['update_fields']:
kwargs['update_fields'].remove('updated_at')
super().save(*args, **kwargs)
class EPGData(models.Model):
# Removed the Channel foreign key. We now just store the original tvg_id
# and a name (which might simply be the tvg_id if no real channel exists).
tvg_id = models.CharField(max_length=255, null=True, blank=True, db_index=True)
name = models.CharField(max_length=255)
icon_url = models.URLField(max_length=500, null=True, blank=True)
epg_source = models.ForeignKey(
EPGSource,
on_delete=models.CASCADE,
@ -54,10 +159,10 @@ class ProgramData(models.Model):
start_time = models.DateTimeField()
end_time = models.DateTimeField()
title = models.CharField(max_length=255)
sub_title = models.CharField(max_length=255, blank=True, null=True)
sub_title = models.TextField(blank=True, null=True)
description = models.TextField(blank=True, null=True)
tvg_id = models.CharField(max_length=255, null=True, blank=True)
custom_properties = models.TextField(null=True, blank=True)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
def __str__(self):
return f"{self.title} ({self.start_time} - {self.end_time})"

View file

@ -1,17 +1,41 @@
from core.utils import validate_flexible_url
from rest_framework import serializers
from .models import EPGSource, EPGData, ProgramData
from apps.channels.models import Channel
class EPGSourceSerializer(serializers.ModelSerializer):
epg_data_ids = serializers.SerializerMethodField()
epg_data_count = serializers.SerializerMethodField()
read_only_fields = ['created_at', 'updated_at']
url = serializers.CharField(
required=False,
allow_blank=True,
allow_null=True,
validators=[validate_flexible_url]
)
class Meta:
model = EPGSource
fields = ['id', 'name', 'source_type', 'url', 'api_key', 'is_active', 'epg_data_ids', 'refresh_interval', 'created_at', 'updated_at']
fields = [
'id',
'name',
'source_type',
'url',
'api_key',
'is_active',
'file_path',
'refresh_interval',
'priority',
'status',
'last_message',
'created_at',
'updated_at',
'custom_properties',
'epg_data_count'
]
def get_epg_data_ids(self, obj):
return list(obj.epgs.values_list('id', flat=True))
def get_epg_data_count(self, obj):
"""Return the count of EPG data entries instead of all IDs to prevent large payloads"""
return obj.epgs.count()
class ProgramDataSerializer(serializers.ModelSerializer):
class Meta:
@ -31,5 +55,6 @@ class EPGDataSerializer(serializers.ModelSerializer):
'id',
'tvg_id',
'name',
'icon_url',
'epg_source',
]

View file

@ -1,21 +1,88 @@
from django.db.models.signals import post_save, post_delete
from django.db.models.signals import post_save, post_delete, pre_save
from django.dispatch import receiver
from .models import EPGSource
from .tasks import refresh_epg_data
from .models import EPGSource, EPGData
from .tasks import refresh_epg_data, delete_epg_refresh_task_by_id
from django_celery_beat.models import PeriodicTask, IntervalSchedule
from core.utils import is_protected_path, send_websocket_update
import json
import logging
import os
logger = logging.getLogger(__name__)
@receiver(post_save, sender=EPGSource)
def trigger_refresh_on_new_epg_source(sender, instance, created, **kwargs):
# Trigger refresh only if the source is newly created and active
if created and instance.is_active:
# Trigger refresh only if the source is newly created, active, and not a dummy EPG
if created and instance.is_active and instance.source_type != 'dummy':
refresh_epg_data.delay(instance.id)
@receiver(post_save, sender=EPGSource)
def create_dummy_epg_data(sender, instance, created, **kwargs):
"""
Automatically create EPGData for dummy EPG sources when they are created.
This allows channels to be assigned to dummy EPGs immediately without
requiring a refresh first.
"""
if instance.source_type == 'dummy':
# Ensure dummy EPGs always have idle status and no status message
if instance.status != EPGSource.STATUS_IDLE or instance.last_message:
instance.status = EPGSource.STATUS_IDLE
instance.last_message = None
instance.save(update_fields=['status', 'last_message'])
# Create a URL-friendly tvg_id from the dummy EPG name
# Replace spaces and special characters with underscores
friendly_tvg_id = instance.name.replace(' ', '_').replace('-', '_')
# Remove any characters that aren't alphanumeric or underscores
friendly_tvg_id = ''.join(c for c in friendly_tvg_id if c.isalnum() or c == '_')
# Convert to lowercase for consistency
friendly_tvg_id = friendly_tvg_id.lower()
# Prefix with 'dummy_' to make it clear this is a dummy EPG
friendly_tvg_id = f"dummy_{friendly_tvg_id}"
# Create or update the EPGData record
epg_data, data_created = EPGData.objects.get_or_create(
tvg_id=friendly_tvg_id,
epg_source=instance,
defaults={
'name': instance.name,
'icon_url': None
}
)
# Update name if it changed and record already existed
if not data_created and epg_data.name != instance.name:
epg_data.name = instance.name
epg_data.save(update_fields=['name'])
if data_created:
logger.info(f"Auto-created EPGData for dummy EPG source: {instance.name} (ID: {instance.id})")
# Send websocket update to notify frontend that EPG data has been created
# This allows the channel form to immediately show the new dummy EPG without refreshing
send_websocket_update('updates', 'update', {
'type': 'epg_data_created',
'source_id': instance.id,
'source_name': instance.name,
'epg_data_id': epg_data.id
})
else:
logger.debug(f"EPGData already exists for dummy EPG source: {instance.name} (ID: {instance.id})")
@receiver(post_save, sender=EPGSource)
def create_or_update_refresh_task(sender, instance, **kwargs):
"""
Create or update a Celery Beat periodic task when an EPGSource is created/updated.
Skip creating tasks for dummy EPG sources as they don't need refreshing.
"""
# Skip task creation for dummy EPGs
if instance.source_type == 'dummy':
# If there's an existing task, disable it
if instance.refresh_task:
instance.refresh_task.enabled = False
instance.refresh_task.save(update_fields=['enabled'])
return
task_name = f"epg_source-refresh-{instance.id}"
interval, _ = IntervalSchedule.objects.get_or_create(
every=int(instance.refresh_interval),
@ -26,7 +93,7 @@ def create_or_update_refresh_task(sender, instance, **kwargs):
"interval": interval,
"task": "apps.epg.tasks.refresh_epg_data",
"kwargs": json.dumps({"source_id": instance.id}),
"enabled": instance.refresh_interval != 0,
"enabled": instance.refresh_interval != 0 and instance.is_active,
})
update_fields = []
@ -36,8 +103,11 @@ def create_or_update_refresh_task(sender, instance, **kwargs):
if task.interval != interval:
task.interval = interval
update_fields.append("interval")
if task.enabled != (instance.refresh_interval != 0):
task.enabled = instance.refresh_interval != 0
# Check both refresh_interval and is_active to determine if task should be enabled
should_be_enabled = instance.refresh_interval != 0 and instance.is_active
if task.enabled != should_be_enabled:
task.enabled = should_be_enabled
update_fields.append("enabled")
if update_fields:
@ -45,12 +115,82 @@ def create_or_update_refresh_task(sender, instance, **kwargs):
if instance.refresh_task != task:
instance.refresh_task = task
instance.save(update_fields=update_fields)
instance.save(update_fields=["refresh_task"]) # Fixed field name
@receiver(post_delete, sender=EPGSource)
def delete_refresh_task(sender, instance, **kwargs):
"""
Delete the associated Celery Beat periodic task when a Channel is deleted.
Delete the associated Celery Beat periodic task when an EPGSource is deleted.
"""
if instance.refresh_task:
instance.refresh_task.delete()
try:
# First try the foreign key relationship to find the task ID
task = None
if instance.refresh_task:
logger.info(f"Found task via foreign key: {instance.refresh_task.id} for EPGSource {instance.id}")
task = instance.refresh_task
# Store task ID before deletion if we need to bypass the helper function
if task:
delete_epg_refresh_task_by_id(instance.id)
else:
# Otherwise use the helper function
delete_epg_refresh_task_by_id(instance.id)
except Exception as e:
logger.error(f"Error in delete_refresh_task signal handler: {str(e)}", exc_info=True)
@receiver(pre_save, sender=EPGSource)
def update_status_on_active_change(sender, instance, **kwargs):
"""
When an EPGSource's is_active field changes, update the status accordingly.
For dummy EPGs, always ensure status is idle and no status message.
"""
# Dummy EPGs should always be idle with no status message
if instance.source_type == 'dummy':
instance.status = EPGSource.STATUS_IDLE
instance.last_message = None
return
if instance.pk: # Only for existing records, not new ones
try:
# Get the current record from the database
old_instance = EPGSource.objects.get(pk=instance.pk)
# If is_active changed, update the status
if old_instance.is_active != instance.is_active:
if instance.is_active:
# When activating, set status to idle
instance.status = 'idle'
else:
# When deactivating, set status to disabled
instance.status = 'disabled'
except EPGSource.DoesNotExist:
# New record, will use default status
pass
@receiver(post_delete, sender=EPGSource)
def delete_cached_files(sender, instance, **kwargs):
"""
Delete cached files associated with an EPGSource when it's deleted.
Only deletes files that aren't in protected directories.
"""
# Check and delete the main file path if not protected
if instance.file_path and os.path.exists(instance.file_path):
if is_protected_path(instance.file_path):
logger.info(f"Skipping deletion of protected file: {instance.file_path}")
else:
try:
os.remove(instance.file_path)
logger.info(f"Deleted cached file: {instance.file_path}")
except OSError as e:
logger.error(f"Error deleting cached file {instance.file_path}: {e}")
# Check and delete the extracted file path if it exists, is different from main path, and not protected
if instance.extracted_file_path and os.path.exists(instance.extracted_file_path) and instance.extracted_file_path != instance.file_path:
if is_protected_path(instance.extracted_file_path):
logger.info(f"Skipping deletion of protected extracted file: {instance.extracted_file_path}")
else:
try:
os.remove(instance.extracted_file_path)
logger.info(f"Deleted extracted file: {instance.extracted_file_path}")
except OSError as e:
logger.error(f"Error deleting extracted file {instance.extracted_file_path}: {e}")

File diff suppressed because it is too large Load diff

View file

@ -1,12 +1,14 @@
from rest_framework import viewsets, status
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.permissions import IsAuthenticated
from apps.accounts.permissions import Authenticated, permission_classes_by_action
from django.http import JsonResponse, HttpResponseForbidden, HttpResponse
import logging
from drf_yasg.utils import swagger_auto_schema
from drf_yasg import openapi
from django.shortcuts import get_object_or_404
from apps.channels.models import Channel, ChannelProfile
from django.db import models
from apps.channels.models import Channel, ChannelProfile, Stream
from .models import HDHRDevice
from .serializers import HDHRDeviceSerializer
from django.contrib.auth.decorators import login_required
@ -16,18 +18,29 @@ from django.utils.decorators import method_decorator
from django.contrib.auth.decorators import login_required
from django.views.decorators.csrf import csrf_exempt
# Configure logger
logger = logging.getLogger(__name__)
@login_required
def hdhr_dashboard_view(request):
"""Render the HDHR management page."""
hdhr_devices = HDHRDevice.objects.all()
return render(request, "hdhr/hdhr.html", {"hdhr_devices": hdhr_devices})
# 🔹 1) HDHomeRun Device API
class HDHRDeviceViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for HDHomeRun devices"""
queryset = HDHRDevice.objects.all()
serializer_class = HDHRDeviceSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
# 🔹 2) Discover API
@ -36,27 +49,37 @@ class DiscoverAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve HDHomeRun device discovery information",
responses={200: openapi.Response("HDHR Discovery JSON")}
responses={200: openapi.Response("HDHR Discovery JSON")},
)
def get(self, request, profile=None):
uri_parts = ["hdhr"]
if profile is not None:
uri_parts.append(profile)
base_url = request.build_absolute_uri(f'/{"/".join(uri_parts)}/').rstrip('/')
base_url = request.build_absolute_uri(f'/{"/".join(uri_parts)}/').rstrip("/")
device = HDHRDevice.objects.first()
# Calculate tuner count using centralized function
from apps.m3u.utils import calculate_tuner_count
tuner_count = calculate_tuner_count(minimum=1, unlimited_default=10)
# Create a unique DeviceID for the HDHomeRun device based on profile ID or a default value
device_ID = "12345678" # Default DeviceID
friendly_name = "Dispatcharr HDHomeRun"
if profile is not None:
device_ID = f"dispatcharr-hdhr-{profile}"
friendly_name = f"Dispatcharr HDHomeRun - {profile}"
if not device:
data = {
"FriendlyName": "Dispatcharr HDHomeRun",
"FriendlyName": friendly_name,
"ModelNumber": "HDTC-2US",
"FirmwareName": "hdhomerun3_atsc",
"FirmwareVersion": "20200101",
"DeviceID": "12345678",
"DeviceID": device_ID,
"DeviceAuth": "test_auth_token",
"BaseURL": base_url,
"LineupURL": f"{base_url}/lineup.json",
"TunerCount": 10,
"TunerCount": tuner_count,
}
else:
data = {
@ -68,7 +91,7 @@ class DiscoverAPIView(APIView):
"DeviceAuth": "test_auth_token",
"BaseURL": base_url,
"LineupURL": f"{base_url}/lineup.json",
"TunerCount": 10,
"TunerCount": tuner_count,
}
return JsonResponse(data)
@ -79,28 +102,38 @@ class LineupAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the available channel lineup",
responses={200: openapi.Response("Channel Lineup JSON")}
responses={200: openapi.Response("Channel Lineup JSON")},
)
def get(self, request, profile=None):
if profile is not None:
channel_profile = ChannelProfile.objects.get(name=profile)
channels = Channel.objects.filter(
channelprofilemembership__channel_profile=channel_profile,
channelprofilemembership__enabled=True
).order_by('channel_number')
channelprofilemembership__enabled=True,
).order_by("channel_number")
else:
channels = Channel.objects.all().order_by('channel_number')
channels = Channel.objects.all().order_by("channel_number")
lineup = [
{
"GuideNumber": str(ch.channel_number),
"GuideName": ch.name,
"URL": request.build_absolute_uri(f"/proxy/ts/stream/{ch.uuid}"),
"Guide_ID": str(ch.channel_number),
"Station": str(ch.channel_number),
}
for ch in channels
]
lineup = []
for ch in channels:
# Format channel number as integer if it has no decimal component
if ch.channel_number is not None:
if ch.channel_number == int(ch.channel_number):
formatted_channel_number = str(int(ch.channel_number))
else:
formatted_channel_number = str(ch.channel_number)
else:
formatted_channel_number = ""
lineup.append(
{
"GuideNumber": formatted_channel_number,
"GuideName": ch.name,
"URL": request.build_absolute_uri(f"/proxy/ts/stream/{ch.uuid}"),
"Guide_ID": formatted_channel_number,
"Station": formatted_channel_number,
}
)
return JsonResponse(lineup, safe=False)
@ -110,14 +143,14 @@ class LineupStatusAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the HDHomeRun lineup status",
responses={200: openapi.Response("Lineup Status JSON")}
responses={200: openapi.Response("Lineup Status JSON")},
)
def get(self, request, profile=None):
data = {
"ScanInProgress": 0,
"ScanPossible": 0,
"Source": "Cable",
"SourceList": ["Cable"]
"SourceList": ["Cable"],
}
return JsonResponse(data)
@ -128,10 +161,10 @@ class HDHRDeviceXMLAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the HDHomeRun device XML configuration",
responses={200: openapi.Response("HDHR Device XML")}
responses={200: openapi.Response("HDHR Device XML")},
)
def get(self, request):
base_url = request.build_absolute_uri('/hdhr/').rstrip('/')
base_url = request.build_absolute_uri("/hdhr/").rstrip("/")
xml_response = f"""<?xml version="1.0" encoding="utf-8"?>
<root>

View file

@ -2,6 +2,7 @@ import os
import socket
import threading
import time
import gevent # Add this import
from django.conf import settings
# SSDP Multicast Address and Port
@ -59,7 +60,7 @@ def ssdp_broadcaster(host_ip):
sock.setsockopt(socket.IPPROTO_IP, socket.IP_MULTICAST_TTL, 2)
while True:
sock.sendto(notify.encode("utf-8"), (SSDP_MULTICAST, SSDP_PORT))
time.sleep(30)
gevent.sleep(30) # Replace time.sleep with gevent.sleep
def start_ssdp():
host_ip = get_host_ip()

View file

@ -1,7 +1,7 @@
from rest_framework import viewsets, status
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.permissions import IsAuthenticated
from apps.accounts.permissions import Authenticated, permission_classes_by_action
from django.http import JsonResponse, HttpResponseForbidden, HttpResponse
from drf_yasg.utils import swagger_auto_schema
from drf_yasg import openapi
@ -16,18 +16,26 @@ from django.utils.decorators import method_decorator
from django.contrib.auth.decorators import login_required
from django.views.decorators.csrf import csrf_exempt
@login_required
def hdhr_dashboard_view(request):
"""Render the HDHR management page."""
hdhr_devices = HDHRDevice.objects.all()
return render(request, "hdhr/hdhr.html", {"hdhr_devices": hdhr_devices})
# 🔹 1) HDHomeRun Device API
class HDHRDeviceViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for HDHomeRun devices"""
queryset = HDHRDevice.objects.all()
serializer_class = HDHRDeviceSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
# 🔹 2) Discover API
@ -36,10 +44,10 @@ class DiscoverAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve HDHomeRun device discovery information",
responses={200: openapi.Response("HDHR Discovery JSON")}
responses={200: openapi.Response("HDHR Discovery JSON")},
)
def get(self, request):
base_url = request.build_absolute_uri('/hdhr/').rstrip('/')
base_url = request.build_absolute_uri("/hdhr/").rstrip("/")
device = HDHRDevice.objects.first()
if not device:
@ -52,7 +60,7 @@ class DiscoverAPIView(APIView):
"DeviceAuth": "test_auth_token",
"BaseURL": base_url,
"LineupURL": f"{base_url}/lineup.json",
"TunerCount": "10",
"TunerCount": 10,
}
else:
data = {
@ -64,7 +72,7 @@ class DiscoverAPIView(APIView):
"DeviceAuth": "test_auth_token",
"BaseURL": base_url,
"LineupURL": f"{base_url}/lineup.json",
"TunerCount": "10",
"TunerCount": 10,
}
return JsonResponse(data)
@ -75,15 +83,15 @@ class LineupAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the available channel lineup",
responses={200: openapi.Response("Channel Lineup JSON")}
responses={200: openapi.Response("Channel Lineup JSON")},
)
def get(self, request):
channels = Channel.objects.all().order_by('channel_number')
channels = Channel.objects.all().order_by("channel_number")
lineup = [
{
"GuideNumber": str(ch.channel_number),
"GuideName": ch.name,
"URL": request.build_absolute_uri(f"/proxy/ts/stream/{ch.uuid}")
"URL": request.build_absolute_uri(f"/proxy/ts/stream/{ch.uuid}"),
}
for ch in channels
]
@ -96,14 +104,14 @@ class LineupStatusAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the HDHomeRun lineup status",
responses={200: openapi.Response("Lineup Status JSON")}
responses={200: openapi.Response("Lineup Status JSON")},
)
def get(self, request):
data = {
"ScanInProgress": 0,
"ScanPossible": 0,
"Source": "Cable",
"SourceList": ["Cable"]
"SourceList": ["Cable"],
}
return JsonResponse(data)
@ -114,10 +122,10 @@ class HDHRDeviceXMLAPIView(APIView):
@swagger_auto_schema(
operation_description="Retrieve the HDHomeRun device XML configuration",
responses={200: openapi.Response("HDHR Device XML")}
responses={200: openapi.Response("HDHR Device XML")},
)
def get(self, request):
base_url = request.build_absolute_uri('/hdhr/').rstrip('/')
base_url = request.build_absolute_uri("/hdhr/").rstrip("/")
xml_response = f"""<?xml version="1.0" encoding="utf-8"?>
<root>

View file

@ -1,6 +1,8 @@
from django.contrib import admin
from django.utils.html import format_html
from .models import M3UAccount, M3UFilter, ServerGroup, UserAgent
from .models import M3UAccount, M3UFilter, ServerGroup, UserAgent, M3UAccountProfile
import json
class M3UFilterInline(admin.TabularInline):
model = M3UFilter
@ -8,50 +10,181 @@ class M3UFilterInline(admin.TabularInline):
verbose_name = "M3U Filter"
verbose_name_plural = "M3U Filters"
@admin.register(M3UAccount)
class M3UAccountAdmin(admin.ModelAdmin):
list_display = ('name', 'server_url', 'server_group', 'max_streams', 'is_active', 'user_agent_display', 'uploaded_file_link', 'created_at', 'updated_at')
list_filter = ('is_active', 'server_group')
search_fields = ('name', 'server_url', 'server_group__name')
list_display = (
"name",
"server_url",
"server_group",
"max_streams",
"priority",
"is_active",
"user_agent_display",
"uploaded_file_link",
"created_at",
"updated_at",
)
list_filter = ("is_active", "server_group")
search_fields = ("name", "server_url", "server_group__name")
inlines = [M3UFilterInline]
actions = ['activate_accounts', 'deactivate_accounts']
actions = ["activate_accounts", "deactivate_accounts"]
# Handle both ForeignKey and ManyToManyField cases for UserAgent
def user_agent_display(self, obj):
if hasattr(obj, 'user_agent'): # ForeignKey case
if hasattr(obj, "user_agent"): # ForeignKey case
return obj.user_agent.user_agent if obj.user_agent else "None"
elif hasattr(obj, 'user_agents'): # ManyToManyField case
elif hasattr(obj, "user_agents"): # ManyToManyField case
return ", ".join([ua.user_agent for ua in obj.user_agents.all()]) or "None"
return "None"
user_agent_display.short_description = "User Agent(s)"
def vod_enabled_display(self, obj):
"""Display whether VOD is enabled for this account"""
if obj.custom_properties:
custom_props = obj.custom_properties or {}
return "Yes" if custom_props.get('enable_vod', False) else "No"
return "No"
vod_enabled_display.short_description = "VOD Enabled"
vod_enabled_display.boolean = True
def uploaded_file_link(self, obj):
if obj.uploaded_file:
return format_html("<a href='{}' target='_blank'>Download M3U</a>", obj.uploaded_file.url)
return format_html(
"<a href='{}' target='_blank'>Download M3U</a>", obj.uploaded_file.url
)
return "No file uploaded"
uploaded_file_link.short_description = "Uploaded File"
@admin.action(description='Activate selected accounts')
@admin.action(description="Activate selected accounts")
def activate_accounts(self, request, queryset):
queryset.update(is_active=True)
@admin.action(description='Deactivate selected accounts')
@admin.action(description="Deactivate selected accounts")
def deactivate_accounts(self, request, queryset):
queryset.update(is_active=False)
# Add ManyToManyField for Django Admin (if applicable)
if hasattr(M3UAccount, 'user_agents'):
filter_horizontal = ('user_agents',) # Only for ManyToManyField
if hasattr(M3UAccount, "user_agents"):
filter_horizontal = ("user_agents",) # Only for ManyToManyField
@admin.register(M3UFilter)
class M3UFilterAdmin(admin.ModelAdmin):
list_display = ('m3u_account', 'filter_type', 'regex_pattern', 'exclude')
list_filter = ('filter_type', 'exclude')
search_fields = ('regex_pattern',)
ordering = ('m3u_account',)
list_display = ("m3u_account", "filter_type", "regex_pattern", "exclude")
list_filter = ("filter_type", "exclude")
search_fields = ("regex_pattern",)
ordering = ("m3u_account",)
@admin.register(ServerGroup)
class ServerGroupAdmin(admin.ModelAdmin):
list_display = ('name',)
search_fields = ('name',)
list_display = ("name",)
search_fields = ("name",)
@admin.register(M3UAccountProfile)
class M3UAccountProfileAdmin(admin.ModelAdmin):
list_display = (
"name",
"m3u_account",
"is_default",
"is_active",
"max_streams",
"current_viewers",
"account_status_display",
"account_expiration_display",
"last_refresh_display",
)
list_filter = ("is_active", "is_default", "m3u_account__account_type")
search_fields = ("name", "m3u_account__name")
readonly_fields = ("account_info_display",)
def account_status_display(self, obj):
"""Display account status from custom properties"""
status = obj.get_account_status()
if status:
# Create colored status display
color_map = {
'Active': 'green',
'Expired': 'red',
'Disabled': 'red',
'Banned': 'red',
}
color = color_map.get(status, 'black')
return format_html(
'<span style="color: {};">{}</span>',
color,
status
)
return "Unknown"
account_status_display.short_description = "Account Status"
def account_expiration_display(self, obj):
"""Display account expiration from custom properties"""
expiration = obj.get_account_expiration()
if expiration:
from datetime import datetime
if expiration < datetime.now():
return format_html(
'<span style="color: red;">{}</span>',
expiration.strftime('%Y-%m-%d %H:%M')
)
else:
return format_html(
'<span style="color: green;">{}</span>',
expiration.strftime('%Y-%m-%d %H:%M')
)
return "Unknown"
account_expiration_display.short_description = "Expires"
def last_refresh_display(self, obj):
"""Display last refresh time from custom properties"""
last_refresh = obj.get_last_refresh()
if last_refresh:
return last_refresh.strftime('%Y-%m-%d %H:%M:%S')
return "Never"
last_refresh_display.short_description = "Last Refresh"
def account_info_display(self, obj):
"""Display formatted account information from custom properties"""
if not obj.custom_properties:
return "No account information available"
html_parts = []
# User Info
user_info = obj.custom_properties.get('user_info', {})
if user_info:
html_parts.append("<h3>User Information:</h3>")
html_parts.append("<ul>")
for key, value in user_info.items():
if key == 'exp_date' and value:
try:
from datetime import datetime
exp_date = datetime.fromtimestamp(float(value))
value = exp_date.strftime('%Y-%m-%d %H:%M:%S')
except (ValueError, TypeError):
pass
html_parts.append(f"<li><strong>{key}:</strong> {value}</li>")
html_parts.append("</ul>")
# Server Info
server_info = obj.custom_properties.get('server_info', {})
if server_info:
html_parts.append("<h3>Server Information:</h3>")
html_parts.append("<ul>")
for key, value in server_info.items():
html_parts.append(f"<li><strong>{key}:</strong> {value}</li>")
html_parts.append("</ul>")
# Last Refresh
last_refresh = obj.custom_properties.get('last_refresh')
if last_refresh:
html_parts.append(f"<p><strong>Last Refresh:</strong> {last_refresh}</p>")
return format_html(''.join(html_parts)) if html_parts else "No account information available"
account_info_display.short_description = "Account Information"

View file

@ -1,18 +1,44 @@
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from .api_views import M3UAccountViewSet, M3UFilterViewSet, ServerGroupViewSet, RefreshM3UAPIView, RefreshSingleM3UAPIView, UserAgentViewSet, M3UAccountProfileViewSet
from .api_views import (
M3UAccountViewSet,
M3UFilterViewSet,
ServerGroupViewSet,
RefreshM3UAPIView,
RefreshSingleM3UAPIView,
RefreshAccountInfoAPIView,
UserAgentViewSet,
M3UAccountProfileViewSet,
)
app_name = 'm3u'
app_name = "m3u"
router = DefaultRouter()
router.register(r'accounts', M3UAccountViewSet, basename='m3u-account')
router.register(r'accounts\/(?P<account_id>\d+)\/profiles', M3UAccountProfileViewSet, basename='m3u-account-profiles')
router.register(r'filters', M3UFilterViewSet, basename='m3u-filter')
router.register(r'server-groups', ServerGroupViewSet, basename='server-group')
router.register(r"accounts", M3UAccountViewSet, basename="m3u-account")
router.register(
r"accounts\/(?P<account_id>\d+)\/profiles",
M3UAccountProfileViewSet,
basename="m3u-account-profiles",
)
router.register(
r"accounts\/(?P<account_id>\d+)\/filters",
M3UFilterViewSet,
basename="m3u-filters",
)
router.register(r"server-groups", ServerGroupViewSet, basename="server-group")
urlpatterns = [
path('refresh/', RefreshM3UAPIView.as_view(), name='m3u_refresh'),
path('refresh/<int:account_id>/', RefreshSingleM3UAPIView.as_view(), name='m3u_refresh_single'),
path("refresh/", RefreshM3UAPIView.as_view(), name="m3u_refresh"),
path(
"refresh/<int:account_id>/",
RefreshSingleM3UAPIView.as_view(),
name="m3u_refresh_single",
),
path(
"refresh-account-info/<int:profile_id>/",
RefreshAccountInfoAPIView.as_view(),
name="m3u_refresh_account_info",
),
]
urlpatterns += router.urls

View file

@ -1,7 +1,11 @@
from rest_framework import viewsets, status
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.permissions import IsAuthenticated
from apps.accounts.permissions import (
Authenticated,
permission_classes_by_action,
permission_classes_by_method,
)
from drf_yasg.utils import swagger_auto_schema
from drf_yasg import openapi
from django.shortcuts import get_object_or_404
@ -10,13 +14,15 @@ from django.core.cache import cache
import os
from rest_framework.decorators import action
from django.conf import settings
from .tasks import refresh_m3u_groups
import json
# Import all models, including UserAgent.
from .models import M3UAccount, M3UFilter, ServerGroup, M3UAccountProfile
from core.models import UserAgent
from apps.channels.models import ChannelGroupM3UAccount
from core.serializers import UserAgentSerializer
# Import all serializers, including the UserAgentSerializer.
from apps.vod.models import M3UVODCategoryRelation
from .serializers import (
M3UAccountSerializer,
M3UFilterSerializer,
@ -24,130 +30,455 @@ from .serializers import (
M3UAccountProfileSerializer,
)
from .tasks import refresh_single_m3u_account, refresh_m3u_accounts
from django.core.files.storage import default_storage
from django.core.files.base import ContentFile
from .tasks import refresh_single_m3u_account, refresh_m3u_accounts, refresh_account_info
import json
class M3UAccountViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for M3U accounts"""
queryset = M3UAccount.objects.prefetch_related('channel_group')
queryset = M3UAccount.objects.prefetch_related("channel_group")
serializer_class = M3UAccountSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
def create(self, request, *args, **kwargs):
# Handle file upload first, if any
file_path = None
if 'file' in request.FILES:
file = request.FILES['file']
if "file" in request.FILES:
file = request.FILES["file"]
file_name = file.name
file_path = os.path.join('/data/uploads/m3us', file_name)
file_path = os.path.join("/data/uploads/m3us", file_name)
os.makedirs(os.path.dirname(file_path), exist_ok=True)
with open(file_path, 'wb+') as destination:
with open(file_path, "wb+") as destination:
for chunk in file.chunks():
destination.write(chunk)
# Add file_path to the request data so it's available during creation
request.data._mutable = True # Allow modification of the request data
request.data['file_path'] = file_path # Include the file path if a file was uploaded
request.data.pop('server_url')
request.data["file_path"] = (
file_path # Include the file path if a file was uploaded
)
# Handle the user_agent field - convert "null" string to None
if "user_agent" in request.data and request.data["user_agent"] == "null":
request.data["user_agent"] = None
# Handle server_url appropriately
if "server_url" in request.data and not request.data["server_url"]:
request.data.pop("server_url")
request.data._mutable = False # Make the request data immutable again
# Now call super().create() to create the instance
response = super().create(request, *args, **kwargs)
account_type = response.data.get("account_type")
account_id = response.data.get("id")
# Notify frontend that a new playlist was created
from core.utils import send_websocket_update
send_websocket_update('updates', 'update', {
'type': 'playlist_created',
'playlist_id': account_id
})
if account_type == M3UAccount.Types.XC:
refresh_m3u_groups(account_id)
# Check if VOD is enabled
enable_vod = request.data.get("enable_vod", False)
if enable_vod:
from apps.vod.tasks import refresh_categories
refresh_categories(account_id)
# After the instance is created, return the response
return response
def update(self, request, *args, **kwargs):
instance = self.get_object()
old_vod_enabled = False
# Check current VOD setting
if instance.custom_properties:
custom_props = instance.custom_properties or {}
old_vod_enabled = custom_props.get("enable_vod", False)
# Handle file upload first, if any
file_path = None
if 'file' in request.FILES:
file = request.FILES['file']
if "file" in request.FILES:
file = request.FILES["file"]
file_name = file.name
file_path = os.path.join('/data/uploads/m3us', file_name)
file_path = os.path.join("/data/uploads/m3us", file_name)
os.makedirs(os.path.dirname(file_path), exist_ok=True)
with open(file_path, 'wb+') as destination:
with open(file_path, "wb+") as destination:
for chunk in file.chunks():
destination.write(chunk)
# Add file_path to the request data so it's available during creation
request.data._mutable = True # Allow modification of the request data
request.data['file_path'] = file_path # Include the file path if a file was uploaded
request.data.pop('server_url')
request.data["file_path"] = (
file_path # Include the file path if a file was uploaded
)
# Handle the user_agent field - convert "null" string to None
if "user_agent" in request.data and request.data["user_agent"] == "null":
request.data["user_agent"] = None
# Handle server_url appropriately
if "server_url" in request.data and not request.data["server_url"]:
request.data.pop("server_url")
request.data._mutable = False # Make the request data immutable again
if instance.file_path and os.path.exists(instance.file_path):
os.remove(instance.file_path)
# Now call super().create() to create the instance
# Now call super().update() to update the instance
response = super().update(request, *args, **kwargs)
# After the instance is created, return the response
# Check if VOD setting changed and trigger refresh if needed
new_vod_enabled = request.data.get("enable_vod", old_vod_enabled)
if (
instance.account_type == M3UAccount.Types.XC
and not old_vod_enabled
and new_vod_enabled
):
# Create Uncategorized categories immediately so they're available in the UI
from apps.vod.models import VODCategory, M3UVODCategoryRelation
# Create movie Uncategorized category
movie_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="movie",
defaults={}
)
# Create series Uncategorized category
series_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="series",
defaults={}
)
# Create relations for both categories (disabled by default until first refresh)
account_custom_props = instance.custom_properties or {}
auto_enable_new = account_custom_props.get("auto_enable_new_groups_vod", True)
M3UVODCategoryRelation.objects.get_or_create(
category=movie_category,
m3u_account=instance,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
M3UVODCategoryRelation.objects.get_or_create(
category=series_category,
m3u_account=instance,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
# Trigger full VOD refresh
from apps.vod.tasks import refresh_vod_content
refresh_vod_content.delay(instance.id)
# After the instance is updated, return the response
return response
def partial_update(self, request, *args, **kwargs):
"""Handle partial updates with special logic for is_active field"""
instance = self.get_object()
# Check if we're toggling is_active
if (
"is_active" in request.data
and instance.is_active != request.data["is_active"]
):
# Set appropriate status based on new is_active value
if request.data["is_active"]:
request.data["status"] = M3UAccount.Status.IDLE
else:
request.data["status"] = M3UAccount.Status.DISABLED
# Continue with regular partial update
return super().partial_update(request, *args, **kwargs)
@action(detail=True, methods=["post"], url_path="refresh-vod")
def refresh_vod(self, request, pk=None):
"""Trigger VOD content refresh for XtreamCodes accounts"""
account = self.get_object()
if account.account_type != M3UAccount.Types.XC:
return Response(
{"error": "VOD refresh is only available for XtreamCodes accounts"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if VOD is enabled
vod_enabled = False
if account.custom_properties:
custom_props = account.custom_properties or {}
vod_enabled = custom_props.get("enable_vod", False)
if not vod_enabled:
return Response(
{"error": "VOD is not enabled for this account"},
status=status.HTTP_400_BAD_REQUEST,
)
try:
from apps.vod.tasks import refresh_vod_content
refresh_vod_content.delay(account.id)
return Response(
{"message": f"VOD refresh initiated for account {account.name}"},
status=status.HTTP_202_ACCEPTED,
)
except Exception as e:
return Response(
{"error": f"Failed to initiate VOD refresh: {str(e)}"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@action(detail=True, methods=["patch"], url_path="group-settings")
def update_group_settings(self, request, pk=None):
"""Update auto channel sync settings for M3U account groups"""
account = self.get_object()
group_settings = request.data.get("group_settings", [])
category_settings = request.data.get("category_settings", [])
try:
for setting in group_settings:
group_id = setting.get("channel_group")
enabled = setting.get("enabled", True)
auto_sync = setting.get("auto_channel_sync", False)
sync_start = setting.get("auto_sync_channel_start")
custom_properties = setting.get("custom_properties", {})
if group_id:
ChannelGroupM3UAccount.objects.update_or_create(
channel_group_id=group_id,
m3u_account=account,
defaults={
"enabled": enabled,
"auto_channel_sync": auto_sync,
"auto_sync_channel_start": sync_start,
"custom_properties": custom_properties,
},
)
for setting in category_settings:
category_id = setting.get("id")
enabled = setting.get("enabled", True)
custom_properties = setting.get("custom_properties", {})
if category_id:
M3UVODCategoryRelation.objects.update_or_create(
category_id=category_id,
m3u_account=account,
defaults={
"enabled": enabled,
"custom_properties": custom_properties,
},
)
return Response({"message": "Group settings updated successfully"})
except Exception as e:
return Response(
{"error": f"Failed to update group settings: {str(e)}"},
status=status.HTTP_400_BAD_REQUEST,
)
class M3UFilterViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for M3U filters"""
queryset = M3UFilter.objects.all()
serializer_class = M3UFilterSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
def get_queryset(self):
m3u_account_id = self.kwargs["account_id"]
return M3UFilter.objects.filter(m3u_account_id=m3u_account_id)
def perform_create(self, serializer):
# Get the account ID from the URL
account_id = self.kwargs["account_id"]
# # Get the M3UAccount instance for the account_id
# m3u_account = M3UAccount.objects.get(id=account_id)
# Save the 'm3u_account' in the serializer context
serializer.context["m3u_account"] = account_id
# Perform the actual save
serializer.save(m3u_account_id=account_id)
class ServerGroupViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for Server Groups"""
queryset = ServerGroup.objects.all()
serializer_class = ServerGroupSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
class RefreshM3UAPIView(APIView):
"""Triggers refresh for all active M3U accounts"""
def get_permissions(self):
try:
return [
perm() for perm in permission_classes_by_method[self.request.method]
]
except KeyError:
return [Authenticated()]
@swagger_auto_schema(
operation_description="Triggers a refresh of all active M3U accounts",
responses={202: "M3U refresh initiated"}
responses={202: "M3U refresh initiated"},
)
def post(self, request, format=None):
refresh_m3u_accounts.delay()
return Response({'success': True, 'message': 'M3U refresh initiated.'}, status=status.HTTP_202_ACCEPTED)
return Response(
{"success": True, "message": "M3U refresh initiated."},
status=status.HTTP_202_ACCEPTED,
)
class RefreshSingleM3UAPIView(APIView):
"""Triggers refresh for a single M3U account"""
def get_permissions(self):
try:
return [
perm() for perm in permission_classes_by_method[self.request.method]
]
except KeyError:
return [Authenticated()]
@swagger_auto_schema(
operation_description="Triggers a refresh of a single M3U account",
responses={202: "M3U account refresh initiated"}
responses={202: "M3U account refresh initiated"},
)
def post(self, request, account_id, format=None):
refresh_single_m3u_account.delay(account_id)
return Response({'success': True, 'message': f'M3U account {account_id} refresh initiated.'},
status=status.HTTP_202_ACCEPTED)
return Response(
{
"success": True,
"message": f"M3U account {account_id} refresh initiated.",
},
status=status.HTTP_202_ACCEPTED,
)
class RefreshAccountInfoAPIView(APIView):
"""Triggers account info refresh for a single M3U account"""
def get_permissions(self):
try:
return [
perm() for perm in permission_classes_by_method[self.request.method]
]
except KeyError:
return [Authenticated()]
@swagger_auto_schema(
operation_description="Triggers a refresh of account information for a specific M3U profile",
responses={202: "Account info refresh initiated", 400: "Profile not found or not XtreamCodes"},
)
def post(self, request, profile_id, format=None):
try:
from .models import M3UAccountProfile
profile = M3UAccountProfile.objects.get(id=profile_id)
account = profile.m3u_account
if account.account_type != M3UAccount.Types.XC:
return Response(
{
"success": False,
"error": "Account info refresh is only available for XtreamCodes accounts",
},
status=status.HTTP_400_BAD_REQUEST,
)
refresh_account_info.delay(profile_id)
return Response(
{
"success": True,
"message": f"Account info refresh initiated for profile {profile.name}.",
},
status=status.HTTP_202_ACCEPTED,
)
except M3UAccountProfile.DoesNotExist:
return Response(
{
"success": False,
"error": "Profile not found",
},
status=status.HTTP_404_NOT_FOUND,
)
class UserAgentViewSet(viewsets.ModelViewSet):
"""Handles CRUD operations for User Agents"""
queryset = UserAgent.objects.all()
serializer_class = UserAgentSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
class M3UAccountProfileViewSet(viewsets.ModelViewSet):
queryset = M3UAccountProfile.objects.all()
serializer_class = M3UAccountProfileSerializer
permission_classes = [IsAuthenticated]
def get_permissions(self):
try:
return [perm() for perm in permission_classes_by_action[self.action]]
except KeyError:
return [Authenticated()]
def get_queryset(self):
m3u_account_id = self.kwargs['account_id']
m3u_account_id = self.kwargs["account_id"]
return M3UAccountProfile.objects.filter(m3u_account_id=m3u_account_id)
def perform_create(self, serializer):
# Get the account ID from the URL
account_id = self.kwargs['account_id']
account_id = self.kwargs["account_id"]
# Get the M3UAccount instance for the account_id
m3u_account = M3UAccount.objects.get(id=account_id)
# Save the 'm3u_account' in the serializer context
serializer.context['m3u_account'] = m3u_account
serializer.context["m3u_account"] = m3u_account
# Perform the actual save
serializer.save(m3u_account_id=m3u_account)

View file

@ -4,6 +4,13 @@ from .models import M3UAccount, M3UFilter
import re
class M3UAccountForm(forms.ModelForm):
enable_vod = forms.BooleanField(
required=False,
initial=False,
label="Enable VOD Content",
help_text="Parse and import VOD (movies/series) content for XtreamCodes accounts"
)
class Meta:
model = M3UAccount
fields = [
@ -13,8 +20,34 @@ class M3UAccountForm(forms.ModelForm):
'server_group',
'max_streams',
'is_active',
'enable_vod',
]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# Set initial value for enable_vod from custom_properties
if self.instance and self.instance.custom_properties:
custom_props = self.instance.custom_properties or {}
self.fields['enable_vod'].initial = custom_props.get('enable_vod', False)
def save(self, commit=True):
instance = super().save(commit=False)
# Handle enable_vod field
enable_vod = self.cleaned_data.get('enable_vod', False)
# Parse existing custom_properties
custom_props = instance.custom_properties or {}
# Update VOD preference
custom_props['enable_vod'] = enable_vod
instance.custom_properties = custom_props
if commit:
instance.save()
return instance
def clean_uploaded_file(self):
uploaded_file = self.cleaned_data.get('uploaded_file')
if uploaded_file:

View file

@ -3,6 +3,7 @@
from django.db import migrations
from core.models import CoreSettings
def create_custom_account(apps, schema_editor):
default_user_agent_id = CoreSettings.get_default_user_agent_id()
@ -18,7 +19,7 @@ def create_custom_account(apps, schema_editor):
M3UAccountProfile = apps.get_model("m3u", "M3UAccountProfile")
M3UAccountProfile.objects.create(
m3u_account=m3u_account,
name=f'{m3u_account.name} Default',
name=f"{m3u_account.name} Default",
max_streams=m3u_account.max_streams,
is_default=True,
is_active=True,
@ -26,10 +27,12 @@ def create_custom_account(apps, schema_editor):
replace_pattern="$1",
)
class Migration(migrations.Migration):
dependencies = [
('m3u', '0002_m3uaccount_locked'),
("m3u", "0002_m3uaccount_locked"),
("core", "0004_preload_core_settings"),
]
operations = [

View file

@ -7,24 +7,29 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('django_celery_beat', '0019_alter_periodictasks_options'),
('m3u', '0004_m3uaccount_stream_profile'),
("django_celery_beat", "0019_alter_periodictasks_options"),
("m3u", "0004_m3uaccount_stream_profile"),
]
operations = [
migrations.AddField(
model_name='m3uaccount',
name='custom_properties',
model_name="m3uaccount",
name="custom_properties",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name='m3uaccount',
name='refresh_interval',
model_name="m3uaccount",
name="refresh_interval",
field=models.IntegerField(default=24),
),
migrations.AddField(
model_name='m3uaccount',
name='refresh_task',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='django_celery_beat.periodictask'),
model_name="m3uaccount",
name="refresh_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="django_celery_beat.periodictask",
),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0007_remove_m3uaccount_uploaded_file_m3uaccount_file_path'),
]
operations = [
migrations.AddField(
model_name='m3uaccount',
name='stale_stream_days',
field=models.PositiveIntegerField(default=7, help_text='Number of days after which a stream will be removed if not seen in the M3U source.'),
),
]

View file

@ -0,0 +1,28 @@
# Generated by Django 5.1.6 on 2025-04-27 12:56
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0008_m3uaccount_stale_stream_days'),
]
operations = [
migrations.AddField(
model_name='m3uaccount',
name='account_type',
field=models.CharField(choices=[('STD', 'Standard'), ('XC', 'Xtream Codes')], default='STD'),
),
migrations.AddField(
model_name='m3uaccount',
name='password',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='m3uaccount',
name='username',
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View file

@ -0,0 +1,28 @@
# Generated by Django 5.1.6 on 2025-05-04 21:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0009_m3uaccount_account_type_m3uaccount_password_and_more'),
]
operations = [
migrations.AddField(
model_name='m3uaccount',
name='last_message',
field=models.TextField(blank=True, null=True, help_text="Last status message, including success results or error information"),
),
migrations.AddField(
model_name='m3uaccount',
name='status',
field=models.CharField(choices=[('idle', 'Idle'), ('fetching', 'Fetching'), ('parsing', 'Parsing'), ('error', 'Error'), ('success', 'Success')], default='idle', max_length=20),
),
migrations.AlterField(
model_name='m3uaccount',
name='updated_at',
field=models.DateTimeField(blank=True, help_text='Time when this account was last successfully refreshed', null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-15 01:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0010_add_status_fields_and_remove_auto_now'),
]
operations = [
migrations.AlterField(
model_name='m3uaccount',
name='status',
field=models.CharField(choices=[('idle', 'Idle'), ('fetching', 'Fetching'), ('parsing', 'Parsing'), ('error', 'Error'), ('success', 'Success'), ('pending_setup', 'Pending Setup'), ('disabled', 'Disabled')], default='idle', max_length=20),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-05-21 19:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0011_alter_m3uaccount_status'),
]
operations = [
migrations.AlterField(
model_name='m3uaccount',
name='refresh_interval',
field=models.IntegerField(default=0),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.1.6 on 2025-07-22 21:16
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0012_alter_m3uaccount_refresh_interval'),
]
operations = [
migrations.AlterField(
model_name='m3ufilter',
name='filter_type',
field=models.CharField(choices=[('group', 'Group'), ('name', 'Stream Name'), ('url', 'Stream URL')], default='group', help_text='Filter based on either group title or stream name.', max_length=50),
),
]

View file

@ -0,0 +1,22 @@
# Generated by Django 5.1.6 on 2025-07-31 17:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0013_alter_m3ufilter_filter_type'),
]
operations = [
migrations.AlterModelOptions(
name='m3ufilter',
options={'ordering': ['order']},
),
migrations.AddField(
model_name='m3ufilter',
name='order',
field=models.PositiveIntegerField(default=0),
),
]

View file

@ -0,0 +1,22 @@
# Generated by Django 5.2.4 on 2025-08-02 16:06
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0014_alter_m3ufilter_options_m3ufilter_order'),
]
operations = [
migrations.AlterModelOptions(
name='m3ufilter',
options={},
),
migrations.AddField(
model_name='m3ufilter',
name='custom_properties',
field=models.TextField(blank=True, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-08-20 22:35
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0015_alter_m3ufilter_options_m3ufilter_custom_properties'),
]
operations = [
migrations.AddField(
model_name='m3uaccount',
name='priority',
field=models.PositiveIntegerField(default=0, help_text='Priority for VOD provider selection (higher numbers = higher priority). Used when multiple providers offer the same content.'),
),
]

View file

@ -0,0 +1,28 @@
# Generated by Django 5.2.4 on 2025-09-02 15:19
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0016_m3uaccount_priority'),
]
operations = [
migrations.AlterField(
model_name='m3uaccount',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AlterField(
model_name='m3uaccount',
name='server_url',
field=models.URLField(blank=True, help_text='The base URL of the M3U server (optional if a file is uploaded)', max_length=1000, null=True),
),
migrations.AlterField(
model_name='m3ufilter',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-09-09 20:57
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('m3u', '0017_alter_m3uaccount_custom_properties_and_more'),
]
operations = [
migrations.AddField(
model_name='m3uaccountprofile',
name='custom_properties',
field=models.JSONField(blank=True, default=dict, help_text='Custom properties for storing account information from provider (e.g., XC account details, expiration dates)', null=True),
),
]

View file

@ -7,73 +7,98 @@ from apps.channels.models import StreamProfile
from django_celery_beat.models import PeriodicTask
from core.models import CoreSettings, UserAgent
CUSTOM_M3U_ACCOUNT_NAME="custom"
CUSTOM_M3U_ACCOUNT_NAME = "custom"
class M3UAccount(models.Model):
class Types(models.TextChoices):
STADNARD = "STD", "Standard"
XC = "XC", "Xtream Codes"
class Status(models.TextChoices):
IDLE = "idle", "Idle"
FETCHING = "fetching", "Fetching"
PARSING = "parsing", "Parsing"
ERROR = "error", "Error"
SUCCESS = "success", "Success"
PENDING_SETUP = "pending_setup", "Pending Setup"
DISABLED = "disabled", "Disabled"
"""Represents an M3U Account for IPTV streams."""
name = models.CharField(
max_length=255,
unique=True,
help_text="Unique name for this M3U account"
max_length=255, unique=True, help_text="Unique name for this M3U account"
)
server_url = models.URLField(
max_length=1000,
blank=True,
null=True,
help_text="The base URL of the M3U server (optional if a file is uploaded)"
)
file_path = models.CharField(
max_length=255,
blank=True,
null=True
help_text="The base URL of the M3U server (optional if a file is uploaded)",
)
file_path = models.CharField(max_length=255, blank=True, null=True)
server_group = models.ForeignKey(
'ServerGroup',
"ServerGroup",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='m3u_accounts',
help_text="The server group this M3U account belongs to"
related_name="m3u_accounts",
help_text="The server group this M3U account belongs to",
)
max_streams = models.PositiveIntegerField(
default=0,
help_text="Maximum number of concurrent streams (0 for unlimited)"
default=0, help_text="Maximum number of concurrent streams (0 for unlimited)"
)
is_active = models.BooleanField(
default=True,
help_text="Set to false to deactivate this M3U account"
default=True, help_text="Set to false to deactivate this M3U account"
)
created_at = models.DateTimeField(
auto_now_add=True,
help_text="Time when this account was created"
auto_now_add=True, help_text="Time when this account was created"
)
updated_at = models.DateTimeField(
auto_now=True,
help_text="Time when this account was last updated"
null=True,
blank=True,
help_text="Time when this account was last successfully refreshed",
)
status = models.CharField(
max_length=20, choices=Status.choices, default=Status.IDLE
)
last_message = models.TextField(
null=True,
blank=True,
help_text="Last status message, including success results or error information",
)
user_agent = models.ForeignKey(
'core.UserAgent',
"core.UserAgent",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='m3u_accounts',
help_text="The User-Agent associated with this M3U account."
related_name="m3u_accounts",
help_text="The User-Agent associated with this M3U account.",
)
locked = models.BooleanField(
default=False,
help_text="Protected - can't be deleted or modified"
default=False, help_text="Protected - can't be deleted or modified"
)
stream_profile = models.ForeignKey(
StreamProfile,
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name='m3u_accounts'
related_name="m3u_accounts",
)
custom_properties = models.TextField(null=True, blank=True)
refresh_interval = models.IntegerField(default=24)
account_type = models.CharField(choices=Types.choices, default=Types.STADNARD)
username = models.CharField(max_length=255, null=True, blank=True)
password = models.CharField(max_length=255, null=True, blank=True)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
refresh_interval = models.IntegerField(default=0)
refresh_task = models.ForeignKey(
PeriodicTask, on_delete=models.SET_NULL, null=True, blank=True
)
stale_stream_days = models.PositiveIntegerField(
default=7,
help_text="Number of days after which a stream will be removed if not seen in the M3U source.",
)
priority = models.PositiveIntegerField(
default=0,
help_text="Priority for VOD provider selection (higher numbers = higher priority). Used when multiple providers offer the same content.",
)
def __str__(self):
return self.name
@ -104,10 +129,21 @@ class M3UAccount(models.Model):
def get_user_agent(self):
user_agent = self.user_agent
if not user_agent:
user_agent = UserAgent.objects.get(id=CoreSettings.get_default_user_agent_id())
user_agent = UserAgent.objects.get(
id=CoreSettings.get_default_user_agent_id()
)
return user_agent
def save(self, *args, **kwargs):
# Prevent auto_now behavior by handling updated_at manually
if "update_fields" in kwargs and "updated_at" not in kwargs["update_fields"]:
# Don't modify updated_at for regular updates
kwargs.setdefault("update_fields", [])
if "updated_at" in kwargs["update_fields"]:
kwargs["update_fields"].remove("updated_at")
super().save(*args, **kwargs)
# def get_channel_groups(self):
# return ChannelGroup.objects.filter(m3u_account__m3u_account=self)
@ -119,35 +155,40 @@ class M3UAccount(models.Model):
# """Return all streams linked to this account with enabled ChannelGroups."""
# return self.streams.filter(channel_group__in=ChannelGroup.objects.filter(m3u_account__enabled=True))
class M3UFilter(models.Model):
"""Defines filters for M3U accounts based on stream name or group title."""
FILTER_TYPE_CHOICES = (
('group', 'Group Title'),
('name', 'Stream Name'),
("group", "Group"),
("name", "Stream Name"),
("url", "Stream URL"),
)
m3u_account = models.ForeignKey(
M3UAccount,
on_delete=models.CASCADE,
related_name='filters',
help_text="The M3U account this filter is applied to."
related_name="filters",
help_text="The M3U account this filter is applied to.",
)
filter_type = models.CharField(
max_length=50,
choices=FILTER_TYPE_CHOICES,
default='group',
help_text="Filter based on either group title or stream name."
default="group",
help_text="Filter based on either group title or stream name.",
)
regex_pattern = models.CharField(
max_length=200,
help_text="A regex pattern to match streams or groups."
max_length=200, help_text="A regex pattern to match streams or groups."
)
exclude = models.BooleanField(
default=True,
help_text="If True, matching items are excluded; if False, only matches are included."
help_text="If True, matching items are excluded; if False, only matches are included.",
)
order = models.PositiveIntegerField(default=0)
custom_properties = models.JSONField(default=dict, blank=True, null=True)
def applies_to(self, stream_name, group_name):
target = group_name if self.filter_type == 'group' else stream_name
target = group_name if self.filter_type == "group" else stream_name
return bool(re.search(self.regex_pattern, target, re.IGNORECASE))
def clean(self):
@ -157,7 +198,9 @@ class M3UFilter(models.Model):
raise ValidationError(f"Invalid regex pattern: {self.regex_pattern}")
def __str__(self):
filter_type_display = dict(self.FILTER_TYPE_CHOICES).get(self.filter_type, 'Unknown')
filter_type_display = dict(self.FILTER_TYPE_CHOICES).get(
self.filter_type, "Unknown"
)
exclude_status = "Exclude" if self.exclude else "Include"
return f"[{self.m3u_account.name}] {filter_type_display}: {self.regex_pattern} ({exclude_status})"
@ -183,40 +226,35 @@ class M3UFilter(models.Model):
class ServerGroup(models.Model):
"""Represents a logical grouping of servers or channels."""
name = models.CharField(
max_length=100,
unique=True,
help_text="Unique name for this server group."
max_length=100, unique=True, help_text="Unique name for this server group."
)
def __str__(self):
return self.name
from django.db import models
class M3UAccountProfile(models.Model):
"""Represents a profile associated with an M3U Account."""
m3u_account = models.ForeignKey(
'M3UAccount',
"M3UAccount",
on_delete=models.CASCADE,
related_name='profiles',
help_text="The M3U account this profile belongs to."
related_name="profiles",
help_text="The M3U account this profile belongs to.",
)
name = models.CharField(
max_length=255,
help_text="Name for the M3U account profile"
max_length=255, help_text="Name for the M3U account profile"
)
is_default = models.BooleanField(
default=False,
help_text="Set to false to deactivate this profile"
default=False, help_text="Set to false to deactivate this profile"
)
max_streams = models.PositiveIntegerField(
default=0,
help_text="Maximum number of concurrent streams (0 for unlimited)"
default=0, help_text="Maximum number of concurrent streams (0 for unlimited)"
)
is_active = models.BooleanField(
default=True,
help_text="Set to false to deactivate this profile"
default=True, help_text="Set to false to deactivate this profile"
)
search_pattern = models.CharField(
max_length=255,
@ -225,22 +263,95 @@ class M3UAccountProfile(models.Model):
max_length=255,
)
current_viewers = models.PositiveIntegerField(default=0)
custom_properties = models.JSONField(
default=dict,
blank=True,
null=True,
help_text="Custom properties for storing account information from provider (e.g., XC account details, expiration dates)"
)
class Meta:
constraints = [
models.UniqueConstraint(fields=['m3u_account', 'name'], name='unique_account_name')
models.UniqueConstraint(
fields=["m3u_account", "name"], name="unique_account_name"
)
]
def __str__(self):
return f"{self.name} ({self.m3u_account.name})"
def get_account_expiration(self):
"""Get account expiration date from custom properties if available"""
if not self.custom_properties:
return None
user_info = self.custom_properties.get('user_info', {})
exp_date = user_info.get('exp_date')
if exp_date:
try:
from datetime import datetime
# XC exp_date is typically a Unix timestamp
if isinstance(exp_date, (int, float)):
return datetime.fromtimestamp(exp_date)
elif isinstance(exp_date, str):
# Try to parse as timestamp first, then as ISO date
try:
return datetime.fromtimestamp(float(exp_date))
except ValueError:
return datetime.fromisoformat(exp_date)
except (ValueError, TypeError):
pass
return None
def get_account_status(self):
"""Get account status from custom properties if available"""
if not self.custom_properties:
return None
user_info = self.custom_properties.get('user_info', {})
return user_info.get('status')
def get_max_connections(self):
"""Get maximum connections from custom properties if available"""
if not self.custom_properties:
return None
user_info = self.custom_properties.get('user_info', {})
return user_info.get('max_connections')
def get_active_connections(self):
"""Get active connections from custom properties if available"""
if not self.custom_properties:
return None
user_info = self.custom_properties.get('user_info', {})
return user_info.get('active_cons')
def get_last_refresh(self):
"""Get last refresh timestamp from custom properties if available"""
if not self.custom_properties:
return None
last_refresh = self.custom_properties.get('last_refresh')
if last_refresh:
try:
from datetime import datetime
return datetime.fromisoformat(last_refresh)
except (ValueError, TypeError):
pass
return None
@receiver(models.signals.post_save, sender=M3UAccount)
def create_profile_for_m3u_account(sender, instance, created, **kwargs):
"""Automatically create an M3UAccountProfile when M3UAccount is created."""
if created:
M3UAccountProfile.objects.create(
m3u_account=instance,
name=f'{instance.name} Default',
name=f"{instance.name} Default",
max_streams=instance.max_streams,
is_default=True,
is_active=True,
@ -253,6 +364,5 @@ def create_profile_for_m3u_account(sender, instance, created, **kwargs):
is_default=True,
)
profile.max_streams = instance.max_streams
profile.save()

View file

@ -1,41 +1,106 @@
from rest_framework import serializers
from core.utils import validate_flexible_url
from rest_framework import serializers, status
from rest_framework.response import Response
from .models import M3UAccount, M3UFilter, ServerGroup, M3UAccountProfile
from core.models import UserAgent
from apps.channels.models import ChannelGroup, ChannelGroupM3UAccount
from apps.channels.serializers import ChannelGroupM3UAccountSerializer, ChannelGroupSerializer
from apps.channels.serializers import (
ChannelGroupM3UAccountSerializer,
)
import logging
import json
logger = logging.getLogger(__name__)
class M3UFilterSerializer(serializers.ModelSerializer):
"""Serializer for M3U Filters"""
channel_groups = ChannelGroupM3UAccountSerializer(source='m3u_account', many=True)
class Meta:
model = M3UFilter
fields = ['id', 'filter_type', 'regex_pattern', 'exclude', 'channel_groups']
fields = [
"id",
"filter_type",
"regex_pattern",
"exclude",
"order",
"custom_properties",
]
from rest_framework import serializers
from .models import M3UAccountProfile
class M3UAccountProfileSerializer(serializers.ModelSerializer):
account = serializers.SerializerMethodField()
def get_account(self, obj):
"""Include basic account information for frontend use"""
return {
'id': obj.m3u_account.id,
'name': obj.m3u_account.name,
'account_type': obj.m3u_account.account_type,
'is_xtream_codes': obj.m3u_account.account_type == 'XC'
}
class Meta:
model = M3UAccountProfile
fields = ['id', 'name', 'max_streams', 'is_active', 'is_default', 'current_viewers', 'search_pattern', 'replace_pattern']
read_only_fields = ['id']
fields = [
"id",
"name",
"max_streams",
"is_active",
"is_default",
"current_viewers",
"search_pattern",
"replace_pattern",
"custom_properties",
"account",
]
read_only_fields = ["id", "account"]
extra_kwargs = {
'search_pattern': {'required': False, 'allow_blank': True},
'replace_pattern': {'required': False, 'allow_blank': True},
}
def create(self, validated_data):
m3u_account = self.context.get('m3u_account')
m3u_account = self.context.get("m3u_account")
# Use the m3u_account when creating the profile
validated_data['m3u_account_id'] = m3u_account.id
validated_data["m3u_account_id"] = m3u_account.id
return super().create(validated_data)
def validate(self, data):
"""Custom validation to handle default profiles"""
# For updates to existing instances
if self.instance and self.instance.is_default:
# For default profiles, search_pattern and replace_pattern are not required
# and we don't want to validate them since they shouldn't be changed
return data
# For non-default profiles or new profiles, ensure required fields are present
if not data.get('search_pattern'):
raise serializers.ValidationError({
'search_pattern': ['This field is required for non-default profiles.']
})
if not data.get('replace_pattern'):
raise serializers.ValidationError({
'replace_pattern': ['This field is required for non-default profiles.']
})
return data
def update(self, instance, validated_data):
if instance.is_default:
raise serializers.ValidationError("Default profiles cannot be modified.")
# For default profiles, only allow updating name and custom_properties (for notes)
allowed_fields = {'name', 'custom_properties'}
# Remove any fields that aren't allowed for default profiles
disallowed_fields = set(validated_data.keys()) - allowed_fields
if disallowed_fields:
raise serializers.ValidationError(
f"Default profiles can only modify name and notes. "
f"Cannot modify: {', '.join(disallowed_fields)}"
)
return super().update(instance, validated_data)
def destroy(self, request, *args, **kwargs):
@ -43,13 +108,15 @@ class M3UAccountProfileSerializer(serializers.ModelSerializer):
if instance.is_default:
return Response(
{"error": "Default profiles cannot be deleted."},
status=status.HTTP_400_BAD_REQUEST
status=status.HTTP_400_BAD_REQUEST,
)
return super().destroy(request, *args, **kwargs)
class M3UAccountSerializer(serializers.ModelSerializer):
"""Serializer for M3U Account"""
filters = M3UFilterSerializer(many=True, read_only=True)
filters = serializers.SerializerMethodField()
# Include user_agent as a mandatory field using its primary key.
user_agent = serializers.PrimaryKeyRelatedField(
queryset=UserAgent.objects.all(),
@ -57,21 +124,96 @@ class M3UAccountSerializer(serializers.ModelSerializer):
allow_null=True,
)
profiles = M3UAccountProfileSerializer(many=True, read_only=True)
read_only_fields = ['locked', 'created_at', 'updated_at']
read_only_fields = ["locked", "created_at", "updated_at"]
# channel_groups = serializers.SerializerMethodField()
channel_groups = ChannelGroupM3UAccountSerializer(source='channel_group', many=True, required=False)
channel_groups = ChannelGroupM3UAccountSerializer(
source="channel_group", many=True, required=False
)
server_url = serializers.CharField(
required=False,
allow_blank=True,
allow_null=True,
validators=[validate_flexible_url],
)
enable_vod = serializers.BooleanField(required=False, write_only=True)
auto_enable_new_groups_live = serializers.BooleanField(required=False, write_only=True)
auto_enable_new_groups_vod = serializers.BooleanField(required=False, write_only=True)
auto_enable_new_groups_series = serializers.BooleanField(required=False, write_only=True)
class Meta:
model = M3UAccount
fields = [
'id', 'name', 'server_url', 'file_path', 'server_group',
'max_streams', 'is_active', 'created_at', 'updated_at', 'filters', 'user_agent', 'profiles', 'locked',
'channel_groups', 'refresh_interval'
"id",
"name",
"server_url",
"file_path",
"server_group",
"max_streams",
"is_active",
"created_at",
"updated_at",
"filters",
"user_agent",
"profiles",
"locked",
"channel_groups",
"refresh_interval",
"custom_properties",
"account_type",
"username",
"password",
"stale_stream_days",
"priority",
"status",
"last_message",
"enable_vod",
"auto_enable_new_groups_live",
"auto_enable_new_groups_vod",
"auto_enable_new_groups_series",
]
extra_kwargs = {
"password": {
"required": False,
"allow_blank": True,
},
}
def to_representation(self, instance):
data = super().to_representation(instance)
# Parse custom_properties to get VOD preference and auto_enable_new_groups settings
custom_props = instance.custom_properties or {}
data["enable_vod"] = custom_props.get("enable_vod", False)
data["auto_enable_new_groups_live"] = custom_props.get("auto_enable_new_groups_live", True)
data["auto_enable_new_groups_vod"] = custom_props.get("auto_enable_new_groups_vod", True)
data["auto_enable_new_groups_series"] = custom_props.get("auto_enable_new_groups_series", True)
return data
def update(self, instance, validated_data):
# Handle enable_vod preference and auto_enable_new_groups settings
enable_vod = validated_data.pop("enable_vod", None)
auto_enable_new_groups_live = validated_data.pop("auto_enable_new_groups_live", None)
auto_enable_new_groups_vod = validated_data.pop("auto_enable_new_groups_vod", None)
auto_enable_new_groups_series = validated_data.pop("auto_enable_new_groups_series", None)
# Get existing custom_properties
custom_props = instance.custom_properties or {}
# Update preferences
if enable_vod is not None:
custom_props["enable_vod"] = enable_vod
if auto_enable_new_groups_live is not None:
custom_props["auto_enable_new_groups_live"] = auto_enable_new_groups_live
if auto_enable_new_groups_vod is not None:
custom_props["auto_enable_new_groups_vod"] = auto_enable_new_groups_vod
if auto_enable_new_groups_series is not None:
custom_props["auto_enable_new_groups_series"] = auto_enable_new_groups_series
validated_data["custom_properties"] = custom_props
# Pop out channel group memberships so we can handle them manually
channel_group_data = validated_data.pop('channel_group', [])
channel_group_data = validated_data.pop("channel_group", [])
# First, update the M3UAccount itself
for attr, value in validated_data.items():
@ -81,13 +223,12 @@ class M3UAccountSerializer(serializers.ModelSerializer):
# Prepare a list of memberships to update
memberships_to_update = []
for group_data in channel_group_data:
group = group_data.get('channel_group')
enabled = group_data.get('enabled')
group = group_data.get("channel_group")
enabled = group_data.get("enabled")
try:
membership = ChannelGroupM3UAccount.objects.get(
m3u_account=instance,
channel_group=group
m3u_account=instance, channel_group=group
)
membership.enabled = enabled
memberships_to_update.append(membership)
@ -96,13 +237,39 @@ class M3UAccountSerializer(serializers.ModelSerializer):
# Perform the bulk update
if memberships_to_update:
ChannelGroupM3UAccount.objects.bulk_update(memberships_to_update, ['enabled'])
ChannelGroupM3UAccount.objects.bulk_update(
memberships_to_update, ["enabled"]
)
return instance
def create(self, validated_data):
# Handle enable_vod preference and auto_enable_new_groups settings during creation
enable_vod = validated_data.pop("enable_vod", False)
auto_enable_new_groups_live = validated_data.pop("auto_enable_new_groups_live", True)
auto_enable_new_groups_vod = validated_data.pop("auto_enable_new_groups_vod", True)
auto_enable_new_groups_series = validated_data.pop("auto_enable_new_groups_series", True)
# Parse existing custom_properties or create new
custom_props = validated_data.get("custom_properties", {})
# Set preferences (default to True for auto_enable_new_groups)
custom_props["enable_vod"] = enable_vod
custom_props["auto_enable_new_groups_live"] = auto_enable_new_groups_live
custom_props["auto_enable_new_groups_vod"] = auto_enable_new_groups_vod
custom_props["auto_enable_new_groups_series"] = auto_enable_new_groups_series
validated_data["custom_properties"] = custom_props
return super().create(validated_data)
def get_filters(self, obj):
filters = obj.filters.order_by("order")
return M3UFilterSerializer(filters, many=True).data
class ServerGroupSerializer(serializers.ModelSerializer):
"""Serializer for Server Group"""
class Meta:
model = ServerGroup
fields = ['id', 'name']
fields = ["id", "name"]

View file

@ -1,10 +1,13 @@
# apps/m3u/signals.py
from django.db.models.signals import post_save, post_delete
from django.db.models.signals import post_save, post_delete, pre_save
from django.dispatch import receiver
from .models import M3UAccount
from .tasks import refresh_single_m3u_account, refresh_m3u_groups
from .tasks import refresh_single_m3u_account, refresh_m3u_groups, delete_m3u_refresh_task_by_id
from django_celery_beat.models import PeriodicTask, IntervalSchedule
import json
import logging
logger = logging.getLogger(__name__)
@receiver(post_save, sender=M3UAccount)
def refresh_account_on_save(sender, instance, created, **kwargs):
@ -13,7 +16,7 @@ def refresh_account_on_save(sender, instance, created, **kwargs):
call a Celery task that fetches & parses that single account
if it is active or newly created.
"""
if created:
if created and instance.account_type != M3UAccount.Types.XC:
refresh_m3u_groups.delay(instance.id)
@receiver(post_save, sender=M3UAccount)
@ -28,21 +31,17 @@ def create_or_update_refresh_task(sender, instance, **kwargs):
period=IntervalSchedule.HOURS
)
if not instance.refresh_task:
refresh_task = PeriodicTask.objects.create(
name=task_name,
interval=interval,
task="apps.m3u.tasks.refresh_single_m3u_account",
kwargs=json.dumps({"account_id": instance.id}),
enabled=instance.refresh_interval != 0,
)
M3UAccount.objects.filter(id=instance.id).update(refresh_task=refresh_task)
else:
task = instance.refresh_task
# Task should be enabled only if refresh_interval != 0 AND account is active
should_be_enabled = (instance.refresh_interval != 0) and instance.is_active
# First check if the task already exists to avoid validation errors
try:
task = PeriodicTask.objects.get(name=task_name)
# Task exists, just update it
updated_fields = []
if task.enabled != (instance.refresh_interval != 0):
task.enabled = instance.refresh_interval != 0
if task.enabled != should_be_enabled:
task.enabled = should_be_enabled
updated_fields.append("enabled")
if task.interval != interval:
@ -52,11 +51,60 @@ def create_or_update_refresh_task(sender, instance, **kwargs):
if updated_fields:
task.save(update_fields=updated_fields)
# Ensure instance has the task
if instance.refresh_task_id != task.id:
M3UAccount.objects.filter(id=instance.id).update(refresh_task=task)
except PeriodicTask.DoesNotExist:
# Create new task if it doesn't exist
refresh_task = PeriodicTask.objects.create(
name=task_name,
interval=interval,
task="apps.m3u.tasks.refresh_single_m3u_account",
kwargs=json.dumps({"account_id": instance.id}),
enabled=should_be_enabled,
)
M3UAccount.objects.filter(id=instance.id).update(refresh_task=refresh_task)
@receiver(post_delete, sender=M3UAccount)
def delete_refresh_task(sender, instance, **kwargs):
"""
Delete the associated Celery Beat periodic task when a Channel is deleted.
"""
if instance.refresh_task:
instance.refresh_task.interval.delete()
instance.refresh_task.delete()
try:
# First try the foreign key relationship to find the task ID
task = None
if instance.refresh_task:
logger.info(f"Found task via foreign key: {instance.refresh_task.id} for M3UAccount {instance.id}")
task = instance.refresh_task
# Use the helper function to delete the task
if task:
delete_m3u_refresh_task_by_id(instance.id)
else:
# Otherwise use the helper function
delete_m3u_refresh_task_by_id(instance.id)
except Exception as e:
logger.error(f"Error in delete_refresh_task signal handler: {str(e)}", exc_info=True)
@receiver(pre_save, sender=M3UAccount)
def update_status_on_active_change(sender, instance, **kwargs):
"""
When an M3UAccount's is_active field changes, update the status accordingly.
"""
if instance.pk: # Only for existing records, not new ones
try:
# Get the current record from the database
old_instance = M3UAccount.objects.get(pk=instance.pk)
# If is_active changed, update the status
if old_instance.is_active != instance.is_active:
if instance.is_active:
# When activating, set status to idle
instance.status = M3UAccount.Status.IDLE
else:
# When deactivating, set status to disabled
instance.status = M3UAccount.Status.DISABLED
except M3UAccount.DoesNotExist:
# New record, will use default status
pass

File diff suppressed because it is too large Load diff

Some files were not shown because too many files have changed in this diff Show more