Merge branch 'dev' of https://github.com/Dispatcharr/Dispatcharr into pr/bobey6/707

This commit is contained in:
SergeantPanda 2025-12-03 16:49:21 -06:00
commit 042612c677
49 changed files with 2837 additions and 307 deletions

View file

@ -3,6 +3,8 @@ name: CI Pipeline
on:
push:
branches: [dev]
paths-ignore:
- '**.md'
pull_request:
branches: [dev]
workflow_dispatch:

View file

@ -43,6 +43,10 @@ jobs:
NEW_VERSION=$(python -c "import version; print(f'{version.__version__}')")
echo "new_version=${NEW_VERSION}" >> $GITHUB_OUTPUT
- name: Update Changelog
run: |
python scripts/update_changelog.py ${{ steps.update_version.outputs.new_version }}
- name: Set repository metadata
id: meta
run: |
@ -54,7 +58,7 @@ jobs:
- name: Commit and Tag
run: |
git add version.py
git add version.py CHANGELOG.md
git commit -m "Release v${{ steps.update_version.outputs.new_version }}"
git tag -a "v${{ steps.update_version.outputs.new_version }}" -m "Release v${{ steps.update_version.outputs.new_version }}"
git push origin main --tags

844
CHANGELOG.md Normal file
View file

@ -0,0 +1,844 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
### Changed
- IPv6 access now allowed by default with all IPv6 CIDRs accepted - Thanks [@adrianmace](https://github.com/adrianmace)
- nginx.conf updated to bind to both IPv4 and IPv6 ports - Thanks [@jordandalley](https://github.com/jordandalley)
## [0.13.0] - 2025-12-02
### Added
- `CHANGELOG.md` file following Keep a Changelog format to document all notable changes and project history
- System event logging and viewer: Comprehensive logging system that tracks internal application events (M3U refreshes, EPG updates, stream switches, errors) with a dedicated UI viewer for filtering and reviewing historical events. Improves monitoring, troubleshooting, and understanding system behavior
- M3U/EPG endpoint caching: Implements intelligent caching for frequently requested M3U playlists and EPG data to reduce database load and improve response times for clients.
- Search icon to name headers for the channels and streams tables (#686)
- Comprehensive logging for user authentication events and network access restrictions
- Validation for EPG objects and payloads in updateEPG functions to prevent errors from invalid data
- Referrerpolicy to YouTube iframes in series and VOD modals for better compatibility
### Changed
- XC player API now returns server_info for unknown actions to align with provider behavior
- XC player API refactored to streamline action handling and ensure consistent responses
- Date parsing logic in generate_custom_dummy_programs improved to handle empty or invalid inputs
- DVR cards now reflect date and time formats chosen by user - Thanks [@Biologisten](https://github.com/Biologisten)
- "Uncategorized" categories and relations now automatically created for VOD accounts to improve content management (#627)
- Improved minimum horizontal size in the stats page for better usability on smaller displays
- M3U and EPG generation now handles missing channel profiles with appropriate error logging
### Fixed
- Episode URLs in series modal now use UUID instead of ID, fixing broken links (#684, #694)
- Stream preview now respects selected M3U profile instead of always using default profile (#690)
- Channel groups filter in M3UGroupFilter component now filters out non-existent groups (prevents blank webui when editing M3U after a group was removed)
- Stream order now preserved in PATCH/PUT responses from ChannelSerializer, ensuring consistent ordering across all API operations - Thanks [@FiveBoroughs](https://github.com/FiveBoroughs) (#643)
- XC client compatibility: float channel numbers now converted to integers
- M3U account and profile modals now scrollable on mobile devices for improved usability
## [0.12.0] - 2025-11-19
### Added
- RTSP stream support with automatic protocol detection when a proxy profile requires it. The proxy now forces FFmpeg for RTSP sources and properly handles RTSP URLs - Thanks [@ragchuck](https://github.com/ragchuck) (#184)
- UDP stream support, including correct handling when a proxy profile specifies a UDP source. The proxy now skips HTTP-specific headers (like `user_agent`) for non-HTTP protocols and performs manual redirect handling to improve reliability (#617)
- Separate VOD logos system with a new `VODLogo` model, database migration, dedicated API/viewset, and server-paginated UI. This separates movie/series logos from channel logos, making cleanup safer and enabling independent bulk operations
### Changed
- Background profile refresh now uses a rate-limiting/backoff strategy to avoid provider bans
- Bulk channel editing now validates all requested changes up front and applies updates in a single database transaction
- ProxyServer shutdown & ghost-client handling improved to avoid initializing channels for transient clients and prevent duplicate reinitialization during rapid reconnects
- URL / Stream validation expanded to support credentials on non-FQDN hosts, skips HTTP-only checks for RTSP/RTP/UDP streams, and improved host/port normalization
- TV guide scrolling & timeline synchronization improved with mouse-wheel scrolling, synchronized timeline position with guide navigation, and improved mobile momentum scrolling (#252)
- EPG Source dropdown now sorts alphabetically - Thanks [@0x53c65c0a8bd30fff](https://github.com/0x53c65c0a8bd30fff)
- M3U POST handling restored and improved for clients (e.g., Smarters) that request playlists using HTTP POST - Thanks [@maluueu](https://github.com/maluueu)
- Login form revamped with branding, cleaner layout, loading state, "Remember Me" option, and focused sign-in flow
- Series & VOD now have copy-link buttons in modals for easier URL sharing
- `get_host_and_port` now prioritizes verified port sources and handles reverse-proxy edge cases more accurately (#618)
### Fixed
- EXTINF parsing overhauled to correctly extract attributes such as `tvg-id`, `tvg-name`, and `group-title`, even when values include quotes or commas (#637)
- Websocket payload size reduced during EPG processing to avoid UI freezes, blank screens, or memory spikes in the browser (#327)
- Logo management UI fixes including confirmation dialogs, header checkbox reset, delete button reliability, and full client refetch after cleanup
## [0.11.2] - 2025-11-04
### Added
- Custom Dummy EPG improvements:
- Support for using an existing Custom Dummy EPG as a template for creating new EPGs
- Custom fallback templates for unmatched patterns
- `{endtime}` as an available output placeholder and renamed `{time}``{starttime}` (#590)
- Support for date placeholders that respect both source and output timezones (#597)
- Ability to bulk assign Custom Dummy EPGs to multiple channels
- "Include New Tag" option to mark programs as new in Dummy EPG output
- Support for month strings in date parsing
- Ability to set custom posters and channel logos via regex patterns for Custom Dummy EPGs
- Improved DST handling by calculating offsets based on the actual program date, not today's date
### Changed
- Stream model maximum URL length increased from 2000 to 4096 characters (#585)
- Groups now sorted during `xc_get_live_categories` based on the order they first appear (by lowest channel number)
- Client TTL settings updated and periodic refresh implemented during active streaming to maintain accurate connection tracking
- `ProgramData.sub_title` field changed from `CharField` to `TextField` to allow subtitles longer than 255 characters (#579)
- Startup improved by verifying `/data` directory ownership and automatically fixing permissions if needed. Pre-creates `/data/models` during initialization (#614)
- Port detection enhanced to check `request.META.get("SERVER_PORT")` before falling back to defaults, ensuring correct port when generating M3U, EPG, and logo URLs - Thanks [@lasharor](https://github.com/lasharor)
### Fixed
- Custom Dummy EPG frontend DST calculation now uses program date instead of current date
- Channel titles no longer truncated early after an apostrophe - Thanks [@0x53c65c0a8bd30fff](https://github.com/0x53c65c0a8bd30fff)
## [0.11.1] - 2025-10-22
### Fixed
- uWSGI not receiving environmental variables
- LXC unable to access daemons launched by uWSGI ([#575](https://github.com/Dispatcharr/Dispatcharr/issues/575), [#576](https://github.com/Dispatcharr/Dispatcharr/issues/576), [#577](https://github.com/Dispatcharr/Dispatcharr/issues/577))
## [0.11.0] - 2025-10-22
### Added
- Custom Dummy EPG system:
- Regex pattern matching and name source selection
- Support for custom upcoming and ended programs
- Timezone-aware with source and local timezone selection
- Option to include categories and date/live tags in Dummy EPG output
- (#293)
- Auto-Enable & Category Improvements:
- Auto-enable settings for new groups and categories in M3U and VOD components (#208)
- IPv6 CIDR validation in Settings - Thanks [@jordandalley](https://github.com/jordandalley) (#236)
- Custom logo support for channel groups in Auto Sync Channels (#555)
- Tooltips added to the Stream Table
### Changed
- Celery and uWSGI now have configurable `nice` levels (defaults: `uWSGI=0`, `Celery=5`) to prioritize streaming when needed. (#571)
- Directory creation and ownership management refactored in init scripts to avoid unnecessary recursive `chown` operations and improve boot speed
- HTTP streamer switched to threaded model with piped output for improved robustness
- Chunk timeout configuration improved and StreamManager timeout handling enhanced
- Proxy timeout values reduced to avoid unnecessary waiting
- Resource cleanup improved to prevent "Too many open files" errors
- Proxy settings caching implemented and database connections properly closed after use
- EPG program fetching optimized with chunked retrieval and explicit ordering to reduce memory usage during output
- EPG output now sorted by channel number for consistent presentation
- Stream Table buttons reordered for better usability
- Database connection handling improved throughout the codebase to reduce overall connection count
### Fixed
- Crash when resizing columns in the Channel Table (#516)
- Errors when saving stream settings (#535)
- Preview and edit bugs for custom streams where profile and group selections did not display correctly
- `channel_id` and `channel.uuid` now converted to strings before processing to fix manual switching when the uWSGI worker was not the stream owner (#269)
- Stream locking and connection search issues when switching channels; increased search timeout to reduce premature failures (#503)
- Stream Table buttons no longer shift into multiple rows when selecting many streams
- Custom stream previews
- Custom Stream settings not loading properly (#186)
- Orphaned categories now automatically removed for VOD and Series during M3U refresh (#540)
## [0.10.4] - 2025-10-08
### Added
- "Assign TVG-ID from EPG" functionality with frontend actions for single-channel and batch operations
- Confirmation dialogs in `ChannelBatchForm` for setting names, logos, TVG-IDs, and clearing EPG assignments
- "Clear EPG" button to `ChannelBatchForm` for easy reset of assignments
- Batch editing of channel logos - Thanks [@EmeraldPi](https://github.com/EmeraldPi)
- Ability to set logo name from URL - Thanks [@EmeraldPi](https://github.com/EmeraldPi)
- Proper timestamp tracking for channel creation and updates; `XC Get Live Streams` now uses this information
- Time Zone Settings added to the application ([#482](https://github.com/Dispatcharr/Dispatcharr/issues/482), [#347](https://github.com/Dispatcharr/Dispatcharr/issues/347))
- Comskip settings support including comskip.ini upload and custom directory selection (#418)
- Manual recording scheduling for channels without EPG data (#162)
### Changed
- Default M3U account type is now set to XC for new accounts
- Performance optimization: Only fetch playlists and channel profiles after a successful M3U refresh (rather than every status update)
- Playlist retrieval now includes current connection counts and improved session handling during VOD session start
- Improved stream selection logic when all profiles have reached max connections (retries faster)
### Fixed
- Large EPGs now fully parse all channels
- Duplicate channel outputs for streamer profiles set to "All"
- Streamer profiles with "All" assigned now receive all eligible channels
- PostgreSQL btree index errors from logo URL validation during channel creation (#519)
- M3U processing lock not releasing when no streams found during XC refresh, which also skipped VOD scanning (#449)
- Float conversion errors by normalizing decimal format during VOD scanning (#526)
- Direct URL ordering in M3U output to use correct stream sequence (#528)
- Adding multiple M3U accounts without refreshing modified only the first entry (#397)
- UI state bug where new playlist creation was not notified to frontend ("Fetching Groups" stuck)
- Minor FFmpeg task and stream termination bugs in DVR module
- Input escaping issue where single quotes were interpreted as code delimiters (#406)
## [0.10.3] - 2025-10-04
### Added
- Logo management UI improvements where Channel editor now uses the Logo Manager modal, allowing users to add logos by URL directly from the edit form - Thanks [@EmeraldPi](https://github.com/EmeraldPi)
### Changed
- FFmpeg base container rebuilt with improved native build support - Thanks [@EmeraldPi](https://github.com/EmeraldPi)
- GitHub Actions workflow updated to use native runners instead of QEMU emulation for more reliable multi-architecture builds
### Fixed
- EPG parsing stability when large EPG files would not fully parse all channels. Parser now uses `iterparse` with `recover=True` for both channel and program-level parsing, ensuring complete and resilient XML processing even when Cloudflare injects additional root elements
## [0.10.2] - 2025-10-03
### Added
- `m3u_id` parameter to `generate_hash_key` and updated related calls
- Support for `x-tvg-url` and `url-tvg` generation with preserved query parameters (#345)
- Exact Gracenote ID matching for EPG channel mapping (#291)
- Recovery handling for XMLTV parser errors
- `nice -n 5` added to Celery commands for better process priority management
### Changed
- Default M3U hash key changed to URL only for new installs
- M3U profile retrieval now includes current connection counts and improved session handling during VOD session start
- Improved stream selection logic when all profiles have reached max connections (retries faster)
- XMLTV parsing refactored to use `iterparse` for `<tv>` element
- Release workflow refactored to run on native architecture
- Docker build system improvements:
- Split install/build steps
- Switch from Yarn → NPM
- Updated to Node.js 24 (frontend build)
- Improved ARM build reliability
- Pushes to DockerHub with combined manifest
- Removed redundant tags and improved build organization
### Fixed
- Cloudflare-hosted EPG feeds breaking parsing (#497)
- Bulk channel creation now preserves the order channels were selected in (no longer reversed)
- M3U hash settings not saving properly
- VOD selecting the wrong M3U profile at session start (#461)
- Redundant `h` removed from 12-hour time format in settings page
## [0.10.1] - 2025-09-24
### Added
- Virtualized rendering for TV Guide for smoother performance when displaying large guides - Thanks [@stlalpha](https://github.com/stlalpha) (#438)
- Enhanced channel/program mapping to reuse EPG data across multiple channels that share the same TVG-ID
### Changed
- `URL` field length in EPGSource model increased from 200 → 1000 characters to support long URLs with tokens
- Improved URL transformation logic with more advanced regex during profile refreshes
- During EPG scanning, the first display name for a channel is now used instead of the last
- `whiteSpace` style changed from `nowrap``pre` in StreamsTable for better text formatting
### Fixed
- EPG channel parsing failure when channel `URL` exceeded 500 characters by adding validation during scanning (#452)
- Frontend incorrectly saving case-sensitive setting as a JSON string for stream filters
## [0.10.0] - 2025-09-18
### Added
- Channel Creation Improvements:
- Ability to specify channel number during channel creation ([#377](https://github.com/Dispatcharr/Dispatcharr/issues/377), [#169](https://github.com/Dispatcharr/Dispatcharr/issues/169))
- Asynchronous bulk channel creation from stream IDs with WebSocket progress updates
- WebSocket notifications when channels are created
- EPG Auto-Matching (Rewritten & Enhanced):
- Completely refactored for improved accuracy and efficiency
- Can now be applied to selected channels or triggered directly from the channel edit form
- Uses stricter matching logic with support from sentence transformers
- Added progress notifications during the matching process
- Implemented memory cleanup for ML models after matching operations
- Removed deprecated matching scripts
- Logo & EPG Management:
- Ability in channel edit form and bulk channel editor to set logos and names from assigned EPG (#157)
- Improved logo update flow: frontend refreshes on changes, store updates after bulk changes, progress shown via notifications
- Table Enhancements:
- All tables now support adjustable column resizing (#295)
- Channels and Streams tables persist column widths and center divider position to local storage
- Improved sizing and layout for user-agents, stream profiles, logos, M3U, and EPG tables
### Changed
- Simplified VOD and series access: removed user-level restrictions on M3U accounts
- Skip disabled M3U accounts when choosing streams during playback (#402)
- Enhanced `UserViewSet` queryset to prefetch related channel profiles for better performance
- Auto-focus added to EPG filter input
- Category API retrieval now sorts by name
- Increased default column size for EPG fields and removed max size on group/EPG columns
- Standardized EPG column header to display `(EPG ID - TVG-ID)`
### Fixed
- Bug during VOD cleanup where all VODs not from the current M3U scan could be deleted
- Logos not being set correctly in some cases
- Bug where not setting a channel number caused an error when creating a channel (#422)
- Bug where clicking "Add Channel" with a channel selected opened the edit form instead
- Bug where a newly created channel could reuse streams from another channel due to form not clearing properly
- VOD page not displaying correct order while changing pages
- `ReferenceError: setIsInitialized is not defined` when logging into web UI
- `cannot access local variable 'total_chunks' where it is not associated with a value` during VOD refresh
## [0.9.1] - 2025-09-13
### Fixed
- Broken migrations affecting the plugins system
- DVR and plugin paths to ensure proper functionality (#381)
## [0.9.0] - 2025-09-12
### Added
- **Video on Demand (VOD) System:**
- Complete VOD infrastructure with support for movies and TV series
- Advanced VOD metadata including IMDB/TMDB integration, trailers, cast information
- Smart VOD categorization with filtering by type (movies vs series)
- Multi-provider VOD support with priority-based selection
- VOD streaming proxy with connection tracking and statistics
- Season/episode organization for TV series with expandable episode details
- VOD statistics and monitoring integrated with existing stats dashboard
- Optimized VOD parsing and category filtering
- Dedicated VOD page with movies and series tabs
- Rich VOD modals with backdrop images, trailers, and metadata
- Episode management with season-based organization
- Play button integration with external player support
- VOD statistics cards similar to channel cards
- **Plugin System:**
- Extensible Plugin Framework - Developers can build custom functionality without modifying Dispatcharr core
- Plugin Discovery & Management - Automatic detection of installed plugins, with enable/disable controls in the UI
- Backend API Support - New APIs for listing, loading, and managing plugins programmatically
- Plugin Registry - Structured models for plugin metadata (name, version, author, description)
- UI Enhancements - Dedicated Plugins page in the admin panel for centralized plugin management
- Documentation & Scaffolding - Initial documentation and scaffolding to accelerate plugin development
- **DVR System:**
- Refreshed DVR page for managing scheduled and completed recordings
- Global pre/post padding controls surfaced in Settings
- Playback support for completed recordings directly in the UI
- DVR table view includes title, channel, time, and padding adjustments for clear scheduling
- Improved population of DVR listings, fixing intermittent blank screen issues
- Comskip integration for automated commercial detection and skipping in recordings
- User-configurable comskip toggle in Settings
- **Enhanced Channel Management:**
- EPG column added to channels table for better organization
- EPG filtering by channel assignment and source name
- Channel batch renaming for efficient bulk channel name updates
- Auto channel sync improvements with custom stream profile override
- Channel logo management overhaul with background loading
- Date and time format customization in settings - Thanks [@Biologisten](https://github.com/Biologisten)
- Auto-refresh intervals for statistics with better UI controls
- M3U profile notes field for better organization
- XC account information retrieval and display with account refresh functionality and notifications
### Changed
- JSONB field conversion for custom properties (replacing text fields) for better performance
- Database encoding converted from ASCII to UTF8 for better character support
- Batch processing for M3U updates and channel operations
- Query optimization with prefetch_related to eliminate N+1 queries
- Reduced API calls by fetching all data at once instead of per-category
- Buffering speed setting now affects UI indicators
- Swagger endpoint accessible with or without trailing slash
- EPG source names displayed before channel names in edit forms
- Logo loading improvements with background processing
- Channel card enhancements with better status indicators
- Group column width optimization
- Better content-type detection for streams
- Improved headers with content-range and total length
- Enhanced user-agent handling for M3U accounts
- HEAD request support with connection keep-alive
- Progress tracking improvements for clients with new sessions
- Server URL length increased to 1000 characters for token support
- Prettier formatting applied to all frontend code
- String quote standardization and code formatting improvements
### Fixed
- Logo loading issues in channel edit forms resolved
- M3U download error handling and user feedback improved
- Unique constraint violations fixed during stream rehashing
- Channel stats fetching moved from Celery beat task to configurable API calls
- Speed badge colors now use configurable buffering speed setting
- Channel cards properly close when streams stop
- Active streams labeling updated from "Active Channels"
- WebSocket updates for client connect/disconnect events
- Null value handling before database saves
- Empty string scrubbing for cleaner data
- Group relationship cleanup for removed M3U groups
- Logo cleanup for unused files with proper batch processing
- Recordings start 5 mins after show starts (#102)
### Closed
- [#350](https://github.com/Dispatcharr/Dispatcharr/issues/350): Allow DVR recordings to be played via the UI
- [#349](https://github.com/Dispatcharr/Dispatcharr/issues/349): DVR screen doesn't populate consistently
- [#340](https://github.com/Dispatcharr/Dispatcharr/issues/340): Global find and replace
- [#311](https://github.com/Dispatcharr/Dispatcharr/issues/311): Stat's "Current Speed" does not reflect "Buffering Speed" setting
- [#304](https://github.com/Dispatcharr/Dispatcharr/issues/304): Name ignored when uploading logo
- [#300](https://github.com/Dispatcharr/Dispatcharr/issues/300): Updating Logo throws error
- [#286](https://github.com/Dispatcharr/Dispatcharr/issues/286): 2 Value/Column EPG in Channel Edit
- [#280](https://github.com/Dispatcharr/Dispatcharr/issues/280): Add general text field in M3U/XS profiles
- [#190](https://github.com/Dispatcharr/Dispatcharr/issues/190): Show which stream is being used and allow it to be altered in channel properties
- [#155](https://github.com/Dispatcharr/Dispatcharr/issues/155): Additional column with EPG assignment information / Allow filtering by EPG assignment
- [#138](https://github.com/Dispatcharr/Dispatcharr/issues/138): Bulk Channel Edit Functions
## [0.8.0] - 2025-08-19
### Added
- Channel & Stream Enhancements:
- Preview streams under a channel, with stream logo and name displayed in the channel card
- Advanced stats for channel streams
- Stream qualities displayed in the channel table
- Stream stats now saved to the database
- URL badges can now be clicked to copy stream links to the clipboard
- M3U Filtering for Streams:
- Streams for an M3U account can now be filtered using flexible parameters
- Apply filters based on stream name, group title, or stream URL (via regex)
- Filters support both inclusion and exclusion logic for precise control
- Multiple filters can be layered with a priority order for complex rules
- Ability to reverse the sort order for auto channel sync
- Custom validator for URL fields now allows non-FQDN hostnames (#63)
- Membership creation added in `UpdateChannelMembershipAPIView` if not found (#275)
### Changed
- Bumped Postgres to version 17
- Updated dependencies in `requirements.txt` for compatibility and improvements
- Improved chunked extraction to prevent memory issues - Thanks [@pantherale0](https://github.com/pantherale0)
### Fixed
- XML escaping for channel ID in `generate_dummy_epg` function
- Bug where creating a channel from a stream not displayed in the table used an invalid stream name
- Debian install script - Thanks [@deku-m](https://github.com/deku-m)
## [0.7.1] - 2025-07-29
### Added
- Natural sorting for channel names during auto channel sync
- Ability to sort auto sync order by provider order (default), channel name, TVG ID, or last updated time
- Auto-created channels can now be assigned to specific channel profiles (#255)
- Channel profiles are now fetched automatically after a successful M3U refresh
- Uses only whole numbers when assigning the next available channel number
### Changed
- Logo upload behavior changed to wait for the Create button before saving
- Uses the channel name as the display name in EPG output for improved readability
- Ensures channels are only added to a selected profile if one is explicitly chosen
### Fixed
- Logo Manager prevents redundant messages from the file scanner by properly tracking uploaded logos in Redis
- Fixed an issue preventing logo uploads via URL
- Adds internal support for assigning multiple profiles via API
## [0.7.0] - 2025-07-19
### Added
- **Logo Manager:**
- Complete logo management system with filtering, search, and usage tracking
- Upload logos directly through the UI
- Automatically scan `/data/logos` for existing files (#69)
- View which channels use each logo
- Bulk delete unused logos with cleanup
- Enhanced display with hover effects and improved sizing
- Improved logo fetching with timeouts and user-agent headers to prevent hanging
- **Group Manager:**
- Comprehensive group management interface (#128)
- Search and filter groups with ease
- Bulk operations for cleanup
- Filter channels by group membership
- Automatically clean up unused groups
- **Auto Channel Sync:**
- Automatic channel synchronization from M3U sources (#147)
- Configure auto-sync settings per M3U account group
- Set starting channel numbers by group
- Override group names during sync
- Apply regex match and replace for channel names
- Filter channels by regex match on stream name
- Track auto-created vs manually added channels
- Smart updates preserve UUIDs and existing links
- Stream rehashing with WebSocket notifications
- Better error handling for blocked rehash attempts
- Lock acquisition to prevent conflicts
- Real-time progress tracking
### Changed
- Persist table page sizes in local storage (streams & channels)
- Smoother pagination and improved UX
- Fixed z-index issues during table refreshes
- Improved XC client with connection pooling
- Better error handling for API and JSON decode failures
- Smarter handling of empty content and blocking responses
- Improved EPG XML generation with richer metadata
- Better support for keywords, languages, ratings, and credits
- Better form layouts and responsive buttons
- Enhanced confirmation dialogs and feedback
### Fixed
- Channel table now correctly restores page size from local storage
- Resolved WebSocket message formatting issues
- Fixed logo uploads and edits
- Corrected ESLint issues across the codebase
- Fixed HTML validation errors in menus
- Optimized logo fetching with proper timeouts and headers ([#101](https://github.com/Dispatcharr/Dispatcharr/issues/101), [#217](https://github.com/Dispatcharr/Dispatcharr/issues/217))
## [0.6.2] - 2025-07-10
### Fixed
- **Streaming & Connection Stability:**
- Provider timeout issues - Slow but responsive providers no longer cause channel lockups
- Added chunk and process timeouts - Prevents hanging during stream processing and transcoding
- Improved connection handling - Enhanced process management and socket closure detection for safer streaming
- Enhanced health monitoring - Health monitor now properly notifies main thread without attempting reconnections
- **User Interface & Experience:**
- Touch screen compatibility - Web player can now be properly closed on touch devices
- Improved user management - Added support for first/last names, login tracking, and standardized table formatting
- Improved logging - Enhanced log messages with channel IDs for better debugging
- Code cleanup - Removed unused imports, variables, and dead links
## [0.6.1] - 2025-06-27
### Added
- Dynamic parameter options for M3U and EPG URLs (#207)
- Support for 'num' property in channel number extraction (fixes channel creation from XC streams not having channel numbers)
### Changed
- EPG generation now uses streaming responses to prevent client timeouts during large EPG file generation (#179)
- Improved reliability when downloading EPG data from external sources
- Better program positioning - Programs that start before the current view now have proper text positioning (#223)
- Better mobile support - Improved sizing and layout for mobile devices across multiple tables
- Responsive stats cards - Better calculation for card layout and improved filling on different screen sizes (#218)
- Enhanced table rendering - M3U and EPG tables now render better on small screens
- Optimized spacing - Removed unnecessary padding and blank space throughout the interface
- Better settings layout - Improved minimum widths and mobile support for settings pages
- Always show 2 decimal places for FFmpeg speed values
### Fixed
- TV Guide now properly filters channels based on selected channel group
- Resolved loading issues - Fixed channels and groups not loading correctly in the TV Guide
- Stream profile fixes - Resolved issue with setting stream profile to 'use default'
- Single channel editing - When only one channel is selected, the correct channel editor now opens
- Bulk edit improvements - Added "no change" options for bulk editing operations
- Bulk channel editor now properly saves changes (#222)
- Link form improvements - Better sizing and rendering of link forms with proper layering
- Confirmation dialogs added with warning suppression for user deletion, channel profile deletion, and M3U profile deletion
## [0.6.0] - 2025-06-19
### Added
- **User Management & Access Control:**
- Complete user management system with user levels and channel access controls
- Network access control with CIDR validation and IP-based restrictions
- Logout functionality and improved loading states for authenticated users
- **Xtream Codes Output:**
- Xtream Codes support enables easy output to IPTV clients (#195)
- **Stream Management & Monitoring:**
- FFmpeg statistics integration - Real-time display of video/audio codec info, resolution, speed, and stream type
- Automatic stream switching when buffering is detected
- Enhanced stream profile management with better connection tracking
- Improved stream state detection, including buffering as an active state
- **Channel Management:**
- Bulk channel editing for channel group, stream profile, and user access level
- **Enhanced M3U & EPG Features:**
- Dynamic `tvg-id` source selection for M3U and EPG (`tvg_id`, `gracenote`, or `channel_number`)
- Direct URL support in M3U output via `direct=true` parameter
- Flexible EPG output with a configurable day limit via `days=#` parameter
- Support for LIVE tags and `dd_progrid` numbering in EPG processing
- Proxy settings configuration with UI integration and improved validation
- Stream retention controls - Set stale stream days to `0` to disable retention completely (#123)
- Tuner flexibility - Minimum of 1 tuner now allowed for HDHomeRun output
- Fallback IP geolocation provider (#127) - Thanks [@maluueu](https://github.com/maluueu)
- POST method now allowed for M3U output, enabling support for Smarters IPTV - Thanks [@maluueu](https://github.com/maluueu)
### Changed
- Improved channel cards with better status indicators and tooltips
- Clearer error messaging for unsupported codecs in the web player
- Network access warnings to prevent accidental lockouts
- Case-insensitive M3U parsing for improved compatibility
- Better EPG processing with improved channel matching
- Replaced Mantine React Table with custom implementations
- Improved tooltips and parameter wrapping for cleaner interfaces
- Better badge colors and status indicators
- Stronger form validation and user feedback
- Streamlined settings management using JSON configs
- Default value population for clean installs
- Environment-specific configuration support for multiple deployment scenarios
### Fixed
- FFmpeg process cleanup - Ensures FFmpeg fully exits before marking connection closed
- Resolved stream profile update issues in statistics display
- Fixed M3U profile ID behavior when switching streams
- Corrected stream switching logic - Redis is only updated on successful switches
- Fixed connection counting - Excludes the current profile from available connection counts
- Fixed custom stream channel creation when no group is assigned (#122)
- Resolved EPG auto-matching deadlock when many channels match simultaneously - Thanks [@xham3](https://github.com/xham3)
## [0.5.2] - 2025-06-03
### Added
- Direct Logo Support: Added ability to bypass logo caching by adding `?cachedlogos=false` to the end of M3U and EPG URLs (#109)
### Changed
- Dynamic Resource Management: Auto-scales Celery workers based on demand, reducing overall memory and CPU usage while still allowing high-demand tasks to complete quickly (#111)
- Enhanced Logging:
- Improved logging for M3U processing
- Better error output from XML parser for easier troubleshooting
### Fixed
- XMLTV Parsing: Added `remove_blank_text=True` to lxml parser to prevent crashes with poorly formatted XMLTV files (#115)
- Stats Display: Refactored channel info retrieval for safer decoding and improved error logging, fixing intermittent issues with statistics not displaying properly
## [0.5.1] - 2025-05-28
### Added
- Support for ZIP-compressed EPG files
- Automatic extraction of compressed files after downloading
- Intelligent file type detection for EPG sources:
- Reads the first bits of files to determine file type
- If a compressed file is detected, it peeks inside to find XML files
- Random descriptions for dummy channels in the TV guide
- Support for decimal channel numbers (converted from integer to float) - Thanks [@MooseyOnTheLoosey](https://github.com/MooseyOnTheLoosey)
- Show channels without EPG data in TV Guide
- Profile name added to HDHR-friendly name and device ID (allows adding multiple HDHR profiles to Plex)
### Changed
- About 30% faster EPG processing
- Significantly improved memory usage for large EPG files
- Improved timezone handling
- Cleaned up cached files when deleting EPG sources
- Performance improvements when processing extremely large M3U files
- Improved batch processing with better cleanup
- Enhanced WebSocket update handling for large operations
- Redis configured for better performance (no longer saves to disk)
- Improved memory management for Celery tasks
- Separated beat schedules with a file scanning interval set to 20 seconds
- Improved authentication error handling with user redirection to the login page
- Improved channel card formatting for different screen resolutions (can now actually read the channel stats card on mobile)
- Decreased line height for status messages in the EPG and M3U tables for better appearance on smaller screens
- Updated the EPG form to match the M3U form for consistency
### Fixed
- Profile selection issues that previously caused WebUI crashes
- Issue with `tvc-guide-id` (Gracenote ID) in bulk channel creation
- Bug when uploading an M3U with the default user-agent set
- Bug where multiple channel initializations could occur, causing zombie streams and performance issues (choppy streams)
- Better error handling for buffer overflow issues
- Fixed various memory leaks
- Bug in the TV Guide that would crash the web UI when selecting a profile to filter by
- Multiple minor bug fixes and code cleanup
## [0.5.0] - 2025-05-15
### Added
- **XtreamCodes Support:**
- Initial XtreamCodes client support
- Option to add EPG source with XC account
- Improved XC login and authentication
- Improved error handling for XC connections
- **Hardware Acceleration:**
- Detection of hardware acceleration capabilities with recommendations (available in logs after startup)
- Improved support for NVIDIA, Intel (QSV), and VAAPI acceleration methods
- Added necessary drivers and libraries for hardware acceleration
- Automatically assigns required permissions for hardware acceleration
- Thanks to [@BXWeb](https://github.com/BXWeb), @chris.r3x, [@rykr](https://github.com/rykr), @j3111, [@jesmannstl](https://github.com/jesmannstl), @jimmycarbone, [@gordlaben](https://github.com/gordlaben), [@roofussummers](https://github.com/roofussummers), [@slamanna212](https://github.com/slamanna212)
- **M3U and EPG Management:**
- Enhanced M3U profile creation with live regex results
- Added stale stream detection with configurable thresholds
- Improved status messaging for M3U and EPG operations:
- Shows download speed with estimated time remaining
- Shows parsing time remaining
- Added "Pending Setup" status for M3U's requiring group selection
- Improved handling of M3U group filtering
- **UI Improvements:**
- Added configurable table sizes
- Enhanced video player with loading and error states
- Improved WebSocket connection handling with authentication
- Added confirmation dialogs for critical operations
- Auto-assign numbers now configurable by selection
- Added bulk editing of channel profile membership (select multiple channels, then click the profile toggle on any selected channel to apply the change to all)
- **Infrastructure & Performance:**
- Standardized and improved the logging system
- New environment variable to set logging level: `DISPATCHARR_LOG_LEVEL` (default: `INFO`, available: `TRACE`, `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`)
- Introduced a new base image build process: updates are now significantly smaller (typically under 15MB unless the base image changes)
- Improved environment variable handling in container
- Support for Gracenote ID (`tvc-guide-stationid`) - Thanks [@rykr](https://github.com/rykr)
- Improved file upload handling with size limits removed
### Fixed
- Issues with profiles not loading correctly
- Problems with stream previews in tables
- Channel creation and editing workflows
- Logo display issues
- WebSocket connection problems
- Multiple React-related errors and warnings
- Pagination and filtering issues in tables
## [0.4.1] - 2025-05-01
### Changed
- Optimized uWSGI configuration settings for better server performance
- Improved asynchronous processing by converting additional timers to gevent
- Enhanced EPG (Electronic Program Guide) downloading with proper user agent headers
### Fixed
- Issue with "add streams to channel" functionality to correctly follow disabled state logic
## [0.4.0] - 2025-05-01
### Added
- URL copy buttons for stream and channel URLs
- Manual stream switching ability
- EPG auto-match notifications - Users now receive feedback about how many matches were found
- Informative tooltips throughout the interface, including stream profiles and user-agent details
- Display of connected time for each client
- Current M3U profile information to stats
- Better logging for which channel clients are getting chunks from
### Changed
- Table System Rewrite: Completely refactored channel and stream tables for dramatically improved performance with large datasets
- Improved Concurrency: Replaced time.sleep with gevent.sleep for better performance when handling multiple streams
- Improved table interactions:
- Restored alternating row colors and hover effects
- Added shift-click support for multiple row selection
- Preserved drag-and-drop functionality
- Adjusted logo display to prevent layout shifts with different sized logos
- Improved sticky headers in tables
- Fixed spacing and padding in EPG and M3U tables for better readability on smaller displays
- Stream URL handling improved for search/replace patterns
- Enhanced stream lock management for better reliability
- Added stream name to channel status for better visibility
- Properly track current stream ID during stream switches
- Improved EPG cache handling and cleanup of old cache files
- Corrected content type for M3U file (using m3u instead of m3u8)
- Fixed logo URL handling in M3U generation
- Enhanced tuner count calculation to include only active M3U accounts
- Increased thread stack size in uwsgi configuration
- Changed proxy to use uwsgi socket
- Added build timestamp to version information
- Reduced excessive logging during M3U/EPG file importing
- Improved store variable handling to increase application efficiency
- Frontend now being built by Yarn instead of NPM
### Fixed
- Issues with channel statistics randomly not working
- Stream ordering in channel selection
- M3U profile name added to stream names for better identification
- Channel form not updating some properties after saving
- Issue with setting logos to default
- Channel creation from streams
- Channel group saving
- Improved error handling throughout the application
- Bugs in deleting stream profiles
- Resolved mimetype detection issues
- Fixed form display issues
- Added proper requerying after form submissions and item deletions
- Bug overwriting tvg-id when loading TV Guide
- Bug that prevented large m3u's and epg's from uploading
- Typo in Stream Profile header column for Description - Thanks [@LoudSoftware](https://github.com/LoudSoftware)
- Typo in m3u input processing (tv-chno instead of tvg-chno) - Thanks @www2a
## [0.3.3] - 2025-04-18
### Fixed
- Issue with dummy EPG calculating hours above 24, ensuring time values remain within valid 24-hour format
- Auto import functionality to properly process old files that hadn't been imported yet, rather than ignoring them
## [0.3.2] - 2025-04-16
### Fixed
- Issue with stream ordering for channels - resolved problem where stream objects were incorrectly processed when assigning order in channel configurations
## [0.3.1] - 2025-04-16
### Added
- Key to navigation links in sidebar to resolve DOM errors when loading web UI
- Channels that are set to 'dummy' epg to the TV Guide
### Fixed
- Issue preventing dummy EPG from being set
- Channel numbers not saving properly
- EPGs not refreshing when linking EPG to channel
- Improved error messages in notifications
## [0.3.0] - 2025-04-15
### Added
- URL validation for redirect profile:
- Validates stream URLs before redirecting clients
- Prevents clients from being redirected to unavailable streams
- Now tries alternate streams when primary stream validation fails
- Dynamic tuner configuration for HDHomeRun devices:
- TunerCount is now dynamically created based on profile max connections
- Sets minimum of 2 tuners, up to 10 for unlimited profiles
### Changed
- More robust stream switching:
- Clients now wait properly if a stream is in the switching state
- Improved reliability during stream transitions
- Performance enhancements:
- Increased workers and threads for uwsgi for better concurrency
### Fixed
- Issue with multiple dead streams in a row - System now properly handles cases where several sequential streams are unavailable
- Broken links to compose files in documentation
## [0.2.1] - 2025-04-13
### Fixed
- Stream preview (not channel)
- Streaming wouldn't work when using default user-agent for an M3U
- WebSockets and M3U profile form issues
## [0.2.0] - 2025-04-12
Initial beta public release.

View file

@ -20,30 +20,88 @@ class TokenObtainPairView(TokenObtainPairView):
def post(self, request, *args, **kwargs):
# Custom logic here
if not network_access_allowed(request, "UI"):
# Log blocked login attempt due to network restrictions
from core.utils import log_system_event
username = request.data.get("username", 'unknown')
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='login_failed',
user=username,
client_ip=client_ip,
user_agent=user_agent,
reason='Network access denied',
)
return Response({"error": "Forbidden"}, status=status.HTTP_403_FORBIDDEN)
# Get the response from the parent class first
response = super().post(request, *args, **kwargs)
username = request.data.get("username")
# If login was successful, update last_login
if response.status_code == 200:
username = request.data.get("username")
if username:
from django.utils import timezone
try:
user = User.objects.get(username=username)
user.last_login = timezone.now()
user.save(update_fields=['last_login'])
except User.DoesNotExist:
pass # User doesn't exist, but login somehow succeeded
# Log login attempt
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
return response
try:
response = super().post(request, *args, **kwargs)
# If login was successful, update last_login and log success
if response.status_code == 200:
if username:
from django.utils import timezone
try:
user = User.objects.get(username=username)
user.last_login = timezone.now()
user.save(update_fields=['last_login'])
# Log successful login
log_system_event(
event_type='login_success',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
except User.DoesNotExist:
pass # User doesn't exist, but login somehow succeeded
else:
# Log failed login attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason='Invalid credentials',
)
return response
except Exception as e:
# If parent class raises an exception (e.g., validation error), log failed attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason=f'Authentication error: {str(e)[:100]}',
)
raise # Re-raise the exception to maintain normal error flow
class TokenRefreshView(TokenRefreshView):
def post(self, request, *args, **kwargs):
# Custom logic here
if not network_access_allowed(request, "UI"):
# Log blocked token refresh attempt due to network restrictions
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='login_failed',
user='token_refresh',
client_ip=client_ip,
user_agent=user_agent,
reason='Network access denied (token refresh)',
)
return Response({"error": "Unauthorized"}, status=status.HTTP_403_FORBIDDEN)
return super().post(request, *args, **kwargs)
@ -80,6 +138,15 @@ def initialize_superuser(request):
class AuthViewSet(viewsets.ViewSet):
"""Handles user login and logout"""
def get_permissions(self):
"""
Login doesn't require auth, but logout does
"""
if self.action == 'logout':
from rest_framework.permissions import IsAuthenticated
return [IsAuthenticated()]
return []
@swagger_auto_schema(
operation_description="Authenticate and log in a user",
request_body=openapi.Schema(
@ -100,6 +167,11 @@ class AuthViewSet(viewsets.ViewSet):
password = request.data.get("password")
user = authenticate(request, username=username, password=password)
# Get client info for logging
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
if user:
login(request, user)
# Update last_login timestamp
@ -107,6 +179,14 @@ class AuthViewSet(viewsets.ViewSet):
user.last_login = timezone.now()
user.save(update_fields=['last_login'])
# Log successful login
log_system_event(
event_type='login_success',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
return Response(
{
"message": "Login successful",
@ -118,6 +198,15 @@ class AuthViewSet(viewsets.ViewSet):
},
}
)
# Log failed login attempt
log_system_event(
event_type='login_failed',
user=username or 'unknown',
client_ip=client_ip,
user_agent=user_agent,
reason='Invalid credentials',
)
return Response({"error": "Invalid credentials"}, status=400)
@swagger_auto_schema(
@ -126,6 +215,19 @@ class AuthViewSet(viewsets.ViewSet):
)
def logout(self, request):
"""Logs out the authenticated user"""
# Log logout event before actually logging out
from core.utils import log_system_event
username = request.user.username if request.user and request.user.is_authenticated else 'unknown'
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='logout',
user=username,
client_ip=client_ip,
user_agent=user_agent,
)
logout(request)
return Response({"message": "Logout successful"})

View file

@ -294,8 +294,17 @@ class ChannelSerializer(serializers.ModelSerializer):
if include_streams:
self.fields["streams"] = serializers.SerializerMethodField()
return super().to_representation(instance)
return super().to_representation(instance)
else:
# Fix: For PATCH/PUT responses, ensure streams are ordered
representation = super().to_representation(instance)
if "streams" in representation:
representation["streams"] = list(
instance.streams.all()
.order_by("channelstream__order")
.values_list("id", flat=True)
)
return representation
def get_logo(self, obj):
return LogoSerializer(obj.logo).data

View file

@ -1434,6 +1434,18 @@ def run_recording(recording_id, channel_id, start_time_str, end_time_str):
logger.info(f"Starting recording for channel {channel.name}")
# Log system event for recording start
try:
from core.utils import log_system_event
log_system_event(
'recording_start',
channel_id=channel.uuid,
channel_name=channel.name,
recording_id=recording_id
)
except Exception as e:
logger.error(f"Could not log recording start event: {e}")
# Try to resolve the Recording row up front
recording_obj = None
try:
@ -1827,6 +1839,20 @@ def run_recording(recording_id, channel_id, start_time_str, end_time_str):
# After the loop, the file and response are closed automatically.
logger.info(f"Finished recording for channel {channel.name}")
# Log system event for recording end
try:
from core.utils import log_system_event
log_system_event(
'recording_end',
channel_id=channel.uuid,
channel_name=channel.name,
recording_id=recording_id,
interrupted=interrupted,
bytes_written=bytes_written
)
except Exception as e:
logger.error(f"Could not log recording end event: {e}")
# Remux TS to MKV container
remux_success = False
try:

View file

@ -24,7 +24,7 @@ from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from .models import EPGSource, EPGData, ProgramData
from core.utils import acquire_task_lock, release_task_lock, send_websocket_update, cleanup_memory
from core.utils import acquire_task_lock, release_task_lock, send_websocket_update, cleanup_memory, log_system_event
logger = logging.getLogger(__name__)
@ -1496,6 +1496,15 @@ def parse_programs_for_source(epg_source, tvg_id=None):
epg_source.updated_at = timezone.now()
epg_source.save(update_fields=['status', 'last_message', 'updated_at'])
# Log system event for EPG refresh
log_system_event(
event_type='epg_refresh',
source_name=epg_source.name,
programs=program_count,
channels=channel_count,
updated=updated_count,
)
# Send completion notification with status
send_epg_update(epg_source.id, "parsing_programs", 100,
status="success",

View file

@ -152,6 +152,46 @@ class M3UAccountViewSet(viewsets.ModelViewSet):
and not old_vod_enabled
and new_vod_enabled
):
# Create Uncategorized categories immediately so they're available in the UI
from apps.vod.models import VODCategory, M3UVODCategoryRelation
# Create movie Uncategorized category
movie_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="movie",
defaults={}
)
# Create series Uncategorized category
series_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="series",
defaults={}
)
# Create relations for both categories (disabled by default until first refresh)
account_custom_props = instance.custom_properties or {}
auto_enable_new = account_custom_props.get("auto_enable_new_groups_vod", True)
M3UVODCategoryRelation.objects.get_or_create(
category=movie_category,
m3u_account=instance,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
M3UVODCategoryRelation.objects.get_or_create(
category=series_category,
m3u_account=instance,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
# Trigger full VOD refresh
from apps.vod.tasks import refresh_vod_content
refresh_vod_content.delay(instance.id)

View file

@ -24,6 +24,7 @@ from core.utils import (
acquire_task_lock,
release_task_lock,
natural_sort_key,
log_system_event,
)
from core.models import CoreSettings, UserAgent
from asgiref.sync import async_to_sync
@ -2840,6 +2841,17 @@ def refresh_single_m3u_account(account_id):
account.updated_at = timezone.now()
account.save(update_fields=["status", "last_message", "updated_at"])
# Log system event for M3U refresh
log_system_event(
event_type='m3u_refresh',
account_name=account.name,
elapsed_time=round(elapsed_time, 2),
streams_created=streams_created,
streams_updated=streams_updated,
streams_deleted=streams_deleted,
total_processed=streams_processed,
)
# Send final update with complete metrics and explicitly include success status
send_m3u_update(
account_id,

View file

@ -23,23 +23,86 @@ from django.db.models.functions import Lower
import os
from apps.m3u.utils import calculate_tuner_count
import regex
from core.utils import log_system_event
import hashlib
logger = logging.getLogger(__name__)
def get_client_identifier(request):
"""Get client information including IP, user agent, and a unique hash identifier
Returns:
tuple: (client_id_hash, client_ip, user_agent)
"""
# Get client IP (handle proxies)
x_forwarded_for = request.META.get('HTTP_X_FORWARDED_FOR')
if x_forwarded_for:
client_ip = x_forwarded_for.split(',')[0].strip()
else:
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
# Get user agent
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
# Create a hash for a shorter cache key
client_str = f"{client_ip}:{user_agent}"
client_id_hash = hashlib.md5(client_str.encode()).hexdigest()[:12]
return client_id_hash, client_ip, user_agent
def m3u_endpoint(request, profile_name=None, user=None):
logger.debug("m3u_endpoint called: method=%s, profile=%s", request.method, profile_name)
if not network_access_allowed(request, "M3U_EPG"):
# Log blocked M3U download
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='m3u_blocked',
profile=profile_name or 'all',
reason='Network access denied',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({"error": "Forbidden"}, status=403)
# Handle HEAD requests efficiently without generating content
if request.method == "HEAD":
logger.debug("Handling HEAD request for M3U")
response = HttpResponse(content_type="audio/x-mpegurl")
response["Content-Disposition"] = 'attachment; filename="channels.m3u"'
return response
return generate_m3u(request, profile_name, user)
def epg_endpoint(request, profile_name=None, user=None):
logger.debug("epg_endpoint called: method=%s, profile=%s", request.method, profile_name)
if not network_access_allowed(request, "M3U_EPG"):
# Log blocked EPG download
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='epg_blocked',
profile=profile_name or 'all',
reason='Network access denied',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({"error": "Forbidden"}, status=403)
# Handle HEAD requests efficiently without generating content
if request.method == "HEAD":
logger.debug("Handling HEAD request for EPG")
response = HttpResponse(content_type="application/xml")
response["Content-Disposition"] = 'attachment; filename="Dispatcharr.xml"'
response["Cache-Control"] = "no-cache"
return response
return generate_epg(request, profile_name, user)
@csrf_exempt
@require_http_methods(["GET", "POST"])
@require_http_methods(["GET", "POST", "HEAD"])
def generate_m3u(request, profile_name=None, user=None):
"""
Dynamically generate an M3U file from channels.
@ -47,7 +110,19 @@ def generate_m3u(request, profile_name=None, user=None):
Supports both GET and POST methods for compatibility with IPTVSmarters.
"""
# Check if this is a POST request and the body is not empty (which we don't want to allow)
logger.debug("Generating M3U for profile: %s, user: %s", profile_name, user.username if user else "Anonymous")
logger.debug("Generating M3U for profile: %s, user: %s, method: %s", profile_name, user.username if user else "Anonymous", request.method)
# Check cache for recent identical request (helps with double-GET from browsers)
from django.core.cache import cache
cache_params = f"{profile_name or 'all'}:{user.username if user else 'anonymous'}:{request.GET.urlencode()}"
content_cache_key = f"m3u_content:{cache_params}"
cached_content = cache.get(content_cache_key)
if cached_content:
logger.debug("Serving M3U from cache")
response = HttpResponse(cached_content, content_type="audio/x-mpegurl")
response["Content-Disposition"] = 'attachment; filename="channels.m3u"'
return response
# Check if this is a POST request with data (which we don't want to allow)
if request.method == "POST" and request.body:
if request.body.decode() != '{}':
@ -76,14 +151,22 @@ def generate_m3u(request, profile_name=None, user=None):
else:
if profile_name is not None:
channel_profile = ChannelProfile.objects.get(name=profile_name)
try:
channel_profile = ChannelProfile.objects.get(name=profile_name)
except ChannelProfile.DoesNotExist:
logger.warning("Requested channel profile (%s) during m3u generation does not exist", profile_name)
raise Http404(f"Channel profile '{profile_name}' not found")
channels = Channel.objects.filter(
channelprofilemembership__channel_profile=channel_profile,
channelprofilemembership__enabled=True
).order_by('channel_number')
else:
if profile_name is not None:
channel_profile = ChannelProfile.objects.get(name=profile_name)
try:
channel_profile = ChannelProfile.objects.get(name=profile_name)
except ChannelProfile.DoesNotExist:
logger.warning("Requested channel profile (%s) during m3u generation does not exist", profile_name)
raise Http404(f"Channel profile '{profile_name}' not found")
channels = Channel.objects.filter(
channelprofilemembership__channel_profile=channel_profile,
channelprofilemembership__enabled=True,
@ -184,6 +267,23 @@ def generate_m3u(request, profile_name=None, user=None):
m3u_content += extinf_line + stream_url + "\n"
# Cache the generated content for 2 seconds to handle double-GET requests
cache.set(content_cache_key, m3u_content, 2)
# Log system event for M3U download (with deduplication based on client)
client_id, client_ip, user_agent = get_client_identifier(request)
event_cache_key = f"m3u_download:{user.username if user else 'anonymous'}:{profile_name or 'all'}:{client_id}"
if not cache.get(event_cache_key):
log_system_event(
event_type='m3u_download',
profile=profile_name or 'all',
user=user.username if user else 'anonymous',
channels=channels.count(),
client_ip=client_ip,
user_agent=user_agent,
)
cache.set(event_cache_key, True, 2) # Prevent duplicate events for 2 seconds
response = HttpResponse(m3u_content, content_type="audio/x-mpegurl")
response["Content-Disposition"] = 'attachment; filename="channels.m3u"'
return response
@ -564,28 +664,39 @@ def generate_custom_dummy_programs(channel_id, channel_name, now, num_days, cust
try:
# Support various date group names: month, day, year
month_str = date_groups.get('month', '')
day = int(date_groups.get('day', 1))
year = int(date_groups.get('year', now.year)) # Default to current year if not provided
day_str = date_groups.get('day', '')
year_str = date_groups.get('year', '')
# Parse day - default to current day if empty or invalid
day = int(day_str) if day_str else now.day
# Parse year - default to current year if empty or invalid (matches frontend behavior)
year = int(year_str) if year_str else now.year
# Parse month - can be numeric (1-12) or text (Jan, January, etc.)
month = None
if month_str.isdigit():
month = int(month_str)
else:
# Try to parse text month names
import calendar
month_str_lower = month_str.lower()
# Check full month names
for i, month_name in enumerate(calendar.month_name):
if month_name.lower() == month_str_lower:
month = i
break
# Check abbreviated month names if not found
if month is None:
for i, month_abbr in enumerate(calendar.month_abbr):
if month_abbr.lower() == month_str_lower:
if month_str:
if month_str.isdigit():
month = int(month_str)
else:
# Try to parse text month names
import calendar
month_str_lower = month_str.lower()
# Check full month names
for i, month_name in enumerate(calendar.month_name):
if month_name.lower() == month_str_lower:
month = i
break
# Check abbreviated month names if not found
if month is None:
for i, month_abbr in enumerate(calendar.month_abbr):
if month_abbr.lower() == month_str_lower:
month = i
break
# Default to current month if not extracted or invalid
if month is None:
month = now.month
if month and 1 <= month <= 12 and 1 <= day <= 31:
date_info = {'year': year, 'month': month, 'day': day}
@ -1126,8 +1237,22 @@ def generate_epg(request, profile_name=None, user=None):
by their associated EPGData record.
This version filters data based on the 'days' parameter and sends keep-alives during processing.
"""
# Check cache for recent identical request (helps with double-GET from browsers)
from django.core.cache import cache
cache_params = f"{profile_name or 'all'}:{user.username if user else 'anonymous'}:{request.GET.urlencode()}"
content_cache_key = f"epg_content:{cache_params}"
cached_content = cache.get(content_cache_key)
if cached_content:
logger.debug("Serving EPG from cache")
response = HttpResponse(cached_content, content_type="application/xml")
response["Content-Disposition"] = 'attachment; filename="Dispatcharr.xml"'
response["Cache-Control"] = "no-cache"
return response
def epg_generator():
"""Generator function that yields EPG data with keep-alives during processing""" # Send initial HTTP headers as comments (these will be ignored by XML parsers but keep connection alive)
"""Generator function that yields EPG data with keep-alives during processing"""
# Send initial HTTP headers as comments (these will be ignored by XML parsers but keep connection alive)
xml_lines = []
xml_lines.append('<?xml version="1.0" encoding="UTF-8"?>')
@ -1158,7 +1283,11 @@ def generate_epg(request, profile_name=None, user=None):
)
else:
if profile_name is not None:
channel_profile = ChannelProfile.objects.get(name=profile_name)
try:
channel_profile = ChannelProfile.objects.get(name=profile_name)
except ChannelProfile.DoesNotExist:
logger.warning("Requested channel profile (%s) during epg generation does not exist", profile_name)
raise Http404(f"Channel profile '{profile_name}' not found")
channels = Channel.objects.filter(
channelprofilemembership__channel_profile=channel_profile,
channelprofilemembership__enabled=True,
@ -1190,16 +1319,45 @@ def generate_epg(request, profile_name=None, user=None):
now = django_timezone.now()
cutoff_date = now + timedelta(days=num_days) if num_days > 0 else None
# Build collision-free channel number mapping for XC clients (if user is authenticated)
# XC clients require integer channel numbers, so we need to ensure no conflicts
channel_num_map = {}
if user is not None:
# This is an XC client - build collision-free mapping
used_numbers = set()
# First pass: assign integers for channels that already have integer numbers
for channel in channels:
if channel.channel_number == int(channel.channel_number):
num = int(channel.channel_number)
channel_num_map[channel.id] = num
used_numbers.add(num)
# Second pass: assign integers for channels with float numbers
for channel in channels:
if channel.channel_number != int(channel.channel_number):
candidate = int(channel.channel_number)
while candidate in used_numbers:
candidate += 1
channel_num_map[channel.id] = candidate
used_numbers.add(candidate)
# Process channels for the <channel> section
for channel in channels:
# Format channel number as integer if it has no decimal component - same as M3U generation
if channel.channel_number is not None:
if channel.channel_number == int(channel.channel_number):
formatted_channel_number = int(channel.channel_number)
else:
formatted_channel_number = channel.channel_number
# For XC clients (user is not None), use collision-free integer mapping
# For regular clients (user is None), use original formatting logic
if user is not None:
# XC client - use collision-free integer
formatted_channel_number = channel_num_map[channel.id]
else:
formatted_channel_number = ""
# Regular client - format channel number as integer if it has no decimal component
if channel.channel_number is not None:
if channel.channel_number == int(channel.channel_number):
formatted_channel_number = int(channel.channel_number)
else:
formatted_channel_number = channel.channel_number
else:
formatted_channel_number = ""
# Determine the channel ID based on the selected source
if tvg_id_source == 'tvg_id' and channel.tvg_id:
@ -1286,7 +1444,8 @@ def generate_epg(request, profile_name=None, user=None):
xml_lines.append(" </channel>")
# Send all channel definitions
yield '\n'.join(xml_lines) + '\n'
channel_xml = '\n'.join(xml_lines) + '\n'
yield channel_xml
xml_lines = [] # Clear to save memory
# Process programs for each channel
@ -1298,14 +1457,20 @@ def generate_epg(request, profile_name=None, user=None):
elif tvg_id_source == 'gracenote' and channel.tvc_guide_stationid:
channel_id = channel.tvc_guide_stationid
else:
# Get formatted channel number
if channel.channel_number is not None:
if channel.channel_number == int(channel.channel_number):
formatted_channel_number = int(channel.channel_number)
else:
formatted_channel_number = channel.channel_number
# For XC clients (user is not None), use collision-free integer mapping
# For regular clients (user is None), use original formatting logic
if user is not None:
# XC client - use collision-free integer from map
formatted_channel_number = channel_num_map[channel.id]
else:
formatted_channel_number = ""
# Regular client - format channel number as before
if channel.channel_number is not None:
if channel.channel_number == int(channel.channel_number):
formatted_channel_number = int(channel.channel_number)
else:
formatted_channel_number = channel.channel_number
else:
formatted_channel_number = ""
# Default to channel number
channel_id = str(formatted_channel_number) if formatted_channel_number != "" else str(channel.id)
@ -1676,7 +1841,8 @@ def generate_epg(request, profile_name=None, user=None):
# Send batch when full or send keep-alive
if len(program_batch) >= batch_size:
yield '\n'.join(program_batch) + '\n'
batch_xml = '\n'.join(program_batch) + '\n'
yield batch_xml
program_batch = []
# Move to next chunk
@ -1684,12 +1850,40 @@ def generate_epg(request, profile_name=None, user=None):
# Send remaining programs in batch
if program_batch:
yield '\n'.join(program_batch) + '\n'
batch_xml = '\n'.join(program_batch) + '\n'
yield batch_xml
# Send final closing tag and completion message
yield "</tv>\n" # Return streaming response
yield "</tv>\n"
# Log system event for EPG download after streaming completes (with deduplication based on client)
client_id, client_ip, user_agent = get_client_identifier(request)
event_cache_key = f"epg_download:{user.username if user else 'anonymous'}:{profile_name or 'all'}:{client_id}"
if not cache.get(event_cache_key):
log_system_event(
event_type='epg_download',
profile=profile_name or 'all',
user=user.username if user else 'anonymous',
channels=channels.count(),
client_ip=client_ip,
user_agent=user_agent,
)
cache.set(event_cache_key, True, 2) # Prevent duplicate events for 2 seconds
# Wrapper generator that collects content for caching
def caching_generator():
collected_content = []
for chunk in epg_generator():
collected_content.append(chunk)
yield chunk
# After streaming completes, cache the full content
full_content = ''.join(collected_content)
cache.set(content_cache_key, full_content, 300)
logger.debug("Cached EPG content (%d bytes)", len(full_content))
# Return streaming response
response = StreamingHttpResponse(
streaming_content=epg_generator(),
streaming_content=caching_generator(),
content_type="application/xml"
)
response["Content-Disposition"] = 'attachment; filename="Dispatcharr.xml"'
@ -1777,45 +1971,31 @@ def xc_player_api(request, full=False):
if user is None:
return JsonResponse({'error': 'Unauthorized'}, status=401)
server_info = xc_get_info(request)
if not action:
return JsonResponse(server_info)
if action == "get_live_categories":
return JsonResponse(xc_get_live_categories(user), safe=False)
if action == "get_live_streams":
elif action == "get_live_streams":
return JsonResponse(xc_get_live_streams(request, user, request.GET.get("category_id")), safe=False)
if action == "get_short_epg":
elif action == "get_short_epg":
return JsonResponse(xc_get_epg(request, user, short=True), safe=False)
if action == "get_simple_data_table":
elif action == "get_simple_data_table":
return JsonResponse(xc_get_epg(request, user, short=False), safe=False)
# Endpoints not implemented, but still provide a response
if action in [
"get_vod_categories",
"get_vod_streams",
"get_series",
"get_series_categories",
"get_series_info",
"get_vod_info",
]:
if action == "get_vod_categories":
return JsonResponse(xc_get_vod_categories(user), safe=False)
elif action == "get_vod_streams":
return JsonResponse(xc_get_vod_streams(request, user, request.GET.get("category_id")), safe=False)
elif action == "get_series_categories":
return JsonResponse(xc_get_series_categories(user), safe=False)
elif action == "get_series":
return JsonResponse(xc_get_series(request, user, request.GET.get("category_id")), safe=False)
elif action == "get_series_info":
return JsonResponse(xc_get_series_info(request, user, request.GET.get("series_id")), safe=False)
elif action == "get_vod_info":
return JsonResponse(xc_get_vod_info(request, user, request.GET.get("vod_id")), safe=False)
else:
return JsonResponse([], safe=False)
raise Http404()
elif action == "get_vod_categories":
return JsonResponse(xc_get_vod_categories(user), safe=False)
elif action == "get_vod_streams":
return JsonResponse(xc_get_vod_streams(request, user, request.GET.get("category_id")), safe=False)
elif action == "get_series_categories":
return JsonResponse(xc_get_series_categories(user), safe=False)
elif action == "get_series":
return JsonResponse(xc_get_series(request, user, request.GET.get("category_id")), safe=False)
elif action == "get_series_info":
return JsonResponse(xc_get_series_info(request, user, request.GET.get("series_id")), safe=False)
elif action == "get_vod_info":
return JsonResponse(xc_get_vod_info(request, user, request.GET.get("vod_id")), safe=False)
else:
# For any other action (including get_account_info or unknown actions),
# return server_info/account_info to match provider behavior
server_info = xc_get_info(request)
return JsonResponse(server_info, safe=False)
def xc_panel_api(request):
@ -1832,12 +2012,34 @@ def xc_panel_api(request):
def xc_get(request):
if not network_access_allowed(request, 'XC_API'):
# Log blocked M3U download
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='m3u_blocked',
user=request.GET.get('username', 'unknown'),
reason='Network access denied (XC API)',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({'error': 'Forbidden'}, status=403)
action = request.GET.get("action")
user = xc_get_user(request)
if user is None:
# Log blocked M3U download due to invalid credentials
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='m3u_blocked',
user=request.GET.get('username', 'unknown'),
reason='Invalid XC credentials',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({'error': 'Unauthorized'}, status=401)
return generate_m3u(request, None, user)
@ -1845,11 +2047,33 @@ def xc_get(request):
def xc_xmltv(request):
if not network_access_allowed(request, 'XC_API'):
# Log blocked EPG download
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='epg_blocked',
user=request.GET.get('username', 'unknown'),
reason='Network access denied (XC API)',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({'error': 'Forbidden'}, status=403)
user = xc_get_user(request)
if user is None:
# Log blocked EPG download due to invalid credentials
from core.utils import log_system_event
client_ip = request.META.get('REMOTE_ADDR', 'unknown')
user_agent = request.META.get('HTTP_USER_AGENT', 'unknown')
log_system_event(
event_type='epg_blocked',
user=request.GET.get('username', 'unknown'),
reason='Invalid XC credentials',
client_ip=client_ip,
user_agent=user_agent,
)
return JsonResponse({'error': 'Unauthorized'}, status=401)
return generate_epg(request, None, user)
@ -1924,10 +2148,38 @@ def xc_get_live_streams(request, user, category_id=None):
channel_group__id=category_id, user_level__lte=user.user_level
).order_by("channel_number")
# Build collision-free mapping for XC clients (which require integers)
# This ensures channels with float numbers don't conflict with existing integers
channel_num_map = {} # Maps channel.id -> integer channel number for XC
used_numbers = set() # Track all assigned integer channel numbers
# First pass: assign integers for channels that already have integer numbers
for channel in channels:
if channel.channel_number == int(channel.channel_number):
# Already an integer, use it directly
num = int(channel.channel_number)
channel_num_map[channel.id] = num
used_numbers.add(num)
# Second pass: assign integers for channels with float numbers
# Find next available number to avoid collisions
for channel in channels:
if channel.channel_number != int(channel.channel_number):
# Has decimal component, need to find available integer
# Start from truncated value and increment until we find an unused number
candidate = int(channel.channel_number)
while candidate in used_numbers:
candidate += 1
channel_num_map[channel.id] = candidate
used_numbers.add(candidate)
# Build the streams list with the collision-free channel numbers
for channel in channels:
channel_num_int = channel_num_map[channel.id]
streams.append(
{
"num": int(channel.channel_number) if channel.channel_number.is_integer() else channel.channel_number,
"num": channel_num_int,
"name": channel.name,
"stream_type": "live",
"stream_id": channel.id,
@ -1939,7 +2191,7 @@ def xc_get_live_streams(request, user, category_id=None):
reverse("api:channels:logo-cache", args=[channel.logo.id])
)
),
"epg_channel_id": str(int(channel.channel_number)) if channel.channel_number.is_integer() else str(channel.channel_number),
"epg_channel_id": str(channel_num_int),
"added": int(channel.created_at.timestamp()),
"is_adult": 0,
"category_id": str(channel.channel_group.id),
@ -1988,6 +2240,35 @@ def xc_get_epg(request, user, short=False):
if not channel:
raise Http404()
# Calculate the collision-free integer channel number for this channel
# This must match the logic in xc_get_live_streams to ensure consistency
# Get all channels in the same category for collision detection
category_channels = Channel.objects.filter(
channel_group=channel.channel_group
).order_by("channel_number")
channel_num_map = {}
used_numbers = set()
# First pass: assign integers for channels that already have integer numbers
for ch in category_channels:
if ch.channel_number == int(ch.channel_number):
num = int(ch.channel_number)
channel_num_map[ch.id] = num
used_numbers.add(num)
# Second pass: assign integers for channels with float numbers
for ch in category_channels:
if ch.channel_number != int(ch.channel_number):
candidate = int(ch.channel_number)
while candidate in used_numbers:
candidate += 1
channel_num_map[ch.id] = candidate
used_numbers.add(candidate)
# Get the mapped integer for this specific channel
channel_num_int = channel_num_map.get(channel.id, int(channel.channel_number))
limit = request.GET.get('limit', 4)
if channel.epg_data:
# Check if this is a dummy EPG that generates on-demand
@ -2020,6 +2301,7 @@ def xc_get_epg(request, user, short=False):
programs = generate_dummy_programs(channel_id=channel_id, channel_name=channel.name, epg_source=None)
output = {"epg_listings": []}
for program in programs:
id = "0"
epg_id = "0"
@ -2037,7 +2319,7 @@ def xc_get_epg(request, user, short=False):
"start": start.strftime("%Y%m%d%H%M%S"),
"end": end.strftime("%Y%m%d%H%M%S"),
"description": base64.b64encode(description.encode()).decode(),
"channel_id": int(channel.channel_number) if channel.channel_number.is_integer() else channel.channel_number,
"channel_id": channel_num_int,
"start_timestamp": int(start.timestamp()),
"stop_timestamp": int(end.timestamp()),
"stream_id": f"{channel_id}",

View file

@ -34,6 +34,10 @@ class ClientManager:
self.heartbeat_interval = ConfigHelper.get('CLIENT_HEARTBEAT_INTERVAL', 10)
self.last_heartbeat_time = {}
# Get ProxyServer instance for ownership checks
from .server import ProxyServer
self.proxy_server = ProxyServer.get_instance()
# Start heartbeat thread for local clients
self._start_heartbeat_thread()
self._registered_clients = set() # Track already registered client IDs
@ -337,16 +341,30 @@ class ClientManager:
self._notify_owner_of_activity()
# Publish client disconnected event
event_data = json.dumps({
"event": EventType.CLIENT_DISCONNECTED, # Use constant instead of string
"channel_id": self.channel_id,
"client_id": client_id,
"worker_id": self.worker_id or "unknown",
"timestamp": time.time(),
"remaining_clients": remaining
})
self.redis_client.publish(RedisKeys.events_channel(self.channel_id), event_data)
# Check if we're the owner - if so, handle locally; if not, publish event
am_i_owner = self.proxy_server and self.proxy_server.am_i_owner(self.channel_id)
if am_i_owner:
# We're the owner - handle the disconnect directly
logger.debug(f"Owner handling CLIENT_DISCONNECTED for client {client_id} locally (not publishing)")
if remaining == 0:
# Trigger shutdown check directly via ProxyServer method
logger.debug(f"No clients left - triggering immediate shutdown check")
# Spawn greenlet to avoid blocking
import gevent
gevent.spawn(self.proxy_server.handle_client_disconnect, self.channel_id)
else:
# We're not the owner - publish event so owner can handle it
logger.debug(f"Non-owner publishing CLIENT_DISCONNECTED event for client {client_id} on channel {self.channel_id} from worker {self.worker_id}")
event_data = json.dumps({
"event": EventType.CLIENT_DISCONNECTED,
"channel_id": self.channel_id,
"client_id": client_id,
"worker_id": self.worker_id or "unknown",
"timestamp": time.time(),
"remaining_clients": remaining
})
self.redis_client.publish(RedisKeys.events_channel(self.channel_id), event_data)
# Trigger channel stats update via WebSocket
self._trigger_stats_update()

View file

@ -19,7 +19,7 @@ import gevent # Add gevent import
from typing import Dict, Optional, Set
from apps.proxy.config import TSConfig as Config
from apps.channels.models import Channel, Stream
from core.utils import RedisClient
from core.utils import RedisClient, log_system_event
from redis.exceptions import ConnectionError, TimeoutError
from .stream_manager import StreamManager
from .stream_buffer import StreamBuffer
@ -194,35 +194,11 @@ class ProxyServer:
self.redis_client.delete(disconnect_key)
elif event_type == EventType.CLIENT_DISCONNECTED:
logger.debug(f"Owner received {EventType.CLIENT_DISCONNECTED} event for channel {channel_id}")
# Check if any clients remain
if channel_id in self.client_managers:
# VERIFY REDIS CLIENT COUNT DIRECTLY
client_set_key = RedisKeys.clients(channel_id)
total = self.redis_client.scard(client_set_key) or 0
if total == 0:
logger.debug(f"No clients left after disconnect event - stopping channel {channel_id}")
# Set the disconnect timer for other workers to see
disconnect_key = RedisKeys.last_client_disconnect(channel_id)
self.redis_client.setex(disconnect_key, 60, str(time.time()))
# Get configured shutdown delay or default
shutdown_delay = ConfigHelper.channel_shutdown_delay()
if shutdown_delay > 0:
logger.info(f"Waiting {shutdown_delay}s before stopping channel...")
gevent.sleep(shutdown_delay) # REPLACE: time.sleep(shutdown_delay)
# Re-check client count before stopping
total = self.redis_client.scard(client_set_key) or 0
if total > 0:
logger.info(f"New clients connected during shutdown delay - aborting shutdown")
self.redis_client.delete(disconnect_key)
return
# Stop the channel directly
self.stop_channel(channel_id)
client_id = data.get("client_id")
worker_id = data.get("worker_id")
logger.debug(f"Owner received {EventType.CLIENT_DISCONNECTED} event for channel {channel_id}, client {client_id} from worker {worker_id}")
# Delegate to dedicated method
self.handle_client_disconnect(channel_id)
elif event_type == EventType.STREAM_SWITCH:
@ -646,6 +622,29 @@ class ProxyServer:
logger.info(f"Created StreamManager for channel {channel_id} with stream ID {channel_stream_id}")
self.stream_managers[channel_id] = stream_manager
# Log channel start event
try:
channel_obj = Channel.objects.get(uuid=channel_id)
# Get stream name if stream_id is available
stream_name = None
if channel_stream_id:
try:
stream_obj = Stream.objects.get(id=channel_stream_id)
stream_name = stream_obj.name
except Exception:
pass
log_system_event(
'channel_start',
channel_id=channel_id,
channel_name=channel_obj.name,
stream_name=stream_name,
stream_id=channel_stream_id
)
except Exception as e:
logger.error(f"Could not log channel start event: {e}")
# Create client manager with channel_id, redis_client AND worker_id (only if not already exists)
if channel_id not in self.client_managers:
client_manager = ClientManager(
@ -800,6 +799,44 @@ class ProxyServer:
logger.error(f"Error cleaning zombie channel {channel_id}: {e}", exc_info=True)
return False
def handle_client_disconnect(self, channel_id):
"""
Handle client disconnect event - check if channel should shut down.
Can be called directly by owner or via PubSub from non-owner workers.
"""
if channel_id not in self.client_managers:
return
try:
# VERIFY REDIS CLIENT COUNT DIRECTLY
client_set_key = RedisKeys.clients(channel_id)
total = self.redis_client.scard(client_set_key) or 0
if total == 0:
logger.debug(f"No clients left after disconnect event - stopping channel {channel_id}")
# Set the disconnect timer for other workers to see
disconnect_key = RedisKeys.last_client_disconnect(channel_id)
self.redis_client.setex(disconnect_key, 60, str(time.time()))
# Get configured shutdown delay or default
shutdown_delay = ConfigHelper.channel_shutdown_delay()
if shutdown_delay > 0:
logger.info(f"Waiting {shutdown_delay}s before stopping channel...")
gevent.sleep(shutdown_delay)
# Re-check client count before stopping
total = self.redis_client.scard(client_set_key) or 0
if total > 0:
logger.info(f"New clients connected during shutdown delay - aborting shutdown")
self.redis_client.delete(disconnect_key)
return
# Stop the channel directly
self.stop_channel(channel_id)
except Exception as e:
logger.error(f"Error handling client disconnect for channel {channel_id}: {e}")
def stop_channel(self, channel_id):
"""Stop a channel with proper ownership handling"""
try:
@ -847,6 +884,41 @@ class ProxyServer:
self.release_ownership(channel_id)
logger.info(f"Released ownership of channel {channel_id}")
# Log channel stop event (after cleanup, before releasing ownership section ends)
try:
channel_obj = Channel.objects.get(uuid=channel_id)
# Calculate runtime and get total bytes from metadata
runtime = None
total_bytes = None
if self.redis_client:
metadata_key = RedisKeys.channel_metadata(channel_id)
metadata = self.redis_client.hgetall(metadata_key)
if metadata:
# Calculate runtime from init_time
if b'init_time' in metadata:
try:
init_time = float(metadata[b'init_time'].decode('utf-8'))
runtime = round(time.time() - init_time, 2)
except Exception:
pass
# Get total bytes transferred
if b'total_bytes' in metadata:
try:
total_bytes = int(metadata[b'total_bytes'].decode('utf-8'))
except Exception:
pass
log_system_event(
'channel_stop',
channel_id=channel_id,
channel_name=channel_obj.name,
runtime=runtime,
total_bytes=total_bytes
)
except Exception as e:
logger.error(f"Could not log channel stop event: {e}")
# Always clean up local resources - WITH SAFE CHECKS
if channel_id in self.stream_managers:
del self.stream_managers[channel_id]
@ -968,6 +1040,13 @@ class ProxyServer:
# If in connecting or waiting_for_clients state, check grace period
if channel_state in [ChannelState.CONNECTING, ChannelState.WAITING_FOR_CLIENTS]:
# Check if channel is already stopping
if self.redis_client:
stop_key = RedisKeys.channel_stopping(channel_id)
if self.redis_client.exists(stop_key):
logger.debug(f"Channel {channel_id} is already stopping - skipping monitor shutdown")
continue
# Get connection_ready_time from metadata (indicates if channel reached ready state)
connection_ready_time = None
if metadata and b'connection_ready_time' in metadata:
@ -1048,6 +1127,13 @@ class ProxyServer:
logger.info(f"Channel {channel_id} activated with {total_clients} clients after grace period")
# If active and no clients, start normal shutdown procedure
elif channel_state not in [ChannelState.CONNECTING, ChannelState.WAITING_FOR_CLIENTS] and total_clients == 0:
# Check if channel is already stopping
if self.redis_client:
stop_key = RedisKeys.channel_stopping(channel_id)
if self.redis_client.exists(stop_key):
logger.debug(f"Channel {channel_id} is already stopping - skipping monitor shutdown")
continue
# Check if there's a pending no-clients timeout
disconnect_key = RedisKeys.last_client_disconnect(channel_id)
disconnect_time = None

View file

@ -14,6 +14,7 @@ from ..server import ProxyServer
from ..redis_keys import RedisKeys
from ..constants import EventType, ChannelState, ChannelMetadataField
from ..url_utils import get_stream_info_for_switch
from core.utils import log_system_event
logger = logging.getLogger("ts_proxy")
@ -598,7 +599,7 @@ class ChannelService:
def _update_stream_stats_in_db(stream_id, **stats):
"""Update stream stats in database"""
from django.db import connection
try:
from apps.channels.models import Stream
from django.utils import timezone
@ -624,7 +625,7 @@ class ChannelService:
except Exception as e:
logger.error(f"Error updating stream stats in database for stream {stream_id}: {e}")
return False
finally:
# Always close database connection after update
try:
@ -700,6 +701,7 @@ class ChannelService:
RedisKeys.events_channel(channel_id),
json.dumps(switch_request)
)
return True
@staticmethod

View file

@ -8,6 +8,8 @@ import logging
import threading
import gevent # Add this import at the top of your file
from apps.proxy.config import TSConfig as Config
from apps.channels.models import Channel
from core.utils import log_system_event
from .server import ProxyServer
from .utils import create_ts_packet, get_logger
from .redis_keys import RedisKeys
@ -88,6 +90,20 @@ class StreamGenerator:
if not self._setup_streaming():
return
# Log client connect event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'client_connect',
channel_id=self.channel_id,
channel_name=channel_obj.name,
client_ip=self.client_ip,
client_id=self.client_id,
user_agent=self.client_user_agent[:100] if self.client_user_agent else None
)
except Exception as e:
logger.error(f"Could not log client connect event: {e}")
# Main streaming loop
for chunk in self._stream_data_generator():
yield chunk
@ -439,6 +455,22 @@ class StreamGenerator:
total_clients = client_manager.get_total_client_count()
logger.info(f"[{self.client_id}] Disconnected after {elapsed:.2f}s (local: {local_clients}, total: {total_clients})")
# Log client disconnect event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'client_disconnect',
channel_id=self.channel_id,
channel_name=channel_obj.name,
client_ip=self.client_ip,
client_id=self.client_id,
user_agent=self.client_user_agent[:100] if self.client_user_agent else None,
duration=round(elapsed, 2),
bytes_sent=self.bytes_sent
)
except Exception as e:
logger.error(f"Could not log client disconnect event: {e}")
# Schedule channel shutdown if no clients left
if not stream_released: # Only if we haven't already released the stream
self._schedule_channel_shutdown_if_needed(local_clients)

View file

@ -16,6 +16,7 @@ from apps.proxy.config import TSConfig as Config
from apps.channels.models import Channel, Stream
from apps.m3u.models import M3UAccount, M3UAccountProfile
from core.models import UserAgent, CoreSettings
from core.utils import log_system_event
from .stream_buffer import StreamBuffer
from .utils import detect_stream_type, get_logger
from .redis_keys import RedisKeys
@ -260,6 +261,20 @@ class StreamManager:
# Store connection start time to measure success duration
connection_start_time = time.time()
# Log reconnection event if this is a retry (not first attempt)
if self.retry_count > 0:
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_reconnect',
channel_id=self.channel_id,
channel_name=channel_obj.name,
attempt=self.retry_count + 1,
max_attempts=self.max_retries
)
except Exception as e:
logger.error(f"Could not log reconnection event: {e}")
# Successfully connected - read stream data until disconnect/error
self._process_stream_data()
# If we get here, the connection was closed/failed
@ -289,6 +304,20 @@ class StreamManager:
if self.retry_count >= self.max_retries:
url_failed = True
logger.warning(f"Maximum retry attempts ({self.max_retries}) reached for URL: {self.url} for channel: {self.channel_id}")
# Log connection error event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_error',
channel_id=self.channel_id,
channel_name=channel_obj.name,
error_type='connection_failed',
url=self.url[:100] if self.url else None,
attempts=self.max_retries
)
except Exception as e:
logger.error(f"Could not log connection error event: {e}")
else:
# Wait with exponential backoff before retrying
timeout = min(.25 * self.retry_count, 3) # Cap at 3 seconds
@ -302,6 +331,21 @@ class StreamManager:
if self.retry_count >= self.max_retries:
url_failed = True
# Log connection error event with exception details
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_error',
channel_id=self.channel_id,
channel_name=channel_obj.name,
error_type='connection_exception',
error_message=str(e)[:200],
url=self.url[:100] if self.url else None,
attempts=self.max_retries
)
except Exception as log_error:
logger.error(f"Could not log connection error event: {log_error}")
else:
# Wait with exponential backoff before retrying
timeout = min(.25 * self.retry_count, 3) # Cap at 3 seconds
@ -702,6 +746,19 @@ class StreamManager:
# Reset buffering state
self.buffering = False
self.buffering_start_time = None
# Log failover event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_failover',
channel_id=self.channel_id,
channel_name=channel_obj.name,
reason='buffering_timeout',
duration=buffering_duration
)
except Exception as e:
logger.error(f"Could not log failover event: {e}")
else:
logger.error(f"Failed to switch to next stream for channel {self.channel_id} after buffering timeout")
else:
@ -709,6 +766,19 @@ class StreamManager:
self.buffering = True
self.buffering_start_time = time.time()
logger.warning(f"Buffering started for channel {self.channel_id} - speed: {ffmpeg_speed}x")
# Log system event for buffering
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_buffering',
channel_id=self.channel_id,
channel_name=channel_obj.name,
speed=ffmpeg_speed
)
except Exception as e:
logger.error(f"Could not log buffering event: {e}")
# Log buffering warning
logger.debug(f"FFmpeg speed on channel {self.channel_id} is below {self.buffering_speed} ({ffmpeg_speed}x) - buffering detected")
# Set channel state to buffering
@ -1004,6 +1074,19 @@ class StreamManager:
except Exception as e:
logger.warning(f"Failed to reset buffer position: {e}")
# Log stream switch event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'stream_switch',
channel_id=self.channel_id,
channel_name=channel_obj.name,
new_url=new_url[:100] if new_url else None,
stream_id=stream_id
)
except Exception as e:
logger.error(f"Could not log stream switch event: {e}")
return True
except Exception as e:
logger.error(f"Error during URL update for channel {self.channel_id}: {e}", exc_info=True)
@ -1122,6 +1205,19 @@ class StreamManager:
if connection_result:
self.connection_start_time = time.time()
logger.info(f"Reconnect successful for channel {self.channel_id}")
# Log reconnection event
try:
channel_obj = Channel.objects.get(uuid=self.channel_id)
log_system_event(
'channel_reconnect',
channel_id=self.channel_id,
channel_name=channel_obj.name,
reason='health_monitor'
)
except Exception as e:
logger.error(f"Could not log reconnection event: {e}")
return True
else:
logger.warning(f"Reconnect failed for channel {self.channel_id}")
@ -1199,25 +1295,17 @@ class StreamManager:
logger.debug(f"Error closing socket for channel {self.channel_id}: {e}")
pass
# Enhanced transcode process cleanup with more aggressive termination
# Enhanced transcode process cleanup with immediate termination
if self.transcode_process:
try:
# First try polite termination
logger.debug(f"Terminating transcode process for channel {self.channel_id}")
self.transcode_process.terminate()
logger.debug(f"Killing transcode process for channel {self.channel_id}")
self.transcode_process.kill()
# Give it a short time to terminate gracefully
# Give it a very short time to die
try:
self.transcode_process.wait(timeout=1.0)
self.transcode_process.wait(timeout=0.5)
except subprocess.TimeoutExpired:
# If it doesn't terminate quickly, kill it
logger.warning(f"Transcode process didn't terminate within timeout, killing forcefully for channel {self.channel_id}")
self.transcode_process.kill()
try:
self.transcode_process.wait(timeout=1.0)
except subprocess.TimeoutExpired:
logger.error(f"Failed to kill transcode process even with force for channel {self.channel_id}")
logger.error(f"Failed to kill transcode process even with force for channel {self.channel_id}")
except Exception as e:
logger.debug(f"Error terminating transcode process for channel {self.channel_id}: {e}")

View file

@ -39,6 +39,8 @@ def generate_stream_url(channel_id: str) -> Tuple[str, str, bool, Optional[int]]
# Handle direct stream preview (custom streams)
if isinstance(channel_or_stream, Stream):
from core.utils import RedisClient
stream = channel_or_stream
logger.info(f"Previewing stream directly: {stream.id} ({stream.name})")
@ -48,12 +50,43 @@ def generate_stream_url(channel_id: str) -> Tuple[str, str, bool, Optional[int]]
logger.error(f"Stream {stream.id} has no M3U account")
return None, None, False, None
# Get the default profile for this M3U account (custom streams use default)
m3u_profiles = m3u_account.profiles.all()
profile = next((obj for obj in m3u_profiles if obj.is_default), None)
# Get active profiles for this M3U account
m3u_profiles = m3u_account.profiles.filter(is_active=True)
default_profile = next((obj for obj in m3u_profiles if obj.is_default), None)
if not profile:
logger.error(f"No default profile found for M3U account {m3u_account.id}")
if not default_profile:
logger.error(f"No default active profile found for M3U account {m3u_account.id}")
return None, None, False, None
# Check profiles in order: default first, then others
profiles = [default_profile] + [obj for obj in m3u_profiles if not obj.is_default]
# Try to find an available profile with connection capacity
redis_client = RedisClient.get_client()
selected_profile = None
for profile in profiles:
logger.info(profile)
# Check connection availability
if redis_client:
profile_connections_key = f"profile_connections:{profile.id}"
current_connections = int(redis_client.get(profile_connections_key) or 0)
# Check if profile has available slots (or unlimited connections)
if profile.max_streams == 0 or current_connections < profile.max_streams:
selected_profile = profile
logger.debug(f"Selected profile {profile.id} with {current_connections}/{profile.max_streams} connections for stream preview")
break
else:
logger.debug(f"Profile {profile.id} at max connections: {current_connections}/{profile.max_streams}")
else:
# No Redis available, use first active profile
selected_profile = profile
break
if not selected_profile:
logger.error(f"No profiles available with connection capacity for M3U account {m3u_account.id}")
return None, None, False, None
# Get the appropriate user agent
@ -62,8 +95,8 @@ def generate_stream_url(channel_id: str) -> Tuple[str, str, bool, Optional[int]]
stream_user_agent = UserAgent.objects.get(id=CoreSettings.get_default_user_agent_id())
logger.debug(f"No user agent found for account, using default: {stream_user_agent}")
# Get stream URL (no transformation for custom streams)
stream_url = stream.url
# Get stream URL with the selected profile's URL transformation
stream_url = transform_url(stream.url, selected_profile.search_pattern, selected_profile.replace_pattern)
# Check if the stream has its own stream_profile set, otherwise use default
if stream.stream_profile:

View file

@ -18,7 +18,7 @@ from apps.accounts.permissions import (
)
from .models import (
Series, VODCategory, Movie, Episode, VODLogo,
M3USeriesRelation, M3UMovieRelation, M3UEpisodeRelation
M3USeriesRelation, M3UMovieRelation, M3UEpisodeRelation, M3UVODCategoryRelation
)
from .serializers import (
MovieSerializer,
@ -476,6 +476,59 @@ class VODCategoryViewSet(viewsets.ReadOnlyModelViewSet):
except KeyError:
return [Authenticated()]
def list(self, request, *args, **kwargs):
"""Override list to ensure Uncategorized categories and relations exist for all XC accounts with VOD enabled"""
from apps.m3u.models import M3UAccount
# Ensure Uncategorized categories exist
movie_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="movie",
defaults={}
)
series_category, _ = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="series",
defaults={}
)
# Get all active XC accounts with VOD enabled
xc_accounts = M3UAccount.objects.filter(
account_type=M3UAccount.Types.XC,
is_active=True
)
for account in xc_accounts:
if account.custom_properties:
custom_props = account.custom_properties or {}
vod_enabled = custom_props.get("enable_vod", False)
if vod_enabled:
# Ensure relations exist for this account
auto_enable_new = custom_props.get("auto_enable_new_groups_vod", True)
M3UVODCategoryRelation.objects.get_or_create(
category=movie_category,
m3u_account=account,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
M3UVODCategoryRelation.objects.get_or_create(
category=series_category,
m3u_account=account,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
# Now proceed with normal list operation
return super().list(request, *args, **kwargs)
class UnifiedContentViewSet(viewsets.ReadOnlyModelViewSet):
"""ViewSet that combines Movies and Series for unified 'All' view"""

View file

@ -127,6 +127,37 @@ def refresh_movies(client, account, categories_by_provider, relations, scan_star
"""Refresh movie content using single API call for all movies"""
logger.info(f"Refreshing movies for account {account.name}")
# Ensure "Uncategorized" category exists for movies without a category
uncategorized_category, created = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="movie",
defaults={}
)
# Ensure there's a relation for the Uncategorized category
account_custom_props = account.custom_properties or {}
auto_enable_new = account_custom_props.get("auto_enable_new_groups_vod", True)
uncategorized_relation, rel_created = M3UVODCategoryRelation.objects.get_or_create(
category=uncategorized_category,
m3u_account=account,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
if created:
logger.info(f"Created 'Uncategorized' category for movies")
if rel_created:
logger.info(f"Created relation for 'Uncategorized' category (enabled={auto_enable_new})")
# Add uncategorized category to relations dict for easy access
relations[uncategorized_category.id] = uncategorized_relation
# Add to categories_by_provider with a special key for items without category
categories_by_provider['__uncategorized__'] = uncategorized_category
# Get all movies in a single API call
logger.info("Fetching all movies from provider...")
all_movies_data = client.get_vod_streams() # No category_id = get all movies
@ -150,6 +181,37 @@ def refresh_series(client, account, categories_by_provider, relations, scan_star
"""Refresh series content using single API call for all series"""
logger.info(f"Refreshing series for account {account.name}")
# Ensure "Uncategorized" category exists for series without a category
uncategorized_category, created = VODCategory.objects.get_or_create(
name="Uncategorized",
category_type="series",
defaults={}
)
# Ensure there's a relation for the Uncategorized category
account_custom_props = account.custom_properties or {}
auto_enable_new = account_custom_props.get("auto_enable_new_groups_series", True)
uncategorized_relation, rel_created = M3UVODCategoryRelation.objects.get_or_create(
category=uncategorized_category,
m3u_account=account,
defaults={
'enabled': auto_enable_new,
'custom_properties': {}
}
)
if created:
logger.info(f"Created 'Uncategorized' category for series")
if rel_created:
logger.info(f"Created relation for 'Uncategorized' category (enabled={auto_enable_new})")
# Add uncategorized category to relations dict for easy access
relations[uncategorized_category.id] = uncategorized_relation
# Add to categories_by_provider with a special key for items without category
categories_by_provider['__uncategorized__'] = uncategorized_category
# Get all series in a single API call
logger.info("Fetching all series from provider...")
all_series_data = client.get_series() # No category_id = get all series
@ -240,6 +302,7 @@ def batch_create_categories(categories_data, category_type, account):
M3UVODCategoryRelation.objects.bulk_create(relations_to_create, ignore_conflicts=True)
# Delete orphaned category relationships (categories no longer in the M3U source)
# Exclude "Uncategorized" from cleanup as it's a special category we manage
current_category_ids = set(existing_categories[name].id for name in category_names)
existing_relations = M3UVODCategoryRelation.objects.filter(
m3u_account=account,
@ -248,7 +311,7 @@ def batch_create_categories(categories_data, category_type, account):
relations_to_delete = [
rel for rel in existing_relations
if rel.category_id not in current_category_ids
if rel.category_id not in current_category_ids and rel.category.name != "Uncategorized"
]
if relations_to_delete:
@ -331,7 +394,16 @@ def process_movie_batch(account, batch, categories, relations, scan_start_time=N
logger.debug("Skipping disabled category")
continue
else:
logger.warning(f"No category ID provided for movie {name}")
# Assign to Uncategorized category if no category_id provided
logger.debug(f"No category ID provided for movie {name}, assigning to 'Uncategorized'")
category = categories.get('__uncategorized__')
if category:
movie_data['_category_id'] = category.id
# Check if uncategorized is disabled
relation = relations.get(category.id, None)
if relation and not relation.enabled:
logger.debug("Skipping disabled 'Uncategorized' category")
continue
# Extract metadata
year = extract_year_from_data(movie_data, 'name')
@ -633,7 +705,16 @@ def process_series_batch(account, batch, categories, relations, scan_start_time=
logger.debug("Skipping disabled category")
continue
else:
logger.warning(f"No category ID provided for series {name}")
# Assign to Uncategorized category if no category_id provided
logger.debug(f"No category ID provided for series {name}, assigning to 'Uncategorized'")
category = categories.get('__uncategorized__')
if category:
series_data['_category_id'] = category.id
# Check if uncategorized is disabled
relation = relations.get(category.id, None)
if relation and not relation.enabled:
logger.debug("Skipping disabled 'Uncategorized' category")
continue
# Extract metadata
year = extract_year(series_data.get('releaseDate', ''))

View file

@ -2,7 +2,16 @@
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from .api_views import UserAgentViewSet, StreamProfileViewSet, CoreSettingsViewSet, environment, version, rehash_streams_endpoint, TimezoneListView
from .api_views import (
UserAgentViewSet,
StreamProfileViewSet,
CoreSettingsViewSet,
environment,
version,
rehash_streams_endpoint,
TimezoneListView,
get_system_events
)
router = DefaultRouter()
router.register(r'useragents', UserAgentViewSet, basename='useragent')
@ -13,5 +22,6 @@ urlpatterns = [
path('version/', version, name='version'),
path('rehash-streams/', rehash_streams_endpoint, name='rehash_streams'),
path('timezones/', TimezoneListView.as_view(), name='timezones'),
path('system-events/', get_system_events, name='system_events'),
path('', include(router.urls)),
]

View file

@ -396,3 +396,64 @@ class TimezoneListView(APIView):
'grouped': grouped,
'count': len(all_timezones)
})
# ─────────────────────────────
# System Events API
# ─────────────────────────────
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def get_system_events(request):
"""
Get recent system events (channel start/stop, buffering, client connections, etc.)
Query Parameters:
limit: Number of events to return per page (default: 100, max: 1000)
offset: Number of events to skip (for pagination, default: 0)
event_type: Filter by specific event type (optional)
"""
from core.models import SystemEvent
try:
# Get pagination params
limit = min(int(request.GET.get('limit', 100)), 1000)
offset = int(request.GET.get('offset', 0))
# Start with all events
events = SystemEvent.objects.all()
# Filter by event_type if provided
event_type = request.GET.get('event_type')
if event_type:
events = events.filter(event_type=event_type)
# Get total count before applying pagination
total_count = events.count()
# Apply offset and limit for pagination
events = events[offset:offset + limit]
# Serialize the data
events_data = [{
'id': event.id,
'event_type': event.event_type,
'event_type_display': event.get_event_type_display(),
'timestamp': event.timestamp.isoformat(),
'channel_id': str(event.channel_id) if event.channel_id else None,
'channel_name': event.channel_name,
'details': event.details
} for event in events]
return Response({
'events': events_data,
'count': len(events_data),
'total': total_count,
'offset': offset,
'limit': limit
})
except Exception as e:
logger.error(f"Error fetching system events: {e}")
return Response({
'error': 'Failed to fetch system events'
}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)

View file

@ -0,0 +1,28 @@
# Generated by Django 5.2.4 on 2025-11-20 20:47
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0016_update_dvr_template_paths'),
]
operations = [
migrations.CreateModel(
name='SystemEvent',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('event_type', models.CharField(choices=[('channel_start', 'Channel Started'), ('channel_stop', 'Channel Stopped'), ('channel_buffering', 'Channel Buffering'), ('channel_failover', 'Channel Failover'), ('channel_reconnect', 'Channel Reconnected'), ('channel_error', 'Channel Error'), ('client_connect', 'Client Connected'), ('client_disconnect', 'Client Disconnected'), ('recording_start', 'Recording Started'), ('recording_end', 'Recording Ended'), ('stream_switch', 'Stream Switched'), ('m3u_refresh', 'M3U Refreshed'), ('m3u_download', 'M3U Downloaded'), ('epg_refresh', 'EPG Refreshed'), ('epg_download', 'EPG Downloaded')], db_index=True, max_length=50)),
('timestamp', models.DateTimeField(auto_now_add=True, db_index=True)),
('channel_id', models.UUIDField(blank=True, db_index=True, null=True)),
('channel_name', models.CharField(blank=True, max_length=255, null=True)),
('details', models.JSONField(blank=True, default=dict)),
],
options={
'ordering': ['-timestamp'],
'indexes': [models.Index(fields=['-timestamp'], name='core_system_timesta_c6c3d1_idx'), models.Index(fields=['event_type', '-timestamp'], name='core_system_event_t_4267d9_idx')],
},
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 5.2.4 on 2025-11-21 15:59
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0017_systemevent'),
]
operations = [
migrations.AlterField(
model_name='systemevent',
name='event_type',
field=models.CharField(choices=[('channel_start', 'Channel Started'), ('channel_stop', 'Channel Stopped'), ('channel_buffering', 'Channel Buffering'), ('channel_failover', 'Channel Failover'), ('channel_reconnect', 'Channel Reconnected'), ('channel_error', 'Channel Error'), ('client_connect', 'Client Connected'), ('client_disconnect', 'Client Disconnected'), ('recording_start', 'Recording Started'), ('recording_end', 'Recording Ended'), ('stream_switch', 'Stream Switched'), ('m3u_refresh', 'M3U Refreshed'), ('m3u_download', 'M3U Downloaded'), ('epg_refresh', 'EPG Refreshed'), ('epg_download', 'EPG Downloaded'), ('login_success', 'Login Successful'), ('login_failed', 'Login Failed'), ('logout', 'User Logged Out'), ('m3u_blocked', 'M3U Download Blocked'), ('epg_blocked', 'EPG Download Blocked')], db_index=True, max_length=50),
),
]

View file

@ -375,3 +375,48 @@ class CoreSettings(models.Model):
return rules
except Exception:
return rules
class SystemEvent(models.Model):
"""
Tracks system events like channel start/stop, buffering, failover, client connections.
Maintains a rolling history based on max_system_events setting.
"""
EVENT_TYPES = [
('channel_start', 'Channel Started'),
('channel_stop', 'Channel Stopped'),
('channel_buffering', 'Channel Buffering'),
('channel_failover', 'Channel Failover'),
('channel_reconnect', 'Channel Reconnected'),
('channel_error', 'Channel Error'),
('client_connect', 'Client Connected'),
('client_disconnect', 'Client Disconnected'),
('recording_start', 'Recording Started'),
('recording_end', 'Recording Ended'),
('stream_switch', 'Stream Switched'),
('m3u_refresh', 'M3U Refreshed'),
('m3u_download', 'M3U Downloaded'),
('epg_refresh', 'EPG Refreshed'),
('epg_download', 'EPG Downloaded'),
('login_success', 'Login Successful'),
('login_failed', 'Login Failed'),
('logout', 'User Logged Out'),
('m3u_blocked', 'M3U Download Blocked'),
('epg_blocked', 'EPG Download Blocked'),
]
event_type = models.CharField(max_length=50, choices=EVENT_TYPES, db_index=True)
timestamp = models.DateTimeField(auto_now_add=True, db_index=True)
channel_id = models.UUIDField(null=True, blank=True, db_index=True)
channel_name = models.CharField(max_length=255, null=True, blank=True)
details = models.JSONField(default=dict, blank=True)
class Meta:
ordering = ['-timestamp']
indexes = [
models.Index(fields=['-timestamp']),
models.Index(fields=['event_type', '-timestamp']),
]
def __str__(self):
return f"{self.event_type} - {self.channel_name or 'N/A'} @ {self.timestamp}"

View file

@ -388,3 +388,48 @@ def validate_flexible_url(value):
# If it doesn't match our flexible patterns, raise the original error
raise ValidationError("Enter a valid URL.")
def log_system_event(event_type, channel_id=None, channel_name=None, **details):
"""
Log a system event and maintain the configured max history.
Args:
event_type: Type of event (e.g., 'channel_start', 'client_connect')
channel_id: Optional UUID of the channel
channel_name: Optional name of the channel
**details: Additional details to store in the event (stored as JSON)
Example:
log_system_event('channel_start', channel_id=uuid, channel_name='CNN',
stream_url='http://...', user='admin')
"""
from core.models import SystemEvent, CoreSettings
try:
# Create the event
SystemEvent.objects.create(
event_type=event_type,
channel_id=channel_id,
channel_name=channel_name,
details=details
)
# Get max events from settings (default 100)
try:
max_events_setting = CoreSettings.objects.filter(key='max-system-events').first()
max_events = int(max_events_setting.value) if max_events_setting else 100
except Exception:
max_events = 100
# Delete old events beyond the limit (keep it efficient with a single query)
total_count = SystemEvent.objects.count()
if total_count > max_events:
# Get the ID of the event at the cutoff point
cutoff_event = SystemEvent.objects.values_list('id', flat=True)[max_events]
# Delete all events with ID less than cutoff (older events)
SystemEvent.objects.filter(id__lt=cutoff_event).delete()
except Exception as e:
# Don't let event logging break the main application
logger.error(f"Failed to log system event {event_type}: {e}")

View file

@ -44,7 +44,7 @@ def network_access_allowed(request, settings_key):
cidrs = (
network_access[settings_key].split(",")
if settings_key in network_access
else ["0.0.0.0/0"]
else ["0.0.0.0/0", "::/0"]
)
network_allowed = False

View file

@ -3,6 +3,7 @@ proxy_cache_path /app/logo_cache levels=1:2 keys_zone=logo_cache:10m
server {
listen NGINX_PORT;
listen [::]:NGINX_PORT;
proxy_connect_timeout 75;
proxy_send_timeout 300;

View file

@ -569,14 +569,22 @@ export const WebsocketProvider = ({ children }) => {
break;
case 'epg_refresh':
// Update the store with progress information
updateEPGProgress(parsedEvent.data);
// If we have source/account info, update the EPG source status
// If we have source/account info, check if EPG exists before processing
if (parsedEvent.data.source || parsedEvent.data.account) {
const sourceId =
parsedEvent.data.source || parsedEvent.data.account;
const epg = epgs[sourceId];
// Only update progress if the EPG still exists in the store
// This prevents crashes when receiving updates for deleted EPGs
if (epg) {
// Update the store with progress information
updateEPGProgress(parsedEvent.data);
} else {
// EPG was deleted, ignore this update
console.debug(`Ignoring EPG refresh update for deleted EPG ${sourceId}`);
break;
}
if (epg) {
// Check for any indication of an error (either via status or error field)

View file

@ -170,7 +170,7 @@ export default class API {
static async logout() {
return await request(`${host}/api/accounts/auth/logout/`, {
auth: false,
auth: true, // Send JWT token so backend can identify the user
method: 'POST',
});
}
@ -1053,8 +1053,20 @@ export default class API {
}
static async updateEPG(values, isToggle = false) {
// Validate that values is an object
if (!values || typeof values !== 'object') {
console.error('updateEPG called with invalid values:', values);
return;
}
const { id, ...payload } = values;
// Validate that we have an ID and payload is an object
if (!id || typeof payload !== 'object') {
console.error('updateEPG: invalid id or payload', { id, payload });
return;
}
try {
// If this is just toggling the active state, make a simpler request
if (
@ -2481,4 +2493,21 @@ export default class API {
errorNotification('Failed to update playback position', e);
}
}
static async getSystemEvents(limit = 100, offset = 0, eventType = null) {
try {
const params = new URLSearchParams();
params.append('limit', limit);
params.append('offset', offset);
if (eventType) {
params.append('event_type', eventType);
}
const response = await request(
`${host}/api/core/system-events/?${params.toString()}`
);
return response;
} catch (e) {
errorNotification('Failed to retrieve system events', e);
}
}
}

View file

@ -928,7 +928,8 @@ const SeriesModal = ({ series, opened, onClose }) => {
src={trailerUrl}
title="YouTube Trailer"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share"
referrerPolicy="strict-origin-when-cross-origin"
allowFullScreen
style={{
position: 'absolute',

View file

@ -188,8 +188,8 @@ const Sidebar = ({ collapsed, toggleDrawer, drawerWidth, miniDrawerWidth }) => {
}
};
const onLogout = () => {
logout();
const onLogout = async () => {
await logout();
window.location.reload();
};

View file

@ -0,0 +1,333 @@
import React, { useState, useEffect, useCallback } from 'react';
import {
ActionIcon,
Box,
Button,
Card,
Group,
NumberInput,
Pagination,
Select,
Stack,
Text,
Title,
} from '@mantine/core';
import { useElementSize } from '@mantine/hooks';
import {
ChevronDown,
CirclePlay,
Download,
Gauge,
HardDriveDownload,
List,
LogIn,
LogOut,
RefreshCw,
Shield,
ShieldAlert,
SquareX,
Timer,
Users,
Video,
XCircle,
} from 'lucide-react';
import dayjs from 'dayjs';
import API from '../api';
import useLocalStorage from '../hooks/useLocalStorage';
const SystemEvents = () => {
const [events, setEvents] = useState([]);
const [totalEvents, setTotalEvents] = useState(0);
const [isExpanded, setIsExpanded] = useState(false);
const { ref: cardRef, width: cardWidth } = useElementSize();
const isNarrow = cardWidth < 650;
const [isLoading, setIsLoading] = useState(false);
const [dateFormatSetting] = useLocalStorage('date-format', 'mdy');
const dateFormat = dateFormatSetting === 'mdy' ? 'MM/DD' : 'DD/MM';
const [eventsRefreshInterval, setEventsRefreshInterval] = useLocalStorage(
'events-refresh-interval',
0
);
const [eventsLimit, setEventsLimit] = useLocalStorage('events-limit', 100);
const [currentPage, setCurrentPage] = useState(1);
// Calculate offset based on current page and limit
const offset = (currentPage - 1) * eventsLimit;
const totalPages = Math.ceil(totalEvents / eventsLimit);
const fetchEvents = useCallback(async () => {
try {
setIsLoading(true);
const response = await API.getSystemEvents(eventsLimit, offset);
if (response && response.events) {
setEvents(response.events);
setTotalEvents(response.total || 0);
}
} catch (error) {
console.error('Error fetching system events:', error);
} finally {
setIsLoading(false);
}
}, [eventsLimit, offset]);
// Fetch events on mount and when eventsRefreshInterval changes
useEffect(() => {
fetchEvents();
// Set up polling if interval is set and events section is expanded
if (eventsRefreshInterval > 0 && isExpanded) {
const interval = setInterval(fetchEvents, eventsRefreshInterval * 1000);
return () => clearInterval(interval);
}
}, [fetchEvents, eventsRefreshInterval, isExpanded]);
// Reset to first page when limit changes
useEffect(() => {
setCurrentPage(1);
}, [eventsLimit]);
const getEventIcon = (eventType) => {
switch (eventType) {
case 'channel_start':
return <CirclePlay size={16} />;
case 'channel_stop':
return <SquareX size={16} />;
case 'channel_reconnect':
return <RefreshCw size={16} />;
case 'channel_buffering':
return <Timer size={16} />;
case 'channel_failover':
return <HardDriveDownload size={16} />;
case 'client_connect':
return <Users size={16} />;
case 'client_disconnect':
return <Users size={16} />;
case 'recording_start':
return <Video size={16} />;
case 'recording_end':
return <Video size={16} />;
case 'stream_switch':
return <HardDriveDownload size={16} />;
case 'm3u_refresh':
return <RefreshCw size={16} />;
case 'm3u_download':
return <Download size={16} />;
case 'epg_refresh':
return <RefreshCw size={16} />;
case 'epg_download':
return <Download size={16} />;
case 'login_success':
return <LogIn size={16} />;
case 'login_failed':
return <ShieldAlert size={16} />;
case 'logout':
return <LogOut size={16} />;
case 'm3u_blocked':
return <XCircle size={16} />;
case 'epg_blocked':
return <XCircle size={16} />;
default:
return <Gauge size={16} />;
}
};
const getEventColor = (eventType) => {
switch (eventType) {
case 'channel_start':
case 'client_connect':
case 'recording_start':
case 'login_success':
return 'green';
case 'channel_reconnect':
return 'yellow';
case 'channel_stop':
case 'client_disconnect':
case 'recording_end':
case 'logout':
return 'gray';
case 'channel_buffering':
return 'yellow';
case 'channel_failover':
case 'channel_error':
return 'orange';
case 'stream_switch':
return 'blue';
case 'm3u_refresh':
case 'epg_refresh':
return 'cyan';
case 'm3u_download':
case 'epg_download':
return 'teal';
case 'login_failed':
case 'm3u_blocked':
case 'epg_blocked':
return 'red';
default:
return 'gray';
}
};
return (
<Card
ref={cardRef}
shadow="sm"
padding="sm"
radius="md"
withBorder
style={{
color: '#fff',
backgroundColor: '#27272A',
width: '100%',
maxWidth: isExpanded ? '100%' : '800px',
marginLeft: 'auto',
marginRight: 'auto',
transition: 'max-width 0.3s ease',
}}
>
<Group justify="space-between" mb={isExpanded ? 'sm' : 0}>
<Group gap="xs">
<Gauge size={20} />
<Title order={4}>System Events</Title>
</Group>
<Group gap="xs">
{(isExpanded || !isNarrow) && (
<>
<NumberInput
size="xs"
label="Events Per Page"
value={eventsLimit}
onChange={(value) => setEventsLimit(value || 10)}
min={10}
max={1000}
step={10}
style={{ width: 130 }}
/>
<Select
size="xs"
label="Auto Refresh"
value={eventsRefreshInterval.toString()}
onChange={(value) => setEventsRefreshInterval(parseInt(value))}
data={[
{ value: '0', label: 'Manual' },
{ value: '5', label: '5s' },
{ value: '10', label: '10s' },
{ value: '30', label: '30s' },
{ value: '60', label: '1m' },
]}
style={{ width: 120 }}
/>
<Button
size="xs"
variant="subtle"
onClick={fetchEvents}
loading={isLoading}
style={{ marginTop: 'auto' }}
>
Refresh
</Button>
</>
)}
<ActionIcon
variant="subtle"
onClick={() => setIsExpanded(!isExpanded)}
>
<ChevronDown
size={18}
style={{
transform: isExpanded ? 'rotate(180deg)' : 'rotate(0deg)',
transition: 'transform 0.2s',
}}
/>
</ActionIcon>
</Group>
</Group>
{isExpanded && (
<>
{totalEvents > eventsLimit && (
<Group justify="space-between" align="center" mt="sm" mb="xs">
<Text size="xs" c="dimmed">
Showing {offset + 1}-
{Math.min(offset + eventsLimit, totalEvents)} of {totalEvents}
</Text>
<Pagination
total={totalPages}
value={currentPage}
onChange={setCurrentPage}
size="sm"
/>
</Group>
)}
<Stack
gap="xs"
mt="sm"
style={{
maxHeight: '60vh',
overflowY: 'auto',
}}
>
{events.length === 0 ? (
<Text size="sm" c="dimmed" ta="center" py="xl">
No events recorded yet
</Text>
) : (
events.map((event) => (
<Box
key={event.id}
p="xs"
style={{
backgroundColor: '#1A1B1E',
borderRadius: '4px',
borderLeft: `3px solid var(--mantine-color-${getEventColor(event.event_type)}-6)`,
}}
>
<Group justify="space-between" wrap="nowrap">
<Group gap="xs" style={{ flex: 1, minWidth: 0 }}>
<Box c={`${getEventColor(event.event_type)}.6`}>
{getEventIcon(event.event_type)}
</Box>
<Stack gap={2} style={{ flex: 1, minWidth: 0 }}>
<Group gap="xs" wrap="nowrap">
<Text size="sm" fw={500}>
{event.event_type_display || event.event_type}
</Text>
{event.channel_name && (
<Text
size="sm"
c="dimmed"
truncate
style={{ maxWidth: '300px' }}
>
{event.channel_name}
</Text>
)}
</Group>
{event.details &&
Object.keys(event.details).length > 0 && (
<Text size="xs" c="dimmed">
{Object.entries(event.details)
.filter(
([key]) =>
!['stream_url', 'new_url'].includes(key)
)
.map(([key, value]) => `${key}: ${value}`)
.join(', ')}
</Text>
)}
</Stack>
</Group>
<Text size="xs" c="dimmed" style={{ whiteSpace: 'nowrap' }}>
{dayjs(event.timestamp).format(`${dateFormat} HH:mm:ss`)}
</Text>
</Group>
</Box>
))
)}
</Stack>
</>
)}
</Card>
);
};
export default SystemEvents;

View file

@ -694,7 +694,8 @@ const VODModal = ({ vod, opened, onClose }) => {
src={trailerUrl}
title="YouTube Trailer"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share"
referrerPolicy="strict-origin-when-cross-origin"
allowFullScreen
style={{
position: 'absolute',

View file

@ -728,6 +728,16 @@ const DummyEPGForm = ({ epg, isOpen, onClose }) => {
const handleSubmit = async (values) => {
try {
if (epg?.id) {
// Validate that we have a valid EPG object before updating
if (!epg || typeof epg !== 'object' || !epg.id) {
notifications.show({
title: 'Error',
message: 'Invalid EPG data. Please close and reopen this form.',
color: 'red',
});
return;
}
await API.updateEPG({ ...values, id: epg.id });
notifications.show({
title: 'Success',

View file

@ -15,6 +15,7 @@ import {
Text,
} from '@mantine/core';
import { isNotEmpty, useForm } from '@mantine/form';
import { notifications } from '@mantine/notifications';
const EPG = ({ epg = null, isOpen, onClose }) => {
const [sourceType, setSourceType] = useState('xmltv');
@ -40,6 +41,16 @@ const EPG = ({ epg = null, isOpen, onClose }) => {
const values = form.getValues();
if (epg?.id) {
// Validate that we have a valid EPG object before updating
if (!epg || typeof epg !== 'object' || !epg.id) {
notifications.show({
title: 'Error',
message: 'Invalid EPG data. Please close and reopen this form.',
color: 'red',
});
return;
}
await API.updateEPG({ id: epg.id, ...values });
} else {
await API.addEPG(values);

View file

@ -226,7 +226,17 @@ const M3U = ({
return (
<>
<Modal size={700} opened={isOpen} onClose={close} title="M3U Account">
<Modal
size={700}
opened={isOpen}
onClose={close}
title="M3U Account"
scrollAreaComponent={Modal.NativeScrollArea}
lockScroll={false}
withinPortal={true}
trapFocus={false}
yOffset="2vh"
>
<LoadingOverlay
visible={form.submitting}
overlayBlur={2}

View file

@ -253,7 +253,16 @@ const M3UFilters = ({ playlist, isOpen, onClose }) => {
return (
<>
<Modal opened={isOpen} onClose={onClose} title="Filters" size="lg">
<Modal
opened={isOpen}
onClose={onClose}
title="Filters"
size="lg"
scrollAreaComponent={Modal.NativeScrollArea}
lockScroll={false}
withinPortal={true}
yOffset="2vh"
>
<Alert
icon={<Info size={16} />}
color="blue"

View file

@ -77,27 +77,29 @@ const M3UGroupFilter = ({ playlist = null, isOpen, onClose }) => {
}
setGroupStates(
playlist.channel_groups.map((group) => {
// Parse custom_properties if present
let customProps = {};
if (group.custom_properties) {
try {
customProps =
typeof group.custom_properties === 'string'
? JSON.parse(group.custom_properties)
: group.custom_properties;
} catch (e) {
customProps = {};
playlist.channel_groups
.filter((group) => channelGroups[group.channel_group]) // Filter out groups that don't exist
.map((group) => {
// Parse custom_properties if present
let customProps = {};
if (group.custom_properties) {
try {
customProps =
typeof group.custom_properties === 'string'
? JSON.parse(group.custom_properties)
: group.custom_properties;
} catch (e) {
customProps = {};
}
}
}
return {
...group,
name: channelGroups[group.channel_group].name,
auto_channel_sync: group.auto_channel_sync || false,
auto_sync_channel_start: group.auto_sync_channel_start || 1.0,
custom_properties: customProps,
};
})
return {
...group,
name: channelGroups[group.channel_group].name,
auto_channel_sync: group.auto_channel_sync || false,
auto_sync_channel_start: group.auto_sync_channel_start || 1.0,
custom_properties: customProps,
};
})
);
}, [playlist, channelGroups]);
@ -184,6 +186,10 @@ const M3UGroupFilter = ({ playlist = null, isOpen, onClose }) => {
title="M3U Group Filter & Auto Channel Sync"
size={1000}
styles={{ content: { '--mantine-color-body': '#27272A' } }}
scrollAreaComponent={Modal.NativeScrollArea}
lockScroll={false}
withinPortal={true}
yOffset="2vh"
>
<LoadingOverlay visible={isLoading} overlayBlur={2} />
<Stack>

View file

@ -192,7 +192,15 @@ const M3UProfiles = ({ playlist = null, isOpen, onClose }) => {
return (
<>
<Modal opened={isOpen} onClose={onClose} title="Profiles">
<Modal
opened={isOpen}
onClose={onClose}
title="Profiles"
scrollAreaComponent={Modal.NativeScrollArea}
lockScroll={false}
withinPortal={true}
yOffset="2vh"
>
{profilesArray
.sort((a, b) => {
// Always put default profile first

View file

@ -23,6 +23,7 @@ import {
ArrowUpNarrowWide,
ArrowUpDown,
ArrowDownWideNarrow,
Search,
} from 'lucide-react';
import {
Box,
@ -949,6 +950,7 @@ const ChannelsTable = ({}) => {
size="xs"
variant="unstyled"
className="table-input-header"
leftSection={<Search size={14} opacity={0.5} />}
/>
<Center>
{React.createElement(sortingIcon, {

View file

@ -131,6 +131,12 @@ const EPGsTable = () => {
const toggleActive = async (epg) => {
try {
// Validate that epg is a valid object with an id
if (!epg || typeof epg !== 'object' || !epg.id) {
console.error('toggleActive called with invalid epg:', epg);
return;
}
// Send only the is_active field to trigger our special handling
await API.updateEPG(
{

View file

@ -13,6 +13,7 @@ import {
ArrowUpDown,
ArrowUpNarrowWide,
ArrowDownWideNarrow,
Search,
} from 'lucide-react';
import {
TextInput,
@ -745,6 +746,7 @@ const StreamsTable = () => {
size="xs"
variant="unstyled"
className="table-input-header"
leftSection={<Search size={14} opacity={0.5} />}
/>
<Center>
{React.createElement(sortingIcon, {

View file

@ -102,6 +102,16 @@ const RECURRING_DAY_OPTIONS = [
{ value: 5, label: 'Sat' },
];
const useDateTimeFormat = () => {
const [timeFormatSetting] = useLocalStorage('time-format', '12h');
const [dateFormatSetting] = useLocalStorage('date-format', 'mdy');
// Use user preference for time format
const timeFormat = timeFormatSetting === '12h' ? 'h:mma' : 'HH:mm';
const dateFormat = dateFormatSetting === 'mdy' ? 'MMM D' : 'D MMM';
return [timeFormat, dateFormat]
};
// Short preview that triggers the details modal when clicked
const RecordingSynopsis = ({ description, onOpen }) => {
const truncated = description?.length > 140;
@ -139,6 +149,7 @@ const RecordingDetailsModal = ({
const { toUserTime, userNow } = useTimeHelpers();
const [childOpen, setChildOpen] = React.useState(false);
const [childRec, setChildRec] = React.useState(null);
const [timeformat, dateformat] = useDateTimeFormat();
const safeRecording = recording || {};
const customProps = safeRecording.custom_properties || {};
@ -320,7 +331,7 @@ const RecordingDetailsModal = ({
)}
</Group>
<Text size="xs">
{start.format('MMM D, YYYY h:mma')} {end.format('h:mma')}
{start.format(`${dateformat}, YYYY ${timeformat}`)} {end.format(timeformat)}
</Text>
</Stack>
<Group gap={6}>
@ -498,7 +509,7 @@ const RecordingDetailsModal = ({
</Group>
</Group>
<Text size="sm">
{start.format('MMM D, YYYY h:mma')} {end.format('h:mma')}
{start.format(`${dateformat}, YYYY ${timeformat}`)} {end.format(timeformat)}
</Text>
{rating && (
<Group gap={8}>
@ -558,6 +569,7 @@ const RecurringRuleModal = ({ opened, onClose, ruleId, onEditOccurrence }) => {
const fetchRecordings = useChannelsStore((s) => s.fetchRecordings);
const recordings = useChannelsStore((s) => s.recordings);
const { toUserTime, userNow } = useTimeHelpers();
const [timeformat, dateformat] = useDateTimeFormat();
const [saving, setSaving] = useState(false);
const [deleting, setDeleting] = useState(false);
@ -892,10 +904,10 @@ const RecurringRuleModal = ({ opened, onClose, ruleId, onEditOccurrence }) => {
<Group justify="space-between" align="center">
<Stack gap={2} style={{ flex: 1 }}>
<Text fw={600} size="sm">
{occStart.format('MMM D, YYYY')}
{occStart.format(`${dateformat}, YYYY`)}
</Text>
<Text size="xs" c="dimmed">
{occStart.format('h:mma')} {occEnd.format('h:mma')}
{occStart.format(timeformat)} {occEnd.format(timeformat)}
</Text>
</Stack>
<Group gap={6}>
@ -937,6 +949,7 @@ const RecordingCard = ({ recording, onOpenDetails, onOpenRecurring }) => {
const showVideo = useVideoStore((s) => s.showVideo);
const fetchRecordings = useChannelsStore((s) => s.fetchRecordings);
const { toUserTime, userNow } = useTimeHelpers();
const [timeformat, dateformat] = useDateTimeFormat();
const channel = channels?.[recording.channel];
@ -1221,7 +1234,7 @@ const RecordingCard = ({ recording, onOpenDetails, onOpenRecurring }) => {
{isSeriesGroup ? 'Next recording' : 'Time'}
</Text>
<Text size="sm">
{start.format('MMM D, YYYY h:mma')} {end.format('h:mma')}
{start.format(`${dateformat}, YYYY ${timeformat}`)} {end.format(timeformat)}
</Text>
</Group>
@ -1698,4 +1711,4 @@ const DVRPage = () => {
);
};
export default DVRPage;
export default DVRPage;

View file

@ -278,14 +278,15 @@ const SettingsPage = () => {
const networkAccessForm = useForm({
mode: 'controlled',
initialValues: Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => {
acc[key] = '0.0.0.0/0,::0/0';
acc[key] = '0.0.0.0/0,::/0';
return acc;
}, {}),
validate: Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => {
acc[key] = (value) => {
const cidrs = value.split(',');
const ipv4CidrRegex = /^([0-9]{1,3}\.){3}[0-9]{1,3}\/\d+$/;
const ipv6CidrRegex = /(?:(?:(?:[A-F0-9]{1,4}:){6}|(?=(?:[A-F0-9]{0,4}:){0,6}(?:[0-9]{1,3}\.){3}[0-9]{1,3}(?![:.\w]))(([0-9A-F]{1,4}:){0,5}|:)((:[0-9A-F]{1,4}){1,5}:|:)|::(?:[A-F0-9]{1,4}:){5})(?:(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)|(?:[A-F0-9]{1,4}:){7}[A-F0-9]{1,4}|(?=(?:[A-F0-9]{0,4}:){0,7}[A-F0-9]{0,4}(?![:.\w]))(([0-9A-F]{1,4}:){1,7}|:)((:[0-9A-F]{1,4}){1,7}|:)|(?:[A-F0-9]{1,4}:){7}:|:(:[A-F0-9]{1,4}){7})(?![:.\w])\/(?:12[0-8]|1[01][0-9]|[1-9]?[0-9])/;
const ipv6CidrRegex =
/(?:(?:(?:[A-F0-9]{1,4}:){6}|(?=(?:[A-F0-9]{0,4}:){0,6}(?:[0-9]{1,3}\.){3}[0-9]{1,3}(?![:.\w]))(([0-9A-F]{1,4}:){0,5}|:)((:[0-9A-F]{1,4}){1,5}:|:)|::(?:[A-F0-9]{1,4}:){5})(?:(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)|(?:[A-F0-9]{1,4}:){7}[A-F0-9]{1,4}|(?=(?:[A-F0-9]{0,4}:){0,7}[A-F0-9]{0,4}(?![:.\w]))(([0-9A-F]{1,4}:){1,7}|:)((:[0-9A-F]{1,4}){1,7}|:)|(?:[A-F0-9]{1,4}:){7}:|:(:[A-F0-9]{1,4}){7})(?![:.\w])\/(?:12[0-8]|1[01][0-9]|[1-9]?[0-9])/;
for (const cidr of cidrs) {
if (cidr.match(ipv4CidrRegex) || cidr.match(ipv6CidrRegex)) {
continue;
@ -357,7 +358,7 @@ const SettingsPage = () => {
);
networkAccessForm.setValues(
Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => {
acc[key] = networkAccessSettings[key] || '0.0.0.0/0,::0/0';
acc[key] = networkAccessSettings[key] || '0.0.0.0/0,::/0';
return acc;
}, {})
);
@ -1093,6 +1094,46 @@ const SettingsPage = () => {
</Accordion.Panel>
</Accordion.Item>
<Accordion.Item value="system-settings">
<Accordion.Control>System Settings</Accordion.Control>
<Accordion.Panel>
<Stack gap="md">
{generalSettingsSaved && (
<Alert
variant="light"
color="green"
title="Saved Successfully"
/>
)}
<Text size="sm" c="dimmed">
Configure how many system events (channel start/stop,
buffering, etc.) to keep in the database. Events are
displayed on the Stats page.
</Text>
<NumberInput
label="Maximum System Events"
description="Number of events to retain (minimum: 10, maximum: 1000)"
value={form.values['max-system-events'] || 100}
onChange={(value) => {
form.setFieldValue('max-system-events', value);
}}
min={10}
max={1000}
step={10}
/>
<Flex mih={50} gap="xs" justify="flex-end" align="flex-end">
<Button
onClick={form.onSubmit(onSubmit)}
disabled={form.submitting}
variant="default"
>
Save
</Button>
</Flex>
</Stack>
</Accordion.Panel>
</Accordion.Item>
<Accordion.Item value="user-agents">
<Accordion.Control>User-Agents</Accordion.Control>
<Accordion.Panel>

View file

@ -8,6 +8,7 @@ import {
Container,
Flex,
Group,
Pagination,
Progress,
SimpleGrid,
Stack,
@ -25,9 +26,11 @@ import useLogosStore from '../store/logos';
import logo from '../images/logo.png';
import {
ChevronDown,
CirclePlay,
Gauge,
HardDriveDownload,
HardDriveUpload,
RefreshCw,
SquareX,
Timer,
Users,
@ -44,6 +47,7 @@ import { useLocation } from 'react-router-dom';
import { notifications } from '@mantine/notifications';
import { CustomTable, useTable } from '../components/tables/CustomTable';
import useLocalStorage from '../hooks/useLocalStorage';
import SystemEvents from '../components/SystemEvents';
dayjs.extend(duration);
dayjs.extend(relativeTime);
@ -1482,108 +1486,132 @@ const ChannelsPage = () => {
}, [channelHistory, vodConnections]);
return (
<Box style={{ overflowX: 'auto' }}>
<Box style={{ padding: '10px', borderBottom: '1px solid #444' }}>
<Group justify="space-between" align="center">
<Title order={3}>Active Connections</Title>
<Group align="center">
<Text size="sm" c="dimmed">
{Object.keys(channelHistory).length} stream
{Object.keys(channelHistory).length !== 1 ? 's' : ''} {' '}
{vodConnections.reduce(
(total, vodContent) =>
total + (vodContent.connections?.length || 0),
0
)}{' '}
VOD connection
{vodConnections.reduce(
(total, vodContent) =>
total + (vodContent.connections?.length || 0),
0
) !== 1
? 's'
: ''}
</Text>
<Group align="center" gap="xs">
<Text size="sm">Refresh Interval (seconds):</Text>
<NumberInput
value={refreshIntervalSeconds}
onChange={(value) => setRefreshIntervalSeconds(value || 0)}
min={0}
max={300}
step={1}
size="xs"
style={{ width: 120 }}
/>
{refreshIntervalSeconds === 0 && (
<>
<Box style={{ overflowX: 'auto' }}>
<Box style={{ minWidth: '520px' }}>
<Box style={{ padding: '10px', borderBottom: '1px solid #444' }}>
<Group justify="space-between" align="center">
<Title order={3}>Active Connections</Title>
<Group align="center">
<Text size="sm" c="dimmed">
Refreshing disabled
{Object.keys(channelHistory).length} stream
{Object.keys(channelHistory).length !== 1 ? 's' : ''} {' '}
{vodConnections.reduce(
(total, vodContent) =>
total + (vodContent.connections?.length || 0),
0
)}{' '}
VOD connection
{vodConnections.reduce(
(total, vodContent) =>
total + (vodContent.connections?.length || 0),
0
) !== 1
? 's'
: ''}
</Text>
)}
<Group align="center" gap="xs">
<Text size="sm">Refresh Interval (seconds):</Text>
<NumberInput
value={refreshIntervalSeconds}
onChange={(value) => setRefreshIntervalSeconds(value || 0)}
min={0}
max={300}
step={1}
size="xs"
style={{ width: 120 }}
/>
{refreshIntervalSeconds === 0 && (
<Text size="sm" c="dimmed">
Refreshing disabled
</Text>
)}
</Group>
{isPollingActive && refreshInterval > 0 && (
<Text size="sm" c="dimmed">
Refreshing every {refreshIntervalSeconds}s
</Text>
)}
<Button
size="xs"
variant="subtle"
onClick={() => {
fetchChannelStats();
fetchVODStats();
}}
loading={false}
>
Refresh Now
</Button>
</Group>
</Group>
{isPollingActive && refreshInterval > 0 && (
<Text size="sm" c="dimmed">
Refreshing every {refreshIntervalSeconds}s
</Text>
)}
<Button
size="xs"
variant="subtle"
onClick={() => {
fetchChannelStats();
fetchVODStats();
}}
loading={false}
>
Refresh Now
</Button>
</Group>
</Group>
</Box>
<div
style={{
display: 'grid',
gap: '1rem',
padding: '10px',
gridTemplateColumns: 'repeat(auto-fill, minmax(500px, 1fr))',
}}
>
{combinedConnections.length === 0 ? (
</Box>
<Box
style={{
gridColumn: '1 / -1',
textAlign: 'center',
padding: '40px',
display: 'grid',
gap: '1rem',
padding: '10px',
paddingBottom: '120px',
gridTemplateColumns: 'repeat(auto-fill, minmax(500px, 1fr))',
minHeight: 'calc(100vh - 250px)',
alignContent: 'start',
}}
>
<Text size="xl" color="dimmed">
No active connections
</Text>
{combinedConnections.length === 0 ? (
<Box
style={{
gridColumn: '1 / -1',
textAlign: 'center',
padding: '40px',
}}
>
<Text size="xl" color="dimmed">
No active connections
</Text>
</Box>
) : (
combinedConnections.map((connection) => {
if (connection.type === 'stream') {
return (
<ChannelCard
key={connection.id}
channel={connection.data}
clients={clients}
stopClient={stopClient}
stopChannel={stopChannel}
logos={logos}
channelsByUUID={channelsByUUID}
/>
);
} else if (connection.type === 'vod') {
return (
<VODCard key={connection.id} vodContent={connection.data} />
);
}
return null;
})
)}
</Box>
) : (
combinedConnections.map((connection) => {
if (connection.type === 'stream') {
return (
<ChannelCard
key={connection.id}
channel={connection.data}
clients={clients}
stopClient={stopClient}
stopChannel={stopChannel}
logos={logos}
channelsByUUID={channelsByUUID}
/>
);
} else if (connection.type === 'vod') {
return (
<VODCard key={connection.id} vodContent={connection.data} />
);
}
return null;
})
)}
</div>
</Box>
</Box>
</Box>
{/* System Events Section - Fixed at bottom */}
<Box
style={{
position: 'fixed',
bottom: 0,
left: 'var(--app-shell-navbar-width, 0)',
right: 0,
zIndex: 100,
padding: '0 1rem 1rem 1rem',
pointerEvents: 'none',
}}
>
<Box style={{ pointerEvents: 'auto' }}>
<SystemEvents />
</Box>
</Box>
</>
);
};

View file

@ -134,13 +134,21 @@ const useAuthStore = create((set, get) => ({
return false; // Add explicit return for when data.access is not available
} catch (error) {
console.error('Token refresh failed:', error);
get().logout();
await get().logout();
return false; // Add explicit return after error
}
},
// Action to logout
logout: () => {
logout: async () => {
// Call backend logout endpoint to log the event
try {
await API.logout();
} catch (error) {
// Continue with logout even if API call fails
console.error('Logout API call failed:', error);
}
set({
accessToken: null,
refreshToken: null,

View file

@ -50,9 +50,17 @@ const useEPGsStore = create((set) => ({
})),
updateEPG: (epg) =>
set((state) => ({
epgs: { ...state.epgs, [epg.id]: epg },
})),
set((state) => {
// Validate that epg is an object with an id
if (!epg || typeof epg !== 'object' || !epg.id) {
console.error('updateEPG called with invalid epg:', epg);
return state;
}
return {
epgs: { ...state.epgs, [epg.id]: epg },
};
}),
removeEPGs: (epgIds) =>
set((state) => {
@ -66,6 +74,12 @@ const useEPGsStore = create((set) => ({
updateEPGProgress: (data) =>
set((state) => {
// Validate that data is an object with a source
if (!data || typeof data !== 'object' || !data.source) {
console.error('updateEPGProgress called with invalid data:', data);
return state;
}
// Early exit if source doesn't exist in our EPGs store
if (!state.epgs[data.source] && !data.status) {
return state;

View file

@ -364,7 +364,7 @@ const useVODStore = create((set, get) => ({
name: seriesInfo.name,
},
type: 'episode',
uuid: episode.id, // Use the stream ID as UUID for playback
uuid: episode.uuid,
logo: episode.movie_image ? { url: episode.movie_image } : null,
air_date: episode.air_date || null,
movie_image: episode.movie_image || null,

View file

@ -0,0 +1,59 @@
#!/usr/bin/env python
"""
Updates the CHANGELOG.md file for a new release.
Renames [Unreleased] section to the new version with current date.
Usage: python update_changelog.py <version>
"""
import re
import sys
from datetime import datetime
from pathlib import Path
def update_changelog(version):
"""Update CHANGELOG.md with new version and date."""
changelog_file = Path(__file__).parent.parent / "CHANGELOG.md"
if not changelog_file.exists():
print("CHANGELOG.md not found")
sys.exit(1)
content = changelog_file.read_text(encoding='utf-8')
# Check if there's an Unreleased section
if '## [Unreleased]' not in content:
print("No [Unreleased] section found in CHANGELOG.md")
sys.exit(1)
# Get current date in YYYY-MM-DD format
today = datetime.now().strftime('%Y-%m-%d')
# Replace [Unreleased] with new version and date, and add new [Unreleased] section
# This pattern preserves everything after [Unreleased] until the next version or end
new_content = re.sub(
r'## \[Unreleased\]',
f'## [Unreleased]\n\n## [{version}] - {today}',
content,
count=1
)
if new_content == content:
print("Failed to update CHANGELOG.md")
sys.exit(1)
changelog_file.write_text(new_content, encoding='utf-8')
print(f"CHANGELOG.md updated for version {version} ({today})")
return True
if __name__ == "__main__":
if len(sys.argv) < 2:
print("Usage: python update_changelog.py <version>")
print("Example: python update_changelog.py 0.13.0")
sys.exit(1)
version = sys.argv[1]
# Remove 'v' prefix if present
version = version.lstrip('v')
update_changelog(version)

View file

@ -1,5 +1,5 @@
"""
Dispatcharr version information.
"""
__version__ = '0.12.0' # Follow semantic versioning (MAJOR.MINOR.PATCH)
__version__ = '0.13.0' # Follow semantic versioning (MAJOR.MINOR.PATCH)
__timestamp__ = None # Set during CI/CD build process