diff --git a/CHANGELOG.md b/CHANGELOG.md index 0cb610fa..9d75289e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -7,6 +7,55 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [Unreleased] +### Added + +- Advanced filtering for Channels table: Filter menu now allows toggling disabled channels visibility (when a profile is selected) and filtering to show only empty channels without streams (Closes #182) +- Network Access warning modal now displays the client's IP address for better transparency when network restrictions are being enforced - Thanks [@damien-alt-sudo](https://github.com/damien-alt-sudo) (Closes #778) +- VLC streaming support - Thanks [@sethwv](https://github.com/sethwv) + - Added `cvlc` as an alternative streaming backend alongside FFmpeg and Streamlink + - Log parser refactoring: Introduced `LogParserFactory` and stream-specific parsers (`FFmpegLogParser`, `VLCLogParser`, `StreamlinkLogParser`) to enable codec and resolution detection from multiple streaming tools + - VLC log parsing for stream information: Detects video/audio codecs from TS demux output, supports both stream-copy and transcode modes with resolution/FPS extraction from transcode output + - Locked, read-only VLC stream profile configured for headless operation with intelligent audio/video codec detection + - VLC and required plugins installed in Docker environment with headless configuration +- ErrorBoundary component for handling frontend errors gracefully with generic error message - Thanks [@nick4810](https://github.com/nick4810) + +### Changed + +- Fixed event viewer arrow direction (previously inverted) — UI behavior corrected. - Thanks [@drnikcuk](https://github.com/drnikcuk) (Closes #772) +- Region code options now intentionally include both `GB` (ISO 3166-1 standard) and `UK` (commonly used by EPG/XMLTV providers) to accommodate real-world EPG data variations. Many providers use `UK` in channel identifiers (e.g., `BBCOne.uk`) despite `GB` being the official ISO country code. Users should select the region code that matches their specific EPG provider's convention for optimal region-based EPG matching bonuses - Thanks [@bigpandaaaa](https://github.com/bigpandaaaa) +- Channel number inputs in stream-to-channel creation modals no longer have a maximum value restriction, allowing users to enter any valid channel number supported by the database +- Stream log parsing refactored to use factory pattern: Simplified `ChannelService.parse_and_store_stream_info()` to route parsing through specialized log parsers instead of inline program-specific logic (~150 lines of code removed) +- Stream profile names in fixtures updated to use proper capitalization (ffmpeg → FFmpeg, streamlink → Streamlink) +- Frontend component refactoring for improved code organization and maintainability - Thanks [@nick4810](https://github.com/nick4810) + - Extracted large nested components into separate files (RecordingCard, RecordingDetailsModal, RecurringRuleModal, RecordingSynopsis, GuideRow, HourTimeline, PluginCard, ProgramRecordingModal, SeriesRecordingModal, Field) + - Moved business logic from components into dedicated utility files (dateTimeUtils, RecordingCardUtils, RecordingDetailsModalUtils, RecurringRuleModalUtils, DVRUtils, guideUtils, PluginsUtils, PluginCardUtils, notificationUtils) + - Lazy loaded heavy components (SuperuserForm, RecordingDetailsModal, ProgramRecordingModal, SeriesRecordingModal, PluginCard) with loading fallbacks + - Removed unused Dashboard and Home pages + - Guide page refactoring: Extracted GuideRow and HourTimeline components, moved grid calculations and utility functions to guideUtils.js, added loading states for initial data fetching, improved performance through better memoization + - Plugins page refactoring: Extracted PluginCard and Field components, added Zustand store for plugin state management, improved plugin action confirmation handling, better separation of concerns between UI and business logic +- Logo loading optimization: Logos now load only after both Channels and Streams tables complete loading to prevent blocking initial page render, with rendering gated by table readiness to ensure data loads before visual elements +- M3U stream URLs now use `build_absolute_uri_with_port()` for consistency with EPG and logo URLs, ensuring uniform port handling across all M3U file URLs +- Settings and Logos page refactoring for improved readability and separation of concerns - Thanks [@nick4810](https://github.com/nick4810) + - Extracted individual settings forms (DVR, Network Access, Proxy, Stream, System, UI) into separate components with dedicated utility files + - Moved larger nested components into their own files + - Moved business logic into corresponding utils/ files + - Extracted larger in-line component logic into its own function + - Each panel in Settings now uses its own form state with the parent component handling active state management + +### Fixed + +- Auto Channel Sync Force EPG Source feature not properly forcing "No EPG" assignment - When selecting "Force EPG Source" > "No EPG (Disabled)", channels were still being auto-matched to EPG data instead of forcing dummy/no EPG. Now correctly sets `force_dummy_epg` flag to prevent unwanted EPG assignment. (Fixes #788) +- VOD episode processing now properly handles season and episode numbers from APIs that return string values instead of integers, with comprehensive error logging to track data quality issues - Thanks [@patchy8736](https://github.com/patchy8736) (Fixes #770) +- VOD episode-to-stream relations are now validated to ensure episodes have been saved to the database before creating relations, preventing integrity errors when bulk_create operations encounter conflicts - Thanks [@patchy8736](https://github.com/patchy8736) +- VOD category filtering now correctly handles category names containing pipe "|" characters (e.g., "PL | BAJKI", "EN | MOVIES") by using `rsplit()` to split from the right instead of the left, ensuring the category type is correctly extracted as the last segment - Thanks [@Vitekant](https://github.com/Vitekant) +- M3U and EPG URLs now correctly preserve non-standard HTTPS ports (e.g., `:8443`) when accessed behind reverse proxies that forward the port in headers — `get_host_and_port()` now properly checks `X-Forwarded-Port` header before falling back to other detection methods (Fixes #704) +- M3U and EPG manager page no longer crashes when a playlist references a deleted channel group (Fixes screen blank on navigation) +- Stream validation now returns original URL instead of redirected URL to prevent issues with temporary redirect URLs that expire before clients can connect +- XtreamCodes EPG limit parameter now properly converted to integer to prevent type errors when accessing EPG listings (Fixes #781) +- Docker container file permissions: Django management commands (`migrate`, `collectstatic`) now run as the non-root user to prevent root-owned `__pycache__` and static files from causing permission issues - Thanks [@sethwv](https://github.com/sethwv) +- Stream validation now continues with GET request if HEAD request fails due to connection issues - Thanks [@kvnnap](https://github.com/kvnnap) (Fixes #782) +- XtreamCodes M3U files now correctly set `x-tvg-url` and `url-tvg` headers to reference XC EPG URL (`xmltv.php`) instead of standard EPG endpoint when downloaded via XC API (Fixes #629) + ## [0.15.1] - 2025-12-22 ### Fixed diff --git a/apps/channels/api_views.py b/apps/channels/api_views.py index 1f98358e..aebb74a3 100644 --- a/apps/channels/api_views.py +++ b/apps/channels/api_views.py @@ -8,6 +8,7 @@ from drf_yasg.utils import swagger_auto_schema from drf_yasg import openapi from django.shortcuts import get_object_or_404, get_list_or_404 from django.db import transaction +from django.db.models import Q import os, json, requests, logging from urllib.parse import unquote from apps.accounts.permissions import ( @@ -420,10 +421,36 @@ class ChannelViewSet(viewsets.ModelViewSet): group_names = channel_group.split(",") qs = qs.filter(channel_group__name__in=group_names) - if self.request.user.user_level < 10: - qs = qs.filter(user_level__lte=self.request.user.user_level) + filters = {} + q_filters = Q() - return qs + channel_profile_id = self.request.query_params.get("channel_profile_id") + show_disabled_param = self.request.query_params.get("show_disabled", None) + only_streamless = self.request.query_params.get("only_streamless", None) + + if channel_profile_id: + try: + profile_id_int = int(channel_profile_id) + filters["channelprofilemembership__channel_profile_id"] = profile_id_int + + if show_disabled_param is None: + filters["channelprofilemembership__enabled"] = True + except (ValueError, TypeError): + # Ignore invalid profile id values + pass + + if only_streamless: + q_filters &= Q(streams__isnull=True) + + if self.request.user.user_level < 10: + filters["user_level__lte"] = self.request.user.user_level + + if filters: + qs = qs.filter(**filters) + if q_filters: + qs = qs.filter(q_filters) + + return qs.distinct() def get_serializer_context(self): context = super().get_serializer_context() diff --git a/apps/output/views.py b/apps/output/views.py index 635bb9d9..aa7fd1bb 100644 --- a/apps/output/views.py +++ b/apps/output/views.py @@ -174,16 +174,26 @@ def generate_m3u(request, profile_name=None, user=None): tvg_id_source = request.GET.get('tvg_id_source', 'channel_number').lower() # Build EPG URL with query parameters if needed - epg_base_url = build_absolute_uri_with_port(request, reverse('output:epg_endpoint', args=[profile_name]) if profile_name else reverse('output:epg_endpoint')) + # Check if this is an XC API request (has username/password in GET params and user is authenticated) + xc_username = request.GET.get('username') + xc_password = request.GET.get('password') - # Optionally preserve certain query parameters - preserved_params = ['tvg_id_source', 'cachedlogos', 'days'] - query_params = {k: v for k, v in request.GET.items() if k in preserved_params} - if query_params: - from urllib.parse import urlencode - epg_url = f"{epg_base_url}?{urlencode(query_params)}" + if user is not None and xc_username and xc_password: + # This is an XC API request - use XC-style EPG URL + base_url = build_absolute_uri_with_port(request, '') + epg_url = f"{base_url}/xmltv.php?username={xc_username}&password={xc_password}" else: - epg_url = epg_base_url + # Regular request - use standard EPG endpoint + epg_base_url = build_absolute_uri_with_port(request, reverse('output:epg_endpoint', args=[profile_name]) if profile_name else reverse('output:epg_endpoint')) + + # Optionally preserve certain query parameters + preserved_params = ['tvg_id_source', 'cachedlogos', 'days'] + query_params = {k: v for k, v in request.GET.items() if k in preserved_params} + if query_params: + from urllib.parse import urlencode + epg_url = f"{epg_base_url}?{urlencode(query_params)}" + else: + epg_url = epg_base_url # Add x-tvg-url and url-tvg attribute for EPG URL m3u_content = f'#EXTM3U x-tvg-url="{epg_url}" url-tvg="{epg_url}"\n' @@ -247,12 +257,10 @@ def generate_m3u(request, profile_name=None, user=None): stream_url = first_stream.url else: # Fall back to proxy URL if no direct URL available - base_url = request.build_absolute_uri('/')[:-1] - stream_url = f"{base_url}/proxy/ts/stream/{channel.uuid}" + stream_url = build_absolute_uri_with_port(request, f"/proxy/ts/stream/{channel.uuid}") else: # Standard behavior - use proxy URL - base_url = request.build_absolute_uri('/')[:-1] - stream_url = f"{base_url}/proxy/ts/stream/{channel.uuid}" + stream_url = build_absolute_uri_with_port(request, f"/proxy/ts/stream/{channel.uuid}") m3u_content += extinf_line + stream_url + "\n" @@ -2258,7 +2266,7 @@ def xc_get_epg(request, user, short=False): # Get the mapped integer for this specific channel channel_num_int = channel_num_map.get(channel.id, int(channel.channel_number)) - limit = request.GET.get('limit', 4) + limit = int(request.GET.get('limit', 4)) if channel.epg_data: # Check if this is a dummy EPG that generates on-demand if channel.epg_data.epg_source and channel.epg_data.epg_source.source_type == 'dummy': @@ -2932,19 +2940,16 @@ def get_host_and_port(request): if xfh: if ":" in xfh: host, port = xfh.split(":", 1) - # Omit standard ports from URLs, or omit if port doesn't match standard for scheme - # (e.g., HTTPS but port is 9191 = behind external reverse proxy) + # Omit standard ports from URLs if port == standard_port: return host, None - # If port doesn't match standard and X-Forwarded-Proto is set, likely behind external RP - if request.META.get("HTTP_X_FORWARDED_PROTO"): - host = xfh.split(":")[0] # Strip port, will check for proper port below - else: - return host, port + # Non-standard port in X-Forwarded-Host - return it + # This handles reverse proxies on non-standard ports (e.g., https://example.com:8443) + return host, port else: host = xfh - # Check for X-Forwarded-Port header (if we didn't already find a valid port) + # Check for X-Forwarded-Port header (if we didn't find a port in X-Forwarded-Host) port = request.META.get("HTTP_X_FORWARDED_PORT") if port: # Omit standard ports from URLs @@ -2962,22 +2967,28 @@ def get_host_and_port(request): else: host = raw_host - # 3. Check if we're behind a reverse proxy (X-Forwarded-Proto or X-Forwarded-For present) + # 3. Check for X-Forwarded-Port (when Host header has no port but we're behind a reverse proxy) + port = request.META.get("HTTP_X_FORWARDED_PORT") + if port: + # Omit standard ports from URLs + return host, None if port == standard_port else port + + # 4. Check if we're behind a reverse proxy (X-Forwarded-Proto or X-Forwarded-For present) # If so, assume standard port for the scheme (don't trust SERVER_PORT in this case) if request.META.get("HTTP_X_FORWARDED_PROTO") or request.META.get("HTTP_X_FORWARDED_FOR"): return host, None - # 4. Try SERVER_PORT from META (only if NOT behind reverse proxy) + # 5. Try SERVER_PORT from META (only if NOT behind reverse proxy) port = request.META.get("SERVER_PORT") if port: # Omit standard ports from URLs return host, None if port == standard_port else port - # 5. Dev fallback: guess port 5656 + # 6. Dev fallback: guess port 5656 if os.environ.get("DISPATCHARR_ENV") == "dev" or host in ("localhost", "127.0.0.1"): return host, "5656" - # 6. Final fallback: assume standard port for scheme (omit from URL) + # 7. Final fallback: assume standard port for scheme (omit from URL) return host, None def build_absolute_uri_with_port(request, path): diff --git a/apps/proxy/ts_proxy/services/channel_service.py b/apps/proxy/ts_proxy/services/channel_service.py index 6484cd3f..4c4a73ac 100644 --- a/apps/proxy/ts_proxy/services/channel_service.py +++ b/apps/proxy/ts_proxy/services/channel_service.py @@ -15,6 +15,7 @@ from ..redis_keys import RedisKeys from ..constants import EventType, ChannelState, ChannelMetadataField from ..url_utils import get_stream_info_for_switch from core.utils import log_system_event +from .log_parsers import LogParserFactory logger = logging.getLogger("ts_proxy") @@ -419,124 +420,51 @@ class ChannelService: @staticmethod def parse_and_store_stream_info(channel_id, stream_info_line, stream_type="video", stream_id=None): - """Parse FFmpeg stream info line and store in Redis metadata and database""" + """ + Parse stream info from FFmpeg/VLC/Streamlink logs and store in Redis/DB. + Uses specialized parsers for each streaming tool. + """ try: - if stream_type == "input": - # Example lines: - # Input #0, mpegts, from 'http://example.com/stream.ts': - # Input #0, hls, from 'http://example.com/stream.m3u8': + # Use factory to parse the line based on stream type + parsed_data = LogParserFactory.parse(stream_type, stream_info_line) + + if not parsed_data: + return - # Extract input format (e.g., "mpegts", "hls", "flv", etc.) - input_match = re.search(r'Input #\d+,\s*([^,]+)', stream_info_line) - input_format = input_match.group(1).strip() if input_match else None + # Update Redis and database with parsed data + ChannelService._update_stream_info_in_redis( + channel_id, + parsed_data.get('video_codec'), + parsed_data.get('resolution'), + parsed_data.get('width'), + parsed_data.get('height'), + parsed_data.get('source_fps'), + parsed_data.get('pixel_format'), + parsed_data.get('video_bitrate'), + parsed_data.get('audio_codec'), + parsed_data.get('sample_rate'), + parsed_data.get('audio_channels'), + parsed_data.get('audio_bitrate'), + parsed_data.get('stream_type') + ) - # Store in Redis if we have valid data - if input_format: - ChannelService._update_stream_info_in_redis(channel_id, None, None, None, None, None, None, None, None, None, None, None, input_format) - # Save to database if stream_id is provided - if stream_id: - ChannelService._update_stream_stats_in_db(stream_id, stream_type=input_format) - - logger.debug(f"Input format info - Format: {input_format} for channel {channel_id}") - - elif stream_type == "video": - # Example line: - # Stream #0:0: Video: h264 (Main), yuv420p(tv, progressive), 1280x720 [SAR 1:1 DAR 16:9], q=2-31, 2000 kb/s, 29.97 fps, 90k tbn - - # Extract video codec (e.g., "h264", "mpeg2video", etc.) - codec_match = re.search(r'Video:\s*([a-zA-Z0-9_]+)', stream_info_line) - video_codec = codec_match.group(1) if codec_match else None - - # Extract resolution (e.g., "1280x720") - be more specific to avoid hex values - # Look for resolution patterns that are realistic video dimensions - resolution_match = re.search(r'\b(\d{3,5})x(\d{3,5})\b', stream_info_line) - if resolution_match: - width = int(resolution_match.group(1)) - height = int(resolution_match.group(2)) - # Validate that these look like reasonable video dimensions - if 100 <= width <= 10000 and 100 <= height <= 10000: - resolution = f"{width}x{height}" - else: - width = height = resolution = None - else: - width = height = resolution = None - - # Extract source FPS (e.g., "29.97 fps") - fps_match = re.search(r'(\d+(?:\.\d+)?)\s*fps', stream_info_line) - source_fps = float(fps_match.group(1)) if fps_match else None - - # Extract pixel format (e.g., "yuv420p") - pixel_format_match = re.search(r'Video:\s*[^,]+,\s*([^,(]+)', stream_info_line) - pixel_format = None - if pixel_format_match: - pf = pixel_format_match.group(1).strip() - # Clean up pixel format (remove extra info in parentheses) - if '(' in pf: - pf = pf.split('(')[0].strip() - pixel_format = pf - - # Extract bitrate if present (e.g., "2000 kb/s") - video_bitrate = None - bitrate_match = re.search(r'(\d+(?:\.\d+)?)\s*kb/s', stream_info_line) - if bitrate_match: - video_bitrate = float(bitrate_match.group(1)) - - # Store in Redis if we have valid data - if any(x is not None for x in [video_codec, resolution, source_fps, pixel_format, video_bitrate]): - ChannelService._update_stream_info_in_redis(channel_id, video_codec, resolution, width, height, source_fps, pixel_format, video_bitrate, None, None, None, None, None) - # Save to database if stream_id is provided - if stream_id: - ChannelService._update_stream_stats_in_db( - stream_id, - video_codec=video_codec, - resolution=resolution, - source_fps=source_fps, - pixel_format=pixel_format, - video_bitrate=video_bitrate - ) - - logger.info(f"Video stream info - Codec: {video_codec}, Resolution: {resolution}, " - f"Source FPS: {source_fps}, Pixel Format: {pixel_format}, " - f"Video Bitrate: {video_bitrate} kb/s") - - elif stream_type == "audio": - # Example line: - # Stream #0:1[0x101]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 64 kb/s - - # Extract audio codec (e.g., "aac", "mp3", etc.) - codec_match = re.search(r'Audio:\s*([a-zA-Z0-9_]+)', stream_info_line) - audio_codec = codec_match.group(1) if codec_match else None - - # Extract sample rate (e.g., "48000 Hz") - sample_rate_match = re.search(r'(\d+)\s*Hz', stream_info_line) - sample_rate = int(sample_rate_match.group(1)) if sample_rate_match else None - - # Extract channel layout (e.g., "stereo", "5.1", "mono") - # Look for common channel layouts - channel_match = re.search(r'\b(mono|stereo|5\.1|7\.1|quad|2\.1)\b', stream_info_line, re.IGNORECASE) - channels = channel_match.group(1) if channel_match else None - - # Extract audio bitrate if present (e.g., "64 kb/s") - audio_bitrate = None - bitrate_match = re.search(r'(\d+(?:\.\d+)?)\s*kb/s', stream_info_line) - if bitrate_match: - audio_bitrate = float(bitrate_match.group(1)) - - # Store in Redis if we have valid data - if any(x is not None for x in [audio_codec, sample_rate, channels, audio_bitrate]): - ChannelService._update_stream_info_in_redis(channel_id, None, None, None, None, None, None, None, audio_codec, sample_rate, channels, audio_bitrate, None) - # Save to database if stream_id is provided - if stream_id: - ChannelService._update_stream_stats_in_db( - stream_id, - audio_codec=audio_codec, - sample_rate=sample_rate, - audio_channels=channels, - audio_bitrate=audio_bitrate - ) + if stream_id: + ChannelService._update_stream_stats_in_db( + stream_id, + video_codec=parsed_data.get('video_codec'), + resolution=parsed_data.get('resolution'), + source_fps=parsed_data.get('source_fps'), + pixel_format=parsed_data.get('pixel_format'), + video_bitrate=parsed_data.get('video_bitrate'), + audio_codec=parsed_data.get('audio_codec'), + sample_rate=parsed_data.get('sample_rate'), + audio_channels=parsed_data.get('audio_channels'), + audio_bitrate=parsed_data.get('audio_bitrate'), + stream_type=parsed_data.get('stream_type') + ) except Exception as e: - logger.debug(f"Error parsing FFmpeg {stream_type} stream info: {e}") + logger.debug(f"Error parsing {stream_type} stream info: {e}") @staticmethod def _update_stream_info_in_redis(channel_id, codec, resolution, width, height, fps, pixel_format, video_bitrate, audio_codec=None, sample_rate=None, channels=None, audio_bitrate=None, input_format=None): diff --git a/apps/proxy/ts_proxy/services/log_parsers.py b/apps/proxy/ts_proxy/services/log_parsers.py new file mode 100644 index 00000000..95ee7a06 --- /dev/null +++ b/apps/proxy/ts_proxy/services/log_parsers.py @@ -0,0 +1,410 @@ +"""Log parsers for FFmpeg, Streamlink, and VLC output.""" +import re +import logging +from abc import ABC, abstractmethod +from typing import Optional, Dict, Any + +logger = logging.getLogger(__name__) + + +class BaseLogParser(ABC): + """Base class for log parsers""" + + # Map of stream_type -> method_name that this parser handles + STREAM_TYPE_METHODS: Dict[str, str] = {} + + @abstractmethod + def can_parse(self, line: str) -> Optional[str]: + """ + Check if this parser can handle the line. + Returns the stream_type if it can parse, None otherwise. + e.g., 'video', 'audio', 'vlc_video', 'vlc_audio', 'streamlink' + """ + pass + + @abstractmethod + def parse_input_format(self, line: str) -> Optional[Dict[str, Any]]: + pass + + @abstractmethod + def parse_video_stream(self, line: str) -> Optional[Dict[str, Any]]: + pass + + @abstractmethod + def parse_audio_stream(self, line: str) -> Optional[Dict[str, Any]]: + pass + + +class FFmpegLogParser(BaseLogParser): + """Parser for FFmpeg log output""" + + STREAM_TYPE_METHODS = { + 'input': 'parse_input_format', + 'video': 'parse_video_stream', + 'audio': 'parse_audio_stream' + } + + def can_parse(self, line: str) -> Optional[str]: + """Check if this is an FFmpeg line we can parse""" + lower = line.lower() + + # Input format detection + if lower.startswith('input #'): + return 'input' + + # Stream info (only during input phase, but we'll let stream_manager handle phase tracking) + if 'stream #' in lower: + if 'video:' in lower: + return 'video' + elif 'audio:' in lower: + return 'audio' + + return None + + def parse_input_format(self, line: str) -> Optional[Dict[str, Any]]: + """Parse FFmpeg input format (e.g., mpegts, hls)""" + try: + input_match = re.search(r'Input #\d+,\s*([^,]+)', line) + input_format = input_match.group(1).strip() if input_match else None + + if input_format: + logger.debug(f"Input format info - Format: {input_format}") + return {'stream_type': input_format} + except Exception as e: + logger.debug(f"Error parsing FFmpeg input format: {e}") + + return None + + def parse_video_stream(self, line: str) -> Optional[Dict[str, Any]]: + """Parse FFmpeg video stream info""" + try: + result = {} + + # Extract codec, resolution, fps, pixel format, bitrate + codec_match = re.search(r'Video:\s*([a-zA-Z0-9_]+)', line) + if codec_match: + result['video_codec'] = codec_match.group(1) + + resolution_match = re.search(r'\b(\d{3,5})x(\d{3,5})\b', line) + if resolution_match: + width = int(resolution_match.group(1)) + height = int(resolution_match.group(2)) + if 100 <= width <= 10000 and 100 <= height <= 10000: + result['resolution'] = f"{width}x{height}" + result['width'] = width + result['height'] = height + + fps_match = re.search(r'(\d+(?:\.\d+)?)\s*fps', line) + if fps_match: + result['source_fps'] = float(fps_match.group(1)) + + pixel_format_match = re.search(r'Video:\s*[^,]+,\s*([^,(]+)', line) + if pixel_format_match: + pf = pixel_format_match.group(1).strip() + if '(' in pf: + pf = pf.split('(')[0].strip() + result['pixel_format'] = pf + + bitrate_match = re.search(r'(\d+(?:\.\d+)?)\s*kb/s', line) + if bitrate_match: + result['video_bitrate'] = float(bitrate_match.group(1)) + + if result: + logger.info(f"Video stream info - Codec: {result.get('video_codec')}, " + f"Resolution: {result.get('resolution')}, " + f"Source FPS: {result.get('source_fps')}, " + f"Pixel Format: {result.get('pixel_format')}, " + f"Video Bitrate: {result.get('video_bitrate')} kb/s") + return result + + except Exception as e: + logger.debug(f"Error parsing FFmpeg video stream info: {e}") + + return None + + def parse_audio_stream(self, line: str) -> Optional[Dict[str, Any]]: + """Parse FFmpeg audio stream info""" + try: + result = {} + + codec_match = re.search(r'Audio:\s*([a-zA-Z0-9_]+)', line) + if codec_match: + result['audio_codec'] = codec_match.group(1) + + sample_rate_match = re.search(r'(\d+)\s*Hz', line) + if sample_rate_match: + result['sample_rate'] = int(sample_rate_match.group(1)) + + channel_match = re.search(r'\b(mono|stereo|5\.1|7\.1|quad|2\.1)\b', line, re.IGNORECASE) + if channel_match: + result['audio_channels'] = channel_match.group(1) + + bitrate_match = re.search(r'(\d+(?:\.\d+)?)\s*kb/s', line) + if bitrate_match: + result['audio_bitrate'] = float(bitrate_match.group(1)) + + if result: + return result + + except Exception as e: + logger.debug(f"Error parsing FFmpeg audio stream info: {e}") + + return None + + +class VLCLogParser(BaseLogParser): + """Parser for VLC log output""" + + STREAM_TYPE_METHODS = { + 'vlc_video': 'parse_video_stream', + 'vlc_audio': 'parse_audio_stream' + } + + def can_parse(self, line: str) -> Optional[str]: + """Check if this is a VLC line we can parse""" + lower = line.lower() + + # VLC TS demux codec detection + if 'ts demux debug' in lower and 'type=' in lower: + if 'video' in lower: + return 'vlc_video' + elif 'audio' in lower: + return 'vlc_audio' + + # VLC decoder output + if 'decoder' in lower and ('channels:' in lower or 'samplerate:' in lower or 'x' in line or 'fps' in lower): + if 'audio' in lower or 'channels:' in lower or 'samplerate:' in lower: + return 'vlc_audio' + else: + return 'vlc_video' + + # VLC transcode output for resolution/FPS + if 'stream_out_transcode' in lower and ('source fps' in lower or ('source ' in lower and 'x' in line)): + return 'vlc_video' + + return None + + def parse_input_format(self, line: str) -> Optional[Dict[str, Any]]: + return None + + def parse_video_stream(self, line: str) -> Optional[Dict[str, Any]]: + """Parse VLC TS demux output and decoder info for video""" + try: + lower = line.lower() + result = {} + + # Codec detection from TS demux + video_codec_map = { + ('avc', 'h.264', 'type=0x1b'): "h264", + ('hevc', 'h.265', 'type=0x24'): "hevc", + ('mpeg-2', 'type=0x02'): "mpeg2video", + ('mpeg-4', 'type=0x10'): "mpeg4" + } + + for patterns, codec in video_codec_map.items(): + if any(p in lower for p in patterns): + result['video_codec'] = codec + break + + # Extract FPS from transcode output: "source fps 30/1" + fps_fraction_match = re.search(r'source fps\s+(\d+)/(\d+)', lower) + if fps_fraction_match: + numerator = int(fps_fraction_match.group(1)) + denominator = int(fps_fraction_match.group(2)) + if denominator > 0: + result['source_fps'] = numerator / denominator + + # Extract resolution from transcode output: "source 1280x720" + source_res_match = re.search(r'source\s+(\d{3,4})x(\d{3,4})', lower) + if source_res_match: + width = int(source_res_match.group(1)) + height = int(source_res_match.group(2)) + if 100 <= width <= 10000 and 100 <= height <= 10000: + result['resolution'] = f"{width}x{height}" + result['width'] = width + result['height'] = height + else: + # Fallback: generic resolution pattern + resolution_match = re.search(r'(\d{3,4})x(\d{3,4})', line) + if resolution_match: + width = int(resolution_match.group(1)) + height = int(resolution_match.group(2)) + if 100 <= width <= 10000 and 100 <= height <= 10000: + result['resolution'] = f"{width}x{height}" + result['width'] = width + result['height'] = height + + # Fallback: try to extract FPS from generic format + if 'source_fps' not in result: + fps_match = re.search(r'(\d+\.?\d*)\s*fps', lower) + if fps_match: + result['source_fps'] = float(fps_match.group(1)) + + return result if result else None + + except Exception as e: + logger.debug(f"Error parsing VLC video stream info: {e}") + + return None + + def parse_audio_stream(self, line: str) -> Optional[Dict[str, Any]]: + """Parse VLC TS demux output and decoder info for audio""" + try: + lower = line.lower() + result = {} + + # Codec detection from TS demux + audio_codec_map = { + ('type=0xf', 'adts'): "aac", + ('type=0x03', 'type=0x04'): "mp3", + ('type=0x06', 'type=0x81'): "ac3", + ('type=0x0b', 'lpcm'): "pcm" + } + + for patterns, codec in audio_codec_map.items(): + if any(p in lower for p in patterns): + result['audio_codec'] = codec + break + + # VLC decoder format: "AAC channels: 2 samplerate: 48000" + if 'channels:' in lower: + channels_match = re.search(r'channels:\s*(\d+)', lower) + if channels_match: + num_channels = int(channels_match.group(1)) + # Convert number to name + channel_names = {1: 'mono', 2: 'stereo', 6: '5.1', 8: '7.1'} + result['audio_channels'] = channel_names.get(num_channels, str(num_channels)) + + if 'samplerate:' in lower: + samplerate_match = re.search(r'samplerate:\s*(\d+)', lower) + if samplerate_match: + result['sample_rate'] = int(samplerate_match.group(1)) + + # Try to extract sample rate (Hz format) + sample_rate_match = re.search(r'(\d+)\s*hz', lower) + if sample_rate_match and 'sample_rate' not in result: + result['sample_rate'] = int(sample_rate_match.group(1)) + + # Try to extract channels (word format) + if 'audio_channels' not in result: + channel_match = re.search(r'\b(mono|stereo|5\.1|7\.1|quad|2\.1)\b', lower) + if channel_match: + result['audio_channels'] = channel_match.group(1) + + return result if result else None + + except Exception as e: + logger.error(f"[VLC AUDIO PARSER] Error parsing VLC audio stream info: {e}") + + return None + + +class StreamlinkLogParser(BaseLogParser): + """Parser for Streamlink log output""" + + STREAM_TYPE_METHODS = { + 'streamlink': 'parse_video_stream' + } + + def can_parse(self, line: str) -> Optional[str]: + """Check if this is a Streamlink line we can parse""" + lower = line.lower() + + if 'opening stream:' in lower or 'available streams:' in lower: + return 'streamlink' + + return None + + def parse_input_format(self, line: str) -> Optional[Dict[str, Any]]: + return None + + def parse_video_stream(self, line: str) -> Optional[Dict[str, Any]]: + """Parse Streamlink quality/resolution""" + try: + quality_match = re.search(r'(\d+p|\d+x\d+)', line) + if quality_match: + quality = quality_match.group(1) + + if 'x' in quality: + resolution = quality + width, height = map(int, quality.split('x')) + else: + resolutions = { + '2160p': ('3840x2160', 3840, 2160), + '1080p': ('1920x1080', 1920, 1080), + '720p': ('1280x720', 1280, 720), + '480p': ('854x480', 854, 480), + '360p': ('640x360', 640, 360) + } + resolution, width, height = resolutions.get(quality, ('1920x1080', 1920, 1080)) + + return { + 'video_codec': 'h264', + 'resolution': resolution, + 'width': width, + 'height': height, + 'pixel_format': 'yuv420p' + } + + except Exception as e: + logger.debug(f"Error parsing Streamlink video info: {e}") + + return None + + def parse_audio_stream(self, line: str) -> Optional[Dict[str, Any]]: + return None + + +class LogParserFactory: + """Factory to get the appropriate log parser""" + + _parsers = { + 'ffmpeg': FFmpegLogParser(), + 'vlc': VLCLogParser(), + 'streamlink': StreamlinkLogParser() + } + + @classmethod + def _get_parser_and_method(cls, stream_type: str) -> Optional[tuple[BaseLogParser, str]]: + """Determine parser and method from stream_type""" + # Check each parser to see if it handles this stream_type + for parser in cls._parsers.values(): + method_name = parser.STREAM_TYPE_METHODS.get(stream_type) + if method_name: + return (parser, method_name) + + return None + + @classmethod + def parse(cls, stream_type: str, line: str) -> Optional[Dict[str, Any]]: + """ + Parse a log line based on stream type. + Returns parsed data or None if parsing fails. + """ + result = cls._get_parser_and_method(stream_type) + if not result: + return None + + parser, method_name = result + method = getattr(parser, method_name, None) + if method: + return method(line) + + return None + + @classmethod + def auto_parse(cls, line: str) -> Optional[tuple[str, Dict[str, Any]]]: + """ + Automatically detect which parser can handle this line and parse it. + Returns (stream_type, parsed_data) or None if no parser can handle it. + """ + # Try each parser to see if it can handle this line + for parser in cls._parsers.values(): + stream_type = parser.can_parse(line) + if stream_type: + # Parser can handle this line, now parse it + parsed_data = cls.parse(stream_type, line) + if parsed_data: + return (stream_type, parsed_data) + + return None diff --git a/apps/proxy/ts_proxy/stream_manager.py b/apps/proxy/ts_proxy/stream_manager.py index bbeb4bb7..e7f752d8 100644 --- a/apps/proxy/ts_proxy/stream_manager.py +++ b/apps/proxy/ts_proxy/stream_manager.py @@ -107,6 +107,10 @@ class StreamManager: # Add this flag for tracking transcoding process status self.transcode_process_active = False + # Track stream command for efficient log parser routing + self.stream_command = None + self.parser_type = None # Will be set when transcode process starts + # Add tracking for data throughput self.bytes_processed = 0 self.last_bytes_update = time.time() @@ -476,6 +480,21 @@ class StreamManager: # Build and start transcode command self.transcode_cmd = stream_profile.build_command(self.url, self.user_agent) + # Store stream command for efficient log parser routing + self.stream_command = stream_profile.command + # Map actual commands to parser types for direct routing + command_to_parser = { + 'ffmpeg': 'ffmpeg', + 'cvlc': 'vlc', + 'vlc': 'vlc', + 'streamlink': 'streamlink' + } + self.parser_type = command_to_parser.get(self.stream_command.lower()) + if self.parser_type: + logger.debug(f"Using {self.parser_type} parser for log parsing (command: {self.stream_command})") + else: + logger.debug(f"Unknown stream command '{self.stream_command}', will use auto-detection for log parsing") + # For UDP streams, remove any user_agent parameters from the command if hasattr(self, 'stream_type') and self.stream_type == StreamType.UDP: # Filter out any arguments that contain the user_agent value or related headers @@ -645,35 +664,51 @@ class StreamManager: if content_lower.startswith('output #') or 'encoder' in content_lower: self.ffmpeg_input_phase = False - # Only parse stream info if we're still in the input phase - if ("stream #" in content_lower and - ("video:" in content_lower or "audio:" in content_lower) and - self.ffmpeg_input_phase): + # Route to appropriate parser based on known command type + from .services.log_parsers import LogParserFactory + from .services.channel_service import ChannelService - from .services.channel_service import ChannelService - if "video:" in content_lower: - ChannelService.parse_and_store_stream_info(self.channel_id, content, "video", self.current_stream_id) - elif "audio:" in content_lower: - ChannelService.parse_and_store_stream_info(self.channel_id, content, "audio", self.current_stream_id) + parse_result = None + + # If we know the parser type, use direct routing for efficiency + if self.parser_type: + # Get the appropriate parser and check what it can parse + parser = LogParserFactory._parsers.get(self.parser_type) + if parser: + stream_type = parser.can_parse(content) + if stream_type: + # Parser can handle this line, parse it directly + parsed_data = LogParserFactory.parse(stream_type, content) + if parsed_data: + parse_result = (stream_type, parsed_data) + else: + # Unknown command type - use auto-detection as fallback + parse_result = LogParserFactory.auto_parse(content) + + if parse_result: + stream_type, parsed_data = parse_result + # For FFmpeg, only parse during input phase + if stream_type in ['video', 'audio', 'input']: + if self.ffmpeg_input_phase: + ChannelService.parse_and_store_stream_info(self.channel_id, content, stream_type, self.current_stream_id) + else: + # VLC and Streamlink can be parsed anytime + ChannelService.parse_and_store_stream_info(self.channel_id, content, stream_type, self.current_stream_id) # Determine log level based on content if any(keyword in content_lower for keyword in ['error', 'failed', 'cannot', 'invalid', 'corrupt']): - logger.error(f"FFmpeg stderr for channel {self.channel_id}: {content}") + logger.error(f"Stream process error for channel {self.channel_id}: {content}") elif any(keyword in content_lower for keyword in ['warning', 'deprecated', 'ignoring']): - logger.warning(f"FFmpeg stderr for channel {self.channel_id}: {content}") + logger.warning(f"Stream process warning for channel {self.channel_id}: {content}") elif content.startswith('frame=') or 'fps=' in content or 'speed=' in content: # Stats lines - log at trace level to avoid spam - logger.trace(f"FFmpeg stats for channel {self.channel_id}: {content}") + logger.trace(f"Stream stats for channel {self.channel_id}: {content}") elif any(keyword in content_lower for keyword in ['input', 'output', 'stream', 'video', 'audio']): # Stream info - log at info level - logger.info(f"FFmpeg info for channel {self.channel_id}: {content}") - if content.startswith('Input #0'): - # If it's input 0, parse stream info - from .services.channel_service import ChannelService - ChannelService.parse_and_store_stream_info(self.channel_id, content, "input", self.current_stream_id) + logger.info(f"Stream info for channel {self.channel_id}: {content}") else: # Everything else at debug level - logger.debug(f"FFmpeg stderr for channel {self.channel_id}: {content}") + logger.debug(f"Stream process output for channel {self.channel_id}: {content}") except Exception as e: logger.error(f"Error logging stderr content for channel {self.channel_id}: {e}") diff --git a/apps/proxy/ts_proxy/url_utils.py b/apps/proxy/ts_proxy/url_utils.py index 3b05c9f2..8b467b7f 100644 --- a/apps/proxy/ts_proxy/url_utils.py +++ b/apps/proxy/ts_proxy/url_utils.py @@ -462,16 +462,21 @@ def validate_stream_url(url, user_agent=None, timeout=(5, 5)): session.headers.update(headers) # Make HEAD request first as it's faster and doesn't download content - head_response = session.head( - url, - timeout=timeout, - allow_redirects=True - ) + head_request_success = True + try: + head_response = session.head( + url, + timeout=timeout, + allow_redirects=True + ) + except requests.exceptions.RequestException as e: + head_request_success = False + logger.warning(f"Request error (HEAD), assuming HEAD not supported: {str(e)}") # If HEAD not supported, server will return 405 or other error - if 200 <= head_response.status_code < 300: + if head_request_success and (200 <= head_response.status_code < 300): # HEAD request successful - return True, head_response.url, head_response.status_code, "Valid (HEAD request)" + return True, url, head_response.status_code, "Valid (HEAD request)" # Try a GET request with stream=True to avoid downloading all content get_response = session.get( @@ -484,7 +489,7 @@ def validate_stream_url(url, user_agent=None, timeout=(5, 5)): # IMPORTANT: Check status code first before checking content if not (200 <= get_response.status_code < 300): logger.warning(f"Stream validation failed with HTTP status {get_response.status_code}") - return False, get_response.url, get_response.status_code, f"Invalid HTTP status: {get_response.status_code}" + return False, url, get_response.status_code, f"Invalid HTTP status: {get_response.status_code}" # Only check content if status code is valid try: @@ -538,7 +543,7 @@ def validate_stream_url(url, user_agent=None, timeout=(5, 5)): get_response.close() # If we have content, consider it valid even with unrecognized content type - return is_valid, get_response.url, get_response.status_code, message + return is_valid, url, get_response.status_code, message except requests.exceptions.Timeout: return False, url, 0, "Timeout connecting to stream" diff --git a/apps/vod/api_views.py b/apps/vod/api_views.py index 8cc55a11..3bd984e6 100644 --- a/apps/vod/api_views.py +++ b/apps/vod/api_views.py @@ -62,7 +62,7 @@ class MovieFilter(django_filters.FilterSet): # Handle the format 'category_name|category_type' if '|' in value: - category_name, category_type = value.split('|', 1) + category_name, category_type = value.rsplit('|', 1) return queryset.filter( m3u_relations__category__name=category_name, m3u_relations__category__category_type=category_type @@ -219,7 +219,7 @@ class SeriesFilter(django_filters.FilterSet): # Handle the format 'category_name|category_type' if '|' in value: - category_name, category_type = value.split('|', 1) + category_name, category_type = value.rsplit('|', 1) return queryset.filter( m3u_relations__category__name=category_name, m3u_relations__category__category_type=category_type @@ -588,7 +588,7 @@ class UnifiedContentViewSet(viewsets.ReadOnlyModelViewSet): if category: if '|' in category: - cat_name, cat_type = category.split('|', 1) + cat_name, cat_type = category.rsplit('|', 1) if cat_type == 'movie': where_conditions[0] += " AND movies.id IN (SELECT movie_id FROM vod_m3umovierelation mmr JOIN vod_vodcategory c ON mmr.category_id = c.id WHERE c.name = %s)" where_conditions[1] = "1=0" # Exclude series diff --git a/apps/vod/tasks.py b/apps/vod/tasks.py index d42be946..4eb9fadc 100644 --- a/apps/vod/tasks.py +++ b/apps/vod/tasks.py @@ -1292,8 +1292,17 @@ def batch_process_episodes(account, series, episodes_data, scan_start_time=None) try: episode_id = str(episode_data.get('id')) episode_name = episode_data.get('title', 'Unknown Episode') - season_number = episode_data['_season_number'] - episode_number = episode_data.get('episode_num', 0) + # Ensure season and episode numbers are integers (API may return strings) + try: + season_number = int(episode_data['_season_number']) + except (ValueError, TypeError) as e: + logger.warning(f"Invalid season_number '{episode_data.get('_season_number')}' for episode '{episode_name}': {e}") + season_number = 0 + try: + episode_number = int(episode_data.get('episode_num', 0)) + except (ValueError, TypeError) as e: + logger.warning(f"Invalid episode_num '{episode_data.get('episode_num')}' for episode '{episode_name}': {e}") + episode_number = 0 info = episode_data.get('info', {}) # Extract episode metadata @@ -1324,7 +1333,7 @@ def batch_process_episodes(account, series, episodes_data, scan_start_time=None) # Check if we already have this episode pending creation (multiple streams for same episode) if not episode and episode_key in episodes_pending_creation: episode = episodes_pending_creation[episode_key] - logger.debug(f"Reusing pending episode for S{season_number:02d}E{episode_number:02d} (stream_id: {episode_id})") + logger.debug(f"Reusing pending episode for S{season_number}E{episode_number} (stream_id: {episode_id})") if episode: # Update existing episode @@ -1432,6 +1441,21 @@ def batch_process_episodes(account, series, episodes_data, scan_start_time=None) if key in episode_pk_map: relation.episode = episode_pk_map[key] + # Filter out relations with unsaved episodes (no PK) + # This can happen if bulk_create had a conflict and ignore_conflicts=True didn't save the episode + valid_relations_to_create = [] + for relation in relations_to_create: + if relation.episode.pk is not None: + valid_relations_to_create.append(relation) + else: + season_num = relation.episode.season_number + episode_num = relation.episode.episode_number + logger.warning( + f"Skipping relation for episode S{season_num}E{episode_num} " + f"- episode not saved to database" + ) + relations_to_create = valid_relations_to_create + # Update existing episodes if episodes_to_update: Episode.objects.bulk_update(episodes_to_update, [ diff --git a/core/api_views.py b/core/api_views.py index c50d7fa6..e3459a38 100644 --- a/core/api_views.py +++ b/core/api_views.py @@ -142,8 +142,12 @@ class CoreSettingsViewSet(viewsets.ModelViewSet): }, status=status.HTTP_200_OK, ) - - return Response(in_network, status=status.HTTP_200_OK) + + response_data = { + **in_network, + "client_ip": str(client_ip) + } + return Response(response_data, status=status.HTTP_200_OK) return Response({}, status=status.HTTP_200_OK) diff --git a/core/fixtures/initial_data.json b/core/fixtures/initial_data.json index c037fa78..889f0d24 100644 --- a/core/fixtures/initial_data.json +++ b/core/fixtures/initial_data.json @@ -23,7 +23,7 @@ "model": "core.streamprofile", "pk": 1, "fields": { - "name": "ffmpeg", + "name": "FFmpeg", "command": "ffmpeg", "parameters": "-i {streamUrl} -c:v copy -c:a copy -f mpegts pipe:1", "is_active": true, @@ -34,11 +34,22 @@ "model": "core.streamprofile", "pk": 2, "fields": { - "name": "streamlink", + "name": "Streamlink", "command": "streamlink", "parameters": "{streamUrl} best --stdout", "is_active": true, "user_agent": "1" } + }, + { + "model": "core.streamprofile", + "pk": 3, + "fields": { + "name": "VLC", + "command": "cvlc", + "parameters": "-vv -I dummy --no-video-title-show --http-user-agent {userAgent} {streamUrl} --sout #standard{access=file,mux=ts,dst=-}", + "is_active": true, + "user_agent": "1" + } } ] diff --git a/core/migrations/0019_add_vlc_stream_profile.py b/core/migrations/0019_add_vlc_stream_profile.py new file mode 100644 index 00000000..c3f72592 --- /dev/null +++ b/core/migrations/0019_add_vlc_stream_profile.py @@ -0,0 +1,42 @@ +# Generated migration to add VLC stream profile + +from django.db import migrations + +def add_vlc_profile(apps, schema_editor): + StreamProfile = apps.get_model("core", "StreamProfile") + UserAgent = apps.get_model("core", "UserAgent") + + # Check if VLC profile already exists + if not StreamProfile.objects.filter(name="VLC").exists(): + # Get the TiviMate user agent (should be pk=1) + try: + tivimate_ua = UserAgent.objects.get(pk=1) + except UserAgent.DoesNotExist: + # Fallback: get first available user agent + tivimate_ua = UserAgent.objects.first() + if not tivimate_ua: + # No user agents exist, skip creating profile + return + + StreamProfile.objects.create( + name="VLC", + command="cvlc", + parameters="-vv -I dummy --no-video-title-show --http-user-agent {userAgent} {streamUrl} --sout #standard{access=file,mux=ts,dst=-}", + is_active=True, + user_agent=tivimate_ua, + locked=True, # Make it read-only like ffmpeg/streamlink + ) + +def remove_vlc_profile(apps, schema_editor): + StreamProfile = apps.get_model("core", "StreamProfile") + StreamProfile.objects.filter(name="VLC").delete() + +class Migration(migrations.Migration): + + dependencies = [ + ('core', '0018_alter_systemevent_event_type'), + ] + + operations = [ + migrations.RunPython(add_vlc_profile, remove_vlc_profile), + ] diff --git a/docker/Dockerfile b/docker/Dockerfile index dc437227..bfb35c11 100644 --- a/docker/Dockerfile +++ b/docker/Dockerfile @@ -35,9 +35,6 @@ RUN rm -rf /app/frontend # Copy built frontend assets COPY --from=frontend-builder /app/frontend/dist /app/frontend/dist -# Run Django collectstatic -RUN python manage.py collectstatic --noinput - # Add timestamp argument ARG TIMESTAMP diff --git a/docker/entrypoint.sh b/docker/entrypoint.sh index 72eb5928..5de9bf0a 100755 --- a/docker/entrypoint.sh +++ b/docker/entrypoint.sh @@ -100,7 +100,7 @@ export POSTGRES_DIR=/data/db if [[ ! -f /etc/profile.d/dispatcharr.sh ]]; then # Define all variables to process variables=( - PATH VIRTUAL_ENV DJANGO_SETTINGS_MODULE PYTHONUNBUFFERED + PATH VIRTUAL_ENV DJANGO_SETTINGS_MODULE PYTHONUNBUFFERED PYTHONDONTWRITEBYTECODE POSTGRES_DB POSTGRES_USER POSTGRES_PASSWORD POSTGRES_HOST POSTGRES_PORT DISPATCHARR_ENV DISPATCHARR_DEBUG DISPATCHARR_LOG_LEVEL REDIS_HOST REDIS_DB POSTGRES_DIR DISPATCHARR_PORT @@ -174,9 +174,9 @@ else pids+=("$nginx_pid") fi -cd /app -python manage.py migrate --noinput -python manage.py collectstatic --noinput +# Run Django commands as non-root user to prevent permission issues +su - $POSTGRES_USER -c "cd /app && python manage.py migrate --noinput" +su - $POSTGRES_USER -c "cd /app && python manage.py collectstatic --noinput" # Select proper uwsgi config based on environment if [ "$DISPATCHARR_ENV" = "dev" ] && [ "$DISPATCHARR_DEBUG" != "true" ]; then diff --git a/docker/init/03-init-dispatcharr.sh b/docker/init/03-init-dispatcharr.sh index 03fe6816..0c317017 100644 --- a/docker/init/03-init-dispatcharr.sh +++ b/docker/init/03-init-dispatcharr.sh @@ -15,6 +15,7 @@ DATA_DIRS=( APP_DIRS=( "/app/logo_cache" "/app/media" + "/app/static" ) # Create all directories diff --git a/fixtures.json b/fixtures.json index 2d42f84e..3c31f926 100644 --- a/fixtures.json +++ b/fixtures.json @@ -36,7 +36,7 @@ "model": "core.streamprofile", "pk": 1, "fields": { - "profile_name": "ffmpeg", + "profile_name": "FFmpeg", "command": "ffmpeg", "parameters": "-i {streamUrl} -c:a copy -c:v copy -f mpegts pipe:1", "is_active": true, @@ -46,13 +46,23 @@ { "model": "core.streamprofile", "fields": { - "profile_name": "streamlink", + "profile_name": "Streamlink", "command": "streamlink", "parameters": "{streamUrl} best --stdout", "is_active": true, "user_agent": "1" } }, + { + "model": "core.streamprofile", + "fields": { + "profile_name": "VLC", + "command": "cvlc", + "parameters": "-vv -I dummy --no-video-title-show --http-user-agent {userAgent} {streamUrl} --sout #standard{access=file,mux=ts,dst=-}", + "is_active": true, + "user_agent": "1" + } + }, { "model": "core.coresettings", "fields": { diff --git a/frontend/src/App.jsx b/frontend/src/App.jsx index 3c7c3877..f22d408f 100644 --- a/frontend/src/App.jsx +++ b/frontend/src/App.jsx @@ -19,7 +19,6 @@ import Users from './pages/Users'; import LogosPage from './pages/Logos'; import VODsPage from './pages/VODs'; import useAuthStore from './store/auth'; -import useLogosStore from './store/logos'; import FloatingVideo from './components/FloatingVideo'; import { WebsocketProvider } from './WebSocket'; import { Box, AppShell, MantineProvider } from '@mantine/core'; @@ -40,8 +39,6 @@ const defaultRoute = '/channels'; const App = () => { const [open, setOpen] = useState(true); - const [backgroundLoadingStarted, setBackgroundLoadingStarted] = - useState(false); const isAuthenticated = useAuthStore((s) => s.isAuthenticated); const setIsAuthenticated = useAuthStore((s) => s.setIsAuthenticated); const logout = useAuthStore((s) => s.logout); @@ -81,11 +78,7 @@ const App = () => { const loggedIn = await initializeAuth(); if (loggedIn) { await initData(); - // Start background logo loading after app is fully initialized (only once) - if (!backgroundLoadingStarted) { - setBackgroundLoadingStarted(true); - useLogosStore.getState().startBackgroundLoading(); - } + // Logos are now loaded at the end of initData, no need for background loading } else { await logout(); } @@ -96,7 +89,7 @@ const App = () => { }; checkAuth(); - }, [initializeAuth, initData, logout, backgroundLoadingStarted]); + }, [initializeAuth, initData, logout]); return ( Something went wrong; + } + return this.props.children; + } +} + +export default ErrorBoundary; \ No newline at end of file diff --git a/frontend/src/components/Field.jsx b/frontend/src/components/Field.jsx new file mode 100644 index 00000000..1293bf7b --- /dev/null +++ b/frontend/src/components/Field.jsx @@ -0,0 +1,47 @@ +import { NumberInput, Select, Switch, TextInput } from '@mantine/core'; +import React from 'react'; + +export const Field = ({ field, value, onChange }) => { + const common = { label: field.label, description: field.help_text }; + const effective = value ?? field.default; + switch (field.type) { + case 'boolean': + return ( + onChange(field.id, e.currentTarget.checked)} + label={field.label} + description={field.help_text} + /> + ); + case 'number': + return ( + onChange(field.id, v)} + {...common} + /> + ); + case 'select': + return ( + + + ({ + value: String(opt.value), + label: opt.label, + }))} + searchable + clearable + /> + + + + + + + + + + + + + + + + + + Upcoming occurrences + + {upcomingOccurrences.length} + + {upcomingOccurrences.length === 0 ? ( + + No future airings currently scheduled. + + ) : } + + + + ); +}; + +export default RecurringRuleModal; \ No newline at end of file diff --git a/frontend/src/components/forms/SeriesRecordingModal.jsx b/frontend/src/components/forms/SeriesRecordingModal.jsx new file mode 100644 index 00000000..3d890971 --- /dev/null +++ b/frontend/src/components/forms/SeriesRecordingModal.jsx @@ -0,0 +1,91 @@ +import React from 'react'; +import { Modal, Stack, Text, Flex, Group, Button } from '@mantine/core'; +import useChannelsStore from '../../store/channels.jsx'; +import { deleteSeriesAndRule } from '../../utils/cards/RecordingCardUtils.js'; +import { evaluateSeriesRulesByTvgId, fetchRules } from '../../pages/guideUtils.js'; +import { showNotification } from '../../utils/notificationUtils.js'; + +export default function SeriesRecordingModal({ + opened, + onClose, + rules, + onRulesUpdate +}) { + const handleEvaluateNow = async (r) => { + await evaluateSeriesRulesByTvgId(r.tvg_id); + try { + await useChannelsStore.getState().fetchRecordings(); + } catch (error) { + console.warn('Failed to refresh recordings after evaluation', error); + } + showNotification({ + title: 'Evaluated', + message: 'Checked for episodes', + }); + }; + + const handleRemoveSeries = async (r) => { + await deleteSeriesAndRule({ tvg_id: r.tvg_id, title: r.title }); + try { + await useChannelsStore.getState().fetchRecordings(); + } catch (error) { + console.warn('Failed to refresh recordings after bulk removal', error); + } + const updated = await fetchRules(); + onRulesUpdate(updated); + }; + + return ( + + + {(!rules || rules.length === 0) && ( + + No series rules configured + + )} + {rules && rules.map((r) => ( + + + {r.title || r.tvg_id} —{' '} + {r.mode === 'new' ? 'New episodes' : 'Every episode'} + + + + + + + ))} + + + ); +} diff --git a/frontend/src/components/forms/settings/DvrSettingsForm.jsx b/frontend/src/components/forms/settings/DvrSettingsForm.jsx new file mode 100644 index 00000000..f03bdf66 --- /dev/null +++ b/frontend/src/components/forms/settings/DvrSettingsForm.jsx @@ -0,0 +1,263 @@ +import useSettingsStore from '../../../store/settings.jsx'; +import React, { useEffect, useState } from 'react'; +import { + getChangedSettings, + parseSettings, + saveChangedSettings, +} from '../../../utils/pages/SettingsUtils.js'; +import { showNotification } from '../../../utils/notificationUtils.js'; +import { + Alert, + Button, + FileInput, + Flex, + Group, + NumberInput, + Stack, + Switch, + Text, + TextInput, +} from '@mantine/core'; +import { + getComskipConfig, + getDvrSettingsFormInitialValues, + uploadComskipIni, +} from '../../../utils/forms/settings/DvrSettingsFormUtils.js'; +import { useForm } from '@mantine/form'; + +const DvrSettingsForm = React.memo(({ active }) => { + const settings = useSettingsStore((s) => s.settings); + const [saved, setSaved] = useState(false); + const [comskipFile, setComskipFile] = useState(null); + const [comskipUploadLoading, setComskipUploadLoading] = useState(false); + const [comskipConfig, setComskipConfig] = useState({ + path: '', + exists: false, + }); + + const form = useForm({ + mode: 'controlled', + initialValues: getDvrSettingsFormInitialValues(), + }); + + useEffect(() => { + if (!active) setSaved(false); + }, [active]); + + useEffect(() => { + if (settings) { + const formValues = parseSettings(settings); + + form.setValues(formValues); + + if (formValues['dvr-comskip-custom-path']) { + setComskipConfig((prev) => ({ + path: formValues['dvr-comskip-custom-path'], + exists: prev.exists, + })); + } + } + }, [settings]); + + useEffect(() => { + const loadComskipConfig = async () => { + try { + const response = await getComskipConfig(); + if (response) { + setComskipConfig({ + path: response.path || '', + exists: Boolean(response.exists), + }); + if (response.path) { + form.setFieldValue('dvr-comskip-custom-path', response.path); + } + } + } catch (error) { + console.error('Failed to load comskip config', error); + } + }; + loadComskipConfig(); + }, []); + + const onComskipUpload = async () => { + if (!comskipFile) { + return; + } + + setComskipUploadLoading(true); + try { + const response = await uploadComskipIni(comskipFile); + if (response?.path) { + showNotification({ + title: 'comskip.ini uploaded', + message: response.path, + autoClose: 3000, + color: 'green', + }); + form.setFieldValue('dvr-comskip-custom-path', response.path); + useSettingsStore.getState().updateSetting({ + ...(settings['dvr-comskip-custom-path'] || { + key: 'dvr-comskip-custom-path', + name: 'DVR Comskip Custom Path', + }), + value: response.path, + }); + setComskipConfig({ path: response.path, exists: true }); + } + } catch (error) { + console.error('Failed to upload comskip.ini', error); + } finally { + setComskipUploadLoading(false); + setComskipFile(null); + } + }; + + const onSubmit = async () => { + setSaved(false); + + const changedSettings = getChangedSettings(form.getValues(), settings); + + // Update each changed setting in the backend (create if missing) + try { + await saveChangedSettings(settings, changedSettings); + + setSaved(true); + } catch (error) { + // Error notifications are already shown by API functions + // Just don't show the success message + console.error('Error saving settings:', error); + } + }; + + return ( +
+ + {saved && ( + + )} + + + + + + + + {comskipConfig.exists && comskipConfig.path + ? `Using ${comskipConfig.path}` + : 'No custom comskip.ini uploaded.'} + + + + + + + + + + + +
+ ); +}); + +export default DvrSettingsForm; \ No newline at end of file diff --git a/frontend/src/components/forms/settings/NetworkAccessForm.jsx b/frontend/src/components/forms/settings/NetworkAccessForm.jsx new file mode 100644 index 00000000..1d2c42e7 --- /dev/null +++ b/frontend/src/components/forms/settings/NetworkAccessForm.jsx @@ -0,0 +1,161 @@ +import { NETWORK_ACCESS_OPTIONS } from '../../../constants.js'; +import useSettingsStore from '../../../store/settings.jsx'; +import React, { useEffect, useState } from 'react'; +import { useForm } from '@mantine/form'; +import { + checkSetting, + updateSetting, +} from '../../../utils/pages/SettingsUtils.js'; +import { Alert, Button, Flex, Stack, Text, TextInput } from '@mantine/core'; +import ConfirmationDialog from '../../ConfirmationDialog.jsx'; +import { + getNetworkAccessFormInitialValues, + getNetworkAccessFormValidation, +} from '../../../utils/forms/settings/NetworkAccessFormUtils.js'; + +const NetworkAccessForm = React.memo(({ active }) => { + const settings = useSettingsStore((s) => s.settings); + + const [networkAccessError, setNetworkAccessError] = useState(null); + const [saved, setSaved] = useState(false); + const [networkAccessConfirmOpen, setNetworkAccessConfirmOpen] = + useState(false); + const [netNetworkAccessConfirmCIDRs, setNetNetworkAccessConfirmCIDRs] = + useState([]); + const [clientIpAddress, setClientIpAddress] = useState(null); + + const networkAccessForm = useForm({ + mode: 'controlled', + initialValues: getNetworkAccessFormInitialValues(), + validate: getNetworkAccessFormValidation(), + }); + + useEffect(() => { + if(!active) setSaved(false); + }, [active]); + + useEffect(() => { + const networkAccessSettings = JSON.parse( + settings['network-access'].value || '{}' + ); + networkAccessForm.setValues( + Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => { + acc[key] = networkAccessSettings[key] || '0.0.0.0/0,::/0'; + return acc; + }, {}) + ); + }, [settings]); + + const onNetworkAccessSubmit = async () => { + setSaved(false); + setNetworkAccessError(null); + const check = await checkSetting({ + ...settings['network-access'], + value: JSON.stringify(networkAccessForm.getValues()), + }); + + if (check.error && check.message) { + setNetworkAccessError(`${check.message}: ${check.data}`); + return; + } + + // Store the client IP + setClientIpAddress(check.client_ip); + + // For now, only warn if we're blocking the UI + const blockedAccess = check.UI; + if (blockedAccess.length === 0) { + return saveNetworkAccess(); + } + + setNetNetworkAccessConfirmCIDRs(blockedAccess); + setNetworkAccessConfirmOpen(true); + }; + + const saveNetworkAccess = async () => { + setSaved(false); + try { + await updateSetting({ + ...settings['network-access'], + value: JSON.stringify(networkAccessForm.getValues()), + }); + setSaved(true); + setNetworkAccessConfirmOpen(false); + } catch (e) { + const errors = {}; + for (const key in e.body.value) { + errors[key] = `Invalid CIDR(s): ${e.body.value[key]}`; + } + networkAccessForm.setErrors(errors); + } + }; + + return ( + <> +
+ + {saved && ( + + )} + {networkAccessError && ( + + )} + + {Object.entries(NETWORK_ACCESS_OPTIONS).map(([key, config]) => ( + + ))} + + + + + +
+ + setNetworkAccessConfirmOpen(false)} + onConfirm={saveNetworkAccess} + title={`Confirm Network Access Blocks`} + message={ + <> + + Your client {clientIpAddress && `(${clientIpAddress}) `}is not + included in the allowed networks for the web UI. Are you sure you + want to proceed? + + +
    + {netNetworkAccessConfirmCIDRs.map((cidr) => ( +
  • {cidr}
  • + ))} +
+ + } + confirmLabel="Save" + cancelLabel="Cancel" + size="md" + /> + + ); +}); + +export default NetworkAccessForm; \ No newline at end of file diff --git a/frontend/src/components/forms/settings/ProxySettingsForm.jsx b/frontend/src/components/forms/settings/ProxySettingsForm.jsx new file mode 100644 index 00000000..7fc2d0cb --- /dev/null +++ b/frontend/src/components/forms/settings/ProxySettingsForm.jsx @@ -0,0 +1,166 @@ +import useSettingsStore from '../../../store/settings.jsx'; +import React, { useEffect, useState } from 'react'; +import { useForm } from '@mantine/form'; +import { updateSetting } from '../../../utils/pages/SettingsUtils.js'; +import { + Alert, + Button, + Flex, + NumberInput, + Stack, + TextInput, +} from '@mantine/core'; +import { PROXY_SETTINGS_OPTIONS } from '../../../constants.js'; +import { + getProxySettingDefaults, + getProxySettingsFormInitialValues, +} from '../../../utils/forms/settings/ProxySettingsFormUtils.js'; + +const ProxySettingsOptions = React.memo(({ proxySettingsForm }) => { + const isNumericField = (key) => { + // Determine if this field should be a NumberInput + return [ + 'buffering_timeout', + 'redis_chunk_ttl', + 'channel_shutdown_delay', + 'channel_init_grace_period', + ].includes(key); + }; + const isFloatField = (key) => { + return key === 'buffering_speed'; + }; + const getNumericFieldMax = (key) => { + return key === 'buffering_timeout' + ? 300 + : key === 'redis_chunk_ttl' + ? 3600 + : key === 'channel_shutdown_delay' + ? 300 + : 60; + }; + return ( + <> + {Object.entries(PROXY_SETTINGS_OPTIONS).map(([key, config]) => { + if (isNumericField(key)) { + return ( + + ); + } else if (isFloatField(key)) { + return ( + + ); + } else { + return ( + + ); + } + })} + + ); +}); + +const ProxySettingsForm = React.memo(({ active }) => { + const settings = useSettingsStore((s) => s.settings); + + const [saved, setSaved] = useState(false); + + const proxySettingsForm = useForm({ + mode: 'controlled', + initialValues: getProxySettingsFormInitialValues(), + }); + + useEffect(() => { + if(!active) setSaved(false); + }, [active]); + + useEffect(() => { + if (settings) { + if (settings['proxy-settings']?.value) { + try { + const proxySettings = JSON.parse(settings['proxy-settings'].value); + proxySettingsForm.setValues(proxySettings); + } catch (error) { + console.error('Error parsing proxy settings:', error); + } + } + } + }, [settings]); + + const resetProxySettingsToDefaults = () => { + proxySettingsForm.setValues(getProxySettingDefaults()); + }; + + const onProxySettingsSubmit = async () => { + setSaved(false); + + try { + const result = await updateSetting({ + ...settings['proxy-settings'], + value: JSON.stringify(proxySettingsForm.getValues()), + }); + // API functions return undefined on error + if (result) { + setSaved(true); + } + } catch (error) { + // Error notifications are already shown by API functions + console.error('Error saving proxy settings:', error); + } + }; + + return ( +
+ + {saved && ( + + )} + + + + + + + + +
+ ); +}); + +export default ProxySettingsForm; \ No newline at end of file diff --git a/frontend/src/components/forms/settings/StreamSettingsForm.jsx b/frontend/src/components/forms/settings/StreamSettingsForm.jsx new file mode 100644 index 00000000..1b6b466d --- /dev/null +++ b/frontend/src/components/forms/settings/StreamSettingsForm.jsx @@ -0,0 +1,306 @@ +import useSettingsStore from '../../../store/settings.jsx'; +import useWarningsStore from '../../../store/warnings.jsx'; +import useUserAgentsStore from '../../../store/userAgents.jsx'; +import useStreamProfilesStore from '../../../store/streamProfiles.jsx'; +import { REGION_CHOICES } from '../../../constants.js'; +import React, { useEffect, useState } from 'react'; +import { + getChangedSettings, + parseSettings, + rehashStreams, + saveChangedSettings, +} from '../../../utils/pages/SettingsUtils.js'; +import { + Alert, + Button, + Flex, + Group, + MultiSelect, + Select, + Switch, + Text, +} from '@mantine/core'; +import ConfirmationDialog from '../../ConfirmationDialog.jsx'; +import { useForm } from '@mantine/form'; +import { + getStreamSettingsFormInitialValues, + getStreamSettingsFormValidation, +} from '../../../utils/forms/settings/StreamSettingsFormUtils.js'; + +const StreamSettingsForm = React.memo(({ active }) => { + const settings = useSettingsStore((s) => s.settings); + const suppressWarning = useWarningsStore((s) => s.suppressWarning); + const isWarningSuppressed = useWarningsStore((s) => s.isWarningSuppressed); + const userAgents = useUserAgentsStore((s) => s.userAgents); + const streamProfiles = useStreamProfilesStore((s) => s.profiles); + const regionChoices = REGION_CHOICES; + + // Store pending changed settings when showing the dialog + const [pendingChangedSettings, setPendingChangedSettings] = useState(null); + + const [saved, setSaved] = useState(false); + const [rehashingStreams, setRehashingStreams] = useState(false); + const [rehashSuccess, setRehashSuccess] = useState(false); + const [rehashConfirmOpen, setRehashConfirmOpen] = useState(false); + + // Add a new state to track the dialog type + const [rehashDialogType, setRehashDialogType] = useState(null); // 'save' or 'rehash' + + const form = useForm({ + mode: 'controlled', + initialValues: getStreamSettingsFormInitialValues(), + validate: getStreamSettingsFormValidation(), + }); + + useEffect(() => { + if (!active) { + setSaved(false); + setRehashSuccess(false); + } + }, [active]); + + useEffect(() => { + if (settings) { + const formValues = parseSettings(settings); + + form.setValues(formValues); + } + }, [settings]); + + const executeSettingsSaveAndRehash = async () => { + setRehashConfirmOpen(false); + setSaved(false); + + // Use the stored pending values that were captured before the dialog was shown + const changedSettings = pendingChangedSettings || {}; + + // Update each changed setting in the backend (create if missing) + try { + await saveChangedSettings(settings, changedSettings); + + // Clear the pending values + setPendingChangedSettings(null); + setSaved(true); + } catch (error) { + // Error notifications are already shown by API functions + // Just don't show the success message + console.error('Error saving settings:', error); + setPendingChangedSettings(null); + } + }; + + const executeRehashStreamsOnly = async () => { + setRehashingStreams(true); + setRehashSuccess(false); + setRehashConfirmOpen(false); + + try { + await rehashStreams(); + setRehashSuccess(true); + setTimeout(() => setRehashSuccess(false), 5000); + } catch (error) { + console.error('Error rehashing streams:', error); + } finally { + setRehashingStreams(false); + } + }; + + const onRehashStreams = async () => { + // Skip warning if it's been suppressed + if (isWarningSuppressed('rehash-streams')) { + return executeRehashStreamsOnly(); + } + + setRehashDialogType('rehash'); // Set dialog type to rehash + setRehashConfirmOpen(true); + }; + + const handleRehashConfirm = () => { + if (rehashDialogType === 'save') { + executeSettingsSaveAndRehash(); + } else { + executeRehashStreamsOnly(); + } + }; + + const onSubmit = async () => { + setSaved(false); + + const values = form.getValues(); + const changedSettings = getChangedSettings(values, settings); + + const m3uHashKeyChanged = + settings['m3u-hash-key']?.value !== values['m3u-hash-key'].join(','); + + // If M3U hash key changed, show warning (unless suppressed) + if (m3uHashKeyChanged && !isWarningSuppressed('rehash-streams')) { + // Store the changed settings before showing dialog + setPendingChangedSettings(changedSettings); + setRehashDialogType('save'); // Set dialog type to save + setRehashConfirmOpen(true); + return; + } + + // Update each changed setting in the backend (create if missing) + try { + await saveChangedSettings(settings, changedSettings); + + setSaved(true); + } catch (error) { + // Error notifications are already shown by API functions + // Just don't show the success message + console.error('Error saving settings:', error); + } + }; + + return ( + <> +
+ {saved && ( + + )} + ({ + value: `${option.id}`, + label: option.name, + }))} + /> + onUISettingsChange('table-size', val)} + data={[ + { + value: 'default', + label: 'Default', + }, + { + value: 'compact', + label: 'Compact', + }, + { + value: 'large', + label: 'Large', + }, + ]} + /> + onUISettingsChange('date-format', val)} + data={[ + { + value: 'mdy', + label: 'MM/DD/YYYY', + }, + { + value: 'dmy', + label: 'DD/MM/YYYY', + }, + ]} + /> + - - ({ - value: String(opt.value), - label: opt.label, - }))} - searchable - clearable - /> - - - form.setFieldValue('start_date', value || dayjs().toDate()) - } - valueFormat="MMM D, YYYY" - /> - form.setFieldValue('end_date', value)} - valueFormat="MMM D, YYYY" - minDate={form.values.start_date || undefined} - /> - - - - form.setFieldValue('start_time', toTimeString(value)) - } - withSeconds={false} - format="12" - amLabel="AM" - pmLabel="PM" - /> - - form.setFieldValue('end_time', toTimeString(value)) - } - withSeconds={false} - format="12" - amLabel="AM" - pmLabel="PM" - /> - - - - - - - - - - - Upcoming occurrences - - {upcomingOccurrences.length} - - {upcomingOccurrences.length === 0 ? ( - - No future airings currently scheduled. - - ) : ( - - {upcomingOccurrences.map((occ) => { - const occStart = toUserTime(occ.start_time); - const occEnd = toUserTime(occ.end_time); - return ( - - - - - {occStart.format(`${dateformat}, YYYY`)} - - - {occStart.format(timeformat)} – {occEnd.format(timeformat)} - - - - - - - - - ); - })} - - )} - - - - ); -}; - -const RecordingCard = ({ recording, onOpenDetails, onOpenRecurring }) => { - const channels = useChannelsStore((s) => s.channels); - const env_mode = useSettingsStore((s) => s.environment.env_mode); - const showVideo = useVideoStore((s) => s.showVideo); - const fetchRecordings = useChannelsStore((s) => s.fetchRecordings); - const { toUserTime, userNow } = useTimeHelpers(); - const [timeformat, dateformat] = useDateTimeFormat(); - - const channel = channels?.[recording.channel]; - - const deleteRecording = (id) => { - // Optimistically remove immediately from UI - try { - useChannelsStore.getState().removeRecording(id); - } catch (error) { - console.error('Failed to optimistically remove recording', error); - } - // Fire-and-forget server delete; websocket will keep others in sync - API.deleteRecording(id).catch(() => { - // On failure, fallback to refetch to restore state - try { - useChannelsStore.getState().fetchRecordings(); - } catch (error) { - console.error('Failed to refresh recordings after delete', error); - } - }); - }; - - const customProps = recording.custom_properties || {}; - const program = customProps.program || {}; - const recordingName = program.title || 'Custom Recording'; - const subTitle = program.sub_title || ''; - const description = program.description || customProps.description || ''; - const isRecurringRule = customProps?.rule?.type === 'recurring'; - - // Poster or channel logo - const posterLogoId = customProps.poster_logo_id; - let posterUrl = posterLogoId - ? `/api/channels/logos/${posterLogoId}/cache/` - : customProps.poster_url || channel?.logo?.cache_url || '/logo.png'; - // Prefix API host in dev if using a relative path - if (env_mode === 'dev' && posterUrl && posterUrl.startsWith('/')) { - posterUrl = `${window.location.protocol}//${window.location.hostname}:5656${posterUrl}`; - } - - const start = toUserTime(recording.start_time); - const end = toUserTime(recording.end_time); - const now = userNow(); - const status = customProps.status; - const isTimeActive = now.isAfter(start) && now.isBefore(end); - const isInterrupted = status === 'interrupted'; - const isInProgress = isTimeActive; // Show as recording by time, regardless of status glitches - const isUpcoming = now.isBefore(start); - const isSeriesGroup = Boolean( - recording._group_count && recording._group_count > 1 - ); - // Season/Episode display if present - const season = customProps.season ?? program?.custom_properties?.season; - const episode = customProps.episode ?? program?.custom_properties?.episode; - const onscreen = - customProps.onscreen_episode ?? - program?.custom_properties?.onscreen_episode; - const seLabel = - season && episode - ? `S${String(season).padStart(2, '0')}E${String(episode).padStart(2, '0')}` - : onscreen || null; - - const handleWatchLive = () => { - if (!channel) return; - let url = `/proxy/ts/stream/${channel.uuid}`; - if (env_mode === 'dev') { - url = `${window.location.protocol}//${window.location.hostname}:5656${url}`; - } - showVideo(url, 'live'); - }; - - const handleWatchRecording = () => { - // Only enable if backend provides a playable file URL in custom properties - let fileUrl = customProps.file_url || customProps.output_file_url; - if (!fileUrl) return; - if (env_mode === 'dev' && fileUrl.startsWith('/')) { - fileUrl = `${window.location.protocol}//${window.location.hostname}:5656${fileUrl}`; - } - showVideo(fileUrl, 'vod', { - name: recordingName, - logo: { url: posterUrl }, - }); - }; - - const handleRunComskip = async (e) => { - e?.stopPropagation?.(); - try { - await API.runComskip(recording.id); - notifications.show({ - title: 'Removing commercials', - message: 'Queued comskip for this recording', - color: 'blue.5', - autoClose: 2000, - }); - } catch (error) { - console.error('Failed to queue comskip for recording', error); - } - }; - - // Cancel handling for series groups - const [cancelOpen, setCancelOpen] = React.useState(false); - const [busy, setBusy] = React.useState(false); - const handleCancelClick = (e) => { - e.stopPropagation(); - if (isRecurringRule) { - onOpenRecurring?.(recording, true); - return; - } - if (isSeriesGroup) { - setCancelOpen(true); - } else { - deleteRecording(recording.id); - } - }; - - const seriesInfo = (() => { - const cp = customProps || {}; - const pr = cp.program || {}; - return { tvg_id: pr.tvg_id, title: pr.title }; - })(); - - const removeUpcomingOnly = async () => { - try { - setBusy(true); - await API.deleteRecording(recording.id); - } finally { - setBusy(false); - setCancelOpen(false); - try { - await fetchRecordings(); - } catch (error) { - console.error('Failed to refresh recordings', error); - } - } - }; - - const removeSeriesAndRule = async () => { - try { - setBusy(true); - const { tvg_id, title } = seriesInfo; - if (tvg_id) { - try { - await API.bulkRemoveSeriesRecordings({ - tvg_id, - title, - scope: 'title', - }); - } catch (error) { - console.error('Failed to remove series recordings', error); - } - try { - await API.deleteSeriesRule(tvg_id); - } catch (error) { - console.error('Failed to delete series rule', error); - } - } - } finally { - setBusy(false); - setCancelOpen(false); - try { - await fetchRecordings(); - } catch (error) { - console.error( - 'Failed to refresh recordings after series removal', - error - ); - } - } - }; - - const MainCard = ( - { - if (isRecurringRule) { - onOpenRecurring?.(recording, false); - } else { - onOpenDetails?.(recording); - } - }} - > - - - - {isInterrupted - ? 'Interrupted' - : isInProgress - ? 'Recording' - : isUpcoming - ? 'Scheduled' - : 'Completed'} - - {isInterrupted && } - - - - {recordingName} - - {isSeriesGroup && ( - - Series - - )} - {isRecurringRule && ( - - Recurring - - )} - {seLabel && !isSeriesGroup && ( - - {seLabel} - - )} - - - - -
- - e.stopPropagation()} - onClick={handleCancelClick} - > - - - -
-
- - - {recordingName} - - {!isSeriesGroup && subTitle && ( - - - Episode - - - {subTitle} - - - )} - - - Channel - - - {channel ? `${channel.channel_number} • ${channel.name}` : '—'} - - - - - - {isSeriesGroup ? 'Next recording' : 'Time'} - - - {start.format(`${dateformat}, YYYY ${timeformat}`)} – {end.format(timeformat)} - - - - {!isSeriesGroup && description && ( - onOpenDetails?.(recording)} - /> - )} - - {isInterrupted && customProps.interrupted_reason && ( - - {customProps.interrupted_reason} - - )} - - - {isInProgress && ( - - )} - - {!isUpcoming && ( - - - - )} - {!isUpcoming && - customProps?.status === 'completed' && - (!customProps?.comskip || - customProps?.comskip?.status !== 'completed') && ( - - )} - - - - {/* If this card is a grouped upcoming series, show count */} - {recording._group_count > 1 && ( - - Next of {recording._group_count} - - )} -
- ); - if (!isSeriesGroup) return MainCard; - - // Stacked look for series groups: render two shadow layers behind the main card - return ( - - setCancelOpen(false)} - title="Cancel Series" - centered - size="md" - zIndex={9999} - > - - This is a series rule. What would you like to cancel? - - - - - - - - - {MainCard} - - ); +const RecordingList = ({ list, onOpenDetails, onOpenRecurring }) => { + return list.map((rec) => ( + + )); }; const DVRPage = () => { @@ -1441,86 +105,42 @@ const DVRPage = () => { // Categorize recordings const { inProgress, upcoming, completed } = useMemo(() => { - const inProgress = []; - const upcoming = []; - const completed = []; - const list = Array.isArray(recordings) - ? recordings - : Object.values(recordings || {}); - - // ID-based dedupe guard in case store returns duplicates - const seenIds = new Set(); - for (const rec of list) { - if (rec && rec.id != null) { - const k = String(rec.id); - if (seenIds.has(k)) continue; - seenIds.add(k); - } - const s = toUserTime(rec.start_time); - const e = toUserTime(rec.end_time); - const status = rec.custom_properties?.status; - if (status === 'interrupted' || status === 'completed') { - completed.push(rec); - } else { - if (now.isAfter(s) && now.isBefore(e)) inProgress.push(rec); - else if (now.isBefore(s)) upcoming.push(rec); - else completed.push(rec); - } - } - - // Deduplicate in-progress and upcoming by program id or channel+slot - const dedupeByProgramOrSlot = (arr) => { - const out = []; - const sigs = new Set(); - for (const r of arr) { - const cp = r.custom_properties || {}; - const pr = cp.program || {}; - const sig = - pr?.id != null - ? `id:${pr.id}` - : `slot:${r.channel}|${r.start_time}|${r.end_time}|${pr.title || ''}`; - if (sigs.has(sig)) continue; - sigs.add(sig); - out.push(r); - } - return out; - }; - - const inProgressDedup = dedupeByProgramOrSlot(inProgress).sort( - (a, b) => toUserTime(b.start_time) - toUserTime(a.start_time) - ); - - // Group upcoming by series title+tvg_id (keep only next episode) - const grouped = new Map(); - const upcomingDedup = dedupeByProgramOrSlot(upcoming).sort( - (a, b) => toUserTime(a.start_time) - toUserTime(b.start_time) - ); - for (const rec of upcomingDedup) { - const cp = rec.custom_properties || {}; - const prog = cp.program || {}; - const key = `${prog.tvg_id || ''}|${(prog.title || '').toLowerCase()}`; - if (!grouped.has(key)) { - grouped.set(key, { rec, count: 1 }); - } else { - const entry = grouped.get(key); - entry.count += 1; - } - } - const upcomingGrouped = Array.from(grouped.values()).map((e) => { - const item = { ...e.rec }; - item._group_count = e.count; - return item; - }); - completed.sort((a, b) => toUserTime(b.end_time) - toUserTime(a.end_time)); - return { - inProgress: inProgressDedup, - upcoming: upcomingGrouped, - completed, - }; + return categorizeRecordings(recordings, toUserTime, now); }, [recordings, now, toUserTime]); + const handleOnWatchLive = () => { + const rec = detailsRecording; + const now = userNow(); + const s = toUserTime(rec.start_time); + const e = toUserTime(rec.end_time); + if(isAfter(now, s) && isBefore(now, e)) { + // call into child RecordingCard behavior by constructing a URL like there + const channel = channels[rec.channel]; + if (!channel) return; + const url = getShowVideoUrl(channel, useSettingsStore.getState().environment.env_mode); + useVideoStore.getState().showVideo(url, 'live'); + } + } + + const handleOnWatchRecording = () => { + const url = getRecordingUrl( + detailsRecording.custom_properties, useSettingsStore.getState().environment.env_mode); + if(!url) return; + useVideoStore.getState().showVideo(url, 'vod', { + name: + detailsRecording.custom_properties?.program?.title || + 'Recording', + logo: { + url: getPosterUrl( + detailsRecording.custom_properties?.poster_logo_id, + undefined, + channels[detailsRecording.channel]?.logo?.cache_url + ) + }, + }); + } return ( - + - +
Currently Recording @@ -1550,14 +170,11 @@ const DVRPage = () => { { maxWidth: '36rem', cols: 1 }, ]} > - {inProgress.map((rec) => ( - - ))} + {} {inProgress.length === 0 && ( Nothing recording right now. @@ -1579,14 +196,11 @@ const DVRPage = () => { { maxWidth: '36rem', cols: 1 }, ]} > - {upcoming.map((rec) => ( - - ))} + {} {upcoming.length === 0 && ( No upcoming recordings. @@ -1608,14 +222,11 @@ const DVRPage = () => { { maxWidth: '36rem', cols: 1 }, ]} > - {completed.map((rec) => ( - - ))} + {} {completed.length === 0 && ( No completed recordings yet. @@ -1648,64 +259,28 @@ const DVRPage = () => { {/* Details Modal */} {detailsRecording && ( - { - const rec = detailsRecording; - const now = userNow(); - const s = toUserTime(rec.start_time); - const e = toUserTime(rec.end_time); - if (now.isAfter(s) && now.isBefore(e)) { - // call into child RecordingCard behavior by constructing a URL like there - const channel = channels[rec.channel]; - if (!channel) return; - let url = `/proxy/ts/stream/${channel.uuid}`; - if (useSettingsStore.getState().environment.env_mode === 'dev') { - url = `${window.location.protocol}//${window.location.hostname}:5656${url}`; - } - useVideoStore.getState().showVideo(url, 'live'); - } - }} - onWatchRecording={() => { - let fileUrl = - detailsRecording.custom_properties?.file_url || - detailsRecording.custom_properties?.output_file_url; - if (!fileUrl) return; - if ( - useSettingsStore.getState().environment.env_mode === 'dev' && - fileUrl.startsWith('/') - ) { - fileUrl = `${window.location.protocol}//${window.location.hostname}:5656${fileUrl}`; - } - useVideoStore.getState().showVideo(fileUrl, 'vod', { - name: - detailsRecording.custom_properties?.program?.title || - 'Recording', - logo: { - url: - (detailsRecording.custom_properties?.poster_logo_id - ? `/api/channels/logos/${detailsRecording.custom_properties.poster_logo_id}/cache/` - : channels[detailsRecording.channel]?.logo?.cache_url) || - '/logo.png', - }, - }); - }} - onEdit={(rec) => { - setEditRecording(rec); - closeDetails(); - }} - /> + + Loading...}> + { + setEditRecording(rec); + closeDetails(); + }} + /> + + )} ); diff --git a/frontend/src/pages/Dashboard.jsx b/frontend/src/pages/Dashboard.jsx deleted file mode 100644 index c3c0fb61..00000000 --- a/frontend/src/pages/Dashboard.jsx +++ /dev/null @@ -1,27 +0,0 @@ -// src/components/Dashboard.js -import React, { useState } from 'react'; - -const Dashboard = () => { - const [newStream, setNewStream] = useState(''); - - return ( -
-

Dashboard Page

- setNewStream(e.target.value)} - placeholder="Enter Stream" - /> - -

Streams:

-
    - {state.streams.map((stream, index) => ( -
  • {stream}
  • - ))} -
-
- ); -}; - -export default Dashboard; diff --git a/frontend/src/pages/Guide.jsx b/frontend/src/pages/Guide.jsx index dbeaf431..ac0fdf82 100644 --- a/frontend/src/pages/Guide.jsx +++ b/frontend/src/pages/Guide.jsx @@ -5,248 +5,94 @@ import React, { useEffect, useRef, useCallback, + Suspense, } from 'react'; -import dayjs from 'dayjs'; -import API from '../api'; import useChannelsStore from '../store/channels'; import useLogosStore from '../store/logos'; -import logo from '../images/logo.png'; import useVideoStore from '../store/useVideoStore'; // NEW import -import { notifications } from '@mantine/notifications'; import useSettingsStore from '../store/settings'; import { - Title, - Box, - Flex, - Button, - Text, - Paper, - Group, - TextInput, - Select, ActionIcon, + Box, + Button, + Flex, + Group, + LoadingOverlay, + Paper, + Select, + Text, + TextInput, + Title, Tooltip, - Transition, - Modal, - Stack, } from '@mantine/core'; -import { Search, X, Clock, Video, Calendar, Play } from 'lucide-react'; +import { Calendar, Clock, Search, Video, X } from 'lucide-react'; import './guide.css'; import useEPGsStore from '../store/epgs'; -import useLocalStorage from '../hooks/useLocalStorage'; import { useElementSize } from '@mantine/hooks'; import { VariableSizeList } from 'react-window'; import { - PROGRAM_HEIGHT, - EXPANDED_PROGRAM_HEIGHT, buildChannelIdMap, - mapProgramsByChannel, + calculateDesiredScrollPosition, + calculateEarliestProgramStart, + calculateEnd, + calculateHourTimeline, + calculateLatestProgramEnd, + calculateLeftScrollPosition, + calculateNowPosition, + calculateScrollPosition, + calculateScrollPositionByTimeClick, + calculateStart, + CHANNEL_WIDTH, computeRowHeights, + createRecording, + createSeriesRule, + evaluateSeriesRule, + EXPANDED_PROGRAM_HEIGHT, + fetchPrograms, + fetchRules, + filterGuideChannels, + formatTime, + getGroupOptions, + getProfileOptions, + getRuleByProgram, + HOUR_WIDTH, + mapChannelsById, + mapProgramsByChannel, + mapRecordingsByProgramId, + matchChannelByTvgId, + MINUTE_BLOCK_WIDTH, + MINUTE_INCREMENT, + PROGRAM_HEIGHT, + sortChannels, } from './guideUtils'; - -/** Layout constants */ -const CHANNEL_WIDTH = 120; // Width of the channel/logo column -const HOUR_WIDTH = 450; // Increased from 300 to 450 to make each program wider -const MINUTE_INCREMENT = 15; // For positioning programs every 15 min -const MINUTE_BLOCK_WIDTH = HOUR_WIDTH / (60 / MINUTE_INCREMENT); - -const GuideRow = React.memo(({ index, style, data }) => { - const { - filteredChannels, - programsByChannelId, - expandedProgramId, - rowHeights, - logos, - hoveredChannelId, - setHoveredChannelId, - renderProgram, - handleLogoClick, - contentWidth, - } = data; - - const channel = filteredChannels[index]; - if (!channel) { - return null; - } - - const channelPrograms = programsByChannelId.get(channel.id) || []; - const rowHeight = - rowHeights[index] ?? - (channelPrograms.some((program) => program.id === expandedProgramId) - ? EXPANDED_PROGRAM_HEIGHT - : PROGRAM_HEIGHT); - - return ( -
- - handleLogoClick(channel, event)} - onMouseEnter={() => setHoveredChannelId(channel.id)} - onMouseLeave={() => setHoveredChannelId(null)} - > - {hoveredChannelId === channel.id && ( - - - - )} - - - - {channel.name} - - - - {channel.channel_number || '-'} - - - - - - {channelPrograms.length > 0 ? ( - channelPrograms.map((program) => - renderProgram(program, undefined, channel) - ) - ) : ( - <> - {Array.from({ length: Math.ceil(24 / 2) }).map( - (_, placeholderIndex) => ( - - No program data - - ) - )} - - )} - - -
- ); -}); +import { + getShowVideoUrl, +} from '../utils/cards/RecordingCardUtils.js'; +import { + add, + convertToMs, + format, + getNow, + initializeTime, + startOfDay, + useDateTimeFormat, +} from '../utils/dateTimeUtils.js'; +import GuideRow from '../components/GuideRow.jsx'; +import HourTimeline from '../components/HourTimeline'; +const ProgramRecordingModal = React.lazy(() => + import('../components/forms/ProgramRecordingModal')); +const SeriesRecordingModal = React.lazy(() => + import('../components/forms/SeriesRecordingModal')); +import { showNotification } from '../utils/notificationUtils.js'; +import ErrorBoundary from '../components/ErrorBoundary.jsx'; export default function TVChannelGuide({ startDate, endDate }) { const channels = useChannelsStore((s) => s.channels); const recordings = useChannelsStore((s) => s.recordings); const channelGroups = useChannelsStore((s) => s.channelGroups); const profiles = useChannelsStore((s) => s.profiles); + const isLoading = useChannelsStore((s) => s.isLoading); + const [isProgramsLoading, setIsProgramsLoading] = useState(true); const logos = useLogosStore((s) => s.logos); const tvgsById = useEPGsStore((s) => s.tvgsById); @@ -254,8 +100,7 @@ export default function TVChannelGuide({ startDate, endDate }) { const [programs, setPrograms] = useState([]); const [guideChannels, setGuideChannels] = useState([]); - const [filteredChannels, setFilteredChannels] = useState([]); - const [now, setNow] = useState(dayjs()); + const [now, setNow] = useState(getNow()); const [expandedProgramId, setExpandedProgramId] = useState(null); // Track expanded program const [recordingForProgram, setRecordingForProgram] = useState(null); const [recordChoiceOpen, setRecordChoiceOpen] = useState(false); @@ -290,81 +135,38 @@ export default function TVChannelGuide({ startDate, endDate }) { // Load program data once useEffect(() => { - if (!Object.keys(channels).length === 0) { + if (Object.keys(channels).length === 0) { console.warn('No channels provided or empty channels array'); - notifications.show({ title: 'No channels available', color: 'red.5' }); + showNotification({ title: 'No channels available', color: 'red.5' }); + setIsProgramsLoading(false); return; } - const fetchPrograms = async () => { - console.log('Fetching program grid...'); - const fetched = await API.getGrid(); // GETs your EPG grid - console.log(`Received ${fetched.length} programs`); + const sortedChannels = sortChannels(channels); + setGuideChannels(sortedChannels); - // Include ALL channels, sorted by channel number - don't filter by EPG data - const sortedChannels = Object.values(channels).sort( - (a, b) => - (a.channel_number || Infinity) - (b.channel_number || Infinity) - ); - - console.log(`Using all ${sortedChannels.length} available channels`); - - const processedPrograms = fetched.map((program) => { - const start = dayjs(program.start_time); - const end = dayjs(program.end_time); - return { - ...program, - startMs: start.valueOf(), - endMs: end.valueOf(), - }; + fetchPrograms() + .then((data) => { + setPrograms(data); + setIsProgramsLoading(false); + }) + .catch((error) => { + console.error('Failed to fetch programs:', error); + setIsProgramsLoading(false); }); - - setGuideChannels(sortedChannels); - setFilteredChannels(sortedChannels); // Initialize filtered channels - setPrograms(processedPrograms); - }; - - fetchPrograms(); }, [channels]); // Apply filters when search, group, or profile changes - useEffect(() => { - if (!guideChannels.length) return; + const filteredChannels = useMemo(() => { + if (!guideChannels.length) return []; - let result = [...guideChannels]; - - // Apply search filter - if (searchQuery) { - const query = searchQuery.toLowerCase(); - result = result.filter((channel) => - channel.name.toLowerCase().includes(query) - ); - } - - // Apply channel group filter - if (selectedGroupId !== 'all') { - result = result.filter( - (channel) => channel.channel_group_id === parseInt(selectedGroupId) - ); - } - - // Apply profile filter - if (selectedProfileId !== 'all') { - // Get the profile's enabled channels - const profileChannels = profiles[selectedProfileId]?.channels || []; - // Check if channels is a Set (from the error message, it likely is) - const enabledChannelIds = Array.isArray(profileChannels) - ? profileChannels.filter((pc) => pc.enabled).map((pc) => pc.id) - : profiles[selectedProfileId]?.channels instanceof Set - ? Array.from(profiles[selectedProfileId].channels) - : []; - - result = result.filter((channel) => - enabledChannelIds.includes(channel.id) - ); - } - - setFilteredChannels(result); + return filterGuideChannels( + guideChannels, + searchQuery, + selectedGroupId, + selectedProfileId, + profiles + ); }, [ searchQuery, selectedGroupId, @@ -374,61 +176,44 @@ export default function TVChannelGuide({ startDate, endDate }) { ]); // Use start/end from props or default to "today at midnight" +24h - const defaultStart = dayjs(startDate || dayjs().startOf('day')); - const defaultEnd = endDate ? dayjs(endDate) : defaultStart.add(24, 'hour'); + const defaultStart = initializeTime(startDate || startOfDay(getNow())); + const defaultEnd = endDate + ? initializeTime(endDate) + : add(defaultStart, 24, 'hour'); // Expand timeline if needed based on actual earliest/ latest program - const earliestProgramStart = useMemo(() => { - if (!programs.length) return defaultStart; - return programs.reduce((acc, p) => { - const s = dayjs(p.start_time); - return s.isBefore(acc) ? s : acc; - }, defaultStart); - }, [programs, defaultStart]); + const earliestProgramStart = useMemo( + () => calculateEarliestProgramStart(programs, defaultStart), + [programs, defaultStart] + ); - const latestProgramEnd = useMemo(() => { - if (!programs.length) return defaultEnd; - return programs.reduce((acc, p) => { - const e = dayjs(p.end_time); - return e.isAfter(acc) ? e : acc; - }, defaultEnd); - }, [programs, defaultEnd]); + const latestProgramEnd = useMemo( + () => calculateLatestProgramEnd(programs, defaultEnd), + [programs, defaultEnd] + ); - const start = earliestProgramStart.isBefore(defaultStart) - ? earliestProgramStart - : defaultStart; - const end = latestProgramEnd.isAfter(defaultEnd) - ? latestProgramEnd - : defaultEnd; + const start = calculateStart(earliestProgramStart, defaultStart); + const end = calculateEnd(latestProgramEnd, defaultEnd); const channelIdByTvgId = useMemo( () => buildChannelIdMap(guideChannels, tvgsById, epgs), [guideChannels, tvgsById, epgs] ); - const channelById = useMemo(() => { - const map = new Map(); - guideChannels.forEach((channel) => { - map.set(channel.id, channel); - }); - return map; - }, [guideChannels]); + const channelById = useMemo( + () => mapChannelsById(guideChannels), + [guideChannels] + ); const programsByChannelId = useMemo( () => mapProgramsByChannel(programs, channelIdByTvgId), [programs, channelIdByTvgId] ); - const recordingsByProgramId = useMemo(() => { - const map = new Map(); - (recordings || []).forEach((recording) => { - const programId = recording?.custom_properties?.program?.id; - if (programId != null) { - map.set(programId, recording); - } - }); - return map; - }, [recordings]); + const recordingsByProgramId = useMemo( + () => mapRecordingsByProgramId(recordings), + [recordings] + ); const rowHeights = useMemo( () => @@ -445,62 +230,19 @@ export default function TVChannelGuide({ startDate, endDate }) { [rowHeights] ); - const [timeFormatSetting] = useLocalStorage('time-format', '12h'); - const [dateFormatSetting] = useLocalStorage('date-format', 'mdy'); - // Use user preference for time format - const timeFormat = timeFormatSetting === '12h' ? 'h:mm A' : 'HH:mm'; - const dateFormat = dateFormatSetting === 'mdy' ? 'MMMM D' : 'D MMMM'; + const [timeFormat, dateFormat] = useDateTimeFormat(); // Format day label using relative terms when possible (Today, Tomorrow, etc) const formatDayLabel = useCallback( - (time) => { - const today = dayjs().startOf('day'); - const tomorrow = today.add(1, 'day'); - const weekLater = today.add(7, 'day'); - - const day = time.startOf('day'); - - if (day.isSame(today, 'day')) { - return 'Today'; - } else if (day.isSame(tomorrow, 'day')) { - return 'Tomorrow'; - } else if (day.isBefore(weekLater)) { - // Within a week, show day name - return time.format('dddd'); - } else { - // Beyond a week, show month and day - return time.format(dateFormat); - } - }, + (time) => formatTime(time, dateFormat), [dateFormat] ); // Hourly marks with day labels - const hourTimeline = useMemo(() => { - const hours = []; - let current = start; - let currentDay = null; - - while (current.isBefore(end)) { - // Check if we're entering a new day - const day = current.startOf('day'); - const isNewDay = !currentDay || !day.isSame(currentDay, 'day'); - - if (isNewDay) { - currentDay = day; - } - - // Add day information to our hour object - hours.push({ - time: current, - isNewDay, - dayLabel: formatDayLabel(current), - }); - - current = current.add(1, 'hour'); - } - return hours; - }, [start, end, formatDayLabel]); + const hourTimeline = useMemo( + () => calculateHourTimeline(start, end, formatDayLabel), + [start, end, formatDayLabel] + ); useEffect(() => { const node = guideRef.current; @@ -542,17 +284,16 @@ export default function TVChannelGuide({ startDate, endDate }) { // Update "now" every second useEffect(() => { const interval = setInterval(() => { - setNow(dayjs()); + setNow(getNow()); }, 1000); return () => clearInterval(interval); }, []); // Pixel offset for the "now" vertical line - const nowPosition = useMemo(() => { - if (now.isBefore(start) || now.isAfter(end)) return -1; - const minutesSinceStart = now.diff(start, 'minute'); - return (minutesSinceStart / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; - }, [now, start, end]); + const nowPosition = useMemo( + () => calculateNowPosition(now, start, end), + [now, start, end] + ); useEffect(() => { const tvGuide = tvGuideRef.current; @@ -765,31 +506,14 @@ export default function TVChannelGuide({ startDate, endDate }) { // Scroll to the nearest half-hour mark ONLY on initial load useEffect(() => { if (programs.length > 0 && !initialScrollComplete) { - const roundedNow = - now.minute() < 30 - ? now.startOf('hour') - : now.startOf('hour').add(30, 'minute'); - const nowOffset = roundedNow.diff(start, 'minute'); - const scrollPosition = - (nowOffset / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH - - MINUTE_BLOCK_WIDTH; - - const scrollPos = Math.max(scrollPosition, 0); - syncScrollLeft(scrollPos); + syncScrollLeft(calculateScrollPosition(now, start)); setInitialScrollComplete(true); } }, [programs, start, now, initialScrollComplete, syncScrollLeft]); const findChannelByTvgId = useCallback( - (tvgId) => { - const channelIds = channelIdByTvgId.get(String(tvgId)); - if (!channelIds || channelIds.length === 0) { - return null; - } - // Return the first channel that matches this TVG ID - return channelById.get(channelIds[0]) || null; - }, + (tvgId) => matchChannelByTvgId(channelIdByTvgId, channelById, tvgId), [channelById, channelIdByTvgId] ); @@ -798,19 +522,14 @@ export default function TVChannelGuide({ startDate, endDate }) { setRecordChoiceProgram(program); setRecordChoiceOpen(true); try { - const rules = await API.listSeriesRules(); - const rule = (rules || []).find( - (r) => - String(r.tvg_id) === String(program.tvg_id) && - (!r.title || r.title === program.title) - ); + const rules = await fetchRules(); + const rule = getRuleByProgram(rules, program); setExistingRuleMode(rule ? rule.mode : null); } catch (error) { console.warn('Failed to fetch series rules metadata', error); } - const existingRecording = recordingsByProgramId.get(program.id) || null; - setRecordingForProgram(existingRecording); + setRecordingForProgram(recordingsByProgramId.get(program.id) || null); }, [recordingsByProgramId] ); @@ -819,7 +538,7 @@ export default function TVChannelGuide({ startDate, endDate }) { async (program) => { const channel = findChannelByTvgId(program.tvg_id); if (!channel) { - notifications.show({ + showNotification({ title: 'Unable to schedule recording', message: 'No channel found for this program.', color: 'red.6', @@ -827,24 +546,15 @@ export default function TVChannelGuide({ startDate, endDate }) { return; } - await API.createRecording({ - channel: `${channel.id}`, - start_time: program.start_time, - end_time: program.end_time, - custom_properties: { program }, - }); - notifications.show({ title: 'Recording scheduled' }); + await createRecording(channel, program); + showNotification({ title: 'Recording scheduled' }); }, [findChannelByTvgId] ); const saveSeriesRule = useCallback(async (program, mode) => { - await API.createSeriesRule({ - tvg_id: program.tvg_id, - mode, - title: program.title, - }); - await API.evaluateSeriesRules(program.tvg_id); + await createSeriesRule(program, mode); + await evaluateSeriesRule(program); try { await useChannelsStore.getState().fetchRecordings(); } catch (error) { @@ -853,7 +563,7 @@ export default function TVChannelGuide({ startDate, endDate }) { error ); } - notifications.show({ + showNotification({ title: mode === 'new' ? 'Record new episodes' : 'Record all episodes', }); }, []); @@ -861,7 +571,7 @@ export default function TVChannelGuide({ startDate, endDate }) { const openRules = useCallback(async () => { setRulesOpen(true); try { - const r = await API.listSeriesRules(); + const r = await fetchRules(); setRules(r); } catch (error) { console.warn('Failed to load series rules', error); @@ -878,12 +588,7 @@ export default function TVChannelGuide({ startDate, endDate }) { return; } - let vidUrl = `/proxy/ts/stream/${matched.uuid}`; - if (env_mode === 'dev') { - vidUrl = `${window.location.protocol}//${window.location.hostname}:5656${vidUrl}`; - } - - showVideo(vidUrl); + showVideo(getShowVideoUrl(matched, env_mode)); }, [env_mode, findChannelByTvgId, showVideo] ); @@ -892,12 +597,7 @@ export default function TVChannelGuide({ startDate, endDate }) { (channel, event) => { event.stopPropagation(); - let vidUrl = `/proxy/ts/stream/${channel.uuid}`; - if (env_mode === 'dev') { - vidUrl = `${window.location.protocol}//${window.location.hostname}:5656${vidUrl}`; - } - - showVideo(vidUrl); + showVideo(getShowVideoUrl(channel, env_mode)); }, [env_mode, showVideo] ); @@ -906,13 +606,6 @@ export default function TVChannelGuide({ startDate, endDate }) { (program, event) => { event.stopPropagation(); - const programStartMs = - program.startMs ?? dayjs(program.start_time).valueOf(); - const startOffsetMinutes = (programStartMs - start.valueOf()) / 60000; - const leftPx = - (startOffsetMinutes / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; - const desiredScrollPosition = Math.max(0, leftPx - 20); - if (expandedProgramId === program.id) { setExpandedProgramId(null); setRecordingForProgram(null); @@ -921,6 +614,9 @@ export default function TVChannelGuide({ startDate, endDate }) { setRecordingForProgram(recordingsByProgramId.get(program.id) || null); } + const leftPx = calculateLeftScrollPosition(program, start); + const desiredScrollPosition = calculateDesiredScrollPosition(leftPx); + const guideNode = guideRef.current; if (guideNode) { const currentScrollPosition = guideNode.scrollLeft; @@ -948,16 +644,7 @@ export default function TVChannelGuide({ startDate, endDate }) { return; } - const roundedNow = - now.minute() < 30 - ? now.startOf('hour') - : now.startOf('hour').add(30, 'minute'); - const nowOffset = roundedNow.diff(start, 'minute'); - const scrollPosition = - (nowOffset / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH - MINUTE_BLOCK_WIDTH; - - const scrollPos = Math.max(scrollPosition, 0); - syncScrollLeft(scrollPos, 'smooth'); + syncScrollLeft(calculateScrollPosition(now, start), 'smooth'); }, [now, nowPosition, start, syncScrollLeft]); const handleTimelineScroll = useCallback(() => { @@ -1000,44 +687,26 @@ export default function TVChannelGuide({ startDate, endDate }) { const handleTimeClick = useCallback( (clickedTime, event) => { - const rect = event.currentTarget.getBoundingClientRect(); - const clickPositionX = event.clientX - rect.left; - const percentageAcross = clickPositionX / rect.width; - const minuteWithinHour = Math.floor(percentageAcross * 60); - - let snappedMinute; - if (minuteWithinHour < 7.5) { - snappedMinute = 0; - } else if (minuteWithinHour < 22.5) { - snappedMinute = 15; - } else if (minuteWithinHour < 37.5) { - snappedMinute = 30; - } else if (minuteWithinHour < 52.5) { - snappedMinute = 45; - } else { - snappedMinute = 0; - clickedTime = clickedTime.add(1, 'hour'); - } - - const snappedTime = clickedTime.minute(snappedMinute); - const snappedOffset = snappedTime.diff(start, 'minute'); - const scrollPosition = - (snappedOffset / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; - - syncScrollLeft(scrollPosition, 'smooth'); + syncScrollLeft( + calculateScrollPositionByTimeClick(event, clickedTime, start), + 'smooth' + ); }, [start, syncScrollLeft] ); const renderProgram = useCallback( (program, channelStart = start, channel = null) => { - const programStartMs = - program.startMs ?? dayjs(program.start_time).valueOf(); - const programEndMs = program.endMs ?? dayjs(program.end_time).valueOf(); - const programStart = dayjs(programStartMs); - const programEnd = dayjs(programEndMs); + const { + programStart, + programEnd, + startMs: programStartMs, + endMs: programEndMs, + isLive, + isPast, + } = program; const startOffsetMinutes = - (programStartMs - channelStart.valueOf()) / 60000; + (programStartMs - convertToMs(channelStart)) / 60000; const durationMinutes = (programEndMs - programStartMs) / 60000; const leftPx = (startOffsetMinutes / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; @@ -1048,10 +717,7 @@ export default function TVChannelGuide({ startDate, endDate }) { const recording = recordingsByProgramId.get(program.id); - const isLive = now.isAfter(programStart) && now.isBefore(programEnd); - const isPast = now.isAfter(programEnd); const isExpanded = expandedProgramId === program.id; - const rowHeight = isExpanded ? EXPANDED_PROGRAM_HEIGHT : PROGRAM_HEIGHT; const MIN_EXPANDED_WIDTH = 450; const expandedWidthPx = Math.max(widthPx, MIN_EXPANDED_WIDTH); @@ -1069,36 +735,61 @@ export default function TVChannelGuide({ startDate, endDate }) { textOffsetLeft = Math.min(visibleStart, maxOffset); } + const RecordButton = () => { + return ( + + ); + }; + const WatchNow = () => { + return ( + + ); + }; return ( handleProgramClick(program, event)} > {recording && ( @@ -1154,8 +850,8 @@ export default function TVChannelGuide({ startDate, endDate }) { overflow: 'hidden', }} > - {programStart.format(timeFormat)} -{' '} - {programEnd.format(timeFormat)} + {format(programStart, timeFormat)} -{' '} + {format(programEnd, timeFormat)}
@@ -1169,13 +865,13 @@ export default function TVChannelGuide({ startDate, endDate }) { {program.description} @@ -1183,37 +879,11 @@ export default function TVChannelGuide({ startDate, endDate }) { )} {isExpanded && ( - + - {!isPast && ( - - )} + {!isPast && } - {isLive && ( - - )} + {isLive && } )} @@ -1296,49 +966,13 @@ export default function TVChannelGuide({ startDate, endDate }) { }, [searchQuery, selectedGroupId, selectedProfileId]); // Create group options for dropdown - but only include groups used by guide channels - const groupOptions = useMemo(() => { - const options = [{ value: 'all', label: 'All Channel Groups' }]; - - if (channelGroups && guideChannels.length > 0) { - // Get unique channel group IDs from the channels that have program data - const usedGroupIds = new Set(); - guideChannels.forEach((channel) => { - if (channel.channel_group_id) { - usedGroupIds.add(channel.channel_group_id); - } - }); - // Only add groups that are actually used by channels in the guide - Object.values(channelGroups) - .filter((group) => usedGroupIds.has(group.id)) - .sort((a, b) => a.name.localeCompare(b.name)) // Sort alphabetically - .forEach((group) => { - options.push({ - value: group.id.toString(), - label: group.name, - }); - }); - } - return options; - }, [channelGroups, guideChannels]); + const groupOptions = useMemo( + () => getGroupOptions(channelGroups, guideChannels), + [channelGroups, guideChannels] + ); // Create profile options for dropdown - const profileOptions = useMemo(() => { - const options = [{ value: 'all', label: 'All Profiles' }]; - - if (profiles) { - Object.values(profiles).forEach((profile) => { - if (profile.id !== '0') { - // Skip the 'All' default profile - options.push({ - value: profile.id.toString(), - label: profile.name, - }); - } - }); - } - - return options; - }, [profiles]); + const profileOptions = useMemo(() => getProfileOptions(profiles), [profiles]); // Clear all filters const clearFilters = () => { @@ -1357,40 +991,45 @@ export default function TVChannelGuide({ startDate, endDate }) { setSelectedProfileId(value || 'all'); }; + const handleClearSearchQuery = () => { + setSearchQuery(''); + }; + const handleChangeSearchQuery = (e) => { + setSearchQuery(e.target.value); + }; + return ( {/* Sticky top bar */} {/* Title and current time */} - + <Title order={3} fw={'bold'}> TV Guide - {now.format(`dddd, ${dateFormat}, YYYY • ${timeFormat}`)} + {format(now, `dddd, ${dateFormat}, YYYY • ${timeFormat}`)} setSearchQuery(e.target.value)} - style={{ width: '250px' }} // Reduced width from flex: 1 + onChange={handleChangeSearchQuery} + w={'250px'} // Reduced width from flex: 1 leftSection={} rightSection={ searchQuery ? ( setSearchQuery('')} + onClick={handleClearSearchQuery} variant="subtle" color="gray" size="sm" @@ -1433,7 +1072,7 @@ export default function TVChannelGuide({ startDate, endDate }) { data={groupOptions} value={selectedGroupId} onChange={handleGroupChange} // Use the new handler - style={{ width: '220px' }} + w={'220px'} clearable={true} // Allow clearing the selection /> @@ -1442,7 +1081,7 @@ export default function TVChannelGuide({ startDate, endDate }) { data={profileOptions} value={selectedProfileId} onChange={handleProfileChange} // Use the new handler - style={{ width: '180px' }} + w={'180px'} clearable={true} // Allow clearing the selection /> @@ -1460,14 +1099,14 @@ export default function TVChannelGuide({ startDate, endDate }) { onClick={openRules} style={{ backgroundColor: '#245043', - border: '1px solid #3BA882', - color: '#FFFFFF', }} + bd={'1px solid #3BA882'} + color='#FFFFFF' > Series Rules - + {filteredChannels.length}{' '} {filteredChannels.length === 1 ? 'channel' : 'channels'} @@ -1477,34 +1116,34 @@ export default function TVChannelGuide({ startDate, endDate }) { {/* Guide container with headers and scrollable content */} {/* Logo header - Sticky, non-scrollable */} {/* Logo header cell - sticky in both directions */} {/* Timeline header with its own scrollbar */} @@ -1512,122 +1151,33 @@ export default function TVChannelGuide({ startDate, endDate }) { style={{ flex: 1, overflow: 'hidden', - position: 'relative', }} + pos='relative' > - {' '} - {hourTimeline.map((hourData) => { - const { time, isNewDay } = hourData; - - return ( - handleTimeClick(time, e)} - > - {/* Remove the special day label for new days since we'll show day for all hours */} - - {/* Position time label at the left border of each hour block */} - - {/* Show day above time for every hour using the same format */} - - {formatDayLabel(time)}{' '} - {/* Use same formatDayLabel function for all hours */} - - {time.format(timeFormat)} - - {/*time.format('A')*/} - - - - {/* Hour boundary marker - more visible */} - - - {/* Quarter hour tick marks */} - - {[15, 30, 45].map((minute) => ( - - ))} - - - ); - })} + @@ -1638,22 +1188,23 @@ export default function TVChannelGuide({ startDate, endDate }) { ref={guideContainerRef} style={{ flex: 1, - position: 'relative', overflow: 'hidden', }} + pos='relative' > + {nowPosition >= 0 && ( )} @@ -1674,13 +1225,7 @@ export default function TVChannelGuide({ startDate, endDate }) { {GuideRow} ) : ( - + No channels match your filters - - - {recordingForProgram && ( - <> - - - - )} - {existingRuleMode && ( - - )} - - + + }> + setRecordChoiceOpen(false)} + program={recordChoiceProgram} + recording={recordingForProgram} + existingRuleMode={existingRuleMode} + onRecordOne={() => recordOne(recordChoiceProgram)} + onRecordSeriesAll={() => saveSeriesRule(recordChoiceProgram, 'all')} + onRecordSeriesNew={() => saveSeriesRule(recordChoiceProgram, 'new')} + onExistingRuleModeChange={setExistingRuleMode} + /> + + )} {/* Series rules modal */} {rulesOpen && ( - setRulesOpen(false)} - title="Series Recording Rules" - centered - radius="md" - zIndex={9999} - overlayProps={{ color: '#000', backgroundOpacity: 0.55, blur: 0 }} - styles={{ - content: { backgroundColor: '#18181B', color: 'white' }, - header: { backgroundColor: '#18181B', color: 'white' }, - title: { color: 'white' }, - }} - > - - {(!rules || rules.length === 0) && ( - - No series rules configured - - )} - {rules && - rules.map((r) => ( - - - {r.title || r.tvg_id} —{' '} - {r.mode === 'new' ? 'New episodes' : 'Every episode'} - - - - - - - ))} - - + + }> + setRulesOpen(false)} + rules={rules} + onRulesUpdate={setRules} + /> + + )} ); diff --git a/frontend/src/pages/Home.jsx b/frontend/src/pages/Home.jsx deleted file mode 100644 index e9751d8d..00000000 --- a/frontend/src/pages/Home.jsx +++ /dev/null @@ -1,14 +0,0 @@ -// src/components/Home.js -import React, { useState } from 'react'; - -const Home = () => { - const [newChannel, setNewChannel] = useState(''); - - return ( -
-

Home Page

-
- ); -}; - -export default Home; diff --git a/frontend/src/pages/Login.jsx b/frontend/src/pages/Login.jsx index 262d4c35..3c2cf869 100644 --- a/frontend/src/pages/Login.jsx +++ b/frontend/src/pages/Login.jsx @@ -1,13 +1,21 @@ -import React from 'react'; +import React, { lazy, Suspense } from 'react'; import LoginForm from '../components/forms/LoginForm'; -import SuperuserForm from '../components/forms/SuperuserForm'; +const SuperuserForm = lazy(() => import('../components/forms/SuperuserForm')); import useAuthStore from '../store/auth'; +import ErrorBoundary from '../components/ErrorBoundary.jsx'; +import { Text } from '@mantine/core'; const Login = ({}) => { const superuserExists = useAuthStore((s) => s.superuserExists); if (!superuserExists) { - return ; + return ( + + Loading...
}> + + + + ); } return ; diff --git a/frontend/src/pages/Logos.jsx b/frontend/src/pages/Logos.jsx index 889e32c9..f95212d6 100644 --- a/frontend/src/pages/Logos.jsx +++ b/frontend/src/pages/Logos.jsx @@ -1,34 +1,34 @@ import React, { useEffect, useCallback, useState } from 'react'; -import { Box, Tabs, Flex, Text } from '@mantine/core'; -import { notifications } from '@mantine/notifications'; +import { Box, Tabs, Flex, Text, TabsList, TabsTab } from '@mantine/core'; import useLogosStore from '../store/logos'; import useVODLogosStore from '../store/vodLogos'; import LogosTable from '../components/tables/LogosTable'; import VODLogosTable from '../components/tables/VODLogosTable'; +import { showNotification } from '../utils/notificationUtils.js'; const LogosPage = () => { - const { fetchAllLogos, needsAllLogos, logos } = useLogosStore(); - const { totalCount } = useVODLogosStore(); + const logos = useLogosStore(s => s.logos); + const totalCount = useVODLogosStore(s => s.totalCount); const [activeTab, setActiveTab] = useState('channel'); - - const channelLogosCount = Object.keys(logos).length; - const vodLogosCount = totalCount; + const logoCount = activeTab === 'channel' + ? Object.keys(logos).length + : totalCount; const loadChannelLogos = useCallback(async () => { try { // Only fetch all logos if we haven't loaded them yet - if (needsAllLogos()) { - await fetchAllLogos(); + if (useLogosStore.getState().needsAllLogos()) { + await useLogosStore.getState().fetchAllLogos(); } } catch (err) { - notifications.show({ + showNotification({ title: 'Error', message: 'Failed to load channel logos', color: 'red', }); console.error('Failed to load channel logos:', err); } - }, [fetchAllLogos, needsAllLogos]); + }, []); useEffect(() => { // Always load channel logos on mount @@ -39,51 +39,41 @@ const LogosPage = () => { {/* Header with title and tabs */} Logos - ({activeTab === 'channel' ? channelLogosCount : vodLogosCount}{' '} - logo - {(activeTab === 'channel' ? channelLogosCount : vodLogosCount) !== - 1 - ? 's' - : ''} - ) + ({logoCount} {logoCount !== 1 ? 'logos' : 'logo'}) - - Channel Logos - VOD Logos - + + Channel Logos + VOD Logos + diff --git a/frontend/src/pages/Plugins.jsx b/frontend/src/pages/Plugins.jsx index f2902523..21df7faf 100644 --- a/frontend/src/pages/Plugins.jsx +++ b/frontend/src/pages/Plugins.jsx @@ -1,353 +1,108 @@ -import React, { useEffect, useState } from 'react'; +import React, { + Suspense, + useCallback, + useEffect, + useRef, + useState, +} from 'react'; import { - AppShell, - Box, + ActionIcon, Alert, + AppShellMain, + Box, Button, - Card, + Divider, + FileInput, Group, Loader, + Modal, + SimpleGrid, Stack, Switch, Text, - TextInput, - NumberInput, - Select, - Divider, - ActionIcon, - SimpleGrid, - Modal, - FileInput, } from '@mantine/core'; import { Dropzone } from '@mantine/dropzone'; -import { RefreshCcw, Trash2 } from 'lucide-react'; -import API from '../api'; -import { notifications } from '@mantine/notifications'; +import { showNotification, updateNotification, } from '../utils/notificationUtils.js'; +import { usePluginStore } from '../store/plugins.jsx'; +import { + deletePluginByKey, + importPlugin, + runPluginAction, + setPluginEnabled, + updatePluginSettings, +} from '../utils/pages/PluginsUtils.js'; +import { RefreshCcw } from 'lucide-react'; +import ErrorBoundary from '../components/ErrorBoundary.jsx'; +const PluginCard = React.lazy(() => + import('../components/cards/PluginCard.jsx')); -const Field = ({ field, value, onChange }) => { - const common = { label: field.label, description: field.help_text }; - const effective = value ?? field.default; - switch (field.type) { - case 'boolean': - return ( - onChange(field.id, e.currentTarget.checked)} - label={field.label} - description={field.help_text} - /> - ); - case 'number': - return ( - onChange(field.id, v)} - {...common} - /> - ); - case 'select': - return ( - onUISettingsChange('table-size', val)} - data={[ - { - value: 'default', - label: 'Default', - }, - { - value: 'compact', - label: 'Compact', - }, - { - value: 'large', - label: 'Large', - }, - ]} - /> - onUISettingsChange('date-format', val)} - data={[ - { - value: 'mdy', - label: 'MM/DD/YYYY', - }, - { - value: 'dmy', - label: 'DD/MM/YYYY', - }, - ]} - /> - ({ - value: `${option.id}`, - label: option.name, - }))} - /> - ({ - label: r.label, - value: `${r.value}`, - }))} - /> + + DVR + + + }> + + + + + - - - Auto-Import Mapped Files - - - + + Stream Settings + + + }> + + + + + - + + System Settings + + + }> + + + + + - {rehashSuccess && ( - - )} + + User-Agents + + + }> + + + + + - - - - - - - + + Stream Profiles + + + }> + + + + + - - System Settings - - - {generalSettingsSaved && ( - - )} - - Configure how many system events (channel start/stop, - buffering, etc.) to keep in the database. Events are - displayed on the Stats page. - - { - form.setFieldValue('max-system-events', value); - }} - min={10} - max={1000} - step={10} - /> - - - - - - - - - User-Agents - - - - - - - Stream Profiles - - - - - - - + + Network Access - {accordianValue == 'network-access' && ( + {accordianValue === 'network-access' && ( Comma-Delimited CIDR ranges )} - - -
- - {networkAccessSaved && ( - - )} - {networkAccessError && ( - - )} - {Object.entries(NETWORK_ACCESS_OPTIONS).map( - ([key, config]) => { - return ( - - ); - } - )} + + + + }> + + + + + - - - - -
-
-
- - - + + Proxy Settings - - -
- - {proxySettingsSaved && ( - - )} - {Object.entries(PROXY_SETTINGS_OPTIONS).map( - ([key, config]) => { - // Determine if this field should be a NumberInput - const isNumericField = [ - 'buffering_timeout', - 'redis_chunk_ttl', - 'channel_shutdown_delay', - 'channel_init_grace_period', - ].includes(key); + + + + }> + + + + + - const isFloatField = key === 'buffering_speed'; - - if (isNumericField) { - return ( - - ); - } else if (isFloatField) { - return ( - - ); - } else { - return ( - - ); - } - } - )} - - - - - - -
-
-
- - - Backup & Restore - - - - + + Backup & Restore + + + }> + + + + + )}
- - { - setRehashConfirmOpen(false); - setRehashDialogType(null); - // Clear pending values when dialog is cancelled - setPendingChangedSettings(null); - }} - onConfirm={handleRehashConfirm} - title={ - rehashDialogType === 'save' - ? 'Save Settings and Rehash Streams' - : 'Confirm Stream Rehash' - } - message={ -
- {`Are you sure you want to rehash all streams? - -This process may take a while depending on the number of streams. -Do not shut down Dispatcharr until the rehashing is complete. -M3U refreshes will be blocked until this process finishes. - -Please ensure you have time to let this complete before proceeding.`} -
- } - confirmLabel={ - rehashDialogType === 'save' ? 'Save and Rehash' : 'Start Rehash' - } - cancelLabel="Cancel" - actionKey="rehash-streams" - onSuppressChange={suppressWarning} - size="md" - /> - - setNetworkAccessConfirmOpen(false)} - onConfirm={saveNetworkAccess} - title={`Confirm Network Access Blocks`} - message={ - <> - - Your client is not included in the allowed networks for the web - UI. Are you sure you want to proceed? - - -
    - {netNetworkAccessConfirmCIDRs.map((cidr) => ( -
  • {cidr}
  • - ))} -
- - } - confirmLabel="Save" - cancelLabel="Cancel" - size="md" - /> ); }; diff --git a/frontend/src/pages/Stats.jsx b/frontend/src/pages/Stats.jsx index e7e3043a..8ec576a8 100644 --- a/frontend/src/pages/Stats.jsx +++ b/frontend/src/pages/Stats.jsx @@ -481,8 +481,8 @@ const VODCard = ({ vodContent, stopVODClient }) => { size={16} style={{ transform: isClientExpanded - ? 'rotate(180deg)' - : 'rotate(0deg)', + ? 'rotate(0deg)' + : 'rotate(180deg)', transition: 'transform 0.2s', }} /> diff --git a/frontend/src/pages/Users.jsx b/frontend/src/pages/Users.jsx index 570e49c1..e69f07f8 100644 --- a/frontend/src/pages/Users.jsx +++ b/frontend/src/pages/Users.jsx @@ -1,55 +1,25 @@ -import React, { useState } from 'react'; import UsersTable from '../components/tables/UsersTable'; import { Box } from '@mantine/core'; import useAuthStore from '../store/auth'; -import { USER_LEVELS } from '../constants'; +import ErrorBoundary from '../components/ErrorBoundary'; -const UsersPage = () => { +const PageContent = () => { const authUser = useAuthStore((s) => s.user); - - const [selectedUser, setSelectedUser] = useState(null); - const [userModalOpen, setUserModalOpen] = useState(false); - const [confirmDeleteOpen, setConfirmDeleteOpen] = useState(false); - const [deleteTarget, setDeleteTarget] = useState(null); - const [userToDelete, setUserToDelete] = useState(null); - - if (!authUser.id) { - return <>; - } - - const closeUserModal = () => { - setSelectedUser(null); - setUserModalOpen(false); - }; - const editUser = (user) => { - setSelectedUser(user); - setUserModalOpen(true); - }; - - const deleteUser = (id) => { - // Get user details for the confirmation dialog - const user = users.find((u) => u.id === id); - setUserToDelete(user); - setDeleteTarget(id); - - // Skip warning if it's been suppressed - if (isWarningSuppressed('delete-user')) { - return executeDeleteUser(id); - } - - setConfirmDeleteOpen(true); - }; - - const executeDeleteUser = async (id) => { - await API.deleteUser(id); - setConfirmDeleteOpen(false); - }; + if (!authUser.id) throw new Error(); return ( - + ); +} + +const UsersPage = () => { + return ( + + + + ); }; export default UsersPage; diff --git a/frontend/src/pages/guideUtils.js b/frontend/src/pages/guideUtils.js index 1f4ff671..68bb74b2 100644 --- a/frontend/src/pages/guideUtils.js +++ b/frontend/src/pages/guideUtils.js @@ -1,7 +1,26 @@ -import dayjs from 'dayjs'; +import { + convertToMs, + initializeTime, + startOfDay, + isBefore, + isAfter, + isSame, + add, + diff, + format, + getNow, + getNowMs, + roundToNearest +} from '../utils/dateTimeUtils.js'; +import API from '../api.js'; export const PROGRAM_HEIGHT = 90; export const EXPANDED_PROGRAM_HEIGHT = 180; +/** Layout constants */ +export const CHANNEL_WIDTH = 120; // Width of the channel/logo column +export const HOUR_WIDTH = 450; // Increased from 300 to 450 to make each program wider +export const MINUTE_INCREMENT = 15; // For positioning programs every 15 min +export const MINUTE_BLOCK_WIDTH = HOUR_WIDTH / (60 / MINUTE_INCREMENT); export function buildChannelIdMap(channels, tvgsById, epgs = {}) { const map = new Map(); @@ -38,25 +57,32 @@ export function buildChannelIdMap(channels, tvgsById, epgs = {}) { return map; } -export function mapProgramsByChannel(programs, channelIdByTvgId) { +export const mapProgramsByChannel = (programs, channelIdByTvgId) => { if (!programs?.length || !channelIdByTvgId?.size) { return new Map(); } const map = new Map(); + const nowMs = getNowMs(); + programs.forEach((program) => { const channelIds = channelIdByTvgId.get(String(program.tvg_id)); if (!channelIds || channelIds.length === 0) { return; } - const startMs = program.startMs ?? dayjs(program.start_time).valueOf(); - const endMs = program.endMs ?? dayjs(program.end_time).valueOf(); + const startMs = program.startMs ?? convertToMs(program.start_time); + const endMs = program.endMs ?? convertToMs(program.end_time); const programData = { ...program, startMs, endMs, + programStart: initializeTime(program.startMs), + programEnd: initializeTime(program.endMs), + // Precompute live/past status + isLive: nowMs >= program.startMs && nowMs < program.endMs, + isPast: nowMs >= program.endMs, }; // Add this program to all channels that share the same TVG ID @@ -73,7 +99,7 @@ export function mapProgramsByChannel(programs, channelIdByTvgId) { }); return map; -} +}; export function computeRowHeights( filteredChannels, @@ -94,3 +120,282 @@ export function computeRowHeights( return expanded ? expandedHeight : defaultHeight; }); } + +export const fetchPrograms = async () => { + console.log('Fetching program grid...'); + const fetched = await API.getGrid(); // GETs your EPG grid + console.log(`Received ${fetched.length} programs`); + + return fetched.map((program) => { + return { + ...program, + startMs: convertToMs(program.start_time), + endMs: convertToMs(program.end_time), + }; + }); +}; + +export const sortChannels = (channels) => { + // Include ALL channels, sorted by channel number - don't filter by EPG data + const sortedChannels = Object.values(channels).sort( + (a, b) => + (a.channel_number || Infinity) - (b.channel_number || Infinity) + ); + + console.log(`Using all ${sortedChannels.length} available channels`); + return sortedChannels; +} + +export const filterGuideChannels = (guideChannels, searchQuery, selectedGroupId, selectedProfileId, profiles) => { + return guideChannels.filter((channel) => { + // Search filter + if (searchQuery) { + if (!channel.name.toLowerCase().includes(searchQuery.toLowerCase())) return false; + } + + // Channel group filter + if (selectedGroupId !== 'all') { + if (channel.channel_group_id !== parseInt(selectedGroupId)) return false; + } + + // Profile filter + if (selectedProfileId !== 'all') { + const profileChannels = profiles[selectedProfileId]?.channels || []; + const enabledChannelIds = Array.isArray(profileChannels) + ? profileChannels.filter((pc) => pc.enabled).map((pc) => pc.id) + : profiles[selectedProfileId]?.channels instanceof Set + ? Array.from(profiles[selectedProfileId].channels) + : []; + + if (!enabledChannelIds.includes(channel.id)) return false; + } + + return true; + }); +} + +export const calculateEarliestProgramStart = (programs, defaultStart) => { + if (!programs.length) return defaultStart; + return programs.reduce((acc, p) => { + const s = initializeTime(p.start_time); + return isBefore(s, acc) ? s : acc; + }, defaultStart); +} + +export const calculateLatestProgramEnd = (programs, defaultEnd) => { + if (!programs.length) return defaultEnd; + return programs.reduce((acc, p) => { + const e = initializeTime(p.end_time); + return isAfter(e, acc) ? e : acc; + }, defaultEnd); +} + +export const calculateStart = (earliestProgramStart, defaultStart) => { + return isBefore(earliestProgramStart, defaultStart) + ? earliestProgramStart + : defaultStart; +} + +export const calculateEnd = (latestProgramEnd, defaultEnd) => { + return isAfter(latestProgramEnd, defaultEnd) ? latestProgramEnd : defaultEnd; +} + +export const mapChannelsById = (guideChannels) => { + const map = new Map(); + guideChannels.forEach((channel) => { + map.set(channel.id, channel); + }); + return map; +} + +export const mapRecordingsByProgramId = (recordings) => { + const map = new Map(); + (recordings || []).forEach((recording) => { + const programId = recording?.custom_properties?.program?.id; + if (programId != null) { + map.set(programId, recording); + } + }); + return map; +} + +export const formatTime = (time, dateFormat) => { + const today = startOfDay(getNow()); + const tomorrow = add(today, 1, 'day'); + const weekLater = add(today, 7, 'day'); + const day = startOfDay(time); + + if (isSame(day, today, 'day')) { + return 'Today'; + } else if (isSame(day, tomorrow, 'day')) { + return 'Tomorrow'; + } else if (isBefore(day, weekLater)) { + // Within a week, show day name + return format(time, 'dddd'); + } else { + // Beyond a week, show month and day + return format(time, dateFormat); + } +} + +export const calculateHourTimeline = (start, end, formatDayLabel) => { + const hours = []; + let current = start; + let currentDay = null; + + while (isBefore(current, end)) { + // Check if we're entering a new day + const day = startOfDay(current); + const isNewDay = !currentDay || !isSame(day, currentDay, 'day'); + + if (isNewDay) { + currentDay = day; + } + + // Add day information to our hour object + hours.push({ + time: current, + isNewDay, + dayLabel: formatDayLabel(current), + }); + + current = add(current, 1, 'hour'); + } + return hours; +} + +export const calculateNowPosition = (now, start, end) => { + if (isBefore(now, start) || isAfter(now, end)) return -1; + const minutesSinceStart = diff(now, start, 'minute'); + return (minutesSinceStart / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; +}; + +export const calculateScrollPosition = (now, start) => { + const roundedNow = roundToNearest(now, 30); + const nowOffset = diff(roundedNow, start, 'minute'); + const scrollPosition = + (nowOffset / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH - MINUTE_BLOCK_WIDTH; + + return Math.max(scrollPosition, 0); +}; + +export const matchChannelByTvgId = (channelIdByTvgId, channelById, tvgId) => { + const channelIds = channelIdByTvgId.get(String(tvgId)); + if (!channelIds || channelIds.length === 0) { + return null; + } + // Return the first channel that matches this TVG ID + return channelById.get(channelIds[0]) || null; +} + +export const fetchRules = async () => { + return await API.listSeriesRules(); +} + +export const getRuleByProgram = (rules, program) => { + return (rules || []).find( + (r) => + String(r.tvg_id) === String(program.tvg_id) && + (!r.title || r.title === program.title) + ); +} + +export const createRecording = async (channel, program) => { + await API.createRecording({ + channel: `${channel.id}`, + start_time: program.start_time, + end_time: program.end_time, + custom_properties: { program }, + }); +} + +export const createSeriesRule = async (program, mode) => { + await API.createSeriesRule({ + tvg_id: program.tvg_id, + mode, + title: program.title, + }); +} + +export const evaluateSeriesRule = async (program) => { + await API.evaluateSeriesRules(program.tvg_id); +} + +export const calculateLeftScrollPosition = (program, start) => { + const programStartMs = + program.startMs ?? convertToMs(program.start_time); + const startOffsetMinutes = (programStartMs - convertToMs(start)) / 60000; + + return (startOffsetMinutes / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; +}; + +export const calculateDesiredScrollPosition = (leftPx) => { + return Math.max(0, leftPx - 20); +} + +export const calculateScrollPositionByTimeClick = (event, clickedTime, start) => { + const rect = event.currentTarget.getBoundingClientRect(); + const clickPositionX = event.clientX - rect.left; + const percentageAcross = clickPositionX / rect.width; + const minuteWithinHour = percentageAcross * 60; + + const snappedMinute = Math.round(minuteWithinHour / 15) * 15; + + const adjustedTime = (snappedMinute === 60) + ? add(clickedTime, 1, 'hour').minute(0) + : clickedTime.minute(snappedMinute); + + const snappedOffset = diff(adjustedTime, start, 'minute'); + return (snappedOffset / MINUTE_INCREMENT) * MINUTE_BLOCK_WIDTH; +}; + +export const getGroupOptions = (channelGroups, guideChannels) => { + const options = [{ value: 'all', label: 'All Channel Groups' }]; + + if (channelGroups && guideChannels.length > 0) { + // Get unique channel group IDs from the channels that have program data + const usedGroupIds = new Set(); + guideChannels.forEach((channel) => { + if (channel.channel_group_id) { + usedGroupIds.add(channel.channel_group_id); + } + }); + // Only add groups that are actually used by channels in the guide + Object.values(channelGroups) + .filter((group) => usedGroupIds.has(group.id)) + .sort((a, b) => a.name.localeCompare(b.name)) // Sort alphabetically + .forEach((group) => { + options.push({ + value: group.id.toString(), + label: group.name, + }); + }); + } + return options; +} + +export const getProfileOptions = (profiles) => { + const options = [{ value: 'all', label: 'All Profiles' }]; + + if (profiles) { + Object.values(profiles).forEach((profile) => { + if (profile.id !== '0') { + // Skip the 'All' default profile + options.push({ + value: profile.id.toString(), + label: profile.name, + }); + } + }); + } + + return options; +} + +export const deleteSeriesRuleByTvgId = async (tvg_id) => { + await API.deleteSeriesRule(tvg_id); +} + +export const evaluateSeriesRulesByTvgId = async (tvg_id) => { + await API.evaluateSeriesRules(tvg_id); +} \ No newline at end of file diff --git a/frontend/src/store/auth.jsx b/frontend/src/store/auth.jsx index b1d60a1a..8fe943b7 100644 --- a/frontend/src/store/auth.jsx +++ b/frontend/src/store/auth.jsx @@ -7,7 +7,6 @@ import useEPGsStore from './epgs'; import useStreamProfilesStore from './streamProfiles'; import useUserAgentsStore from './userAgents'; import useUsersStore from './users'; -import useLogosStore from './logos'; import API from '../api'; import { USER_LEVELS } from '../constants'; @@ -43,6 +42,8 @@ const useAuthStore = create((set, get) => ({ throw new Error('Unauthorized'); } + set({ user, isAuthenticated: true }); + // Ensure settings are loaded first await useSettingsStore.getState().fetchSettings(); @@ -63,7 +64,8 @@ const useAuthStore = create((set, get) => ({ await Promise.all([useUsersStore.getState().fetchUsers()]); } - set({ user, isAuthenticated: true }); + // Note: Logos are loaded after the Channels page tables finish loading + // This is handled by the tables themselves signaling completion } catch (error) { console.error('Error initializing data:', error); } diff --git a/frontend/src/store/logos.jsx b/frontend/src/store/logos.jsx index 4634f672..5843b113 100644 --- a/frontend/src/store/logos.jsx +++ b/frontend/src/store/logos.jsx @@ -9,16 +9,10 @@ const useLogosStore = create((set, get) => ({ hasLoadedAll: false, // Track if we've loaded all logos hasLoadedChannelLogos: false, // Track if we've loaded channel logos error: null, + allowLogoRendering: false, // Gate to prevent logo rendering until tables are ready - // Basic CRUD operations - setLogos: (logos) => { - set({ - logos: logos.reduce((acc, logo) => { - acc[logo.id] = { ...logo }; - return acc; - }, {}), - }); - }, + // Enable logo rendering (call this after tables have loaded and painted) + enableLogoRendering: () => set({ allowLogoRendering: true }), addLogo: (newLogo) => set((state) => { @@ -73,6 +67,9 @@ const useLogosStore = create((set, get) => ({ // Smart loading methods fetchLogos: async (pageSize = 100) => { + // Don't fetch if logo fetching is not allowed yet + if (!get().allowLogoFetching) return []; + set({ isLoading: true, error: null }); try { const response = await api.getLogos({ page_size: pageSize }); @@ -163,59 +160,28 @@ const useLogosStore = create((set, get) => ({ }, fetchChannelAssignableLogos: async () => { - const { backgroundLoading, hasLoadedChannelLogos, channelLogos } = get(); + const { hasLoadedChannelLogos, channelLogos } = get(); - // Prevent concurrent calls - if ( - backgroundLoading || - (hasLoadedChannelLogos && Object.keys(channelLogos).length > 0) - ) { + // Return cached if already loaded + if (hasLoadedChannelLogos && Object.keys(channelLogos).length > 0) { return Object.values(channelLogos); } - set({ backgroundLoading: true, error: null }); - try { - // Load all channel logos (no special filtering needed - all Logo entries are for channels) - const response = await api.getLogos({ - no_pagination: 'true', // Get all channel logos - }); + // Fetch all logos and cache them as channel logos + const logos = await get().fetchAllLogos(); - // Handle both paginated and non-paginated responses - const logos = Array.isArray(response) ? response : response.results || []; + set({ + channelLogos: logos.reduce((acc, logo) => { + acc[logo.id] = { ...logo }; + return acc; + }, {}), + hasLoadedChannelLogos: true, + }); - console.log(`Fetched ${logos.length} channel logos`); - - // Store in both places, but this is intentional and only when specifically requested - set({ - logos: { - ...get().logos, // Keep existing logos - ...logos.reduce((acc, logo) => { - acc[logo.id] = { ...logo }; - return acc; - }, {}), - }, - channelLogos: logos.reduce((acc, logo) => { - acc[logo.id] = { ...logo }; - return acc; - }, {}), - hasLoadedChannelLogos: true, - backgroundLoading: false, - }); - - return logos; - } catch (error) { - console.error('Failed to fetch channel logos:', error); - set({ - error: 'Failed to load channel logos.', - backgroundLoading: false, - }); - throw error; - } + return logos; }, fetchLogosByIds: async (logoIds) => { - if (!logoIds || logoIds.length === 0) return []; - try { // Filter out logos we already have const missingIds = logoIds.filter((id) => !get().logos[id]); diff --git a/frontend/src/store/plugins.jsx b/frontend/src/store/plugins.jsx new file mode 100644 index 00000000..e8d0b065 --- /dev/null +++ b/frontend/src/store/plugins.jsx @@ -0,0 +1,41 @@ +import { create } from 'zustand'; +import API from '../api'; + +export const usePluginStore = create((set, get) => ({ + plugins: [], + loading: false, + error: null, + + fetchPlugins: async () => { + set({ loading: true, error: null }); + try { + const response = await API.getPlugins(); + set({ plugins: response || [], loading: false }); + } catch (error) { + set({ error, loading: false }); + } + }, + + updatePlugin: (key, updates) => { + set((state) => ({ + plugins: state.plugins.map((p) => + p.key === key ? { ...p, ...updates } : p + ), + })); + }, + + addPlugin: (plugin) => { + set((state) => ({ plugins: [...state.plugins, plugin] })); + }, + + removePlugin: (key) => { + set((state) => ({ + plugins: state.plugins.filter((p) => p.key !== key), + })); + }, + + invalidatePlugins: () => { + set({ plugins: [] }); + get().fetchPlugins(); + }, +})); \ No newline at end of file diff --git a/frontend/src/utils/cards/PluginCardUtils.js b/frontend/src/utils/cards/PluginCardUtils.js new file mode 100644 index 00000000..8752e019 --- /dev/null +++ b/frontend/src/utils/cards/PluginCardUtils.js @@ -0,0 +1,24 @@ +export const getConfirmationDetails = (action, plugin, settings) => { + const actionConfirm = action.confirm; + const confirmField = (plugin.fields || []).find((f) => f.id === 'confirm'); + let requireConfirm = false; + let confirmTitle = `Run ${action.label}?`; + let confirmMessage = `You're about to run "${action.label}" from "${plugin.name}".`; + + if (actionConfirm) { + if (typeof actionConfirm === 'boolean') { + requireConfirm = actionConfirm; + } else if (typeof actionConfirm === 'object') { + requireConfirm = actionConfirm.required !== false; + if (actionConfirm.title) confirmTitle = actionConfirm.title; + if (actionConfirm.message) confirmMessage = actionConfirm.message; + } + } else if (confirmField) { + const settingVal = settings?.confirm; + const effectiveConfirm = + (settingVal !== undefined ? settingVal : confirmField.default) ?? false; + requireConfirm = !!effectiveConfirm; + } + + return { requireConfirm, confirmTitle, confirmMessage }; +}; diff --git a/frontend/src/utils/cards/RecordingCardUtils.js b/frontend/src/utils/cards/RecordingCardUtils.js new file mode 100644 index 00000000..65b3da3a --- /dev/null +++ b/frontend/src/utils/cards/RecordingCardUtils.js @@ -0,0 +1,92 @@ +import API from '../../api.js'; +import useChannelsStore from '../../store/channels.jsx'; + +export const removeRecording = (id) => { + // Optimistically remove immediately from UI + try { + useChannelsStore.getState().removeRecording(id); + } catch (error) { + console.error('Failed to optimistically remove recording', error); + } + // Fire-and-forget server delete; websocket will keep others in sync + API.deleteRecording(id).catch(() => { + // On failure, fallback to refetch to restore state + try { + useChannelsStore.getState().fetchRecordings(); + } catch (error) { + console.error('Failed to refresh recordings after delete', error); + } + }); +}; + +export const getPosterUrl = (posterLogoId, customProperties, posterUrl) => { + let purl = posterLogoId + ? `/api/channels/logos/${posterLogoId}/cache/` + : customProperties?.poster_url || posterUrl || '/logo.png'; + if ( + typeof import.meta !== 'undefined' && + import.meta.env && + import.meta.env.DEV && + purl && + purl.startsWith('/') + ) { + purl = `${window.location.protocol}//${window.location.hostname}:5656${purl}`; + } + return purl; +}; + +export const getShowVideoUrl = (channel, env_mode) => { + let url = `/proxy/ts/stream/${channel.uuid}`; + if (env_mode === 'dev') { + url = `${window.location.protocol}//${window.location.hostname}:5656${url}`; + } + return url; +}; + +export const runComSkip = async (recording) => { + await API.runComskip(recording.id); +}; + +export const deleteRecordingById = async (recordingId) => { + await API.deleteRecording(recordingId); +}; + +export const deleteSeriesAndRule = async (seriesInfo) => { + const { tvg_id, title } = seriesInfo; + if (tvg_id) { + try { + await API.bulkRemoveSeriesRecordings({ + tvg_id, + title, + scope: 'title', + }); + } catch (error) { + console.error('Failed to remove series recordings', error); + } + try { + await API.deleteSeriesRule(tvg_id); + } catch (error) { + console.error('Failed to delete series rule', error); + } + } +}; + +export const getRecordingUrl = (customProps, env_mode) => { + let fileUrl = customProps?.file_url || customProps?.output_file_url; + if (fileUrl && env_mode === 'dev' && fileUrl.startsWith('/')) { + fileUrl = `${window.location.protocol}//${window.location.hostname}:5656${fileUrl}`; + } + return fileUrl; +}; + +export const getSeasonLabel = (season, episode, onscreen) => { + return season && episode + ? `S${String(season).padStart(2, '0')}E${String(episode).padStart(2, '0')}` + : onscreen || null; +}; + +export const getSeriesInfo = (customProps) => { + const cp = customProps || {}; + const pr = cp.program || {}; + return { tvg_id: pr.tvg_id, title: pr.title }; +}; \ No newline at end of file diff --git a/frontend/src/utils/dateTimeUtils.js b/frontend/src/utils/dateTimeUtils.js new file mode 100644 index 00000000..64a50947 --- /dev/null +++ b/frontend/src/utils/dateTimeUtils.js @@ -0,0 +1,258 @@ +import { useCallback, useEffect } from 'react'; +import dayjs from 'dayjs'; +import duration from 'dayjs/plugin/duration'; +import relativeTime from 'dayjs/plugin/relativeTime'; +import utc from 'dayjs/plugin/utc'; +import timezone from 'dayjs/plugin/timezone'; +import useSettingsStore from '../store/settings'; +import useLocalStorage from '../hooks/useLocalStorage'; + +dayjs.extend(duration); +dayjs.extend(relativeTime); +dayjs.extend(utc); +dayjs.extend(timezone); + +export const convertToMs = (dateTime) => dayjs(dateTime).valueOf(); + +export const initializeTime = (dateTime) => dayjs(dateTime); + +export const startOfDay = (dateTime) => dayjs(dateTime).startOf('day'); + +export const isBefore = (date1, date2) => dayjs(date1).isBefore(date2); + +export const isAfter = (date1, date2) => dayjs(date1).isAfter(date2); + +export const isSame = (date1, date2, unit = 'day') => + dayjs(date1).isSame(date2, unit); + +export const add = (dateTime, value, unit) => dayjs(dateTime).add(value, unit); + +export const diff = (date1, date2, unit = 'millisecond') => + dayjs(date1).diff(date2, unit); + +export const format = (dateTime, formatStr) => + dayjs(dateTime).format(formatStr); + +export const getNow = () => dayjs(); + +export const getNowMs = () => Date.now(); + +export const roundToNearest = (dateTime, minutes) => { + const current = initializeTime(dateTime); + const minute = current.minute(); + const snappedMinute = Math.round(minute / minutes) * minutes; + + return snappedMinute === 60 + ? current.add(1, 'hour').minute(0) + : current.minute(snappedMinute); +}; + +export const useUserTimeZone = () => { + const settings = useSettingsStore((s) => s.settings); + const [timeZone, setTimeZone] = useLocalStorage( + 'time-zone', + dayjs.tz?.guess + ? dayjs.tz.guess() + : Intl.DateTimeFormat().resolvedOptions().timeZone + ); + + useEffect(() => { + const tz = settings?.['system-time-zone']?.value; + if (tz && tz !== timeZone) { + setTimeZone(tz); + } + }, [settings, timeZone, setTimeZone]); + + return timeZone; +}; + +export const useTimeHelpers = () => { + const timeZone = useUserTimeZone(); + + const toUserTime = useCallback( + (value) => { + if (!value) return dayjs.invalid(); + try { + return initializeTime(value).tz(timeZone); + } catch (error) { + return initializeTime(value); + } + }, + [timeZone] + ); + + const userNow = useCallback(() => getNow().tz(timeZone), [timeZone]); + + return { timeZone, toUserTime, userNow }; +}; + +export const RECURRING_DAY_OPTIONS = [ + { value: 6, label: 'Sun' }, + { value: 0, label: 'Mon' }, + { value: 1, label: 'Tue' }, + { value: 2, label: 'Wed' }, + { value: 3, label: 'Thu' }, + { value: 4, label: 'Fri' }, + { value: 5, label: 'Sat' }, +]; + +export const useDateTimeFormat = () => { + const [timeFormatSetting] = useLocalStorage('time-format', '12h'); + const [dateFormatSetting] = useLocalStorage('date-format', 'mdy'); + // Use user preference for time format + const timeFormat = timeFormatSetting === '12h' ? 'h:mma' : 'HH:mm'; + const dateFormat = dateFormatSetting === 'mdy' ? 'MMM D' : 'D MMM'; + + return [timeFormat, dateFormat]; +}; + +export const toTimeString = (value) => { + if (!value) return '00:00'; + if (typeof value === 'string') { + const parsed = dayjs(value, ['HH:mm', 'HH:mm:ss', 'h:mm A'], true); + if (parsed.isValid()) return parsed.format('HH:mm'); + return value; + } + const parsed = initializeTime(value); + return parsed.isValid() ? parsed.format('HH:mm') : '00:00'; +}; + +export const parseDate = (value) => { + if (!value) return null; + const parsed = dayjs(value, ['YYYY-MM-DD', dayjs.ISO_8601], true); + return parsed.isValid() ? parsed.toDate() : null; +}; + +const TIMEZONE_FALLBACKS = [ + 'UTC', + 'America/New_York', + 'America/Chicago', + 'America/Denver', + 'America/Los_Angeles', + 'America/Phoenix', + 'America/Anchorage', + 'Pacific/Honolulu', + 'Europe/London', + 'Europe/Paris', + 'Europe/Berlin', + 'Europe/Madrid', + 'Europe/Warsaw', + 'Europe/Moscow', + 'Asia/Dubai', + 'Asia/Kolkata', + 'Asia/Shanghai', + 'Asia/Tokyo', + 'Asia/Seoul', + 'Australia/Sydney', +]; + +const getSupportedTimeZones = () => { + try { + if (typeof Intl.supportedValuesOf === 'function') { + return Intl.supportedValuesOf('timeZone'); + } + } catch (error) { + console.warn('Unable to enumerate supported time zones:', error); + } + return TIMEZONE_FALLBACKS; +}; + +const getTimeZoneOffsetMinutes = (date, timeZone) => { + try { + const dtf = new Intl.DateTimeFormat('en-US', { + timeZone, + year: 'numeric', + month: '2-digit', + day: '2-digit', + hour: '2-digit', + minute: '2-digit', + second: '2-digit', + hourCycle: 'h23', + }); + const parts = dtf.formatToParts(date).reduce((acc, part) => { + if (part.type !== 'literal') acc[part.type] = part.value; + return acc; + }, {}); + const asUTC = Date.UTC( + Number(parts.year), + Number(parts.month) - 1, + Number(parts.day), + Number(parts.hour), + Number(parts.minute), + Number(parts.second) + ); + return (asUTC - date.getTime()) / 60000; + } catch (error) { + console.warn(`Failed to compute offset for ${timeZone}:`, error); + return 0; + } +}; + +const formatOffset = (minutes) => { + const rounded = Math.round(minutes); + const sign = rounded < 0 ? '-' : '+'; + const absolute = Math.abs(rounded); + const hours = String(Math.floor(absolute / 60)).padStart(2, '0'); + const mins = String(absolute % 60).padStart(2, '0'); + return `UTC${sign}${hours}:${mins}`; +}; + +export const buildTimeZoneOptions = (preferredZone) => { + const zones = getSupportedTimeZones(); + const referenceYear = new Date().getUTCFullYear(); + const janDate = new Date(Date.UTC(referenceYear, 0, 1, 12, 0, 0)); + const julDate = new Date(Date.UTC(referenceYear, 6, 1, 12, 0, 0)); + + const options = zones + .map((zone) => { + const janOffset = getTimeZoneOffsetMinutes(janDate, zone); + const julOffset = getTimeZoneOffsetMinutes(julDate, zone); + const currentOffset = getTimeZoneOffsetMinutes(new Date(), zone); + const minOffset = Math.min(janOffset, julOffset); + const maxOffset = Math.max(janOffset, julOffset); + const usesDst = minOffset !== maxOffset; + const labelParts = [`now ${formatOffset(currentOffset)}`]; + if (usesDst) { + labelParts.push( + `DST range ${formatOffset(minOffset)} to ${formatOffset(maxOffset)}` + ); + } + return { + value: zone, + label: `${zone} (${labelParts.join(' | ')})`, + numericOffset: minOffset, + }; + }) + .sort((a, b) => { + if (a.numericOffset !== b.numericOffset) { + return a.numericOffset - b.numericOffset; + } + return a.value.localeCompare(b.value); + }); + if ( + preferredZone && + !options.some((option) => option.value === preferredZone) + ) { + const currentOffset = getTimeZoneOffsetMinutes(new Date(), preferredZone); + options.push({ + value: preferredZone, + label: `${preferredZone} (now ${formatOffset(currentOffset)})`, + numericOffset: currentOffset, + }); + options.sort((a, b) => { + if (a.numericOffset !== b.numericOffset) { + return a.numericOffset - b.numericOffset; + } + return a.value.localeCompare(b.value); + }); + } + return options; +}; + +export const getDefaultTimeZone = () => { + try { + return Intl.DateTimeFormat().resolvedOptions().timeZone || 'UTC'; + } catch (error) { + return 'UTC'; + } +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/RecordingDetailsModalUtils.js b/frontend/src/utils/forms/RecordingDetailsModalUtils.js new file mode 100644 index 00000000..805bc006 --- /dev/null +++ b/frontend/src/utils/forms/RecordingDetailsModalUtils.js @@ -0,0 +1,87 @@ +export const getStatRows = (stats) => { + return [ + ['Video Codec', stats.video_codec], + [ + 'Resolution', + stats.resolution || + (stats.width && stats.height ? `${stats.width}x${stats.height}` : null), + ], + ['FPS', stats.source_fps], + ['Video Bitrate', stats.video_bitrate && `${stats.video_bitrate} kb/s`], + ['Audio Codec', stats.audio_codec], + ['Audio Channels', stats.audio_channels], + ['Sample Rate', stats.sample_rate && `${stats.sample_rate} Hz`], + ['Audio Bitrate', stats.audio_bitrate && `${stats.audio_bitrate} kb/s`], + ].filter(([, v]) => v !== null && v !== undefined && v !== ''); +}; + +export const getRating = (customProps, program) => { + return ( + customProps.rating || + customProps.rating_value || + (program && program.custom_properties && program.custom_properties.rating) + ); +}; + +const filterByUpcoming = (arr, tvid, titleKey, toUserTime, userNow) => { + return arr.filter((r) => { + const cp = r.custom_properties || {}; + const pr = cp.program || {}; + + if ((pr.tvg_id || '') !== tvid) return false; + if ((pr.title || '').toLowerCase() !== titleKey) return false; + const st = toUserTime(r.start_time); + return st.isAfter(userNow()); + }); +} + +const dedupeByProgram = (filtered) => { + // Deduplicate by program.id if present, else by time+title + const seen = new Set(); + const deduped = []; + + for (const r of filtered) { + const cp = r.custom_properties || {}; + const pr = cp.program || {}; + // Prefer season/episode or onscreen code; else fall back to sub_title; else program id/slot + const season = cp.season ?? pr?.custom_properties?.season; + const episode = cp.episode ?? pr?.custom_properties?.episode; + const onscreen = + cp.onscreen_episode ?? pr?.custom_properties?.onscreen_episode; + + let key = null; + if (season != null && episode != null) key = `se:${season}:${episode}`; + else if (onscreen) key = `onscreen:${String(onscreen).toLowerCase()}`; + else if (pr.sub_title) key = `sub:${(pr.sub_title || '').toLowerCase()}`; + else if (pr.id != null) key = `id:${pr.id}`; + else + key = `slot:${r.channel}|${r.start_time}|${r.end_time}|${pr.title || ''}`; + + if (seen.has(key)) continue; + seen.add(key); + deduped.push(r); + } + return deduped; +} + +export const getUpcomingEpisodes = ( + isSeriesGroup, + allRecordings, + program, + toUserTime, + userNow +) => { + if (!isSeriesGroup) return []; + + const arr = Array.isArray(allRecordings) + ? allRecordings + : Object.values(allRecordings || {}); + const tvid = program.tvg_id || ''; + const titleKey = (program.title || '').toLowerCase(); + + const filtered = filterByUpcoming(arr, tvid, titleKey, toUserTime, userNow); + + return dedupeByProgram(filtered).sort( + (a, b) => toUserTime(a.start_time) - toUserTime(b.start_time) + ); +}; diff --git a/frontend/src/utils/forms/RecurringRuleModalUtils.js b/frontend/src/utils/forms/RecurringRuleModalUtils.js new file mode 100644 index 00000000..1eb9194a --- /dev/null +++ b/frontend/src/utils/forms/RecurringRuleModalUtils.js @@ -0,0 +1,66 @@ +import API from '../../api.js'; +import { toTimeString } from '../dateTimeUtils.js'; +import dayjs from 'dayjs'; + +export const getChannelOptions = (channels) => { + return Object.values(channels || {}) + .sort((a, b) => { + const aNum = Number(a.channel_number) || 0; + const bNum = Number(b.channel_number) || 0; + if (aNum === bNum) { + return (a.name || '').localeCompare(b.name || ''); + } + return aNum - bNum; + }) + .map((item) => ({ + value: `${item.id}`, + label: item.name || `Channel ${item.id}`, + })); +}; + +export const getUpcomingOccurrences = ( + recordings, + userNow, + ruleId, + toUserTime +) => { + const list = Array.isArray(recordings) + ? recordings + : Object.values(recordings || {}); + const now = userNow(); + return list + .filter( + (rec) => + rec?.custom_properties?.rule?.id === ruleId && + toUserTime(rec.start_time).isAfter(now) + ) + .sort( + (a, b) => + toUserTime(a.start_time).valueOf() - toUserTime(b.start_time).valueOf() + ); +}; + +export const updateRecurringRule = async (ruleId, values) => { + await API.updateRecurringRule(ruleId, { + channel: values.channel_id, + days_of_week: (values.days_of_week || []).map((d) => Number(d)), + start_time: toTimeString(values.start_time), + end_time: toTimeString(values.end_time), + start_date: values.start_date + ? dayjs(values.start_date).format('YYYY-MM-DD') + : null, + end_date: values.end_date + ? dayjs(values.end_date).format('YYYY-MM-DD') + : null, + name: values.rule_name?.trim() || '', + enabled: Boolean(values.enabled), + }); +}; + +export const deleteRecurringRuleById = async (ruleId) => { + await API.deleteRecurringRule(ruleId); +}; + +export const updateRecurringRuleEnabled = async (ruleId, checked) => { + await API.updateRecurringRule(ruleId, { enabled: checked }); +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/settings/DvrSettingsFormUtils.js b/frontend/src/utils/forms/settings/DvrSettingsFormUtils.js new file mode 100644 index 00000000..7fa272d0 --- /dev/null +++ b/frontend/src/utils/forms/settings/DvrSettingsFormUtils.js @@ -0,0 +1,22 @@ +import API from '../../../api.js'; + +export const getComskipConfig = async () => { + return await API.getComskipConfig(); +}; + +export const uploadComskipIni = async (file) => { + return await API.uploadComskipIni(file); +}; + +export const getDvrSettingsFormInitialValues = () => { + return { + 'dvr-tv-template': '', + 'dvr-movie-template': '', + 'dvr-tv-fallback-template': '', + 'dvr-movie-fallback-template': '', + 'dvr-comskip-enabled': false, + 'dvr-comskip-custom-path': '', + 'dvr-pre-offset-minutes': 0, + 'dvr-post-offset-minutes': 0, + }; +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/settings/NetworkAccessFormUtils.js b/frontend/src/utils/forms/settings/NetworkAccessFormUtils.js new file mode 100644 index 00000000..fe1eea8a --- /dev/null +++ b/frontend/src/utils/forms/settings/NetworkAccessFormUtils.js @@ -0,0 +1,29 @@ +import { NETWORK_ACCESS_OPTIONS } from '../../../constants.js'; +import { IPV4_CIDR_REGEX, IPV6_CIDR_REGEX } from '../../networkUtils.js'; + +export const getNetworkAccessFormInitialValues = () => { + return Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => { + acc[key] = '0.0.0.0/0,::/0'; + return acc; + }, {}); +}; + +export const getNetworkAccessFormValidation = () => { + return Object.keys(NETWORK_ACCESS_OPTIONS).reduce((acc, key) => { + acc[key] = (value) => { + if ( + value + .split(',') + .some( + (cidr) => + !(cidr.match(IPV4_CIDR_REGEX) || cidr.match(IPV6_CIDR_REGEX)) + ) + ) { + return 'Invalid CIDR range'; + } + + return null; + }; + return acc; + }, {}); +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/settings/ProxySettingsFormUtils.js b/frontend/src/utils/forms/settings/ProxySettingsFormUtils.js new file mode 100644 index 00000000..864dd9b1 --- /dev/null +++ b/frontend/src/utils/forms/settings/ProxySettingsFormUtils.js @@ -0,0 +1,18 @@ +import { PROXY_SETTINGS_OPTIONS } from '../../../constants.js'; + +export const getProxySettingsFormInitialValues = () => { + return Object.keys(PROXY_SETTINGS_OPTIONS).reduce((acc, key) => { + acc[key] = ''; + return acc; + }, {}); +}; + +export const getProxySettingDefaults = () => { + return { + buffering_timeout: 15, + buffering_speed: 1.0, + redis_chunk_ttl: 60, + channel_shutdown_delay: 0, + channel_init_grace_period: 5, + }; +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/settings/StreamSettingsFormUtils.js b/frontend/src/utils/forms/settings/StreamSettingsFormUtils.js new file mode 100644 index 00000000..2ff5dd55 --- /dev/null +++ b/frontend/src/utils/forms/settings/StreamSettingsFormUtils.js @@ -0,0 +1,19 @@ +import { isNotEmpty } from '@mantine/form'; + +export const getStreamSettingsFormInitialValues = () => { + return { + 'default-user-agent': '', + 'default-stream-profile': '', + 'preferred-region': '', + 'auto-import-mapped-files': true, + 'm3u-hash-key': [], + }; +}; + +export const getStreamSettingsFormValidation = () => { + return { + 'default-user-agent': isNotEmpty('Select a user agent'), + 'default-stream-profile': isNotEmpty('Select a stream profile'), + 'preferred-region': isNotEmpty('Select a region'), + }; +}; \ No newline at end of file diff --git a/frontend/src/utils/forms/settings/SystemSettingsFormUtils.js b/frontend/src/utils/forms/settings/SystemSettingsFormUtils.js new file mode 100644 index 00000000..75c4f513 --- /dev/null +++ b/frontend/src/utils/forms/settings/SystemSettingsFormUtils.js @@ -0,0 +1,5 @@ +export const getSystemSettingsFormInitialValues = () => { + return { + 'max-system-events': 100, + }; +}; diff --git a/frontend/src/utils/forms/settings/UiSettingsFormUtils.js b/frontend/src/utils/forms/settings/UiSettingsFormUtils.js new file mode 100644 index 00000000..79e99d96 --- /dev/null +++ b/frontend/src/utils/forms/settings/UiSettingsFormUtils.js @@ -0,0 +1,14 @@ +import { createSetting, updateSetting } from '../../pages/SettingsUtils.js'; + +export const saveTimeZoneSetting = async (tzValue, settings) => { + const existing = settings['system-time-zone']; + if (existing?.id) { + await updateSetting({ ...existing, value: tzValue }); + } else { + await createSetting({ + key: 'system-time-zone', + name: 'System Time Zone', + value: tzValue, + }); + } +}; \ No newline at end of file diff --git a/frontend/src/utils/networkUtils.js b/frontend/src/utils/networkUtils.js new file mode 100644 index 00000000..8562face --- /dev/null +++ b/frontend/src/utils/networkUtils.js @@ -0,0 +1,4 @@ +export const IPV4_CIDR_REGEX = /^([0-9]{1,3}\.){3}[0-9]{1,3}\/\d+$/; + +export const IPV6_CIDR_REGEX = + /(?:(?:(?:[A-F0-9]{1,4}:){6}|(?=(?:[A-F0-9]{0,4}:){0,6}(?:[0-9]{1,3}\.){3}[0-9]{1,3}(?![:.\w]))(([0-9A-F]{1,4}:){0,5}|:)((:[0-9A-F]{1,4}){1,5}:|:)|::(?:[A-F0-9]{1,4}:){5})(?:(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)|(?:[A-F0-9]{1,4}:){7}[A-F0-9]{1,4}|(?=(?:[A-F0-9]{0,4}:){0,7}[A-F0-9]{0,4}(?![:.\w]))(([0-9A-F]{1,4}:){1,7}|:)((:[0-9A-F]{1,4}){1,7}|:)|(?:[A-F0-9]{1,4}:){7}:|:(:[A-F0-9]{1,4}){7})(?![:.\w])\/(?:12[0-8]|1[01][0-9]|[1-9]?[0-9])/; diff --git a/frontend/src/utils/notificationUtils.js b/frontend/src/utils/notificationUtils.js new file mode 100644 index 00000000..ba965343 --- /dev/null +++ b/frontend/src/utils/notificationUtils.js @@ -0,0 +1,9 @@ +import { notifications } from '@mantine/notifications'; + +export function showNotification(notificationObject) { + return notifications.show(notificationObject); +} + +export function updateNotification(notificationId, notificationObject) { + return notifications.update(notificationId, notificationObject); +} \ No newline at end of file diff --git a/frontend/src/utils/pages/DVRUtils.js b/frontend/src/utils/pages/DVRUtils.js new file mode 100644 index 00000000..139988d2 --- /dev/null +++ b/frontend/src/utils/pages/DVRUtils.js @@ -0,0 +1,90 @@ +// Deduplicate in-progress and upcoming by program id or channel+slot +const dedupeByProgramOrSlot = (arr) => { + const out = []; + const sigs = new Set(); + + for (const r of arr) { + const cp = r.custom_properties || {}; + const pr = cp.program || {}; + const sig = + pr?.id != null + ? `id:${pr.id}` + : `slot:${r.channel}|${r.start_time}|${r.end_time}|${pr.title || ''}`; + + if (sigs.has(sig)) continue; + sigs.add(sig); + out.push(r); + } + return out; +}; + +const dedupeById = (list, toUserTime, completed, now, inProgress, upcoming) => { + // ID-based dedupe guard in case store returns duplicates + const seenIds = new Set(); + for (const rec of list) { + if (rec && rec.id != null) { + const k = String(rec.id); + if (seenIds.has(k)) continue; + seenIds.add(k); + } + + const s = toUserTime(rec.start_time); + const e = toUserTime(rec.end_time); + const status = rec.custom_properties?.status; + + if (status === 'interrupted' || status === 'completed') { + completed.push(rec); + } else { + if (now.isAfter(s) && now.isBefore(e)) inProgress.push(rec); + else if (now.isBefore(s)) upcoming.push(rec); + else completed.push(rec); + } + } +} + +export const categorizeRecordings = (recordings, toUserTime, now) => { + const inProgress = []; + const upcoming = []; + const completed = []; + const list = Array.isArray(recordings) + ? recordings + : Object.values(recordings || {}); + + dedupeById(list, toUserTime, completed, now, inProgress, upcoming); + + const inProgressDedup = dedupeByProgramOrSlot(inProgress).sort( + (a, b) => toUserTime(b.start_time) - toUserTime(a.start_time) + ); + + // Group upcoming by series title+tvg_id (keep only next episode) + const upcomingDedup = dedupeByProgramOrSlot(upcoming).sort( + (a, b) => toUserTime(a.start_time) - toUserTime(b.start_time) + ); + const grouped = new Map(); + + for (const rec of upcomingDedup) { + const cp = rec.custom_properties || {}; + const prog = cp.program || {}; + const key = `${prog.tvg_id || ''}|${(prog.title || '').toLowerCase()}`; + if (!grouped.has(key)) { + grouped.set(key, { rec, count: 1 }); + } else { + const entry = grouped.get(key); + entry.count += 1; + } + } + + const upcomingGrouped = Array.from(grouped.values()).map((e) => { + const item = { ...e.rec }; + item._group_count = e.count; + return item; + }); + + completed.sort((a, b) => toUserTime(b.end_time) - toUserTime(a.end_time)); + + return { + inProgress: inProgressDedup, + upcoming: upcomingGrouped, + completed, + }; +} \ No newline at end of file diff --git a/frontend/src/utils/pages/PluginsUtils.js b/frontend/src/utils/pages/PluginsUtils.js new file mode 100644 index 00000000..bae98e93 --- /dev/null +++ b/frontend/src/utils/pages/PluginsUtils.js @@ -0,0 +1,17 @@ +import API from '../../api.js'; + +export const updatePluginSettings = async (key, settings) => { + return await API.updatePluginSettings(key, settings); +}; +export const runPluginAction = async (key, actionId) => { + return await API.runPluginAction(key, actionId); +}; +export const setPluginEnabled = async (key, next) => { + return await API.setPluginEnabled(key, next); +}; +export const importPlugin = async (importFile) => { + return await API.importPlugin(importFile); +}; +export const deletePluginByKey = (key) => { + return API.deletePlugin(key); +}; \ No newline at end of file diff --git a/frontend/src/utils/pages/SettingsUtils.js b/frontend/src/utils/pages/SettingsUtils.js new file mode 100644 index 00000000..e6179f06 --- /dev/null +++ b/frontend/src/utils/pages/SettingsUtils.js @@ -0,0 +1,104 @@ +import API from '../../api.js'; + +export const checkSetting = async (values) => { + return await API.checkSetting(values); +}; + +export const updateSetting = async (values) => { + return await API.updateSetting(values); +}; + +export const createSetting = async (values) => { + return await API.createSetting(values); +}; + +export const rehashStreams = async () => { + return await API.rehashStreams(); +}; + +export const saveChangedSettings = async (settings, changedSettings) => { + for (const updatedKey in changedSettings) { + const existing = settings[updatedKey]; + if (existing?.id) { + const result = await updateSetting({ + ...existing, + value: changedSettings[updatedKey], + }); + // API functions return undefined on error + if (!result) { + throw new Error('Failed to update setting'); + } + } else { + const result = await createSetting({ + key: updatedKey, + name: updatedKey.replace(/-/g, ' '), + value: changedSettings[updatedKey], + }); + // API functions return undefined on error + if (!result) { + throw new Error('Failed to create setting'); + } + } + } +}; + +export const getChangedSettings = (values, settings) => { + const changedSettings = {}; + + for (const settingKey in values) { + // Only compare against existing value if the setting exists + const existing = settings[settingKey]; + + // Convert array values (like m3u-hash-key) to comma-separated strings + const stringValue = Array.isArray(values[settingKey]) + ? values[settingKey].join(',') + : `${values[settingKey]}`; + + // Skip empty values to avoid validation errors + if (!stringValue) { + continue; + } + + if (!existing) { + // Create new setting on save + changedSettings[settingKey] = stringValue; + } else if (stringValue !== String(existing.value)) { + // If the user changed the setting's value from what's in the DB: + changedSettings[settingKey] = stringValue; + } + } + return changedSettings; +}; + +export const parseSettings = (settings) => { + return Object.entries(settings).reduce((acc, [key, value]) => { + // Modify each value based on its own properties + switch (value.value) { + case 'true': + value.value = true; + break; + case 'false': + value.value = false; + break; + } + + let val = null; + switch (key) { + case 'm3u-hash-key': + // Split comma-separated string, filter out empty strings + val = value.value ? value.value.split(',').filter((v) => v) : []; + break; + case 'dvr-pre-offset-minutes': + case 'dvr-post-offset-minutes': + val = Number.parseInt(value.value || '0', 10); + if (Number.isNaN(val)) val = 0; + break; + default: + val = value.value; + break; + } + + acc[key] = val; + return acc; + }, {}); +}; \ No newline at end of file