mirror of
https://github.com/chrisbenincasa/tunarr.git
synced 2026-04-18 09:03:35 -04:00
Dev -> Main (1.2.0) (#1648)
This commit is contained in:
3
.gitignore
vendored
3
.gitignore
vendored
@@ -40,4 +40,5 @@ tunarr-openapi.json
|
||||
|
||||
web/.tanstack
|
||||
|
||||
:memory:*
|
||||
:memory:*
|
||||
.serena/
|
||||
@@ -1,3 +1,27 @@
|
||||
# EPG
|
||||
|
||||
TBD
|
||||
Tunarr generates an [XMLTV](https://wiki.xmltv.org/index.php/XMLTVFormat)-formatted Electronic Program Guide (EPG) for all active channels. This guide data can be consumed by media clients (Plex, Jellyfin, Emby, Channels DVR, etc.) to display program schedules and metadata alongside the live stream.
|
||||
|
||||
## XMLTV Output
|
||||
|
||||
Tunarr writes XMLTV data to an `.xml` file in the data directory, but clients should always consume it via the API endpoint rather than reading the file directly:
|
||||
|
||||
```
|
||||
http://<tunarr-host>:<port>/api/xmltv.xml
|
||||
```
|
||||
|
||||
The full URL is also available in the Tunarr web UI under **Settings > HDHR / M3U**. Point your client's guide data source at this URL to populate the program guide.
|
||||
|
||||
## Program Metadata
|
||||
|
||||
Tunarr populates the XMLTV output with program metadata sourced from your media libraries, including:
|
||||
|
||||
- Program title and episode title
|
||||
- Description / plot summary
|
||||
- Air date and year
|
||||
- Content rating
|
||||
- Cast and crew credits
|
||||
- **Genres** — included under the XMLTV `<category>` tag, sourced from the genre metadata attached to each program in your media library. Clients that display genre information in their guide (e.g., Channels DVR) will pick these up automatically.
|
||||
|
||||
!!! info
|
||||
Genre data is only available for programs that have genre metadata in the originating media source. If a program's entry in your media server has no genres attached, the `<category>` tag will not be present for that program in the XMLTV output.
|
||||
@@ -40,7 +40,19 @@ This mode does not perform any stream normalization. When the channel m3u8 playl
|
||||
|
||||
#### Things to consider
|
||||
|
||||
Because this mode does not perform stream normalization, there may be issues when transitioning between programs; the mode requires clients to essentially "reset" themselves between each program for transitions to function as expected. Some clients that are known to work in this mode are Jellyfin and MPV, but there are almost certainly others.
|
||||
Because this mode does not perform stream normalization, there may be issues when transitioning between programs; the mode requires clients to essentially "reset" themselves between each program for transitions to function as expected. Some clients that are known to work in this mode are Jellyfin and MPV, but there are almost certainly others.
|
||||
|
||||
### HLS Direct v2
|
||||
|
||||
#### How does it work?
|
||||
|
||||
HLS Direct v2 works like standard HLS mode, but without transcoding. Rather than returning a single-item m3u8 pointing directly to the program stream (as HLS Direct does), it produces a proper continuous HLS playlist of segmented chunks — the same structure as the regular HLS mode. No codec conversion or normalization is applied; the source content is remuxed directly into the segments.
|
||||
|
||||
#### Things to consider
|
||||
|
||||
Users who want direct streaming without transcoding may find this mode more compatible than HLS Direct, particularly with clients that do not rely on FFmpeg's HLS demuxer implementation. Because the output is a standard segmented HLS playlist, clients that expect that structure should handle program transitions more gracefully than they would with HLS Direct.
|
||||
|
||||
As with HLS Direct, no stream normalization is applied, so mixed source formats (different codecs, resolutions, frame rates) across lineup items may cause playback issues depending on the client.
|
||||
|
||||
### MPEG-TS
|
||||
|
||||
|
||||
@@ -30,13 +30,13 @@ Configure output video parameters including hardware acceleration, format, resol
|
||||
|
||||
Hardware acceleration offloads video encoding and decoding from the CPU to dedicated hardware (GPU or media engine), dramatically reducing CPU usage and allowing real-time transcoding on modest hardware.
|
||||
|
||||
| Mode | Platform | When to Use |
|
||||
|------|----------|-------------|
|
||||
| **None** | Any | Software-only transcoding. CPU-intensive. Only use if no hardware acceleration is available or for debugging. |
|
||||
| **CUDA** | NVIDIA GPUs (Linux/Windows) | Requires an NVIDIA GPU with NVENC support. The most reliable hardware path on Linux and Windows with NVIDIA hardware. |
|
||||
| **VAAPI** | Intel / AMD GPUs (Linux) | Uses the Video Acceleration API. Defaults to `/dev/dri/renderD128`. Supported drivers: `system`, `ihd`, `i965`, `radeonsi`, `nouveau`. **Recommended for Intel and AMD users on Linux.** |
|
||||
| **QSV** | Intel GPUs (Linux/Windows) | Intel Quick Sync Video. Cannot hardware-decode 10-bit H.264 or HEVC content. Generally less stable than VAAPI on Linux; may work well on Windows. |
|
||||
| **VideoToolbox** | macOS | Apple's hardware video codec framework. Works on both Apple Silicon and Intel Macs. Recommended for all macOS users. |
|
||||
| Mode | Platform | Tonemapping | When to Use |
|
||||
|------|----------|-------------|-------------|
|
||||
| **None** | Any | Software | Software-only transcoding. CPU-intensive. Only use if no hardware acceleration is available or for debugging. |
|
||||
| **CUDA** | NVIDIA GPUs (Linux/Windows) | Hardware (Vulkan) / Software fallback | Requires an NVIDIA GPU with NVENC support. The most reliable hardware path on Linux and Windows with NVIDIA hardware. |
|
||||
| **VAAPI** | Intel / AMD GPUs (Linux) | `tonemap_vaapi` / `tonemap_opencl` / Software fallback | Uses the Video Acceleration API. Defaults to `/dev/dri/renderD128`. Supported drivers: `system`, `ihd`, `i965`, `radeonsi`, `nouveau`. **Recommended for Intel and AMD users on Linux.** |
|
||||
| **QSV** | Intel GPUs (Linux/Windows) | Hardware (experimental) | Intel Quick Sync Video. Cannot hardware-decode 10-bit H.264 or HEVC content. Generally less stable than VAAPI on Linux; may work well on Windows. |
|
||||
| **VideoToolbox** | macOS | Not supported | Apple's hardware video codec framework. Works on both Apple Silicon and Intel Macs. Recommended for all macOS users. |
|
||||
|
||||
!!! info "Linux with Intel GPU: VAAPI vs QSV"
|
||||
If you are running Tunarr on Linux with an Intel GPU, **VAAPI is the recommended choice** over QSV. VAAPI is generally more stable, better tested with Tunarr. QSV may work for some users but can produce compatibility issues with certain codecs and bit depths.
|
||||
@@ -46,6 +46,8 @@ When **VAAPI** or **QSV** is selected, additional options appear:
|
||||
- **VAAPI Driver** — Select the driver your system uses (`system`, `ihd`, `i965`, `radeonsi`, `nouveau`). Leave as `system` unless you know you need a specific driver.
|
||||
- **VAAPI Device Path** — Path to the DRI render device (default: `/dev/dri/renderD128`). Change this only if your device is at a non-standard path (e.g., `/dev/dri/renderD129` when multiple GPUs are present).
|
||||
|
||||
When **VAAPI** is selected, Tunarr will automatically use hardware-accelerated pad filters (`pad_vaapi`, or `pad_opencl` as a fallback) for letterboxing and pillarboxing operations. This keeps padding on the GPU and avoids a round-trip to software. If your hardware does not support these filters or you observe artifacts, you can force software padding by setting `TUNARR_DISABLE_VAAPI_PAD=true`.
|
||||
|
||||
### Video Format
|
||||
|
||||
Controls the codec used to encode the output video stream.
|
||||
@@ -85,6 +87,29 @@ When enabled, forces the output stream to a consistent frame rate (e.g., 23.976,
|
||||
|
||||
Applies a deinterlace filter to the output. Enable this if your source content is interlaced — common with recordings from broadcast or cable TV. The specific deinterlace filter used is configured globally in FFmpeg settings. This setting has no effect on progressive (non-interlaced) source content.
|
||||
|
||||
### HDR Tonemapping
|
||||
|
||||
!!! warning "Experimental"
|
||||
HDR tonemapping is an experimental feature. Results may vary depending on hardware, driver version, and source content. It may cause stream errors on some configurations. Monitor logs closely when first enabling it.
|
||||
|
||||
When Tunarr encounters HDR content (HDR10 or HLG), it can convert it to SDR during transcoding. This is useful when playback devices or clients do not support HDR, or when a consistent SDR output is required regardless of source content.
|
||||
|
||||
Tonemapping is **disabled by default**. To enable it, set the `TUNARR_TONEMAP_ENABLED` environment variable (see [Environment Variables](../../getting-started/run.md#transcoding)).
|
||||
|
||||
When enabled, Tunarr automatically selects the best available tonemapping method for the configured hardware acceleration mode:
|
||||
|
||||
| Hardware Mode | Method | Notes |
|
||||
|---|---|---|
|
||||
| **VAAPI** | `tonemap_vaapi` → `tonemap_opencl` → software | Falls back through the chain based on what filters are supported by your hardware and FFmpeg build. |
|
||||
| **CUDA** | Hardware tonemapping via Vulkan | Requires Vulkan support on the host. If Vulkan is unavailable or causing stream errors, set `TUNARR_DISABLE_VULKAN=true` to fall back to software tonemapping. |
|
||||
| **QSV** | Hardware tonemapping | Experimental. May not work reliably on all hardware. |
|
||||
| **None (Software)** | Software tonemapping | Used as the final fallback for all modes. More CPU-intensive than hardware paths. |
|
||||
|
||||
Tonemapping is only applied to content Tunarr identifies as HDR (HDR10 or HLG). SDR content is unaffected.
|
||||
|
||||
!!! info "Color Metadata"
|
||||
Tonemapping relies on color metadata (color space, color transfer, color primaries) stored in Tunarr's database for each program. This metadata is populated automatically when media is scanned or imported from a media source. If tonemapping is not being applied to content you expect to be HDR, try re-scanning the relevant library to ensure the metadata has been recorded.
|
||||
|
||||
---
|
||||
|
||||
## Audio
|
||||
@@ -136,6 +161,23 @@ Adjusts the output volume relative to the source.
|
||||
!!! info
|
||||
This setting has no effect when **Audio Format** is set to **Copy**, since the audio stream is passed through without processing.
|
||||
|
||||
### Loudness Normalization
|
||||
|
||||
Tunarr supports [EBU R128](https://en.wikipedia.org/wiki/EBU_R_128) loudness normalization via FFmpeg's `loudnorm` filter. This provides more perceptually consistent loudness leveling across different programs, compared to the simple volume percentage adjustment above.
|
||||
|
||||
When enabled, the following parameters can be configured:
|
||||
|
||||
| Parameter | Range | Default | Description |
|
||||
|---|---|---|---|
|
||||
| **Integrated Loudness (I)** | -70.0 to -5.0 LUFS | -24.0 | Target integrated loudness. EBU R128 broadcast standard is -23 LUFS; -24 is a common streaming target. |
|
||||
| **Loudness Range (LRA)** | 1.0 to 50.0 LU | 7.0 | Target loudness range (dynamic range). Lower values compress dynamics more aggressively. |
|
||||
| **Max True Peak (TP)** | -9.0 to 0.0 dBTP | -2.0 | Maximum allowed true peak level, preventing clipping. |
|
||||
|
||||
Loudness normalization and **Audio Volume %** can be used independently of each other.
|
||||
|
||||
!!! info
|
||||
Loudness normalization has no effect when **Audio Format** is set to **Copy**, since the audio stream is passed through without processing.
|
||||
|
||||
---
|
||||
|
||||
## Advanced Options
|
||||
|
||||
@@ -18,6 +18,24 @@ If we instead wanted to air these two episodes, then have the channel play Flex
|
||||
|
||||

|
||||
|
||||
See below for an example of our schedule now that we have Flex after our two episodes air. Now it will alternate Show 1 Day 1, Show 2 Day 1, Show 1 Day 2, Show 2 Day 2, etc.
|
||||
See below for an example of our schedule now that we have Flex after our two episodes air. Now it will alternate Show 1 Day 1, Show 2 Day 1, Show 1 Day 2, Show 2 Day 2, etc.
|
||||
|
||||

|
||||

|
||||
|
||||
## Padding
|
||||
|
||||
Padding controls how Tunarr handles the gap between when a program finishes and when the next scheduled slot is due to start. When a program ends before its slot's start time, Tunarr fills the gap with [Flex](/configure/channels/flex) content (or silence if no filler is configured) so that the next program begins at exactly the scheduled time.
|
||||
|
||||
### Global Pad Time
|
||||
|
||||
The **Pad Times** setting applies a uniform pad duration to all slots in the schedule. This is the primary way to enforce "hard" start times: if an episode finishes 8 minutes before the next slot, those 8 minutes are filled with flex content.
|
||||
|
||||
### Per-Slot Padding
|
||||
|
||||
Individual slots can override the global pad time with their own value. This is useful when different slots have different tolerance requirements — for example, a morning block that needs tight padding while an evening block can be more flexible.
|
||||
|
||||
To set per-slot padding, open the slot's options in the Time Slot editor. When a slot has its own pad time set, it takes precedence over the global **Pad Times** value for that slot only. Slots without a per-slot override continue to use the global value.
|
||||
|
||||
### Max Lateness
|
||||
|
||||
The **Max Lateness** setting is a companion to padding. It defines how far a program is allowed to run past the slot's scheduled start time before the slot is skipped and the next one begins. For example, with a max lateness of 5 minutes, an episode that would end 4 minutes into the next slot will still be allowed to play in full; one that would end 6 minutes into the next slot will be cut off at the slot boundary.
|
||||
@@ -23,6 +23,37 @@ Additionally, Tunarr has custom levels for HTTP traffic:
|
||||
| `http` | 25 | Incoming HTTP request logging |
|
||||
| `http_out` | 15 | Outgoing HTTP request logging (to Plex, Jellyfin, etc.) |
|
||||
|
||||
## Per-Category Log Levels
|
||||
|
||||
In addition to the global log level, Tunarr supports setting independent log levels for specific subsystems. This is useful when debugging a particular area without flooding the logs with output from the entire application.
|
||||
|
||||
The following log categories are available:
|
||||
|
||||
| Category | Description |
|
||||
|----------|-------------|
|
||||
| `scheduling` | Logs from the channel scheduling and guide generation subsystem |
|
||||
| `streaming` | Logs from the streaming and transcode session subsystem |
|
||||
|
||||
A category log level overrides the global level for that category only. For example, you can keep the global level at `info` while setting `streaming` to `debug` to get detailed stream logs without extra noise from elsewhere.
|
||||
|
||||
Per-category log levels are configured in **Settings > System > Logging**, or via the API:
|
||||
|
||||
```bash
|
||||
curl -X PUT "http://localhost:8000/api/system/settings" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"logging": {
|
||||
"logLevel": "info",
|
||||
"categoryLogLevel": {
|
||||
"scheduling": "debug",
|
||||
"streaming": "trace"
|
||||
}
|
||||
}
|
||||
}'
|
||||
```
|
||||
|
||||
Set a category's value to `null` or omit it to fall back to the global log level.
|
||||
|
||||
## Configuration
|
||||
|
||||
### Via Environment Variables
|
||||
@@ -42,6 +73,7 @@ LOG_DIRECTORY=/path/to/logs
|
||||
Navigate to **Settings > System > Logging** to configure:
|
||||
|
||||
- Log level
|
||||
- Per-category log levels
|
||||
- Log file directory
|
||||
- Log rolling settings
|
||||
|
||||
|
||||
1
docs/generated/tunarr-v1.2.0-dev.1-openapi.json
Normal file
1
docs/generated/tunarr-v1.2.0-dev.1-openapi.json
Normal file
File diff suppressed because one or more lines are too long
@@ -67,9 +67,14 @@ Tunarr has various command line / environment variables for configuration. These
|
||||
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
| -------------------- | ----------------- | ------- | ----------- |
|
||||
| `LOG_LEVEL` | N/A | `info` | Sets the log level. Valid values: `trace`, `debug`, `info`, `warn`, `error`, `fatal`, `silent`. Overrides the UI setting. |
|
||||
| `TUNARR_LOG_LEVEL` | N/A | `info` | Sets the log level. Valid values: `trace`, `debug`, `info`, `warn`, `error`, `fatal`, `silent`. Overrides the UI setting. |
|
||||
| `LOG_DIRECTORY` | N/A | (data dir) | Sets a custom directory for log files. |
|
||||
|
||||
### Streaming
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
| -------------------- | ----------------- | ------- | ----------- |
|
||||
| `TUNARR_SESSION_CLEANUP_DELAY_SECONDS` | N/A | `15` | How long to wait before cleaning up a transcode session after the last disconnect |
|
||||
|
||||
### Search (Meilisearch)
|
||||
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
@@ -81,6 +86,20 @@ Tunarr has various command line / environment variables for configuration. These
|
||||
| `TUNARR_SEARCH_REDUCE_INDEXER_MEMORY_USAGE` | N/A | `true` | Reduces Meilisearch memory usage during indexing. See [Meilisearch docs](https://www.meilisearch.com/docs/learn/self_hosted/configure_meilisearch_at_launch#reduce-indexing-memory-usage). May [impact file storage](https://github.com/chrisbenincasa/tunarr/issues/1558). Not available on Windows. |
|
||||
| `TUNARR_DISABLE_SEARCH_SNAPSHOT_IN_BACKUP` | N/A | `false` | When set to `true`, excludes Meilisearch snapshots from backups. The search index will be rebuilt on restore. |
|
||||
|
||||
### Artwork
|
||||
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
| -------------------- | ----------------- | ------- | ----------- |
|
||||
| `TUNARR_PROXY_ARTWORK` | N/A | `false` | When set to `true`, Tunarr proxies artwork requests through itself rather than redirecting clients directly to the media server URL. Useful when clients cannot reach your media server directly (e.g., behind a firewall or on a different network segment). |
|
||||
|
||||
### Transcoding
|
||||
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
| -------------------- | ----------------- | ------- | ----------- |
|
||||
| `TUNARR_TONEMAP_ENABLED` | N/A | `false` | Enable experimental HDR tonemapping. When set to `true`, Tunarr will apply HDR-to-SDR tonemapping when HDR source content is detected. See [HDR Tonemapping](../configure/ffmpeg/transcode_config.md#hdr-tonemapping). |
|
||||
| `TUNARR_DISABLE_VULKAN` | N/A | `false` | Disable Vulkan-based tonemapping in the CUDA pipeline. Use if Vulkan is not available on your system or is causing stream failures; Tunarr will fall back to software tonemapping. |
|
||||
| `TUNARR_DISABLE_VAAPI_PAD` | N/A | `false` | Disable hardware pad filters (`pad_vaapi`/`pad_opencl`) for VAAPI and fall back to software padding. Use if hardware padding causes artifacts or stream errors on your hardware. |
|
||||
|
||||
### Performance
|
||||
|
||||
| Environment Variable | Command Line Flag | Default | Description |
|
||||
|
||||
@@ -21,6 +21,7 @@ Tunarr's search feature different typed fields, such as `string`, `number`, and
|
||||
| `<` or `<=` | Starts With | `title <= A` |
|
||||
| `!=` | Not Equals | `title != "Sesame Street"` |
|
||||
| `~` | Contains | `title ~ Hours` |
|
||||
| `!~` | Not Contains | `title !~ "Sesame"` |
|
||||
| `in` | Set includes | `title IN ["30 Rock", "Arrested Development"]` |
|
||||
| `not in` | Set excludes | `genre NOT IN [comedy, horror]` |
|
||||
|
||||
@@ -74,4 +75,6 @@ Fields available for search:
|
||||
| `show_title` | `string` | Title of the show a program belongs to (only applicable to episodes) | 30 Rock |
|
||||
| `show_genre` | `string` | Genre of the show a program belongs to (only applicable to episodes) | comedy |
|
||||
| `show_tags` | `string` | Tag on the show the program belongs to (only applicable to episodes) | - |
|
||||
| `show_studio` | `string` | Studio on the show the program belongs to | - |
|
||||
| `show_studio` | `string` | Studio on the show the program belongs to | - |
|
||||
| `media_source_name` | `string` | Name of the media source (Plex server, Jellyfin server, etc.) the program was imported from | `My Plex` |
|
||||
| `library_name` | `string` | Name of the library within the media source the program belongs to | `TV Shows` |
|
||||
11
package.json
11
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "tunarr",
|
||||
"version": "1.1.3",
|
||||
"version": "1.2.0-dev.1",
|
||||
"description": "Create LiveTV channels from your Plex media",
|
||||
"type": "module",
|
||||
"author": "chrisbenincasa",
|
||||
@@ -27,8 +27,8 @@
|
||||
"@semantic-release/changelog": "^6.0.3",
|
||||
"@types/node": "22.10.7",
|
||||
"@types/semver": "^7.7.1",
|
||||
"@typescript-eslint/eslint-plugin": "^8.21.0",
|
||||
"@typescript-eslint/parser": "^8.21.0",
|
||||
"@typescript-eslint/eslint-plugin": "catalog:",
|
||||
"@typescript-eslint/parser": "catalog:",
|
||||
"@vitest/coverage-v8": "^3.2.4",
|
||||
"esbuild": "^0.21.5",
|
||||
"eslint": "catalog:",
|
||||
@@ -62,9 +62,8 @@
|
||||
"kysely": "patches/kysely.patch"
|
||||
},
|
||||
"overrides": {
|
||||
"eslint": "9.39.2",
|
||||
"@types/node": "22.10.7",
|
||||
"typescript": "5.9.3"
|
||||
"eslint": "catalog:",
|
||||
"@types/node": "22.10.7"
|
||||
},
|
||||
"onlyBuiltDependencies": [
|
||||
"@swc/core",
|
||||
|
||||
1655
pnpm-lock.yaml
generated
1655
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@@ -5,11 +5,13 @@ packages:
|
||||
- shared
|
||||
|
||||
catalog:
|
||||
'@typescript-eslint/eslint-plugin': ^8.55.0
|
||||
'@typescript-eslint/parser': ^8.55.0
|
||||
dayjs: ^1.11.14
|
||||
eslint: 9.17.0
|
||||
eslint: 9.39.2
|
||||
lodash-es: ^4.17.21
|
||||
random-js: 2.1.0
|
||||
typescript: 5.7.3
|
||||
zod: ^4.1.5
|
||||
typescript: 5.9.3
|
||||
zod: ^4.3.6
|
||||
|
||||
enablePrePostScripts: true
|
||||
|
||||
59
scripts/bump-version.sh
Executable file
59
scripts/bump-version.sh
Executable file
@@ -0,0 +1,59 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
if [[ $# -ne 1 ]]; then
|
||||
echo "Usage: $0 <version>"
|
||||
echo "Example: $0 1.2.0"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
VERSION="$1"
|
||||
|
||||
# Validate version format (basic semver check)
|
||||
if ! [[ "$VERSION" =~ ^[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9.]+)?$ ]]; then
|
||||
echo "Error: Invalid version format. Expected semver (e.g., 1.2.0 or 1.2.0-beta.1)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for jq
|
||||
if ! command -v jq &> /dev/null; then
|
||||
echo "Error: jq is required but not installed."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT_DIR="$(dirname "$SCRIPT_DIR")"
|
||||
|
||||
# Package.json files to update (relative to root)
|
||||
PACKAGE_FILES=(
|
||||
"package.json"
|
||||
"server/package.json"
|
||||
"shared/package.json"
|
||||
"types/package.json"
|
||||
"web/package.json"
|
||||
)
|
||||
|
||||
echo "Updating version to $VERSION in all package.json files..."
|
||||
|
||||
for file in "${PACKAGE_FILES[@]}"; do
|
||||
filepath="$ROOT_DIR/$file"
|
||||
if [[ -f "$filepath" ]]; then
|
||||
tmp=$(mktemp)
|
||||
jq --arg version "$VERSION" '.version = $version' "$filepath" > "$tmp"
|
||||
mv "$tmp" "$filepath"
|
||||
echo " Updated $file"
|
||||
else
|
||||
echo " Warning: $file not found, skipping"
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo "Committing changes..."
|
||||
|
||||
cd "$ROOT_DIR"
|
||||
git add "${PACKAGE_FILES[@]}"
|
||||
git commit -m "chore: bump version to $VERSION"
|
||||
|
||||
echo ""
|
||||
echo "Done! Version bumped to $VERSION and committed."
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@tunarr/server",
|
||||
"version": "1.1.3",
|
||||
"version": "1.2.0-dev.1",
|
||||
"description": "Create LiveTV channels from your Plex media",
|
||||
"license": "Zlib",
|
||||
"private": true,
|
||||
@@ -79,7 +79,7 @@
|
||||
"pino": "^9.9.1",
|
||||
"pino-pretty": "^11.3.0",
|
||||
"pino-roll": "^1.3.0",
|
||||
"random-js": "2.1.0",
|
||||
"random-js": "catalog:",
|
||||
"reflect-metadata": "^0.2.2",
|
||||
"retry": "^0.13.1",
|
||||
"sonic-boom": "4.2.0",
|
||||
@@ -88,7 +88,7 @@
|
||||
"tslib": "^2.8.1",
|
||||
"uuid": "^9.0.1",
|
||||
"yargs": "^17.7.2",
|
||||
"zod": "^4.1.5"
|
||||
"zod": "catalog:"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@faker-js/faker": "^9.9.0",
|
||||
@@ -131,7 +131,7 @@
|
||||
"tsconfig-paths": "^4.2.0",
|
||||
"tsx": "^4.20.5",
|
||||
"typed-emitter": "^2.1.0",
|
||||
"typescript": "5.7.3",
|
||||
"typescript": "catalog:",
|
||||
"typescript-eslint": "^8.41.0",
|
||||
"vitest": "^3.2.4"
|
||||
},
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import '@dotenvx/dotenvx/config';
|
||||
import dotenv from '@dotenvx/dotenvx';
|
||||
dotenv.config({ debug: false, quiet: true, ignore: ['MISSING_ENV_FILE'] });
|
||||
|
||||
import esbuild from 'esbuild';
|
||||
import fg from 'fast-glob';
|
||||
|
||||
@@ -6,6 +6,7 @@ import cors from '@fastify/cors';
|
||||
import fastifyMultipart from '@fastify/multipart';
|
||||
import fpStatic from '@fastify/static';
|
||||
import fastifySwagger from '@fastify/swagger';
|
||||
import glob from 'fast-glob';
|
||||
import fastify, { FastifySchema } from 'fastify';
|
||||
import fastifyGracefulShutdown from 'fastify-graceful-shutdown';
|
||||
import fp from 'fastify-plugin';
|
||||
@@ -21,6 +22,7 @@ import type { RouteOptions } from 'fastify/types/route.js';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import {
|
||||
isArray,
|
||||
isBoolean,
|
||||
isNumber,
|
||||
isString,
|
||||
isUndefined,
|
||||
@@ -37,10 +39,12 @@ import { HdhrApiRouter } from './api/hdhrApi.js';
|
||||
import { apiRouter } from './api/index.js';
|
||||
import { streamApi } from './api/streamApi.js';
|
||||
import { videoApiRouter } from './api/videoApi.js';
|
||||
import { defaultHlsOptions } from './ffmpeg/builder/constants.ts';
|
||||
import { type ServerOptions, serverOptions } from './globals.js';
|
||||
import { IWorkerPool } from './interfaces/IWorkerPool.ts';
|
||||
import { ServerContext, ServerRequestContext } from './ServerContext.js';
|
||||
import { TUNARR_ENV_VARS } from './util/env.ts';
|
||||
import { Result } from './types/result.ts';
|
||||
import { getBooleanEnvVar, TUNARR_ENV_VARS } from './util/env.ts';
|
||||
import { filename, isDev, run, timeoutPromise } from './util/index.js';
|
||||
import { type Logger } from './util/logging/LoggerFactory.js';
|
||||
|
||||
@@ -63,6 +67,7 @@ export class Server {
|
||||
trustProxy: this.serverOptions.trustProxy,
|
||||
})
|
||||
.setValidatorCompiler(validatorCompiler)
|
||||
// eslint-disable-next-line @typescript-eslint/no-unsafe-argument
|
||||
.setSerializerCompiler(serializerCompiler)
|
||||
.withTypeProvider<ZodTypeProvider>();
|
||||
|
||||
@@ -214,7 +219,16 @@ export class Server {
|
||||
);
|
||||
|
||||
this.app.addHook('onResponse', (req, rep, done) => {
|
||||
if (req.routeOptions.config.disableRequestLogging) {
|
||||
if (
|
||||
isBoolean(req.routeOptions.config.disableRequestLogging) &&
|
||||
req.routeOptions.config.disableRequestLogging
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
const onlyErrors =
|
||||
req.routeOptions.config.disableRequestLogging === 'only-errors';
|
||||
if (onlyErrors && rep.raw.statusCode < 400) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -482,10 +496,10 @@ export class Server {
|
||||
level: 'warning',
|
||||
});
|
||||
} catch (e) {
|
||||
this.logger.debug(e, 'Error sending shutdown signal to frontend');
|
||||
this.logger.warn(e, 'Error sending shutdown signal to frontend');
|
||||
}
|
||||
|
||||
this.logger.debug('Canceling all active scans');
|
||||
this.logger.info('Canceling all active scans');
|
||||
this.serverContext.mediaSourceScanCoordinator.cancelAll();
|
||||
|
||||
try {
|
||||
@@ -500,13 +514,13 @@ export class Server {
|
||||
this.serverContext.searchService.stop();
|
||||
|
||||
try {
|
||||
this.logger.debug('Pausing all on-demand channels');
|
||||
this.logger.info('Pausing all on-demand channels');
|
||||
await this.serverContext.onDemandChannelService.pauseAllChannels();
|
||||
} catch (e) {
|
||||
this.logger.error(e, 'Error pausing on-demand channels');
|
||||
}
|
||||
|
||||
this.logger.debug('Shutting down all sessions');
|
||||
this.logger.info('Shutting down all sessions');
|
||||
for (const session of values(
|
||||
this.serverContext.sessionManager.allSessions(),
|
||||
)) {
|
||||
@@ -522,17 +536,31 @@ export class Server {
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.debug('Shutting down all workers');
|
||||
try {
|
||||
await container.get<IWorkerPool>(KEYS.WorkerPool).shutdown(5_000);
|
||||
} catch (e) {
|
||||
this.logger.error(e, 'Error shutting down workers');
|
||||
// TODO: This is a bug because in theory
|
||||
// sessions can override this. But for the most part,
|
||||
// it works
|
||||
const baseStreamsDir = path.join(
|
||||
serverOptions().databaseDirectory,
|
||||
defaultHlsOptions.segmentBaseDirectory,
|
||||
);
|
||||
const transcodeDirs = await glob(path.join(`${baseStreamsDir}/*`));
|
||||
await Promise.all(
|
||||
transcodeDirs.map((dir) => Result.attemptAsync(() => fs.rmdir(dir))),
|
||||
);
|
||||
|
||||
if (getBooleanEnvVar(TUNARR_ENV_VARS.USE_WORKER_POOL_ENV_VAR, false)) {
|
||||
this.logger.info('Shutting down all workers');
|
||||
try {
|
||||
await container.get<IWorkerPool>(KEYS.WorkerPool).shutdown(5_000);
|
||||
} catch (e) {
|
||||
this.logger.error(e, 'Error shutting down workers');
|
||||
}
|
||||
}
|
||||
|
||||
this.serverContext.eventService.close();
|
||||
|
||||
try {
|
||||
this.logger.debug('Waiting for pending jobs to complete!');
|
||||
this.logger.info('Waiting for pending jobs to complete!');
|
||||
await Promise.race([
|
||||
schedule.gracefulShutdown(),
|
||||
new Promise<boolean>((resolve) => {
|
||||
|
||||
5
server/src/api/ApiController.ts
Normal file
5
server/src/api/ApiController.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import type { RouterPluginAsyncCallback } from '../types/serverType.js';
|
||||
|
||||
export interface ApiController {
|
||||
mount: RouterPluginAsyncCallback;
|
||||
}
|
||||
@@ -1,8 +1,11 @@
|
||||
import { isNonEmptyString } from '@tunarr/shared/util';
|
||||
import { Person } from '@tunarr/types';
|
||||
import { Person as PersonSchema } from '@tunarr/types/schemas';
|
||||
import axios, { AxiosHeaders, isAxiosError } from 'axios';
|
||||
import type { HttpHeader } from 'fastify/types/utils.js';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import { trimStart } from 'lodash-es';
|
||||
import { isNull, omitBy, trimStart } from 'lodash-es';
|
||||
import type stream from 'node:stream';
|
||||
import { match } from 'ts-pattern';
|
||||
import z from 'zod';
|
||||
import { ArtworkTypes } from '../db/schema/Artwork.ts';
|
||||
@@ -11,6 +14,7 @@ import { DrizzleDBAccess } from '../db/schema/index.ts';
|
||||
import { globalOptions } from '../globals.ts';
|
||||
import { KEYS } from '../types/inject.ts';
|
||||
import { RouterPluginAsyncCallback } from '../types/serverType.js';
|
||||
import { getBooleanEnvVar, TUNARR_ENV_VARS } from '../util/env.ts';
|
||||
|
||||
@injectable()
|
||||
export class CreditsApiController {
|
||||
@@ -172,7 +176,37 @@ export class CreditsApiController {
|
||||
// break;
|
||||
// }
|
||||
|
||||
return res.redirect(url.toString());
|
||||
const fullUrl = url.toString();
|
||||
|
||||
if (getBooleanEnvVar(TUNARR_ENV_VARS.PROXY_ARTWORK_ENV_VAR, false)) {
|
||||
try {
|
||||
const proxyRes = await axios.request<stream.Readable>({
|
||||
url: fullUrl,
|
||||
responseType: 'stream',
|
||||
});
|
||||
|
||||
let headers: Partial<Record<HttpHeader, string | string[]>>;
|
||||
if (proxyRes.headers instanceof AxiosHeaders) {
|
||||
headers = {
|
||||
...proxyRes.headers
|
||||
};
|
||||
} else {
|
||||
headers = { ...omitBy(proxyRes.headers, isNull) };
|
||||
}
|
||||
|
||||
return res
|
||||
.status(200)
|
||||
.headers(headers)
|
||||
.send(proxyRes.data);
|
||||
} catch (e) {
|
||||
if (isAxiosError(e) && e.response?.status === 404) {
|
||||
return res.status(404).send();
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
return res.redirect(fullUrl);
|
||||
} else {
|
||||
return res.sendFile(art.sourcePath);
|
||||
}
|
||||
|
||||
@@ -168,7 +168,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings'],
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
response: {
|
||||
200: z.array(TranscodeConfigSchema),
|
||||
},
|
||||
@@ -185,7 +185,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs/:id',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings'],
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
params: z.object({
|
||||
id: z.string().uuid(),
|
||||
}),
|
||||
@@ -211,6 +211,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs/:id/copy',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
params: z.object({
|
||||
id: z.uuid(),
|
||||
}),
|
||||
@@ -244,7 +245,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings'],
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
body: TranscodeConfigSchema.omit({
|
||||
id: true,
|
||||
}),
|
||||
@@ -265,7 +266,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs/:id',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings'],
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
body: TranscodeConfigSchema,
|
||||
params: IdPathParamSchema,
|
||||
response: {
|
||||
@@ -286,7 +287,7 @@ export const ffmpegSettingsRouter: RouterPluginCallback = (
|
||||
'/transcode_configs/:id',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Settings'],
|
||||
tags: ['Settings', 'Transcode Configs'],
|
||||
params: IdPathParamSchema,
|
||||
response: {
|
||||
200: z.void(),
|
||||
|
||||
@@ -31,6 +31,7 @@ import { mediaSourceRouter } from './mediaSourceApi.js';
|
||||
import { metadataApiRouter } from './metadataApi.js';
|
||||
import { plexApiRouter } from './plexApi.ts';
|
||||
import { plexSettingsRouter } from './plexSettingsApi.js';
|
||||
import { ProgramGroupingApiController } from './programGroupingApi.ts';
|
||||
import { programmingApi } from './programmingApi.js';
|
||||
import { sessionApiRouter } from './sessionApi.js';
|
||||
import { settingsApi } from './settingsApi.ts';
|
||||
@@ -79,7 +80,8 @@ export const apiRouter: RouterPluginAsyncCallback = async (fastify) => {
|
||||
.register(settingsApi)
|
||||
.register(trashApi)
|
||||
.register(container.get(SmartCollectionsApiController).mount)
|
||||
.register(container.get(CreditsApiController).mount);
|
||||
.register(container.get(CreditsApiController).mount)
|
||||
.register(container.get(ProgramGroupingApiController).mount);
|
||||
|
||||
fastify.get(
|
||||
'/version',
|
||||
|
||||
@@ -34,6 +34,7 @@ import type { MarkOptional, StrictExtract } from 'ts-essentials';
|
||||
import { match, P } from 'ts-pattern';
|
||||
import { v4 } from 'uuid';
|
||||
import z from 'zod/v4';
|
||||
import { DeleteMediaSourceCommand } from '../commands/media_source/DeleteMediaSourceCommand.ts';
|
||||
import { container } from '../container.ts';
|
||||
import type { MediaSourceWithRelations } from '../db/schema/derivedTypes.js';
|
||||
import { EntityMutex } from '../services/EntityMutex.ts';
|
||||
@@ -734,10 +735,9 @@ export const mediaSourceRouter: RouterPluginAsyncCallback = async (
|
||||
},
|
||||
async (req, res) => {
|
||||
try {
|
||||
const { deletedServer } =
|
||||
await req.serverCtx.mediaSourceDB.deleteMediaSource(
|
||||
tag(req.params.id),
|
||||
);
|
||||
const deletedServer = await container
|
||||
.get<DeleteMediaSourceCommand>(DeleteMediaSourceCommand)
|
||||
.run(tag(req.params.id));
|
||||
|
||||
// Are these useful? What do they even do?
|
||||
req.serverCtx.eventService.push({
|
||||
|
||||
64
server/src/api/programGroupingApi.ts
Normal file
64
server/src/api/programGroupingApi.ts
Normal file
@@ -0,0 +1,64 @@
|
||||
import { tag } from '@tunarr/types';
|
||||
import { ProgramGroupingSchema } from '@tunarr/types/schemas';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import z from 'zod';
|
||||
import { MaterializeProgramGroupings } from '../commands/MaterializeProgramGroupings.ts';
|
||||
import { container } from '../container.ts';
|
||||
import { ProgramGroupingDB } from '../db/ProgramGroupingDB.ts';
|
||||
import {
|
||||
MediaSourceId,
|
||||
RemoteSourceType,
|
||||
RemoteSourceTypes,
|
||||
} from '../db/schema/base.ts';
|
||||
import { BatchLookupExternalProgrammingSchema } from '../types/schemas.ts';
|
||||
import { RouterPluginAsyncCallback } from '../types/serverType.js';
|
||||
import { groupByUniq, inConstArr } from '../util/index.ts';
|
||||
import { ApiController } from './ApiController.ts';
|
||||
|
||||
@injectable()
|
||||
export class ProgramGroupingApiController implements ApiController {
|
||||
constructor(
|
||||
@inject(ProgramGroupingDB) private programGroupingDB: ProgramGroupingDB,
|
||||
) {}
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/require-await
|
||||
mount: RouterPluginAsyncCallback = async (fastify) => {
|
||||
fastify.post(
|
||||
'/program_groupings/batch/lookup',
|
||||
{
|
||||
schema: {
|
||||
tags: ['Program Groupings'],
|
||||
operationId: 'batchGetProgramGroupingsByExternalIds',
|
||||
body: BatchLookupExternalProgrammingSchema,
|
||||
response: {
|
||||
200: z.record(z.string(), ProgramGroupingSchema),
|
||||
},
|
||||
},
|
||||
},
|
||||
async (req, res) => {
|
||||
const ids = req.body.externalIds
|
||||
.values()
|
||||
.filter(([source]) => inConstArr(RemoteSourceTypes, source))
|
||||
.map(
|
||||
([source, sourceId, id]) =>
|
||||
[source, tag<MediaSourceId>(sourceId), id] as [
|
||||
RemoteSourceType,
|
||||
MediaSourceId,
|
||||
string,
|
||||
],
|
||||
)
|
||||
.toArray();
|
||||
|
||||
const results = await this.programGroupingDB.lookupByExternalIds(
|
||||
new Set(ids),
|
||||
);
|
||||
|
||||
const materialized = await container
|
||||
.get<MaterializeProgramGroupings>(MaterializeProgramGroupings)
|
||||
.execute(results);
|
||||
|
||||
return res.send(groupByUniq(materialized, (p) => p.uuid));
|
||||
},
|
||||
);
|
||||
};
|
||||
}
|
||||
@@ -3,7 +3,11 @@ import { ProgramType } from '@/db/schema/Program.js';
|
||||
import { ProgramGroupingType } from '@/db/schema/ProgramGrouping.js';
|
||||
import { JellyfinApiClient } from '@/external/jellyfin/JellyfinApiClient.js';
|
||||
import { PlexApiClient } from '@/external/plex/PlexApiClient.js';
|
||||
import { PagingParams, TruthyQueryParam } from '@/types/schemas.js';
|
||||
import {
|
||||
BatchLookupExternalProgrammingSchema,
|
||||
PagingParams,
|
||||
TruthyQueryParam,
|
||||
} from '@/types/schemas.js';
|
||||
import type { RouterPluginAsyncCallback } from '@/types/serverType.js';
|
||||
import {
|
||||
groupByUniq,
|
||||
@@ -13,6 +17,7 @@ import {
|
||||
isHttpUrl,
|
||||
isNonEmptyString,
|
||||
} from '@/util/index.js';
|
||||
import { getBooleanEnvVar, TUNARR_ENV_VARS } from '@/util/env.js';
|
||||
import { LoggerFactory } from '@/util/logging/LoggerFactory.js';
|
||||
import { seq } from '@tunarr/shared/util';
|
||||
import type { Episode, MusicAlbum, MusicTrack, Season } from '@tunarr/types';
|
||||
@@ -34,13 +39,11 @@ import type { HttpHeader } from 'fastify/types/utils.js';
|
||||
import { jsonArrayFrom } from 'kysely/helpers/sqlite';
|
||||
import {
|
||||
compact,
|
||||
every,
|
||||
find,
|
||||
first,
|
||||
head,
|
||||
isNil,
|
||||
isNull,
|
||||
isUndefined,
|
||||
map,
|
||||
omitBy,
|
||||
trimStart,
|
||||
@@ -49,7 +52,6 @@ import {
|
||||
import type stream from 'node:stream';
|
||||
import z from 'zod/v4';
|
||||
import { container } from '../container.ts';
|
||||
import { programSourceTypeFromString } from '../db/custom_types/ProgramSourceType.ts';
|
||||
import {
|
||||
AllProgramFields,
|
||||
AllProgramGroupingFields,
|
||||
@@ -80,23 +82,6 @@ const LookupExternalProgrammingSchema = z.object({
|
||||
.transform((s) => s.split('|', 3) as [string, string, string]),
|
||||
});
|
||||
|
||||
const BatchLookupExternalProgrammingSchema = z.object({
|
||||
externalIds: z
|
||||
.array(z.string())
|
||||
.transform(
|
||||
(s) =>
|
||||
new Set(
|
||||
[...s].map((s0) => s0.split('|', 3) as [string, string, string]),
|
||||
),
|
||||
)
|
||||
.refine((set) => {
|
||||
return every(
|
||||
[...set],
|
||||
(tuple) => !isUndefined(programSourceTypeFromString(tuple[0])),
|
||||
);
|
||||
}),
|
||||
});
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/require-await
|
||||
export const programmingApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
const logger = LoggerFactory.child({
|
||||
@@ -459,7 +444,37 @@ export const programmingApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
}
|
||||
}
|
||||
|
||||
return res.redirect(url.toString());
|
||||
const fullUrl = url.toString();
|
||||
|
||||
if (getBooleanEnvVar(TUNARR_ENV_VARS.PROXY_ARTWORK_ENV_VAR, false)) {
|
||||
try {
|
||||
const proxyRes = await axios.request<stream.Readable>({
|
||||
url: fullUrl,
|
||||
responseType: 'stream',
|
||||
});
|
||||
|
||||
let headers: Partial<Record<HttpHeader, string | string[]>>;
|
||||
if (proxyRes.headers instanceof AxiosHeaders) {
|
||||
headers = {
|
||||
...proxyRes.headers,
|
||||
};
|
||||
} else {
|
||||
headers = { ...omitBy(proxyRes.headers, isNull) };
|
||||
}
|
||||
|
||||
return res
|
||||
.status(200)
|
||||
.headers(headers)
|
||||
.send(proxyRes.data);
|
||||
} catch (e) {
|
||||
if (isAxiosError(e) && e.response?.status === 404) {
|
||||
return res.status(404).send();
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
return res.redirect(fullUrl);
|
||||
} else {
|
||||
return res.sendFile(art.sourcePath);
|
||||
}
|
||||
@@ -698,7 +713,7 @@ export const programmingApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
}
|
||||
|
||||
return res
|
||||
.status(proxyRes.status)
|
||||
.status(200)
|
||||
.headers(headers)
|
||||
.send(proxyRes.data);
|
||||
} catch (e) {
|
||||
|
||||
@@ -3,9 +3,10 @@ import { inject, injectable } from 'inversify';
|
||||
import z from 'zod';
|
||||
import { SmartCollectionsDB } from '../db/SmartCollectionsDB.ts';
|
||||
import { RouterPluginAsyncCallback } from '../types/serverType.js';
|
||||
import { ApiController } from './ApiController.ts';
|
||||
|
||||
@injectable()
|
||||
export class SmartCollectionsApiController {
|
||||
export class SmartCollectionsApiController implements ApiController {
|
||||
constructor(
|
||||
@inject(SmartCollectionsDB) private smartCollectionDB: SmartCollectionsDB,
|
||||
) {}
|
||||
|
||||
@@ -77,6 +77,7 @@ export const streamApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
case 'hls':
|
||||
case 'hls_slower':
|
||||
case 'hls_direct':
|
||||
case 'hls_direct_v2':
|
||||
return res.redirect(
|
||||
`/stream/channels/${channel.uuid}.m3u8?${params.toString()}`,
|
||||
);
|
||||
@@ -246,19 +247,23 @@ export const streamApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
sessionType: ChannelStreamModeSchema.refine(
|
||||
(typ) => typ !== 'mpegts',
|
||||
),
|
||||
id: z.string().uuid(),
|
||||
id: z.uuid(),
|
||||
file: z.string(),
|
||||
}),
|
||||
},
|
||||
config: {
|
||||
disableRequestLogging: true,
|
||||
disableRequestLogging: 'only-errors',
|
||||
},
|
||||
},
|
||||
async (req, res) => {
|
||||
let session: Maybe<BaseHlsSession>;
|
||||
switch (req.params.sessionType) {
|
||||
case 'hls':
|
||||
session = req.serverCtx.sessionManager.getHlsSession(req.params.id);
|
||||
case 'hls_direct_v2':
|
||||
session = req.serverCtx.sessionManager.getHlsSession(
|
||||
req.params.id,
|
||||
req.params.sessionType,
|
||||
);
|
||||
break;
|
||||
case 'hls_slower':
|
||||
session = req.serverCtx.sessionManager.getHlsSlowerSession(
|
||||
@@ -278,6 +283,7 @@ export const streamApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
}
|
||||
|
||||
session.recordHeartbeat(req.ip);
|
||||
session.onSegmentRequested(req.ip, req.params.file);
|
||||
|
||||
return res.sendFile(req.params.file, session.workingDirectory);
|
||||
},
|
||||
@@ -286,6 +292,9 @@ export const streamApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
fastify.route({
|
||||
url: '/stream/channels/:id.m3u8',
|
||||
method: ['HEAD', 'GET'],
|
||||
config: {
|
||||
disableRequestLogging: 'only-errors',
|
||||
},
|
||||
schema: {
|
||||
tags: ['Streaming'],
|
||||
description:
|
||||
@@ -327,14 +336,15 @@ export const streamApi: RouterPluginAsyncCallback = async (fastify) => {
|
||||
let sessionResult: Result<FastifyReply>;
|
||||
switch (mode) {
|
||||
case 'hls':
|
||||
case 'hls_direct_v2':
|
||||
sessionResult = await req.serverCtx.sessionManager
|
||||
.getOrCreateHlsSession(channelId, req.ip, connectionDetails, {})
|
||||
.getOrCreateHlsSession(channelId, req.ip, connectionDetails, {
|
||||
streamMode: mode,
|
||||
})
|
||||
.then((result) =>
|
||||
result.mapAsync(async (session) => {
|
||||
session.recordHeartbeat(req.ip);
|
||||
const playlistResult = await session.trimPlaylist(
|
||||
dayjs().subtract(30, 'seconds'),
|
||||
);
|
||||
const playlistResult = await session.trimPlaylist();
|
||||
|
||||
if (playlistResult.isFailure()) {
|
||||
logger.error(playlistResult.error);
|
||||
|
||||
@@ -3,7 +3,7 @@ import { serverOptions } from '@/globals.js';
|
||||
import { scheduleBackupJobs } from '@/services/Scheduler.js';
|
||||
import type { RouterPluginAsyncCallback } from '@/types/serverType.js';
|
||||
import { getDefaultLogLevel } from '@/util/defaults.js';
|
||||
import { ifDefined } from '@/util/index.js';
|
||||
import { ifDefined, mapToObj } from '@/util/index.js';
|
||||
import {
|
||||
getEnvironmentLogLevel,
|
||||
getPrettyStreamOpts,
|
||||
@@ -177,6 +177,22 @@ export const systemApiRouter: RouterPluginAsyncCallback = async (
|
||||
req.body.logging?.logLevel ?? getDefaultLogLevel(false);
|
||||
}
|
||||
|
||||
if (!req.body.logging?.categoryLogLevel?.scheduling) {
|
||||
delete system.logging.categoryLogLevel?.scheduling;
|
||||
} else {
|
||||
system.logging.categoryLogLevel ??= {};
|
||||
system.logging.categoryLogLevel.scheduling =
|
||||
req.body.logging.categoryLogLevel.scheduling;
|
||||
}
|
||||
|
||||
if (!req.body.logging?.categoryLogLevel?.streaming) {
|
||||
delete system.logging.categoryLogLevel?.streaming;
|
||||
} else {
|
||||
system.logging.categoryLogLevel ??= {};
|
||||
system.logging.categoryLogLevel.streaming =
|
||||
req.body.logging.categoryLogLevel.streaming;
|
||||
}
|
||||
|
||||
if (!isUndefined(req.body.backup)) {
|
||||
system.backup = req.body.backup;
|
||||
scheduleBackupJobs(req.body.backup);
|
||||
@@ -475,6 +491,14 @@ export const systemApiRouter: RouterPluginAsyncCallback = async (
|
||||
},
|
||||
);
|
||||
|
||||
fastify.get('/system/debug/loggers', (_, res) => {
|
||||
return res.send(
|
||||
mapToObj([...LoggerFactory.traverseHierarchy()], ([k, v]) => ({
|
||||
[k]: v,
|
||||
})),
|
||||
);
|
||||
});
|
||||
|
||||
function getSystemSettingsResponse(
|
||||
settings: DeepReadonly<SystemSettings>,
|
||||
): SystemSettingsResponse {
|
||||
|
||||
@@ -176,7 +176,10 @@ export const videoApiRouter: RouterPluginAsyncCallback = async (fastify) => {
|
||||
req.query.mode,
|
||||
settings.hlsDirectOutputFormat,
|
||||
])
|
||||
.with([P.union('hls', 'mpegts'), P._], () => 'video/mp2t')
|
||||
.with(
|
||||
[P.union('hls', 'mpegts', 'hls_direct_v2'), P._],
|
||||
() => 'video/mp2t',
|
||||
)
|
||||
.with(['hls_slower', P._], () => 'video/nut')
|
||||
.with(['hls_direct', 'mpegts'], () => 'video/mp2t')
|
||||
.with(['hls_direct', 'mkv'], () => 'video/matroska')
|
||||
|
||||
3
server/src/commands/Command.ts
Normal file
3
server/src/commands/Command.ts
Normal file
@@ -0,0 +1,3 @@
|
||||
export interface Command<Request, Result = void> {
|
||||
run(request: Request): Promise<Result>;
|
||||
}
|
||||
@@ -11,16 +11,17 @@ import {
|
||||
import { KEYS } from '../types/inject.ts';
|
||||
import { Result } from '../types/result.ts';
|
||||
import { Maybe } from '../types/util.ts';
|
||||
import { Command } from './Command.ts';
|
||||
|
||||
@injectable()
|
||||
export class ForceScanCommand {
|
||||
export class ForceScanCommand implements Command<string, Result<void>> {
|
||||
constructor(
|
||||
@inject(KEYS.ProgramDB) private programDB: IProgramDB,
|
||||
@inject(MediaSourceScanCoordinator)
|
||||
private mediaSourceScanCoordinator: MediaSourceScanCoordinator,
|
||||
) {}
|
||||
|
||||
async run(id: string) {
|
||||
async run(id: string): Promise<Result<void>> {
|
||||
let program: Maybe<ProgramGroupingOrm | ProgramOrm> =
|
||||
await this.programDB.getProgramById(id);
|
||||
if (!program) {
|
||||
|
||||
32
server/src/commands/media_source/DeleteMediaSourceCommand.ts
Normal file
32
server/src/commands/media_source/DeleteMediaSourceCommand.ts
Normal file
@@ -0,0 +1,32 @@
|
||||
import { inject, injectable } from 'inversify';
|
||||
import { IChannelDB } from '../../db/interfaces/IChannelDB.ts';
|
||||
import { MediaSourceDB } from '../../db/mediaSourceDB.ts';
|
||||
import { MediaSourceId } from '../../db/schema/base.ts';
|
||||
import { MediaSourceWithRelations } from '../../db/schema/derivedTypes.ts';
|
||||
import { MediaSourceApiFactory } from '../../external/MediaSourceApiFactory.ts';
|
||||
import { MeilisearchService } from '../../services/MeilisearchService.ts';
|
||||
import { KEYS } from '../../types/inject.ts';
|
||||
import { Command } from '../Command.ts';
|
||||
|
||||
@injectable()
|
||||
export class DeleteMediaSourceCommand
|
||||
implements Command<MediaSourceId, MediaSourceWithRelations>
|
||||
{
|
||||
constructor(
|
||||
@inject(MediaSourceDB) private mediaSourceDB: MediaSourceDB,
|
||||
@inject(MeilisearchService)
|
||||
private searchService: MeilisearchService,
|
||||
@inject(KEYS.MediaSourceApiFactory)
|
||||
private mediaSourceApiFactory: () => MediaSourceApiFactory,
|
||||
@inject(KEYS.ChannelDB) private channelDB: IChannelDB,
|
||||
) {}
|
||||
|
||||
async run(request: MediaSourceId): Promise<MediaSourceWithRelations> {
|
||||
const { programIds, groupingIds, deletedServer } =
|
||||
await this.mediaSourceDB.deleteMediaSource(request);
|
||||
await this.channelDB.removeProgramsFromAllLineups(programIds);
|
||||
await this.searchService.deleteByIds(programIds.concat(groupingIds));
|
||||
this.mediaSourceApiFactory().deleteCachedClient(deletedServer);
|
||||
return deletedServer;
|
||||
}
|
||||
}
|
||||
@@ -17,7 +17,7 @@ import { HealthCheckModule } from '@/services/health_checks/HealthCheckModule.js
|
||||
import { StreamModule } from '@/stream/StreamModule.js';
|
||||
import { TasksModule } from '@/tasks/TasksModule.js';
|
||||
import { FixerModule } from '@/tasks/fixers/FixerModule.js';
|
||||
import { KEYS } from '@/types/inject.js';
|
||||
import { autoFactoryKey, KEYS } from '@/types/inject.js';
|
||||
import type { Maybe } from '@/types/util.js';
|
||||
import type { Logger } from '@/util/logging/LoggerFactory.js';
|
||||
import { LoggerFactory } from '@/util/logging/LoggerFactory.js';
|
||||
@@ -45,6 +45,7 @@ import { SystemDevicesService } from './services/SystemDevicesService.ts';
|
||||
import { TunarrWorkerPool } from './services/TunarrWorkerPool.ts';
|
||||
import { DynamicChannelsModule } from './services/dynamic_channels/DynamicChannelsModule.ts';
|
||||
import { TimeSlotSchedulerService } from './services/scheduling/TimeSlotSchedulerService.ts';
|
||||
import { SearchParser } from './services/search/SearchParser.ts';
|
||||
import { ChannelLineupMigratorStartupTask } from './services/startup/ChannelLineupMigratorStartupTask.ts';
|
||||
import { ClearM3uCacheStartupTask } from './services/startup/ClearM3uCacheStartupTask.ts';
|
||||
import { GenerateGuideStartupTask } from './services/startup/GenerateGuideStartupTask.ts';
|
||||
@@ -59,6 +60,7 @@ import { FixerRunner } from './tasks/fixers/FixerRunner.ts';
|
||||
import { ChildProcessHelper } from './util/ChildProcessHelper.ts';
|
||||
import { Timer } from './util/Timer.ts';
|
||||
import { getBooleanEnvVar, USE_WORKER_POOL_ENV_VAR } from './util/env.ts';
|
||||
import type { LoggingDefinition } from './util/logging/loggingDef.ts';
|
||||
|
||||
const container = new Container({ autoBindInjectable: true });
|
||||
|
||||
@@ -87,9 +89,13 @@ const RootModule = new ContainerModule((bind) => {
|
||||
bind<Logger>(KEYS.Logger).toDynamicValue((ctx) => {
|
||||
const impl =
|
||||
ctx.currentRequest.parentRequest?.bindings?.[0]?.implementationType;
|
||||
const loggingDef = impl
|
||||
? (Reflect.get(impl, 'tunarr:log_def') as LoggingDefinition)
|
||||
: null;
|
||||
return LoggerFactory.child({
|
||||
className: impl ? (Reflect.get(impl, 'name') as string) : 'Unknown',
|
||||
worker: isMainThread ? undefined : true,
|
||||
category: loggingDef?.category,
|
||||
});
|
||||
});
|
||||
|
||||
@@ -169,6 +175,7 @@ const RootModule = new ContainerModule((bind) => {
|
||||
bind(App).toSelf().inSingletonScope();
|
||||
|
||||
bind(search.SearchParser).to(search.SearchParser);
|
||||
bind(autoFactoryKey(SearchParser)).toAutoFactory(SearchParser);
|
||||
});
|
||||
|
||||
container.load(RootModule);
|
||||
|
||||
@@ -1524,18 +1524,28 @@ export class ChannelDB implements IChannelDB {
|
||||
async loadAllRawLineups(): Promise<Record<string, ChannelAndRawLineup>> {
|
||||
return asyncMapToRecord(
|
||||
await this.getAllChannels(),
|
||||
async (channel) => ({
|
||||
channel,
|
||||
lineup: jsonSchema.parse(
|
||||
JSON.parse(
|
||||
(
|
||||
await fs.readFile(
|
||||
this.fileSystemService.getChannelLineupPath(channel.uuid),
|
||||
)
|
||||
).toString('utf-8'),
|
||||
async (channel) => {
|
||||
if (
|
||||
!(await fileExists(
|
||||
this.fileSystemService.getChannelLineupPath(channel.uuid),
|
||||
))
|
||||
) {
|
||||
await this.createLineup(channel.uuid);
|
||||
}
|
||||
|
||||
return {
|
||||
channel,
|
||||
lineup: jsonSchema.parse(
|
||||
JSON.parse(
|
||||
(
|
||||
await fs.readFile(
|
||||
this.fileSystemService.getChannelLineupPath(channel.uuid),
|
||||
)
|
||||
).toString('utf-8'),
|
||||
),
|
||||
),
|
||||
),
|
||||
}),
|
||||
};
|
||||
},
|
||||
({ channel }) => channel.uuid,
|
||||
);
|
||||
}
|
||||
|
||||
765
server/src/db/CustomShowDB.test.ts
Normal file
765
server/src/db/CustomShowDB.test.ts
Normal file
@@ -0,0 +1,765 @@
|
||||
import { faker } from '@faker-js/faker';
|
||||
import type { ContentProgram } from '@tunarr/types';
|
||||
import { tag } from '@tunarr/types';
|
||||
import dayjs from 'dayjs';
|
||||
import tmp from 'tmp-promise';
|
||||
import { v4 } from 'uuid';
|
||||
import { test as baseTest, describe, expect, vi } from 'vitest';
|
||||
import { bootstrapTunarr } from '../bootstrap.ts';
|
||||
import { setGlobalOptionsUnchecked } from '../globals.ts';
|
||||
import { LoggerFactory } from '../util/logging/LoggerFactory.ts';
|
||||
import { CustomShowDB } from './CustomShowDB.ts';
|
||||
import { DBAccess } from './DBAccess.ts';
|
||||
import { ProgramDB } from './ProgramDB.ts';
|
||||
import type { MediaSourceId, MediaSourceName } from './schema/base.ts';
|
||||
import { CustomShow } from './schema/CustomShow.ts';
|
||||
import type { NewCustomShowContent } from './schema/CustomShowContent.ts';
|
||||
import { CustomShowContent } from './schema/CustomShowContent.ts';
|
||||
import { DrizzleDBAccess } from './schema/index.ts';
|
||||
import { MediaSource } from './schema/MediaSource.ts';
|
||||
import type { NewProgramDao } from './schema/Program.ts';
|
||||
import { Program } from './schema/Program.ts';
|
||||
|
||||
// Suppress background task scheduling that fires in setImmediate after upsertContentPrograms.
|
||||
// These callbacks try to resolve Inversify bindings that aren't set up in tests.
|
||||
vi.mock('@/services/Scheduler.js', () => ({
|
||||
GlobalScheduler: { scheduleOneOffTask: vi.fn(), removeTask: vi.fn() },
|
||||
}));
|
||||
vi.mock('@/tasks/TaskQueue.js', () => ({
|
||||
PlexTaskQueue: {
|
||||
pause: vi.fn(),
|
||||
resume: vi.fn(),
|
||||
add: vi.fn().mockResolvedValue(undefined),
|
||||
},
|
||||
JellyfinTaskQueue: {
|
||||
pause: vi.fn(),
|
||||
resume: vi.fn(),
|
||||
add: vi.fn().mockResolvedValue(undefined),
|
||||
},
|
||||
}));
|
||||
|
||||
type Fixture = {
|
||||
db: string;
|
||||
mediaSourceId: MediaSourceId;
|
||||
customShowDb: CustomShowDB;
|
||||
drizzle: DrizzleDBAccess;
|
||||
};
|
||||
|
||||
const test = baseTest.extend<Fixture>({
|
||||
db: async ({}, use) => {
|
||||
const dbResult = await tmp.dir({ unsafeCleanup: true });
|
||||
const opts = setGlobalOptionsUnchecked({
|
||||
database: dbResult.path,
|
||||
log_level: 'debug',
|
||||
verbose: 0,
|
||||
});
|
||||
await bootstrapTunarr(opts, ':memory:');
|
||||
await use(dbResult.path);
|
||||
// const dbPath = `${dbResult.path}/db.db`;
|
||||
// await DBAccess.instance.closeConnection(dbPath);
|
||||
await dbResult.cleanup();
|
||||
},
|
||||
mediaSourceId: async ({ db: _ }, use) => {
|
||||
const drizzle = DBAccess.instance.getConnection(':memory:')!.drizzle!;
|
||||
const uuid = v4() as MediaSourceId;
|
||||
const now = +dayjs();
|
||||
await drizzle.insert(MediaSource).values({
|
||||
uuid,
|
||||
name: tag<MediaSourceName>('Test Plex Server'),
|
||||
accessToken: 'test-token',
|
||||
index: 0,
|
||||
type: 'plex',
|
||||
uri: 'http://localhost:32400',
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
});
|
||||
await use(uuid);
|
||||
},
|
||||
customShowDb: async ({ db: _ }, use) => {
|
||||
const dbAccess = DBAccess.instance;
|
||||
const logger = LoggerFactory.child({ className: 'ProgramDB' });
|
||||
|
||||
const mockTaskFactory = () => ({ enqueue: async () => {} }) as any;
|
||||
|
||||
// Minimal stub of ProgramDaoMinter — only implements what upsertContentPrograms
|
||||
// needs when converting non-persisted ContentPrograms to DB rows.
|
||||
const mockMinterFactory = () => ({
|
||||
contentProgramDtoToDao(
|
||||
program: ContentProgram,
|
||||
): NewProgramDao | undefined {
|
||||
if (!program.canonicalId) return undefined;
|
||||
const now = +dayjs();
|
||||
return {
|
||||
uuid: v4(),
|
||||
sourceType: program.externalSourceType,
|
||||
externalSourceId: tag<MediaSourceName>(program.externalSourceName),
|
||||
mediaSourceId: tag<MediaSourceId>(program.externalSourceId),
|
||||
externalKey: program.externalKey,
|
||||
canonicalId: program.canonicalId,
|
||||
libraryId: null,
|
||||
duration: program.duration,
|
||||
title: program.title,
|
||||
type: program.subtype,
|
||||
state: 'ok',
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
rating: program.rating ?? null,
|
||||
summary: program.summary ?? null,
|
||||
originalAirDate: program.date ?? null,
|
||||
year: program.year ?? null,
|
||||
episode: null,
|
||||
plexRatingKey: null,
|
||||
plexFilePath: null,
|
||||
filePath: null,
|
||||
parentExternalKey: null,
|
||||
grandparentExternalKey: null,
|
||||
showTitle: null,
|
||||
seasonNumber: null,
|
||||
};
|
||||
},
|
||||
mintExternalIds() {
|
||||
return [];
|
||||
},
|
||||
});
|
||||
|
||||
const programDb = new ProgramDB(
|
||||
logger,
|
||||
mockTaskFactory,
|
||||
mockTaskFactory,
|
||||
dbAccess.getKyselyDatabase(':memory:')!,
|
||||
mockMinterFactory as any,
|
||||
dbAccess.getConnection(':memory:')?.drizzle!,
|
||||
);
|
||||
|
||||
const customShowDb = new CustomShowDB(
|
||||
programDb,
|
||||
dbAccess.getKyselyDatabase(':memory:')!,
|
||||
dbAccess.getConnection(':memory:')!.drizzle!,
|
||||
);
|
||||
|
||||
await use(customShowDb);
|
||||
},
|
||||
drizzle: async ({ db: _ }, use) => {
|
||||
await use(DBAccess.instance.getConnection(':memory:')!.drizzle!);
|
||||
},
|
||||
});
|
||||
|
||||
function createProgram(
|
||||
mediaSourceId: MediaSourceId,
|
||||
overrides?: Partial<NewProgramDao>,
|
||||
): NewProgramDao {
|
||||
const now = +dayjs();
|
||||
return {
|
||||
uuid: v4(),
|
||||
canonicalId: v4(),
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
albumName: null,
|
||||
albumUuid: null,
|
||||
artistName: null,
|
||||
artistUuid: null,
|
||||
duration: faker.number.int({ min: 60000, max: 7200000 }),
|
||||
episode: null,
|
||||
episodeIcon: null,
|
||||
externalKey: faker.string.alphanumeric(10),
|
||||
externalSourceId: tag<MediaSourceName>(faker.string.alphanumeric(8)),
|
||||
mediaSourceId,
|
||||
libraryId: null,
|
||||
localMediaFolderId: null,
|
||||
localMediaSourcePathId: null,
|
||||
filePath: null,
|
||||
grandparentExternalKey: null,
|
||||
icon: null,
|
||||
originalAirDate: null,
|
||||
parentExternalKey: null,
|
||||
plexFilePath: null,
|
||||
plexRatingKey: null,
|
||||
rating: null,
|
||||
seasonIcon: null,
|
||||
seasonNumber: null,
|
||||
seasonUuid: null,
|
||||
showIcon: null,
|
||||
showTitle: null,
|
||||
sourceType: 'plex',
|
||||
summary: null,
|
||||
plot: null,
|
||||
tagline: null,
|
||||
title: faker.word.words(3),
|
||||
tvShowUuid: null,
|
||||
type: 'movie',
|
||||
year: faker.date.past().getFullYear(),
|
||||
state: 'ok',
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
async function insertProgram(drizzle: DrizzleDBAccess, program: NewProgramDao) {
|
||||
await drizzle.insert(Program).values(program);
|
||||
return program;
|
||||
}
|
||||
|
||||
function makePersistedContentProgram(program: NewProgramDao): ContentProgram {
|
||||
return {
|
||||
type: 'content',
|
||||
subtype: 'movie',
|
||||
id: program.uuid,
|
||||
persisted: true,
|
||||
duration: program.duration,
|
||||
title: program.title!,
|
||||
externalSourceType: program.sourceType as 'plex',
|
||||
externalSourceName: String(program.externalSourceId),
|
||||
externalSourceId: String(program.mediaSourceId),
|
||||
externalKey: program.externalKey,
|
||||
uniqueId: program.uuid,
|
||||
externalIds: [],
|
||||
};
|
||||
}
|
||||
|
||||
function makeNewContentProgram(
|
||||
mediaSourceId: MediaSourceId,
|
||||
overrides?: Partial<ContentProgram>,
|
||||
): ContentProgram {
|
||||
const externalKey = faker.string.alphanumeric(10);
|
||||
return {
|
||||
type: 'content',
|
||||
subtype: 'movie',
|
||||
persisted: false,
|
||||
duration: faker.number.int({ min: 60000, max: 7200000 }),
|
||||
title: faker.word.words(3),
|
||||
externalSourceType: 'plex',
|
||||
externalSourceName: 'Test Plex Server',
|
||||
externalSourceId: mediaSourceId,
|
||||
externalKey,
|
||||
uniqueId: `plex|${mediaSourceId}|${externalKey}`,
|
||||
externalIds: [],
|
||||
canonicalId: v4(),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
async function createCustomShow(
|
||||
drizzle: DrizzleDBAccess,
|
||||
name?: string,
|
||||
): Promise<typeof CustomShow.$inferSelect> {
|
||||
const now = +dayjs();
|
||||
const show = {
|
||||
uuid: v4(),
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
name: name ?? faker.word.words(2),
|
||||
};
|
||||
await drizzle.insert(CustomShow).values(show);
|
||||
return show;
|
||||
}
|
||||
|
||||
async function insertCustomShowContent(
|
||||
drizzle: DrizzleDBAccess,
|
||||
rows: NewCustomShowContent[],
|
||||
) {
|
||||
if (rows.length > 0) {
|
||||
await drizzle.insert(CustomShowContent).values(rows);
|
||||
}
|
||||
}
|
||||
|
||||
describe('CustomShowDB', () => {
|
||||
describe('duplicate programs in custom shows', () => {
|
||||
test('should allow the same program to appear multiple times with different indexes', async ({
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
const rows: NewCustomShowContent[] = [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 2 },
|
||||
];
|
||||
|
||||
// This should NOT throw — the new composite PK includes index
|
||||
await expect(
|
||||
insertCustomShowContent(drizzle, rows),
|
||||
).resolves.not.toThrow();
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, show.uuid),
|
||||
orderBy: (fields, { asc }) => asc(fields.index),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(3);
|
||||
expect(content.map((c) => c.index)).toEqual([0, 1, 2]);
|
||||
expect(content.every((c) => c.contentUuid === program.uuid)).toBe(true);
|
||||
});
|
||||
|
||||
test('should still enforce uniqueness on (contentUuid, customShowUuid, index) triple', async ({
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
]);
|
||||
|
||||
// Inserting the same (show, program, index) should fail
|
||||
await expect(
|
||||
insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
]),
|
||||
).rejects.toThrow();
|
||||
});
|
||||
|
||||
test('should allow different programs at the same index in different custom shows', async ({
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show1 = await createCustomShow(drizzle, 'Show 1');
|
||||
const show2 = await createCustomShow(drizzle, 'Show 2');
|
||||
|
||||
await expect(
|
||||
insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show1.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show2.uuid, contentUuid: program.uuid, index: 0 },
|
||||
]),
|
||||
).resolves.not.toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe('getShowProgramsOrm', () => {
|
||||
test('should return programs in index order', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const programA = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Program A' }),
|
||||
);
|
||||
const programB = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Program B' }),
|
||||
);
|
||||
const programC = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Program C' }),
|
||||
);
|
||||
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
// Insert in non-sequential order to verify sorting
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: programC.uuid, index: 2 },
|
||||
{ customShowUuid: show.uuid, contentUuid: programA.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: programB.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
const programs = await customShowDb.getShowProgramsOrm(show.uuid);
|
||||
|
||||
expect(programs).toHaveLength(3);
|
||||
expect(programs[0]!.title).toBe('Program A');
|
||||
expect(programs[1]!.title).toBe('Program B');
|
||||
expect(programs[2]!.title).toBe('Program C');
|
||||
});
|
||||
|
||||
test('should return duplicate programs preserving order', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const programA = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Repeat Me' }),
|
||||
);
|
||||
const programB = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Other' }),
|
||||
);
|
||||
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: programA.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: programB.uuid, index: 1 },
|
||||
{ customShowUuid: show.uuid, contentUuid: programA.uuid, index: 2 },
|
||||
]);
|
||||
|
||||
const programs = await customShowDb.getShowProgramsOrm(show.uuid);
|
||||
|
||||
expect(programs).toHaveLength(3);
|
||||
expect(programs[0]!.title).toBe('Repeat Me');
|
||||
expect(programs[1]!.title).toBe('Other');
|
||||
expect(programs[2]!.title).toBe('Repeat Me');
|
||||
});
|
||||
|
||||
test('should return empty array for nonexistent show', async ({
|
||||
customShowDb,
|
||||
}) => {
|
||||
const programs = await customShowDb.getShowProgramsOrm(v4());
|
||||
expect(programs).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getShow', () => {
|
||||
test('should return show with program data', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
const result = await customShowDb.getShow(show.uuid);
|
||||
|
||||
expect(result).toBeDefined();
|
||||
expect(result!.uuid).toBe(show.uuid);
|
||||
expect(result!.name).toBe(show.name);
|
||||
});
|
||||
|
||||
test('should return undefined for nonexistent show', async ({
|
||||
customShowDb,
|
||||
}) => {
|
||||
const result = await customShowDb.getShow(v4());
|
||||
expect(result).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('deleteShow', () => {
|
||||
test('should delete a show and its content', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
const deleted = await customShowDb.deleteShow(show.uuid);
|
||||
expect(deleted).toBe(true);
|
||||
|
||||
// Verify show is gone
|
||||
const result = await customShowDb.getShow(show.uuid);
|
||||
expect(result).toBeUndefined();
|
||||
|
||||
// Verify content is gone
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, show.uuid),
|
||||
});
|
||||
expect(content).toHaveLength(0);
|
||||
});
|
||||
|
||||
test('should return false for nonexistent show', async ({
|
||||
customShowDb,
|
||||
}) => {
|
||||
const deleted = await customShowDb.deleteShow(v4());
|
||||
expect(deleted).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getAllShowsInfo', () => {
|
||||
test('should return correct content count with duplicate programs', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle, 'Duplicates Show');
|
||||
|
||||
// Same program at 3 different indexes
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 2 },
|
||||
]);
|
||||
|
||||
const shows = await customShowDb.getAllShowsInfo();
|
||||
const found = shows.find((s) => s.id === show.uuid);
|
||||
|
||||
expect(found).toBeDefined();
|
||||
// With the .distinct() removed, count now correctly reflects total entries
|
||||
expect(found!.count).toBe(3);
|
||||
});
|
||||
|
||||
test('should return multiple shows with correct counts', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const programA = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const programB = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
|
||||
const show1 = await createCustomShow(drizzle, 'Show One');
|
||||
const show2 = await createCustomShow(drizzle, 'Show Two');
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show1.uuid, contentUuid: programA.uuid, index: 0 },
|
||||
{ customShowUuid: show1.uuid, contentUuid: programB.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show2.uuid, contentUuid: programA.uuid, index: 0 },
|
||||
]);
|
||||
|
||||
const shows = await customShowDb.getAllShowsInfo();
|
||||
const found1 = shows.find((s) => s.id === show1.uuid);
|
||||
const found2 = shows.find((s) => s.id === show2.uuid);
|
||||
|
||||
expect(found1).toBeDefined();
|
||||
expect(found1!.count).toBe(2);
|
||||
|
||||
expect(found2).toBeDefined();
|
||||
expect(found2!.count).toBe(1);
|
||||
});
|
||||
|
||||
test('should calculate total duration across duplicate entries', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const duration = 120000; // 2 minutes
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { duration }),
|
||||
);
|
||||
const show = await createCustomShow(drizzle, 'Duration Test');
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 2 },
|
||||
]);
|
||||
|
||||
const shows = await customShowDb.getAllShowsInfo();
|
||||
const found = shows.find((s) => s.id === show.uuid);
|
||||
|
||||
expect(found).toBeDefined();
|
||||
// totalDuration uses a correlated subquery per row, so with 3 content
|
||||
// rows pointing to the same program, we get 3 * duration
|
||||
expect(found!.totalDuration).toBe(duration * 3);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getShows', () => {
|
||||
test('should return shows with correct content counts via Drizzle path', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle, 'Drizzle Show');
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
const shows = await customShowDb.getShows([show.uuid]);
|
||||
|
||||
expect(shows).toHaveLength(1);
|
||||
// The Drizzle path uses result.content.length, which correctly counts all entries
|
||||
expect(shows[0]!.contentCount).toBe(2);
|
||||
});
|
||||
|
||||
test('should return empty array for empty input', async ({
|
||||
customShowDb,
|
||||
}) => {
|
||||
const shows = await customShowDb.getShows([]);
|
||||
expect(shows).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('cascade deletes', () => {
|
||||
test('should delete custom show content when a program is deleted', async ({
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 1 },
|
||||
]);
|
||||
|
||||
// Delete the program — content should cascade
|
||||
await drizzle.delete(Program).where(
|
||||
// Using drizzle eq
|
||||
(await import('drizzle-orm')).eq(Program.uuid, program.uuid),
|
||||
);
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, show.uuid),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(0);
|
||||
});
|
||||
|
||||
test('should delete custom show content when a custom show is deleted', async ({
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId),
|
||||
);
|
||||
const show = await createCustomShow(drizzle);
|
||||
|
||||
await insertCustomShowContent(drizzle, [
|
||||
{ customShowUuid: show.uuid, contentUuid: program.uuid, index: 0 },
|
||||
]);
|
||||
|
||||
const { eq } = await import('drizzle-orm');
|
||||
await drizzle.delete(CustomShow).where(eq(CustomShow.uuid, show.uuid));
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, ops) => ops.eq(fields.customShowUuid, show.uuid),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('upsertCustomShowContent (via createShow)', () => {
|
||||
test('should save persisted programs to a new custom show', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const program1 = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Persisted 1' }),
|
||||
);
|
||||
const program2 = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Persisted 2' }),
|
||||
);
|
||||
|
||||
const showId = await customShowDb.createShow({
|
||||
name: 'Persisted Only Show',
|
||||
programs: [
|
||||
makePersistedContentProgram(program1),
|
||||
makePersistedContentProgram(program2),
|
||||
],
|
||||
});
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, showId),
|
||||
orderBy: (fields, { asc }) => asc(fields.index),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(2);
|
||||
expect(content[0]!.contentUuid).toBe(program1.uuid);
|
||||
expect(content[1]!.contentUuid).toBe(program2.uuid);
|
||||
});
|
||||
|
||||
test('should upsert and save new (non-persisted) programs to a new custom show', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const programs = [
|
||||
makeNewContentProgram(mediaSourceId, { externalKey: 'key-aaa' }),
|
||||
makeNewContentProgram(mediaSourceId, { externalKey: 'key-bbb' }),
|
||||
];
|
||||
|
||||
const showId = await customShowDb.createShow({
|
||||
name: 'New Programs Show',
|
||||
programs,
|
||||
});
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, showId),
|
||||
orderBy: (fields, { asc }) => asc(fields.index),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(2);
|
||||
|
||||
// Both programs should now exist in the program table
|
||||
const savedPrograms = await drizzle.query.program.findMany({
|
||||
where: (fields, { inArray }) =>
|
||||
inArray(
|
||||
fields.uuid,
|
||||
content.map((c) => c.contentUuid),
|
||||
),
|
||||
});
|
||||
expect(savedPrograms).toHaveLength(2);
|
||||
});
|
||||
|
||||
test('should save both persisted and new programs to a new custom show', async ({
|
||||
customShowDb,
|
||||
drizzle,
|
||||
mediaSourceId,
|
||||
}) => {
|
||||
const existingProgram = await insertProgram(
|
||||
drizzle,
|
||||
createProgram(mediaSourceId, { title: 'Already In DB' }),
|
||||
);
|
||||
const newProgram = makeNewContentProgram(mediaSourceId, {
|
||||
title: 'Needs Inserting',
|
||||
externalKey: 'brand-new-key',
|
||||
});
|
||||
|
||||
const showId = await customShowDb.createShow({
|
||||
name: 'Mixed Show',
|
||||
programs: [makePersistedContentProgram(existingProgram), newProgram],
|
||||
});
|
||||
|
||||
const content = await drizzle.query.customShowContent.findMany({
|
||||
where: (fields, { eq }) => eq(fields.customShowUuid, showId),
|
||||
orderBy: (fields, { asc }) => asc(fields.index),
|
||||
});
|
||||
|
||||
expect(content).toHaveLength(2);
|
||||
|
||||
// First entry is the pre-existing program
|
||||
expect(content[0]!.contentUuid).toBe(existingProgram.uuid);
|
||||
|
||||
// Second entry was newly inserted
|
||||
const insertedProgram = await drizzle.query.program.findFirst({
|
||||
where: (fields, { eq }) => eq(fields.uuid, content[1]!.contentUuid),
|
||||
});
|
||||
expect(insertedProgram).toBeDefined();
|
||||
expect(insertedProgram!.title).toBe('Needs Inserting');
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,5 +1,12 @@
|
||||
import { KEYS } from '@/types/inject.js';
|
||||
import { isNonEmptyString, programExternalIdString } from '@/util/index.js';
|
||||
import { isNonEmptyString } from '@/util/index.js';
|
||||
import { createExternalId } from '@tunarr/shared';
|
||||
import {
|
||||
ContentProgram,
|
||||
isContentProgram,
|
||||
isCustomProgram,
|
||||
tag,
|
||||
} from '@tunarr/types';
|
||||
import {
|
||||
CreateCustomShowRequest,
|
||||
UpdateCustomShowRequest,
|
||||
@@ -7,14 +14,14 @@ import {
|
||||
import dayjs from 'dayjs';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import { Kysely } from 'kysely';
|
||||
import { chunk, filter, isNil, map } from 'lodash-es';
|
||||
import { chunk, isNil, orderBy, partition } from 'lodash-es';
|
||||
import { v4 } from 'uuid';
|
||||
import { ProgramDB } from './ProgramDB.ts';
|
||||
import { createPendingProgramIndexMap } from './programHelpers.ts';
|
||||
import { IProgramDB } from './interfaces/IProgramDB.ts';
|
||||
import {
|
||||
AllProgramJoins,
|
||||
withCustomShowPrograms,
|
||||
} from './programQueryHelpers.ts';
|
||||
import { MediaSourceId, MediaSourceType } from './schema/base.ts';
|
||||
import type { NewCustomShow } from './schema/CustomShow.ts';
|
||||
import type { NewCustomShowContent } from './schema/CustomShowContent.ts';
|
||||
import { DB } from './schema/db.ts';
|
||||
@@ -27,7 +34,7 @@ import { DrizzleDBAccess } from './schema/index.ts';
|
||||
@injectable()
|
||||
export class CustomShowDB {
|
||||
constructor(
|
||||
@inject(KEYS.ProgramDB) private programDB: ProgramDB,
|
||||
@inject(KEYS.ProgramDB) private programDB: IProgramDB,
|
||||
@inject(KEYS.Database) private db: Kysely<DB>,
|
||||
@inject(KEYS.DrizzleDB) private drizzle: DrizzleDBAccess,
|
||||
) {}
|
||||
@@ -63,23 +70,6 @@ export class CustomShowDB {
|
||||
contentCount: result.content.length,
|
||||
}));
|
||||
});
|
||||
|
||||
return this.db
|
||||
.selectFrom('customShow')
|
||||
.where('customShow.uuid', 'in', ids)
|
||||
.innerJoin(
|
||||
'customShowContent',
|
||||
'customShowContent.customShowUuid',
|
||||
'customShow.uuid',
|
||||
)
|
||||
.selectAll('customShow')
|
||||
.select((eb) =>
|
||||
eb.fn
|
||||
.count<number>('customShowContent.contentUuid')
|
||||
.distinct()
|
||||
.as('contentCount'),
|
||||
)
|
||||
.execute();
|
||||
}
|
||||
|
||||
async getShowPrograms(id: string): Promise<ProgramWithRelations[]> {
|
||||
@@ -118,62 +108,15 @@ export class CustomShowDB {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (updateRequest.programs) {
|
||||
const programIndexById = createPendingProgramIndexMap(
|
||||
updateRequest.programs,
|
||||
);
|
||||
|
||||
const persisted = filter(
|
||||
updateRequest.programs,
|
||||
(p) => p.persisted && isNonEmptyString(p.id),
|
||||
);
|
||||
|
||||
const upsertedPrograms = await this.programDB.upsertContentPrograms(
|
||||
updateRequest.programs,
|
||||
);
|
||||
|
||||
const persistedCustomShowContent = map(
|
||||
persisted,
|
||||
(p) =>
|
||||
({
|
||||
customShowUuid: show.uuid,
|
||||
contentUuid: p.id!,
|
||||
index: programIndexById[p.id!]!,
|
||||
}) satisfies NewCustomShowContent,
|
||||
);
|
||||
|
||||
const newCustomShowContent = map(
|
||||
upsertedPrograms,
|
||||
(p) =>
|
||||
({
|
||||
customShowUuid: show.uuid,
|
||||
contentUuid: p.uuid,
|
||||
index: programIndexById[programExternalIdString(p)]!,
|
||||
}) satisfies NewCustomShowContent,
|
||||
);
|
||||
|
||||
await this.db.transaction().execute(async (tx) => {
|
||||
await tx
|
||||
.deleteFrom('customShowContent')
|
||||
.where('customShowContent.customShowUuid', '=', show.uuid)
|
||||
.execute();
|
||||
await Promise.all(
|
||||
chunk(
|
||||
[...persistedCustomShowContent, ...newCustomShowContent],
|
||||
1_000,
|
||||
).map((csc) =>
|
||||
tx.insertInto('customShowContent').values(csc).execute(),
|
||||
),
|
||||
);
|
||||
});
|
||||
if (updateRequest.programs && updateRequest.programs.length > 0) {
|
||||
await this.upsertCustomShowContent(show.uuid, updateRequest.programs);
|
||||
}
|
||||
|
||||
if (updateRequest.name) {
|
||||
await this.db
|
||||
.updateTable('customShow')
|
||||
.where('uuid', '=', show.uuid)
|
||||
// TODO: Blocked on https://github.com/oven-sh/bun/issues/16909
|
||||
// .limit(1)
|
||||
.limit(1)
|
||||
.set({ name: updateRequest.name })
|
||||
.execute();
|
||||
}
|
||||
@@ -190,45 +133,9 @@ export class CustomShowDB {
|
||||
name: createRequest.name,
|
||||
} satisfies NewCustomShow;
|
||||
|
||||
const programIndexById = createPendingProgramIndexMap(
|
||||
createRequest.programs,
|
||||
);
|
||||
|
||||
const persisted = filter(createRequest.programs, (p) => p.persisted);
|
||||
|
||||
const upsertedPrograms = await this.programDB.upsertContentPrograms(
|
||||
createRequest.programs,
|
||||
);
|
||||
|
||||
await this.db.insertInto('customShow').values(show).execute();
|
||||
|
||||
const persistedCustomShowContent = map(
|
||||
persisted,
|
||||
(p) =>
|
||||
({
|
||||
customShowUuid: show.uuid,
|
||||
contentUuid: p.id!,
|
||||
index: programIndexById[p.id!]!,
|
||||
}) satisfies NewCustomShowContent,
|
||||
);
|
||||
const newCustomShowContent = map(
|
||||
upsertedPrograms,
|
||||
(p) =>
|
||||
({
|
||||
customShowUuid: show.uuid,
|
||||
contentUuid: p.uuid,
|
||||
index: programIndexById[programExternalIdString(p)]!,
|
||||
}) satisfies NewCustomShowContent,
|
||||
);
|
||||
|
||||
await Promise.all(
|
||||
chunk(
|
||||
[...persistedCustomShowContent, ...newCustomShowContent],
|
||||
1_000,
|
||||
).map((csc) =>
|
||||
this.db.insertInto('customShowContent').values(csc).execute(),
|
||||
),
|
||||
);
|
||||
await this.upsertCustomShowContent(show.uuid, createRequest.programs);
|
||||
|
||||
return show.uuid;
|
||||
}
|
||||
@@ -278,10 +185,7 @@ export class CustomShowDB {
|
||||
)
|
||||
.groupBy('customShow.uuid')
|
||||
.select((eb) => [
|
||||
eb.fn
|
||||
.count<number>('customShowContent.contentUuid')
|
||||
.distinct()
|
||||
.as('contentCount'),
|
||||
eb.fn.count<number>('customShowContent.contentUuid').as('contentCount'),
|
||||
eb.fn
|
||||
.sum<number>(
|
||||
eb
|
||||
@@ -299,4 +203,104 @@ export class CustomShowDB {
|
||||
totalDuration: f.totalDuration,
|
||||
}));
|
||||
}
|
||||
|
||||
private async upsertCustomShowContent(
|
||||
customShowId: string,
|
||||
programs: ContentProgram[],
|
||||
) {
|
||||
if (programs.length === 0) {
|
||||
return;
|
||||
}
|
||||
const newProgramIndexesById = new Map<string, number[]>();
|
||||
for (let i = 0; i < programs.length; i++) {
|
||||
const program = programs[i]!;
|
||||
if (
|
||||
(program.persisted ||
|
||||
isCustomProgram(program) ||
|
||||
program.externalSourceType === 'local') &&
|
||||
isNonEmptyString(program.id)
|
||||
) {
|
||||
const existing = newProgramIndexesById.get(program.id) ?? [];
|
||||
existing.push(i);
|
||||
newProgramIndexesById.set(program.id, existing);
|
||||
} else if (
|
||||
isContentProgram(program) &&
|
||||
program.externalSourceType !== 'local'
|
||||
) {
|
||||
const key = createExternalId(
|
||||
program.externalSourceType,
|
||||
tag(program.externalSourceId),
|
||||
program.externalKey,
|
||||
);
|
||||
const existing = newProgramIndexesById.get(key) ?? [];
|
||||
existing.push(i);
|
||||
newProgramIndexesById.set(key, existing);
|
||||
}
|
||||
}
|
||||
|
||||
const [persisted, needsPersist] = partition(
|
||||
programs,
|
||||
(p) => p.persisted && isNonEmptyString(p.id),
|
||||
);
|
||||
const upsertedPrograms =
|
||||
await this.programDB.upsertContentPrograms(needsPersist);
|
||||
const allPrograms: {
|
||||
uuid: string;
|
||||
sourceType: MediaSourceType;
|
||||
mediaSourceId: MediaSourceId;
|
||||
externalKey: string;
|
||||
}[] = persisted
|
||||
.map((p) => ({
|
||||
uuid: p.id!,
|
||||
sourceType: p.externalSourceType,
|
||||
mediaSourceId: tag<MediaSourceId>(p.externalSourceId),
|
||||
externalKey: p.externalKey,
|
||||
}))
|
||||
.concat(upsertedPrograms);
|
||||
|
||||
const allNewCustomContent = orderBy(
|
||||
allPrograms.flatMap((program) => {
|
||||
let indexes = newProgramIndexesById.get(program.uuid);
|
||||
if (!indexes && program.sourceType !== 'local') {
|
||||
const externalId = createExternalId(
|
||||
program.sourceType,
|
||||
program.mediaSourceId,
|
||||
program.externalKey,
|
||||
);
|
||||
indexes = newProgramIndexesById.get(externalId);
|
||||
}
|
||||
if (!indexes) {
|
||||
return [];
|
||||
}
|
||||
return indexes.map(
|
||||
(index) =>
|
||||
({
|
||||
customShowUuid: customShowId,
|
||||
contentUuid: program.uuid,
|
||||
index,
|
||||
}) satisfies NewCustomShowContent,
|
||||
);
|
||||
}),
|
||||
(csc) => csc.index,
|
||||
'asc',
|
||||
).map((csc, idx) => {
|
||||
csc.index = idx;
|
||||
return csc;
|
||||
});
|
||||
|
||||
await this.db.transaction().execute(async (tx) => {
|
||||
if (allNewCustomContent.length > 0) {
|
||||
await tx
|
||||
.deleteFrom('customShowContent')
|
||||
.where('customShowContent.customShowUuid', '=', customShowId)
|
||||
.execute();
|
||||
for (const contentChunk of chunk(allNewCustomContent, 1_000)) {
|
||||
await tx
|
||||
.insertInto('customShowContent')
|
||||
.values(contentChunk)
|
||||
.execute();
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
@@ -8,6 +8,7 @@ import { ContainerModule } from 'inversify';
|
||||
import type { Kysely } from 'kysely';
|
||||
import { DBAccess } from './DBAccess.ts';
|
||||
import { FillerDB } from './FillerListDB.ts';
|
||||
import { ProgramPlayHistoryDB } from './ProgramPlayHistoryDB.ts';
|
||||
import { ProgramDaoMinter } from './converters/ProgramMinter.ts';
|
||||
import type { DB } from './schema/db.ts';
|
||||
import type { DrizzleDBAccess } from './schema/index.ts';
|
||||
@@ -29,6 +30,7 @@ const DBModule = new ContainerModule((bind) => {
|
||||
KEYS.DrizzleDatabaseFactory,
|
||||
).toAutoFactory(KEYS.DrizzleDB);
|
||||
bind(KEYS.FillerListDB).to(FillerDB).inSingletonScope();
|
||||
bind(ProgramPlayHistoryDB).toSelf().inSingletonScope();
|
||||
|
||||
bind(ProgramDaoMinter).toSelf();
|
||||
bind<interfaces.AutoFactory<ProgramDaoMinter>>(
|
||||
|
||||
@@ -1005,7 +1005,7 @@ export class ProgramDB implements IProgramDB {
|
||||
async upsertContentPrograms(
|
||||
programs: ChannelProgram[],
|
||||
programUpsertBatchSize: number = 100,
|
||||
) {
|
||||
): Promise<MarkNonNullable<ProgramDao, 'mediaSourceId'>[]> {
|
||||
if (isEmpty(programs)) {
|
||||
return [];
|
||||
}
|
||||
|
||||
71
server/src/db/ProgramGroupingDB.ts
Normal file
71
server/src/db/ProgramGroupingDB.ts
Normal file
@@ -0,0 +1,71 @@
|
||||
import { MediaSourceId } from '@tunarr/shared';
|
||||
import { seq } from '@tunarr/shared/util';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import { chunk } from 'lodash-es';
|
||||
import { MarkRequired } from 'ts-essentials';
|
||||
import { KEYS } from '../types/inject.ts';
|
||||
import { RemoteSourceType } from './schema/base.ts';
|
||||
import { ProgramGroupingOrmWithRelations } from './schema/derivedTypes.ts';
|
||||
import { DrizzleDBAccess } from './schema/index.ts';
|
||||
|
||||
@injectable()
|
||||
export class ProgramGroupingDB {
|
||||
constructor(@inject(KEYS.DrizzleDB) private db: DrizzleDBAccess) {}
|
||||
|
||||
async lookupByExternalIds(
|
||||
ids:
|
||||
| Set<[RemoteSourceType, MediaSourceId, string]>
|
||||
| Set<readonly [RemoteSourceType, MediaSourceId, string]>,
|
||||
chunkSize: number = 200,
|
||||
) {
|
||||
const allIds = [...ids];
|
||||
const programs: MarkRequired<
|
||||
ProgramGroupingOrmWithRelations,
|
||||
'externalIds'
|
||||
>[] = [];
|
||||
for (const idChunk of chunk(allIds, chunkSize)) {
|
||||
const results = await this.db.query.programGroupingExternalId.findMany({
|
||||
where: (fields, { or, and, eq }) => {
|
||||
const ands = idChunk.map(([ps, es, ek]) =>
|
||||
and(
|
||||
eq(fields.externalKey, ek),
|
||||
eq(fields.sourceType, ps),
|
||||
eq(fields.mediaSourceId, es),
|
||||
),
|
||||
);
|
||||
return or(...ands);
|
||||
},
|
||||
columns: {},
|
||||
with: {
|
||||
grouping: {
|
||||
with: {
|
||||
artist: true,
|
||||
show: true,
|
||||
externalIds: true,
|
||||
tags: {
|
||||
with: {
|
||||
tag: true,
|
||||
},
|
||||
},
|
||||
artwork: true,
|
||||
credits: true,
|
||||
genres: {
|
||||
with: {
|
||||
genre: true,
|
||||
},
|
||||
},
|
||||
studios: {
|
||||
with: {
|
||||
studio: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
programs.push(...seq.collect(results, (r) => r.grouping));
|
||||
}
|
||||
|
||||
return programs;
|
||||
}
|
||||
}
|
||||
192
server/src/db/ProgramPlayHistoryDB.ts
Normal file
192
server/src/db/ProgramPlayHistoryDB.ts
Normal file
@@ -0,0 +1,192 @@
|
||||
import { KEYS } from '@/types/inject.js';
|
||||
import { isNonEmptyString } from '@tunarr/shared/util';
|
||||
import { and, eq, gt } from 'drizzle-orm';
|
||||
import { inject, injectable } from 'inversify';
|
||||
import { v4 } from 'uuid';
|
||||
import {
|
||||
NewProgramPlayHistoryDrizzle,
|
||||
ProgramPlayHistory,
|
||||
} from './schema/ProgramPlayHistory.ts';
|
||||
import { DrizzleDBAccess } from './schema/index.ts';
|
||||
|
||||
@injectable()
|
||||
export class ProgramPlayHistoryDB {
|
||||
constructor(@inject(KEYS.DrizzleDB) private drizzle: DrizzleDBAccess) {}
|
||||
|
||||
getById(id: string) {
|
||||
return this.drizzle.query.programPlayHistory.findFirst({
|
||||
where: (fields, { eq }) => eq(fields.uuid, id),
|
||||
});
|
||||
}
|
||||
|
||||
getByProgramId(programId: string) {
|
||||
return this.drizzle.query.programPlayHistory.findMany({
|
||||
where: (fields, { eq }) => eq(fields.programUuid, programId),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
});
|
||||
}
|
||||
|
||||
getByChannelId(channelId: string, limit?: number) {
|
||||
return this.drizzle.query.programPlayHistory.findMany({
|
||||
where: (fields, { eq }) => eq(fields.channelUuid, channelId),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
limit,
|
||||
});
|
||||
}
|
||||
|
||||
getByChannelAndProgram(channelId: string, programId: string) {
|
||||
return this.drizzle.query.programPlayHistory.findMany({
|
||||
where: (fields, { eq, and }) =>
|
||||
and(
|
||||
eq(fields.channelUuid, channelId),
|
||||
eq(fields.programUuid, programId),
|
||||
),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
});
|
||||
}
|
||||
|
||||
getAll(limit?: number) {
|
||||
return this.drizzle.query.programPlayHistory.findMany({
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
limit,
|
||||
});
|
||||
}
|
||||
|
||||
create(
|
||||
data: Omit<NewProgramPlayHistoryDrizzle, 'uuid' | 'createdAt'> & {
|
||||
uuid?: string;
|
||||
createdAt?: Date;
|
||||
},
|
||||
) {
|
||||
const now = new Date();
|
||||
const record: NewProgramPlayHistoryDrizzle = {
|
||||
uuid: data.uuid ?? v4(),
|
||||
programUuid: data.programUuid,
|
||||
channelUuid: data.channelUuid,
|
||||
playedAt: data.playedAt,
|
||||
playedDuration: data.playedDuration,
|
||||
createdAt: data.createdAt ?? now,
|
||||
fillerListId: data.fillerListId,
|
||||
};
|
||||
|
||||
return this.drizzle
|
||||
.insert(ProgramPlayHistory)
|
||||
.values(record)
|
||||
.returning()
|
||||
.then((rows) => rows[0]);
|
||||
}
|
||||
|
||||
async update(
|
||||
id: string,
|
||||
data: Partial<
|
||||
Pick<
|
||||
NewProgramPlayHistoryDrizzle,
|
||||
'playedAt' | 'playedDuration' | 'fillerListId'
|
||||
>
|
||||
>,
|
||||
) {
|
||||
const rows = await this.drizzle
|
||||
.update(ProgramPlayHistory)
|
||||
.set(data)
|
||||
.where(eq(ProgramPlayHistory.uuid, id))
|
||||
.returning();
|
||||
return rows[0];
|
||||
}
|
||||
|
||||
async delete(id: string) {
|
||||
const rows = await this.drizzle
|
||||
.delete(ProgramPlayHistory)
|
||||
.where(eq(ProgramPlayHistory.uuid, id))
|
||||
.returning();
|
||||
return rows.length > 0;
|
||||
}
|
||||
|
||||
async deleteByChannelId(channelId: string) {
|
||||
const rows = await this.drizzle
|
||||
.delete(ProgramPlayHistory)
|
||||
.where(eq(ProgramPlayHistory.channelUuid, channelId))
|
||||
.returning();
|
||||
return rows.length;
|
||||
}
|
||||
|
||||
async deleteByProgramId(programId: string) {
|
||||
const rows = await this.drizzle
|
||||
.delete(ProgramPlayHistory)
|
||||
.where(eq(ProgramPlayHistory.programUuid, programId))
|
||||
.returning();
|
||||
return rows.length;
|
||||
}
|
||||
|
||||
async deleteByChannelIdAfter(channelId: string, after: Date) {
|
||||
const rows = await this.drizzle
|
||||
.delete(ProgramPlayHistory)
|
||||
.where(
|
||||
and(
|
||||
eq(ProgramPlayHistory.channelUuid, channelId),
|
||||
gt(ProgramPlayHistory.playedAt, after),
|
||||
),
|
||||
)
|
||||
.returning();
|
||||
return rows.length;
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the most recent play history entry for a program on a channel.
|
||||
* Returns the entry if found, or undefined if no play history exists.
|
||||
*/
|
||||
async getMostRecentPlay(
|
||||
channelId: string,
|
||||
programId: string,
|
||||
fillerListId?: string,
|
||||
) {
|
||||
return await this.drizzle.query.programPlayHistory.findFirst({
|
||||
where: (fields, { eq, and }) =>
|
||||
and(
|
||||
eq(fields.channelUuid, channelId),
|
||||
eq(fields.programUuid, programId),
|
||||
isNonEmptyString(fillerListId)
|
||||
? eq(fields.fillerListId, fillerListId)
|
||||
: undefined,
|
||||
),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
});
|
||||
}
|
||||
|
||||
async getFillerHistory(channelId: string) {
|
||||
return await this.drizzle.query.programPlayHistory.findMany({
|
||||
where: (fields, { isNotNull, and }) =>
|
||||
and(eq(fields.channelUuid, channelId), isNotNull(fields.fillerListId)),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
});
|
||||
}
|
||||
|
||||
async getFillerLastPlay(channelId: string, fillerId: string) {
|
||||
return await this.drizzle.query.programPlayHistory.findFirst({
|
||||
where: (fields, { eq, and }) =>
|
||||
and(
|
||||
eq(fields.channelUuid, channelId),
|
||||
eq(fields.fillerListId, fillerId),
|
||||
),
|
||||
orderBy: (fields, { desc }) => desc(fields.playedAt),
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a program is currently playing on a channel.
|
||||
* A program is considered "currently playing" if there's a play history entry
|
||||
* where playedAt + playedDuration > timestamp.
|
||||
*/
|
||||
async isProgramCurrentlyPlaying(
|
||||
channelId: string,
|
||||
programId: string,
|
||||
timestamp: number,
|
||||
): Promise<boolean> {
|
||||
const mostRecent = await this.getMostRecentPlay(channelId, programId);
|
||||
if (!mostRecent || !mostRecent.playedDuration) {
|
||||
return false;
|
||||
}
|
||||
const playEndTime =
|
||||
mostRecent.playedAt.getTime() + mostRecent.playedDuration;
|
||||
return timestamp < playEndTime;
|
||||
}
|
||||
}
|
||||
@@ -196,78 +196,6 @@ export class TranscodeConfigDB implements ITranscodeConfigDB {
|
||||
.limit(1)
|
||||
.execute();
|
||||
});
|
||||
// const numConfigs = await tx
|
||||
// .selectFrom('transcodeConfig')
|
||||
// .select((eb) => eb.fn.count<number>('uuid').as('count'))
|
||||
// .executeTakeFirst()
|
||||
// .then((res) => res?.count ?? 0);
|
||||
|
||||
// // If there are no configs (should be impossible) create a default, assign it to all channels
|
||||
// // and move on.
|
||||
// if (numConfigs === 0) {
|
||||
// const { uuid: newDefaultConfigId } =
|
||||
// await this.insertDefaultConfiguration(tx);
|
||||
// await tx
|
||||
// .updateTable('channel')
|
||||
// .set('transcodeConfigId', newDefaultConfigId)
|
||||
// .execute();
|
||||
// return;
|
||||
// }
|
||||
|
||||
// const configToDelete = await tx
|
||||
// .selectFrom('transcodeConfig')
|
||||
// .where('uuid', '=', id)
|
||||
// .selectAll()
|
||||
// .limit(1)
|
||||
// .executeTakeFirst();
|
||||
|
||||
// if (!configToDelete) {
|
||||
// return;
|
||||
// }
|
||||
|
||||
// // If this is the last config, we'll need a new one and will have to assign it
|
||||
// if (numConfigs === 1) {
|
||||
// const { uuid: newDefaultConfigId } =
|
||||
// await this.insertDefaultConfiguration(tx);
|
||||
// await tx
|
||||
// .updateTable('channel')
|
||||
// .set('transcodeConfigId', newDefaultConfigId)
|
||||
// .execute();
|
||||
// await tx
|
||||
// .deleteFrom('transcodeConfig')
|
||||
// .where('uuid', '=', id)
|
||||
// .limit(1)
|
||||
// .execute();
|
||||
// return;
|
||||
// }
|
||||
|
||||
// // We're deleting the default config. Pick a random one to make the new default. Not great!
|
||||
// if (configToDelete.isDefault) {
|
||||
// const newDefaultConfig = await tx
|
||||
// .selectFrom('transcodeConfig')
|
||||
// .where('uuid', '!=', id)
|
||||
// .where('isDefault', '=', 0)
|
||||
// .select('uuid')
|
||||
// .limit(1)
|
||||
// .executeTakeFirstOrThrow();
|
||||
// await tx
|
||||
// .updateTable('transcodeConfig')
|
||||
// .set('isDefault', 1)
|
||||
// .where('uuid', '=', newDefaultConfig.uuid)
|
||||
// .limit(1)
|
||||
// .execute();
|
||||
// await tx
|
||||
// .updateTable('channel')
|
||||
// .set('transcodeConfigId', newDefaultConfig.uuid)
|
||||
// .execute();
|
||||
// }
|
||||
|
||||
// await tx
|
||||
// .deleteFrom('transcodeConfig')
|
||||
// .where('uuid', '=', id)
|
||||
// .limit(1)
|
||||
// .execute();
|
||||
// });
|
||||
}
|
||||
|
||||
private async insertDefaultConfiguration(db: DrizzleDBAccess = this.drizzle) {
|
||||
|
||||
@@ -257,7 +257,10 @@ export class ProgramDaoMinter {
|
||||
uuid: v4(),
|
||||
bitsPerSample: stream.bitDepth,
|
||||
channels: stream.channels,
|
||||
// TODO: color
|
||||
colorRange: stream.colorRange ?? null,
|
||||
colorSpace: stream.colorSpace ?? null,
|
||||
colorTransfer: stream.colorTransfer ?? null,
|
||||
colorPrimaries: stream.colorPrimaries ?? null,
|
||||
default: booleanToNumber(stream.default ?? false),
|
||||
//TODO: forced: stream.forced
|
||||
language: stream.languageCodeISO6392,
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
import type { TranscodeConfig } from '@tunarr/types';
|
||||
import { numberToBoolean } from '../../util/sqliteUtil.ts';
|
||||
import type {
|
||||
TranscodeConfig as TranscodeConfigDAO,
|
||||
TranscodeConfigOrm,
|
||||
} from '../schema/TranscodeConfig.ts';
|
||||
import type { TranscodeConfigOrm } from '../schema/TranscodeConfig.ts';
|
||||
|
||||
export function transcodeConfigOrmToDto(
|
||||
config: TranscodeConfigOrm,
|
||||
@@ -11,13 +7,6 @@ export function transcodeConfigOrmToDto(
|
||||
return {
|
||||
...config,
|
||||
id: config.uuid,
|
||||
// disableChannelOverlay: numberToBoolean(config.disableChannelOverlay),
|
||||
// normalizeFrameRate: numberToBoolean(config.normalizeFrameRate),
|
||||
// deinterlaceVideo: numberToBoolean(config.deinterlaceVideo),
|
||||
// isDefault: numberToBoolean(config.isDefault),
|
||||
// disableHardwareDecoder: numberToBoolean(config.disableHardwareDecoder),
|
||||
// disableHardwareEncoding: numberToBoolean(config.disableHardwareEncoding),
|
||||
// disableHardwareFilters: numberToBoolean(config.disableHardwareFilters),
|
||||
disableChannelOverlay: config.disableChannelOverlay ?? false,
|
||||
normalizeFrameRate: config.normalizeFrameRate ?? false,
|
||||
deinterlaceVideo: config.deinterlaceVideo ?? false,
|
||||
@@ -25,21 +14,6 @@ export function transcodeConfigOrmToDto(
|
||||
disableHardwareDecoder: config.disableHardwareDecoder ?? false,
|
||||
disableHardwareEncoding: config.disableHardwareEncoding ?? false,
|
||||
disableHardwareFilters: config.disableHardwareFilters ?? false,
|
||||
} satisfies TranscodeConfig;
|
||||
}
|
||||
|
||||
export function legacyTranscodeConfigToDto(
|
||||
config: TranscodeConfigDAO,
|
||||
): TranscodeConfig {
|
||||
return {
|
||||
...config,
|
||||
id: config.uuid,
|
||||
disableChannelOverlay: numberToBoolean(config.disableChannelOverlay),
|
||||
normalizeFrameRate: numberToBoolean(config.normalizeFrameRate),
|
||||
deinterlaceVideo: numberToBoolean(config.deinterlaceVideo),
|
||||
isDefault: numberToBoolean(config.isDefault),
|
||||
disableHardwareDecoder: numberToBoolean(config.disableHardwareDecoder),
|
||||
disableHardwareEncoding: numberToBoolean(config.disableHardwareEncoding),
|
||||
disableHardwareFilters: numberToBoolean(config.disableHardwareFilters),
|
||||
audioLoudnormConfig: config.audioLoudnormConfig ?? undefined,
|
||||
} satisfies TranscodeConfig;
|
||||
}
|
||||
|
||||
@@ -45,7 +45,11 @@ export function isProgramLineupItem(
|
||||
export function isContentBackedLineupItem(
|
||||
item: StreamLineupItem,
|
||||
): item is ContentBackedStreamLineupItem {
|
||||
return isCommercialLineupItem(item) || isProgramLineupItem(item);
|
||||
return (
|
||||
isCommercialLineupItem(item) ||
|
||||
isProgramLineupItem(item) ||
|
||||
item.type === 'fallback'
|
||||
);
|
||||
}
|
||||
|
||||
export function isPlexBackedLineupItem(
|
||||
@@ -86,7 +90,8 @@ export function isLocalBackedLineupItem(
|
||||
|
||||
export type ContentBackedStreamLineupItem =
|
||||
| CommercialStreamLineupItem
|
||||
| ProgramStreamLineupItem;
|
||||
| ProgramStreamLineupItem
|
||||
| FallbackStreamLineupItem;
|
||||
|
||||
export type MinimalContentStreamLineupItem = {
|
||||
programId: string;
|
||||
@@ -130,7 +135,11 @@ type BaseContentBackedStreamLineupItem = BaseStreamLineupItem & {
|
||||
|
||||
export type CommercialStreamLineupItem = BaseContentBackedStreamLineupItem & {
|
||||
type: 'commercial';
|
||||
fillerId: string;
|
||||
fillerListId: string;
|
||||
};
|
||||
|
||||
export type FallbackStreamLineupItem = BaseContentBackedStreamLineupItem & {
|
||||
type: 'fallback';
|
||||
};
|
||||
|
||||
export type ProgramStreamLineupItem = BaseContentBackedStreamLineupItem & {
|
||||
@@ -153,7 +162,8 @@ export type StreamLineupItem =
|
||||
| CommercialStreamLineupItem
|
||||
| OfflineStreamLineupItem
|
||||
| RedirectStreamLineupItem
|
||||
| ErrorStreamLineupItem;
|
||||
| ErrorStreamLineupItem
|
||||
| FallbackStreamLineupItem;
|
||||
|
||||
export function createOfflineStreamLineupItem(
|
||||
duration: number,
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { type IChannelDB } from '@/db/interfaces/IChannelDB.js';
|
||||
import { KEYS } from '@/types/inject.js';
|
||||
import { Maybe } from '@/types/util.js';
|
||||
import { isNonEmptyString } from '@/util/index.js';
|
||||
@@ -25,7 +24,6 @@ import { MarkRequired } from 'ts-essentials';
|
||||
import { v4 } from 'uuid';
|
||||
import { MediaSourceApiFactory } from '../external/MediaSourceApiFactory.ts';
|
||||
import { MediaSourceLibraryRefresher } from '../services/MediaSourceLibraryRefresher.ts';
|
||||
import { MeilisearchService } from '../services/MeilisearchService.ts';
|
||||
import {
|
||||
withProgramChannels,
|
||||
withProgramCustomShows,
|
||||
@@ -60,7 +58,6 @@ type MediaSourceUserInfo = {
|
||||
@injectable()
|
||||
export class MediaSourceDB {
|
||||
constructor(
|
||||
@inject(KEYS.ChannelDB) private channelDb: IChannelDB,
|
||||
@inject(KEYS.MediaSourceApiFactory)
|
||||
private mediaSourceApiFactory: () => MediaSourceApiFactory,
|
||||
@inject(KEYS.Database) private db: Kysely<DB>,
|
||||
@@ -68,12 +65,10 @@ export class MediaSourceDB {
|
||||
private mediaSourceLibraryRefresher: interfaces.AutoFactory<MediaSourceLibraryRefresher>,
|
||||
@inject(KEYS.DrizzleDB)
|
||||
private drizzleDB: DrizzleDBAccess,
|
||||
@inject(MeilisearchService)
|
||||
private searchService: MeilisearchService,
|
||||
) {}
|
||||
|
||||
async getAll(): Promise<MediaSourceWithRelations[]> {
|
||||
return this.drizzleDB.query.mediaSource.findMany({
|
||||
return await this.drizzleDB.query.mediaSource.findMany({
|
||||
with: {
|
||||
libraries: true,
|
||||
paths: true,
|
||||
@@ -215,15 +210,10 @@ export class MediaSourceDB {
|
||||
.limit(1)
|
||||
.execute();
|
||||
|
||||
// This cannot happen in the transaction because the DB would be locked.
|
||||
const programIds = allPrograms.map((p) => p.uuid);
|
||||
await this.channelDb.removeProgramsFromAllLineups(programIds);
|
||||
const groupingIds = allGroupings.map((p) => p.uuid);
|
||||
await this.searchService.deleteByIds(programIds.concat(groupingIds));
|
||||
|
||||
this.mediaSourceApiFactory().deleteCachedClient(deletedServer);
|
||||
|
||||
return { deletedServer };
|
||||
return { deletedServer, programIds, groupingIds };
|
||||
}
|
||||
|
||||
async updateMediaSource(updateReq: UpdateMediaSourceRequest) {
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import type { InferSelectModel } from 'drizzle-orm';
|
||||
import { relations } from 'drizzle-orm';
|
||||
import { inArray, relations } from 'drizzle-orm';
|
||||
import {
|
||||
check,
|
||||
getTableConfig,
|
||||
integer,
|
||||
sqliteTable,
|
||||
text,
|
||||
uniqueIndex,
|
||||
} from 'drizzle-orm/sqlite-core';
|
||||
import type { Insertable, Selectable, Updateable } from 'kysely';
|
||||
import {
|
||||
@@ -18,30 +20,41 @@ import { ChannelCustomShow } from './ChannelCustomShow.ts';
|
||||
import { ChannelFillerShow } from './ChannelFillerShow.ts';
|
||||
import { ChannelPrograms } from './ChannelPrograms.ts';
|
||||
import type { KyselifyBetter } from './KyselifyBetter.ts';
|
||||
import { ProgramPlayHistory } from './ProgramPlayHistory.ts';
|
||||
import { TranscodeConfig } from './TranscodeConfig.ts';
|
||||
|
||||
export const Channel = sqliteTable('channel', {
|
||||
uuid: text().primaryKey(),
|
||||
createdAt: integer(),
|
||||
updatedAt: integer(),
|
||||
disableFillerOverlay: integer({ mode: 'boolean' }).default(false),
|
||||
duration: integer().notNull(),
|
||||
fillerRepeatCooldown: integer(),
|
||||
groupTitle: text(),
|
||||
guideFlexTitle: text(),
|
||||
guideMinimumDuration: integer().notNull(),
|
||||
icon: text({ mode: 'json' }).$type<ChannelIcon>().notNull(),
|
||||
name: text().notNull(),
|
||||
number: integer().notNull().unique(),
|
||||
offline: text({ mode: 'json' }).$type<ChannelOfflineSettings>().notNull(),
|
||||
startTime: integer().notNull(),
|
||||
stealth: integer({ mode: 'boolean' }).default(false),
|
||||
streamMode: text({ enum: ChannelStreamModes }).default('hls').notNull(),
|
||||
transcoding: text({ mode: 'json' }).$type<ChannelTranscodingSettings>(),
|
||||
transcodeConfigId: text().notNull(),
|
||||
watermark: text({ mode: 'json' }).$type<ChannelWatermark>(),
|
||||
subtitlesEnabled: integer({ mode: 'boolean' }).default(false),
|
||||
});
|
||||
export const Channel = sqliteTable(
|
||||
'channel',
|
||||
{
|
||||
uuid: text().primaryKey(),
|
||||
createdAt: integer(),
|
||||
updatedAt: integer(),
|
||||
disableFillerOverlay: integer({ mode: 'boolean' }).default(false),
|
||||
duration: integer().notNull(),
|
||||
fillerRepeatCooldown: integer(),
|
||||
groupTitle: text(),
|
||||
guideFlexTitle: text(),
|
||||
guideMinimumDuration: integer().notNull(),
|
||||
icon: text({ mode: 'json' }).$type<ChannelIcon>().notNull(),
|
||||
name: text().notNull(),
|
||||
number: integer().notNull().unique(),
|
||||
offline: text({ mode: 'json' }).$type<ChannelOfflineSettings>().notNull(),
|
||||
startTime: integer().notNull(),
|
||||
stealth: integer({ mode: 'boolean' }).default(false),
|
||||
streamMode: text({ enum: ChannelStreamModes }).default('hls').notNull(),
|
||||
transcoding: text({ mode: 'json' }).$type<ChannelTranscodingSettings>(),
|
||||
transcodeConfigId: text().notNull(),
|
||||
watermark: text({ mode: 'json' }).$type<ChannelWatermark>(),
|
||||
subtitlesEnabled: integer({ mode: 'boolean' }).default(false),
|
||||
},
|
||||
(table) => [
|
||||
uniqueIndex('channel_number_unique').on(table.number),
|
||||
check(
|
||||
'channel_stream_mode_check',
|
||||
inArray(table.streamMode, table.streamMode.enumValues).inlineParams(),
|
||||
),
|
||||
],
|
||||
);
|
||||
|
||||
export type ChannelTable = KyselifyBetter<typeof Channel>;
|
||||
|
||||
@@ -65,6 +78,7 @@ export const ChannelRelations = relations(Channel, ({ many, one }) => ({
|
||||
channelPrograms: many(ChannelPrograms),
|
||||
channelCustomShows: many(ChannelCustomShow),
|
||||
channelFillerShow: many(ChannelFillerShow),
|
||||
playHistory: many(ProgramPlayHistory),
|
||||
transcodeConfig: one(TranscodeConfig, {
|
||||
fields: [Channel.transcodeConfigId],
|
||||
references: [TranscodeConfig.uuid],
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { relations } from 'drizzle-orm';
|
||||
import {
|
||||
foreignKey,
|
||||
integer,
|
||||
primaryKey,
|
||||
sqliteTable,
|
||||
@@ -22,7 +23,19 @@ export const CustomShowContent = sqliteTable(
|
||||
index: integer().notNull(),
|
||||
},
|
||||
(table) => [
|
||||
primaryKey({ columns: [table.contentUuid, table.customShowUuid] }),
|
||||
primaryKey({
|
||||
columns: [table.contentUuid, table.customShowUuid, table.index],
|
||||
}),
|
||||
foreignKey({
|
||||
name: 'custom_show_content_content_uuid_foreign',
|
||||
columns: [table.contentUuid],
|
||||
foreignColumns: [Program.uuid],
|
||||
}).onDelete('cascade'),
|
||||
foreignKey({
|
||||
name: 'custom_show_content_custom_show_uuid_foreign',
|
||||
columns: [table.customShowUuid],
|
||||
foreignColumns: [CustomShow.uuid],
|
||||
}).onDelete('cascade'),
|
||||
],
|
||||
);
|
||||
|
||||
|
||||
@@ -23,6 +23,7 @@ import { MediaSource } from './MediaSource.ts';
|
||||
import { MediaSourceLibrary } from './MediaSourceLibrary.ts';
|
||||
import { ProgramExternalId } from './ProgramExternalId.ts';
|
||||
import { ProgramGrouping } from './ProgramGrouping.ts';
|
||||
import { ProgramPlayHistory } from './ProgramPlayHistory.ts';
|
||||
import { ProgramSubtitles } from './ProgramSubtitles.ts';
|
||||
import { ProgramVersion } from './ProgramVersion.ts';
|
||||
import { StudioEntity } from './Studio.ts';
|
||||
@@ -166,6 +167,7 @@ export const ProgramRelations = relations(Program, ({ many, one }) => ({
|
||||
genres: many(EntityGenre),
|
||||
studios: many(StudioEntity),
|
||||
tags: many(TagRelations),
|
||||
playHistory: many(ProgramPlayHistory),
|
||||
}));
|
||||
|
||||
export type ProgramTable = KyselifyBetter<typeof Program>;
|
||||
|
||||
63
server/src/db/schema/ProgramPlayHistory.ts
Normal file
63
server/src/db/schema/ProgramPlayHistory.ts
Normal file
@@ -0,0 +1,63 @@
|
||||
import type { InferInsertModel, InferSelectModel } from 'drizzle-orm';
|
||||
import { relations } from 'drizzle-orm';
|
||||
import { index, integer, sqliteTable, text } from 'drizzle-orm/sqlite-core';
|
||||
import type { Insertable, Selectable } from 'kysely';
|
||||
import { Channel } from './Channel.ts';
|
||||
import { FillerShow } from './FillerShow.ts';
|
||||
import type { KyselifyBetter } from './KyselifyBetter.ts';
|
||||
import { Program } from './Program.ts';
|
||||
|
||||
export const ProgramPlayHistory = sqliteTable(
|
||||
'program_play_history',
|
||||
{
|
||||
uuid: text().primaryKey(),
|
||||
programUuid: text()
|
||||
.notNull()
|
||||
.references(() => Program.uuid, { onDelete: 'cascade' }),
|
||||
channelUuid: text()
|
||||
.notNull()
|
||||
.references(() => Channel.uuid, { onDelete: 'cascade' }),
|
||||
// Timestamp when this program started playing (milliseconds since epoch)
|
||||
playedAt: integer({ mode: 'timestamp_ms' }).notNull(),
|
||||
// How long the program was played in milliseconds (useful for tracking partial plays)
|
||||
playedDuration: integer(),
|
||||
createdAt: integer({ mode: 'timestamp_ms' }).notNull(),
|
||||
fillerListId: text().references(() => FillerShow.uuid),
|
||||
},
|
||||
(table) => [
|
||||
index('program_play_history_program_uuid_index').on(table.programUuid),
|
||||
index('program_play_history_channel_uuid_index').on(
|
||||
table.channelUuid,
|
||||
table.programUuid,
|
||||
table.fillerListId,
|
||||
),
|
||||
index('program_play_history_played_at_index').on(table.playedAt),
|
||||
// Composite index for querying play history by channel ordered by time
|
||||
index('program_play_history_channel_played_at_index').on(
|
||||
table.channelUuid,
|
||||
table.playedAt,
|
||||
),
|
||||
],
|
||||
);
|
||||
|
||||
export type ProgramPlayHistoryTable = KyselifyBetter<typeof ProgramPlayHistory>;
|
||||
export type ProgramPlayHistoryDao = Selectable<ProgramPlayHistoryTable>;
|
||||
export type ProgramPlayHistoryOrm = InferSelectModel<typeof ProgramPlayHistory>;
|
||||
export type NewProgramPlayHistoryDao = Insertable<ProgramPlayHistoryTable>;
|
||||
export type NewProgramPlayHistoryDrizzle = InferInsertModel<
|
||||
typeof ProgramPlayHistory
|
||||
>;
|
||||
|
||||
export const ProgramPlayHistoryRelations = relations(
|
||||
ProgramPlayHistory,
|
||||
({ one }) => ({
|
||||
channel: one(Channel, {
|
||||
fields: [ProgramPlayHistory.channelUuid],
|
||||
references: [Channel.uuid],
|
||||
}),
|
||||
program: one(Program, {
|
||||
fields: [ProgramPlayHistory.programUuid],
|
||||
references: [Program.uuid],
|
||||
}),
|
||||
}),
|
||||
);
|
||||
@@ -1,4 +1,5 @@
|
||||
import type { Resolution, TupleToUnion } from '@tunarr/types';
|
||||
import type { LoudnormConfig } from '@tunarr/types';
|
||||
import { type Resolution, type TupleToUnion } from '@tunarr/types';
|
||||
import type { InferInsertModel, InferSelectModel } from 'drizzle-orm';
|
||||
import { inArray } from 'drizzle-orm';
|
||||
import { check, integer, sqliteTable, text } from 'drizzle-orm/sqlite-core';
|
||||
@@ -8,6 +9,15 @@ import { VideoFormats } from '../../ffmpeg/builder/constants.ts';
|
||||
import { booleanToNumber } from '../../util/sqliteUtil.ts';
|
||||
import { type KyselifyBetter } from './KyselifyBetter.ts';
|
||||
|
||||
export const AllKnownHardwareAccelerationModes = [
|
||||
'none',
|
||||
'cuda',
|
||||
'vaapi',
|
||||
'qsv',
|
||||
'videotoolbox',
|
||||
'vulkan',
|
||||
] as const;
|
||||
|
||||
export const HardwareAccelerationModes = [
|
||||
'none',
|
||||
'cuda',
|
||||
@@ -17,7 +27,7 @@ export const HardwareAccelerationModes = [
|
||||
] as const;
|
||||
|
||||
export type HardwareAccelerationMode = TupleToUnion<
|
||||
typeof HardwareAccelerationModes
|
||||
typeof AllKnownHardwareAccelerationModes
|
||||
>;
|
||||
|
||||
export const HardwareAccelerationMode: Record<
|
||||
@@ -29,6 +39,7 @@ export const HardwareAccelerationMode: Record<
|
||||
Qsv: 'qsv' as const,
|
||||
Videotoolbox: 'videotoolbox' as const,
|
||||
Vaapi: 'vaapi' as const,
|
||||
Vulkan: 'vulkan' as const,
|
||||
} as const;
|
||||
|
||||
export const VaapiDrivers = [
|
||||
@@ -146,6 +157,7 @@ export const TranscodeConfig = sqliteTable(
|
||||
audioBufferSize: integer().notNull(),
|
||||
audioSampleRate: integer().notNull(),
|
||||
audioVolumePercent: integer().notNull().default(100), // Default 100
|
||||
audioLoudnormConfig: text({ mode: 'json' }).$type<LoudnormConfig>(),
|
||||
|
||||
normalizeFrameRate: integer({ mode: 'boolean' }).default(false),
|
||||
deinterlaceVideo: integer({ mode: 'boolean' }).default(true),
|
||||
|
||||
@@ -41,6 +41,7 @@ export const ChannelStreamModes = [
|
||||
'hls_slower',
|
||||
'mpegts',
|
||||
'hls_direct',
|
||||
'hls_direct_v2',
|
||||
] as const;
|
||||
export type ChannelStreamMode = TupleToUnion<typeof ChannelStreamModes>;
|
||||
|
||||
|
||||
@@ -70,6 +70,10 @@ import {
|
||||
ProgramMediaStream,
|
||||
ProgramMediaStreamRelations,
|
||||
} from './ProgramMediaStream.ts';
|
||||
import {
|
||||
ProgramPlayHistory,
|
||||
ProgramPlayHistoryRelations,
|
||||
} from './ProgramPlayHistory.ts';
|
||||
import {
|
||||
ProgramSubtitles,
|
||||
ProgramSubtitlesRelations,
|
||||
@@ -112,6 +116,8 @@ export const schema = {
|
||||
programGroupingRelations: ProgramGroupingRelations,
|
||||
programExternalId: ProgramExternalId,
|
||||
programExternalIdRelations: ProgramExternalIdRelations,
|
||||
programPlayHistory: ProgramPlayHistory,
|
||||
programPlayHistoryRelations: ProgramPlayHistoryRelations,
|
||||
programGroupingExternalId: ProgramGroupingExternalId,
|
||||
programGroupingExternalIdRelations: ProgramGroupingExternalIdRelations,
|
||||
programMediaStream: ProgramMediaStream,
|
||||
|
||||
@@ -20,7 +20,10 @@ export class FfmpegPlaybackParamsCalculator {
|
||||
) {}
|
||||
|
||||
calculateForStream(streamDetails: StreamDetails): FfmpegPlaybackParams {
|
||||
if (this.streamMode === ChannelStreamModes.HlsDirect) {
|
||||
if (
|
||||
this.streamMode === ChannelStreamModes.HlsDirect ||
|
||||
this.streamMode === ChannelStreamModes.HlsDirectV2
|
||||
) {
|
||||
return {
|
||||
hwAccel: HardwareAccelerationMode.None,
|
||||
audioFormat: TranscodeAudioOutputFormat.Copy,
|
||||
|
||||
@@ -32,6 +32,7 @@ import {
|
||||
} from './builder/MediaStream.ts';
|
||||
import type { OutputFormat } from './builder/constants.ts';
|
||||
import { MpegTsOutputFormat, VideoFormats } from './builder/constants.ts';
|
||||
import { ColorFormat } from './builder/format/ColorFormat.ts';
|
||||
import type { PixelFormat } from './builder/format/PixelFormat.ts';
|
||||
import {
|
||||
KnownPixelFormats,
|
||||
@@ -192,6 +193,7 @@ export class FfmpegStreamFactory extends IFFMPEG {
|
||||
providedSampleAspectRatio: '1:1',
|
||||
displayAspectRatio: '1:1',
|
||||
inputKind: 'video',
|
||||
colorFormat: ColorFormat.bt709,
|
||||
});
|
||||
|
||||
const videoInputSource = VideoInputSource.withStream(
|
||||
@@ -350,6 +352,12 @@ export class FfmpegStreamFactory extends IFFMPEG {
|
||||
width: videoStreamDetails.width,
|
||||
}),
|
||||
frameRate: videoStreamDetails.framerate?.toString(),
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: videoStreamDetails.colorRange ?? null,
|
||||
colorSpace: videoStreamDetails.colorSpace ?? null,
|
||||
colorTransfer: videoStreamDetails.colorTransfer ?? null,
|
||||
colorPrimaries: videoStreamDetails.colorPrimaries ?? null,
|
||||
}),
|
||||
});
|
||||
|
||||
videoInputSource = new VideoInputSource(streamSource, [videoStream]);
|
||||
@@ -383,6 +391,8 @@ export class FfmpegStreamFactory extends IFFMPEG {
|
||||
// Check if audio and video are coming from same location
|
||||
audioDuration:
|
||||
streamMode === 'hls_direct' ? null : duration.asMilliseconds(),
|
||||
normalizeLoudness: false, // !!this.transcodeConfig.audioLoudnormConfig,
|
||||
// loudnormConfig: this.transcodeConfig.audioLoudnormConfig,
|
||||
});
|
||||
|
||||
let audioInput: AudioInputSource;
|
||||
@@ -409,7 +419,11 @@ export class FfmpegStreamFactory extends IFFMPEG {
|
||||
}
|
||||
|
||||
let watermarkSource: Nullable<WatermarkInputSource> = null;
|
||||
if (streamMode !== ChannelStreamModes.HlsDirect && watermark?.enabled) {
|
||||
if (
|
||||
streamMode !== ChannelStreamModes.HlsDirect &&
|
||||
streamMode !== ChannelStreamModes.HlsDirectV2 &&
|
||||
watermark?.enabled
|
||||
) {
|
||||
const watermarkUrl = watermark.url ?? makeLocalUrl('/images/tunarr.png');
|
||||
watermarkSource = new WatermarkInputSource(
|
||||
new HttpStreamSource(watermarkUrl),
|
||||
@@ -709,6 +723,7 @@ export class FfmpegStreamFactory extends IFFMPEG {
|
||||
providedSampleAspectRatio: '1:1',
|
||||
displayAspectRatio: '1:1',
|
||||
pixelFormat: PixelFormatUnknown(),
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
import { VideoStream } from '@/ffmpeg/builder/MediaStream.js';
|
||||
import { PixelFormatYuv420P } from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import {
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
|
||||
describe('MediaStream', () => {
|
||||
@@ -48,3 +51,49 @@ describe('MediaStream', () => {
|
||||
expect(stream.sampleAspectRatio).toEqual(`16:${(1.5).toFixed(12)}`);
|
||||
});
|
||||
});
|
||||
|
||||
describe('VideoStream.isDolbyVision', () => {
|
||||
function createStream(codec: string, profile?: string) {
|
||||
return VideoStream.create({
|
||||
codec,
|
||||
profile,
|
||||
frameSize: FrameSize.FHD,
|
||||
index: 0,
|
||||
providedSampleAspectRatio: null,
|
||||
displayAspectRatio: '16:9',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
});
|
||||
}
|
||||
|
||||
test('returns true for dvhe codec', () => {
|
||||
expect(createStream('dvhe').isDolbyVision()).toBe(true);
|
||||
});
|
||||
|
||||
test('returns true for dvh1 codec', () => {
|
||||
expect(createStream('dvh1').isDolbyVision()).toBe(true);
|
||||
});
|
||||
|
||||
test('returns true for hevc codec with dolby vision profile string', () => {
|
||||
expect(
|
||||
createStream('hevc', 'dolby vision / hevc main 10').isDolbyVision(),
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
test('returns true for hevc codec with mixed-case dolby vision profile', () => {
|
||||
expect(createStream('hevc', 'Dolby Vision Profile 5').isDolbyVision()).toBe(
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
test('returns false for hevc codec with non-DV profile', () => {
|
||||
expect(createStream('hevc', 'main 10').isDolbyVision()).toBe(false);
|
||||
});
|
||||
|
||||
test('returns false for hevc codec with no profile', () => {
|
||||
expect(createStream('hevc').isDolbyVision()).toBe(false);
|
||||
});
|
||||
|
||||
test('returns false for h264 codec', () => {
|
||||
expect(createStream('h264').isDolbyVision()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import type { ExcludeByValueType, Nullable } from '@/types/util.js';
|
||||
import type { DataProps, ExcludeByValueType, Nullable } from '@/types/util.js';
|
||||
import { isNonEmptyString } from '@tunarr/shared/util';
|
||||
import { isEmpty, isNull, merge, nth } from 'lodash-es';
|
||||
import type { AnyFunction, MarkOptional, StrictOmit } from 'ts-essentials';
|
||||
import { VideoFormats } from './constants.ts';
|
||||
import { ColorFormat } from './format/ColorFormat.js';
|
||||
import { PixelFormatUnknown, type PixelFormat } from './format/PixelFormat.ts';
|
||||
import type { DataProps, StreamKind } from './types.ts';
|
||||
import type { StreamKind } from './types.ts';
|
||||
import { FrameSize } from './types.ts';
|
||||
|
||||
export type MediaStream = {
|
||||
@@ -58,6 +59,7 @@ export class VideoStream implements MediaStream {
|
||||
frameSize: FrameSize;
|
||||
frameRate?: string;
|
||||
inputKind: VideoInputKind = 'video' as const;
|
||||
colorFormat: Nullable<ColorFormat>;
|
||||
providedSampleAspectRatio: Nullable<string>;
|
||||
displayAspectRatio: string;
|
||||
|
||||
@@ -77,6 +79,18 @@ export class VideoStream implements MediaStream {
|
||||
return this.pixelFormat?.bitDepth ?? 8;
|
||||
}
|
||||
|
||||
isHdr() {
|
||||
return this.colorFormat?.isHdr ?? false;
|
||||
}
|
||||
|
||||
isDolbyVision(): boolean {
|
||||
return (
|
||||
this.codec === VideoFormats.Dvhe ||
|
||||
this.codec === VideoFormats.Dvh1 ||
|
||||
(this.profile?.toLowerCase().includes('dolby vision') ?? false)
|
||||
);
|
||||
}
|
||||
|
||||
get sampleAspectRatio(): string {
|
||||
const inputSar = this.providedSampleAspectRatio;
|
||||
if (isNull(inputSar) || isEmpty(inputSar) || inputSar === '0:0') {
|
||||
@@ -195,6 +209,7 @@ type StillImageStreamFields = MarkOptional<
|
||||
| 'sampleAspectRatio'
|
||||
| 'displayAspectRatio'
|
||||
| 'inputKind'
|
||||
| 'colorFormat'
|
||||
>,
|
||||
'pixelFormat'
|
||||
>;
|
||||
@@ -209,6 +224,7 @@ export class StillImageStream extends VideoStream {
|
||||
displayAspectRatio: '1:1',
|
||||
...fields,
|
||||
pixelFormat: fields.pixelFormat ?? PixelFormatUnknown(),
|
||||
colorFormat: ColorFormat.unknown, // TODO: Is this necessary
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -6,6 +6,7 @@ export class FfmpegCapabilities {
|
||||
// FFmpeg name to encoder details
|
||||
private videoEncoders: ReadonlyMap<string, FfmpegEncoder>,
|
||||
private filters: ReadonlySet<string>,
|
||||
private hwAccels: ReadonlySet<string>,
|
||||
) {}
|
||||
|
||||
allOptions(): Set<string> {
|
||||
@@ -31,6 +32,10 @@ export class FfmpegCapabilities {
|
||||
hasFilter(filter: string) {
|
||||
return this.filters.has(filter);
|
||||
}
|
||||
|
||||
hasHardwareAccel(hwAccel: string) {
|
||||
return this.hwAccels.has(hwAccel);
|
||||
}
|
||||
}
|
||||
|
||||
// Used for testing
|
||||
@@ -38,4 +43,5 @@ export const EmptyFfmpegCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
@@ -20,7 +20,7 @@ export const defaultHlsOptions: DeepRequired<HlsOptions> = {
|
||||
hlsDeleteThreshold: 3,
|
||||
segmentBaseDirectory: 'streams', // Relative to cwd
|
||||
streamBasePath: 'stream_%v',
|
||||
segmentNameFormat: 'data%05d.ts',
|
||||
segmentNameFormat: 'data%06d.ts',
|
||||
streamNameFormat: 'stream.m3u8',
|
||||
streamBaseUrl: 'hls/',
|
||||
deleteThreshold: 3,
|
||||
@@ -44,6 +44,8 @@ export const defaultMpegDashOptions: DeepRequired<MpegDashOptions> = {
|
||||
export const VideoFormats = {
|
||||
Hevc: 'hevc',
|
||||
H264: 'h264',
|
||||
Dvhe: 'dvhe',
|
||||
Dvh1: 'dvh1',
|
||||
Mpeg1Video: 'mpeg1video',
|
||||
Mpeg2Video: 'mpeg2video',
|
||||
MsMpeg4V2: 'msmpeg4v2',
|
||||
@@ -78,10 +80,51 @@ export const OutputFormatTypes = {
|
||||
MpegTs: 'mpegts',
|
||||
Mp4: 'mp4',
|
||||
Hls: 'hls',
|
||||
HlsDirect: 'hls_direct',
|
||||
HlsDirectV2: 'hls_direct_v2',
|
||||
Nut: 'nut',
|
||||
Dash: 'dash',
|
||||
} as const;
|
||||
|
||||
export const ColorRanges = {
|
||||
Tv: 'tv',
|
||||
};
|
||||
|
||||
// https://trac.ffmpeg.org/wiki/colorspace#color_primaries
|
||||
export const ColorSpaces = {
|
||||
Rgb: 'rgb',
|
||||
Bt709: 'bt709',
|
||||
Bt709bg: 'bt709bg',
|
||||
Smpte170m: 'smpte170m',
|
||||
Smpte240m: 'smpte240m',
|
||||
Bt2020nc: 'bt2020nc',
|
||||
Bt2020c: 'bt2020c',
|
||||
Smpte2085: 'smpte2085',
|
||||
};
|
||||
|
||||
export const ColorPrimaries = {
|
||||
Bt709: 'bt709',
|
||||
Bt709m: 'bt709m',
|
||||
Bt709bg: 'bt709bg',
|
||||
Smpte170m: 'smpte170m',
|
||||
Smpte240m: 'smpte240m',
|
||||
Bt2020: 'bt2020',
|
||||
};
|
||||
|
||||
export const ColorTransferFormats = {
|
||||
Bt709: 'bt709',
|
||||
Unknown: 'unknown',
|
||||
Gamma22: 'gamma22',
|
||||
Gamma28: 'gamma28',
|
||||
Smpte170m: 'smpte170m',
|
||||
Smpte240m: 'smpte240m',
|
||||
Smpte2084: 'smpte2084',
|
||||
AribStdB67: 'arib-std-b67',
|
||||
} as const;
|
||||
|
||||
export type ColorTransferFormat =
|
||||
(typeof ColorTransferFormats)[keyof typeof ColorTransferFormats];
|
||||
|
||||
export type OutputLocation = Lowercase<keyof typeof OutputLocation>;
|
||||
|
||||
export type HlsOutputFormat = {
|
||||
@@ -89,6 +132,11 @@ export type HlsOutputFormat = {
|
||||
hlsOptions: HlsOptions;
|
||||
};
|
||||
|
||||
export type HlsDirectOutputFormat = {
|
||||
type: typeof OutputFormatTypes.HlsDirectV2;
|
||||
hlsOptions: HlsOptions;
|
||||
};
|
||||
|
||||
export type MpegDashOutputFormat = {
|
||||
type: 'dash';
|
||||
options?: Partial<MpegDashOptions>;
|
||||
@@ -133,6 +181,13 @@ export function HlsOutputFormat(opts: HlsOptions): HlsOutputFormat {
|
||||
};
|
||||
}
|
||||
|
||||
export function HlsDirectOutputFormat(opts: HlsOptions): HlsDirectOutputFormat {
|
||||
return {
|
||||
type: OutputFormatTypes.HlsDirectV2,
|
||||
hlsOptions: opts,
|
||||
};
|
||||
}
|
||||
|
||||
export function MpegDashOutputFormat(
|
||||
opts?: Partial<MpegDashOptions>,
|
||||
): MpegDashOutputFormat {
|
||||
@@ -144,8 +199,12 @@ export function MpegDashOutputFormat(
|
||||
|
||||
export type OutputFormat =
|
||||
| HlsOutputFormat
|
||||
| HlsDirectOutputFormat
|
||||
| NutOutputFormat
|
||||
| MkvOutputFormat
|
||||
| MpegDashOutputFormat
|
||||
| Mp4OutputFormat
|
||||
| MpegTsOutputFormat;
|
||||
|
||||
export const OneDayMillis = 7 * 24 * 60 * 60 * 1000;
|
||||
export const FiveMinutesMillis = 5 * 60 * 60 * 1000;
|
||||
|
||||
11
server/src/ffmpeg/builder/decoder/VulkanDecoder.ts
Normal file
11
server/src/ffmpeg/builder/decoder/VulkanDecoder.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { FrameDataLocation } from '../types.ts';
|
||||
import { BaseDecoder } from './BaseDecoder.ts';
|
||||
|
||||
export class VulkanDecoder extends BaseDecoder {
|
||||
readonly name = 'implicit_vulkan';
|
||||
protected _outputFrameDataLocation = FrameDataLocation.Hardware;
|
||||
|
||||
options(): string[] {
|
||||
return ['-hwaccel_output_format', 'vulkan'];
|
||||
}
|
||||
}
|
||||
17
server/src/ffmpeg/builder/filter/HardwareFilterOption.ts
Normal file
17
server/src/ffmpeg/builder/filter/HardwareFilterOption.ts
Normal file
@@ -0,0 +1,17 @@
|
||||
import type { FrameState } from '../state/FrameState.ts';
|
||||
import { FilterOption } from './FilterOption.ts';
|
||||
|
||||
export abstract class HardwareFilterOption extends FilterOption {
|
||||
protected preprocessFilters: FilterOption[] = [];
|
||||
protected postProcessFilters: FilterOption[] = [];
|
||||
|
||||
constructor(protected currentState: FrameState) {
|
||||
super();
|
||||
}
|
||||
|
||||
public get filter(): string {
|
||||
return '';
|
||||
}
|
||||
|
||||
protected abstract filterInternal(): string;
|
||||
}
|
||||
5
server/src/ffmpeg/builder/filter/HdrDetection.ts
Normal file
5
server/src/ffmpeg/builder/filter/HdrDetection.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import type { VideoStream } from '@/ffmpeg/builder/MediaStream.js';
|
||||
|
||||
export function isHdrContent(videoStream: VideoStream): boolean {
|
||||
return videoStream.isHdr() || videoStream.isDolbyVision();
|
||||
}
|
||||
29
server/src/ffmpeg/builder/filter/LibplaceboTonemapFilter.ts
Normal file
29
server/src/ffmpeg/builder/filter/LibplaceboTonemapFilter.ts
Normal file
@@ -0,0 +1,29 @@
|
||||
import { ColorFormat } from '../format/ColorFormat.ts';
|
||||
import type { PixelFormat } from '../format/PixelFormat.ts';
|
||||
import type { FrameState } from '../state/FrameState.ts';
|
||||
import { FrameDataLocation } from '../types.ts';
|
||||
import { FilterOption } from './FilterOption.ts';
|
||||
|
||||
export class LibplaceboTonemapFilter extends FilterOption {
|
||||
public readonly affectsFrameState: boolean = true;
|
||||
private targetPixelFormat: PixelFormat;
|
||||
|
||||
constructor(targetPixelFormat: PixelFormat) {
|
||||
super();
|
||||
this.targetPixelFormat =
|
||||
targetPixelFormat.toHardwareFormat() ?? targetPixelFormat;
|
||||
}
|
||||
|
||||
public get filter(): string {
|
||||
// TODO: Allow setting tonemapping algo
|
||||
return `libplacebo=tonemapping=auto:colorspace=bt709:color_primaries=bt709:color_trc=bt709:format=${this.targetPixelFormat.name},hwupload_cuda`;
|
||||
}
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
return currentState.update({
|
||||
pixelFormat: this.targetPixelFormat,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
colorFormat: ColorFormat.bt709,
|
||||
});
|
||||
}
|
||||
}
|
||||
19
server/src/ffmpeg/builder/filter/LoudnormFilter.ts
Normal file
19
server/src/ffmpeg/builder/filter/LoudnormFilter.ts
Normal file
@@ -0,0 +1,19 @@
|
||||
import type { LoudnormConfig } from '@tunarr/types';
|
||||
import { isDefined } from '../../../util/index.ts';
|
||||
import { FilterOption } from './FilterOption.ts';
|
||||
|
||||
export class LoudnormFilter extends FilterOption {
|
||||
constructor(
|
||||
private loudnormConfig: LoudnormConfig,
|
||||
private sampleRate: number,
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
public get filter(): string {
|
||||
const gain = isDefined(this.loudnormConfig.offsetGain)
|
||||
? `:offset=${this.loudnormConfig.offsetGain}`
|
||||
: '';
|
||||
return `loudnorm=I=${this.loudnormConfig.i}:LRA=${this.loudnormConfig.lra}:TP=${this.loudnormConfig.tp}${gain},aresample=${this.sampleRate}`;
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { FfmpegState } from '@/ffmpeg/builder/state/FfmpegState.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import type { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import { FrameDataLocation, type FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import { FilterOption } from './FilterOption.ts';
|
||||
|
||||
export class ScaleFilter extends FilterOption {
|
||||
@@ -48,7 +48,9 @@ export class ScaleFilter extends FilterOption {
|
||||
scaleFilter = `scale=${this.desiredPaddedSize.width}:${this.desiredPaddedSize.height}:flags=${this.ffmpegState.softwareScalingAlgorithm}${aspectRatio},setsar=1`;
|
||||
}
|
||||
|
||||
// TODO: hwdownload if needed
|
||||
if (this.currentState.frameDataLocation === FrameDataLocation.Hardware) {
|
||||
scaleFilter = `hwdownload,${scaleFilter}`;
|
||||
}
|
||||
|
||||
return scaleFilter;
|
||||
}
|
||||
@@ -58,6 +60,7 @@ export class ScaleFilter extends FilterOption {
|
||||
scaledSize: this.desiredScaledSize,
|
||||
paddedSize: this.desiredScaledSize,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
192
server/src/ffmpeg/builder/filter/TonemapFilter.test.ts
Normal file
192
server/src/ffmpeg/builder/filter/TonemapFilter.test.ts
Normal file
@@ -0,0 +1,192 @@
|
||||
import {
|
||||
ColorPrimaries,
|
||||
ColorRanges,
|
||||
ColorSpaces,
|
||||
ColorTransferFormats,
|
||||
} from '@/ffmpeg/builder/constants.js';
|
||||
import { TonemapFilter } from '@/ffmpeg/builder/filter/TonemapFilter.js';
|
||||
import { ColorFormat } from '@/ffmpeg/builder/format/ColorFormat.js';
|
||||
import {
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation, FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
|
||||
describe('TonemapFilter', () => {
|
||||
test('filter string when frame data is in software (no hwdownload prefix)', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,' +
|
||||
'tonemap=tonemap=hable:desat=0,' +
|
||||
'zscale=s=bt709:t=bt709:m=bt709:r=tv,format=yuv420p',
|
||||
);
|
||||
});
|
||||
|
||||
test('filter string when frame data is on hardware (hwdownload prefix present)', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'hwdownload,format=p010le|nv12,' +
|
||||
'zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,' +
|
||||
'tonemap=tonemap=hable:desat=0,' +
|
||||
'zscale=s=bt709:t=bt709:m=bt709:r=tv,format=yuv420p',
|
||||
);
|
||||
});
|
||||
|
||||
test('affectsFrameState is true', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.affectsFrameState).toBe(true);
|
||||
});
|
||||
|
||||
test('nextState sets colorFormat to bt709', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.colorFormat?.colorSpace).to.eq(ColorSpaces.Bt709);
|
||||
expect(nextState.colorFormat?.colorTransfer).to.eq(
|
||||
ColorTransferFormats.Bt709,
|
||||
);
|
||||
expect(nextState.colorFormat?.colorPrimaries).to.eq(ColorPrimaries.Bt709);
|
||||
expect(nextState.colorFormat?.colorRange).to.eq(ColorRanges.Tv);
|
||||
});
|
||||
|
||||
test('nextState sets frameDataLocation to Software', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.frameDataLocation).to.eq(FrameDataLocation.Software);
|
||||
});
|
||||
|
||||
test('nextState sets pixelFormat to PixelFormatYuv420P', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(new PixelFormatYuv420P());
|
||||
});
|
||||
|
||||
test('filter includes tin=smpte2084 when colorTransfer is smpte2084 (HDR10)', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
colorFormat: new ColorFormat({
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorRange: ColorRanges.Tv,
|
||||
}),
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).toContain('zscale=t=linear:tin=smpte2084:npl=100');
|
||||
});
|
||||
|
||||
test('filter includes tin=arib-std-b67 when colorTransfer is arib-std-b67 (HLG)', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
colorFormat: new ColorFormat({
|
||||
colorTransfer: ColorTransferFormats.AribStdB67,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorRange: ColorRanges.Tv,
|
||||
}),
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).toContain('zscale=t=linear:tin=arib-std-b67:npl=100');
|
||||
});
|
||||
|
||||
test('filter omits tin= when colorTransfer is null', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).toContain('zscale=t=linear:npl=100');
|
||||
expect(filter.filter).not.toContain('tin=');
|
||||
});
|
||||
|
||||
test('hardware location with smpte2084 includes hwdownload prefix and tin=smpte2084', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
colorFormat: new ColorFormat({
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorRange: ColorRanges.Tv,
|
||||
}),
|
||||
});
|
||||
|
||||
const filter = new TonemapFilter(currentState);
|
||||
|
||||
expect(filter.filter).toMatch(/^hwdownload,format=p010le\|nv12,/);
|
||||
expect(filter.filter).toContain('zscale=t=linear:tin=smpte2084:npl=100');
|
||||
});
|
||||
});
|
||||
31
server/src/ffmpeg/builder/filter/TonemapFilter.ts
Normal file
31
server/src/ffmpeg/builder/filter/TonemapFilter.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import { PixelFormatYuv420P } from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import { ColorFormat } from '../format/ColorFormat.ts';
|
||||
|
||||
export class TonemapFilter extends FilterOption {
|
||||
constructor(private currentState: FrameState) {
|
||||
super();
|
||||
}
|
||||
|
||||
public readonly affectsFrameState = true;
|
||||
|
||||
public get filter(): string {
|
||||
const transfer = this.currentState.colorFormat?.colorTransfer;
|
||||
// DV Profile 5 may have `color_transfer = null/unknown` explicitly specify so zscale does not incorrectly convert PQ when color transfer is unknown
|
||||
const tinParam = transfer ? `:tin=${transfer}` : '';
|
||||
const tonemap = `zscale=t=linear${tinParam}:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable:desat=0,zscale=s=bt709:t=bt709:m=bt709:r=tv,format=yuv420p`;
|
||||
return this.currentState.frameDataLocation === FrameDataLocation.Hardware
|
||||
? `hwdownload,format=p010le|nv12,${tonemap}`
|
||||
: tonemap;
|
||||
}
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
return currentState.update({
|
||||
colorFormat: ColorFormat.bt709,
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -23,7 +23,6 @@ export class HardwareUploadCudaFilter extends FilterOption {
|
||||
return '';
|
||||
} else {
|
||||
let fmtPart = '';
|
||||
console.log(this.currentState);
|
||||
if (
|
||||
!this.currentState.pixelFormat ||
|
||||
this.currentState.pixelFormat.name === PixelFormats.Unknown
|
||||
|
||||
@@ -111,7 +111,6 @@ export class ScaleCudaFilter extends FilterOption {
|
||||
const filters = [scale];
|
||||
if (this.currentState.frameDataLocation === FrameDataLocation.Software) {
|
||||
this.uploadFilter = new HardwareUploadCudaFilter(this.currentState);
|
||||
console.log('apply upload filter');
|
||||
filters.unshift(this.uploadFilter.filter);
|
||||
}
|
||||
|
||||
|
||||
41
server/src/ffmpeg/builder/filter/opencl/PadOpenclFilter.ts
Normal file
41
server/src/ffmpeg/builder/filter/opencl/PadOpenclFilter.ts
Normal file
@@ -0,0 +1,41 @@
|
||||
import type { FrameState } from '../../state/FrameState.ts';
|
||||
import type { FrameSize } from '../../types.ts';
|
||||
import { FrameDataLocation } from '../../types.ts';
|
||||
import { FilterOption } from '../FilterOption.ts';
|
||||
import { HardwareUploadVaapiFilter } from '../vaapi/HardwareUploadVaapiFilter.ts';
|
||||
|
||||
export class PadOpenclFilter extends FilterOption {
|
||||
private preprocessFilters: FilterOption[] = [];
|
||||
|
||||
constructor(
|
||||
private currentState: FrameState,
|
||||
private paddedSize: FrameSize,
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
get filter(): string {
|
||||
if (this.currentState.frameDataLocation === FrameDataLocation.Software) {
|
||||
this.preprocessFilters.push(new HardwareUploadVaapiFilter(true));
|
||||
}
|
||||
|
||||
const pad = `hwmap=derive_device=opencl,w=${this.paddedSize.width}:h=${this.paddedSize.height}:x=-1:y=-1:color=black`;
|
||||
|
||||
return this.preprocessFilters
|
||||
.map((filter) => filter.filter)
|
||||
.concat([pad])
|
||||
.join(',');
|
||||
}
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
currentState = this.preprocessFilters.reduce(
|
||||
(prev, filter) => filter.nextState(prev),
|
||||
currentState,
|
||||
);
|
||||
return currentState.update({
|
||||
paddedSize: this.paddedSize,
|
||||
});
|
||||
}
|
||||
|
||||
public affectsFrameState: boolean = true;
|
||||
}
|
||||
@@ -0,0 +1,165 @@
|
||||
import {
|
||||
ColorRanges,
|
||||
ColorTransferFormats,
|
||||
} from '@/ffmpeg/builder/constants.js';
|
||||
import { TonemapOpenclFilter } from '@/ffmpeg/builder/filter/opencl/TonemapOpenclFilter.js';
|
||||
import {
|
||||
PixelFormatNv12,
|
||||
PixelFormatUnknown,
|
||||
PixelFormatVaapi,
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation, FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
|
||||
describe('TonemapOpenclFilter', () => {
|
||||
test('filter string when frame data is on hardware', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'hwmap=derive_device=opencl,tonemap_opencl=tonemap=hable:desat=0:t=bt709:m=bt709:p=bt709:format=nv12,hwmap=derive_device=vaapi:reverse=1',
|
||||
);
|
||||
});
|
||||
|
||||
test('filter string when frame data is in software', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'format=vaapi|nv12|p010le,hwmap=derive_device=opencl,tonemap_opencl=tonemap=hable:desat=0:t=bt709:m=bt709:p=bt709:format=nv12,hwmap=derive_device=vaapi:reverse=1',
|
||||
);
|
||||
});
|
||||
|
||||
test('affectsFrameState is true', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
|
||||
expect(filter.affectsFrameState).toBe(true);
|
||||
});
|
||||
|
||||
test('nextState sets color properties to bt709 and tv range', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.colorFormat?.colorSpace).to.eq(ColorTransferFormats.Bt709);
|
||||
expect(nextState.colorFormat?.colorTransfer).to.eq(
|
||||
ColorTransferFormats.Bt709,
|
||||
);
|
||||
expect(nextState.colorFormat?.colorPrimaries).to.eq(
|
||||
ColorTransferFormats.Bt709,
|
||||
);
|
||||
expect(nextState.colorFormat?.colorRange).to.eq(ColorRanges.Tv);
|
||||
});
|
||||
|
||||
test('nextState sets frame data location to hardware', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.frameDataLocation).to.eq(FrameDataLocation.Hardware);
|
||||
});
|
||||
|
||||
test('nextState wraps existing pixel format in PixelFormatNv12', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(
|
||||
new PixelFormatNv12(new PixelFormatYuv420P10Le()),
|
||||
);
|
||||
});
|
||||
|
||||
test('nextState wraps unknown pixel format in PixelFormatNv12 when no current pixel format', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(
|
||||
new PixelFormatNv12(PixelFormatUnknown()),
|
||||
);
|
||||
});
|
||||
|
||||
test('nextState does not double-wrap nv12 pixel format in another nv12', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatNv12(new PixelFormatYuv420P()),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(
|
||||
new PixelFormatNv12(new PixelFormatYuv420P()),
|
||||
);
|
||||
});
|
||||
|
||||
test('nextState does not wrap vaapi pixel format in nv12 without unwrapping', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatVaapi(new PixelFormatYuv420P()),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapOpenclFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(
|
||||
new PixelFormatNv12(new PixelFormatYuv420P()),
|
||||
);
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,37 @@
|
||||
import { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import {
|
||||
PixelFormatNv12,
|
||||
PixelFormatUnknown,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import { ColorFormat } from '../../format/ColorFormat.ts';
|
||||
|
||||
export class TonemapOpenclFilter extends FilterOption {
|
||||
constructor(private currentState: FrameState) {
|
||||
super();
|
||||
}
|
||||
|
||||
public get filter(): string {
|
||||
const tonemap =
|
||||
'hwmap=derive_device=opencl,tonemap_opencl=tonemap=hable:desat=0:t=bt709:m=bt709:p=bt709:format=nv12,hwmap=derive_device=vaapi:reverse=1';
|
||||
return this.currentState.frameDataLocation === FrameDataLocation.Hardware
|
||||
? tonemap
|
||||
: `format=vaapi|nv12|p010le,${tonemap}`;
|
||||
}
|
||||
|
||||
public readonly affectsFrameState: boolean = true;
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
const currentPixelFormat =
|
||||
currentState.pixelFormat?.toSoftwareFormat() ?? currentState.pixelFormat;
|
||||
|
||||
return currentState.update({
|
||||
colorFormat: ColorFormat.bt709,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
pixelFormat: new PixelFormatNv12(
|
||||
currentPixelFormat ?? PixelFormatUnknown(currentState.bitDepth),
|
||||
),
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,7 @@
|
||||
import { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '../../types.ts';
|
||||
import { HardwareUploadQsvFilter } from './HardwareUploadQsvFilter.ts';
|
||||
|
||||
export class DeinterlaceQsvFilter extends FilterOption {
|
||||
readonly filter: string;
|
||||
@@ -14,14 +16,14 @@ export class DeinterlaceQsvFilter extends FilterOption {
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
return currentState.update({
|
||||
deinterlace: false,
|
||||
frameDataLocation: 'hardware',
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
}
|
||||
|
||||
private generateFilter(currentState: FrameState): string {
|
||||
const prelude =
|
||||
currentState.frameDataLocation === 'hardware'
|
||||
? 'hwupload=extra_hw_frames=64,'
|
||||
currentState.frameDataLocation !== FrameDataLocation.Hardware
|
||||
? `${new HardwareUploadQsvFilter(64).filter},`
|
||||
: '';
|
||||
return `${prelude}deinterlace_qsv`;
|
||||
}
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import { PixelFormats } from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { VideoStream } from '@/ffmpeg/builder/MediaStream.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import type { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
@@ -11,7 +10,6 @@ export class ScaleQsvFilter extends FilterOption {
|
||||
readonly affectsFrameState: boolean = true;
|
||||
|
||||
constructor(
|
||||
private videoStream: VideoStream,
|
||||
private currentState: FrameState,
|
||||
private scaledSize: FrameSize,
|
||||
) {
|
||||
@@ -51,12 +49,10 @@ export class ScaleQsvFilter extends FilterOption {
|
||||
|
||||
if (!this.currentState.scaledSize.equals(this.scaledSize)) {
|
||||
const targetSize = `w=${this.scaledSize.width}:h=${this.scaledSize.height}`;
|
||||
const sarValue =
|
||||
this.videoStream.sampleAspectRatio?.replace(':', '/') ?? '1/1';
|
||||
let squareScale = '';
|
||||
let format = '';
|
||||
if (this.currentState.isAnamorphic) {
|
||||
squareScale = `vpp_qsv=w=iw*${sarValue}:h=ih,setsar=1,`;
|
||||
squareScale = `vpp_qsv=w=iw*sar:h=ih,setsar=1,`;
|
||||
} else {
|
||||
format = `,setsar=1`;
|
||||
}
|
||||
|
||||
15
server/src/ffmpeg/builder/filter/qsv/TonemapQsvFilter.ts
Normal file
15
server/src/ffmpeg/builder/filter/qsv/TonemapQsvFilter.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import { ColorFormat } from '../../format/ColorFormat.ts';
|
||||
import type { FrameState } from '../../state/FrameState.ts';
|
||||
import { FilterOption } from '../FilterOption.ts';
|
||||
|
||||
export class TonemapQsvFilter extends FilterOption {
|
||||
get filter() {
|
||||
return `vpp_qsv=tonemap=1`;
|
||||
}
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
return currentState.update({
|
||||
colorFormat: ColorFormat.bt709,
|
||||
});
|
||||
}
|
||||
}
|
||||
41
server/src/ffmpeg/builder/filter/vaapi/PadVaapiFilter.ts
Normal file
41
server/src/ffmpeg/builder/filter/vaapi/PadVaapiFilter.ts
Normal file
@@ -0,0 +1,41 @@
|
||||
import type { FrameState } from '../../state/FrameState.ts';
|
||||
import type { FrameSize } from '../../types.ts';
|
||||
import { FrameDataLocation } from '../../types.ts';
|
||||
import { FilterOption } from '../FilterOption.ts';
|
||||
import { HardwareUploadVaapiFilter } from './HardwareUploadVaapiFilter.ts';
|
||||
|
||||
export class PadVaapiFilter extends FilterOption {
|
||||
private preprocessFilters: FilterOption[] = [];
|
||||
|
||||
constructor(
|
||||
private currentState: FrameState,
|
||||
private paddedSize: FrameSize,
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
get filter(): string {
|
||||
if (this.currentState.frameDataLocation === FrameDataLocation.Software) {
|
||||
this.preprocessFilters.push(new HardwareUploadVaapiFilter(true));
|
||||
}
|
||||
|
||||
const pad = `pad_vaapi=w=${this.paddedSize.width}:h=${this.paddedSize.height}:x=-1:y=-1:color=black`;
|
||||
|
||||
return this.preprocessFilters
|
||||
.map((filter) => filter.filter)
|
||||
.concat([pad])
|
||||
.join(',');
|
||||
}
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
currentState = this.preprocessFilters.reduce(
|
||||
(prev, filter) => filter.nextState(prev),
|
||||
currentState,
|
||||
);
|
||||
return currentState.update({
|
||||
paddedSize: this.paddedSize,
|
||||
});
|
||||
}
|
||||
|
||||
public affectsFrameState: boolean = true;
|
||||
}
|
||||
@@ -0,0 +1,127 @@
|
||||
import { TonemapVaapiFilter } from '@/ffmpeg/builder/filter/vaapi/TonemapVaapiFilter.js';
|
||||
import {
|
||||
ColorRanges,
|
||||
ColorTransferFormats,
|
||||
} from '@/ffmpeg/builder/constants.js';
|
||||
import {
|
||||
PixelFormatNv12,
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation, FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
|
||||
describe('TonemapVaapiFilter', () => {
|
||||
test('filter string when frame data is on hardware', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'tonemap_vaapi=format=nv12:t=bt709:m=bt709:p=bt709',
|
||||
);
|
||||
});
|
||||
|
||||
test('filter string when frame data is in software', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
|
||||
expect(filter.filter).to.eq(
|
||||
'format=vaapi|nv12|p010le,tonemap_vaapi=format=nv12:t=bt709:m=bt709:p=bt709',
|
||||
);
|
||||
});
|
||||
|
||||
test('affectsFrameState is true', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
|
||||
expect(filter.affectsFrameState).toBe(true);
|
||||
});
|
||||
|
||||
test('nextState sets color properties to bt709 and tv range', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.colorFormat?.colorSpace).to.eq(ColorTransferFormats.Bt709);
|
||||
expect(nextState.colorFormat?.colorTransfer).to.eq(
|
||||
ColorTransferFormats.Bt709,
|
||||
);
|
||||
expect(nextState.colorFormat?.colorPrimaries).to.eq(
|
||||
ColorTransferFormats.Bt709,
|
||||
);
|
||||
expect(nextState.colorFormat?.colorRange).to.eq(ColorRanges.Tv);
|
||||
});
|
||||
|
||||
test('nextState sets frame data location to hardware', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.frameDataLocation).to.eq(FrameDataLocation.Hardware);
|
||||
});
|
||||
|
||||
test('nextState wraps existing pixel format in PixelFormatNv12', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toMatchPixelFormat(
|
||||
new PixelFormatNv12(new PixelFormatYuv420P10Le()),
|
||||
);
|
||||
});
|
||||
|
||||
test('nextState sets pixel format to null when no current pixel format', () => {
|
||||
const currentState = new FrameState({
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
isAnamorphic: false,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
});
|
||||
|
||||
const filter = new TonemapVaapiFilter(currentState);
|
||||
const nextState = filter.nextState(currentState);
|
||||
|
||||
expect(nextState.pixelFormat).toBeNull();
|
||||
});
|
||||
});
|
||||
31
server/src/ffmpeg/builder/filter/vaapi/TonemapVaapiFilter.ts
Normal file
31
server/src/ffmpeg/builder/filter/vaapi/TonemapVaapiFilter.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import { PixelFormatNv12 } from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import { ColorFormat } from '../../format/ColorFormat.ts';
|
||||
|
||||
export class TonemapVaapiFilter extends FilterOption {
|
||||
constructor(private currentState: FrameState) {
|
||||
super();
|
||||
}
|
||||
|
||||
public get filter(): string {
|
||||
const tonemap = 'tonemap_vaapi=format=nv12:t=bt709:m=bt709:p=bt709';
|
||||
return this.currentState.frameDataLocation === FrameDataLocation.Hardware
|
||||
? tonemap
|
||||
: `format=vaapi|nv12|p010le,${tonemap}`;
|
||||
}
|
||||
|
||||
public readonly affectsFrameState: boolean = true;
|
||||
|
||||
nextState(currentState: FrameState): FrameState {
|
||||
const currentPixelFormat = currentState.pixelFormat;
|
||||
return currentState.update({
|
||||
colorFormat: ColorFormat.bt709,
|
||||
frameDataLocation: FrameDataLocation.Hardware,
|
||||
pixelFormat: currentPixelFormat
|
||||
? new PixelFormatNv12(currentPixelFormat)
|
||||
: null,
|
||||
});
|
||||
}
|
||||
}
|
||||
60
server/src/ffmpeg/builder/format/ColorFormat.ts
Normal file
60
server/src/ffmpeg/builder/format/ColorFormat.ts
Normal file
@@ -0,0 +1,60 @@
|
||||
import { isEmpty } from 'lodash-es';
|
||||
import type { Nullable } from '../../../types/util.ts';
|
||||
import {
|
||||
ColorPrimaries,
|
||||
ColorRanges,
|
||||
ColorSpaces,
|
||||
ColorTransferFormats,
|
||||
} from '../constants.ts';
|
||||
|
||||
type ColorFormatCtor = {
|
||||
colorRange: Nullable<string>;
|
||||
colorSpace: Nullable<string>;
|
||||
colorTransfer: Nullable<string>;
|
||||
colorPrimaries: Nullable<string>;
|
||||
};
|
||||
export class ColorFormat {
|
||||
readonly colorRange: Nullable<string>;
|
||||
readonly colorSpace: Nullable<string>;
|
||||
readonly colorTransfer: Nullable<string>;
|
||||
readonly colorPrimaries: Nullable<string>;
|
||||
|
||||
static bt709: ColorFormat = new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt709,
|
||||
colorPrimaries: ColorPrimaries.Bt709,
|
||||
colorTransfer: ColorTransferFormats.Bt709,
|
||||
});
|
||||
|
||||
static unknown: ColorFormat = new ColorFormat({
|
||||
colorPrimaries: null,
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: null,
|
||||
colorTransfer: null,
|
||||
});
|
||||
|
||||
constructor(params: ColorFormatCtor) {
|
||||
this.colorPrimaries = params.colorPrimaries;
|
||||
this.colorRange = params.colorRange;
|
||||
this.colorSpace = params.colorSpace;
|
||||
this.colorTransfer = params.colorTransfer;
|
||||
}
|
||||
|
||||
get isHdr() {
|
||||
return (
|
||||
this.colorTransfer === ColorTransferFormats.Smpte2084 ||
|
||||
this.colorTransfer === ColorTransferFormats.AribStdB67
|
||||
);
|
||||
}
|
||||
|
||||
get isBt709() {
|
||||
return (
|
||||
this.colorRange === ColorRanges.Tv &&
|
||||
this.colorSpace === ColorSpaces.Bt709 &&
|
||||
(isEmpty(this.colorTransfer) ||
|
||||
this.colorTransfer === ColorTransferFormats.Bt709) &&
|
||||
(isEmpty(this.colorPrimaries) ||
|
||||
this.colorPrimaries === ColorPrimaries.Bt709)
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,6 @@
|
||||
import { VideoStream } from '@/ffmpeg/builder/MediaStream.js';
|
||||
import type { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import { ColorFormat } from '../format/ColorFormat.ts';
|
||||
import type {
|
||||
InputSourceContinuity,
|
||||
InputSourceType,
|
||||
@@ -29,6 +30,7 @@ export class ConcatInputSource extends InputSource<VideoStream> {
|
||||
providedSampleAspectRatio: null,
|
||||
displayAspectRatio: '1:1',
|
||||
inputKind: 'video',
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
];
|
||||
}
|
||||
|
||||
@@ -6,6 +6,7 @@ import { LavfiInputOption } from '@/ffmpeg/builder/options/input/LavfiInputOptio
|
||||
import { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import type { HasFilterOption } from '@/ffmpeg/builder/types/PipelineStep.js';
|
||||
import { FilterStreamSource } from '@/stream/types.js';
|
||||
import { ColorFormat } from '../format/ColorFormat.ts';
|
||||
import { VideoInputSource } from './VideoInputSource.ts';
|
||||
|
||||
export class LavfiVideoInputSource extends VideoInputSource {
|
||||
@@ -23,6 +24,7 @@ export class LavfiVideoInputSource extends VideoInputSource {
|
||||
providedSampleAspectRatio: null,
|
||||
displayAspectRatio: '1:1',
|
||||
frameSize: size,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
]);
|
||||
this.addOption(new LavfiInputOption());
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import type { GlobalOptionPipelineStep } from '@/ffmpeg/builder/types/PipelineStep.js';
|
||||
import { isString } from 'lodash-es';
|
||||
import type { NonEmptyArray } from 'ts-essentials';
|
||||
|
||||
export abstract class GlobalOption implements GlobalOptionPipelineStep {
|
||||
readonly type = 'global';
|
||||
@@ -15,11 +16,11 @@ export abstract class GlobalOption implements GlobalOptionPipelineStep {
|
||||
}
|
||||
|
||||
export abstract class ConstantGlobalOption extends GlobalOption {
|
||||
private _options: [string, ...string[]];
|
||||
private _options: NonEmptyArray<string>;
|
||||
|
||||
constructor(options: string);
|
||||
constructor(options: [string, ...string[]]);
|
||||
constructor(options: string | [string, ...string[]]) {
|
||||
constructor(options: NonEmptyArray<string>);
|
||||
constructor(options: string | NonEmptyArray<string>) {
|
||||
super();
|
||||
this._options = isString(options) ? [options] : options;
|
||||
}
|
||||
|
||||
58
server/src/ffmpeg/builder/options/HlsDirectOutputFormat.ts
Normal file
58
server/src/ffmpeg/builder/options/HlsDirectOutputFormat.ts
Normal file
@@ -0,0 +1,58 @@
|
||||
import { OutputOption } from './OutputOption.ts';
|
||||
|
||||
/**
|
||||
* HLS output format for direct stream copy mode (no transcoding).
|
||||
* Unlike HlsOutputFormat, this does NOT include keyframe-forcing options
|
||||
* (-g, -keyint_min, -force_key_frames) which are incompatible with -c:v copy.
|
||||
* Segments will be created at source keyframe boundaries.
|
||||
*/
|
||||
export class HlsDirectOutputFormat extends OutputOption {
|
||||
public static SegmentSeconds = 4;
|
||||
|
||||
constructor(
|
||||
private playlistPath: string,
|
||||
private segmentTemplate: string,
|
||||
private baseStreamUrl: string,
|
||||
private isFirstTranscode: boolean,
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
options(): string[] {
|
||||
const opts = [
|
||||
'-f',
|
||||
'hls',
|
||||
'-hls_time',
|
||||
`${HlsDirectOutputFormat.SegmentSeconds}`,
|
||||
'-hls_list_size',
|
||||
'0',
|
||||
'-segment_list_flags',
|
||||
'+live',
|
||||
'-hls_segment_type',
|
||||
'mpegts',
|
||||
'-hls_segment_filename',
|
||||
this.segmentTemplate,
|
||||
'-hls_base_url',
|
||||
this.baseStreamUrl,
|
||||
'-copyts', // Preserve original timestamps
|
||||
];
|
||||
|
||||
if (this.isFirstTranscode) {
|
||||
opts.push(
|
||||
'-hls_flags',
|
||||
'program_date_time+append_list+omit_endlist+independent_segments',
|
||||
this.playlistPath,
|
||||
);
|
||||
} else {
|
||||
opts.push(
|
||||
'-hls_flags',
|
||||
'program_date_time+append_list+discont_start+omit_endlist+independent_segments',
|
||||
'-mpegts_flags',
|
||||
'+initial_discontinuity',
|
||||
this.playlistPath,
|
||||
);
|
||||
}
|
||||
|
||||
return opts;
|
||||
}
|
||||
}
|
||||
@@ -6,4 +6,10 @@ export const KnownFfmpegOptions = {
|
||||
export const KnownFfmpegFilters = {
|
||||
ScaleNpp: 'scale_npp',
|
||||
ScaleCuda: 'scale_cuda',
|
||||
ScaleVulkan: 'scale_vulkan',
|
||||
TonemapVaapi: 'tonemap_vaapi',
|
||||
TonemapOpencl: 'tonemap_opencl',
|
||||
Libplacebo: 'libplacebo',
|
||||
PadVaapi: 'pad_vaapi',
|
||||
PadOpencl: 'pad_opencl',
|
||||
};
|
||||
|
||||
@@ -68,6 +68,9 @@ export const NoSceneDetectOutputOption = (
|
||||
export const TimeLimitOutputOption = (finish: Duration): ConstantOutputOption =>
|
||||
makeConstantOutputOption(['-t', `${finish.asMilliseconds()}ms`]);
|
||||
|
||||
export const TransocdeUntilOutputOption = (ms: number): ConstantOutputOption =>
|
||||
makeConstantOutputOption(['-to', `${ms}ms`]);
|
||||
|
||||
export const VideoBitrateOutputOption = (
|
||||
bitrate: number,
|
||||
): ConstantOutputOption =>
|
||||
|
||||
@@ -1,7 +1,18 @@
|
||||
import { ConstantGlobalOption } from '@/ffmpeg/builder/options/GlobalOption.js';
|
||||
|
||||
export class CudaHardwareAccelerationOption extends ConstantGlobalOption {
|
||||
constructor() {
|
||||
super(['-hwaccel', 'cuda']);
|
||||
constructor(initVulkanDevice: boolean) {
|
||||
if (initVulkanDevice) {
|
||||
super([
|
||||
'-init_hw_device',
|
||||
'cuda=nv',
|
||||
'-init_hw_device',
|
||||
'vulkan=vk@nv',
|
||||
'-hwaccel',
|
||||
'vulkan',
|
||||
]);
|
||||
} else {
|
||||
super(['-init_hw_device', 'cuda', '-hwaccel', 'cuda']);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -12,6 +12,7 @@ export class ReadrateInputOption extends InputOption {
|
||||
constructor(
|
||||
private capabilities: FfmpegCapabilities,
|
||||
private initialBurstSeconds: number,
|
||||
private readrate: number = 1.0,
|
||||
) {
|
||||
super();
|
||||
}
|
||||
@@ -32,7 +33,7 @@ export class ReadrateInputOption extends InputOption {
|
||||
}
|
||||
|
||||
options(): string[] {
|
||||
const opts = ['-readrate', '1.0'];
|
||||
const opts = ['-readrate', `${this.readrate}`];
|
||||
if (this.shouldBurst()) {
|
||||
opts.push('-readrate_initial_burst', `${this.initialBurstSeconds}`);
|
||||
}
|
||||
|
||||
@@ -0,0 +1,16 @@
|
||||
import type { InputSource } from '../../input/InputSource.ts';
|
||||
import { InputOption } from './InputOption.ts';
|
||||
|
||||
export class RealtimeBufferSizeInputOption extends InputOption {
|
||||
constructor(private size: string) {
|
||||
super();
|
||||
}
|
||||
|
||||
appliesToInput(input: InputSource): boolean {
|
||||
return input.type === 'video' || input.type === 'audio';
|
||||
}
|
||||
|
||||
options(): string[] {
|
||||
return ['-rtbufsize', this.size];
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
import { FileStreamSource } from '../../../stream/types.ts';
|
||||
import { EmptyFfmpegCapabilities } from '../capabilities/FfmpegCapabilities.ts';
|
||||
import { AudioVolumeFilter } from '../filter/AudioVolumeFilter.ts';
|
||||
import { LoudnormFilter } from '../filter/LoudnormFilter.ts';
|
||||
import { PixelFormatYuv420P } from '../format/PixelFormat.ts';
|
||||
import { AudioInputSource } from '../input/AudioInputSource.ts';
|
||||
import { VideoInputSource } from '../input/VideoInputSource.ts';
|
||||
@@ -127,4 +128,269 @@ describe('BasePipelineBuilder', () => {
|
||||
|
||||
expect(volumeFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test('add loudnorm filter when loudnormConfig is set', () => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
loudnormConfig: { i: -24, lra: 7, tp: -2 },
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeDefined();
|
||||
expect(loudnormFilter?.filter).toEqual(
|
||||
'loudnorm=I=-24:LRA=7:TP=-2,aresample=48000',
|
||||
);
|
||||
});
|
||||
|
||||
test('add loudnorm filter with custom offset gain', () => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
loudnormConfig: { i: -16, lra: 11, tp: -1, offsetGain: 3 },
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeDefined();
|
||||
expect(loudnormFilter?.filter).toEqual(
|
||||
'loudnorm=I=-16:LRA=11:TP=-1:offset=3,aresample=48000',
|
||||
);
|
||||
});
|
||||
|
||||
test('use custom sample rate in loudnorm filter when audioSampleRate is set', () => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
audioSampleRate: 44100,
|
||||
loudnormConfig: { i: -24, lra: 7, tp: -2 },
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeDefined();
|
||||
expect(loudnormFilter?.filter).toEqual(
|
||||
'loudnorm=I=-24:LRA=7:TP=-2,aresample=44100',
|
||||
);
|
||||
});
|
||||
|
||||
test('do not add loudnorm filter when audio encoder is copy', () => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioEncoder: 'copy',
|
||||
loudnormConfig: { i: -24, lra: 7, tp: -2 },
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test.each([
|
||||
{ desc: 'i too low', config: { i: -70.1, lra: 7, tp: -2 } },
|
||||
{ desc: 'i too high', config: { i: -4.9, lra: 7, tp: -2 } },
|
||||
{ desc: 'lra too low', config: { i: -24, lra: 0.9, tp: -2 } },
|
||||
{ desc: 'lra too high', config: { i: -24, lra: 50.1, tp: -2 } },
|
||||
{ desc: 'tp too low', config: { i: -24, lra: 7, tp: -9.1 } },
|
||||
{ desc: 'tp too high', config: { i: -24, lra: 7, tp: 0.1 } },
|
||||
])(
|
||||
'do not add loudnorm filter when $desc',
|
||||
({ config }) => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
loudnormConfig: config,
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeUndefined();
|
||||
},
|
||||
);
|
||||
|
||||
test.each([
|
||||
{ desc: 'i at lower bound', config: { i: -70, lra: 7, tp: -2 } },
|
||||
{ desc: 'i at upper bound', config: { i: -5, lra: 7, tp: -2 } },
|
||||
{ desc: 'lra at lower bound', config: { i: -24, lra: 1, tp: -2 } },
|
||||
{ desc: 'lra at upper bound', config: { i: -24, lra: 50, tp: -2 } },
|
||||
{ desc: 'tp at lower bound', config: { i: -24, lra: 7, tp: -9 } },
|
||||
{ desc: 'tp at upper bound', config: { i: -24, lra: 7, tp: 0 } },
|
||||
])(
|
||||
'add loudnorm filter when $desc',
|
||||
({ config }) => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
loudnormConfig: config,
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeDefined();
|
||||
},
|
||||
);
|
||||
|
||||
test('do not add loudnorm filter when loudnormConfig is not set', () => {
|
||||
const audio = AudioInputSource.withStream(
|
||||
new FileStreamSource('/path/to/song.flac'),
|
||||
AudioStream.create({
|
||||
channels: 2,
|
||||
codec: 'flac',
|
||||
index: 0,
|
||||
}),
|
||||
AudioState.create({
|
||||
audioBitrate: 192,
|
||||
audioBufferSize: 192 * 2,
|
||||
audioChannels: 2,
|
||||
}),
|
||||
);
|
||||
|
||||
const pipeline = new NoopPipelineBuilder(
|
||||
video,
|
||||
audio,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
EmptyFfmpegCapabilities,
|
||||
);
|
||||
|
||||
const result = pipeline.build(state, frameState, DefaultPipelineOptions);
|
||||
|
||||
const loudnormFilter = result.inputs.audioInput?.filterSteps.find(
|
||||
(step) => step instanceof LoudnormFilter,
|
||||
);
|
||||
|
||||
expect(loudnormFilter).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,4 +1,7 @@
|
||||
import { HardwareAccelerationMode } from '@/db/schema/TranscodeConfig.js';
|
||||
import {
|
||||
HardwareAccelerationMode,
|
||||
TranscodeAudioOutputFormat,
|
||||
} from '@/db/schema/TranscodeConfig.js';
|
||||
import {
|
||||
SubtitleMethods,
|
||||
type AudioStream,
|
||||
@@ -29,6 +32,7 @@ import type { ConcatInputSource } from '@/ffmpeg/builder/input/ConcatInputSource
|
||||
import type { VideoInputSource } from '@/ffmpeg/builder/input/VideoInputSource.js';
|
||||
import type { WatermarkInputSource } from '@/ffmpeg/builder/input/WatermarkInputSource.js';
|
||||
import { HlsConcatOutputFormat } from '@/ffmpeg/builder/options/HlsConcatOutputFormat.js';
|
||||
import { HlsDirectOutputFormat } from '@/ffmpeg/builder/options/HlsDirectOutputFormat.js';
|
||||
import { HlsOutputFormat } from '@/ffmpeg/builder/options/HlsOutputFormat.js';
|
||||
import { LogLevelOption } from '@/ffmpeg/builder/options/LogLevelOption.js';
|
||||
import { NoStatsOption } from '@/ffmpeg/builder/options/NoStatsOption.js';
|
||||
@@ -45,13 +49,12 @@ import type {
|
||||
PipelineOptions,
|
||||
} from '@/ffmpeg/builder/state/FfmpegState.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import type { DataProps } from '@/ffmpeg/builder/types.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import type {
|
||||
IPipelineStep,
|
||||
PipelineStep,
|
||||
} from '@/ffmpeg/builder/types/PipelineStep.js';
|
||||
import type { Nilable, Nullable } from '@/types/util.js';
|
||||
import type { DataProps, Nilable, Nullable } from '@/types/util.js';
|
||||
import { ifDefined, isNonEmptyString } from '@/util/index.js';
|
||||
import type { Logger } from '@/util/logging/LoggerFactory.js';
|
||||
import { LoggerFactory } from '@/util/logging/LoggerFactory.js';
|
||||
@@ -79,6 +82,7 @@ import { Mpeg2VideoEncoder } from '../encoder/Mpeg2VideoEncoder.ts';
|
||||
import { RawVideoEncoder } from '../encoder/RawVideoEncoder.ts';
|
||||
import { AudioVolumeFilter } from '../filter/AudioVolumeFilter.ts';
|
||||
import type { FilterOption } from '../filter/FilterOption.ts';
|
||||
import { LoudnormFilter } from '../filter/LoudnormFilter.ts';
|
||||
import { StreamSeekFilter } from '../filter/StreamSeekFilter.ts';
|
||||
import type { SubtitlesInputSource } from '../input/SubtitlesInputSource.ts';
|
||||
import {
|
||||
@@ -111,10 +115,12 @@ import {
|
||||
OutputTsOffsetOption,
|
||||
PipeProtocolOutputOption,
|
||||
TimeLimitOutputOption,
|
||||
TransocdeUntilOutputOption,
|
||||
VideoBitrateOutputOption,
|
||||
VideoBufferSizeOutputOption,
|
||||
VideoTrackTimescaleOutputOption,
|
||||
} from '../options/OutputOption.ts';
|
||||
import { RealtimeBufferSizeInputOption } from '../options/input/RealtimeBufferSizeInputOption.ts';
|
||||
import { FrameRateOutputOption } from '../options/output/FrameRateOutputOption.ts';
|
||||
import { Pipeline } from './Pipeline.ts';
|
||||
import type { PipelineBuilder } from './PipelineBuilder.ts';
|
||||
@@ -291,7 +297,10 @@ export abstract class BasePipelineBuilder implements PipelineBuilder {
|
||||
input.addOption(new ConcatHttpReconnectOptions());
|
||||
}
|
||||
|
||||
input.addOption(new ReadrateInputOption(this.ffmpegCapabilities, 0));
|
||||
input.addOption(new RealtimeBufferSizeInputOption('15M'));
|
||||
input.addOption(
|
||||
new ReadrateInputOption(this.ffmpegCapabilities, 0 /*, 1.5*/),
|
||||
);
|
||||
if (state.metadataServiceName) {
|
||||
pipelineSteps.push(
|
||||
MetadataServiceNameOutputOption(state.metadataServiceName),
|
||||
@@ -382,7 +391,18 @@ export abstract class BasePipelineBuilder implements PipelineBuilder {
|
||||
this.setStreamSeek();
|
||||
|
||||
if (this.ffmpegState.duration && +this.ffmpegState.duration > 0) {
|
||||
this.pipelineSteps.push(TimeLimitOutputOption(this.ffmpegState.duration));
|
||||
if (this.ffmpegState.outputFormat.type !== 'hls_direct_v2') {
|
||||
this.pipelineSteps.push(
|
||||
TimeLimitOutputOption(this.ffmpegState.duration),
|
||||
);
|
||||
} else {
|
||||
const seek = this.ffmpegState.start?.asMilliseconds() ?? 0;
|
||||
this.pipelineSteps.push(
|
||||
TransocdeUntilOutputOption(
|
||||
seek + this.ffmpegState.duration.asMilliseconds(),
|
||||
),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if (
|
||||
@@ -634,6 +654,32 @@ export abstract class BasePipelineBuilder implements PipelineBuilder {
|
||||
this.audioInputSource?.filterSteps.push(new AudioPadFilter());
|
||||
}
|
||||
}
|
||||
|
||||
if (
|
||||
!isNull(this.desiredAudioState.loudnormConfig) &&
|
||||
encoder.name !== TranscodeAudioOutputFormat.Copy
|
||||
) {
|
||||
if (
|
||||
this.desiredAudioState.loudnormConfig.i < -70.0 ||
|
||||
this.desiredAudioState.loudnormConfig.i > -5.0 ||
|
||||
this.desiredAudioState.loudnormConfig.lra < 1.0 ||
|
||||
this.desiredAudioState.loudnormConfig.lra > 50.0 ||
|
||||
this.desiredAudioState.loudnormConfig.tp < -9.0 ||
|
||||
this.desiredAudioState.loudnormConfig.tp > 0
|
||||
) {
|
||||
this.logger.warn(
|
||||
'Loudnorm config is not valid: %O',
|
||||
this.desiredAudioState.loudnormConfig,
|
||||
);
|
||||
} else {
|
||||
this.audioInputSource?.filterSteps.push(
|
||||
new LoudnormFilter(
|
||||
this.desiredAudioState.loudnormConfig,
|
||||
this.desiredAudioState.audioSampleRate ?? 48_000,
|
||||
),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
protected abstract setupVideoFilters(): void;
|
||||
@@ -762,12 +808,6 @@ export abstract class BasePipelineBuilder implements PipelineBuilder {
|
||||
}
|
||||
|
||||
protected setOutputFormat() {
|
||||
// this.context.pipelineSteps.push(
|
||||
// this.context.ffmpegState.outputFormat === OutputFormats.Mkv
|
||||
// ? MatroskaOutputFormatOption()
|
||||
// : MpegTsOutputFormatOption(),
|
||||
// PipeProtocolOutputOption(),
|
||||
// );
|
||||
switch (this.ffmpegState.outputFormat.type) {
|
||||
case OutputFormatTypes.Mkv:
|
||||
this.pipelineSteps.push(MatroskaOutputFormatOption());
|
||||
@@ -808,11 +848,32 @@ export abstract class BasePipelineBuilder implements PipelineBuilder {
|
||||
}
|
||||
break;
|
||||
}
|
||||
case OutputFormatTypes.HlsDirectV2: {
|
||||
if (
|
||||
isNonEmptyString(this.ffmpegState.hlsPlaylistPath) &&
|
||||
isNonEmptyString(this.ffmpegState.hlsSegmentTemplate) &&
|
||||
isNonEmptyString(this.ffmpegState.hlsBaseStreamUrl)
|
||||
) {
|
||||
this.pipelineSteps.push(
|
||||
new HlsDirectOutputFormat(
|
||||
this.ffmpegState.hlsPlaylistPath,
|
||||
this.ffmpegState.hlsSegmentTemplate,
|
||||
this.ffmpegState.hlsBaseStreamUrl,
|
||||
isNil(this.ffmpegState.ptsOffset) ||
|
||||
this.ffmpegState.ptsOffset === 0,
|
||||
),
|
||||
);
|
||||
}
|
||||
break;
|
||||
}
|
||||
case OutputFormatTypes.Dash:
|
||||
throw new Error('MPEG-DASH streaming is not yet implemented');
|
||||
}
|
||||
|
||||
if (this.ffmpegState.outputFormat.type !== OutputFormatTypes.Hls) {
|
||||
if (
|
||||
this.ffmpegState.outputFormat.type !== OutputFormatTypes.Hls &&
|
||||
this.ffmpegState.outputFormat.type !== OutputFormatTypes.HlsDirectV2
|
||||
) {
|
||||
switch (this.ffmpegState.outputLocation) {
|
||||
case OutputLocation.Stdout:
|
||||
this.pipelineSteps.push(PipeProtocolOutputOption());
|
||||
|
||||
@@ -1,12 +1,26 @@
|
||||
import { FileStreamSource } from '../../../../stream/types.ts';
|
||||
import { FfmpegCapabilities } from '../../capabilities/FfmpegCapabilities.ts';
|
||||
import { TUNARR_ENV_VARS } from '../../../../util/env.ts';
|
||||
import { EmptyFfmpegCapabilities } from '../../capabilities/FfmpegCapabilities.ts';
|
||||
import {
|
||||
VaapiEntrypoint,
|
||||
VaapiHardwareCapabilities,
|
||||
VaapiProfileEntrypoint,
|
||||
VaapiProfiles,
|
||||
} from '../../capabilities/VaapiHardwareCapabilities.ts';
|
||||
import { PixelFormatYuv420P } from '../../format/PixelFormat.ts';
|
||||
import {
|
||||
ColorPrimaries,
|
||||
ColorRanges,
|
||||
ColorSpaces,
|
||||
ColorTransferFormats,
|
||||
} from '../../constants.ts';
|
||||
import { HardwareUploadQsvFilter } from '../../filter/qsv/HardwareUploadQsvFilter.ts';
|
||||
import { TonemapQsvFilter } from '../../filter/qsv/TonemapQsvFilter.ts';
|
||||
import { TonemapFilter } from '../../filter/TonemapFilter.ts';
|
||||
import { ColorFormat } from '../../format/ColorFormat.ts';
|
||||
import {
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '../../format/PixelFormat.ts';
|
||||
import { SubtitlesInputSource } from '../../input/SubtitlesInputSource.ts';
|
||||
import { VideoInputSource } from '../../input/VideoInputSource.ts';
|
||||
import { WatermarkInputSource } from '../../input/WatermarkInputSource.ts';
|
||||
@@ -27,11 +41,6 @@ import { QsvPipelineBuilder } from './QsvPipelineBuilder.ts';
|
||||
describe('QsvPipelineBuilder', () => {
|
||||
test('should work', () => {
|
||||
const capabilities = new VaapiHardwareCapabilities([]);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -41,6 +50,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -63,7 +73,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -112,11 +122,6 @@ describe('QsvPipelineBuilder', () => {
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -126,6 +131,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -148,7 +154,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -197,11 +203,6 @@ describe('QsvPipelineBuilder', () => {
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -212,6 +213,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -234,7 +236,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -279,11 +281,6 @@ describe('QsvPipelineBuilder', () => {
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -294,6 +291,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -316,7 +314,7 @@ describe('QsvPipelineBuilder', () => {
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -353,4 +351,231 @@ describe('QsvPipelineBuilder', () => {
|
||||
|
||||
console.log(out.getCommandArgs().join(' '));
|
||||
});
|
||||
|
||||
describe('tonemapping', () => {
|
||||
const ffmpegVersion = {
|
||||
versionString: 'n7.0.2-15-g0458a86656-20240904',
|
||||
majorVersion: 7,
|
||||
minorVersion: 0,
|
||||
patchVersion: 2,
|
||||
isUnknown: false,
|
||||
} as const;
|
||||
|
||||
const hdrColorFormat = new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
});
|
||||
|
||||
const fullCapabilities = new VaapiHardwareCapabilities([
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.H264Main,
|
||||
VaapiEntrypoint.Decode,
|
||||
),
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.H264Main,
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.HevcMain10,
|
||||
VaapiEntrypoint.Decode,
|
||||
),
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.HevcMain10,
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
|
||||
afterEach(() => {
|
||||
vi.unstubAllEnvs();
|
||||
});
|
||||
|
||||
function makeH264VideoInput() {
|
||||
return VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
codec: 'h264',
|
||||
profile: 'main',
|
||||
displayAspectRatio: '16:9',
|
||||
frameSize: FrameSize.FHD,
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
function makeHevc10BitVideoInput() {
|
||||
return VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/hdr-video.mkv'),
|
||||
VideoStream.create({
|
||||
codec: 'hevc',
|
||||
displayAspectRatio: '16:9',
|
||||
frameSize: FrameSize.FHD,
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: hdrColorFormat,
|
||||
profile: 'main 10',
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
function makeDesiredFrameState(video: VideoInputSource) {
|
||||
return new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
});
|
||||
}
|
||||
|
||||
test('does not apply tonemap when TUNARR_TONEMAP_ENABLED is not set', () => {
|
||||
const video = makeH264VideoInput();
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
fullCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof TonemapQsvFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test('applies TonemapQsvFilter when tonemap is enabled and frame is already on hardware', () => {
|
||||
vi.stubEnv(TUNARR_ENV_VARS.TONEMAP_ENABLED, 'true');
|
||||
|
||||
const video = makeHevc10BitVideoInput();
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
fullCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const videoFilterSteps =
|
||||
out.getComplexFilter()?.filterChain.videoFilterSteps ?? [];
|
||||
|
||||
const tonemapIdx = videoFilterSteps.findIndex(
|
||||
(step) => step instanceof TonemapQsvFilter,
|
||||
);
|
||||
expect(tonemapIdx).toBeGreaterThan(-1);
|
||||
|
||||
// No hwupload should appear before the tonemap filter since the frame is
|
||||
// already on hardware from the QSV decoder
|
||||
const hwUploadBeforeTonemap = videoFilterSteps
|
||||
.slice(0, tonemapIdx)
|
||||
.some((step) => step instanceof HardwareUploadQsvFilter);
|
||||
expect(hwUploadBeforeTonemap, out.getCommandArgs().join(' ')).toBe(false);
|
||||
});
|
||||
|
||||
test('uploads to hardware before applying TonemapQsvFilter when frame is on software', () => {
|
||||
vi.stubEnv(TUNARR_ENV_VARS.TONEMAP_ENABLED, 'true');
|
||||
|
||||
// 10-bit HEVC: hardware decode is blocked for 10-bit content, so the
|
||||
// frame remains on software when setTonemap is called
|
||||
const video = makeHevc10BitVideoInput();
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
fullCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
{ ...DefaultPipelineOptions, disableHardwareDecoding: true },
|
||||
);
|
||||
|
||||
const videoFilterSteps =
|
||||
out.getComplexFilter()?.filterChain.videoFilterSteps ?? [];
|
||||
|
||||
const tonemapIdx = videoFilterSteps.findIndex(
|
||||
(step) => step instanceof TonemapQsvFilter,
|
||||
);
|
||||
expect(tonemapIdx).toBeGreaterThan(-1);
|
||||
|
||||
const hwUploadIdx = videoFilterSteps.findIndex(
|
||||
(step) => step instanceof HardwareUploadQsvFilter,
|
||||
);
|
||||
expect(hwUploadIdx).toBeGreaterThan(-1);
|
||||
|
||||
// hwupload must precede tonemap
|
||||
expect(hwUploadIdx).toBeLessThan(tonemapIdx);
|
||||
expect(
|
||||
(videoFilterSteps[hwUploadIdx] as HardwareUploadQsvFilter).filter,
|
||||
).toBe('hwupload=extra_hw_frames=64');
|
||||
});
|
||||
|
||||
test('downloads hardware frame and falls back to software tonemap when disableHardwareFilters is true', () => {
|
||||
vi.stubEnv(TUNARR_ENV_VARS.TONEMAP_ENABLED, 'true');
|
||||
|
||||
// H264 8-bit: hardware decode still runs (disableHardwareFilters does not
|
||||
// affect decoding), so the frame is on hardware when setTonemap is called
|
||||
const video = makeHevc10BitVideoInput();
|
||||
|
||||
const builder = new QsvPipelineBuilder(
|
||||
fullCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
{ ...DefaultPipelineOptions, disableHardwareFilters: true },
|
||||
);
|
||||
|
||||
const videoFilterSteps =
|
||||
out.getComplexFilter()?.filterChain.videoFilterSteps ?? [];
|
||||
|
||||
// Hardware tonemap should NOT be applied when hardware filters are disabled
|
||||
const tonemapFilter = videoFilterSteps.find(
|
||||
(step) => step instanceof TonemapQsvFilter,
|
||||
);
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
|
||||
const softwareTonemapFilter = videoFilterSteps.find(
|
||||
(step) => step instanceof TonemapFilter,
|
||||
);
|
||||
expect(softwareTonemapFilter).toBeDefined();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -20,9 +20,9 @@ import { WatermarkScaleFilter } from '@/ffmpeg/builder/filter/watermark/Watermar
|
||||
import {
|
||||
PixelFormatNv12,
|
||||
PixelFormatP010,
|
||||
PixelFormats,
|
||||
PixelFormatYuv420P10Le,
|
||||
PixelFormatYuva420P,
|
||||
PixelFormats,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { AudioInputSource } from '@/ffmpeg/builder/input/AudioInputSource.js';
|
||||
import type { ConcatInputSource } from '@/ffmpeg/builder/input/ConcatInputSource.js';
|
||||
@@ -39,6 +39,7 @@ import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import type { Nullable } from '@/types/util.js';
|
||||
import { isDefined, isNonEmptyString } from '@/util/index.js';
|
||||
import { every, head, inRange, isNull, some } from 'lodash-es';
|
||||
import { getBooleanEnvVar, TUNARR_ENV_VARS } from '../../../../util/env.ts';
|
||||
import { H264QsvEncoder } from '../../encoder/qsv/H264QsvEncoder.ts';
|
||||
import { HevcQsvEncoder } from '../../encoder/qsv/HevcQsvEncoder.ts';
|
||||
import { Mpeg2QsvEncoder } from '../../encoder/qsv/Mpeg2QsvEncoder.ts';
|
||||
@@ -47,6 +48,8 @@ import { ResetPtsFilter } from '../../filter/ResetPtsFilter.ts';
|
||||
import { SetFpsFilter } from '../../filter/SetFpsFilter.ts';
|
||||
import { SubtitleFilter } from '../../filter/SubtitleFilter.ts';
|
||||
import { SubtitleOverlayFilter } from '../../filter/SubtitleOverlayFilter.ts';
|
||||
import { HardwareUploadQsvFilter } from '../../filter/qsv/HardwareUploadQsvFilter.ts';
|
||||
import { TonemapQsvFilter } from '../../filter/qsv/TonemapQsvFilter.ts';
|
||||
import type { SubtitlesInputSource } from '../../input/SubtitlesInputSource.ts';
|
||||
import { CopyTimestampInputOption } from '../../options/input/CopyTimestampInputOption.ts';
|
||||
import { FrameRateOutputOption } from '../../options/output/FrameRateOutputOption.ts';
|
||||
@@ -94,8 +97,7 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
|
||||
if (
|
||||
canDecode &&
|
||||
(this.context.videoStream.codec === VideoFormats.H264 ||
|
||||
this.context.videoStream.codec === VideoFormats.Hevc) &&
|
||||
this.context.videoStream.codec === VideoFormats.H264 &&
|
||||
this.context.videoStream.pixelFormat?.bitDepth === 10
|
||||
) {
|
||||
canDecode = false;
|
||||
@@ -191,6 +193,7 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
|
||||
currentState = this.setDeinterlace(currentState);
|
||||
currentState = this.setScale(currentState);
|
||||
currentState = this.setTonemap(currentState);
|
||||
currentState = this.setPad(currentState);
|
||||
this.setStillImageLoop();
|
||||
|
||||
@@ -248,10 +251,12 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
protected setDeinterlace(currentState: FrameState): FrameState {
|
||||
let nextState = currentState;
|
||||
if (this.context.shouldDeinterlace) {
|
||||
const filter =
|
||||
currentState.frameDataLocation === FrameDataLocation.Software
|
||||
? new DeinterlaceFilter(this.ffmpegState, currentState)
|
||||
: new DeinterlaceQsvFilter(currentState);
|
||||
let filter: FilterOption;
|
||||
if (this.context.pipelineOptions.disableHardwareFilters) {
|
||||
filter = new DeinterlaceFilter(this.ffmpegState, currentState);
|
||||
} else {
|
||||
filter = new DeinterlaceQsvFilter(currentState);
|
||||
}
|
||||
nextState = filter.nextState(nextState);
|
||||
this.videoInputSource.filterSteps.push(filter);
|
||||
}
|
||||
@@ -263,7 +268,7 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
const { videoStream, ffmpegState, desiredState } = this.context;
|
||||
const { ffmpegState, desiredState } = this.context;
|
||||
let nextState = currentState;
|
||||
const needsScale = !currentState.scaledSize.equals(desiredState.scaledSize);
|
||||
const noHardware =
|
||||
@@ -285,11 +290,7 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
desiredState.paddedSize,
|
||||
);
|
||||
} else {
|
||||
scaleFilter = new ScaleQsvFilter(
|
||||
videoStream,
|
||||
nextState,
|
||||
desiredState.scaledSize,
|
||||
);
|
||||
scaleFilter = new ScaleQsvFilter(nextState, desiredState.scaledSize);
|
||||
}
|
||||
|
||||
if (isNonEmptyString(scaleFilter.filter)) {
|
||||
@@ -538,6 +539,38 @@ export class QsvPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
protected setTonemap(currentState: FrameState): FrameState {
|
||||
if (!this.context.videoStream?.isHdr()) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
if (!getBooleanEnvVar(TUNARR_ENV_VARS.TONEMAP_ENABLED, false)) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
if (this.context.pipelineOptions.disableHardwareFilters) {
|
||||
if (currentState.frameDataLocation === FrameDataLocation.Hardware) {
|
||||
const hwDownload = new HardwareDownloadFilter(currentState);
|
||||
currentState = hwDownload.nextState(currentState);
|
||||
this.videoInputSource.addFilter(hwDownload);
|
||||
}
|
||||
// TODO: refactor this into a "strategy"
|
||||
return super.setTonemap(currentState);
|
||||
}
|
||||
|
||||
if (currentState.frameDataLocation === FrameDataLocation.Software) {
|
||||
const hwUpload = new HardwareUploadQsvFilter(64);
|
||||
currentState = hwUpload.nextState(currentState);
|
||||
this.videoInputSource.addFilter(hwUpload);
|
||||
}
|
||||
|
||||
const tonemap = new TonemapQsvFilter();
|
||||
currentState = tonemap.nextState(currentState);
|
||||
this.videoInputSource.addFilter(tonemap);
|
||||
|
||||
return currentState;
|
||||
}
|
||||
|
||||
protected getIsIntelQsvOrVaapi(): boolean {
|
||||
return (
|
||||
this.ffmpegState.decoderHwAccelMode === HardwareAccelerationMode.Qsv ||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { TONEMAP_ENABLED, TUNARR_ENV_VARS } from '@/util/env.js';
|
||||
import { FileStreamSource } from '../../../../stream/types.ts';
|
||||
import { FfmpegCapabilities } from '../../capabilities/FfmpegCapabilities.ts';
|
||||
import {
|
||||
@@ -6,9 +7,16 @@ import {
|
||||
VaapiProfileEntrypoint,
|
||||
VaapiProfiles,
|
||||
} from '../../capabilities/VaapiHardwareCapabilities.ts';
|
||||
import {
|
||||
ColorPrimaries,
|
||||
ColorRanges,
|
||||
ColorSpaces,
|
||||
ColorTransferFormats,
|
||||
} from '../../constants.ts';
|
||||
import {
|
||||
PixelFormatRgba,
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '../../format/PixelFormat.ts';
|
||||
import { AudioInputSource } from '../../input/AudioInputSource.ts';
|
||||
import { SubtitlesInputSource } from '../../input/SubtitlesInputSource.ts';
|
||||
@@ -21,6 +29,7 @@ import {
|
||||
SubtitleMethods,
|
||||
VideoStream,
|
||||
} from '../../MediaStream.ts';
|
||||
import { KnownFfmpegFilters } from '../../options/KnownFfmpegOptions.ts';
|
||||
import { AudioState } from '../../state/AudioState.ts';
|
||||
import {
|
||||
DefaultPipelineOptions,
|
||||
@@ -29,6 +38,7 @@ import {
|
||||
import { FrameState } from '../../state/FrameState.ts';
|
||||
import { FrameSize } from '../../types.ts';
|
||||
import { VaapiPipelineBuilder } from './VaapiPipelineBuilder.ts';
|
||||
import { ColorFormat } from '@/ffmpeg/builder/format/ColorFormat.js';
|
||||
|
||||
describe('VaapiPipelineBuilder', () => {
|
||||
test('should work', () => {
|
||||
@@ -37,6 +47,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
@@ -47,6 +58,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: null,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -96,7 +108,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
}),
|
||||
@@ -121,6 +133,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
@@ -131,6 +144,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: null,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -180,7 +194,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'h264',
|
||||
@@ -206,6 +220,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
@@ -217,6 +232,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: null,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -267,7 +283,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'h264',
|
||||
@@ -293,6 +309,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
@@ -304,6 +321,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: null,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -353,7 +371,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'h264',
|
||||
@@ -379,6 +397,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
const video = VideoInputSource.withStream(
|
||||
@@ -429,7 +448,7 @@ describe('VaapiPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'h264',
|
||||
@@ -440,3 +459,616 @@ describe('VaapiPipelineBuilder', () => {
|
||||
console.log(out.getCommandArgs().join(' '));
|
||||
});
|
||||
});
|
||||
|
||||
describe('VaapiPipelineBuilder pad', () => {
|
||||
const originalEnv = process.env;
|
||||
|
||||
beforeEach(() => {
|
||||
process.env = { ...originalEnv };
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env = originalEnv;
|
||||
});
|
||||
|
||||
const fakeVersion = {
|
||||
versionString: 'n7.0.2',
|
||||
majorVersion: 7,
|
||||
minorVersion: 0,
|
||||
patchVersion: 2,
|
||||
isUnknown: false,
|
||||
};
|
||||
|
||||
// 4:3 video that needs pillarboxing to fit in 16:9 FHD:
|
||||
// squarePixelFrameSize(FHD) = 1440x1080, paddedSize = 1920x1080
|
||||
function create43VideoStream(): VideoStream {
|
||||
return VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'h264',
|
||||
profile: 'main',
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
frameSize: FrameSize.withDimensions(640, 480),
|
||||
displayAspectRatio: '4:3',
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: null,
|
||||
});
|
||||
}
|
||||
|
||||
function buildWithPad(opts: {
|
||||
videoStream: VideoStream;
|
||||
binaryCapabilities?: FfmpegCapabilities;
|
||||
disableHardwareDecoding?: boolean;
|
||||
disableHardwareEncoding?: boolean;
|
||||
}) {
|
||||
const capabilities = new VaapiHardwareCapabilities([
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.H264Main,
|
||||
VaapiEntrypoint.Decode,
|
||||
),
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.H264Main,
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
|
||||
const binaryCapabilities =
|
||||
opts.binaryCapabilities ??
|
||||
new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.PadVaapi]),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
opts.videoStream,
|
||||
);
|
||||
|
||||
const builder = new VaapiPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const state = FfmpegState.create({ version: fakeVersion });
|
||||
const videoStream = video.streams[0]!;
|
||||
|
||||
return builder.build(
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: videoStream.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'h264',
|
||||
}),
|
||||
{
|
||||
...DefaultPipelineOptions,
|
||||
vaapiDevice: '/dev/dri/renderD128',
|
||||
disableHardwareDecoding: opts.disableHardwareDecoding ?? false,
|
||||
disableHardwareEncoding: opts.disableHardwareEncoding ?? false,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
test('uses pad_vaapi when capability is available and content is SDR', () => {
|
||||
const pipeline = buildWithPad({ videoStream: create43VideoStream() });
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).toContain('pad_vaapi=w=1920:h=1080');
|
||||
expect(args).not.toContain('pad=1920');
|
||||
});
|
||||
|
||||
test('falls back to software pad when pad_vaapi capability is not available', () => {
|
||||
const pipeline = buildWithPad({
|
||||
videoStream: create43VideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).not.toContain('pad_vaapi');
|
||||
expect(args).toContain('pad=1920:1080');
|
||||
});
|
||||
|
||||
test('uses software pad for HDR content even when pad_vaapi capability is available', () => {
|
||||
const hdrStream = VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'h264',
|
||||
profile: 'main',
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
frameSize: FrameSize.withDimensions(640, 480),
|
||||
displayAspectRatio: '4:3',
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
});
|
||||
|
||||
const pipeline = buildWithPad({ videoStream: hdrStream });
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).not.toContain('pad_vaapi');
|
||||
expect(args).toContain('pad=1920:1080');
|
||||
});
|
||||
|
||||
test('pad_vaapi includes hwupload when frame data is in software', () => {
|
||||
const pipeline = buildWithPad({
|
||||
videoStream: create43VideoStream(),
|
||||
disableHardwareDecoding: true,
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).toContain('pad_vaapi');
|
||||
const hwuploadIndex = args.indexOf('hwupload');
|
||||
const padVaapiIndex = args.indexOf('pad_vaapi');
|
||||
expect(hwuploadIndex).toBeGreaterThan(-1);
|
||||
expect(hwuploadIndex).toBeLessThan(padVaapiIndex);
|
||||
});
|
||||
|
||||
test('falls back to software pad when TUNARR_DISABLE_VAAPI_PAD=true, even when pad_vaapi is available', () => {
|
||||
process.env[TUNARR_ENV_VARS.DISABLE_VAAPI_PAD] = 'true';
|
||||
|
||||
const pipeline = buildWithPad({ videoStream: create43VideoStream() });
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).not.toContain('pad_vaapi');
|
||||
expect(args).toContain('pad=1920:1080');
|
||||
});
|
||||
|
||||
test('falls back to software pad when TUNARR_DISABLE_VAAPI_PAD=true and only pad_opencl is available', () => {
|
||||
process.env[TUNARR_ENV_VARS.DISABLE_VAAPI_PAD] = 'true';
|
||||
|
||||
const pipeline = buildWithPad({
|
||||
videoStream: create43VideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.PadOpencl]),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).not.toContain('pad_opencl');
|
||||
expect(args).toContain('pad=1920:1080');
|
||||
});
|
||||
|
||||
test('uses pad_vaapi when TUNARR_DISABLE_VAAPI_PAD is not set', () => {
|
||||
delete process.env[TUNARR_ENV_VARS.DISABLE_VAAPI_PAD];
|
||||
|
||||
const pipeline = buildWithPad({ videoStream: create43VideoStream() });
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).toContain('pad_vaapi=w=1920:h=1080');
|
||||
});
|
||||
|
||||
test('uses pad_vaapi when TUNARR_DISABLE_VAAPI_PAD=false', () => {
|
||||
process.env[TUNARR_ENV_VARS.DISABLE_VAAPI_PAD] = 'false';
|
||||
|
||||
const pipeline = buildWithPad({ videoStream: create43VideoStream() });
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).toContain('pad_vaapi=w=1920:h=1080');
|
||||
expect(args).not.toContain('pad=1920:1080');
|
||||
});
|
||||
});
|
||||
|
||||
describe('VaapiPipelineBuilder tonemap', () => {
|
||||
const originalEnv = process.env;
|
||||
const fakeVersion = {
|
||||
versionString: 'n7.0.2',
|
||||
majorVersion: 7,
|
||||
minorVersion: 0,
|
||||
patchVersion: 2,
|
||||
isUnknown: false,
|
||||
};
|
||||
|
||||
beforeEach(() => {
|
||||
process.env = { ...originalEnv };
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env = originalEnv;
|
||||
});
|
||||
|
||||
function createHdrVideoStream(
|
||||
colorFormat: ColorFormat = new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
): VideoStream {
|
||||
return VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'hevc',
|
||||
profile: 'main 10',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameSize: FrameSize.FourK,
|
||||
displayAspectRatio: '16:9',
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: colorFormat,
|
||||
});
|
||||
}
|
||||
|
||||
function buildWithTonemap(opts: {
|
||||
videoStream: VideoStream;
|
||||
binaryCapabilities?: FfmpegCapabilities;
|
||||
disableHardwareFilters?: boolean;
|
||||
}) {
|
||||
const capabilities = new VaapiHardwareCapabilities([
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.HevcMain10,
|
||||
VaapiEntrypoint.Decode,
|
||||
),
|
||||
new VaapiProfileEntrypoint(
|
||||
VaapiProfiles.HevcMain,
|
||||
VaapiEntrypoint.Encode,
|
||||
),
|
||||
]);
|
||||
|
||||
const binaryCapabilities =
|
||||
opts.binaryCapabilities ??
|
||||
new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.TonemapVaapi]),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
opts.videoStream,
|
||||
);
|
||||
|
||||
const builder = new VaapiPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const state = FfmpegState.create({ version: fakeVersion });
|
||||
|
||||
const pipeline = builder.build(
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: FrameSize.FHD,
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
videoFormat: 'hevc',
|
||||
}),
|
||||
{
|
||||
...DefaultPipelineOptions,
|
||||
vaapiDevice: '/dev/dri/renderD128',
|
||||
disableHardwareFilters: opts.disableHardwareFilters ?? false,
|
||||
},
|
||||
);
|
||||
|
||||
return pipeline;
|
||||
}
|
||||
|
||||
function hasTonemapFilter(pipeline: ReturnType<typeof buildWithTonemap>) {
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
return args.includes('tonemap_vaapi');
|
||||
}
|
||||
|
||||
function hasOpenclTonemapFilter(
|
||||
pipeline: ReturnType<typeof buildWithTonemap>,
|
||||
) {
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
return args.includes('tonemap_opencl');
|
||||
}
|
||||
|
||||
function hasSoftwareTonemapFilter(
|
||||
pipeline: ReturnType<typeof buildWithTonemap>,
|
||||
) {
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
return args.includes('zscale') && args.includes('tonemap=tonemap=hable');
|
||||
}
|
||||
|
||||
test('applies tonemap filter for HDR10 (smpte2084) content', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(
|
||||
new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
),
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(true);
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(args).toContain('tonemap_vaapi=format=nv12:t=bt709:m=bt709:p=bt709');
|
||||
});
|
||||
|
||||
test('applies tonemap filter for HLG (arib-std-b67) content', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(
|
||||
new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.AribStdB67,
|
||||
}),
|
||||
),
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
|
||||
test('skips tonemap when TONEMAP_ENABLED is false', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'false';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
});
|
||||
|
||||
test('skips tonemap when content is SDR', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const sdrStream = VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'hevc',
|
||||
profile: 'main 10',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameSize: FrameSize.FHD,
|
||||
displayAspectRatio: '16:9',
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt709,
|
||||
colorPrimaries: ColorPrimaries.Bt709,
|
||||
colorTransfer: ColorTransferFormats.Bt709,
|
||||
}),
|
||||
});
|
||||
|
||||
const pipeline = buildWithTonemap({ videoStream: sdrStream });
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
});
|
||||
|
||||
test('falls back to software tonemap when neither tonemap_vaapi nor tonemap_opencl is available', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasSoftwareTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
|
||||
test('skips hardware tonemap but applies software tonemap when hardware filters are disabled', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
disableHardwareFilters: true,
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasSoftwareTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
|
||||
test('tonemap filter appears before scale in the filter chain', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
const tonemapIndex = args.indexOf('tonemap_vaapi');
|
||||
const scaleIndex = args.indexOf('scale_vaapi');
|
||||
|
||||
expect(tonemapIndex).toBeGreaterThan(-1);
|
||||
expect(scaleIndex).toBeGreaterThan(-1);
|
||||
expect(tonemapIndex).toBeLessThan(scaleIndex);
|
||||
});
|
||||
|
||||
test('falls back to tonemap_opencl when tonemap_vaapi is unavailable', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.TonemapOpencl]),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(true);
|
||||
expect(args).toContain('tonemap_opencl=tonemap=hable');
|
||||
expect(args).toContain('hwmap=derive_device=opencl');
|
||||
expect(args).toContain('hwmap=derive_device=vaapi:reverse=1');
|
||||
});
|
||||
|
||||
test('prefers tonemap_vaapi over tonemap_opencl when both are available', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([
|
||||
KnownFfmpegFilters.TonemapVaapi,
|
||||
KnownFfmpegFilters.TonemapOpencl,
|
||||
]),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(true);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
});
|
||||
|
||||
test('opencl tonemap filter appears before scale in the filter chain', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.TonemapOpencl]),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
const args = pipeline.getCommandArgs().join(' ');
|
||||
const tonemapIndex = args.indexOf('tonemap_opencl');
|
||||
const scaleIndex = args.indexOf('scale_vaapi');
|
||||
|
||||
expect(tonemapIndex).toBeGreaterThan(-1);
|
||||
expect(scaleIndex).toBeGreaterThan(-1);
|
||||
expect(tonemapIndex).toBeLessThan(scaleIndex);
|
||||
});
|
||||
|
||||
test('skips opencl tonemap when hardware filters are disabled but applies software tonemap', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: createHdrVideoStream(),
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set([KnownFfmpegFilters.TonemapOpencl]),
|
||||
new Set(),
|
||||
),
|
||||
disableHardwareFilters: true,
|
||||
});
|
||||
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasSoftwareTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
|
||||
test('applies tonemap_vaapi for Dolby Vision content (dvhe codec)', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const dvStream = VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'dvhe',
|
||||
profile: 'dvhe.08.09',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameSize: FrameSize.FourK,
|
||||
displayAspectRatio: '16:9',
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
});
|
||||
|
||||
const pipeline = buildWithTonemap({ videoStream: dvStream });
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(true);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
});
|
||||
|
||||
test('applies software tonemap for Dolby Vision (dvhe codec) when hardware filters are disabled', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const dvStream = VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'dvhe',
|
||||
profile: 'dvhe.08.09',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameSize: FrameSize.FourK,
|
||||
displayAspectRatio: '16:9',
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
});
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: dvStream,
|
||||
disableHardwareFilters: true,
|
||||
});
|
||||
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasSoftwareTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
|
||||
test('applies software tonemap for Dolby Vision with profile string (hevc codec)', () => {
|
||||
process.env[TONEMAP_ENABLED] = 'true';
|
||||
|
||||
const dvStream = VideoStream.create({
|
||||
index: 0,
|
||||
codec: 'hevc',
|
||||
profile: 'dolby vision / hevc main 10',
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
frameSize: FrameSize.FourK,
|
||||
displayAspectRatio: '16:9',
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
}),
|
||||
});
|
||||
|
||||
const pipeline = buildWithTonemap({
|
||||
videoStream: dvStream,
|
||||
binaryCapabilities: new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
new Set(),
|
||||
),
|
||||
});
|
||||
|
||||
// No hardware tonemap filter, falls back to software
|
||||
expect(hasTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasOpenclTonemapFilter(pipeline)).to.eq(false);
|
||||
expect(hasSoftwareTonemapFilter(pipeline)).to.eq(true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -7,12 +7,15 @@ import { VaapiDecoder } from '@/ffmpeg/builder/decoder/vaapi/VaapiDecoder.js';
|
||||
import { DeinterlaceFilter } from '@/ffmpeg/builder/filter/DeinterlaceFilter.js';
|
||||
import type { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import { HardwareDownloadFilter } from '@/ffmpeg/builder/filter/HardwareDownloadFilter.js';
|
||||
import { isHdrContent } from '@/ffmpeg/builder/filter/HdrDetection.js';
|
||||
import { TonemapOpenclFilter } from '@/ffmpeg/builder/filter/opencl/TonemapOpenclFilter.js';
|
||||
import { PadFilter } from '@/ffmpeg/builder/filter/PadFilter.js';
|
||||
import { PixelFormatFilter } from '@/ffmpeg/builder/filter/PixelFormatFilter.js';
|
||||
import { ScaleFilter } from '@/ffmpeg/builder/filter/ScaleFilter.js';
|
||||
import { DeinterlaceVaapiFilter } from '@/ffmpeg/builder/filter/vaapi/DeinterlaceVaapiFilter.js';
|
||||
import { HardwareUploadVaapiFilter } from '@/ffmpeg/builder/filter/vaapi/HardwareUploadVaapiFilter.js';
|
||||
import { ScaleVaapiFilter } from '@/ffmpeg/builder/filter/vaapi/ScaleVaapiFilter.js';
|
||||
import { TonemapVaapiFilter } from '@/ffmpeg/builder/filter/vaapi/TonemapVaapiFilter.js';
|
||||
import { VaapiFormatFilter } from '@/ffmpeg/builder/filter/vaapi/VaapiFormatFilter.js';
|
||||
import { OverlayWatermarkFilter } from '@/ffmpeg/builder/filter/watermark/OverlayWatermarkFilter.js';
|
||||
import { WatermarkOpacityFilter } from '@/ffmpeg/builder/filter/watermark/WatermarkOpacityFilter.js';
|
||||
@@ -26,12 +29,18 @@ import { ExtraHardwareFramesOption } from '@/ffmpeg/builder/options/hardwareAcce
|
||||
import { VaapiHardwareAccelerationOption } from '@/ffmpeg/builder/options/hardwareAcceleration/VaapiOptions.js';
|
||||
import { DoNotIgnoreLoopInputOption } from '@/ffmpeg/builder/options/input/DoNotIgnoreLoopInputOption.js';
|
||||
import { InfiniteLoopInputOption } from '@/ffmpeg/builder/options/input/InfiniteLoopInputOption.js';
|
||||
import { KnownFfmpegFilters } from '@/ffmpeg/builder/options/KnownFfmpegOptions.js';
|
||||
import { isVideoPipelineContext } from '@/ffmpeg/builder/pipeline/BasePipelineBuilder.js';
|
||||
import { SoftwarePipelineBuilder } from '@/ffmpeg/builder/pipeline/software/SoftwarePipelineBuilder.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import type { Nullable } from '@/types/util.js';
|
||||
import type { Maybe, Nullable } from '@/types/util.js';
|
||||
import {
|
||||
TONEMAP_ENABLED,
|
||||
TUNARR_ENV_VARS,
|
||||
getBooleanEnvVar,
|
||||
} from '@/util/env.js';
|
||||
import { isDefined, isNonEmptyString } from '@/util/index.js';
|
||||
import { every, head, inRange, isUndefined } from 'lodash-es';
|
||||
import { every, head, inRange } from 'lodash-es';
|
||||
import { P, match } from 'ts-pattern';
|
||||
import {
|
||||
H264VaapiEncoder,
|
||||
@@ -39,8 +48,10 @@ import {
|
||||
Mpeg2VaapiEncoder,
|
||||
} from '../../encoder/vaapi/VaapiEncoders.ts';
|
||||
import { ImageScaleFilter } from '../../filter/ImageScaleFilter.ts';
|
||||
import { PadOpenclFilter } from '../../filter/opencl/PadOpenclFilter.ts';
|
||||
import { SubtitleFilter } from '../../filter/SubtitleFilter.ts';
|
||||
import { SubtitleOverlayFilter } from '../../filter/SubtitleOverlayFilter.ts';
|
||||
import { PadVaapiFilter } from '../../filter/vaapi/PadVaapiFilter.ts';
|
||||
import { ScaleSubtitlesVaapiFilter } from '../../filter/vaapi/ScaleSubtitlesVaapiFilter.ts';
|
||||
import { VaapiOverlayFilter } from '../../filter/vaapi/VaapiOverlayFilter.ts';
|
||||
import { VaapiSubtitlePixelFormatFilter } from '../../filter/vaapi/VaapiSubtitlePixelFormatFilter.ts';
|
||||
@@ -159,11 +170,13 @@ export class VaapiPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
scaledSize: videoStream.frameSize,
|
||||
paddedSize: videoStream.frameSize,
|
||||
pixelFormat: videoStream.pixelFormat,
|
||||
colorFormat: videoStream.colorFormat,
|
||||
});
|
||||
|
||||
currentState = this.decoder?.nextState(currentState) ?? currentState;
|
||||
|
||||
currentState = this.setDeinterlace(currentState);
|
||||
currentState = this.setTonemap(currentState);
|
||||
currentState = this.setScale(currentState);
|
||||
currentState = this.setPad(currentState);
|
||||
this.setStillImageLoop();
|
||||
@@ -336,6 +349,41 @@ export class VaapiPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
return nextState;
|
||||
}
|
||||
|
||||
protected setTonemap(currentState: FrameState): FrameState {
|
||||
if (!isVideoPipelineContext(this.context)) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
const { videoStream, pipelineOptions } = this.context;
|
||||
|
||||
if (
|
||||
!getBooleanEnvVar(TONEMAP_ENABLED, false) ||
|
||||
!isHdrContent(videoStream)
|
||||
) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
let filter: FilterOption | undefined;
|
||||
if (!pipelineOptions.disableHardwareFilters) {
|
||||
if (this.ffmpegCapabilities.hasFilter(KnownFfmpegFilters.TonemapVaapi)) {
|
||||
filter = new TonemapVaapiFilter(currentState);
|
||||
} else if (
|
||||
this.ffmpegCapabilities.hasFilter(KnownFfmpegFilters.TonemapOpencl)
|
||||
) {
|
||||
filter = new TonemapOpenclFilter(currentState);
|
||||
}
|
||||
}
|
||||
|
||||
if (filter) {
|
||||
const nextState = filter.nextState(currentState);
|
||||
this.videoInputSource.filterSteps.push(filter);
|
||||
return nextState;
|
||||
}
|
||||
|
||||
// Fall back to software tonemap
|
||||
return super.setTonemap(currentState);
|
||||
}
|
||||
|
||||
protected setScale(currentState: FrameState): FrameState {
|
||||
let nextState = currentState;
|
||||
const { desiredState, ffmpegState, shouldDeinterlace } = this.context;
|
||||
@@ -379,16 +427,49 @@ export class VaapiPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
}
|
||||
|
||||
protected setPad(currentState: FrameState) {
|
||||
let nextState = currentState;
|
||||
if (
|
||||
isUndefined(this.desiredState.croppedSize) &&
|
||||
!currentState.paddedSize.equals(this.desiredState.paddedSize)
|
||||
this.desiredState.croppedSize &&
|
||||
currentState.paddedSize.equals(this.desiredState.paddedSize)
|
||||
) {
|
||||
const padFilter = PadFilter.create(currentState, this.desiredState);
|
||||
nextState = padFilter.nextState(currentState);
|
||||
this.videoInputSource.filterSteps.push(padFilter);
|
||||
return currentState;
|
||||
}
|
||||
return nextState;
|
||||
|
||||
if (!this.context.videoStream) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
// Enabled by default
|
||||
const disableHardwarePad = getBooleanEnvVar(
|
||||
TUNARR_ENV_VARS.DISABLE_VAAPI_PAD,
|
||||
false,
|
||||
);
|
||||
let padFilter: Maybe<FilterOption>;
|
||||
if (this.context.videoStream.isHdr()) {
|
||||
padFilter = PadFilter.create(currentState, this.desiredState);
|
||||
} else if (
|
||||
!disableHardwarePad &&
|
||||
this.ffmpegCapabilities.hasFilter(KnownFfmpegFilters.PadVaapi)
|
||||
) {
|
||||
padFilter = new PadVaapiFilter(
|
||||
currentState,
|
||||
this.desiredState.paddedSize,
|
||||
);
|
||||
} else if (
|
||||
!disableHardwarePad &&
|
||||
this.ffmpegCapabilities.hasFilter(KnownFfmpegFilters.PadOpencl)
|
||||
) {
|
||||
padFilter = new PadOpenclFilter(
|
||||
currentState,
|
||||
this.desiredState.paddedSize,
|
||||
);
|
||||
} else {
|
||||
padFilter = PadFilter.create(currentState, this.desiredState);
|
||||
}
|
||||
|
||||
currentState = padFilter.nextState(currentState);
|
||||
this.videoInputSource.filterSteps.push(padFilter);
|
||||
|
||||
return currentState;
|
||||
}
|
||||
|
||||
protected addSubtitles(currentState: FrameState): FrameState {
|
||||
|
||||
@@ -1,9 +1,24 @@
|
||||
import { FileStreamSource } from '../../../../stream/types.ts';
|
||||
import { TUNARR_ENV_VARS } from '../../../../util/env.ts';
|
||||
import { LoggerFactory } from '../../../../util/logging/LoggerFactory.ts';
|
||||
import { FfmpegCapabilities } from '../../capabilities/FfmpegCapabilities.ts';
|
||||
import {
|
||||
EmptyFfmpegCapabilities,
|
||||
FfmpegCapabilities,
|
||||
} from '../../capabilities/FfmpegCapabilities.ts';
|
||||
import { NvidiaHardwareCapabilities } from '../../capabilities/NvidiaHardwareCapabilities.ts';
|
||||
import {
|
||||
ColorPrimaries,
|
||||
ColorRanges,
|
||||
ColorSpaces,
|
||||
ColorTransferFormats,
|
||||
} from '../../constants.ts';
|
||||
import { DeinterlaceFilter } from '../../filter/DeinterlaceFilter.ts';
|
||||
import { PixelFormatYuv420P } from '../../format/PixelFormat.ts';
|
||||
import { LibplaceboTonemapFilter } from '../../filter/LibplaceboTonemapFilter.ts';
|
||||
import { ColorFormat } from '../../format/ColorFormat.ts';
|
||||
import {
|
||||
PixelFormatYuv420P,
|
||||
PixelFormatYuv420P10Le,
|
||||
} from '../../format/PixelFormat.ts';
|
||||
import { SubtitlesInputSource } from '../../input/SubtitlesInputSource.ts';
|
||||
import { VideoInputSource } from '../../input/VideoInputSource.ts';
|
||||
import { WatermarkInputSource } from '../../input/WatermarkInputSource.ts';
|
||||
@@ -25,11 +40,6 @@ import { NvidiaPipelineBuilder } from './NvidiaPipelineBuilder.ts';
|
||||
describe('NvidiaPipelineBuilder', () => {
|
||||
test('should work', () => {
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -39,6 +49,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -61,7 +72,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -88,7 +99,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
}),
|
||||
@@ -103,11 +114,6 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
test('should work software decode', () => {
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mp2'),
|
||||
VideoStream.create({
|
||||
@@ -118,12 +124,13 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: '1:1',
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -146,7 +153,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
deinterlace: true,
|
||||
@@ -164,11 +171,6 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
test('should work with hardware filters disabled', () => {
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/video.mkv'),
|
||||
VideoStream.create({
|
||||
@@ -178,6 +180,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -200,7 +203,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -226,7 +229,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
}),
|
||||
@@ -241,11 +244,6 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
test('updates pixel format for non-scaled input', () => {
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
const videoSource = new FileStreamSource('/path/to/video.mkv');
|
||||
|
||||
@@ -258,6 +256,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -280,7 +279,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -307,7 +306,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
deinterlace: true,
|
||||
@@ -318,13 +317,251 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
console.log(out.getCommandArgs().join(' '));
|
||||
});
|
||||
|
||||
describe('HDR tonemapping', () => {
|
||||
const hdrColorFormat = new ColorFormat({
|
||||
colorRange: ColorRanges.Tv,
|
||||
colorSpace: ColorSpaces.Bt2020nc,
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
colorPrimaries: ColorPrimaries.Bt2020,
|
||||
});
|
||||
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
|
||||
const ffmpegVersion = {
|
||||
versionString: 'n7.0.2-15-g0458a86656-20240904',
|
||||
majorVersion: 7,
|
||||
minorVersion: 0,
|
||||
patchVersion: 2,
|
||||
isUnknown: false,
|
||||
} as const;
|
||||
|
||||
function makeHdrVideoInput() {
|
||||
return VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/hdr-video.mkv'),
|
||||
VideoStream.create({
|
||||
codec: 'hevc',
|
||||
displayAspectRatio: '16:9',
|
||||
frameSize: FrameSize.FHD,
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P10Le(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: hdrColorFormat,
|
||||
}),
|
||||
);
|
||||
}
|
||||
|
||||
function makeDesiredFrameState(video: VideoInputSource) {
|
||||
return new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
});
|
||||
}
|
||||
|
||||
afterEach(() => {
|
||||
vi.unstubAllEnvs();
|
||||
});
|
||||
|
||||
test('uses LibplaceboTonemapFilter for HDR content with Vulkan and libplacebo capabilities', () => {
|
||||
vi.stubEnv(TUNARR_ENV_VARS.TONEMAP_ENABLED, 'true');
|
||||
vi.stubEnv(TUNARR_ENV_VARS.DISABLE_VULKAN, 'false');
|
||||
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(['libplacebo']),
|
||||
new Set(['vulkan']),
|
||||
);
|
||||
const video = makeHdrVideoInput();
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof LibplaceboTonemapFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeInstanceOf(LibplaceboTonemapFilter);
|
||||
expect(tonemapFilter?.filter).toContain('libplacebo=tonemapping=auto');
|
||||
});
|
||||
|
||||
test('does not tonemap HDR content when Vulkan hwaccel is not available', () => {
|
||||
vi.stubEnv('TUNARR_DISABLE_VULKAN', 'true');
|
||||
|
||||
const noVulkanCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(['libplacebo']),
|
||||
new Set(), // no vulkan
|
||||
);
|
||||
const video = makeHdrVideoInput();
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
noVulkanCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof LibplaceboTonemapFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test('does not tonemap HDR content when libplacebo filter is not available', () => {
|
||||
vi.stubEnv('TUNARR_DISABLE_VULKAN', 'true');
|
||||
|
||||
const noLibplaceboCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(), // no libplacebo
|
||||
new Set(['vulkan']),
|
||||
);
|
||||
const video = makeHdrVideoInput();
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
noLibplaceboCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof LibplaceboTonemapFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test('does not tonemap SDR content even with Vulkan and libplacebo capabilities', () => {
|
||||
vi.stubEnv('TUNARR_DISABLE_VULKAN', 'true');
|
||||
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(['libplacebo']),
|
||||
new Set(['vulkan']),
|
||||
);
|
||||
const video = VideoInputSource.withStream(
|
||||
new FileStreamSource('/path/to/sdr-video.mkv'),
|
||||
VideoStream.create({
|
||||
codec: 'hevc',
|
||||
displayAspectRatio: '16:9',
|
||||
frameSize: FrameSize.FHD,
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.bt709,
|
||||
}),
|
||||
);
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
}),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof LibplaceboTonemapFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
});
|
||||
|
||||
test('does not tonemap HDR content when TUNARR_DISABLE_VULKAN is not set', () => {
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(['libplacebo']),
|
||||
new Set(['vulkan']),
|
||||
);
|
||||
const video = makeHdrVideoInput();
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
null,
|
||||
);
|
||||
|
||||
const out = builder.build(
|
||||
FfmpegState.create({ version: ffmpegVersion }),
|
||||
makeDesiredFrameState(video),
|
||||
DefaultPipelineOptions,
|
||||
);
|
||||
|
||||
const tonemapFilter = out
|
||||
.getComplexFilter()
|
||||
?.filterChain.videoFilterSteps.find(
|
||||
(step) => step instanceof LibplaceboTonemapFilter,
|
||||
);
|
||||
|
||||
expect(tonemapFilter).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
test('intermittent watermark, set format on hardware scale, do not set format on hwdownload', async () => {
|
||||
const capabilities = new NvidiaHardwareCapabilities('RTX 2080 Ti', 75);
|
||||
const binaryCapabilities = new FfmpegCapabilities(
|
||||
new Set(),
|
||||
new Map(),
|
||||
new Set(),
|
||||
);
|
||||
|
||||
const videoSource = new FileStreamSource('/path/to/video.mkv');
|
||||
|
||||
@@ -337,6 +574,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
index: 0,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
providedSampleAspectRatio: null,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -359,7 +597,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
|
||||
const builder = new NvidiaPipelineBuilder(
|
||||
capabilities,
|
||||
binaryCapabilities,
|
||||
EmptyFfmpegCapabilities,
|
||||
video,
|
||||
null,
|
||||
null,
|
||||
@@ -382,7 +620,7 @@ describe('NvidiaPipelineBuilder', () => {
|
||||
state,
|
||||
new FrameState({
|
||||
isAnamorphic: false,
|
||||
scaledSize: video.streams[0].squarePixelFrameSize(FrameSize.FHD),
|
||||
scaledSize: video.streams[0]!.squarePixelFrameSize(FrameSize.FHD),
|
||||
paddedSize: FrameSize.FHD,
|
||||
pixelFormat: new PixelFormatYuv420P(),
|
||||
deinterlace: false,
|
||||
|
||||
@@ -27,9 +27,11 @@ import { isVideoPipelineContext } from '@/ffmpeg/builder/pipeline/BasePipelineBu
|
||||
import { SoftwarePipelineBuilder } from '@/ffmpeg/builder/pipeline/software/SoftwarePipelineBuilder.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import type { Nullable } from '@/types/util.js';
|
||||
import type { Maybe, Nullable } from '@/types/util.js';
|
||||
import { isDefined, isNonEmptyString } from '@/util/index.js';
|
||||
import { head, isEmpty, isNil, isNull, reject, some } from 'lodash-es';
|
||||
import { getBooleanEnvVar, TUNARR_ENV_VARS } from '../../../../util/env.ts';
|
||||
import { VulkanDecoder } from '../../decoder/VulkanDecoder.ts';
|
||||
import {
|
||||
ImplicitNvidiaDecoder,
|
||||
NvidiaDecoder,
|
||||
@@ -40,6 +42,7 @@ import {
|
||||
} from '../../encoder/nvidia/NvidiaEncoders.ts';
|
||||
import { HardwareDownloadFilter } from '../../filter/HardwareDownloadFilter.ts';
|
||||
import { ImageScaleFilter } from '../../filter/ImageScaleFilter.ts';
|
||||
import { LibplaceboTonemapFilter } from '../../filter/LibplaceboTonemapFilter.ts';
|
||||
import { SubtitleFilter } from '../../filter/SubtitleFilter.ts';
|
||||
import { SubtitleOverlayFilter } from '../../filter/SubtitleOverlayFilter.ts';
|
||||
import { NvidiaCropBottomBitstreamFilter } from '../../filter/nvidia/NvidiaCropBottomBitstreamFilter.ts';
|
||||
@@ -111,8 +114,20 @@ export class NvidiaPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
canDecode = false;
|
||||
}
|
||||
|
||||
const needsTonemapWithVulkan =
|
||||
getBooleanEnvVar(TUNARR_ENV_VARS.TONEMAP_ENABLED, false) &&
|
||||
canDecode &&
|
||||
this.ffmpegCapabilities.hasHardwareAccel(
|
||||
HardwareAccelerationMode.Vulkan,
|
||||
) &&
|
||||
this.ffmpegCapabilities.hasFilter(KnownFfmpegFilters.Libplacebo) &&
|
||||
!!this.videoInputSource.streams?.[0]?.isHdr() &&
|
||||
!getBooleanEnvVar(TUNARR_ENV_VARS.DISABLE_VULKAN, false);
|
||||
|
||||
if (canDecode) {
|
||||
pipelineSteps.push(new CudaHardwareAccelerationOption());
|
||||
pipelineSteps.push(
|
||||
new CudaHardwareAccelerationOption(needsTonemapWithVulkan),
|
||||
);
|
||||
}
|
||||
|
||||
ffmpegState.decoderHwAccelMode = canDecode
|
||||
@@ -121,6 +136,7 @@ export class NvidiaPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
ffmpegState.encoderHwAccelMode = canEncode
|
||||
? HardwareAccelerationMode.Cuda
|
||||
: HardwareAccelerationMode.None;
|
||||
ffmpegState.tonemapHdr = needsTonemapWithVulkan;
|
||||
}
|
||||
|
||||
protected setupDecoder(): Nullable<Decoder> {
|
||||
@@ -130,7 +146,12 @@ export class NvidiaPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
|
||||
const { ffmpegState } = this.context;
|
||||
let decoder: Nullable<Decoder> = null;
|
||||
if (ffmpegState.decoderHwAccelMode === HardwareAccelerationMode.Cuda) {
|
||||
if (ffmpegState.tonemapHdr) {
|
||||
decoder = new VulkanDecoder();
|
||||
this.videoInputSource.addOption(decoder);
|
||||
} else if (
|
||||
ffmpegState.decoderHwAccelMode === HardwareAccelerationMode.Cuda
|
||||
) {
|
||||
decoder = new ImplicitNvidiaDecoder();
|
||||
this.videoInputSource.addOption(decoder);
|
||||
} else {
|
||||
@@ -174,6 +195,10 @@ export class NvidiaPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
this.videoInputSource.addFilter(filter);
|
||||
}
|
||||
|
||||
if (this.ffmpegState.tonemapHdr) {
|
||||
currentState = this.setTonemap(currentState);
|
||||
}
|
||||
|
||||
currentState = NvidiaDeinterlacer.setDeinterlace(
|
||||
this.context,
|
||||
this.videoInputSource,
|
||||
@@ -774,4 +799,29 @@ export class NvidiaPipelineBuilder extends SoftwarePipelineBuilder {
|
||||
|
||||
return currentState;
|
||||
}
|
||||
|
||||
protected setTonemap(currentState: FrameState): FrameState {
|
||||
if (!this.context.videoStream?.isHdr()) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
if (!this.desiredState.pixelFormat) {
|
||||
return currentState;
|
||||
}
|
||||
|
||||
let tonemapFilter: Maybe<FilterOption>;
|
||||
if (this.ffmpegState.tonemapHdr) {
|
||||
tonemapFilter = new LibplaceboTonemapFilter(
|
||||
this.desiredState.pixelFormat,
|
||||
);
|
||||
}
|
||||
// TODO: Software
|
||||
|
||||
if (tonemapFilter) {
|
||||
this.videoInputSource.filterSteps.push(tonemapFilter);
|
||||
currentState = tonemapFilter.nextState(currentState);
|
||||
}
|
||||
|
||||
return currentState;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -120,6 +120,7 @@ describe('NvidiaScaler', () => {
|
||||
expect(nextState).toStrictEqual(
|
||||
currentState.update({
|
||||
scaledSize: FrameSize.FHD,
|
||||
frameDataLocation: FrameDataLocation.Software,
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
@@ -1,13 +1,20 @@
|
||||
import { VideoFormats } from '@/ffmpeg/builder/constants.js';
|
||||
import {
|
||||
ColorTransferFormats,
|
||||
VideoFormats,
|
||||
} from '@/ffmpeg/builder/constants.js';
|
||||
import { Encoder } from '@/ffmpeg/builder/encoder/Encoder.js';
|
||||
import { DeinterlaceFilter } from '@/ffmpeg/builder/filter/DeinterlaceFilter.js';
|
||||
import type { FilterOption } from '@/ffmpeg/builder/filter/FilterOption.js';
|
||||
import { PadFilter } from '@/ffmpeg/builder/filter/PadFilter.js';
|
||||
import { ScaleFilter } from '@/ffmpeg/builder/filter/ScaleFilter.js';
|
||||
import { isHdrContent } from '@/ffmpeg/builder/filter/HdrDetection.js';
|
||||
import { TonemapFilter } from '@/ffmpeg/builder/filter/TonemapFilter.js';
|
||||
import { OverlayWatermarkFilter } from '@/ffmpeg/builder/filter/watermark/OverlayWatermarkFilter.js';
|
||||
import { ColorFormat } from '@/ffmpeg/builder/format/ColorFormat.js';
|
||||
import { PixelFormatOutputOption } from '@/ffmpeg/builder/options/OutputOption.js';
|
||||
import type { FrameState } from '@/ffmpeg/builder/state/FrameState.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import { TONEMAP_ENABLED, getBooleanEnvVar } from '@/util/env.js';
|
||||
import dayjs from '@/util/dayjs.js';
|
||||
import type { Watermark } from '@tunarr/types';
|
||||
import { filter, first, isEmpty, isNull, some } from 'lodash-es';
|
||||
@@ -38,10 +45,12 @@ export class SoftwarePipelineBuilder extends BasePipelineBuilder {
|
||||
isAnamorphic: videoStream.isAnamorphic,
|
||||
scaledSize: videoStream.frameSize,
|
||||
paddedSize: videoStream.frameSize,
|
||||
colorFormat: videoStream.colorFormat,
|
||||
});
|
||||
|
||||
if (desiredState.videoFormat !== VideoFormats.Copy) {
|
||||
currentState = this.setDeinterlace(currentState);
|
||||
currentState = this.setTonemap(currentState);
|
||||
currentState = this.setScale(currentState);
|
||||
currentState = this.setPad(currentState);
|
||||
currentState = this.addSubtitles(currentState);
|
||||
@@ -268,4 +277,33 @@ export class SoftwarePipelineBuilder extends BasePipelineBuilder {
|
||||
|
||||
return currentState;
|
||||
}
|
||||
|
||||
protected setTonemap(currentState: FrameState): FrameState {
|
||||
if (!isVideoPipelineContext(this.context)) {
|
||||
return currentState;
|
||||
}
|
||||
const { videoStream } = this.context;
|
||||
if (
|
||||
!getBooleanEnvVar(TONEMAP_ENABLED, false) ||
|
||||
!isHdrContent(videoStream)
|
||||
) {
|
||||
return currentState;
|
||||
}
|
||||
// DV Profile 5 may report color_transfer = null even though it uses PQ
|
||||
// encoding. Explicitly set smpte2084 so zscale can correctly invert the curve.
|
||||
const effectiveState =
|
||||
videoStream.isDolbyVision() && !currentState.colorFormat?.colorTransfer
|
||||
? currentState.update({
|
||||
colorFormat: new ColorFormat({
|
||||
colorTransfer: ColorTransferFormats.Smpte2084,
|
||||
colorRange: currentState.colorFormat?.colorRange ?? null,
|
||||
colorSpace: currentState.colorFormat?.colorSpace ?? null,
|
||||
colorPrimaries: currentState.colorFormat?.colorPrimaries ?? null,
|
||||
}),
|
||||
})
|
||||
: currentState;
|
||||
const filter = new TonemapFilter(effectiveState);
|
||||
this.videoInputSource.filterSteps.push(filter);
|
||||
return filter.nextState(currentState);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import type { ExcludeByValueType, Nullable } from '@/types/util.js';
|
||||
import type { LoudnormConfig } from '@tunarr/types';
|
||||
import { isNil, omitBy } from 'lodash-es';
|
||||
import type { AnyFunction } from 'ts-essentials';
|
||||
import type { TranscodeAudioOutputFormat } from '../../../db/schema/TranscodeConfig.ts';
|
||||
@@ -13,6 +14,8 @@ const DefaultAudioState: AudioState = {
|
||||
audioSampleRate: null,
|
||||
audioDuration: null,
|
||||
audioVolume: null,
|
||||
normalizeLoudness: false,
|
||||
loudnormConfig: null,
|
||||
};
|
||||
|
||||
export class AudioState {
|
||||
@@ -23,6 +26,8 @@ export class AudioState {
|
||||
audioSampleRate: Nullable<number>;
|
||||
audioDuration: Nullable<number>;
|
||||
audioVolume: Nullable<number>;
|
||||
normalizeLoudness: boolean;
|
||||
loudnormConfig: Nullable<LoudnormConfig>;
|
||||
|
||||
private constructor(fields: Partial<AudioStateFields> = {}) {
|
||||
const merged: AudioStateFields = {
|
||||
@@ -36,6 +41,8 @@ export class AudioState {
|
||||
this.audioSampleRate = merged.audioSampleRate;
|
||||
this.audioDuration = merged.audioDuration;
|
||||
this.audioVolume = merged.audioVolume;
|
||||
this.normalizeLoudness = merged.normalizeLoudness ?? false;
|
||||
this.loudnormConfig = merged.loudnormConfig;
|
||||
}
|
||||
|
||||
static create(fields: Partial<AudioStateFields> = {}) {
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { HardwareAccelerationMode } from '@/db/schema/TranscodeConfig.js';
|
||||
import type { DataProps } from '@/ffmpeg/builder/types.js';
|
||||
import type { FfmpegVersionResult } from '@/ffmpeg/ffmpegInfo.js';
|
||||
import type { Maybe, Nullable } from '@/types/util.js';
|
||||
import type { DataProps, Maybe, Nullable } from '@/types/util.js';
|
||||
import type { FfmpegLogLevel } from '@tunarr/types/schemas';
|
||||
import type { Duration } from 'dayjs/plugin/duration.js';
|
||||
import { merge } from 'lodash-es';
|
||||
@@ -75,10 +74,14 @@ export class FfmpegState {
|
||||
outputFormat: OutputFormat = MpegTsOutputFormat; // TODO: No
|
||||
outputLocation: OutputLocation = OutputLocation.Stdout;
|
||||
ptsOffset?: number;
|
||||
tonemapHdr: boolean = false;
|
||||
|
||||
// HLS
|
||||
get hlsPlaylistPath(): Maybe<string> {
|
||||
if (this.outputFormat.type === OutputFormatTypes.Hls) {
|
||||
if (
|
||||
this.outputFormat.type === OutputFormatTypes.Hls ||
|
||||
this.outputFormat.type === OutputFormatTypes.HlsDirectV2
|
||||
) {
|
||||
return path.join(
|
||||
this.outputFormat.hlsOptions.segmentBaseDirectory,
|
||||
this.outputFormat.hlsOptions.streamBasePath,
|
||||
@@ -89,7 +92,10 @@ export class FfmpegState {
|
||||
}
|
||||
|
||||
get hlsSegmentTemplate(): Maybe<string> {
|
||||
if (this.outputFormat.type === OutputFormatTypes.Hls) {
|
||||
if (
|
||||
this.outputFormat.type === OutputFormatTypes.Hls ||
|
||||
this.outputFormat.type === OutputFormatTypes.HlsDirectV2
|
||||
) {
|
||||
return path.join(
|
||||
this.outputFormat.hlsOptions.segmentBaseDirectory,
|
||||
this.outputFormat.hlsOptions.streamBasePath,
|
||||
@@ -100,7 +106,10 @@ export class FfmpegState {
|
||||
}
|
||||
|
||||
get hlsBaseStreamUrl() {
|
||||
if (this.outputFormat.type === OutputFormatTypes.Hls) {
|
||||
if (
|
||||
this.outputFormat.type === OutputFormatTypes.Hls ||
|
||||
this.outputFormat.type === OutputFormatTypes.HlsDirectV2
|
||||
) {
|
||||
return this.outputFormat.hlsOptions.streamBaseUrl;
|
||||
}
|
||||
return;
|
||||
|
||||
@@ -2,12 +2,13 @@ import {
|
||||
PixelFormatUnknown,
|
||||
type PixelFormat,
|
||||
} from '@/ffmpeg/builder/format/PixelFormat.js';
|
||||
import type { DataProps, FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import type { FrameSize } from '@/ffmpeg/builder/types.js';
|
||||
import { FrameDataLocation } from '@/ffmpeg/builder/types.js';
|
||||
import type { Nullable } from '@/types/util.js';
|
||||
import type { DataProps, Nullable } from '@/types/util.js';
|
||||
import { isEqual, merge } from 'lodash-es';
|
||||
import type { MarkOptional } from 'ts-essentials';
|
||||
import type { VideoFormat } from '../constants.ts';
|
||||
import { ColorFormat } from '../format/ColorFormat.ts';
|
||||
|
||||
type FrameStateFields = DataProps<FrameState>;
|
||||
|
||||
@@ -28,6 +29,7 @@ export const DefaultFrameState: Omit<
|
||||
deinterlace: false,
|
||||
pixelFormat: null,
|
||||
bitDepth: 8,
|
||||
colorFormat: ColorFormat.unknown,
|
||||
forceSoftwareOverlay: false,
|
||||
infiniteLoop: false,
|
||||
};
|
||||
@@ -48,6 +50,7 @@ export class FrameState {
|
||||
frameDataLocation: FrameDataLocation;
|
||||
deinterlace: boolean;
|
||||
pixelFormat: Nullable<PixelFormat>;
|
||||
colorFormat: Nullable<ColorFormat>;
|
||||
infiniteLoop: boolean = false;
|
||||
|
||||
forceSoftwareOverlay = false;
|
||||
|
||||
@@ -1,8 +1,5 @@
|
||||
import type { ExcludeByValueType, TupleToUnion } from '@/types/util.js';
|
||||
import type { DataProps, TupleToUnion } from '@/types/util.js';
|
||||
import type { Resolution } from '@tunarr/types';
|
||||
import type { AnyFunction } from 'ts-essentials';
|
||||
|
||||
export type DataProps<T> = ExcludeByValueType<T, AnyFunction>;
|
||||
|
||||
export const StreamKinds = [
|
||||
'audio',
|
||||
|
||||
@@ -22,6 +22,7 @@ export const ConcatStreamModeToChildMode: Record<
|
||||
hls_slower_concat: 'hls_slower',
|
||||
mpegts_concat: 'mpegts',
|
||||
hls_direct_concat: 'hls_direct',
|
||||
hls_direct_v2_concat: 'hls_direct_v2',
|
||||
} as const;
|
||||
|
||||
export abstract class IFFMPEG {
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user