Compare commits

...

101 Commits
v6.1.2 ... main

Author SHA1 Message Date
Bel LaPointe a1adf97bfb no dry 2025-09-25 15:56:10 -04:00
Bel LaPointe 8dea290dd1 nosqueaky 2025-09-25 15:00:14 -04:00
Bel LaPointe 90358f0176 allowlist 2025-09-25 14:55:50 -04:00
Bel LaPointe 96239d6704 runs 2025-09-25 14:55:11 -04:00
Bel LaPointe e8e6d6469f at least it bombs now 2025-09-25 14:40:30 -04:00
Bel LaPointe 0dc338df6f quote 2025-09-25 14:36:56 -04:00
Bel LaPointe 72df1cd0ad env 2025-09-25 14:36:06 -04:00
Bel LaPointe 8af7fe00f9 me 2025-09-25 14:34:41 -04:00
Luis Garcia 6eefedfc40 Tag version 8.3.0
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-09-11 17:43:32 -06:00
Luigi311 2914dbb81c
Merge pull request #311 from luigi311/deps
Update dependencies
2025-09-11 17:42:39 -06:00
Luis Garcia 52c780d8a7 Update dependencies
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-09-11 17:39:18 -06:00
Luigi311 0276e7c8eb
Merge pull request #309 from luigi311/pathlib
Utilize pathlib for universal location file extraction
2025-09-11 17:31:28 -06:00
Luis Garcia bf50defcb5 Use pathlib to extract file/folder to fix windows paths
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-09-10 21:37:17 -06:00
Luigi311 71d753878e
Merge pull request #298 from luigi311/identifies_logging
Identifies logging
2025-07-15 01:09:07 -06:00
Luis Garcia 21c530d956 Plex: Log missing identifiers information
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-14 23:10:06 +00:00
Luis Garcia 142c9df6e9 Jellyfin/Emby: Log more missing identifiers information
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-14 22:53:00 +00:00
Luis Garcia 629f50ecdc Tag version 8.2.0
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-14 19:10:36 +00:00
Luigi311 3e2450b5fd
Merge pull request #296 from luigi311/fix_emby
Jellyfin/Emby: Add fallback for played percentage if missing
2025-07-14 13:09:58 -06:00
Luis Garcia 0de5e86837 Jellyfin/Emby: Add fallback for played percentage if missing
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-14 17:21:55 +00:00
Luis Garcia 33a719f693 Tag version 8.1.0
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-12 00:17:46 +00:00
Luigi311 9ff985a848
Merge pull request #292 from luigi311/sync_timestamps
Jellyfin/Emby: Sync across the view times
2025-07-11 18:16:56 -06:00
Luis Garcia 5501e21aa8 Jellyfin/Emby: Use the same endpoint for marking as for partials to fix emby
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-12 00:14:27 +00:00
Luis Garcia 2208d91d07 README: Add sync view dates
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 23:48:33 +00:00
Luis Garcia 75f7f576ac Jellyfin/Emby: Sync across the view times
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 23:48:23 +00:00
Luis Garcia 24f56769f9 Tag version 8.0.0
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 18:31:19 +00:00
Luigi311 29e4f224dc
Merge pull request #233 from luigi311/reuse_server1
Reuse server_1_watched
2025-07-11 12:29:35 -06:00
Luis Garcia bdb58918e7 Jellyfin/Emby: Log missing identifiers to debug
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 18:06:20 +00:00
Luis Garcia c3be980eea Reuse server_1_watched history to avoid duplication
Keeps the server_1_watched history that way it does not need to fetch
the same results again each time it needs to sync to another server

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 17:35:53 +00:00
Luis Garcia c1a26dd73b README: Add ENV_FILE option
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 17:01:59 +00:00
Luigi311 e5d5f11f33
Merge pull request #291 from luigi311/deps
Deps: Update deps
2025-07-11 10:50:28 -06:00
Luis Garcia 616ca92d5e Deps: Update deps
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 16:43:55 +00:00
Luigi311 b2b214c987
Merge pull request #290 from luigi311/docker_push
CI: Simplify docker
2025-07-11 10:40:58 -06:00
Luis Garcia 07542b498e CI: UV sync frozen and no extra tools
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 10:37:26 -06:00
Luis Garcia 9e53c0f8e2 CI: Simplify docker build push
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-07-11 10:32:52 -06:00
Luigi311 98266de678
Merge pull request #279 from luigi311/env_file
Add support for env file support, set via ENV_FILE
2025-07-11 10:24:40 -06:00
Luis Garcia 9d4f3dd432 Move generate locations/guids to the class level
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-06-05 17:04:05 -06:00
Luis Garcia cc9b84fefa Update .env.sample
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-18 20:38:19 +00:00
Luis Garcia c76bb3b355 CI: Use ENV_FILE
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-18 20:25:13 +00:00
Luis Garcia 544649effd Add support for env file support, set via ENV_FILE
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-18 20:25:13 +00:00
Luis Garcia 46b60bb866 Tag version 7.0.4
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-16 04:06:05 +00:00
Luigi311 5670c3ad97
Merge pull request #278 from luigi311/update_deps
Update dependencies
2025-05-15 22:05:19 -06:00
Luis Garcia 7e0f4babda Update dependencies
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-16 04:01:59 +00:00
Luigi311 d5c36c61ec
Merge pull request #277 from luigi311/error_raised
Do not fail on some errors
2025-05-15 18:25:11 -06:00
Luis Garcia 69cd73d965 Functions: Remove fstring from mark_file file path
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-15 03:46:51 +00:00
Luis Garcia 229ab59b44 Do not fail on some errors
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-05-15 03:38:33 +00:00
Luigi311 3e474a4593
Merge pull request #267 from masesisaac/main
fix: case-insensitive library name check for jellyfin/emby
2025-04-10 16:28:19 -06:00
masesisaac 69958a257b fix: case-insensitive library name check 2025-04-11 00:34:10 +03:00
Luis Garcia 64c1823e5b tag 7.0.3
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-04-03 19:25:20 +00:00
Luigi311 446f6df470
Merge pull request #259 from luigi311/fallback_library_types
Jellyfin/Emby: Add fallback to media files for library types
2025-04-03 13:23:41 -06:00
Luis Garcia 91ea5d76f6 Jellyfin/Emby: Add fallback to media files for library types
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-04-02 04:27:39 +00:00
Luigi311 dc26b9a7b1
Merge pull request #249 from luigi311/simplify_get_watched
Jellyfin/Emby: Simplify get watched
2025-03-07 16:35:11 -07:00
Luis Garcia d98b7c3e09 Jellyfin/Emby: Simplify get watched
Shouldn't need to do library type checks as that is handed in the
get libraries function and then used with the sync libraries name
check

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-03-07 23:29:36 +00:00
Luigi311 93d9471333
Merge pull request #248 from luigi311/reliable
Improve reliability
2025-03-07 16:27:55 -07:00
Luis Garcia e6fa8ae745 Treewide: MyPy type fixes
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-03-07 23:24:33 +00:00
Luis Garcia 5b644a54a2 Plex: Better reliability 2025-03-07 20:23:02 +00:00
Luis Garcia 5a17c5f7a1 Jellyfin/Emby: Better reliability 2025-03-07 19:34:37 +00:00
Luigi311 61e3dddd6b
Merge pull request #245 from luigi311/none_title
Watched: Allow None for mediaidentifier title
2025-02-26 23:14:20 -07:00
Luis Garcia aaaa7eba70 Watched: Allow None for mediaidentifier title 2025-02-26 23:13:49 +00:00
Luigi311 991355716d
Merge pull request #240 from luigi311/user_name
Plex: Use username for watch key if exists
2025-02-26 15:48:32 -07:00
Luis Garcia 54bd6e836f Plex: Use username for watch key if exists
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-26 22:46:16 +00:00
Luigi311 57c41f41bc
Merge pull request #244 from luigi311/watched_trace
Move server watched log to trace
2025-02-26 13:27:09 -07:00
Luis Garcia ea85a31d9c Move server watched log to trace
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-26 20:23:40 +00:00
Luigi311 80d5c9e54c
Merge pull request #242 from BadCoder2/patch-1
Update README.md to reflect uv update; make environment setting more clear
2025-02-25 23:05:30 -07:00
Nathan Hoffman 5828701944
Update README.md to uv; make environment setting more clear 2025-02-25 23:44:26 -06:00
Luigi311 81ba9bd7f9
Merge pull request #237 from luigi311/rotate_logging
Reconfigure the logger on each loop so the logs are rotated on each run
2025-02-23 12:46:14 -07:00
Luis Garcia d15759570e Reconfigure the logger on each loop so the logs are rotated on each run
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-23 19:42:55 +00:00
Luigi311 1b88ecf2eb
Merge pull request #236 from luigi311/logging
More logging
2025-02-22 17:18:10 -07:00
Luis Garcia c62809c615 More logging
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-23 00:15:14 +00:00
Luigi311 899a6b05a4
Merge pull request #235 from luigi311/docker_defaults
Docker: Set default env values to prevent issues
2025-02-21 23:41:12 -07:00
Luis Garcia fcd6103e17 Docker: Set default env values to prevent issues
Set default values to prevent issues where it thinks values are set and
causing json read errors

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 23:36:19 -07:00
Luigi311 ac5be474f8
Merge pull request #234 from luigi311/cleanup
Cleanup
2025-02-21 19:55:09 -07:00
Luis Garcia d15f29b772 Do not include test in docker image
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 19:52:12 -07:00
Luis Garcia c9944866f8 Remove extra print
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 19:51:34 -07:00
Luigi311 846e18fffe
Merge pull request #232 from luigi311/test_update
Update test validation to use loguru
2025-02-21 19:39:07 -07:00
Luis Garcia eb09de2bdf CI: Better validate logging 2025-02-21 19:37:04 -07:00
Luis Garcia c0e207924c CI: Enable trace
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 19:36:48 -07:00
Luigi311 e48533dfbd
Merge pull request #230 from luigi311/file_logfile
Fix logfile
2025-02-21 17:01:35 -07:00
Luis Garcia 8503b087b2 Fix logfile
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 16:59:50 -07:00
Luigi311 305fea8f9a
Merge pull request #229 from luigi311/loguru
Switch logging to loguru
2025-02-21 16:09:15 -07:00
Luis Garcia 588c23ce41 Switch logging to loguru
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 16:03:29 -07:00
Luigi311 8f4a2e2690
Merge pull request #228 from luigi311/misc
Misc
2025-02-21 14:54:10 -07:00
Luis Garcia 38e65f5a17 vscode: Fix deprecated python debug type
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 14:47:14 -07:00
Luis Garcia de32d59aa1 Better initial library filtering
Filter for only tv shows and movie type libraries in plex and jellyfin.
Jellyfin no longer require pulling in multiple different items and
instead use the actual library category

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 14:41:42 -07:00
Luis Garcia 998f2b1209 Formatting
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 14:41:19 -07:00
Luis Garcia 0b02f531c1 Force python 3.12 or greater
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-21 12:10:53 -07:00
Luigi311 e589935b37
Merge pull request #226 from luigi311/plex_episodes
Plex: only fetch watched or partially watched episodes
2025-02-19 20:09:41 -07:00
Luigi311 031d43e980
Merge pull request #227 from luigi311/uv
Swap to UV
2025-02-19 20:09:27 -07:00
Luis Garcia ba6cad13f6 Swap to UV 2025-02-19 19:48:16 -07:00
Luis Garcia f3801a0bd2 Plex: only fetch watched or partially watched episodes
Instead of fetching all episodes and checking if watched or view time it
is faster to search for only watched and partially watched episodes.

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-19 13:52:05 -07:00
Luigi311 196a49fca4
Merge pull request #224 from luigi311/pydantic
Use Pydantic for watch structure
2025-02-19 13:20:45 -07:00
Luis Garcia 4d0f1d303f Plex: Remove logging if locations or guids
Remove the logging of if an item has locations or guids as that
forces a fetch of that data which defeats the purpose of generate
guid/location variables

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-19 12:44:21 -07:00
Luis Garcia ce5b810a5b Use pydantic for structure
Complete redesign of everything using pydantic to create the
watched structure. This brings in type checking support and
simplifies a lot of things

Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-19 10:51:10 -07:00
Luigi311 a1e1ccde42
Merge pull request #210 from luigi311/more_cleanup
More cleanup
2025-02-18 18:09:57 -07:00
Luis Garcia bf633c75d1 More typing
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:59:43 -07:00
Luis Garcia 46fa5e7c9a Use list instead of tuple for servers
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:59:43 -07:00
Luis Garcia 170757aca1 Add lots of static typing
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:59:43 -07:00
Luis Garcia 9786e9e27d Test: Split out black and white mapping tests
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:59:43 -07:00
Luis Garcia 8b691b7bfa Test: Remove duplicate black and white list test
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:59:38 -07:00
Luis Garcia e1c65fc082 Jellyfin/emby: Combine info/version, add typing for versions
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:45:21 -07:00
Luis Garcia 58749a4fb8 Plex: Improve variable names to avoid confusion
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:45:21 -07:00
Luis Garcia 51ec69f651 Simplify search_mapping
Signed-off-by: Luis Garcia <git@luigi311.com>
2025-02-18 17:45:21 -07:00
38 changed files with 3427 additions and 2633 deletions

View File

@ -1,3 +1,4 @@
.venv
.dockerignore
.env
.env.sample
@ -9,7 +10,4 @@
Dockerfile*
README.md
test
venv

117
.env Normal file
View File

@ -0,0 +1,117 @@
# Global Settings
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "False"
## Debugging level, "info" is default, "debug" is more verbose
#DEBUG_LEVEL = "DEBUG"
DEBUG_LEVEL = "INFO"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"
## How often to run the script in seconds
SLEEP_DURATION = "60"
## Log file where all output will be written to
LOG_FILE = "/mnt/log.log"
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "/mnt/mark.log"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Max threads for processing
MAX_THREADS = 1
## Generate guids/locations
## These are slow processes, so this is a way to speed things up
## If media servers are using the same files then you can enable only generate locations
## If media servers are using different files then you can enable only generate guids
## Default is to generate both
GENERATE_GUIDS = "True"
GENERATE_LOCATIONS = "True"
## Map usernames between servers in the event that they are different, order does not matter
## Comma separated for multiple options
# jellyfin: plex,plex
#USER_MAPPING = { "belandbroc": "debila,belan49", "debila,belan49": "belandbroc", "debila": "belandbroc", "belan49": "belandbroc" }
USER_MAPPING = { "belandbroc":"debila", "debila":"belandbroc", "debila":"belandbroc" }
## Map libraries between servers in the event that they are different, order does not matter
## Comma separated for multiple options
LIBRARY_MAPPING = { "TV Shows": "Scratch TV Shows", "Scratch TV Shows": "TV Shows" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma separated for multiple options
#BLACKLIST_LIBRARY = ""
WHITELIST_LIBRARY = "TV Shows,Scratch TV Shows,Movies"
#BLACKLIST_LIBRARY_TYPE = ""
#WHITELIST_LIBRARY_TYPE = ""
#BLACKLIST_USERS = ""
WHITELIST_USERS = "belandbroc,debila"
# Plex
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
## Comma separated list for multiple servers
PLEX_BASEURL = "http://192.168.0.86:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
## Comma separated list for multiple servers
# PLEX_TOKEN = "vPGyuy6zWVCz6ZFyy8x1"
# # debila=debilapointe@gmail
PLEX_TOKEN = "S7gbVzAzH4ypN-4K7ta5"
# me
## If not using plex token then use username and password of the server admin along with the servername
## Comma separated for multiple options
#PLEX_USERNAME = "squeaky2x3@gmail.com"
#PLEX_PASSWORD = "qoDuGNsGsWRurOd5QFdRy2@"
#PLEX_SERVERNAME = "Scratch"
## Skip hostname validation for ssl certificates.
## Set to True if running into ssl certificate errors
SSL_BYPASS = "True"
# Jellyfin
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
## Comma separated list for multiple servers
JELLYFIN_BASEURL = "https://jellyfin.home.blapointe.com"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
## Comma separated list for multiple servers
JELLYFIN_TOKEN = "1dc766ce6ca44c53b773263a06889b96"
# # Emby
#
# ## Emby server URL, use hostname or IP address if the hostname is not resolving correctly
# ## Comma seperated list for multiple servers
# EMBY_BASEURL = "http://localhost:8097"
#
# ## Emby api token, created manually by logging in to the Emby server admin dashboard and creating an api key
# ## Comma seperated list for multiple servers
# EMBY_TOKEN = "SuperSecretToken"
# Syncing Options
## control the direction of syncing. e.g. SYNC_FROM_PLEX_TO_JELLYFIN set to true will cause the updates from plex
## to be updated in jellyfin. SYNC_FROM_PLEX_TO_PLEX set to true will sync updates between multiple plex servers
SYNC_FROM_PLEX_TO_JELLYFIN = "True"
SYNC_FROM_PLEX_TO_PLEX = "False"
#SYNC_FROM_PLEX_TO_EMBY = "False"
SYNC_FROM_JELLYFIN_TO_PLEX = "False"
SYNC_FROM_JELLYFIN_TO_JELLYFIN = "False"
#SYNC_FROM_JELLYFIN_TO_EMBY = "False"
#SYNC_FROM_EMBY_TO_PLEX = "False"
#SYNC_FROM_EMBY_TO_JELLYFIN = "False"
#SYNC_FROM_EMBY_TO_EMBY = "False"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "False"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "info"
DEBUG_LEVEL = "INFO"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "False"
@ -16,7 +13,7 @@ RUN_ONLY_ONCE = "False"
SLEEP_DURATION = "3600"
## Log file where all output will be written to
LOGFILE = "log.log"
LOG_FILE = "log.log"
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "mark.log"
@ -24,26 +21,24 @@ MARK_FILE = "mark.log"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Generate guids
## Generating guids is a slow process, so this is a way to speed up the process
## by using the location only, useful when using same files on multiple servers
GENERATE_GUIDS = "True"
## Generate locations
## Generating locations is a slow process, so this is a way to speed up the process
## by using the guid only, useful when using different files on multiple servers
GENERATE_LOCATIONS = "True"
## Max threads for processing
MAX_THREADS = 2
MAX_THREADS = 1
## Generate guids/locations
## These are slow processes, so this is a way to speed things up
## If media servers are using the same files then you can enable only generate locations
## If media servers are using different files then you can enable only generate guids
## Default is to generate both
GENERATE_GUIDS = "True"
GENERATE_LOCATIONS = "True"
## Map usernames between servers in the event that they are different, order does not matter
## Comma separated for multiple options
#USER_MAPPING = { "testuser2": "testuser3", "testuser1":"testuser4" }
USER_MAPPING = { "Username": "User", "Second User": "User Dos" }
## Map libraries between servers in the event that they are different, order does not matter
## Comma separated for multiple options
#LIBRARY_MAPPING = { "Shows": "TV Shows", "Movie": "Movies" }
LIBRARY_MAPPING = { "Shows": "TV Shows", "Movie": "Movies" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma separated for multiple options
@ -52,7 +47,7 @@ MAX_THREADS = 2
#BLACKLIST_LIBRARY_TYPE = ""
#WHITELIST_LIBRARY_TYPE = ""
#BLACKLIST_USERS = ""
WHITELIST_USERS = "testuser1,testuser2"
#WHITELIST_USERS = ""
# Plex
@ -96,7 +91,7 @@ EMBY_BASEURL = "http://localhost:8097"
## Emby api token, created manually by logging in to the Emby server admin dashboard and creating an api key
## Comma seperated list for multiple servers
EMBY_TOKEN = "ed9507cba8d14d469ae4d58e33afc515"
EMBY_TOKEN = "SuperSecretToken"
# Syncing Options

View File

@ -19,28 +19,36 @@ jobs:
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: "Set up Python"
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
python-version-file: ".python-version"
- name: "Install dependencies"
run: pip install -r requirements.txt && pip install -r test/requirements.txt
run: uv sync --frozen
- name: "Run tests"
run: pytest -vvv
run: uv run pytest -vvv
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: "Set up Python"
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
python-version-file: ".python-version"
- name: "Install dependencies"
run: |
pip install -r requirements.txt
uv sync --frozen
sudo apt update && sudo apt install -y docker-compose
- name: "Checkout JellyPlex-Watched-CI"
@ -62,54 +70,48 @@ jobs:
- name: "Test Plex"
run: |
mv test/ci_plex.env .env
python main.py
python test/validate_ci_marklog.py --plex
ENV_FILE="test/ci_plex.env" uv run main.py
uv run test/validate_ci_marklog.py --plex
rm mark.log
- name: "Test Jellyfin"
run: |
mv test/ci_jellyfin.env .env
python main.py
python test/validate_ci_marklog.py --jellyfin
ENV_FILE="test/ci_jellyfin.env" uv run main.py
uv run test/validate_ci_marklog.py --jellyfin
rm mark.log
- name: "Test Emby"
run: |
mv test/ci_emby.env .env
python main.py
python test/validate_ci_marklog.py --emby
ENV_FILE="test/ci_emby.env" uv run main.py
uv run test/validate_ci_marklog.py --emby
rm mark.log
- name: "Test Guids"
run: |
mv test/ci_guids.env .env
python main.py
python test/validate_ci_marklog.py --guids
ENV_FILE="test/ci_guids.env" uv run main.py
uv run test/validate_ci_marklog.py --guids
rm mark.log
- name: "Test Locations"
run: |
mv test/ci_locations.env .env
python main.py
python test/validate_ci_marklog.py --locations
ENV_FILE="test/ci_locations.env" uv run main.py
uv run test/validate_ci_marklog.py --locations
rm mark.log
- name: "Test writing to the servers"
run: |
# Test writing to the servers
mv test/ci_write.env .env
python main.py
ENV_FILE="test/ci_write.env" uv run main.py
# Test again to test if it can handle existing data
python main.py
ENV_FILE="test/ci_write.env" uv run main.py
python test/validate_ci_marklog.py --write
uv run test/validate_ci_marklog.py --write
rm mark.log
@ -170,6 +172,7 @@ jobs:
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
if: "${{ env.DOCKER_USERNAME != '' }}"
id: docker_login
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
@ -183,26 +186,14 @@ jobs:
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build
id: build
if: "${{ steps.docker_meta.outputs.tags == '' }}"
uses: docker/build-push-action@v5
with:
context: .
file: ${{ matrix.dockerfile }}
platforms: linux/amd64,linux/arm64
push: false
tags: jellyplex-watched:action
- name: Build Push
id: build_push
if: "${{ steps.docker_meta.outputs.tags != '' }}"
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: .
file: ${{ matrix.dockerfile }}
platforms: linux/amd64,linux/arm64
push: true
push: ${{ steps.docker_login.outcome == 'success' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}

3
.gitignore vendored
View File

@ -84,9 +84,6 @@ target/
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies

1
.python-version Normal file
View File

@ -0,0 +1 @@
3.13

4
.vscode/launch.json vendored
View File

@ -6,7 +6,7 @@
"configurations": [
{
"name": "Python: Main",
"type": "python",
"type": "debugpy",
"request": "launch",
"program": "main.py",
"console": "integratedTerminal",
@ -14,7 +14,7 @@
},
{
"name": "Pytest",
"type": "python",
"type": "debugpy",
"request": "launch",
"module": "pytest",
"args": [

View File

@ -1,4 +1,4 @@
FROM python:3.13-alpine
FROM ghcr.io/astral-sh/uv:python3.13-alpine
ENV PUID=1000
ENV PGID=1000
@ -36,14 +36,72 @@ RUN set -eux; \
WORKDIR /app
COPY ./requirements.txt ./
# Enable bytecode compilation
ENV UV_COMPILE_BYTECODE=1
RUN pip install --no-cache-dir -r requirements.txt
ENV UV_LINK_MODE=copy
# Set the cache directory to /tmp instead of root
ENV UV_CACHE_DIR=/tmp/.cache/uv
# Install the project's dependencies using the lockfile and settings
RUN --mount=type=cache,target=/tmp/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --frozen --no-install-project --no-dev
# Then, add the rest of the project source code and install it
# Installing separately from its dependencies allows optimal layer caching
COPY . /app
RUN --mount=type=cache,target=/tmp/.cache/uv \
uv sync --frozen --no-dev
# Place executables in the environment at the front of the path
ENV PATH="/app/.venv/bin:$PATH"
COPY . .
RUN chmod +x *.sh && \
dos2unix *.sh
# Set default values to prevent issues
ENV DRYRUN="True"
ENV DEBUG_LEVEL="INFO"
ENV RUN_ONLY_ONCE="False"
ENV SLEEP_DURATION=3600
ENV LOG_FILE="log.log"
ENV MARK_FILE="mark.log"
ENV REQUEST_TIME=300
ENV GENERATE_GUIDS="True"
ENV GENERATE_LOCATIONS="True"
ENV MAX_THREADS=1
ENV USER_MAPPING=""
ENV LIBRARY_MAPPING=""
ENV BLACKLIST_LIBRARY=""
ENV WHITELIST_LIBRARY=""
ENV BLACKLIST_LIBRARY_TYPE=""
ENV WHITELIST_LIBRARY_TYPE=""
ENV BLACKLIST_USERS=""
ENV WHITELIST_USERS=""
ENV PLEX_BASEURL=""
ENV PLEX_TOKEN=""
ENV PLEX_USERNAME=""
ENV PLEX_PASSWORD=""
ENV PLEX_SERVERNAME=""
ENV SSL_BYPASS="False"
ENV JELLYFIN_BASEURL=""
ENV JELLYFIN_TOKEN=""
ENV EMBY_BASEURL=""
ENV EMBY_TOKEN=""
ENV SYNC_FROM_PLEX_TO_JELLYFIN="True"
ENV SYNC_FROM_PLEX_TO_PLEX="True"
ENV SYNC_FROM_PLEX_TO_EMBY="True"
ENV SYNC_FROM_JELLYFIN_TO_PLEX="True"
ENV SYNC_FROM_JELLYFIN_TO_JELLYFIN="True"
ENV SYNC_FROM_JELLYFIN_TO_EMBY="True"
ENV SYNC_FROM_EMBY_TO_PLEX="True"
ENV SYNC_FROM_EMBY_TO_JELLYFIN="True"
ENV SYNC_FROM_EMBY_TO_EMBY="True"
ENTRYPOINT ["tini", "--", "/app/entrypoint.sh"]
CMD ["python", "-u", "main.py"]

View File

@ -1,4 +1,4 @@
FROM python:3.13-slim
FROM ghcr.io/astral-sh/uv:bookworm-slim
ENV PUID=1000
ENV PGID=1000
@ -10,14 +10,72 @@ RUN apt-get update && \
WORKDIR /app
COPY ./requirements.txt ./
# Enable bytecode compilation
ENV UV_COMPILE_BYTECODE=1
RUN pip install --no-cache-dir -r requirements.txt
ENV UV_LINK_MODE=copy
COPY . .
# Set the cache directory to /tmp instead of root
ENV UV_CACHE_DIR=/tmp/.cache/uv
ENV UV_PYTHON_INSTALL_DIR=/app/.bin
# Install the project's dependencies using the lockfile and settings
RUN --mount=type=cache,target=/tmp/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --frozen --no-install-project --no-dev
# Then, add the rest of the project source code and install it
# Installing separately from its dependencies allows optimal layer caching
COPY . /app
RUN --mount=type=cache,target=/tmp/.cache/uv \
uv sync --frozen --no-dev
# Place executables in the environment at the front of the path
ENV PATH="/app/.venv/bin:$PATH"
RUN chmod +x *.sh && \
dos2unix *.sh
# Set default values to prevent issues
ENV DRYRUN="True"
ENV DEBUG_LEVEL="INFO"
ENV RUN_ONLY_ONCE="False"
ENV SLEEP_DURATION=3600
ENV LOG_FILE="log.log"
ENV MARK_FILE="mark.log"
ENV REQUEST_TIME=300
ENV GENERATE_GUIDS="True"
ENV GENERATE_LOCATIONS="True"
ENV MAX_THREADS=1
ENV USER_MAPPING=""
ENV LIBRARY_MAPPING=""
ENV BLACKLIST_LIBRARY=""
ENV WHITELIST_LIBRARY=""
ENV BLACKLIST_LIBRARY_TYPE=""
ENV WHITELIST_LIBRARY_TYPE=""
ENV BLACKLIST_USERS=""
ENV WHITELIST_USERS=""
ENV PLEX_BASEURL=""
ENV PLEX_TOKEN=""
ENV PLEX_USERNAME=""
ENV PLEX_PASSWORD=""
ENV PLEX_SERVERNAME=""
ENV SSL_BYPASS="False"
ENV JELLYFIN_BASEURL=""
ENV JELLYFIN_TOKEN=""
ENV EMBY_BASEURL=""
ENV EMBY_TOKEN=""
ENV SYNC_FROM_PLEX_TO_JELLYFIN="True"
ENV SYNC_FROM_PLEX_TO_PLEX="True"
ENV SYNC_FROM_PLEX_TO_EMBY="True"
ENV SYNC_FROM_JELLYFIN_TO_PLEX="True"
ENV SYNC_FROM_JELLYFIN_TO_JELLYFIN="True"
ENV SYNC_FROM_JELLYFIN_TO_EMBY="True"
ENV SYNC_FROM_EMBY_TO_PLEX="True"
ENV SYNC_FROM_EMBY_TO_JELLYFIN="True"
ENV SYNC_FROM_EMBY_TO_EMBY="True"
ENTRYPOINT ["/bin/tini", "--", "/app/entrypoint.sh"]
CMD ["python", "-u", "main.py"]

View File

@ -19,6 +19,7 @@ Keep in sync all your users watched history between jellyfin, plex and emby serv
- \[x] One way/multi way sync
- \[x] Sync watched
- \[x] Sync in progress
- \[ ] Sync view dates
### Jellyfin
@ -29,6 +30,8 @@ Keep in sync all your users watched history between jellyfin, plex and emby serv
- \[x] One way/multi way sync
- \[x] Sync watched
- \[x] Sync in progress
- \[x] Sync view dates
### Emby
@ -39,6 +42,8 @@ Keep in sync all your users watched history between jellyfin, plex and emby serv
- \[x] One way/multi way sync
- \[x] Sync watched
- \[x] Sync in progress
- \[x] Sync view dates
## Configuration
@ -48,20 +53,18 @@ Full list of configuration options can be found in the [.env.sample](.env.sample
### Baremetal
- Setup virtualenv of your choice
- [Install uv](https://docs.astral.sh/uv/getting-started/installation/)
- Install dependencies
```bash
pip install -r requirements.txt
```
- Create a .env file similar to .env.sample, uncomment whitelist and blacklist if needed, fill in baseurls and tokens
- Create a .env file similar to .env.sample; fill in baseurls and tokens, **remember to uncomment anything you wish to use** (e.g., user mapping, library mapping, black/whitelist, etc.). If you want to store your .env file anywhere else or under a different name you can use ENV_FILE variable to specify the location.
- Run
```bash
python main.py
uv run main.py
```
```bash
ENV_FILE="Test.env" uv run main.py
```
### Docker
@ -104,6 +107,7 @@ Full list of configuration options can be found in the [.env.sample](.env.sample
- Configuration
- Do not use quotes around variables in docker compose
- If you are not running all 3 supported servers, that is, Plex, Jellyfin, and Emby simultaneously, make sure to comment out the server url and token of the server you aren't using.
## Contributing

7
entrypoint.sh Normal file → Executable file
View File

@ -50,12 +50,13 @@ echo "Starting JellyPlex-Watched with UID: $PUID and GID: $PGID"
# If root run as the created user
if [ "$(id -u)" = '0' ]; then
chown -R "$PUID:$PGID" /app/.venv
chown -R "$PUID:$PGID" "$LOG_DIR"
chown -R "$PUID:$PGID" "$MARK_DIR"
# Run the application as the created user
exec gosu "$PUID:$PGID" "$@"
else
# Run the application as the current user
exec "$@"
fi
# Run the application as the current user
exec "$@"

View File

@ -1,9 +1,9 @@
import sys
if __name__ == "__main__":
# Check python version 3.9 or higher
if not (3, 9) <= tuple(map(int, sys.version_info[:2])):
print("This script requires Python 3.9 or higher")
# Check python version 3.12 or higher
if not (3, 12) <= tuple(map(int, sys.version_info[:2])):
print("This script requires Python 3.12 or higher")
sys.exit(1)
from src.main import main

24
pyproject.toml Normal file
View File

@ -0,0 +1,24 @@
[project]
name = "jellyplex-watched"
version = "8.3.0"
description = "Sync watched between media servers locally"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"loguru>=0.7.3",
"packaging==25.0",
"plexapi==4.17.1",
"pydantic==2.11.7",
"python-dotenv==1.1.1",
"requests==2.32.5",
]
[dependency-groups]
lint = [
"ruff>=0.12.3",
]
dev = [
"mypy>=1.16.1",
"pytest>=8.4.1",
"types-requests>=2.32.0.20250611",
]

Binary file not shown.

10
run.sh Normal file
View File

@ -0,0 +1,10 @@
#! /usr/bin/env bash
d=/tmp/jellyplex.d
mkdir -p $d
docker run --rm -it -v "$d":/mnt $(
if [ "${PWD##*/}" == JellyPlex-Watched ]; then
echo "-v $PWD/src:/app/src"
fi
) -v $PWD/.env:/app/.env \
luigi311/jellyplex-watched:latest

View File

@ -1,16 +1,18 @@
from src.functions import logger, search_mapping
from loguru import logger
from src.functions import search_mapping
def setup_black_white_lists(
blacklist_library: str,
whitelist_library: str,
blacklist_library_type: str,
whitelist_library_type: str,
blacklist_users: str,
whitelist_users: str,
library_mapping=None,
user_mapping=None,
):
blacklist_library: list[str] | None,
whitelist_library: list[str] | None,
blacklist_library_type: list[str] | None,
whitelist_library_type: list[str] | None,
blacklist_users: list[str] | None,
whitelist_users: list[str] | None,
library_mapping: dict[str, str] | None = None,
user_mapping: dict[str, str] | None = None,
) -> tuple[list[str], list[str], list[str], list[str], list[str], list[str]]:
blacklist_library, blacklist_library_type, blacklist_users = setup_x_lists(
blacklist_library,
blacklist_library_type,
@ -40,53 +42,44 @@ def setup_black_white_lists(
def setup_x_lists(
xlist_library,
xlist_library_type,
xlist_users,
xlist_type,
library_mapping=None,
user_mapping=None,
):
xlist_library: list[str] | None,
xlist_library_type: list[str] | None,
xlist_users: list[str] | None,
xlist_type: str | None,
library_mapping: dict[str, str] | None = None,
user_mapping: dict[str, str] | None = None,
) -> tuple[list[str], list[str], list[str]]:
out_library: list[str] = []
if xlist_library:
if len(xlist_library) > 0:
xlist_library = xlist_library.split(",")
xlist_library = [x.strip() for x in xlist_library]
if library_mapping:
temp_library = []
for library in xlist_library:
library_other = search_mapping(library_mapping, library)
if library_other:
temp_library.append(library_other)
out_library = [x.strip() for x in xlist_library]
if library_mapping:
temp_library: list[str] = []
for library in xlist_library:
library_other = search_mapping(library_mapping, library)
if library_other:
temp_library.append(library_other)
xlist_library = xlist_library + temp_library
else:
xlist_library = []
logger(f"{xlist_type}list Library: {xlist_library}", 1)
out_library = out_library + temp_library
logger.info(f"{xlist_type}list Library: {xlist_library}")
out_library_type: list[str] = []
if xlist_library_type:
if len(xlist_library_type) > 0:
xlist_library_type = xlist_library_type.split(",")
xlist_library_type = [x.lower().strip() for x in xlist_library_type]
else:
xlist_library_type = []
logger(f"{xlist_type}list Library Type: {xlist_library_type}", 1)
out_library_type = [x.lower().strip() for x in xlist_library_type]
logger.info(f"{xlist_type}list Library Type: {out_library_type}")
out_users: list[str] = []
if xlist_users:
if len(xlist_users) > 0:
xlist_users = xlist_users.split(",")
xlist_users = [x.lower().strip() for x in xlist_users]
if user_mapping:
temp_users = []
for user in xlist_users:
user_other = search_mapping(user_mapping, user)
if user_other:
temp_users.append(user_other)
out_users = [x.lower().strip() for x in xlist_users]
if user_mapping:
temp_users: list[str] = []
for user in out_users:
user_other = search_mapping(user_mapping, user)
if user_other:
temp_users.append(user_other)
xlist_users = xlist_users + temp_users
else:
xlist_users = []
else:
xlist_users = []
logger(f"{xlist_type}list Users: {xlist_users}", 1)
out_users = out_users + temp_users
return xlist_library, xlist_library_type, xlist_users
logger.info(f"{xlist_type}list Users: {out_users}")
return out_library, out_library_type, out_users

View File

@ -1,68 +1,65 @@
import os
from dotenv import load_dotenv
from typing import Literal
from loguru import logger
from src.functions import logger, str_to_bool
from src.functions import str_to_bool, get_env_value
from src.plex import Plex
from src.jellyfin import Jellyfin
from src.emby import Emby
load_dotenv(override=True)
def jellyfin_emby_server_connection(
env,
server_baseurl: str,
server_token: str,
server_type: Literal["jellyfin", "emby"],
) -> list[Jellyfin | Emby]:
servers: list[Jellyfin | Emby] = []
server: Jellyfin | Emby
def jellyfin_emby_server_connection(server_baseurl, server_token, server_type):
servers = []
server_baseurls = server_baseurl.split(",")
server_tokens = server_token.split(",")
server_baseurl = server_baseurl.split(",")
server_token = server_token.split(",")
if len(server_baseurl) != len(server_token):
if len(server_baseurls) != len(server_tokens):
raise Exception(
f"{server_type.upper()}_BASEURL and {server_type.upper()}_TOKEN must have the same number of entries"
)
for i, baseurl in enumerate(server_baseurl):
baseurl = baseurl.strip()
if baseurl[-1] == "/":
baseurl = baseurl[:-1]
for i, base_url in enumerate(server_baseurls):
base_url = base_url.strip()
if base_url[-1] == "/":
base_url = base_url[:-1]
if server_type == "jellyfin":
server = Jellyfin(baseurl=baseurl, token=server_token[i].strip())
servers.append(
(
"jellyfin",
server,
)
server = Jellyfin(
env=env, base_url=base_url, token=server_tokens[i].strip()
)
servers.append(server)
elif server_type == "emby":
server = Emby(baseurl=baseurl, token=server_token[i].strip())
servers.append(
(
"emby",
server,
)
)
server = Emby(env=env, base_url=base_url, token=server_tokens[i].strip())
servers.append(server)
else:
raise Exception("Unknown server type")
logger(f"{server_type} Server {i} info: {server.info()}", 3)
logger.debug(f"{server_type} Server {i} info: {server.info()}")
return servers
def generate_server_connections():
servers = []
def generate_server_connections(env) -> list[Plex | Jellyfin | Emby]:
servers: list[Plex | Jellyfin | Emby] = []
plex_baseurl = os.getenv("PLEX_BASEURL", None)
plex_token = os.getenv("PLEX_TOKEN", None)
plex_username = os.getenv("PLEX_USERNAME", None)
plex_password = os.getenv("PLEX_PASSWORD", None)
plex_servername = os.getenv("PLEX_SERVERNAME", None)
ssl_bypass = str_to_bool(os.getenv("SSL_BYPASS", "False"))
plex_baseurl_str: str | None = get_env_value(env, "PLEX_BASEURL", None)
plex_token_str: str | None = get_env_value(env, "PLEX_TOKEN", None)
plex_username_str: str | None = get_env_value(env, "PLEX_USERNAME", None)
plex_password_str: str | None = get_env_value(env, "PLEX_PASSWORD", None)
plex_servername_str: str | None = get_env_value(env, "PLEX_SERVERNAME", None)
ssl_bypass = str_to_bool(get_env_value(env, "SSL_BYPASS", "False"))
if plex_baseurl and plex_token:
plex_baseurl = plex_baseurl.split(",")
plex_token = plex_token.split(",")
print(f"if plex_baseurl_str={plex_baseurl_str} and plex_token_str={plex_token_str}")
if plex_baseurl_str and plex_token_str:
plex_baseurl = plex_baseurl_str.split(",")
plex_token = plex_token_str.split(",")
if len(plex_baseurl) != len(plex_token):
raise Exception(
@ -70,28 +67,25 @@ def generate_server_connections():
)
for i, url in enumerate(plex_baseurl):
print(f"Plex({url.strip()}, {plex_token[i].strip()})")
server = Plex(
baseurl=url.strip(),
env,
base_url=url.strip(),
token=plex_token[i].strip(),
username=None,
user_name=None,
password=None,
servername=None,
server_name=None,
ssl_bypass=ssl_bypass,
)
logger(f"Plex Server {i} info: {server.info()}", 3)
logger.debug(f"Plex Server {i} info: {server.info()}")
servers.append(
(
"plex",
server,
)
)
servers.append(server)
if plex_username and plex_password and plex_servername:
plex_username = plex_username.split(",")
plex_password = plex_password.split(",")
plex_servername = plex_servername.split(",")
if plex_username_str and plex_password_str and plex_servername_str:
plex_username = plex_username_str.split(",")
plex_password = plex_password_str.split(",")
plex_servername = plex_servername_str.split(",")
if len(plex_username) != len(plex_password) or len(plex_username) != len(
plex_servername
@ -102,38 +96,32 @@ def generate_server_connections():
for i, username in enumerate(plex_username):
server = Plex(
baseurl=None,
env,
base_url=None,
token=None,
username=username.strip(),
user_name=username.strip(),
password=plex_password[i].strip(),
servername=plex_servername[i].strip(),
server_name=plex_servername[i].strip(),
ssl_bypass=ssl_bypass,
)
logger(f"Plex Server {i} info: {server.info()}", 3)
servers.append(
(
"plex",
server,
)
)
jellyfin_baseurl = os.getenv("JELLYFIN_BASEURL", None)
jellyfin_token = os.getenv("JELLYFIN_TOKEN", None)
logger.debug(f"Plex Server {i} info: {server.info()}")
servers.append(server)
jellyfin_baseurl = get_env_value(env, "JELLYFIN_BASEURL", None)
jellyfin_token = get_env_value(env, "JELLYFIN_TOKEN", None)
if jellyfin_baseurl and jellyfin_token:
servers.extend(
jellyfin_emby_server_connection(
jellyfin_baseurl, jellyfin_token, "jellyfin"
env, jellyfin_baseurl, jellyfin_token, "jellyfin"
)
)
emby_baseurl = os.getenv("EMBY_BASEURL", None)
emby_token = os.getenv("EMBY_TOKEN", None)
emby_baseurl = get_env_value(env, "EMBY_BASEURL", None)
emby_token = get_env_value(env, "EMBY_TOKEN", None)
if emby_baseurl and emby_token:
servers.extend(
jellyfin_emby_server_connection(emby_baseurl, emby_token, "emby")
jellyfin_emby_server_connection(env, emby_baseurl, emby_token, "emby")
)
return servers

View File

@ -1,9 +1,10 @@
from src.jellyfin_emby import JellyfinEmby
from packaging import version
from packaging.version import parse, Version
from loguru import logger
class Emby(JellyfinEmby):
def __init__(self, baseurl, token):
def __init__(self, env, base_url: str, token: str) -> None:
authorization = (
"Emby , "
'Client="JellyPlex-Watched", '
@ -18,8 +19,14 @@ class Emby(JellyfinEmby):
}
super().__init__(
server_type="Emby", baseurl=baseurl, token=token, headers=headers
env, server_type="Emby", base_url=base_url, token=token, headers=headers
)
def is_partial_update_supported(self, server_version):
return server_version > version.parse("4.4")
def is_partial_update_supported(self, server_version: Version) -> bool:
if not server_version >= parse("4.4"):
logger.info(
f"{self.server_type}: Server version {server_version} does not support updating playback position.",
)
return False
return True

View File

@ -1,40 +1,12 @@
import os
from concurrent.futures import ThreadPoolExecutor
from concurrent.futures import Future, ThreadPoolExecutor
from typing import Any, Callable
from dotenv import load_dotenv
import re
from pathlib import PureWindowsPath, PurePosixPath
load_dotenv(override=True)
log_file = os.getenv("LOG_FILE", os.getenv("LOGFILE", "log.log"))
mark_file = os.getenv("MARK_FILE", os.getenv("MARKFILE", "mark.log"))
def logger(message: str, log_type=0):
debug = str_to_bool(os.getenv("DEBUG", "False"))
debug_level = os.getenv("DEBUG_LEVEL", "info").lower()
output = str(message)
if log_type == 0:
pass
elif log_type == 1 and (debug and debug_level in ("info", "debug")):
output = f"[INFO]: {output}"
elif log_type == 2:
output = f"[ERROR]: {output}"
elif log_type == 3 and (debug and debug_level == "debug"):
output = f"[DEBUG]: {output}"
elif log_type == 4:
output = f"[WARNING]: {output}"
elif log_type == 5:
output = f"[MARK]: {output}"
elif log_type == 6:
output = f"[DRYRUN]: {output}"
else:
output = None
if output is not None:
print(output)
with open(f"{log_file}", "a", encoding="utf-8") as file:
file.write(output + "\n")
def log_marked(
server_type: str,
@ -42,12 +14,10 @@ def log_marked(
username: str,
library: str,
movie_show: str,
episode: str = None,
duration=None,
):
if mark_file is None:
return
episode: str | None = None,
duration: float | None = None,
mark_file: str = "mark.log",
) -> None:
output = f"{server_type}/{server_name}/{username}/{library}/{movie_show}"
if episode:
@ -56,35 +26,29 @@ def log_marked(
if duration:
output += f"/{duration}"
with open(f"{mark_file}", "a", encoding="utf-8") as file:
with open(mark_file, "a", encoding="utf-8") as file:
file.write(output + "\n")
def get_env_value(env, key: str, default: Any = None):
if env and key in env:
return env[key]
elif os.getenv(key):
return os.getenv(key)
else:
return default
# Reimplementation of distutils.util.strtobool due to it being deprecated
# Source: https://github.com/PostHog/posthog/blob/01e184c29d2c10c43166f1d40a334abbc3f99d8a/posthog/utils.py#L668
def str_to_bool(value: any) -> bool:
def str_to_bool(value: str | None) -> bool:
if not value:
return False
return str(value).lower() in ("y", "yes", "t", "true", "on", "1")
# Search for nested element in list
def contains_nested(element, lst):
if lst is None:
return None
for i, item in enumerate(lst):
if item is None:
continue
if element in item:
return i
elif element == item:
return i
return None
# Get mapped value
def search_mapping(dictionary: dict, key_value: str):
def search_mapping(dictionary: dict[str, str], key_value: str) -> str | None:
if key_value in dictionary.keys():
return dictionary[key_value]
elif key_value.lower() in dictionary.keys():
@ -100,8 +64,10 @@ def search_mapping(dictionary: dict, key_value: str):
# Return list of objects that exist in both lists including mappings
def match_list(list1, list2, list_mapping=None):
output = []
def match_list(
list1: list[str], list2: list[str], list_mapping: dict[str, str] | None = None
) -> list[str]:
output: list[str] = []
for element in list1:
if element in list2:
output.append(element)
@ -114,35 +80,59 @@ def match_list(list1, list2, list_mapping=None):
def future_thread_executor(
args: list, threads: int = None, override_threads: bool = False
):
futures_list = []
results = []
args: list[tuple[Callable[..., Any], ...]],
threads: int | None = None,
override_threads: bool = False,
max_threads: int | None = None,
) -> list[Any]:
results: list[Any] = []
workers = min(int(os.getenv("MAX_THREADS", 32)), os.cpu_count() * 2)
if threads:
# Determine the number of workers, defaulting to 1 if os.cpu_count() returns None
cpu_threads: int = os.cpu_count() or 1 # Default to 1 if os.cpu_count() is None
workers: int = min(max_threads, cpu_threads * 2) if max_threads else cpu_threads * 2
# Adjust workers based on threads parameter and override_threads flag
if threads is not None:
workers = min(threads, workers)
if override_threads:
workers = threads
workers = threads if threads is not None else workers
# If only one worker, run in main thread to avoid overhead
if workers == 1:
results = []
for arg in args:
results.append(arg[0](*arg[1:]))
return results
with ThreadPoolExecutor(max_workers=workers) as executor:
futures_list: list[Future[Any]] = []
for arg in args:
# * arg unpacks the list into actual arguments
futures_list.append(executor.submit(*arg))
for future in futures_list:
for out in futures_list:
try:
result = future.result()
result = out.result()
results.append(result)
except Exception as e:
raise Exception(e)
return results
def parse_string_to_list(string: str | None) -> list[str]:
output: list[str] = []
if string and len(string) > 0:
output = string.split(",")
return output
_WINDOWS_DRIVE = re.compile(r"^[A-Za-z]:") # e.g. C: D:
def filename_from_any_path(p: str) -> str:
# Windows-y if UNC (\\server\share), drive letter, or has backslashes
if p.startswith("\\\\") or _WINDOWS_DRIVE.match(p) or ("\\" in p and "/" not in p):
return PureWindowsPath(p).name
return PurePosixPath(p).name

View File

@ -1,9 +1,10 @@
from src.jellyfin_emby import JellyfinEmby
from packaging import version
from packaging.version import parse, Version
from loguru import logger
class Jellyfin(JellyfinEmby):
def __init__(self, baseurl, token):
def __init__(self, env, base_url: str, token: str) -> None:
authorization = (
"MediaBrowser , "
'Client="JellyPlex-Watched", '
@ -18,8 +19,14 @@ class Jellyfin(JellyfinEmby):
}
super().__init__(
server_type="Jellyfin", baseurl=baseurl, token=token, headers=headers
env, server_type="Jellyfin", base_url=base_url, token=token, headers=headers
)
def is_partial_update_supported(self, server_version):
return server_version >= version.parse("10.9.0")
def is_partial_update_supported(self, server_version: Version) -> bool:
if not server_version >= parse("10.9.0"):
logger.info(
f"{self.server_type}: Server version {server_version} does not support updating playback position.",
)
return False
return True

File diff suppressed because it is too large Load Diff

View File

@ -1,19 +1,24 @@
from loguru import logger
from src.functions import (
logger,
match_list,
search_mapping,
)
from src.emby import Emby
from src.jellyfin import Jellyfin
from src.plex import Plex
def check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping=None,
):
library_title: str,
library_type: str,
blacklist_library: list[str],
whitelist_library: list[str],
blacklist_library_type: list[str],
whitelist_library_type: list[str],
library_mapping: dict[str, str] | None = None,
) -> str | None:
skip_reason = None
library_other = None
if library_mapping:
@ -48,12 +53,12 @@ def check_skip_logic(
def check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other=None,
):
library_title: str,
library_type: str,
blacklist_library: list[str],
blacklist_library_type: list[str],
library_other: str | None = None,
) -> str | None:
skip_reason = None
if isinstance(library_type, (list, tuple, set)):
for library_type_item in library_type:
@ -84,12 +89,12 @@ def check_blacklist_logic(
def check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other=None,
):
library_title: str,
library_type: str,
whitelist_library: list[str],
whitelist_library_type: list[str],
library_other: str | None = None,
) -> str | None:
skip_reason = None
if len(whitelist_library_type) > 0:
if isinstance(library_type, (list, tuple, set)):
@ -131,14 +136,14 @@ def check_whitelist_logic(
def filter_libaries(
server_libraries,
blacklist_library,
blacklist_library_type,
whitelist_library,
whitelist_library_type,
library_mapping=None,
):
filtered_libaries = []
server_libraries: dict[str, str],
blacklist_library: list[str],
blacklist_library_type: list[str],
whitelist_library: list[str],
whitelist_library_type: list[str],
library_mapping: dict[str, str] | None = None,
) -> list[str]:
filtered_libaries: list[str] = []
for library in server_libraries:
skip_reason = check_skip_logic(
library,
@ -151,7 +156,7 @@ def filter_libaries(
)
if skip_reason:
logger(f"Skipping library {library}: {skip_reason}", 1)
logger.info(f"Skipping library {library}: {skip_reason}")
continue
filtered_libaries.append(library)
@ -160,18 +165,19 @@ def filter_libaries(
def setup_libraries(
server_1,
server_2,
blacklist_library,
blacklist_library_type,
whitelist_library,
whitelist_library_type,
library_mapping=None,
):
server_1: Plex | Jellyfin | Emby,
server_2: Plex | Jellyfin | Emby,
blacklist_library: list[str],
blacklist_library_type: list[str],
whitelist_library: list[str],
whitelist_library_type: list[str],
library_mapping: dict[str, str] | None = None,
) -> tuple[list[str], list[str]]:
server_1_libraries = server_1.get_libraries()
server_2_libraries = server_2.get_libraries()
logger(f"Server 1 libraries: {server_1_libraries}", 1)
logger(f"Server 2 libraries: {server_2_libraries}", 1)
logger.debug(f"{server_1.server_type}: Libraries and types {server_1_libraries}")
logger.debug(f"{server_2.server_type}: Libraries and types {server_2_libraries}")
# Filter out all blacklist, whitelist libaries
filtered_server_1_libraries = filter_libaries(
@ -199,139 +205,3 @@ def setup_libraries(
)
return output_server_1_libaries, output_server_2_libaries
def show_title_dict(user_list: dict):
try:
show_output_dict = {}
show_output_dict["locations"] = []
show_counter = 0 # Initialize a counter for the current show position
show_output_keys = user_list.keys()
show_output_keys = [dict(x) for x in list(show_output_keys)]
for show_key in show_output_keys:
for provider_key, provider_value in show_key.items():
# Skip title
if provider_key.lower() == "title":
continue
if provider_key.lower() not in show_output_dict:
show_output_dict[provider_key.lower()] = [None] * show_counter
if provider_key.lower() == "locations":
show_output_dict[provider_key.lower()].append(provider_value)
else:
show_output_dict[provider_key.lower()].append(
provider_value.lower()
)
show_counter += 1
for key in show_output_dict:
if len(show_output_dict[key]) < show_counter:
show_output_dict[key].append(None)
return show_output_dict
except Exception:
return {}
def episode_title_dict(user_list: dict):
try:
episode_output_dict = {}
episode_output_dict["completed"] = []
episode_output_dict["time"] = []
episode_output_dict["locations"] = []
episode_output_dict["show"] = []
episode_counter = 0 # Initialize a counter for the current episode position
# Iterate through the shows and episodes in user_list
for show in user_list:
for episode in user_list[show]:
# Add the show title to the episode_output_dict if it doesn't exist
if "show" not in episode_output_dict:
episode_output_dict["show"] = [None] * episode_counter
# Add the show title to the episode_output_dict
episode_output_dict["show"].append(dict(show))
# Iterate through the keys and values in each episode
for episode_key, episode_value in episode.items():
# If the key is not "status", add the key to episode_output_dict if it doesn't exist
if episode_key != "status":
if episode_key.lower() not in episode_output_dict:
# Initialize the list with None values up to the current episode position
episode_output_dict[episode_key.lower()] = [
None
] * episode_counter
# If the key is "locations", append each location to the list
if episode_key == "locations":
episode_output_dict[episode_key.lower()].append(episode_value)
# If the key is "status", append the "completed" and "time" values
elif episode_key == "status":
episode_output_dict["completed"].append(
episode_value["completed"]
)
episode_output_dict["time"].append(episode_value["time"])
# For other keys, append the value to the list
else:
episode_output_dict[episode_key.lower()].append(
episode_value.lower()
)
# Increment the episode_counter
episode_counter += 1
# Extend the lists in episode_output_dict with None values to match the current episode_counter
for key in episode_output_dict:
if len(episode_output_dict[key]) < episode_counter:
episode_output_dict[key].append(None)
return episode_output_dict
except Exception:
return {}
def movies_title_dict(user_list: dict):
try:
movies_output_dict = {}
movies_output_dict["completed"] = []
movies_output_dict["time"] = []
movies_output_dict["locations"] = []
movie_counter = 0 # Initialize a counter for the current movie position
for movie in user_list:
for movie_key, movie_value in movie.items():
if movie_key != "status":
if movie_key.lower() not in movies_output_dict:
movies_output_dict[movie_key.lower()] = []
if movie_key == "locations":
movies_output_dict[movie_key.lower()].append(movie_value)
elif movie_key == "status":
movies_output_dict["completed"].append(movie_value["completed"])
movies_output_dict["time"].append(movie_value["time"])
else:
movies_output_dict[movie_key.lower()].append(movie_value.lower())
movie_counter += 1
for key in movies_output_dict:
if len(movies_output_dict[key]) < movie_counter:
movies_output_dict[key].append(None)
return movies_output_dict
except Exception:
return {}
def generate_library_guids_dict(user_list: dict):
# Handle the case where user_list is empty or does not contain the expected keys and values
if not user_list:
return {}, {}, {}
show_output_dict = show_title_dict(user_list)
episode_output_dict = episode_title_dict(user_list)
movies_output_dict = movies_title_dict(user_list)
return show_output_dict, episode_output_dict, movies_output_dict

View File

@ -1,114 +1,156 @@
import os, traceback, json
from dotenv import load_dotenv
import os
import traceback
import json
import sys
from dotenv import dotenv_values
from time import sleep, perf_counter
from loguru import logger
from src.emby import Emby
from src.jellyfin import Jellyfin
from src.plex import Plex
from src.library import setup_libraries
from src.functions import (
logger,
parse_string_to_list,
str_to_bool,
get_env_value,
)
from src.users import setup_users
from src.watched import (
cleanup_watched,
merge_server_watched,
)
from src.black_white import setup_black_white_lists
from src.connection import generate_server_connections
load_dotenv(override=True)
def configure_logger(log_file: str = "log.log", debug_level: str = "INFO") -> None:
# Remove default logger to configure our own
logger.remove()
# Choose log level based on environment
# If in debug mode with a "debug" level, use DEBUG; otherwise, default to INFO.
if debug_level not in ["INFO", "DEBUG", "TRACE"]:
logger.add(sys.stdout)
raise Exception(
f"Invalid DEBUG_LEVEL {debug_level}, please choose between INFO, DEBUG, TRACE"
)
# Add a sink for file logging and the console.
logger.add(log_file, level=debug_level, mode="w")
logger.add(sys.stdout, level=debug_level)
def should_sync_server(server_1_type, server_2_type):
def should_sync_server(
env,
server_1: Plex | Jellyfin | Emby,
server_2: Plex | Jellyfin | Emby,
) -> bool:
sync_from_plex_to_jellyfin = str_to_bool(
os.getenv("SYNC_FROM_PLEX_TO_JELLYFIN", "True")
get_env_value(env, "SYNC_FROM_PLEX_TO_JELLYFIN", "True")
)
sync_from_plex_to_plex = str_to_bool(
get_env_value(env, "SYNC_FROM_PLEX_TO_PLEX", "True")
)
sync_from_plex_to_emby = str_to_bool(
get_env_value(env, "SYNC_FROM_PLEX_TO_EMBY", "True")
)
sync_from_plex_to_plex = str_to_bool(os.getenv("SYNC_FROM_PLEX_TO_PLEX", "True"))
sync_from_plex_to_emby = str_to_bool(os.getenv("SYNC_FROM_PLEX_TO_EMBY", "True"))
sync_from_jelly_to_plex = str_to_bool(
os.getenv("SYNC_FROM_JELLYFIN_TO_PLEX", "True")
get_env_value(env, "SYNC_FROM_JELLYFIN_TO_PLEX", "True")
)
sync_from_jelly_to_jellyfin = str_to_bool(
os.getenv("SYNC_FROM_JELLYFIN_TO_JELLYFIN", "True")
get_env_value(env, "SYNC_FROM_JELLYFIN_TO_JELLYFIN", "True")
)
sync_from_jelly_to_emby = str_to_bool(
os.getenv("SYNC_FROM_JELLYFIN_TO_EMBY", "True")
get_env_value(env, "SYNC_FROM_JELLYFIN_TO_EMBY", "True")
)
sync_from_emby_to_plex = str_to_bool(os.getenv("SYNC_FROM_EMBY_TO_PLEX", "True"))
sync_from_emby_to_plex = str_to_bool(
get_env_value(env, "SYNC_FROM_EMBY_TO_PLEX", "True")
)
sync_from_emby_to_jellyfin = str_to_bool(
os.getenv("SYNC_FROM_EMBY_TO_JELLYFIN", "True")
get_env_value(env, "SYNC_FROM_EMBY_TO_JELLYFIN", "True")
)
sync_from_emby_to_emby = str_to_bool(
get_env_value(env, "SYNC_FROM_EMBY_TO_EMBY", "True")
)
sync_from_emby_to_emby = str_to_bool(os.getenv("SYNC_FROM_EMBY_TO_EMBY", "True"))
if server_1_type == "plex":
if server_2_type == "jellyfin" and not sync_from_plex_to_jellyfin:
logger("Sync from plex -> jellyfin is disabled", 1)
if isinstance(server_1, Plex):
if isinstance(server_2, Jellyfin) and not sync_from_plex_to_jellyfin:
logger.info("Sync from plex -> jellyfin is disabled")
return False
if server_2_type == "emby" and not sync_from_plex_to_emby:
logger("Sync from plex -> emby is disabled", 1)
if isinstance(server_2, Emby) and not sync_from_plex_to_emby:
logger.info("Sync from plex -> emby is disabled")
return False
if server_2_type == "plex" and not sync_from_plex_to_plex:
logger("Sync from plex -> plex is disabled", 1)
if isinstance(server_2, Plex) and not sync_from_plex_to_plex:
logger.info("Sync from plex -> plex is disabled")
return False
if server_1_type == "jellyfin":
if server_2_type == "plex" and not sync_from_jelly_to_plex:
logger("Sync from jellyfin -> plex is disabled", 1)
if isinstance(server_1, Jellyfin):
if isinstance(server_2, Plex) and not sync_from_jelly_to_plex:
logger.info("Sync from jellyfin -> plex is disabled")
return False
if server_2_type == "jellyfin" and not sync_from_jelly_to_jellyfin:
logger("Sync from jellyfin -> jellyfin is disabled", 1)
if isinstance(server_2, Jellyfin) and not sync_from_jelly_to_jellyfin:
logger.info("Sync from jellyfin -> jellyfin is disabled")
return False
if server_2_type == "emby" and not sync_from_jelly_to_emby:
logger("Sync from jellyfin -> emby is disabled", 1)
if isinstance(server_2, Emby) and not sync_from_jelly_to_emby:
logger.info("Sync from jellyfin -> emby is disabled")
return False
if server_1_type == "emby":
if server_2_type == "plex" and not sync_from_emby_to_plex:
logger("Sync from emby -> plex is disabled", 1)
if isinstance(server_1, Emby):
if isinstance(server_2, Plex) and not sync_from_emby_to_plex:
logger.info("Sync from emby -> plex is disabled")
return False
if server_2_type == "jellyfin" and not sync_from_emby_to_jellyfin:
logger("Sync from emby -> jellyfin is disabled", 1)
if isinstance(server_2, Jellyfin) and not sync_from_emby_to_jellyfin:
logger.info("Sync from emby -> jellyfin is disabled")
return False
if server_2_type == "emby" and not sync_from_emby_to_emby:
logger("Sync from emby -> emby is disabled", 1)
if isinstance(server_2, Emby) and not sync_from_emby_to_emby:
logger.info("Sync from emby -> emby is disabled")
return False
return True
def main_loop():
log_file = os.getenv("LOG_FILE", os.getenv("LOGFILE", "log.log"))
# Delete log_file if it exists
if os.path.exists(log_file):
os.remove(log_file)
def main_loop(env) -> None:
dryrun = str_to_bool(get_env_value(env, "DRYRUN", "False"))
logger.info(f"Dryrun: {dryrun}")
dryrun = str_to_bool(os.getenv("DRYRUN", "False"))
logger(f"Dryrun: {dryrun}", 1)
user_mapping_env = get_env_value(env, "USER_MAPPING", None)
user_mapping = None
if user_mapping_env:
user_mapping = json.loads(user_mapping_env.lower())
logger.info(f"User Mapping: {user_mapping}")
user_mapping = os.getenv("USER_MAPPING")
if user_mapping:
user_mapping = json.loads(user_mapping.lower())
logger(f"User Mapping: {user_mapping}", 1)
library_mapping = os.getenv("LIBRARY_MAPPING")
if library_mapping:
library_mapping = json.loads(library_mapping)
logger(f"Library Mapping: {library_mapping}", 1)
library_mapping_env = get_env_value(env, "LIBRARY_MAPPING", None)
library_mapping = None
if library_mapping_env:
library_mapping = json.loads(library_mapping_env)
logger.info(f"Library Mapping: {library_mapping}")
# Create (black/white)lists
logger("Creating (black/white)lists", 1)
blacklist_library = os.getenv("BLACKLIST_LIBRARY", None)
whitelist_library = os.getenv("WHITELIST_LIBRARY", None)
blacklist_library_type = os.getenv("BLACKLIST_LIBRARY_TYPE", None)
whitelist_library_type = os.getenv("WHITELIST_LIBRARY_TYPE", None)
blacklist_users = os.getenv("BLACKLIST_USERS", None)
whitelist_users = os.getenv("WHITELIST_USERS", None)
logger.info("Creating (black/white)lists")
blacklist_library = parse_string_to_list(
get_env_value(env, "BLACKLIST_LIBRARY", None)
)
whitelist_library = parse_string_to_list(
get_env_value(env, "WHITELIST_LIBRARY", None)
)
blacklist_library_type = parse_string_to_list(
get_env_value(env, "BLACKLIST_LIBRARY_TYPE", None)
)
whitelist_library_type = parse_string_to_list(
get_env_value(env, "WHITELIST_LIBRARY_TYPE", None)
)
blacklist_users = parse_string_to_list(get_env_value(env, "BLACKLIST_USERS", None))
whitelist_users = parse_string_to_list(get_env_value(env, "WHITELIST_USERS", None))
(
blacklist_library,
@ -129,86 +171,97 @@ def main_loop():
)
# Create server connections
logger("Creating server connections", 1)
servers = generate_server_connections()
logger.info("Creating server connections")
servers = generate_server_connections(env)
for server_1 in servers:
# If server is the final server in the list, then we are done with the loop
if server_1 == servers[-1]:
break
# Store a copy of server_1_watched that way it can be used multiple times without having to regather everyones watch history every single time
server_1_watched = None
# Start server_2 at the next server in the list
for server_2 in servers[servers.index(server_1) + 1 :]:
# Check if server 1 and server 2 are going to be synced in either direction, skip if not
if not should_sync_server(
server_1[0], server_2[0]
) and not should_sync_server(server_2[0], server_1[0]):
env, server_1, server_2
) and not should_sync_server(env, server_2, server_1):
continue
logger(f"Server 1: {server_1[0].capitalize()}: {server_1[1].info()}", 0)
logger(f"Server 2: {server_2[0].capitalize()}: {server_2[1].info()}", 0)
logger.info(f"Server 1: {type(server_1)}: {server_1.info()}")
logger.info(f"Server 2: {type(server_2)}: {server_2.info()}")
# Create users list
logger("Creating users list", 1)
logger.info("Creating users list")
server_1_users, server_2_users = setup_users(
server_1, server_2, blacklist_users, whitelist_users, user_mapping
)
server_1_libraries, server_2_libraries = setup_libraries(
server_1[1],
server_2[1],
server_1,
server_2,
blacklist_library,
blacklist_library_type,
whitelist_library,
whitelist_library_type,
library_mapping,
)
logger.info(f"Server 1 syncing libraries: {server_1_libraries}")
logger.info(f"Server 2 syncing libraries: {server_2_libraries}")
logger("Creating watched lists", 1)
server_1_watched = server_1[1].get_watched(
server_1_users, server_1_libraries
logger.info("Creating watched lists", 1)
server_1_watched = server_1.get_watched(
server_1_users, server_1_libraries, server_1_watched
)
logger("Finished creating watched list server 1", 1)
logger.info("Finished creating watched list server 1")
server_2_watched = server_2[1].get_watched(
server_2_users, server_2_libraries
)
logger("Finished creating watched list server 2", 1)
server_2_watched = server_2.get_watched(server_2_users, server_2_libraries)
logger.info("Finished creating watched list server 2")
logger(f"Server 1 watched: {server_1_watched}", 3)
logger(f"Server 2 watched: {server_2_watched}", 3)
logger.trace(f"Server 1 watched: {server_1_watched}")
logger.trace(f"Server 2 watched: {server_2_watched}")
logger("Cleaning Server 1 Watched", 1)
logger.info("Cleaning Server 1 Watched", 1)
server_1_watched_filtered = cleanup_watched(
server_1_watched, server_2_watched, user_mapping, library_mapping
)
logger("Cleaning Server 2 Watched", 1)
logger.info("Cleaning Server 2 Watched", 1)
server_2_watched_filtered = cleanup_watched(
server_2_watched, server_1_watched, user_mapping, library_mapping
)
logger(
logger.debug(
f"server 1 watched that needs to be synced to server 2:\n{server_1_watched_filtered}",
1,
)
logger(
logger.debug(
f"server 2 watched that needs to be synced to server 1:\n{server_2_watched_filtered}",
1,
)
if should_sync_server(server_2[0], server_1[0]):
logger(f"Syncing {server_2[1].info()} -> {server_1[1].info()}", 0)
server_1[1].update_watched(
if should_sync_server(env, server_2, server_1):
logger.info(f"Syncing {server_2.info()} -> {server_1.info()}")
# Add server_2_watched_filtered to server_1_watched that way the stored version isn't stale for the next server
if not dryrun:
server_1_watched = merge_server_watched(
server_1_watched,
server_2_watched_filtered,
user_mapping,
library_mapping,
)
server_1.update_watched(
server_2_watched_filtered,
user_mapping,
library_mapping,
dryrun,
)
if should_sync_server(server_1[0], server_2[0]):
logger(f"Syncing {server_1[1].info()} -> {server_2[1].info()}", 0)
server_2[1].update_watched(
if should_sync_server(env, server_1, server_2):
logger.info(f"Syncing {server_1.info()} -> {server_2.info()}")
server_2.update_watched(
server_1_watched_filtered,
user_mapping,
library_mapping,
@ -216,43 +269,55 @@ def main_loop():
)
def main():
run_only_once = str_to_bool(os.getenv("RUN_ONLY_ONCE", "False"))
sleep_duration = float(os.getenv("SLEEP_DURATION", "3600"))
times = []
@logger.catch
def main() -> None:
# Get environment variables
env_file = get_env_value(None, "ENV_FILE", ".env")
env = dotenv_values(env_file)
run_only_once = str_to_bool(get_env_value(env, "RUN_ONLY_ONCE", "False"))
sleep_duration = float(get_env_value(env, "SLEEP_DURATION", "3600"))
log_file = get_env_value(env, "LOG_FILE", "log.log")
debug_level = get_env_value(env, "DEBUG_LEVEL", "INFO")
if debug_level:
debug_level = debug_level.upper()
times: list[float] = []
while True:
try:
start = perf_counter()
main_loop()
# Reconfigure the logger on each loop so the logs are rotated on each run
configure_logger(log_file, debug_level)
main_loop(env)
end = perf_counter()
times.append(end - start)
if len(times) > 0:
logger(f"Average time: {sum(times) / len(times)}", 0)
logger.info(f"Average time: {sum(times) / len(times)}")
if run_only_once:
break
logger(f"Looping in {sleep_duration}")
logger.info(f"Looping in {sleep_duration}")
sleep(sleep_duration)
except Exception as error:
if isinstance(error, list):
for message in error:
logger(message, log_type=2)
logger.error(message)
else:
logger(error, log_type=2)
logger.error(error)
logger(traceback.format_exc(), 2)
logger.error(traceback.format_exc())
if run_only_once:
break
logger(f"Retrying in {sleep_duration}", log_type=0)
logger.info(f"Retrying in {sleep_duration}")
sleep(sleep_duration)
except KeyboardInterrupt:
if len(times) > 0:
logger(f"Average time: {sum(times) / len(times)}", 0)
logger("Exiting", log_type=0)
logger.info(f"Average time: {sum(times) / len(times)}")
logger.info("Exiting")
os._exit(0)

File diff suppressed because it is too large Load Diff

View File

@ -1,30 +1,35 @@
from src.functions import (
logger,
search_mapping,
)
from plexapi.myplex import MyPlexAccount, MyPlexUser
from loguru import logger
from src.emby import Emby
from src.jellyfin import Jellyfin
from src.plex import Plex
from src.functions import search_mapping
def generate_user_list(server):
def generate_user_list(server: Plex | Jellyfin | Emby) -> list[str]:
# generate list of users from server 1 and server 2
server_type = server[0]
server_connection = server[1]
server_users = []
if server_type == "plex":
for user in server_connection.users:
server_users: list[str] = []
if isinstance(server, Plex):
for user in server.users:
server_users.append(
user.username.lower() if user.username else user.title.lower()
)
elif server_type in ["jellyfin", "emby"]:
server_users = [key.lower() for key in server_connection.users.keys()]
elif isinstance(server, (Jellyfin, Emby)):
server_users = [key.lower() for key in server.users.keys()]
return server_users
def combine_user_lists(server_1_users, server_2_users, user_mapping):
def combine_user_lists(
server_1_users: list[str],
server_2_users: list[str],
user_mapping: dict[str, str] | None,
) -> dict[str, str]:
# combined list of overlapping users from plex and jellyfin
users = {}
users: dict[str, str] = {}
for server_1_user in server_1_users:
if user_mapping:
@ -49,13 +54,15 @@ def combine_user_lists(server_1_users, server_2_users, user_mapping):
return users
def filter_user_lists(users, blacklist_users, whitelist_users):
users_filtered = {}
def filter_user_lists(
users: dict[str, str], blacklist_users: list[str], whitelist_users: list[str]
) -> dict[str, str]:
users_filtered: dict[str, str] = {}
for user in users:
# whitelist_user is not empty and user lowercase is not in whitelist lowercase
if len(whitelist_users) > 0:
if user not in whitelist_users and users[user] not in whitelist_users:
logger(f"{user} or {users[user]} is not in whitelist", 1)
logger.info(f"{user} or {users[user]} is not in whitelist")
continue
if user not in blacklist_users and users[user] not in blacklist_users:
@ -64,12 +71,13 @@ def filter_user_lists(users, blacklist_users, whitelist_users):
return users_filtered
def generate_server_users(server, users):
server_users = None
if server[0] == "plex":
server_users = []
for plex_user in server[1].users:
def generate_server_users(
server: Plex | Jellyfin | Emby,
users: dict[str, str],
) -> list[MyPlexAccount] | dict[str, str] | None:
if isinstance(server, Plex):
plex_server_users: list[MyPlexAccount] = []
for plex_user in server.users:
username_title = (
plex_user.username if plex_user.username else plex_user.title
)
@ -78,45 +86,56 @@ def generate_server_users(server, users):
username_title.lower() in users.keys()
or username_title.lower() in users.values()
):
server_users.append(plex_user)
elif server[0] in ["jellyfin", "emby"]:
server_users = {}
for jellyfin_user, jellyfin_id in server[1].users.items():
plex_server_users.append(plex_user)
return plex_server_users
elif isinstance(server, (Jellyfin, Emby)):
jelly_emby_server_users: dict[str, str] = {}
for jellyfin_user, jellyfin_id in server.users.items():
if (
jellyfin_user.lower() in users.keys()
or jellyfin_user.lower() in users.values()
):
server_users[jellyfin_user] = jellyfin_id
jelly_emby_server_users[jellyfin_user] = jellyfin_id
return server_users
return jelly_emby_server_users
return None
def setup_users(
server_1, server_2, blacklist_users, whitelist_users, user_mapping=None
):
server_1: Plex | Jellyfin | Emby,
server_2: Plex | Jellyfin | Emby,
blacklist_users: list[str],
whitelist_users: list[str],
user_mapping: dict[str, str] | None = None,
) -> tuple[
list[MyPlexAccount | MyPlexUser] | dict[str, str],
list[MyPlexAccount | MyPlexUser] | dict[str, str],
]:
server_1_users = generate_user_list(server_1)
server_2_users = generate_user_list(server_2)
logger(f"Server 1 users: {server_1_users}", 1)
logger(f"Server 2 users: {server_2_users}", 1)
logger.debug(f"Server 1 users: {server_1_users}")
logger.debug(f"Server 2 users: {server_2_users}")
users = combine_user_lists(server_1_users, server_2_users, user_mapping)
logger(f"User list that exist on both servers {users}", 1)
logger.debug(f"User list that exist on both servers {users}")
users_filtered = filter_user_lists(users, blacklist_users, whitelist_users)
logger(f"Filtered user list {users_filtered}", 1)
logger.debug(f"Filtered user list {users_filtered}")
output_server_1_users = generate_server_users(server_1, users_filtered)
output_server_2_users = generate_server_users(server_2, users_filtered)
# Check if users is none or empty
if output_server_1_users is None or len(output_server_1_users) == 0:
logger(
f"No users found for server 1 {server_1[0]}, users: {server_1_users}, overlapping users {users}, filtered users {users_filtered}, server 1 users {server_1[1].users}"
logger.warning(
f"No users found for server 1 {type(server_1)}, users: {server_1_users}, overlapping users {users}, filtered users {users_filtered}, server 1 users {server_1.users}"
)
if output_server_2_users is None or len(output_server_2_users) == 0:
logger(
f"No users found for server 2 {server_2[0]}, users: {server_2_users}, overlapping users {users} filtered users {users_filtered}, server 2 users {server_2[1].users}"
logger.warning(
f"No users found for server 2 {type(server_2)}, users: {server_2_users}, overlapping users {users} filtered users {users_filtered}, server 2 users {server_2.users}"
)
if (
@ -127,7 +146,7 @@ def setup_users(
):
raise Exception("No users found for one or both servers")
logger(f"Server 1 users: {output_server_1_users}", 1)
logger(f"Server 2 users: {output_server_2_users}", 1)
logger.info(f"Server 1 users: {output_server_1_users}")
logger.info(f"Server 2 users: {output_server_2_users}")
return output_server_1_users, output_server_2_users

View File

@ -1,55 +1,219 @@
import copy
from datetime import datetime
from pydantic import BaseModel, Field
from loguru import logger
from typing import Any
from src.functions import logger, search_mapping, contains_nested
from src.library import generate_library_guids_dict
from src.functions import search_mapping
def check_remove_entry(video, library, video_index, library_watched_list_2):
if video_index is not None:
if (
library_watched_list_2["completed"][video_index]
== video["status"]["completed"]
) and (library_watched_list_2["time"][video_index] == video["status"]["time"]):
logger(
f"Removing {video['title']} from {library} due to exact match",
3,
class MediaIdentifiers(BaseModel):
title: str | None = None
# File information, will be folder for series and media file for episode/movie
locations: tuple[str, ...] = tuple()
# Guids
imdb_id: str | None = None
tvdb_id: str | None = None
tmdb_id: str | None = None
class WatchedStatus(BaseModel):
completed: bool
time: int
viewed_date: datetime
class MediaItem(BaseModel):
identifiers: MediaIdentifiers
status: WatchedStatus
class Series(BaseModel):
identifiers: MediaIdentifiers
episodes: list[MediaItem] = Field(default_factory=list)
class LibraryData(BaseModel):
title: str
movies: list[MediaItem] = Field(default_factory=list)
series: list[Series] = Field(default_factory=list)
class UserData(BaseModel):
libraries: dict[str, LibraryData] = Field(default_factory=dict)
def merge_mediaitem_data(ep1: MediaItem, ep2: MediaItem) -> MediaItem:
"""
Merge two MediaItem episodes by comparing their watched status.
If one is completed while the other isn't, choose the completed one.
If both are completed or both are not, choose the one with the higher time.
"""
if ep1.status.completed != ep2.status.completed:
return ep1 if ep1.status.completed else ep2
return ep1 if ep1.status.time >= ep2.status.time else ep2
def merge_series_data(series1: Series, series2: Series) -> Series:
"""
Merge two Series objects by combining their episodes.
For duplicate episodes (determined by check_same_identifiers), merge their watched status.
"""
merged_series = copy.deepcopy(series1)
for ep in series2.episodes:
for idx, merged_ep in enumerate(merged_series.episodes):
if check_same_identifiers(ep.identifiers, merged_ep.identifiers):
merged_series.episodes[idx] = merge_mediaitem_data(merged_ep, ep)
break
else:
merged_series.episodes.append(copy.deepcopy(ep))
return merged_series
def merge_library_data(lib1: LibraryData, lib2: LibraryData) -> LibraryData:
"""
Merge two LibraryData objects by extending movies and merging series.
For series, duplicates are determined using check_same_identifiers.
"""
merged = copy.deepcopy(lib1)
# Merge movies.
for movie in lib2.movies:
for idx, merged_movie in enumerate(merged.movies):
if check_same_identifiers(movie.identifiers, merged_movie.identifiers):
merged.movies[idx] = merge_mediaitem_data(merged_movie, movie)
break
else:
merged.movies.append(copy.deepcopy(movie))
# Merge series.
for series2 in lib2.series:
for idx, series1 in enumerate(merged.series):
if check_same_identifiers(series1.identifiers, series2.identifiers):
merged.series[idx] = merge_series_data(series1, series2)
break
else:
merged.series.append(copy.deepcopy(series2))
return merged
def merge_user_data(user1: UserData, user2: UserData) -> UserData:
"""
Merge two UserData objects by merging their libraries.
If a library exists in both, merge its content; otherwise, add the new library.
"""
merged_libraries = copy.deepcopy(user1.libraries)
for lib_key, lib_data in user2.libraries.items():
if lib_key in merged_libraries:
merged_libraries[lib_key] = merge_library_data(
merged_libraries[lib_key], lib_data
)
return True
elif (
library_watched_list_2["completed"][video_index] == True
and video["status"]["completed"] == False
):
logger(
f"Removing {video['title']} from {library} due to being complete in one library and not the other",
3,
)
return True
elif (
library_watched_list_2["completed"][video_index] == False
and video["status"]["completed"] == False
) and (video["status"]["time"] < library_watched_list_2["time"][video_index]):
logger(
f"Removing {video['title']} from {library} due to more time watched in one library than the other",
3,
)
return True
elif (
library_watched_list_2["completed"][video_index] == True
and video["status"]["completed"] == True
):
logger(
f"Removing {video['title']} from {library} due to being complete in both libraries",
3,
else:
merged_libraries[lib_key] = copy.deepcopy(lib_data)
return UserData(libraries=merged_libraries)
def merge_server_watched(
watched_list_1: dict[str, UserData],
watched_list_2: dict[str, UserData],
user_mapping: dict[str, str] | None = None,
library_mapping: dict[str, str] | None = None,
) -> dict[str, UserData]:
"""
Merge two dictionaries of UserData while taking into account possible
differences in user and library keys via the provided mappings.
"""
merged_watched = copy.deepcopy(watched_list_1)
for user_2, user_data in watched_list_2.items():
# Determine matching user key.
user_key = user_mapping.get(user_2, user_2) if user_mapping else user_2
if user_key not in merged_watched:
merged_watched[user_2] = copy.deepcopy(user_data)
continue
for lib_key, lib_data in user_data.libraries.items():
mapped_lib_key = (
library_mapping.get(lib_key, lib_key) if library_mapping else lib_key
)
if mapped_lib_key not in merged_watched[user_key].libraries:
merged_watched[user_key].libraries[lib_key] = copy.deepcopy(lib_data)
else:
merged_watched[user_key].libraries[mapped_lib_key] = merge_library_data(
merged_watched[user_key].libraries[mapped_lib_key],
lib_data,
)
return merged_watched
def check_same_identifiers(item1: MediaIdentifiers, item2: MediaIdentifiers) -> bool:
# Check for duplicate based on file locations:
if item1.locations and item2.locations:
if set(item1.locations) & set(item2.locations):
return True
# Check for duplicate based on GUIDs:
if (
(item1.imdb_id and item2.imdb_id and item1.imdb_id == item2.imdb_id)
or (item1.tvdb_id and item2.tvdb_id and item1.tvdb_id == item2.tvdb_id)
or (item1.tmdb_id and item2.tmdb_id and item1.tmdb_id == item2.tmdb_id)
):
return True
return False
def check_remove_entry(item1: MediaItem, item2: MediaItem) -> bool:
"""
Returns True if item1 (from watched_list_1) should be removed
in favor of item2 (from watched_list_2), based on:
- Duplicate criteria:
* They match if any file location is shared OR
at least one of imdb_id, tvdb_id, or tmdb_id matches.
- Watched status:
* If one is complete and the other is not, remove the incomplete one.
* If both are incomplete, remove the one with lower progress (time).
* If both are complete, remove item1 as duplicate.
"""
if not check_same_identifiers(item1.identifiers, item2.identifiers):
return False
# Compare watched statuses.
status1 = item1.status
status2 = item2.status
# If one is complete and the other isn't, remove the one that's not complete.
if status1.completed != status2.completed:
if not status1.completed and status2.completed:
return True # Remove item1 since it's not complete.
else:
return False # Do not remove item1; it's complete.
# Both have the same completed status.
if not status1.completed and not status2.completed:
# Both incomplete: remove the one with lower progress (time)
if status1.time < status2.time:
return True # Remove item1 because it has watched less.
elif status1.time > status2.time:
return False # Keep item1 because it has more progress.
else:
# Same progress; Remove duplicate
return True
# If both are complete, consider item1 the duplicate and remove it.
return True
def cleanup_watched(
watched_list_1, watched_list_2, user_mapping=None, library_mapping=None
):
watched_list_1: dict[str, UserData],
watched_list_2: dict[str, UserData],
user_mapping: dict[str, str] | None = None,
library_mapping: dict[str, str] | None = None,
) -> dict[str, UserData]:
modified_watched_list_1 = copy.deepcopy(watched_list_1)
# remove entries from watched_list_1 that are in watched_list_2
@ -61,195 +225,99 @@ def cleanup_watched(
if user_2 is None:
continue
for library_1 in watched_list_1[user_1]:
for library_1_key in watched_list_1[user_1].libraries:
library_other = None
if library_mapping:
library_other = search_mapping(library_mapping, library_1)
library_2 = get_other(watched_list_2[user_2], library_1, library_other)
if library_2 is None:
library_other = search_mapping(library_mapping, library_1_key)
library_2_key = get_other(
watched_list_2[user_2].libraries, library_1_key, library_other
)
if library_2_key is None:
continue
(
_,
episode_watched_list_2_keys_dict,
movies_watched_list_2_keys_dict,
) = generate_library_guids_dict(watched_list_2[user_2][library_2])
library_1 = watched_list_1[user_1].libraries[library_1_key]
library_2 = watched_list_2[user_2].libraries[library_2_key]
# Movies
if isinstance(watched_list_1[user_1][library_1], list):
for movie in watched_list_1[user_1][library_1]:
movie_index = get_movie_index_in_dict(
movie, movies_watched_list_2_keys_dict
)
if movie_index is not None:
if check_remove_entry(
movie,
library_1,
movie_index,
movies_watched_list_2_keys_dict,
):
modified_watched_list_1[user_1][library_1].remove(movie)
filtered_movies = []
for movie in library_1.movies:
remove_flag = False
for movie2 in library_2.movies:
if check_remove_entry(movie, movie2):
logger.trace(f"Removing movie: {movie.identifiers.title}")
remove_flag = True
break
if not remove_flag:
filtered_movies.append(movie)
modified_watched_list_1[user_1].libraries[
library_1_key
].movies = filtered_movies
# TV Shows
elif isinstance(watched_list_1[user_1][library_1], dict):
for show_key_1 in watched_list_1[user_1][library_1].keys():
show_key_dict = dict(show_key_1)
filtered_series_list = []
for series1 in library_1.series:
matching_series = None
for series2 in library_2.series:
if check_same_identifiers(series1.identifiers, series2.identifiers):
matching_series = series2
break
# Filter the episode_watched_list_2_keys_dict dictionary to handle cases
# where episode location names are not unique such as S01E01.mkv
filtered_episode_watched_list_2_keys_dict = (
filter_episode_watched_list_2_keys_dict(
episode_watched_list_2_keys_dict, show_key_dict
if matching_series is None:
# No matching show in watched_list_2; keep the series as is.
filtered_series_list.append(series1)
else:
# We have a matching show; now clean up the episodes.
filtered_episodes = []
for ep1 in series1.episodes:
remove_flag = False
for ep2 in matching_series.episodes:
if check_remove_entry(ep1, ep2):
logger.trace(
f"Removing episode '{ep1.identifiers.title}' from show '{series1.identifiers.title}'",
)
remove_flag = True
break
if not remove_flag:
filtered_episodes.append(ep1)
# Only keep the series if there are remaining episodes.
if filtered_episodes:
modified_series1 = copy.deepcopy(series1)
modified_series1.episodes = filtered_episodes
filtered_series_list.append(modified_series1)
else:
logger.trace(
f"Removing entire show '{series1.identifiers.title}' as no episodes remain after cleanup.",
)
)
for episode in watched_list_1[user_1][library_1][show_key_1]:
episode_index = get_episode_index_in_dict(
episode, filtered_episode_watched_list_2_keys_dict
)
if episode_index is not None:
if check_remove_entry(
episode,
library_1,
episode_index,
episode_watched_list_2_keys_dict,
):
modified_watched_list_1[user_1][library_1][
show_key_1
].remove(episode)
modified_watched_list_1[user_1].libraries[
library_1_key
].series = filtered_series_list
# Remove empty shows
if len(modified_watched_list_1[user_1][library_1][show_key_1]) == 0:
if show_key_1 in modified_watched_list_1[user_1][library_1]:
logger(
f"Removing {show_key_dict['title']} because it is empty",
3,
)
del modified_watched_list_1[user_1][library_1][show_key_1]
for user_1 in watched_list_1:
for library_1 in watched_list_1[user_1]:
if library_1 in modified_watched_list_1[user_1]:
# If library is empty then remove it
if len(modified_watched_list_1[user_1][library_1]) == 0:
logger(f"Removing {library_1} from {user_1} because it is empty", 1)
del modified_watched_list_1[user_1][library_1]
if user_1 in modified_watched_list_1:
# If user is empty delete user
if len(modified_watched_list_1[user_1]) == 0:
logger(f"Removing {user_1} from watched list 1 because it is empty", 1)
del modified_watched_list_1[user_1]
# After processing, remove any library that is completely empty.
for user, user_data in modified_watched_list_1.items():
new_libraries = {}
for lib_key, library in user_data.libraries.items():
if library.movies or library.series:
new_libraries[lib_key] = library
else:
logger.trace(f"Removing empty library '{lib_key}' for user '{user}'")
user_data.libraries = new_libraries
return modified_watched_list_1
def get_other(watched_list, object_1, object_2):
def get_other(
watched_list: dict[str, Any], object_1: str, object_2: str | None
) -> str | None:
if object_1 in watched_list:
return object_1
elif object_2 in watched_list:
if object_2 and object_2 in watched_list:
return object_2
else:
logger(f"{object_1} and {object_2} not found in watched list 2", 1)
return None
logger.info(
f"{object_1}{' and ' + object_2 if object_2 else ''} not found in watched list 2"
)
def get_movie_index_in_dict(movie, movies_watched_list_2_keys_dict):
# Iterate through the keys and values of the movie dictionary
for movie_key, movie_value in movie.items():
# If the key is "locations", check if the "locations" key is present in the movies_watched_list_2_keys_dict dictionary
if movie_key == "locations":
if "locations" in movies_watched_list_2_keys_dict.keys():
# Iterate through the locations in the movie dictionary
for location in movie_value:
# If the location is in the movies_watched_list_2_keys_dict dictionary, return index of the key
return contains_nested(
location, movies_watched_list_2_keys_dict["locations"]
)
# If the key is not "locations", check if the movie_key is present in the movies_watched_list_2_keys_dict dictionary
else:
if movie_key in movies_watched_list_2_keys_dict.keys():
# If the movie_value is in the movies_watched_list_2_keys_dict dictionary, return True
if movie_value in movies_watched_list_2_keys_dict[movie_key]:
return movies_watched_list_2_keys_dict[movie_key].index(movie_value)
# If the loop completes without finding a match, return False
return None
def filter_episode_watched_list_2_keys_dict(
episode_watched_list_2_keys_dict, show_key_dict
):
# If the episode_watched_list_2_keys_dict dictionary is empty, missing show then return an empty dictionary
if (
len(episode_watched_list_2_keys_dict) == 0
or "show" not in episode_watched_list_2_keys_dict.keys()
):
return {}
# Filter the episode_watched_list_2_keys_dict dictionary to only include values for the correct show
filtered_episode_watched_list_2_keys_dict = {}
show_indecies = []
# Iterate through episode_watched_list_2_keys_dict["show"] and find the indecies that match show_key_dict
for show_index, show_value in enumerate(episode_watched_list_2_keys_dict["show"]):
# Iterate through the keys and values of the show_value dictionary and check if they match show_key_dict
for show_key, show_key_value in show_value.items():
if show_key == "locations":
# Iterate through the locations in the show_value dictionary
for location in show_key_value:
# If the location is in the episode_watched_list_2_keys_dict dictionary, return index of the key
if (
contains_nested(location, show_key_dict["locations"])
is not None
):
show_indecies.append(show_index)
break
else:
if show_key in show_key_dict.keys():
if show_key_value == show_key_dict[show_key]:
show_indecies.append(show_index)
break
# lists
indecies = list(set(show_indecies))
# If there are no indecies that match the show, return an empty dictionary
if len(indecies) == 0:
return {}
# Create a copy of the dictionary with indecies that match the show and none that don't
for key, value in episode_watched_list_2_keys_dict.items():
if key not in filtered_episode_watched_list_2_keys_dict:
filtered_episode_watched_list_2_keys_dict[key] = []
for index, _ in enumerate(value):
if index in indecies:
filtered_episode_watched_list_2_keys_dict[key].append(value[index])
else:
filtered_episode_watched_list_2_keys_dict[key].append(None)
return filtered_episode_watched_list_2_keys_dict
def get_episode_index_in_dict(episode, episode_watched_list_2_keys_dict):
# Iterate through the keys and values of the episode dictionary
for episode_key, episode_value in episode.items():
if episode_key in episode_watched_list_2_keys_dict.keys():
if episode_key == "locations":
# Iterate through the locations in the episode dictionary
for location in episode_value:
# If the location is in the episode_watched_list_2_keys_dict dictionary, return index of the key
return contains_nested(
location, episode_watched_list_2_keys_dict["locations"]
)
else:
# If the episode_value is in the episode_watched_list_2_keys_dict dictionary, return True
if episode_value in episode_watched_list_2_keys_dict[episode_key]:
return episode_watched_list_2_keys_dict[episode_key].index(
episode_value
)
# If the loop completes without finding a match, return False
return None

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -3,11 +3,8 @@
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "False"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
DEBUG_LEVEL = "trace"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"

View File

@ -18,12 +18,12 @@ from src.black_white import setup_black_white_lists
def test_setup_black_white_lists():
# Simple
blacklist_library = "library1, library2"
whitelist_library = "library1, library2"
blacklist_library_type = "library_type1, library_type2"
whitelist_library_type = "library_type1, library_type2"
blacklist_users = "user1, user2"
whitelist_users = "user1, user2"
blacklist_library = ["library1", "library2"]
whitelist_library = ["library1", "library2"]
blacklist_library_type = ["library_type1", "library_type2"]
whitelist_library_type = ["library_type1", "library_type2"]
blacklist_users = ["user1", "user2"]
whitelist_users = ["user1", "user2"]
(
results_blacklist_library,
@ -48,6 +48,15 @@ def test_setup_black_white_lists():
assert return_blacklist_users == ["user1", "user2"]
assert return_whitelist_users == ["user1", "user2"]
def test_library_mapping_black_white_list():
blacklist_library = ["library1", "library2"]
whitelist_library = ["library1", "library2"]
blacklist_library_type = ["library_type1", "library_type2"]
whitelist_library_type = ["library_type1", "library_type2"]
blacklist_users = ["user1", "user2"]
whitelist_users = ["user1", "user2"]
# Library Mapping and user mapping
library_mapping = {"library1": "library3"}
user_mapping = {"user1": "user3"}

View File

@ -21,10 +21,6 @@ from src.library import (
check_skip_logic,
check_blacklist_logic,
check_whitelist_logic,
show_title_dict,
episode_title_dict,
movies_title_dict,
generate_library_guids_dict,
)
blacklist_library = ["TV Shows"]
@ -280,45 +276,3 @@ def test_check_whitelist_logic():
)
assert skip_reason is None
def test_show_title_dict():
show_titles_dict = show_title_dict(show_list)
assert show_titles_dict == show_titles
def test_episode_title_dict():
episode_titles_dict = episode_title_dict(show_list)
assert episode_titles_dict == episode_titles
def test_movies_title_dict():
movies_titles_dict = movies_title_dict(movie_list)
assert movies_titles_dict == movie_titles
def test_generate_library_guids_dict():
# Test with shows
(
show_titles_dict,
episode_titles_dict,
movies_titles_dict,
) = generate_library_guids_dict(show_list)
assert show_titles_dict == show_titles
assert episode_titles_dict == episode_titles
assert movies_titles_dict == {}
# Test with movies
(
show_titles_dict,
episode_titles_dict,
movies_titles_dict,
) = generate_library_guids_dict(movie_list)
assert show_titles_dict == {}
assert episode_titles_dict == {}
assert movies_titles_dict == movie_titles

View File

@ -1,78 +0,0 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.black_white import setup_black_white_lists
def test_setup_black_white_lists():
# Simple
blacklist_library = "library1, library2"
whitelist_library = "library1, library2"
blacklist_library_type = "library_type1, library_type2"
whitelist_library_type = "library_type1, library_type2"
blacklist_users = "user1, user2"
whitelist_users = "user1, user2"
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
)
assert results_blacklist_library == ["library1", "library2"]
assert return_whitelist_library == ["library1", "library2"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2"]
assert return_whitelist_users == ["user1", "user2"]
# Library Mapping and user mapping
library_mapping = {"library1": "library3"}
user_mapping = {"user1": "user3"}
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
library_mapping,
user_mapping,
)
assert results_blacklist_library == ["library1", "library2", "library3"]
assert return_whitelist_library == ["library1", "library2", "library3"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2", "user3"]
assert return_whitelist_users == ["user1", "user2", "user3"]

File diff suppressed because it is too large Load Diff

View File

@ -1,28 +1,37 @@
# Check the mark.log file that is generated by the CI to make sure it contains the expected values
import argparse
import os
import sys
from loguru import logger
from collections import Counter
import os, argparse
class MarkLogError(Exception):
"""Custom exception for mark.log validation failures."""
pass
def parse_args():
parser = argparse.ArgumentParser(
description="Check the mark.log file that is generated by the CI to make sure it contains the expected values"
)
parser.add_argument(
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument(
"--guids", action="store_true", help="Check the mark.log file for guids"
)
parser.add_argument(
group.add_argument(
"--locations", action="store_true", help="Check the mark.log file for locations"
)
parser.add_argument(
group.add_argument(
"--write", action="store_true", help="Check the mark.log file for write-run"
)
parser.add_argument(
group.add_argument(
"--plex", action="store_true", help="Check the mark.log file for Plex"
)
parser.add_argument(
group.add_argument(
"--jellyfin", action="store_true", help="Check the mark.log file for Jellyfin"
)
parser.add_argument(
group.add_argument(
"--emby", action="store_true", help="Check the mark.log file for Emby"
)
@ -31,51 +40,47 @@ def parse_args():
def read_marklog():
marklog = os.path.join(os.getcwd(), "mark.log")
with open(marklog, "r") as f:
lines = f.readlines()
return lines
try:
with open(marklog, "r") as f:
lines = [line.strip() for line in f if line.strip()]
return lines
except Exception as e:
raise MarkLogError(f"Error reading {marklog}: {e}")
def check_marklog(lines, expected_values):
try:
# Check to make sure the marklog contains all the expected values and nothing else
found_values = []
for line in lines:
# Remove the newline character
line = line.strip()
if line not in expected_values:
raise Exception("Line not found in marklog: " + line)
found_counter = Counter(lines)
expected_counter = Counter(expected_values)
found_values.append(line)
# Determine missing and extra items by comparing counts
missing = expected_counter - found_counter
extra = found_counter - expected_counter
# Check to make sure the marklog contains the same number of values as the expected values
if len(found_values) != len(expected_values):
raise Exception(
"Marklog did not contain the same number of values as the expected values, found "
+ str(len(found_values))
+ " values, expected "
+ str(len(expected_values))
+ " values\n"
+ "\n".join(found_values)
)
if missing or extra:
if missing:
logger.error("Missing expected entries (with counts):")
for entry, count in missing.items():
logger.error(f" {entry}: missing {count} time(s)")
if extra:
logger.error("Unexpected extra entries found (with counts):")
for entry, count in extra.items():
logger.error(f" {entry}: found {count} extra time(s)")
# Check that the two lists contain the same values
if sorted(found_values) != sorted(expected_values):
raise Exception(
"Marklog did not contain the same values as the expected values, found:\n"
+ "\n".join(sorted(found_values))
+ "\n\nExpected:\n"
+ "\n".join(sorted(expected_values))
)
logger.error(
f"Entry count mismatch: found {len(lines)} entries, expected {len(expected_values)} entries."
)
logger.error("Full mark.log content:")
for line in sorted(lines):
logger.error(f" {line}")
raise MarkLogError("mark.log validation failed.")
return True
except Exception as e:
print(e)
return False
return True
def main():
args = parse_args()
# Expected values defined for each check
expected_jellyfin = [
"Plex/JellyPlex-CI/jellyplex_watched/Custom Movies/Movie Two (2021)",
"Plex/JellyPlex-CI/jellyplex_watched/Custom TV Shows/Greatest Show Ever 3000/Episode 2",
@ -128,8 +133,7 @@ def main():
expected_locations = expected_emby + expected_plex + expected_jellyfin
# Remove Custom Movies/TV Shows as they should not have guids
expected_guids = [item for item in expected_locations if "Custom" not in item ]
expected_guids = [item for item in expected_locations if "Custom" not in item]
expected_write = [
"Plex/JellyPlex-CI/jellyplex_watched/Custom Movies/Movie Two (2021)",
@ -171,41 +175,42 @@ def main():
"Jellyfin/Jellyfin-Server/JellyUser/Custom Movies/Movie Three (2022)",
"Jellyfin/Jellyfin-Server/JellyUser/Custom TV Shows/Greatest Show Ever (3000)/S01E03",
"Jellyfin/Jellyfin-Server/JellyUser/Movies/Tears of Steel",
"Jellyfin/Jellyfin-Server/JellyUser/Shows/Monarch: Legacy of Monsters/Parallels and Interiors/4"
"Jellyfin/Jellyfin-Server/JellyUser/Shows/Monarch: Legacy of Monsters/Parallels and Interiors/4",
]
# Expected values for the mark.log file, dry-run is slightly different than write-run
# due to some of the items being copied over from one server to another and now being there
# for the next server run.
# Determine which expected values to use based on the command-line flag
if args.guids:
expected_values = expected_guids
check_type = "GUIDs"
elif args.locations:
expected_values = expected_locations
check_type = "locations"
elif args.write:
expected_values = expected_write
check_type = "write-run"
elif args.plex:
expected_values = expected_plex
check_type = "Plex"
elif args.jellyfin:
expected_values = expected_jellyfin
check_type = "Jellyfin"
elif args.emby:
expected_values = expected_emby
check_type = "Emby"
else:
print("No server specified")
exit(1)
raise MarkLogError("No server specified")
lines = read_marklog()
if not check_marklog(lines, expected_values):
print("Failed to validate marklog")
for line in lines:
# Remove the newline character
line = line.strip()
logger.info(f"Validating mark.log for {check_type}...")
print(line)
try:
lines = read_marklog()
check_marklog(lines, expected_values)
except MarkLogError as e:
logger.error(e)
sys.exit(1)
exit(1)
print("Successfully validated marklog")
exit(0)
logger.success("Successfully validated mark.log")
sys.exit(0)
if __name__ == "__main__":

407
uv.lock Normal file
View File

@ -0,0 +1,407 @@
version = 1
revision = 2
requires-python = ">=3.12"
[[package]]
name = "annotated-types"
version = "0.7.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
]
[[package]]
name = "certifi"
version = "2025.8.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/dc/67/960ebe6bf230a96cda2e0abcf73af550ec4f090005363542f0765df162e0/certifi-2025.8.3.tar.gz", hash = "sha256:e564105f78ded564e3ae7c923924435e1daa7463faeab5bb932bc53ffae63407", size = 162386, upload-time = "2025-08-03T03:07:47.08Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" },
]
[[package]]
name = "charset-normalizer"
version = "3.4.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/83/2d/5fd176ceb9b2fc619e63405525573493ca23441330fcdaee6bef9460e924/charset_normalizer-3.4.3.tar.gz", hash = "sha256:6fce4b8500244f6fcb71465d4a4930d132ba9ab8e71a7859e6a5d59851068d14", size = 122371, upload-time = "2025-08-09T07:57:28.46Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e9/5e/14c94999e418d9b87682734589404a25854d5f5d0408df68bc15b6ff54bb/charset_normalizer-3.4.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e28e334d3ff134e88989d90ba04b47d84382a828c061d0d1027b1b12a62b39b1", size = 205655, upload-time = "2025-08-09T07:56:08.475Z" },
{ url = "https://files.pythonhosted.org/packages/7d/a8/c6ec5d389672521f644505a257f50544c074cf5fc292d5390331cd6fc9c3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0cacf8f7297b0c4fcb74227692ca46b4a5852f8f4f24b3c766dd94a1075c4884", size = 146223, upload-time = "2025-08-09T07:56:09.708Z" },
{ url = "https://files.pythonhosted.org/packages/fc/eb/a2ffb08547f4e1e5415fb69eb7db25932c52a52bed371429648db4d84fb1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c6fd51128a41297f5409deab284fecbe5305ebd7e5a1f959bee1c054622b7018", size = 159366, upload-time = "2025-08-09T07:56:11.326Z" },
{ url = "https://files.pythonhosted.org/packages/82/10/0fd19f20c624b278dddaf83b8464dcddc2456cb4b02bb902a6da126b87a1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cfb2aad70f2c6debfbcb717f23b7eb55febc0bb23dcffc0f076009da10c6392", size = 157104, upload-time = "2025-08-09T07:56:13.014Z" },
{ url = "https://files.pythonhosted.org/packages/16/ab/0233c3231af734f5dfcf0844aa9582d5a1466c985bbed6cedab85af9bfe3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1606f4a55c0fd363d754049cdf400175ee96c992b1f8018b993941f221221c5f", size = 151830, upload-time = "2025-08-09T07:56:14.428Z" },
{ url = "https://files.pythonhosted.org/packages/ae/02/e29e22b4e02839a0e4a06557b1999d0a47db3567e82989b5bb21f3fbbd9f/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:027b776c26d38b7f15b26a5da1044f376455fb3766df8fc38563b4efbc515154", size = 148854, upload-time = "2025-08-09T07:56:16.051Z" },
{ url = "https://files.pythonhosted.org/packages/05/6b/e2539a0a4be302b481e8cafb5af8792da8093b486885a1ae4d15d452bcec/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:42e5088973e56e31e4fa58eb6bd709e42fc03799c11c42929592889a2e54c491", size = 160670, upload-time = "2025-08-09T07:56:17.314Z" },
{ url = "https://files.pythonhosted.org/packages/31/e7/883ee5676a2ef217a40ce0bffcc3d0dfbf9e64cbcfbdf822c52981c3304b/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cc34f233c9e71701040d772aa7490318673aa7164a0efe3172b2981218c26d93", size = 158501, upload-time = "2025-08-09T07:56:18.641Z" },
{ url = "https://files.pythonhosted.org/packages/c1/35/6525b21aa0db614cf8b5792d232021dca3df7f90a1944db934efa5d20bb1/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:320e8e66157cc4e247d9ddca8e21f427efc7a04bbd0ac8a9faf56583fa543f9f", size = 153173, upload-time = "2025-08-09T07:56:20.289Z" },
{ url = "https://files.pythonhosted.org/packages/50/ee/f4704bad8201de513fdc8aac1cabc87e38c5818c93857140e06e772b5892/charset_normalizer-3.4.3-cp312-cp312-win32.whl", hash = "sha256:fb6fecfd65564f208cbf0fba07f107fb661bcd1a7c389edbced3f7a493f70e37", size = 99822, upload-time = "2025-08-09T07:56:21.551Z" },
{ url = "https://files.pythonhosted.org/packages/39/f5/3b3836ca6064d0992c58c7561c6b6eee1b3892e9665d650c803bd5614522/charset_normalizer-3.4.3-cp312-cp312-win_amd64.whl", hash = "sha256:86df271bf921c2ee3818f0522e9a5b8092ca2ad8b065ece5d7d9d0e9f4849bcc", size = 107543, upload-time = "2025-08-09T07:56:23.115Z" },
{ url = "https://files.pythonhosted.org/packages/65/ca/2135ac97709b400c7654b4b764daf5c5567c2da45a30cdd20f9eefe2d658/charset_normalizer-3.4.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:14c2a87c65b351109f6abfc424cab3927b3bdece6f706e4d12faaf3d52ee5efe", size = 205326, upload-time = "2025-08-09T07:56:24.721Z" },
{ url = "https://files.pythonhosted.org/packages/71/11/98a04c3c97dd34e49c7d247083af03645ca3730809a5509443f3c37f7c99/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41d1fc408ff5fdfb910200ec0e74abc40387bccb3252f3f27c0676731df2b2c8", size = 146008, upload-time = "2025-08-09T07:56:26.004Z" },
{ url = "https://files.pythonhosted.org/packages/60/f5/4659a4cb3c4ec146bec80c32d8bb16033752574c20b1252ee842a95d1a1e/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1bb60174149316da1c35fa5233681f7c0f9f514509b8e399ab70fea5f17e45c9", size = 159196, upload-time = "2025-08-09T07:56:27.25Z" },
{ url = "https://files.pythonhosted.org/packages/86/9e/f552f7a00611f168b9a5865a1414179b2c6de8235a4fa40189f6f79a1753/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30d006f98569de3459c2fc1f2acde170b7b2bd265dc1943e87e1a4efe1b67c31", size = 156819, upload-time = "2025-08-09T07:56:28.515Z" },
{ url = "https://files.pythonhosted.org/packages/7e/95/42aa2156235cbc8fa61208aded06ef46111c4d3f0de233107b3f38631803/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:416175faf02e4b0810f1f38bcb54682878a4af94059a1cd63b8747244420801f", size = 151350, upload-time = "2025-08-09T07:56:29.716Z" },
{ url = "https://files.pythonhosted.org/packages/c2/a9/3865b02c56f300a6f94fc631ef54f0a8a29da74fb45a773dfd3dcd380af7/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aab0f181c486f973bc7262a97f5aca3ee7e1437011ef0c2ec04b5a11d16c927", size = 148644, upload-time = "2025-08-09T07:56:30.984Z" },
{ url = "https://files.pythonhosted.org/packages/77/d9/cbcf1a2a5c7d7856f11e7ac2d782aec12bdfea60d104e60e0aa1c97849dc/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabf8315679312cfa71302f9bd509ded4f2f263fb5b765cf1433b39106c3cc9", size = 160468, upload-time = "2025-08-09T07:56:32.252Z" },
{ url = "https://files.pythonhosted.org/packages/f6/42/6f45efee8697b89fda4d50580f292b8f7f9306cb2971d4b53f8914e4d890/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:bd28b817ea8c70215401f657edef3a8aa83c29d447fb0b622c35403780ba11d5", size = 158187, upload-time = "2025-08-09T07:56:33.481Z" },
{ url = "https://files.pythonhosted.org/packages/70/99/f1c3bdcfaa9c45b3ce96f70b14f070411366fa19549c1d4832c935d8e2c3/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:18343b2d246dc6761a249ba1fb13f9ee9a2bcd95decc767319506056ea4ad4dc", size = 152699, upload-time = "2025-08-09T07:56:34.739Z" },
{ url = "https://files.pythonhosted.org/packages/a3/ad/b0081f2f99a4b194bcbb1934ef3b12aa4d9702ced80a37026b7607c72e58/charset_normalizer-3.4.3-cp313-cp313-win32.whl", hash = "sha256:6fb70de56f1859a3f71261cbe41005f56a7842cc348d3aeb26237560bfa5e0ce", size = 99580, upload-time = "2025-08-09T07:56:35.981Z" },
{ url = "https://files.pythonhosted.org/packages/9a/8f/ae790790c7b64f925e5c953b924aaa42a243fb778fed9e41f147b2a5715a/charset_normalizer-3.4.3-cp313-cp313-win_amd64.whl", hash = "sha256:cf1ebb7d78e1ad8ec2a8c4732c7be2e736f6e5123a4146c5b89c9d1f585f8cef", size = 107366, upload-time = "2025-08-09T07:56:37.339Z" },
{ url = "https://files.pythonhosted.org/packages/8e/91/b5a06ad970ddc7a0e513112d40113e834638f4ca1120eb727a249fb2715e/charset_normalizer-3.4.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3cd35b7e8aedeb9e34c41385fda4f73ba609e561faedfae0a9e75e44ac558a15", size = 204342, upload-time = "2025-08-09T07:56:38.687Z" },
{ url = "https://files.pythonhosted.org/packages/ce/ec/1edc30a377f0a02689342f214455c3f6c2fbedd896a1d2f856c002fc3062/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b89bc04de1d83006373429975f8ef9e7932534b8cc9ca582e4db7d20d91816db", size = 145995, upload-time = "2025-08-09T07:56:40.048Z" },
{ url = "https://files.pythonhosted.org/packages/17/e5/5e67ab85e6d22b04641acb5399c8684f4d37caf7558a53859f0283a650e9/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2001a39612b241dae17b4687898843f254f8748b796a2e16f1051a17078d991d", size = 158640, upload-time = "2025-08-09T07:56:41.311Z" },
{ url = "https://files.pythonhosted.org/packages/f1/e5/38421987f6c697ee3722981289d554957c4be652f963d71c5e46a262e135/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8dcfc373f888e4fb39a7bc57e93e3b845e7f462dacc008d9749568b1c4ece096", size = 156636, upload-time = "2025-08-09T07:56:43.195Z" },
{ url = "https://files.pythonhosted.org/packages/a0/e4/5a075de8daa3ec0745a9a3b54467e0c2967daaaf2cec04c845f73493e9a1/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18b97b8404387b96cdbd30ad660f6407799126d26a39ca65729162fd810a99aa", size = 150939, upload-time = "2025-08-09T07:56:44.819Z" },
{ url = "https://files.pythonhosted.org/packages/02/f7/3611b32318b30974131db62b4043f335861d4d9b49adc6d57c1149cc49d4/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ccf600859c183d70eb47e05a44cd80a4ce77394d1ac0f79dbd2dd90a69a3a049", size = 148580, upload-time = "2025-08-09T07:56:46.684Z" },
{ url = "https://files.pythonhosted.org/packages/7e/61/19b36f4bd67f2793ab6a99b979b4e4f3d8fc754cbdffb805335df4337126/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:53cd68b185d98dde4ad8990e56a58dea83a4162161b1ea9272e5c9182ce415e0", size = 159870, upload-time = "2025-08-09T07:56:47.941Z" },
{ url = "https://files.pythonhosted.org/packages/06/57/84722eefdd338c04cf3030ada66889298eaedf3e7a30a624201e0cbe424a/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:30a96e1e1f865f78b030d65241c1ee850cdf422d869e9028e2fc1d5e4db73b92", size = 157797, upload-time = "2025-08-09T07:56:49.756Z" },
{ url = "https://files.pythonhosted.org/packages/72/2a/aff5dd112b2f14bcc3462c312dce5445806bfc8ab3a7328555da95330e4b/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d716a916938e03231e86e43782ca7878fb602a125a91e7acb8b5112e2e96ac16", size = 152224, upload-time = "2025-08-09T07:56:51.369Z" },
{ url = "https://files.pythonhosted.org/packages/b7/8c/9839225320046ed279c6e839d51f028342eb77c91c89b8ef2549f951f3ec/charset_normalizer-3.4.3-cp314-cp314-win32.whl", hash = "sha256:c6dbd0ccdda3a2ba7c2ecd9d77b37f3b5831687d8dc1b6ca5f56a4880cc7b7ce", size = 100086, upload-time = "2025-08-09T07:56:52.722Z" },
{ url = "https://files.pythonhosted.org/packages/ee/7a/36fbcf646e41f710ce0a563c1c9a343c6edf9be80786edeb15b6f62e17db/charset_normalizer-3.4.3-cp314-cp314-win_amd64.whl", hash = "sha256:73dc19b562516fc9bcf6e5d6e596df0b4eb98d87e4f79f3ae71840e6ed21361c", size = 107400, upload-time = "2025-08-09T07:56:55.172Z" },
{ url = "https://files.pythonhosted.org/packages/8a/1f/f041989e93b001bc4e44bb1669ccdcf54d3f00e628229a85b08d330615c5/charset_normalizer-3.4.3-py3-none-any.whl", hash = "sha256:ce571ab16d890d23b5c278547ba694193a45011ff86a9162a71307ed9f86759a", size = 53175, upload-time = "2025-08-09T07:57:26.864Z" },
]
[[package]]
name = "colorama"
version = "0.4.6"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
]
[[package]]
name = "idna"
version = "3.10"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
]
[[package]]
name = "iniconfig"
version = "2.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
]
[[package]]
name = "jellyplex-watched"
version = "8.3.0"
source = { virtual = "." }
dependencies = [
{ name = "loguru" },
{ name = "packaging" },
{ name = "plexapi" },
{ name = "pydantic" },
{ name = "python-dotenv" },
{ name = "requests" },
]
[package.dev-dependencies]
dev = [
{ name = "mypy" },
{ name = "pytest" },
{ name = "types-requests" },
]
lint = [
{ name = "ruff" },
]
[package.metadata]
requires-dist = [
{ name = "loguru", specifier = ">=0.7.3" },
{ name = "packaging", specifier = "==25.0" },
{ name = "plexapi", specifier = "==4.17.1" },
{ name = "pydantic", specifier = "==2.11.7" },
{ name = "python-dotenv", specifier = "==1.1.1" },
{ name = "requests", specifier = "==2.32.5" },
]
[package.metadata.requires-dev]
dev = [
{ name = "mypy", specifier = ">=1.16.1" },
{ name = "pytest", specifier = ">=8.4.1" },
{ name = "types-requests", specifier = ">=2.32.0.20250611" },
]
lint = [{ name = "ruff", specifier = ">=0.12.3" }]
[[package]]
name = "loguru"
version = "0.7.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
{ name = "win32-setctime", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/3a/05/a1dae3dffd1116099471c643b8924f5aa6524411dc6c63fdae648c4f1aca/loguru-0.7.3.tar.gz", hash = "sha256:19480589e77d47b8d85b2c827ad95d49bf31b0dcde16593892eb51dd18706eb6", size = 63559, upload-time = "2024-12-06T11:20:56.608Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/0c/29/0348de65b8cc732daa3e33e67806420b2ae89bdce2b04af740289c5c6c8c/loguru-0.7.3-py3-none-any.whl", hash = "sha256:31a33c10c8e1e10422bfd431aeb5d351c7cf7fa671e3c4df004162264b28220c", size = 61595, upload-time = "2024-12-06T11:20:54.538Z" },
]
[[package]]
name = "mypy"
version = "1.18.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mypy-extensions" },
{ name = "pathspec" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/14/a3/931e09fc02d7ba96da65266884da4e4a8806adcdb8a57faaacc6edf1d538/mypy-1.18.1.tar.gz", hash = "sha256:9e988c64ad3ac5987f43f5154f884747faf62141b7f842e87465b45299eea5a9", size = 3448447, upload-time = "2025-09-11T23:00:47.067Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e7/14/1c3f54d606cb88a55d1567153ef3a8bc7b74702f2ff5eb64d0994f9e49cb/mypy-1.18.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:502cde8896be8e638588b90fdcb4c5d5b8c1b004dfc63fd5604a973547367bb9", size = 12911082, upload-time = "2025-09-11T23:00:41.465Z" },
{ url = "https://files.pythonhosted.org/packages/90/83/235606c8b6d50a8eba99773add907ce1d41c068edb523f81eb0d01603a83/mypy-1.18.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7509549b5e41be279afc1228242d0e397f1af2919a8f2877ad542b199dc4083e", size = 11919107, upload-time = "2025-09-11T22:58:40.903Z" },
{ url = "https://files.pythonhosted.org/packages/ca/25/4e2ce00f8d15b99d0c68a2536ad63e9eac033f723439ef80290ec32c1ff5/mypy-1.18.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5956ecaabb3a245e3f34100172abca1507be687377fe20e24d6a7557e07080e2", size = 12472551, upload-time = "2025-09-11T22:58:37.272Z" },
{ url = "https://files.pythonhosted.org/packages/32/bb/92642a9350fc339dd9dcefcf6862d171b52294af107d521dce075f32f298/mypy-1.18.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8750ceb014a96c9890421c83f0db53b0f3b8633e2864c6f9bc0a8e93951ed18d", size = 13340554, upload-time = "2025-09-11T22:59:38.756Z" },
{ url = "https://files.pythonhosted.org/packages/cd/ee/38d01db91c198fb6350025d28f9719ecf3c8f2c55a0094bfbf3ef478cc9a/mypy-1.18.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fb89ea08ff41adf59476b235293679a6eb53a7b9400f6256272fb6029bec3ce5", size = 13530933, upload-time = "2025-09-11T22:59:20.228Z" },
{ url = "https://files.pythonhosted.org/packages/da/8d/6d991ae631f80d58edbf9d7066e3f2a96e479dca955d9a968cd6e90850a3/mypy-1.18.1-cp312-cp312-win_amd64.whl", hash = "sha256:2657654d82fcd2a87e02a33e0d23001789a554059bbf34702d623dafe353eabf", size = 9828426, upload-time = "2025-09-11T23:00:21.007Z" },
{ url = "https://files.pythonhosted.org/packages/e4/ec/ef4a7260e1460a3071628a9277a7579e7da1b071bc134ebe909323f2fbc7/mypy-1.18.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d70d2b5baf9b9a20bc9c730015615ae3243ef47fb4a58ad7b31c3e0a59b5ef1f", size = 12918671, upload-time = "2025-09-11T22:58:29.814Z" },
{ url = "https://files.pythonhosted.org/packages/a1/82/0ea6c3953f16223f0b8eda40c1aeac6bd266d15f4902556ae6e91f6fca4c/mypy-1.18.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b8367e33506300f07a43012fc546402f283c3f8bcff1dc338636affb710154ce", size = 11913023, upload-time = "2025-09-11T23:00:29.049Z" },
{ url = "https://files.pythonhosted.org/packages/ae/ef/5e2057e692c2690fc27b3ed0a4dbde4388330c32e2576a23f0302bc8358d/mypy-1.18.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:913f668ec50c3337b89df22f973c1c8f0b29ee9e290a8b7fe01cc1ef7446d42e", size = 12473355, upload-time = "2025-09-11T23:00:04.544Z" },
{ url = "https://files.pythonhosted.org/packages/98/43/b7e429fc4be10e390a167b0cd1810d41cb4e4add4ae50bab96faff695a3b/mypy-1.18.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1a0e70b87eb27b33209fa4792b051c6947976f6ab829daa83819df5f58330c71", size = 13346944, upload-time = "2025-09-11T22:58:23.024Z" },
{ url = "https://files.pythonhosted.org/packages/89/4e/899dba0bfe36bbd5b7c52e597de4cf47b5053d337b6d201a30e3798e77a6/mypy-1.18.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c378d946e8a60be6b6ede48c878d145546fb42aad61df998c056ec151bf6c746", size = 13512574, upload-time = "2025-09-11T22:59:52.152Z" },
{ url = "https://files.pythonhosted.org/packages/f5/f8/7661021a5b0e501b76440454d786b0f01bb05d5c4b125fcbda02023d0250/mypy-1.18.1-cp313-cp313-win_amd64.whl", hash = "sha256:2cd2c1e0f3a7465f22731987fff6fc427e3dcbb4ca5f7db5bbeaff2ff9a31f6d", size = 9837684, upload-time = "2025-09-11T22:58:44.454Z" },
{ url = "https://files.pythonhosted.org/packages/bf/87/7b173981466219eccc64c107cf8e5ab9eb39cc304b4c07df8e7881533e4f/mypy-1.18.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ba24603c58e34dd5b096dfad792d87b304fc6470cbb1c22fd64e7ebd17edcc61", size = 12900265, upload-time = "2025-09-11T22:59:03.4Z" },
{ url = "https://files.pythonhosted.org/packages/ae/cc/b10e65bae75b18a5ac8f81b1e8e5867677e418f0dd2c83b8e2de9ba96ebd/mypy-1.18.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ed36662fb92ae4cb3cacc682ec6656208f323bbc23d4b08d091eecfc0863d4b5", size = 11942890, upload-time = "2025-09-11T23:00:00.607Z" },
{ url = "https://files.pythonhosted.org/packages/39/d4/aeefa07c44d09f4c2102e525e2031bc066d12e5351f66b8a83719671004d/mypy-1.18.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:040ecc95e026f71a9ad7956fea2724466602b561e6a25c2e5584160d3833aaa8", size = 12472291, upload-time = "2025-09-11T22:59:43.425Z" },
{ url = "https://files.pythonhosted.org/packages/c6/07/711e78668ff8e365f8c19735594ea95938bff3639a4c46a905e3ed8ff2d6/mypy-1.18.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:937e3ed86cb731276706e46e03512547e43c391a13f363e08d0fee49a7c38a0d", size = 13318610, upload-time = "2025-09-11T23:00:17.604Z" },
{ url = "https://files.pythonhosted.org/packages/ca/85/df3b2d39339c31d360ce299b418c55e8194ef3205284739b64962f6074e7/mypy-1.18.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1f95cc4f01c0f1701ca3b0355792bccec13ecb2ec1c469e5b85a6ef398398b1d", size = 13513697, upload-time = "2025-09-11T22:58:59.534Z" },
{ url = "https://files.pythonhosted.org/packages/b1/df/462866163c99ea73bb28f0eb4d415c087e30de5d36ee0f5429d42e28689b/mypy-1.18.1-cp314-cp314-win_amd64.whl", hash = "sha256:e4f16c0019d48941220ac60b893615be2f63afedaba6a0801bdcd041b96991ce", size = 9985739, upload-time = "2025-09-11T22:58:51.644Z" },
{ url = "https://files.pythonhosted.org/packages/e0/1d/4b97d3089b48ef3d904c9ca69fab044475bd03245d878f5f0b3ea1daf7ce/mypy-1.18.1-py3-none-any.whl", hash = "sha256:b76a4de66a0ac01da1be14ecc8ae88ddea33b8380284a9e3eae39d57ebcbe26e", size = 2352212, upload-time = "2025-09-11T22:59:26.576Z" },
]
[[package]]
name = "mypy-extensions"
version = "1.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
]
[[package]]
name = "packaging"
version = "25.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
]
[[package]]
name = "pathspec"
version = "0.12.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" },
]
[[package]]
name = "plexapi"
version = "4.17.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "requests" },
]
sdist = { url = "https://files.pythonhosted.org/packages/2a/02/1bebd67c3cd94a45f6c3520da971791b66457535c9771d8e0068746d7bc2/plexapi-4.17.1.tar.gz", hash = "sha256:1e5bfb486bb150e058a80ff4fb9aff9e3efce644c56d52bb5297272e005d8241", size = 154746, upload-time = "2025-08-26T00:11:02.819Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c3/1c/9fdaa0e1f797dde3c3cb56d7b222109009f70380e7f49fc0ff42d5705409/plexapi-4.17.1-py3-none-any.whl", hash = "sha256:9d51adb112a2b0b7aa91a928c8b5c0dfffc0d51108cea67d86fea08cee06c998", size = 166861, upload-time = "2025-08-26T00:11:00.89Z" },
]
[[package]]
name = "pluggy"
version = "1.6.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
]
[[package]]
name = "pydantic"
version = "2.11.7"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "annotated-types" },
{ name = "pydantic-core" },
{ name = "typing-extensions" },
{ name = "typing-inspection" },
]
sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350, upload-time = "2025-06-14T08:33:17.137Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" },
]
[[package]]
name = "pydantic-core"
version = "2.33.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" },
{ url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" },
{ url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" },
{ url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" },
{ url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" },
{ url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" },
{ url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" },
{ url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" },
{ url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" },
{ url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" },
{ url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" },
{ url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" },
{ url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" },
{ url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" },
{ url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" },
{ url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" },
{ url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" },
{ url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" },
{ url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" },
{ url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" },
{ url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" },
{ url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" },
{ url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" },
{ url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" },
{ url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" },
{ url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" },
{ url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" },
{ url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" },
{ url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" },
{ url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" },
{ url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" },
]
[[package]]
name = "pygments"
version = "2.19.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
]
[[package]]
name = "pytest"
version = "8.4.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
{ name = "iniconfig" },
{ name = "packaging" },
{ name = "pluggy" },
{ name = "pygments" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" },
]
[[package]]
name = "python-dotenv"
version = "1.1.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978, upload-time = "2025-06-24T04:21:07.341Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" },
]
[[package]]
name = "requests"
version = "2.32.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
{ name = "charset-normalizer" },
{ name = "idna" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
]
[[package]]
name = "ruff"
version = "0.13.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/6e/1a/1f4b722862840295bcaba8c9e5261572347509548faaa99b2d57ee7bfe6a/ruff-0.13.0.tar.gz", hash = "sha256:5b4b1ee7eb35afae128ab94459b13b2baaed282b1fb0f472a73c82c996c8ae60", size = 5372863, upload-time = "2025-09-10T16:25:37.917Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ac/fe/6f87b419dbe166fd30a991390221f14c5b68946f389ea07913e1719741e0/ruff-0.13.0-py3-none-linux_armv6l.whl", hash = "sha256:137f3d65d58ee828ae136a12d1dc33d992773d8f7644bc6b82714570f31b2004", size = 12187826, upload-time = "2025-09-10T16:24:39.5Z" },
{ url = "https://files.pythonhosted.org/packages/e4/25/c92296b1fc36d2499e12b74a3fdb230f77af7bdf048fad7b0a62e94ed56a/ruff-0.13.0-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:21ae48151b66e71fd111b7d79f9ad358814ed58c339631450c66a4be33cc28b9", size = 12933428, upload-time = "2025-09-10T16:24:43.866Z" },
{ url = "https://files.pythonhosted.org/packages/44/cf/40bc7221a949470307d9c35b4ef5810c294e6cfa3caafb57d882731a9f42/ruff-0.13.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:64de45f4ca5441209e41742d527944635a05a6e7c05798904f39c85bafa819e3", size = 12095543, upload-time = "2025-09-10T16:24:46.638Z" },
{ url = "https://files.pythonhosted.org/packages/f1/03/8b5ff2a211efb68c63a1d03d157e924997ada87d01bebffbd13a0f3fcdeb/ruff-0.13.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2b2c653ae9b9d46e0ef62fc6fbf5b979bda20a0b1d2b22f8f7eb0cde9f4963b8", size = 12312489, upload-time = "2025-09-10T16:24:49.556Z" },
{ url = "https://files.pythonhosted.org/packages/37/fc/2336ef6d5e9c8d8ea8305c5f91e767d795cd4fc171a6d97ef38a5302dadc/ruff-0.13.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4cec632534332062bc9eb5884a267b689085a1afea9801bf94e3ba7498a2d207", size = 11991631, upload-time = "2025-09-10T16:24:53.439Z" },
{ url = "https://files.pythonhosted.org/packages/39/7f/f6d574d100fca83d32637d7f5541bea2f5e473c40020bbc7fc4a4d5b7294/ruff-0.13.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dcd628101d9f7d122e120ac7c17e0a0f468b19bc925501dbe03c1cb7f5415b24", size = 13720602, upload-time = "2025-09-10T16:24:56.392Z" },
{ url = "https://files.pythonhosted.org/packages/fd/c8/a8a5b81d8729b5d1f663348d11e2a9d65a7a9bd3c399763b1a51c72be1ce/ruff-0.13.0-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:afe37db8e1466acb173bb2a39ca92df00570e0fd7c94c72d87b51b21bb63efea", size = 14697751, upload-time = "2025-09-10T16:24:59.89Z" },
{ url = "https://files.pythonhosted.org/packages/57/f5/183ec292272ce7ec5e882aea74937f7288e88ecb500198b832c24debc6d3/ruff-0.13.0-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0f96a8d90bb258d7d3358b372905fe7333aaacf6c39e2408b9f8ba181f4b6ef2", size = 14095317, upload-time = "2025-09-10T16:25:03.025Z" },
{ url = "https://files.pythonhosted.org/packages/9f/8d/7f9771c971724701af7926c14dab31754e7b303d127b0d3f01116faef456/ruff-0.13.0-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94b5e3d883e4f924c5298e3f2ee0f3085819c14f68d1e5b6715597681433f153", size = 13144418, upload-time = "2025-09-10T16:25:06.272Z" },
{ url = "https://files.pythonhosted.org/packages/a8/a6/7985ad1778e60922d4bef546688cd8a25822c58873e9ff30189cfe5dc4ab/ruff-0.13.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03447f3d18479df3d24917a92d768a89f873a7181a064858ea90a804a7538991", size = 13370843, upload-time = "2025-09-10T16:25:09.965Z" },
{ url = "https://files.pythonhosted.org/packages/64/1c/bafdd5a7a05a50cc51d9f5711da704942d8dd62df3d8c70c311e98ce9f8a/ruff-0.13.0-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:fbc6b1934eb1c0033da427c805e27d164bb713f8e273a024a7e86176d7f462cf", size = 13321891, upload-time = "2025-09-10T16:25:12.969Z" },
{ url = "https://files.pythonhosted.org/packages/bc/3e/7817f989cb9725ef7e8d2cee74186bf90555279e119de50c750c4b7a72fe/ruff-0.13.0-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a8ab6a3e03665d39d4a25ee199d207a488724f022db0e1fe4002968abdb8001b", size = 12119119, upload-time = "2025-09-10T16:25:16.621Z" },
{ url = "https://files.pythonhosted.org/packages/58/07/9df080742e8d1080e60c426dce6e96a8faf9a371e2ce22eef662e3839c95/ruff-0.13.0-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:d2a5c62f8ccc6dd2fe259917482de7275cecc86141ee10432727c4816235bc41", size = 11961594, upload-time = "2025-09-10T16:25:19.49Z" },
{ url = "https://files.pythonhosted.org/packages/6a/f4/ae1185349197d26a2316840cb4d6c3fba61d4ac36ed728bf0228b222d71f/ruff-0.13.0-py3-none-musllinux_1_2_i686.whl", hash = "sha256:b7b85ca27aeeb1ab421bc787009831cffe6048faae08ad80867edab9f2760945", size = 12933377, upload-time = "2025-09-10T16:25:22.371Z" },
{ url = "https://files.pythonhosted.org/packages/b6/39/e776c10a3b349fc8209a905bfb327831d7516f6058339a613a8d2aaecacd/ruff-0.13.0-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:79ea0c44a3032af768cabfd9616e44c24303af49d633b43e3a5096e009ebe823", size = 13418555, upload-time = "2025-09-10T16:25:25.681Z" },
{ url = "https://files.pythonhosted.org/packages/46/09/dca8df3d48e8b3f4202bf20b1658898e74b6442ac835bfe2c1816d926697/ruff-0.13.0-py3-none-win32.whl", hash = "sha256:4e473e8f0e6a04e4113f2e1de12a5039579892329ecc49958424e5568ef4f768", size = 12141613, upload-time = "2025-09-10T16:25:28.664Z" },
{ url = "https://files.pythonhosted.org/packages/61/21/0647eb71ed99b888ad50e44d8ec65d7148babc0e242d531a499a0bbcda5f/ruff-0.13.0-py3-none-win_amd64.whl", hash = "sha256:48e5c25c7a3713eea9ce755995767f4dcd1b0b9599b638b12946e892123d1efb", size = 13258250, upload-time = "2025-09-10T16:25:31.773Z" },
{ url = "https://files.pythonhosted.org/packages/e1/a3/03216a6a86c706df54422612981fb0f9041dbb452c3401501d4a22b942c9/ruff-0.13.0-py3-none-win_arm64.whl", hash = "sha256:ab80525317b1e1d38614addec8ac954f1b3e662de9d59114ecbf771d00cf613e", size = 12312357, upload-time = "2025-09-10T16:25:35.595Z" },
]
[[package]]
name = "types-requests"
version = "2.32.4.20250809"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ed/b0/9355adb86ec84d057fea765e4c49cce592aaf3d5117ce5609a95a7fc3dac/types_requests-2.32.4.20250809.tar.gz", hash = "sha256:d8060de1c8ee599311f56ff58010fb4902f462a1470802cf9f6ed27bc46c4df3", size = 23027, upload-time = "2025-08-09T03:17:10.664Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2b/6f/ec0012be842b1d888d46884ac5558fd62aeae1f0ec4f7a581433d890d4b5/types_requests-2.32.4.20250809-py3-none-any.whl", hash = "sha256:f73d1832fb519ece02c85b1f09d5f0dd3108938e7d47e7f94bbfa18a6782b163", size = 20644, upload-time = "2025-08-09T03:17:09.716Z" },
]
[[package]]
name = "typing-extensions"
version = "4.15.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
]
[[package]]
name = "typing-inspection"
version = "0.4.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" },
]
[[package]]
name = "urllib3"
version = "2.5.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" },
]
[[package]]
name = "win32-setctime"
version = "1.2.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/b3/8f/705086c9d734d3b663af0e9bb3d4de6578d08f46b1b101c2442fd9aecaa2/win32_setctime-1.2.0.tar.gz", hash = "sha256:ae1fdf948f5640aae05c511ade119313fb6a30d7eabe25fef9764dca5873c4c0", size = 4867, upload-time = "2024-12-07T15:28:28.314Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/07/c6fe3ad3e685340704d314d765b7912993bcb8dc198f0e7a89382d37974b/win32_setctime-1.2.0-py3-none-any.whl", hash = "sha256:95d644c4e708aba81dc3704a116d8cbc974d70b3bdb8be1d150e36be6e9d1390", size = 4083, upload-time = "2024-12-07T15:28:26.465Z" },
]