308 Commits

Author SHA1 Message Date
Luigi311
b3175305bd Merge pull request #164 from luigi311/dev
Dev
2024-04-15 15:06:24 -06:00
Luis Garcia
5b1933cb08 format 2024-04-15 15:03:18 -06:00
Luis Garcia
ae71ca0940 Jellyfin/Plex: Log when guid items are missing 2024-04-14 17:44:31 -06:00
Luis Garcia
9b38729b95 Watched: Use get for season
Use get to avoid KeyError if season doesnt exist.
2024-04-14 17:08:49 -06:00
Luis Garcia
402c286742 Plex: format 2024-04-14 17:06:37 -06:00
Luis Garcia
dcd4ac1d36 Gitignore: expand .env 2024-04-14 17:06:29 -06:00
Luis Garcia
e6fbf746d8 CI: Increase wait 2024-04-14 17:06:07 -06:00
Luis Garcia
803d248cb8 Jellyfin: Skip if UserData is not avaliable 2024-04-05 00:47:24 -06:00
Luis Garcia
713be6970c Jellyfin: Fix error status code 2024-04-05 00:26:24 -06:00
Luigi311
62509f16db Merge pull request #151 from luigi311/dev
Return empty if season/show are missing from episode_watched_list
2024-02-13 15:27:44 -07:00
Luigi311
84899aef50 Return empty if season/show are missing from episode_watched_list
Signed-off-by: Luigi311 <git@luigi311.com>
2024-02-11 02:21:42 -07:00
Luigi311
86b30e1887 Merge pull request #147 from luigi311/dev
Plex: Use username
2024-01-25 17:53:38 -07:00
Luigi311
033ef76cfe Plex: Use username
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-25 17:41:43 -07:00
Luigi311
815596379c Merge pull request #141 from luigi311/dev
Dev
2024-01-18 16:07:10 -07:00
Luigi311
bc5e8bc65d Update requirements
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-18 15:40:02 -07:00
Luigi311
b32de7259b Jellyfin: Swap out is not
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-18 15:39:52 -07:00
Luigi311
29cb0cebd5 Merge pull request #140 from luigi311/fixes
Fixes
2024-01-17 15:01:59 -07:00
Luigi311
6744ebcb5b Jellyfin: Skip season if no indexnumber
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 04:07:40 -07:00
Luigi311
c6b026a82d Jellyfin: Remove redudent keys call
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 03:22:08 -07:00
Luigi311
cc706938ce Jellyfin: Add generate_guids/locations. Cleanup
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 02:53:36 -07:00
Luigi311
84b98db36b Jellyfin: Add timeout
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 00:50:22 -07:00
Luigi311
01ad15e2bd CI: Update action versions
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 00:27:01 -07:00
Luigi311
54adf0e56f Jellyfin: Remove async
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-16 00:21:58 -07:00
Luigi311
025e40b098 Merge pull request #139 from luigi311/dev
Dev
2024-01-12 19:19:23 -07:00
Luigi311
4534854001 Merge pull request #136 from luigi311/plex_optimize
Plex: Add GENERATE_GUIDS, remove recursive thread calls
2024-01-12 19:08:45 -07:00
Luigi311
362d54b471 plex: guids
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:55:28 -07:00
Luigi311
fa533ff65e Test Guids/Locations/both
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:48 -07:00
Luigi311
96fe367562 Add GENERATE_LOCATIONS
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:48 -07:00
Luigi311
9566ffa384 CI: Test twice
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:48 -07:00
Luigi311
f5835e1e72 Add GENERATE_GUIDS enviornment
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:48 -07:00
Luigi311
fe65716706 Plex: Remove recursive thread calls
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:16 -07:00
Luigi311
873735900f Functions: Add override_threads
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-12 17:22:16 -07:00
Luigi311
28c166146e Merge pull request #138 from remos/plex-guid-fix
Plex: fix guid lookup for X -> Plex sync
2024-01-12 12:07:42 -07:00
Somer Hayter
c6affc3108 Plex: add logging for failed find_video + get_video_status 2024-01-13 00:10:34 +11:00
Somer Hayter
59b49fd0df Plex: Fix guid lookup in find_video and get_video_status 2024-01-13 00:10:34 +11:00
Luigi311
6ec003f899 Merge pull request #135 from luigi311/dev
Dev
2024-01-06 04:45:19 -07:00
Luigi311
95f2a9ad30 If only one worker, run in main thread to avoid overhead 2024-01-06 01:13:15 -07:00
Luigi311
7317e8533d Watched: Use enumerate 2024-01-06 00:16:13 -07:00
Luigi311
f80c20d70c Watched: Remove deepcopy due to performance 2024-01-05 23:46:15 -07:00
Luigi311
01fc13c3e0 Merge branch 'dev' of github.com:luigi311/JellyPlex-Watched into dev 2024-01-05 22:45:19 -07:00
Luigi311
1edfecae42 Cleanup 2024-01-05 22:44:56 -07:00
Luigi311
9dab9a4632 Merge branch 'main' into dev 2024-01-05 15:12:54 -07:00
Luigi311
98a824bfdc Plex: Format 2024-01-05 14:58:24 -07:00
Luigi311
8fa9351ef1 Plex: Only partially watched more than 1 min 2024-01-05 14:58:24 -07:00
Roberto Banić
64b2197844 Remove unnecessary check 2024-01-05 14:58:24 -07:00
Roberto Banić
26f1f80be7 Refactor get_user_library_watched 2024-01-05 14:58:24 -07:00
Roberto Banić
2e4c2a6817 Refactor get_user_library_watched_show 2024-01-05 14:58:24 -07:00
Roberto Banić
9498335e22 Deduplicate get_movie_guids and get_episode_guids 2024-01-05 14:58:24 -07:00
Roberto Banić
26f40110d0 Bump minimum Python version to 3.9 2024-01-05 14:58:24 -07:00
Luigi311
9375d482b0 CI: Improve mark validation 2024-01-05 14:58:24 -07:00
Luigi311
de9180a124 Handle episode names are not unique 2024-01-05 14:58:24 -07:00
Luigi311
ba480d2cb7 CI: Add workflow dispatch 2024-01-05 14:58:24 -07:00
Luigi311
5014748ee1 CI: Speedup start containers 2024-01-05 14:58:24 -07:00
Luigi311
4e25ae5539 CI: Validate mark log 2024-01-05 14:58:24 -07:00
Jan Willhaus
a2b802a5de Add tini for sigterm handling 2024-01-05 14:58:24 -07:00
Luigi311
9739b27718 Remove failed message from show/episode/movie dict
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
bdf6476689 Watched: combine_watched_dicts check types
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
b8b627be1a Use season number instead of season name
Using season name is not reliable as it can vary between servers
and can be overridden by the user.

Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
03cad668aa README: Add troubleshooting/Issues
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
2e0ec9aa38 Plex: Use updateTimeline instead of updateProgress
Not all accounts have access to updateProgress, so use updateTimeline instead

Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
4b02aae889 Show average time on exit
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
c91ba0b1b3 Action: Add test
Spins up jellyfin and plex containers to test against
2024-01-05 14:57:55 -07:00
dependabot[bot]
6b7f8b04e6 Bump aiohttp from 3.8.6 to 3.9.0
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.8.6 to 3.9.0.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.8.6...v3.9.0)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-05 14:57:55 -07:00
Luigi311
5472baab51 Action: Limit ghcr push to luigi311
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:55 -07:00
Luigi311
d5b6859bf8 Action: Add default variant 2024-01-05 14:57:55 -07:00
Luigi311
8e23847c79 README: Change configuration to point to .env.sample 2024-01-05 14:57:55 -07:00
Luigi311
0c1579bae7 Use non root for containers 2024-01-05 14:57:55 -07:00
Luigi311
3dc50fff95 Docker-compose: Add markfile. Add user mapping ex 2024-01-05 14:57:55 -07:00
Luigi311
b8273f50c2 MARKFILE match LOGFILE 2024-01-05 14:57:55 -07:00
Luigi311
dbea28e9c6 Docker: Add RUN_ONLY_ONCE and MARKFILE 2024-01-05 14:57:55 -07:00
Luigi311
a1b11ab039 Add unraid to type 2024-01-05 14:57:55 -07:00
Luigi311
1841b0dea6 Jellyfin: Remove headers append 2024-01-05 14:57:55 -07:00
Luigi311
b311bf2770 Add MARK/DRYRUN logger levels 2024-01-05 14:57:55 -07:00
Luigi311
df13cef760 Add mark list support 2024-01-05 14:57:55 -07:00
Luigi311
76ac264b25 Add example baseurl/token to docker-compose 2024-01-05 14:57:44 -07:00
Luigi311
93bc94add5 Pin to 3.11 due to 3.12 issues 2024-01-05 14:57:44 -07:00
neofright
79325b8c61 Update README.md
Remove another unnecessary captialisation.
2024-01-05 14:57:44 -07:00
neofright
58c1eb7004 Improve README.md
- Inprogress is not a word in English, but two separate words.
- Many words are unnecessarily captialised as they are not names or at the beginning of the sentences.
- Prefer 'usernames' to 'usersnames'
2024-01-05 14:57:44 -07:00
neofright
466f292feb Typo in .env.sample 2024-01-05 14:57:44 -07:00
Luigi311
4de25a0d4a Print server info
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:44 -07:00
Luigi311
43d6bc0d82 Timeout issues (#103)
* Add timeout support for jellyfin requests

Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:57:44 -07:00
Luigi311
b53d7c9ecc Add docker compose to types
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
116d50a75a Add max_threads
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
e1fb365096 Update apis
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
03617dacfc Jellyfin: Remove isPlayed, Use get for name
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
e6b33f1bc9 Jellyfin: Fix locations logic
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
d9e6a554f6 Disable fast fail
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
7ef37fe848 Jellyfin: Check for provider_source in episode
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
dd64617cbd Jellyfin: Handle missing paths
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Lai Jiang
a227c01a7f Fix a type 2024-01-05 14:55:56 -07:00
Luigi311
da53609385 Jellyfin: Remove reassigning jellyfin_episode_id
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
Luigi311
e94a8fb2c3 Jellyfin: Fix errors with missing matches
Signed-off-by: Luigi311 <git@luigi311.com>
2024-01-05 14:55:56 -07:00
dependabot[bot]
d87542ab78 Bump requests from 2.28.2 to 2.31.0
Bumps [requests](https://github.com/psf/requests) from 2.28.2 to 2.31.0.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.28.2...v2.31.0)

---
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-05 14:55:56 -07:00
Luigi311
945ffb2fb3 Plex: Cleanup username_title 2024-01-05 14:55:56 -07:00
Luigi311
da808ba25e CI: Add back in dev based on alpine 2024-01-05 14:55:56 -07:00
Luigi311
e4b4c7ba39 plex: Fix username/title 2024-01-05 14:55:56 -07:00
Luigi311
43ead4bb0f Plex: Fix username/title selection 2024-01-05 14:55:56 -07:00
Luigi311
c4a2f8af39 Users: Default to username and fall back to title 2024-01-05 14:55:56 -07:00
Luigi311
fd281a50b6 Log both servers users instead of exiting immediately 2024-01-05 14:55:56 -07:00
Luigi311
f8ef4fe6c9 Add docker-compose file 2024-01-05 14:55:56 -07:00
Luigi311
faef0ae246 Merge pull request #125 from luigi311/dev
Use season number instead of season name
2023-12-10 21:37:59 -07:00
Luigi311
117932e272 Use season number instead of season name
Using season name is not reliable as it can vary between servers
and can be overridden by the user.

Signed-off-by: Luigi311 <git@luigi311.com>
2023-12-10 10:41:59 -07:00
Luigi311
4297708d3e Merge pull request #124 from luigi311/dev
Dev
2023-12-10 09:56:19 -07:00
Luigi311
2d00d8cb3e README: Add troubleshooting/Issues
Signed-off-by: Luigi311 <git@luigi311.com>
2023-12-10 09:51:47 -07:00
Luigi311
0190788658 Plex: Use updateTimeline instead of updateProgress
Not all accounts have access to updateProgress, so use updateTimeline instead

Signed-off-by: Luigi311 <git@luigi311.com>
2023-12-10 09:38:19 -07:00
Luigi311
b46d4a7166 Show average time on exit
Signed-off-by: Luigi311 <git@luigi311.com>
2023-12-10 09:38:19 -07:00
Luigi311
994d529f59 Action: Add test
Spins up jellyfin and plex containers to test against
2023-12-10 09:38:19 -07:00
Luigi311
7f347ae186 Merge pull request #120 from luigi311/dev
Dev
2023-12-06 14:25:35 -07:00
Luigi311
cd4ce186ca Merge pull request #119 from luigi311/CI-Testing
Action: Add test
2023-12-06 14:14:27 -07:00
Luigi311
ca5403f97b Action: Add test
Spins up jellyfin and plex containers to test against
2023-12-06 14:11:50 -07:00
Luigi311
7bb76f62a5 Merge pull request #117 from luigi311/dependabot/pip/aiohttp-3.9.0
Bump aiohttp from 3.8.6 to 3.9.0
2023-12-06 12:07:40 -07:00
Luigi311
dcdbe44648 Action: Limit ghcr push to luigi311
Signed-off-by: Luigi311 <git@luigi311.com>
2023-12-06 12:04:32 -07:00
dependabot[bot]
f91005f0ba Bump aiohttp from 3.8.6 to 3.9.0
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.8.6 to 3.9.0.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.8.6...v3.9.0)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-11-28 01:04:37 +00:00
Luigi311
5baea978ab Merge pull request #114 from luigi311/dev
Add markfile, Fix documentation, Add default variant, Non-root docker container
2023-11-18 04:05:28 -07:00
Luigi311
9cc1f96eea Merge pull request #113 from luigi311/user
User
2023-11-13 03:51:27 -07:00
Luigi311
2a65c4b5ca Action: Add default variant 2023-11-13 03:48:05 -07:00
Luigi311
e1ef6615cc README: Change configuration to point to .env.sample 2023-11-13 03:39:29 -07:00
Luigi311
d607c9c821 Use non root for containers 2023-11-13 03:36:10 -07:00
Luigi311
f6b2186824 Docker-compose: Add markfile. Add user mapping ex 2023-11-13 02:49:14 -07:00
Luigi311
a3fc53059c MARKFILE match LOGFILE 2023-11-13 02:30:11 -07:00
Luigi311
6afe123947 Docker: Add RUN_ONLY_ONCE and MARKFILE 2023-11-13 02:28:40 -07:00
Luigi311
7e9c6bb338 Add unraid to type 2023-11-13 02:05:47 -07:00
Luigi311
89a2768fc9 Jellyfin: Remove headers append 2023-11-13 01:59:18 -07:00
Luigi311
9ff3bdf302 Add MARK/DRYRUN logger levels 2023-11-13 01:48:07 -07:00
Luigi311
2c48e89435 Add mark list support 2023-11-13 01:12:08 -07:00
Luigi311
6ccb68aeb3 Add example baseurl/token to docker-compose 2023-11-13 01:09:11 -07:00
Luigi311
032243de0a Pin to 3.11 due to 3.12 issues 2023-11-13 00:16:44 -07:00
Luigi311
5b1b9ec222 Merge pull request #112 from neofright/dev
Typos / formatting
2023-11-03 19:57:12 -06:00
neofright
375c6b23a5 Update README.md
Remove another unnecessary captialisation.
2023-11-03 13:21:41 +00:00
neofright
b378dff0dc Improve README.md
- Inprogress is not a word in English, but two separate words.
- Many words are unnecessarily captialised as they are not names or at the beginning of the sentences.
- Prefer 'usernames' to 'usersnames'
2023-11-03 13:21:07 +00:00
neofright
23f2d287d6 Typo in .env.sample 2023-11-03 13:14:46 +00:00
Luigi311
3cd73e54a1 Merge pull request #109 from luigi311/dev
Update Deps, Add max_threads
2023-09-28 20:12:43 -06:00
Luigi311
bf5d875079 Print server info
Signed-off-by: Luigi311 <git@luigi311.com>
2023-09-28 20:00:47 -06:00
Luigi311
aef884523b Merge branch 'main' into dev 2023-09-28 19:24:04 -06:00
Luigi311
2a59f38faf Add docker compose to types
Signed-off-by: Luigi311 <git@luigi311.com>
2023-09-28 10:45:02 -06:00
Luigi311
3a0e60c772 Add max_threads
Signed-off-by: Luigi311 <git@luigi311.com>
2023-09-28 10:00:07 -06:00
Luigi311
fb657d41db Update apis
Signed-off-by: Luigi311 <git@luigi311.com>
2023-09-28 09:47:34 -06:00
Luigi311
ac7f389563 Timeout issues (#103)
* Add timeout support for jellyfin requests

Signed-off-by: Luigi311 <git@luigi311.com>
2023-09-25 01:59:16 -06:00
Luigi311
237e82eceb Merge pull request #96 from luigi311/dev
Jellyfin: Remove isPlayed, Use get for name
2023-08-16 19:16:35 -06:00
Luigi311
8fab4304a4 Jellyfin: Remove isPlayed, Use get for name
Signed-off-by: Luigi311 <git@luigi311.com>
2023-08-16 19:00:17 -06:00
Luigi311
971c9e9147 Merge pull request #94 from luigi311/dependabot/pip/aiohttp-3.8.5
Bump aiohttp from 3.8.4 to 3.8.5
2023-07-24 15:09:10 -06:00
dependabot[bot]
cacbca5a07 Bump aiohttp from 3.8.4 to 3.8.5
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.8.4 to 3.8.5.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/v3.8.5/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.8.4...v3.8.5)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-07-20 16:25:18 +00:00
Luigi311
e4dbd8adfb Merge pull request #93 from luigi311/dev 2023-07-19 12:07:40 -06:00
Luigi311
19f77c89e7 Jellyfin: Fix locations logic
Signed-off-by: Luigi311 <git@luigi311.com>
2023-07-18 16:27:13 -06:00
Luigi311
ce1b922f71 Merge pull request #90 from luigi311/dev
Fix missing paths and providers
2023-06-30 19:41:16 -06:00
Luigi311
81e967864d Disable fast fail
Signed-off-by: Luigi311 <git@luigi311.com>
2023-06-28 16:55:56 -06:00
Luigi311
29f55104bc Jellyfin: Check for provider_source in episode
Signed-off-by: Luigi311 <git@luigi311.com>
2023-06-28 16:52:23 -06:00
Luigi311
ff2e2deb20 Jellyfin: Handle missing paths
Signed-off-by: Luigi311 <git@luigi311.com>
2023-06-28 16:21:07 -06:00
Luigi311
3fa55cb41b Merge pull request #80 from jianglai/patch-1 2023-06-08 13:41:24 -06:00
Lai Jiang
aa5d97a0d5 Fix a type 2023-05-29 21:08:12 -04:00
Luigi311
89c4f15ae8 Merge pull request #79 from luigi311/dev
Jellyfin: Fix errors with missing matches
2023-05-23 15:39:29 -06:00
Luigi311
1351bfc1cf Jellyfin: Remove reassigning jellyfin_episode_id
Signed-off-by: Luigi311 <git@luigi311.com>
2023-05-23 14:33:42 -06:00
Luigi311
32cc76f043 Merge pull request #78 from luigi311/dependabot/pip/requests-2.31.0
Bump requests from 2.28.2 to 2.31.0
2023-05-23 14:26:43 -06:00
dependabot[bot]
968cb2091d Bump requests from 2.28.2 to 2.31.0
Bumps [requests](https://github.com/psf/requests) from 2.28.2 to 2.31.0.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.28.2...v2.31.0)

---
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-05-23 06:45:31 +00:00
Luigi311
8986c1037b Jellyfin: Fix errors with missing matches
Signed-off-by: Luigi311 <git@luigi311.com>
2023-05-22 01:22:34 -06:00
Luigi311
87b4a950f1 Merge pull request #75 from luigi311/dev
Variants, Pin versions, CI, Plex usernames
2023-05-17 13:38:25 -06:00
Luigi311
9f61c7338d Plex: Cleanup username_title 2023-05-17 13:22:00 -06:00
Luigi311
ffc81dad69 CI: Add back in dev based on alpine 2023-05-15 15:12:25 -06:00
Luigi311
7eba46b5cb plex: Fix username/title 2023-05-15 14:57:46 -06:00
Luigi311
aa177666a5 Plex: Fix username/title selection 2023-05-15 11:17:28 -06:00
Luigi311
7de7b42fd2 Users: Default to username and fall back to title 2023-05-15 11:10:03 -06:00
Luigi311
03d1fd8019 Log both servers users instead of exiting immediately 2023-05-15 10:44:30 -06:00
Luigi311
485ec5fe2d Add docker-compose file 2023-04-29 20:31:24 -06:00
Luigi311
59bfbd9811 Merge pull request #71 from luigi311/fix-docker-build/push
Do not publish on PR, fix condition check on build
2023-04-13 13:02:55 -06:00
Luigi311
1e485b37f8 Do not publish on PR, fix condition check on build 2023-04-13 12:56:52 -06:00
Luigi311
4adf94f24b Update ci.yml
Action: Use github.repository and github.actor instead
2023-04-13 10:28:01 -06:00
Luigi311
1a0fab36d3 Merge pull request #66 from Nicba1010/main
General build improvements
2023-04-13 09:50:59 -06:00
Roberto Banić
a1ef3b5a8d Add conditional to DockerHub login 2023-04-13 16:45:05 +02:00
Luigi311
0c47ee7119 Merge pull request #68 from Nicba1010/refactor-black-white
Refactor black/whitelist processing
2023-04-13 08:37:38 -06:00
Roberto Banić
e51cf6e482 Refactor black/whitelist processing 2023-04-13 12:56:28 +02:00
Roberto Banić
24d5de813d Remove DOCKER_USERNAME environment variable from docker_meta step 2023-04-13 11:23:32 +02:00
Roberto Banić
9921b2a355 Change is_default_branch to other default branch check 2023-04-13 11:21:28 +02:00
Roberto Banić
faa378c75e Add is_default_branch conditional to latest tag 2023-04-13 11:20:19 +02:00
Roberto Banić
26199100dc Update tags 2023-04-13 11:19:56 +02:00
Roberto Banić
bee854f059 Exclude DockerHub in case there is no username set 2023-04-13 10:48:03 +02:00
Roberto Banić
73c1ebf3ed Pin pytest version 2023-04-13 02:26:12 +02:00
Roberto Banić
397dd17429 Specify Python version 2023-04-13 02:26:11 +02:00
Roberto Banić
73d18dad92 Rename Dockerfile to Dockerfile.alpine 2023-04-13 02:26:10 +02:00
Roberto Banić
94d63a3fdb Add ghcr.io image name to the docker metadata action step 2023-04-13 02:26:09 +02:00
Roberto Banić
120d89e8be Add dashes to tags 2023-04-13 02:26:08 +02:00
Roberto Banić
eb5534c61c Add ghcr.io registry 2023-04-13 02:26:07 +02:00
Roberto Banić
99d217e8f1 Update ci.yml to perform a multi-variant build 2023-04-13 02:26:05 +02:00
Roberto Banić
f7e3f8ae2a Update Dockerfile to use the alpine Python 3 base image 2023-04-13 02:26:04 +02:00
Roberto Banić
2cebd2d73d Pin dependency versions to enable reproducible builds 2023-04-13 02:25:13 +02:00
Luigi311
18df322c41 Merge pull request #65 from luigi311/dev
Dev
2023-04-11 09:29:08 -06:00
Luigi311
fc80f50560 Fix codeql issues 2023-04-11 08:57:49 -06:00
Luigi311
4870ff9e7a Cleanup 2023-04-11 08:48:30 -06:00
Luigi311
58337bd38c Test: Use is None 2023-04-10 23:05:22 -06:00
Luigi311
e6d1e0933a Merge pull request #64 from luigi311/fix_indexing
Fix indexing with check_remove_entry
2023-04-10 17:20:36 -06:00
Luigi311
68e3f25ba4 Fix indexing 2023-04-10 16:59:54 -06:00
Luigi311
c981426db6 Merge pull request #62 from agustinmorantes/dev
Add "RUN_ONLY_ONCE" option
2023-04-10 11:54:17 -06:00
Agustín Morantes
916b16b12c Add "RUN_ONLY_ONCE" option 2023-04-10 14:39:28 -03:00
Luigi311
a178d230de Jellfyfin: Fix more issues with ids 2023-04-07 17:31:25 -06:00
Luigi311
fffb04728a Jellfyin: Fix issue with ids. Do not show marked for partial 2023-04-07 15:17:00 -06:00
Luigi311
658361383a Update README.md 2023-04-07 13:41:39 -06:00
Luigi311
3330026de6 Merge pull request #57 from luigi311/partial_watch
Partially implement in progress syncing
2023-03-31 12:14:53 -06:00
Luigi311
25fe426720 Plex: Implement partial play syncing 2023-03-26 23:55:56 -06:00
Luigi311
8d53b5b8c0 Take into account comparing two partially watched/one watched video 2023-03-23 22:50:13 -06:00
Luigi311
0774735f0f Plex: Add title to episode_guids 2023-03-23 22:49:14 -06:00
Luigi311
a5540b94d5 Gather partially watched movie/episodes with todo for processing. 2023-03-22 19:48:19 -06:00
Luigi311
c69d59858d Merge pull request #54 from luigi311/dev
Fix variable overwrites, Fix errors when plex user has no access
2023-03-22 11:29:36 -06:00
Luigi311
962b1149ad Plex: Use token, Check for token on mark 2023-03-18 12:15:59 -06:00
Luigi311
a8edee0354 Jellyfin: Fix user_watched_temp overwrite issues. 2023-03-18 12:12:12 -06:00
Luigi311
3627dde64d Plex: Do not error if user has no access 2023-03-18 11:56:56 -06:00
Luigi311
80ec0e42c2 Dockerfile: Add sync directions to ENV 2023-03-16 14:57:57 -06:00
Luigi311
fd64088bde Merge pull request #51 from luigi311/dev
Add sync direction flags, seperate out functions, better logging for jellyfin queries
2023-03-09 12:52:24 -07:00
Luigi311
7832e41a3b Add sync from to to readme 2023-03-09 01:32:27 -07:00
Luigi311
cadd65d69b Update issue templates (#50)
* Update issue templates
2023-03-09 01:29:11 -07:00
Luigi311
9f004797fc Force format on save in vscode 2023-03-09 00:53:07 -07:00
Luigi311
9041fee7ad Format 2023-03-09 00:48:29 -07:00
Luigi311
9af6c9057c Simplify plex update_user_watched 2023-03-09 00:36:55 -07:00
Luigi311
757ce91138 Merge pull request #49 from luigi311/seperate_functions
Seperate functions
2023-03-08 23:55:53 -07:00
Luigi311
98f96ed5c7 Fix user being added when shouldnt. Add test_users 2023-03-08 23:48:54 -07:00
Luigi311
3e15120e2a Fix library whitelist, add library tests 2023-03-08 23:17:54 -07:00
Luigi311
5824e6c0cc cleanup 2023-03-08 22:21:40 -07:00
Luigi311
7087d75efb Fix exception 2023-03-08 22:15:03 -07:00
Luigi311
b2a06b8fd3 Add tests for black_white and watched 2023-03-08 22:05:32 -07:00
Luigi311
1ee055faf5 format 2023-03-08 22:05:32 -07:00
Luigi311
404089dfca Seperate generate_library_guids_dict 2023-03-08 22:05:32 -07:00
Luigi311
ed24948dee Better logging on library skip 2023-03-08 22:05:32 -07:00
Luigi311
1f16fcb8eb Seperate check_skip_logic, append reasons 2023-03-08 22:05:32 -07:00
Luigi311
03de3affd7 Cleanup, seperate black/white lists setup 2023-03-08 22:05:32 -07:00
Luigi311
2bad887659 Seperate out functions to seperate scripts. 2023-03-08 22:04:48 -07:00
Luigi311
796be47a63 Move lots of setup_users to functions 2023-03-08 22:03:48 -07:00
Luigi311
dc1fe11590 Check for response status 200 on jellyfin query 2023-03-08 21:49:56 -07:00
Luigi311
13b4ff3215 Merge pull request #48 from JChris246/main
[Feature] Add flags to control the direction of syncing between the servers
2023-03-08 20:46:57 -07:00
JChris246
dca54cf4fb feat:add flags to control the direction of syncing 2023-03-08 21:30:28 -04:00
Luigi311
a4365e59f3 Merge pull request #44 from luigi311/dev
Fix issues with certain libraries failing
2023-02-26 13:32:26 -07:00
Luigi311
b960bccb86 Plex: Fix guids error on mark 2023-02-25 18:42:07 -07:00
Luigi311
218037200c Jellyfin: Fix tv show searching for watched 2023-02-25 18:27:01 -07:00
Luigi311
4ac670e837 Plex: Do not error if guids can not be gathered. Parallelize show processing for get watched. 2023-02-25 16:58:57 -07:00
Luigi311
96eff65c3e Do not error if failed to get library watched 2023-02-25 15:03:27 -07:00
Luigi311
45471607c8 Merge pull request #43 from JChris246/chore/spelling
Correct some spelling issues
2023-02-22 09:51:42 -07:00
JChris246
14885744b1 fix: correct some spelling issues 2023-02-22 00:09:30 -04:00
Luigi311
d1fd61f1d1 Merge pull request #38 from luigi311/dev
Fix issue with nested folders
2023-01-31 16:27:54 -07:00
Luigi311
6c1ee4a7dc Log server users 2023-01-30 11:56:27 -07:00
Luigi311
9a8e799e68 Recursive all the things. Use includeItemType 2023-01-30 11:46:12 -07:00
Luigi311
ffec4e2f28 Support multiple library types 2023-01-28 16:33:36 -07:00
Luigi311
00102891a5 Catch None for types 2023-01-27 23:45:03 -07:00
Luigi311
aa76b83428 Use isinstance instead of type 2023-01-27 12:21:38 -07:00
Luigi311
a644189ea5 Use isinstance instead of type 2023-01-27 12:18:15 -07:00
Luigi311
c5d987a8c9 Update .env.sample and README 2023-01-27 11:23:58 -07:00
Luigi311
bdd68ad68d If user is type str get plex object 2023-01-27 11:02:15 -07:00
Luigi311
2d86bca781 Update github actions 2023-01-27 10:48:52 -07:00
Luigi311
1b01ff6ec2 Log if multiple types and continue instead of error 2023-01-27 10:45:46 -07:00
Luigi311
f08ec43507 Skip library before erroring for multiple types. 2023-01-27 10:43:50 -07:00
Luigi311
7f9424260a Format 2023-01-26 14:03:13 -07:00
Luigi311
5f21943353 Exclude folders, use recursive. 2023-01-26 13:55:50 -07:00
Luigi311
a5a795f43c Exclude Folders from list 2023-01-26 13:42:35 -07:00
Luigi311
fcb6d7625f Fix invalid library types, raise mixed types 2023-01-26 13:31:57 -07:00
Luigi311
fd2179998f Fix ssl_bypass for plex 2023-01-26 11:23:47 -07:00
Luigi311
654e7f20e1 Merge pull request #33 from luigi311/dev
Lots of fixes and simplification
2022-12-23 23:13:22 -07:00
Luigi311
1eb92cf7c1 black formatting 2022-12-23 23:11:38 -07:00
Luigi311
111e284cc8 Cleanup 2022-12-23 23:10:51 -07:00
Luigi311
1a4e3f4ec4 Move setup_black_white_list to functions. Fix trailing slash on jellyfin baseurl 2022-12-23 23:02:53 -07:00
Luigi311
4066228e57 Add more debug logging. Do not enable debug by default 2022-12-19 14:07:56 -07:00
Luigi311
59c6d278e3 Add more logging to debug 2022-12-19 13:57:20 -07:00
Luigi311
39b33f3d43 Fix missing logging when using debug level 2022-12-19 13:22:42 -07:00
Luigi311
e8faf52b2b Do not mark shows/movies that do not exist 2022-12-19 01:35:16 -07:00
Luigi311
370e9bac63 change get user watched name to avoid mistakes 2022-12-18 22:39:03 -07:00
Luigi311
d0746cec5a Fix server 2 always running async runner. Speedup plex get watched 2022-12-18 22:27:42 -07:00
Luigi311
251937431b Move cleanup_watched to functions and simplify 2022-12-18 01:50:45 -07:00
Luigi311
50faf061af Remove dockerfile defaults 2022-11-21 18:21:29 -07:00
Luigi311
9ffbc49ad3 Merge pull request #30 from luigi311/dev
Add ssl_bypass to skip hostname validation.
2022-11-21 17:39:00 -07:00
Luigi311
644dc8e3af Merge pull request #29 from lgtm-migrator/codeql
Add CodeQL workflow for GitHub code scanning
2022-11-21 17:38:45 -07:00
Luigi311
47bc4e94dc Fix dockerfile 2022-11-21 17:31:47 -07:00
LGTM Migrator
f17d39fe17 Add CodeQL workflow for GitHub code scanning 2022-11-10 14:41:07 +00:00
Luigi311
966dcacf8d Add ssl_bypass to skip hostname validation. 2022-09-25 14:16:01 -06:00
Luigi311
9afc00443c Merge pull request #27 from luigi311/dev
Cleanup issues
2022-08-18 00:46:00 -06:00
Luigi311
3ec177ea64 rename test_main 2022-08-18 00:17:32 -06:00
Luigi311
b360c9fd0b Remove unnecessary deepcopy 2022-08-18 00:15:42 -06:00
Luigi311
1ed791b1ed Fix jellyfin 2022-08-17 23:49:05 -06:00
Luigi311
f19b1a3063 Cleanup length and functions instead of methods 2022-08-17 23:34:45 -06:00
Luigi311
190a72bd3c Cleanup 2022-08-17 22:53:27 -06:00
Luigi311
c848106ce7 Black cleanup 2022-08-17 22:31:23 -06:00
Luigi311
dd319271bd Cleanup 2022-08-17 22:09:11 -06:00
Luigi311
16879cc728 Merge pull request #26 from luigi311/dev
Use async for jellyfin
2022-08-17 21:49:34 -06:00
Luigi311
942ec3533f Cleanup log file on runs 2022-08-17 21:43:51 -06:00
Luigi311
9f6edfc91a Merge branch 'main' into dev 2022-08-17 21:40:25 -06:00
Luigi311
827ace2e97 cleanup 2022-08-17 21:20:28 -06:00
Luigi311
f6b57a1b4d Update README.md 2022-07-10 01:38:42 -06:00
Luigi311
88a7526721 Use async for jellyfin (#23)
* Use async

* Massive jellyfin watched speedup

Co-authored-by: Luigi311 <luigi311.lg@gmail.com>
2022-07-10 01:30:12 -06:00
luigi311
1efb4d8543 Fix debug 2022-07-06 17:22:35 -06:00
Luigi311
7571e9a343 Merge pull request #22 from luigi311/dev
Fix errors on certain edge cases
2022-07-05 21:23:14 -06:00
Luigi311
7640e9ee03 fix typo 2022-07-05 19:26:58 -06:00
Luigi311
50ed3d6400 Fix user_name in plex 2022-07-05 19:26:22 -06:00
Luigi311
c9a373851f Remove indexnumber from logging 2022-07-05 19:16:25 -06:00
Luigi311
a3f3db8f4e Use generate_library_guids_dict instead of library type 2022-07-05 18:09:08 -06:00
Luigi311
de619de923 Add more logging, fix username in jellyfin mark. 2022-07-05 16:35:22 -06:00
Luigi311
852d8dc3c3 Merge pull request #18 from luigi311/dev
Dev
2022-06-21 02:53:32 -06:00
Luigi311
c104973f95 Add location based matching 2022-06-20 21:12:02 -06:00
Luigi311
8b7fc5e323 Merge pull request #17 from luigi311/pytest
Add Pytest
2022-06-20 16:10:29 -06:00
Luigi311
afb71d8e00 Handle locations in generate_library_guids_dict 2022-06-20 16:07:52 -06:00
Luigi311
34d97f8dde Add pytest action 2022-06-20 16:05:05 -06:00
Luigi311
2ad6b3afdf Add pytest 2022-06-20 15:48:07 -06:00
Luigi311
7cd492dc98 Remove worker=1 2022-06-19 03:03:17 -06:00
Luigi311
74b5ea7b5e Fix username differences in watch list. Add python version check. More error handling. 2022-06-19 02:56:50 -06:00
Luigi311
21fe4875eb Add ENVs to dockerfile 2022-06-15 13:38:28 -06:00
Luigi311
aeb86f6b85 Fix user when using plex login. Fix sleep duration 2022-06-15 13:21:03 -06:00
Luigi311
70ef31ff47 Fix threading 2022-06-15 12:51:09 -06:00
Luigi311
0584a85f90 Add parallel threading 2022-06-14 22:36:44 -06:00
Luigi311
beb4e667ae Cleanup 2022-06-13 22:43:56 -06:00
Luigi311
7695994ec2 Support x many servers of any combination 2022-06-13 22:30:41 -06:00
Luigi311
04a8da6478 Merge pull request #14 from luigi311/dev
Improve matching via IDs
2022-06-13 18:43:46 -06:00
Luigi311
7ef2986bde Remove unused variable 2022-06-13 18:41:43 -06:00
Luigi311
c18f0a2582 Add debug_level option 2022-06-13 18:16:55 -06:00
Luigi311
4657097f6d Cleanup 2022-06-13 16:46:29 -06:00
Luigi311
ca84bbb19d Improve show/movie matching by using IDs only 2022-06-13 14:49:37 -06:00
35 changed files with 5812 additions and 1734 deletions

View File

@@ -1 +1,15 @@
.env
.dockerignore
.env
.env.sample
.git
.github
.gitignore
.idea
.vscode
Dockerfile*
README.md
test
venv

View File

@@ -1,38 +1,96 @@
# Global Settings
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
DEBUG = "False"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "info"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "False"
## How often to run the script in seconds
SLEEP_DURATION = "3600"
## Log file where all output will be written to
LOGFILE = "log.log"
## Map usernames between plex and jellyfin in the event that they are different, order does not matter
#USER_MAPPING = { "testuser2": "testuser3" }
## Map libraries between plex and jellyfin in the even that they are different, order does not matter
#LIBRARY_MAPPING = { "Shows": "TV Shows" }
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "mark.log"
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
PLEX_BASEURL = "http://localhost:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
PLEX_TOKEN = "SuperSecretToken"
## If not using plex token then use username and password of the server admin along with the servername
#PLEX_USERNAME = ""
#PLEX_PASSWORD = ""
#PLEX_SERVERNAME = "Plex Server"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Generate guids
## Generating guids is a slow process, so this is a way to speed up the process
## by using the location only, useful when using same files on multiple servers
GENERATE_GUIDS = "True"
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
JELLYFIN_BASEURL = "http://localhost:8096"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
JELLYFIN_TOKEN = "SuperSecretToken"
## Generate locations
## Generating locations is a slow process, so this is a way to speed up the process
## by using the guid only, useful when using different files on multiple servers
GENERATE_LOCATIONS = "True"
## Max threads for processing
MAX_THREADS = 32
## Map usernames between servers in the event that they are different, order does not matter
## Comma separated for multiple options
#USER_MAPPING = { "testuser2": "testuser3", "testuser1":"testuser4" }
## Map libraries between servers in the event that they are different, order does not matter
## Comma separated for multiple options
#LIBRARY_MAPPING = { "Shows": "TV Shows", "Movie": "Movies" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma separated for multiple options
#BLACKLIST_LIBRARY = ""
#WHITELIST_LIBRARY = ""
#BLACKLIST_LIBRARY_TYPE = ""
#BLACKLIST_LIBRARY_TYPE = ""
#WHITELIST_LIBRARY_TYPE = ""
#BLACKLIST_USERS = ""
WHITELIST_USERS = "testuser1,testuser2"
# Plex
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
## Comma separated list for multiple servers
PLEX_BASEURL = "http://localhost:32400, https://nas:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
## Comma separated list for multiple servers
PLEX_TOKEN = "SuperSecretToken, SuperSecretToken2"
## If not using plex token then use username and password of the server admin along with the servername
## Comma separated for multiple options
#PLEX_USERNAME = "PlexUser, PlexUser2"
#PLEX_PASSWORD = "SuperSecret, SuperSecret2"
#PLEX_SERVERNAME = "Plex Server1, Plex Server2"
## Skip hostname validation for ssl certificates.
## Set to True if running into ssl certificate errors
SSL_BYPASS = "False"
## control the direction of syncing. e.g. SYNC_FROM_PLEX_TO_JELLYFIN set to true will cause the updates from plex
## to be updated in jellyfin. SYNC_FROM_PLEX_TO_PLEX set to true will sync updates between multiple plex servers
SYNC_FROM_PLEX_TO_JELLYFIN = "True"
SYNC_FROM_JELLYFIN_TO_PLEX = "True"
SYNC_FROM_PLEX_TO_PLEX = "True"
SYNC_FROM_JELLYFIN_TO_JELLYFIN = "True"
# Jellyfin
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
## Comma separated list for multiple servers
JELLYFIN_BASEURL = "http://localhost:8096, http://nas:8096"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
## Comma separated list for multiple servers
JELLYFIN_TOKEN = "SuperSecretToken, SuperSecretToken2"

33
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,33 @@
---
name: Bug report
about: Create a report to help us improve
title: "[BUG]"
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Logs**
If applicable, add logs to help explain your problem ideally with DEBUG set to true, be sure to remove sensitive information
**Type:**
- [ ] Docker Compose
- [ ] Docker
- [ ] Unraid
- [ ] Native
**Additional context**
Add any other context about the problem here.

View File

@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[Feature Request]"
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -1,74 +1,172 @@
name: CI
on:
push:
paths-ignore:
- .gitignore
- "*.md"
pull_request:
paths-ignore:
- .gitignore
- "*.md"
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Docker meta
id: docker_meta
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
if: "${{ env.DOCKER_USERNAME != '' }}"
uses: docker/metadata-action@v4
with:
images: ${{ secrets.DOCKER_USERNAME }}/jellyplex-watched # list of Docker images to use as base name for tags
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
if: "${{ steps.docker_meta.outcome == 'success' }}"
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Build
id: build
if: "${{ steps.docker_meta.outcome == 'skipped' }}"
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: false
tags: jellyplex-watched:action
- name: Build Push
id: build_push
if: "${{ steps.docker_meta.outcome == 'success' }}"
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
# Echo digest so users can validate their image
- name: Image digest
if: "${{ steps.docker_meta.outcome == 'success' }}"
run: echo "${{ steps.build_push.outputs.digest }}"
name: CI
on:
workflow_dispatch:
push:
paths-ignore:
- .gitignore
- "*.md"
pull_request:
paths-ignore:
- .gitignore
- "*.md"
jobs:
pytest:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: "Install dependencies"
run: pip install -r requirements.txt && pip install -r test/requirements.txt
- name: "Run tests"
run: pytest -vvv
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: "Install dependencies"
run: |
pip install -r requirements.txt
sudo apt update && sudo apt install -y docker-compose
- name: "Checkout JellyPlex-Watched-CI"
uses: actions/checkout@v4
with:
repository: luigi311/JellyPlex-Watched-CI
path: JellyPlex-Watched-CI
- name: "Start containers"
run: |
export PGID=$(id -g)
export PUID=$(id -u)
sudo chown -R $PUID:$PGID JellyPlex-Watched-CI
docker pull lscr.io/linuxserver/plex &
docker pull lscr.io/linuxserver/jellyfin &
wait
docker-compose -f JellyPlex-Watched-CI/plex/docker-compose.yml up -d
docker-compose -f JellyPlex-Watched-CI/jellyfin/docker-compose.yml up -d
# Wait for containers to start
sleep 10
docker-compose -f JellyPlex-Watched-CI/plex/docker-compose.yml logs
docker-compose -f JellyPlex-Watched-CI/jellyfin/docker-compose.yml logs
- name: "Run tests"
run: |
# Test ci1
mv test/ci1.env .env
python main.py
# Test ci2
mv test/ci2.env .env
python main.py
# Test ci3
mv test/ci3.env .env
python main.py
# Test again to test if it can handle existing data
python main.py
cat mark.log
python test/validate_ci_marklog.py
docker:
runs-on: ubuntu-latest
needs:
- pytest
- test
env:
DEFAULT_VARIANT: alpine
strategy:
fail-fast: false
matrix:
include:
- dockerfile: Dockerfile.alpine
variant: alpine
- dockerfile: Dockerfile.slim
variant: slim
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Docker meta
id: docker_meta
uses: docker/metadata-action@v5
with:
images: |
${{ secrets.DOCKER_USERNAME }}/jellyplex-watched,enable=${{ secrets.DOCKER_USERNAME != '' }}
# Do not push to ghcr.io on PRs due to permission issues, only push if the owner is luigi311 so it doesnt fail on forks
ghcr.io/${{ github.repository }},enable=${{ github.event_name != 'pull_request' && github.repository_owner == 'luigi311'}}
tags: |
type=raw,value=latest,enable=${{ matrix.variant == env.DEFAULT_VARIANT && github.ref_name == github.event.repository.default_branch }}
type=raw,value=dev,enable=${{ matrix.variant == env.DEFAULT_VARIANT && github.ref_name == 'dev' }}
type=raw,value=latest,suffix=-${{ matrix.variant }},enable={{ is_default_branch }}
type=ref,event=branch,suffix=-${{ matrix.variant }}
type=ref,event=branch,enable=${{ matrix.variant == env.DEFAULT_VARIANT }}
type=ref,event=pr,suffix=-${{ matrix.variant }}
type=ref,event=pr,enable=${{ matrix.variant == env.DEFAULT_VARIANT }}
type=semver,pattern={{ version }},suffix=-${{ matrix.variant }}
type=semver,pattern={{ version }},enable=${{ matrix.variant == env.DEFAULT_VARIANT }}
type=semver,pattern={{ major }}.{{ minor }},suffix=-${{ matrix.variant }}
type=semver,pattern={{ major }}.{{ minor }},enable=${{ matrix.variant == env.DEFAULT_VARIANT }}
type=sha,suffix=-${{ matrix.variant }}
type=sha,enable=${{ matrix.variant == env.DEFAULT_VARIANT }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
if: "${{ env.DOCKER_USERNAME != '' }}"
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Login to GitHub Container Registry
if: "${{ steps.docker_meta.outcome == 'success' }}"
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build
id: build
if: "${{ steps.docker_meta.outputs.tags == '' }}"
uses: docker/build-push-action@v5
with:
context: .
file: ${{ matrix.dockerfile }}
platforms: linux/amd64,linux/arm64
push: false
tags: jellyplex-watched:action
- name: Build Push
id: build_push
if: "${{ steps.docker_meta.outputs.tags != '' }}"
uses: docker/build-push-action@v5
with:
context: .
file: ${{ matrix.dockerfile }}
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
# Echo digest so users can validate their image
- name: Image digest
if: "${{ steps.docker_meta.outcome == 'success' }}"
run: echo "${{ steps.build_push.outputs.digest }}"

41
.github/workflows/codeql.yml vendored Normal file
View File

@@ -0,0 +1,41 @@
name: "CodeQL"
on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
schedule:
- cron: "23 20 * * 6"
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ python ]
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
queries: +security-and-quality
- name: Autobuild
uses: github/codeql-action/autobuild@v2
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{ matrix.language }}"

263
.gitignore vendored
View File

@@ -1,131 +1,132 @@
.env
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
**.env*
*.prof
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/

27
.vscode/launch.json vendored Normal file
View File

@@ -0,0 +1,27 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Python: Main",
"type": "python",
"request": "launch",
"program": "main.py",
"console": "integratedTerminal",
"justMyCode": true
},
{
"name": "Pytest",
"type": "python",
"request": "launch",
"module": "pytest",
"args": [
"-vv"
],
"console": "integratedTerminal",
"justMyCode": true
}
]
}

7
.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,7 @@
{
"[python]" : {
"editor.formatOnSave": true,
},
"python.formatting.provider": "black",
}

View File

@@ -1,10 +0,0 @@
FROM python:3-slim
WORKDIR /app
COPY ./requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "-u", "main.py"]

53
Dockerfile.alpine Normal file
View File

@@ -0,0 +1,53 @@
FROM python:3.11-alpine
ENV DRYRUN 'True'
ENV DEBUG 'True'
ENV DEBUG_LEVEL 'INFO'
ENV RUN_ONLY_ONCE 'False'
ENV SLEEP_DURATION '3600'
ENV LOGFILE 'log.log'
ENV MARKFILE 'mark.log'
ENV USER_MAPPING ''
ENV LIBRARY_MAPPING ''
ENV PLEX_BASEURL ''
ENV PLEX_TOKEN ''
ENV PLEX_USERNAME ''
ENV PLEX_PASSWORD ''
ENV PLEX_SERVERNAME ''
ENV JELLYFIN_BASEURL ''
ENV JELLYFIN_TOKEN ''
ENV SYNC_FROM_PLEX_TO_JELLYFIN 'True'
ENV SYNC_FROM_JELLYFIN_TO_PLEX 'True'
ENV SYNC_FROM_PLEX_TO_PLEX 'True'
ENV SYNC_FROM_JELLYFIN_TO_JELLYFIN 'True'
ENV BLACKLIST_LIBRARY ''
ENV WHITELIST_LIBRARY ''
ENV BLACKLIST_LIBRARY_TYPE ''
ENV WHITELIST_LIBRARY_TYPE ''
ENV BLACKLIST_USERS ''
ENV WHITELIST_USERS ''
RUN apk add --no-cache tini && \
addgroup --system jellyplex_user && \
adduser --system --no-create-home jellyplex_user --ingroup jellyplex_user && \
mkdir -p /app && \
chown -R jellyplex_user:jellyplex_user /app
WORKDIR /app
COPY --chown=jellyplex_user:jellyplex_user ./requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY --chown=jellyplex_user:jellyplex_user . .
USER jellyplex_user
ENTRYPOINT ["/sbin/tini", "--"]
CMD ["python", "-u", "main.py"]

56
Dockerfile.slim Normal file
View File

@@ -0,0 +1,56 @@
FROM python:3.11-slim
ENV DRYRUN 'True'
ENV DEBUG 'True'
ENV DEBUG_LEVEL 'INFO'
ENV RUN_ONLY_ONCE 'False'
ENV SLEEP_DURATION '3600'
ENV LOGFILE 'log.log'
ENV MARKFILE 'mark.log'
ENV USER_MAPPING ''
ENV LIBRARY_MAPPING ''
ENV PLEX_BASEURL ''
ENV PLEX_TOKEN ''
ENV PLEX_USERNAME ''
ENV PLEX_PASSWORD ''
ENV PLEX_SERVERNAME ''
ENV JELLYFIN_BASEURL ''
ENV JELLYFIN_TOKEN ''
ENV SYNC_FROM_PLEX_TO_JELLYFIN 'True'
ENV SYNC_FROM_JELLYFIN_TO_PLEX 'True'
ENV SYNC_FROM_PLEX_TO_PLEX 'True'
ENV SYNC_FROM_JELLYFIN_TO_JELLYFIN 'True'
ENV BLACKLIST_LIBRARY ''
ENV WHITELIST_LIBRARY ''
ENV BLACKLIST_LIBRARY_TYPE ''
ENV WHITELIST_LIBRARY_TYPE ''
ENV BLACKLIST_USERS ''
ENV WHITELIST_USERS ''
RUN apt-get update && \
apt-get install tini --yes --no-install-recommends && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* && \
addgroup --system jellyplex_user && \
adduser --system --no-create-home jellyplex_user --ingroup jellyplex_user && \
mkdir -p /app && \
chown -R jellyplex_user:jellyplex_user /app
WORKDIR /app
COPY --chown=jellyplex_user:jellyplex_user ./requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY --chown=jellyplex_user:jellyplex_user . .
USER jellyplex_user
ENTRYPOINT ["/bin/tini", "--"]
CMD ["python", "-u", "main.py"]

1348
LICENSE

File diff suppressed because it is too large Load Diff

View File

@@ -1,28 +1,64 @@
# JellyPlex-Watched
[![Codacy Badge](https://app.codacy.com/project/badge/Grade/26b47c5db63942f28f02f207f692dc85)](https://www.codacy.com/gh/luigi311/JellyPlex-Watched/dashboard?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=luigi311/JellyPlex-Watched&amp;utm_campaign=Badge_Grade)
[![Codacy Badge](https://app.codacy.com/project/badge/Grade/26b47c5db63942f28f02f207f692dc85)](https://www.codacy.com/gh/luigi311/JellyPlex-Watched/dashboard?utm_source=github.com\&utm_medium=referral\&utm_content=luigi311/JellyPlex-Watched\&utm_campaign=Badge_Grade)
Sync watched between jellyfin and plex
Sync watched between jellyfin and plex locally
## Description
Keep in sync all your users watched history between jellyfin and plex locally. This uses the imdb ids and any other matching id to find the correct episode/movie between the two. This is not perfect but it works for most cases.
Keep in sync all your users watched history between jellyfin and plex servers locally. This uses file names and provider ids to find the correct episode/movie between the two. This is not perfect but it works for most cases. You can use this for as many servers as you want by entering multiple options in the .env plex/jellyfin section separated by commas.
## Features
### Plex
* \[x] Match via filenames
* \[x] Match via provider ids
* \[x] Map usernames
* \[x] Use single login
* \[x] One way/multi way sync
* \[x] Sync watched
* \[x] Sync in progress
### Jellyfin
* \[x] Match via filenames
* \[x] Match via provider ids
* \[x] Map usernames
* \[x] Use single login
* \[x] One way/multi way sync
* \[x] Sync watched
* \[ ] Sync in progress
### Emby
* \[ ] Match via filenames
* \[ ] Match via provider ids
* \[ ] Map usernames
* \[ ] Use single login
* \[ ] One way/multi way sync
* \[ ] Sync watched
* \[ ] Sync in progress
## Configuration
Full list of configuration options can be found in the [.env.sample](.env.sample)
## Installation
### Baremeta
### Baremetal
- Setup virtualenv of your choice
* Setup virtualenv of your choice
- Install dependencies
* Install dependencies
```bash
pip install -r requirements.txt
```
- Create a .env file similar to .env.sample, uncomment whitelist and blacklist if needed, fill in baseurls and tokens
* Create a .env file similar to .env.sample, uncomment whitelist and blacklist if needed, fill in baseurls and tokens
- Run
* Run
```bash
python main.py
@@ -30,13 +66,13 @@ Keep in sync all your users watched history between jellyfin and plex locally. T
### Docker
- Build docker image
* Build docker image
```bash
docker build -t jellyplex-watched .
```
- or use pre-built image
* or use pre-built image
```bash
docker pull luigi311/jellyplex-watched:latest
@@ -44,7 +80,7 @@ Keep in sync all your users watched history between jellyfin and plex locally. T
#### With variables
- Run
* Run
```bash
docker run --rm -it -e PLEX_TOKEN='SuperSecretToken' luigi311/jellyplex-watched:latest
@@ -52,17 +88,26 @@ Keep in sync all your users watched history between jellyfin and plex locally. T
#### With .env
- Create a .env file similar to .env.sample and set the MNEMONIC variable to your seed phrase
* Create a .env file similar to .env.sample and set the variables to match your setup
- Run
* Run
```bash
docker run --rm -it -v "$(pwd)/.env:/app/.env" luigi311/jellyplex-watched:latest
```
## Troubleshooting/Issues
* Jellyfin
* Attempt to decode JSON with unexpected mimetype, make sure you enable remote access or add your docker subnet to lan networks in jellyfin settings
* Configuration
* Do not use quotes around variables in docker compose
## Contributing
I am open to recieving pull requests. If you are submitting a pull request, please make sure run it locally for a day or two to make sure it is working as expected and stable. Make all pull requests against the dev branch and nothing will be merged into the main without going through the lower branches.
I am open to receiving pull requests. If you are submitting a pull request, please make sure run it locally for a day or two to make sure it is working as expected and stable. Make all pull requests against the dev branch and nothing will be merged into the main without going through the lower branches.
## License

32
docker-compose.yml Normal file
View File

@@ -0,0 +1,32 @@
version: '3'
services:
jellyplex-watched:
image: luigi311/jellyplex-watched:latest
container_name: jellyplex-watched
restart: always
environment:
- DRYRUN=True
- DEBUG=True
- DEBUG_LEVEL=info
- RUN_ONLY_ONCE=False
- SLEEP_DURATION=3600
- LOGFILE=/tmp/log.log
- MARKFILE=/tmp/mark.log
- USER_MAPPING={"user1":"user2"}
- LIBRARY_MAPPING={"TV Shows":"Shows"}
- BLACKLIST_LIBRARY=
- WHITELIST_LIBRARY=
- BLACKLIST_LIBRARY_TYPE=
- WHITELIST_LIBRARY_TYPE=
- BLACKLIST_USERS=
- WHITELIST_USERS=
- PLEX_BASEURL=https://localhost:32400
- PLEX_TOKEN=plex_token
- JELLYFIN_BASEURL=http://localhost:8096
- JELLYFIN_TOKEN=jelly_token
- SSL_BYPASS=True
- SYNC_FROM_PLEX_TO_JELLYFIN=True
- SYNC_FROM_JELLYFIN_TO_PLEX=True
- SYNC_FROM_PLEX_TO_PLEX=True
- SYNC_FROM_JELLYFIN_TO_JELLYFIN=True

305
main.py
View File

@@ -1,294 +1,11 @@
import copy, os, traceback, json
from dotenv import load_dotenv
from time import sleep
from src.functions import logger, str_to_bool, search_mapping
from src.plex import Plex
from src.jellyfin import Jellyfin
load_dotenv(override=True)
def cleanup_watched(watched_list_1, watched_list_2, user_mapping=None, library_mapping=None):
modified_watched_list_1 = copy.deepcopy(watched_list_1)
# remove entries from plex_watched that are in jellyfin_watched
for user_1 in watched_list_1:
user_other = None
if user_mapping:
user_other = search_mapping(user_mapping, user_1)
if user_1 in modified_watched_list_1:
if user_1 in watched_list_2:
user_2 = user_1
elif user_other in watched_list_2:
user_2 = user_other
else:
logger(f"User {user_1} and {user_other} not found in watched list 2", 1)
continue
for library_1 in watched_list_1[user_1]:
library_other = None
if library_mapping:
library_other = search_mapping(library_mapping, library_1)
if library_1 in modified_watched_list_1[user_1]:
if library_1 in watched_list_2[user_2]:
library_2 = library_1
elif library_other in watched_list_2[user_2]:
library_2 = library_other
else:
logger(f"User {library_1} and {library_other} not found in watched list 2", 1)
continue
for item in watched_list_1[user_1][library_1]:
if item in modified_watched_list_1[user_1][library_1]:
# Movies
if isinstance(watched_list_1[user_1][library_1], list):
for watch_list_1_key, watch_list_1_value in item.items():
for watch_list_2_item in watched_list_2[user_2][library_2]:
for watch_list_2_item_key, watch_list_2_item_value in watch_list_2_item.items():
if watch_list_1_key == watch_list_2_item_key and watch_list_1_value == watch_list_2_item_value:
if item in modified_watched_list_1[user_1][library_1]:
modified_watched_list_1[user_1][library_1].remove(item)
# TV Shows
elif isinstance(watched_list_1[user_1][library_1], dict):
if item in watched_list_2[user_2][library_2]:
for season in watched_list_1[user_1][library_1][item]:
if season in watched_list_2[user_2][library_2][item]:
for episode in watched_list_1[user_1][library_1][item][season]:
for watch_list_1_episode_key, watch_list_1_episode_value in episode.items():
for watch_list_2_episode in watched_list_2[user_2][library_2][item][season]:
for watch_list_2_episode_key, watch_list_2_episode_value in watch_list_2_episode.items():
if watch_list_1_episode_key == watch_list_2_episode_key and watch_list_1_episode_value == watch_list_2_episode_value:
if episode in modified_watched_list_1[user_1][library_1][item][season]:
modified_watched_list_1[user_1][library_1][item][season].remove(episode)
# If season is empty, remove season
if len(modified_watched_list_1[user_1][library_1][item][season]) == 0:
if season in modified_watched_list_1[user_1][library_1][item]:
del modified_watched_list_1[user_1][library_1][item][season]
# If the show is empty, remove the show
if len(modified_watched_list_1[user_1][library_1][item]) == 0:
if item in modified_watched_list_1[user_1][library_1]:
del modified_watched_list_1[user_1][library_1][item]
# If library is empty then remove it
if len(modified_watched_list_1[user_1][library_1]) == 0:
if library_1 in modified_watched_list_1[user_1]:
del modified_watched_list_1[user_1][library_1]
# If user is empty delete user
if len(modified_watched_list_1[user_1]) == 0:
del modified_watched_list_1[user_1]
return modified_watched_list_1
def setup_black_white_lists(library_mapping=None):
blacklist_library = os.getenv("BLACKLIST_LIBRARY")
if blacklist_library:
if len(blacklist_library) > 0:
blacklist_library = blacklist_library.split(",")
blacklist_library = [x.strip() for x in blacklist_library]
if library_mapping:
temp_library = []
for library in blacklist_library:
library_other = search_mapping(library_mapping, library)
if library_other:
temp_library.append(library_other)
blacklist_library = blacklist_library + temp_library
else:
blacklist_library = []
logger(f"Blacklist Library: {blacklist_library}", 1)
whitelist_library = os.getenv("WHITELIST_LIBRARY")
if whitelist_library:
if len(whitelist_library) > 0:
whitelist_library = whitelist_library.split(",")
whitelist_library = [x.strip() for x in whitelist_library]
if library_mapping:
temp_library = []
for library in whitelist_library:
library_other = search_mapping(library_mapping, library)
if library_other:
temp_library.append(library_other)
whitelist_library = whitelist_library + temp_library
else:
whitelist_library = []
logger(f"Whitelist Library: {whitelist_library}", 1)
blacklist_library_type = os.getenv("BLACKLIST_LIBRARY_TYPE")
if blacklist_library_type:
if len(blacklist_library_type) > 0:
blacklist_library_type = blacklist_library_type.split(",")
blacklist_library_type = [x.lower().strip() for x in blacklist_library_type]
else:
blacklist_library_type = []
logger(f"Blacklist Library Type: {blacklist_library_type}", 1)
whitelist_library_type = os.getenv("WHITELIST_LIBRARY_TYPE")
if whitelist_library_type:
if len(whitelist_library_type) > 0:
whitelist_library_type = whitelist_library_type.split(",")
whitelist_library_type = [x.lower().strip() for x in whitelist_library_type]
else:
whitelist_library_type = []
logger(f"Whitelist Library Type: {whitelist_library_type}", 1)
blacklist_users = os.getenv("BLACKLIST_USERS")
if blacklist_users:
if len(blacklist_users) > 0:
blacklist_users = blacklist_users.split(",")
blacklist_users = [x.lower().strip() for x in blacklist_users]
else:
blacklist_users = []
logger(f"Blacklist Users: {blacklist_users}", 1)
whitelist_users = os.getenv("WHITELIST_USERS")
if whitelist_users:
if len(whitelist_users) > 0:
whitelist_users = whitelist_users.split(",")
whitelist_users = [x.lower().strip() for x in whitelist_users]
else:
whitelist_users = []
else:
whitelist_users = []
logger(f"Whitelist Users: {whitelist_users}", 1)
return blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, blacklist_users, whitelist_users
def setup_users(plex, jellyfin, blacklist_users, whitelist_users, user_mapping=None):
# generate list of users from plex.users
plex_users = [ x.title.lower() for x in plex.users ]
jellyfin_users = [ key.lower() for key in jellyfin.users.keys() ]
# combined list of overlapping users from plex and jellyfin
users = {}
for plex_user in plex_users:
if user_mapping:
jellyfin_plex_mapped_user = search_mapping(user_mapping, plex_user)
if jellyfin_plex_mapped_user:
users[plex_user] = jellyfin_plex_mapped_user
continue
if plex_user in jellyfin_users:
users[plex_user] = plex_user
for jellyfin_user in jellyfin_users:
if user_mapping:
plex_jellyfin_mapped_user = search_mapping(user_mapping, jellyfin_user)
if plex_jellyfin_mapped_user:
users[plex_jellyfin_mapped_user] = jellyfin_user
continue
if jellyfin_user in plex_users:
users[jellyfin_user] = jellyfin_user
logger(f"User list that exist on both servers {users}", 1)
users_filtered = {}
for user in users:
# whitelist_user is not empty and user lowercase is not in whitelist lowercase
if len(whitelist_users) > 0:
if user not in whitelist_users and users[user] not in whitelist_users:
logger(f"{user} or {users[user]} is not in whitelist", 1)
continue
if user not in blacklist_users and users[user] not in blacklist_users:
users_filtered[user] = users[user]
logger(f"Filtered user list {users_filtered}", 1)
plex_users = []
for plex_user in plex.users:
if plex_user.title.lower() in users_filtered.keys() or plex_user.title.lower() in users_filtered.values():
plex_users.append(plex_user)
jellyfin_users = {}
for jellyfin_user, jellyfin_id in jellyfin.users.items():
if jellyfin_user.lower() in users_filtered.keys() or jellyfin_user.lower() in users_filtered.values():
jellyfin_users[jellyfin_user] = jellyfin_id
if len(plex_users) == 0:
raise Exception(f"No plex users found, users found {users} filtered users {users_filtered}")
if len(jellyfin_users) == 0:
raise Exception(f"No jellyfin users found, users found {users} filtered users {users_filtered}")
logger(f"plex_users: {plex_users}", 1)
logger(f"jellyfin_users: {jellyfin_users}", 1)
return plex_users, jellyfin_users
def main():
logfile = os.getenv("LOGFILE","log.log")
# Delete logfile if it exists
if os.path.exists(logfile):
os.remove(logfile)
dryrun = str_to_bool(os.getenv("DRYRUN", "False"))
logger(f"Dryrun: {dryrun}", 1)
user_mapping = os.getenv("USER_MAPPING")
if user_mapping:
user_mapping = json.loads(user_mapping.lower())
logger(f"User Mapping: {user_mapping}", 1)
library_mapping = os.getenv("LIBRARY_MAPPING")
if library_mapping:
library_mapping = json.loads(library_mapping)
logger(f"Library Mapping: {library_mapping}", 1)
plex = Plex()
jellyfin = Jellyfin()
# Create (black/white)lists
blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, blacklist_users, whitelist_users = setup_black_white_lists(library_mapping)
# Create users list
plex_users, jellyfin_users = setup_users(plex, jellyfin, blacklist_users, whitelist_users, user_mapping)
plex_watched = plex.get_plex_watched(plex_users, blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, library_mapping)
jellyfin_watched = jellyfin.get_jellyfin_watched(jellyfin_users, blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, library_mapping)
# clone watched so it isnt modified in the cleanup function so all duplicates are actually removed
plex_watched_filtered = copy.deepcopy(plex_watched)
jellyfin_watched_filtered = copy.deepcopy(jellyfin_watched)
plex_watched = cleanup_watched(plex_watched_filtered, jellyfin_watched_filtered, user_mapping, library_mapping)
logger(f"plex_watched that needs to be synced to jellyfin:\n{plex_watched}", 1)
jellyfin_watched = cleanup_watched(jellyfin_watched_filtered, plex_watched_filtered, user_mapping, library_mapping)
logger(f"jellyfin_watched that needs to be synced to plex:\n{jellyfin_watched}", 1)
# Update watched status
plex.update_watched(jellyfin_watched, user_mapping, library_mapping, dryrun)
jellyfin.update_watched(plex_watched, user_mapping, library_mapping, dryrun)
if __name__ == "__main__":
sleep_timer = float(os.getenv("SLEEP_TIMER", "3600"))
while(True):
try:
main()
logger(f"Looping in {sleep_timer}")
except Exception as error:
if isinstance(error, list):
for message in error:
logger(message, log_type=2)
else:
logger(error, log_type=2)
logger(traceback.format_exc(), 2)
logger(f"Retrying in {sleep_timer}", log_type=0)
except KeyboardInterrupt:
logger("Exiting", log_type=0)
os._exit(0)
sleep(sleep_timer)
import sys
if __name__ == "__main__":
# Check python version 3.9 or higher
if not (3, 9) <= tuple(map(int, sys.version_info[:2])):
print("This script requires Python 3.9 or higher")
sys.exit(1)
from src.main import main
main()

View File

@@ -1,3 +1,3 @@
plexapi
requests
python-dotenv
PlexAPI==4.15.7
requests==2.31.0
python-dotenv==1.0.0

92
src/black_white.py Normal file
View File

@@ -0,0 +1,92 @@
from src.functions import logger, search_mapping
def setup_black_white_lists(
blacklist_library: str,
whitelist_library: str,
blacklist_library_type: str,
whitelist_library_type: str,
blacklist_users: str,
whitelist_users: str,
library_mapping=None,
user_mapping=None,
):
blacklist_library, blacklist_library_type, blacklist_users = setup_x_lists(
blacklist_library,
blacklist_library_type,
blacklist_users,
"Black",
library_mapping,
user_mapping,
)
whitelist_library, whitelist_library_type, whitelist_users = setup_x_lists(
whitelist_library,
whitelist_library_type,
whitelist_users,
"White",
library_mapping,
user_mapping,
)
return (
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
)
def setup_x_lists(
xlist_library,
xlist_library_type,
xlist_users,
xlist_type,
library_mapping=None,
user_mapping=None,
):
if xlist_library:
if len(xlist_library) > 0:
xlist_library = xlist_library.split(",")
xlist_library = [x.strip() for x in xlist_library]
if library_mapping:
temp_library = []
for library in xlist_library:
library_other = search_mapping(library_mapping, library)
if library_other:
temp_library.append(library_other)
xlist_library = xlist_library + temp_library
else:
xlist_library = []
logger(f"{xlist_type}list Library: {xlist_library}", 1)
if xlist_library_type:
if len(xlist_library_type) > 0:
xlist_library_type = xlist_library_type.split(",")
xlist_library_type = [x.lower().strip() for x in xlist_library_type]
else:
xlist_library_type = []
logger(f"{xlist_type}list Library Type: {xlist_library_type}", 1)
if xlist_users:
if len(xlist_users) > 0:
xlist_users = xlist_users.split(",")
xlist_users = [x.lower().strip() for x in xlist_users]
if user_mapping:
temp_users = []
for user in xlist_users:
user_other = search_mapping(user_mapping, user)
if user_other:
temp_users.append(user_other)
xlist_users = xlist_users + temp_users
else:
xlist_users = []
else:
xlist_users = []
logger(f"{xlist_type}list Users: {xlist_users}", 1)
return xlist_library, xlist_library_type, xlist_users

View File

@@ -1,75 +1,128 @@
import os
from dotenv import load_dotenv
load_dotenv(override=True)
logfile = os.getenv("LOGFILE","log.log")
def logger(message, log_type=0):
debug = str_to_bool(os.getenv("DEBUG", "True"))
output = str(message)
if log_type == 0:
pass
elif log_type == 1 and debug:
output = f"[INFO]: {output}"
elif log_type == 2:
output = f"[ERROR]: {output}"
else:
output = None
if output is not None:
print(output)
file = open(logfile, "a", encoding="utf-8")
file.write(output + "\n")
# Reimplementation of distutils.util.strtobool due to it being deprecated
# Source: https://github.com/PostHog/posthog/blob/01e184c29d2c10c43166f1d40a334abbc3f99d8a/posthog/utils.py#L668
def str_to_bool(value: any) -> bool:
if not value:
return False
return str(value).lower() in ("y", "yes", "t", "true", "on", "1")
# Get mapped value
def search_mapping(dictionary: dict, key_value: str):
if key_value in dictionary.keys():
return dictionary[key_value]
elif key_value.lower() in dictionary.keys():
return dictionary[key_value]
elif key_value in dictionary.values():
return list(dictionary.keys())[list(dictionary.values()).index(key_value)]
elif key_value.lower() in dictionary.values():
return list(dictionary.keys())[list(dictionary.values()).index(key_value)]
else:
return None
def check_skip_logic(library_title, library_type, blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, library_mapping):
skip_reason = None
if library_type.lower() in blacklist_library_type:
skip_reason = "is blacklist_library_type"
if library_title.lower() in [x.lower() for x in blacklist_library]:
skip_reason = "is blacklist_library"
library_other = None
if library_mapping:
library_other = search_mapping(library_mapping, library_title)
if library_other:
if library_other.lower() in [x.lower() for x in blacklist_library]:
skip_reason = "is blacklist_library"
if len(whitelist_library_type) > 0:
if library_type.lower() not in whitelist_library_type:
skip_reason = "is not whitelist_library_type"
# if whitelist is not empty and library is not in whitelist
if len(whitelist_library) > 0:
if library_title.lower() not in [x.lower() for x in whitelist_library]:
skip_reason = "is not whitelist_library"
if library_other:
if library_other.lower() not in [x.lower() for x in whitelist_library]:
skip_reason = "is not whitelist_library"
return skip_reason
import os
from concurrent.futures import ThreadPoolExecutor
from dotenv import load_dotenv
load_dotenv(override=True)
logfile = os.getenv("LOGFILE", "log.log")
markfile = os.getenv("MARKFILE", "mark.log")
def logger(message: str, log_type=0):
debug = str_to_bool(os.getenv("DEBUG", "False"))
debug_level = os.getenv("DEBUG_LEVEL", "info").lower()
output = str(message)
if log_type == 0:
pass
elif log_type == 1 and (debug and debug_level in ("info", "debug")):
output = f"[INFO]: {output}"
elif log_type == 2:
output = f"[ERROR]: {output}"
elif log_type == 3 and (debug and debug_level == "debug"):
output = f"[DEBUG]: {output}"
elif log_type == 4:
output = f"[WARNING]: {output}"
elif log_type == 5:
output = f"[MARK]: {output}"
elif log_type == 6:
output = f"[DRYRUN]: {output}"
else:
output = None
if output is not None:
print(output)
file = open(logfile, "a", encoding="utf-8")
file.write(output + "\n")
def log_marked(
username: str, library: str, movie_show: str, episode: str = None, duration=None
):
if markfile is None:
return
output = f"{username}/{library}/{movie_show}"
if episode:
output += f"/{episode}"
if duration:
output += f"/{duration}"
file = open(f"{markfile}", "a", encoding="utf-8")
file.write(output + "\n")
# Reimplementation of distutils.util.strtobool due to it being deprecated
# Source: https://github.com/PostHog/posthog/blob/01e184c29d2c10c43166f1d40a334abbc3f99d8a/posthog/utils.py#L668
def str_to_bool(value: any) -> bool:
if not value:
return False
return str(value).lower() in ("y", "yes", "t", "true", "on", "1")
# Search for nested element in list
def contains_nested(element, lst):
if lst is None:
return None
for i, item in enumerate(lst):
if item is None:
continue
if element in item:
return i
elif element == item:
return i
return None
# Get mapped value
def search_mapping(dictionary: dict, key_value: str):
if key_value in dictionary.keys():
return dictionary[key_value]
elif key_value.lower() in dictionary.keys():
return dictionary[key_value.lower()]
elif key_value in dictionary.values():
return list(dictionary.keys())[list(dictionary.values()).index(key_value)]
elif key_value.lower() in dictionary.values():
return list(dictionary.keys())[
list(dictionary.values()).index(key_value.lower())
]
else:
return None
def future_thread_executor(
args: list, threads: int = None, override_threads: bool = False
):
futures_list = []
results = []
workers = min(int(os.getenv("MAX_THREADS", 32)), os.cpu_count() * 2)
if threads:
workers = min(threads, workers)
if override_threads:
workers = threads
# If only one worker, run in main thread to avoid overhead
if workers == 1:
results = []
for arg in args:
results.append(arg[0](*arg[1:]))
return results
with ThreadPoolExecutor(max_workers=workers) as executor:
for arg in args:
# * arg unpacks the list into actual arguments
futures_list.append(executor.submit(*arg))
for future in futures_list:
try:
result = future.result()
results.append(result)
except Exception as e:
raise Exception(e)
return results

File diff suppressed because it is too large Load Diff

278
src/library.py Normal file
View File

@@ -0,0 +1,278 @@
from src.functions import (
logger,
search_mapping,
)
def check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping=None,
):
skip_reason = None
library_other = None
if library_mapping:
library_other = search_mapping(library_mapping, library_title)
skip_reason_black = check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other,
)
skip_reason_white = check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other,
)
# Combine skip reasons
if skip_reason_black:
skip_reason = skip_reason_black
if skip_reason_white:
if skip_reason:
skip_reason = skip_reason + " and " + skip_reason_white
else:
skip_reason = skip_reason_white
return skip_reason
def check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other=None,
):
skip_reason = None
if isinstance(library_type, (list, tuple, set)):
for library_type_item in library_type:
if library_type_item.lower() in blacklist_library_type:
skip_reason = f"{library_type_item} is in blacklist_library_type"
else:
if library_type.lower() in blacklist_library_type:
skip_reason = f"{library_type} is in blacklist_library_type"
if library_title.lower() in [x.lower() for x in blacklist_library]:
if skip_reason:
skip_reason = (
skip_reason + " and " + f"{library_title} is in blacklist_library"
)
else:
skip_reason = f"{library_title} is in blacklist_library"
if library_other:
if library_other.lower() in [x.lower() for x in blacklist_library]:
if skip_reason:
skip_reason = (
skip_reason + " and " + f"{library_other} is in blacklist_library"
)
else:
skip_reason = f"{library_other} is in blacklist_library"
return skip_reason
def check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other=None,
):
skip_reason = None
if len(whitelist_library_type) > 0:
if isinstance(library_type, (list, tuple, set)):
for library_type_item in library_type:
if library_type_item.lower() not in whitelist_library_type:
skip_reason = (
f"{library_type_item} is not in whitelist_library_type"
)
else:
if library_type.lower() not in whitelist_library_type:
skip_reason = f"{library_type} is not in whitelist_library_type"
# if whitelist is not empty and library is not in whitelist
if len(whitelist_library) > 0:
if library_other:
if library_title.lower() not in [
x.lower() for x in whitelist_library
] and library_other.lower() not in [x.lower() for x in whitelist_library]:
if skip_reason:
skip_reason = (
skip_reason
+ " and "
+ f"{library_title} is not in whitelist_library"
)
else:
skip_reason = f"{library_title} is not in whitelist_library"
else:
if library_title.lower() not in [x.lower() for x in whitelist_library]:
if skip_reason:
skip_reason = (
skip_reason
+ " and "
+ f"{library_title} is not in whitelist_library"
)
else:
skip_reason = f"{library_title} is not in whitelist_library"
return skip_reason
def show_title_dict(user_list: dict):
try:
show_output_dict = {}
show_output_dict["locations"] = []
show_counter = 0 # Initialize a counter for the current show position
show_output_keys = user_list.keys()
show_output_keys = [dict(x) for x in list(show_output_keys)]
for show_key in show_output_keys:
for provider_key, provider_value in show_key.items():
# Skip title
if provider_key.lower() == "title":
continue
if provider_key.lower() not in show_output_dict:
show_output_dict[provider_key.lower()] = [None] * show_counter
if provider_key.lower() == "locations":
show_output_dict[provider_key.lower()].append(provider_value)
else:
show_output_dict[provider_key.lower()].append(
provider_value.lower()
)
show_counter += 1
for key in show_output_dict:
if len(show_output_dict[key]) < show_counter:
show_output_dict[key].append(None)
return show_output_dict
except Exception:
logger("Skipping show_output_dict ", 1)
return {}
def episode_title_dict(user_list: dict):
try:
episode_output_dict = {}
episode_output_dict["completed"] = []
episode_output_dict["time"] = []
episode_output_dict["locations"] = []
episode_output_dict["show"] = []
episode_output_dict["season"] = []
episode_counter = 0 # Initialize a counter for the current episode position
# Iterate through the shows, seasons, and episodes in user_list
for show in user_list:
for season in user_list[show]:
for episode in user_list[show][season]:
# Add the show title to the episode_output_dict if it doesn't exist
if "show" not in episode_output_dict:
episode_output_dict["show"] = [None] * episode_counter
# Add the season number to the episode_output_dict if it doesn't exist
if "season" not in episode_output_dict:
episode_output_dict["season"] = [None] * episode_counter
# Add the show title to the episode_output_dict
episode_output_dict["show"].append(dict(show))
# Add the season number to the episode_output_dict
episode_output_dict["season"].append(season)
# Iterate through the keys and values in each episode
for episode_key, episode_value in episode.items():
# If the key is not "status", add the key to episode_output_dict if it doesn't exist
if episode_key != "status":
if episode_key.lower() not in episode_output_dict:
# Initialize the list with None values up to the current episode position
episode_output_dict[episode_key.lower()] = [
None
] * episode_counter
# If the key is "locations", append each location to the list
if episode_key == "locations":
episode_output_dict[episode_key.lower()].append(
episode_value
)
# If the key is "status", append the "completed" and "time" values
elif episode_key == "status":
episode_output_dict["completed"].append(
episode_value["completed"]
)
episode_output_dict["time"].append(episode_value["time"])
# For other keys, append the value to the list
else:
episode_output_dict[episode_key.lower()].append(
episode_value.lower()
)
# Increment the episode_counter
episode_counter += 1
# Extend the lists in episode_output_dict with None values to match the current episode_counter
for key in episode_output_dict:
if len(episode_output_dict[key]) < episode_counter:
episode_output_dict[key].append(None)
return episode_output_dict
except Exception:
logger("Skipping episode_output_dict", 1)
return {}
def movies_title_dict(user_list: dict):
try:
movies_output_dict = {}
movies_output_dict["completed"] = []
movies_output_dict["time"] = []
movies_output_dict["locations"] = []
movie_counter = 0 # Initialize a counter for the current movie position
for movie in user_list:
for movie_key, movie_value in movie.items():
if movie_key != "status":
if movie_key.lower() not in movies_output_dict:
movies_output_dict[movie_key.lower()] = []
if movie_key == "locations":
movies_output_dict[movie_key.lower()].append(movie_value)
elif movie_key == "status":
movies_output_dict["completed"].append(movie_value["completed"])
movies_output_dict["time"].append(movie_value["time"])
else:
movies_output_dict[movie_key.lower()].append(movie_value.lower())
movie_counter += 1
for key in movies_output_dict:
if len(movies_output_dict[key]) < movie_counter:
movies_output_dict[key].append(None)
return movies_output_dict
except Exception:
logger("Skipping movies_output_dict failed", 1)
return {}
def generate_library_guids_dict(user_list: dict):
# Handle the case where user_list is empty or does not contain the expected keys and values
if not user_list:
return {}, {}, {}
show_output_dict = show_title_dict(user_list)
episode_output_dict = episode_title_dict(user_list)
movies_output_dict = movies_title_dict(user_list)
return show_output_dict, episode_output_dict, movies_output_dict

426
src/main.py Normal file
View File

@@ -0,0 +1,426 @@
import os, traceback, json
from dotenv import load_dotenv
from time import sleep, perf_counter
from src.functions import (
logger,
str_to_bool,
)
from src.users import (
generate_user_list,
combine_user_lists,
filter_user_lists,
generate_server_users,
)
from src.watched import (
cleanup_watched,
)
from src.black_white import setup_black_white_lists
from src.plex import Plex
from src.jellyfin import Jellyfin
load_dotenv(override=True)
def setup_users(
server_1, server_2, blacklist_users, whitelist_users, user_mapping=None
):
server_1_users = generate_user_list(server_1)
server_2_users = generate_user_list(server_2)
logger(f"Server 1 users: {server_1_users}", 1)
logger(f"Server 2 users: {server_2_users}", 1)
users = combine_user_lists(server_1_users, server_2_users, user_mapping)
logger(f"User list that exist on both servers {users}", 1)
users_filtered = filter_user_lists(users, blacklist_users, whitelist_users)
logger(f"Filtered user list {users_filtered}", 1)
output_server_1_users = generate_server_users(server_1, users_filtered)
output_server_2_users = generate_server_users(server_2, users_filtered)
# Check if users is none or empty
if output_server_1_users is None or len(output_server_1_users) == 0:
logger(
f"No users found for server 1 {server_1[0]}, users: {server_1_users}, overlapping users {users}, filtered users {users_filtered}, server 1 users {server_1[1].users}"
)
if output_server_2_users is None or len(output_server_2_users) == 0:
logger(
f"No users found for server 2 {server_2[0]}, users: {server_2_users}, overlapping users {users} filtered users {users_filtered}, server 2 users {server_2[1].users}"
)
if (
output_server_1_users is None
or len(output_server_1_users) == 0
or output_server_2_users is None
or len(output_server_2_users) == 0
):
raise Exception("No users found for one or both servers")
logger(f"Server 1 users: {output_server_1_users}", 1)
logger(f"Server 2 users: {output_server_2_users}", 1)
return output_server_1_users, output_server_2_users
def generate_server_connections():
servers = []
plex_baseurl = os.getenv("PLEX_BASEURL", None)
plex_token = os.getenv("PLEX_TOKEN", None)
plex_username = os.getenv("PLEX_USERNAME", None)
plex_password = os.getenv("PLEX_PASSWORD", None)
plex_servername = os.getenv("PLEX_SERVERNAME", None)
ssl_bypass = str_to_bool(os.getenv("SSL_BYPASS", "False"))
if plex_baseurl and plex_token:
plex_baseurl = plex_baseurl.split(",")
plex_token = plex_token.split(",")
if len(plex_baseurl) != len(plex_token):
raise Exception(
"PLEX_BASEURL and PLEX_TOKEN must have the same number of entries"
)
for i, url in enumerate(plex_baseurl):
server = Plex(
baseurl=url.strip(),
token=plex_token[i].strip(),
username=None,
password=None,
servername=None,
ssl_bypass=ssl_bypass,
)
logger(f"Plex Server {i} info: {server.info()}", 3)
servers.append(
(
"plex",
server,
)
)
if plex_username and plex_password and plex_servername:
plex_username = plex_username.split(",")
plex_password = plex_password.split(",")
plex_servername = plex_servername.split(",")
if len(plex_username) != len(plex_password) or len(plex_username) != len(
plex_servername
):
raise Exception(
"PLEX_USERNAME, PLEX_PASSWORD and PLEX_SERVERNAME must have the same number of entries"
)
for i, username in enumerate(plex_username):
server = Plex(
baseurl=None,
token=None,
username=username.strip(),
password=plex_password[i].strip(),
servername=plex_servername[i].strip(),
ssl_bypass=ssl_bypass,
)
logger(f"Plex Server {i} info: {server.info()}", 3)
servers.append(
(
"plex",
server,
)
)
jellyfin_baseurl = os.getenv("JELLYFIN_BASEURL", None)
jellyfin_token = os.getenv("JELLYFIN_TOKEN", None)
if jellyfin_baseurl and jellyfin_token:
jellyfin_baseurl = jellyfin_baseurl.split(",")
jellyfin_token = jellyfin_token.split(",")
if len(jellyfin_baseurl) != len(jellyfin_token):
raise Exception(
"JELLYFIN_BASEURL and JELLYFIN_TOKEN must have the same number of entries"
)
for i, baseurl in enumerate(jellyfin_baseurl):
baseurl = baseurl.strip()
if baseurl[-1] == "/":
baseurl = baseurl[:-1]
server = Jellyfin(baseurl=baseurl, token=jellyfin_token[i].strip())
logger(f"Jellyfin Server {i} info: {server.info()}", 3)
servers.append(
(
"jellyfin",
server,
)
)
return servers
def get_server_watched(
server_connection: list,
users: dict,
blacklist_library: list,
whitelist_library: list,
blacklist_library_type: list,
whitelist_library_type: list,
library_mapping: dict,
):
if server_connection[0] == "plex":
return server_connection[1].get_watched(
users,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
elif server_connection[0] == "jellyfin":
return server_connection[1].get_watched(
users,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
def update_server_watched(
server_connection: list,
server_watched_filtered: dict,
user_mapping: dict,
library_mapping: dict,
dryrun: bool,
):
if server_connection[0] == "plex":
server_connection[1].update_watched(
server_watched_filtered, user_mapping, library_mapping, dryrun
)
elif server_connection[0] == "jellyfin":
server_connection[1].update_watched(
server_watched_filtered, user_mapping, library_mapping, dryrun
)
def should_sync_server(server_1_type, server_2_type):
sync_from_plex_to_jellyfin = str_to_bool(
os.getenv("SYNC_FROM_PLEX_TO_JELLYFIN", "True")
)
sync_from_jelly_to_plex = str_to_bool(
os.getenv("SYNC_FROM_JELLYFIN_TO_PLEX", "True")
)
sync_from_plex_to_plex = str_to_bool(os.getenv("SYNC_FROM_PLEX_TO_PLEX", "True"))
sync_from_jelly_to_jellyfin = str_to_bool(
os.getenv("SYNC_FROM_JELLYFIN_TO_JELLYFIN", "True")
)
if (
server_1_type == "plex"
and server_2_type == "plex"
and not sync_from_plex_to_plex
):
logger("Sync between plex and plex is disabled", 1)
return False
if (
server_1_type == "plex"
and server_2_type == "jellyfin"
and not sync_from_jelly_to_plex
):
logger("Sync from jellyfin to plex disabled", 1)
return False
if (
server_1_type == "jellyfin"
and server_2_type == "jellyfin"
and not sync_from_jelly_to_jellyfin
):
logger("Sync between jellyfin and jellyfin is disabled", 1)
return False
if (
server_1_type == "jellyfin"
and server_2_type == "plex"
and not sync_from_plex_to_jellyfin
):
logger("Sync from plex to jellyfin is disabled", 1)
return False
return True
def main_loop():
logfile = os.getenv("LOGFILE", "log.log")
# Delete logfile if it exists
if os.path.exists(logfile):
os.remove(logfile)
dryrun = str_to_bool(os.getenv("DRYRUN", "False"))
logger(f"Dryrun: {dryrun}", 1)
user_mapping = os.getenv("USER_MAPPING")
if user_mapping:
user_mapping = json.loads(user_mapping.lower())
logger(f"User Mapping: {user_mapping}", 1)
library_mapping = os.getenv("LIBRARY_MAPPING")
if library_mapping:
library_mapping = json.loads(library_mapping)
logger(f"Library Mapping: {library_mapping}", 1)
# Create (black/white)lists
logger("Creating (black/white)lists", 1)
blacklist_library = os.getenv("BLACKLIST_LIBRARY", None)
whitelist_library = os.getenv("WHITELIST_LIBRARY", None)
blacklist_library_type = os.getenv("BLACKLIST_LIBRARY_TYPE", None)
whitelist_library_type = os.getenv("WHITELIST_LIBRARY_TYPE", None)
blacklist_users = os.getenv("BLACKLIST_USERS", None)
whitelist_users = os.getenv("WHITELIST_USERS", None)
(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
library_mapping,
user_mapping,
)
# Create server connections
logger("Creating server connections", 1)
servers = generate_server_connections()
for server_1 in servers:
# If server is the final server in the list, then we are done with the loop
if server_1 == servers[-1]:
break
# Start server_2 at the next server in the list
for server_2 in servers[servers.index(server_1) + 1 :]:
logger(f"Server 1: {server_1[0].capitalize()}: {server_1[1].info()}", 0)
logger(f"Server 2: {server_2[0].capitalize()}: {server_2[1].info()}", 0)
# Create users list
logger("Creating users list", 1)
server_1_users, server_2_users = setup_users(
server_1, server_2, blacklist_users, whitelist_users, user_mapping
)
logger("Creating watched lists", 1)
server_1_watched = get_server_watched(
server_1,
server_1_users,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
logger("Finished creating watched list server 1", 1)
server_2_watched = get_server_watched(
server_2,
server_2_users,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
logger("Finished creating watched list server 2", 1)
logger(f"Server 1 watched: {server_1_watched}", 3)
logger(f"Server 2 watched: {server_2_watched}", 3)
logger("Cleaning Server 1 Watched", 1)
server_1_watched_filtered = cleanup_watched(
server_1_watched, server_2_watched, user_mapping, library_mapping
)
logger("Cleaning Server 2 Watched", 1)
server_2_watched_filtered = cleanup_watched(
server_2_watched, server_1_watched, user_mapping, library_mapping
)
logger(
f"server 1 watched that needs to be synced to server 2:\n{server_1_watched_filtered}",
1,
)
logger(
f"server 2 watched that needs to be synced to server 1:\n{server_2_watched_filtered}",
1,
)
if should_sync_server(server_1[0], server_2[0]):
update_server_watched(
server_1,
server_2_watched_filtered,
user_mapping,
library_mapping,
dryrun,
)
if should_sync_server(server_2[0], server_1[0]):
update_server_watched(
server_2,
server_1_watched_filtered,
user_mapping,
library_mapping,
dryrun,
)
def main():
run_only_once = str_to_bool(os.getenv("RUN_ONLY_ONCE", "False"))
sleep_duration = float(os.getenv("SLEEP_DURATION", "3600"))
times = []
while True:
try:
start = perf_counter()
main_loop()
end = perf_counter()
times.append(end - start)
if len(times) > 0:
logger(f"Average time: {sum(times) / len(times)}", 0)
if run_only_once:
break
logger(f"Looping in {sleep_duration}")
sleep(sleep_duration)
except Exception as error:
if isinstance(error, list):
for message in error:
logger(message, log_type=2)
else:
logger(error, log_type=2)
logger(traceback.format_exc(), 2)
if run_only_once:
break
logger(f"Retrying in {sleep_duration}", log_type=0)
sleep(sleep_duration)
except KeyboardInterrupt:
if len(times) > 0:
logger(f"Average time: {sum(times) / len(times)}", 0)
logger("Exiting", log_type=0)
os._exit(0)

View File

@@ -1,216 +1,638 @@
import re, os
from dotenv import load_dotenv
from src.functions import logger, search_mapping, check_skip_logic
from plexapi.server import PlexServer
from plexapi.myplex import MyPlexAccount
load_dotenv(override=True)
plex_baseurl = os.getenv("PLEX_BASEURL")
plex_token = os.getenv("PLEX_TOKEN")
username = os.getenv("PLEX_USERNAME")
password = os.getenv("PLEX_PASSWORD")
servername = os.getenv("PLEX_SERVERNAME")
# class plex accept base url and token and username and password but default with none
class Plex:
def __init__(self):
self.baseurl = plex_baseurl
self.token = plex_token
self.username = username
self.password = password
self.servername = servername
self.plex = self.plex_login()
self.admin_user = self.plex.myPlexAccount()
self.users = self.get_plex_users()
def plex_login(self):
try:
if self.baseurl and self.token:
# Login via token
plex = PlexServer(self.baseurl, self.token)
elif self.username and self.password and self.servername:
# Login via plex account
account = MyPlexAccount(self.username, self.password)
plex = account.resource(self.servername).connect()
else:
raise Exception("No complete plex credentials provided")
return plex
except Exception as e:
if self.username or self.password:
msg = f"Failed to login via plex account {self.username}"
logger(f"Plex: Failed to login, {msg}, Error: {e}", 2)
else:
logger(f"Plex: Failed to login, Error: {e}", 2)
return None
def get_plex_users(self):
users = self.plex.myPlexAccount().users()
# append self to users
users.append(self.plex.myPlexAccount())
return users
def get_plex_user_watched(self, user, library):
if self.admin_user == user:
user_plex = self.plex
else:
user_plex = PlexServer(self.baseurl, user.get_token(self.plex.machineIdentifier))
watched = None
if library.type == "movie":
watched = []
library_videos = user_plex.library.section(library.title)
for video in library_videos.search(unmatched=False, unwatched=False):
guids = {}
for guid in video.guids:
guid_source = re.search(r'(.*)://', guid.id).group(1).lower()
guid_id = re.search(r'://(.*)', guid.id).group(1)
guids[guid_source] = guid_id
watched.append(guids)
elif library.type == "show":
watched = {}
library_videos = user_plex.library.section(library.title)
for show in library_videos.search(unmatched=False, unwatched=False):
for season in show.seasons():
guids = []
for episode in season.episodes():
if episode.viewCount > 0:
guids_temp = {}
for guid in episode.guids:
# Extract after :// from guid.id
guid_source = re.search(r'(.*)://', guid.id).group(1).lower()
guid_id = re.search(r'://(.*)', guid.id).group(1)
guids_temp[guid_source] = guid_id
guids.append(guids_temp)
if guids:
# append show, season, episode
if show.title not in watched:
watched[show.title] = {}
if season.title not in watched[show.title]:
watched[show.title][season.title] = {}
watched[show.title][season.title] = guids
return watched
def get_plex_watched(self, users, blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, library_mapping):
# Get all libraries
libraries = self.plex.library.sections()
users_watched = {}
# for not in blacklist
for library in libraries:
library_title = library.title
library_type = library.type
skip_reason = check_skip_logic(library_title, library_type, blacklist_library, whitelist_library, blacklist_library_type, whitelist_library_type, library_mapping)
if skip_reason:
logger(f"Plex: Skipping library {library_title} {skip_reason}", 1)
continue
for user in users:
logger(f"Plex: Generating watched for {user.title} in library {library_title}", 0)
user_name = user.title.lower()
watched = self.get_plex_user_watched(user, library)
if watched:
if user_name not in users_watched:
users_watched[user_name] = {}
if library_title not in users_watched[user_name]:
users_watched[user_name][library_title] = []
users_watched[user_name][library_title] = watched
return users_watched
def update_watched(self, watched_list, user_mapping=None, library_mapping=None, dryrun=False):
for user, libraries in watched_list.items():
if user_mapping:
user_other = None
if user in user_mapping.keys():
user_other = user_mapping[user]
elif user in user_mapping.values():
user_other = search_mapping(user_mapping, user)
if user_other:
logger(f"Swapping user {user} with {user_other}", 1)
user = user_other
for index, value in enumerate(self.users):
if user.lower() == value.title.lower():
user = self.users[index]
break
if self.admin_user == user:
user_plex = self.plex
else:
user_plex = PlexServer(self.baseurl, user.get_token(self.plex.machineIdentifier))
for library, videos in libraries.items():
if library_mapping:
library_other = None
if library in library_mapping.keys():
library_other = library_mapping[library]
elif library in library_mapping.values():
library_other = search_mapping(library_mapping, library)
if library_other:
logger(f"Swapping library {library} with {library_other}", 1)
library = library_other
# if library in plex library list
library_list = user_plex.library.sections()
if library.lower() not in [x.title.lower() for x in library_list]:
logger(f"Library {library} not found in Plex library list", 2)
continue
logger(f"Plex: Updating watched for {user.title} in library {library}", 1)
library_videos = user_plex.library.section(library)
if library_videos.type == "movie":
for movies_search in library_videos.search(unmatched=False, unwatched=True):
for guid in movies_search.guids:
guid_source = re.search(r'(.*)://', guid.id).group(1).lower()
guid_id = re.search(r'://(.*)', guid.id).group(1)
for video in videos:
for video_keys, video_id in video.items():
if video_keys == guid_source and video_id == guid_id:
if movies_search.viewCount == 0:
msg = f"{movies_search.title} as watched for {user.title} in {library} for Plex"
if not dryrun:
logger(f"Marked {msg}", 0)
movies_search.markWatched()
else:
logger(f"Dryrun {msg}", 0)
break
elif library_videos.type == "show":
for show_search in library_videos.search(unmatched=False, unwatched=True):
if show_search.title in videos:
for season_search in show_search.seasons():
for episode_search in season_search.episodes():
for guid in episode_search.guids:
guid_source = re.search(r'(.*)://', guid.id).group(1).lower()
guid_id = re.search(r'://(.*)', guid.id).group(1)
for show in videos:
for season in videos[show]:
for episode in videos[show][season]:
for episode_keys, episode_id in episode.items():
if episode_keys == guid_source and episode_id == guid_id:
if episode_search.viewCount == 0:
msg = f"{show_search.title} {season_search.title} {episode_search.title} as watched for {user.title} in {library} for Plex"
if not dryrun:
logger(f"Marked {msg}", 0)
episode_search.markWatched()
else:
logger(f"Dryrun {msg}", 0)
break
import os, requests, traceback
from dotenv import load_dotenv
from typing import Dict, Union, FrozenSet
from urllib3.poolmanager import PoolManager
from math import floor
from requests.adapters import HTTPAdapter as RequestsHTTPAdapter
from plexapi.video import Show, Episode, Movie
from plexapi.server import PlexServer
from plexapi.myplex import MyPlexAccount
from src.functions import (
logger,
search_mapping,
future_thread_executor,
contains_nested,
log_marked,
str_to_bool,
)
from src.library import (
check_skip_logic,
generate_library_guids_dict,
)
load_dotenv(override=True)
generate_guids = str_to_bool(os.getenv("GENERATE_GUIDS", "True"))
generate_locations = str_to_bool(os.getenv("GENERATE_LOCATIONS", "True"))
# Bypass hostname validation for ssl. Taken from https://github.com/pkkid/python-plexapi/issues/143#issuecomment-775485186
class HostNameIgnoringAdapter(RequestsHTTPAdapter):
def init_poolmanager(self, connections, maxsize, block=..., **pool_kwargs):
self.poolmanager = PoolManager(
num_pools=connections,
maxsize=maxsize,
block=block,
assert_hostname=False,
**pool_kwargs,
)
def extract_guids_from_item(item: Union[Movie, Show, Episode]) -> Dict[str, str]:
# If GENERATE_GUIDS is set to False, then return an empty dict
if not generate_guids:
return {}
guids: Dict[str, str] = dict(
guid.id.split("://")
for guid in item.guids
if guid.id is not None and len(guid.id.strip()) > 0
)
if len(guids) == 0:
logger(
f"Plex: Failed to get any guids for {item.title}",
1,
)
return guids
def get_guids(item: Union[Movie, Episode], completed=True):
if not item.locations:
logger(
f"Plex: {item.title} has no locations",
1,
)
if not item.guids:
logger(
f"Plex: {item.title} has no guids",
1,
)
return {
"title": item.title,
"locations": (
tuple([location.split("/")[-1] for location in item.locations])
if generate_locations
else tuple()
),
"status": {
"completed": completed,
"time": item.viewOffset,
},
} | extract_guids_from_item(
item
) # Merge the metadata and guid dictionaries
def get_user_library_watched_show(show, process_episodes, threads=None):
try:
show_guids: FrozenSet = frozenset(
(
{
"title": show.title,
"locations": (
tuple([location.split("/")[-1] for location in show.locations])
if generate_locations
else tuple()
),
}
| extract_guids_from_item(show)
).items() # Merge the metadata and guid dictionaries
)
episode_guids_args = []
for episode in process_episodes:
episode_guids_args.append([get_guids, episode, episode.isWatched])
episode_guids_results = future_thread_executor(
episode_guids_args, threads=threads
)
episode_guids = {}
for index, episode in enumerate(process_episodes):
if episode.parentIndex not in episode_guids:
episode_guids[episode.parentIndex] = []
episode_guids[episode.parentIndex].append(episode_guids_results[index])
return show_guids, episode_guids
except Exception:
return {}, {}
def get_user_library_watched(user, user_plex, library):
user_name: str = user.username.lower() if user.username else user.title.lower()
try:
logger(
f"Plex: Generating watched for {user_name} in library {library.title}",
0,
)
library_videos = user_plex.library.section(library.title)
if library.type == "movie":
watched = []
args = [
[get_guids, video, video.isWatched]
for video in library_videos.search(unwatched=False)
+ library_videos.search(inProgress=True)
if video.isWatched or video.viewOffset >= 60000
]
for guid in future_thread_executor(args, threads=len(args)):
logger(f"Plex: Adding {guid['title']} to {user_name} watched list", 3)
watched.append(guid)
elif library.type == "show":
watched = {}
# Get all watched shows and partially watched shows
parallel_show_task = []
parallel_episodes_task = []
for show in library_videos.search(unwatched=False) + library_videos.search(
inProgress=True
):
process_episodes = []
for episode in show.episodes():
if episode.isWatched or episode.viewOffset >= 60000:
process_episodes.append(episode)
# Shows with more than 24 episodes has its episodes processed in parallel
# Shows with less than 24 episodes has its episodes processed in serial but the shows are processed in parallel
if len(process_episodes) >= 24:
parallel_episodes_task.append(
[
get_user_library_watched_show,
show,
process_episodes,
len(process_episodes),
]
)
else:
parallel_show_task.append(
[get_user_library_watched_show, show, process_episodes, 1]
)
for show_guids, episode_guids in future_thread_executor(
parallel_show_task, threads=len(parallel_show_task)
) + future_thread_executor(parallel_episodes_task, threads=1):
if show_guids and episode_guids:
watched[show_guids] = episode_guids
logger(
f"Plex: Added {episode_guids} to {user_name} {show_guids} watched list",
3,
)
else:
watched = None
logger(f"Plex: Got watched for {user_name} in library {library.title}", 1)
logger(f"Plex: {watched}", 3)
return {user_name: {library.title: watched} if watched is not None else {}}
except Exception as e:
logger(
f"Plex: Failed to get watched for {user_name} in library {library.title}, Error: {e}",
2,
)
return {}
def find_video(plex_search, video_ids, videos=None):
try:
if not generate_guids and not generate_locations:
return False, []
if generate_locations:
for location in plex_search.locations:
if (
contains_nested(location.split("/")[-1], video_ids["locations"])
is not None
):
episode_videos = []
if videos:
for show, seasons in videos.items():
show = {k: v for k, v in show}
if (
contains_nested(
location.split("/")[-1], show["locations"]
)
is not None
):
for season in seasons.values():
for episode in season:
episode_videos.append(episode)
return True, episode_videos
if generate_guids:
for guid in plex_search.guids:
guid_source, guid_id = guid.id.split("://")
# If show provider source and show provider id are in videos_shows_ids exactly, then the show is in the list
if guid_source in video_ids.keys():
if guid_id in video_ids[guid_source]:
episode_videos = []
if videos:
for show, seasons in videos.items():
show = {k: v for k, v in show}
if guid_source in show.keys():
if guid_id == show[guid_source]:
for season in seasons.values():
for episode in season:
episode_videos.append(episode)
return True, episode_videos
return False, []
except Exception:
return False, []
def get_video_status(plex_search, video_ids, videos):
try:
if not generate_guids and not generate_locations:
return None
if generate_locations:
for location in plex_search.locations:
if (
contains_nested(location.split("/")[-1], video_ids["locations"])
is not None
):
for video in videos:
if (
contains_nested(location.split("/")[-1], video["locations"])
is not None
):
return video["status"]
if generate_guids:
for guid in plex_search.guids:
guid_source, guid_id = guid.id.split("://")
# If show provider source and show provider id are in videos_shows_ids exactly, then the show is in the list
if guid_source in video_ids.keys():
if guid_id in video_ids[guid_source]:
for video in videos:
if guid_source in video.keys():
if guid_id == video[guid_source]:
return video["status"]
return None
except Exception:
return None
def update_user_watched(user, user_plex, library, videos, dryrun):
try:
logger(f"Plex: Updating watched for {user.title} in library {library}", 1)
(
videos_shows_ids,
videos_episodes_ids,
videos_movies_ids,
) = generate_library_guids_dict(videos)
logger(
f"Plex: mark list\nShows: {videos_shows_ids}\nEpisodes: {videos_episodes_ids}\nMovies: {videos_movies_ids}",
1,
)
library_videos = user_plex.library.section(library)
if videos_movies_ids:
for movies_search in library_videos.search(unwatched=True):
video_status = get_video_status(
movies_search, videos_movies_ids, videos
)
if video_status:
if video_status["completed"]:
msg = f"Plex: {movies_search.title} as watched for {user.title} in {library}"
if not dryrun:
logger(msg, 5)
movies_search.markWatched()
else:
logger(msg, 6)
log_marked(user.title, library, movies_search.title, None, None)
elif video_status["time"] > 60_000:
msg = f"Plex: {movies_search.title} as partially watched for {floor(video_status['time'] / 60_000)} minutes for {user.title} in {library}"
if not dryrun:
logger(msg, 5)
movies_search.updateTimeline(video_status["time"])
else:
logger(msg, 6)
log_marked(
user.title,
library,
movies_search.title,
duration=video_status["time"],
)
else:
logger(
f"Plex: Skipping movie {movies_search.title} as it is not in mark list for {user.title}",
1,
)
if videos_shows_ids and videos_episodes_ids:
for show_search in library_videos.search(unwatched=True):
show_found, episode_videos = find_video(
show_search, videos_shows_ids, videos
)
if show_found:
for episode_search in show_search.episodes():
video_status = get_video_status(
episode_search, videos_episodes_ids, episode_videos
)
if video_status:
if video_status["completed"]:
msg = f"Plex: {show_search.title} {episode_search.title} as watched for {user.title} in {library}"
if not dryrun:
logger(msg, 5)
episode_search.markWatched()
else:
logger(msg, 6)
log_marked(
user.title,
library,
show_search.title,
episode_search.title,
)
else:
msg = f"Plex: {show_search.title} {episode_search.title} as partially watched for {floor(video_status['time'] / 60_000)} minutes for {user.title} in {library}"
if not dryrun:
logger(msg, 5)
episode_search.updateTimeline(video_status["time"])
else:
logger(msg, 6)
log_marked(
user.title,
library,
show_search.title,
episode_search.title,
video_status["time"],
)
else:
logger(
f"Plex: Skipping episode {episode_search.title} as it is not in mark list for {user.title}",
3,
)
else:
logger(
f"Plex: Skipping show {show_search.title} as it is not in mark list for {user.title}",
3,
)
if not videos_movies_ids and not videos_shows_ids and not videos_episodes_ids:
logger(
f"Jellyfin: No videos to mark as watched for {user.title} in library {library}",
1,
)
except Exception as e:
logger(
f"Plex: Failed to update watched for {user.title} in library {library}, Error: {e}",
2,
)
logger(traceback.format_exc(), 2)
# class plex accept base url and token and username and password but default with none
class Plex:
def __init__(
self,
baseurl=None,
token=None,
username=None,
password=None,
servername=None,
ssl_bypass=False,
session=None,
):
self.baseurl = baseurl
self.token = token
self.username = username
self.password = password
self.servername = servername
self.ssl_bypass = ssl_bypass
if ssl_bypass:
# Session for ssl bypass
session = requests.Session()
# By pass ssl hostname check https://github.com/pkkid/python-plexapi/issues/143#issuecomment-775485186
session.mount("https://", HostNameIgnoringAdapter())
self.session = session
self.plex = self.login(self.baseurl, self.token)
self.admin_user = self.plex.myPlexAccount()
self.users = self.get_users()
def login(self, baseurl, token):
try:
if baseurl and token:
plex = PlexServer(baseurl, token, session=self.session)
elif self.username and self.password and self.servername:
# Login via plex account
account = MyPlexAccount(self.username, self.password)
plex = account.resource(self.servername).connect()
else:
raise Exception("No complete plex credentials provided")
return plex
except Exception as e:
if self.username or self.password:
msg = f"Failed to login via plex account {self.username}"
logger(f"Plex: Failed to login, {msg}, Error: {e}", 2)
else:
logger(f"Plex: Failed to login, Error: {e}", 2)
raise Exception(e)
def info(self) -> str:
return f"{self.plex.friendlyName}: {self.plex.version}"
def get_users(self):
try:
users = self.plex.myPlexAccount().users()
# append self to users
users.append(self.plex.myPlexAccount())
return users
except Exception as e:
logger(f"Plex: Failed to get users, Error: {e}", 2)
raise Exception(e)
def get_watched(
self,
users,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
):
try:
# Get all libraries
users_watched = {}
for user in users:
if self.admin_user == user:
user_plex = self.plex
else:
token = user.get_token(self.plex.machineIdentifier)
if token:
user_plex = self.login(
self.plex._baseurl,
token,
)
else:
logger(
f"Plex: Failed to get token for {user.title}, skipping",
2,
)
users_watched[user.title] = {}
continue
libraries = user_plex.library.sections()
for library in libraries:
library_title = library.title
library_type = library.type
skip_reason = check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
if skip_reason:
logger(
f"Plex: Skipping library {library_title}: {skip_reason}", 1
)
continue
user_watched = get_user_library_watched(user, user_plex, library)
for user_watched, user_watched_temp in user_watched.items():
if user_watched not in users_watched:
users_watched[user_watched] = {}
users_watched[user_watched].update(user_watched_temp)
return users_watched
except Exception as e:
logger(f"Plex: Failed to get watched, Error: {e}", 2)
raise Exception(e)
def update_watched(
self, watched_list, user_mapping=None, library_mapping=None, dryrun=False
):
try:
args = []
for user, libraries in watched_list.items():
user_other = None
# If type of user is dict
if user_mapping:
if user in user_mapping.keys():
user_other = user_mapping[user]
elif user in user_mapping.values():
user_other = search_mapping(user_mapping, user)
for index, value in enumerate(self.users):
username_title = (
value.username.lower()
if value.username
else value.title.lower()
)
if user.lower() == username_title:
user = self.users[index]
break
elif user_other and user_other.lower() == username_title:
user = self.users[index]
break
if self.admin_user == user:
user_plex = self.plex
else:
if isinstance(user, str):
logger(
f"Plex: {user} is not a plex object, attempting to get object for user",
4,
)
user = self.plex.myPlexAccount().user(user)
token = user.get_token(self.plex.machineIdentifier)
if token:
user_plex = PlexServer(
self.plex._baseurl,
token,
session=self.session,
)
else:
logger(
f"Plex: Failed to get token for {user.title}, skipping",
2,
)
continue
for library, videos in libraries.items():
library_other = None
if library_mapping:
if library in library_mapping.keys():
library_other = library_mapping[library]
elif library in library_mapping.values():
library_other = search_mapping(library_mapping, library)
# if library in plex library list
library_list = user_plex.library.sections()
if library.lower() not in [x.title.lower() for x in library_list]:
if library_other:
if library_other.lower() in [
x.title.lower() for x in library_list
]:
logger(
f"Plex: Library {library} not found, but {library_other} found, using {library_other}",
1,
)
library = library_other
else:
logger(
f"Plex: Library {library} or {library_other} not found in library list",
1,
)
continue
else:
logger(
f"Plex: Library {library} not found in library list",
1,
)
continue
args.append(
[
update_user_watched,
user,
user_plex,
library,
videos,
dryrun,
]
)
future_thread_executor(args)
except Exception as e:
logger(f"Plex: Failed to update watched, Error: {e}", 2)
raise Exception(e)

91
src/users.py Normal file
View File

@@ -0,0 +1,91 @@
from src.functions import (
logger,
search_mapping,
)
def generate_user_list(server):
# generate list of users from server 1 and server 2
server_type = server[0]
server_connection = server[1]
server_users = []
if server_type == "plex":
for user in server_connection.users:
server_users.append(
user.username.lower() if user.username else user.title.lower()
)
elif server_type == "jellyfin":
server_users = [key.lower() for key in server_connection.users.keys()]
return server_users
def combine_user_lists(server_1_users, server_2_users, user_mapping):
# combined list of overlapping users from plex and jellyfin
users = {}
for server_1_user in server_1_users:
if user_mapping:
mapped_user = search_mapping(user_mapping, server_1_user)
if mapped_user in server_2_users:
users[server_1_user] = mapped_user
continue
if server_1_user in server_2_users:
users[server_1_user] = server_1_user
for server_2_user in server_2_users:
if user_mapping:
mapped_user = search_mapping(user_mapping, server_2_user)
if mapped_user in server_1_users:
users[mapped_user] = server_2_user
continue
if server_2_user in server_1_users:
users[server_2_user] = server_2_user
return users
def filter_user_lists(users, blacklist_users, whitelist_users):
users_filtered = {}
for user in users:
# whitelist_user is not empty and user lowercase is not in whitelist lowercase
if len(whitelist_users) > 0:
if user not in whitelist_users and users[user] not in whitelist_users:
logger(f"{user} or {users[user]} is not in whitelist", 1)
continue
if user not in blacklist_users and users[user] not in blacklist_users:
users_filtered[user] = users[user]
return users_filtered
def generate_server_users(server, users):
server_users = None
if server[0] == "plex":
server_users = []
for plex_user in server[1].users:
username_title = (
plex_user.username if plex_user.username else plex_user.title
)
if (
username_title.lower() in users.keys()
or username_title.lower() in users.values()
):
server_users.append(plex_user)
elif server[0] == "jellyfin":
server_users = {}
for jellyfin_user, jellyfin_id in server[1].users.items():
if (
jellyfin_user.lower() in users.keys()
or jellyfin_user.lower() in users.values()
):
server_users[jellyfin_user] = jellyfin_id
return server_users

317
src/watched.py Normal file
View File

@@ -0,0 +1,317 @@
import copy
from src.functions import logger, search_mapping, contains_nested
from src.library import generate_library_guids_dict
def combine_watched_dicts(dicts: list):
# Ensure that the input is a list of dictionaries
if not all(isinstance(d, dict) for d in dicts):
raise ValueError("Input must be a list of dictionaries")
combined_dict = {}
for single_dict in dicts:
for key, value in single_dict.items():
if key not in combined_dict:
combined_dict[key] = {}
for subkey, subvalue in value.items():
if subkey in combined_dict[key]:
# If the subkey already exists in the combined dictionary,
# check if the values are different and raise an exception if they are
if combined_dict[key][subkey] != subvalue:
raise ValueError(
f"Conflicting values for subkey '{subkey}' under key '{key}'"
)
else:
# If the subkey does not exist in the combined dictionary, add it
combined_dict[key][subkey] = subvalue
return combined_dict
def check_remove_entry(video, library, video_index, library_watched_list_2):
if video_index is not None:
if (
library_watched_list_2["completed"][video_index]
== video["status"]["completed"]
) and (library_watched_list_2["time"][video_index] == video["status"]["time"]):
logger(
f"Removing {video['title']} from {library} due to exact match",
3,
)
return True
elif (
library_watched_list_2["completed"][video_index] == True
and video["status"]["completed"] == False
):
logger(
f"Removing {video['title']} from {library} due to being complete in one library and not the other",
3,
)
return True
elif (
library_watched_list_2["completed"][video_index] == False
and video["status"]["completed"] == False
) and (video["status"]["time"] < library_watched_list_2["time"][video_index]):
logger(
f"Removing {video['title']} from {library} due to more time watched in one library than the other",
3,
)
return True
elif (
library_watched_list_2["completed"][video_index] == True
and video["status"]["completed"] == True
):
logger(
f"Removing {video['title']} from {library} due to being complete in both libraries",
3,
)
return True
return False
def cleanup_watched(
watched_list_1, watched_list_2, user_mapping=None, library_mapping=None
):
modified_watched_list_1 = copy.deepcopy(watched_list_1)
# remove entries from watched_list_1 that are in watched_list_2
for user_1 in watched_list_1:
user_other = None
if user_mapping:
user_other = search_mapping(user_mapping, user_1)
user_2 = get_other(watched_list_2, user_1, user_other)
if user_2 is None:
continue
for library_1 in watched_list_1[user_1]:
library_other = None
if library_mapping:
library_other = search_mapping(library_mapping, library_1)
library_2 = get_other(watched_list_2[user_2], library_1, library_other)
if library_2 is None:
continue
(
_,
episode_watched_list_2_keys_dict,
movies_watched_list_2_keys_dict,
) = generate_library_guids_dict(watched_list_2[user_2][library_2])
# Movies
if isinstance(watched_list_1[user_1][library_1], list):
for movie in watched_list_1[user_1][library_1]:
movie_index = get_movie_index_in_dict(
movie, movies_watched_list_2_keys_dict
)
if movie_index is not None:
if check_remove_entry(
movie,
library_1,
movie_index,
movies_watched_list_2_keys_dict,
):
modified_watched_list_1[user_1][library_1].remove(movie)
# TV Shows
elif isinstance(watched_list_1[user_1][library_1], dict):
for show_key_1 in watched_list_1[user_1][library_1].keys():
show_key_dict = dict(show_key_1)
for season in watched_list_1[user_1][library_1][show_key_1]:
# Filter the episode_watched_list_2_keys_dict dictionary to handle cases
# where episode location names are not unique such as S01E01.mkv
filtered_episode_watched_list_2_keys_dict = (
filter_episode_watched_list_2_keys_dict(
episode_watched_list_2_keys_dict, show_key_dict, season
)
)
for episode in watched_list_1[user_1][library_1][show_key_1][
season
]:
episode_index = get_episode_index_in_dict(
episode, filtered_episode_watched_list_2_keys_dict
)
if episode_index is not None:
if check_remove_entry(
episode,
library_1,
episode_index,
episode_watched_list_2_keys_dict,
):
modified_watched_list_1[user_1][library_1][
show_key_1
][season].remove(episode)
# Remove empty seasons
if (
len(
modified_watched_list_1[user_1][library_1][show_key_1][
season
]
)
== 0
):
if (
season
in modified_watched_list_1[user_1][library_1][
show_key_1
]
):
logger(
f"Removing {season} from {show_key_dict['title']} because it is empty",
3,
)
del modified_watched_list_1[user_1][library_1][
show_key_1
][season]
# Remove empty shows
if len(modified_watched_list_1[user_1][library_1][show_key_1]) == 0:
if show_key_1 in modified_watched_list_1[user_1][library_1]:
logger(
f"Removing {show_key_dict['title']} because it is empty",
3,
)
del modified_watched_list_1[user_1][library_1][show_key_1]
for user_1 in watched_list_1:
for library_1 in watched_list_1[user_1]:
if library_1 in modified_watched_list_1[user_1]:
# If library is empty then remove it
if len(modified_watched_list_1[user_1][library_1]) == 0:
logger(f"Removing {library_1} from {user_1} because it is empty", 1)
del modified_watched_list_1[user_1][library_1]
if user_1 in modified_watched_list_1:
# If user is empty delete user
if len(modified_watched_list_1[user_1]) == 0:
logger(f"Removing {user_1} from watched list 1 because it is empty", 1)
del modified_watched_list_1[user_1]
return modified_watched_list_1
def get_other(watched_list, object_1, object_2):
if object_1 in watched_list:
return object_1
elif object_2 in watched_list:
return object_2
else:
logger(f"{object_1} and {object_2} not found in watched list 2", 1)
return None
def get_movie_index_in_dict(movie, movies_watched_list_2_keys_dict):
# Iterate through the keys and values of the movie dictionary
for movie_key, movie_value in movie.items():
# If the key is "locations", check if the "locations" key is present in the movies_watched_list_2_keys_dict dictionary
if movie_key == "locations":
if "locations" in movies_watched_list_2_keys_dict.keys():
# Iterate through the locations in the movie dictionary
for location in movie_value:
# If the location is in the movies_watched_list_2_keys_dict dictionary, return index of the key
return contains_nested(
location, movies_watched_list_2_keys_dict["locations"]
)
# If the key is not "locations", check if the movie_key is present in the movies_watched_list_2_keys_dict dictionary
else:
if movie_key in movies_watched_list_2_keys_dict.keys():
# If the movie_value is in the movies_watched_list_2_keys_dict dictionary, return True
if movie_value in movies_watched_list_2_keys_dict[movie_key]:
return movies_watched_list_2_keys_dict[movie_key].index(movie_value)
# If the loop completes without finding a match, return False
return None
def filter_episode_watched_list_2_keys_dict(
episode_watched_list_2_keys_dict, show_key_dict, season
):
# If the episode_watched_list_2_keys_dict dictionary is empty, missing season or show then return an empty dictionary
if (
len(episode_watched_list_2_keys_dict) == 0
or "season" not in episode_watched_list_2_keys_dict.keys()
or "show" not in episode_watched_list_2_keys_dict.keys()
):
return {}
# Filter the episode_watched_list_2_keys_dict dictionary to only include values for the correct show and season
filtered_episode_watched_list_2_keys_dict = {}
show_indecies = []
season_indecies = []
# Iterate through episode_watched_list_2_keys_dict["season"] and find the indecies that match season
for season_index, season_value in enumerate(
episode_watched_list_2_keys_dict.get("season")
):
if season_value == season:
season_indecies.append(season_index)
# Iterate through episode_watched_list_2_keys_dict["show"] and find the indecies that match show_key_dict
for show_index, show_value in enumerate(episode_watched_list_2_keys_dict["show"]):
# Iterate through the keys and values of the show_value dictionary and check if they match show_key_dict
for show_key, show_key_value in show_value.items():
if show_key == "locations":
# Iterate through the locations in the show_value dictionary
for location in show_key_value:
# If the location is in the episode_watched_list_2_keys_dict dictionary, return index of the key
if (
contains_nested(location, show_key_dict["locations"])
is not None
):
show_indecies.append(show_index)
break
else:
if show_key in show_key_dict.keys():
if show_key_value == show_key_dict[show_key]:
show_indecies.append(show_index)
break
# Find the intersection of the show_indecies and season_indecies lists
indecies = list(set(show_indecies) & set(season_indecies))
# If there are no indecies that match the show and season, return an empty dictionary
if len(indecies) == 0:
return {}
# Create a copy of the dictionary with indecies that match the show and season and none that don't
for key, value in episode_watched_list_2_keys_dict.items():
if key not in filtered_episode_watched_list_2_keys_dict:
filtered_episode_watched_list_2_keys_dict[key] = []
for index, _ in enumerate(value):
if index in indecies:
filtered_episode_watched_list_2_keys_dict[key].append(value[index])
else:
filtered_episode_watched_list_2_keys_dict[key].append(None)
return filtered_episode_watched_list_2_keys_dict
def get_episode_index_in_dict(episode, episode_watched_list_2_keys_dict):
# Iterate through the keys and values of the episode dictionary
for episode_key, episode_value in episode.items():
if episode_key in episode_watched_list_2_keys_dict.keys():
if episode_key == "locations":
# Iterate through the locations in the episode dictionary
for location in episode_value:
# If the location is in the episode_watched_list_2_keys_dict dictionary, return index of the key
return contains_nested(
location, episode_watched_list_2_keys_dict["locations"]
)
else:
# If the episode_value is in the episode_watched_list_2_keys_dict dictionary, return True
if episode_value in episode_watched_list_2_keys_dict[episode_key]:
return episode_watched_list_2_keys_dict[episode_key].index(
episode_value
)
# If the loop completes without finding a match, return False
return None

96
test/ci1.env Normal file
View File

@@ -0,0 +1,96 @@
# Global Settings
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"
## How often to run the script in seconds
SLEEP_DURATION = 10
## Log file where all output will be written to
LOG_FILE = "log.log"
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "mark.log"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Max threads for processing
MAX_THREADS = 2
## Generate guids
## Generating guids is a slow process, so this is a way to speed up the process
# by using the location only, useful when using same files on multiple servers
GENERATE_GUIDS = "False"
## Generate locations
## Generating locations is a slow process, so this is a way to speed up the process
## by using the guid only, useful when using different files on multiple servers
GENERATE_LOCATIONS = "True"
## Map usernames between servers in the event that they are different, order does not matter
## Comma seperated for multiple options
USER_MAPPING = {"JellyUser":"jellyplex_watched"}
## Map libraries between servers in the even that they are different, order does not matter
## Comma seperated for multiple options
LIBRARY_MAPPING = { "Shows": "TV Shows" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma seperated for multiple options
#BLACKLIST_LIBRARY = ""
#WHITELIST_LIBRARY = "Movies"
#BLACKLIST_LIBRARY_TYPE = "Series"
#WHITELIST_LIBRARY_TYPE = "Movies, movie"
#BLACKLIST_USERS = ""
WHITELIST_USERS = "jellyplex_watched"
# Plex
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
PLEX_BASEURL = "https://localhost:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
## Comma seperated list for multiple servers
PLEX_TOKEN = "mVaCzSyd78uoWkCBzZ_Y"
## If not using plex token then use username and password of the server admin along with the servername
## Comma seperated for multiple options
#PLEX_USERNAME = "PlexUser, PlexUser2"
#PLEX_PASSWORD = "SuperSecret, SuperSecret2"
#PLEX_SERVERNAME = "Plex Server1, Plex Server2"
## Skip hostname validation for ssl certificates.
## Set to True if running into ssl certificate errors
SSL_BYPASS = "True"
## control the direction of syncing. e.g. SYNC_FROM_PLEX_TO_JELLYFIN set to true will cause the updates from plex
## to be updated in jellyfin. SYNC_FROM_PLEX_TO_PLEX set to true will sync updates between multiple plex servers
SYNC_FROM_PLEX_TO_JELLYFIN = "True"
SYNC_FROM_JELLYFIN_TO_PLEX = "True"
SYNC_FROM_PLEX_TO_PLEX = "True"
SYNC_FROM_JELLYFIN_TO_JELLYFIN = "True"
# Jellyfin
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
JELLYFIN_BASEURL = "http://localhost:8096"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
## Comma seperated list for multiple servers
JELLYFIN_TOKEN = "d773c4db3ecc4b028fc0904d9694804c"

96
test/ci2.env Normal file
View File

@@ -0,0 +1,96 @@
# Global Settings
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "True"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"
## How often to run the script in seconds
SLEEP_DURATION = 10
## Log file where all output will be written to
LOG_FILE = "log.log"
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "mark.log"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Max threads for processing
MAX_THREADS = 2
## Generate guids
## Generating guids is a slow process, so this is a way to speed up the process
# by using the location only, useful when using same files on multiple servers
GENERATE_GUIDS = "True"
## Generate locations
## Generating locations is a slow process, so this is a way to speed up the process
## by using the guid only, useful when using different files on multiple servers
GENERATE_LOCATIONS = "False"
## Map usernames between servers in the event that they are different, order does not matter
## Comma seperated for multiple options
USER_MAPPING = {"JellyUser":"jellyplex_watched"}
## Map libraries between servers in the even that they are different, order does not matter
## Comma seperated for multiple options
LIBRARY_MAPPING = { "Shows": "TV Shows" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma seperated for multiple options
#BLACKLIST_LIBRARY = ""
#WHITELIST_LIBRARY = "Movies"
#BLACKLIST_LIBRARY_TYPE = "Series"
#WHITELIST_LIBRARY_TYPE = "Movies, movie"
#BLACKLIST_USERS = ""
WHITELIST_USERS = "jellyplex_watched"
# Plex
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
PLEX_BASEURL = "https://localhost:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
## Comma seperated list for multiple servers
PLEX_TOKEN = "mVaCzSyd78uoWkCBzZ_Y"
## If not using plex token then use username and password of the server admin along with the servername
## Comma seperated for multiple options
#PLEX_USERNAME = "PlexUser, PlexUser2"
#PLEX_PASSWORD = "SuperSecret, SuperSecret2"
#PLEX_SERVERNAME = "Plex Server1, Plex Server2"
## Skip hostname validation for ssl certificates.
## Set to True if running into ssl certificate errors
SSL_BYPASS = "True"
## control the direction of syncing. e.g. SYNC_FROM_PLEX_TO_JELLYFIN set to true will cause the updates from plex
## to be updated in jellyfin. SYNC_FROM_PLEX_TO_PLEX set to true will sync updates between multiple plex servers
SYNC_FROM_PLEX_TO_JELLYFIN = "True"
SYNC_FROM_JELLYFIN_TO_PLEX = "True"
SYNC_FROM_PLEX_TO_PLEX = "True"
SYNC_FROM_JELLYFIN_TO_JELLYFIN = "True"
# Jellyfin
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
JELLYFIN_BASEURL = "http://localhost:8096"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
## Comma seperated list for multiple servers
JELLYFIN_TOKEN = "d773c4db3ecc4b028fc0904d9694804c"

96
test/ci3.env Normal file
View File

@@ -0,0 +1,96 @@
# Global Settings
## Do not mark any shows/movies as played and instead just output to log if they would of been marked.
DRYRUN = "False"
## Additional logging information
DEBUG = "True"
## Debugging level, "info" is default, "debug" is more verbose
DEBUG_LEVEL = "debug"
## If set to true then the script will only run once and then exit
RUN_ONLY_ONCE = "True"
## How often to run the script in seconds
SLEEP_DURATION = 10
## Log file where all output will be written to
LOG_FILE = "log.log"
## Mark file where all shows/movies that have been marked as played will be written to
MARK_FILE = "mark.log"
## Timeout for requests for jellyfin
REQUEST_TIMEOUT = 300
## Max threads for processing
MAX_THREADS = 2
## Generate guids
## Generating guids is a slow process, so this is a way to speed up the process
# by using the location only, useful when using same files on multiple servers
GENERATE_GUIDS = "True"
## Generate locations
## Generating locations is a slow process, so this is a way to speed up the process
## by using the guid only, useful when using different files on multiple servers
GENERATE_LOCATIONS = "True"
## Map usernames between servers in the event that they are different, order does not matter
## Comma seperated for multiple options
USER_MAPPING = {"JellyUser":"jellyplex_watched"}
## Map libraries between servers in the even that they are different, order does not matter
## Comma seperated for multiple options
LIBRARY_MAPPING = { "Shows": "TV Shows" }
## Blacklisting/Whitelisting libraries, library types such as Movies/TV Shows, and users. Mappings apply so if the mapping for the user or library exist then both will be excluded.
## Comma seperated for multiple options
#BLACKLIST_LIBRARY = ""
#WHITELIST_LIBRARY = "Movies"
#BLACKLIST_LIBRARY_TYPE = "Series"
#WHITELIST_LIBRARY_TYPE = "Movies, movie"
#BLACKLIST_USERS = ""
WHITELIST_USERS = "jellyplex_watched"
# Plex
## Recommended to use token as it is faster to connect as it is direct to the server instead of going through the plex servers
## URL of the plex server, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
PLEX_BASEURL = "https://localhost:32400"
## Plex token https://support.plex.tv/articles/204059436-finding-an-authentication-token-x-plex-token/
## Comma seperated list for multiple servers
PLEX_TOKEN = "mVaCzSyd78uoWkCBzZ_Y"
## If not using plex token then use username and password of the server admin along with the servername
## Comma seperated for multiple options
#PLEX_USERNAME = "PlexUser, PlexUser2"
#PLEX_PASSWORD = "SuperSecret, SuperSecret2"
#PLEX_SERVERNAME = "Plex Server1, Plex Server2"
## Skip hostname validation for ssl certificates.
## Set to True if running into ssl certificate errors
SSL_BYPASS = "True"
## control the direction of syncing. e.g. SYNC_FROM_PLEX_TO_JELLYFIN set to true will cause the updates from plex
## to be updated in jellyfin. SYNC_FROM_PLEX_TO_PLEX set to true will sync updates between multiple plex servers
SYNC_FROM_PLEX_TO_JELLYFIN = "True"
SYNC_FROM_JELLYFIN_TO_PLEX = "True"
SYNC_FROM_PLEX_TO_PLEX = "True"
SYNC_FROM_JELLYFIN_TO_JELLYFIN = "True"
# Jellyfin
## Jellyfin server URL, use hostname or IP address if the hostname is not resolving correctly
## Comma seperated list for multiple servers
JELLYFIN_BASEURL = "http://localhost:8096"
## Jellyfin api token, created manually by logging in to the jellyfin server admin dashboard and creating an api key
## Comma seperated list for multiple servers
JELLYFIN_TOKEN = "d773c4db3ecc4b028fc0904d9694804c"

1
test/requirements.txt Normal file
View File

@@ -0,0 +1 @@
pytest==7.3.0

78
test/test_black_white.py Normal file
View File

@@ -0,0 +1,78 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.black_white import setup_black_white_lists
def test_setup_black_white_lists():
# Simple
blacklist_library = "library1, library2"
whitelist_library = "library1, library2"
blacklist_library_type = "library_type1, library_type2"
whitelist_library_type = "library_type1, library_type2"
blacklist_users = "user1, user2"
whitelist_users = "user1, user2"
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
)
assert results_blacklist_library == ["library1", "library2"]
assert return_whitelist_library == ["library1", "library2"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2"]
assert return_whitelist_users == ["user1", "user2"]
# Library Mapping and user mapping
library_mapping = {"library1": "library3"}
user_mapping = {"user1": "user3"}
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
library_mapping,
user_mapping,
)
assert results_blacklist_library == ["library1", "library2", "library3"]
assert return_whitelist_library == ["library1", "library2", "library3"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2", "user3"]
assert return_whitelist_users == ["user1", "user2", "user3"]

327
test/test_library.py Normal file
View File

@@ -0,0 +1,327 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.functions import (
search_mapping,
)
from src.library import (
check_skip_logic,
check_blacklist_logic,
check_whitelist_logic,
show_title_dict,
episode_title_dict,
movies_title_dict,
generate_library_guids_dict,
)
blacklist_library = ["TV Shows"]
whitelist_library = ["Movies"]
blacklist_library_type = ["episodes"]
whitelist_library_type = ["movies"]
library_mapping = {"Shows": "TV Shows", "Movie": "Movies"}
show_list = {
frozenset(
{
("locations", ("The Last of Us",)),
("tmdb", "100088"),
("imdb", "tt3581920"),
("tvdb", "392256"),
("title", "The Last of Us"),
}
): {
"Season 1": [
{
"imdb": "tt11957006",
"tmdb": "2181581",
"tvdb": "8444132",
"locations": (
(
"The Last of Us - S01E01 - When You're Lost in the Darkness WEBDL-1080p.mkv",
)
),
"status": {"completed": True, "time": 0},
}
]
}
}
movie_list = [
{
"title": "Coco",
"imdb": "tt2380307",
"tmdb": "354912",
"locations": [("Coco (2017) Remux-2160p.mkv", "Coco (2017) Remux-1080p.mkv")],
"status": {"completed": True, "time": 0},
}
]
show_titles = {
"imdb": ["tt3581920"],
"locations": [("The Last of Us",)],
"tmdb": ["100088"],
"tvdb": ["392256"],
}
episode_titles = {
"imdb": ["tt11957006"],
"locations": [
("The Last of Us - S01E01 - When You're Lost in the Darkness WEBDL-1080p.mkv",)
],
"tmdb": ["2181581"],
"tvdb": ["8444132"],
"completed": [True],
"time": [0],
"season": ["Season 1"],
"show": [
{
"imdb": "tt3581920",
"locations": ("The Last of Us",),
"title": "The Last of Us",
"tmdb": "100088",
"tvdb": "392256",
}
],
}
movie_titles = {
"imdb": ["tt2380307"],
"locations": [
[
(
"Coco (2017) Remux-2160p.mkv",
"Coco (2017) Remux-1080p.mkv",
)
]
],
"title": ["coco"],
"tmdb": ["354912"],
"completed": [True],
"time": [0],
}
def test_check_skip_logic():
# Failes
library_title = "Test"
library_type = "movies"
skip_reason = check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
assert skip_reason == "Test is not in whitelist_library"
library_title = "Shows"
library_type = "episodes"
skip_reason = check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
assert (
skip_reason
== "episodes is in blacklist_library_type and TV Shows is in blacklist_library and "
+ "episodes is not in whitelist_library_type and Shows is not in whitelist_library"
)
# Passes
library_title = "Movie"
library_type = "movies"
skip_reason = check_skip_logic(
library_title,
library_type,
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
library_mapping,
)
assert skip_reason is None
def test_check_blacklist_logic():
# Fails
library_title = "Shows"
library_type = "episodes"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other,
)
assert (
skip_reason
== "episodes is in blacklist_library_type and TV Shows is in blacklist_library"
)
library_title = "TV Shows"
library_type = "episodes"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other,
)
assert (
skip_reason
== "episodes is in blacklist_library_type and TV Shows is in blacklist_library"
)
# Passes
library_title = "Movie"
library_type = "movies"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other,
)
assert skip_reason is None
library_title = "Movies"
library_type = "movies"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_blacklist_logic(
library_title,
library_type,
blacklist_library,
blacklist_library_type,
library_other,
)
assert skip_reason is None
def test_check_whitelist_logic():
# Fails
library_title = "Shows"
library_type = "episodes"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other,
)
assert (
skip_reason
== "episodes is not in whitelist_library_type and Shows is not in whitelist_library"
)
library_title = "TV Shows"
library_type = "episodes"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other,
)
assert (
skip_reason
== "episodes is not in whitelist_library_type and TV Shows is not in whitelist_library"
)
# Passes
library_title = "Movie"
library_type = "movies"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other,
)
assert skip_reason is None
library_title = "Movies"
library_type = "movies"
library_other = search_mapping(library_mapping, library_title)
skip_reason = check_whitelist_logic(
library_title,
library_type,
whitelist_library,
whitelist_library_type,
library_other,
)
assert skip_reason is None
def test_show_title_dict():
show_titles_dict = show_title_dict(show_list)
assert show_titles_dict == show_titles
def test_episode_title_dict():
episode_titles_dict = episode_title_dict(show_list)
assert episode_titles_dict == episode_titles
def test_movies_title_dict():
movies_titles_dict = movies_title_dict(movie_list)
assert movies_titles_dict == movie_titles
def test_generate_library_guids_dict():
# Test with shows
(
show_titles_dict,
episode_titles_dict,
movies_titles_dict,
) = generate_library_guids_dict(show_list)
assert show_titles_dict == show_titles
assert episode_titles_dict == episode_titles
assert movies_titles_dict == {}
# Test with movies
(
show_titles_dict,
episode_titles_dict,
movies_titles_dict,
) = generate_library_guids_dict(movie_list)
assert show_titles_dict == {}
assert episode_titles_dict == {}
assert movies_titles_dict == movie_titles

78
test/test_main.py Normal file
View File

@@ -0,0 +1,78 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.black_white import setup_black_white_lists
def test_setup_black_white_lists():
# Simple
blacklist_library = "library1, library2"
whitelist_library = "library1, library2"
blacklist_library_type = "library_type1, library_type2"
whitelist_library_type = "library_type1, library_type2"
blacklist_users = "user1, user2"
whitelist_users = "user1, user2"
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
)
assert results_blacklist_library == ["library1", "library2"]
assert return_whitelist_library == ["library1", "library2"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2"]
assert return_whitelist_users == ["user1", "user2"]
# Library Mapping and user mapping
library_mapping = {"library1": "library3"}
user_mapping = {"user1": "user3"}
(
results_blacklist_library,
return_whitelist_library,
return_blacklist_library_type,
return_whitelist_library_type,
return_blacklist_users,
return_whitelist_users,
) = setup_black_white_lists(
blacklist_library,
whitelist_library,
blacklist_library_type,
whitelist_library_type,
blacklist_users,
whitelist_users,
library_mapping,
user_mapping,
)
assert results_blacklist_library == ["library1", "library2", "library3"]
assert return_whitelist_library == ["library1", "library2", "library3"]
assert return_blacklist_library_type == ["library_type1", "library_type2"]
assert return_whitelist_library_type == ["library_type1", "library_type2"]
assert return_blacklist_users == ["user1", "user2", "user3"]
assert return_whitelist_users == ["user1", "user2", "user3"]

39
test/test_users.py Normal file
View File

@@ -0,0 +1,39 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.users import (
combine_user_lists,
filter_user_lists,
)
def test_combine_user_lists():
server_1_users = ["test", "test3", "luigi311"]
server_2_users = ["luigi311", "test2", "test3"]
user_mapping = {"test2": "test"}
combined = combine_user_lists(server_1_users, server_2_users, user_mapping)
assert combined == {"luigi311": "luigi311", "test": "test2", "test3": "test3"}
def test_filter_user_lists():
users = {"luigi311": "luigi311", "test": "test2", "test3": "test3"}
blacklist_users = ["test3"]
whitelist_users = ["test", "luigi311"]
filtered = filter_user_lists(users, blacklist_users, whitelist_users)
assert filtered == {"test": "test2", "luigi311": "luigi311"}

684
test/test_watched.py Normal file
View File

@@ -0,0 +1,684 @@
import sys
import os
# getting the name of the directory
# where the this file is present.
current = os.path.dirname(os.path.realpath(__file__))
# Getting the parent directory name
# where the current directory is present.
parent = os.path.dirname(current)
# adding the parent directory to
# the sys.path.
sys.path.append(parent)
from src.watched import cleanup_watched, combine_watched_dicts
tv_shows_watched_list_1 = {
frozenset(
{
("locations", ("Doctor Who (2005) {tvdb-78804} {imdb-tt0436992}",)),
("imdb", "tt0436992"),
("tmdb", "57243"),
("tvdb", "78804"),
("title", "Doctor Who (2005)"),
}
): {
1: [
{
"imdb": "tt0563001",
"tmdb": "968589",
"tvdb": "295296",
"title": "The Unquiet Dead",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt0562985",
"tmdb": "968590",
"tvdb": "295297",
"title": "Aliens of London (1)",
"locations": ("S01E04.mkv",),
"status": {"completed": False, "time": 240000},
},
{
"imdb": "tt0563003",
"tmdb": "968592",
"tvdb": "295298",
"title": "World War Three (2)",
"locations": ("S01E05.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
frozenset(
{
("title", "Monarch: Legacy of Monsters"),
("imdb", "tt17220216"),
("tvdb", "422598"),
("tmdb", "202411"),
(
"locations",
("Monarch - Legacy of Monsters {tvdb-422598} {imdb-tt17220216}",),
),
}
): {
1: [
{
"imdb": "tt21255044",
"tmdb": "4661246",
"tvdb": "10009418",
"title": "Secrets and Lies",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt21255050",
"tmdb": "4712059",
"tvdb": "10009419",
"title": "Parallels and Interiors",
"locations": ("S01E04.mkv",),
"status": {"completed": False, "time": 240000},
},
{
"imdb": "tt23787572",
"tmdb": "4712061",
"tvdb": "10009420",
"title": "The Way Out",
"locations": ("S01E05.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
frozenset(
{
("tmdb", "125928"),
("imdb", "tt14681924"),
("tvdb", "403172"),
(
"locations",
("My Adventures with Superman {tvdb-403172} {imdb-tt14681924}",),
),
("title", "My Adventures with Superman"),
}
): {
1: [
{
"imdb": "tt15699926",
"tmdb": "3070048",
"tvdb": "8438181",
"title": "Adventures of a Normal Man (1)",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt20413322",
"tmdb": "4568681",
"tvdb": "9829910",
"title": "Adventures of a Normal Man (2)",
"locations": ("S01E02.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt20413328",
"tmdb": "4497012",
"tvdb": "9870382",
"title": "My Interview with Superman",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
}
tv_shows_watched_list_2 = {
frozenset(
{
("locations", ("Doctor Who (2005) {tvdb-78804} {imdb-tt0436992}",)),
("imdb", "tt0436992"),
("tmdb", "57243"),
("title", "Doctor Who"),
("tvdb", "78804"),
("tvrage", "3332"),
}
): {
1: [
{
"tvdb": "295294",
"imdb": "tt0562992",
"title": "Rose",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "295295",
"imdb": "tt0562997",
"title": "The End of the World",
"locations": ("S01E02.mkv",),
"status": {"completed": False, "time": 300670},
},
{
"tvdb": "295298",
"imdb": "tt0563003",
"title": "World War Three (2)",
"locations": ("S01E05.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
frozenset(
{
("title", "Monarch: Legacy of Monsters"),
("imdb", "tt17220216"),
("tvdb", "422598"),
("tmdb", "202411"),
(
"locations",
("Monarch - Legacy of Monsters {tvdb-422598} {imdb-tt17220216}",),
),
}
): {
1: [
{
"tvdb": "9959300",
"imdb": "tt20412166",
"title": "Aftermath",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "10009417",
"imdb": "tt22866594",
"title": "Departure",
"locations": ("S01E02.mkv",),
"status": {"completed": False, "time": 300741},
},
{
"tvdb": "10009420",
"imdb": "tt23787572",
"title": "The Way Out",
"locations": ("S01E05.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
frozenset(
{
("tmdb", "125928"),
("imdb", "tt14681924"),
("tvdb", "403172"),
(
"locations",
("My Adventures with Superman {tvdb-403172} {imdb-tt14681924}",),
),
("title", "My Adventures with Superman"),
}
): {
1: [
{
"tvdb": "8438181",
"imdb": "tt15699926",
"title": "Adventures of a Normal Man (1)",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "9829910",
"imdb": "tt20413322",
"title": "Adventures of a Normal Man (2)",
"locations": ("S01E02.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "9870382",
"imdb": "tt20413328",
"title": "My Interview with Superman",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
]
},
}
expected_tv_show_watched_list_1 = {
frozenset(
{
("locations", ("Doctor Who (2005) {tvdb-78804} {imdb-tt0436992}",)),
("imdb", "tt0436992"),
("tmdb", "57243"),
("tvdb", "78804"),
("title", "Doctor Who (2005)"),
}
): {
1: [
{
"imdb": "tt0563001",
"tmdb": "968589",
"tvdb": "295296",
"title": "The Unquiet Dead",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt0562985",
"tmdb": "968590",
"tvdb": "295297",
"title": "Aliens of London (1)",
"locations": ("S01E04.mkv",),
"status": {"completed": False, "time": 240000},
},
]
},
frozenset(
{
("title", "Monarch: Legacy of Monsters"),
("imdb", "tt17220216"),
("tvdb", "422598"),
("tmdb", "202411"),
(
"locations",
("Monarch - Legacy of Monsters {tvdb-422598} {imdb-tt17220216}",),
),
}
): {
1: [
{
"imdb": "tt21255044",
"tmdb": "4661246",
"tvdb": "10009418",
"title": "Secrets and Lies",
"locations": ("S01E03.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt21255050",
"tmdb": "4712059",
"tvdb": "10009419",
"title": "Parallels and Interiors",
"locations": ("S01E04.mkv",),
"status": {"completed": False, "time": 240000},
},
]
},
}
expected_tv_show_watched_list_2 = {
frozenset(
{
("locations", ("Doctor Who (2005) {tvdb-78804} {imdb-tt0436992}",)),
("imdb", "tt0436992"),
("tmdb", "57243"),
("title", "Doctor Who"),
("tvdb", "78804"),
("tvrage", "3332"),
}
): {
1: [
{
"tvdb": "295294",
"imdb": "tt0562992",
"title": "Rose",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "295295",
"imdb": "tt0562997",
"title": "The End of the World",
"locations": ("S01E02.mkv",),
"status": {"completed": False, "time": 300670},
},
]
},
frozenset(
{
("title", "Monarch: Legacy of Monsters"),
("imdb", "tt17220216"),
("tvdb", "422598"),
("tmdb", "202411"),
(
"locations",
("Monarch - Legacy of Monsters {tvdb-422598} {imdb-tt17220216}",),
),
}
): {
1: [
{
"tvdb": "9959300",
"imdb": "tt20412166",
"title": "Aftermath",
"locations": ("S01E01.mkv",),
"status": {"completed": True, "time": 0},
},
{
"tvdb": "10009417",
"imdb": "tt22866594",
"title": "Departure",
"locations": ("S01E02.mkv",),
"status": {"completed": False, "time": 300741},
},
]
},
}
movies_watched_list_1 = [
{
"imdb": "tt1254207",
"tmdb": "10378",
"tvdb": "12352",
"title": "Big Buck Bunny",
"locations": ("Big Buck Bunny.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt16431870",
"tmdb": "1029575",
"tvdb": "351194",
"title": "The Family Plan",
"locations": ("The Family Plan (2023).mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt5537002",
"tmdb": "466420",
"tvdb": "135852",
"title": "Killers of the Flower Moon",
"locations": ("Killers of the Flower Moon (2023).mkv",),
"status": {"completed": False, "time": 240000},
},
]
movies_watched_list_2 = [
{
"imdb": "tt16431870",
"tmdb": "1029575",
"title": "The Family Plan",
"locations": ("The Family Plan (2023).mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt4589218",
"tmdb": "507089",
"title": "Five Nights at Freddy's",
"locations": ("Five Nights at Freddy's (2023).mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt10545296",
"tmdb": "695721",
"tmdbcollection": "131635",
"title": "The Hunger Games: The Ballad of Songbirds & Snakes",
"locations": ("The Hunger Games The Ballad of Songbirds & Snakes (2023).mkv",),
"status": {"completed": False, "time": 301215},
},
]
expected_movie_watched_list_1 = [
{
"imdb": "tt1254207",
"tmdb": "10378",
"tvdb": "12352",
"title": "Big Buck Bunny",
"locations": ("Big Buck Bunny.mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt5537002",
"tmdb": "466420",
"tvdb": "135852",
"title": "Killers of the Flower Moon",
"locations": ("Killers of the Flower Moon (2023).mkv",),
"status": {"completed": False, "time": 240000},
},
]
expected_movie_watched_list_2 = [
{
"imdb": "tt4589218",
"tmdb": "507089",
"title": "Five Nights at Freddy's",
"locations": ("Five Nights at Freddy's (2023).mkv",),
"status": {"completed": True, "time": 0},
},
{
"imdb": "tt10545296",
"tmdb": "695721",
"tmdbcollection": "131635",
"title": "The Hunger Games: The Ballad of Songbirds & Snakes",
"locations": ("The Hunger Games The Ballad of Songbirds & Snakes (2023).mkv",),
"status": {"completed": False, "time": 301215},
},
]
# Test to see if objects get deleted all the way up to the root.
tv_shows_2_watched_list_1 = {
frozenset(
{
("tvdb", "75710"),
("title", "Criminal Minds"),
("imdb", "tt0452046"),
("locations", ("Criminal Minds",)),
("tmdb", "4057"),
}
): {
"Season 1": [
{
"imdb": "tt0550489",
"tmdb": "282843",
"tvdb": "176357",
"title": "Extreme Aggressor",
"locations": (
"Criminal Minds S01E01 Extreme Aggressor WEBDL-720p.mkv",
),
"status": {"completed": True, "time": 0},
},
]
}
}
def test_simple_cleanup_watched():
user_watched_list_1 = {
"user1": {
"TV Shows": tv_shows_watched_list_1,
"Movies": movies_watched_list_1,
"Other Shows": tv_shows_2_watched_list_1,
},
}
user_watched_list_2 = {
"user1": {
"TV Shows": tv_shows_watched_list_2,
"Movies": movies_watched_list_2,
"Other Shows": tv_shows_2_watched_list_1,
}
}
expected_watched_list_1 = {
"user1": {
"TV Shows": expected_tv_show_watched_list_1,
"Movies": expected_movie_watched_list_1,
}
}
expected_watched_list_2 = {
"user1": {
"TV Shows": expected_tv_show_watched_list_2,
"Movies": expected_movie_watched_list_2,
}
}
return_watched_list_1 = cleanup_watched(user_watched_list_1, user_watched_list_2)
return_watched_list_2 = cleanup_watched(user_watched_list_2, user_watched_list_1)
assert return_watched_list_1 == expected_watched_list_1
assert return_watched_list_2 == expected_watched_list_2
def test_mapping_cleanup_watched():
user_watched_list_1 = {
"user1": {
"TV Shows": tv_shows_watched_list_1,
"Movies": movies_watched_list_1,
"Other Shows": tv_shows_2_watched_list_1,
},
}
user_watched_list_2 = {
"user2": {
"Shows": tv_shows_watched_list_2,
"Movies": movies_watched_list_2,
"Other Shows": tv_shows_2_watched_list_1,
}
}
expected_watched_list_1 = {
"user1": {
"TV Shows": expected_tv_show_watched_list_1,
"Movies": expected_movie_watched_list_1,
}
}
expected_watched_list_2 = {
"user2": {
"Shows": expected_tv_show_watched_list_2,
"Movies": expected_movie_watched_list_2,
}
}
user_mapping = {"user1": "user2"}
library_mapping = {"TV Shows": "Shows"}
return_watched_list_1 = cleanup_watched(
user_watched_list_1,
user_watched_list_2,
user_mapping=user_mapping,
library_mapping=library_mapping,
)
return_watched_list_2 = cleanup_watched(
user_watched_list_2,
user_watched_list_1,
user_mapping=user_mapping,
library_mapping=library_mapping,
)
assert return_watched_list_1 == expected_watched_list_1
assert return_watched_list_2 == expected_watched_list_2
def test_combine_watched_dicts():
input_watched = [
{
"test3": {
"Anime Movies": [
{
"title": "Ponyo",
"tmdb": "12429",
"imdb": "tt0876563",
"locations": ("Ponyo (2008) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
{
"title": "Spirited Away",
"tmdb": "129",
"imdb": "tt0245429",
"locations": ("Spirited Away (2001) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
{
"title": "Castle in the Sky",
"tmdb": "10515",
"imdb": "tt0092067",
"locations": ("Castle in the Sky (1986) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
]
}
},
{"test3": {"Anime Shows": {}}},
{"test3": {"Cartoon Shows": {}}},
{
"test3": {
"Shows": {
frozenset(
{
("tmdb", "64464"),
("tvdb", "301824"),
("tvrage", "45210"),
("title", "11.22.63"),
("locations", ("11.22.63",)),
("imdb", "tt2879552"),
}
): {
"Season 1": [
{
"imdb": "tt4460418",
"title": "The Rabbit Hole",
"locations": (
"11.22.63 S01E01 The Rabbit Hole Bluray-1080p.mkv",
),
"status": {"completed": True, "time": 0},
}
]
}
}
}
},
{"test3": {"Subbed Anime": {}}},
]
expected = {
"test3": {
"Anime Movies": [
{
"title": "Ponyo",
"tmdb": "12429",
"imdb": "tt0876563",
"locations": ("Ponyo (2008) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
{
"title": "Spirited Away",
"tmdb": "129",
"imdb": "tt0245429",
"locations": ("Spirited Away (2001) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
{
"title": "Castle in the Sky",
"tmdb": "10515",
"imdb": "tt0092067",
"locations": ("Castle in the Sky (1986) Bluray-1080p.mkv",),
"status": {"completed": True, "time": 0},
},
],
"Anime Shows": {},
"Cartoon Shows": {},
"Shows": {
frozenset(
{
("tmdb", "64464"),
("tvdb", "301824"),
("tvrage", "45210"),
("title", "11.22.63"),
("locations", ("11.22.63",)),
("imdb", "tt2879552"),
}
): {
"Season 1": [
{
"imdb": "tt4460418",
"title": "The Rabbit Hole",
"locations": (
"11.22.63 S01E01 The Rabbit Hole Bluray-1080p.mkv",
),
"status": {"completed": True, "time": 0},
}
]
}
},
"Subbed Anime": {},
}
}
assert combine_watched_dicts(input_watched) == expected

View File

@@ -0,0 +1,76 @@
# Check the mark.log file that is generated by the CI to make sure it contains the expected values
import os
def read_marklog():
marklog = os.path.join(os.getcwd(), "mark.log")
with open(marklog, "r") as f:
lines = f.readlines()
return lines
def check_marklog(lines, expected_values):
try:
# Check to make sure the marklog contains all the expected values and nothing else
found_values = []
for line in lines:
# Remove the newline character
line = line.strip()
if line not in expected_values:
raise Exception("Line not found in marklog: " + line)
found_values.append(line)
# Check to make sure the marklog contains the same number of values as the expected values
if len(found_values) != len(expected_values):
raise Exception(
"Marklog did not contain the same number of values as the expected values, found "
+ str(len(found_values))
+ " values, expected "
+ str(len(expected_values))
+ " values"
)
# Check that the two lists contain the same values
if sorted(found_values) != sorted(expected_values):
raise Exception(
"Marklog did not contain the same values as the expected values, found:\n"
+ "\n".join(sorted(found_values))
+ "\n\nExpected:\n"
+ "\n".join(sorted(expected_values))
)
return True
except Exception as e:
print(e)
return False
def main():
expected_values = [
"jellyplex_watched/Movies/Five Nights at Freddy's",
"jellyplex_watched/Movies/The Hunger Games: The Ballad of Songbirds & Snakes/301215",
"jellyplex_watched/TV Shows/Doctor Who (2005)/Rose",
"jellyplex_watched/TV Shows/Doctor Who (2005)/The End of the World/300670",
"jellyplex_watched/TV Shows/Monarch: Legacy of Monsters/Aftermath",
"jellyplex_watched/TV Shows/Monarch: Legacy of Monsters/Departure/300741",
"JellyUser/Movies/Big Buck Bunny",
"JellyUser/Shows/Doctor Who/The Unquiet Dead",
"JellyUser/Shows/Monarch: Legacy of Monsters/Secrets and Lies",
]
# Triple the expected values because the CI runs three times
expected_values = expected_values * 3
lines = read_marklog()
if not check_marklog(lines, expected_values):
print("Failed to validate marklog")
exit(1)
print("Successfully validated marklog")
exit(0)
if __name__ == "__main__":
main()