I want to run this on a large TV and Movie collection which currently does not have ANY subtitles (except the files which already have them embedded) and doing so will exceed my daily download limit considerably (even as a VIP member) I must therefore download the subtitles over period of a few days.
however calling the script daily requires
If you call this script repeatedly on the same folders or files then you MUST SET --def maxAgeDays to 30 days or less and call it no more than once per day.
The problem I have is all my shows have the file creation date set to the show air date so if I set --def maxAgeDays to 30 days or less then it wont download any subtitles for my older shows i.e Star Trek
If I have understood this thread correctly then this command might be the answer I'm looking for as the script would scan the /path/to/media folder and then
only download an EXACT match for MISSING subtitles - skipping embedded and existing subtitles (both downloaded subtitles from previous runs of the script and any new media which might already have exisiting srt files)
Shell: Select all
filebot -script fn:suball /path/to/media --action duplicate --conflict skip
If that is the case, would you still need to add an --def maxAgeDays value? As it would only ever download subtitles that where missing - if the script can't find an exact match or the subtitle is embedded / existing then nothing would be downloaded?
My thinking is
The script COULD run daily (without setting --def maxAgeDays )and over a period of a few days download all the subtitles for all the shows/films - skipping subtitles i already downloaded the day before
Then once that has been done , the daily cron run of the script would simply find any subtitles for newly downloaded content which were not available on release (download) day and if no new subtitles where found or nothing was missing the script would just exit without downloading anything
So long post short - would setting the following 2 cron jobs to run once a day achieve my goal of
1st Downloading all missing subtitles for my TV and Movie collection over a period of a few days and then
2nd check once a day for any missing subtitles (ie if new content as added recently and no subtitles where available on release/download day)
3rd not get banned!
Shell: Select all
filebot -script fn:suball /mnt/user/TV --action duplicate --conflict skip
Shell: Select all
filebot -script fn:suball /mnt/user/Films --action duplicate --conflict skip
assuming for a moment the above cron jobs are safe to add I would also like exact match subtitles to be downloaded at rename if they are available and if they are not available - download nothing at all and wait until the daily cron finds them in x days time
i use utorrent to download but because it runs as nobody it sets some weird file permissions so I call filebot by a custom postprocess.sh script which basically run chmod on utorrent's completed folder and then calls amc with $1 $2 $3 etc
Shell: Select all
sudo -su root chmod -R 777 "/mnt/cache/completed"
sudo -su root chown -R nobody:users "/mnt/cache/completed"
/usr/bin/docker exec filebot /opt/filebot/filebot -script fn:amc --output "/mnt/cache/" --log-file "/mnt/appdata/logs/amc.log" --action move --conflict auto -non-strict --def movieDB=TheMovieDB seriesDB=TheTVDB movieFormat=@/mnt/appdata/config/Filebot/MovieFormat.groovy seriesFormat=@/mnt/appdata/config/Filebot/SeriesFormat.groovy animeFormat=@/mnt/appdata/config/Filebot/AnimeFormat.groovy musicFormat=@/mnt/appdata/config/Filebot/MusicFormat.groovy plex=192.168.0.250:################### minFileSize=0 deleteAfterExtract=y clean=y "ut_dir=$1" "ut_file=$2" "ut_kind=$3" "ut_title=$4" "ut_label=$5" "ut_state=$6" --apply date
now when I was searching the forums for help, I read that if you want to ignore embedded and existing subtitles in cli you must first disable amc --def subtitles=en and then call filebot -script fn:suball before you call the amc script.
Since I am using a custom script to call filebot this should be easy, my question is how to pass the input files in my custom script as some downloads will be files and other folders (ut_kind=single/multi)
Shell: Select all
/usr/bin/docker exec filebot /opt/filebot/filebot -script fn:suball "ut_dir=$1" "ut_file=$2" "ut_kind=$3" "ut_title=$4" "ut_label=$5" "ut_state=$6" --action duplicate --conflict skip --log-file "/mnt/appdata/logs/suball.log"
or
Shell: Select all
/usr/bin/docker exec filebot /opt/filebot/filebot -script fn:suball "$1" "$2" "$3" "$4" "$5" "$6" --action duplicate --conflict skip --log-file "/mnt/appdata/logs/suball.log"