To make this work, we make type a proper segment field.
We also tell get_best_segments to ignore temp segments, since they might go away
before we can actually use them.
Exposes a way to read all rows, and write a single cell.
We need to read all columns of each row so we know what would be modified
so we only do updates to single cells that aren't already the correct value.
This keeps us from impacting the sheet load too much with constantly changing values,
which I think might be a thing even if the values are the same.
This allows manual uploads to work without needing to fill all the edit fields
with junk.
We also set a constraint on uploader asserting that any videos from claimed onwards have a known uploader.
Again, an exception is made for DONE to allow manual uploads.
These can happen if a downloader or backfiller dies suddenly.
We treat it similarly to partial but lacking any hash.
At some point in the future we should probably have something
to find any temp segments, hash them and rename them to partials.
We wrap direct dateutil calls to handle two distinct cases:
* `common.dateutil.parse()`: We want to handle arbitrary timestamps including tz info,
then convert them to UTC.
This is used in HLS parsing, and for command line input for backfiller
* `common.dateutil.parse_utc_only()`: We want to only handle UTC timestamps,
but datetime.strptime isn't flexible enough (eg. can't handle missing fractional component).
This is used for restreamer request params.
Each method is fairly complicated, but is self-contained and can be examined independently.
cut_jobs in particular contains several extra helpers and directs control flow
via some iterators. This is unfortunately nessecary due to the requests interface.
This commit only lays out the main loop, showing the high-level flow
and defining shared utilities. This is for clarity.
The actual methods that do the work will be implemented seperately.
It runs on an interval, fetching all videos in TRANSCODING from the DB,
checking them against youtube, and then updating any that are done.
It should be noted that youtube somewhat lies about what being "done" means,
but this is a better approximation than nothing.
Provides basic youtube api calls, and gets passed into both transcode checker and cutter.
The official youtube client library is many orders of magnitude larger and more complicated,
and can't actually do what we want (stream an upload of unknown size).
Note this moves over the 'experimental' cutter and deletes the original cutter
that concatenates entire videos before cutting.
We may eventually want to revive that method if the experimental cutter turns out
to introduce too many issues.
We move most of the code over verbatim, but adjust it such that it acts
as a generic iterator that can be used in a variety of contexts.
Some other changes made during the move include telling ffmpeg to be quieter
(don't output version info and junk, only log if something goes wrong),
and avoiding errors during cleanup.
This is a performance optimization, allowing us to fail out early (potentially avoiding a LOT
of work) if we know we're going to reject any result that contains holes.
We add a new exception ContainsHoles that is raised in this condition.
The cutter has two jobs:
* To cut videos, taking them through states EDITED -> TRANSCODING
* To monitor TRANSCODING videos for when they're complete
We run these as separate greenlets with their own DB connections,
and if either dies we gracefully shut down the other.
This should help prevent changing state to EDITED with any of these fields unset,
which would blow up the cutter.
We also fix up upload_location, which was set up as a sheet input (NOT NULL DEFAULT ''),
and add a similar constraint saying any DONE columns must have non-NULL video link.