We have a server on AWS that is used for encoding multiple videos at one time (we encode 6 at once). On startup the EBS volume is currently mounted to the encoding instance with all the source videos to be encoded stored on the EBS. We currently use a simple bash script to encode a single video ("encode [login to view URL]" ) and it creates a final mp4 file.
The goal here is to further automate the process.
What we already have
1. a script and all applications/software that encodes a single video "encode [login to view URL]"
2. A csv list that contains all 6 source filenames for encoding (this csv can be in any format the developer needs)
We need to do the following:
1. Mount the instance storage (ephemeral) [login to view URL] (this can be written into the AMI if preferred)
2. copy all source videos on the csv list (6 files) from the EBS volume to the instance storage. The source filename contains the directory where the source file is stored. so the directory structure is:
/cat
/dog
/elephant
and the filenames are [login to view URL], [login to view URL], [login to view URL]
3. create a script or process that sends all 6 files at one time for encoding in parallel.
'encode [login to view URL]'
'encode [login to view URL]'
encode [login to view URL]'
4. copy the final mp4 files to S3
This project needs to be done asap.
There are much better ways to accomplish this using SQS queue and other web services. this is a "baby step" for us.