Docker enables you to create highly customized images that are used to execute your jobs. These images allow you to easily share complex applications between teams and even organizations
2. • Technical Evangelist, Developer Advocate,
… Software Engineer
• My @home is in Finland
• Previously:
• Solutions Architect @AWS
• Lead Cloud Architect @Dreambroker
• Director of Engineering, Software Engineer, DevOps, Manager, ... @Hdm
• Researcher @Nokia Research Center
• and a bunch of other stuff.
• Love climbing and ginger shots.
3. What to expect from this session
• Batch processing overview
• AWS Batch platform walkthrough
• API overview
• Demo(s)
• Show me the code!
• Usage patterns
5. What is batch computing?
Run jobs asynchronously and automatically across one or more
computers.
Jobs may have dependencies, making the sequencing and scheduling of
multiple jobs complex and challenging.
6. Early Batch APIs (19th Century)
• Processing of data stored on decks of punch
card
• Tabulating machine by Herman Hollerith,
used for the 1890 United States Census.
• Each card stored a separate record of data
with different fields.
• Cards were processed by the machine one
by one, all in the same way, as a batch.
IBM Type 285 tabulators (1936) being used for batch
processing of punch cards (in stack on each machine) with
human operators at U.S. Social Security Administration
8. Batch in Linux
echo "cc -o foo foo.c" | at 1145 jan 31
> job 1 at Wed Jan 31 11:45:00 2018
9. Batch in Linux
echo "cc -o foo foo.c" | at 1145 jan 31
> job 1 at Wed Jan 31 11:45:00 2018
$ at 1145 jan 31
at> cc -o foo foo.c
at> ^D
$ atq (list jobs)
$ atrm <job_number>
10. Batch computing today
• In-house compute clusters powered by open source or
commercial job schedulers.
• Often comprised of a large array of identical,
undifferentiated processors, all of the same vintage and
built to the same specifications.
11. It’s like trying to fit a square into a circle
Batch computing today …
13. AWS Batch in a nutshell
• Fully managed batch primitives
• Focus on your applications
• Shell scripts,
• Linux executables,
• Docker images
• and their resource requirements
• We take care of the rest!
17. Jobs
Jobs are the unit of work executed by AWS Batch as containerized
applications running on Amazon EC2.
Containerized jobs can reference a container image, command, and
parameters.
Or, users can fetch a .zip containing their application and run it on a
Amazon Linux container.
20. Job States
Jobs submitted to a queue can have the following states:
SUBMITTED: Accepted into the queue, but not yet evaluated for execution
PENDING: Your job has dependencies on other jobs which have not yet completed
RUNNABLE: Your job has been evaluated by the scheduler and is ready to run
STARTING: Your job is in the process of being scheduled to a compute resource
RUNNING: Your job is currently running
SUCCEEDED: Your job has finished with exit code 0
FAILED: Your job finished with a non-zero exit code, was cancelled or terminated.
21. Job Definition
AWS Batch job definitions specify how jobs are to be run.
Some of the attributes specified in a job definition:
• IAM role associated with the job
• vCPU and memory requirements
• Mount points
• Container properties
• Environment variables
• Retry strategy
• While each job must reference a job definition, many parameters
can be overridden.
23. Job Queue
Jobs are submitted to a job queue, where they reside until they are
able to be scheduled to a compute resource. Information related to
completed jobs persists in the queue for 24 hours.
Job queues support priorities and multiple queues can schedule work
to the same compute environment.
25. Job Scheduler
The scheduler evaluates when, where, and how to run jobs
that have been submitted to a job queue.
Jobs run in approximately the order in which they are
submitted, as long as all dependencies on other jobs have
been met.
26. Compute Environment
Job queues are mapped to one or more compute environments.
Managed compute environments enable you to describe your business
requirements (instance types, min/max/desired vCPUs, and Spot
Instance bid as a % of the On-Demand price) and we launch and scale
resources on your behalf.
You can choose specific instance types or choose “optimal” and AWS
Batch launches appropriately sized instances.
28. Customer Provided AMIs
Customer Provided AMIs let you set the AMI that is
launched as part of a managed compute environment.
Makes it possible to configure Docker settings, mount
EBS/EFS volumes, and configure drivers for GPU jobs.
AMIs must be Linux-based, HVM and have a working ECS
agent installation.
36. AWS Batch Use Cases
High Performance Computing
Post-Trade Analytics
Fraud Surveillance
Drug Screening
DNA Sequencing
Rendering
Transcoding
Media Supply Chain
38. Life Sciences: Drug Screening for Biopharma
Rapidly search libraries of small molecules for drug discovery.
39. Digital Media: Visual Effects Rendering
Automate content rendering workloads and reduce the need for human intervention due to execution
dependencies or resource scheduling.
Business Priorities drive architectural configurations:
Cost – Not overly time sensitive, cost the primary concern
Resource (RI’s) – If the customer is already paying for resources, AWS Batch can help ensure they get fully utilized
Time (SLA oriented)