This course runs for a duration of 3 Days.
The class will run daily from 12:00pm EST to 8:00pm EST.
Class Location: Virtual LIVE Instructor Led - Virtual Live Classroom.
Big Data on AWS introduces you to cloud-based big data solutions such as Amazon Elastic MapReduce (EMR), Amazon Redshift, Amazon Kinesis and the rest of the AWS big data platform. In this course, we show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Hive and Hue. We also teach you how to create big data environments, work with Amazon DynamoDB, Amazon Redshift, and Amazon Kinesis, and leverage best practices to design big data environments for security and cost-effectiveness.
Skills Gained
This course teaches you how to:
Fit AWS solutions inside of a big data ecosystem
Leverage Apache Hadoop in the context of Amazon EMR
Identify the components of an Amazon EMR cluster
Launch and configure an Amazon EMR cluster
Leverage common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
Leverage Hue to improve the ease-of-use of Amazon EMR
Use in-memory analytics with Spark and Spark SQL on Amazon EMR
Choose appropriate AWS data storage options
Identify the benefits of using Amazon Kinesis for near real-time big data processing
Define data warehousing and columnar database concepts
Leverage Amazon Redshift to efficiently store and analyze data
Comprehend and manage costs and security for Amazon EMR and Amazon Redshift deployments
Identify options for ingesting, transferring, and compressing data
Use visualization software to depict data and queries
Orchestrate big data workflows using AWS Data Pipeline
This course is intended for:
Individuals responsible for designing and implementing big data solutions, namely Solutions Architects and SysOps Administrators
Data Scientists and Data Analysts interested in learning about big data solutions on AWS
We recommend that attendees of this course have the following prerequisites:
Basic familiarity with big data technologies, including Apache Hadoop, MapReduce, and HDFS, and SQL/NoSQL querying
Students should complete the Big Data Technology Fundamentals web-based training or have equivalent experience
Working knowledge of core AWS services and public cloud implementation
Students should complete the AWS Essentials course or have equivalent experience
Basic understanding of data warehousing, relational database systems, and database design