Big Data on AWS
5랩 · 50개의 크레딧 · 5시 11분Use Case (Experienced)
This quest is designed to teach you how to work with AWS services to perform big data analytics on the cloud.
The lab demonstrates how to use Amazon RedShift to create a cluster, load data, run queries and monitor performance. Note: Students will download a free SQL client as part of this lab.
advanced 10 크레딧 45 분
This lab demonstrates how to launch an Amazon Elastic MapReduce (EMR) cluster for Big Data processing and use Hive with SQL-style queries to analyze data. You will create a Hadoop cluster using Amazon EMR which will allow to run interactive Hive queries against data stored in Amazon S3. You will use Hive to normalize the data in a more useful way, and you will run queries to analyze the data.
expert 15 크레딧 1 시간
In this lab, you will deploy a fully functional Hadoop cluster, ready to analyze log data in just a few minutes. You will start by launching an Amazon EMR cluster and then use a HiveQL script to process sample log data stored in an Amazon S3 bucket. HiveQL is a SQL-like scripting language for data warehousing and analysis. You can then use a similar setup to analyze your own log files.
Introductory 무료 40 분
In this lab, you will take a close look at different types of table layout and schema design. You will create tables using various methods for data compression and distribution, and analyze which methods work best, including incorporating Amazon Redshift recommendations. You will conclude the lab by building five different versions of the same table, and analyzing how the differences impact storage requirements and query performance. Pre-requisites: To successfully complete this lab, you should be familiar with Redshift concepts by taking the "Introduction to Amazon Redshift" and "Working with Amazon Redshift" labs at qwiklabs.com. Knowledge of SQL programming is required, although full solution code is provided.
expert 15 크레딧 1 시간 45 분
In this lab, you will experiment with and compare different types of data loading using Amazon Redshift. You will create tables, load data using S3, remote hosts, and practice troubleshooting data loading errors. For the lab to function as written, please DO NOT change the auto assigned region.
advanced 10 크레딧 45 분