Edureka
Глобальный
Курс
Online
29 января 2022
6 недель
Стоимость курса
449 USD

Apache Spark and Scala

Навыки, которые вы получите:
Scala SQL Apache Spark BigData Machine learning Apache Kafka

Designed to meet the industry benchmarks, Edureka’s Apache Spark and Scala certification is curated by top industry experts. This Scala certification training is created to help you master Apache Spark and the Spark Ecosystem, which includes Spark RDD, Spark SQL, and Spark MLlib. This Apache Spark training is live, instructor-led & helps you master key Apache Spark concepts, with hands-on demonstrations. This Apache Spark course is fully immersive where you can learn and interact with the instructor and your peers. Enroll now in this Scala online training.

Spark is one of the most growing and widely used tool for Big Data & Analytics. It has been adopted by multiple companies falling into various domains around the globe and therefore, offers promising career opportunities. In order to take part in these kind of opportunities, you need a structured training that is aligned as per Cloudera Hadoop and Spark Developer Certification (CCA175) and current industry requirements and best practices.

Who is this course for

  • Developers and Architects.
  • BI /ETL/DW Professionals.
  • Senior IT Professionals.
  • Testing Professionals.
  • Mainframe Professionals.
  • Freshers.
  • Big Data Enthusiasts.
  • Software Architects, Engineers, and Developers.
  • Data Scientists and Analytics Professionals.

The Program

  1. Introduction to Big Data Hadoop and Spark.
  2. Introduction to Scala for Apache Spark.
  3. Functional Programming and OOPs Concepts in Scala.
  4. Deep Dive into Apache Spark Framework.
  5. Playing with Spark RDDs.
  6. DataFrames and Spark SQL.
  7. Machine Learning using Spark MLlib.
  8. Deep Dive into Spark MLlib.
  9. Understanding Apache Kafka and Apache Flume.
  10. Apache Spark Streaming — Processing Multiple Batches.
  11. Apache Spark Streaming — Data Sources.

What will you learn

  • Write Scala Programs to build Spark Application Master the concepts of HDFS.
  • Understand Hadoop 2.x Architecture.
  • Understand Spark and its Ecosystem.
  • Implement Spark operations on Spark Shell.
  • Implement Spark applications on YARN (Hadoop).
  • Write Spark Applications using Spark RDD concepts.
  • Learn data ingestion using Sqoop.
  • Perform SQL queries using Spark SQL.
  • Implement various machine learning algorithms in Spark MLlib API and Clustering.
  • Explain Kafka and its components.
  • Understand Flume and its components.
  • Integrate Kafka with real time streaming systems like Flume.
  • Use Kafka to produce and consume messages.
  • Build Spark.
  • Streaming Application Process.
  • Multiple Batches in Spark.
  • Streaming Implement different streaming data sources
Нам нужен ваш фидбек!
Честный и беспристрастный