This Professional Certificate is for anyone who wants to develop job-ready skills, tools, and a portfolio for an entry-level data engineer position. Throughout the self-paced online courses, you will immerse yourself in the role of a data engineer and acquire the essential skills you need to work with a range of tools and databases to design, deploy, and manage structured and unstructured data.
By the end of this Professional Certificate, you will be able to explain and perform the key tasks required in a data engineering role. You will use the Python programming language and Linux/UNIX shell scripts to extract, transform and load (ETL) data. You will work with Relational Databases (RDBMS) and query data using SQL statements. You will use NoSQL databases and unstructured data. You will be introduced to Big Data and work with Big Data engines like Hadoop and Spark. You will gain experience with creating Data Warehouses and utilize Business Intelligence tools to analyze and extract insights.
- Introduction to Data Engineering.
- Python for Data Science, AI & Development.
- Python Project for Data Engineering.
- Introduction to Relational Databases (RDBMS).
- Databases and SQL for Data Science with Python.
- Hands-on Introduction to Linux Commands and Shell Scripting.
- Relational Database Administration (DBA).
- ETL and Data Pipelines with Shell, Airflow and Kafka.
- Getting Started with Data Warehousing and BI Analytics.
- Introduction to NoSQL Databases.
- Introduction to Big Data with Spark and Hadoop.
- Data Engineering and Machine Learning using Spark.
- Data Engineering Capstone Project.
What will you learn
- Create, design and manage relational databases and apply database administration (DBA) concepts to RDBMSes such as MySQL, PostgreSQL & IBM Db2.
- Demonstrate working knowledge of NoSQL & Big Data using MongoDB, Cassandra, Cloudant, Hadoop, Apache Spark, Spark SQL, Spark ML, Spark Streaming.
- Develop and execute SQL queries using SELECT, INSERT, UPDATE, DELETE statements, database functions, stored procedures, Nested Queries and JOINs.
- Implement ETL & Data Pipelines with Bash, Airflow & Kafka; architect, populate, deploy Data Warehouses; create BI reports & interactive dashboards.