WHAT YOU WILL LEARN
Write SQL queries to work with relational databases together with CREATE TABLE, SELECT, INSERT, UPDATE, DELETE, ORDER, JOIN, Features, and many others.
Execute generally used Linux instructions; Automate Extract, Rework and Load (ETL) jobs and knowledge pipelines utilizing BASH scripts, Apache Airflow & Kafka
Design Knowledge Warehouses utilizing star and snowflake schemas, loading and confirm knowledge in staging areas, construct cubes, rollups and materialized views/tables
Analyze knowledge in warehouses utilizing interactive studies and dashboards utilizing BI instruments equivalent to Cognos Analytics
SKILLS YOU WILL GAIN
- Shell Script
- Bash (Unix Shell)
- Linux
- Cloud Databases
- Python Programming
- Ipython
- Relational Database Administration System (RDBMS)
- SQL
- Extraction, Transformation And Loading (ETL)
- Apache Kafka
- Apache Airflow
- Knowledge Pipelines
About this Specialization
Professionals with SQL, ETL, Enterprise Knowledge Warehousing (EDW), Enterprise Intelligence (BI) and Knowledge Evaluation abilities are in nice demand. This Specialization is designed to offer profession related information and abilities for anybody desirous to pursue a job position in domains equivalent to Knowledge Engineering, Knowledge Administration, BI or Knowledge Analytics. This system consists of 4 on-line programs. Within the first course you study the fundamentals of SQL and how one can question relational databases with this highly effective language. Subsequent you study to make use of important Linux instructions and create fundamental shell scripts. You proceed your journey by studying to construct and automate ETL, ELT and knowledge pipelines utilizing BASH scripts, Apache Airflow and Apache Kafka. Within the last course you study Knowledge Lakes, Knowledge Marts in addition to work with Knowledge Warehouses. You additionally create interactive studies and dashboards to derive insights from knowledge in your warehouse. Notice that this specialization has a major emphasis on hands-on observe using actual instruments utilized by knowledge professionals. Each course has quite a few hands-on labs in addition to a course challenge. Whereas you’ll profit from some prior programming expertise, it isn’t completely essential for this course. The one pre-requisites for this specialization are fundamental laptop and knowledge literacy, and a ardour to self-learn on-line.
Utilized Studying Undertaking
Every course offers a lot of observe utilizing hands-on labs and tasks on cloud-based environments with actual instruments. Arms-on workouts embrace: working Linux instructions and pipes, in creating shell scripts, scheduling jobs utilizing cron, constructing ETL and knowledge pipelines, creating & monitoring Airflow DAGs, working with streaming knowledge utilizing Kafka, designing a knowledge warehouses with star and snowflake schemas, verifying knowledge high quality, loading staging and manufacturing warehouses, writing SQL queries and joins with PostgreSQL, MySQL & DB2 databases, growing cubes, rollups and materialized views/tables, creating interactive studies & dashboards, and analyzing warehouse knowledge utilizing BI instruments like Cognos Analytics.
Getting Started with Data Warehousing and BI Analytics
Data is one of an organization’s most valuable commodities. But how can organizations best use their data? And how does the organization determine which data is the most recent, accurate, and useful for business decision making at the highest level?
After taking this course, you will be able to describe different kinds of repositories including data marts, data lakes, and data reservoirs, and explain their functions and uses. A data warehouse is a large repository of data that has been cleaned to a consistent quality. Not all data repositories are used in the same way or require the same rigor when choosing what data to store. Data warehouses are designed to enable rapid business decision making through accurate and flexible reporting and data analysis. A data warehouse is one of the most fundamental business intelligence tools in use today, and one that successful Data Engineers must understand. You will also be able to describe how data warehouses serve a single source of data truth for organization’s current and historical data. Organizations create data value using analytics and business intelligence applications. Now that you have experienced the ELT process, gain hands-on analytics and business intelligence experience using IBM Cognos and its reporting, dashboard features including visualization capabilities. Finally, you will complete a shareable final project that enables you to demonstrate the skills you acquired in each module.
0 Comments