Enroll Now!!! and get 10% special Discount on all courses. Limited Time only!!!

Apache Kafka Certification Training ( L048 )

4.5 + (25,859) Students Ratings

IQ Training’s Apache Kafka course offers an in-depth understanding of Kafka API,  creating Kafka clusters, architecture, configuration, installation, integration with Hadoop, Spark, Storm etc through real-time use cases.

 

Course Price :

₹18,418
₹20,464
10%
off
Available

Live Instructor

Self Paced

Think Bigger Advantage

Live Online Classes

All our Classes are Live Instrucotor led online sessions. You can attend at the comfort of your place and Login to our Classes.

LMS (Learning Management System)

LMS will help you to organize your all training material, session videos and review at later date. You can access LMS anytime and review your completed classes. If you miss any class, then you can review the missed class in LMS.

Flexible Schedule

For some reasons, you can not attend the Classes, we can enroll you in the next possible classes. we assure flexibility in class schdules.

Lifetime Access to Learning Platform

You will get Lifetime free access to LMS(Learning Mangement System) You can access all Videos, class room assignments, quizzes, Projects for Life time. You will also get free video sessions for Life time.

Highest Completion Rate

We have the highest course completion rate in the Industry. If you miss a class, you can opt for the missed class in different batch. We assure you the best training possible for you to succeed.

Certificate of Completion

We provide you the Industry recognized Certification of Course completion This certificate will sometimes helps you to get reimbursement of training expenses by your company.

Training Scheule
Batch Start Date Days of Training Weekday/ Weekend Timings
28-Mar-2020 Available SAT & SUN (5 WEEKS) Weekend Batch 11:00 AM - 02:00 PM (EST)
17-Apr-2020 Available SAT & SUN (5 WEEKS) Weekend Batch 09:30 PM - 12:30 AM (EST)
 
 
 
 
 

Course Curriculum

Goal: In this module, you will understand where Kafka works in the Big Data space, and Kafka Architecture. In addition, you will learn more about Kafka Cluster, its Components, and how to configure a Cluster

Skills:

  • Kafka Concepts
  • Kafka Installation
  • Configuring Kafka Cluster

Objectives: 

In this module, you will be able to: 

  • Explain what is Big Data
  • Understand the importance of big data analytics
  • Describe the need for Kafka
  • Understand the role of each Kafka Components
  • Know the role of ZooKeeper
  • Install ZooKeeper and Kafka 
  • Classify the different type of Kafka Clusters
  • Work with Single Node-Single Broker Cluster

Topics:

  • Introduction to Big Data
  • Big Data Analytics
  • Need for Kafka
  • What is Kafka? 
  • Kafka Features
  • Kafka Concepts
  • Kafka Architecture
  • Kafka Components 
  • ZooKeeper
  • Where is Kafka Used?
  • Kafka Installation
  • Kafka Cluster 
  • Types of Kafka Clusters
  • Configuring Single Node Single Broker Cluster

Hands-on:

  • Kafka Installation
  • Implementing Single Node-Single Broker Cluster

Goal: In this module, you will work with different Kafka Producer APIs. Kafka Producers mainly send records to topics. The records are sometimes referred to as Messages.

 

Skills:

  • Configure Kafka Producer
  • Constructing Kafka Producer
  • Kafka Producer APIs
  • Handling Partitions

 

Objectives:

In this module, you’ll be able to:

  • Construct a Kafka Producer
  • Send messages to Kafka
  • Send messages Synchronously & Asynchronously
  • Configure Producers
  • Serialize Using Apache Avro
  • Create & handle Partitions

 

Topics:

  • Configuring Single Node Multi Broker Cluster
  • Constructing a Kafka Producer
  • Sending a Message to Kafka
  • Producing Keyed and Non-Keyed Messages 
  • Sending a Message Synchronously & Asynchronously
  • Configuring Producers
  • Serializers
  • Serializing Using Apache Avro
  • Partitions

 

Hands-On:

  • Working with Single Node Multi Broker Cluster
  • Creating a Kafka Producer
  • Configuring a Kafka Producer
  • Sending a Message Synchronously & Asynchronously

Goal: In this module, you will learn how to construct Kafka Consumer, process messages from Kafka with Consumer, run Kafka Consumer and subscribe to Topics. Applications which has to read from data from Kafka use a Kafka Consumer to subscribe to Kafka topics and receive messages from these topics.

Skills:

  • Configure Kafka Consumer
  • Kafka Consumer API
  • Constructing Kafka Consumer

Objectives: 

In this module, you should be able to:

  • Perform Operations on Kafka
  • Define Kafka Consumer and Consumer Groups
  • Explain how Partition Rebalance occurs 
  • Describe how Partitions are assigned to Kafka Broker
  • Configure Kafka Consumer
  • Create a Kafka consumer and subscribe to Topics
  • Describe & implement different Types of Commit
  • Deserialize the received messages

Topics:

  • Consumers and Consumer Groups
  • Standalone Consumer
  • Consumer Groups and Partition Rebalance
  • Creating a Kafka Consumer
  • Subscribing to Topics
  • The Poll Loop
  • Configuring Consumers
  • Commits and Offsets
  • Rebalance Listeners
  • Consuming Records with Specific Offsets
  • Deserializers

Hands-On:

  • Creating a Kafka Consumer
  • Configuring a Kafka Consumer
  • Working with Offsets

Goal: For handling, real-time data feeds Apache Kafka provides a unified, high-throughput, low-latency platform. In this module, you will learn more about tuning Kafka to meet your high-performance needs.

 

Skills:

  • Kafka APIs
  • Kafka Storage 
  • Configure Broker

 

Objectives: 

After completion of this course module, you will be able to

  • Understand Kafka Internals
  • Explain how Replication works in Kafka
  • Differentiate between In-sync and Out-off-sync Replicas
  • Understand the Partition Allocation
  • Classify and Describe Requests in Kafka
  • Configure the Producer, Broker, and Consumer for a Reliable System
  • Validate System Reliabilities
  • Configure Kafka for Performance Tuning

 

Topics:

  • Cluster Membership
  • The Controller
  • Replication
  • Request Processing
  • Physical Storage
  • Reliability 
  • Broker Configuration
  • Using Producers in a Reliable System
  • Using Consumers in a Reliable System
  • Validating System Reliability
  • Performance Tuning in Kafka

 

Hands-On:

  • Create a topic with partition & replication factor 3 and also execute them on multi-broker cluster
  • Fault tolerance has to be shown by shutting down 1 Broker and serving its partition from another broker

Goal:  In order to maintain load balance Kafka Cluster typically consists of multiple brokers. ZooKeeper is used for coordinating and managing the Kafka broker. In this module, you will learn about Kafka Multi-Cluster Architecture, Kafka Brokers, Topic, Partitions, Consumer Group, Mirroring, and ZooKeeper Coordination.

 

Skills: 

  • Administer Kafka

 

Objectives:

 

In this module, you will be able to 

  • Understand Use Cases of Cross-Cluster Mirroring
  • Learn Multi-cluster Architectures
  • Explain Apache Kafka’s MirrorMaker
  • Perform Topic Operations
  • Understand Consumer Groups
  • Describe Dynamic Configuration Changes
  • Learn Partition Management
  • Understand Consuming and Producing
  • Explain Unsafe Operations

 

Topics:

  • Use Cases - Cross-Cluster Mirroring
  • Multi-Cluster Architectures
  • Apache Kafka’s MirrorMaker
  • Other Cross-Cluster Mirroring Solutions
  • Topic Operations
  • Consumer Groups
  • Dynamic Configuration Changes
  • Partition Management
  • Consuming and Producing
  • Unsafe Operations

 

Hands-on:

  • Topic Operations
  • Consumer Group Operations
  • Partition Operations
  • Consumer and Producer Operations

Goal: In this module, you will learn in-depth about Kafka Connect API and Kafka Monitoring. Kafka Connect is a scalable tool for reliably streaming data between Apache Kafka and a few other systems.

 

Skills: 

  • Kafka Connect.
  • Metrics Concepts
  • Monitoring Kafka

Objectives: 

 

After completion of this course module, you should be able to:

  • Explain the Metrics of Kafka Monitoring
  • Understand Kafka Connect
  • Build Data pipelines using Kafka Connect
  • Understand when to use Kafka Connect vs Consumer/Producer API 
  • Perform File source and sink using Kafka Connect

 

  • Topics:
  • Considerations When Building Data Pipelines
  • Metric Basics
  • Kafka Broker Metrics
  • Client Monitoring
  • Lag Monitoring
  • End-to-End Monitoring
  • Kafka Connect
  • When to Use Kafka Connect?
  • Kafka Connect Properties

 

Hands-on:

  • Kafka Connect

Goal: In this module you will learn about the Kafka Streams API. Kafka Streams is library for the client which is used for building mission-critical real-time applications and microservices, where the input and/or output data is stored in Kafka Clusters.

 

Skills: 

  • Stream Processing using Kafka

Objectives:

In this course module, you should be able to

  • Describe What is Stream Processing
  • Learn Different Types of Programming Paradigm
  • Describe Stream Processing Design Patterns
  • Explain Kafka Streams & Kafka Streams API

Topics:

  • Stream Processing
  • Stream-Processing Concepts
  • Stream-Processing Design Patterns
  • Kafka Streams by Example
  • Kafka Streams: Architecture Overview

Hands-on:

  • Kafka Streams
  • Word Count Stream Processing

Goal: In this module, you will learn about Apache Hadoop, Hadoop Architecture, Apache Storm, Configuration of the storm, and Spark Ecosystem. In addition, you will also learn how to configure Spark Cluster, Integrate Kafka with Hadoop, Storm, and Spark.

 

Skills: 

  • Kafka Integration with Hadoop
  • Kafka Integration with Storm
  • Kafka Integration with Spark

Objectives:

After completion of this course module, you will be able to:

  • Understand What is Hadoop
  • Explain Hadoop 2.x Core Components
  • Integrate Kafka with Hadoop
  • Understand What is Apache Storm
  • Explain Storm Components
  • Integrate Kafka with Storm
  • Understand What is Spark
  • Describe RDDs
  • Explain Spark Components
  • Integrate Kafka with Spark

 Topics:

  • Apache Hadoop Basics
  • Hadoop Configuration
  • Kafka Integration with Hadoop
  • Apache Storm Basics
  • Configuration of Storm 
  • Integration of Kafka with Storm
  • Apache Spark Basics
  • Spark Configuration
  • Kafka Integration with Spark

Hands-On:

  • Kafka integration with Hadoop
  • Kafka integration with Storm
  • Kafka integration with Spark

Goal: Here you will learn how to integrate Kafka with Flume, Cassandra, and Talend.

 

Skills:

  • Kafka Integration with Flume
  • Kafka Integration with Cassandra
  • Kafka Integration with Talend

 

 Objectives:

After completion of this course module, you will be able to,

  • Understand Flume
  • Explain Flume Architecture and its Components
  • Setup a Flume Agent
  • Integrate Kafka with Flume
  • Understand Cassandra
  • Learn Cassandra Database Elements
  • Create a Keyspace in Cassandra
  • Integrate Kafka with Cassandra
  • Understand Talend
  • Create Talend Jobs
  • Integrate Kafka with Talend

Topics:

  • Flume Basics
  • Integration of Kafka with Flume
  • Basics of Cassandra such as and KeySpace and Table Creation
  • Integration of Kafka with Cassandra
  • Talend Basics
  • Integration of Kafka with Talend

Hands-On:

  • Kafka demo with Flume
  • Kafka demo with Cassandra
  • Kafka demo with Talend

Goal: In this module, you will work on a project, which will be gathering messages from various multiple sources.

 

Scenario:

In the E-commerce industry, you must have seen how catalog changes frequently. The most typical problem they face is “How to make their inventory and price consistent?”.

 

There are various places where price reflects on Amazon, Flipkart, and other e-commerce sites. If you visit the Search page, Product Description page or any ads on Facebook or Google. You will find there is some mismatch in the price and availability of the products. If we see as per the user’s point of view that’s very disappointing because he spends more time to find better products and at last if he doesn’t purchase just because of the consistency problem.

 

Here you have to build a system that should be consistent in nature. For example, if you are getting product feeds either from flat file or any event stream you have to thoroughly check and make sure you don’t lose any events related to product specially inventory and price.

 

If we talk about price and availability it should always be consistent because there might be a possibility that the product is sold or the seller doesn’t want to sell it anymore or maybe any other reason. However, attributes like name and description doesn’t make that much noise if not updated on time.

 

Problem Statement

You have given a set of sample products. Once we get products in the consumer you have to consume and push products to Cassandra/MySQL. You need to save the below-mentioned fields in Cassandra.

  1. PogId
  2. Supc
  3. Brand
  4. Description
  5. Size
  6. Category
  7. Sub Category
  8. Country
  9. Seller Code

 

In MySQL, you have to store

  1. PogId
  2. Supc
  3. Price
  4. Quantity

This Project enables you to gain Hands-On experience on the concepts that you have learned in this Apache Kafka course.

You can email the solution to our team within 2 weeks from the Course Completion Date. IQ Training will evaluate the solution and award a Certificate with a Performance-based Grading.

Problem Statement:

You are working for a website techreview.com. This website provides reviews of different technologies. Now, the company has decided to include a new feature on the website which will allow users to compare the popularity or latest trend of multiple technologies based on twitter feeds. They want this comparison to happen in real-time. So, as a big data developer of the company, you have to implement the following things:

  • Near Real-Time Streaming of the data from Twitter for displaying last minute's count of people tweeting about a particular technology.
  • Store the twitter count data into Cassandra.
Like the course? Enroll Now

Structure your learning and get a certificate to prove it.

Course Details

After the completion of Real-Time Analytics with Apache Kafka course at IQ Training, you should be able to:

  • Learn Kafka and its components and end to end Kafka cluster along with YARN and Hadoop cluster 
  • Combine Kafka with real-time streaming systems like Spark and Storm.
  • Describe the basic and advanced features which are involved in designing and developing a high throughput messaging system
  • Use Kafka to produce and consume messages from different sources including real-time streaming sources like Twitter
  • Get an insight of Kafka API
  • Understand Kafka Stream APIs
  • Work on a real-life project, Implementing Twitter Streaming with Kafka, Hadoop, Flume & Storm.

This course is crafted for professionals who want to learn Kafka techniques and wish to apply it to Big Data. It is highly recommended for:

Developers, who aspire to gain acceleration in their career as a "Kafka Big Data Developer”

  • Minimum RAM required: 4GB (Suggested: 8GB)
  • Minimum Free Disk Space: 25GB
  • Minimum Processor i3 or above
  • Operating System of 64bit
  • Participant’s machines must support a 64-bit VirtualBox guest image.

We will help you to set up a Virtual Machine in your System with local access. The detailed guide for installation is provided in the LMS for setting up the environment. For any doubt, the 24*7 support team will assist you promptly. This Virtual Machine can be installed on Mac or Windows machine.

Case Study 1:

Stock Profit Ltd, India’s first discount broker, offers unlimited online share trading and zero brokerage in Equity Cash. Design a real-time system to capture real-time stock data from source (i.e. Yahoo.com) and calculate the loss and profit for customers who are subscribed to the tool. Finally, store the produced result in HDFS.

Case Study 2:

If you are an SEO specialist in a company, you get an email from management where they gave a task to get top trending Keywords. You have to write the topology which consumes keywords from Kafka. You have given a file containing different search keywords across multiple verticals.

Case Study 3:

You have to build a system that should be consistent in nature. For example, if you are getting product feeds either through flat file or any other event stream you have to make sure you don’t lose any events related to the product especially inventory and price as they are very much important.

If we talk about price and availability it should always be consistent because there might be a possibility that the product is sold or the seller doesn’t want to sell it anymore or any other reason. However, attributes like Name, the description doesn’t make that much noise if not updated on time.

Case Study 4:

John wants to build an e-commerce portal like Amazon, Flipkart or E-bay. He will ask sellers/local brands to upload all their products on the portal so that users can visit the portal online and purchase. John doesn’t have much knowledge about the system and he hired you to build a reliable and scalable solution for him where  buyers and sellers can easily update their products in the e-commerce site.

Apache Kafka Certification Training Ceritficate

Apache Kafka Certification Training Reviews

25,859

Total number of reviews

4.5

Aggregate review score

80%

Course completion rate

Apache Kafka Certification Training Features

Live Online Classes

All our Classes are Live Instructor led online sessions. You can attend at the comfort of your place and Login to our Classes.

LMS (Learning Management System)

LMS will help you to organize your all training material, session videos and review at later date. You can access LMS anytime and review your completed classes. If you miss any class, then you can review the missed class in LMS.

Flexible Schedule

For some reasons, you can not attend the Classes, we can enroll you in the next possible classes. we assure flexibility in class schedules.

Lifetime Access to Learning Platform

You will get Lifetime free access to LMS(Learning Mangement System) You can access all Videos, class room assignments, quizzes, Projects for Life time. You will also get free video sessions for Life time.

Highest Course Completion Rate

We have the highest course completion rate in the Industry. If you miss a class, you can opt for the missed class in different batch. We assure you the best training possible for you to succeed.

Certificate of completion

We provide you the Industry recognized Certification of Course completion This certificate will sometimes helps you to get reimbursement of training expenses by your company.

Like the course? Enroll Now

Structure your learning and get a certificate to prove it.

Apache Kafka Certification Training FAQs

You will never miss a class at IQ Online Training! You can choose either of the two options:

  1. View the recorded session of the class available in your LMS or
  2. You can attend the missed session in any other live batch.

After the enrolment, the LMS access will be instantly provided to you able to access for lifetime which includes complete set of previous class recordings/PPTs/PDFs/assignments. You can start learning right away.

Your access to the Support Team is for lifetime. Our team will help you in resolving queries, during and after the course.

Yes, once enrollment has done for course. Access to the course material will be available for lifetime.

You can Call our support numbers listed in site OR Email us at info@iqtrainings.com.

You can view in-depth class sample recordings before the enrollment. Experience the complete learning instead of a demo session with our expertise.

All the instructors are Industry experts with minimum 10-12 yrs of relevant IT experience. They are subject matter experts and are well trained for providing an awesome learning experience to the participants.

Recommended Courses

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

BIG DATA HADOOP CERTIFICATION TRAINING

Duration: 30 Hours

₹ 
22,994
 ₹ 20,695
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
27-Mar-2020 09:30 PM - 12:30 AM (EST) Yes
20-Apr-2020 11:00 AM - 01:00 PM (EST) No

HADOOP ADMINISTRATION CERTIFICATION TRAINING

Duration: 24 Hours

₹ 
20,464
 ₹ 18,418
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
04-Apr-2020 11:00 AM - 02:00 PM (EST) Yes

ELK STACK TRAINING & CERTIFICATION

Duration:

₹ 
22,994
 ₹ 20,695
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

SPLUNK TRAINING & CERTIFICATION POWER USER & ADMIN

Duration: 24 Hours

₹ 
20,464
 ₹ 18,418
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
28-Mar-2020 11:00 AM - 02:00 PM (EST) Yes

HADOOP, SPARK, AND TALEND INTEGRATION

Duration:

₹ 
22,994
 ₹ 20,695
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

HADOOP AND SPARK TRAINING

Duration:

₹ 
22,994
 ₹ 20,695
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

APACHE SPARK AND SCALA CERTIFICATION TRAINING

Duration: 36 Hours

₹ 
25,294
 ₹ 22,765
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
27-Mar-2020 09:30 PM - 12:30 AM (EST) Yes
18-Apr-2020 11:00 AM - 02:00 PM (EST) Yes

PYTHON SPARK CERTIFICATION TRAINING USING PYSPARK

Duration: 36 Hours

₹ 
25,294
 ₹ 22,765
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
28-Mar-2020 11:00 AM - 02:00 PM (EST) Yes
17-Apr-2020 09:30 PM - 12:30 AM (EST) Yes

COMPREHENSIVE MAPREDUCE CERTIFICATION TRAINING

Duration:

₹ 
9,199
 ₹ 8,279
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

COMPREHENSIVE PIG CERTIFICATION TRAINING

Duration:

₹ 
8,049
 ₹ 7,244
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

COMPREHENSIVE HIVE CERTIFICATION TRAINING

Duration:

₹ 
8,049
 ₹ 7,244
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

COMPREHENSIVE HBASE CERTIFICATION TRAINING

Duration:

₹ 
8,049
 ₹ 7,244
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

MAPREDUCE DESIGN PATTERNS CERTIFICATION TRAINING

Duration:

₹ 
5,749
 ₹ 5,174
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

REAL-TIME ANALYTICS WITH APACHE KAFKA

Duration:

₹ 
11,499
 ₹ 10,349
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

MASTERING APACHE AMBARI CERTIFICATION TRAINING

Duration:

₹ 
8,049
 ₹ 7,244
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend

BIG DATA HADOOP

Duration:

₹ 
11,499
 ₹ 10,349
 (10% OFF)
Reviews

4.5

Course Schedule

Start Date Timings Weekend
Drop us a Query

+91 97846 54326

Available 24x7 for your queries