Apache Kafka

Apache Kafka

Apache Kafka

Training Cost: $295.00
Training Type Instructor Based Online Training
Schedule 24-March-2018, 7:30 AM PST | 9:30 AM EST | 7:00 PM IST
Audience and Prerequisites

About The Course:
The Apache Kafka training has been designed to offer the students an opportunity to dive deeper into learning the skills and knowledge necessary to become an Apache Kafka professional. Kafka is basically a real time message broker which allows publishing and subscribing to message streams. This training course will encompass wide number of subjects such as Kafka API and Kafka cluster and advanced topics like integration of Kafka with Hadoop, Spark, Storm, Maven, etc. along with their installation and configuration.

Audience & Pre-Requisites
  • The program is best suited for project managers, Big Data Hadoop developers, messaging and queuing system professionals.
  • The only pre-requisite is basic knowledge of Java programming language.
Objectives

Why Take Apache Kafka Training Course?
Apache Kafka offers high throughput and scalable messaging systems while being an extremely power packed distributed streaming platform that allows working with humongous volumes of data. Since Kafka can be easily amalgamated with Hadoop, Spark, Storm, etc., it is the highly suitable messaging system for solving the Big Data issues in the messaging system. Being highly scalable, an individual Kafka professional gets empowered to manage hundreds of petabytes of data pertaining to a huge number of clients. Since it is a leading technology in the field of messaging systems, there is a steep rise in the job opportunities for Kafka trained professionals and this training will provide students with necessary skills to work in challenging roles in the Kafka domain for getting top jobs.

Objective Of The Course

  • Understanding Kafka, its features and components
  • Learning real time Kafka streaming integration with Spark & Storm
  • Kafka cluster deployment on YARN and Hadoop
  • Introduction to Kafka API
  • Kafka based record storing in fault-tolerant way
  • Solving Big Data issues in messaging systems
  • Implementing Twitter streaming with Kafka, Storm & Hadoop
  • Advanced learning of Kafka throughput, fault-tolerance, scalability and durability
  • Real world business scenarios based project work using Kafka 


Certification COSO IT Certified
Curriculum

1. Fundamentals of Apache Kafka

  • Understanding the concepts of Apache Kafka, Kafka architecture, use cases and implementation of Kafka on single node.

2. Multi Broker Implementation of Kafka & Deep Dive Into Kafka Cluster

  • Learning about the various Kafka terminologies, deployment of Kafka on single node with Zookeeper, replication adding in Kafka, etc.
  • Understanding Kafka consumers, learning to work with partitioning and brokers, terminology of Kafka writes and Kafka’s failure handling scenarios.
  • Introduction to Kafka’s multi node cluster set up, partition balancing and leadership balancing, cluster expanding, administration commands, removing of a broker, shutdown of Kafka brokers and tasks, etc. 

3. Understanding Optimisation Of Kafka Operations & Performance 

  • Learning more about Kafka monitoring and issues, performance tuning in Kafka, offset, design and hardware monitoring, reading data from Kafka, functioning programming with mixed Kafka concepts, etc.

4. Integration of Kafka With Hadoop, Storm & Flume

  • Understanding the Hadoop cluster and its integration with Kafka, understanding Apache storm and deploying Spouts and Bolts and Kafka with Storm based Spout.
  • Understanding the need for integration of Apache Kafka with Apache Flume and the steps to be followed for integrating Flume with Kafka as a source.

5. Learning About Kafka API And Producers & Consumers

  • In-depth learning of Kafka and Flume integration, deployment of Kafka in the form of sink and as a channel along with introduction to PyKafka API and set up of PyKafka environment.
  • Establishing connection of Kafka with PyKafka, writing of Kafka producers and consumers, writing a consumer for reading messages from a topic, writing a JSON based producer, writing a consumer for storing topics data into file, etc.

6. Kafka Based Project