Working with Apache Kafka

Apache Kafka is a real-time data pipeline processor. It’s scalable, fault tolerance, has high execution speeds with fluid integrations are some of the key hallmarks that make Kafka an integral part of many Enterprise Data architectures. In this lab intensive two day course, you will learn how to use Kafka to build streaming solutions.

    Sep 16 2021

    Date: 09/16/2021 - 09/17/2021 (Thursday - Friday) | 10:00 AM - 6:00 PM (EST)
    Location: ONLINE (Virtual Classroom Live)
    Delivery Format: VIRTUAL CLASSROOM LIVE Request Quote & Enroll

    Success! Your message has been sent to us.
    Error! There was an error sending your message.
    REQUEST MORE INFO:

    Working with Apache Kafka

    September 16 - 17, 2021 | 10:00 AM - 6:00 PM (EST) | Virtual Classroom Live


    How Did You Hear of Global IT Training?

    Join Our Email List?

    Nov 18 2021

    Date: 11/18/2021 - 11/19/2021 (Thursday - Friday) | 10:00 AM - 6:00 PM (EST)
    Location: ONLINE (Virtual Classroom Live)
    Delivery Format: VIRTUAL CLASSROOM LIVE Request Quote & Enroll

    Success! Your message has been sent to us.
    Error! There was an error sending your message.
    REQUEST MORE INFO:

    Working with Apache Kafka

    November 18 - 19, 2021 | 10:00 AM - 6:00 PM (EST) | Virtual Classroom Live


    How Did You Hear of Global IT Training?

    Join Our Email List?

Introduction to Streaming Systems

  • Fast data
  • Streaming architecture
  • Lambda architecture
  • Message queues
  • Streaming processors

Introduction to Kafka

  • Architecture
  • Comparing Kafka with other queue systems (JMS / MQ)
  • Kaka concepts : Messages, Topics, Partitions, Brokers, Producers, and commit logs
  • Kafka and Zookeeper
  • Producing messages
  • Consuming messages (Consumers and Consumer Groups)
  • Message retention
  • Scaling Kafka

Programming With Kafka

  • Configuration parameters
  • Producer API (Sending messages to Kafka)
  • Consumer API (consuming messages from Kafka)
  • Commits, Offsets, Seeking
  • Schema with Avro

Kafka Streams

  • Streams overview and architecture
  • Streams use cases and comparison with other platforms
  • Learning Kafka Streaming concepts (KStream, KTable, and KStore)
  • KStreaming operations (transformations, filters, joins, and aggregations)

Administering Kafka

  • Hardware/Software requirements
  • Deploying Kafka
  • Configuration of brokers/topics/partitions/producers/consumers
  • Security: How secure Kafka cluster, and secure client communications (SASL and Kerberos)
  • Monitoring: monitoring tools
  • Capacity Planning: estimating usage and demand
  • Trouble shooting: failure scenarios and recovery

Monitoring and Instrumenting Kafka

  • Monitoring Kafka
  • Instrumenting with Metrics library
  • Instrument Kafka applications and monitor their performance

*Please Note: Course Outline is subject to change without notice. Exact course outline will be provided at time of registration.

Join an engaging hands-on learning environment, where you’ll explore:

  • Overview of Streaming technologies
  • Kafka concepts and architecture
  • Programming using Kafka API
  • Kafka Streams
  • Monitoring Kafka
  • Tuning/Troubleshooting Kafka

This course has a 50% hands-on labs to 50% lecture ratio with engaging instruction, demos, group discussions, labs, and project work.

Before attending this course, you should:

  • Be comfortable with Java
  • Have experience working with databases
  • Able to navigate the Linux command line
  • Have basic knowledge of Linux editors (such as VI/nano) for editing code

 

Experienced Java Developers with database experience.

Ready to Jumpstart Your IT Career?

CONTACT US NOW!