Kafka interview questions and answers

Kafka interview questions and answers(MCQ): Test Your Knowledge!

 

How does Kafka handle consumer lag?
By using ZooKeeper
By monitoring the difference between the last produced offset and the last consumed offset
By compressing messages
By using SSL/TLS
What is the difference between a push and pull model in Kafka?
Consumers pull data, producers push data
Consumers push data, producers pull data
Both consumers and producers push data
Both consumers and producers pull data
Which Kafka feature allows for exactly-once delivery semantics?
Idempotent Producer
Transactional Producer
Exactly-Once Consumer
Guaranteed Delivery
Which Kafka feature allows for processing streams of data incrementally?
Kafka Streams
Kafka Connect
Kafka REST Proxy
Kafka Mirror Maker
Which Kafka protocol is used for communication between producers and brokers?
Kafka Protocol
Producer Protocol
Broker Protocol
Client-Server Protocol
What is the purpose of a Kafka topic?
To categorize and organize messages
To authenticate users
To compress data
To manage consumer groups
How does Kafka handle data stream filtering with predicates?
Using Kafka Streams
Automatically in brokers
Only in producers
Not supported
In which domain is Kafka commonly used for real-time log aggregation and analysis?
IT operations and monitoring
Only in financial services
Exclusively in e-commerce
Kafka is not used for log aggregation
Which Kafka feature allows for exactly-once processing semantics in stream processing?
Transactional Producer
Idempotent Consumer
Exactly-Once Streams
Guaranteed Processing
What is the purpose of a graceful shutdown in Kafka consumers?
To ensure proper offset commits and group rebalancing
To compress remaining messages
To delete consumer group metadata
To notify producers of shutdown
What are some examples of data that are transferred through Apache Kafka?
Logs
Metrics
Clickstream data
All of the above
How can you secure Kafka?
By using SSL/TLS
By using data encryption
By using access control lists (ACLs)
All of the above
Which Kafka feature allows for dynamically updating broker configurations without restart?
Dynamic Configuration
Rolling Restart
Configuration Management
Hot Reload
Which Kafka tool is used for verifying the consistency of the logs?
kafka-verify-logs.sh
kafka-check-logs.sh
kafka-validate-logs.sh
kafka-log-checker.sh
In Kafka, what does the leader.imbalance.check.interval.seconds configuration do?
Controls how often to check for partition leader imbalance
Sets the consumer poll interval
Defines producer batch interval
Configures broker heartbeat interval
How does Kafka handle message timestamps?
By using ZooKeeper
By adding timestamps to each message at the producer level
By compressing messages
By using SSL/TLS
What are the required Kafka properties?
bootstrap.servers, key.serializer, value.serializer
bootstrap.servers, broker.id, log.dirs
broker.id, log.dirs, zookeeper.connect
zookeeper.connect, key.serializer, value.serializer
Which of the following is not a valid Kafka producer retries behavior?
Exponential backoff
Linear backoff
Jitter
Constant
How does Kafka handle data stream denormalization with custom serdes?
Using Kafka Streams API
Automatically in brokers
Only in producers
Not supported
What is the difference between message brokers and message queue?
Brokers manage queues
Brokers do not manage queues
Queues manage brokers
No difference
How does Kafka handle message compression for producers?
Compresses messages before sending
Doesn\\\'t support compression
Only compresses large messages
Only supports specific algorithms
How do you handle message ordering in Spring AMQP and Spring Pub-Sub?
By using partition keys
By using message priorities
By using consumer groups
Spring AMQP does not support ordering; Spring Pub-Sub supports ordering within a topic
What is the purpose of the group.initial.rebalance.delay.ms configuration in Kafka?
To delay the initial consumer rebalance
To set the frequency of consumer rebalances
To set the timeout for consumer rebalances
To set the maximum time for a consumer rebalance
Which of the following is NOT a typical use case for Apache Kafka?
Real-time streaming
Log aggregation
Metrics collection
Relational database replacement
What is the purpose of Kafka\\\'s __consumer_offsets topic compaction?
To reduce storage size of consumer offsets
To encrypt consumer data
To improve read performance
To handle authentication
Score: 0/25