In Avro, adding an element to an enum without a default is a __ schema evolution
What happens if you write the following code in your producer? producer.send(producerRecord).get()
If I want to send binary data through the REST proxy, it needs to be base64 encoded. Which component needs to encode the binary data into base 64?
In the Kafka consumer metrics it is observed that fetch-rate is very high and each fetch is small. What steps will you take to increase throughput?
What is the risk of increasing max.in.flight.requests.per.connection while also enabling retries in a producer?
Your manager would like to have topic availability over consistency. Which setting do you need to change in order to enable that?
If I supply the setting compression.type=snappy to my producer, what will happen? (select two)
Select all the way for one consumer to subscribe simultaneously to the following topics - topic.history, topic.sports, topic.politics? (select two)
You are running a Kafka Streams application in a Docker container managed by Kubernetes, and upon application restart, it takes a long time for the docker container to replicate the state and get back to processing the data. How can you improve dramatically the application restart?
We would like to be in an at-most once consuming scenario. Which offset commit strategy would you recommend?
Consumer failed to process record # 10 and succeeded in processing record # 11. Select the course of action that you should choose to guarantee at least once processing
A Zookeeper ensemble contains 5 servers. What is the maximum number of servers that can go missing and the ensemble still run?
To get acknowledgement of writes to only the leader partition, we need to use the config...
You are using JDBC source connector to copy data from a table to Kafka topic. There is one connector created with max.tasks equal to 2 deployed on a cluster of 3 workers. How many tasks are launched?
In Avro, removing a field that does not have a default is a __ schema evolution
I am producing Avro data on my Kafka cluster that is integrated with the Confluent Schema Registry. After a schema change that is incompatible, I know my data will be rejected. Which component will reject the data?
Which of the following is true regarding thread safety in the Java Kafka Clients?
Which of these joins does not require input topics to be sharing the same number of partitions?
To allow consumers in a group to resume at the previously committed offset, I need to set the proper value for...
How will you find out all the partitions where one or more of the replicas for the partition are not in-sync with the leader?
Which of the following statements are true regarding the number of partitions of a topic?
When using plain JSON data with Connect, you see the following error messageorg.apache.kafka.connect.errors.DataExceptionJsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields. How will you fix the error?
You are doing complex calculations using a machine learning framework on records fetched from a Kafka topic. It takes more about 6 minutes to process a record batch, and the consumer enters rebalances even though it's still running. How can you improve this scenario?
Which of the following errors are retriable from a producer perspective? (select two)
A customer has many consumer applications that process messages from a Kafka topic. Each consumer application can only process 50 MB/s. Your customer wants to achieve a target throughput of 1 GB/s. What is the minimum number of partitions will you suggest to the customer for that particular topic?
You are sending messages with keys to a topic. To increase throughput, you decide to increase the number of partitions of the topic. Select all that apply.
Compaction is enabled for a topic in Kafka by setting log.cleanup.policy=compact. What is true about log compaction?
If I want to send binary data through the REST proxy to topic "test_binary", it needs to be base64 encoded. A consumer connecting directly into the Kafka topic A. "test_binary" will receive
B. binary data
C. avro data
D. json data
E. base64 encoded data, it will need to decode it
We want the average of all events in every five-minute window updated every minute. What kind of Kafka Streams window will be required on the stream?
A bank uses a Kafka cluster for credit card payments. What should be the value of the property unclean.leader.election.enable?
You have a Kafka cluster and all the topics have a replication factor of 3. One intern at your company stopped a broker, and accidentally deleted all the data of that broker on the disk. What will happen if the broker is restarted?