How to process tha historical data queued up on our topics

how to process tha historical data queued up on our topics here we don't want to process all this data in a single batch as that is harder to do (and if it fails it has to start again!).

Ideally we would like to know of a config like maxBatchSize and minBatchSize, where we can simply set the number of records tht we would need.

Tagged:

Comments

  • This config optionmaxOffsetsPerTrigger:

    Rate limit on maximum number of offsets processed per trigger interval. The specified total number of offsets will be proportionally split across topicPartitions of different volume.

    Note that if you have a checkpoint directory with start and end offsets, then the application will process the offsets in the directory for the first batch, thus ignoring this config. (The next batch will respect it).

  • This config optionmaxOffsetsPerTrigger:

    Rate limit on maximum number of offsets processed per trigger interval. The specified total number of offsets will be proportionally split across topicPartitions of different volume.

    Note that if you have a checkpoint directory with start and end offsets, then the application will process the offsets in the directory for the first batch, thus ignoring this config. (The next batch will respect it).

Sign In or Register to comment.