Java Code Examples for org.apache.kafka.common.header.Headers#add()
The following examples show how to use
org.apache.kafka.common.header.Headers#add() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: OutgoingKafkaRecord.java From smallrye-reactive-messaging with Apache License 2.0 | 6 votes |
/** * Creates a new outgoing Kafka Message with a header added to the header list. * * @param key the header key * @param content the header key, must not be {@code null} * @return the updated Kafka Message. */ public OutgoingKafkaRecord<K, T> withHeader(String key, byte[] content) { Headers headers = getHeaders(); Headers copy = new RecordHeaders(headers); copy.add(new Header() { @Override public String key() { return key; } @Override public byte[] value() { return content; } }); return new OutgoingKafkaRecord<>(getTopic(), getKey(), getPayload(), getTimestamp(), getPartition(), copy, getAck(), getNack(), getMetadata()); }
Example 2
Source File: OutgoingKafkaRecord.java From smallrye-reactive-messaging with Apache License 2.0 | 6 votes |
/** * Creates a new outgoing Kafka Message with a header added to the header list. * * @param key the header key * @param content the header key, must not be {@code null} * @return the updated Kafka Message. */ public OutgoingKafkaRecord<K, T> withHeader(String key, String content) { Headers headers = getHeaders(); Headers copy = new RecordHeaders(headers); copy.add(new Header() { @Override public String key() { return key; } @Override public byte[] value() { return content.getBytes(); } }); return new OutgoingKafkaRecord<>(getTopic(), getKey(), getPayload(), getTimestamp(), getPartition(), copy, getAck(), getNack(), getMetadata()); }
Example 3
Source File: OutgoingKafkaRecord.java From smallrye-reactive-messaging with Apache License 2.0 | 6 votes |
/** * Creates a new outgoing Kafka Message with a header added to the header list. * * @param key the header key * @param content the header key, must not be {@code null} * @param enc the encoding, must not be {@code null} * @return the updated Kafka Message. */ public OutgoingKafkaRecord<K, T> withHeader(String key, String content, Charset enc) { Headers headers = getHeaders(); Headers copy = new RecordHeaders(headers); copy.add(new Header() { @Override public String key() { return key; } @Override public byte[] value() { return content.getBytes(enc); } }); return new OutgoingKafkaRecord<>(getTopic(), getKey(), getPayload(), getTimestamp(), getPartition(), copy, getAck(), getNack(), getMetadata()); }
Example 4
Source File: JsonSchemaKafkaSerializer.java From apicurio-registry with Apache License 2.0 | 5 votes |
/** * Adds appropriate information to the Headers so that the deserializer can function properly. * * @param headers msg headers * @param artifactId artifact id * @param globalId global id */ protected void addSchemaHeaders(Headers headers, String artifactId, long globalId) { // we never actually set this requirement for the globalId to be non-negative ... but it mostly is ... if (globalId >= 0) { ByteBuffer buff = ByteBuffer.allocate(8); buff.putLong(globalId); headers.add(JsonSchemaSerDeConstants.HEADER_GLOBAL_ID, buff.array()); } else { headers.add(JsonSchemaSerDeConstants.HEADER_ARTIFACT_ID, IoUtil.toBytes(artifactId)); } }
Example 5
Source File: LoggingAuditHeadersInjector.java From singer with Apache License 2.0 | 5 votes |
@Override public Headers addHeaders(Headers headers, LogMessage logMessage) { try { headers.add(HEADER_KEY, SER.serialize(logMessage.getLoggingAuditHeaders())); } catch (TException e) { Stats.incr(SingerMetrics.NUMBER_OF_SERIALIZING_HEADERS_ERRORS); LOG.warn("Exception thrown while serializing loggingAuditHeaders", e); } return headers; }
Example 6
Source File: MessagePublisherImpl.java From core-ng-project with Apache License 2.0 | 5 votes |
private void linkContext(Headers headers) { headers.add(MessageHeaders.HEADER_CLIENT, Strings.bytes(LogManager.APP_NAME)); ActionLog actionLog = LogManager.CURRENT_ACTION_LOG.get(); if (actionLog == null) return; // publisher may be used without action log context headers.add(MessageHeaders.HEADER_CORRELATION_ID, Strings.bytes(actionLog.correlationId())); if (actionLog.trace) headers.add(MessageHeaders.HEADER_TRACE, Strings.bytes("true")); headers.add(MessageHeaders.HEADER_REF_ID, Strings.bytes(actionLog.id)); }
Example 7
Source File: MirusSourceTaskTest.java From mirus with BSD 3-Clause "New" or "Revised" License | 5 votes |
@Test public void testSourceRecordsWorksWithHeaders() { final String topic = "topica"; final int partition = 0; final int offset = 123; final long timestamp = 314159; Map<TopicPartition, List<ConsumerRecord<byte[], byte[]>>> records = new HashMap<>(); Headers headers = new RecordHeaders(); headers.add("h1", "v1".getBytes(StandardCharsets.UTF_8)); headers.add("h2", "v2".getBytes(StandardCharsets.UTF_8)); records.put( new TopicPartition(topic, partition), Collections.singletonList(newConsumerRecord(topic, partition, offset, timestamp, headers))); ConsumerRecords<byte[], byte[]> pollResult = new ConsumerRecords<>(records); List<SourceRecord> result = mirusSourceTask.sourceRecords(pollResult); assertThat( StreamSupport.stream(result.get(0).headers().spliterator(), false) .map(Header::key) .collect(Collectors.toList()), hasItems("h1", "h2")); assertThat( StreamSupport.stream(result.get(0).headers().spliterator(), false) .map(Header::value) .collect(Collectors.toList()), hasItems("v1".getBytes(StandardCharsets.UTF_8), "v2".getBytes(StandardCharsets.UTF_8))); }
Example 8
Source File: ProducerRawRecordsTest.java From zerocode with Apache License 2.0 | 5 votes |
@Test public void testDeser_headers() { Headers headers = new RecordHeaders(); headers.add("headerKey1", "headerValue1".getBytes()); headers.add("headerKey2", "headerValue2".getBytes()); ProducerRecord producerRecord = new ProducerRecord("topic2", null, "key-123", "Hello", headers); String jsonBack = gson.toJson(producerRecord); JSONAssert.assertEquals("{\"topic\":\"topic2\",\"headers\":{\"headerKey1\":\"headerValue1\",\"headerKey2\":\"headerValue2\"},\"key\":\"key-123\",\"value\":\"Hello\"}", jsonBack, LENIENT); }
Example 9
Source File: HeaderUtils.java From extension-kafka with Apache License 2.0 | 5 votes |
/** * Creates a new {@link org.apache.kafka.common.header.internals.RecordHeader} based on {@code key} and * {@code value} and adds it to {@code headers}. The {@code value} is converted to bytes and follows this logic: * <ul> * <li>Instant - calls {@link Instant#toEpochMilli()}</li> * <li>Number - calls {@link HeaderUtils#toBytes} </li> * <li>String/custom object - calls {@link String#toString()} </li> * <li>null - <code>null</code> </li> * </ul> * * @param headers the Kafka {@code headers} to add a {@code key}/{@code value} pair to * @param key the key you want to add to the {@code headers} * @param value the value you want to add to the {@code headers} */ public static void addHeader(Headers headers, String key, Object value) { notNull(headers, () -> "headers may not be null"); if (value instanceof Instant) { headers.add(key, toBytes((Number) ((Instant) value).toEpochMilli())); } else if (value instanceof Number) { headers.add(key, toBytes((Number) value)); } else if (value instanceof String) { headers.add(key, ((String) value).getBytes(UTF_8)); } else if (value == null) { headers.add(key, null); } else { headers.add(key, value.toString().getBytes(UTF_8)); } }
Example 10
Source File: HeadersMapExtractAdapterTest.java From java-kafka-client with Apache License 2.0 | 5 votes |
@Test public void verifyNullHeaderHandled() { Headers headers = new RecordHeaders(); headers.add("test_null_header", null); HeadersMapExtractAdapter headersMapExtractAdapter = new HeadersMapExtractAdapter(headers); Entry<String, String> header = headersMapExtractAdapter.iterator().next(); assertNotNull(header); assertEquals(header.getKey(), "test_null_header"); assertNull(header.getValue()); }
Example 11
Source File: KafkaAvroSerializer.java From registry with Apache License 2.0 | 5 votes |
@Override public byte[] serialize(String topic, Headers headers, Object data) { if (useRecordHeader) { final MessageAndMetadata context = messageAndMetadataAvroSerializer.serialize(data, createSchemaMetadata(topic)); headers.add(isKey ? keySchemaVersionIdHeaderName : valueSchemaVersionIdHeaderName, context.metadata()); return context.payload(); } else { return serialize(topic, data); } }
Example 12
Source File: KafkaHeaders.java From brave with Apache License 2.0 | 4 votes |
static void replaceHeader(Headers headers, String key, String value) { headers.remove(key); headers.add(key, value.getBytes(UTF_8)); }
Example 13
Source File: KafkaHeaders.java From brave with Apache License 2.0 | 4 votes |
static void replaceHeader(Headers headers, String key, String value) { headers.remove(key); headers.add(key, value.getBytes(UTF_8)); }
Example 14
Source File: MirusSourceTaskTest.java From mirus with BSD 3-Clause "New" or "Revised" License | 4 votes |
@Test public void testSourceRecordsWorksWithHeadersWithHeaderConverter() { final String topic = "topica"; final int partition = 0; final int offset = 123; final long timestamp = 314159; Map<String, String> properties = mockTaskProperties(); properties.put( SourceConfigDefinition.SOURCE_HEADER_CONVERTER.getKey(), "org.apache.kafka.connect.json.JsonConverter"); mirusSourceTask.start(properties); Map<TopicPartition, List<ConsumerRecord<byte[], byte[]>>> records = new HashMap<>(); Headers headers = new RecordHeaders(); headers.add( "h1", "{\"schema\": {\"type\": \"struct\",\"fields\": [{\"type\": \"string\",\"optional\": true,\"field\": \"version\"}],\"optional\": false},\"payload\": {\"version\": \"v1\"}}" .getBytes(StandardCharsets.UTF_8)); headers.add( "h2", "{\"schema\": {\"type\": \"struct\",\"fields\": [{\"type\": \"string\",\"optional\": true,\"field\": \"version\"}],\"optional\": false},\"payload\": {\"version\": \"v2\"}}" .getBytes(StandardCharsets.UTF_8)); records.put( new TopicPartition(topic, partition), Collections.singletonList(newConsumerRecord(topic, partition, offset, timestamp, headers))); ConsumerRecords<byte[], byte[]> pollResult = new ConsumerRecords<>(records); List<SourceRecord> result = mirusSourceTask.sourceRecords(pollResult); Iterator<Header> connectHeaders = result.get(0).headers().iterator(); Header header1 = connectHeaders.next(); assertThat(header1.key(), is("h1")); assertThat(header1.value(), instanceOf(Struct.class)); Struct header1Value = (Struct) header1.value(); assertThat(header1Value.getString("version"), is("v1")); Header header2 = connectHeaders.next(); assertThat(header2.key(), is("h2")); assertThat(header2.value(), instanceOf(Struct.class)); Struct header2Value = (Struct) header2.value(); assertThat(header2Value.getString("version"), is("v2")); }
Example 15
Source File: CryptoDeserializer.java From kafka-encryption with Apache License 2.0 | 4 votes |
/** * deserialize the data (with decryption if needed) * The keyref used to deserialize the data will be added to the header {@link KafkaCryptoConstants#KEY_REF_HEADER} (may be {@code null}) * * @param topic * @param headers they will be enriched with header {@link KafkaCryptoConstants#KEY_REF_HEADER} * @param data * @return the deserialized data */ @Override public T deserialize(String topic, Headers headers, byte[] data) { if (data == null) { return null; } DecryptedDataWithKeyRef decryptedDataWithKeyRef = decrypt(data); T deserializedValue = rawDeserializer.deserialize(topic, headers, decryptedDataWithKeyRef.decryptedData); headers.add(KEY_REF_HEADER, decryptedDataWithKeyRef.keyRef); return deserializedValue; }
Example 16
Source File: CryptoAwareSerializerWrapper.java From kafka-encryption with Apache License 2.0 | 2 votes |
/** * Call the KeyReferenceExtractor with the topic and the data and set the result in the kafka header {@link KafkaCryptoConstants#KEY_REF_HEADER} * <p> * This method is called by the Kafka Producer * * @param topic * @param headers * @param data * @return the result of the underlying serializer */ @Override public byte[] serialize(String topic, Headers headers, T data) { headers.add(KafkaCryptoConstants.KEY_REF_HEADER, keyReferenceExtractor.extractKeyReference(topic, data)); return this.rawSerializer.serialize(topic, headers, data); }
Example 17
Source File: JsonSchemaKafkaSerializer.java From apicurio-registry with Apache License 2.0 | 2 votes |
/** * Adds appropriate information to the Headers so that the deserializer can function properly. * * @param headers the headers * @param data the msg data */ protected void addTypeHeaders(Headers headers, T data) { headers.add(JsonSchemaSerDeConstants.HEADER_MSG_TYPE, IoUtil.toBytes(data.getClass().getName())); }