You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
MINOR: Document how to create source streams and tables
Originally reviewed as part of apache#3490.
Author: Eno Thereska <eno.thereska@gmail.com>
Reviewers: Damian Guy <damian.guy@gmail.com>
Closesapache#3701 from enothereska/minor-docs-create-source-streams
If these records a KStream and the stream processing application were to sum the values it would return <code>4</code>. If these records were a KTable or GlobalKTable, the return would be <code>3</code>, since the last record would be considered as an update.
452
452
453
-
<h4><aid="streams_dsl_source"href="#streams_dsl_source">Create Source Streams from Kafka</a></h4>
453
+
<h4><aid="streams_dsl_source"href="#streams_dsl_source">Creating Source Streams from Kafka</a></h4>
454
454
455
455
<p>
456
-
Either a <b>record stream</b> (defined as <code>KStream</code>) or a <b>changelog stream</b> (defined as <code>KTable</code> or <code>GlobalKTable</code>)
457
-
can be created as a source stream from one or more Kafka topics (for <code>KTable</code> and <code>GlobalKTable</code> you can only create the source stream
458
-
from a single topic).
456
+
You can easily read data from Kafka topics into your application. We support the following operations.
"word-counts-global-store" /* table/store name */);
560
+
</pre>
561
+
562
+
When to provide serdes explicitly:
563
+
<ul>
564
+
<li>If you do not specify serdes explicitly, the default serdes from the configuration are used.</li>
565
+
<li>You must specificy serdes explicitly if the key and/or value types of the records in the Kafka input topic do not
566
+
match the configured default serdes.</li>
567
+
</ul>
568
+
Several variants of <code>globalTable<code> exist to e.g. specify explicit serdes.
569
+
570
+
</td>
571
+
</tbody>
572
+
</table>
468
573
469
574
<h4><aid="streams_dsl_windowing"href="#streams_dsl_windowing">Windowing a stream</a></h4>
470
575
A stream processor may need to divide data records into time buckets, i.e. to <b>window</b> the stream by time. This is usually needed for join and aggregation operations, etc. Kafka Streams currently defines the following types of windows:
0 commit comments