Options
Latency-aware placement of stream processing operators
Publikationstyp
Conference Paper
Publikationsdatum
2024
Sprache
English
Author
Ecker, Raphael
Ebrahimi, Elmira
First published in
Number in series
14351
Start Page
30
End Page
41
Citation
Euro-Par 2023: Parallel Processing Workshops. Euro-Par 2023. Lecture Notes in Computer Science, vol 14351. Springer, Cham, 2024. - Seite 30-41
Contribution to Conference
Publisher DOI
Scopus ID
Publisher
Springer
ISBN
978-3-031-50683-3
The rise of the Internet of Things and Fog computing has increased substantially the number of interconnected devices at the edge of the network. As a result, a large amount of computations is now performed in the fog generating vast amounts of data. To process this data in near real time, stream processing is typically employed due to its efficiency in handling continuous streams of information in a scalable manner. However, most stream processing approaches do not consider the underlying network devices as candidate resources for processing data. Moreover, many existing works do not take into account the incurred network latency of performing computations on multiple devices in a distributed way. Consequently, the fog computing resources may not be fully exploited by existing stream processing approaches. To avoid this, we formulate an optimization problem for utilizing the existing fog resources, and we design heuristics for solving this problem efficiently. Furthermore, we integrate our heuristics into Apache Storm, and we perform experiments that show latency-related benefits compared to alternatives.
Schlagworte
Apache Storm
Edge Computing
Fog Computing
Internet of Things
Stream Processing
DDC Class
005: Computer Programming, Programs, Data and Security