Abstract
Hundreds of millions of network cameras have been installed throughout the world. Each is capable of providing a vast amount of real-time data. Analyzing the massive data generated by these cameras requires significant computational resources and the demands may vary over time. Cloud computing shows the most promise to provide the needed resources on demand. In this article, we investigate how to allocate cloud resources when analyzing real-time data streams from network cameras. A resource manager considers many factors that affect its decisions, including the types of analysis, the number of data streams, and the locations of the cameras. The manager then selects the most cost-efficient types of cloud instances (e.g. CPU vs. GPGPU) to meet the computational demands for analyzing streams. We evaluate the effectiveness of our approach using Amazon Web Services. Experiments demonstrate more than 50% cost reduction for real workloads.
Original language | English |
---|---|
Article number | 8594612 |
Pages (from-to) | 31-41 |
Number of pages | 11 |
Journal | IEEE Multimedia |
Volume | 26 |
Issue number | 3 |
DOIs | |
State | Published - Jul 1 2019 |
Externally published | Yes |
ASJC Scopus Subject Areas
- Software
- Signal Processing
- Media Technology
- Hardware and Architecture
- Computer Science Applications
Keywords
- Computer Vision
- Cloud Computing
- Resource Optimization
Disciplines
- Computer Sciences
- Systems Architecture