With the developments of the last years, smartphones became suitable to be used as
smart cameras in a visual sensor network.
They have the necessary cameras with an acceptable quality of lenses and CCD, the
required processing power, and WiFi or Bluetooth interfaces for communication.
This paper describes the architecture of the SMEyeL (Smart Mobile Eyes for Localization)
system.
It is designed to utilize both smartphones and PC-s as processing nodes of a visual
sensor network.
It aims to achieve high spatial and temporal resolution, energy consumption optimization
via computation offloading and distributed processing.
The SMEyeL project is open source: sourcecode and measurement data sets are available
on the web.
We present the detailed description of the architecture and experimental results related
to timing accuracy and localization precision.