A pipeline architecture for analyzing multiple streams of video is
embodied, in part, in a layer of application program interfaces (APIs) to
each stage of processing. Buffer queuing is used between some stages,
which helps moderate the load on the CPU(s). Through the layer of APIs,
innumerable video analysis applications can access and analyze video data
flowing through the pipeline, and can annotate portions of the video data
(e.g., frames and groups of frames), based on the analyses performed,
with information that describes the frame or group. These annotated
frames and groups flow through the pipeline to subsequent stages of
processing, at which increasingly complex analyses can be performed. At
each stage, portions of the video data that are of little or no interest
are removed from the video data. Ultimately, "events" are constructed and
stored in a database, from which cross-event and historical analyses may
be performed and associations with, and among, events may be made.