Pipelines process the images to produce catalogs, which are then made accessible to the community via open interfaces in a Virtual Observatory model. Since new data is being collected nightly throughout the 10-year duration of Rubin Observatory's Legacy Survey of Space and Time (LSST), and scientific algorithms will evolve during this time frame, significant re-processing will occur. This must be taken into account in sizing Rubin Observatory technology resources and making the Rubin Observatory middleware easily extendable.
The pipelines can be categorized as "near real-time" or "static," depending on how stringent the associated latency and throughput deadlines are. Examples of near real-time pipelines include data quality assessments for providing feedback to telescope operations, instrument calibration processing, and time-domain/transient science analysis. These pipelines execute at the mountain/base facility in order to avoid the latency associated with long-haul transmission of the raw data. The static pipelines include deep image co-addition (stacking of paired exposures), weak lensing shear processing needed for dark energy and dark matter science, and object cataloging. These pipelines execute at the archive center, which also performs re-processing of the near real-time pipelines.
Links
[1] https://www.lsst.org/sites/default/files/pipelines.png