3.3. Frame Samples and Pipelines

3.3.1. Base Classes

3.3.1.1. Frame Sampler

For the higher dimensional observers (>0D) frame samplers determine how the individual pixels that make up the observers frame are sampled. For example, sample the full frame or adaptively select pixels to sample based on their noise level.

class raysect.optical.observer.base.sampler.FrameSampler1D

Base class for 1D frame samplers.

generate_tasks()

Generates a list of tuples that selects the pixels to render.

Must return a list of tuples where each tuple contains the id of a pixel to render. For example:

tasks = [(1,), (5,), (512,), ...]

This is a virtual method and must be implemented in a sub class.

Parameters:pixels (int) – The number of pixels in the frame.
Return type:list
class raysect.optical.observer.base.sampler.FrameSampler2D

Base class for 2D frame samplers.

generate_tasks()

Generates a list of tuples that selects the pixels to render.

Must return a list of tuples where each tuple contains the id of a pixel to render. For example:

tasks = [(1, 10), (5, 53), (512, 354), ...]

This is a virtual method and must be implemented in a sub class.

Parameters:pixels (tuple) – Contains the (x, y) pixel dimensions of the frame.
Return type:list

3.3.1.2. Pixel Processor

A pixel processor handles the processing of the spectra for each pixel sampled by the observer.

class raysect.optical.observer.base.processor.PixelProcessor

Base class for pixel processors.

To optimally use computing resources, observers may use parallel processes to sample the world.

Raysect observers launch multiple worker processes to sample the world, these processes send their results back to a single process that combines them into a frame.

In order to distribute the processing of the returned spectra, it is necessary to perform the data processing on each worker.

Each worker is given a pixel id and a number of spectral samples to collect for that pixel. The worker launches a ray to collect a sample and a spectrum is returned. When a spectrum is obtained, the worker calls add_sample() on each pixel processor associated with the pipelines attached to the observer. The pixel prcoessor processes the spectrum and accumulates the results in its internal buffer.

When the pixel samples are complete the worker calls pack_results() on each pixel processor. These results are sent back to the process handling the frame assembly.

add_sample()

Processes a spectrum and adds it to internal buffer.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • spectrum (Spectrum) – The sampled spectrum.
  • etendue (float) – The pixel etendue.
pack_results()

Packs the contents of the internal buffer.

This method must return a tuple. The contents and length of the tuple are entirely defined by the needs of the pipeline.

This is a virtual method and must be implemented in a sub class.

Return type:tuple

3.3.1.3. Pipeline

Pipelines define how spectra are processed by observers into the form desired by the user. For example, the power pipelines define how the measured spectra is integrated over the spectral range to give the overall power in W arriving at the observing surfaces. The also control the display and visualisation of results.

class raysect.optical.observer.base.pipeline.Pipeline0D

The base class for 0D pipelines.

finalise()

Finalises the results when rendering has finished.

This method is called when all workers have finished sampling and the results need to undergo any final processing.

If this pipeline implements some form of visualisation, use this method to plot the final visualisation of results.

This is a virtual method and must be implemented in a sub class.

initialise()

Initialises containers for the pipeline’s processing output.

The deriving class should use this method to perform any initialisation needed for their calculations.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • min_wavelength (float) – The minimum wavelength in the spectral range.
  • max_wavelength (float) – The maximum wavelength in the spectral range.
  • spectral_bins (int) – Number of spectral samples across wavelength range.
  • spectral_slices (list) – List of spectral sub-ranges for cases where spectral rays is > 1 (i.e. dispersion effects turned on).
  • quiet (bool) – When True, suppresses output to the terminal.
pixel_processor()

Initialise and return a pixel processor for this pipeline.

Called by worker threads, each worker will request a pixel processor for processing the output of their work from each pipeline.

This is a virtual method and must be implemented in a sub class.

Parameters:slice_id (int) – The integer identifying the spectral slice being worked on by the requesting worker thread.
Return type:PixelProcessor
update()

Updates the internal results array with packed results from the pixel processor.

After worker threads have observed the world and used the pixel processor to process the spectra into packed results, the worker then passes the packed results to the pipeline with the update() method.

If this pipeline implements some form of visualisation, update the visualisation at the end of this method.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • slice_id (int) – The integer identifying the spectral slice being worked on by the worker thread.
  • packed_result (tuple) – The tuple of results generated by this pipeline’s PixelProcessor.
  • samples (int) – The number of samples taken by the worker. Needed for ensuring accuracy of statistical errors when combining new samples with previous results.
class raysect.optical.observer.base.pipeline.Pipeline1D

The base class for 1D pipelines.

finalise()

Finalises the results when rendering has finished.

This method is called when all workers have finished sampling and the results need to undergo any final processing.

If this pipeline implements some form of visualisation, use this method to plot the final visualisation of results.

This is a virtual method and must be implemented in a sub class.

initialise()

Initialises containers for the pipeline’s processing output.

The deriving class should use this method to perform any initialisation needed for their calculations.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • pixels (tuple) – A tuple defining the pixel dimensions being sampled (i.e. (256,)).
  • pixel_samples (int) – The number of samples being taken per pixel. Needed for statistical calculations.
  • min_wavelength (float) – The minimum wavelength in the spectral range.
  • max_wavelength (float) – The maximum wavelength in the spectral range.
  • spectral_bins (int) – Number of spectral samples across wavelength range.
  • spectral_slices (list) – List of spectral sub-ranges for cases where spectral rays is > 1 (i.e. dispersion effects turned on).
  • quiet (bool) – When True, suppresses output to the terminal.
pixel_processor()

Initialise and return a pixel processor for this pipeline and pixel coordinate.

Called by worker threads, each worker will request a pixel processor for the pixel they are processing.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • pixel (int) – The pixel coordinate of the pixel being sampled by the worker.
  • slice_id (int) – The integer identifying the spectral slice being worked on by the requesting worker thread.
Return type:

PixelProcessor

update()

Updates the internal results array with packed results from the pixel processor.

After worker threads have observed the world and used the pixel processor to process the spectra into packed results, the worker then passes the packed results for the current pixel to the pipeline with the update() method. This method should add the results for this pixel to the pipeline’s results array.

If this pipeline implements some form of visualisation, update the visualisation at the end of this method.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • pixel (int) – The integer identifying the pixel being worked on.
  • slice_id (int) – The integer identifying the spectral slice being worked on by the worker thread.
  • packed_result (tuple) – The tuple of results generated by this pipeline’s PixelProcessor.
class raysect.optical.observer.base.pipeline.Pipeline2D

The base class for 2D pipelines.

finalise()

Finalises the results when rendering has finished.

This method is called when all workers have finished sampling and the results need to undergo any final processing.

If this pipeline implements some form of visualisation, use this method to plot the final visualisation of results.

This is a virtual method and must be implemented in a sub class.

initialise()

Initialises containers for the pipeline’s processing output.

The deriving class should use this method to perform any initialisation needed for their calculations.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • pixels (tuple) – A tuple defining the pixel dimensions being sampled (i.e. (512, 512)).
  • pixel_samples (int) – The number of samples being taken per pixel. Needed for statistical calculations.
  • min_wavelength (float) – The minimum wavelength in the spectral range.
  • max_wavelength (float) – The maximum wavelength in the spectral range.
  • spectral_bins (int) – Number of spectral samples across wavelength range.
  • spectral_slices (list) – List of spectral sub-ranges for cases where spectral rays is > 1 (i.e. dispersion effects turned on).
  • quiet (bool) – When True, suppresses output to the terminal.
pixel_processor()

Initialise and return a pixel processor for this pipeline and pixel coordinate.

Called by worker threads, each worker will request a pixel processor for the pixel they are currently processing.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • x (int) – The x pixel coordinate (x, y) of the pixel being sampled by the worker.
  • y (int) – The y pixel coordinate (x, y) of the pixel being sampled by the worker.
  • slice_id (int) – The integer identifying the spectral slice being worked on by the requesting worker thread.
Return type:

PixelProcessor

update()

Updates the internal results array with packed results from the pixel processor.

After worker threads have observed the world and used the pixel processor to process the spectra into packed results, the worker then passes the packed results for the current pixel to the pipeline with the update() method. This method should add the results for this pixel to the pipeline’s results array.

If this pipeline implements some form of visualisation, update the visualisation at the end of this method.

This is a virtual method and must be implemented in a sub class.

Parameters:
  • x (int) – The x pixel coordinate (x, y) of the pixel being sampled by the worker.
  • y (int) – The y pixel coordinate (x, y) of the pixel being sampled by the worker.
  • slice_id (int) – The integer identifying the spectral slice being worked on by the worker thread.
  • packed_result (tuple) – The tuple of results generated by this pipeline’s PixelProcessor.

3.3.2. RGB

class raysect.optical.observer.pipeline.rgb.RGBPipeline2D

Bases: raysect.optical.observer.base.pipeline.Pipeline2D

2D pipeline of sRGB colour values.

Converts the measured spectrum from each pixel into sRGB colour space values. See the colour module for more information. The RGBPipeline2D class is the workhorse pipeline for visualisation of scenes with Raysect and the default pipeline for most 2D observers.

Parameters:
  • display_progress (bool) – Toggles the display of live render progress (default=True).
  • display_update_time (float) – Time in seconds between preview display updates (default=15 seconds).
  • accumulate (bool) – Whether to accumulate samples with subsequent calls to observe() (default=True).
  • display_auto_exposure (bool) – Toggles the use of automatic exposure of final images (default=True).
  • display_sensitivity (float) – The sensitivity of the camera, effectively inverse of the exposure time (default=1.0).
  • display_unsaturated_fraction (float) – Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value (default=1.0).
  • name (str) – User friendly name for this pipeline.
display()

Plot the RGB frame.

display_auto_exposure

Toggles the use of automatic exposure on final image.

Return type:bool
display_sensitivity

The sensitivity of the camera, effectively inverse of the exposure time.

Return type:float
display_unsaturated_fraction

Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value.

Return type:float
display_update_time

Time in seconds between preview display updates.

Return type:float
save()

Saves the display image to a png file.

The current display settings (exposure, gamma, etc..) are used to process the image prior saving.

Parameters:filename (str) – Image path and filename.
class raysect.optical.observer.pipeline.rgb.XYZPixelProcessor

Bases: raysect.optical.observer.base.processor.PixelProcessor

PixelProcessor that converts each pixel’s spectrum into three XYZ colourspace values.

class raysect.optical.observer.pipeline.rgb.RGBAdaptiveSampler2D

Bases: raysect.optical.observer.base.sampler.FrameSampler2D

FrameSampler that dynamically adjusts a camera’s pixel samples based on the noise level in each RGB pixel value.

Pixels that have high noise levels will receive extra samples until the desired noise threshold is achieve across the whole image.

Parameters:
  • pipeline (RGBPipeline2D) – The specific RGB pipeline to use for feedback control.
  • fraction (float) – The fraction of frame pixels to receive extra sampling (default=0.2).
  • ratio (float) –
  • min_samples (int) – Minimum number of pixel samples across the image before turning on adaptive sampling (default=1000).
  • cutoff (double) – Noise threshold at which extra sampling will be aborted and rendering will complete (default=0.0).

3.3.3. Bayer

class raysect.optical.observer.pipeline.bayer.BayerPipeline2D

Bases: raysect.optical.observer.base.pipeline.Pipeline2D

A 2D pipeline simulating a Bayer filter.

Many commercial cameras use a Bayer filter for converting measured spectra into a 2D image of RGB values. The 2D sensor pixel array is covered with a mosaic of alternating red, green and blue filters. Thus each pixel in the array is only responsive to one of the colour filters simulating the response of the human eye. The final image is represented by a 2D grid of only red, green and blue values. The eye interpolates these values to create other colours. See Wikipedia for more information.

Parameters:
  • red_filter (SpectralFunction) – The spectral function representing the red pixel filter.
  • green_filter (SpectralFunction) – The spectral function representing the green pixel filter.
  • blue_filter (SpectralFunction) – The spectral function representing the blue pixel filter.
  • display_progress (bool) – Toggles the display of live render progress (default=True).
  • display_update_time (float) – Time in seconds between preview display updates (default=15 seconds).
  • accumulate (bool) – Whether to accumulate samples with subsequent calls to observe() (default=True).
  • display_auto_exposure (bool) – Toggles the use of automatic exposure of final images (default=True).
  • display_black_point (float) – Lower clamp point for pixel to appear black (default=0.0).
  • display_white_point (float) – Upper clamp point for pixel saturation (default=1.0).
  • display_unsaturated_fraction (float) – Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value (default=1.0).
  • display_gamma (float) – Gamma exponent to account for non-linear response of display screens (default=2.2).
  • name (str) – User friendly name for this pipeline (default=”Bayer Pipeline”).
display()

Plot the RGB frame.

display_auto_exposure

Toggles the use of automatic exposure on final image.

Return type:bool
display_black_point

Lower clamp point for pixel to appear black.

Return type:float
display_gamma

Power law exponent to approximate non-linear human eye response.

Each pixel value will be raised to power gamma:

\[V_{out} = V_{in}^{\gamma}\]

For more information see Wikipedia.

Return type:float
display_unsaturated_fraction

Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value.

Return type:float
display_update_time

Time in seconds between preview display updates.

Return type:float
display_white_point

Upper clamp point for pixel colour saturation.

Return type:float
save()

Saves the display image to a png file.

The current display settings (exposure, gamma, etc..) are used to process the image prior saving.

Parameters:filename (str) – Image path and filename.

3.3.4. Power

class raysect.optical.observer.pipeline.power.PowerPipeline0D

Bases: raysect.optical.observer.base.pipeline.Pipeline0D

A power pipeline for 0D observers.

The raw spectrum collected by the observer is multiplied by a spectra filter and integrated to give to total power collected.

The measured value and error are accessed at self.value.mean and self.value.error respectively.

Parameters:
  • filter (SpectralFunction) – A filter function to be multiplied with the measured spectrum.
  • accumulate (bool) –
  • name (str) – User friendly name for this pipeline.
class raysect.optical.observer.pipeline.power.PowerPipeline2D

Bases: raysect.optical.observer.base.pipeline.Pipeline2D

A power pipeline for 2D observers.

The raw spectrum collected at each pixel by the observer is multiplied by a spectral filter and integrated to give to total power collected at that pixel.

The measured value and error for each pixel are accessed at self.frame.mean and self.frame.error respectively.

Parameters:
  • filter (SpectralFunction) – A filter function to be multiplied with the measured spectrum.
  • display_progress (bool) – Toggles the display of live render progress (default=True).
  • display_update_time (float) – Time in seconds between preview display updates (default=15 seconds).
  • accumulate (bool) – Whether to accumulate samples with subsequent calls to observe() (default=True).
  • display_auto_exposure (bool) – Toggles the use of automatic exposure of final images (default=True).
  • display_black_point (float) –
  • display_white_point (float) –
  • display_unsaturated_fraction (float) – Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value (default=1.0).
  • display_gamma (float) –
  • name (str) – User friendly name for this pipeline.
display_auto_exposure

Toggles the use of automatic exposure on final image.

Return type:bool
display_unsaturated_fraction

Fraction of pixels that must not be saturated. Display values will be scaled to satisfy this value.

Return type:float
display_update_time

Time in seconds between preview display updates.

Return type:float
save()

Saves the display image to a png file.

The current display settings (exposure, gamma, etc..) are used to process the image prior saving.

Parameters:filename (str) – Image path and filename.
class raysect.optical.observer.pipeline.power.PowerPixelProcessor

Bases: raysect.optical.observer.base.processor.PixelProcessor

PixelProcessor that converts each pixel’s spectrum into total power by integrating over the spectrum and multiplying the resulting radiance value by the pixel’s etendue.

class raysect.optical.observer.pipeline.power.PowerAdaptiveSampler2D

Bases: raysect.optical.observer.base.sampler.FrameSampler2D

FrameSampler that dynamically adjusts a camera’s pixel samples based on the noise level in each pixel’s power value.

Pixels that have high noise levels will receive extra samples until the desired noise threshold is achieve across the whole image.

Parameters:
  • pipeline (PowerPipeline2D) – The specific power pipeline to use for feedback control.
  • fraction (float) – The fraction of frame pixels to receive extra sampling (default=0.2).
  • ratio (float) –
  • min_samples (int) – Minimum number of pixel samples across the image before turning on adaptive sampling (default=1000).
  • cutoff (double) – Normalised noise threshold at which extra sampling will be aborted and rendering will complete (default=0.0). The standard error is normalised to 1 so that a cutoff of 0.01 corresponds to 1% standard error.

3.3.5. Spectral

class raysect.optical.observer.pipeline.spectral.SpectralPipeline0D

Bases: raysect.optical.observer.base.pipeline.Pipeline0D

A basic spectrum pipeline for 0D observers.

The raw spectrum for the observer is stored along with the associated error on each wavelength bin.

Spectral values and errors are available through the self.frame attribute.

Parameters:
  • accumulate (bool) – Whether to accumulate samples with subsequent calls to observe() (default=True).
  • name (str) – User friendly name for this pipeline.
class raysect.optical.observer.pipeline.spectral.SpectralPipeline2D

Bases: raysect.optical.observer.base.pipeline.Pipeline2D

A basic spectrum pipeline for 2D observers.

The raw spectrum for each pixel is stored along with the associated error on each wavelength bin in a 2D frame object.

Spectral values and errors are available through the self.frame attribute.

Parameters:
  • accumulate (bool) – Whether to accumulate samples with subsequent calls to observe() (default=True).
  • name (str) – User friendly name for this pipeline.
class raysect.optical.observer.pipeline.spectral.SpectralPixelProcessor

Bases: raysect.optical.observer.base.processor.PixelProcessor

PixelProcessor that stores the spectrum observed by each pixel.