Viewer Core

Video Tutorial


This objects stores the data that the Viewer interacts with.

class mesmerize.viewer.core.ViewerWorkEnv(imgdata: Optional[mesmerize.viewer.core.data_types.ImgData] = None, sample_id='', UUID=None, meta=None, stim_maps=None, roi_manager=None, roi_states=None, comments='', origin_file='', custom_cols=None, history_trace: Optional[list] = None, additional_data: Optional[dict] = None, misc: Optional[dict] = None, **kwargs)[source]

UUID, if opened from a project Sample refers to the ImgUUID

__init__(imgdata: Optional[mesmerize.viewer.core.data_types.ImgData] = None, sample_id='', UUID=None, meta=None, stim_maps=None, roi_manager=None, roi_states=None, comments='', origin_file='', custom_cols=None, history_trace: Optional[list] = None, additional_data: Optional[dict] = None, misc: Optional[dict] = None, **kwargs)[source]

A class that encapsulates the main work environment objects (img sequence, ROIs, and ROI associated curves) of the viewer. Allows for a work environment to be easily spawned from different types of sources and allows for a work environment to be easily saved in different ways regardless of the type of original data source.

  • roi_states (dict) – roi states from ROI Manager module

  • stim_maps (dict) – {‘units’: str, ‘dataframe’: pd.DataFrame}

  • history_trace (list) – list of dicts containing a traceable history of what what done with the work environment, such as params used from modules to process the data


list of weak references to the object (if defined)

static _organize_meta(meta: dict, origin: str) dict[source]

Organize input meta data dict into a uniform structure :param meta: meta data dict, origin from a json file for example :param origin: name of the origin source of the meta data, such a program or microscope etc. :return: dict organized with keys that are used throughout Mesmerize.


Cleanup of the work environment

classmethod from_mesfile(mesfile_object: mesmerize.viewer.core.mesfile.MES, img_reference: str)[source]

Return instance of work environment with MesmerizeCore.ImgData class object using seq returned from MES.load_img from MesmerizeCore.FileInput module and any stimulus map that the user may have specified.

classmethod from_pickle(pickle_file_path: str, tiff_path: Optional[str] = None)[source]

Get pickled image data from a pickle file & image sequence from a npz or tiff. Used after motion correction & to view a sample from a project DataFrame. Create ImgData class object (See MesmerizeCore.DataTypes) and return instance of the work environment.


pickle_file_path: full path to the pickle containing image metadata, stim maps, and roi_states


tiff_path: str of the full path to a tiff file containing the image sequence

classmethod from_tiff(path: str, method: str, meta_path: Optional[str] = None, axes_order: Optional[str] = None, meta_format: Optional[str] = None)[source]

Return instance of work environment with ImgData.seq set from the tiff file.

  • path – path to the tiff file

  • method – one of ‘imread’, ‘asarray’, or ‘asarray-multi’. Refers to usage of either tifffile.imread or tifffile.asarray. ‘asarray-multi’ will load multi-page tiff files.

  • meta_path – path to a file containing meta data

  • meta_format – meta data format, must correspond to the name of a function in viewer.core.organize_meta

  • axes_order – Axes order as a 3 or 4 letter string for 2D or 3D data respectively. Axes order is assumed to be “txy” or “tzxy” if not specified.


history log

imgdata: mesmerize.viewer.core.data_types.ImgData

ImgData instance


Return True if the work environment is empty

static load_mesfile(path: str) mesmerize.viewer.core.mesfile.MES[source]

Just passes the path of a .mes file to the constructor of class MES in MesmerizeCore.FileInput. Loads .mes file & constructs MES obj from which individual images & their respective metadata can be loaded to construct viewer work environments using the classmethod viewerWorkEnv.from_mesfile.


path – full path to a single .mes file.


reference to the back-end ROI Manager that is currently in use


SampleID, if opened from a project Sample


Stimulus maps

to_pandas(proj_path: str, modify_options: Optional[dict] = None) list[source]

Used for saving the work environment as a project Sample.

  • proj_path – Root path of the current project

  • modify_options


list of dicts that each correspond to a single curve that can be appended as rows to the project dataframe

to_pickle(dir_path: str, filename: Optional[str] = None, save_img_seq=True, UUID=None) str[source]

Package the current work Env ImgData class object (See MesmerizeCore.DataTypes) and any paramteres such as for motion correction and package them into a pickle & image seq array. Used for batch motion correction and for saving current sample to the project. Image sequence is saved as a tiff and other information about the image is saved in a pickle.


class mesmerize.viewer.core.data_types.ImgData(seq: Optional[numpy.ndarray] = None, meta: Optional[dict] = None)[source]

Object that stores the image sequence and meta data from the imaging source

__init__(seq: Optional[numpy.ndarray] = None, meta: Optional[dict] = None)[source]
  • seq – Image sequence as a numpy array, shape is [x, y, t] or [x, y, t, z]

  • meta – Meta data dict from the imaging source.


The Viewer is usually not interacted with directly from modules outside of the viewer (such as viewer modules. They instead use the ViewerUtils class which includes helper functions and a reference to the viewer.

class mesmerize.viewer.core.ViewerUtils(viewer_reference: <module 'mesmerize.pyqtgraphCore.imageview.ImageView' from '/home/docs/checkouts/'>)[source]

Some utility functions for interfacing viewer.core.ViewerWorkEnv with the pyqtgraphCore.ImageView widget

__init__(viewer_reference: <module 'mesmerize.pyqtgraphCore.imageview.ImageView' from '/home/docs/checkouts/'>)[source]

Cleanup of the ViewerWorkEnv and ImageView widget


Ask the user if they want to discard their work environment. If Yes, calls _clear_workEnv()


Set the status bar message in the viewer window.


msg – text to display in the status bar


Update the ImageView widget with the ViewerWorkEnv


reference to the pyqtgraph ImageView widget instance (viewer)


ViewerWorkEnv instance


class mesmerize.viewer.core.mesfile.MES(filename: str)[source]

Handles of opening .mes files and organizing the images and meta data. The load_img() method returns a 3D array (dims are [time, cols, rows]) of the image sequence and its associated meta data.

Usage: Create a MES instance by passing the path of your mes file, example:

mesfile = MES(‘/path/to/mesfile/experiment_Feb_31.mes’)

Call the get_image_references() method to get a list of references for images that can be loaded.

To load an image that is available in the instance, just pass one of the references from get_image_references() to the load_img method:

img_array, meta_dict = mesfile.load_img(‘IF0001_0001’)

__init__(filename: str)[source]

filename – full path of a single .mes file


list of weak references to the object (if defined)

get_image_references() list[source]

Get a list of all image references available in the instance

load_img(img_reference: str) -> (<class 'numpy.ndarray'>, <class 'dict'>)[source]

img_reference – The image reference, usually something like IFxxxx_xxxx or Ifxxxx_xxxx


(image sequence array, meta data dict)


These examples can be run in the Viewer Console.

Working with meta data

# view meta data
>>> get_meta()

{'origin': 'AwesomeImager', 'version': '4107ff58a0c3d4d5d3c15c3d6a69f8798a20e3de', 'fps': 10.0, 'date': '20190426_152034', 'vmin': 323, 'vmax': 1529, 'orig_meta': {'source': 'AwesomeImager', 'version': '4107ff58a0c3d4d5d3c15c3d6a69f8798a20e3de', 'level_min': 323, 'stims': {}, 'time': '152034', 'date': '20190426', 'framerate': 10.0, 'level_max': 1529}}

# manually set meta data entries
>>> get_meta()['fps'] = 30.0

Open image

Use the Viewer Core API to open any arbitrary image

This example loads an image stored using, but this is applicable to images stored in any format that can eventually be represented as a numpy array in python. For example, you could also load image files stored in HDF5 format and load the numpy array that represents your image sequence.

 1import numpy as np
 3# clear the viewer work environment
 6a = np.load('/path_to_image.npy')
 8# check what the axes order is
11# (1000, 512, 512) # for example
12# looks like this is in [t, x, y]
13# this can be transposed so we get [x, y, t]
14# ImgData takes either [x, y, t] or [x, y, t, z] axes order
16# Define a meta data dict
17meta = \
18    {
19        "origin":      "Tutorial example",
20        "fps":         10.0,
21        "date":        "20200629_171823",
22        "scanner_pos": [0, 1, 2, 3, 4, 5, 6]
23    }
25# Create ImgData instance
26imgdata = ImgData(a.T, meta)  # use a.T to get [x, y, t]
28# Create a work environment instance
29work_env = ViewerWorkEnv(imgdata)
31# Set the current Viewer Work Environment from this new instance
32vi.viewer.workEnv = work_env
34# Update the viewer with the new work environment
35# this MUST be run whenever you replace the viewer work environment (the previous line)

Image data

Image sequences are simply numpy arrays. For example extract the image sequence between frame 1000 and 2000.

 1# Get the current image sequence
 2seq = get_image()
 4# Trim the image sequence
 5trim = seq[:, :, 1000:2000]
 7# Set the viewer work environment image sequence to the trim one
 8vi.viewer.workEnv.imgdata.seq = trim
10# Update the GUI with the new work environment

View analysis log

View the analysis log, such as batch manager processing history.

>>> get_workEnv().history_trace

[{'caiman_motion_correction': {'max_shifts_x': 32, 'max_shifts_y': 32, 'iters_rigid': 1, 'name_rigid': 'Does not matter', 'max_dev': 20, 'strides': 196, 'overlaps': 98, 'upsample': 4, 'name_elas': 'a1_t2', 'output_bit_depth': 'Do not convert', 'bord_px': 5}}, {'cnmfe': {'Input': 'Current Work Environment', 'frate': 10.0, 'gSig': 10, 'bord_px': 5, 'min_corr': 0.9600000000000001, 'min_pnr': 10, 'min_SNR': 1, 'r_values_min': 0.7, 'decay_time': 2, 'rf': 80, 'stride': 40, 'gnb': 8, 'nb_patch': 8, 'k': 8, 'name_corr_pnr': 'a8_t1', 'name_cnmfe': 'a1_t2', 'do_corr_pnr': False, 'do_cnmfe': True}}, {'cnmfe': {'Input': 'Current Work Environment', 'frate': 10.0, 'gSig': 10, 'bord_px': 5, 'min_corr': 0.9600000000000001, 'min_pnr': 14, 'min_SNR': 1, 'r_values_min': 0.7, 'decay_time': 4, 'rf': 80, 'stride': 40, 'gnb': 8, 'nb_patch': 8, 'k': 8, 'name_corr_pnr': '', 'name_cnmfe': 'a1_t2', 'do_corr_pnr': False, 'do_cnmfe': True}}]

Running scripts

You can use the Script Editor to run scripts in the Viewer console for automating tasks such as batch creation. It basically allows you to use the viewer console more conveniently with a text editor. The execution environment of the viewer console and script editor are identical.

Some example are provided for caiman modules and stimulus mapping.