top of page

ACCELERATING DISCOVERY

Increasing throughput for a given project refers to a decrease in processing time per unit entity. Stated in another way, increasing throughput refers to an increase in the data processed in a given amount of time.

Throughput can be increased in three main ways:

1

Parallel processing

Parallel processing simply entails multiplying operations
that are already underway.

2

Faster processing

The same processes underway are simply sped up using
brute force.

3

Switching to a new workflow

This method involves coming up with new approaches and thinking beyond what is customary.

Most systems currently aim to increase throughput by increasing the speed of an already existing workflow in one of three ways: by the movement of the sample to the plane of interest, movement of the optical components to the plane of interest, or use of widefield tracking.

Celerity Nine introduces a new way of thinking about cell imaging to increase throughput.

A good high throughput imaging system must have three main components:

01.

Controlled light delivery and detection system

02.

Suitable data storage

03.

Methods for automation which allow for increased throughput (e.g., automation through robotics)

Currently several advanced, yet very expensive, high throughput systems are available with extensive and powerful optics. As powerful as these systems are, they have two primary drawbacks:

01.

High prices that are cost prohibitive

02.

The limitations and restrictions presented by the kinetic motion of robotic components, one of the most pertinent being the inability to increase throughput to the millisecond time scale

Celerity Nine, by using a novel solid state Newtonian optical setup, offers at an affordable price and without the restrictions that are inherent in kinetic robotic movement, increased imaging throughput to make possible, near simultaneous in-vitro and in-incubator recordings of multiple biological samples undergoing dynamic processes that are only milliseconds in duration, such as a cardiac contraction.This opens the door to many new physiological discoveries: for example, measurement of the arrhythmogenic properties of various compounds on cardiac monolayers.

PARALLEL MEASUREMENTS

The main goal of an experiment is to determine whether a certain manipulation has a specific effect. To achieve this goal, the experimental group is compared to a control group. As it is impractical to measure the entire population of interest we use samples to estimate the statistical parameters of a population. Biological experiments aim to determine the effect of a specific manipulation on a population’s mean and variance and use samples to estimate these parameters. Statistics and probabilities are used to calculate and analyze these means and variabilities.

A good experiment will be able to, with the most accuracy, determine the smallest effect of a manipulation. This is generally achieved by reducing variance and increasing sample size. Imaging multiple samples under physiological conditions over a long time allows us to achieve this. This becomes particularly pertinent when the dynamics under study are faster changing than the switch time between various samples.

In a microscopy system that uses robotics to move the samples under the objective at an accelerated rate, most of the samples are not being imaged most of the time and instead are waiting in line to be imaged. This raises an issue when the dynamics that are changing are rare and last a shorter period of time than the typical wait time and as a result may be missed.

Additionally, by measuring multiple samples simultaneously under biological conditions we reduce the variability in the samples and thus also help to increase the statistical strength of any experiment.

Taken together, the main benefit is that the device enables and increases our ability to control for more factors, cofactors and lurking variables, therefore decreasing variability, as well as allowing for more samples to be analyzed and consequently reducing the standard error and increasing the power of the test. (That is, the ability to image multiple samples simultaneously for extended periods of time inside an incubator and, therefore, decrease the variability that would inevitably be introduced by imaging different passages, or the same passage, at different time points sequentially.) Thus the ability to make measurements in parallel is one of the greatest benefits of the device. This is particularly pertinent in cases where fast processes that develop over long periods of time, such as arrhythmias in cardiac monolayers, need to be studied and compared.

ACHIEVEMENTS

This device was developed and tested over several years. One such test was using cardiac monolayers to detect cardiac arrhythmias. Arrhythmic events in monolayer cultures are rare events in two ways: first, factors that lead to arrythmia only lead to a change in the probability of arrhythmias and are not deterministic in nature. Second, in a sample that demonstrates arrhythmias, the duration of ectopic events can vary, and they can appear and disappear spontaneously and randomly and do so only infrequently. What this translates to, in addition to a need for greater sample numbers and reduced variability as discussed above, is a need for extended recording periods. Without longer durations, rare abnormalities can easily be missed. This long duration recording is only possible as our device can function inside an incubator so that the cells are imaged under physiological conditions. Overall, the results showed that the device was able to distinguish between different arrhythmia rates and regularity, between different confluency levels as well as between different time periods.

Picture1.jpg

Day 2: Confluency 1X - Well 1

Day 2: Confluency 1X - Well 2

Day 2: Confluency 1X - Well 3

Day 2: Confluency 2X - Well 1

Day 2: Confluency 2X - Well 2

Day 2: Confluency 2X - Well 3

Day 2: Confluency 3X - Well 1

Day 2: Confluency 3X - Well 2

Day 2: Confluency 3X - Well 3

Day 4: Confluency 1X - Well 1

Day 4: Confluency 1X - Well 2

Day 4: Confluency 1X - Well 3

Day 4: Confluency 2X - Well 1

Day 4: Confluency 2X - Well 2

Day 4: Confluency 2X - Well 3

Day 4: Confluency 3X - Well 1

Day 4: Confluency 3X - Well 2

Day 4: Confluency 3X - Well 3

Day 2: Confluency 2X - Well 4

Day 2: Confluency 2X - Well 4

bottom of page