Tuesday, February 3, 2015

Cytotoxicity Testing



DEFENITION

Cytotoxicity is a term used for substances to describe how toxic or poisonous to the cells they can potentially be. Exposure to cytotoxic substances can result in permanent cellular damage or even death. To determine levels of cytotoxicity, laboratory tests and assays are often conducted on substances or ingredients that will be included in any medication or medical apparatus. As for its etymology, the term “cytotoxicity” is a combination of two Greek words: “kytos,” which refers to the cell, and “toxikon,” which pertains to poison.
Substances that can be described as cytotoxic can include some chemicals or even other types of cells. When it comes to chemicals, some naturally produced ones can come in the form of animal venom, such as in some spiders and snakes. The family of vipers, for example, is known to release a type of cytotoxin called the haemotoxin, which can rupture red blood cells and cause internal bleeding and organ damage. Another dangerous cytotoxin is the cardiotoxin, which is often associated with a king cobra’s venomous bite. The toxin attaches itself to the muscle cells in the heart, causing the organ to stop pumping blood, which can result in death.

MEASURING CYTOTOXICITY

Cytotoxicity assays are widely used by the pharmaceutical industry to screen for cytotoxicity in compound libraries. Researchers can either look for cytotoxic compounds, if they are interested in developing a therapeutic that targets rapidly dividing cancer cells, for instance; or they can screen "hits" from initial high-throughput drug screens for unwanted cytotoxic effects before investing in their development as a pharmaceutical.
Assessing cell membrane integrity is one of the most common ways to measure cell viability and cytotoxic effects. Compounds that have cytotoxic effects often compromise cell membrane integrity. Vital dyes, such as trypan blue or propidium iodide are normally excluded from the inside of healthy cells; however, if the cell membrane has been compromised, they freely cross the membrane and stain intracellular components.Alternatively, membrane integrity can be assessed by monitoring the passage of substances that are normally sequestered inside cells to the outside. One molecule, lactate dehydrogenase (LDH), is commonly measured using LDH assay. Protease biomarkers have been identified that allow researchers to measure relative numbers of live and dead cells within the same cell population. The live-cell protease is only active in cells that have a healthy cell membrane, and loses activity once the cell is compromised and the protease is exposed to the external environment. The dead-cell protease cannot cross the cell membrane, and can only be measured in culture media after cells have lost their membrane integrity.
Cytotoxicity can also be monitored using the 3-(4, 5-Dimethyl-2-thiazolyl)-2, 5-diphenyl-2H-tetrazolium bromide (MTT) or MTS assay. This assay measures the reducing potential of the cell using a colorimetric reaction. Viable cells will reduce the MTS reagent to a colored formazan product. A similar redox-based assay has also been developed using the fluorescent dye, resazurin. In addition to using dyes to indicate the redox potential of cells in order to monitor their viability, researchers have developed assays that use ATP content as a marker of viability. Such ATP-based assays include bioluminescent assays in which ATP is the limiting reagent for the luciferase reaction.
Cytotoxicity can also be measured by the sulforhodamine B (SRB) assay, WST assay and clonogenic assay.
A label-free approach to follow the cytotoxic response of adherent animal cells in real-time is based on electric impedance measurements when the cells are grown on gold-film electrodes. This technology is referred to as electric cell-substrate impedance sensing (ECIS). Label-free real-time techniques provide the kinetics of the cytotoxic response rather than just a snapshot like many colorimetric endpoint assays.

Saturday, January 10, 2015

ATOMIC ABSORPTION SPECTROSCOPY

ATOMIC ABSORPTION SPECTROSCOPY


               Atomic absorption spectroscopy (AAS) is a common technique used in many analytical chemistry protocols, as well as applications requiring a high degree of precision and accuracy, such as food & drug safety, clinical diagnostics and environmental monitoring.

                In order to analyze a sample for its atomic constituents, it has to be atomized. The atoms should then be irradiated by optical radiation, and the radiation source could be an element-specific line radiation source or a continuum radiation source. The radiation then passes through a monochromator in order to separate the element-specific radiation from any other radiation emitted by the radiation source, which is finally measured by a detector.

  

Atomizers

                The atomizers most commonly used nowadays are (spectroscopic) flames and electrothermal (graphite tube) atomizers. Other atomizers, such as glow-discharge atomization, hydride atomization, or cold-vapor atomization might be used for special purposes.


Background Absorption and Background Correction

                The relatively small number of atomic absorption lines (compared to atomic emission lines) and their narrow width (a few pm) make spectral overlap rare; there are only very few examples known that an absorption line from one element will overlap with another. Molecular absorption, in contrast, is much broader, so that it is more likely that some molecular absorption band will overlap with an atomic line. This kind of absorption might be caused by un-dissociated molecules of concomitant elements of the sample or by flame gases. We have to distinguish between the spectra of di-atomic molecules, which exhibit a pronounced fine structure, and those of larger (usually tri-atomic) molecules that don’t show such fine structure. 

Background correction techniques in LS AAS

Deuterium background correction This is the oldest and still most commonly used technique, particularly for flame AAS. In this case, a separate source (a deuterium lamp) with broad emission is used to measure the background absorption over the entire width of the exit slit of the spectrometer.
Smith-Hieftje background correction
This technique (named after their inventors) is based on the line-broadening and self-reversal of emission lines from HCL when high current is applied. Total absorption is measured with normal lamp current, i.e., with a narrow emission line, and background absorption after application of a high-current pulse with the profile of the self-reversed line, which has little emission at the original wavelength, but strong emission on both sides of the analytical line.
Zeeman-effect background correction
An alternating magnetic field is applied at the atomizer (graphite furnace) to split the absorption line into three components, the π component, which remains at the same position as the original absorption line, and two σ components, which are moved to higher and lower wavelengths, respectively. Total absorption is measured without magnetic field and background absorption with the magnetic field on. The π component has to be removed in this case, e.g. using a polarizer, and the σ components do not overlap with the emission profile of the lamp, so that only the background absorption is measured.

Background correction techniques in HR-CS AAS

Background correction using correction pixels
It has already been mentioned that in HR-CS AAS lamp flicker noise is eliminated using correction pixels. In fact, any increase or decrease in radiation intensity that is observed to the same extent at all pixels chosen for correction is eliminated by the correction algorithm. This obviously also includes a reduction of the measured intensity due to radiation scattering or molecular absorption, which is corrected in the same way.
Background correction using a least-squares algorithm
The above technique can obviously not correct for a background with fine structure, as in this case the absorbance will be different at each of the correction pixels. In this case HR-CS AAS is offering the possibility to measure correction spectra of the molecule(s) that is (are) responsible for the background and store them in the computer. These spectra are then multiplied with a factor to match the intensity of the sample spectrum and subtracted pixel by pixel and spectrum by spectrum from the sample spectrum using a least-squares algorithm.