Particle sizeParticle size is a notion introduced for comparing dimensions of solid particles (flecks), liquid particles (droplets), or gaseous particles (bubbles). The notion of particle size applies to particles in colloids, in ecology, in granular material (whether airborne or not), and to particles that form a granular material (see also grain size). Particle size measurement There are several methods for measuring particle size and particle size distribution. Some of them are based on light, other on ultrasound, or electric field, or gravity, or centrifugation.
Size-exclusion chromatographySize-exclusion chromatography (SEC), also known as molecular sieve chromatography, is a chromatographic method in which molecules in solution are separated by their size, and in some cases molecular weight. It is usually applied to large molecules or macromolecular complexes such as proteins and industrial polymers. Typically, when an aqueous solution is used to transport the sample through the column, the technique is known as gel-filtration chromatography, versus the name gel permeation chromatography, which is used when an organic solvent is used as a mobile phase.
Rank–size distributionRank–size distribution is the distribution of size by rank, in decreasing order of size. For example, if a data set consists of items of sizes 5, 100, 5, and 8, the rank-size distribution is 100, 8, 5, 5 (ranks 1 through 4). This is also known as the rank–frequency distribution, when the source data are from a frequency distribution. These are particularly of interest when the data vary significantly in scales, such as city size or word frequency.
Single particle analysisSingle particle analysis is a group of related computerized image processing techniques used to analyze images from transmission electron microscopy (TEM). These methods were developed to improve and extend the information obtainable from TEM images of particulate samples, typically proteins or other large biological entities such as viruses. Individual images of stained or unstained particles are very noisy, and so hard to interpret. Combining several digitized images of similar particles together gives an image with stronger and more easily interpretable features.
Cryogenic electron microscopyCryogenic electron microscopy (cryo-EM) is a cryomicroscopy technique applied on samples cooled to cryogenic temperatures. For biological specimens, the structure is preserved by embedding in an environment of vitreous ice. An aqueous sample solution is applied to a grid-mesh and plunge-frozen in liquid ethane or a mixture of liquid ethane and propane. While development of the technique began in the 1970s, recent advances in detector technology and software algorithms have allowed for the determination of biomolecular structures at near-atomic resolution.
Scanning transmission electron microscopyA scanning transmission electron microscope (STEM) is a type of transmission electron microscope (TEM). Pronunciation is [stɛm] or [ɛsti:i:ɛm]. As with a conventional transmission electron microscope (CTEM), images are formed by electrons passing through a sufficiently thin specimen. However, unlike CTEM, in STEM the electron beam is focused to a fine spot (with the typical spot size 0.05 – 0.2 nm) which is then scanned over the sample in a raster illumination system constructed so that the sample is illuminated at each point with the beam parallel to the optical axis.
Transmission electron cryomicroscopyTransmission electron cryomicroscopy (CryoTEM), commonly known as cryo-EM, is a form of cryogenic electron microscopy, more specifically a type of transmission electron microscopy (TEM) where the sample is studied at cryogenic temperatures (generally liquid-nitrogen temperatures). Cryo-EM is gaining popularity in structural biology. The utility of transmission electron cryomicroscopy stems from the fact that it allows the observation of specimens that have not been stained or fixed in any way, showing them in their native environment.
Sample size determinationSample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample. In practice, the sample size used in a study is usually determined based on the cost, time, or convenience of collecting the data, and the need for it to offer sufficient statistical power.
Binomial distributionIn probability theory and statistics, the binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experiments, each asking a yes–no question, and each with its own Boolean-valued outcome: success (with probability p) or failure (with probability ). A single success/failure experiment is also called a Bernoulli trial or Bernoulli experiment, and a sequence of outcomes is called a Bernoulli process; for a single trial, i.
Dynamic light scatteringDynamic light scattering (DLS) is a technique in physics that can be used to determine the size distribution profile of small particles in suspension or polymers in solution. In the scope of DLS, temporal fluctuations are usually analyzed using the intensity or photon auto-correlation function (also known as photon correlation spectroscopy - PCS or quasi-elastic light scattering - QELS). In the time domain analysis, the autocorrelation function (ACF) usually decays starting from zero delay time, and faster dynamics due to smaller particles lead to faster decorrelation of scattered intensity trace.
Transmission electron microscopyTransmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through a specimen to form an image. The specimen is most often an ultrathin section less than 100 nm thick or a suspension on a grid. An image is formed from the interaction of the electrons with the sample as the beam is transmitted through the specimen. The image is then magnified and focused onto an imaging device, such as a fluorescent screen, a layer of photographic film, or a sensor such as a scintillator attached to a charge-coupled device.
Static light scatteringStatic light scattering is a technique in physical chemistry that measures the intensity of the scattered light to obtain the average molecular weight Mw of a macromolecule like a polymer or a protein in solution. Measurement of the scattering intensity at many angles allows calculation of the root mean square radius, also called the radius of gyration Rg. By measuring the scattering intensity for many samples of various concentrations, the second virial coefficient, A2, can be calculated.
Weibull distributionIn probability theory and statistics, the Weibull distribution ˈwaɪbʊl is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page. The distribution is named after Swedish mathematician Waloddi Weibull, who described it in detail in 1939, although it was first identified by Maurice René Fréchet and first applied by to describe a particle size distribution.
Powder diffractionPowder diffraction is a scientific technique using X-ray, neutron, or electron diffraction on powder or microcrystalline samples for structural characterization of materials. An instrument dedicated to performing such powder measurements is called a powder diffractometer. Powder diffraction stands in contrast to single crystal diffraction techniques, which work best with a single, well-ordered crystal. Diffraction grating The most common type of powder diffraction is with x-rays, the focus of this article although some aspects of neutron powder diffraction are mentioned.
X-ray crystallographyX-ray crystallography is the experimental science determining the atomic and molecular structure of a crystal, in which the crystalline structure causes a beam of incident X-rays to diffract into many specific directions. By measuring the angles and intensities of these diffracted beams, a crystallographer can produce a three-dimensional picture of the density of electrons within the crystal. From this electron density, the mean positions of the atoms in the crystal can be determined, as well as their chemical bonds, their crystallographic disorder, and various other information.
Log-normal distributionIn probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values.
Pareto distributionThe Pareto distribution, named after the Italian civil engineer, economist, and sociologist Vilfredo Pareto, is a power-law probability distribution that is used in description of social, quality control, scientific, geophysical, actuarial, and many other types of observable phenomena; the principle originally applied to describing the distribution of wealth in a society, fitting the trend that a large portion of wealth is held by a small fraction of the population.
Electron microscopeAn electron microscope is a microscope that uses a beam of electrons as a source of illumination. They use electron optics that are analogous to the glass lenses of an optical light microscope. As the wavelength of an electron can be up to 100,000 times shorter than that of visible light, electron microscopes have a higher resolution of about 0.1 nm, which compares to about 200 nm for light microscopes.
X-ray scattering techniquesX-ray scattering techniques are a family of non-destructive analytical techniques which reveal information about the crystal structure, chemical composition, and physical properties of materials and thin films. These techniques are based on observing the scattered intensity of an X-ray beam hitting a sample as a function of incident and scattered angle, polarization, and wavelength or energy.
Moving averageIn statistics, a moving average (rolling average or running average) is a calculation to analyze data points by creating a series of averages of different selections of the full data set. It is also called a moving mean (MM) or rolling mean and is a type of finite impulse response filter. Variations include: simple, cumulative, or weighted forms (described below). A moving average filter is sometimes called a boxcar filter, especially when followed by decimation.