SpectroscopySpectroscopy is the field of study that measures and interprets the electromagnetic spectra that result from the interaction between electromagnetic radiation and matter as a function of the wavelength or frequency of the radiation. Matter waves and acoustic waves can also be considered forms of radiative energy, and recently gravitational waves have been associated with a spectral signature in the context of the Laser Interferometer Gravitational-Wave Observatory (LIGO).
Absorption spectroscopyAbsorption spectroscopy refers to spectroscopic techniques that measure the absorption of electromagnetic radiation, as a function of frequency or wavelength, due to its interaction with a sample. The sample absorbs energy, i.e., photons, from the radiating field. The intensity of the absorption varies as a function of frequency, and this variation is the absorption spectrum. Absorption spectroscopy is performed across the electromagnetic spectrum.
Raman spectroscopyRaman spectroscopy (ˈrɑːmən) (named after Indian physicist C. V. Raman) is a spectroscopic technique typically used to determine vibrational modes of molecules, although rotational and other low-frequency modes of systems may also be observed. Raman spectroscopy is commonly used in chemistry to provide a structural fingerprint by which molecules can be identified. Raman spectroscopy relies upon inelastic scattering of photons, known as Raman scattering.
Reduction (complexity)In computability theory and computational complexity theory, a reduction is an algorithm for transforming one problem into another problem. A sufficiently efficient reduction from one problem to another may be used to show that the second problem is at least as difficult as the first. Intuitively, problem A is reducible to problem B, if an algorithm for solving problem B efficiently (if it existed) could also be used as a subroutine to solve problem A efficiently. When this is true, solving A cannot be harder than solving B.
Infrared spectroscopyInfrared spectroscopy (IR spectroscopy or vibrational spectroscopy) is the measurement of the interaction of infrared radiation with matter by absorption, emission, or reflection. It is used to study and identify chemical substances or functional groups in solid, liquid, or gaseous forms. It can be used to characterize new materials or identify and verify known and unknown samples. The method or technique of infrared spectroscopy is conducted with an instrument called an infrared spectrometer (or spectrophotometer) which produces an infrared spectrum.
Complexity classIn computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory. In particular, most complexity classes consist of decision problems that are solvable with a Turing machine, and are differentiated by their time or space (memory) requirements.
Near-infrared spectroscopyNear-infrared spectroscopy (NIRS) is a spectroscopic method that uses the near-infrared region of the electromagnetic spectrum (from 780 nm to 2500 nm). Typical applications include medical and physiological diagnostics and research including blood sugar, pulse oximetry, functional neuroimaging, sports medicine, elite sports training, ergonomics, rehabilitation, neonatal research, brain computer interface, urology (bladder contraction), and neurology (neurovascular coupling).
Computational complexityIn computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem. The study of the complexity of explicitly given algorithms is called analysis of algorithms, while the study of the complexity of problems is called computational complexity theory.
Fourier-transform spectroscopyFourier-transform spectroscopy is a measurement technique whereby spectra are collected based on measurements of the coherence of a radiative source, using time-domain or space-domain measurements of the radiation, electromagnetic or not. It can be applied to a variety of types of spectroscopy including optical spectroscopy, infrared spectroscopy (FTIR, FT-NIRS), nuclear magnetic resonance (NMR) and magnetic resonance spectroscopic imaging (MRSI), mass spectrometry and electron spin resonance spectroscopy.
Computational complexity theoryIn theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used.
Mössbauer spectroscopyMössbauer spectroscopy is a spectroscopic technique based on the Mössbauer effect. This effect, discovered by Rudolf Mössbauer (sometimes written "Moessbauer", German: "Mößbauer") in 1958, consists of the nearly recoil-free emission and absorption of nuclear gamma rays in solids. The consequent nuclear spectroscopy method is exquisitely sensitive to small changes in the chemical environment of certain nuclei.
X-ray spectroscopyX-ray spectroscopy is a general term for several spectroscopic techniques for characterization of materials by using x-ray radiation. When an electron from the inner shell of an atom is excited by the energy of a photon, it moves to a higher energy level. When it returns to the low energy level, the energy which it previously gained by the excitation is emitted as a photon which has a wavelength that is characteristic for the element (there could be several characteristic wavelengths per element).
Parameterized complexityIn computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input.
Sagnac effectThe Sagnac effect, also called Sagnac interference, named after French physicist Georges Sagnac, is a phenomenon encountered in interferometry that is elicited by rotation. The Sagnac effect manifests itself in a setup called a ring interferometer or Sagnac interferometer. A beam of light is split and the two beams are made to follow the same path but in opposite directions. On return to the point of entry the two light beams are allowed to exit the ring and undergo interference.
Space complexityThe space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely. This includes the memory space used by its inputs, called input space, and any other (auxiliary) memory it uses during execution, which is called auxiliary space. Similar to time complexity, space complexity is often expressed asymptotically in big O notation, such as etc.
Frequency combIn optics, a frequency comb is a laser source whose spectrum consists of a series of discrete, equally spaced frequency lines. Frequency combs can be generated by a number of mechanisms, including periodic modulation (in amplitude and/or phase) of a continuous-wave laser, four-wave mixing in nonlinear media, or stabilization of the pulse train generated by a mode-locked laser. Much work has been devoted to this last mechanism, which was developed around the turn of the 21st century and ultimately led to one half of the Nobel Prize in Physics being shared by John L.
Common-path interferometerA common-path interferometer is a class of interferometers in which the reference beam and sample beams travel along the same path. Examples include the Sagnac interferometer, Zernike phase-contrast interferometer, and the point diffraction interferometer. A common-path interferometer is generally more robust to environmental vibrations than a "double-path interferometer" such as the Michelson interferometer or the Mach–Zehnder interferometer.
InterferometryInterferometry is a technique which uses the interference of superimposed waves to extract information. Interferometry typically uses electromagnetic waves and is an important investigative technique in the fields of astronomy, fiber optics, engineering metrology, optical metrology, oceanography, seismology, spectroscopy (and its applications to chemistry), quantum mechanics, nuclear and particle physics, plasma physics, biomolecular interactions, surface profiling, microfluidics, mechanical stress/strain measurement, velocimetry, optometry, and making holograms.
Equalization (audio)Equalization, or simply EQ, in sound recording and reproduction is the process of adjusting the volume of different frequency bands within an audio signal. The circuit or equipment used to achieve this is called an equalizer. Most hi-fi equipment uses relatively simple filters to make bass and treble adjustments. Graphic and parametric equalizers have much more flexibility in tailoring the frequency content of an audio signal.
Hammar experimentThe Hammar experiment was an experiment designed and conducted by Gustaf Wilhelm Hammar (1935) to test the aether drag hypothesis. Its negative result refuted some specific aether drag models, and confirmed special relativity. Experiments such as the Michelson–Morley experiment of 1887 (and later other experiments such as the Trouton–Noble experiment in 1903 or the Trouton–Rankine experiment in 1908), presented evidence against the theory of a medium for light propagation known as the luminiferous aether; a theory that had been an established part of science for nearly one hundred years at the time.