Real-time computingReal-time computing (RTC) is the computer science term for hardware and software systems subject to a "real-time constraint", for example from event to system response. Real-time programs must guarantee response within specified time constraints, often referred to as "deadlines". Real-time responses are often understood to be in the order of milliseconds, and sometimes microseconds. A system not specified as operating in real time cannot usually guarantee a response within any timeframe, although typical or expected response times may be given.
LHCb experimentThe LHCb (Large Hadron Collider beauty) experiment is a particle physics detector experiment collecting data at the Large Hadron Collider at CERN. LHCb is a specialized b-physics experiment, designed primarily to measure the parameters of CP violation in the interactions of b-hadrons (heavy particles containing a bottom quark). Such studies can help to explain the matter-antimatter asymmetry of the Universe. The detector is also able to perform measurements of production cross sections, exotic hadron spectroscopy, charm physics and electroweak physics in the forward region.
ATLAS experimentATLAS is the largest general-purpose particle detector experiment at the Large Hadron Collider (LHC), a particle accelerator at CERN (the European Organization for Nuclear Research) in Switzerland. The experiment is designed to take advantage of the unprecedented energy available at the LHC and observe phenomena that involve highly massive particles which were not observable using earlier lower-energy accelerators. ATLAS was one of the two LHC experiments involved in the discovery of the Higgs boson in July 2012.
Real-time operating systemA real-time operating system (RTOS) is an operating system (OS) for real-time computing applications that processes data and events that have critically defined time constraints. An RTOS is distinct from a time-sharing operating system, such as Unix, which manages the sharing of system resources with a scheduler, data buffers, or fixed task prioritization in a multitasking or multiprogramming environment. Processing time requirements need to be fully understood and bound rather than just kept as a minimum.
PentaquarkA pentaquark is a human-made subatomic particle, consisting of four quarks and one antiquark bound together; they are not known to occur naturally, or exist outside of experiments specifically carried out to create them. As quarks have a baryon number of + 1/3, and antiquarks of − 1/3, the pentaquark would have a total baryon number of 1, and thus would be a baryon. Further, because it has five quarks instead of the usual three found in regular baryons ( 'triquarks'), it is classified as an exotic baryon.
Exotic hadronExotic hadrons are subatomic particles composed of quarks and gluons, but which – unlike "well-known" hadrons such as protons, neutrons and mesons – consist of more than three valence quarks. By contrast, "ordinary" hadrons contain just two or three quarks. Hadrons with explicit valence gluon content would also be considered exotic. In theory, there is no limit on the number of quarks in a hadron, as long as the hadron's color charge is white, or color-neutral.
Large Hadron ColliderThe Large Hadron Collider (LHC) is the world's largest and highest-energy particle collider. It was built by the European Organization for Nuclear Research (CERN) between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries. It lies in a tunnel in circumference and as deep as beneath the France–Switzerland border near Geneva. The first collisions were achieved in 2010 at an energy of 3.
B mesonIn particle physics, B mesons are mesons composed of a bottom antiquark and either an up (_B+), down (_B0), strange (_Strange B0) or charm quark (_Charmed B+). The combination of a bottom antiquark and a top quark is not thought to be possible because of the top quark's short lifetime. The combination of a bottom antiquark and a bottom quark is not a B meson, but rather bottomonium, which is something else entirely. Each B meson has an antiparticle that is composed of a bottom quark and an up (_B-), down (_AntiB0), strange (_Strange antiB0) or charm (_Charmed b-) antiquark respectively.
DØ experimentThe DØ experiment (sometimes written D0 experiment, or DZero experiment) was a worldwide collaboration of scientists conducting research on the fundamental nature of matter. DØ was one of two major experiments (the other was the CDF experiment) located at the Tevatron Collider at Fermilab in Batavia, Illinois. The Tevatron was the world's highest-energy accelerator from 1983 until 2009, when its energy was surpassed by the Large Hadron Collider. The DØ experiment stopped taking data in 2011, when the Tevatron shut down, but data analysis is still ongoing.
Sampling (statistics)In statistics, quality assurance, and survey methodology, sampling is the selection of a subset or a statistical sample (termed sample for short) of individuals from within a statistical population to estimate characteristics of the whole population. Statisticians attempt to collect samples that are representative of the population. Sampling has lower costs and faster data collection compared to recording data from the entire population, and thus, it can provide insights in cases where it is infeasible to measure an entire population.
Real-time computer graphicsReal-time computer graphics or real-time rendering is the sub-field of computer graphics focused on producing and analyzing images in real time. The term can refer to anything from rendering an application's graphical user interface (GUI) to real-time , but is most often used in reference to interactive 3D computer graphics, typically using a graphics processing unit (GPU). One example of this concept is a video game that rapidly renders changing 3D environments to produce an illusion of motion.
Classical physicsClassical physics is a group of physics theories that predate modern, more complete, or more widely applicable theories. If a currently accepted theory is considered to be modern, and its introduction represented a major paradigm shift, then the previous theories, or new theories based on the older paradigm, will often be referred to as belonging to the area of "classical physics". As such, the definition of a classical theory depends on context. Classical physical concepts are often used when modern theories are unnecessarily complex for a particular situation.
Stratified samplingIn statistics, stratified sampling is a method of sampling from a population which can be partitioned into subpopulations. In statistical surveys, when subpopulations within an overall population vary, it could be advantageous to sample each subpopulation (stratum) independently. Stratification is the process of dividing members of the population into homogeneous subgroups before sampling. The strata should define a partition of the population.
Physics engineA physics engine is computer software that provides an approximate simulation of certain physical systems, such as rigid body dynamics (including collision detection), soft body dynamics, and fluid dynamics, of use in the domains of computer graphics, video games and film (). Their main uses are in video games (typically as middleware), in which case the simulations are in real-time. The term is sometimes used more generally to describe any software system for simulating physical phenomena, such as high-performance scientific simulation.
PhysicsPhysics is the natural science of matter, involving the study of matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. Physics is one of the most fundamental scientific disciplines, with its main goal being to understand how the universe behaves. A scientist who specializes in the field of physics is called a physicist. Physics is one of the oldest academic disciplines and, through its inclusion of astronomy, perhaps the oldest.
Particle detectorIn experimental and applied particle physics, nuclear physics, and nuclear engineering, a particle detector, also known as a radiation detector, is a device used to detect, track, and/or identify ionizing particles, such as those produced by nuclear decay, cosmic radiation, or reactions in a particle accelerator. Detectors can measure the particle energy and other attributes such as momentum, spin, charge, particle type, in addition to merely registering the presence of the particle.
Semiconductor detectorA semiconductor detector in ionizing radiation detection physics is a device that uses a semiconductor (usually silicon or germanium) to measure the effect of incident charged particles or photons. Semiconductor detectors find broad application for radiation protection, gamma and X-ray spectrometry, and as particle detectors. In semiconductor detectors, ionizing radiation is measured by the number of charge carriers set free in the detector material which is arranged between two electrodes, by the radiation.
Detector (radio)In radio, a detector is a device or circuit that extracts information from a modulated radio frequency current or voltage. The term dates from the first three decades of radio (1888-1918). Unlike modern radio stations which transmit sound (an audio signal) on an uninterrupted carrier wave, early radio stations transmitted information by radiotelegraphy. The transmitter was switched on and off to produce long or short periods of radio waves, spelling out text messages in Morse code.
Crystal detectorA crystal detector is an obsolete electronic component used in some early 20th century radio receivers that consists of a piece of crystalline mineral which rectifies the alternating current radio signal. It was employed as a detector (demodulator) to extract the audio modulation signal from the modulated carrier, to produce the sound in the earphones. It was the first type of semiconductor diode, and one of the first semiconductor electronic devices.
Sampling biasIn statistics, sampling bias is a bias in which a sample is collected in such a way that some members of the intended population have a lower or higher sampling probability than others. It results in a biased sample of a population (or non-human factors) in which all individuals, or instances, were not equally likely to have been selected. If this is not accounted for, results can be erroneously attributed to the phenomenon under study rather than to the method of sampling.