Iterative reconstructionIterative reconstruction refers to iterative algorithms used to reconstruct 2D and 3D images in certain imaging techniques. For example, in computed tomography an image must be reconstructed from projections of an object. Here, iterative reconstruction techniques are usually a better, but computationally more expensive alternative to the common filtered back projection (FBP) method, which directly calculates the image in a single reconstruction step.
TokamakA tokamak (ˈtoʊkəmæk; токамáк) is a device which uses a powerful magnetic field to confine plasma in the shape of a torus. The tokamak is one of several types of magnetic confinement devices being developed to produce controlled thermonuclear fusion power. , it was the leading candidate for a practical fusion reactor. Tokamaks were initially conceptualized in the 1950s by Soviet physicists Igor Tamm and Andrei Sakharov, inspired by a letter by Oleg Lavrentiev. The first working tokamak was attributed to the work of Natan Yavlinsky on the T-1 in 1958.
Magnetic confinement fusionMagnetic confinement fusion is an approach to generate thermonuclear fusion power that uses magnetic fields to confine fusion fuel in the form of a plasma. Magnetic confinement is one of two major branches of fusion energy research, along with inertial confinement fusion. The magnetic approach began in the 1940s and absorbed the majority of subsequent development. Fusion reactions combine light atomic nuclei such as hydrogen to form heavier ones such as helium, producing energy.
Edge-localized modeAn edge-localized mode (ELM) is a plasma instability occurring in the edge region of a tokamak plasma due to periodic relaxations of the edge transport barrier in high-confinement mode. Each ELM burst is associated with expulsion of particles and energy from the confined plasma into the scrape-off layer. This phenomenon was first observed in the ASDEX tokamak in 1981. Diamagnetic effects in the model equations expand the size of the parameter space in which solutions of repeated sawteeth can be recovered compared to a resistive MHD model.
Tomographic reconstructionTomographic reconstruction is a type of multidimensional inverse problem where the challenge is to yield an estimate of a specific system from a finite number of projections. The mathematical basis for tomographic imaging was laid down by Johann Radon. A notable example of applications is the reconstruction of computed tomography (CT) where cross-sectional images of patients are obtained in non-invasive manner.
Plasma-facing materialIn nuclear fusion power research, the plasma-facing material (or materials) (PFM) is any material used to construct the plasma-facing components (PFC), those components exposed to the plasma within which nuclear fusion occurs, and particularly the material used for the lining the first wall or divertor region of the reactor vessel. Plasma-facing materials for fusion reactor designs must support the overall steps for energy generation, these include: Generating heat through fusion, Capturing heat in the first wall, Transferring heat at a faster rate than capturing heat.
Confidence intervalIn frequentist statistics, a confidence interval (CI) is a range of estimates for an unknown parameter. A confidence interval is computed at a designated confidence level; the 95% confidence level is most common, but other levels, such as 90% or 99%, are sometimes used. The confidence level, degree of confidence or confidence coefficient represents the long-run proportion of CIs (at the given confidence level) that theoretically contain the true value of the parameter; this is tantamount to the nominal coverage probability.
Maximum likelihood estimationIn statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
Pinch (plasma physics)A pinch (or: Bennett pinch (after Willard Harrison Bennett), electromagnetic pinch, magnetic pinch, pinch effect, or plasma pinch.) is the compression of an electrically conducting filament by magnetic forces, or a device that does such. The conductor is usually a plasma, but could also be a solid or liquid metal. Pinches were the first type of device used for experiments in controlled nuclear fusion power. Pinches occur naturally in electrical discharges such as lightning bolts, planetary auroras, current sheets, and solar flares.
Interval estimationIn statistics, interval estimation is the use of sample data to estimate an interval of possible values of a parameter of interest. This is in contrast to point estimation, which gives a single value. The most prevalent forms of interval estimation are confidence intervals (a frequentist method) and credible intervals (a Bayesian method); less common forms include likelihood intervals and fiducial intervals.
Prediction intervalIn statistical inference, specifically predictive inference, a prediction interval is an estimate of an interval in which a future observation will fall, with a certain probability, given what has already been observed. Prediction intervals are often used in regression analysis.
Confidence distributionIn statistical inference, the concept of a confidence distribution (CD) has often been loosely referred to as a distribution function on the parameter space that can represent confidence intervals of all levels for a parameter of interest. Historically, it has typically been constructed by inverting the upper limits of lower sided confidence intervals of all levels, and it was also commonly associated with a fiducial interpretation (fiducial distribution), although it is a purely frequentist concept.
Euler methodIn mathematics and computational science, the Euler method (also called the forward Euler method) is a first-order numerical procedure for solving ordinary differential equations (ODEs) with a given initial value. It is the most basic explicit method for numerical integration of ordinary differential equations and is the simplest Runge–Kutta method. The Euler method is named after Leonhard Euler, who first proposed it in his book Institutionum calculi integralis (published 1768–1870).
Tolerance intervalA tolerance interval (TI) is a statistical interval within which, with some confidence level, a specified sampled proportion of a population falls. "More specifically, a 100×p%/100×(1−α) tolerance interval provides limits within which at least a certain proportion (p) of the population falls with a given level of confidence (1−α)." "A (p, 1−α) tolerance interval (TI) based on a sample is constructed so that it would include at least a proportion p of the sampled population with confidence 1−α; such a TI is usually referred to as p-content − (1−α) coverage TI.
Machine learningMachine learning (ML) is an umbrella term for solving problems for which development of algorithms by human programmers would be cost-prohibitive, and instead the problems are solved by helping machines 'discover' their 'own' algorithms, without needing to be explicitly told what to do by any human-developed algorithms. Recently, generative artificial neural networks have been able to surpass results of many previous approaches.
Spherical tokamakA spherical tokamak is a type of fusion power device based on the tokamak principle. It is notable for its very narrow profile, or aspect ratio. A traditional tokamak has a toroidal confinement area that gives it an overall shape similar to a donut, complete with a large hole in the middle. The spherical tokamak reduces the size of the hole as much as possible, resulting in a plasma shape that is almost spherical, often compared to a cored apple. The spherical tokamak is sometimes referred to as a spherical torus and often shortened to ST.
Coverage probabilityIn statistics, the coverage probability, or coverage for short, is the probability that a confidence interval or confidence region will include the true value (parameter) of interest. It can be defined as the proportion of instances where the interval surrounds the true value as assessed by long-run frequency. The fixed degree of certainty pre-specified by the analyst, referred to as the confidence level or confidence coefficient of the constructed interval, is effectively the nominal coverage probability of the procedure for constructing confidence intervals.
Thermal radiationThermal radiation is electromagnetic radiation generated by the thermal motion of particles in matter. Thermal radiation is generated when heat from the movement of charges in the material (electrons and protons in common forms of matter) is converted to electromagnetic radiation. All matter with a temperature greater than absolute zero emits thermal radiation. At room temperature, most of the emission is in the infrared (IR) spectrum. Particle motion results in charge-acceleration or dipole oscillation which produces electromagnetic radiation.
Iterative methodIn computational mathematics, an iterative method is a mathematical procedure that uses an initial value to generate a sequence of improving approximate solutions for a class of problems, in which the n-th approximation is derived from the previous ones. A specific implementation with termination criteria for a given iterative method like gradient descent, hill climbing, Newton's method, or quasi-Newton methods like BFGS, is an algorithm of the iterative method.
Fiducial inferenceFiducial inference is one of a number of different types of statistical inference. These are rules, intended for general application, by which conclusions can be drawn from samples of data. In modern statistical practice, attempts to work with fiducial inference have fallen out of fashion in favour of frequentist inference, Bayesian inference and decision theory. However, fiducial inference is important in the history of statistics since its development led to the parallel development of concepts and tools in theoretical statistics that are widely used.