Lactate dehydrogenaseLactate dehydrogenase (LDH or LD) is an enzyme found in nearly all living cells. LDH catalyzes the conversion of pyruvate to lactate and back, as it converts NAD+ to NADH and back. A dehydrogenase is an enzyme that transfers a hydride from one molecule to another. LDH exists in four distinct enzyme classes. This article is specifically about the NAD(P)-dependent L-lactate dehydrogenase. Other LDHs act on D-lactate and/or are dependent on cytochrome c: D-lactate dehydrogenase (cytochrome) and L-lactate dehydrogenase (cytochrome).
Estimation theoryEstimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
Lactic acidosisLactic acidosis is a medical condition characterized by a build-up of lactate (especially -lactate) in the body, with formation of an excessively low pH in the bloodstream. It is a form of metabolic acidosis, in which excessive acid accumulates due to a problem with the body's oxidative metabolism. Lactic acidosis is typically the result of an underlying acute or chronic medical condition, medication, or poisoning. The symptoms are generally attributable to these underlying causes, but may include nausea, vomiting, Kussmaul breathing (laboured and deep), and generalised weakness.
Standard errorThe standard error (SE) of a statistic (usually an estimate of a parameter) is the standard deviation of its sampling distribution or an estimate of that standard deviation. If the statistic is the sample mean, it is called the standard error of the mean (SEM). The sampling distribution of a mean is generated by repeated sampling from the same population and recording of the sample means obtained. This forms a distribution of different means, and this distribution has its own mean and variance.
Dorsolateral prefrontal cortexThe dorsolateral prefrontal cortex (DLPFC or DL-PFC) is an area in the prefrontal cortex of the primate brain. It is one of the most recently derived parts of the human brain. It undergoes a prolonged period of maturation which lasts into adulthood. The DLPFC is not an anatomical structure, but rather a functional one. It lies in the middle frontal gyrus of humans (i.e., lateral part of Brodmann's area (BA) 9 and 46). In macaque monkeys, it is around the principal sulcus (i.e., in Brodmann's area 46).
Prefrontal cortexIn mammalian brain anatomy, the prefrontal cortex (PFC) covers the front part of the frontal lobe of the cerebral cortex. The PFC contains the Brodmann areas BA8, BA9, BA10, BA11, BA12, BA13, BA14, BA24, BA25, BA32, BA44, BA45, BA46, and BA47. The basic activity of this brain region is considered to be orchestration of thoughts and actions in accordance with internal goals. Many authors have indicated an integral link between a person's will to live, personality, and the functions of the prefrontal cortex.
Efficiency (statistics)In statistics, efficiency is a measure of quality of an estimator, of an experimental design, or of a hypothesis testing procedure. Essentially, a more efficient estimator needs fewer input data or observations than a less efficient one to achieve the Cramér–Rao bound. An efficient estimator is characterized by having the smallest possible variance, indicating that there is a small deviance between the estimated value and the "true" value in the L2 norm sense.
Maximum likelihood estimationIn statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
MeasurementMeasurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events. In other words, measurement is a process of determining how large or small a physical quantity is as compared to a basic reference quantity of the same kind. The scope and application of measurement are dependent on the context and discipline. In natural sciences and engineering, measurements do not apply to nominal properties of objects or events, which is consistent with the guidelines of the International vocabulary of metrology published by the International Bureau of Weights and Measures.
Fisher informationIn mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).
Ventromedial prefrontal cortexThe ventromedial prefrontal cortex (vmPFC) is a part of the prefrontal cortex in the mammalian brain. The ventral medial prefrontal is located in the frontal lobe at the bottom of the cerebral hemispheres and is implicated in the processing of risk and fear, as it is critical in the regulation of amygdala activity in humans. It also plays a role in the inhibition of emotional responses, and in the process of decision-making and self-control. It is also involved in the cognitive evaluation of morality.
Lactic acid fermentationLactic acid fermentation is a metabolic process by which glucose or other six-carbon sugars (also, disaccharides of six-carbon sugars, e.g. sucrose or lactose) are converted into cellular energy and the metabolite lactate, which is lactic acid in solution. It is an anaerobic fermentation reaction that occurs in some bacteria and animal cells, such as muscle cells. If oxygen is present in the cell, many organisms will bypass fermentation and undergo cellular respiration; however, facultative anaerobic organisms will both ferment and undergo respiration in the presence of oxygen.
Double-precision floating-point formatDouble-precision floating-point format (sometimes called FP64 or float64) is a floating-point number format, usually occupying 64 bits in computer memory; it represents a wide dynamic range of numeric values by using a floating radix point. Floating point is used to represent fractional values, or when a wider range is needed than is provided by fixed point (of the same bit width), even if at the cost of precision. Double precision may be chosen when the range or precision of single precision would be insufficient.
Human brainThe human brain is the central organ of the human nervous system, and with the spinal cord makes up the central nervous system. The brain consists of the cerebrum, the brainstem and the cerebellum. It controls most of the activities of the body, processing, integrating, and coordinating the information it receives from the sense organs, and making decisions as to the instructions sent to the rest of the body. The brain is contained in, and protected by, the skull bones of the head.
Half-precision floating-point formatIn computing, half precision (sometimes called FP16 or float16) is a binary floating-point computer number format that occupies 16 bits (two bytes in modern computers) in computer memory. It is intended for storage of floating-point values in applications where higher precision is not essential, in particular and neural networks. Almost all modern uses follow the IEEE 754-2008 standard, where the 16-bit base-2 format is referred to as binary16, and the exponent uses 5 bits.
EstimatorIn statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the sample mean is a commonly used estimator of the population mean. There are point and interval estimators. The point estimators yield single-valued results. This is in contrast to an interval estimator, where the result would be a range of plausible values.
Mixed acid fermentationIn biochemistry, mixed acid fermentation is the metabolic process by which a six-carbon sugar (e.g. glucose, ) is converted into a complex and variable mixture of acids. It is an anaerobic (non-oxygen-requiring) fermentation reaction that is common in bacteria. It is characteristic for members of the Enterobacteriaceae, a large family of Gram-negative bacteria that includes E. coli. The mixture of end products produced by mixed acid fermentation includes lactate, acetate, succinate, formate, ethanol and the gases and .
Accuracy and precisionAccuracy and precision are two measures of observational error. Accuracy is how close a given set of measurements (observations or readings) are to their true value, while precision is how close the measurements are to each other. In other words, precision is a description of random errors, a measure of statistical variability. Accuracy has two definitions: More commonly, it is a description of only systematic errors, a measure of statistical bias of a given measure of central tendency; low accuracy causes a difference between a result and a true value; ISO calls this trueness.
Orbitofrontal cortexThe orbitofrontal cortex (OFC) is a prefrontal cortex region in the frontal lobes of the brain which is involved in the cognitive process of decision-making. In non-human primates it consists of the association cortex areas Brodmann area 11, 12 and 13; in humans it consists of Brodmann area 10, 11 and 47. The OFC is functionally related to the ventromedial prefrontal cortex. Therefore, the region is distinguished due to the distinct neural connections and the distinct functions it performs.
Successive over-relaxationIn numerical linear algebra, the method of successive over-relaxation (SOR) is a variant of the Gauss–Seidel method for solving a linear system of equations, resulting in faster convergence. A similar method can be used for any slowly converging iterative process. It was devised simultaneously by David M. Young Jr. and by Stanley P. Frankel in 1950 for the purpose of automatically solving linear systems on digital computers. Over-relaxation methods had been used before the work of Young and Frankel.