Histone acetylation and deacetylationHistone acetylation and deacetylation are the processes by which the lysine residues within the N-terminal tail protruding from the histone core of the nucleosome are acetylated and deacetylated as part of gene regulation. Histone acetylation and deacetylation are essential parts of gene regulation. These reactions are typically catalysed by enzymes with "histone acetyltransferase" (HAT) or "histone deacetylase" (HDAC) activity. Acetylation is the process where an acetyl functional group is transferred from one molecule (in this case, acetyl coenzyme A) to another.
Histone deacetylaseHistone deacetylases (, HDAC) are a class of enzymes that remove acetyl groups (O=C-CH3) from an ε-N-acetyl lysine amino acid on both histone and non-histone proteins. HDACs allow histones to wrap the DNA more tightly. This is important because DNA is wrapped around histones, and DNA expression is regulated by acetylation and de-acetylation. HDAC's action is opposite to that of histone acetyltransferase. HDAC proteins are now also called lysine deacetylases (KDAC), to describe their function rather than their target, which also includes non-histone proteins.
Histone deacetylase inhibitorHistone deacetylase inhibitors (HDAC inhibitors, HDACi, HDIs) are chemical compounds that inhibit histone deacetylases. HDIs have a long history of use in psychiatry and neurology as mood stabilizers and anti-epileptics. More recently they are being investigated as possible treatments for cancers, parasitic and inflammatory diseases. To carry out gene expression, a cell must control the coiling and uncoiling of DNA around histones.
DataIn common usage and statistics, data (USˈdætə; UKˈdeɪtə) is a collection of discrete or continuous values that convey information, describing the quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted formally. A datum is an individual value in a collection of data. Data is usually organized into structures such as tables that provide additional context and meaning, and which may themselves be used as data in larger structures.
Data managementData management comprises all disciplines related to handling data as a valuable resource. The concept of data management arose in the 1980s as technology moved from sequential processing (first punched cards, then magnetic tape) to random access storage. Since it was now possible to store a discrete fact and quickly access it using random access disk technology, those suggesting that data management was more important than business process management used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems.
CorrelationIn statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics it usually refers to the degree to which a pair of variables are linearly related. Familiar examples of dependent phenomena include the correlation between the height of parents and their offspring, and the correlation between the price of a good and the quantity the consumers are willing to purchase, as it is depicted in the so-called demand curve.
Big dataBig data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate. Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe big data is the one associated with a large body of information that we could not comprehend when used only in smaller amounts.
Data scienceData science is an interdisciplinary academic field that uses statistics, scientific computing, scientific methods, processes, algorithms and systems to extract or extrapolate knowledge and insights from noisy, structured, and unstructured data. Data science also integrates domain knowledge from the underlying application domain (e.g., natural sciences, information technology, and medicine). Data science is multifaceted and can be described as a science, a research paradigm, a research method, a discipline, a workflow, and a profession.
OssificationOssification (also called osteogenesis or bone mineralization) in bone remodeling is the process of laying down new bone material by cells named osteoblasts. It is synonymous with bone tissue formation. There are two processes resulting in the formation of normal, healthy bone tissue: Intramembranous ossification is the direct laying down of bone into the primitive connective tissue (mesenchyme), while endochondral ossification involves cartilage as a precursor.
Pearson correlation coefficientIn statistics, the Pearson correlation coefficient (PCC) is a correlation coefficient that measures linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviations; thus, it is essentially a normalized measurement of the covariance, such that the result always has a value between −1 and 1. As with covariance itself, the measure can only reflect a linear correlation of variables, and ignores many other types of relationships or correlations.
BoneA bone is a rigid organ that constitutes part of the skeleton in most vertebrate animals. Bones protect the various other organs of the body, produce red and white blood cells, store minerals, provide structure and support for the body, and enable mobility. Bones come in a variety of shapes and sizes and have complex internal and external structures. They are lightweight yet strong and hard and serve multiple functions. Bone tissue (osseous tissue), which is also called bone in the uncountable sense of that word, is hard tissue, a type of specialised connective tissue.
Bone healingBone healing, or fracture healing, is a proliferative physiological process in which the body facilitates the repair of a bone fracture. Generally, bone fracture treatment consists of a doctor reducing (pushing) displaced bones back into place via relocation with or without anaesthetic, stabilizing their position to aid union, and then waiting for the bone's natural healing process to occur. Adequate nutrient intake has been found to significantly affect the integrity of the fracture repair.
Chromatin remodelingChromatin remodeling is the dynamic modification of chromatin architecture to allow access of condensed genomic DNA to the regulatory transcription machinery proteins, and thereby control gene expression. Such remodeling is principally carried out by 1) covalent histone modifications by specific enzymes, e.g., histone acetyltransferases (HATs), deacetylases, methyltransferases, and kinases, and 2) ATP-dependent chromatin remodeling complexes which either move, eject or restructure nucleosomes.
Data dredgingData dredging (also known as data snooping or p-hacking) is the misuse of data analysis to find patterns in data that can be presented as statistically significant, thus dramatically increasing and understating the risk of false positives. This is done by performing many statistical tests on the data and only reporting those that come back with significant results.
Coefficient of multiple correlationIn statistics, the coefficient of multiple correlation is a measure of how well a given variable can be predicted using a linear function of a set of other variables. It is the correlation between the variable's values and the best predictions that can be computed linearly from the predictive variables. The coefficient of multiple correlation takes values between 0 and 1.
Protein foldingProtein folding is the physical process where a protein chain is translated into its native three-dimensional structure, typically a "folded" conformation, by which the protein becomes biologically functional. Via an expeditious and reproducible process, a polypeptide folds into its characteristic three-dimensional structure from a random coil. Each protein exists first as an unfolded polypeptide or random coil after being translated from a sequence of mRNA into a linear chain of amino acids.
Bone mineralBone mineral (also called inorganic bone phase, bone salt, or bone apatite) is the inorganic component of bone tissue. It gives bones their compressive strength. Bone mineral is formed predominantly from carbonated hydroxyapatite with lower crystallinity. Bone mineral is formed from globular and plate structures distributed among the collagen fibrils of bone and forming yet a larger structure. The bone salt and collagen fibers together constitute the extracellular matrix of bone tissue.
Data warehouseIn computing, a data warehouse (DW or DWH), also known as an enterprise data warehouse (EDW), is a system used for reporting and data analysis and is considered a core component of business intelligence. Data warehouses are central repositories of integrated data from one or more disparate sources. They store current and historical data in one single place that are used for creating analytical reports for workers throughout the enterprise. This is beneficial for companies as it enables them to interrogate and draw insights from their data and make decisions.
Exploratory data analysisIn statistics, exploratory data analysis (EDA) is an approach of analyzing data sets to summarize their main characteristics, often using statistical graphics and other data visualization methods. A statistical model can be used or not, but primarily EDA is for seeing what the data can tell us beyond the formal modeling and thereby contrasts traditional hypothesis testing. Exploratory data analysis has been promoted by John Tukey since 1970 to encourage statisticians to explore the data, and possibly formulate hypotheses that could lead to new data collection and experiments.
Testing hypotheses suggested by the dataIn statistics, hypotheses suggested by a given dataset, when tested with the same dataset that suggested them, are likely to be accepted even when they are not true. This is because circular reasoning (double dipping) would be involved: something seems true in the limited data set; therefore we hypothesize that it is true in general; therefore we wrongly test it on the same, limited data set, which seems to confirm that it is true.