NanoparticleA nanoparticle or ultrafine particle is usually defined as a particle of matter that is between 1 and 100 nanometres (nm) in diameter. The term is sometimes used for larger particles, up to 500 nm, or fibers and tubes that are less than 100 nm in only two directions. At the lowest range, metal particles smaller than 1 nm are usually called atom clusters instead.
Silver nanoparticleSilver nanoparticles are nanoparticles of silver of between 1 nm and 100 nm in size. While frequently described as being 'silver' some are composed of a large percentage of silver oxide due to their large ratio of surface to bulk silver atoms. Numerous shapes of nanoparticles can be constructed depending on the application at hand. Commonly used silver nanoparticles are spherical, but diamond, octagonal, and thin sheets are also common. Their extremely large surface area permits the coordination of a vast number of ligands.
Colloidal goldColloidal gold is a sol or colloidal suspension of nanoparticles of gold in a fluid, usually water. The colloid is coloured usually either wine red (for spherical particles less than 100 nm) or blue-purple (for larger spherical particles or nanorods). Due to their optical, electronic, and molecular-recognition properties, gold nanoparticles are the subject of substantial research, with many potential or promised applications in a wide variety of areas, including electron microscopy, electronics, nanotechnology, materials science, and biomedicine.
Platinum nanoparticlePlatinum nanoparticles are usually in the form of a suspension or colloid of nanoparticles of platinum in a fluid, usually water. A colloid is technically defined as a stable dispersion of particles in a fluid medium (liquid or gas). Spherical platinum nanoparticles can be made with sizes between about 2 and 100 nanometres (nm), depending on reaction conditions. Platinum nanoparticles are suspended in the colloidal solution of brownish-red or black color. Nanoparticles come in wide variety of shapes including spheres, rods, cubes, and tetrahedra.
NP-completenessIn computational complexity theory, a problem is NP-complete when: It is a decision problem, meaning that for any input to the problem, the output is either "yes" or "no". When the answer is "yes", this can be demonstrated through the existence of a short (polynomial length) solution. The correctness of each solution can be verified quickly (namely, in polynomial time) and a brute-force search algorithm can find a solution by trying all possible solutions.
Nanoparticle–biomolecule conjugateA nanoparticle–biomolecule conjugate is a nanoparticle with biomolecules attached to its surface. Nanoparticles are minuscule particles, typically measured in nanometers (nm), that are used in nanobiotechnology to explore the functions of biomolecules. Properties of the ultrafine particles are characterized by the components on their surfaces more so than larger structures, such as cells, due to large surface area-to-volume ratios. Large surface area-to-volume-ratios of nanoparticles optimize the potential for interactions with biomolecules.
P versus NP problemThe P versus NP problem is a major unsolved problem in theoretical computer science. In informal terms, it asks whether every problem whose solution can be quickly verified can also be quickly solved. The informal term quickly, used above, means the existence of an algorithm solving the task that runs in polynomial time, such that the time to complete the task varies as a polynomial function on the size of the input to the algorithm (as opposed to, say, exponential time).
Well-founded relationIn mathematics, a binary relation R is called well-founded (or wellfounded or foundational) on a class X if every non-empty subset S ⊆ X has a minimal element with respect to R, that is, an element m ∈ S not related by s R m (for instance, "s is not smaller than m") for any s ∈ S. In other words, a relation is well founded if Some authors include an extra condition that R is set-like, i.e., that the elements less than any given element form a set.
Non-well-founded set theoryNon-well-founded set theories are variants of axiomatic set theory that allow sets to be elements of themselves and otherwise violate the rule of well-foundedness. In non-well-founded set theories, the foundation axiom of ZFC is replaced by axioms implying its negation. The study of non-well-founded sets was initiated by Dmitry Mirimanoff in a series of papers between 1917 and 1920, in which he formulated the distinction between well-founded and non-well-founded sets; he did not regard well-foundedness as an axiom.
Well-orderIn mathematics, a well-order (or well-ordering or well-order relation) on a set S is a total order on S with the property that every non-empty subset of S has a least element in this ordering. The set S together with the well-order relation is then called a well-ordered set. In some academic articles and textbooks these terms are instead written as wellorder, wellordered, and wellordering or well order, well ordered, and well ordering. Every non-empty well-ordered set has a least element.
Surface areaThe surface area (symbol A) of a solid object is a measure of the total area that the surface of the object occupies. The mathematical definition of surface area in the presence of curved surfaces is considerably more involved than the definition of arc length of one-dimensional curves, or of the surface area for polyhedra (i.e., objects with flat polygonal faces), for which the surface area is the sum of the areas of its faces. Smooth surfaces, such as a sphere, are assigned surface area using their representation as parametric surfaces.
NP-hardnessIn computational complexity theory, NP-hardness (non-deterministic polynomial-time hardness) is the defining property of a class of problems that are informally "at least as hard as the hardest problems in NP". A simple example of an NP-hard problem is the subset sum problem. A more precise specification is: a problem H is NP-hard when every problem L in NP can be reduced in polynomial time to H; that is, assuming a solution for H takes 1 unit time, Hs solution can be used to solve L in polynomial time.
Naive set theoryNaive set theory is any of several theories of sets used in the discussion of the foundations of mathematics. Unlike axiomatic set theories, which are defined using formal logic, naive set theory is defined informally, in natural language. It describes the aspects of mathematical sets familiar in discrete mathematics (for example Venn diagrams and symbolic reasoning about their Boolean algebra), and suffices for the everyday use of set theory concepts in contemporary mathematics.
NP (complexity)In computational complexity theory, NP (nondeterministic polynomial time) is a complexity class used to classify decision problems. NP is the set of decision problems for which the problem instances, where the answer is "yes", have proofs verifiable in polynomial time by a deterministic Turing machine, or alternatively the set of problems that can be solved in polynomial time by a nondeterministic Turing machine. NP is the set of decision problems solvable in polynomial time by a nondeterministic Turing machine.
Ligand (biochemistry)In biochemistry and pharmacology, a ligand is a substance that forms a complex with a biomolecule to serve a biological purpose. The etymology stems from Latin ligare, which means 'to bind'. In protein-ligand binding, the ligand is usually a molecule which produces a signal by binding to a site on a target protein. The binding typically results in a change of conformational isomerism (conformation) of the target protein. In DNA-ligand binding studies, the ligand can be a small molecule, ion, or protein which binds to the DNA double helix.
Directed setIn mathematics, a directed set (or a directed preorder or a filtered set) is a nonempty set together with a reflexive and transitive binary relation (that is, a preorder), with the additional property that every pair of elements has an upper bound. In other words, for any and in there must exist in with and A directed set's preorder is called a direction. The notion defined above is sometimes called an . A is defined analogously, meaning that every pair of elements is bounded below.
ProteinProteins are large biomolecules and macromolecules that comprise one or more long chains of amino acid residues. Proteins perform a vast array of functions within organisms, including catalysing metabolic reactions, DNA replication, responding to stimuli, providing structure to cells and organisms, and transporting molecules from one location to another. Proteins differ from one another primarily in their sequence of amino acids, which is dictated by the nucleotide sequence of their genes, and which usually results in protein folding into a specific 3D structure that determines its activity.
Peripheral membrane proteinPeripheral membrane proteins, or extrinsic membrane proteins, are membrane proteins that adhere only temporarily to the biological membrane with which they are associated. These proteins attach to integral membrane proteins, or penetrate the peripheral regions of the lipid bilayer. The regulatory protein subunits of many ion channels and transmembrane receptors, for example, may be defined as peripheral membrane proteins.
Set theorySet theory is the branch of mathematical logic that studies sets, which can be informally described as collections of objects. Although objects of any kind can be collected into a set, set theory, as a branch of mathematics, is mostly concerned with those that are relevant to mathematics as a whole. The modern study of set theory was initiated by the German mathematicians Richard Dedekind and Georg Cantor in the 1870s. In particular, Georg Cantor is commonly considered the founder of set theory.
Single particle analysisSingle particle analysis is a group of related computerized image processing techniques used to analyze images from transmission electron microscopy (TEM). These methods were developed to improve and extend the information obtainable from TEM images of particulate samples, typically proteins or other large biological entities such as viruses. Individual images of stained or unstained particles are very noisy, and so hard to interpret. Combining several digitized images of similar particles together gives an image with stronger and more easily interpretable features.