Traffic flowIn mathematics and transportation engineering, traffic flow is the study of interactions between travellers (including pedestrians, cyclists, drivers, and their vehicles) and infrastructure (including highways, signage, and traffic control devices), with the aim of understanding and developing an optimal transport network with efficient movement of traffic and minimal traffic congestion problems.
Three-phase traffic theoryThree-phase traffic theory is a theory of traffic flow developed by Boris Kerner between 1996 and 2002. It focuses mainly on the explanation of the physics of traffic breakdown and resulting congested traffic on highways. Kerner describes three phases of traffic, while the classical theories based on the fundamental diagram of traffic flow have two phases: free flow and congested traffic.
TrafficTraffic comprises pedestrians, vehicles, ridden or herded animals, trains, and other conveyances that use public ways (roads) for travel and transportation. Traffic laws govern and regulate traffic, while rules of the road include traffic laws and informal rules that may have developed over time to facilitate the orderly and timely flow of traffic. Organized traffic generally has well-established priorities, lanes, right-of-way, and traffic control at intersections.
Traffic lightTraffic lights, traffic signals, or stoplights – also known as robots in South Africa – are signalling devices positioned at road intersections, pedestrian crossings, and other locations in order to control the flow of traffic. Traffic lights consist normally of three signals, transmitting meaningful information to road users through colours and symbols including arrows and bicycles. The regular traffic light colours are red, yellow (also known as amber), and green arranged vertically or horizontally in that order.
Unsupervised learningUnsupervised learning, is paradigm in machine learning where, in contrast to supervised learning and semi-supervised learning, algorithms learn patterns exclusively from unlabeled data. Neural network tasks are often categorized as discriminative (recognition) or generative (imagination). Often but not always, discriminative tasks use supervised methods and generative tasks use unsupervised (see Venn diagram); however, the separation is very hazy. For example, object recognition favors supervised learning but unsupervised learning can also cluster objects into groups.
K-means clusteringk-means clustering is a method of vector quantization, originally from signal processing, that aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean (cluster centers or cluster centroid), serving as a prototype of the cluster. This results in a partitioning of the data space into Voronoi cells. k-means clustering minimizes within-cluster variances (squared Euclidean distances), but not regular Euclidean distances, which would be the more difficult Weber problem: the mean optimizes squared errors, whereas only the geometric median minimizes Euclidean distances.
Big dataBig data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate. Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe big data is the one associated with a large body of information that we could not comprehend when used only in smaller amounts.
Cluster analysisCluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense) to each other than to those in other groups (clusters). It is a main task of exploratory data analysis, and a common technique for statistical data analysis, used in many fields, including pattern recognition, , information retrieval, bioinformatics, data compression, computer graphics and machine learning.
Traffic modelA traffic model is a mathematical model of real-world traffic, usually, but not restricted to, road traffic. Traffic modeling draws heavily on theoretical foundations like network theory and certain theories from physics like the kinematic wave model. The interesting quantity being modeled and measured is the traffic flow, i.e. the throughput of mobile units (e.g. vehicles) per time and transportation medium capacity (e.g. road or lane width). Models can teach researchers and engineers how to ensure an optimal flow with a minimum number of traffic jams.
Traffic congestionTraffic congestion is a condition in transport that is characterized by slower speeds, longer trip times, and increased vehicular queueing. Traffic congestion on urban road networks has increased substantially since the 1950s. When traffic demand is great enough that the interaction between vehicles slows the traffic stream, this results in congestion. While congestion is a possibility for any mode of transportation, this article will focus on automobile congestion on public roads.
Bus priority signalBus priority or transit signal priority (TSP) is a name for various techniques to improve service and reduce delay for mass transit vehicles at intersections (or junctions) controlled by traffic signals. TSP techniques are most commonly associated with buses, but can also be used along tram/streetcar or light rail lines, especially those that mix with or conflict with general vehicular traffic. Transit signal priority techniques can generally be classified as "active" or "passive".
Fundamental diagram of traffic flowThe fundamental diagram of traffic flow is a diagram that gives a relation between road traffic flux (vehicles/hour) and the traffic density (vehicles/km). A macroscopic traffic model involving traffic flux, traffic density and velocity forms the basis of the fundamental diagram. It can be used to predict the capability of a road system, or its behaviour when applying inflow regulation or speed limits. There is a connection between traffic density and vehicle velocity: The more vehicles are on a road, the slower their velocity will be.
ExperimentAn experiment is a procedure carried out to support or refute a hypothesis, or determine the efficacy or likelihood of something previously untried. Experiments provide insight into cause-and-effect by demonstrating what outcome occurs when a particular factor is manipulated. Experiments vary greatly in goal and scale but always rely on repeatable procedure and logical analysis of the results. There also exist natural experimental studies.
BiclusteringBiclustering, block clustering, Co-clustering or two-mode clustering is a data mining technique which allows simultaneous clustering of the rows and columns of a matrix. The term was first introduced by Boris Mirkin to name a technique introduced many years earlier, in 1972, by John A. Hartigan. Given a set of samples represented by an -dimensional feature vector, the entire dataset can be represented as rows in columns (i.e., an matrix). The Biclustering algorithm generates Biclusters.
Anomaly detectionIn data analysis, anomaly detection (also referred to as outlier detection and sometimes as novelty detection) is generally understood to be the identification of rare items, events or observations which deviate significantly from the majority of the data and do not conform to a well defined notion of normal behaviour. Such examples may arouse suspicions of being generated by a different mechanism, or appear inconsistent with the remainder of that set of data.
Feature learningIn machine learning, feature learning or representation learning is a set of techniques that allows a system to automatically discover the representations needed for feature detection or classification from raw data. This replaces manual feature engineering and allows a machine to both learn the features and use them to perform a specific task. Feature learning is motivated by the fact that machine learning tasks such as classification often require input that is mathematically and computationally convenient to process.
Thought experimentA thought experiment is a hypothetical situation in which a hypothesis, theory, or principle is laid out for the purpose of thinking through its consequences. The ancient Greek deiknymi, "was the most ancient pattern of mathematical proof", and existed before Euclidean mathematics, where the emphasis was on the conceptual, rather than on the experimental part of a thought-experiment. Johann Witt-Hansen established that Hans Christian Ørsted was the first to use the term Gedankenexperiment (from German: 'thought experiment') circa 1812.
IdentifierAn identifier is a name that identifies (that is, labels the identity of) either a unique object or a unique class of objects, where the "object" or class may be an idea, physical countable object (or class thereof), or physical noncountable substance (or class thereof). The abbreviation Id often refers to identity, identification (the process of identifying), or an identifier (that is, an instance of identification). An identifier may be a word, number, letter, symbol, or any combination of those.
Critical path methodThe critical path method (CPM), or critical path analysis (CPA), is an algorithm for scheduling a set of project activities. A critical path is determined by identifying the longest stretch of dependent activities and measuring the time required to complete them from start to finish. It is commonly used in conjunction with the program evaluation and review technique (PERT). The CPM is a project-modeling technique developed in the late 1950s by Morgan R. Walker of DuPont and James E. Kelley Jr. of Remington Rand.
Aspect's experimentAspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox which had been proposed about fifty years earlier. The experiment was led by French physicist Alain Aspect at the École supérieure d'optique in Orsay between 1980 and 1982.