Weather radarWeather radar, also called weather surveillance radar (WSR) and Doppler weather radar, is a type of radar used to locate precipitation, calculate its motion, and estimate its type (rain, snow, hail etc.). Modern weather radars are mostly pulse-Doppler radars, capable of detecting the motion of rain droplets in addition to the intensity of the precipitation. Both types of data can be analyzed to determine the structure of storms and their potential to cause severe weather.
Kalman filterFor statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. The filter is named after Rudolf E. Kálmán, who was one of the primary developers of its theory.
Tornado debris signatureA tornadic debris signature (TDS), often colloquially referred to as a debris ball, is an area of high reflectivity on weather radar caused by debris lofting into the air, usually associated with a tornado. A TDS may also be indicated by dual-polarization radar products, designated as a polarimetric tornado debris signature (PTDS). Polarimetric radar can discern meteorological and nonmeteorological hydrometeors and the co-location of a PTDS with the enhanced reflectivity of a debris ball are used by meteorologists as confirmation that a tornado is occurring.
Ensemble Kalman filterThe ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnKF originated as a version of the Kalman filter for large problems (essentially, the covariance matrix is replaced by the sample covariance), and it is now an important data assimilation component of ensemble forecasting.
Particle filterParticle filters, or sequential Monte Carlo methods, are a set of Monte Carlo algorithms used to find approximate solutions for filtering problems for nonlinear state-space systems, such as signal processing and Bayesian statistical inference. The filtering problem consists of estimating the internal states in dynamical systems when partial observations are made and random perturbations are present in the sensors as well as in the dynamical system.
AlgorithmIn mathematics and computer science, an algorithm (ˈælɡərɪðəm) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually.
Synthetic-aperture radarSynthetic-aperture radar (SAR) is a form of radar that is used to create two-dimensional images or three-dimensional reconstructions of objects, such as landscapes. SAR uses the motion of the radar antenna over a target region to provide finer spatial resolution than conventional stationary beam-scanning radars. SAR is typically mounted on a moving platform, such as an aircraft or spacecraft, and has its origins in an advanced form of side looking airborne radar (SLAR).
Ice crystalIce crystals are solid ice in symmetrical shapes including hexagonal columns, hexagonal plates, and dendritic crystals. Ice crystals are responsible for various atmospheric optic displays and cloud formations. At ambient temperature and pressure, water molecules have a V shape. The two hydrogen atoms bond to the oxygen atom at a 105° angle. Ice crystals have a hexagonal crystal lattice, meaning the water molecules arrange themselves into layered hexagons upon freezing.
Tornado vortex signatureA tornadic vortex signature, abbreviated TVS, is a Pulse-Doppler radar weather radar detected rotation algorithm that indicates the likely presence of a strong mesocyclone that is in some stage of tornadogenesis. It may give meteorologists the ability to pinpoint and track the location of tornadic rotation within a larger storm, and is one component of the National Weather Service's warning operations. The tornadic vortex signature was first identified by Donald W. Burgess, Leslie R. Lemon, and Rodger A.
DataIn common usage and statistics, data (USˈdætə; UKˈdeɪtə) is a collection of discrete or continuous values that convey information, describing the quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted formally. A datum is an individual value in a collection of data. Data is usually organized into structures such as tables that provide additional context and meaning, and which may themselves be used as data in larger structures.
Sorting algorithmIn computer science, a sorting algorithm is an algorithm that puts elements of a list into an order. The most frequently used orders are numerical order and lexicographical order, and either ascending or descending. Efficient sorting is important for optimizing the efficiency of other algorithms (such as search and merge algorithms) that require input data to be in sorted lists. Sorting is also often useful for canonicalizing data and for producing human-readable output.
DisplayPortDisplayPort (DP) is a digital display interface developed by a consortium of PC and chip manufacturers and standardized by the Video Electronics Standards Association (VESA). It is primarily used to connect a video source to a display device such as a computer monitor. It can also carry audio, USB, and other forms of data. DisplayPort was designed to replace VGA, FPD-Link, and Digital Visual Interface (DVI). It is backward compatible with other interfaces, such as HDMI and DVI, through the use of either active or passive adapters.
Divide-and-conquer algorithmIn computer science, divide and conquer is an algorithm design paradigm. A divide-and-conquer algorithm recursively breaks down a problem into two or more sub-problems of the same or related type, until these become simple enough to be solved directly. The solutions to the sub-problems are then combined to give a solution to the original problem. The divide-and-conquer technique is the basis of efficient algorithms for many problems, such as sorting (e.g., quicksort, merge sort), multiplying large numbers (e.
Data analysisData analysis is the process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, informing conclusions, and supporting decision-making. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, and is used in different business, science, and social science domains. In today's business world, data analysis plays a role in making decisions more scientific and helping businesses operate more effectively.
Data warehouseIn computing, a data warehouse (DW or DWH), also known as an enterprise data warehouse (EDW), is a system used for reporting and data analysis and is considered a core component of business intelligence. Data warehouses are central repositories of integrated data from one or more disparate sources. They store current and historical data in one single place that are used for creating analytical reports for workers throughout the enterprise. This is beneficial for companies as it enables them to interrogate and draw insights from their data and make decisions.
Environmental racismEnvironmental racism, ecological racism or ecological apartheid is a form of institutional racism leading to landfills, incinerators, and hazardous waste disposal being disproportionately placed in communities of color. Internationally, it is also associated with extractivism, which places the environmental burdens of mining, oil extraction, and industrial agriculture upon indigenous peoples and poorer nations largely inhabited by people of color.
Environmental justiceEnvironmental justice or eco-justice, is a social movement to address environmental injustice, which occurs when poor and marginalized communities are harmed by hazardous waste, resource extraction, and other land uses from which they do not benefit. The movement has generated hundreds of studies showing that exposure to environmental harm is inequitably distributed. The movement began in the United States in the 1980s. It was heavily influenced by the American civil rights movement and focused on environmental racism within rich countries.
Selection algorithmIn computer science, a selection algorithm is an algorithm for finding the th smallest value in a collection of ordered values, such as numbers. The value that it finds is called the th order statistic. Selection includes as special cases the problems of finding the minimum, median, and maximum element in the collection. Selection algorithms include quickselect, and the median of medians algorithm. When applied to a collection of values, these algorithms take linear time, as expressed using big O notation.
Software development effort estimationIn software development, effort estimation is the process of predicting the most realistic amount of effort (expressed in terms of person-hours or money) required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds. Published surveys on estimation practice suggest that expert estimation is the dominant strategy when estimating software development effort.
Data assimilationData assimilation is a mathematical discipline that seeks to optimally combine theory (usually in the form of a numerical model) with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using (e.g. physical) knowledge of the system being observed, to set numerical parameters based on training a model from observed data.