Misplaced Pages

Uncertainty

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

Uncertainty or incertitude refers to epistemic situations involving imperfect or unknown information . It applies to predictions of future events, to physical measurements that are already made, or to the unknown. Uncertainty arises in partially observable or stochastic environments, as well as due to ignorance , indolence , or both. It arises in any number of fields, including insurance , philosophy , physics , statistics , economics , finance, medicine , psychology , sociology , engineering , metrology , meteorology , ecology and information science .

#120879

95-438: Although the terms are used in various ways among the general public, many specialists in decision theory , statistics and other quantitative fields have defined uncertainty, risk, and their measurement as: The lack of certainty , a state of limited knowledge where it is impossible to exactly describe the existing state, a future outcome, or more than one possible outcome. In statistics and economics, second-order uncertainty

190-709: A counting measure over the set of all possible outcomes. Densities for absolutely continuous distributions are usually defined as this derivative with respect to the Lebesgue measure . If a theorem can be proved in this general setting, it holds for both discrete and continuous distributions as well as others; separate proofs are not required for discrete and continuous distributions. Certain random variables occur very often in probability theory because they well describe many natural or physical processes. Their distributions, therefore, have gained special importance in probability theory. Some fundamental discrete distributions are

285-465: A measure P {\displaystyle P\,} defined on F {\displaystyle {\mathcal {F}}\,} is called a probability measure if P ( Ω ) = 1. {\displaystyle P(\Omega )=1.\,} If F {\displaystyle {\mathcal {F}}\,} is the Borel σ-algebra on the set of real numbers, then there

380-467: A sequence of independent and identically distributed random variables X k {\displaystyle X_{k}} converges towards their common expectation (expected value) μ {\displaystyle \mu } , provided that the expectation of | X k | {\displaystyle |X_{k}|} is finite. It is in the different forms of convergence of random variables that separates

475-489: A Dutch merchant is trying to decide whether to insure a cargo being sent from Amsterdam to St. Petersburg in winter. In his solution, he defines a utility function and computes expected utility rather than expected financial value. In the 20th century, interest was reignited by Abraham Wald's 1939 paper pointing out that the two central procedures of sampling-distribution-based statistical-theory, namely hypothesis testing and parameter estimation , are special cases of

570-502: A book on the subject in 1657. In the 19th century, what is considered the classical definition of probability was completed by Pierre Laplace . Initially, probability theory mainly considered discrete events, and its methods were mainly combinatorial . Eventually, analytical considerations compelled the incorporation of continuous variables into the theory. This culminated in modern probability theory, on foundations laid by Andrey Nikolaevich Kolmogorov . Kolmogorov combined

665-401: A broader sense of uncertainty and how it should be approached from an ethics perspective: There are some things that you know to be true, and others that you know to be false; yet, despite this extensive knowledge that you have, there remain many things whose truth or falsity is not known to you. We say that you are uncertain about them. You are uncertain, to varying degrees, about everything in

760-507: A business related sense, in an economic-development frame or a social progress frame. The nature of these frames is to downplay or eliminate uncertainty, so when economic and scientific promise are focused on early in the issue cycle, as has happened with coverage of plant biotechnology and nanotechnology in the United States, the matter in question seems more definitive and certain. Sometimes, stockholders, owners, or advertising will pressure

855-402: A common and erroneous thought process that arises through heuristic thinking is the gambler's fallacy — believing that an isolated random event is affected by previous isolated random events. For example, if flips of a fair coin give repeated tails, the coin still has the same probability (i.e., 0.5) of tails in future turns, though intuitively it might seems that heads becomes more likely. In

950-636: A continuous sample space. Classical definition : The classical definition breaks down when confronted with the continuous case. See Bertrand's paradox . Modern definition : If the sample space of a random variable X is the set of real numbers ( R {\displaystyle \mathbb {R} } ) or a subset thereof, then a function called the cumulative distribution function ( CDF ) F {\displaystyle F\,} exists, defined by F ( x ) = P ( X ≤ x ) {\displaystyle F(x)=P(X\leq x)\,} . That is, F ( x ) returns

1045-443: A distribution of frequencies of multiple instances of the quantity, derived from observed data. In economics, in 1921 Frank Knight distinguished uncertainty from risk with uncertainty being lack of knowledge which is immeasurable and impossible to calculate. Because of the absence of clearly defined statistics in most economic decisions where people face uncertainty, he believed that we cannot measure probabilities in such cases; this

SECTION 10

#1733085912121

1140-563: A fixed universe of possibilities is that it considers the "known unknowns", not the " unknown unknowns ": it focuses on expected variations, not on unforeseen events, which some argue have outsized impact and must be considered – significant events may be "outside model". This line of argument, called the ludic fallacy , is that there are inevitable imperfections in modeling the real world by particular models, and that unquestioning reliance on models blinds one to their limits. Probability theory Probability theory or probability calculus

1235-400: A logarithmic scale, for example. Uncertainty of a measurement can be determined by repeating a measurement to arrive at an estimate of the standard deviation of the values. Then, any single value has an uncertainty equal to the standard deviation. However, if the values are averaged, then the mean measurement value has a much smaller uncertainty, equal to the standard error of the mean, which

1330-582: A media organization to promote the business aspects of a scientific issue, and therefore any uncertainty claims which may compromise the business interests are downplayed or eliminated. In Western philosophy the first philosopher to embrace uncertainty was Pyrrho resulting in the Hellenistic philosophies of Pyrrhonism and Academic Skepticism , the first schools of philosophical skepticism . Aporia and acatalepsy represent key concepts in ancient Greek philosophy regarding uncertainty. William MacAskill ,

1425-505: A minimum for any insurance coverage, then add onto that other operating costs and profit. Since many people are willing to buy insurance for many reasons, then clearly the EOL alone is not the perceived value of avoiding the risk. Quantitative uses of the terms uncertainty and risk are fairly consistent among fields such as probability theory , actuarial science , and information theory . Some also create new terms without substantially changing

1520-504: A mix, for example, the Cantor distribution has no positive probability for any single point, neither does it have a density. The modern approach to probability theory solves these problems using measure theory to define the probability space : Given any set Ω {\displaystyle \Omega \,} (also called sample space ) and a σ-algebra F {\displaystyle {\mathcal {F}}\,} on it,

1615-433: A notation of uncertainty. They apply to the least significant digits . For instance, 1.007 94 (7) stands for 1.007 94 ± 0.000 07 , while 1.007 94 (72) stands for 1.007 94 ± 0.000 72 . This concise notation is used for example by IUPAC in stating the atomic mass of elements . The middle notation is used when the error is not symmetrical about the value – for example 3.4 +0.3 −0.2 . This can occur when using

1710-587: A pension scheme, giving them an income at some time in the future. What is the optimal thing to do? The answer depends partly on factors such as the expected rates of interest and inflation , the person's life expectancy , and their confidence in the pensions industry. However even with all those factors taken into account, human behavior again deviates greatly from the predictions of prescriptive decision theory, leading to alternative models in which, for example, objective interest rates are replaced by subjective discount rates . Some decisions are difficult because of

1805-417: A philosopher at Oxford University, has also discussed the concept of Moral Uncertainty. Moral Uncertainty is "uncertainty about how to act given lack of certainty in any one moral theory, as well as the study of how we ought to act given this uncertainty." Decision theory Decision theory or the theory of rational choice is a branch of probability , economics , and analytic philosophy that uses

1900-495: A procedural framework (e.g. Amos Tversky 's elimination by aspects model) or an axiomatic framework (e.g. stochastic transitivity axioms), reconciling the Von Neumann-Morgenstern axioms with behavioral violations of the expected utility hypothesis, or they may explicitly give a functional form for time-inconsistent utility functions (e.g. Laibson's quasi-hyperbolic discounting ). Prescriptive decision theory

1995-629: A random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability theory describing such behaviour are the law of large numbers and the central limit theorem . As a mathematical foundation for statistics , probability theory is essential to many human activities that involve quantitative analysis of data. Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in statistical mechanics or sequential estimation . A great discovery of twentieth-century physics

SECTION 20

#1733085912121

2090-489: A random value from a normal distribution with probability 1/2. It can still be studied to some extent by considering it to have a PDF of ( δ [ x ] + φ ( x ) ) / 2 {\displaystyle (\delta [x]+\varphi (x))/2} , where δ [ x ] {\displaystyle \delta [x]} is the Dirac delta function . Other distributions may not even be

2185-429: A set of outcomes called the sample space . Any specified subset of the sample space is called an event . Central subjects in probability theory include discrete and continuous random variables , probability distributions , and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in

2280-403: A single source or without any context of previous research mean that the subject at hand is presented as more definitive and certain than it is in reality. There is often a "product over process" approach to science journalism that aids, too, in the downplaying of uncertainty. Finally, and most notably for this investigation, when science is framed by journalists as a triumphant quest, uncertainty

2375-423: Is understood that 10.5 means 10.5 ± 0.05 , and 10.50 means 10.50 ± 0.005 , also written 10.50(5) and 10.500(5) respectively. But if the accuracy is within two tenths, the uncertainty is ± one tenth, and it is required to be explicit: 10.5 ± 0.1 and 10.50 ± 0.01 or 10.5(1) and 10.50(1) . The numbers in parentheses apply to the numeral left of themselves, and are not part of that number, but part of

2470-411: Is 'resolvable'. If uncertainty arises from a lack of knowledge, and that lack of knowledge is resolvable by acquiring knowledge (such as by primary or secondary research) then it is not radical uncertainty. Only when there are no means available to acquire the knowledge which would resolve the uncertainty, is it considered 'radical'. The most commonly used procedure for calculating measurement uncertainty

2565-424: Is a form of uncertainty where even the possible outcomes have unclear meanings and interpretations. The statement "He returns from the bank" is ambiguous because its interpretation depends on whether the word 'bank' is meant as "the side of a river" or "a financial institution" . Ambiguity typically arises in situations where multiple analysts or observers have different interpretations of the same statements. At

2660-485: Is a unique probability measure on F {\displaystyle {\mathcal {F}}\,} for any CDF, and vice versa. The measure corresponding to a CDF is said to be induced by the CDF. This measure coincides with the pmf for discrete variables and PDF for continuous variables, making the measure-theoretic approach free of fallacies. The probability of a set E {\displaystyle E\,} in

2755-478: Is an irreducible property of nature or if there are "hidden variables" that would describe the state of a particle even more exactly than Heisenberg's uncertainty principle allows. The term 'radical uncertainty' was popularised by John Kay and Mervyn King in their book Radical Uncertainty: Decision-Making for an Unknowable Future, published in March 2020. It is distinct from Knightian uncertainty, by whether or not it

2850-456: Is attached, which satisfies the following properties: That is, the probability function f ( x ) lies between zero and one for every value of x in the sample space Ω , and the sum of f ( x ) over all values x in the sample space Ω is equal to 1. An event is defined as any subset E {\displaystyle E\,} of the sample space Ω {\displaystyle \Omega \,} . The probability of

2945-424: Is concerned with predictions about behavior that positive decision theory produces to allow for further tests of the kind of decision-making that occurs in practice. In recent decades, there has also been increasing interest in "behavioral decision theory", contributing to a re-evaluation of what useful decision-making requires. The area of choice under uncertainty represents the heart of decision theory. Known from

Uncertainty - Misplaced Pages Continue

3040-750: Is described in the "Guide to the Expression of Uncertainty in Measurement" (GUM) published by ISO . A derived work is for example the National Institute of Standards and Technology (NIST) Technical Note 1297, "Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results", and the Eurachem/Citac publication "Quantifying Uncertainty in Analytical Measurement". The uncertainty of

3135-561: Is erroneously framed as "reducible and resolvable". Some media routines and organizational factors affect the overstatement of uncertainty; other media routines and organizational factors help inflate the certainty of an issue. Because the general public (in the United States) generally trusts scientists, when science stories are covered without alarm-raising cues from special interest organizations (religious groups, environmental organizations, political factions, etc.) they are often covered in

3230-469: Is given by the sum of the probabilities of the events. The probability that any one of the events {1,6}, {3}, or {2,4} will occur is 5/6. This is the same as saying that the probability of event {1,2,3,4,6} is 5/6. This event encompasses the possibility of any number except five being rolled. The mutually exclusive event {5} has a probability of 1/6, and the event {1,2,3,4,5,6} has a probability of 1, that is, absolute certainty. When doing calculations using

3325-511: Is in some sense fully rational . The practical application of this prescriptive approach (how people ought to make decisions) is called decision analysis and is aimed at finding tools, methodologies, and software ( decision support systems ) to help people make better decisions. In contrast, descriptive decision theory is concerned with describing observed behaviors often under the assumption that those making decisions are behaving under some consistent rules. These rules may, for instance, have

3420-423: Is known as the distinction bias . Heuristics are procedures for making a decision without working out the consequences of every option. Heuristics decrease the amount of evaluative thinking required for decisions, focusing on some aspects of the decision while ignoring others. While quicker than step-by-step processing, heuristic thinking is also more likely to involve fallacies or inaccuracies. One example of

3515-474: Is not known. It is so fundamental, indeed, that … a known risk will not lead to any reward or special payment at all. Knight pointed out that the unfavorable outcome of known risks can be insured during the decision-making process because it has a clearly defined expected probability distribution. Unknown risks have no known expected probability distribution, which can lead to extremely risky company decisions. Other taxonomies of uncertainties and decisions include

3610-408: Is now referred to as Knightian uncertainty . Uncertainty must be taken in a sense radically distinct from the familiar notion of risk, from which it has never been properly separated.... The essential fact is that 'risk' means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in

3705-420: Is only one of many alternatives and point to many examples where non-standard alternatives have been implemented with apparent success. Notably, probabilistic decision theory can sometimes be sensitive to assumptions about the probabilities of various events, whereas non-probabilistic rules, such as minimax , are robust in that they do not make such assumptions. A general criticism of decision theory based on

3800-412: Is reported in the public sphere, discrepancies between outcomes of multiple scientific studies due to methodological differences could be interpreted by the public as a lack of consensus in a situation where a consensus does in fact exist. This interpretation may have even been intentionally promoted, as scientific uncertainty may be managed to reach certain goals. For example, climate change deniers took

3895-403: Is represented in probability density functions over (first-order) probabilities. Opinions in subjective logic carry this type of uncertainty. There is a difference between uncertainty and variability. Uncertainty is quantified by a probability distribution which depends upon knowledge about the likelihood of what the single, true value of the uncertain quantity is. Variability is quantified by

Uncertainty - Misplaced Pages Continue

3990-505: Is severely biased by anchoring . Intertemporal choice is concerned with the kind of choice where different actions lead to outcomes that are realized at different stages over time. It is also described as cost-benefit decision making since it involves the choices between rewards that vary according to magnitude and time of arrival. If someone received a windfall of several thousand dollars, they could spend it on an expensive holiday, giving them immediate pleasure, or they could invest it in

4085-418: Is the branch of mathematics concerned with probability . Although there are several different probability interpretations , probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms . Typically these axioms formalise probability in terms of a probability space , which assigns a measure taking values between 0 and 1, termed the probability measure , to

4180-399: Is the standard deviation divided by the square root of the number of measurements. This procedure neglects systematic errors , however. When the uncertainty represents the standard error of the measurement, then about 68.3% of the time, the true value of the measured quantity falls within the stated uncertainty range. For example, it is likely that for 31.7% of the atomic mass values given on

4275-424: The discrete uniform , Bernoulli , binomial , negative binomial , Poisson and geometric distributions . Important continuous distributions include the continuous uniform , normal , exponential , gamma and beta distributions . In probability theory, there are several notions of convergence for random variables . They are listed below in the order of strength, i.e., any subsequent notion of convergence in

4370-699: The identity function . This does not always work. For example, when flipping a coin the two possible outcomes are "heads" and "tails". In this example, the random variable X could assign to the outcome "heads" the number "0" ( X ( heads ) = 0 {\textstyle X({\text{heads}})=0} ) and to the outcome "tails" the number "1" ( X ( tails ) = 1 {\displaystyle X({\text{tails}})=1} ). Discrete probability theory deals with events that occur in countable sample spaces. Examples: Throwing dice , experiments with decks of cards , random walk , and tossing coins . Classical definition : Initially

4465-435: The list of elements by atomic mass , the true value lies outside of the stated range. If the width of the interval is doubled, then probably only 4.6% of the true values lie outside the doubled interval, and if the width is tripled, probably only 0.3% lie outside. These values follow from the properties of the normal distribution , and they apply only if the measurement process produces normally distributed errors. In that case,

4560-883: The weak and the strong law of large numbers It follows from the LLN that if an event of probability p is observed repeatedly during independent experiments, the ratio of the observed frequency of that event to the total number of repetitions converges towards p . For example, if Y 1 , Y 2 , . . . {\displaystyle Y_{1},Y_{2},...\,} are independent Bernoulli random variables taking values 1 with probability p and 0 with probability 1- p , then E ( Y i ) = p {\displaystyle {\textrm {E}}(Y_{i})=p} for all i , so that Y ¯ n {\displaystyle {\bar {Y}}_{n}} converges to p almost surely . The central limit theorem (CLT) explains

4655-406: The 17th century ( Blaise Pascal invoked it in his famous wager , which is contained in his Pensées , published in 1670), the idea of expected value is that, when faced with a number of actions, each of which could give rise to more than one possible outcome with different probabilities, the rational procedure is to identify all possible outcomes, determine their values (positive or negative) and

4750-556: The 6 have even numbers and each face has the same probability of appearing. Modern definition : The modern definition starts with a finite or countable set called the sample space , which relates to the set of all possible outcomes in classical sense, denoted by Ω {\displaystyle \Omega } . It is then assumed that for each element x ∈ Ω {\displaystyle x\in \Omega \,} , an intrinsic "probability" value f ( x ) {\displaystyle f(x)\,}

4845-537: The advice of Frank Luntz to frame global warming as an issue of scientific uncertainty, which was a precursor to the conflict frame used by journalists when reporting the issue. "Indeterminacy can be loosely said to apply to situations in which not all the parameters of the system and their interactions are fully known, whereas ignorance refers to situations in which it is not known what is not known." These unknowns, indeterminacy and ignorance, that exist in science are often "transformed" into uncertainty when reported to

SECTION 50

#1733085912121

4940-405: The bearings of the phenomena depending on which of the two is really present and operating.... It will appear that a measurable uncertainty, or 'risk' proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. There is a fundamental distinction between the reward for taking a known risk and that for assuming a risk whose value itself

5035-416: The complexity of the organization that has to make them. Individuals making decisions are limited in resources (i.e. time and intelligence) and are therefore boundedly rational ; the issue is thus, more than the deviation between real and optimal behavior, the difficulty of determining the optimal behavior in the first place. Decisions are also affected by whether options are framed together or separately; this

5130-422: The concise notation for the ± notation. For example, applying 10 1 ⁄ 2 meters in a scientific or engineering application, it could be written 10.5 m or 10.50 m , by convention meaning accurate to within one tenth of a meter, or one hundredth. The precision is symmetric around the last digit. In this case it's half a tenth up and half a tenth down, so 10.5 means between 10.45 and 10.55. Thus it

5225-436: The cost of delays vs. outright cancellation, etc. Some may represent the risk in this example as the "expected opportunity loss" (EOL) or the chance of the loss multiplied by the amount of the loss (10% × $ 100,000 = $ 10,000). That is useful if the organizer of the event is "risk neutral", which most people are not. Most would be willing to pay a premium to avoid the loss. An insurance company, for example, would compute an EOL as

5320-631: The definitions of uncertainty or risk. For example, surprisal is a variation on uncertainty sometimes used in information theory . But outside of the more mathematical uses of the term, usage may vary widely. In cognitive psychology , uncertainty can be real, or just a matter of perception, such as expectations , threats, etc. Vagueness is a form of uncertainty where the analyst is unable to clearly differentiate between two different classes, such as 'person of average height' and 'tall person'. This form of vagueness can be modelled by some variation on Zadeh 's fuzzy logic or subjective logic . Ambiguity

5415-410: The derivative gives us the CDF back again, then the random variable X is said to have a probability density function ( PDF ) or simply density f ( x ) = d F ( x ) d x . {\displaystyle f(x)={\frac {dF(x)}{dx}}\,.} For a set E ⊆ R {\displaystyle E\subseteq \mathbb {R} } ,

5510-490: The discrete, continuous, a mix of the two, and more. Consider an experiment that can produce a number of outcomes. The set of all outcomes is called the sample space of the experiment. The power set of the sample space (or equivalently, the event space) is formed by considering all different collections of possible results. For example, rolling an honest die produces one of six possible results. One collection of possible results corresponds to getting an odd number. Thus,

5605-438: The empirical study of economic behavior with less emphasis on rationality presuppositions. It describes a way by which people make decisions when all of the outcomes carry a risk. Kahneman and Tversky found three regularities – in actual human decision-making, "losses loom larger than gains"; people focus more on changes in their utility-states than they focus on absolute utilities; and the estimation of subjective probabilities

5700-453: The event E {\displaystyle E\,} is defined as So, the probability of the entire sample space is 1, and the probability of the null event is 0. The function f ( x ) {\displaystyle f(x)\,} mapping a point in the sample space to the "probability" value is called a probability mass function abbreviated as pmf . Continuous probability theory deals with events that occur in

5795-427: The event made up of all possible results (in our example, the event {1,2,3,4,5,6}) be assigned a value of one. To qualify as a probability distribution , the assignment of values must satisfy the requirement that if you look at a collection of mutually exclusive events (events that contain no common results, e.g., the events {1,6}, {3}, and {2,4} are all mutually exclusive), the probability that any of these events occurs

SECTION 60

#1733085912121

5890-416: The foundations for the rational agent models used to mathematically model and analyze individuals in fields such as sociology , economics , criminology , cognitive science , and political science . Normative decision theory is concerned with identification of optimal decisions where optimality is often determined by considering an ideal decision maker who is able to calculate with perfect accuracy and

5985-401: The foundations of probability theory, but instead emerges from these foundations as a theorem. Since it links theoretically derived probabilities to their actual frequency of occurrence in the real world, the law of large numbers is considered as a pillar in the history of statistical theory and has had widespread influence. The law of large numbers (LLN) states that the sample average of

6080-423: The future; much of the past is hidden from you; and there is a lot of the present about which you do not have full information. Uncertainty is everywhere and you cannot escape from it. For example, if it is unknown whether or not it will rain tomorrow, then there is a state of uncertainty. If probabilities are applied to the possible outcomes using weather forecasts or even just a calibrated probability assessment ,

6175-471: The general decision problem. Wald's paper renewed and synthesized many concepts of statistical theory, including loss functions , risk functions , admissible decision rules , antecedent distributions , Bayesian procedures , and minimax procedures. The phrase "decision theory" itself was used in 1950 by E. L. Lehmann . The revival of subjective probability theory, from the work of Frank Ramsey , Bruno de Finetti , Leonard Savage and others, extended

6270-406: The list implies convergence according to all of the preceding notions. As the names indicate, weak convergence is weaker than strong convergence. In fact, strong convergence implies convergence in probability, and convergence in probability implies weak convergence. The reverse statements are not always true. Common intuition suggests that if a fair coin is tossed many times, then roughly half of

6365-472: The long run, heads and tails should occur equally often; people commit the gambler's fallacy when they use this heuristic to predict that a result of heads is "due" after a run of tails. Another example is that decision-makers may be biased towards preferring moderate alternatives to extreme ones. The compromise effect operates under a mindset that the most moderate option carries the most benefit. In an incomplete information scenario, as in most daily decisions,

6460-433: The measure-theoretic treatment of probability is that it unifies the discrete and the continuous cases, and makes the difference a question of which measure is used. Furthermore, it covers distributions that are neither discrete nor continuous nor mixtures of the two. An example of such distributions could be a mix of discrete and continuous distributions—for example, a random variable that is 0 with probability 1/2, and takes

6555-499: The moderate option will look more appealing than either extreme, independent of the context, based only on the fact that it has characteristics that can be found at either extreme. A highly controversial issue is whether one can replace the use of probability in decision theory with something else. Advocates for the use of probability theory point to: The proponents of fuzzy logic , possibility theory , Dempster–Shafer theory , and info-gap decision theory maintain that probability

6650-568: The need to take into account how other people in the situation will respond to the decision that is taken. The analysis of such social decisions is often treated under decision theory, though it involves mathematical methods. In the emerging field of socio-cognitive engineering, the research is especially focused on the different types of distributed decision-making in human organizations, in normal and abnormal/emergency/crisis situations. Other areas of decision theory are concerned with decisions that are difficult simply because of their complexity, or

6745-540: The notion of sample space , introduced by Richard von Mises , and measure theory and presented his axiom system for probability theory in 1933. This became the mostly undisputed axiomatic basis for modern probability theory; but, alternatives exist, such as the adoption of finite rather than countable additivity by Bruno de Finetti . Most introductions to probability theory treat discrete probability distributions and continuous probability distributions separately. The measure theory-based treatment of probability covers

6840-411: The outcomes of an experiment, it is necessary that all those elementary events have a number assigned to them. This is done using a random variable . A random variable is a function that assigns to each elementary event in the sample space a real number . This function is usually denoted by a capital letter. In the case of a die, the assignment of a number to certain elementary events can be done using

6935-558: The probabilities that will result from each course of action, and multiply the two to give an "expected value", or the average expectation for an outcome; the action to be chosen should be the one that gives rise to the highest total expected value. In 1738, Daniel Bernoulli published an influential paper entitled Exposition of a New Theory on the Measurement of Risk , in which he uses the St. Petersburg paradox to show that expected value theory must be normatively wrong. He gives an example in which

7030-470: The probability of an event to occur was defined as the number of cases favorable for the event, over the number of total outcomes possible in an equiprobable sample space: see Classical definition of probability . For example, if the event is "occurrence of an even number when a dice is rolled", the probability is given by 3 6 = 1 2 {\displaystyle {\tfrac {3}{6}}={\tfrac {1}{2}}} , since 3 faces out of

7125-724: The probability of the random variable X being in E {\displaystyle E\,} is In case the PDF exists, this can be written as Whereas the PDF exists only for continuous random variables, the CDF exists for all random variables (including discrete random variables) that take values in R . {\displaystyle \mathbb {R} \,.} These concepts can be generalized for multidimensional cases on R n {\displaystyle \mathbb {R} ^{n}} and other continuous sample spaces. The utility of

7220-450: The probability that X will be less than or equal to x . The CDF necessarily satisfies the following properties. The random variable X {\displaystyle X} is said to have a continuous probability distribution if the corresponding CDF F {\displaystyle F} is continuous. If F {\displaystyle F\,} is absolutely continuous , i.e., its derivative exists and integrating

7315-436: The public in order to make issues more manageable, since scientific indeterminacy and ignorance are difficult concepts for scientists to convey without losing credibility. Conversely, uncertainty is often interpreted by the public as ignorance. The transformation of indeterminacy and ignorance into uncertainty may be related to the public's misinterpretation of uncertainty as ignorance. Journalists may inflate uncertainty (making

7410-435: The public sphere than in the scientific community. This is due in part to the diversity of the public audience, and the tendency for scientists to misunderstand lay audiences and therefore not communicate ideas clearly and effectively. One example is explained by the information deficit model . Also, in the public realm, there are often many scientific voices giving input on a single topic. For example, depending on how an issue

7505-409: The quoted standard errors are easily converted to 68.3% ("one sigma "), 95.4% ("two sigma"), or 99.7% ("three sigma") confidence intervals . In this context, uncertainty depends on both the accuracy and precision of the measurement instrument. The lower the accuracy and precision of an instrument, the larger the measurement uncertainty is. Precision is often determined as the standard deviation of

7600-468: The repeated measures of a given value, namely using the same method described above to assess measurement uncertainty. However, this method is correct only when the instrument is accurate. When it is inaccurate, the uncertainty is larger than the standard deviation of the repeated measures, and it appears evident that the uncertainty does not depend only on instrumental precision. Uncertainty in science, and science in general, may be interpreted differently in

7695-413: The result of a measurement generally consists of several components. The components are regarded as random variables , and may be grouped into two categories according to the method used to estimate their numerical values: By propagating the variances of the components through a function relating the components to the measurement result, the combined measurement uncertainty is given as the square root of

7790-408: The resulting variance. The simplest form is the standard deviation of a repeated observation. In metrology , physics , and engineering , the uncertainty or margin of error of a measurement, when explicitly stated, is given by a range of values likely to enclose the true value. This may be denoted by error bars on a graph, or by the following notations: In the last notation, parentheses are

7885-424: The science seem more uncertain than it really is) or downplay uncertainty (making the science seem more certain than it really is). One way that journalists inflate uncertainty is by describing new research that contradicts past research without providing context for the change. Journalists may give scientists with minority views equal weight as scientists with majority views, without adequately describing or explaining

7980-544: The scope of expected utility theory to situations where subjective probabilities can be used. At the time, von Neumann and Morgenstern's theory of expected utility proved that expected utility maximization followed from basic postulates about rational behavior. The work of Maurice Allais and Daniel Ellsberg showed that human behavior has systematic and sometimes important departures from expected-utility maximization ( Allais paradox and Ellsberg paradox ). The prospect theory of Daniel Kahneman and Amos Tversky renewed

8075-560: The sequence of random variables converges in distribution to a standard normal random variable. For some classes of random variables, the classic central limit theorem works rather fast, as illustrated in the Berry–Esseen theorem . For example, the distributions with finite first, second, and third moment from the exponential family ; on the other hand, for some random variables of the heavy tail and fat tail variety, it works very slowly or may not work at all: in such cases one may use

8170-403: The state of scientific consensus on the issue. In the same vein, journalists may give non-scientists the same amount of attention and importance as scientists. Journalists may downplay uncertainty by eliminating "scientists' carefully chosen tentative wording, and by losing these caveats the information is skewed and presented as more certain and conclusive than it really is". Also, stories with

8265-477: The subatomic level, uncertainty may be a fundamental and unavoidable property of the universe. In quantum mechanics , the Heisenberg uncertainty principle puts limits on how much an observer can ever know about the position and velocity of a particle. This may not just be ignorance of potentially obtainable facts but that there is no fact to be found. There is some controversy in physics as to whether such uncertainty

8360-400: The subset {1,3,5} is an element of the power set of the sample space of dice rolls. These collections are called events . In this case, {1,3,5} is the event that the die falls on some odd number. If the results that actually occur fall in a given event, that event is said to have occurred. Probability is a way of assigning every "event" a value between zero and one, with the requirement that

8455-544: The theory of stochastic processes . For example, to study Brownian motion , probability is defined on a space of functions. When it is convenient to work with a dominating measure, the Radon-Nikodym theorem is used to define a density as the Radon-Nikodym derivative of the probability distribution of interest with respect to this dominating measure. Discrete densities are usually defined as this derivative with respect to

8550-407: The time it will turn up heads , and the other half it will turn up tails . Furthermore, the more often the coin is tossed, the more likely it should be that the ratio of the number of heads to the number of tails will approach unity. Modern probability theory provides a formal version of this intuitive idea, known as the law of large numbers . This law is remarkable because it is not assumed in

8645-461: The tools of expected utility and probability to model how individuals would behave rationally under uncertainty . It differs from the cognitive and behavioral sciences in that it is mainly prescriptive and concerned with identifying optimal decisions for a rational agent , rather than describing how people actually make decisions. Despite this, the field is important to the study of real human behavior by social scientists , as it lays

8740-756: The ubiquitous occurrence of the normal distribution in nature, and this theorem, according to David Williams, "is one of the great results of mathematics." The theorem states that the average of many independent and identically distributed random variables with finite variance tends towards a normal distribution irrespective of the distribution followed by the original random variables. Formally, let X 1 , X 2 , … {\displaystyle X_{1},X_{2},\dots \,} be independent random variables with mean μ {\displaystyle \mu } and variance σ 2 > 0. {\displaystyle \sigma ^{2}>0.\,} Then

8835-484: The uncertainty has been quantified. Suppose it is quantified as a 90% chance of sunshine. If there is a major, costly, outdoor event planned for tomorrow then there is a risk since there is a 10% chance of rain, and rain would be undesirable. Furthermore, if this is a business event and $ 100,000 would be lost if it rains, then the risk has been quantified (a 10% chance of losing $ 100,000). These situations can be made even more realistic by quantifying light rain vs. heavy rain,

8930-561: The σ-algebra F {\displaystyle {\mathcal {F}}\,} is defined as where the integration is with respect to the measure μ F {\displaystyle \mu _{F}\,} induced by F . {\displaystyle F\,.} Along with providing better understanding and unification of discrete and continuous probabilities, measure-theoretic treatment also allows us to work on probabilities outside R n {\displaystyle \mathbb {R} ^{n}} , as in

9025-403: Was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics . The modern mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the " problem of points "). Christiaan Huygens published

#120879