{"id":609,"date":"2021-05-07T18:51:24","date_gmt":"2021-05-07T18:51:24","guid":{"rendered":"https:\/\/www.allendowney.com\/blog\/?p=609"},"modified":"2025-09-27T15:00:38","modified_gmt":"2025-09-27T15:00:38","slug":"founded-upon-an-error","status":"publish","type":"post","link":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/","title":{"rendered":"Founded Upon an Error"},"content":{"rendered":"\n<p><a href=\"https:\/\/www.reddit.com\/r\/statistics\/comments\/mmldkn\/d_why_was_bayes_theory_not_acceptedpopular\/\">A recent post on Reddit<\/a> asks, &#8220;Why was Bayes&#8217; Theory not accepted\/popular historically until the late 20th century?&#8221; <\/p>\n\n\n\n<p>Great question! As always, there are many answers to a question like this, and the good people of Reddit provide several. But the first and most popular answer is, in my humble opinion, wrong.<\/p>\n\n\n\n<p>The story goes something like this: &#8220;Bayesian methods are computationally expensive, so even though they were known in the early days of modern statistics, they were not practical until the availability of computational power and the recent development of efficient sampling algorithms.&#8221;<\/p>\n\n\n\n<p>This theory is appealing because, if we look at problems where Bayesian methods are currently used, many of them are large and complex, and would indeed have been impractical to solve just a few years ago.<\/p>\n\n\n\n<p>I think it is also appealing because it rationalizes the history of statistics. Ignoring Bayesian methods for almost 100 years wasn&#8217;t a mistake, we can tell ourselves; we were just waiting for the computers to catch up.<\/p>\n\n\n\n<p>Well, I&#8217;m sorry, but that&#8217;s bunk. In fact, we could have been doing Bayesian statistics all along, using <strong>conjugate priors<\/strong> and <strong>grid algorithms<\/strong>.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Conjugate Priors<\/h3>\n\n\n\n<p>A large fraction of common, practical problems in statistics can be solved using conjugate priors, and the solutions require almost no computation. For example:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Problems that involve estimating proportions can be solved using a beta prior and binomial likelihood function. In that case, a Bayesian update requires exactly two addition operations. <\/li>\n\n\n\n<li>In the multivariate case, with a Dirichlet prior and a multinomial likelihood function, the update consists of adding two vectors.<\/li>\n\n\n\n<li>Problems that involve estimating rates can be solved with a gamma prior and an exponential or Poisson likelihood function &#8212; and the update requires two additions.<\/li>\n\n\n\n<li>For problems that involve estimating the parameters of a normal distribution, things are a little more challenging: you have to compute the mean and standard deviation of the data, and then perform about a dozen arithmetic operations.<\/li>\n<\/ul>\n\n\n\n<p>For details, see <a href=\"https:\/\/allendowney.github.io\/ThinkBayes2\/chap18.html\">Chapter 18 of <em>Think Bayes<\/em><\/a>. And for even more examples, see <a href=\"https:\/\/en.wikipedia.org\/wiki\/Conjugate_prior#Table_of_conjugate_distributions\">this list of conjugate priors<\/a>. All of these could have been done with paper and pencil, or chalk and rock, at any point in the 20th century. <\/p>\n\n\n\n<p>And these methods would be sufficient to solve many common problems in statistics, including everything covered in an introductory statistics class, and a lot more. In the time it takes for students to understand p-values and confidence intervals, you could teach them Bayesian methods that are more interesting, comprehensible, and useful.<\/p>\n\n\n\n<p>In terms of computational efficiency, updates with prior conjugates border on miraculous. But they are limited to problems where the prior and likelihood can be well modeled by simple analytic functions. For other problems, we need other methods.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Grid Algorithms<\/h3>\n\n\n\n<p>The idea behind grid algorithms is to enumerate all possible values for the parameters we want to estimate and, for each set of parameters:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Compute the prior probability,<\/li>\n\n\n\n<li>Compute the likelihood of the data,<\/li>\n\n\n\n<li>Multiply the priors and the likelihoods,<\/li>\n\n\n\n<li>Add up the products to get the total probability of the data, and<\/li>\n\n\n\n<li>Divide through to normalize the posterior distribution. <\/li>\n<\/ol>\n\n\n\n<p>If the parameters are continuous, we approximate the results by evaluating the prior and likelihood at a discrete set of values, often evenly spaced to form a <em>d<\/em>-dimensional grid, where <em>d<\/em> is the number of parameters.<\/p>\n\n\n\n<p>If there are <em>n<\/em> possible values and <em>m<\/em> elements in the dataset, the total amount of computation we need is proportional to the product <em>n m<\/em>, which is practical for most problems. And in many cases we can do even better by summarizing the data; then the computation we need is proportional to <em>n + m<\/em>.<\/p>\n\n\n\n<p>For problems with 1-2 parameters &#8212; which includes many useful, real-world problems &#8212; grid algorithms are efficient enough to run on my 1982 vintage Commodore 64.<\/p>\n\n\n\n<p>For problems with 3-4 parameters, we need a little more power. For example, in <a href=\"https:\/\/allendowney.github.io\/ThinkBayes2\/chap15.html#three-parameter-model\">Chapter 15 of <em>Think Bayes<\/em><\/a> I solve a problem with 3 parameters, which takes a few seconds on my laptop, and in <a href=\"https:\/\/allendowney.github.io\/ThinkBayes2\/chap17.html#the-update\">Chapter 17<\/a> I solve a problem that takes about a minute.<\/p>\n\n\n\n<p>With some optimization, you might be able to estimate 5-6 parameters using a coarse grid, but at that point you are probably better off with <a href=\"https:\/\/allendowney.github.io\/ThinkBayes2\/chap19.html\">Markov chain Monte Carlo<\/a> (MCMC) or <a href=\"https:\/\/allendowney.github.io\/ThinkBayes2\/chap20.html\">Approximate Bayesian Computation<\/a> (ABC). <\/p>\n\n\n\n<p>For more than six parameters, grid algorithms are not practical at all. But you can solve a lot of real-world problems with fewer than six parameters, using only the computational power that&#8217;s been available since 1970.<\/p>\n\n\n\n<p>So why didn&#8217;t we?<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Awful People, Bankrupt Ideas<\/h3>\n\n\n\n<p>In 1925, <a href=\"http:\/\/haghish.com\/resources\/materials\/Statistical_Methods_for_Research_Workers.pdf\">R.A. Fisher wrote<\/a>, &#8220;&#8230; it will be sufficient &#8230; to reaffirm my personal conviction &#8230; that the theory of inverse probability is founded upon an error, and must be wholly rejected.&#8221; By &#8220;inverse probability&#8221;, he meant what is now called Bayesian statistics, and this is probably the nicest thing he ever wrote about it.<\/p>\n\n\n\n<p>Unfortunately for Bayesianism, Fisher&#8217;s &#8220;personal conviction&#8221; carried more weight than most. Fisher was &#8220;the single most important figure in 20th century statistics&#8221;, at least <a href=\"https:\/\/projecteuclid.org\/journals\/statistical-science\/volume-13\/issue-2\/R-A-Fisher-in-the-21st-century-Invited-paper-presented\/10.1214\/ss\/1028905930.full\">according this article<\/a>. He was also, according to contemporaneous accounts, a colossal jerk who sat on 20th century statistics like a 400-pound gorilla, <a href=\"https:\/\/www.ucl.ac.uk\/biosciences\/gee\/ucl-centre-computational-biology\/ronald-aylmer-fisher-1890-1962\">a raving eugenicist, even after World War II<\/a>, and a <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/2000852\/\">paid denier that smoking causes lung cancer<\/a>.<\/p>\n\n\n\n<p>For details of the story, I recommend <em><a href=\"https:\/\/yalebooks.yale.edu\/book\/9780300188226\/theory-would-not-die\">The Theory That Would Not Die<\/a><\/em>, where Sharon Bertsch McGrayne writes: &#8220;If Bayes\u2019 story were a TV melodrama, it would need a clear-cut villain, and Fisher would probably be the audience\u2019s choice by acclamation.&#8221;<\/p>\n\n\n\n<p>Among other failings, Fisher feuded endlessly with Karl Pearson, Egon Pearson, and Jerzy Neyman, to the detriment of statistics, science, and the world. But he and Neyman agreed about one thing: they were both rabid and influential anti-Bayesians.<\/p>\n\n\n\n<p>The focus of their animosity was the apparent subjectivity of Bayesian statistics, particularly in the choice of prior distributions. But this concern is, in my personal conviction, founded upon an error: the belief that frequentist methods are less subjective than Bayesian methods.<\/p>\n\n\n\n<p>All statistical methods are based on modeling decisions, and modeling decisions are subjective. With Bayesian methods, the modeling decisions are represented more explicitly, but that&#8217;s a feature, not a bug. As <a href=\"https:\/\/www.google.com\/books\/edition\/Good_Thinking\/SvCAPwAACAAJ?hl=en&amp;gbpv=1&amp;bsq=basks\">I.J. Good said<\/a>, &#8220;The subjectivist [Bayesian] states his judgements, whereas the objectivist [frequentist] sweeps them under the carpet by calling assumptions knowledge, and he basks in the glorious objectivity of science.&#8221;<\/p>\n\n\n\n<p>In summary, it would be nice to think it was reasonable to neglect Bayesian statistics for most of the 20th century because we didn&#8217;t have the computational power to make them practical. But that&#8217;s a rationalization. A much more substantial part of the reason is the open opposition of awful people with bankrupt ideas.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What does GPT think?<\/h2>\n\n\n\n<p>Update September 2025: I wrote this article before LLMs were in wide use. So I was curious to see what ChatGPT thinks of it.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><br><strong>Where it\u2019s strongest<\/strong><\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Conjugate priors and grid algorithms really were feasible<\/strong><br>The point that many standard Bayesian updates require just a few arithmetic operations is absolutely right. Beta\u2013binomial, gamma\u2013Poisson, normal\u2013normal, etc., were all within easy reach of anyone doing applied stats in the early 20th century. And grid methods for 1\u20132 parameters could have been done on slide rules or early calculators. So it\u2019s historically inaccurate to say \u201cBayes was impossible without modern computers.\u201d<\/li>\n\n\n\n<li><strong>The sociology of science matters<\/strong><br>The post rightly highlights how much Fisher\u2019s stature (and temperament) shaped the discipline. Statistics developed as a community, with gatekeepers, priorities, and academic politics. That context is often missing when people assume computational constraints were the decisive factor.<\/li>\n\n\n\n<li><strong>The critique of \u201cobjectivity\u201d<\/strong><br>The argument that frequentist methods were no less subjective than Bayesian ones is well put. Making assumptions explicit via priors is not inherently worse than burying assumptions in model form or test design.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">Where it could be more nuanced<\/h3>\n\n\n\n<p><strong>Teaching and pedagogy<\/strong><br>Saying you could have taught Bayes instead of p-values in intro stats is true in principle, but history shows pedagogy is conservative. Frequentist methods were entrenched in textbooks, curricula, and exam systems. Switching to Bayesianism wasn\u2019t just a matter of arithmetic feasibility.<\/p>\n\n\n\n<p><strong>Complex problems were genuinely out of reach<\/strong><br>While it\u2019s true conjugate priors handled many bread-and-butter problems, once you move beyond simple distributions or low-dimensional parameter spaces, the practical work does explode. MCMC was a breakthrough precisely because real-world scientific models quickly exceeded the 2\u20133 parameter realm. So the \u201ccomputational barrier\u201d explanation is not wholly bunk\u2014it just doesn\u2019t account for the whole story.<\/p>\n\n\n\n<p><strong>Fisher wasn\u2019t the only factor<\/strong><br>Fisher looms large, but the frequentist program (Neyman\u2013Pearson hypothesis testing, confidence intervals, large-sample theory) also had its own intrinsic appeal: it offered procedures with guarantees that looked \u201cobjective\u201d and fit well with the needs of experimental science at the time. That institutional fit helped frequentism dominate, even apart from Fisher\u2019s personality.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A recent post on Reddit asks, &#8220;Why was Bayes&#8217; Theory not accepted\/popular historically until the late 20th century?&#8221; Great question! As always, there are many answers to a question like this, and the good people of Reddit provide several. But the first and most popular answer is, in my humble opinion, wrong. The story goes something like this: &#8220;Bayesian methods are computationally expensive, so even though they were known in the early days of modern statistics, they were not practical&#8230;<\/p>\n<p class=\"read-more\"><a class=\"btn btn-default\" href=\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\"> Read More<span class=\"screen-reader-text\">  Read More<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[9,83],"class_list":["post-609","post","type-post","status-publish","format-standard","hentry","category-uncategorized","tag-bayesian","tag-frequentist"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Founded Upon an Error - Probably Overthinking It<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Founded Upon an Error - Probably Overthinking It\" \/>\n<meta property=\"og:description\" content=\"A recent post on Reddit asks, &#8220;Why was Bayes&#8217; Theory not accepted\/popular historically until the late 20th century?&#8221; Great question! As always, there are many answers to a question like this, and the good people of Reddit provide several. But the first and most popular answer is, in my humble opinion, wrong. The story goes something like this: &#8220;Bayesian methods are computationally expensive, so even though they were known in the early days of modern statistics, they were not practical... Read More Read More\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\" \/>\n<meta property=\"og:site_name\" content=\"Probably Overthinking It\" \/>\n<meta property=\"article:published_time\" content=\"2021-05-07T18:51:24+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-09-27T15:00:38+00:00\" \/>\n<meta name=\"author\" content=\"AllenDowney\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@AllenDowney\" \/>\n<meta name=\"twitter:site\" content=\"@AllenDowney\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"AllenDowney\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\"},\"author\":{\"name\":\"AllenDowney\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/4e5bfb2e9af6c3446cb0031a7bf83207\"},\"headline\":\"Founded Upon an Error\",\"datePublished\":\"2021-05-07T18:51:24+00:00\",\"dateModified\":\"2025-09-27T15:00:38+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\"},\"wordCount\":1484,\"publisher\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/#organization\"},\"keywords\":[\"bayesian\",\"frequentist\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\",\"url\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\",\"name\":\"Founded Upon an Error - Probably Overthinking It\",\"isPartOf\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/#website\"},\"datePublished\":\"2021-05-07T18:51:24+00:00\",\"dateModified\":\"2025-09-27T15:00:38+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.allendowney.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Founded Upon an Error\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#website\",\"url\":\"https:\/\/www.allendowney.com\/blog\/\",\"name\":\"Probably Overthinking It\",\"description\":\"Data science, Bayesian Statistics, and other ideas\",\"publisher\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.allendowney.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#organization\",\"name\":\"Probably Overthinking It\",\"url\":\"https:\/\/www.allendowney.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.allendowney.com\/blog\/wp-content\/uploads\/2025\/03\/probably_logo.png\",\"contentUrl\":\"https:\/\/www.allendowney.com\/blog\/wp-content\/uploads\/2025\/03\/probably_logo.png\",\"width\":714,\"height\":784,\"caption\":\"Probably Overthinking It\"},\"image\":{\"@id\":\"https:\/\/www.allendowney.com\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/AllenDowney\",\"https:\/\/www.linkedin.com\/in\/allendowney\/\",\"https:\/\/bsky.app\/profile\/allendowney.bsky.social\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/4e5bfb2e9af6c3446cb0031a7bf83207\",\"name\":\"AllenDowney\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/fb01b3a7f7190bea1bbf7f0852e686c2f8c03b099222df2ce4bc7926f15bcb43?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/fb01b3a7f7190bea1bbf7f0852e686c2f8c03b099222df2ce4bc7926f15bcb43?s=96&d=mm&r=g\",\"caption\":\"AllenDowney\"},\"url\":\"https:\/\/www.allendowney.com\/blog\/author\/allendowney_6dbrc4\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Founded Upon an Error - Probably Overthinking It","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/","og_locale":"en_US","og_type":"article","og_title":"Founded Upon an Error - Probably Overthinking It","og_description":"A recent post on Reddit asks, &#8220;Why was Bayes&#8217; Theory not accepted\/popular historically until the late 20th century?&#8221; Great question! As always, there are many answers to a question like this, and the good people of Reddit provide several. But the first and most popular answer is, in my humble opinion, wrong. The story goes something like this: &#8220;Bayesian methods are computationally expensive, so even though they were known in the early days of modern statistics, they were not practical... Read More Read More","og_url":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/","og_site_name":"Probably Overthinking It","article_published_time":"2021-05-07T18:51:24+00:00","article_modified_time":"2025-09-27T15:00:38+00:00","author":"AllenDowney","twitter_card":"summary_large_image","twitter_creator":"@AllenDowney","twitter_site":"@AllenDowney","twitter_misc":{"Written by":"AllenDowney","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#article","isPartOf":{"@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/"},"author":{"name":"AllenDowney","@id":"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/4e5bfb2e9af6c3446cb0031a7bf83207"},"headline":"Founded Upon an Error","datePublished":"2021-05-07T18:51:24+00:00","dateModified":"2025-09-27T15:00:38+00:00","mainEntityOfPage":{"@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/"},"wordCount":1484,"publisher":{"@id":"https:\/\/www.allendowney.com\/blog\/#organization"},"keywords":["bayesian","frequentist"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/","url":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/","name":"Founded Upon an Error - Probably Overthinking It","isPartOf":{"@id":"https:\/\/www.allendowney.com\/blog\/#website"},"datePublished":"2021-05-07T18:51:24+00:00","dateModified":"2025-09-27T15:00:38+00:00","breadcrumb":{"@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.allendowney.com\/blog\/2021\/05\/07\/founded-upon-an-error\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.allendowney.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Founded Upon an Error"}]},{"@type":"WebSite","@id":"https:\/\/www.allendowney.com\/blog\/#website","url":"https:\/\/www.allendowney.com\/blog\/","name":"Probably Overthinking It","description":"Data science, Bayesian Statistics, and other ideas","publisher":{"@id":"https:\/\/www.allendowney.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.allendowney.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.allendowney.com\/blog\/#organization","name":"Probably Overthinking It","url":"https:\/\/www.allendowney.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.allendowney.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.allendowney.com\/blog\/wp-content\/uploads\/2025\/03\/probably_logo.png","contentUrl":"https:\/\/www.allendowney.com\/blog\/wp-content\/uploads\/2025\/03\/probably_logo.png","width":714,"height":784,"caption":"Probably Overthinking It"},"image":{"@id":"https:\/\/www.allendowney.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/AllenDowney","https:\/\/www.linkedin.com\/in\/allendowney\/","https:\/\/bsky.app\/profile\/allendowney.bsky.social"]},{"@type":"Person","@id":"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/4e5bfb2e9af6c3446cb0031a7bf83207","name":"AllenDowney","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.allendowney.com\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/fb01b3a7f7190bea1bbf7f0852e686c2f8c03b099222df2ce4bc7926f15bcb43?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/fb01b3a7f7190bea1bbf7f0852e686c2f8c03b099222df2ce4bc7926f15bcb43?s=96&d=mm&r=g","caption":"AllenDowney"},"url":"https:\/\/www.allendowney.com\/blog\/author\/allendowney_6dbrc4\/"}]}},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[{"id":883,"url":"https:\/\/www.allendowney.com\/blog\/2023\/03\/20\/the-bayesian-killer-app\/","url_meta":{"origin":609,"position":0},"title":"The Bayesian Killer App","author":"AllenDowney","date":"March 20, 2023","format":false,"excerpt":"It's been a while since anyone said \"killer app\" without irony, so let me remind you that a killer app is software \"so necessary or desirable that it proves the core value of some larger technology,\" quoth Wikipedia. For example, most people didn't have much use for the internet until\u2026","rel":"","context":"Similar post","block_context":{"text":"Similar post","link":""},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/img.youtube.com\/vi\/fsdbneHgi58\/0.jpg?resize=350%2C200","width":350,"height":200},"classes":[]},{"id":569,"url":"https:\/\/www.allendowney.com\/blog\/2021\/04\/25\/bayesian-and-frequentist-results-are-not-the-same-ever\/","url_meta":{"origin":609,"position":1},"title":"Bayesian and frequentist results are not the same, ever","author":"AllenDowney","date":"April 25, 2021","format":false,"excerpt":"I often hear people say that the results from Bayesian methods are the same as the results from frequentist methods, at least under certain conditions. And sometimes it even comes from people who understand Bayesian methods. Today I saw this tweet from Julia Rohrer: \"Running a Bayesian multi-membership multi-level probit\u2026","rel":"","context":"In \"bayesian\"","block_context":{"text":"bayesian","link":"https:\/\/www.allendowney.com\/blog\/tag\/bayesian\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":442,"url":"https:\/\/www.allendowney.com\/blog\/2020\/04\/13\/bayesian-hypothesis-testing\/","url_meta":{"origin":609,"position":2},"title":"Bayesian hypothesis testing","author":"AllenDowney","date":"April 13, 2020","format":false,"excerpt":"I have mixed feelings about Bayesian hypothesis testing. On the positive side, it's better than null-hypothesis significance testing (NHST). And it is probably necessary as an onboarding tool: Hypothesis testing is one of the first things future Bayesians ask about; we need to have an answer. On the negative side,\u2026","rel":"","context":"In \"bayesian statistics\"","block_context":{"text":"bayesian statistics","link":"https:\/\/www.allendowney.com\/blog\/tag\/bayesian-statistics\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/www.allendowney.com\/blog\/wp-content\/uploads\/2020\/04\/image.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":182,"url":"https:\/\/www.allendowney.com\/blog\/2019\/02\/25\/bayesian-zig-zag-webinar\/","url_meta":{"origin":609,"position":3},"title":"Bayesian Zig-Zag Webinar","author":"AllenDowney","date":"February 25, 2019","format":false,"excerpt":"On February 13 I presented a webinar for the ACM Learning Center, entitled \"The Bayesian Zig Zag: Developing Probabilistic Models Using Grid Methods and MCMC\". Eric Ma served as moderator, introducing me and joining me to answer questions at the end. The example I presented is an updated version of\u2026","rel":"","context":"In \"bayesian\"","block_context":{"text":"bayesian","link":"https:\/\/www.allendowney.com\/blog\/tag\/bayesian\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":1704,"url":"https:\/\/www.allendowney.com\/blog\/2026\/01\/09\/bayesian-decision-analysis\/","url_meta":{"origin":609,"position":4},"title":"Bayesian Decision Analysis","author":"AllenDowney","date":"January 9, 2026","format":false,"excerpt":"At PyData Global 2025 I presented a workshop on Bayesian Decision Analysis with PyMC. The video is available now. This workshop is based on the first session of the Applied Bayesian Modeling Workshop I teach along with my colleagues at PyMC Labs. If you would like to learn more, it\u2026","rel":"","context":"In \"bayesian statistics\"","block_context":{"text":"bayesian statistics","link":"https:\/\/www.allendowney.com\/blog\/tag\/bayesian-statistics\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/img.youtube.com\/vi\/PLGVZCDnMOq0qmerwB1eITnr5AfYRGm0DF\/0.jpg?resize=350%2C200","width":350,"height":200},"classes":[]},{"id":1551,"url":"https:\/\/www.allendowney.com\/blog\/2025\/05\/22\/my-very-busy-week\/","url_meta":{"origin":609,"position":5},"title":"My very busy week","author":"AllenDowney","date":"May 22, 2025","format":false,"excerpt":"I'm not sure who scheduled ODSC and PyConUS during the same week, but I am unhappy with their decisions. Last Tuesday I presented a talk and co-presented a workshop at ODSC, and on Thursday I presented a tutorial at PyCon. If you would like to follow along with my very\u2026","rel":"","context":"In \"bayesian statistics\"","block_context":{"text":"bayesian statistics","link":"https:\/\/www.allendowney.com\/blog\/tag\/bayesian-statistics\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/img.youtube.com\/vi\/foMbacbuAQk\/0.jpg?resize=350%2C200","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/posts\/609","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/comments?post=609"}],"version-history":[{"count":9,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/posts\/609\/revisions"}],"predecessor-version":[{"id":1609,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/posts\/609\/revisions\/1609"}],"wp:attachment":[{"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/media?parent=609"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/categories?post=609"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.allendowney.com\/blog\/wp-json\/wp\/v2\/tags?post=609"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}