A Career BS Detector

BS Detector:

Noun. The ability to recognize when someone is exaggerating, lying, talking nonsense, etc.: If you try faking it on many questions, be warned—the average human has a very good bullshit detector.


Finding Poop In The Cereal
This is as good as any for me to sketch my carer as a BS detector.

At about 5, I conducted an experiment proving Santa Claus to be a hoax.

In Grade 12, I was serious enough about calling BS on religion to engage in a formal debate with "Youth For Christ" and writing my English "term paper" on the subject.

In 1969, I was granted an M. Sc. in Math, Computing, and Statistics. From then on, I discovered most of the big piles of BS would be statistical. Since I was being paid for this, I became a professional BS Detector. Although the degree doesn't mention it, my best subject was actually in Philosophy, namely "logic," which would become increasingly important. I also demonstrated an ability to do a "deep dive" into a subject to reach truly independent conclusions based on research. This ability would become a theme of my subsequent 40-year career.

I was fired from my first job by calling BS on a line being drawn through half a dozen random points. It was my first encounter with "reading tea leaves" in a statistical distribution. Before being fired, I created three computer models: crystalized function, railway supply chain, and the business itself.

As acting head of Airport Statistics in the Department of Transport, I showed that aircraft activity reports prepared by the Department of Statistics were misleading and useless for decision-making. This was my first time using a computer model to extract a real probability distribution from real data. Honestly, despite my supposed expertise in statistics, this was my first deep dive into the subject.

I had a years-long encounter with evangelical Christianity, reversing my High School opinion. This is discussed in detail here

As a consultant to the Parks Department, I showed that widely-used statistical methods to assess environmental impact were fundamentally invalid. My report was welcomed by the biologists. This was my introduction to the field of ecology. Statistical methods were not, in fact, used in the project we were working on twinning the Trans Canada highway through Banff national park.

I assisted one of Canada's leading forensic psychiatrists in analyzing data collected on individuals in prison who were diagnosed with some form of mental illness. The problem was "tea leaf reading" after the fact. In other words, no conclusions could be reached just by sifting through the data and looking for patterns. I would see this mistake again and again. To mean anything, data needs to be collected for a purpose. Data collected for one purpose may be misleading for a different purpose. This issue pops up over and over in the climate debate.

In my career supporting quality control in aviation, I showed that mandatory "reliability analysis" was statistically invalid and a waste of time. In one case where there was enough data to justify curiosity, I showed that the "problem" was over-inspection, not "reliability" at all. 

I looked at another mandatory procedure called "trend analysis," which was supposed to detect when an engine was due for an overhaul. In this case, engineers took great pride in interpreting "trends" in time series when the "trend" was more easily spotted by comparing observations with each other rather than with time. No great expertise was required to spot the combination of factors that called for an overhaul. In fact, such a situation could be determined by a single observation. Time sequences were not required.

While supporting a fleet of three hundred military aircraft, I determined that at least 10% of the data collected for maintenance purposes was factually incorrect. Sadly, my client was not interested in a project to correct thousands of errors. This illustrates a general mistrust of "math" as having any real-world impact.

In my role of "computerizing" aircraft maintenance records, finding at least one substantial error in the data put into my software was common. The software itself was designed to spot such errors. One example was a fancy and expensive computer model that did not track one engine, a situation involving the FAA. More broadly, we became experts in government requirements and manufacturer recommendations - frequently challenging existing client procedures.

I acquired quite a bit of experience in the art of taking "time" out of models, either in math or graphical presentations. I had also begun to think that the valid use of "official" statistics was an exception rather than the rule. 

In 40 years in the field, I have never encountered anyone outside the academic world with a basic knowledge of statistical concepts or insight into how data should be presented graphically.

Over time, I have become increasingly interested in the relationship between human experience and "reality in itself." This started a long road, ending up with a project on "Reality Distortion Fields," a work in progress. Looking back, RDFs, including mine, have played an outsized role in human affairs.



Comments

Popular posts from this blog

Facebook and Bing - A Killer Combination

A Process ...

Warp Speed Generative AI