Notes by Philip Dhingra
Science

At some point, dying became an asceticism, i.e. a gauntlet with medicine and nature lashing you on both sides

The modern journey to death, with its lengthy battles against incurable diseases like cancer, has an ascetic, Judeo-Christian feel to it. For most, the end of life is a drawn-out gauntlet, a tunnel where technology and nature whip us on both sides. Meanwhile, we peer forward at the faint light at the end of it, a glimmer of a chance for extending our golden years.

# science living

Esoteric Threshold

50% of Americans reject evolution. The sad thing is that that number probably won't change for a long time, if at all. Scientific advances since the mid-1800s ceased to become advances in the lay understanding of science. All advances before then ultimately permanently enlightened humanity. 99% of people believe the Earth is round and that the Earth revolves around the sun, despite a lack of direct observation. But concepts like relativity or evolution either require an advanced understanding of science or a tremendous amount of faith in it. And most people's faith in God is stronger. Of the 50% of Americans who do believe in evolution, probably only 5% remember why they adopted that belief in the first place. And only 1% can explain the actual evidence for it.

You can probably take every academic discipline and point to a date in history when it breached that esoteric threshold, after which any further advances only enlightened experts and insiders. If you took the earliest of those dates, they would mark the beginning of the end for the "Renaissance Man," the person who could know the sum of human knowledge up to that point.

What does it mean if all future advances are increasingly esoteric? What if it gets to the point where even the vast majority of academics within a discipline can't keep up? One journalist reported that professors now rubber-stamp peer reviews simply because they don't have the time, energy, or expertise to wrap their head around the papers.

There is a video on YouTube of the construction process for a Boeing 747, and it occurs to the viewer that there probably doesn't exist a single engineer working on it who has a concept of how to build the whole thing. All the instruction manuals have been lost or are indecipherable. If you annihilated that factory and asked them to make a Boeing 747 from scratch, they'd have to nearly re-invent it.

Likewise, computer chips are now manufactured using automated processes programmed using computers. If we destroyed all the computers in the world today, how long would it take for us to get back to a modern processor like the Intel Core Duo 2.4 Ghz? Would we have to recapitulate the history of the development of computers?

Could we ever get to a level of advancement and sophistication where we no longer have an idea how everything around us came to be? What would happen to our faith then?

# science technology history religion

For content, crawlability is the new immortality

One day the birth of the Internet will seem like the birth of history. The amount of available recorded information, if plotted on a line, would look like a cliff starting in the late 1990s.

Even though the “digital revolution” supposedly happened before the Internet with the advent of computers, it wasn't until all content delivery became digital that we encountered this cliff.

If an archivist wanted to save a newspaper today, post-Internet, they would most likely crawl the newspaper's website on their own. If the newspaper's website weren't online, they would ask the paper to type a few commands and email a database dump. And if finally, that wasn't possible, they would ask the paper for access to the published HTML files that once represented their website. If the archivist had to, they could then copy and paste these contents.

Before the Internet, even though the newspapers' contents were most likely digital somewhere, nobody would know where the files were. Or if their location were known, they would require assistance from a technical employee who would have been laid-off by then. Or the file formats might be printer-ready and arcane, instead of web-ready and accessible.

Ultimately, the newspaper would simply dump an incomplete box of back-issues on the archivists' desk, only to then continue to collect dust again, as the archivist moves onto much easier projects.

# history computer_science technology science

In the world of scientific inquiry, the wonderment of a child is just as important as the skepticism of an adult

Carl Sagan or Buckminster Fuller would suggest we adopt the mindset of a child who looks at the world in wonderment. Such a mindset leads us to stare at the night sky and dream about worlds beyond. But children also have an unusual capacity for boredom. They can be ferried across the Seven Wonders of the World, only to look up occasionally and apathetically from their portable video games. But is there wisdom in bored inquiry? It's wonderment that drives the quest to zoom the microscope further and further down to figure out what particles truly constitute reality. But it's disillusionment or dispassion that reasons that maybe there is nothing magical about existence at all. Wonderment is what drives people to look at the complexity of an organ like the eye and conjure an intelligent designer behind it. But it's a cool skeptic who sees wonderment as yet another human bias to be triumphed and then wonders, "What if it's all just random mutation and natural selection?"

# science

Law of Hierarchal Returns

Technological progress is more often hierarchal than incremental. The older the technology, the more foundational it is. For example, DNA laid the groundwork for multicellular organisms, which laid the groundwork for sexual reproduction, which paved the way for the Cambrian Explosion. There has been tremendous biological innovation since then, with only minor changes to the fundamental technology of DNA.

The invention of the Internet nests within the invention of computers, which nests within the harnessing of electricity. Facebook, Wikipedia, and Google nests within the invention of World Wide Web which nests within the invention of the Internet. It's more likely that future technologies will derive from previous technologies, and not be whole new classes of technology.

The accelerating march towards the Singularity may, therefore, manifest itself less like a rocket taking off, and more like matryoshka dolls, with smaller and smaller changes having a greater and greater impact.

# futurism evolution technology science singularity

Scientism

If there is a charitable kernel to the philosophy of Flat Earthers, its their opposition to Scientism. Scientism is the belief that science is our best tool for revealing truth. Anti-scientists believe in other sources of truth, typically first experiences or intuition. For example, a round Earth is not intuitive.

One could be forgiven for making it to adulthood and coming out against science. The biggest knock on science comes from healthcare, which is most people's primary contact with science. There, the science tells us conflicting information. Every year, for example, a new food gets added to the "do eat" list, while another food gets taken off. Every decade gets a new diet. Perhaps those fad ideas are mis-applications of science, but how is the lay person supposed to know what is a good application of science?

The information oncologists provide is conflicting, confusing, and uninformative. One study showed that most people's experiences through cancer is not just a physical nightmare, but a mental nightmare of spaghetti information. Meanwhile, "survivors" come out talking about how they underwent miracle cures. Who are we supposed to believe?

Doctors are trained in bedside manner techniques that involve avoiding saying phrases like "I think" or "I believe." Instead, they are to tell everything as if they were facts. Furthermore, it's better not to tell mixed messages. Don't retract something unless you're really sure you made an error. Any sign of waffling reduces authority.

Scientism isn't about the scientific method. Instead, it's a process of trusting scientific authority. Scientism in healthcare may be a net positive today, but even a blind faith in authority would be highly recommended against by that same authority. You should always trust your body.

The domains outside of healthcare should be stronger, though. Physics, for example, is ironclad. However, the media portrays physics as being upended by revolutions, such as relativity, which instead of making physics appear as an iterative discovering of the true nature of the universe, the field comes across as a war of ideas. Again, you could forgive the anti-scientists watching these "scientific revolutions" unfold and coming out skeptical of any stability in science.

# philosophy science epistemology

Statistical Insignificance

The words "statistically significant" don't belong together, because there's nothing objective about significance. Something has statistical significance when it deviates from a normal distribution curve in a way that is rare. For most people "statistical" just translates into "scientific," leading the whole expression to mean, "scientists think it's important." If a sample of 30 men and 30 women show that men have 120 points and women have 125 points, the magazine article could say that the difference is statistically significant. Technically this means that there is likely a population difference between men and women, but to the average reader, it says Men Are from Mars, Women Are from Venus, leading to a new social order for a new generation.

# statistics education science

The definition of the g factor for intelligence is a no-brainer: Of course, a group of cognitive factors predicts career performance

In order to address perceived deficiencies in IQ tests, scientists unified the results from multiple intelligence tests and narrowed down intelligence to a single factor called g. g predicts success in a multitude of intelligence tests as well as in career performance. This points to the possibility of a single attribute or gene—a gene qua non perhaps—which coincides with lay intuition: some people are smarter than others.

But this bundle of intelligence tests, combined with the career-performance correlation, is biased in favor of a social notion of success. g measures a grouping of measurements that determine what we believe to be good in society. Doing well in a career is better than not. Identifying patterns in diagrams faster than others is better than not. But these are all tests valued by humans, not measurements by an independent cognitive judge.

g doesn't test feats of strength nor does it test all mental capacities, such as artistic ability. Even though g does correlate with performance among creative professionals, if all careers were artistic in nature, then g would be unimportant, or we would find another factor a that also grouped multiple tests of intelligence, but in this scenario, intelligence would be synonymous with artistic ability, and career performance would mean being a good at art. g simply tells us that there is some social consensus as to what it means to succeed or not when it comes to tasks that rely on our minds.

# psychology science

The Limits of Collective Research and Reasoning

According to the Internet's best guess, everything causes cancer, and everything doesn't cause cancer. If you search, "Does such-and-such cause cancer?" you can find plenty of articles from supposedly reliable sources, such as WebMD, to confirm the carcinogenic properties of almost anything. The same confusion is true for the side effects of drugs. Everything causes headaches, nausea, etc. Or, if you follow the consensus advice on what to eat and what not to eat, you will be left with only five possible foods in your diet.

There is a medical truth out there, but our current quality of research and the current collective reasoning skills of the Internet are insufficient to answer a broad swath of common questions today.

# technology science medicine
9 entries