I'm experimenting with something other than the blog format to collect my ideas. So if you use this channel to keep tabs on all things Philosophistry, I recommend instead subscribing to my twitter @philosophistry or visiting philosophistry.com for links to my latest thoughts. I will still use this blog, to post project announcements and one-off web specials.

For example, one of my notes about quantum physics got some love on reddit, and another one of my notes about skill acquisition was inspired by my extended stay in Tahoe.


posted by phil on Saturday Mar 9, 2013 9:09 AM
leave a comment
permalink to this post

Men either have less freedom or more freedom when it comes to dress than women do. True, anybody can dress however they want, adorning themselves with all sorts of shapes and colors that fit their imagination, but to dress stylishly, one has to consider the existing, acceptable stylish genres for their gender.

For men in a cosmopolitan city, there are anywhere from 2-4 genres of attire that, when followed fully, lead to something that could be considered a stylish ensemble. For example, in Austin, TX, there are three genres to choose from. There is the hipster adorned with a willfully eclectic mix of styles, retro or ironic sunglasses, and unusual, but of-the-moment colors (at one point, it was purple, and recently it was maroon). There is the uniform that could be called "white liberal" from those who shop at Whole Foods, who like the hipness of hipsters, but don't like how loud they are, and at the moment, tend to wear shoes sold by Toms, the company that donates a pair of shoes for every one you buy. Thirdly, there is the stylish dress of the more traditionally employed, such as those working in finance, who tend to pick a blur of styles from the hipsters and white liberals, but from 5-10 years ago, while throwing in flairs of attitude and class, such as maintaining a popped collar, or still wearing Lacoste shirts.

If a man doesn't dress in a genre associated with his demographic, he can only create a facsimile of style. He can have matching colors, have forms that fit well, and pay for a coherent haircut, but if he doesn't fit within those above three genres, something will always seems off or incomplete about his ensemble.

Women also have genres to choose from. Two of the genres for men mentioned above also exist for women: the hipster and the white liberal. However, the number of genres for women are not on the order of 2-4, but on the order of 12-16. And since women spend more effort outdoing each other in fashion, appearing unique has larger currency for them. For a woman to appear stylish, she really has two options: to either choose from a pre-made genre for her demographic (whether it's a platinum-blonde inspired from The O.C. or a bangs-bedecked cutie-pie à la Zooey Deschanel), or she can create her own genre and ensemble, as long as it adheres to general principles of aesthetics.


posted by phil on Sunday Feb 10, 2013 8:24 PM
leave a comment
permalink to this post

One of the worst historical misconceptions among mainstream Americans has to do with Vietnam. Vietnam appears in the public imagination as a singular blemish, the one time when America wasn't living to its true nature. The truth is that most wars before Vietnam were "Vietnams." Most wars before Vietnam were led by an aggressive elite against significant public opposition. Anti-war critics in those wars were jailed and/or intimidated with McCarthyism-like tactics. As a result, these wars were conducted under false pretenses, with the public being force-fed glossy narratives.

"Nearly all wars" includes the Civil War, the Revolutionary War, and even World War I. There appears to be only one war beyond reproach, and that's World War II. But such a singularly slam-dunk-of-a-war situation is rare, and it probably only seems beyond reproach because of how much of a boon it was for the United States economy and its supremacy.


posted by phil on Wednesday Feb 6, 2013 10:46 PM
leave a comment
permalink to this post

2013 is the first year of the 2000s that feels like a departure from a turn-of-the-millenium mindset. It even feels like the first real year of the decade.

Part of this has to do with the fact that nobody could ever settle on an appropriate way to call the first decade of 2000 (are they the "aughts"?). Another reason is because of 9/11, which made it so that the late aughts could never escape the echoes of 9/11 (thanks to the wars of Afghanistan and Iraq.) And just as Obama got elected, which was supposed to be a palate cleanser, the Great Recession happened, which anchored the next 4 years to the aughts as well.

But 2013, unlike 2012, feels like a break-away from the early 2000 years. It's the first real year of the new tens. Any talk of apocalypse seems ridiculous now. When 2012 came and went without a Mayan apocalypse, it hammered the final nail in the coffin for millennial Armageddon scenarios. Even predictions about an impending Singularity, which reached a zenith of attention in 1999, no longer seem "ten years away," but rather something that maybe will happen in 2065 or 2089. (Or will it even happen?)


posted by phil on Monday Jan 28, 2013 9:14 PM
leave a comment
permalink to this post

When you re-encounter someone who you've spent a long time away from, their face always seem more compressed than normal. In our imaginations, the facial memory of our loved ones becomes elongated and exploded, as features disappear, and salient ones remain, like ornaments on a Christmas tree. Their real face is tighter and more whole. The first impression from a long time apart, as you see them through the windshield of their car when they pick you up from the airport, is an eerie feeling. Perhaps Picasso was onto something with his depiction of jigsaw faces.


posted by phil on Sunday Jan 27, 2013 6:36 PM
leave a comment
permalink to this post


(Joseph Campbell)

I noticed two kinds of people at Stanford: those who got there naturally, and those who got there artificially. To get there artificially means to have set getting to a good school as a goal, striving for it, doing meta-learning and psyching up, and very deliberately architecting your high school years toward getting there.

To get there naturally means to have worked hard in high school, yes, but without much personal strain. You did community service because you liked it, not because you wanted a line on your college apps. You worked hard for As but without sacrificing having fun and "being a kid." If anything, getting As was part of a fun social activity for your cohort in the AP and gifted classes.

Likewise, there are the successful who are the apparent result of their ambition. Think Hillary Clinton. Yes, they have some talents that are naturally suited to the positions they have attained, but the much greater source for their high station and accolades is their diligence and determination.

And then, there are the successful who are the apparent result of their natural talents. Think Joseph Campbell, who could've written more books, garnered millions in speaking gigs, but was content to stay at Sarah Lawrence College for what seems like at eternity. His successes are a more authentic expression of his being, and more likely to have been garnered with joy.


posted by phil on Sunday Jan 27, 2013 6:31 PM
leave a comment
permalink to this post

History is ultimately about the cannabalization of the Top 10 wealthiest by the next 90 wealthiest. The rest is just footnotes. This is the essential pattern that emerges from Howard Zinn's A People's History of the United States. The American Revolution was just a war between the New Rich (George Washington, Thomas Jefferson, et. al) and a few British landowners. The Civil War was just a war between the New Northern Rich and the few slaveholding landowners who owned most of the South. Continuing with this pattern, the Great Recession of 2008 was just a war between Goldman Sachs and Lehman Brothers (with Goldman Sachs winning, gloriously).

Cash is inherently bubble-producing. "Nothing makes money like money," as the saying goes, and so as the rich become the ultra-rich, they eventually create a tumor. The situation is unsustainable, and when the bubble pops, the second layer of wealthy individuals are ready to reclaim the seized territory or government handouts.

The rest of history is simply about the minimal goods that the rich can give the poor to keep the system in tact.


posted by phil on Sunday Jan 27, 2013 12:56 AM
leave a comment
permalink to this post

The idea behind user-generated reviews is that with a large enough sample size, you will eventually reach the truth about the quality of a product or establishment. However, what isn't often noticed are the strange ways that the accuracy varies depending on the sample size.

For example, a friend of mine in San Francisco complains that all the Yelp reviews for restaurants in a certain neighborhood are always 4 stars. Not 3.5 and not 4.5, but just 4. The reason being is that every establishment there has hundreds of reviews and at that sample size, you're actually just getting an over-representation of the legion of vocal supporters of their favorite local establishment.

But four hours away, in South Lake Tahoe, there is the opposite problem. Since there are only a few reviews for most establishments, they're all the random stinkers from patrons who were burned by their experience.

The ideal sample size for reviews, then, is actually somewhere in the middle. If you have too many reviews, a biased sub-group will be over-represented, and if you have too few, there will be too much noise that doesn't get smoothed out.


posted by phil on Wednesday Jan 23, 2013 3:19 AM
1 comment
permalink to this post

So much of the excitement about futurism is in answering age-old grand questions about life, such as, "Are we alone?" However, there's really four types of alien life-forms that we could encounter, most of which aren't that exciting.

At the most basic level, there's things like pulsing bacterium, oxidizing froth, and plants. Pretty much discovering life that is immobile or semi-immobile is nearly equivalent to discovering a planet with interesting chemicals on it, which happens every so often. Encountering this would simply tell us that automata is easier to evolve than we thought. This discovery would be as interesting as discovering multi-cellular organisms on Earth that don't need oxygen. That discovery barely registered a blip in the news.

The next level of interestingness would be the discovery of living things like camels, reindeer, or fish. These are independently moving, non-vegetable-style animals, i.e. the kinds of things that could become pets. That would be interesting to some extent, but after the initial excitement, they would spark as much curiosity for humans as the presence of strange marine life. There are at least 750,000 undiscovered species in our oceans, and that number is likely to remain that way for a long time. At this point, we could say, "We're not alone," but try asking a solitary sailor on a boat far into the ocean if they feel alone. They likely wouldn't get any solace knowing there are strange creatures swimming beneath them.

The next level would be aliens that are sentient and advanced enough to have a culture. Perhaps they don't have any skills for space travel, so they're definitely less intelligent than we are. At this point, "Are we alone?" An answer to that could be similar to the discovery of the New World by Europeans. During that era, the world must have seemed as large to humans as our universe seems to us today, and so to discover a whole new continent with previously unknown living people and culture must have been mind-blowing. And yet, it's difficult to find stories about just how earth-shattering this was to the scientific community or even ordinary people.

Finally, the forth level would be sentient aliens that have already dominated space travel. In which case, they would have discovered us first, or they already have. That may provide us with the similarly wondrous scenes from science-fiction movies like Contact. But this outcome is not the likely scenario for discovering aliens. It's ten times more likely that we'd discover a new world of savages, a hundred times more likely we'd discover a new ocean of marine life, and a thousand times more likely we'd simply discover exotic bacteria.


posted by phil on Friday Jan 11, 2013 8:06 PM
leave a comment
permalink to this post

One of the caveats about accelerating change is that we may reach physical limits to how fast we can make computers. That at some point, all the processors work on an electron scale, and can't be reduced any further, or that solving the overheating problem of CPUs may become intractable.

But there is another limit that could factor in: human demand. All technology is created to ultimately serve human consumption. Without human demand, there is no further development.

For example, at some point, we won't need anything after HD or Retina displays. The human eye won't appreciate any further refinements. Already we're seeing the disinterest in CPU speed on computers. In the 1990s, even casual computer consumers cared about how many MHz were in their machine. Now, the number of people that know how many GHz or cores their laptop has is a small minority. Instead, innovation is being driven by the miniaturization of CPUs. But at some point, we will have the thinnest possible phone. Already, some people complain that the iPhone 5 is too thin, and therefore too easily slips out of their hands. After thin-ness, what's next?

That hasn't stopped CPU innovation, though, because there has been this massive expanse of cloud computing and web servers. Consumer demand is still affecting CPU innovation, but it's proxy via demands from businesses like Google and Facebook that are servicing consumers.

At one point, it was video game consoles that were pushing the envelope of processing power. But after the Playstation 3, there isn't much more that the gamer needs. Theoretically, the Playstation 4 or Playstation 5 will have as much graphical processing power as is used in rendering a 3D-animated Pixar film, but video gamers are drifting in the other direction, toward casual games on their iPhones, or are content with less graphically intense games on the Wii. So there is a step in the opposite direction, to make less intensive CPUs at a cheaper cost.

There's also a limited number of hours of attention a human has. While as a power consumer may own a laptop, smart phone, and a tablet, their time is divided between all three. Can they add another device? Perhaps they will have backup smart phones, and tablets, and eReaders, but again, that will just reduce the amount of time they spend on each device. Each device's significance will diminished by the introduction of another. Only so much entertainment can be consumed per day. Only so many words can be typed per day.

There is a pattern though, where we sometimes think, "No more innovation will happen." For example, there is the famous (though false) quote from the Commissioner of the US patent office who said, "Everything that can be invented has been invented." Or there's another famous (though false) quote from Bill Gates, that "640K of memory should be enough for anybody." And just when we were perhaps getting to a point of boredom in 2009, Avatar came out in theaters, and 3D became the next big envelope pusher. We thought, "Alas, the Playstation 4 would need to be at least twice as fast to handle all the 3D games!" But since then, consumers have become lukewarm to 3D. So already, we're seeing a turning back from new technology.


posted by phil on Thursday Jan 10, 2013 10:21 PM
6 comments
permalink to this post

Read More Entries