Notes by Philip Dhingra
Technology

Advanced AI is just advanced us

One of the best cases for a radical technological singularity comes from potential advances in Artificial Intelligence (AI). If an AI were to get smart enough to make itself smarter, then given the world's processing power, it could create a superintelligence that is light-years ahead of where we are today. However, one argument against this upward spike is that we are already that superintelligence.

Even though humans have irrevocably changed the world, in many ways its the same. For example, birds fly, and now we fly. Animals move in packs, and now we have freeways. Electricity is exotic, but the things we do with it, such as recreating villages that now span greater distances, is familiar. We are already computers, and we created AI in our image. The result may end up being the preservation of all our existing faculties, just really enhanced.

# futurism technology

A hundred years from now, despite the Singularity or Moore's Law, 25% of restaurant tables will still wobble

In Minority Report, despite Maglev cars and floating user interfaces, people still catch colds. Similarly, a hundred years from now, around 25% of restaurant tables will still wobble. Never mind the Singularity or Moore's Law reaching its zenith, this fact about tables will remain. This philosophy is called "The Banality of Futurism," and by delving deeper into the wobbly table issue it's clear some problems weren't meant to be solved.

The current solution to wobbly tables, besides using folded sugar packets, is to sell tables with adjustable screws. Many restaurants already have these tables, but because of how inconvenient it is to find someone to lift the table while you bend over and get your hands dirty, the solution is not utilized. This leads to Principle #1 of the Banality of Futurism: The future may already be here, but we don't use it.

This assumes restaurant owners even bought tables with adjustable screws. While wobbly tables are a collective nuisance, the owners are individuals who have to look at a catalog of restaurant tables and each come to the same conclusion: "I should pay for the premium tables, so that my customers don't have wobbly tables." But because of the cost-saving motivation combined with some rationalizations such as "My floors in the new restaurant should be flat" or "We can just stick small wood chips underneath them," we have the situation we end up in today. Principle #2: The future may already be here, but the problem isn't annoying enough to solve at scale.

Continuing on to the immediate future, wobbly tables are still problematic. Let's say Apple designs "the perfect table," one that adjusts easily. Perhaps they're electrically adjustable with the push of a button. Or maybe the screws are designed such that you can easily adjust them with your toe. Again, by the same principle above, restaurant tables won't get the Apple treatment. Part of the problem is that it's a commodity item, like printers, so there's no incentive for one player to make the table and own the market. Principle #3: The future may already be here, but nobody wants to build it.

Further out, in the exotic future, if there were some cheap solution that involved fancy material science, we would have already found it. Imagine some hard, rubbery substance that expands or contracts based on continuous pressure—or lack of—over multiple days. The date of this material's discovery would be random and not linked to exponential increases in computing power or intelligence. Given how far we've already gone into material science, the discovery would have to be by luck, or barring that, by intense force. Furthermore, the difficulty in its discovery would likely imply a difficulty in manufacturing it as well, and so again, it won't be cheap.

There are many things like the wobbly table. For example, all of these will still be problems in 2116:

  • Pizza boxes that won't stay completely shut
  • Stray corners of paper towels left behind upon ripping
  • Old refrigerators that make weird noises
  • Stubbing our toes on the sharp edges of furniture

Perhaps the silver lining in the Banality of Futurism is that the room for growth won't be in fixing life's inconveniences, but rather in the human condition. If poverty is eliminated or if war becomes taboo, then maybe eating an apple pie on a wobbly table while blowing our noses won't be so bad.

# futurism technology singularity

AI and Dumbing Ourselves Down

Does technology make us smarter or dumber? It's an endless debate, and artificial intelligence is no exception. What is different about AI, though, is how dumbing-down is inherent to the design of that class of technology. For example, we are beginning to adapt our speech to the mediocre ears of Siri and Alexa. When we begin a phrase, "Hey Siri" or "OK Google," we usually follow it with a monotone set of instructions, tailored to maximize comprehension from the machine.

I once met a hand surgeon who had become so dependent on Dragon Dictate that half of our meeting involved him speaking into his headset. While he was switching to a robotic cadence, I could see him mentally narrow his vocabulary so as to optimize for the cleanest transcription from his software. The act reminded me of how parents switch to baby talk to communicate effectively with newborns.

Building sophisticated AI is expensive, and the final 10% of sophistication can cost more than the first 90%. So if the end users can make up last 10% by adapting their behavior to the software, then as AI becomes ubiquitous, so will stooping down.

# futurism technology ai

Apple creates religious experiences, not by being consistently awesome, but by making us yearn for rare, sublime moments of awesomeness

Apple products aren't user-friendly. Average users frequently encounter frustrations trying to do simple things, and yet the reputation of user-friendliness persists for Apple. Without a question, Apple's products are relatively more user-friendly than their competitors, but they get away with the "user-friendly" label because they add something extra to their user interfaces: the random religious experience.

Some of the time, when using an Apple product, there are magical "it just works" moments that are nearly absent in all of their competitors. The Apple experience then becomes like actual religious experiences in that they create a separate aspirational experience. Most churchgoers do not experience an epiphany every Sunday or when they pray. But most of them have had some significant, moving experience at some point in their life, and it's the hoping for that experience that is the bulk of their day-to-day religious experience. The aspiration is enough to be a kind of religious experience in of itself.

# design technology religion

Bitcoin is the first open-source government, with branches of power much like git trees, forking and merging, yet still producing a master

# computer_science politics technology

Bubble economics might be the path to socialism

While socialism has yet to arrive, it may come indirectly through economic bubbles such as the current venture capital market. 2015 saw the rise of the "unicorns," so-called startups with billion-dollar or greater valuations, with Uber being the poster-child, raising cumulatively $15 billion as of 2016, putting its valuation at $62.5 billion. But this so-called "bubble" might instead be a symptom of excess GDP growth. When there is more wealth than the wealthy can spend on yachts and low-yield bonds, their excess money has to find more exciting ventures, like startups. Money is like steam, and when pressure builds up in one area, it finds release somewhere elsewhere.

In a way, the startup bubble has increased tacit socialism. All the extra startup jobs have helped low-education people into lightweight desk work and relatively high salaries. Likewise, society benefits from free software and services pumped out by startups eager to build market-share quickly, often in a vain attempt to achieve a monopoly and return value to investors. If 90% of what we consume comes from startups that eventually won't reach that monopoly state, then in a way society gets a 90% discount on everything. For example, Netflix spends billions to win the original programming race with HBO and Amazon Prime, and while the company may or may not go broke doing so, the consumer benefits from getting all this extra art on the cheap.

# economics technology politics

Craigslist is a superior job board because its only master is simplicity. By spending less time catering to employers and job-seekers, it ends up being cherished by both

When looking for jobs on craigslist, you receive more personal responses, as opposed to automated or mass emails that come from other large job sites like Elance or Monster. Craigslist's competitors provide all sorts of tools to help advertisers re-post ads and to help job-hunters to apply in bulk. On craigslist, there are no templates. There is no "apply to all" or "post to all" buttons. Everything is a one-shot event. One job post has one fee. And if the post or email gets buried over time by the competition, so be it. There's no gaming it, so the results are more genuine and personal, which benefits everyone.

# design technology ux

Creativity through limitation: Technically animated GIFs are just looping videos on mute, yet they're so much bigger than that

The best way to understand Animated GIFs is in comparison to web videos. 80% of the time, web videos aren't set to auto-play, out of respect for people whose speakers are on, or for those who want to conserve bandwidth. Animated GIFs nearly always have no audio accompaniment and are on auto-play. They're also mostly guaranteed to be short and looping, so even if they get started, and you miss the sequence the first time, you'll have a second chance to catch what's happening. This unique structure makes them perfect as a form of pure ephemeral expression, like an advanced emoticon. This is why we find that the enduring home for animated GIFs is in forums and message boards, oftentimes serving as the perfect punchline.

# art technology

Delayed technophilia is the process by which we celebrate technological progress years after we've become inured to it

In the late 1990s, there were more than 500 million Internet users, and yet it was chided at the time as a novelty. Even though in Silicon Valley, it was described as a world-changing force, the popular understanding of it was as a place for time-wasters like pornography and cat photos. The Internet was conceived of as just one application out of many, something to sit alongside word processing or graphic design.

Now, the Internet is so integrated into our lives, that for many people, it's the only way they stay in touch, and for many businesses, it's the only way they make money. Our livelihoods are dependent on it, and so, in retrospect, it seems like a monumental innovation, perhaps even greater than the invention of the computer.

Likewise, airplanes and automobiles were initially received as novelties. Everybody was familiar with cars at the time, but they were primarily owned by the rich, and they were a nuisance on the road. A drive in one, for most people, facilitated something that they used to do by other means, such as by walking or with horse and carriage. It wasn't until later that all transport took place via cars, that it now seems inconceivable to imagine the trappings of modern life without them.

New technologies are initially only interested in acquiring new users; acquiring integration into those same users' lives is a whole different challenge. Thus, groundbreaking technologies only get recognized as such much later on, even decades after actually breaking ground.

# futurism technology

Esoteric Threshold

50% of Americans reject evolution. The sad thing is that that number probably won't change for a long time, if at all. Scientific advances since the mid-1800s ceased to become advances in the lay understanding of science. All advances before then ultimately permanently enlightened humanity. 99% of people believe the Earth is round and that the Earth revolves around the sun, despite a lack of direct observation. But concepts like relativity or evolution either require an advanced understanding of science or a tremendous amount of faith in it. And most people's faith in God is stronger. Of the 50% of Americans who do believe in evolution, probably only 5% remember why they adopted that belief in the first place. And only 1% can explain the actual evidence for it.

You can probably take every academic discipline and point to a date in history when it breached that esoteric threshold, after which any further advances only enlightened experts and insiders. If you took the earliest of those dates, they would mark the beginning of the end for the "Renaissance Man," the person who could know the sum of human knowledge up to that point.

What does it mean if all future advances are increasingly esoteric? What if it gets to the point where even the vast majority of academics within a discipline can't keep up? One journalist reported that professors now rubber-stamp peer reviews simply because they don't have the time, energy, or expertise to wrap their head around the papers.

There is a video on YouTube of the construction process for a Boeing 747, and it occurs to the viewer that there probably doesn't exist a single engineer working on it who has a concept of how to build the whole thing. All the instruction manuals have been lost or are indecipherable. If you annihilated that factory and asked them to make a Boeing 747 from scratch, they'd have to nearly re-invent it.

Likewise, computer chips are now manufactured using automated processes programmed using computers. If we destroyed all the computers in the world today, how long would it take for us to get back to a modern processor like the Intel Core Duo 2.4 Ghz? Would we have to recapitulate the history of the development of computers?

Could we ever get to a level of advancement and sophistication where we no longer have an idea how everything around us came to be? What would happen to our faith then?

# science technology history religion

Fearing that Google or Wikipedia makes us dumber is just as silly as fearing that abacuses or slide rules did the same

A frequent debate in American schools around the early 1990s was whether the reliance on pocket calculators would weaken math skills. The debate is similar to the debate about subsidizing dying industries. When technology replaces a job, life is hard for the unemployed, but we achieve a net benefit in the reduced cost of goods. We might say, "Well, then people won't know how to use multiplication or times tables," but we can replace that with the phrase, "Well, people won't know how to use this thing they don't need anymore." After all, we don't lament abacuses or slide rules. When we lose our facility with times tables, we more than compensate for it by doing other things mathematically.

A roundabout example is that calculators have made it easier to build computers and smartphones which have increased technological literacy, and therefore mathematical literacy. Calculators have made it simpler to make complex video games which require players to quickly count health points versus hit points, without the aid of calculators. Calculators have made it easier to make computers, which have made screen interfaces more common, which has dramatically increased the number of characters, many of which include numbers, which the average person encounters on a daily basis.

People know as much about times tables as they need to. If their situation requires them to multiply things quickly without having easy access to a calculator, then they will learn.

# technology education society

For content, crawlability is the new immortality

One day the birth of the Internet will seem like the birth of history. The amount of available recorded information, if plotted on a line, would look like a cliff starting in the late 1990s.

Even though the “digital revolution” supposedly happened before the Internet with the advent of computers, it wasn't until all content delivery became digital that we encountered this cliff.

If an archivist wanted to save a newspaper today, post-Internet, they would most likely crawl the newspaper's website on their own. If the newspaper's website weren't online, they would ask the paper to type a few commands and email a database dump. And if finally, that wasn't possible, they would ask the paper for access to the published HTML files that once represented their website. If the archivist had to, they could then copy and paste these contents.

Before the Internet, even though the newspapers' contents were most likely digital somewhere, nobody would know where the files were. Or if their location were known, they would require assistance from a technical employee who would have been laid-off by then. Or the file formats might be printer-ready and arcane, instead of web-ready and accessible.

Ultimately, the newspaper would simply dump an incomplete box of back-issues on the archivists' desk, only to then continue to collect dust again, as the archivist moves onto much easier projects.

# history computer_science technology science

Gentrification might now be so fluid that no city will remain obscure enough to incubate any gritty charm worth gentrifying

The speed with which yuppies are colonizing once-obscure spots is unparalleled in history thanks to the Internet. The Internet aids in discovery, as anybody can find articles like "The Top 10 Best Small Towns", or "Why Such-And-Such Neighborhood Is The New 'It' Spot." And the Internet, combined with laptops, has armed yuppies with mobile work so that they can move on a whim. However, the gentrification rate may be so fast, now, that no city may ever even have enough runway to accumulate the gritty charm that once attracted gentrification in the first place. Potentially, this marks an end of an era in urban growth.

# technology society

Hackers and whistleblowers have recovered the transparency that was undone by Citizens United

The long-foretold transparent society is nearly here, but not in the way we would have expected. Rather than voluntary transparency through an easing of inhibitions, we have an involuntary one now thanks to hackers and whistleblowers like WikiLeaks and Edward Snowden. Because of hacking and investigative journalism, campaign spending is now semi-transparent, thus muting the ill-effects foretold after the Supreme Court's Citizens United ruling. The rate of major leaks today is like Watergate, just with ten times the frequency.

# technology society

Historians will split the Information Age into two phases: one when Moore's Law seemed unstoppable, and one when it didn't

Once you learn about Moore's Law, you start to see it everywhere. Not only is CPU speed increasing exponentially, but so is hard disk storage and network technologies. Beyond the direct participants in Moore's Law-like growth patterns, there are secondary and tertiary fields that have also been affected: Weapons have become exponentially deadlier, and our ability to reap food from the Earth has become exponentially easier. There's a compounding effect to all this. Faster CPU speed makes it easier to do research, which makes it easier to invent the Internet, which then makes it easier to do research, which then makes it easier to make bullets and farm.

But just as we see Moore's Law everywhere, we are also noticing where it's absent. Despite the proliferation of many Moore's Laws in many fields, there's just some things that won't change. For example, tables in restaurants will still be uneven because the incentives to always bend over and adjust the tables in restaurants aren't there, and the value of a stable table isn't high enough to justify the invention of affordable self-adjusting table technology. Exponentially increasing CPU speed won't mean that such self-adjusting table technology will all of a sudden become affordable, or that if it does, it may take 100+ years to get here. So don't hold your breath.

But that may just be a problem with the nature of human demand. We aren't demanding un-wobbly tables that much, and it doesn't affect the restaurant-going experience that much, so it doesn't necessarily get the fruits of technological progress as quickly as something core, like our ability to kill each other and farm food.

But even controlling for demand, we can find limits to Moore's Law on the supply-side. For example, pervasive cell phone reception does not appear to be increasing exponentially. Even though cell towers are getting more powerful and cheaper to build, the cost to install them probably isn't changing much. Even as the mobile phone bandwidth is getting better in places where there already exists coverage, it still is horrible on a cross-country drive across the United States, and it is likely to remain horrible for a couple more decades. Sure, there isn't enough demand in those rural areas to justify installing a cell tower, but the problem is also subject to labor costs, which are probably rising as well as fixed resource costs, such as the fuel necessary to move installation equipment to those rural areas. It takes a certain amount of calories to dig a post for a cell tower that just won't get automatically obliterated by exponential technological progress.

We have one foot in the rapidly changing future and one in a world that is becoming increasingly banal. Our impatience for technological advances that should be here by now is encapsulated perfectly by the question, "Where are our damn flying cars?"

# futurism technology

If an app is popular with teens, it could mean it's on trend just as much as it could mean it's niché

Writers have proclaimed the end of Facebook since the beginning of the service. The biggest threat came from Twitter, which prompted Facebook to add status updates. About a half-year after the patch, though, Mark Zuckerberg admitted that they over-corrected, and Facebook's growth continued unchecked.

The latest threat comes from Snapchat, which has turned down billions of dollars because of its rising popularity with teenagers. Simultaneously with Snapchat's ascendancy, Facebook admitted its popularity with teens has declined. Does that mean Facebook is losing its cool? If so, then does the whole house of cards fall with it?

But what if Snapchat is just a teen phenomenon? The supposed attraction to Snapchat is privacy, and adults probably care about privacy just as much, if not more, than teenagers (If an adult makes a privacy flub, they could lose their job). But adults have much more control over their privacy than teenagers. If you leave your Facebook open on your screen as an adult, it's less likely that someone will snoop and bump into it. Or if you post something naughty online, and your parents see it, nobody is going to ground you.

LiveJournal, 4Chan, and Chatroullete are all services that were and are popular with teens, but haven't achieved widespread adoption with adults. Their rapid growths were simply a reflection of how quickly ideas spread within the teen subculture, rather than a leading indicator for how they spread to the rest of the world.

# business technology

If captive animals display stereotypies—i.e. repetitive, but pointless movements—is it possible that modernity, with its incessant checking of Facebook and cell phones, is a form of captivity for humans?

Stereotypies are repetitive tasks or movements that are seemingly pointless. One way to assess animal welfare is to measure the frequency of stereotypies, such as how often a gerbil digs without making a tunnel. Animals in captivity tend to display stereotypies, the most common of which is pacing in figure eights, typical of lions held in cages.

Do modern humans exhibit stereotypies? OCD behavior, like incessant hand-washing, seems to fit the description of stereotypies. But what about more common ones, such as repeatedly checking your cell phone or Facebook? What about obsessive, repetitive thinking? If these are true stereotypies, then it's possible that modernity is a form of captivity. We are potentially so far removed from our natural habitat, that by nursing our anxieties, and by feeding the industries that depend on our behavioral patterns, we lock into loops with our consumer devices.

# technology biology

If literacy and literalism go hand-in-hand, then so must radicalism and the Internet

Text and literalism go together. Once a rule is in writing, it can always be referenced, and usually referenced one specific way. Likewise, upon the arrival of the printing press, Christian literalism saw a resurgence in Protestantism and eventually Puritanism which laid the groundwork for the early American cultural foundation. Could it be possible that the Internet, which is as significant an explosion in text as the printing press, has led to increased literalism? Although this hasn't resulted in religious puritanism given that the religiosity in modern times is at a nadir, it has led to politically polarized minds and a prevalence of conspiracy theories. The Internet helps people codify their beliefs by giving text to every position, both extreme and generic, leading to radicalism and rigidity.

# technology politics religion

It's possible that the announcements of a New Economy in the 1990s were early. If so, then everything today is extremely undervalued

Analysts justified the late 1990s dot-com bubble by predictions of the arrival of a New Economy, where the old rules of economics don't apply. While the subsequent crash squashed those predictions at the time, they also soured people on the possibility that that prediction could ever come true. Twenty years later, a visit to Silicon Valley is like a visit to the circus, with astronomical housing prices surrounding lavish corporate campuses brimming with absurd perks provided by companies whose only claim is making addictive time-wasters on social networks. But nobody is justifying this situation with announcements of a New anything, for fear of recapitulating that old foolish, bubble thinking. All of which is collectively making this trend invisible, and perhaps, exploitable.

# futurism technology

ITunes and Spotify are defenses against piracy, but victories for commercial user-experiences over free ones

Just as piracy will always exist, so will barriers to piracy. Even though technically all media is free, free media is still inconvenient. iTunes rentals have perfect subtitles, are available instantly, don't require any special media players, aren't cluttered with ads for adult websites, and have clean metadata without duplicates. The $2.99 price tag pays for a faster, higher-quality alternative to piracy. Piracy won, in the sense that the conversation is now less about a media delivery mechanism being better than its competitors than it is about being better than piracy. Piracy lost, in the sense that there will always be a more user-friendly alternative, one that consumers are willing to buy.

# technology

Law of Diminishing Enthusiasm

One of the caveats about accelerating change is that we may reach physical limits to how fast we can make computers. That at some point, all the processors work on an electron scale, and can't be reduced any further, or that solving the overheating problem of CPUs may become intractable.

But there is another limit that could factor in: human demand. All technology is ultimately created to serve consumption. Without demand, there is no further development.

For example, at some point, we won't need anything after HD or Retina displays. The human eye won't appreciate any further refinements. Already we can see the disinterest in CPU speed on computers. In the 1990s, even casual computer consumers cared about how many megahertz their machines had. Now, the number of people that know how many GHz or cores their laptop has is a small minority. Instead, innovation is being driven by the miniaturization of CPUs. But at some point, we will have the thinnest possible phone. Already, some people complain that the iPhone 5 is too thin, and therefore too easily slips out of their hands. After thin-ness, what's next?

That hasn't stopped CPU innovation, though, because there has been this massive expanse of cloud computing and web servers. Consumer demand is still affecting CPU innovation, but it's proxy via demands from businesses like Google and Facebook that are servicing consumers.

At one point, it was video game consoles that were pushing the envelope of processing power. But after the PlayStation 3, there isn't much more that the gamer needs. Theoretically, the PlayStation 4 or PlayStation 5 will have as much graphical processing power as is used in rendering a 3D-animated Pixar film, but video gamers are drifting in the other direction, toward casual games on their iPhones, or are content with less graphically intense games on the Wii. So there is a step in the opposite direction, to make slightly slower CPUs at a cheaper cost.

There's also a limited number of hours a human has. While as a power consumer may own a laptop, smartphone, and a tablet, they divide their time between all three. Can they add another device? Perhaps they will have backup smartphones, and tablets, and eReaders, but again, that will just reduce the amount of time they spend on each device. The introduction of another device will diminish each's significance. We can only consume so much entertainment per day.

There is a pattern, though, where we sometimes think, "No more innovation will happen." For example, there is the famous (though false) quote from the Commissioner of the US patent office who said, "Everything that can be invented has been invented." Or there's another famous (though false) quote from Bill Gates: "640K of memory should be enough for anybody." And just when we were perhaps getting bored in 2009, James Cameron released Avatar into theaters, and 3D became the next big envelope pusher. We thought, "Alas, the PlayStation 4 would need to be at least twice as fast to handle all the 3D games!" But since then, consumers have become lukewarm to 3D. So already, we can see a turning back from new technology faster than we can innovate.

# futurism technology history

Law of Hierarchal Returns

Technological progress is more often hierarchal than incremental. The older the technology, the more foundational it is. For example, DNA laid the groundwork for multicellular organisms, which laid the groundwork for sexual reproduction, which paved the way for the Cambrian Explosion. There has been tremendous biological innovation since then, with only minor changes to the fundamental technology of DNA.

The invention of the Internet nests within the invention of computers, which nests within the harnessing of electricity. Facebook, Wikipedia, and Google nests within the invention of World Wide Web which nests within the invention of the Internet. It's more likely that future technologies will derive from previous technologies, and not be whole new classes of technology.

The accelerating march towards the Singularity may, therefore, manifest itself less like a rocket taking off, and more like matryoshka dolls, with smaller and smaller changes having a greater and greater impact.

# futurism evolution technology science singularity

Moore's Law is economics, not magic, because processor speeds actually scale to R&D budgets, which actually scale to human demand

Beneath Moore's Law is Rock's Law which indicates that research and development (R&D) budgets for semiconductor companies are inexorably increasing alongside gains in processing speed. If R&D budgets remained the same, Moore's Law would probably break. But perhaps the R&D growth is the real law undergirding Moore's Law. And R&D growth is justified by insatiable customer demand. Even if direct consumer demand levels off, which evidence suggests is already doing so, our indirect need for cloud computing can grow indefinitely. There isn't enough technology to keep HD videos from buffering or to keep 3D worlds in massively multiplayer online games from streaming piecemeal.

The real law is the indefinite human demand for computation. It is in our genes to seek more intelligent means. We seek finer tools, finer language, finer arts, and finer technology, which in the case of the transistor, also helps us refine that same technology.

# technology

Nowadays, it takes an epidemiologist to tell the difference between fads and lasting products

The wild valuations for budding social networks, like Snapchat, are puzzling because the services alternately seem like fads and like nuclear reactors about to explode. After all, the consensus in Silicon Valley is that Facebook's billion-dollar purchase of Instagram was a bargain. Before social networks, fads only lasted for a season. Pogs, Magic cards, wrist slappers, Tamagucis and other toys for adolescents usually lasted a school year, enough time for everybody in the class to go through the cycle of excitement and boredom, in sync with everybody else.

While social networks resemble phenomena that come and go, they exist atop a cascade of fad-like events happening in pockets. In the original, single fad model, excitement rises and falls collectively. In a fad wave, as the excitement is about to crest, an external force injects excitement back into the wave. In the case of Myspace, there was an initial three-month honeymoon, when a new user tried all the features and posted on their friends' walls. As their excitement died down, a new group of friends joined, changing the experience of the social network. They then went through their excitement cycle, trying new features, and posting on everybody's walls, meanwhile renewing that first user's interest. By the third month of the new group's cycle, another group of friends joined, extending everybody's cycle by a few months, and in tandem the original user's, until everybody found themselves using the network for a year and a half.

Because the phenomena outlast the typical timespan for a fad, many social networks seem like they are going to last. But every time the fad is renewed, the length of the renewal period for older users invisibly shrinks until everybody gets bored at the same time, no new groups join, and the network crashes.

# technology business

Off-Goal Targeting

A common strategy for achieving success is off-goal targeting. This is usually represented in templated expressions of the form, "Just focus on 'x,' and then 'y' will happen naturally." In other words, focus on a tangential objective that indirectly contributes to the other, "real" goal. A common example, as often expressed on the blog Daring Fireball is to focus on delivering truly high-quality user-experiences. If you make product interfaces an absolute joy to use, then those products will sell themselves.

The point of this exercise is two-fold. First, it gets your mind away from short-term expectations of success. If you step into product development with the idea that you want to make as much money as possible, you will likely cut corners and sacrifice quality in order to more efficiently maximize income. This mindset is likely to be self-defeating or only lead to short-term gains.

The second aspect of this goal is to focus on something you can control. It's much easier to control quality because you can measure it yourself, every day. Since there are many uncontrollable factors that play into how successful a product is (e.g. luck, timing, and competitors), by focusing on quality, you can quiet distractions. Your anxiety will be reduced because you've narrowed your attention to a locus of control (as described in 7 Habits of Highly Effective People). If your product doesn't do as well as you had hoped for, at least you can rule out quality. This would encourage you to possibly retry marketing, or to wait and see if market conditions improve. Whereas if you had made a shoddy product to begin with, and the returns don't come in, all your effort will have been for naught.

Off-goal targeting is ultimately a hedge against failure. If your off-goal target is to "learn programming" or "build things of quality" then even if those products don't succeed commercially, you'll have improved your overall experiences and built up your portfolio, all of which will outlast the initial gains from a quick success.

# success work technology

Pioneers usually come from outcasts, but it's only recently with computers when outcasts can turn obscure wanderings reliably into careers

There has been a resurgence in the pioneer's lottery, which is the age-old pursuit of being first. The first person to bump into the oil fields of Texas, or the first person to discover gold veins in California, or the first person to land in the New World were always first to reap the benefits of those new platforms. Now such opportunities for firsts are becoming more accessible and frequent. The first person to create the "million dollar webpage," where you can rent pixels for a dollar, became an instant millionaire. The first person who held up a Bitcoin QR code on a placard at a baseball game received $50,000 instantly. Those who had apps on the iOS App Store on its first day were set financially for the rest of the year. The people who were maybe a year late to the App Store but created the first apps in a niche stood to profit for many more years.

Earlier forms of this lottery depended on a significant amount of luck. Sure, you could increase your chance of discovering the Midwest by having a wanderlust, but for each pioneer that made an important discovery, there were thousands of wanderers who discovered nothing and were considered lost souls and outcasts.

But yesterday's lone rangers have become today's nerds. Initially, they seemed "lost" spending their teenage years affixed to screens playing games, only to become the first to create the same million-dollar games for Facebook. The frequency with which this is happening has come to the point where the armchair futurist can now rely on the exponential pace of technological change to continuously supply him with open vistas, ready to tap.

# futurism technology

Robots will replace mundane tasks only after a century of nerds ruling the world. Someone has to create and maintain said robots

When a machine replaces someone's job, their individual increase in unemployment is more than made up for by the reduced cost of goods spread across the rest of the economy—so the theory goes. For optimistic futurists, machines will solve every problem, leaving us free to live a life of leisure.

However, these futurists and economists gloss over the transition to utopia. Engineers have to create these automatons, and while we could consider their jobs as temporary, how temporary is temporary? What if it takes a thousand years of computer programmers, working to automate everybody else's jobs, that the entire workforce becomes programmers.

Another scenario is that the future then becomes exclusively built for the machine-makers. Survival a hundred or so years from now could depend on having a minimum amount of technical literacy. This scenario could even come about in a roundabout way, whereby welfare becomes outsourced to companies like Google or Facebook, where in order to access your benefits, you have to perform certain technical maneuvers, like cashing in virtual money from playing games. Perhaps welfare is guaranteed to anybody who knows enough about technology to make a secure enough password to protect their Bitcoin wallets.

While the collective dream is one of an all-encompassing leisure life, there are so many roadblocks to getting there, that the roadblocks themselves could ultimately end up being the reality.

# futurism technology society

Silicon Valley's real innovation is to promise workers they can get filthy rich without selling their souls

Silicon Valley pushes the work-life harmony angle, persuading workers that they can "have it all." In every other industry, there is a tacit understanding that you are ultimately trading your time for money so you can buy the other things in life that are important. In some sectors, like Wall Street or Oil and Gas, there's an open acknowledgment that you are selling your soul for the sake of being a Master of the Universe. For the technocrats of California, there is no trade-off. One can be a Master of the Universe while creatively expressing oneself at work while affecting people's lives. Valley corporations have the three Ps covered: passion, purpose, and pay. Do what you love, help others, and profit handsomely.

Because of this, there is a palpable lack of cynicism while walking through the gourmet cafeterias of Zynga or Facebook. For a swath of workers, concentrated in one geographic region, utopia is already here.

# singularity technology

Software is like water, conforming its shape to its container, whether it's ice cube trays or a series of sprints

The reason Agile development works so well is that software is much like water. While it's important to architect and design code before building it, there's an infinite number of ways to get to the same business outcome. Whole features that were once thought critical may fall onto the floor of the editing room if a gun were put to the developer's head. Short two-week sprints are exactly that gun.

When building a house, one can't just skip building bathrooms or skip building a foundation. When building software, though, sometimes a sketch that's re-sketched a few times becomes a minimally viable product, and a few iterations later, one stand-out feature becomes a whole company.

# computer_science technology

Soul-Shattering Epiphanies and Conspiracism

While studies don't make it clear whether the Internet is responsible for the recent uptick in conspiracism—or whether there is even an uptick at all—the recent measles outbreak tells a different story. What's happening is a direct result of a general increase in anti-vaccination interest, which is hard to imagine occurring before the Internet.

The prevailing theory for why the Internet is promoting vaccination conspiracies—and possibly other conspiracies—is that of "filter bubbles." Online social networks e-enforce own biases by steering us to mingle with like-minded people, all so that those social networks can sell more ads. However, the opposite could also be true. The world was more isolated before the Internet. Now we can become exposed to a more diverse set of opinions, such as anti-vaccination, than ever before.

For example, Patricia Steere, one of the stars of the Flat Earth documentary Behind the Curve, became a vegan after listening to the album Meat is Murder by the Smiths. If you woke up one day and realized that you were a mass murderer, you might question other global truths. You would begin searching for things to upend until it became an obsession. Everybody is exposed to more information now. People have more chances to encounter world-changers like Meat is Murder, which is one click away on Spotify. From there, they can keep going down the rabbit hole until the whole world is upside-down.

# society conspiracies technology

The Amish are actually very modern since it's only in modern times when the impracticality of being radically practical can be a point of pride

The Amish are usually considered a throwback, but it's more significant than that. Their lifestyle is a luxury. They are willfully recapitulating olden times, not just because older is more authentic, but because those ways aren't necessary anymore. If an Amish person were transported back in time to when subsistence farming was common and nobody had electricity, there would be nothing interesting or special about their religious posture. Their existence today is only interesting inasmuch as it is completely unnecessary to live with such chaste restrictions.

However, perhaps the Amish are actually on the cutting edge of retro-futurism. We are losing more and more of our old traditions due to productivity gains from technology, that at some point, people will have jobs simply to keep up the simulation of wage-based living. Perhaps a completely ludic economy, one where nobody has to work, will descend into anarchy, and so some restoration of the past will be necessary to keep order.

# futurism technology history

The Centralization of Technology

The current panic over the centralization of tech companies isn't new. While the fear that just a few players (Facebook, Google, Amazon, etc.) are dominating the tech space is a legitimate one, the forces underlying it are part of who we are. Centralization, technology, and capitalism are intertwined. In a way, all of those concepts are synonymous with leverage. Capitalism is the centralization of wealth that is then dispatched for singular aims. Technology is the means of transforming that capital into some value for society. Whether it's farming, weaponry, railroads, or industrialization, the first to seize on an invention is the first to capitalize on it. The subsequent rapid growth is then centralized by the wielder of that tech.

While Silicon Valley tech companies have a friendly face, they are the continuation of a long tradition of capitalists, from the East India Trading Company, to Big Oil, to Kingdoms and Empires. All power ultimately stems from technology. Even the tribal chief's power comes from the technology of spears and clubs.

Eventually, the capitalist's leverage gets co-opted and commoditized by insurgents. Democracy undid the trusts of Big Oil. Guerilla fighters with muskets undid the projection of power wrought by imperial powers. Eventually, an approximation of Google will emerge for less money. Eventually, Amazon's economy's of scale won't be so gross of an advantage. Facebook is a challenge to this model, though. The nature of its application comes from the size of its network, making it impossible for upstarts. And splitting up Facebook wouldn't work like splitting up Big Oil since Facebook's utility is in its all-encompassing nature. And yet, the incentive for upstarts to enclose people into smaller tribes that use different kinds of social networking experiences is too valuable. The king can only reign so long until the meek co-opts all of their advantages for their power play.

# technology society

The emphasis on content in responsive design has turned the art of making web pages from the "anything goes" of Geocities back into the walled garden of AOL

The title "web designer" used to be cutting edge and creative but now (2015) it has become a paint-by-numbers routine. Most websites follow a few basic templates. Everything is simply a matter of filling in the navigation here, a title bar there, and so-on. This is a result of the web becoming more mobile-centric. There's simply only a handful of ways to build a responsive site, one that collapses correctly into a mobile-appropriate way. Thus the web is becoming more consistent, and potentially more content-focused.

# design technology ux

The end of hipsters coincided with the rise of iPhones, because as soon as they all had to have one, it exposed their monoculture

Although the hipsters (circa 2007) initially rebelled against monoculture, they ultimately created a new monoculture, one that existed solely in opposition to the mainstream monoculture.

Hipsters ostensibly rebelled against such confines, and thus they would never describe themselves by such a single-serving label. But a malaise emerged in the scene, as their music became trite. A lot of what interested hipsters was novelty, but now novelty can be manufactured by throwing a little experimentalism in a catchy pop tune. Hipsters were into authenticity, but this can be simulated by reducing the production values of songs to achieve a "lo-fi" effect. And hipsters were into depth, which can be achieved by adding some difficult themes or moods to any genre, even rap.

Music was a leading indicator, and commentators derisively ruled the early death of the hipster, even as hipsters themselves eschewed the label, or ignored the signs of their extinction. What tipped the creeping malaise over the edge, though, was the introduction of the iPhone. Every hipster had an iPhone. Friends who didn't were chided when they couldn't play Words with Friends or properly receive group messages.

When you went for a smoke at a hipster club and saw the same iPhone in everybody's hand, it defeated the purpose of being in a place that was supposed to be celebrating diversity. It revealed the movement for what it was, which was all about representing things considered high-quality, pop, and "different"—all the things that Apple represents.

# technology society

The expression "writing software is like building a house" is a poor analogy, unless the house is made of Lego

# technology computer_science

The Limits of Collective Research and Reasoning

According to the Internet's best guess, everything causes cancer, and everything doesn't cause cancer. If you search, "Does such-and-such cause cancer?" you can find plenty of articles from supposedly reliable sources, such as WebMD, to confirm the carcinogenic properties of almost anything. The same confusion is true for the side effects of drugs. Everything causes headaches, nausea, etc. Or, if you follow the consensus advice on what to eat and what not to eat, you will be left with only five possible foods in your diet.

There is a medical truth out there, but our current quality of research and the current collective reasoning skills of the Internet are insufficient to answer a broad swath of common questions today.

# technology science medicine

The link is the point, not the content at the end of that link, which describes not only Google but also the mind

Hyperlinks and search results are eclipsing content in importance. We often share, comment on, or save links before fully reading the link's contents. Oftentimes we use search engines not so much to learn anything in particular, but to know which keywords reveal what kind of results.

Metaphors for mental phenomena are steeped in the technology of the era. Formerly, brains were compared to hard disks, with streams of 0s and 1s representing blocks of content. Perhaps a revised metaphor is that a brain is like a web browser's bookmarks. We are indexing content and making little notes in anticipation for some future date when we are asked a question, and need to recall that that question is indeed answerable somewhere.

# technology futurism

The most successful businesses today anticipate Moore's Law; the most successful ones of tomorrow, anticipate everybody doing the same

Computer manufacturing companies have long had dynamic business models that factored in Moore's Law. For example, Apple intentionally released an iPhone that the public couldn't afford because they knew the cost of storage and computer chips would continue to decline.

Moore's Law is perhaps an easy trend to anticipate, but increasingly it appears that disruptive businesses need to anticipate secondary or tertiary disruptions that come from exponential computing laws.

For example, the music industry doesn't have a Moore's Law per se, but it's conceivable that the "next BitTorrent" or the "next iTunes" is just around the corner, ready to change the economics of distributing music. This is why when Spotify entered the market, their profit margins didn't just seem marginally lower than the then market leader, iTunes, but lower than a future, unknown market leader. That market leader has yet to materialize or Spotify has stayed well ahead, which is perhaps why they are the current market leader.

# business technology futurism

The native vs. web app debate would be moot if browser cookies and caches truly kept us signed on with offline surfing

Native apps are superior to web apps in essentially two ways that a browser can fix very cheaply: consistent caching rules, and naturally permanent cookies. The promise of a browser cache doesn't seem real, as it's impossible to surf your browser history without an Internet connection. Even though many users are on broadband, it would seem like there's still an incentive to serve assets from a cache as often as possible, so as to keep web pages load times down.

The promise of browser cookies also doesn't seem real, as it's impossible to go a day of surfing the web without having to log into a site again. This is the most bizarre of all patterns, because login failures equal lost business. But stricter caching, and cookies that are more obviously permanent would be all it would take to give a native app experience to web. After all, isn't the main benefit of native apps that the UI works offline and you don't have to log in every time?

# design technology ux

The next phase for technology is to innovate without causing progress traps, thus marking the end of cynicism

Technology is progress in the sense that it advances human aims. And yet, progress has a bad reputation because of progress traps. A progress trap occurs when a technological breakthrough that is useful on a small scale becomes catastrophic at a larger scale. For example, advances in fishing technology are a boon to the individual fishermen, but ruinous for the world when they all use it.

But what if the pace of technological change speeds up so much that it solves secondary or tertiary problems before they happen? Perhaps innovations in fishing net design do trigger progress traps, but then a new method for increasing the fecundity of fish arrives a year or two later, or a new system for monitoring and regulating fishing boats is invented.

At this point, the most noticeable breakthroughs in technology will be when patterns and ceilings formerly inherent in technology become broken. One-by-one, all cautionary tales will become moot and history will truly end.

# technology futurism

The second, decline phase for Moore's Law takes what were once stellar technological achievements and figures out how to milk them

# technology business

The shocking thing about the future is that future shock doesn't really happen: As change accelerates so does our boredom

The shocking thing about the future is that future shock doesn't really happen: As change accelerates so does our boredom. Alvin Toffler predicted that Future Shock would be an increasingly common affliction, with people increasingly unable to cope with changes in their everyday life. However, the reality is that modern humans are more or less capable of accepting rapid change. People who are upset with future shock have been and continue to be culled by society. For example, those who stuck to factory jobs in Detroit rather than switching cities or to different types of work have become disgruntled. Perhaps they are too old to resort to crime, but their disaffection, if it didn't motivate their Gen X or Millennial children to do differently, might have lead to crime in their children, who were then jailed and removed from the reproductive and society pool.

Someone living in Alvin Toffler's time saw the rapid proliferation of automobiles followed by the ascendancy of airplanes. The logical conclusion back then was that we'd all be shuttling to space with our personal robots by now. If the pace of lifestyle change continued to modern times, perhaps we'd all be drooling in future shock. But the reality is that while technological change continues at an accelerating pace, lifestyle change has slowed down a bit, while we've become more accustomed to these changes. Nobody today believes that the tools they use today will exist tomorrow, but we know that we'll adapt. Instead of lying on the bed of a truck fantasizing about space travel, the young, bored hipster snaps his fingers and complains, "Where is our damn flying car!"

# futurism technology

The success of BuzzFeed and the success of foul-mouthed politicians share the same logic

The rise of foul-mouthed politicians like Donald Trump, Bradford William LePage (Governor of Maine), Ricardo Duterte (President of the Philippines), and Boris Jackson (Mayor of London) is the result of social media and democratic apathy. Less than half of eligible voters participate in elections, and less than half of those eligible voters are informed. As a result, the people are much like a sleeping giant, with only a small percentage of its conscious functions active at any given moment. Even if a politician says something widely unpopular and distasteful, he may inspire enough support in his base to overcome the unpopularity. For example, if a politician says something politically incorrect that irks the ears of 20% of the electorate, but inspires or rings true for 12% of the electorate, he may still win if only a quarter of that 20% opposition shows up versus half of his 12% supporters.

This math has existed as long as democratic apathy has existed, which has been a reality for at least fifty years. However, this math hasn't really been exploited until the rise of social media. More than half of people receive their news and information from social media, whether it's from Facebook, Twitter, or from the original social media: the email forward. In these environments, sensational rhetoric thrives. Whereas before social media, the mainstream media could curate rhetoric to be the most pleasing to its audiences, in today's environment everybody forwards the most incendiary sound bites to their friends, regardless of whether they support or oppose the position. Politicians are rewarded for inciting the most offense in their opposition since even the opposition will help spread their words, which may ultimately lead to a net increase in support.

In other words, you can constantly say unpopular things in today's climate, and yet win a popularity contest, i.e., an election, given the amount of slack in democracy, combined with the speed with which incendiary rhetoric bounces around social media.

# politics technology

What we lose in zest by using passive voice, we gain in intrigue, as storms gather, and things happen

"The Internet was invented" sounds better than "Scientists invented the Internet" or "Al Gore invented the Internet." Even though passive voice is discouraged in English class, we crave it. Much of the human experience is about receiving things or things happening to us, with the actor or agent unknown. We can identify who invented the Internet, but doing so ignores the idea that technology itself has agency. "Technology wants something," with or without the specifically named inventors, and so the arrival of the Internet, in a way, is something that magically appeared.

Active voice frames conversations with causality, which makes sense in the plot of a spy thriller with a character who is pushing the events of the story forward. But even then, one could frame it in terms like, "conspiracies formed, commands were sent, and computers got hacked."

# rhetoric technology philosophy writing

While we haven't invented anything as profound as electricity, the amount of things we've built from electricity is profound

The portable phones we carry are lightyears ahead of the phones we had two decades ago, but consumer aircraft hasn't changed much, if at all, in that same timespan. The miniaturization of electronics can be ascribed to Moore's Law which describes this inexorable trend of packing transistors into smaller and smaller spaces, but aircraft innovation hit a ceiling of material and fuel costs early, which have remained either fixed or gradually increasing over time. If one were to put the historical innards of cellphones on a time-lapse, it would show a box with a battery that expands like a tumor, since battery shrinkage has not progressed as much as transistor shrinkage.

So while the acceleration of technology appears to be this unstoppable force, the shape of the change is more like the hands of a clock. The second hand represents tracks of innovation that are still growing swiftly, whereas the hour hand represents domains where technological growth has slowed or stopped. Collectively, all the hands are turning, just as there is some noticeable sense that every couple years, technology is going to improve, like clockwork. But what has become clear is that there isn't some magic that ensures every technology is going to get better all the time. That idea started to die as soon as flying cars did not arrive on schedule. As inexorable as trends like Moore's Law and other innovations seem in retrospect, at any point, they could stop and become like the airplane: a giant leap for humankind we now take for granted.

# futurism technology

Wikipedia, along with other major data consolidators, has given us a Semantic Web, for better or for worse

It's amazing how often Wikipedia is a better salesman for a product than businesses are. Using myself as an example, I wanted to learn more about The Linq hotel in Las Vegas, particularly its storied history as Imperial Palace, followed by a brief period as The Quad, indicating some misguided re-branding in a vain attempt to appeal to Millenials. Their own website mentions none of that, but instead includes a series of glossy photos showing the hotel in a flattering light, as well as information about upcoming shows.

Wikipedia has filled the gap that the push for the Semantic Web was supposed to provide. The Semantic Web was supposed to be where everybody coded web pages with structured data, data that could more easily be consumed and indexed, kind of like the Dewey decimal card system. Instead, Wikipedia has given nearly every notable business a semi-structured dossier, albeit imperfectly.

The same is true for other major data consolidators, such as Google Maps. Going back to the hotel example above, it took too many clicks to find where The Linq was in relationship to the rest of the attractions on The Strip, no doubt because Linq has no interest in sending business to their competitors. It took less time to go to Google Maps, type in "linq," and see where the hotel was in relation to the tram and other points of interest on the Vegas Strip.

So instead of the Semantic Web, we have a collection of semi-accurate, semi-structured mega-sites, giving a few crucial lenses into a business (i.e. mapping from Google Maps, establishment info from Wikipedia, pricing from Hotels.com), leaving the main website created by the owners to be at best redundant, but at worst, distracting, and thus leading to a loss of business.

# technology

With Google, Facebook, and Reddit, is there still room for just surfing the web?

As much as Google dominates the revenue generated from information distribution on the Internet, it's still limited in the sense that it's only good for mining things people already know exist. People rarely experiment with keywords in the search field, rather they try to narrow down the keywords to specifically tap into something they know exists or think might very well exist. But is it possible there is an untapped market for helping people find things they don't know exist? There are plenty of discovery tools. Facebook is chock full of shared links which exposes users to novelty, but it is passive process. Before Google, there was a mystique to surfing the web. Perhaps that use-case has been crowded out, or perhaps there is some as-yet-uninvented interface that will reveal it again.

# technology

You know cybersecurity is lacking proportionality when the utility company requires complex passwords that change every six months just to view and pay your bill

We should be emphasizing probabilistic security. If you went with the security expert's advice, all our passwords would look like "He8(38!!*K" and would be unique for every website.

But then you would need a password manager that would have it's own unique password, and that would create a single point of failure. Even if the probability is effectively 0 that any of the Top 10 password management companies get hacked, it's still possible that someone could install a keylogger or look over your shoulder and gain access to all your other passwords. It would probably be better to have 1% of your low-value accounts online hacked every month, than risk such a catastrophic single-point failure.

If your passwords are so unmemorable that you have to click "Reset Password" and email yourself your password, then once again you have a single password.

The complexity of security should be proportional to the value of the secured goods. In the real world, we place padlocks on sheds in small towns while we use multiple tiers of locks on apartment doors in New York City. In software, the same proportionality doesn't exist. Digital security vulnerabilities appear immediately exploitable and immediately patchable. The cost of materials in the real-world creates naturally sane limits. Software costs cannot be measured in kilograms of metal needed to produce a lock.

The local energy company may remind customers every six months to change their passwords. They might require that new passwords are longer and have special characters, so as to be uncrackable. But ultimately, there isn't much value stealing an encrypted database of energy bills. Whatever protection is gained by having security measures that aren't user-friendly, the company more than makes up for it in the loss of customer payments.

# technology
48 entries