Sep 18, 2022
A future computer platform
date
Sep 19, 2022
slug
future-computer
status
Published
tags
type
Post
Studying the careers of people who’ve done great work made me realize how often their output can be attributed to their ability to spot seismic shifts and ride their subsequent waves. While usually obvious in hindsight, the real utility comes from being able to predict (or at least notice) these waves looking forward. That way, you can orient yourself to maximize your chances of catching and taking full advantage of it.
People tend to be skeptical of the claim that you can predict them. But I think that’s overly pessimistic. It won’t work every time, but it’s worthwhile trying to optimize your expected productive output as best you can - even if you end up just orienting yourself in roughly the right direction.
Defining computer platform shift
A technological [1] shift I felt during my life was the computer platform shift from laptops to smartphones. The words computer platform shift tend to be thrown around a lot. To avoid confusion, I’ve created my framework to define it [2]. A computer platform shift begins as either:
- A new device (physical hardware), or
- New networking/communications infrastructure
But, to constitute a full platform shift, the above needs to:
- Provide applications, experiences, and services that
- Solve a set of deep problems in people’s lives [3]
- In a significantly better, faster and/or cheaper way
- For many millions of people
Shifts can occur simultaneously and independently and aren’t necessarily zero-sum - meaning the rise of a new device or network doesn’t necessarily mean the fall of another. Instead, they become more specialized within their respective niches. [4]
Predicting what might come next
Since it isn’t the platforms themselves that determine adoption but the applications within them, your best bet is to look for the platform that has the potential for useful applications to be built and to assume that nerds will eventually build them. Some of these (killer) applications do things so much better/faster/cheaper than the alternatives that the platform gets adopted wholly because of it.
A platform without a killer application looks funny, maybe even gimmicky, which is why it’s difficult to wrap your head around why nerds care so much about it. The danger is that lying underneath is a lethal application that’ll take you by surprise. [5]
Since you can’t use an objective (better, faster, cheaper) lens pre-killler-app, a good place to look for a potential new platform is to look at what nerds find interesting. Since their interests largely stem from options, feelings and convictions, it can feel a little culty, but if enough nerds came to the same conclusion independently, you can trust that theres something in it. A useful heuristic is to find the platforms that people describe as feeling like the future. [6]
If you’re observant and lucky, you may be able to find early applications built with (or for) new tools that people really love. It’s safe to assume that whatever enabled that particular application to be built will enable others and, over time, turn the tool into a full platform. I think you can go a long way by asking people what new thing has made their lives significantly better. Most responses will be things like a dishwasher or mechanical keyboard, but if you ask a broad range of people over a large enough timespan you’re bound to get some interesting responses. With a little curiosity, you’ll soon find yourself looking straight at something that could become a future platform.
A future computer platform
The strongest case I’ve found is at the intersection of augmented reality (AR) and brain-computer-interfaces (BCIs). It’s pretty early, so there isn’t a precise term for this intersection yet, but some contenders are AR + BCIs, biosensing + spatial computing, extended/mixed reality or some combination of the above. If you know of a better one, let me know, but for now I’ll refer to the intersection as personal brain-computers (PBCs).
The same core property that enabled previous platforms to gain market share will be the same for PBCs - an order of magnitude increase in input and output bandwidth between humans and computers. [7]
PBCs increase output bandwidth (measured in bits/second) by having more screens, of any size, everywhere, customised to you, all at zero marginal cost. Whilst going from a 14” laptop display to several 30” displays isn’t a productivity revolution, I think it’s an unimaginative use of this new superpower, constricted by what we’re used to in current paradigms (if paradigm-ist isn’t a word, it is now). We’re yet to realise just how significant the ability to add pixels to the real world, at will, really is.
The increase in input bandwidth is equally as significant. For the past half a century, we’ve been limited to the speed at which our fingers can type and point. Not only is this inconvenient to use, it’s also far slower than the speed of our thoughts. Interacting with PBCs will be as quick and convenient as looking at an object (real or augmented), thinking about interacting with it and getting instant feedback. But again this is paradigmist. PBCs ability to measure eye, muscle and head movements, pupil dilation and brain activity will open up applications that have previously not been associated with conventional computing. The ability for a smartphone to call a taxi to your exact location with the press of a button (via GPS input) is a good analogy.
An overlooked aspect of bandwidth is its effective increase over longer periods of time. Whilst input bandwidth on a smartphone is slower on a bits/s basis vs a laptop, the fact that you can easily use it on the Tube increases your net input bandwidth (say, bits/day). Similarly, PBCs can record data for as long as you wear it and increase your ability to interact with computers throughout the day.
Doubts
To many, this will sound dystopian. Even if this type of pushback happens during every platform shift, the fact that we’re still struggling to contend with the side effects of excessive social media and smartphone usage suggests that people are right to be at least a little concerned. Mass adoption of PBCs will take place whether we like it or not so it’s worth learning about the dangers to try to minimise them. I think a combination of an adjustment period plus the building of tools to help us combat unnecessary distractions and dopamine-driven UX will help mitigate most of the downsides. Since competing PBC companies are inclined to build things customers want, this is ultimately going to be a battle between our collective neocortexs and limbic systems [8] to make the correct product decisions - but I’m pretty optimistic that Team Neocortex will come out on top. [9]
There are additional doubts too. Even though virtual reality (VR) has been hyped about for many years now, it hasn’t become a major computer platform like many initially thought. Most of us are familiar with how VR can enhance your sense of immersion. The reason VR itself hasn’t (and most likely won’t) become the next computer platform is because immersion-as-a-property alone isn’t strongly desired in our everyday lives. Even when immersion is by far the highest ranking property in an application, the advantages of VR rarely outweigh the inconvenient, clunky and slow UX. This is clearly seen its adoption within the gaming community - a medium perfectly situated to take full advantage of this capability. It turns out that only a small subset of gamers rate immersion as the single core property, so adoption has stayed low.
If VR applications can be thought of replacing immersive applications on previous platforms, PBC applications can be thought of as replacing utilitarian applications - like the way we work and interact with people and computers.
Solutions
Good predictions are time-bound. So, I’m going to suggest that core applications previously used on desktops and laptops will be replaced by better, faster, cheaper mass-market ones on PBCs by 2025. Applications previously used on smartphones will be replaced by ones built for PBCs by 2030. Naturally, hobbyists and early adopters will begin using the tech well before everyone else and we’re already starting to see this happening.
The improvement in PBC power and comfort can be explained crudely via Moore’s law. The reason people complain about the marginal improvements in laptops and smartphones every year is fundamentally because the form-factor cannot change without reducing usability. In other words, the laptop and smartphone form-factor has reached a local maximum. Over time (assuming Moore’s law continues to hold), the advantages gained from the PBC form-factor will outweigh its relative lack of power or comfort when compared to laptops and smartphones. We are a generation away from the hardware being comfortable enough to wear sitting at your desk all day and we’ll eventually be able to do even the most intense work using the smartglasses form-factor outside.
Whilst compute is a solved problem, AR optics is notoriously difficult. Whereas optical-see-through-displays (to replace your smartphone) is still about a decade away from being a great experience outside, video-see-through-displays that’s good enough to use sitting down (to replace your laptop) is just around the corner.
When people hear BCI they immediately think of implanting chips in the brain. This will happen, but we’re at least many decades away from this being a practical mass-market consumer product [10]. However, the ‘brain’ part of personal brain-computers lies on a spectrum going from our thoughts to our muscles. Laptops can be thought of as a BCI with a bunch of middle men in between - converting thoughts into electrical signals (neurons) that become muscle movements (fingers) and kinetic energy (key presses) that then convert back to electrical signals in the form of bits. With this perspective, the pragmatic approach to a mass-market BCI is to remove as many of these middle men as possible to get closer to the brain, reduce latency and increase bandwidth. A good example of this is skipping the need to move a cursor using your fingers by having it appear exactly where you’re already looking using sub-degree eye tracking.
What we are building
To get to this future we’re going to need great software. Today, nearly all our interactions with computers are either on web browsers or use web technologies [11]. So, we think the most important piece of the software puzzle is going to look like an open, cross-platform web browser, allowing you to work and play in AR using multiple types of input. The browsers used in XR today (of which there are very few) are just worse copies of browsers that exist on current platforms. We think that’s unimaginative. For PCBs to become the next platform, we need to be able to take full advantage of their potential. Alongside new features like multiplayer mode, we’re also working hard on making it easier for others to build powerful new applications to bring the web to AR.
The ability to add pixels on the real world and increase i/o bandwidth by an order of magnitude will allow humans and computers to be more productive than ever. As the hardware gets smaller, and with practical AR just around the corner, we’re close to transforming PCBs from something nerds and hobbyists find interesting to something that also looks more like a computer platform.
[1]: As opposed to cultural, political, economic
[2]: It isn’t perfect so let me know if a better one exists
[3]: Usually meaning people already pend significant time and money to fix - usually has its own category (entertainment, education, communication, work).
[4]: Laptops still exist, even if hyper-portable computing is more convenient on a smartphone.
[5]: Which is exactly how you get stuff like this.
[6]: I think this is where blockchain is today - although its difficult to tell how much is real interest and how much is market noise or financial speculation.
[7]: New computer platforms always seeking higher i/o bandwidth may even be something like a fundamental law of computing.
[8]: Our ability to plan for our long term happiness vs instant gratification
[9]: As the tools become more powerful, our ability to combat the negative side-effects get more powerful too. BCIs will be able to monitor and regulate our own shortcomings in being able to optimise for things like long-term happiness.
[10]: Approximately as cheap, safe and convenient as an MRI scan
[11]: Something that seems to be accelerating