I write about a lot of near-future tech in my Singularity’s Children books. One example are the Spex, a combination of neural interface and augmented reality (AR) head-up-display.
To get an idea what it will look like from the inside, check out this over-the-top; but probably—sadly—all too accurate video from Keiichi Matsuda.
I believe AR is going to be huge, but will only really take off when combined with at least enough neural interfacing to extract sub-vocalized speech from the brain (or perhaps from nerves in the throat). Nobody is going to want to wander around in public mumbling to Alexa or Siri and risking eavesdropping!
Of course, it’s only a short step from extracting superficial speech to full on brain-to-brain communication; aka telepathy.
And if this sounds a bit futuristic, there are already scientist able to extract and reconstruct the images the subjects are observing.
Recording dreams is coming! How crazy is that?!
For more, even more, if you have a spare half hour, check out this super interesting interview with Mary Lou Jepsen on Rod Reid’s ‘After On’ podcast.