On what we already know about minds that never look away — and what it might mean to wear the screen on your face.

A different kind of question
The first three posts in this series were about things we can measure. Eye strain, with tens of thousands of subjects across decades of optometry research. Inattentional blindness, with controlled studies in driving simulators and flight cockpits. Pedestrian deaths, with police accident reports and peer-reviewed papers. All of that is real. All of that is solid ground.
This last post is going to walk us off the solid ground a little, and I want to be honest about that up front. The question of what happens when augmented reality moves from a thing you sometimes use to a thing you always wear is, as of this writing, an open question. The glasses are not yet ubiquitous. The contact lenses don’t exist yet. The data set we’d need to answer the big version of the question hasn’t been collected, because the experiment hasn’t been run on a big enough population for long enough.
So I’m not going to make predictions. I’m not going to tell you what AR glasses are going to do to society in 2035. I have no idea, and anybody who tells you they do is selling you something. What I’m going to do instead is something a little sneakier and a lot more honest: I’m going to walk you through what we already know about what phones have done to human attention, memory, and presence — because phones are basically AR glasses that haven’t quite made it onto your face yet, and the research on phones is a lot further along than the research on glasses. Then I’ll let you do the math.
The mere presence of a smartphone
Here’s a finding that should make you sit up straight. In 2017, a group of researchers led by Adrian Ward at the University of Texas published a paper in the Journal of the Association for Consumer Research with the title “Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity.” The study was simple and clean. They gave participants a series of cognitive tests measuring working memory and what researchers call “fluid intelligence” — basically, your ability to think on your feet and solve problems you haven’t seen before. What they varied between groups was where the participant’s smartphone was during the test. Some participants had their phone face-down on the desk in front of them. Some had it in their pocket or bag. Some had it in another room entirely.
The phone was off. The phone was silent. The phone was not buzzing or lighting up or doing anything to interrupt the test. The participants reported afterward that they had not thought about their phones during the test, and the researchers’ measurements confirmed it — these people were not consciously distracted in any measurable way.
Their cognitive performance got measurably worse anyway. The participants whose phones were on the desk in front of them performed worst. The participants whose phones were in another room performed best. The effect was largest for the people who reported the highest dependence on their phones in everyday life, but it showed up across the board.
Sit with that for a second. The mere physical presence of a powered-down smartphone in the room with you is enough to measurably reduce your working memory and your problem-solving ability. Not because you’re using it. Not because it’s interrupting you. Just because it’s there. Some part of your brain is allocating attention to it whether you want it to or not, and that allocation is coming out of the same budget you’d otherwise be spending on whatever you’re trying to think about.
Now consider that an AR headset is not in another room. It is not on your desk. It is bolted to your face.

The Google effect, or, what you stop remembering
There’s another line of research worth knowing about. In 2011, a paper published in Science by Betsy Sparrow and her colleagues introduced what they called the “Google effect” — sometimes called digital amnesia in the popular press. The finding was this: when people know they will have access to information later, they remember it less well. Not “less well than they would have if they were studying for a test” — less well than control groups who were given the same information but told they would not have access to it later.
The brain, it turns out, makes a strategic decision about what to commit to memory. If something is going to be available on demand, the brain treats it as already-stored and doesn’t bother to store it again. This is, in one sense, a perfectly rational allocation of limited resources. Why memorize a phone number when the phone will dial it for you? Why memorize a route when the GPS will guide you? Why memorize a recipe when you can pull it up in three seconds?
But the price of that rational allocation is that the thing itself — the phone number, the route, the recipe — quietly stops being something you know. It becomes something you can look up. And there is a real difference between knowing a thing and being able to look it up, and the difference shows up most clearly at the moments when you can’t look it up. The dead battery moment. The no-signal moment. The “I left my phone at home” moment. In those moments you discover that a part of your mind that used to hold things on its own is no longer in the habit of holding things, and it has to be re-trained to do the work, and that re-training takes time you don’t have.
This whole phenomenon has a more general name in the research literature: cognitive offloading. The basic idea is that when an external tool is reliable enough and convenient enough, the brain stops doing the work itself and starts treating the tool as part of its own cognitive system. The tool becomes, in a real sense, a piece of your mind that lives outside your skull. This is not a bad thing in moderation — humans have been doing it for thousands of years with everything from shopping lists to written language itself — but the modern version is on a scale and in a form that previous generations of cognitive offloading were not.
What changes when the tool is on your face
Here’s where I let you do the math instead of doing it for you, because I promised at the beginning of this post that I wouldn’t make predictions, and I want to keep that promise.
We know from the brain drain research that the mere presence of a powered-down phone in the room with you measurably degrades your cognitive performance. Glasses are closer than the desk. Glasses are closer than your pocket. Glasses are on your face. What does the always-present version of that effect look like?
We know from the cognitive offloading research that when an external tool is reliable enough, the brain quietly stops doing the work itself. With phones, you have to make the choice to pull the phone out and use it, which is a small act of friction that your brain still has to overcome. With always-on AR glasses, the information is already in front of you. There is no friction. There is no decision point. The tool is just there, all the time, available without effort. What does maximum-convenience cognitive offloading look like, sustained over years?
We know from the Google effect research that information you know you’ll have access to later is information you don’t bother to remember. With phones, the access is fast but not instant. With glasses, the access is instant. There is no lag. There is no act of retrieval. The information appears in your field of view the moment you wonder about it. What happens to the memory of a generation that grows up never having to wonder about anything for more than half a second?
We know from research on phones in social settings — there’s a whole literature on this, going back at least a decade — that the mere presence of a phone on the table during a conversation reduces the perceived quality of that conversation, the depth of the topics that get discussed, and the empathic connection between the people involved. This is true even when the phone is never picked up. The phone is a signal, to both people, that one of them has somewhere else they could be. What happens to a conversation when one of the people is wearing the phone on their face the entire time, and the other person can never quite tell whether they’re being looked at or whether something more interesting is being looked at through them?
I told you I wouldn’t make predictions. I’m not making predictions. I’m just laying out the findings and pointing at the gaps where the new technology will fit. The math is yours to do.
A note on the IQ thing
There’s one piece of research I want to mention because it’s fresh and it’s a little spicy and it’s still under active debate in the field. For most of the twentieth century, average IQ scores in developed countries went steadily up — a phenomenon called the Flynn effect, named after the researcher who first documented it. Each generation, on average, scored higher than the previous one, by enough that the tests had to be recalibrated every few decades. Nobody fully understood why, but the leading theories had to do with better nutrition, better education, and a general environmental enrichment effect.
In the last fifteen years or so, in several developed countries, the Flynn effect has stalled. In a few countries, including the United States, the average has actually started to reverse. A group of researchers including the cognitive psychologist Barbara Oakley released a paper in 2025 called “The Memory Paradox” that proposed an explanation for the reversal. They argued — and I want to be clear that this is a hypothesis still being debated, not settled science — that the reversal correlates with two big shifts: a move away from direct instruction and memorization in schools, and a sharp rise in cognitive offloading to smartphones and search engines and now AI tools. Their argument was that the cognitive structures that produce intuitive reasoning and problem-solving ability are built through repeated retrieval and effortful recall, and that when people stop doing the effortful recall, the structures stop getting built.
I’m not going to tell you whether the Memory Paradox hypothesis is right. I genuinely don’t know, and the people who study this for a living don’t fully agree yet either. What I will tell you is that it is currently the most carefully argued explanation on the table for a real and measurable phenomenon, and that the mechanism it proposes is exactly the kind of thing that would get worse if everybody was wearing a search engine on their face all day.
I’ll let that sit there. You can decide what to do with it.
What I actually want you to take from this
Look. I’m not asking you to be afraid of AR glasses, and I’m not asking you to refuse to use them when they show up. I think they will probably show up, and I think a lot of you reading this will probably wear them at some point, and I think some of you will probably build them, and that’s good. The technology is real and exciting and full of legitimate promise.
What I’m asking you to do is hold two things in your mind at the same time, the way you would hold the steering wheel and the speedometer at the same time when you’re learning to drive. The first thing is the excitement, which is real and which I share. The second thing is the awareness that this technology is going to change something inside of you, in ways the people selling it to you do not fully understand yet, and that the changes are not all going to be ones you would have chosen if anybody had asked you in advance.
The good news is that you get a kind of preview, because phones already did some of this. You can look at your own relationship with your phone and ask, honestly, how much of yourself you have already given to it. How much of your attention. How much of your memory. How much of your patience for boredom. How much of your willingness to sit alone with a thought for more than thirty seconds without reaching for a distraction. Whatever the honest answer to those questions is for you — and the answer is going to be different for everybody, and there are no wrong answers, only honest ones — that answer is data. That answer is your personal measurement of what a less-immersive version of this technology has already done to the inside of your head.
When the more-immersive version arrives, you will know better than anybody what to watch for, because you will already have been through one round of it. The people who came of age before phones did not have that advantage. You do. Use it.
A closing question I’m not going to answer
Here’s the question I’ll leave you with, and I promise I am not going to answer it for you, because I genuinely don’t know the answer and I don’t think anyone does yet.
We know what humans are like when they spend their days looking at the world. We have thousands of years of poetry and philosophy and art and science about what that experience is and what kind of mind it produces. We are starting to learn what humans are like when they spend their days looking at the world through a phone, and the early returns suggest the mind it produces is different — not necessarily worse in every way, but different, and different in ways the people producing it didn’t all sign up for.
We don’t know yet what humans are like when they spend their days looking at the world with the world’s information already overlaid on top of it, all the time, with no lag and no friction and no possibility of putting it down. Some of you reading this are going to be the generation that finds out. Either because you wear them, or because you build them, or both.
I just want you to know, before that happens, that the question is real, and that you are allowed to ask it, and that asking it does not make you a Luddite or a coward or somebody who’s afraid of the future. It makes you the kind of person the future actually needs — somebody who notices things and asks questions about them and refuses to let the brochures do their thinking for them.
Go try the AR. Try the headsets, try the apps, try the goofy demos at the science museum. Have fun with it. Build things. Be amazed by what’s possible. And every once in a while, when you take it off, sit quietly for a minute and notice what your mind is like without it.
That noticing — that quiet checking-in with yourself — is one of the most important skills you can develop in this era, and nobody is going to teach it to you in a class. You have to teach it to yourself. And the good news is that you already have everything you need to start. You have a mind, and you have a few quiet minutes, and you have whatever curiosity brought you to the end of this post.
That’s the whole tool kit. Go use it.
This is the final post in a four-part series on the cautionary side of AR. The series started with your eyes, moved through your attention and your body, and ended here at your mind. The lesson it accompanies is “Try AR yourself.” Now you know what to watch for. Have fun out there.