Pssst! Someone put a chip in your head.

Alexandra Samuel
7 min readFeb 14, 2024

I love implanting computer chips in my friends’ brains.

At least, I did until recently, when Elon Musk announced he’d gotten in on my action by actually implanting the first Neuralink chip in a live human subject.

That’s when I realized Elon had killed what has long been one of my favorite conversational gambits: Would you put a computer chip in your head?

Almost everybody says no. But that’s where the fun begins! It usually doesn’t take me too long before I can pinpoint the ambition, dream or annoyance that makes my pals realize that actually, they would get a chip in their head:

Yeah, I guess I would, if it would let me write as well as my very favorite novelist.

Yes, if I would be able to instantly speak any language, no matter where I traveled in the world.

Hell yes, if it meant I could permanently keep track of every object I ever put down, and would never have to go looking for my keys ever again.

My own imaginary threshold for implantation has never been that high.

Yeah, if it would mean I could write by just thinking the words, without even typing or dictating.

Yes, if I could remember every Excel formula without having to look it up.

Hell yes, if I can stop using headphones and a remote control when I’m watching TV.

The power of the chip test

These conversations have always interested me because they get at the core pain points technology has still failed to solve, the great hopes technology has so far failed to deliver on, and establish the level of benefit any one person would have to get in order to cross the line to tech augmentation. (It’s different for every person, but it’s rare that I find someone who says no way, no how and no matter the safeguards or the hypothetical temptation.)

That changed, at least in my still chip-free head, the moment of last week’s announcement. Whenever I take on the rhetorical challenge of trying to sell a friend on a hypothetical implant, trust is the primary reservation. The potential benefits of an implant, no matter how amazing, have to be weighed against the risks of implanting a technology that someone else can potentially intercept or control.

I usually have to stipulate to all kinds of imaginary trust-enhancing conditions (“assume you can read all the code and know exactly how it works, so you’re confident it’s entirely in your control” and “it runs on a proprietary frequency that is unique to you, so nobody else will ever be able to access or intercept it”) before I can get someone to accept an imaginary implant, even if it promises to turn them into a jacked hybrid of Albert Einstein and Frida Kahlo. (If you had a brain implant, you would already instantly know what that person would look like and achieve, instead of sitting there stuck on the question of whose eyebrows they’d have.)

Think you’re chip-free?

It’s one thing to promise a trust-maximizing scenario. It’s another thing to witness its opposite. I mean, if you were going to ask me the hypothetical question, Who is the last person you’d allow to implant a computer chip in your brain?, Elon Musk would be pretty high on that list.

Which is why it’s so weird that so many of us have allowed him to do just that.

I don’t mean the Neuralink recipient. I mean all of us who continue to use Twitter — or for that matter, any other app or gadget that is a constant part of our daily lives, but created and run by people we don’t particularly trust. We’re letting these technologies, companies and people into our heads by adopting the apps and devices they market to us, accepting the default settings that determine our day-to-day experience, and often, handing over our data in the process.

So let me suggest three things you can do, starting today, to take back a little control.

Recognize every technology as a brain implant

The first and most fundamental shift: Recognize that every technology you use is analogous to that Neuralink implant.

Whenever you look at Facebook or Linkedin or TikTok, you’re letting someone else’s algorithm determine who and what you see — in other words, letting it into the social part of your brain. Whenever you look for something on Google or Amazon, you’re letting the search results subtly influence your sense of what’s relevant or desirable or important. Whenever you send a text or draft an email or write a document in Word, you’re letting the interface guide your word choice or the form and pace of your self-expression.

None of this is new, of course: Our minds and ways of looking at the world have always been influenced by our tools. Writing with a fountain pen must have felt new and perhaps liberating to people used to writing with quills; the first libraries, and the first encyclopedias, changed the importance of keeping every piece of information in your own head.

But now it’s happening at a whole other scale and pace, and in a way that’s far less transparent. We let so many interfaces and companies and tech barons into our heads today, just to meet the baseline expectations of professional life or social interaction, that we don’t have time to notice how each one works its way into our ways of thinking or being. We don’t stop to ask: Is this the guy I want inside my head?

You probably won’t give up on technologies just because you don’t like the person or team behind them; I certainly don’t. But recognizing that you’re letting someone else into your head is the first step in resisting or mitigating the impact of that manipulation.

Audit the cognitive and emotional impact of your technologies.

It’s easiest to see the impact of a new tool or network when you first start using it. I’ve really noticed how my thinking and writing processes have shifted over this first year of working with AI, but I’m betting in a year or two that will be so integrated into my workflow that I won’t even remember what’s changed. So whenever you start using a new app or gadget or platform, and especially when you start using a whole new type of technology (like AI), spend some time reflecting on how it changes your work process and consider jotting down the pros and cons for future reference.

You can and should perform this sort of audit on technologies you use regularly. Tech marketers love to draw our attention to all the time-saving features of our apps and gadgets, but the cognitive and emotional impact is just as profound as the productivity boost.

Perhaps Instagram leaves you feeling energized after five minutes of scrolling, but envious and drained after twenty. Maybe writing feels fluid and rewarding when you work in a barebones, distraction-free app like Ulysses, but tedious and painful in a word processor like Word. Consider whether working with AI leads you to chew through tough problems in record time, or leads you down time-wasting rabbit holes.

Yes, you could fall down one such rabbit hole by trying to answer this question with data or AI, but an easier approach is to make a habit of pausing every twenty or thirty minutes to notice how you are thinking and feeling (literally a ten-second self-check) and keep an eye out for recurring patterns. (Consider using an app like BreakTimer to prompt these reflections on a regular schedule.) Don’t just watch out for the stuff that grinds you down; look for the activities and interfaces that unleash your joy, energy and creativity.

Tweak your tools.

The good news is that even if you’ve given the world’s biggest tech companies the keys to your brain, it’s not an all-or-nothing situation. One reason I’m such an inveterate tech tweaker is because I really like to adjust my tools so that they make me feel happy and focused, and to plug any holes that leave me feeling low.

Here are some examples of simple customizations that let me reclaim my brain and heart from the algorithm and interface designed by other people:

  • Extreme friend hiding. Whenever I find that a particular person’s Facebook posts leave me consistently irritable or blue (whether from jealousy at their good fortune or despair at their foolishness), I hide them from my feed. As a result, Facebook rarely leaves me feeling gross.
  • Home screen organization. The apps on my iPhone home screen aren’t the ones I need to use most often; they’re the ones that leave me feeling happy and energized instead of aggravated. Since they’re one click away, I use them more, and other things less. Plus it means that looking at my phone doesn’t produce an instant associative ick feeling.
  • DIY apps. I create a lot of my own work tools in (like my task tracker and my story idea tracker) so that I can have the interface I want. I use whimsical doc names and icons that make even my tedious tasks feel more human and friendly.
  • Inbox restructuring. I’ve written about how much I like the email app Superhuman; one of its features is a “split inbox” that lets me channel different types of email into different views. Making “Workplace news” and “AI news” into menu items in my email app is an easy way to ensure I look at the key newsletters I want to see, not just whatever’s at the top of my newsletters pile when I happen to have a free moment.
  • Bartender. This little Mac menu bar utility lets me decide what appears in my menu bar so that it reflects what I think is important — not what Apple does.

Your life as a cyborg.

Freaky moments like the recent brain implant news are useful, because they’re a prompt to reflect on how much technology is already effectively embedded in our heads. (And yes, mine even more than most!)

Precisely because we live in such a technologized world, it feels a little arbitrary to be spooked by the idea of a brain implant — like keeping the technology outside our bodies and brains is any proof against its influence. If technology has a dramatic impact on our productivity and accomplishments, that’s not despite the fact that it changes how we think and what we do, but because of how it reshapes our thoughts and activities.

Because we don’t just use technology: We fuse with it.

We are already cyborgs.



Alexandra Samuel

Speaker on hybrid & remote work. Author, Remote Inc. Contributor to Wall Street Journal & Harvard Business Review.