Our Best
RamblesIn this Ramble: The best in ourselves and others, living until you don’t, and how decapitated worms may unlock memory–for better or worse
4. “Never Again” and “Not Again”
Image Credit: “Portrait of Alyssa” by Shane Scribner, www.scribnersgallery.com
Oh, the People You Meet…
Marcus Aurelius, Emperor of Rome, the last of the 5 Good Emperors, and as close as one can come to a Philosopher King, is as famous for his “Meditations” as his political achievements. While a good glimpse of Roman Stoicism, “Meditations” are notes that Marcus was jotting to himself on life and how to live it. Much closer to a diary than an essay on his philosophy. You can tell what he struggled with based on how often a theme in his “notes to self” occurs. Turns out, Marcus struggled with other people.
Imagine that.
Imagine, further, if you can, that the Emperor of Rome had problems with people being honest with him. Weird, right? Seems some people wanted things from him and his political power–by any means necessary sometimes.
Marcus repeats to himself that the people he will deal with today will be vain, mendacious, violent, selfish and cunning. More cynic than stoic, and a little more detailed than our modern “people suck,” but relatable. Then the Philosopher King appears. For every time Marcus reminds himself that people will suck, he -immediately- tells himself they do so because they are not in touch with their Nature, the Logos (best thought of as the spirit of God, or at least the unifying, logical consciousness driving the universe). If they were, he reasons they would not choose to be vain, mendacious, selfish and the rest. He often reminds himself not to judge them, but of his responsibility to help them see their error, either by word or deed or example. To Aurelius, the wisdom of the Logos is to act with kindness and justice.
So why do people suck? How and why do we lose touch with our better nature?
Tough question. “Woe to the world because of the things that cause people to stumble!”
For the parents out there, have you noticed, since having children, how you sometimes look at an adult but think of the baby? No one ever warns you this happens when you have kids. But it does. I know I’m not the only one this has happened to. Even more odd, when this happens, it feels not like a burst of insight, but recalling a memory. Something obvious, so obvious, that you had somehow forgotten.
“Yet, at one point,” you think, “that person was a smiling, happy little baby.” Likely with a mother who loved them very much. That much is absolutely true of them—even if they are far from Aurelius’ wish for their better nature as they can at the moment. We did not start out this way, acting the way we sometimes do. We all, -all-, really were more innocent, more simple in our wants. More honest.
We laugh at how blunt kids are at times. “How silly—this kid isn’t old enough yet to have learned to restrain her honesty. Kids really do say the darnedest things.”
That’s how we grow up. In size, certainly. In spirit, perhaps debatable. We use “acting like a baby” or “acting like a child” as a pejorative, probably doing a disservice to this innocence, simplicity and honesty. But still, “acting like a child” captures the degeneration, the regression, of the behavior we are condemning often enough.
After all, as we get older, we learn to want new and different things. Some are things we need. Some are not. Some we see, or hear, or taste, or touch and desire immediately. Some we learn to desire or are conditioned to desire. After all, we did not fuss for social media “likes” when in the crib. Desire warps us. At its excesses, desire can warp us beyond recognition. Take the addict. The addict desires release or a high, took their chemical of choice to achieve it, and then gets stuck. Desire led to a compulsion that has pushed out every other good in their life—family, friends, job, the law. No one chooses to be an addict. But they did choose their desire, and their method of trying to fulfill it. One bad choice in pursuit of an illusory “good” can be all it takes. The stronger the desire, the stronger the temptation, the more we obsess and focus, elevating that desire to the exclusion of other goods, other needs, or more importantly, -other’s- goods and -other’s- needs. That is the moment you lose Marcus’ script. You lose the logos. You lose your better nature.
No wonder desire is THE motivation of poor moral choices agreed by all major religions. At least, my limited understanding of Buddhism is that nirvana is found in the absence of desire and attachment to the material world. To the Abrahamic religions, desire sets the table for temptation, when the object of desire is available, but at the cost of sin: “Rather, each person is tempted when he is lured and enticed by his own desire. Then desire conceives and brings forth sin, and when sin reaches maturity it gives birth to death.”
Indeed, behind all 7 deadly sins is desire. Pride, the desire to be better than, to have -your- will done no matter the cost to others. Greed, simply desire upon desire upon desire, never satisfied, always finding another want. Perhaps the exponential desire of Greed is why this is the sin Jesus sermonized against most often in the gospels. Lust, which sees others as objects first, to satisfy yourself with. We have even made lust synonymous with desire, and our modern “improvement” to the sin is now to use lust in advertising to get you to desire products and buy them. Sex sells, right? Envy, a desire to see others brought down if you cannot rise up. A desire that human excellence should not exist if -you- are not its pinnacle. Gluttony, a desire for more than what you need—and deadly sin when it deprives others of what they need. Wrath, the desire for violence, even the end of another. Sloth, a desire to shirk difficult but necessary duty.
Desire, desire, and desire.
But there is nothing new under the sun. Humans then as humans now see and -want-. Then, as now, we glorify ourselves in our desires. Give ourselves excuse and permission, particularly for those desires that we know might be bad for us if achieved, or bad for others. So, as from the dawn of humankind, we tell ourselves we deserve what we want. Our desire’s object is -owed- us.
If their is original sin, found in every human, it is -that- rationalization.
So we “grow” from innocence and earnestness of our earliest beginnings to become our own little Godling, desiring our own glory. Look closely, and honestly. Consult your doppelgänger. You will see it somewhere.
If not, congratulations on your forthcoming application for sainthood in the religion of your choice.
Speaking of the loss of earnestness as we age, look no further than our daily actions. How context dependent are your reactions and interactions are over the course of your day?
How often do we do more than just dress up for a formal event? How many of us have a “professional” mode at work—whose tone, language, dress and manner is different once you are off work? Are you the same on the first date as you are in the 7th year of marriage? How odd that society demands we put forward a very specific, “best” image forward in certain times, situations and places.
My 6 year old son wouldn’t know the difference, or why maybe not -those- specific shoes for where we are going. Sometimes that is practical—boots are a better choice in the rain than crocs. But sometimes I know I am absolutely conditioning him that because society expects your shoes to match, and might otherwise judge you as disordered and possibly deranged if you walked into work with one cowboy boot and one Minecraft sneaker, you must match your shoes. His “why?” has good intention. My response is…well, superficial and typically the tyranny of “because I said so” or “it doesn’t look right.” What’s the real message of the latter? Your earnestness and honesty matter less than the image society expects you to present.
Well, we are social animals. We must create a society, and that demands written or unwritten limits on the individual to preserve society. And we all -like- that like counter. Anyone who tells themselves that they don’t enjoy validation by society is deceiving themselves. Be it the active counter of your “likes” and “followers” or merely the tacit acknowledgement that you are good and decent, respected and acknowledged member of your society. So we make these small sacrifices of individual preference, sublimate ourselves for those benefits of society.
“The people you deal with today will be vain, mendacious, violent, selfish and cunning.” Some will also be magnanimous, generous, peaceful and self-sacrificing. Some will be joyful and seeking to spread their joy. Some will be suffering.
You won’t necessarily know which. Some will be sublimating strategically, trapped by their desire, looking to secure it. Some will be sublimating to keep the best foot forward, and society demands it. Even those who suffer—those whom you would instantly leap forward to help, if you only knew.
When they meet us, how often are we doing the same?
How many people do you know have -any- idea of the full list of struggles, failures, and triumphs you have had or still have in your life? We all know different joys. We can all look at different paintings and come to different conclusions on which is more beautiful. We all have slightly different values on things—and those may change depending on time and place for us. Suffering we all know. Failures, mistakes, we all know. Desire. Need. Yet how often do we we hide those? Like an eternal first date, we are constantly trying to present only the -best- version of ourselves to the world.
The loss of innocence isn’t when we learn there is evil in the world. We lost innocence when we lost that earnestness and honesty we had with ourselves and others back when we were children.
So remember—people you meet today (or see on TV) will be generous, loving, peaceful, self-sacrificing, noble and virtuous. People you meet today are almost certainly suffering in some small way—if not, then share their good fortune and joy. People you meet today (or see on TV) will be vain, mendacious, deceitful, greedy, violent etc. because desire has gotten the better of them, and they have lost the script. You won’t know which if you don’t connect better, more honestly, and dare to make the effort. Don’t despair those lost to desire, acting badly or just in bad faith. Desire has gotten the better of you at least once, and caused you to lose your script, and betray your better nature. -Remember-. Given the opportunity, help steer them back to their better nature. All while making sure that you, yourself, are not being led astray by your own desires. Given our infinite capacity for rationalization, we should be prepared to accept steering ourselves when it obvious to others (but not to us) that desire is taking us off course too.
Easy enough. Right?
Live Until You Don’t
No one gets out alive.
No one.
When my father was in clinical practice, and he had to give a diagnosis of cancer, he would schedule two visits. He would say the same thing in both visits. In his experience, though, the first time you say the c-word to a patient, they don’t hear anything else. The entire first visit is a waste after Dad would say “You have cancer.” Only on the second would the patient be listening and participating in how their diagnosis would be treated.
So many freeze and shut down when they hear “you have cancer.” But it’s not cancer that is the fear. Cancer merely brings proximal the fear.
The fear, the real fear, is death. We tell them “you have cancer”—but what they hear is “you are mortal—and in mortal danger.” “You have something you know from others and experience may very well end your life.” “You are in danger of dying.”
But it’s not just death, is it? If I say “chemotherapy” to you, the first images you conjure in head are bald patients wheeling IVs through hospital halls, frail and obviously sick. You’ll see nausea and vomiting that don’t stop. Pain medication. “Poison that kills the tumor faster than the patient.” And we know that chemotherapy comes with cancer. So I say to you “you have cancer” and you hear “death.” “Death and suffering.”
We fear suffering too. We do not like to suffer, and we would very much prefer to be infinite, not mortal. The long history of accusation against religion is that they are all myth, claiming the infinite without evidence, to calm and soothe us from contemplating our terrible mortality. Yet the secular world tells itself similar myths. We invent a multiverse, an infinity of universes, where you, or some facsimile or approximation, goes on existing. We claim it is all a simulation, see, so you never really die, never really suffer. No one does. It was never real at all. Life wasn’t real. We have great minds of science and technology desperately seeking, like Ponce De Leon of old, the fountain of youth by chemically reversing aging. Others seek the Philosopher’s Stone, looking for some sort of mind-machine interface, to “download” our consciousness into silicon. Like the multiverse, distribute your consciousness among enough servers, and some shade of you will “live” forever. As if data does not corrupt, and silicon is indestructible.
See how we fear death. How we are outraged at the indignity, at the very idea, of death in the stories we tell, and the desperate measures of avoidance we take.
Yet the fact remains that more humans are dead than currently live. Many more.
No one gets out alive. Not me. Not you.
And to suffer. Oh, what many of us will do just to avoid suffering even a little. We run like children from the jab of the inoculation needle. We celebrate as the best of us those who suffer for others—for such is our fear and distaste of suffering that one who chooses to suffer for another is so rare.
We get so good at telling ourselves we won’t really die, and so good at avoiding, or at least acknowledging suffering. So good. So good, in fact, that we freeze when told “you have cancer.”
Our bubble of self-imposed ignorance is burst in that moment. Yes, most of us go through our lives having forgotten, fortunately forgotten, that no one gets out alive. And you will suffer, inevitably, somewhere along the way. When that bubble is popped, when we are again confronted by our mortality, reminded that suffering exists and yes, we too, may suffer…it is abrupt violence.
Must it be?
I know patients with cancer who have gone one of two roads. On the one, the disease becomes a blessing. Yes, a blessing. Even the suffering of treatment and disease becomes a blessing. The proximity of their mortality seems to sharpen their focus. Of course, there is at first the fear. The fear of the disease. The fear of death. The troubling reflections of that fear that they see in the eyes of their friends and family. But then the treatment starts, and an important principle comes into focus.
Nothing is ever as perfect as you can imagine it if, or when, the situation you imagine occurs. And this cuts both ways. All of our fantasies, yes, even the sexual ones, are always most perfect in our heads. Everything goes right. Your partner, or partners depending on the audacity of your fantasies, always say and do exactly the right thing. Reality though is far different. They are other people, and what is perfect for you may be slightly imperfect to them. I have found, for example, that my interest in many women was not quite the same level of interest they had in me. In my imagination though, they always seemed interested. Frankly, enthusiastic. Alas, -reality.-
So it is with suffering. We imagine the very worst, the -very- worst, that can happen to us. We hear cancer, and picture ourselves withering away. Dying in terrible pain, slowly, inexorably. Of eternal nights spent praying to the porcelain gods. Of a life lived as a burden, and bounded by the distance of the hospital bed to the hospital bathroom. As perfect as we can imagine the best sex we would ever want to have, we can be equally perfect, just as exacting, with our dreads.
But then it starts. The battle with the disease begins. And just like your greatest sexual fantasy, you find it isn’t quite what you imagined. Or even find that you are tougher than you imagined. Perhaps the chemo is bad, but you don’t quite have all the side effects you dread. It appears the treatment has modernized. Fewer total cycles. For some, those lucky ones, they find a depth of connection with friends and family they may not have realized existed before. The prospect of loss brings forth from patient and their closest ones that wonderful turn of phrase—“the fierce urgency of now.” They start having conversations they should have, wondering what it took this to provoke them. They start all those tiny little nonverbal demonstrations of affection. A blanket. Tea, perhaps. Held hands while the TV drones on with something that is now suddenly mere background. They may even start to wonder why now, why did they waste all that time to make these gestures? Or realize they were there all along, all for the taking, and were instead taken for granted for so long?
We often blame God for either causing, or at least permitting, suffering and catastrophe. We focus on the calamity, never wondering, it seems, if God permits them as opportunities to rise. The chance to see, in ourselves, finally, what God sees in us.
What a commentary on human wisdom and insight that it takes the force and pressure of such extremes for us to see what we were truly meant –for-.
Just as natural disasters, earthquakes, wildfires, hurricanes etc. bring out our best, our kindness, our sympathy, our sacrifice, our courage—so cancer seems to for some patients, and some of their friends, and some of their family.
Other patients take a different approach. In these patients, the fear never leaves. Instead of asking themselves “Is that it? Is that what I had worked myself up over?” when the battle against cancer isn’t quite their imagined worst case, they seem to ask themselves “what’s next?” Fortune is clearly lying to them. The worst case merely hasn’t happened yet and any relief they feel must be a trick. Setting them up, only to pull the rug out from under them and descend to the -really- nasty parts for later. However, in doing so, they hang the sword of Damocles above their heads, and try to live underneath it. This dread of impending disaster, that anything good is just a lie, designed to make the inevitable catastrophe worse, becomes a stalking horse.
Now, I don’t say this in judgment. You can say what path you would –like– to be on, but until you have come to this crucible yourself, you don’t actually know. Most alarmingly, I can understand this path, of constantly bracing for something worse to come. After one of the worst things to happen, be told you have cancer, how could you go on -trusting- fortune?
However, once you start to tell yourself that any good, no matter how small, is but an illusion and something far worse is yet to come, you can, unfortunately, make such thinking a habit. You may rationalize this habit as steeling yourself for disaster, getting prepared for it. Didn’t the stoics advocate remind themselves consciously that fortune’s gifts are fleeting, and that by imagining the worst you brace yourself to endure it? Of course they did.
But only if, when you imagine the worst, you make the critical choice to -also- tell yourself that you –will– endure and triumph over it and -how- you will do so. No matter how small the triumph. That is a plan to keep. Anything else is merely a plan to suffer as much as you -possibly- can.
So what ultimately happens is this. If you choose to tell yourself, over and over, that the cancer will come back. That remission is not cure. That the worst is yet to come and when the cancer returns it will be indefatigable. You will be overwhelmed, suffer terribly, and die. If you choose to tell yourself these things, rather than how you will endure recurrence, how others will take courage in seeing you handle this adversity with courage and determination, you give cancer and death full power over your life. You will risk becoming bitter, raging at a fate you did not deserve, waiting for that call to the executioner’s chair in fear and dread. That fear and dread and bitterness can -only- make you lonely, and convince yourself that you, alone, are the only one who suffers like this. After all, –you– are the only one dying of –your– cancer. Others cannot “catch” it to truly know your suffering. So you tell yourself, and withdraw from even the smallest kindnesses of others, bit by bit.
This is no way to live.
Now, I have used cancer as an example, but more broadly, it is the implications of our own mortality that create these divergent paths. Cancer is merely the proximate cause—for some of us. The truth is, just as if we had cancer, we are all facing death. We will all suffer. The only differences are manner, degree, and timing of suffering and death.
You have cancer. You have –always– had cancer, from the moment you were born.
No one gets out alive.
Now, if I think about it, if I -really- think about it, there are times I react to fear of death and my own fantastic imaginings of all the worst that can happen to me, and I rise through them. I respond like a cancer patient on the first path. My best comes forth. Other times, I am not so sure that I don’t take the road of self-pity, fail to see myself enduring, and take the second path. Once I see that there are these two paths, I can tell which one I am on. I can watch my thoughts, and make sure that I picture my own perseverance through my “worst case” ideas. I can see others on those paths, encourage those on the first, and forgive the lashings born of bitterness and self-pity, the desperation, of those on second until they can see the first path. For I have suffered too, and I know their burdens. They are my own. I can decide to use my time more wisely, no matter what hand the future may deal me—confident.
Just live, –well-, until you don’t.
Regrowing Memories (and Heads) For Science
You can cut off a flatworm’s head, but you only slow it down. Decapitate them, and they will regrow their head.
But that’s not the crazy part. Oh no.
They regrow memories they had in the head you cut off at the same time.
So they may remember that -you- are the one who did the decapitating when they seek their unholy revenge.
Yes, meet the most bizarrely ignored scientific paper of the last 6 years:
https://jeb.biologists.org/content/216/20/3799.full
So in short, what these guys did is take a bunch of planarian flatworms to study their memory of a simple task—navigating across a petri dish to some food. Worms that have never been stuck in a petri dish before are timid, and stick to an edge, even if the food is there. After they get used to the new lab made environment, they will go check out the food. But if a worm has been on a petri dish with food before, it will go looking for the food sooner and generally more accurately. Having confirmed that, these guys then cut some heads off some worms. For most creatures, this is a fatal problem. However, a decapitation for a flatworm is merely inconvenient. The worm is very much still alive. If you compare worms that are used to being in petri dishes with worms that are not, there was no difference in decapitated worms in eventually moving towards the food, although they don’t appear to “eat” it. Engaging with food requires a flatworm to have a head and “consciously” decide to go after that food. Next, they let the heads of the worms re-grow, and here is where things take a turn for the bizarre.
If you got a worm that had been in a petri dish before, and cut its head off, then let it finish regrowing, that worm was MUCH faster to find and eat food than a worm that had never been in a petri dish before, got decapitated, and regrew its head.
Regrowing the head also “regrew” the –memory– of being in a petri dish.
Think about that for a second. And another second.
Because the “how’ is a total mystery.
Whatever new neural connections had been formed in the worm’s noggin to create its memory of a petri dish -somehow knew to come back the same way- when the head regrew. Even though the head where those memories were presumably located had been cut off and thrown out.
This is raises interesting questions on where memory actually physically resides. We kind of default assume the brain. Fair enough, there are specific areas of the brain that we know if they are damaged results in a loss of memory. Interestingly though, it’s not complete memory loss. I was taught in medical school that the brain structure most important for memory is the hippocampus. There are characteristic microscopic findings of Alzheimer’s in the hippocampus, such as plaques and tangles, and their hippocampus will physically shrink in size as neurons within it are lost. Chronic alcohol abuse can lead to thiamine deficiency, which in turn creates Korsakoff syndrome. This means the patients has trouble forming new memories, and typically will have memory loss of some older memories, often confabulating which is medicalese for “making up stuff to fill in the gaps in their memory”. Acute use of many drugs can cause anterograde amnesia—such as getting “blackout drunk” with alcohol. Many drugs used in surgical anesthesia do this as an added “benefit”—you don’t remember any of the surgery, as part of the pain control.
What’s interesting though is that the hippocampus, while important for memory, is largely for memory access and processing. Alzheimer’s patients, for example, don’t suddenly forget a language they are fluent in. They can remember major historical markers, and can name a president if asked. The processing damage in their hippocampus just means they name the wrong one—not the current president, but someone in the past. Alzheimer’s patients will also famously have lucid intervals—periods of a few minutes or longer when the hippocampus somehow rallies and they can remember -everything-. Often, this is a cruelty, because they now clearly remember how their function is falling apart. But it’s again consistent that their pathology is a processing and recall issue, NOT a storage problem. “Muscle” memory is rarely an issue for them until very late in the disease process too. They largely remember how to walk, even how to drive. They just don’t remember where they are going, or remember it wrong, because their misfiring hippocampus has them trapped in their past.
So what the flatworm experiments suggest as that the actual substance of memory is stored somewhere else—somewhere outside the flatworm brain. The processing portion, that part which accesses memories and brings them to the fore, is what is being decapitated. When the processing portion regenerates, it is somehow able to access the memory of being in a petri dish—since that was stored somewhere else in the worm.
But where?
Same for humans, to some extent. The lucid intervals of Alzheimer’s suggest memories are physically someplace other than the hippocampus. The hippocampus is just the physical location for the “interface”—where the brain “goes” to request a specific memory. The hippocampus is more akin to the “finder” program on your computer, organizing, indexing, and helping access memory. Nor is it the only physical location for critical processing functions. For example, take Wernicke’s and Broca’s aphasias.
By Taken from NIH publication 97-4257, https://www.nidcd.nih.gov/health/aphasia (https://www.nidcd.nih.gov/sites/default/files/Documents/health/voice/Aphasia6-1-16.pdf), Public Domain, https://commons.wikimedia.org/w/index.php?curid=352704
If you have an injury to those areas in the picture, which are sections just above and atop the temporal lobe of the brain, you get very specific defects in communication. For Broca’s aphasia, any language they are fluent in is understood quite well. They just can’t speak it, and will have “gaps” in their sentences or in extreme cases, can only use single words to communicate. One particularly courageous patient has an interview with them posted on Youtube, so you can see what this looks like: https://www.youtube.com/watch?v=JWC-cVQmEmY Patients with Broca’s aphasia are often frustrated, because they –know– the language, but can’t access the –memory– of precisely what they want to say.
Even though that memory is still there.
We have no idea where the memory of what word for what idea physically –is– in brain, or body, or cell though.
Somewhat in contrast is Wernicke’s aphasia. These patients can speak and access all the sounds and words of their language, but cannot make sense of any of it. Wernicke’s area of the brain is where the sounds and words are connected to the actual ideas. When that connection is broken due to injury to Wernicke’s area, you get a patient that communicates in “word salad”—strings of perfectly pronounced words with no connection to each other. Another courageous patient has an interview here as an example of what Wernicke’s looks like: https://www.youtube.com/watch?v=3oef68YabD0. So these patients can access the memory of words and sounds, but cannot connect them to the memory of what those words and sounds mean, when Wernicke’s processing area is damaged.
We know where some discrete memory accessing and processing centers are located in the brain.
So where are memories themselves “physically” stored?
Interestingly, for muscle and sensory processing (“muscle memory” and “how your brain remembers that a particular nerve in your right arm is in your right arm”), we do have a good idea of physical location in the brain. They run in narrow strips along the top of your brain:
This is the “cortical homunculus” which, yes, looks a little like a horror show. But it diagrams the physical brain location for muscle and sensory control of those parts of the body as you move from the top of the brain (where knee and leg are in the motor (red) and sensory (blue) sections are in the diagram) down along the sides. Here, the neurons running all the way down to control those particular muscle and sensory groups live—but those neurons are the minority in these regions. The rest exist to “vote” on direction, strength, and type of movement. For example, you can give a small electric shock to part of the red “shoulder, arm, elbow, wrist and hand” section, and get complex movements, like reaching out and grasping something. This is the “muscle memory” of how to reach out and grab something, and it has a known, physical brain location
But of course, these are -physical- processes, which require direct connection of muscle and nerves to the brain, so the brain can direct your movements and respond to your environment.
We don’t have a similar homunculus for -concepts-. We don’t have a map of where all the “English Verbs” are stored in the brain, or “Simple Addition” or “Geometry”. There is no single consistent location in the brain where if you have a stroke or other injury completely forget numbers. Instead, it seems processing, organizing, accessing of the memory of what numbers are, or history, or specific nouns, is more distributed. Differences in clinical presentation between two patients with Broca’s or two patients with Wernicke’s aphasia with the same degree of damage to those areas, such as the specific words the Broca’s patients struggle to find, and their ability to recover some of the loss of function over time as the areas heal some and regenerating processing centers reconnect to the “physical” location of the language memories, strongly imply a distributed location of those memories with slight variation among individuals. At least to me. There could be someone with more knowledge of neuroscience than my “basic med school” screaming at me through their computer right now : )
So one of the interesting questions that decapitated flatworms with remembering, regenerated heads raise is that the physical location of some of that memory is -not- in the flatworm brain. Once the processing center regenerates in the brain, and reconnects to wherever the physical memory is located in the flatworm, the worm “remembers.”
So if we ask, in humans, is the physical storage of memory more distributed within the brain, or is there also memory physically stored/distributed elsewhere?
These decapitated flatworms certainly suggest “elsewhere” is a real possibility for at least some kinds of memory.
But where and how?
Maybe it is as primitive as some of the systems that allow single celled organisms, like bacteria and slime molds, to co-operate as colonies to explore and thrive in their environment. These use physical connections of the cell’s “skeleton” and molecular signaling molecules to nearby neighbors to “think” absent an actual defined brain. This has led to some meetings of scientists studying these kind of behaviors and systems to begin thinking of “liquid” and “solid” brains, and how cognitive function may arise from them: https://royalsocietypublishing.org/doi/10.1098/rstb.2019.0040 We know that our own organs, for their specialization and function, have different variations of physical arrangement, connection, and cell to cell signaling molecules. We also know there are nerve cells, particularly of the autonomic nervous system, all over the body. They are closely associated with the heart and the GI tract, for instance. And we all know that our mood can affect our heart rate as effectively as exercise—and can make us feel as queasy as a bad burrito. This isn’t to suggest that -memories- may be distributed along this system. After all, you don’t injure your leg and forget your pet’s names as a result. But there is a distribution of cognitive function throughout the body, and its signals help us to identify a mood, or more possibly, create it. Are we nervous first and THEN the heart rate increased and we felt butterflies in the stomach—or were those changes happening in those organs first, that got communicated back to the brain, and now we recognize we are feeling nervous?
We also know that smell is a very powerful sensory association with memory. Just a fleeting smell of something particular, especially if present at an emotionally powerful time, can cause the mind to bring that memory back to the fore. Cookies that smell like grandma’s can cause us to remember Grandma. And in fact, deteriorating sense of smell is a predictor for onset of dementia. Is this because of processing impairment in the brain, with difficulty accessing memories of smells, or is there deterioration of olfactory neurons which may be the physical storage location of “smell memory”? I don’t know.
Another interesting experiment done recently claims the transfer of memories. In this study snails were trained to have a very specific reflex reaction until they got good at it. Then RNA was extracted from trained snails and given to untrained snails, including what we call “non-coding” RNA (we will get to why that is important shortly). They then found that the untrained snails had some of the reflex action, despite never being trained to do it. The researchers claim that this is a transfer of “memory” and that memory is stored as what we call an “epigenetic” phenomenon.
Before we get -too- excited, a couple things. RNA is the “message” from your genes, which are stored as DNA. RNA comes in two major forms, coding and non-coding. Coding is a copy of your genes, made from the DNA into the RNA. Coding RNA becomes proteins, which do the actual work of the cell, by making up its machinery, from the “skeleton” of the cell to muscle to enzymes that will make and regulate release of neurotransmitters in the brain. This is how DNA codes for you, and why differences in DNA from person to person result in everyone looking a little different. Non-coding RNA is not converted to protein, but it’s still important. The simple way to think of it is that non-coding RNA is still made from parts of your DNA. Non-coding RNA does not get turned into protein—but it does get to “vote” on if coding RNA will turn into protein, what protein that will make, and what that protein will do. This is “epigenetic” phenomenon, because the exact set of active genes are being determined not by the genes themselves, but other molecules like non-coding RNA “voting” on which genes are active, how they are active, and how active they are.
So here’s the thing. As they trained their snails, their snails learned how to do the reflex they wanted. Mastery of that reflex means that a certain set of genes are primed to act in a specific way and faster each time, as the snail remembered from training to do this reflex. That meant a specific set of DNA was getting made into RNA, and a set of non-coding RNA was now selected by the training to be around and “on” to make the reflex faster. This is “muscle memory,” like riding a bike, where your body remembers how to do the balance, even if you haven’t done it in a while and -without you consciously recalling the memory of how you learned to ride a bike-. The RNA from coding and non-coding genes that make “ride a bike” or “this snail’s reflex” possible could easily be an effect, not the cause of the memory.
It is possible that transferring the RNA from trained to untrained snails brought the memory of “when the humans do this thing, I remember that I am supposed to do this reflex.” That’s still possible, as the researchers suggest. But we can’t ask the snails to know for sure. The alternative explanation is they transferred the “muscle memory” from the trained to the untrained snails—the set of coding and non-coding RNA the snail needs to do the reflex. It’s entirely possible the untrained snails are just doing the reflex, because all the coding and non-coding RNA is there, without “remembering” the training at all.
Yet, this is the first tangible evidence that certain ideas from science fiction may be possible. For example, even if it’s just “muscle memory”, this means physical skills may be transferable. You should be able to teach a snail kung fu, extract the RNA from it, and transfer the muscle memory, at least, of all the kicks and punches to another snail. If the other snail will know how to use them together to actually –do– kung fu, or just be really fluid with reflex-like perfect kung fu kicks and punches, is another question. So you might not be able to wake up like Neo in the “Matrix” as a kung fu black belt—but you might be able to shave significant time off of training kung fu to be a black belt by passively absorbing all the “muscle memory” so all your kung fu starts off with perfect black belt form. All you need to learn then is how to put all that great form together–memory driven knowledge about when to kick, when to punch, when to block.
If the memory of the training itself is being transferred with the RNA though, a lot changes. Specifically, if the untrained snail can now remember the humans doing this thing, and remembers learning and practicing the reflex, as if it were the trained snail itself, then we can transfer -more- than just muscle memory. Harry Potter being teleported back into a memory that isn’t his by Dumbledore transferring the actual memory to Harry stops being magic. If you can modify the mix of RNA, so that the person receiving the memory remembers it not as someone else, but as a memory of -themselves- in that situation, “Total Recall” stops being a movie and becomes a business. Transferring the muscle memory and the context can now transmit skills from human to human. You really can suddenly know kung fu. Or fluent mandarin. To say this would revolutionize education is an understatement. Or what about criminal justice? What if the memory of accused could be accessed, transferred to a cop, who can then “watch” the accused’s memory during the time the crime was committed—to see if the accused remembers doing it? Does the 5th amendment extend to protecting yourself from self-incrimination -by your own memory-? Or think of the next time the bad guys capture James Bond.
Bond: “Do you expect me to talk?”
Goldfinger: “No, Mr Bond. I expect you to remember! Soon, I will remember what you remember. And then, of course, I expect you to die.”
And what if you could inject false memories to frame someone for a crime? Or simply control them? What if you could inject false memory on population scale? Editing political undesirables from old photos, like the Soviets used to do as they ate their own leadership, would be unnecessary. You could simply “adjust” the memories of the population to forget entirely who that was.
“Eternal Sunshine of the Spotless Mind”–on dystopian scale.
As fun as it is to brainstorm the possibilities of memory transfer, like a lot of neuroscience and consciousness, we have more questions than answers at the moment. But flatworms that can “remember” when they regenerate their head, and the marked absence in humans of specific lesions of the brain or elsewhere that seem to correlate with the loss of physical memory location for many kinds of memory (as opposed to defects in the processing and recall centers we know well and discussed), raise a really interesting question on where and how the body creates, stores, and accesses memory. Early work in snails, suggesting that at least some degree of muscle memory, and possibly more, can be transferred too raises even more questions.
On the flip side, if we cannot find where and how the human body creates, stores and accesses memory, we are further from in silico immortality than some in Silicon Valley believe. To “download” my personality requires access to my full range of memories. If we don’t know where and how the physical memory is kept, how will we transfer it to code? If it is a mixture of coding and non-coding RNA, how do you translate those epigenetic changes and patterns to code with high fidelity? How can we be sure that the computer copy of me has all the necessary memories, and is accessing them correctly? I admit I don’t know as much as I might like to about neuroscience and mind-machine interface technology. But I know this. Anyone talking about downloading “you” into a computer or network has a -big- challenge in ensuring and proving fidelity and integrity of the copy of “you.” If they are -not- talking about those challenges, I don’t think they fully understand the magnitude of what they are trying to achieve, and how little we understand the biology of our consciousness—despite living it every day.
Quick Hitters
I have little to add to these. The first is a “must read”. The second and third are “recommended reading”—but likely surprise no one.
Quick Hitter 1) This is the final report issued by an independent tribunal investigating 19 years worth of claims of forced organ harvesting from political prisoners in the People’s Republic of China:
This tribunal is chaired by Sir Geoffrey Nice, who prosecuted crimes against humanity during the Yugoslavian conflict in the 1990s. I’m going to present a couple of quotes here. I have redacted some portions in brackets to make a point:
Quote 1:
“Commission of Crimes Against Humanity against [ethnic group] has been proved beyond reasonable doubt by proof of one or more of the following, legally required component acts: murder; extermination; imprisonment or other severe deprivation of physical liberty in violation of fundamental rules of international law; torture; rape or any other form of sexual violence of comparable gravity; persecution on racial, national, ethnic, cultural or religious grounds that are universally recognised as impermissible under international law; and enforced disappearance.
in the course of a widespread and systematic attack or attacks against [ethnic group]”
Quote 2:
“As above stated, in execution of and in connection with the common plan mentioned in Count One, opponents of the [National Government] were exterminated and persecuted. These persecutions were directed against [Ethnic Group]. They were also directed against persons whose political belief or spiritual aspirations were deemed to be in conflict with the aims of the [Ruling Political Party of the National Government].
[Ethnic Group] were systematically persecuted since [Starting Date]; they were deprived of their liberty, thrown into concentration camps where they were murdered and ill-treated. Their property was confiscated. Hundreds of thousands of [Ethnic Group] were so treated before 1 September [Year].
Since 1 September [Year], the persecution of the [Ethnic Group] was redoubled: millions of [Ethnic Area] from [Accused Nation] and from the occupied [Region of Accused Nation] were sent to the [Region of Accused Nation] for extermination.
Particulars by way of example and without prejudice to the production of evidence of other cases are as follows:
The [Ruling Political Party of the National Government] murdered amongst others [Named Political Opponents]. They imprisoned in concentration camps numerous political and religious personages, for example [Named Political and Religious Opponents].
In November [Year], by orders of the Chief of the [National Police], anti-[Ethnic Group] demonstrations all over [Accused Nation] took place. [Ethnic Group] property was destroyed, 30,000 [Members of Ethnic Group] were arrested and sent to concentration camps and their property confiscated.
Under paragraph VIII (A), above, millions of the persons there mentioned as having been murdered and ill-treated were [Ethnic Group].”
One of those quotes is from the recent tribunal report on Chinese detention of Falun Gong and muslim Uyghurs of Xinjiang province.
The other is from the Nuremberg war crimes tribunal for the Nazi genocide.
Which quote is from which document? Can you tell the difference?
When you said “never again”—did you mean it?
See also:
https://www.marketwatch.com/story/trade-war-watch-these-are-the-us-companies-with-the-most-at-stake-in-china-2018-03-29
https://www.forbes.com/sites/kenrapoza/2019/08/30/us-companies-china-unfair-but-we-dont-really-care/#11fa2a274adf
Oh, and one more quote from the China tribunal:
“In regard to the Uyghurs the Tribunal had evidence of medical testing on a scale that could allow them, amongst other uses, to become an ‘organ bank’.”
That’s Unit 731 level special right there.
China claimed earlier this month, under growing international pressure over these reports, that the Uyghur detainees have all been “happily” released: https://www.businessinsider.com/china-uighur-prison-camps-graduated-claim-without-evidence-2019-12 They have offered no evidence of this, and to my knowledge, have not allowed tours of the facilities or interviews of released detainees to prove this.
2) The Afghanistan Papers are worth a read. And if you want to prove to yourself that the US learned absolutely nothing from Vietnam, and has repeated literally every mistake made there, read the Afghanistan Papers, then read Hackworth’s “About Face.”
3) Speaking of the growing global dystopia, if you have the Facebook Messenger App, Zuckerberg was –absolutely– listening to you:
https://www.digitaltrends.com/news/facebook-admits-it-was-also-listening-to-your-private-conversations/
Public service announcement: You do not need the Messenger app to use Facebook messenger. You can open up Facebook in a computer or laptop browser, and use Messenger. Just know, Facebook also reads your Instant Messages via Messenger.
Do not put anything in Gmail, Facebook Messenger, WhatsApp, DMs in any of the major social media that you would not mind posted on the front page of the internet.
Because all it will take is one unscrupulous employee for it ALL to be on the front page of the internet. And like the people Marcus Aurelius would meet, you can –expect– the people running and working at those companies to at least occasionally be vain, mendacious, violent, selfish and cunning.
I am just as guilty of carelessness with their products, I’m sure, but Karl Popper’s razor for politics applies in the modern era too. Popper said the best government was that which limited the damage the worst possible person in power could do—because eventually the worst possible person would get to power. The best approach to your personal data is to assume that the worst possible people will access it—and mind your data in a way that minimizes the damage they can do if when they do.
Your Moment of Zen
Epstein didn’t kill himself. Look into it.
And Ben Hunt even had the absolutely perfect movie reference for the obvious: https://www.epsilontheory.com/im-a-superstitious-man/
Talk to you later.
Paladin