Monday, 30 April 2012

I don't believe in ghosts...

When people learn I'm a skeptic about things like ghosts and such, they'll sometimes relate to me some horrifically spooky experience they had, and then challenge me with "How do you explain THAT?" as if something supernatural is the only plausible explanation. Well, I can't always, but then, I rarely have enough information about the anecdotal situation to make sense of it, especially since it's been retold to me from the perspective of someone who has already chosen to see it in supernatural terms. So I sometimes respond with the following experience of my own, which took place a few years ago.


It’s never so quiet as right after a heavy fall of fluffy snow. I had just been visiting my parents one dark evening, and was walking out to my car, aware of the unnatural absence of the usual background noise of even this quiet residential neighbourhood, and listening intently to the only sound, the squeaky crunch of my shoes in the snow.

They say the ear can play tricks on you in such silence, so I didn’t quite know what to make of it when I heard my wife’s voice, faintly calling my name, as if from far away. I stopped dead in my tracks for a moment, then shook my head and continued. No, I knew my wife was at home and well out of earshot. But then I thought I heard it again.

I stopped and listened. Could I really have heard my wife’s voice? No. Of course not. The silence was messing with my imagination. After a few more long seconds of silence, I continued, a little faster, towards my car, when suddenly I heard my son’s voice, calling “Daaaaad!”

At that I froze. Something about being a parent makes one acutely sensitive to the voice of one’s own child. It’s absolutely unmistakable, and that was no mere trick of the imagination. My son was definitely calling for me.  I hurried to the car, started it up and drove out onto the main street, resisting the urge to go too fast.

Now, I’m not superstitious, and I don’t believe in ghosts or premonitions or anything of the sort. Yet it was difficult to avoid thinking in such terms. I tried to convince myself that I had not heard both my wife and son calling for me in the impossible silence, but it had been just too vivid to deny. I couldn’t shake a feeling of dread for what I might find when I arrived home.

Anxiously I pulled the car into the garage, hit the remote button to close the garage door behind me, and hastened to the house. Twenty paces from the back door, my cell phone rang in my breast pocket.

My cell phone. I stopped again, for just as long as it took to breathe a sigh of relief. As I mounted the steps of the back porch, my wife opened the door with a smug grin and the handset, my giggling son next to her. Somehow, while pulling on my winter parka, I had bumped against the speed dial button for our house, and not heard her “Hello?” over my crunching footsteps. So I HAD heard their voices calling, but in my early days of cell phone ownership, I was not yet in the habit of knowing I had it with me.

I made sure my next one was a flip phone.

Monday, 16 April 2012

Give Your Children a Voice in the Election


    A municipal election is upon us, and again, my son will be prevented from voting on the grounds that he is under 18 years of age.  Just about everywhere there are government elections, there is a minimum voting age. In most places it’s 18, though there are a few countries where it’s 21, and some where it’s as low as 16. The usual reason for imposing a minimum age is that younger people are deemed not to have the experience, wisdom or maturity necessary to cast a ballot responsibly. 


     Does it work? It isn’t hard to find someone who will complain about the foolishness of the majority of adult voters; almost anyone who voted for a losing candidate will do that. In any event, it is difficult to be sure that the quality of electoral results is better than it would be if we allowed minors to vote.
     In fact, there is something a little disturbing about any rule aimed at limiting the vote to only those we judge to be wise enough. Most modern democracies seem to have abandoned other competency tests for voting rights, and for good reason: it is very difficult to design a literacy test that does not include some form of political or cultural bias, and that would undermine the whole purpose of a free and fair election. 
     Whether or not they succeed in ensuring responsible choices at the ballot box, age limits do prevent young people from participating fully in the democratic process, which has its own drawbacks beyond simply depriving them of a voice. In countries where voting is not mandatory, voters in their twenties have historically poor turnout on election day, and the young are often seen as politically apathetic. But why would we expect anything else, when we’ve given them 18 or more years to get into the habit of not voting?
     In my household, we’ve adopted a simple solution to allow our son to participate, at least until he is old enough legally to vote himself. The three of us hold a miniature election around the kitchen table, and my wife and I agree to cast our ballots for whichever candidate wins a majority of our household votes. In principle, our house is like one of the states in the electoral college system for U.S. presidential elections; whoever wins a majority in the state gets all of the electoral college votes for that state.
     Since there are three of us in our household, and at least two of us are likely to agree on any given issue, we rarely have to worry about what to do in case of a tie. In practice, we discuss our choices well in advance of actually voting, and thus usually reach a consensus. Still, in Canada we have several major political parties, so the possibility of a three-way deadlock at our household election is quite real. It has never happened in our household, but if it does, one possible way to resolve it would be the single transferrable vote (STV).
     Most elections run on a simple “first-past-the-post” system, which means that whoever gets the most votes wins. If there are more than two candidates, you don’t need a majority; you only need more votes than any other one candidate. So in a three way race where candidate A gets 40% of the vote while B and C get 30% each, A wins even though 60% of the voters voted against him. 
     With the STV, voters do not simply choose the one candidate they prefer, but rather to choose as many as they like, and rank them in order of preference. If your favorite candidate doesn’t get enough votes overall to be electable, then your vote shifts to your second choice candidate, and so on. Ultimately the candidate left with the majority of the votes is the winner.
     Such a system is also handy for resolving the three-way tie that could result in our household. If I vote for A with my second choice as B, my wife votes for B with her second choice as C, and my son votes for C with his second choice as B, then B wins as the compromise we can all accept. Of course, there will always be the possibility of an unresolvable tie, but there is less chance of it with the STV system than with first-past-the-post. In fact, STV can even resolve many deadlocks in a two-voter household.
     It seems likely that age limits on voting will be with us for a long time. A baby born today will likely be old enough to vote under the current rules before any major reforms are implemented. Those of us who wish to involve our children in the democratic process need not wait until they are adults. By sharing our own votes with them, we can get them started towards becoming engaged, responsible citizens.

Wednesday, 11 April 2012

Evolution is Not Your Friend

In my last post, I mentioned a class of objections to the theory of evolution which do not betray gross misunderstandings of the theory. That is, they acknowledge that while evolutionary theory might have objective scientific merit, it leads to implications about the nature or reality that are intuitively, aesthetically or morally objectionable. These objections take a variety of forms.

For example, some find the Hobbesian view of our nature, red in tooth and claw, particularly depressing. The idea that all living things, including us, are merely the temporary survivors of a brutal struggle of each against every other is not a very positive one for those of us who believe deeply in the values of love, tolerance and cooperation. Indeed, many evolutionary biologists themselves going all the way back to Peter Kropotkin (who was 17 years old when Darwin published On the Origin of Species) have emphasized the role of cooperation and mutual aid, rather than competition and conflict, as an important part of the struggle for survival. Yet while evolution can and has produced altruism and human instincts for morality, it still seems unsatisfying somehow to see these things as ultimately rooted in the self-interest of the genes that produce them. We want to feel that noble self-sacrifice really is noble self-sacrifice, not merely some roundabout way of ensuring one's own survival, or worse, a mistake of misfiring instincts.

As unhappy as that sounds, it doesn't really bother me that much. Regardless of how we happened to end up with our imperfect instincts for morality and justice, we have them and they have provoked the philosophers among us to contemplate the logic of it, using the capacity for generalized intelligence we evolved for other purposes to pursue problems that our ancestral environment never "intended" for us. And personally, I am largely persuaded by the efforts of Kant and Mill that there really is an inherent logic "out there" to morality as distinct from the dictates of natural selection. That we got here by means of "survival of the fittest" in no way means that we must adopt that as our moral compass.

Nor am I particularly upset with the absence of a divinely ordained purpose for our existence offered by evolution. There are people who say that the reason they believe in a Creator is that they feel there must be a purpose, some reason we're here, and that we're not just some accident of no importance in the grand scheme of things. I suppose there are two reasons why this objection doesn't really resonate with me: One, I've never really felt the need for authoritative answers from above. Even as a very young child, I frequently doubted the pronouncements of my parents and teachers, and I've never been able to overcome the epistemological hurdle of some mortal human claiming to speak for God; just because they say God wants me to do this or that doesn't mean that's really what God wants. But two: I've never really understood why it would be such a terrible thing if there were no purpose. We exist, and most of us feel a sense of some kind of purpose, whether it's real or not; why do we need it to be on some absolutely solid foundation before we invest ourselves into it? Isn't our own sense of purpose enough, without having to insist that it be dictated by God to have any real meaning?

No, the implication of evolutionary theory I find more dismal is this: we aren't built to be happy, and to some extent we may actually be built to be unhappy. Think about it: our emotional and intellectual capacities were selected for by evolution because they happened to make it likelier that we'd have offspring who would share these capacities. The things that bring us pleasure and fulfilment are not there for our benefit, but rather simply because they tend to motivate us in certain reproductively advantageous directions. Nature doesn't give a damn that we're happy, and in fact it's not really in our genes' interests for us to be too happy or fulfilled, because they we're likelier to slack off in our gene-propagation activities; it's desire that drives us to do stuff, not satisfaction of those desires.

In our ancestral environment, our desires were rarely if ever entirely fulfilled. One of the reasons we love sweet or rich and fatty foods is to encourage us to stock up on them on those infrequent occasions they became available, such as when fruit comes into season, or we're lucky enough to be able to kill some tasty animal. Most of the time we subsisted on vegetables, and so we tend to view leafy green stuff as something to eat if you're really hungry and there's nothing better available. But today, of course, we have virtually unlimited access to sweets and meats, and we have no built-in instinct to regulate how much of it we eat, because we never needed such an instinct in the nearly constant scarcity of our evolutionary past. Our appetites evolved to make us crave things, and never truly be satisfied. There are very good evolutionary reasons for this, but that doesn't make it any easier to resist overindulging in unhealthy diets.

And the same is true of most of our other biologically determined appetites and instincts. It's not enough that we be well-fed and healthy; we also have evolved as a complex social creature for whom status in the group is a key to reproductive success, so we crave being demonstrably better off than our neighbours, or at the very least not worse off. And this is a game that can never be won for most people, since for anyone to win means for everyone else to lose, to some extent.

So that's what I mean when I say we weren't built to be happy. Not just that it's unlikely to be able to attain happiness, but that it may actually be fundamentally built into our very makeup that we should always be unsatisfied. And so I am sympathetic to those who find the implications of Darwin's theory discouraging, even to the point of wanting to reject it.

Yet in my more cheerful moments, I find reason for optimism. Natural selection may have built us to be chronically dissatisfied, but at the same time, the products of human ingenuity are endlessly surprising. We have figured out ways to satiate ourselves with candy and pork chops. Our technology allows us to communicate with each other instantaneously from almost anywhere on the planet. We have devised ways to organize ourselves and relate to each other that our ancestors never could have imagined. And just as we worked out how to fly despite our lack of wings, we may yet figure out how to maximize human happiness in spite of our Darwinian legacy. And even the mere idea that such a thing could be possible should provide all the purpose anyone could need, divinely ordained or not.

Friday, 6 April 2012

Why Creationist Institutes Shouldn't Be Accredited to Grant Degrees in Science

A friend's Facebook status called my attention to a complaint about religious discrimination in that the Institution for Creation Research is being denied accreditation to give degrees in biology. I started to compose a reply to post in the comment thread there, but decided to write an essay here instead. But I'm going to start out by talking about a very important concept in physics: energy.

Energy's a pretty strange concept, when you think about it, and it doesn't help that the word is misused in so many ways. To a science geek like me, energy is measured in joules, kilowatt-hours or electron volts, so I bristle when someone starts explaining acupuncture and translates the word chi as "energy".

But even a simple, concrete unit like a joule is surprisingly abstract. It's not exactly an obvious quantity, like mass or distance or time. Nor is it directly observable; all our experiences of energy are inferred from our observations mediated by matter in some way. Energy is entirely a derived quantity, and the joule is a derived unit, defined as a kilogram meter squared per second squared. (You can get this by noting Einstein's famous formula, E=mc^2: mass times a squared velocity). We talk about potential energy, kinetic energy, thermal energy, energy stored in chemical bonds or atomic forces, the energy of photons, but in all cases we calculate the quantity through indirect means.

So ultimately, energy is an entirely theoretical quantity, never directly observed but so completely pervasive in everything we do that no one would dream to deny its existence. Physics just wouldn't make any sense at all if we didn't postulate this mysterious energy stuff. In fact, this theoretical quantity is so deeply established in our understanding of the world around us that most of us don't even realize that it's really a completely theoretical concept. Certainly no one would seriously say that energy is unscientific because no one's every actually seen the stuff, and anyone who did say that just doesn't understand what "scientific" means. And it may well be that energy doesn't exist, and our current physical theories are just a happy accident that happen to give good results, and a better, more parsimonious theory will come along some day that gives better predictions with fewer postulates, but anyone who did make the radical claim that modern physics is wrong because there's no such thing as energy would quite rightly be discounted as someone who just didn't understand physics. Unless she could provide that better, more parsimonious theory, which would be a staggeringly impressive accomplishment.

Now, the whole point of accreditation for academic institutions is to try to provide some assurance that a person who gets a degree in physics actually understands something about physics. That generally requires that the faculty providing the instruction should know what they're talking about, and if your faculty is trying to make a go of it without reference to energy, they may be teaching something but it probably isn't physics, and therefore should not be accredited to give out degrees in physics. (If they do have that more parsimonious theory, then they should not only be accredited but awarded Nobel prizes at the very least.)

It's important to note that this isn't strictly speaking about belief, but understanding. You don't need to believe in quantum mechanics (Einstein certainly didn't) or relativity to be a physicist, but you should at least understand the theories well enough to be able to provide legitimate criticisms, or to admit that your objections to it are basically intuitive, aesthetic, moral or otherwise unscientific (as with Einstein's famous statement about God not playing dice with the universe.) If you criticize the theory based on a clear misunderstanding of it, though, you call your expertise into question. Just as someone who claims to be a physicist yet denies the existence of energy is probably just profoundly ignorant of physics.

Which brings us at last to evolution. Like energy, large scale evolution has never been directly observed (though unlike energy, small-scale evolution is observed all the time, from antibiotic-resistant infections to African cichlid speciation). And I do not exaggerate to say that just as energy is to physics, evolution is absolutely crucial to modern biology, as evolutionary theory provides a cognitive framework that gives order to and makes sense of all of the observed data so far.

Again, it's not about belief. You don't have to believe in evolution to call yourself a biologist, but if you clearly don't understand the theory, you have no right to be recognized as an expert. And this may sound unnecessarily harsh, but every single creationist criticism of evolutionary theory I've ever heard (with one exception that I'll get to in a moment) has been based on a profound (if in some cases subtle) misunderstanding of the theory. In other words, creationists who use these arguments demonstrate that they literally do not know what they're talking about. And that means they are not in any way qualified to be hold degrees in biology, much less to be accredited to grant them.

I mentioned there was one exception. The only criticism of evolution I've ever encountered that didn't betray a grave misunderstanding of the theory goes something like this: "It may well be that the theory of evolution is the best one available so far to explain all the observed data, but it still feels wrong to me because it conflicts with my deeply held beliefs or intuitions that are not themselves scientific." That's the same form, essentially, as Einstein's objection to quantum mechanics; it just felt wrong to him that random chance could play so fundamental a role in the basic structure of the universe, notwithstanding that there was no solid scientific basis upon which to object to the theory. Likewise, one might feel intuitively that evolution is wrong because it's aesthetically unsatisfying, or one might believe it is false because one is committed to a literal reading of the Book of Genesis, or that it has destructive moral implications, but none of these are valid scientific objections, and none of them stand in the way of actually understanding the theory even if one happens to believe it is wrong.

Nor do any of these objections stand in the way of accreditation. There are probably lots of biologists who feel intuitively uneasy about some of the implications of evolutionary theory (and I'm working on a post about one of these implications I may get around to finishing in a few days), but they understand and apply the theory, and teach it, and they know what they're talking about. There are undoubtedly physicists who are somehow, deep down, convinced that relativity or quantum mechanics or both just somehow must be wrong, but they know what they're talking about. There may even be physicists who doubt the existence of energy, but they still have to know what they're talking about.

The reason creationist organizations like ICR don't get accredited to grant degrees in biology is not that they're uncomfortable with evolution. It's simply that when it comes to biology, they don't know what they're talking about.

Thursday, 22 March 2012

The Noblest Predator

When we think of the kinds of predators that inspire respect and admiration, the creatures we put on coins and flags and coats of arms, it's often animals like lions, bears and eagles we choose. I'd like to suggest that these aren't necessarily the models of courage and nobility we should be inspired by, and nominate instead an unjustly maligned predator as a moral exemplar.

The lion, for instance, while certainly a majestic looking beast, is a cruel and murderous patriarch. When a dominant male (or usually a gang of two or three) takes over a pride (by the violent overthrow of the previous regime), the first order of business is to kill any nursing cubs, so as to free up the mothers to start a new litter with the invaders. Can we think of this as anything but reprehensible?

Such behaviour isn't limited to lions. Mother bears are well-known among mammals for their fierce defence of their young, but part of the reason the cubs need such defence is that other bears will think nothing of eating someone else's cubs.

But quite apart from the intraspecies conduct of these predators, there are moral issues with their predatory styles, as well. For one thing, most predators are cowardly, when you think about it: they attack the sickly and frail, the most vulnerable prey they can find, as if their sharp teeth and claws weren't enough of an advantage. And many predators go after prey very much smaller than themselves, so much so that we don't even really think of them as predators; consider the baleen whales, who lazily scoop up vast numbers of invertebrates, small fish and anything else unfortunate enough to be in the wrong  volume of seawater, swallowing their prey whole to be digested alive en masse. I sometimes suspect that the majority of individual animals in the world spend their last living moments in the stomachs of such predator like whales and anteaters, who simply gulp down tiny victims without even doing them the courtesy of a merciful euthanizing chewing.

It is, after all, generally only the predators with enough courage to take on prey closer to their own size who bother to kill before eating. Not out of any compassion for their victims, mind you; it's just easier to eat a zebra when it's no longer trying to kick your teeth out. And the methods used are far from humane, anyway; teeth and claws, drowning, suffocation, venom, even electric shock in the case of certain sea creatures. There are even predators, like the ichneumon wasp, who go out of their way to keep their prey alive as long as possible so as to keep it fresh for when the eggs laid inside the hapless victim hatch.

Even on those rare occasions when we humans, the most terrifyingly effective killer of other creatures on the planet, bother to concern ourselves with minimizing the suffering of our prey, the fact remains that we kill what we eat, and there really is no such thing as a nice way to kill someone.

So allow me to recommend as the most noble, heroic and courageous predator this: the lowly mosquito.

The mosquito lives most of its life as a vegetarian, after all. Loath to harm another sentient creature, it takes most of its sustenance from plant juices. It is only when a female finds herself with developing eggs that she is compelled to take a meal of animal blood, and not even for her own sake, but solely for the benefit of her children.

Yet what does the mother-to-be mosquito do when she needs protein for her babies? Unlike more cowardly predators, she does not seek out smaller, more vulnerable creatures to kill. No, she doesn't pick on someone her own size; she goes after prey that dwarfs her by many orders of magnitude. Indeed, she often goes after that very deadliest of animals, Homo sapiens. And not to kill, but to humanely harvest a tiny quantity of blood, ideally without causing the slightest discomfort whatsoever. (A mosquito does not want to be noticed, after all.) Less than a drop of blood is all she seeks, never to be missed by the donor, and that only to feed her babies. And to get it, she faces the grave risk of being smashed into paste by her gigantic prey.

Now, it's true, of course, that mosquitoes are blamed for many human deaths annually, as diseases like malaria are spread by insect bites. But it's not fair to hold the mosquito responsible for what is really done by the malaria parasite; the mosquito is if anything another victim of the parasite's exploitation. Indeed, just counting sheer numbers, the number of mosquitoes slain by pesticides and other measures aimed at controlling the spread of insect-borne diseases is astronomically larger than the number of humans who will ever have lived, let alone be affected by all of those diseases combined. I expect that if the malaria parasite were to go extinct tomorrow, few creatures would have more cause to celebrate than the mosquito.

So let us admire the courage and compassion of the noble mosquito, even if we don't stop swatting them.  And best of all, these valourous little bloodsuckers don't sparkle!

Wednesday, 14 March 2012

Obsolete Ruminations on Obsolete Hominids

     I've always been intrigued by the fact that Neanderthals actually had brains that were, on average, bigger than ours, despite our habit of dismissing them as crude brutes who never had a chance against us refined and sophisticated Cro-Magnon Homo sapiens types. While it's a mistake to assume that intelligence is always directly correlated with brain size, it isn't completely ridiculous to wonder if maybe H. neanderthalensis was actually smarter in some ways than we are.
     I once read somewhere that there was some question as to whether or not our Neanderthal cousins had the capacity for language, based on an examination of their bones that seemed to suggest they didn't have the same vocalization abilities that we do. This got me to thinking about the role of language in our own species' success, how it could have helped us out-compete the Neanderthals, and its implications for our culture generally. Individually, they might well have been smarter. But without language, each individual Neanderthal would have had to figure out new innovations largely on her own, perhaps with some hands-on demonstration by others, but essentially by discovery anew each time. In contrast, the somewhat dimmer  H. sapiens might take longer to come up with a concept, but as soon as any member of a sapiens clan worked something out, everyone else would know it pretty quickly.
    As an example of this process, I think of my own experience in studying math in school. Although the rote memorization of the times table in elementary school bored me to tears, I eventually discovered the beauty of mathematics, and took great pleasure in exploring mathematical ideas. For most of junior high school and the first two years of high school, I did very well, ignoring the teachers for the most part and coming up with my own way of solving problems on the spot. Many of my classmates, however, struggled with trying to memorize and follow the teachers' instructions (and not "getting it" as intuitively and naturally as I did).
     This worked just fine for me, up until Grade 12, when the math suddenly seemed to get more complicated. It wasn't that I couldn't do it; I could still look at a problem, take it apart and derive a solution. But the complexity of the problems in calculus and polynomials was such that it would take me more than the available time to finish the exam. My grades plummeted, and while I did pass (barely), my classmates (who actually listened to the instructions) did much better, even if they didn't always have as deep a feel for the math as I did.
     So I wondered, then, if the Neanderthal predicament was like mine had been in math. On her own, she would have been more than a match for any individual modern human: stronger and smarter. But modern humans aren't on our own; we inherit, through language, a vast amount of knowledge that we don't have to rediscover for ourselves. We don't need to be so smart as individuals, because we share our smarts better.

     These speculations are, at least in the case of Neanderthals, obsolete. It turns out that they probably did have language, based in part on genetic evidence that they had a gene known as FOXP2, which is associated with language. Yet even if my speculations don't actually solve the mystery of how our ancestors survived and the Neanderthals didn't, they have at least given me an interesting insight into the role of authority in human culture. It's always puzzled and even alarmed me the extent to which people are willing to defer to an authority, whether it be an charismatic leader, a peer group or a text. If it's written down, or if it's the wisdom of our ancestors handed down from times past, we're inclined to accept it without question, and I suspect this isn't just something we're taught to do; it seems to come naturally. (It's also taught, of course; I certainly remember the emphasis on citing authority in law school).
     It's tempting to lament this as a curse, the stifling of individuality and slavish unthinking obedience to tradition, but it's not altogether a bad thing, and neither is it all there is to human nature. After all, we also need to have a certain amount of individual creativity to provide the ideas that are then transmitted and received this way. And I have to admit, I found that, when I took linear algebra in university, studying hard and following instructions really did help me to do better within the time constraints, even if it wasn't in my nature.
     But I do think it's important for us to be at least aware of our innate tendency to defer to authority, and maintain a sensible balance. Tradition is useful, and ideas that have been around for a long time can generally be assumed to have passed some kind of test, so we should be inclined to take them seriously, but we should never be afraid to examine them critically as well. A bit of the Neanderthal nature is healthy.

Thursday, 8 March 2012

Some Observations on Neuropathy

     I've recently finished with chemotherapy for colon cancer, which has afforded me the opportunity to experience some things that, while not necessarily pleasant, have been kind of interesting. In particular, right now I'm thinking about the composite nature of tactile senses in a way it never occurred to me to consider before.

     We all know that the colours we see are really just distinct blends of only three basic colours: red, blue and green. That's because our retinas contain photoreceptor cells that are particularly sensitive to one of these three ranges of frequencies. Primates have better colour vision than most other mammals, which only have two different colour receptors. Most birds have four, giving them a greater sensitivity to distinct colours.
     Likewise, the many flavours we can distinguish are generally made up of composite signals from just five types of receptors on our tastebuds: bitter, salty, sour, sweet and umami. Our brains recognize one particular proportion of these signals as garlic, another pattern as lemon, and so on.
     Indeed, hearing is also a composite sense; tiny hairs are located at a different resonant lengths within the cochlea, and thus are sensitive to different frequency inputs. What we distinguish as a single sound (a violin, a trumpet, a human voice) is a complex blend of the inputs of hundreds of different audio frequencies.

     Now, one of the side effects of oxaliplatin (one of the chemotherapy drugs I was on) is what they call peripheral neuropathy, or damage to the nerves leading to the remote (peripheral) parts of the body: fingers and toes. In short, my fingers and toes are uncomfortably numb. However, it seems that this neuropathy doesn't affect all receptors equally. While I've lost most of the pressure sensors in my fingertips, I can still receive signals from the pain and temperature receptors, which has had some interesting (if annoying) effects.
     First, the pressure receptors have by far the finest resolution, as Braille readers demonstrate. The pain and temperature receptors don't need to be as precise about location; it's enough that the brain knows a particular fingertip is risking damage without worrying about what square millimeter is at risk. This means I can't sense things as precisely with my fingertips as I could before the treatment. I have trouble buttoning my shirt, and I can't play the guitar without visually confirming I'm putting my fingers on the right strings and frets. It also means my fingers constantly feel kind of greasy to me, as if a fine layer of oil is preventing me from detecting the tiny variations in altitude of a surface that would normally show up as small local variations in signals from pressure sensors.
     But I can still sense textures to some extent, and I think this is due in part to the role of what we call the pain sensors. The pain sensors are triggered by extremes, stresses of pressure or temperature that threaten to damage tissues, but they seem to have a wide range of sensitivity, and I think that normally they must play a role in sensing things that are not strictly speaking "painful". For example, when feeling the edge of a knife, one doesn't apply enough pressure to actually suffer any damage, but the relatively extreme stresses on the tissues do trigger a mild pain signal which, combined with the pressure and temperature signals, forms a composite tactile image of a sharp edge. The composite signal doesn't register as painful at all, but the presence of a certain amount of the pain signal is part of what gives the feeling of sharpness.
     I am finding I sense the sharpness of surfaces, like the edge of a fingernail, as sharper than normal. I believe this is due to the "pain" signal making up a bigger portion of the composite input, since the pressure sensors are providing very little signal. It doesn't hurt, exactly; it just feels like edges I touch are sharper than they were before the treatment.
     In a way, it's sort of like the reaction to cold I had at the very first cycle. Touching cold things was startling; it's not that they felt colder than usual, but just more vividly cold. It was sort of like having just cleaned my glasses and then seeing the world much more clearly all of a sudden. (Subsequent rounds of chemo made the effect worse; instead of just feeling vividly cold, handling something right out of the fridge was like grabbing a live wire, a very nasty shock.) I am not sure how to interpret this experience, because the attenuated pressure sensitivity hadn't kicked in yet.

     Anyway, it's been fascinating to be able to take advantage of this experiment, much as I'd prefer to have learned this stuff second-hand.