Wednesday 1 September 2010

The contribution of homeopathy to medicine in the UK

Homeopathy was brought to London by an Edinburgh-educated physician, Dr Frederick Quin M.D.. He had been treated by an Italian physician and homeopath for an illness while on tour as the physician to an aristocratic party. Convinced by the efficacy of this new therapeutic approach, Quin set about learning more about it. When he set up his homeopathic practice in 1832 he attracted a significant clientele of the well-to-do, including dukes, duchesses and archbishops. By 1849 he had established the British Homeopathic Society (whose membership was almost entirely limited to qualified physicians), The British Journal of Homeopathy and established the London Homeopathic Hospital, which continues to tarnish Great Ormond Street to this day. Quin despised lay homeopathic quacks, whose unscrupulous behaviour (e.g. parading ‘cured’ patients in public) he felt brought this new system of medicine into disrepute. His reputation was such (he was a close colleague of Hahnemann himself) that he was able to limit the activities of the growing number of lay homeopaths, and close down their hospitals, by merely refusing to be associated with them. Thus, homeopathy began in the UK – as on the continent – as a serious medical development that was available only to those who could afford to engage proper physicians.

By carefully competing for the same patients, and on the same terms, Quin quickly became the enemy of regular physicians. Quin was despised by the Royal College of Physicians, who sent him several letters reminding him that practicing medicine in London without being a member of their hallowed society was an illegal act. Members of the RCP also snubbed him socially, working in unison to blackball his attempts to join the better St James’ clubs. Later, after the formation of the BMA, this august organisation refused membership to homeopathically-inclined doctors and banned its members from associating with them:
A medical man has no right, under any circumstances whatever, to attend the call of a homoeopath, or knowingly to meet him at the bedside of the sick.
- British Medical Journal (1862)
The Lancet vociferously called for the removal of the few homeopaths who infiltrated medical schools, and campaigned against further such atrocities. During the years of medical reform in the late 1850s, the BMA and other medical societies and colleges called on parliament to pass acts preventing homeopathic physicians from practicing at all.

In the face of this much resistance, how could an educated physician contemplate practicing this despised quackery?

As any doctor will tell you, and a great many controlled studies have shown, many diseases and ailments will resolve on their own. Furthermore, placebo treatments can accelerate the resolution of symptoms. At the time that Quin was building his practice, though, most physicians were prescribing quite violent treatments to their patients. These may have had no more effect on the body than sugar pills, and in some cases may have positively worsened symptoms. Bleeding was commonplace for any ailment, as is generally known. Pharmacological interventions of the day mostly fit into the category of ‘heroic’; all had obvious and theatrical effects on the body. [See for example counter-irritation, cupping, strychnine and arsenic, all  in vogue at the time.]. This was the era when drugs that made you pee, poo and vomit were considered to cleanse the body and balance the humours – all considered dangerous nonsense today. Many pharmacopoeias instructed certain drugs to be administered in increasing doses until such effects occurred. Some patients would be very suspicious of a course of treatment that merely made them well. Some of the wealthy clients of the physicians enjoyed experimenting with their health and demanded tangible results. Similarly, physicians were highly sceptical of the homeopathic approach and its lack of obvious effect on the human body (in addition to its general implausibility).

Regular physicians accused their homeopathic colleagues of exclusively treating patients who required no medication, or who would get better eventually without any treatment. Naturally, the homeopaths countered with the argument that if this were the case, then regular physicians should stop unnecessarily poisoning their own, similarly afflicted patients. Homeopathy had been introduced by a trained medic, who had attracted a following of similarly-educated physicians. Quinn and his colleagues all had well to do patients, who voted with their feet. They saw the effects of homeopathy to be as miraculous as the homeopaths themselves did. Despite the vigorous opposition of the RCP, the BMA and the GMC, homeopathy flourished so long as it attracted such custom.

Much like regular medicine, Homeopathy was a popular parlour game for rich hypochondriacs in the 19th century. There was money to be made, and homeopathically-inclined patients would not be put off by the opinions of The Lancet, the RCP and the BMA, so long as their physician was properly qualified, and not some fairground quack. Many patients probably enjoyed the adventure of being treated by rebel physicians. Those adopting the unorthodox practice genuinely believed (as many deluded lay practitioners do today) that it worked, and was more than a mere placebo. As a system of medicine homeopathy was more than the handing out of sugar pills. A complex philosophy governed what treatments should be used for what ailments, as was the case for regular medicine at the time.

Finally, although regular physicians lobbied parliament to pass acts to prohibit the practice of homeopaths, the most significant of these acts (The Medical Act of 1858) utterly failed in this objective. Then, as now, homeopaths had too many friends in high places who could ensure that their business could always operate within the law. The best the regular physicians could achieve was to restrict the legitimacy of post-graduate homeopathic qualifications; they managed to do for a century.

Over the course of half a century dedicated to opposing homeopathy, regular physicians learned a valuable lesson from their competitors: many ailments can self-resolve with adequate care and without a theatrical pharmacological sideshow. In the face of the success of sugar pills as therapeutics at the hands of homeopaths, medicine slowly became less heroic, and more introspective. It began to question practices and call for evidence of efficacy. By the end of the 19th century, homeopathy was in serious decline (in parallel with its aristocratic support base) and, having learnt its lesson, medicine was advancing on its way towards the modern discipline we know today. Although we can confidently denounce homeopathy as nonsense today (sometimes dangerously so), we should acknowledge the important role its historical proponents played in steering regular medicine away from some of the practices that we prefer to now overlook.

Further reading:
Phillip A Nicholls' Homoeopathy and the medical profession (1988) is probably the best academic exploration of homeopathy in the UK.


Rants from lay homeopaths will be deleted.

Friday 11 June 2010

Traditional Chinese Arsenic for Cancer

Arsenic is another favourite poison of crime writers and has a long history of medical use. As usual, it’s a bit difficult to imagine how it could ever have been considered a medicine, considering the effects of a toxic dose:

“Soon after taking it the suffer experiences faintness, nausea, sickness, epigastric pain and tenderness. The symptoms quickly increase. The vomit is brown, and often streaked with blood; the pain is very severe; there is profuse diarrhoea, with much tenesemus [painful straining]; and there are cramps in the calves and legs. The vomiting becomes violent and incessant; there is a burning sensation in the throat, with intense thirst. Soon collapse sets in; the skin is cold, the pulse small and feeble, and the patient dies collapsed.”

- W. Hale White, Material Medica (1892).

Understandably, Agatha Christie and her ilk tended to omit the more propulsive effects of arsenic on the upper and lower gastrointestinal tract, but they otherwise describe fatal arsenic poisoning pretty well. It’s violent, nasty and painful.

It’s difficult to keep an account of the use of arsenic in medicine short, but thought-provoking (generally the aim), so interesting is the long history of the use of this poison. Where ever you turn, another fascinating connection pops up. Composing a pithy post is further complicated by the number of arsenic-containing compounds (“arsenicals” rather than simple arsenic oxides and sulfides) which have also been used as therapeutics, and provide a springboard for an adventurous dive in a completely different direction.

Arsenic is a metal (technically, a metalloid) that occurs in nature, mainly in two sulphur-containing forms, red arsenic (realgar) and yellow arsenic (orpiment); if either is burned in air the trioxide form, white arsenic, is the result. It’s not hard to imagine that when the ancients found such a rare, colourful and mysterious material that they would have thought it was a gift from the gods. Hippocrates described how to find arsenic, and how it could be used as a medicine in his view (earlier uses are entirely feasible though). Various later Greek physicians that found it killed lice and caused skin growths to slough off. Around the same time arsenic entered recorded use in far eastern Asian folk remedies, usually in mixtures with other herbal or animal components. TCM remedies are rightly treated with caution by “western” clinicians, since they frequently contain arsenic and long term exposure to low doses of arsenic is known to lead to a variety of tumours. In contrast to the arsenic that may naturally occur in soil and our water supplies, arsenic in TCM isn't a contaminant but rather purposeful ingredients, and has been for hundreds of years.

Although the use of arsenic in medicine has a long history in Europe, it was during the 18th century that the use of arsenic flourished, as the self-administration of poisons as therapeutics became fashionable. Most famously, physician Thomas Fowler produced a solution of arsenic (in conjunction with Withering of digitalis fame) which although initially indicated for "periodic fevers" and "agues" (malaria) inevitably became a popular panacea. As with so many poisonous preparations of the age, Fowler's Solution soon became a popular nerve and stomach tonic, a treatment for hysteria, dropsy, ulcers, and cancer. It was used externally to kill parasites, to treat skin conditions (eg psoriasis) and as an antiseptic. Every well-to-do home would have had a bottle, and felt safer for it. Physicians' recommendations for dosage varied widely from a few drops to spoonfuls several times a day. This was the era of theatrical pharmacology: drugs that had a violent effect on the body were held to have equally potent effects on disease. Arsenic became known as "the mule", not only because it's effects on the body were unpredictable and dangerous, but also because of the perception that it could perform in all diseases, under all circumstances. Amongst its other actions on the body, arsenic produced generalised swelling of tissues, creating the illusion of weight gain and a general improvement in health; many diabetics’ lives were shortened as a result.

Towards the end of the 19th century, the use of heroic dosing in general had declined and Fowler's Solution was used more cautiously. However, it was still held to be efficacious in a variety of clinical settings. Slowly, the use of such tonics declined as more clearly efficacious drugs came into use. There was one notable exception: arsenic remained in use for the treatment of leukaemia well into the 1930s, until chemotherapeutics and radiotherapy came to the fore. The 1936 edition of W. Hale White's Materia Medica notes that arsenic had been replaced by diathermy for skin conditions, while preparations of liver had surpassed it in the treatment of anemia. In contrast to earlier editions, no other conditions rate a mention. Arsenic had finally had it's day.

Or had it?

Chinese physicians began investigating TCM preparations as cancer treatments in the 1970s and reported their results to the world. One such remedy - Ailing-1 - appeared to be beneficial in a subgroup of patients who all had acute promelocytic leukaemia (APL). Patients were followed for long periods, and survival rates were impressive. Ailing-1 is a typical TCM remedy composed of several ingredients, but the 1% arsenic stood out as a likely active ingredient (the next likely would probably be the mercury). This was indeed found to be the case, and arsenic has now been the subject of several trials worldwide for APL, the consensus being that it has efficacy in this cancer. In all cases, arsenic has been used in patients who have relapsed after successful treatment with, or have developed resistance to conventional treatments. There is now ongoing interest in using arsenic again as an adjunct to existing therapies in other malignancies.

How does it work? As with any chemotherapy, selective toxicity is probably the simplest explanation. Different cells respond to poisons at different doses; APL cells are sensitive to low doses of arsenic. The molecular jiggery-pokery is a little to complex to describe here (there’s a link at the end for those who think they can stomach it.)

While it easy to retort that TCM simply reignited interest in a known use for arsenic in this case, it’s not the first time that elements of a TCM remedy have been purified and employed by medicine. That said, there aren’t many such examples and most of TCM is probably dangerous nonsense.

More on the mechanism of action of arsenic in APL

Thursday 3 June 2010

Chillies for arthritis

If you’ve ever been complacent in the kitchen when chopping chillies, you’ll know just how irritating they can be to the more sensitive parts of the body. In centuries past, herbs with such obvious effects on the body were frequently experimented with as therapeutics, and various healing powers were imposed upon them. The 17th century herbalist Culpepper attributed many virtues to these pungent fruit, such as supposed effects on digestion and the kidneys. However, the crude extracts of chillies that appear in the British Pharmacopoeia of the late 19th century were intended mainly for external use. Modern medicine is still exploring the potential uses of the fiery chemical that chillies contain.

Many are probably familiar with capsaicin as the principle active constituent of chilli peppers, since it is employed by the police as an irritant to incapacitate violent offenders. Once concentrated capsaicin contacts the mucosal surfaces of the eyes, nose and throat, the pain is absolutely excruciating (I inadvertently got some up my nose once in the lab, and it was an amazing experience that I choose not to repeat). If you sprayed a similar solution of capsaicin on your hands and rub it in, the pain would be much less severe, possibly just a mild tingling sensation (since skin is an effective barrier), depending on the strength of the preparation. This tingling, burning sensation is still exploited in some over-the-counter pharmacy preparations (which come with warnings about avoiding getting it in your nose and eyes.) These are hangers on from the days when such preparations would have been used as counter-irritants. Capsaicin isn’t unique, though, and many similarly burning or irritating materials were similarly employed.

A popular and long-lasting counter-irritant concoction appears in the Parke, Davis & Co. catalogues from early last century (mine is the 1921 edition). Their “Capsolin” ointment contained not only crude capsaicin but a mixture of similarly irritating herbal extracts:

Oleoresin of capsicum
Camphor
Oil of turpentine
Oil of cajuput
Oil of croton

All of these can be classed roughly as rubefacients: chemicals that cause reddening of the skin. Contemporary accounts of this potion suggest that it was very painful to apply, although this would depend, of course, on how much was applied to a given surface area of skin. Capsolin was initially indicated for use where ever a counter-irritant was required, i.e. in any disease in which an inflammatory reaction was held to be responsible for the symptoms (many diseases of the day). It probably had little actual therapeutic effect in diseases beyond being a distracting placebo. It would have felt invigorating on arthritic knees on a cold morning, and our modern experiences in this setting can tell us something about how effective it might have been (see below). Somewhere along the way Capsolin (and other rubefacients) fell into use by sporty types, who used it as a means to “warm” muscles. Capsolin was not alone – the less irritating oil of wintergreen is still used for this counter-irritant purpose by athletes.

A quick Google search reveals that Capsolin was popular with American baseball players, acquiring the moniker 'atomic balm'. Some players seem to have used heroic amounts of it – entire tubes during the course of a single game. Much as one becomes accustomed to the heat of chillies as a culinary additive, these players must have deadened their nerves to some extent. It is this additional effect of regular, heavy use of potent preparations that brings us to the modern interest in capsaicin.

We classify the different sensory nerves in the body according to the modalities they respond to (eg. heat, cold, pressure, stretch), how fast they conduct nervous impulses to the central nervous system, and what neurotransmitters they release. When you stub your toe, the first impulses to reach the brain are conducted by nerves that carry impulses very quickly. This initial sharp pain triggers the gut feeling of “Oh, this is going to hurt”, and a second or so later, impulses carried by slower nerve fibres reach the brain; these nerves are responsible for the longer-lasting, dull and throbbing pain. Anyone with children will recognise the slight lag in the arrival of these impulses via different nerve fibres, since the wrong parental reaction (distraction) between them is the difference between “oops” and several minutes of screaming. Capsaicin acts on these slower sensory nerves. When applied for hours at low concentrations, capsaicin will deplete these sensory nerves of their neurotransmitters so that any impulses carried by them cannot be communicated to the central nervous system. At the same time, the pain induced elicits endorphin production in the brain. This combination of deadening peripheral nerves, as well as the release of endogenous pain-killers in the central nervous system produces effects much like true analgesia. Prolonged use (weeks), or short term use of very high concentrations (much higher than those in Capsolin) may eventually lead to damage to sensory nerves, producing a longer term relief from pain at the site of application. This is probably the effect that baseballers achieved: they became insensitive to the dull, throbbing pain from their worn out pitching shoulders. Few doctors would have recommended it over a proper rest, but professional athletes at the time could rarely afford to miss a game.

Once it became clear just how capsaicin acts on the body, and how specifically it desensitises or destroys subtypes of sensory nerves, there was renewed interest in it as a therapeutic. Capsaicin was touted as a potential treatment for chronically painful conditions such as osteoarthritis and neuropathic pain (pain arising from disease-addled nerve fibres), with the aim of desensitising or damaging sensory nerves. We can gain an insight into how effective counter-irritation might have been by considering the results of modern, controlled trials of capsaicin in these conditions. Regularly applied low doses of capsaicin cream or patches does not appear to be consistently effective in relieving the pain of osteoarthritis, and these doses are high enough that some patients are put off by the burning heat. With continued application, some desensitisation occurs, but it is unclear whether this produces a clinically significant reduction in pain. Indeed, such low dose creams have been used as placebos in trials of stronger formulations. Where capsaicin appears to be useful is in neuropathic pain, where high dose applications have been shown to reduce pain indices. These doses probably approximate those used historically by baseball players, and are so painful that they can only be applied after the relevant area has been treated with a local anaesthetic. Such is the damage to sensory nerves after such an application that the degree of pain relief lasts for weeks. Whether patients with milder, arthritic pain will prefer such a treatment to conventional pain-killers remains to be seen. Nevertheless, lower dose formulations are still considered third-line agents for those that find it provides some relief, whether it is effective, or just an old-fashioned, counter-irritant placebo.

Counter-irritation

The ancient and somewhat vague concept of counter-irritation persists, in various guises, right through to the recent past. Through their history, counter-irritants have been either chemical (eg. extracts of chillies, mustards and the like) or physical (eg cupping, ‘medical electricity’, or even the brief flirtation of medicine with moxa) in nature and aimed to distract the body in one way or another from pain or irritation. Like many practices from medicines darker past, it could be deeply unpleasant: the desired effects of counter-irritants varied from mild itching through to severe inflammation and blistering.

Sometimes, a counter-irritant might be applied remote from the source of the problem. For example, during an attack of angina most patients experience shooting pains radiating away from the centre of the chest – typically down the left arm. Thus, one view held that a counter-irritant applied to the arm would distract (or “short-circuit”) the pain arising from the heart. In other cases, irritants were applied closer to the source of discomfort. Using angina as an example again, it was not uncommon for physicians to apply a mustard plaster to the chest to try to counter the pain within, or to try to somehow draw the inflammation out to the surface (it’s all a bit vague from our now more rigid perspective). These attempts to alleviate pain could be quite severe; it was not uncommon, and quite intentional, for painful blisters to develop. Counter-irritants would probably have been applied as frequently as (and as a adjunct to) copious blood letting. Although the practice of bleeding is now all but forgotten, the ghost of counter-irritation still walks. For example, clutching a hot water bottle to alleviate period pain is probably a hanger-on from this way of thinking. Whether this practice has any real therapeutic value would be difficult to determine with confidence; but few would deny it is soothing in some way. Similarly, acupuncture can be thought of as a counter-irritant, and many argue that this is how this largely unproven fad “works” (perhaps without realising it) to provide relief from pain. Many of us would bite a knuckle or part of the hand to counter pain elsewhere, especially as children. Counter-irritation makes tantalising sense at the lower end of the pain spectrum.

Until relatively recently, authors of medical texts tried to explain the mechanisms of counter-irritation by warping contemporary knowledge of neurophysiology around the supposed efficacy of the approach. For example, at one time it was thought that nerve impulses from pathologically painful sites could be short-circuited in the spinal cord or brain somehow by nerve impulses from the site of application of a counter-irritant. It’s not entirely absurd. We have probably all experienced vaguely similar phenomena when, for example, we are too preoccupied with a task to notice that we have cut ourselves in a way that would ordinarily be painful. Stories abound of soldiers enduring stressful combat experiences, only to notice a bullet wound that would have felled them if they had just been on guard. One current and popular explanation for such events would be that the brain produces endogenous analgesic substances such as endorphins in such situations. This is where the realms of actual efficacy and placebo effects intersect in the Venn diagram of pain: placebo effects of treatments have been consistently shown to depend on production of endorphins as well. Ironically, acupuncture is often touted on the basis that it induces endorphin release, as if this provides some scientific authority to this largely pointless practice, rather than evidence that it is nothing more than a placebo. And that is why the “ancient tradition” of acupuncture has been consistently found to be no better than just prodding someone randomly with a toothpick (placebo), but better than doing nothing at all.

The idea that sensory impulses can be short-circuited is still with us. There are a variety of devices that can be used to electrically stimulate the spinal cord in the vague hope of achieving pain relief. Some devices are surgically implanted, and patients can control when and how electrical stimulation is delivered. Despite the cost of equipment (and its implantation) there is very little evidence that this technique isn't just a placebo. At present it is indistinguishable from quackery: small, poorly-conducted clinical trials and anecdotes.

Monday 10 May 2010

A speedy cure for depression

One of the charges frequently thrown in the face of the pharmaceutical industry these days is that they have run out of diseases to target, and have started inventing maladies in order to increase profits. Sometimes dubbed 'disease-mongering', this practice is hardly new. The story of the use of amphetamine to treat minor depression is a classic reminder of how little has changed in the way that drug companies interact with medical researchers and practitioners to peddle drugs.

Amphetamine was first developed by an American by the name of Gordon Alles. Between the wars there had been an interest in developing drugs that acted like adrenaline, but had a longer duration of action. Adrenaline had become a useful drug to clinicians because it could reverse the symptoms of asthma, as well as increase blood pressure in cases of shock. But adrenaline doesn’t have a very long half-life, and had been surpassed in the late 1920s by ephedrine (more on this another time). Alles was trying to make a drug to better the longer-lasting effects of ephedrine. Amphetamine was one of the compounds he came up with in 1929, although it wasn’t known by this name until sometime later (to organic chemists it is phenylisopropylamine). Alles tested it in animals and found that, much like ephedrine, it increased blood pressure and was active if swallowed. Before the close of 1929 he had tried it himself and had it tested on asthmatics. Both he and the asthmatics noted the marked stimulatory effect that amphetamine had on the central nervous system. Unfortunately, the asthmatics also noticed that it wasn't as good a remedy for their ailment as ephedrine. He and one of his clinician friends presented these findings at a meeting of the American Medical Association in the same year. Alles went on to pursue other projects in his bid to improve upon ephedrine, but dejected by the lack of efficacy of the compound in asthmatics, doled out amphetamine to clinical colleagues from time to time for them to test on their patients with various conditions.

Smith Kline and French, as they then were (now Glaxo Smith Kline) released Benzadrine onto the market in 1933. Benzadrine was amphetamine in all but name, and Alles had had the sense to patent his work. It’s not entirely clear to me what interaction Alles and SKF had, but by the end of 1933 Alles was working closely with the management of SKF and was receiving a 5% royalty on sales of benzadrine. Some sort of understanding had clearly been reached. Benzadrine was initially marketed as an inhaled decongestant. It would have worked, in the same way that cocaine would have worked before it was usurped by adrenaline and then ephedrine. As a company trying to embrace science and move beyond more cosmetic products, SKF had bigger hopes for amphetamine. Much like Alles (and with his guidance), but with a more brutal commercial attitude, SKF started sending samples of amphetamine to any clinician working with patients who might conceivably benefit from the drug’s actions. This included trials of amphetamine in abating chest colds, dysmenorrhea ("period pains") and heart conditions. What SKF wanted was evidence that the drug was more than a decongestant so that they could legitimately advertise it for other indications. There aren’t many published reports of just how useless amphetamine was in how ever many conditions SKF considered, since then, as now, drug companies were a bit too secretive about negative findings. It certainly didn’t work in dysmenorrhea, which must have been a blow for SKF as this was (and still is) considered a cash cow. But they struck gold in the end.

Alles already knew from his clinician colleagues that amphetamine was useful for narcoleptic patients – people who fall unpredictably asleep. SKF obtained more data from additional trials and advertised the drug for this condition as well in 1935, but it is not a common condition. In 1936 SKF finally heard the news they were waiting for. Abraham Meyeson, a psychiatrist reported that amphetamine appeared to be of benefit in some cases of depression. This is where the disease-mongering began. Meyeson had unusual ideas about the psychiatric disorders that were crudely recognised at the time, and they fitted perfectly with the central nervous system side effects of amphetamine (the very effects that Alles had tried to eliminate.) Meyerson had written two popular works, the titles of which suggest something about his views on the psychiatric problems of the day: When life loses its zest, and The nervous housewife. He coined the term "ahedonism" for the ailment that he thought was raging through society and causing people live in conflict with the world around them. In short, he thought they’d lost the "pep", "zest" or "energy feeling" that more adjusted people retained. Amphetamine, he reasoned, filled the hole. By 1937 SKF started pushing amphetamine to treat any mood disorder that psychiatrists could identify and the money started flowing in. Slowly a consistent picture emerged: amphetamine appeared to be useful in mild depression, but ineffective (or worse) in more severe cases, and other psychiatric conditions. Initial AMA regulations insisted that amphetamine should be used only in patients already institutionalised, but many general practitioners prescribed the drug in response to SKF's advertising campaign.

Without amphetamine, ahedonia was just one popular psychiatrist's concept, and without ahedonia, Benzadrine was just a decongestant. Conservative estimates put Benzadrine sales at about a million pills per day in the US by 1945.

Further reading:

There is a very good, if rather long account of the amphetamine story in:
Rasmusse, N (2006) Making the first anti-depressant: amphetamine in American medicine 1929-1950. J Hist Med Allied Sci 61(13) 288-323.

Thursday 29 April 2010

Putting a spring in your step with strychnine

Everyone should be familiar with this favourite of crime fiction. Strychnine is not the most poisonous substance known to man (you need at least 50 mg to kill someone), but a you'd have to go a long way to find something that provokes a more violent death. So why the hell would you try to use it as a medicine?

Strychnine has two notable actions. Firstly, its taste is intensely bitter, and can be detected in quite dilute solutions. Secondly, it blocks receptors for the neurotransmitter glycine, which is present in the spinal cord, brain and retina. Glycine is an inhibitory neurotransmitter, meaning that when it is released and binds to receptors on other neurons, its effect is to reduce neuronal activity. It is important in the spinal cord, most notably because inhibitory neurons are required in the quite precise way the nervous system controls our muscles. When one muscle contracts, the neuronal circuitry in the spinal cord simultaneously ensures that any apposing muscles do not, using inhibitory neurotransmission to the neurons that control them. If you block glycine receptors, the delicate control system is lost and the result is violent, uncontrollable convulsions all over the body. Eventually, control of respiration ceases and death follows. Because glycine is also an inhibitory neurotransmitter in the eye, victims often notice visual disturbances as well.

Strychnine is isolated from nux vomica, the seeds of Strychnos nux-vomica. Nux vomica may ring a few bells, because it is a favourite of homeopaths. Of course, homeopaths dilute strychnine down until no active ingredient can possibly be left, and like all their potions, nux vomica is nothing but water. Conventional medicine, it turns out, has treated strychnine in a ironically similar way. It is the bitter taste of strychnine that is the key to this interesting story.

Strychnine was promoted as a "tonic" (a dangerously vague term) well into the twentieth century, having first appeared in routine medical use in the 18 century. It properties as a tonic were thought to be two-fold: stimulation of appetite and digestion and an increase in the excitability of muscle. You can taste the bitterness of strychnine at quite dilute, and harmless, concentrations. Bitterness was long held to stimulate the stomach, and indeed there is something to this theory. Reflex (Pavlovian, if you like) stimulation of gastric secretion by a bitter taste in the mouth probably does occur. Most 'digestive' drinks in many cultures have a bitter taste. But remember, this is all about bitterness, and not any effect on the central nervous system. The supposed effect on the excitability of muscles was presumed, based on the observations of the effects of poisoning; it was thought that at lower doses a useful, milder version of events would unfold. Although it sounds reasonable, the doses of strychnine that were taken for the 'tonic' effect would have no effect on the nervous system at all. To quote my battered copy of Goodman and Gilman's The pharmacological basis of therapeutics (1975):

"To the drug have been ascribed properties that it does not possess, or that is exhibits only when administered in toxic doses"
People took this bitter placebo for a couple of hundred years, all the same.

Cocaine for asthma

Cocaine has two effects on the body, mediated in different ways. This isn't an uncommon thing for a drug to do - most drugs have side effects. The first thing that cocaine does is induce local anaesthesia when it is injected into tissues (or dropped into the eye). It was a very handy drug around the turn of the 19th century for this effect as it allowed minor surgery without pain. The second thing that cocaine does is interfere with neurotransmission - the way that neurons talk to other neurons (or other cells, like muscles).

Nerve terminals release neurotransmitters to communicate with other cells. These generally small chemicals diffuse across to the target cell and bind to receptors to elicit a response. The fate of released neurotransmitters is either degradation by enzymes in the vicinity or re-uptake by the nerve terminal. Re-uptake requires a transporter in the nerve terminal, and it is some of these transporters that cocaine blocks.



Now, in the brain cocaine blocks dopamine and noradrenaline re-uptake, leaving more of these neurotransmitters free to bind with receptors, causing the euphoric effects that cocaine is rather famous for. The story in the rest of the body is rather simpler, and this is where the connection to asthma comes in.

There are nerves that release noradrenaline all over your body, and they affect various processes without you having to think about it (in fact, you have no direct control over these nerves at all). These nerves constitute the sympathetic nervous system, and two of the things that they cause when stimulated are vasoconstriction (decreased diameter of blood vessels) and increased activity by the heart. Users of cocaine will be familiar with the increase in heart rate (and force of contraction) and physicians regularly warn against the increase in blood pressure that cocaine produces via its twin effects on the heart and the blood vessels. But how would these effects be of any clinical use in the treatment of asthma?

The answer is a historically interesting one. Around the mid 1870's early attempts at bronchoscopy (looking inside the airways) revealed that during an asthma attack the lining of the airways (the mucosa) becomes red and swollen. It was (quite correctly) surmised that during an attack, blood flow to the mucosa increases, causing swelling that reduces the diameter of the airways and hence limits airflow. It didn't take to long for someone to reason that if you inhaled cocaine, it would constrict the blood vessels in the mucosa and reduce the swelling. It would have worked (but not as well as current treatments), and it had a brief life as an asthma treatment from about 1885 to 1900. Suffers would have found the relief produced by cocaine better than the mixed bag of very strange advice given by physicians at the time.

We don't use any drugs for asthma that work in a similar way today. We tend to use drugs that cause relaxation of the muscles around the airways, rather than constriction of the muscles surrounding the blood vessels in the mucosa. We do have cocaine-like drugs in modern use though, and they have their uses. But that's another post.

Thursday 15 April 2010

Cupping

This is not technically a poison, but it's a good example of something we'd consider barmy today. (And I had to start somewhere). It's also entertaining because the practitioners of quack medicine are mad keen cuppers. One of the most startlingly obvious features of quack medical practices is that - irrespective of the modality - they are very often something that has been discarded by medicine during its evolution. Only the really whacky ones aren't.

Cupping is an ancient approach, that much is true. Egyptians were cupping around 1000 BC and the practice may be older still (and have evolved more than once, in different places). The Hippocratic and Galenic traditions that physicians still adhered to at the turn of the 19th century encouraged cupping for a variety of ailments, and the practice only died out over the following hundred years.

There were two forms of cupping, which were often mixed together. The basic process was common to both: heated cups were placed over the skin, and as the air cooled inside them (and the pressure dropped) they sucked the skin outwards. That much is dry cupping. Wet cupping involves scarifying the skin before applying the cup, and was a form of bloodletting. The two forms were often mixed, dry cupping being applied first to form a blister, which was subsequently lanced and re-cupped to be bled.

How did it work? It almost certainly didn't do any therapeutic good whatsoever. The idea of bloodletting came from the ancients and was thought to restore the balance of humors in the body (they being blood, phlegm and yellow and black bile). Bloodletting in specific locations on a small scale was thought to draw inflammation away from nearby tissues. On a larger scale, patients would be bled from a large vein until they fainted. Since all conditions were thought to be due to an imbalance of the humors, this practice was quite widespread until the revolutions that occurred in medical practice during the 19th century. Untold unnecessary deaths resulted. On the other hand, more benign dry cupping probably arose from the notion of counter-irritants - the idea that a disease-induced irritation could be resolved by a deliberate one elsewhere.

Modern quacks promote dry cupping as an ancient and traditional Chinese healing system, similar to acupuncture. Western medicine employed it for thousands of years, and only discarded it (with a lot of other practices) when people started to pay careful attention to whether treatments actually worked or not.