The Scientific Activist (Archives)





-->

Mar 24, 2006

The Scientific Activist Goes on the Road


In just a few hours, I’ll board a flight bound for India, where I’ll spend the next couple of weeks traveling with friends. Although I should be able to check on my blog occasionally, I probably won’t be doing any blogging while I’m gone. In the meantime, please enjoy the archives, or better yet check out all of the great sites on my blogroll.

I’ll be back in Oxford on 10 April, and thereafter The Scientific Activist will return in full force, featuring more analysis from the intersection of science and politics plus some thoughts on my travels. Incidentally, this trip to India is the main reason why the blogging has been so sporadic lately, as I’ve crammed in a semester’s worth of research work into the last few weeks in anticipation of all of the work that I’m definitely not going to be doing while I’m gone.

If you want to be one of the first to know when the blogging resumes, though, I would encourage you to sign up for my email service, which will notify you when new material appears on the site.

See you then!

Mar 19, 2006

Genetic Engineering's Next Challenge: The Smiley Face

Genetic engineering holds a great deal of promise, from potentially curing a variety of human ailments to addressing nutritional deficiencies through transgenic crops. One project even aims to engineer into bacteria the ability to generate a variety of alternative fuels. When it comes to genetic engineering and its emerging potential, it seems that the only real limit to the field is that it can only be used to design or improve something that is actually alive.

Despite this “limitation,” some scientists have found that the raw material used in genetic engineering, DNA, could be an attractive building material for inanimate objects, on the nanoscale at least. When one takes into account DNA’s well-defined structure and fundamental symmetry, coupled with the infinite variability available through different lengths of chains of various permutations of its four constituent bases—adenine, guanine, cytosine, and thymine—this interest isn’t very surprising. These aspects of DNA stem from the specific pairing of its bases: adenine always binds to thymine, guanine to cytosine. This fundamental property allows a living cell to replicate its DNA and read the embedded code that details the sequences of all of the cell’s RNA and proteins. This property also makes DNA potentially useful for a variety of technical applications.

This week, a research article and accompanying news piece in Nature (subscription required) detail a compelling, and arguably very cool, breakthrough in the field of DNA nanotechnology. The report, by Paul Rothemund of Caltech, details a simple but apparently effective technique of designing and building virtually any two-dimensional shape using one long strand and several short pieces of DNA.

The method is fairly straightforward, and after a few planning steps nature takes care of the rest. After the desired shape has been chosen, the shape is conceptually filled in with rows of parallel DNA double helixes. The next step involves mapping one long single strand of DNA onto this template so that it zigzags over the entire surface, providing the shape with a great deal of stability. This piece of DNA is only single-stranded, though, so short pieces of DNA are then designed to be complementary to specific pieces of this long strand completing the double helixes. These pieces provide additional cross-links as well. After the DNA has been designed and manufactured, the individual pieces just have to be mixed together, and they’ll self-assemble into the desired shape, whatever that may be.

This is all well and good in theory, but, as they say, the proof of the pudding is in the eating:


In this figure, the top two rows are diagrams of the planned shape, and the bottom two rows are actual images taken with atomic force microscopy. The images in the third row from the top are 165nm wide each (roughly 1/150,000 of an inch). My personal favorite here is the “disk with three holes,” which most people would probably call a smiley face (you’ve got to love the unnecessarily dry language in these reports).

Even cooler than these images, in my opinion, are the next set, where some of the short cross-linking pieces of DNA were changed to create pixels:


The bright spots are areas more dense in DNA, formed when the cross-links fold upon themselves to form a double helix that sticks out of the image.

This is pretty neat, but can this technology be used for anything useful, instead of just making small low-resolution microscopic images? The author concludes the paper with his vision for the technology:
An obvious application of patterned DNA origami would be the creation of a 'nanobreadboard', to which diverse components could be added. The attachment of proteins, for example, might allow novel biological experiments aimed at modelling complex protein assemblies and examining the effects of spatial organization, whereas molecular electronic or plasmonic circuits might be created by attaching nanowires, carbon nanotubes or gold nanoparticles. These ideas suggest that scaffolded DNA origami could find use in fields as diverse as molecular biology and device physics.

Rothemund’s use of the term “DNA origami” reminded me of a story I wrote for The Battalion last year entitled “Biochemical Origami”. There, I explored the search to understand how proteins fold into their three-dimensional shapes, which turns out to be a much more complicated area:
Our bodies are made up of different types of cells, which are themselves made up of different types of molecules. Some of the molecules, called proteins, are linear chains of different chemical blocks called amino acids and the true workhorses of the biochemical world.

A snake-like chain of amino acids without a distinct shape is virtually useless. Therefore, a protein must fold into a varying three-dimensional shape to function properly. This holds true in all life forms, from complex humans to bacteria made of only one tiny cell.

Nick Pace, professor of biochemistry at A&M, studies protein folding. He is one of many scientists trying to solve the "protein folding problem.”

"Protein folding means being able to predict the three-dimensional structure of a protein," Pace said.

Correct folding is important, Pace said, because "many diseases are protein folding diseases," including Alzheimer's disease, Huntington's disease and cystic fibrosis. In these diseases, incorrect folding of important proteins causes the disease's symptoms.

Think of protein folding as biochemical origami - a bland chemical chain masterfully folded into an elegant and efficient protein machine.

Theoretically, the specific order of amino acids should determine the three-dimensional shape of the protein, just like a sheet of paper able to fold itself into an elegant origami swan….

…In theory, protein folding is simple. Proteins are subject to the same physical laws as anything else. If one could understand the forces involved in protein folding, predicting the structure of new proteins should be fairly simple.

"It just turns out," Pace said, "that the physics involved in protein folding is a lot more complicated than we thought it would be."

Predicting the shapes of proteins—which are made up of combinations twenty different amino acids and do not following any simple geometric rules (i.e. no base pairing)—has proven extremely elusive. The basic shape of DNA, on the other hand, has been known since James Watson and Francis Crick first solved it in 1953. This discrepancy fits well with the differing roles DNA and proteins perform in the cell, with DNA functioning as a relatively stable linear code but proteins performing most of the varied and complex tasks that keep living things alive. It should not be surprising, then, that scientists have found these properties in DNA compatible with self-assembling nanoparticles, capitalizing on billions of years of evolution to take nanotechnology another impressive step forward.

Mar 16, 2006

Tangled Bank #49 at Living the Scientific Life

The newest issue of the Tangled Bank is up at Living the Scientific Life. This week's edition is pretty extensive, and it's a great collection of science writing on a wide variety of topics.

Mar 13, 2006

Never Be in the Dark Again!

In today's crazy go-go world you might not always be able to check the web for the latest from The Scientific Activist. Just to make your life easier, you can now let The Scientific Activist come to you via email. If you sign up, using the form below or the one in the sidebar, you'll receive an email notice when a new post appears on the site.

Enter your Email


Powered by FeedBlitz

Free Market Frenzy

If sellers are allowed to compete freely without any regulations, market forces will inevitably drive down prices and improve the quality of services so that everyone wins, even the consumer—or so the dogma goes. Life is rarely so simple, and markets don’t always behave so predictably. In the case of energy, in fact, deregulation has had the opposite effect, catalyzing massive price increases.

Although this is not a new phenomenon, Sunday’s Washington Post details some of the more recent problems consumers have faced after buying into deregulation programs that promised to drive down energy costs:
Maryland and District consumers angry at the record electric bills they will receive this summer might want to recall the promises made by proponents of deregulation seven years ago. If they do, they'll be even angrier.

At the time, in 1999, evangelists for deregulation described a competitive, efficient and lower-priced system of energy delivery that, for the most part, remains a fantasy in the Mid-Atlantic region and other parts of the country today, according to industry experts.

The District, Maryland and Virginia, along with much of the nation, are wrestling with the ramifications of deregulation at the same time that the cost of producing electricity is skyrocketing. But as energy prices have soared, electricity rates have gone up more in deregulated states than in regulated ones….

…Residential customers -- especially those in Maryland facing an average $743 yearly increase in their BGE bills -- are left wondering what deregulation was for, if not to reduce prices.

The basis of the argument for deregulation depended on price competition between power suppliers, but due to a lack of real competition in most areas, prices have only gone up. In fact, deregulation was sold as a silver bullet that would address a variety of issues beyond just lowering prices, including encouraging infrastructure updates and modernization. However, in real life…
Another promise was that suppliers, freed from state regulation, would sell their power to the highest bidder, creating market incentives for increased efficiency and investment in new technologies. But since 1999, older, less-efficient plants have remained profitable, discouraging investment in new plants. The Mid-Atlantic region still has a shortage of capacity, and during much of the year must import energy from outside the area, often at high prices.

Deregulation was also supposed to encourage development of a national grid system. Utilities and their would-be competitors could buy from the lowest-cost producer, no matter where the producer resided. Pepco could, in theory, buy power from a plant in Oklahoma using cheap natural gas. But the national grid hasn't been improved because power producers -- the most logical source of capital to improve the system -- don't want the competition a robust national grid would allow. Today it is nearly impossible to move large amounts of electricity over long distances.

So, energy deregulation was based more on fantasy than fact, and hopefully state legislatures will have learned their lessons after this fiasco and find a more reasonable solution that won’t subject their constituents to such wild price increases. Putting that aside, though, there was something else about the Washington Post article that really caught my eye:
Under the old system, the price of electricity was strictly based on what it cost the power company to produce it. Now, prices are based on what several hundred highly sophisticated power suppliers and traders believe the market will bear, prices that can have only nominal relation to cost.

This sounds surprisingly similar to what’s happening in the pharmaceutical industry right now. Although drug companies have traditionally argued that high prices were justified based on the large investment in research, development, and testing required to bring a drug to the market, many examples are coming to light where the price of a drug has little to do with these costs. An article in Sunday’s New York Times discusses a few examples of this disturbing recent trend:
The medicine, also known as Mustargen, was developed more than 60 years ago and is among the oldest chemotherapy drugs. For decades, it has been blended into an ointment by pharmacists and used as a topical treatment for a cancer called cutaneous T-cell lymphoma, a form of cancer that mainly affects the skin.

Last August, Merck, which makes Mustargen, sold the rights to manufacture and market it and Cosmegen, another cancer drug, to Ovation Pharmaceuticals, a six-year-old company in Deerfield, Ill., that buys slow-selling medicines from big pharmaceutical companies.

The two drugs are used by fewer than 5,000 patients a year and had combined sales of about $1 million in 2004.

Now Ovation has raised the wholesale price of Mustargen roughly tenfold and that of Cosmegen even more, according to several pharmacists and patients.

Sean Nolan, vice president of commercial development for Ovation, said that the price increases were needed to invest in manufacturing facilities for the drugs. He said the company was petitioning insurers to obtain coverage for patients.

The increase has stunned doctors, who say it starkly illustrates two trends in the pharmaceutical industry: the soaring price of cancer medicines and the tendency for those prices to have little relation to the cost of developing or making the drugs.

Genentech, for example, has indicated it will effectively double the price of its colon cancer drug Avastin, to about $100,000, when Avastin's use is expanded to breast and lung cancer patients. As with Avastin, nothing about nitrogen mustard is changing but the price….

… And once a company sets a price, government agencies, private insurers and patients have little choice but to pay it. The Food & Drug Administration does not regulate prices, and Medicare is banned from considering price in deciding whether to cover treatments.

I have written previously about Avastin and its producer Genentech, which openly acknowledges that its pricing of Avastin had little to do with the cost of producing the drug. The current Times article gives another example of this from Pfizer:
But people who analyze drug pricing say they see the Mustargen situation as emblematic of an industry trend of basing drug prices on something other than the underlying costs. After years of defending high prices as necessary to cover the cost of research or production, industry executives increasingly point to the intrinsic value of their medicines as justification for prices.

Last year, in his book "A Call to Action," Henry A. McKinnell, the chairman of Pfizer, the world's largest drug company, wrote that drug prices were not driven by research spending or production costs.

"A number of factors go into the mix" of pricing, he wrote. "Those factors consider cost of business, competition, patent status, anticipated volume, and, most important, our estimation of the income generated by sales of the product."

The idea that the income generated from a drug is the most important factor at play here seems surprisingly cynical coming from an industry purported to have the humanitarian goal of alleviating human suffering from disease. Although I have already written at length about this, I should reiterate that the drug industry is indirectly, but heavily, subsidized through federal funding of biomedical research. I’m pretty sure that voters support this funding for the promise of medical breakthroughs and new medications, not to give big pharmaceutical companies new vehicles for making bundles of money.

Although there are plenty of differences between the energy and pharmaceutical industries, the drug companies’ previous use of research and development costs as justification for high prices isn’t that different from energy companies preaching the benefits of deregulation. As the logic of both of these arguments begins to break down, it appears that the lessons learned in one may have some relevance to the other. Regardless, based on how things have gone in these industries, the idea of paying a price for a good or service based on what it actually costs to deliver doesn’t sound all that unreasonable anymore.

Mar 12, 2006

Is Fusion in Our Future?

Humans have relied on nuclear fusion from the very beginning, but only indirectly, as the ongoing fusion reactions that power the sun have provided the light and warmth that made the earth habitable for life in the first place. Harnessing this power directly, though, is a matter of more controversy and greater difficulty. Although the use nuclear fusion as an energy source has appeared possible, however remotely, since the United States detonated the first fusion bomb (also known as a hydrogen bomb or a thermonuclear weapon) in 1951, an article in this week’s Science (subscription required) outlines the logistical challenges that may preempt any use of this potential energy source.

The article—written by William E. Parkins, who passed away in October 2005 after submitting his paper—describes the barriers to nuclear fusion as engineering problems, not physics problems. Although the science behind nuclear fusion is sound, engineers of a nuclear fusion power reactor would have to find a way to generate and then contain the extremely high temperatures necessary for nuclear fusion to occur and then find a way to remove the additional heat released in the reaction to both generate power and prevent the nuclear reactor from overheating.

As the name implies, nuclear fusion involves joining two atoms into one, most commonly the fusion of two hydrogen atoms to form a helium atom (although the process really just involves the nuclei of the atoms, for the sake of simplicity I’m using the term “atom” and “nucleus” interchangeably). Nuclear power reactors currently in use generate power by taking advantage of the opposite process, nuclear fission, which is the breaking up of larger atoms into smaller ones. It may at first seem unintuitive that two opposing processes can both release energy. In fact, it seems to violate one of the most fundamental concepts learned in chemistry 101: if a reaction in one direction releases energy, the reaction in the opposite direction should consume energy. The reason this works, though, is because atoms of different sizes behave differently, as illustrated by this diagram from Lawrence Livermore National Laboratory Public Affairs:


The nuclear stability of an atom depends on the number of protons and neutrons (collectively called nucleons) in its nucleus. The most stable nuclei are those of iron (56 nucleons) and nickel (62 nucleons). As one moves away from this point, heading toward larger or smaller nuclei, nuclear stability decreases. Since transitioning from a less stable to a more stable nucleus releases energy, fission generates power by breaking up nuclei larger than iron or nickel, and fusion does the same by combining nuclei of smaller atoms. The largest single energy transition occurs between hydrogen and helium, enabling such powerful phenomena as stars and hydrogen bombs.

Nature has taken care of the physics for us, but has left us with quite an engineering problem: to make this reaction occur, temperatures of roughly 100,000,000°C must be generated. Not only is this a difficult feat in itself, but such high temperatures have to be contained and safely dispersed. The fusion reaction itself then generates heat, and although this heat is used to generate power, removing it efficiently enough to prevent the reactor from melting down is not trivial. All of these problems seem to require expensive solutions, as Parkins details:
A 1000 MWe plant requires a thermal power of about 3000 MW, 20% of which must be absorbed by the vessel wall. If we assume an average heat transfer rate of 0.3 MW/m2, the vessel wall and blanket-shield each must have an area of 2000 m2. To absorb the 14 MeV neutrons and to shield against the radiation produced requires a blanket-shield thickness of ~1.7 m of expensive materials. This is a volume of 3400 m3, which, at an average density of about 3 g/cm3, would weigh 10,000 metric tons. A conservative cost would be ~$180/kg, for a total blanket-shield cost of $1.8 billion. This amounts to $1800/kWe of rated capacity--more than nuclear fission reactor plants cost today. This does not include the vacuum vessel, magnetic field windings with their associated cryogenic system, and other systems for vacuum pumping, plasma heating, fueling, “ash” removal, and hydrogen isotope separation. Helium compressors, primary heat exchangers, and power conversion components would have to be housed outside of the steel containment building--required to prevent escape of radioactive tritium in the event of an accident. It will be at least twice the diameter of those common in nuclear plants because of the size of the fusion reactor.

Scaling of the construction costs from the Bechtel estimates suggests a total plant cost on the order of $15 billion, or $15,000/kWe of plant rating. At a plant factor of 0.8 and total annual charges of 17% against the capital investment, these capital charges alone would contribute 36 cents to the cost of generating each kilowatt hour. This is far outside the competitive price range.

Although there is a lot of jargon here, the take-home message is that generating power from nuclear fusion will be expensive and inefficient, or at least it will be using current technologies. Based on this analysis, Parkins draws an interesting conclusion. Because fusion power seems unfeasible, funding priorities should be shifted toward studying fusion in terms of basic science:
New physics knowledge will emerge from this work. But its appeal to the U.S. Congress and the public has been based largely on its potential as a carbon-sparing technology. Even if a practical means of generating a sustained, net power-producing fusion reaction were found, prospects of excessive plant cost per unit of electric output, requirement for reactor vessel replacement, and need for remote maintenance for ensuring vessel vacuum integrity lie ahead. What executive would invest in a fusion power plant if faced with any one of these obstacles? It's time to sell fusion for physics, not power.

I appreciate his focus on basic science, and I have written previously that scientists need to work harder to sell their work on its intrinsic value, not just on potential applications. Still, I am reluctant to give up on nuclear fusion so quickly. If anything, we as a society have proven ourselves incredibly inept at predicting what new technologies will make the seemingly impossible now possible.

In the meantime, though, it looks like we’re going to have to leave fusion up to the real expert: nature. As the sun continues to provide us with an almost unending potential source of energy, it would be a shame not to take full advantage of it and harness the most powerful force of nature in a different way. I imagine that some of this funding being spent on fusion research, while still very important, could go a long way in further developing solar power and other renewable resources. With the need to cut down greenhouse gas emissions at any cost becoming increasingly clear, we need to hedge our bets and develop a truly multifaceted strategy for developing new energy sources.

Mar 10, 2006

Eh, We Probably Know Enough About That Global Warming Stuff by Now Anyways....

Demonstrating that the United States is not the only country where environmental research finds itself under threat, Tuesday’s Independent reported on scientists and scientific organizations expressing concern and opposition about proposed cuts to major United Kingdom wildlife research centers. Although the cuts don’t smell of the same political partisanship underlying the attacks in the U.S., the threatened research centers have a proven track record of producing important research, often with poignant implications for global warming:
A torrent of high-level opposition is building up to the proposals to scrap Britain's three leading wildlife research centres, which are due to be voted on tomorrow.

More than 1,000 formal objections have been received by the Natural Environment Research Council (Nerc) to its plans to close the centres at Monks Wood, Cambridgeshire, Winfrith in Dorset and Banchory near Aberdeen.

The scheme, which will also see 200 wildlife scientists sacked, has caused anger among environmentalists, many of whom believe more, not less, specialised wildlife research is needed to protect Britain's habitats and species from growing threats, especially climate change.

The centres have been responsible for many discoveries about the natural world and the pressures on it. These include the first proof that global warming is having an impact on the living environment - Monks Wood researchers have shown that spring now arrives in Britain three weeks earlier than 50 years ago.

The Independent article provides a thorough outline of the scientific accomplishments of the three research centers, and rather than reproducing it here, I would encourage you to go to the source for more details. Although the types of studies at the centers cover a hodgepodge of disparate topics of varying significant, The Independent focuses on those that have helped demonstrate the pervasive consequences of global warming. Despite this, the article also details some of the more scientific objections to the proposed closures:
The Royal Society, Britain's science academy and the most prestigious scientific body in the land, says: "Of particular concern are the threats posed to the vitally important long-term environmental monitoring sites, programmes, and data sets that play such a key role in underpinning our understanding of the natural environment and environmental change."

The Government's own wildlife conservation agency, English Nature, says it has "major concerns over the scale of the proposed cuts in staff and facilities". It comments: "We are concerned that even if biodiversity research programmes, and work on long term research and data, are retained, closure of centres and relocation of staff may mean that key staff with skills and knowledge essential to such work may be lost. This risks compromising these vital programmes."

Regardless of at what angle one approaches the issue, legitimate arguments against closing the centers abound. In light of the strong case against the proposal, and in light of the significant opposition that has built up against it, proponents of the plan will have to make quite an exceptional case in favor of the closures to win over support. At this point, though, it seems that it is in the best interest of the environment and U.K. science in general for this not to happen.

Mar 8, 2006

Finding a Gray Approach to Cutting Greenhouse Emissions

As the description of The Scientific Activist states, the truth isn’t always black or white. However, you wouldn’t gather that from the struggle over efforts to cut greenhouse gas emissions in the United Kingdom, where there seem to be two mutually exclusive options: fully embrace nuclear power, or focus only on alternative fuel sources and energy efficiency. This is, of course, a false dichotomy, but some UK officials apparently think otherwise, as Tuesday’s Independent reports:
Tony Blair's backing for nuclear power suffered a blow yesterday when the Government's own advisory body on sustainable development came down firmly against the building of a new generation of reactors.

Despite the Prime Minister's well-known support for the nuclear industry, the Sustainable Development Commission (SDC) concluded that a new nuclear programme was not the answer to the twin challenges of climate change and security of supply. In a hard-hitting report, the 15-strong Commission identified five "major disadvantages" to nuclear power.

The five objections are waste, cost, inflexibility, security risks, and distractions from energy efficiency.
A new nuclear power programme would send out a signal that a major technological fix is all that is required, says the report, and hurt efforts to encourage energy efficiency. This has largely been the approach of the Bush administration to climate change. Environmentalists would contend that this is a dangerous delusion, and that technical fixes such as nuclear power do nothing about the long-term problem. Only changing the energy system profoundly will make a real difference.

While the report is on the right track in that a long-term solution to global warming will surely require new and innovative solutions, we are unlikely to find a panacea anytime soon. Decreasing greenhouse gas emissions by any means, then, should be seen as a positive step. The report even notes that doubling the UK’s use of nuclear power alone would lower carbon emissions to 8% below 1990 levels. Although other truly clean energy sources have the potential to reduce emissions much further, this decrease would still be significant.

The commission reports that embracing nuclear power would distract the country from what should be the real focus: developing renewable energy sources combined with cutting energy use. Although this green strategy will be essential to truly combat global warming, it is unclear why nuclear power should be considered mutually exclusive with it. In fact, a truly comprehensive strategy would include both. The stakes are high, though, with the future of our planet in the balance. As global carbon dioxide levels continue to skyrocket, betting that an alleged loss of support for other solutions would cancel out gains from nuclear power is a dangerous gamble.

Mar 4, 2006

Scientific Activists in Oxford Still Making Headlines

One of the biggest recent stories in scientific activism, on this side of the pond at least, is still making waves a week later. Friday’s Guardian ran an excellent piece on Pro-Test—the pro-research student organization at Oxford—and how it fits into the larger conflict between scientists and animal rights activists. Last Saturday, Pro-Test staged a thousand-strong march through the streets of Oxford to demonstrate the popular support behind animal experimentation, to educate the public on the nature of animal research, and to protest against the tactics that the animal rights movement has resorted to.

Although the demonstration was a major break in the current battle over Oxford’s plans for a new biomedical research center, its implications extend much further, highlighting an additional step that scientists could take, but rarely do, to build support for their work in some cases. A quote by Iain Simpson, one of the group’s leaders, drives this point home:
"This is about academics feeling under siege and our concern that no one is defending them. For decades scientists have been vilified for conducting necessary animal research that has led to advances that have saved millions of lives. Because no one has spoken out on their behalf, and because they have been too afraid to defend their work, a culture has developed where people are suspicious of what they are doing.

"This started as a local issue, but on a macro scale we hope to turn the tide in terms of animal research. Scientists and academic institutions have been too afraid to engage in the debate and, therefore, have allowed activists to set the agenda. Now I feel it is right to draw a line in the sand and say, 'No more.' We want to get that debate out in the open and win it based on reason."

Interestingly, the Guardian article picks up on something I noticed at the demonstration as well: it was pretty impressive that Pro-Test succeeded in attracting so many people and so much enthusiasm to what was fundamentally a pro-establishment cause.
These few students are now at the centre of a movement that could have enormous implications for scientific research and for the safety of those involved in it. In some ways this is a strange movement - students campaigning to defend the establishment instead of attempting to bring it down - yet Pro-Test's supporters would argue that it also belongs in the finest traditions of protest: embracing debate and opposing intimidation.

While I marched in the demonstration, my mind wandered back to the anti-war protests I attended in 2002-03. At some of them, I rembered seeing small pro-war counterprotests in response to the rally I was in. I used to wonder—despite knowing that the idea of war held a decent amount of popular support in America at that time—about what would drive someone to actually go out and demonstrate in favor of the status quo, especially if that meant protesting in favor of an invasion that seemed virtually inevitable by that point. (Once, when I was at an anti-war rally in Houston, I saw what instantly became my favorite counterprotest slogan: “Give war a chance.” That only reinforced these feelings.)

Although this dissonance never quite left my mind, it increasingly became clear throughout the rally that Pro-Test’s movement differed considerably from the pro-war cause. In fact, due to its dedication to spreading information, increasing understanding, supporting progress, and challenging violence and intimidation, the pro-research movement is in many ways the antithesis of what the pro-war protesters were trying to achieve. Along these lines, the Guardian provides several examples of just what kind of tactics animal research advocates are up against:
Yesterday, one victim of intimidation, asking not to be named, described how it feels. "There are death threats by email, or threats to kidnap your children," he said. "They might slash your car tyres or throw paint stripper over it. Then there are telephone threats, some of which threaten violence and others that are strangely polite. And there are letters to your neighbours telling them you are a paedophile or a rapist.

"This brings about enormous psychological pressure on you and your family, but the threats of violence are rarely followed up. Most of it is noise and bluster. But I was attacked on my doorstep one morning and had a substance sprayed into my eyes and then some men began to rough me up. Fortunately, I fell backwards into my hall - in front of my wife and three-year-old daughter. Then they smashed my windows, leaving me lying there covered in glass."

Although actual violence only comes from the more extreme minority of animal rights activists, even the more “mainstream” organization SPEAK, which sponsors the weekly protests at the construction site of the new research center, resorts to underhanded tactics that include intimidation. Apparently, the organization also relies on outright lies to try to get its point across:
The university says that 98% of the research in the lab will be carried out on fish and rodents, with a futher [sic] 2% on higher mammals, and less than 1% on primates. But Speak, a local anti-vivisection group involved in the protests against the development, claims that "whole troupes" of primates will be subject to experiments. Robert Cogswell, Speak's co-founder, says he regards the student group as "irrelevant".

"It is not so much a group of pro-vivisection individuals as a collection of people who simply oppose the animal rights movement," he says, claiming that most of those on Saturday's march (he puts the number at "400 at most") were "hunters in hunting regalia, and there were hardly any students".

"Nevertheless, if they give us someone with whom to debate, I welcome them. We have always wanted a public debate because we feel we can win the argument." He says the group does not condone violence.

I cannot independently validate or refute SPEAK’s claims about the use of primates in the new labs (although I tend to regard the University of Oxford as a more reliable source of information), but there are plenty of other untruths in the previous quote. Although the estimates vary, at least 800 people, and possibly over 1,000, attended the Pro-Test event. Also, the vast majority of participants that I met there were current Oxford students (when I went to a SPEAK protest, I did not meet a single Oxford student in the crowd). Since actions theoretically speak louder than words, SPEAK has not demonstrated a burning desire to engage in a rational debate on these topics. While I still believe that animal research is a topic that we as a society should maintain an active dialogue on, Pro-Test appears much more agreeable to this dialogue than SPEAK, and the actions of Pro-Test probably have a greater chance of ensuring animal welfare in research labs than the tactics of SPEAK.

Putting the need to counter the animal rights movement aside, can others facing anti-science forces of a different nature in other countries learn anything from what Pro-Test has done in the United Kingdom? I think so. The key point in this case was that the majority of people agreed with the pro-research cause, but they felt that they had been silenced or marginalized by a vocal minority. This situation might sound familiar to those in the U.S. How about the support behind funding embryonic stem cell research? Taking action against global warming? Teaching evolution? These are all areas where vocal ideologues have hijacked the debate, marginalizing what is otherwise a perfectly mainstream and rational viewpoint.

Of course the U.S. government should fund embryonic stem cell research, and most people agree. In fact, there’s arguably a lot less grey area there than in the battle over animal research, although the parallels between the two are stunning. There’s only one key difference: animal research in the U.K. is still going strong, while embryonic stem cell research in the U.S. is hurting. Badly.

With that in mind, maybe it's time for a good old-fashioned protest. If the scientists plan it, the people will come.

Mar 3, 2006

Distract Yourself With This

I've been busy grading papers lately (and then there's been that little matter of working on my Ph.D. research as well...) so in lieu of some real scientific activism, I'll direct you toward some of the more interesting things I've seen in the science blogosphere over the last couple of days.
  • Over at The Intersection, Chris Mooney takes on President Bush's Science Advisor, John Marburger, after hearing a recent interview in which NPR "let Marburger off far too easily."
  • William Connolley of the blog Stoat reports on the leaking of a draft of the upcoming report by the Intergovernmental Panel on Climate Change (I first came across this one at A Concerned Scientist).
  • Finally, Janet Stemwedel opens an interesting discussion at Adventures in Ethics and Science about what makes her qualified to comment on scientific issues. The natural question here, though, is what on earth makes me qualified to talk about all of these things? According to my resume, I think my anticipated degree from Oxford, distinguished future scientific career, expected Nobel Prize in Chemistry and/or Literature, and potential two-term presidency of the United States of America speak for themselves. Don't worry, though. If all of that doesn't pan out, I promise I'll remember to "update" my resume to make the necessary changes.

Mar 1, 2006

Tangled Bank #48 at Aetiology

If you're looking for the best science writing on the web, all in one location, head over to Aetiology, where you'll find the latest version of the Tangled Bank.