Monday, August 16, 2010

Just Wondering

If you've lived for even 2 decades on this planet you have already seen scientist sannounce that they have developed a cure for pretty much any disease known. Some have multiple cures available, and new promising therapies are always just around the corner. The fine print of the announcements is that the cure has only been proven in rats/dogs/rabbits...pick your favorite animal. Unfortunately, when the cure is scaled up to humans, something goes wrong. Most of us are quite accepting of this either because we 1) understand the complexity of the differences between lower animals and humans, and/or 2) we have seen the pattern repeated over time and have learned to recognize it.

This pattern however, never seems to occurs when there is environmental exposure to chemical or physical hazards. When lower animals are exposed to (quite often unrealistically high) levels of some chemical and then some degradation in health is found, the assumption is then made that the same can and must occur in humans.

In some cases, this is appropriate - toxins seem to cut across the board, although the LD50's can vary some, and as always, it's the dose that make the poison [*]. But in other cases where there is small exposure (pick any example of something leaching into food), shouldn't we question if such a mandatory link exists. Or are we so cursed that all the potential good cures probably won't work out when going from animals to humans, yet all the potential bad exposures and consequences are guaranteed to pass on up?

[*] Consider the toxin from botulism. It is pretty much the deadliest poison known, but when diluted considerably is used in botox injections.

No comments: