Monday, November 05, 2007
Surprisingly, it can. I'm not up on that area, but there are a 17 Google Scholar hits for "negative intrinsic viscosity" and many are published in good journals, so it seems to be real. Or as I would argue, the data is reproducible. But as always in science, it interpreting the results that is the challenge.
I haven't read any of the articles yet, so here's my big chance to insert my foot in my mouth. I will start collecting the references and will post what I learn, so public humiliation is a real possibility here. My guess is that the polymer is breaking up some sort of weak network in the solvent (hydrogen bonding, polar-polar interactions...) so that the solvent has a lower viscosity, thus leading to the negative intrinsic viscosity - and the need for a new theory to explain it. Yes, go with a new theory since the old theory made a certain set of assumptions that are no longer valid.
That said, I am now wondering of the possibility of this effect happening in reverse but not being noticed. i.e., the polymer forces more hydrogen bonding or what have you that would lead to the solvent having a higher visocosity, but because the polymer is also leading to a higher solution viscosity, this "addition" is not noticed since it's easy to just explain the viscosity increase strictly to the polymer. How could you determine this?
First steps first: to the literature. As is often stated, 6 months in the lab will save you an afternoon in the library.
Friday, November 02, 2007
It's refreshing to see the emphasis on maintenance.
Thursday, October 11, 2007
Tuesday, October 02, 2007
Friday, September 14, 2007
Thursday, September 06, 2007
Elsevier has a site listing the top 25 downloaded articles in the various chemistry journals that they publish. I was surprised to find that in the 2nd quarter of this year, the top article, "Composition chimique des huiles essentielles d'Ageratum conyzoides du Burkina Faso" was in French, as was the number 6 article – “ Composition chimique, proprietes antimicrobiennes et activites sur les tiques de l'huile essentielle d'Eucalyptus tereticornis" and the number 22 article – " Proprietes antioxydantes de l'huile essentielle des feuilles de Clausena anisata (Wild) Hook", all on essential oils. Why the sudden interest in essential oils, why are the French leading this research, and why is it published in French?
Less I be accused of the "ugly American" syndrome - i.e., "why don't they publish it in English", it is well established that researchers wishing to reach the broadest audience will publish their results in English, and since these are apparently hot articles, I would have expected them to be in English. By the way, I did not see any other top 25 articles in French for all of 2006 as well as the 1st quarter of 2007.
Wednesday, August 29, 2007
Background: As you may know, I used to work for a very large company. I used to think that the only value that I saw in it was technical: great labs, great library (online access from my desk to all the ACS, Elsevier, Wiley... polymer and chemistry journals you could imagine) and technically competent scientists. My wife also worked at this same company too. Both of us suffered through the endless training and talks on business stategy, benchmarking, personality profiles...all the stuff that is fodder for Dilbert.
I was layed off from said company, and my wife quit, picked up a real estate license and is ecstatic helping real people with going through a huge transition in their lives. She started talking with another realtor who had been a realtor his whole career (Call it 20 years). He has just had started his own brokerage, and was talking to my wife about some of the challenges and that's when the lights went off. This guy knows nothing about all the stuff that I had just complained about as being a waste. He was having to bring in consultants to help with strategies, personality profiles... and the broker thought this was all great... that he was at the cutting edge!
So maybe there was something at that earlier employer that I only now appreciate. Certainly my wife has a huge advantage over other career realtors, and that is apparent as she is already threatening the stability of the other established realtors, which was part of the motivation for that conversation with the broker - if you can't beat her, join her.
1) Removal of the bridge pieces is now going more rapidly since 1) all the bodies have been recovered, and 2) the NTSB has taken whatever samples they have deemed necessary.
2) It already known that the collapse was particularly "catastrophic" (i.e., the whole thing fell all at once) as it was built in a time period were a lack of redundancies in the design were common. Bridges prior to that period (sorry, I can't define the period exactly) and since that period all have redundancies so that if a part of the structure fails, the rest of the structure is strong enough to support the load.
3) The media's attention at this point is on the gusset plates which connect the girders together. MnDOT had looked into strengthening them earlier this year by adding additional rivets, but this was not done as concerns were raised that the additional holes would weaken the structure more than the rivets would help.
What I want to know: bridges are all rated on a scale of 0 - 120. This particular one had been most recently at 50, but bridges with far lower ratings are still standing. Obviously the rating is not deterministic, but rather statistical. So, what's the standard deviation for any bridges rating? Is it constant, or does it vary with the rating? How accurate is a rating given that they are based on visual inspection? How well do ratings correlate with reality?
Thursday, August 02, 2007
“There but for the grace of God go I.” I’ve driven that bridge countless times, as recently as 4 weeks ago coming back from a cross-town visit to a client. It is in the center of the
That more people weren’t killed is truly a miracle. A school bus with 60 kids fell; only 10 of them are in the hospital. The rest are at home with their parents.
This isn’t supposed to happen here. Not in
Identifying the full causes will take a long time. Just accessing the bridge now is difficult as it is now in a 60 foot deep ravine with steep walls on both sides. We have three metallurgists on staff here with plenty of grey hairs between them, and plenty of experience with failure analysis. None of them are in yet at this hour, but I’m anxious to prod them for insight.
More to follow...
Wednesday, July 11, 2007
What I find really creepy are the comments about the civil engineering community. From the NY Times: "During construction, the builders tested the strength of the bolts; when some failed, the problem was attributed to installation errors, not breakdown of the epoxy. "The knowledge of the engineering community seems to be deficient,” said Bruce A. Magladry, director of the board’s office of highway safety. With concrete, steel and asphalt, he said, “once you test them for strength, they essentially keep that strength forever.”“Epoxy is not that way, it’s a different material,” Mr. Magladry said."
And from Forbes:"Magladry said there was no malice among those who built and oversaw the Big Dig."I don't think they understood creep at all," he said."Although the epoxy used in the tunnel had acceptable short term strength, it was incapable of supporting much lower loads over an extended period of time," Magladry said. "If any of the entities involved in the ceiling design and installation had considered creep as a possibility, a different epoxy or a different anchoring system would have been used."
It's a very sad situation. I would have expected that the failure was the result of some interaction of stresses, something more difficult to predict in advance, not a simple design flaw that was quite predictable.
This is the kind of system that we often test here at work. I wish they would have called us.
Wednesday, June 27, 2007
I didn't even know that these chairs had a name, but in addition they also have a fansite with pictures of the chairs in odd, interesting and intriguing spots around the world. They're something I've always taken for granted, but that is often the case with design.
Wednesday, June 20, 2007
The kicker here is that this is 6-year old news. It was first published in August, 2001 in JACS. The last I knew, Cornell was in the state of New York. A Times reporter should have been able to walk to Cornell and back during the summer of 2002 (since the winters Upstate are way too severe for travel on foot) and still filed this report 5 years ago. So does this mean that polymers have fallen to the realm of a "slow news day" filler?
Editor: "Jimmy, think fast! There's no news on Paris, Britney or Lindsey. We need something to fill the paper. We can't have blank spots on the pages. What have you got?"
Jimmy: "Well, gee, there is this story that's I wrote 6 years ago about making plastics from greenhouse gases. I was going to delete...
Editor: "Don't you dare. It's perfect. Give it too me and then get down to the corner of 6th. I hear that Alex Baldwin is there yelling into his cellphone..."
Friday, June 15, 2007
Continuing on the theme of confusing polymer names, this week’s subject is “PVA”.
The “PV” is easy: “Polyvinyl-“. The problem with this TLA (“three-letter acronym”) is that the “A” could be either “alcohol” or “acetate” (or actually both, since copolymers of these two are trivial to prepare.) It could also stand for “acrylamide”, although that is pretty uncommon, and usually goes by “Am”. Kind, gentile authors will use “PVAc” for the acetate, but that still is not a common enough that you can assume it will ALWAYS been indicated as such.
PVA is unusual for a "vinyl" polymer in that it is not made by polymerizing "vinyl alcohol". Vinyl alcohol doesn't exist. Anytime you do try and make it, it undergoes a keto-enol rearrangement to form acetaldehyde. (If you didn't know that, don't feel bad. Linus Pauling forgot about keto-enol rearrangements and consequently was wrong when he published his version of the structure of DNA. Too bad, as that would have yielded a THIRD Nobel Prize for the guy. Instead he just had to be happy with 2 Nobel Prizes.)
Anyway, PVA starts off as PVA which then undergoes a reaction to form PVA. Confused? Let me restate this. Polyvinyl alcohol starts off as polyvinyl acetate which then undergoes a (hydrolysis) reaction to form polyvinyl alcohol. By varying the degrees of hydrolysis, you can end up with either straight polyvinyl alcohol or polyvinyl (acetate-co-alcohol).
Wednesday, June 06, 2007
This was one original guy. If you've read any of his books (Scaling Concepts in Polymer Physics) or articles such a this one on cell entry of DNA through a pore or this one on smelling a rose, then you will be struck by the simplicity AND the power of the arguments. I know of no one else you can make such impact in such a broad range of fields. He was truely an inspiration of what can be done in a different manner than most scientific methods. I always took the time to read anything that had his name on it. He will be missed and I doubt I will ever see another like him.
Monday, June 04, 2007
Looks like the end of the line for the usenet, or at least sci.polymers. While sci.chem and sci.physic have degraded into festering cesspools of off-topic postings on every conspiracy theory know by any kook who knows a bit of English, sci.polymers is going out with a whimper. Looking at the about posting activity, you can see a large drop in the posts with each passing year, even taking the seasonal variation into account. This last month – May - there were only 22 posts. The last time there were that few posts was October, 1993 when the newsgroup was only 3 months old, and the number of internet users was 10% (?) of what it is today. Why is this happening? Colleagues have suggested that blogs are the new usenet, but I don’t see an equivalence. The usenet was a unique resource for accessing information from “experts”. Blogs don’t have that mechanism in place. On the other hand, to the extent that the usenet was used as a site for discussion of a topic, blogs that allow feedback can fill that need. I’ll probably linger around the usenet a little bit more, maybe to see if I can get my current email address up to the # 2 spot on the “All-Time Top Posters, (previous addresses are already at #1 and #9), but it certainly will sadden me to see it go. I very much enjoyed being able to help others as well as learn from the many experts that use to post their as well.
Why is this happening? Colleagues have suggested that blogs are the new usenet, but I don’t see an equivalence. The usenet was a unique resource for accessing information from “experts”. Blogs don’t have that mechanism in place. On the other hand, to the extent that the usenet was used as a site for discussion of a topic, blogs that allow feedback can fill that need. I’ll probably linger around the usenet a little bit more, maybe to see if I can get my current email address up to the # 2 spot on the “All-Time Top Posters, (previous addresses are already at #1 and #9), but it certainly will sadden me to see it go. I very much enjoyed being able to help others as well as learn from the many experts that use to post their as well.
I’ll probably linger around the usenet a little bit more, maybe to see if I can get my current email address up to the # 2 spot on the “All-Time Top Posters, (previous addresses are already at #1 and #9), but it certainly will sadden me to see it go. I very much enjoyed being able to help others as well as learn from the many experts that use to post their as well.
Tuesday, May 29, 2007
Given the wild diversity in the physical properties of polyurethanes, they are my default answer whenever someone tries to play “Stump the Polymer Expert” with me. (“Hey John, what’s this plastic?”) PU may not be the right answer, but it never is a wrong answer. That can go a long ways in establishing instant credibility.
Friday, May 25, 2007
I can’t imagine a more nondescript name for a polymer. (To be fair, polyureas, polyesters and all other condensation polymer suffer the same flaw, except that the options for the intra-urethane segments of a polyurethane are much more diverse than for any of these others.) I am constantly barraged by coworkers and clients bringing me a sample of a “polyurethane” highly expecting me to know exactly what it is. You can make bowling balls, sponges, pressure sensitive adhesives and much, much more from polyurethanes: how am I suppose to know the exact chemistry when all you can tell me is it is a polyurethane?
Thursday, May 24, 2007
Polyurethanes are a most versatile polymer. Between the urethane linkages lay potentially hundreds of different segments ranging widely in polarity, flexibility and connectiveness. The result is bowling balls, sponges, pressure sensitive adhesives and much, much more.
Monday, May 14, 2007
How can measurements of an infinitely dilute polymer have any meaningful purpose at all, especially since nobody works with such dilute solutions in the real world? It turns out that the intrinsic viscosity can be related to the molecular weight by the Mark-Houwink equation: [η] = K Mα [η] is the intrinsic viscosity, K is a constant, M is the molecular weight and α is another constant. You can do some theoretical development of this equation and find that α should be 0.5, which is usually not the case. But it is still an important value.
Polymers in solution do not exist as long straight molecules, but instead are coiled up in a random coil. The size of this coil can be related to α with larger values of α indicating that random coil is larger than would be expected. A larger coil will interact with more solvent molecules and drag these along in the flow field creating a larger viscosity. So a quick look at the value of α will tell you how good the solvent is that you are working with. α = 0.8 means it’s a terrific solvent, α =0.5 means that it’s a lousy solvent. α less than 0.5 means that the polymer will soon be coming out of solution.
In an incredibly mind-blowing relationship, these values determined at infinite dilution can in fact be related to a pure polymer. Molten polymers also exist as a random coil, and the size of this coil is the exact same size as when α = 0.5. How cools it that? An infinitely dilute solution and an 100% pure polymer have something in common.
Friday, May 11, 2007
So as you can tell, I don’t like MFI much as a QC test. Another less common test is IV – which in this case means inherent viscosity. Polyesters are commonly spec’ed on this basis. Here again, running the test at two different conditions can provide a huge increase in the data and what you can conclude from it.
Background: This test is also run with the polymer flowing through a capillary, but with big differences. First off, the capillary is glass. (A variety of different glasses are available each with different advantages and disadvantages.) Also, instead of being a melt (100% polymer), the polymer is dissolved in a solvent and the concentration is so low that each polymer molecule does not interact with any other one because the distance between them is greater than the size of the molecules. The polymer solutions flows down under the pull of gravity and a stopwatch is used to measure the time that it takes for a certain volume of fluid to drain. The same measurement is also made for the pure solvent. It’s a very easy test to run – I’ve had high school students get fantastic results – and even automated systems exist. End of Background
The ratio of the drain time for the polymer solution divided by the drain time for pure solvent is the relative viscosity. Take the natural log of this, divide it by the concentration and you have the IV. There also another “IV” which is called the intrinsic viscosity (Inherent/intrinsic viscosity – very confusing. I didn’t name these things, I just live with them.) The intrinsic viscosity is found by rerunning the inherent viscosity at a different concentration and then extrapolating the inherent viscosity to zero concentration. The neat thing is that this can be self correcting by doing a second set of independent calculations. Going back to the relative viscosity, subtracting “1” from it gives you the specific viscosity. Dividing this by the concentration gives you the reduced viscosity. Extrapolating the reduced viscosity to zero concentration also gives you the intrinsic viscosity. (The IV line has a negative slope while the reduced viscosity line has a positive slope. If they don’t cross at the y axis, something is wrong with the test.)
Like the MFI, measuing IV at just one point is of dubious value, as there are a large number of lines that can be drawn through that point. I know that it is an infinite number of lines, but for practical purposes, it is much less than this. First we know that the lines have to have a negative slope. Also, intrinsic viscosity is determined to two decimal places, so lines that lead to changes in the intrinsic viscosity of less than 0.01 are of grouped together. Nonetheless, there still is a large number of lines that can be drawn though a single IV measurement. Having a second IV measurement lets you calculate the slope AND the intrinsic viscosity AND also lets you calculate the reduced viscosity line too. All that from a second measurement. It's a heckofa deal.!
The intrinsic viscosity is magical. Despite being a measurement made at infinite dilution (commonly called “a single molecule in a sea of solvent”), you can actually learn much about the melt properties of the polymer. But that will have to wait for the next post.
Thursday, May 10, 2007
Tuesday, May 08, 2007
The “melt flow index” is about as bad of a test method as you can get. In fact, I would rate it as # 2 on the list of all-time bad test methods. (I’ll save the discussion for #1 for a later date.)
Background: See ASTM 1238 for all the gory details. Basically, you put some resin in a tank with a floating lid, heat it to a specified temperature that is sufficient to melt it, add a specified mass to the lid and then let it drain out a small circular hole at the bottom. The mass of material (in grams) that flows out in a specified time (usually 10 minutes) is the melt flow index. The softer, less viscous materials will flow out faster and have a higher melt index. End of background.
So what’s the problem? It’s an easy test to run, no advanced education is needed, and so the test results are used on specification sheets throughout the industry. 100’s of millions of pounds of xxPE, PP, PVC and other resins are bought and sold largely on this single test value. So what is the problem?
For starters, the test does not tell you anything fundamental about the resin. It is not a viscosity test even thought it is commonly thought to be one. In fact, the test is deceptively close to a capillary viscometer test, in which the mass flow rate through a small circular tube is recorded along with the pressure drop. The big difference is that the capillary viscometer tube is much longer than that of the mass flow indexer. This is necessary for the flow in the tube to be fully developed and free of any entrance effects (irregular flow patterns on the upstream side of the die such as eddies that result from the flow trying to squeeze into the small capillary). These entrance effects are retained by the polymer until they can relax out, something that takes time which means a longer flow tube. Typically a length/diameter ratio of 20 is needed. The MFI index has a L/D of 4.
But wait, there’s more. Viscosity changes with the shear rate. Drastically in some cases, and in a nonlinear fashion. The MFI measures the “viscosity” at only one shear rate. How many curves can you draw though a single point? Plenty. Measuring the MFI at a second shear rate goes a long ways towards understanding if two lots of resin really have the same or different flow characteristics. Again, assuming that the MFI really does measure “viscosity”.
Tuesday, April 17, 2007
The tragedy at Blacksburg is still being told, but one of the comments from the administration that bothers me the most is about the difficulty of getting information out across a large campus. As a graduate of two very large land grant universities, I certainly understand the perception, but times have changed. Technology does exist and is already being used. My son's school district has a service that in the event of a snow day/attack/early closing/… will send out a message to all parents via automated phone calls and emails. And it's not just 1 email or one phone call in the immediate area adjacent to the school. It's multiple phones messages (both cell and land line) and multiple emails all across the metro area. It's a great service and works really well. Had such a service been used at VTU, perhaps some lives could have been saved.
Monday, March 05, 2007
The ACS is finally taking a step in that direction, albeit a baby step. Authors can now pay a fee so that the ACS will allow free access to their article. This option has been available since October, but it seems that only 3 authors have taken advantage of this wonderful service so far. At $1000-$3000 an article, the fees would quickly add up for authors with huge publication lists. With an average article currently costing ~$30, that means that the average article currently is accessed less than ~30 times.
You can tell from my satirical tone that I favor free access of all information, as both the author and the readers gain. The middle man (the publisher) is the one that is hurt. I don't have any easy solutions. One option that I am in favor of is to base the access fee on the popularity of the article: the more it is accessed, the more it costs to access it. Articles that no one reads would be available very cheaply.
Thursday, March 01, 2007
From the true tales of an MSDS: (I know, I know. A huge book could be written of all the humorous writings in MSDS's). This is from section 4 of an MSDS to gamma-glycidoxylpropyltrimethoxysilane, a common coupling agent. The three methoxy groups are rather labile, and will displace themselves when moisture is present.
"This product reacts with moisture in the acid contents of the stomach to form methanol. The combination of visual disturbances, metabolic acidosis and formic acid in the urine is evidence of methanol poisoning. The therapeutic intravenous administration of ethanol (10 ml per hour) allows it to be preferentially oxidized and reduces production of methanol metabolites."
So take a swig of this, and then go the hooch. "Sorry boss, you can't fire me for drinking on the job. I have to. Doctor's orders!" Granted, 10 ml/hr won't get anybody drunk except for Calista Flockhart, but it still is an unusual therapy.
Friday, February 09, 2007
All the sciences, well maybe not math, but all the others, have unique fears about certain aspects of the other sciences. I've seen biologists freak when I opened a bottle of toluene ("Don't you know that's a carcinogen!" It's not, but I can't convince them otherwise, and the fact that a number of independent biologists have stated it makes me fear for how wide spread the thought is) and I've talked with nuclear physicists paranoid about acrylamide monomer (as if the it's a worse risk than the radioactive materials that they work with).
Chemists certainly have fears in these other sciences, but they also have a few chemical fears. The most widespread one is of hydrofluoric acid. These pictures give you some idea why. HF is a deceptively nasty acid. The deception is in the actual acid strength of it. The halide acids (HF, HCl, HBr, HI) are stronger as you proceed down the periodic table. This is because the anions increase in size in that manner and are better able to disperse the charge. Look at the pKa's: HF 3.45, HCl -4, HBr -9, HI -10. Not quite as wimpy as acetic acid at 4.75, but pretty close.
Despite this, the acid is properly feared because it is merciless in attacking two important everyday materials: glass and human flesh. Glass is remarkably resistant to most acids, but is readily etched by the tiny HF acid, and in the human body, HF is a systemic poison going right for the calcium in your bones. I've never experienced such a burn, but I understand it gives a new meaning to pain.
Thursday, January 25, 2007
The most recent Proceedings of the National Academy of Sciences (Vol. 104, No. 4, pp 1130-1133) has a free access article which is visually striking, although I was really disappointed in the analysis. PDMS was exposed to a beam of Ga+ ions, which then forms a stiff upper layer (SiOx - a depletion of carbon is noted) atop the softer underlayer, with the upper layer forming a buckled surface at high enough Ga+ fluences.
Why? The authors are silent on the subject.
Having a stiff layer of anything over a softer layer doesn't mandate buckling unless there is a difference in stress/strain that is allowed to relax. I suspect that that is the ultimate cause here - residual stress from the manufacturing of the original PDMS. This would be easy to verify qualitatively.
Monday, January 22, 2007
And now GE Plastics is for sale…
As recently as the year 2000, GE, and in particular, its CEO Jack Welch could do no wrong. Jack would say that managers should pat their tummies three times a day and BOOM! that week he would be on the cover of 5 business weekly magazines, with everybody cooing and ahhing at the brilliant insight. GE mangers were also considered nearly as brilliant…until they left and tried to pass off the GE model on other companies. Jim McNerney and Robert Nardelli immediately come to mind. 3M and Home Depot could not believe their luck at getting these guys and were willing to pay the price of a moonshot for them - $210 million in Nardelli's case. Both have now left after several years of accomplishing nothing as the stock prices show.
So now the crib of Jack Welch and his prodigy Jeff Immelt is up for sale. Considering the performance of GE stock under Immelt's reign (from ~ $52 down to the current ~$37), maybe something else is the issue. Heck, anybody can look good if they sell off the slow performing divisions. Does that make someone a great CEO? Maybe so, since Jack's first nickname was Neutron Jack (he got rid of the people and left the buildings standing).
Wednesday, January 03, 2007
In my current employment situation, all my hours are potentially billable, much like the world of lawyers. (Our rates aren't as high, we have expensive toys in the lab, and we aren't … well,… we're more likeable.) This has given me a different perspective on my job than I had at the past when I worked for corporations that simply had an internal overall lab R & D budget that my time was billed against. In the past, I would rarely use support staff to help me out with mundane tasks. Here is it essential to keep a project on budget, since the support staff will bill their hours at a significantly lower rate than I do. Another example: A colleague recently needed a walking treadmill for a project. He went to the local store and found a model that he liked. It was not stocked at that store but at another store across town. Doing the math in his head, it was cheaper to the budget to get a more expensive treadmill now than to spend the extra time going around town to save a few hundred bucks.
You get the point. The math is simple but that is not my issue here. The real question I have is this: should people in more traditional labs (i.e., those with only an internal R & D budget, and that really don't have hours directly charged against them) adopt this attitude as well? Why or why not?
Tuesday, January 02, 2007
I bought a box of Clementine oranges over the holidays. The packaging was not the usual "wood/cardboard crate-with-netting", but instead was a made from polypropylene sheetstock. These particular oranges were distributed by Ocean Spray (Don't think that they grow too many oranges in the bogs of New England and Wisconsin, but I've never been sold on this Global Warming thing anyway. Maybe I should pay more attention.) I'm surprised, as I would have thought that it would be more expensive than the older style packaging.