Tuesday, November 30, 2010

Black Hole Entropy, Loop Gravity, and Polymer Physics

I didn't invent that title, I am only reprinting it. And no, I do not understand it. Black holes I understand (or I think I understand them as much as a non-physicist can), but I have no idea what loop gravity is - the best I would guess would be the force you feel in centrifugal acceleration on the Rotor ride at an amusement park.

Somehow the author of the paper is able to establish an equivalency between the counting of the microstates in a black hole and the statistical mechanics of a polymer chain, and even another relationship between a rotating black hole and an elongated polymer. Please feel free to read the paper yourself. I certainly recognize the statistical mechanic equations from polymer science, but nothing of the black holes or loop gravity formulae.

Every once in a while [here and here] I've posted on weird connections between polymers, rheology and modern physics. This is probably one of the more practical applications -- if you consider black hole entropy to be a practical problem!

Monday, November 29, 2010

Time to Update the Weathering Mechanism?

There's an interesting new research article out that is getting some buzz in the popular press. It is an open access article if you register (for free) with the Royal Society of Chemistry. I'm withholding judgement at present, but it does seem to merit further investigation.

The article looks at the autooxidation mechanism that is commonly used to describe the (photo)oxidation of polymers and why they breakdown when weathered. It's a free-radical propagation scheme not too unlike a polymerization reaction.

I certainly had no idea that the mechanism was first proposed 65 years, and on that basis alone, it probably would be a good idea to re-examine it. I also was unaware that the mechanism had been used to describing degradation of materials that a chemically distant from olefins (the materials that I am most familiar with in a degradation setting), to even include acrylates. [1]

Using computer modeling [2], the researchers found that the existing mechanism is thermodynamically unfavorable in most cases, and "in most cases" here covers an impressively wide range of chemistries. They also found that temperature doesn't change influence at all. Instead, the authors look at defects that occur in polymerization and found that the defective sites are sites where the thermodynamics are favorable for hydrogen abstraction. They then suggest that examining materials made with non-conventional and more constrained polymerization mechanisms may avoid these problems.

As I started with, I find this a very intriguing paper and one that needs more research. There is a tremendous amount of existing research in degradation that was not mentioned here, and it would be invaluable to look at it anew to see if the new mechanisms are supported by it.

I am greatly concerned that the influence of temperature was minimal. Whole books, journals and companies exist because of accelerated weathering, and temperature always plays a key role in it. (If reaction (1) above is the result of a photoinitiation, then that step is pretty much temperature independent, but all the remaining steps certainly have a temperature dependency.) Nonetheless, I still found the whole article worthwhile to read and hope that it leads to new insights quickly. I'd be curious if it leads to the development of new antioxidant chemistries to fight these "new" reactions.

[1] I'm glad that the researchers had the same thought I did in reading that: the proposed scheme won't work because it is well known that oxygen inhibits acrylate polymerization. The peroxy radical in step 4 can't abstract a hydrogen. So either the mechanism above is correct, or our understanding of oxygen inhibition is correct, but they both can't be correct.

[2] Gas phase, and some liquid phase. Ever seen a gaseous polymer? Think it might behave just a little differently in a solid? Unfortunately, even with supercomputers, modeling is still challenging and gas phase analysis is just easier to perform.

The Shortcomings of Business Management Books

My last post on corporate cultures got me thinking a little more about businesses. Years ago I use to read at least major parts of business management books, the ones that are always feature the hot trend of the year, usually published by some prof a the Harvard Business School. The ones where they look at a company or a bunch of companies and then generate data of some sort to prove their idea of why the companies succeeded or failed. The last conclusion then is that everyone should be doing what was discovered.

Completing ignoring all the issues associated with a lack of a control study, the big problem is that they never play their results forward. If they really believe that X is the secret to unlocking value in a company, then find some company and install X in it and see what happens. It's no different that what we do in science and engineering: if you think you have the answer, then run some experiments to check it.

Personally, I think that running a business is about the most complicated human undertaking ever attempted. As such, you can always find data to support anything you want to believe. For comparison, consider the 210 reasons for the Fall of the Roman Empire", all of which are supported by data. The real test is when you have to work with your results.

People always complain that weather forecasters have such an easy job since they can be wrong over half the time and still keep working. I'd say business profs have it easier yet since they never even try to see if they are right or wrong. They just need to keep cranking out new books.

Wednesday, November 24, 2010

That's all for this week

I alluded to Thanksgiving earlier this week. Given that most of the US will be stuffing themselves on turkey tomorrow and then doing all we can to stimulate the sluggish economy on Friday, this will be the last post of the week. That's no loss however, as I strongly expect readership to drop since about 60% of it is here in the US.

Happy Thanksgiving to all.

Corporate Cultures

There has been quite a bit of discussion this week in other blogs (In the Pipeline, The Chemjobber) about the new "Lab of the Future" as designed by Novartis. Obviously there is a strong angle in the discussions towards pharma, but all R & D organizations face the same challenges: how do you get people to work together effectively.

Working at Aspen Research, I've been fortunate to gain access to dozens of different companies, all with different floor plans and building architectures and meeting rooms and... I have not seen that the physical structures matter at all.

As much as I hate starting up buzzword bingo, to me the stronger signs of success are corporate culture. Within about an hour, you can tell which companies are going to succeed and which aren't. And that usually carries over into the project that we are working on. I've had clients where I know that the work I'm doing will just be lost or poorly implemented and others were I knew it was going to be a great success.

I would be rich, extremely rich if I could tell you why the differences exist and what to change. (If it was easy, some business prof would have done it already.) However, some of the common threads I see in good businesses are:
  • Intelligent people. This can be broken into two subsets:
    • They have areas of expertise and know what they are
    • They also know what areas are not their expertise, they recognize this and publicly acknowledge this
  • Management that may or may not be present, but certainly show restraint in their involvement
  • Clear definition of both what the problem is, and what a successful resolution would be (Scope creep in a project a great warning sign)
  • Open communication:
    • Bad news can be spoken about
    • Politics are very limited below the management levels
That's what's there off the top of my head. Again, no revolutionary ideas here.

Just so that I don't fall into the trap that I just accused others of, note that this is descriptive, not prescriptive. The real question is if a company doesn't have this culture, can it be made to have one? And how can it be done? I have no idea.

Tuesday, November 23, 2010

Why I Hate 1-Part Silicones (and Urethanes and...)

Personally, I really don't have much against the 1-part systems, it's just that I've seen other people use them willy-nilly because they don't like mixing, and then they get in trouble. Properly used, a 1-part, be it sealant, adhesive or whatever, can be great.

But one problem that I've seen repeatedly across countless industries [1] is people thinking that because these are moisture-cured products, as long as they have high humidity in the assembly area, everything will be fine. Seriously, I've heard people say "We're in East Texas where it is more humid than all-get-out. There must be something wrong with the adhesive since it isn't curing. Ship it back!"

The problem with this line of thought is that yes, moisture is needed to aid the cure [2], but there is that other end of the reaction to worry about called the products, or in this case, the byproducts. The water displaces some other molecule, often acetic acid or ethanol, but no matter what it is, that byproduct has to get out and disappear. And if you are putting the silicone into an enclosure of some sort, or even a partial enclosure, all that acetic acid may not be able to find its way out and that causes problems. If instead it is used in thin layers, everything should be fine.

Trapped inside, it can form hollow pockets or end up at the surface where they cause adhesion problems. And thick sections like this also can take a long time to cure just because of the diffusivity issues. [3] I've seen all this too many times. Please, next time you reach for a 1-part, make sure that you've thought it all through.

[1] Medical devices, construction and telecommunication to name a few. Your industry is certainly on the list, I just haven't seen it yet.

[2] So you see, a 1-part silicone is actually a 2-part silicone; it's just that the other part is present in the atmosphere.

[3] Remember that with Fickian diffusion, time scales with the thickness squared.

Monday, November 22, 2010

Bicycle Helmet meets Air Bag

The London Evening Standard has a short story about a new bicycling helmet. It basically worn around the head like a collar, and it inflates (courtesy of stored compress gas) when electronic sensors detect an appropriate situation, probably via some accelerometers. There is embedded video in the article which shows the helmet inflating when a crash-test dummy is rear-ended by a car.

It certainly is an interesting concept, but the whole approach seems to be one of vanity: You don't have to put up with helmet hair anymore! Feel the wind in your hair! No more funny tan lines on your face from helmet straps!

Being rear-ended is not what I would consider the most appropriate demo - hitting a large hole and doing a end-over would be better, or so would sliding out in a corner.

I'm also curious about the material of construction and how it interacts with pavement. When bicycling helmets first became popular [*], they had a hard plastic shell and foam liner. Lighter weight helmets quickly evolved, in part by removing the hard plastic. This was not without issues however, as rumors quickly spread that the foam would stick to the pavement and twist the riders neck in the process, so the hard shells began to reappear. (The hard shells were rumored to bounce you head - is that really any better?) So what happens when this helmet meets the pavement?

[*] i.e., after the various bicycle racing organizations ourlawed the leather hairnets (yes, I am really old enough to remember hairnets! Nailed on cleats, too!)

Turkeys & Netting

For those of us in the US, this is a short work week. Thanksgiving is this Thursday, and Friday is a widespread holiday [*] for most businesses - anything except retail stores. The traditional meal for Thanksgiving is a turkey, and this where we can begin to tie plastics in to the topic.

While I've certainly had turkeys that tasted like plastic, this is a more direct approach. A processed turkey can be difficult to handle. They are heavy, coated in a fairly slippery plastic that might be wet from condensation, and also fairly well rounded with few good areas to grip them. In order to allow for easier handling, they are most often put in plastic netting, something that I look at with a little more intimacy than the average person, as for a very brief period, for you see, I worked very briefly for a company that made plastic netting, some of which was used on turkeys.

There are a wide range of options for making netting, although they all have movable extrusion dies. The netting discussed above, also used for bagging onions and other food items, is made by extruding multiple strands through two coaxial circular dies that are rotating in opposite directions. The strands are oriented at an angle to the machine direction some too, which gives the net the characteristic diamond pattern. Where ever the strands meet, they stick and the netting is formed.

Another option is to use an annular die, similar to what is used to make tubing or blown film. The difference is that the movable part of the die has slots cut into it and this part then moves back and forth against the rest of the die. When the movable part is against the die, the polymer can only flow through the slots, thereby creating the downweb strands. When the movable part retracts, the polymer flows through the entire annulus, creating the crossweb strands. Needless to say, this is quite a noisy operation, what with the die opening and closing with high frequency. This netting can be oriented with a tenter just like biaxial films are.

Yet another option is to just make a circular film and poke holes in it while it is still soft. Seen orange snow fence? Now you know how that is made.

[*] Strangely, the Friday after Thanksgiving is called "Black Friday". Normally a descriptor used for some tragic occasion, this is actually considered a good name as so much shopping is done on this day that retailers are finally able to change financially from being in the red for the year to being in the black - or least that is the urban myth. Regardless of the veracity of the name, everyone knows that it is the busiest shopping day of the year.

Thursday, November 18, 2010

Why the Cox-Merz Rule?

There are some aspects of blogging and Google searches that I do not understand (and not trying to be disrespectful of the intelligence of my audience who I don't think will understand it either), there are certain search terms that attract people to this blog. The biggest ones: "cox-merz" and "cox-merz rule". Why? I have written about the rule just a little bit, but never would expect it to be such a hit - always somewhere in the top five or so depending on the details of the search.

But then try "rheology blog" and the top result is for the Zimbio blog which hasn't had a contribution in over 4 years. "The Rheol World" is near the top, but I can't find this blog anywhere.

Wednesday, November 17, 2010

ANTEC Craziness

Posting is getting to be pretty spotty this week, as most of my writing efforts are directed to papers that we (Aspen Research) are submitting to ANTEC. The deadline is Friday and I have my not only my own paper to finish writing up, but am also helping out on 2 others. They are a good set of papers, but you always wonder about the payoff for this work since 1) they might not be accepted, and 2) much of it is done after hours as we can't let our regular work slide.

Tuesday, November 16, 2010

On Stirring Polymer Solutions

One other thought I had following up on yesterday's post about mixing up the PAm/water concentrate was that I knew I had a good batch when I saw the Weissenberg effect. That's the unusual situation that occurs when stirring a polymer solution and instead of seeing the normal vortex that occurs in the center of the swirling, the surface rises up and starts climbing the stirring rod.

Thank goodness we don't use polymer solutions in toilet tanks.

Monday, November 15, 2010

A Polymer Contradiction

Over the weekend I remembered an unusual proposal for polymers that seems to have fallen by the wayside and I don't know why.

Polymers of course have a very high viscosity and their solvents have very low viscosities. Yet there are a few circumstances when mixing them results in an even lower viscosity (or least, an "apparent" viscosity). The specific example I am thinking of high molecular weight polyethylene oxide (PEO) in water. At dilute concentrations, the polymer can stifle the formation of turbulent flow, and thereby allow more liquid to be pumped at a given pressure.

The application that was most discussed was to use this system in fire fighting - more fluid for a given would be a good thing, or (semi) equivalently, being able to spray the water further also be helpful.

For reasons that I heard, this application never caught on. Anyone know why?

Polyacrylamide (PAm) of high molecular weight can also pull off this trick. When I was a TA at the University of Illinois for the undergrad ChemE unit ops lab, one of the experiments involved a large water tank that drained through a horizontal pipe at the bottom into a drain trench in the flow [1]. The tank was first filled with water and the vertical distance that the water shot was recorded as a function of the head. The tank was refilled and some PAm was added [2] and dispersed, and amazingly the water shot out quite a bit further.

[1] Yes, the polymer went right down the drain. This was not a problem, as PAm is used in water treatment plant to help flocculate and settle particulates - we were simply helping out the water treatment plant.

[2] One of the responsibilities of the TA was to prepare a 5 wt% solution of the PAm in water. This could take a while to prepare - the guide from the profs was 4 hours - the PAm had a tendency to clump together as is common with many water soluble polymers so if you tried to rush it, you ended up with a snotty mess and not a really thick, viscous liquid. Since I was studying polymers, I knew of a shortcut - use ice water, not tap water. The cold water limits the rate at which the polymer dissolves and prevents clumping. After everything is dispersed, you heat it up and your done. The other TA's always wondered how I could mix the concentrate up in 30 minutes. (Shhh, a magician never tells his secrets!)

Friday, November 12, 2010

Jensen's Inequality

Jensen's inequality is a simple math relationship that everyone should know. Like most mathematical relationships, it can be written with enough symbols to make it impossible to understand, but the qualitative idea is very simple: if you have a nonlinear relationship between an input and an output, and if you look at the output from those two inputs, the average of the outputs is not equal to what you would find for the output from the average of the inputs. Too wordy, I know. Let me give you an example.

This picture below is a good graphic of the situation.The two inputs are Z1 and Z2, and you can also see the corresponding values f(Z1) and f(Z2) which unfortunately are not labeled. Halfway between Z1 and Z2 is their average, < Z > and halfway between f(Z1) and f(Z2) is < f(Z) >, which you can see is greater than f(< Z >). So this then is Jensen's inequality:
< f(Z) > ≥ f(< Z >)
In this case (and in the example above), the nonlinear relationship is concave up, so the "average" output is greater than the output of the average input. Certainly concave down relationships can happen, and the outcome will be the opposite.

Chemical reactions have a non-linear relationship with temperature (i.e., the Arrhenius equation). So if you first run a reaction for a certain amount of time at one temperature, T1, and then change the temperature to T2 and run it for the same amount of time, the extent of the reaction will be greater than if you had run it at the average temperature, (T1 + T2)/2. Why? Given the non-linear nature of the reaction to temperature, the reaction runs quite a bit faster at the higher temperature compared to how much the colder temperature slows it down.

I've used this relationship in the past quite a bit when I was working with perishable food items. The food was never stored at a constant temperature - even properly working refrigerators cycle around a set point, so we used data loggers to record the temperature at regular intervals. The software that came with the data loggers actually would calculate the "average" temperature, a value that you can see is meaningless. Instead we would calculate a "mean kinetic temperature". Using the Arrhenius equation, we would determine the extent of degradation during each time interval, sum up the total extent of the reaction, and then back calculate the temperature that would have given us that reaction extent. (This is trivial to set up in a spreadsheet - import the data, hit F9 and you're done.) The mean kinetic temperature was always higher, and the higher the activation for the food, the greater the deviation.[*]

I was directly reminded of the inequality today when I came across a paper that looked at the implications of Jensen's inequality as applied to Ca2+ oscillations in cells. Surprisingly, the cells do much better than when the output is steady, and this was shown to be a direct result of the inequality.

[*] This is what I was getting at a few weeks ago after the power went out and we had to triage the food in the fridge. I tossed all the food with high activation energies since Jensen's inequality greatly disfavored them after temperature abuse.

Thursday, November 11, 2010

Olefin Metathesis

The blog All Things Metathesis has a nice entry about a large scale metathesis reaction, and how the changing economy has impacted it.

Metathesis can be (over)simply stated as moving double bonds around. The reaction is actually a combination of breaking and reforming the double bonds, but from looking strictly at the inputs and outputs, it looks like the double bonds are moving around. This is obviously important to the polymer industry as the double bonds can be used in polymerization schemes.

What I never knew was that metathesis is an equilibrium process, so that it can be run backwards - an important part of the post. Read the entry and see why.

Wednesday, November 10, 2010

Flow-Induced Crystallization #4

A new development in the area of flow-induced crystallization is the discovery of "dormant" nuclei.

Back when I was last in the field, the thought was that the development of a stable crystallization nuclei was a balancing act between the surface energy of the crystal and the Gibbs energy released during crystallization. As a nuclei first forms, the surface area is relatively large and so the surface energy required to grow it initially works against the effort, even though Gibbs energy favored crystallization. After reaching a certain size however, the energetics for growth became more favorable and it proceeded without further problems. The figure below shows this with the solid curve. Initially the slope of the thermodynamics (DDG/size) is against growth until the nucleus reaches the size rc, after which the slope become favorable. Unhindered growth only occurs after the curve drops below DG = 0.

You can also see on the figure the new understanding which follows the dashed line. In the past, surface energy was taken as a constant and so the solid line was the result. Surface energy is now understood to be somewhat variable, thereby creating the situation shown with the dashed line. Very, very small nuclei of size r1 are actually stable. These form quickly and easily underflow and stay there until the thermodynamics are favorable to grow, i.e., they are dormant.

Despite all these theoretical machinations, you can probably see that this phenomenon can impact real world problems. Just about all industrial polymer processes have some sort of shear and/or extensional flow and so all this unusual crystallization is out there lurking. Since the end product is crystallized, it's not like things will look different, but the potential still exists particularly if your temperatures are not very hot.

So with that, I'm going to wrap of this series of posts. There certainly is more information in the article, but I'll leave that for another time if it is relevant. My point here is that even in a relatively staid field like polymer science, (minor) revolutions in theoretical understanding can easily occur - especially when you disappear from the field (ala Rip van Winkle) for 20 years.

Tuesday, November 09, 2010

Can’t Touch This

Everybody is aware of the exploding red dye packs that banks throw into bags of stolen money; upon exploding, the money is permanently dyed and unusable. Now a new alternative solution has been invented, one that uses polymers, specifically polyurethanes. Apparently a quick reacting foam is created which envelopes and encases the money into an unmanageable blob. An interesting idea.

This was first reported in South Africa, which is a good thing. Here in the US, I can certainly imagine the scenario where the criminal gets some of the isocyanate on their skin, have a cardiopulmonary seizure and then they (or their survivors) then sue the bank (and their owners), the foam manufacturer (and their owners and their suppliers and the owners of the suppliers), the city, the state, and the Federal Reserve for 685 million dollars.

Tip of the hat to the Urethane Blog for the heads up.

Flow-Induced Crystallization #3

So the big question about flow-induced crystallization is "WHY?". Why does it occur. And this is where what I use to know and what is now known begin to diverge.

Any flow field takes the random coil that a polymer is in and begins to orient it to some degree. Extensional flow fields (a zone that necks down) is better than a shear flow, but both will do fine. Given this, the old explanation was that the partially oriented polymer had a lower entropy, and so the entropy change upon crystallization would be less than before. At a phase change (such as crystallization), T = DH/DS, where DH is the heat of crystallization and DS is the entropy change upon crystallization. Since DS is now less and DH is constant, the temperature at which crystallization occurs is increased.

Nice idea, but this now appears to be totally wrong - note that the orientation does occur, and the formula given is also correct - it's just that it's now been shown that the entropy change induced by the flow is too small to give a meaningful change in the crystallization temperature.

And so an old idea goes down in flames. But we're just getting started. A bigger blow will come tomorrow in the next post of this series.

Monday, November 08, 2010

Flow-Induced Crystallization #2

As promised, here is the first of a set of entries on different aspects of flow-induced crystallization.

The picture below shows the tremendous impact that flow can have on crystallization rates. It's a little confusing, but it will become clear shortly. This plot is for polypropylene, but you will find similar results for HDPE and others. Look first at the left-hand side. This shows the number of crystal nuclei formed (Nc on a log scale) under non-flow conditions and at different temperatures. The cooler the temperature, the more crystals there are - pretty standard stuff. Any higher than 130 oC and the number of crystals is too small to measure (or at least wait around for them to form!).Now look at the right-hand side. This data is all taken at 140 or 150 oC, higher than all the data on the left-hand-side and look at what is going on - the number of crystal nuclei is in the same range as under the quiescent conditions, but the temperatures are 50 degrees or more higher.

Considering that most polymer processes have some form of flow occurring in them, the applications for this can be extremely profound across a wide range of products and processes.

As I mentioned before, things have changed in this area of research in the last 20 years, but data such as this has not. In the next post, I'll start talking about the explanations for why this behavior occurs - what we use to think and what is the current thinking.

If All You Have is a Hammer...

The ChemJobber had an entry today about interviewing at a small company. I posted some comments there, but realized that I could have said more.

As CJ notes, small companies do not have complete analytical labs and instead use analytical labs to fill in the gaps. Even analytical labs don't have everything - we certainly don't, but I don't see that as a bad thing. It's great having the equipment on site as you can play with it on your own to solve a question that is nagging you and you don't have to worry about whose paying for it, but at the same time, the fact that you do have the equipment can limit you.

An example might be best. We don't have any NMR equipment. However, when we do need, NMR, we have the whole world to choose from for the testing. Sometimes a simple setup is all we need, sometimes we need cutting edge equipment - it's our choice as to who to work with.

Compare that with any instruments that we have onsite. They are all good instruments, but as a result of a) easy access, b) laziness, c) financial incentives [*] and d) the "If all you have is a hammer, everything looks like a nail" syndrome, we can often use them when sending a sample out could be better choice.

That said, sending samples out can still be a hassle. Some companies have good turnaround, some don't. Some will give you great analysis, some will tell you "we conclude that the sample was 12% carbon" which you already know from looking at the data tabulated in the report. It takes a while to develop a good network, so I certainly ask colleagues for recommendations before sending anything.

[*] If we use our instrument, we can charge the client for the instrument time and keep all that money. If we use outside equipment, we only pass those charges through. The bill to the client is the same, but the amount we keep is different.

Friday, November 05, 2010

Flow-Induced Crystallization

My graduate research was in the title topic. It actually a very interesting topic to study as it is a very unusual phenomenon. In a nut shell, crystallizable polymers (such as PE, PP,...) can crystallize extremely quickly at high temperatures when flowing, temperatures that are so high that the crystallization is non-existent in the absence of the the flow. This happens in both the melt as well as in solutions.

After school, I left the topic alone. My work had been in ultra-dilute solutions, (0.01 wt%) so the practical applications of that specific approach were negligible. However, I just ran across a review paper that just came out back in September (open access until the end of November) and was shocked by how much the field had progressed. Basically, all the rationalizations that we used to have 20 years ago to explain why things were happening as they did have been proven wrong and replaced with better explanations. Wow. From my dissertation, you could basically take the background discussion and many of the calculation results based on that and just put a big X through it.

I'm a little short on time today to discuss this much further, but I will get into it more next week. For polymer melts, the ramifications of flow-induced crystallization can be quite severe, so it is not just an esoteric topic. 'Til then.

Thursday, November 04, 2010

Plastics in the "Economist"

It's always nice to see a main stream publication discussing a more technical aspect of polymers, especially for a respected one such as the Economist. Last week they discussed the new results (Biomacromolecules - subscription/pay-per-view required)from the Schiraldi group at Case Western, making a casein based polymer reinforced by nanoclay. The hype here is that it is biodegradable as it is protein based.

The summary in the Economist has a few laughable details - such as describing some of the processing conditions as "freezing it at 80°C below zero" and just to be sure that this was not just a on-time error, they repeat is again later: "It was then cured for 24 hours in an oven at 80°C above zero." (-80°C and 80°C are sufficient). And somehow they are not concerned that it takes 4+ days to make a batch of this new material, obviously ignorant of how short a time period is involved in modern polymerization schemes.

The biodegradablilty of the material is (as expected) only measured in compost conditions, where 20% of the material was degraded in 18 days. The lack of widespread composting conditions was also overlooked by the Economist, which is especially poignant since they call the compost conditions as a "dump-like environment".

You will recall that the Wall Street Journal made an attempt not too long ago to explain the glass transition temperature, an effort that came off a little better than this one.

Tuesday, November 02, 2010

Funding "Meaningless" Research

Again, staying on the scientific sides of politics, it is very common for politicians of a certain persuasion to rant and rave about wasteful research supported by the federal government. Giving chickens Perrier water [*] is my favorite example; you'll have your own.

What never gets mentioned are is what is really happening behind the research, no matter how silly it may seem or even how inane it really is. In every case of research, a student is getting educated and a higher degree - either a master's or a Ph.D. The only way those degrees come about (in the sciences and engineering at least) is from government support of research.

At least when I was in grad school, the payback to the government and society was pretty clear. At my first job, I was paying in taxes what I used to receive as my stipend; I am now a better educated engineer, able to contribute to my employer and our clients in ways far beyond what I could without the advanced degree. The actual work that I produced has had little practical value, but that is beside the point. It was only a vehicle for an advanced education, not an end in itself.

So the next time you hear a politician rail on money going to study the birth defect in earthworms or some other silliness, think of the students behind the work bettering themselves. That is always a success even if the research is a failure.

[*] I remember this one well. Illinois has a very big ag department. One of the researchers discovered that chickens laid poorer quality eggs in the summer because the eggshells had less CaCO3 in them, a result of the chickens expiring CO2 in an effort to stay cool. (Like dogs, chickens don't sweat and cool themselves through increase respiration rates.) So one thought was to increase the CO2intake in the chicken by giving them carbonated water. The local press actually called it "Perrier Water", apparently taking it as a generic term. While this approach did work, it was found to be more cost effective to run cool water through the chicken's perches.

Election Day

This blog is not about politics. Today's post will briefly look at predictions for today's elections across the US, but only because there is one site that takes a scientific look at the polling data and uses it to make projections. That site is FiveThirtyEight.

The details of the methodology used are somewhat unclear (certainly for porprietary reasons) but the blogger, Nate Silver, a baseball statistician, uses a Monte Carlo method to run simulations using published polling data. This election cycle is actually his second time out - the first time was 2 years ago and covered just the presidential election, and in that case, his model was pretty accurate in predicting the electoral college results (hence the title of his blog). This time, he's looking at the governor races, senate race and the representatives too. I think this will be much more challenging as the number of polls for many of these races are quite a bit smaller.

Regardless, the outcome of the 100,000 simulations is a distribution with an average and a range of what we can expect, all of which is more meaningful to us scientifically inclined individuals than what is reported by the talking heads. If you've read his stuff in the past, you know he personally has a leftward bend in his politics, but to me, he really tries to be as accurate as possible with these predictions. He is very up front that for 10 races that he is 90% certain of the winner, he had better be wrong (or near wrong - that's just the way statistics works) in one of them, or else his model is wrong.

I will be quite curious to see tomorrow how well the predictions come off. As of this moment in time, he has the Democrats as 93% favorites to control the Senate, but only a 16% of controlling the House. He has odds for all the individual races as well, so check out your local ones for yourself.

Monday, November 01, 2010

2012 ANTEC back to Orlando - but with NPE

Courtesy of the "In the Hopper" blog, the Society of Plastics Engineers has announced that the 2012 ANTEC show will be in Orlando, but in a big change, it will be at the same time (April 1 - 5) and place as the NPE show.

ANTEC was just held in Orlando last year although it was in early May. I personally found the weather to be pretty uncomfortable, so I am looking forward to an April show instead. Plus I've never been to the NPE, so if I can just get a paper worked up and accepted, it could be a great trip.

Of course, I still have to work on this years paper - something I am taking a break from to write this. Remember, the deadline for abstracts AND the complete paper are November 19th this year - considerably earlier than in the past.

Dynamic Mechanical Analysis of Polymers

I was running some PLA this last week and just fell in love with the stuff - it's basically as well behaved as acrylates are. The time-temperature superposition is very easy to run with both G' and G" coming together beautifully. I also get a good amount of horizontal shifting too, so that the master curve really takes in a wide range of frequencies. Basically, I look like a genius because the data is so good.

The other extreme that I have to face on a very regular basis is a PVC-wood flour composite made by our parent corporation. It's just a nightmare - got to move fast and can't go too high in temperature before it starts burning and degrading (both the PVC and the wood!). I end up with lots of scatter, a limited output range and data that is only maginally helpful to reach any conclusions.

I hate working with silicones as well because they are so insensitive to temperature that you really have to make large changes before you get anything to change.

So what are your dream and nightmare materials to test?