July 31, 2013
Last week I attended part of the Summer Institute run by the Center for Process and Analytical Chemistry (CPAC – but not the political one!) (www.cpac.washington.edu). While I didn’t attend all the sessions (one of the privileges of being retired), I did hear some interesting talks. Kurt VandenBussche, UOP Honeywell, talked about big data in the petroleum industry. Apparently, companies are collecting billions of data points per day. This is then analyzed to decide what are the most profitable products to make from a given source, crude oil or from a bio-refinery. Sometimes this is done using science (Composition Based Modeling, Stephan Jaffe, ExxonMobil) based on properties of molecular segments, but sometimes it is just based on statistics and probability, with no understanding of the underlying reason. So much for chemistry classes! I joked that UOP has become the NSA of refining, since they have so much data available. (Aside: My own suggestion to the NSA to avoid people taking data out of the building with flash drives is to walk around the building while chewing gum, and put gum in all the flash drive sockets.)
The trend worldwide is that crude is becoming, on average, heavier with more sulfur. Just the opposite is happening in North Dakota, where the crude is lighter and has less sulfur.
Another talk by Ray Chrisman of Atochemis srl (Italy) talked about continuous micro-processing (i.e. where micrdofluidics plays a role), in which he is an expert. He emphasized the need to spend more effort on separation and purification in small, easily-reproducible devices for extraction, crystallization, distillation, membrane separation, and chromatography.
Apparently when making isobutanol in a biorefinery, the broth must be removed and the isobutanol removed, and the broth recycled, since the isobutanol kills organisms. Thus, attention goes beyond the chemical reactor and includes the separation system as an integral part of the reactor.
There were a few references to some continuous microprocessor systems that have been scaled up by adding identical systems. The idea of microprocessors is that you learn how to make one small one that has all the mixing, reaction, and separation, then make many to run in parallel, taking advantage of the processing techniques developed by the computer chip manufacturers. This is the subject of a book that contains an article written with my undergraduate research students about transport in microfluidic devices.
Bruce A. Finlayson, Pawel W. Drapala, Matt Gebhardt, Michael D. Harrison, Bryan Johnson, Marlina Lukman, Suwimol Kunaridtipol, Trevor Plaisted, Zachary Tyree, Jeremy VanBuren, Albert Witarsa, “Micro-component flow characterization,” Ch. in Micro-Instrumentation, (M. Koch, K.Vanden Bussche, R. Chrisman (ed.), Wiley, 2007).
More info about my microfluidic work is on my University web site: http://faculty.washington.edu/finlayso/ and choose papers. See also the book website, http://www.ChemEComp.com/, since many chapters of the book are devoted to flow and mass transfer in microdevices.