An education isn't how much you have committed to memory, or even how much you know. It's being able to differentiate between what you do know and what you don't. --Anatole France (1844 - 1924)
2007 Jul 27 From PHYSICS NEWS UPDATE 834 http://www.aip.org/pnu LASER ICEMAKING. Physicists at the University of Goettingen have for the first time gotten supercooled water to freeze using pulses of laser light. Supercooling occurs when a sample of water is chilled down through its normal freezing point (0 C) without crystallization occurring. This can happen in a small sample and if no *nucleation* site presents itself around which solid ice (a crystal structure) can form. The incoming laser pulse brings about an optical breakdown: some of the water molecules are ionized, creating a momentary plasma. The hot plasma expands and forms a vapor bubble that collapses very rapidly. It is the pressure waves emitted by the tiny plasma and the bubble collapse which, the Goettingen scientists believe, trigger the rapid crystallization. Previously an acoustic equivalent of this process -- sonocrystallization -- had been seen, but this is the first time crystallization has been initiated by a laser pulse. One of the researchers, Robert Mettin (R.Mettin@physik3.gwdg.de), suggests that laser icemaking can be extended to studying solidification of other materials. (Lindinger et al., Physical Review Letters, 27 July 2007)
2007 Jun 19 From PHYSICS NEWS UPDATE 829 http://www.aip.org/pnu TIME AND TIME AGAIN. The physics world accepts the idea of spacetime, a combined metrical entity which puts time on the same footing as the visible three spatial dimensions. Further spatial dimensions are added in some theories to help assimilate all physical forces into a unified model of reality. But what about adding an extra dimension of time too? Itzak Bars and Yueh-Cheng Kuo of the University of Southern California do exactly that, and add an extra spatial dimension too. Bars explains this proposal with a comparison. Just as a projection of a 3D object onto a 2D wall can have many different shapes, and each such shape is incapable of fully conveying all the properties of the 3D object, so the single-time description of dynamics in the standard formulation of physics is insufficient to capture many properties of dynamical systems which have remained mysterious or unnoticed. The addition of an extra time and an extra space dimension, together with a requirement that all motion in the enlarged space be symmetric under an interchange of position and momentum at any instant, reproduces all possible dynamics in ordinary spacetime, and brings to light many relationships and hidden symmetries that are actually present in our own universe. The hidden relationships among dynamical systems are akin to relationships that exist between the multiple shadows of a 3D object projected on a 2D wall. In this case the object is in a spacetime of 4 space and 2 time dimensions while the shadows are in 3 space and 1 time dimensions. The motion in 4+2 dimensions is actually much more symmetric and simpler than the complex motions of the shadows in 3+1 dimensions. Besides the general unification of dynamics described above, what does this addition to one extra time and one extra space dimension (in addition to all those extra space dimensions called for in string theory) accomplish that could not be achieved without it? Bars (firstname.lastname@example.org) says that his theory explains CP conservation in the strong interactions described by QCD without the need for a new particle, the axion, which has not been found in experiments. It also explains the fact that the elliptical orbit of planets remains fixed (not counting well-known tiny precessions). This *Runge-Lenz* symmetry effect has remained somewhat mysterious in the study of celestial mechanics, but now could be understood as being due to the symmetry of rotations into the fourth space dimension. A similar symmetry observed in the spectrum of hydrogen would also be accounted for in 2-time physics, and again explained as a symmetry of rotations into the extra space and time dimensions. There are many such examples of hidden symmetries in the macroscopic classical world as well as in the microscopic quantum world, Bars argues, which can be addressed for the first time with the new 2T formulation of physics. There have been previous attempts to formulate theories with a second time axis, but Bars says that most of these efforts have been compromised by problems with unitarity (the need for the sum of all probabilities of occurrences to be no greater than 1) and causality (maintaining the thermodynamic arrow of time). The USC theorists have reformulated their model to fit into the ongoing supersymmetry version of the standard model and expect their ideas to be tested in computer simulations and in experiments yet to come. (Physical Review Letters, upcoming article; http://physics1.usc.edu/~bars/ )
2007 Apr 11 From PHYSICS NEWS UPDATE 819 http://www.aip.org/pnu NEWTON'S SECOND LAW OF MOTION, that pillar of classical physics, the formula that says the force on an object is proportional to acceleration, has now been tested, and found to be valid, at the level of 5 x 10^-14 m/s^2. This is a thousandfold improvement in precision over the best previous test, one carried out 21 years ago (Physical Review D, vol 34, p 3240, 1986). The new test was performed by physicists at the University of Washington using a swiveling torsion pendulum, a special kind of pendulum in which the restoring force is not gravity (as you would have in a hanging pendulum) but is provided by a very thin torsion fiber. One implication of Newton*s law is that the pendulum*s frequency (its tick-tock rate) should be independent of the amplitude of its swiveling (as long as the oscillation is small). Looking for a slight departure from this expected independence the Washington researchers watched the pendulum at very small amplitudes; in fact the observed swivel was kept so small that the Brownian excitation of the pendulum was a considerable factor in interpreting the results. Newton's second law is expected to break down for subatomic size scales, where quantum uncertainty frustrates any precise definition of velocity. But for this experiment, where the pendulum has a mass of 70 g and consists of 10^24 atoms, quantum considerations were not important. According to one of the scientists involved, Jens Gundlach (206-616-3012, email@example.com), this new affirmation that force is proportional to acceleration (at least for non-relativistic speeds), might influence further discussion of two anomalies: (1) oddities in the rotation curves for galaxies --characterizing the velocity of stars as a function of their radii from the galactic center-- suggest either that extra gravitational pull in the form of the presence of as-yet-undetected dark matter is at work or that some new form of Newton's Second Law could be operating (referred to as Modified Newtonian Dynamics, or MOND); and (2) the ongoing mystery surrounding the unaccounted-for accelerations apparently characterizing the trajectory of the Pioneer spacecraft (seehttp://www.aip.org/pnu/1998/split/pnu391-1.htm). (Gundlach et al., Physical Review Letters, upcoming article).
2007 Mar 16 From PHYSICS NEWS UPDATE 815 http://www.aip.org/pnu PHYSICS AND PROGRESS. Why do science? To learn more about the universe and to improve the material and intellectual conditions of people. The recently concluded APS March meeting was a great arena for showcasing new fundamental ideas in physics and also for seeing how these ideas can be marshaled for producing practical commercial benefits. Here are three examples: 1. Metamaterials. The architecture of these artificial nanoscale-engineered materials made of tiny ring-, strip-, and rod-shaped components serves to enhance the magnetic interaction between light and matter. This results in the material having a negative index of refraction and consequentially various novel optical properties. One practical goal of negative-index optical research is superlensing, a process in which a thin flat panel of the metamaterial would be able to image an object at a spatial resolution better than the wavelength of the illuminating light. Since metamaterials were first realized in the lab for microwave light, physicists have been pushing negative-index behavior to shorter and shorter wavelengths. To bring about a negative-index condition, the material's electric permittivity (a measure of a material's response to an applied electric field) must be negative, and in some cases also its magnetic permeability (a measure of the material's response to an applied magnetic field (to read more about these parameters and early reports of metamaterials, see http://www.aip.org/pnu/2000/split/pnu476-1.htm). At last week's APS meeting Vladimir Shalaev (Purdue University, firstname.lastname@example.org) reported a negative-index material operating at a wavelength of 770 nm (at the end of the visible spectrum), the shortest wavelength observed for a single-negative (negative permittivity) and the same material (but with a different light polarization) operating at a wavelength of 815 nm, the shortest wavelength observed for a double-negative material (both negative permittivity and permeability). See Shalaev*s review article at Nature Photonics, January 2007. 2. Graphene, essentially one-atom-thin carbon sheets, were presented at last year's meeting by no more than a few groups. Now there are dozens. The reasons for this are graphene*s adaptable mechanical and electrical properties and the very unusual behavior of electrons moving through a graphene landscape: you increase the electron*s energy but you don*t increase their velocity. It's as if the electrons were acting like slow-moving light waves. Pablo Jarillo-Herrero (Columbia Univ, email@example.com) reported the latest developments in this rapidly moving research area, including the useful development of graphene ribbons; the resistivity of the material changes according to the width of the ribbons, meaning that the semiconducting properties of graphene could be tailored to suit the application. He also summarized out recent progress in the field, including the observation of superconducting graphene transistors (Delft), freely suspended graphene sheets, a room- temperature Hall effect, and room temperature single-electron transistors with graphene (Manchester). 3. Light-emitting diodes. Moving from two new topics-metamaterials and graphene-to a more mature field--the production of light by combining holes and electrons inside a semiconductor junction-- we see that considerable forward strides are still possible. George Craford (Lumileds/Philips) described a new record-setting white-light high-power LED, with an input current of 350 mA, the one-square-millimeter device produced light at a rate of 115 lumens per watt, representing the first time a high-power LED exceeded the 100 Lm/W mark. LEDs, because of their energy efficiency and their concentration, are already frequently used in traffic lights, brake lights, and in building lighting. Craford predicted that some LEDs were to be used in cellphone flashes, in daytime automobile running lights, and (later this year) for auto headlights.
2006 Mar 19 From NA Digest Volume 06 : Issue 12 Subject: Mauchly and Eckert fired This month is the anniversary of some important events in the history of computing. The two people with the most formidable claim to inventing the computer, John Mauchly and Presper Eckert, were fired by the University of Pennsylvania 60 years ago this month. Their proposal to build the first digital electronic calculator was ignored by university administrators until Lt. Herman Goldstine in April 1943 convinced the US Army to build what became known as ENIAC. Even though the project supported roughly a dozen people, the principals were not treated well: unlike some classmates the electronics genius Eckert was never made faculty, and the visionary Mauchly had to give up teaching to work on his project at lower pay. The university ceded rights to Mauchly and Eckert until administrator Irven Travis reversed course. Dean Harold Pender then gave Mauchly and Eckert an ultimatum to sign either patent waivers or resignations on March 22, 1946. The immediate impact was to disband the talented group that Mauchly and Eckert had assembled. Perhaps only John von Neumann recognized the importance of personnel continuity in innovation when he hired Goldstine and tried to hire Eckert to build a programmable machine at the Institute for Advanced Study. The lead shifted to England where the first modern computers (digital, electronic, programmable) were built at Manchester and Cambridge based on lessons learned from COLOSSUS, ENIAC, and wartime radar research. Today it is not unusual to find millionaires among computer science faculty. The days of coddled technicians, stock options, venture capital, and IPOs, were made possible by the industry Mauchly and Eckert created. An entertaining but emotional account of ENIAC is the book by Scott McCartney reviewed by William Aspray at http://www.siam.org/siamnews/12-99/eniac.htm . There is no balanced historical account of the early days of computers. Scholarly books about individual machines are those by Nancy Stern about ENIAC, and by Aspray about von Neumann and the IAS computer. (Cheers, -- Joseph Grcar, firstname.lastname@example.org)
2006 Feb 28 From PHYSICS NEWS UPDATE Number 757 ATOM WIRES. The smallest wire width in mass produced electronic devices is about 50 nm, or about 500 atoms across. The ultimate limit of thinness would be wires only one atom wide. Such wires can be made now, although not for any working electronic device, and it is useful to know their properties for future reference. Paul Snijders and Sven Rogge from the Kavli Institute of Nanoscience at the Delft University of Technology and Hanno Weitering from the University of Tennessee build the world's smallest gold necklaces by evaporating a puff of gold atoms onto a silicon substrate which has first been cleared of impurities by baking it at 1200 K. The crystalline surface was cut to form staircase corrugations. Left to themselves, the atoms then self-assemble into wires (aligned along the corrugations) of up to 150 atoms each (see figure at www.aip.org/png ). Then the researchers lower the probe of a scanning tunneling microscope (STM) over the tiny causeway of gold atoms to study the nano-electricity moving in the chain; it both images the atoms and measures the energy states of the atoms' outermost electrons. What they see is the onset of charge density waves---normally variations in the density of electrons along the wire moving in pulselike fashion. But in this case (owing to the curtailed length of the wire) a standing wave pattern is what results---as the temperature is lowered. The wave is a quantum thing; hence certain wavelengths are allowed. In other words, the charge density wave is frozen in place, allowing the STM probe to measure the wave (the electron density) at many points along the wire. Surprisingly, two or more density waves could co-exist along the wire. The charge density disturbance can also be considered as a particlelike thing, including excitations which at times possess a fractional charge. (Snijders et al., Physical Review Letters, 24 February 2006; email@example.com)
2005 Dec 7 From PHYSICS NEWS UPDATE Number 757 FRACTAL-DOMINATED CHEMISTRY. Why does cream poured into coffee swirl the way it does? A new study of how chemical reactions proceed establishes new equations for reaction rates by taking mixing abnormalities more into account. Many existing equations assume efficient mixing of ingredients, but this is far from the case. Before reactions can take place, proper mixing has to occur, and as two Hungarian physicists now discover in their simulations of mixing under more realistic fluid flow conditions, reactions often occur along a fractal frontier. Indeed, much real-world fluid chemistry is chaotic in nature and takes place not in general solution but along a many-filamented fractal surface. Some previous studies of the steady time-independent fracticality of chemical reactions occurring in open flows, those in which fluid continuously flows into and out of a container. According to Gyorgy Karolyi (Budapest University of Technology and Economics) and Tamas Tel (Eotvos University), their new study is the first to address the tougher problem of a closed flow, one in which the fluid remains in the container; in this case, the resultant filamentary fractal is not steady but instead evolves through time, gradually filling up more and more of the container volume. They derive the relation between reaction rate and fractal dimensionality (the extent to which surface of the filaments lies between that of a two dimensional and three dimensional object). Fractal mixing is suspected in the disposition of several natural systems, such as plankton in the ocean, sea ice floating in the ocean, and cloud patterns (http://earthasart.gsfc.nasa.gov/vortices.html). Karolyi (firstname.lastname@example.org) suggests that the new equations might provide new insights for those who design microfluidic devices such as micromixers used in printing and medical equipment. (Karolyi and Tel, Physical Review Letters, upcoming article)
2005 Nov 9 From PHYSICS NEWS UPDATE Number 753 ZEN AND THE ART OF TEMPERATURE MAINTENANCE. Scientists at the Iwate University in Japan have shown that the skunk cabbage---a species of arum lily and whose Japanese name, Zazen-sou, means Zen meditation plant---can maintain its own internal temperature at about 20 C, even on a freezing day (picture at www.aip.org/png/2005/239.htm). The plant occurs in East Asia and northeastern North America, where its English name comes from its bad smell and from the fact that its leaves are like those of cabbage. Unlike the case of mammals, which maintain their body temperature by constant metabolism in cells all over the body, heat in the skunk cabbage is produced chiefly in the spadix, the plant's central spike-like flowering stalk through chemical reactions in the cells' mitochondria. According to one of the authors of the new study, Takanori Ito (email@example.com), only one other plant species, the Asian sacred lotus, is homeothermic, that is, able to maintain its own body temperature at a certain level. Most other plants do not produce heat in this way because they seem to lack the thermogenic genes (the technical name for which, in abbreviated form, is SfUCPb). Moreover, the researchers, studying subtle oscillations in the plant's internal temperature, claim that the thermo-regulation process is chaotic and that this represents the first evidence for deterministic chaos among the higher plants. The resultant trajectory in the abstract phase space (where, typically, one plots the plant's temperature at one time versus the temperature at another time) is a strange attractor, which the authors refer to as a Zazen attractor, a "Zen meditation" attractor. (Physical Review E, November 2005)
2005 Oct 26 From PHYSICS NEWS UPDATE Number 751 WALKING MOLECULES. A single molecule has been made to walk on two legs. Ludwig Bartels and his colleagues at the University of California at Riverside, guided by theorist Talat Rahman of Kansas State University, created a molecule---called 9,10-dithioanthracene (DTA)---with two "feet" configured in such a way that only one foot at a time can rest on the substrate. Activated by heat or the nudge of a scanning tunneling microscope tip, DTA will pull up one foot, put down the other, and thus walk in a straight line across a flat surface. The planted foot not only supplies support but also keeps the body of the molecule from veering or stumbling off course. In tests on a standard copper surface, such as the kind used to manufacture microchips, the molecule has taken 10,000 steps without faltering. According to Bartels (firstname.lastname@example.org, 951-827-2041), possible uses of an atomic-sized walker include guidance of molecular motion for molecule-based information storage or even computation. DTA moves along a straight line as if placed onto railroad tracks without the need to fabricate any nano-tracks; the naturally occurring copper surface is sufficient. The researchers now aim at developing a DTA-based molecule that can convert thermal energy into directed motion like a molecular-sized ratchet. (Kwon et al., Physical Review Letters, upcoming article; text at www.aip.org/physnews/select; see movie at www.chem.ucr.edu/groups/bartels/)
2005 Aug 19 From PHYSICS NEWS UPDATE Number 742 ROOM TEMPERATURE ICE is possible if the water molecules you're freezing are submitted to a high enough electric field. Some physicists had predicted that water could be coaxed into freezing at fields around 10^9 V/m. The fields are thought to trigger the formation of ordered hydrogen bonding needed for crystallization. Now, for the first time, such freezing has been observed, in the lab of Heon Kang at Seoul National University in Korea, at room temperature and at a much lower field than was expected, only 10^6 V/m. Exploring a new freezing mechanism should lead to additional insights about ice formation in various natural settings, Kang believes (email@example.com). The field-assisted room-temperature freezing took place in cramped quarters: the water molecules were constrained to the essentially 2-dimensional enclosure between two surfaces: a gold substrate and the gold tip of a scanning tunneling microscope (STM). Nevertheless, the experimental conditions in this case, modest electric field and narrow spatial gap, might occur in nature. Fields of the size of 10^6 V/m are, for example, are thought to exist in thunderclouds, in some tiny rock crevices, and in certain nanometer electrical devices. (Choi et al., Physical Review Letters, 19 August 2005; for another example of seemingly room-temperature ice, see http://www.aip.org/pnu/1995/split/pnu225-1.htm )
2004 Aug 12 From PHYSICS NEWS UPDATE Number 696 THE MASSIVE NORTHEAST BLACKOUT of a year ago not only shut off electricity for 50 million people in the US and Canada, but also shut off the pollution coming from fossil-fired turbogenerators in the Ohio Valley. In effect, the power outage was an inadvertent experiment for gauging atmospheric repose with the grid gone for the better part of the day. And the results were impressive. On 15 August 2003, only 24 hours after the blackout, air was cleaner by this amount: SO2 was down 90%, O3 down 50%, and light-scattering particles down 70% over "normal" conditions in the same area. The haze reductions were made by University of Maryland scientists scooping air samples with a light aircraft. The observed pollutant reductions exceeded expectations, causing the authors to suggest that the spectacular overnight improvements in air quality "may result from underestimation of emission from power plants, inaccurate representation of power plant effluent in emission models or unaccounted-for atmospheric chemical reactions." (Marufu et al., Geophysical Research Letters, vol 31, L13106, 2004.)
2004 Aug 12 From PHYSICS NEWS UPDATE Number 696 THE LONG-TERM DYNAMICS OF THE ELECTRICAL GRID are examined in new studies conducted by Ben Carreras and his colleagues at Oak Ridge National Lab, the University of Wisconsin, and University of Alaska. Engineers at the utilities are of course always looking for ways to make their systems better, especially in the aftermath of large blackouts, such as the event on August 14, 2003. These post-mortem studies typically locate the sources of the outage and suggest corrective measures to prevent that kind of collapse again, often by strengthening the reliability of specific components. Carreras argues that a more effective approach to mitigating electrical disasters is build more redundancy into the system. And to do this, he says, you need to look at how the electrical grid, considered as a dynamic system subject to many forces, behaves over longer periods of time. And to do this one needs to build into any grid model social and business forces in addition to the physics forces that govern the movement of electricity. Thus the Oak Ridge model not only solves the equations (governed by the so-called Kirchoff laws) that determine how much power flows through specific lines in a simulated circuit, but also build in the strain on the system over time caused by an increasing demand for power, the addition of new generators and transmission lines, and even elements of chance in the form of weather fluctuations and the occasional shorting caused by warm, sagging lines contacting untrimmed trees. The model proceeds to let the grid evolve, and for each "day" it computes possible solutions---in the form of successful combinations of power generation levels and subsequent transmission of that power over existing lines, some of which come in and out of service---for the continued running of the grid. The model derives a probability curve for blackouts which matches pretty well the observed outage data for North America. The Oak Ridge scientists believe that their model could be used by utility companies to test grid behavior for various network-configuration scenarios, particularly those where the grid is operating dangerously close to a cascade threshold. (Carreras et al., Chaos, September 2004; firstname.lastname@example.org)
2004 Jun 21 From PHYSICS NEWS UPDATE Number 689 Amorphous steel with large cross-sections, long a goal of metallurgists, has been fabricated by scientists at Oak Ridge National Lab. The amorphous steel produced has a hardness and strength more than twice that of the best ultra-high-strength conventional steel. Some amorphous (glassy) iron-based alloys have been employed in making transformer cores, the electrical devices which transform electricity from one voltage to another, and have reduced energy losses thereby by two-thirds. But not until now has glassy steel of the kind used in building structures been made. Steel, an alloy of mostly iron atoms with varying amounts of carbon and other elements, is ordinarily a crystal, with an internal structure consisting of neat rows of atoms. If produced quickly from a liquid phase, however, a disordered solid can result. The trick is to find conditions---including the chemical content of the alloy, such as the addition of yttrium in this case---that favor the liquid phase and frustrate the onset of crystallization even as the solidification temperature is approached. The researchers (Zhou Ping Lu, 865-576-7196, email@example.com) have produced centimeter-sized pieces of the amorphous steel, and they feel that structural steel in bulk metallic glass form can be produced economically with traditional drop-casting methods, in which metallic glasses are made by pouring the hot liquid into a cold copper mold. (Lu et al., Physical Review Letters, 18 June 2004. See mention of related work reported by a University of Virginia group (Ponnarnbalam et al., J. Mat. Res, 5 May 2004) and by a Caltech group (Xu et al., Physical Review Letters, 18 June 2004).
2003 Aug 13 From PHYSICS NEWS UPDATE Number 649 DETECTING PLASTIC EXPLOSIVES IN AIR at the parts-per-trillion level has been achieved by researchers at Oak Ridge National Laboratory and the University of Tennessee (Thomas Thundat, 865-574-6201, firstname.lastname@example.org), potentially leading to a fast, portable, and ultrasensitive plastic-bomb "sniffer." Plastic explosives such as pentaerythritol tetranitrate (PETN) and hexahydro-1,3,5-triazine (RDX) pose serious threats because (1) they are easily to mold into desired shapes, (2) they remain highly stable until detonated, and (3) they can inflict significant damage even in small amounts. In fact, the infamous shoe bomber had PETN in his footwear. Most current plastic-bomb sensors are bulky and expensive. In contrast, the new sensor is a microelectromechanical system (MEMS), or a tiny mechanical device with microscopic dimensions. Potentially cheap and easy to mass-produce, the bomb-sniffing MEMS device is a microcantilever, a 180-by-25-micron slab of silicon attached to a spring-loaded wire. Similar in structure to a diving board attached to a pool, the microcantilever is coated on one side with gold. On one end of the gold-coated surface is a single layer of 4-MBA (4-mercaptobenzoic acid), to which PETN and RDX both attach. Like hair that curls up on a humid day as water molecules adsorb to it, this specially coated cantilever will bend by significant amounts when PETN and RDX molecules attach to it. A laser-microscope system can detect the degree of bending to nanometer precision. Placed in a vacuum-tight glass cell, the cantilever was exposed to a stream of ambient air with tiny traces of plastic explosive. Using a modified atomic force microscope to measure the deflections of the cantilever, the researchers determined that their MEMS device could detect the explosives at a level of 14 parts per trillion, after only 20 seconds of operation. By another measure, the device becomes sensitive to plastic explosives even if only a few femtograms (1 fg=10^-15 g) impinges upon it. A future step is to take the device out of the laboratory and develop it into a portable sensor. While much activity has centered on the development of sensors for detecting vapors from all kinds of explosives, this is, to the authors' knowledge, only the third device of its kind that uses MEMs. (Pinnaduwage et al., Applied Physics Letters, 18 August 2003)
2002 Oct 2 From PHYSICS NEWS UPDATE Number 607 THE SEMICONDUCTOR LASER IS 40 YEARS OLD. DVDs, barcode scanners, high-speed fiber-optic telecommunications--these multi-billion- dollar technological tokens of the early 21st century all depend upon the semiconductor laser, which was invented 40 years ago in much humbler settings. One of the laser's most prominent children, the CD player, is also celebrating its 20th anniversary this autumn in the consumer market. In this design, also known as a "diode" laser, electrons and positively charged holes meet at a semiconductor interface to annihilate each other and create light. Semiconductors can convert electricity into light so efficiently that some physicists scoffed at early reports and complained that this design, very different from the original solid-state and gas lasers, required breaking the second law of thermodynamics in order to work as advertised. But starting in September 1962, scientists reported functioning diode lasers from four independent laboratories GE (at two different research centers), IBM, and MIT's Lincoln Lab, where the corporate ethos of the day allowed physicists to pursue research on esoteric topics even without the prospect of likely applications. The four groups' results appeared within three months of each other in the journals Physical Review Letters and Applied Physics Letters, the latter journal then in its first year of publication. Technological development of the semiconductor laser continues to this day. Examples include quantum cascade lasers (see Updates 181, 322, 359), multi-wavelength lasers from a single material (Updates 132, 217, 407), surface emitting lasers (Update 229) and blue lasers (Update 50). Within the decade, blue lasers might replace red lasers in DVD players, enabling a six-fold increase in information on the same-sized disk. Hand in hand with diode lasers is the visible LED, also invented in 1962. The low-power, high efficiency LEDs have found their way into traffic signals and automobile and bus tail lights. Recently available white-light LEDs have become powerful enough to replace incandescent front headlights in some automobile models, a development that even some of the most optimistic LED designers would have found ludicrous a quarter of a century ago. (Also see AIP/OSA news release at www.insidescience.org/reports/2002/056.html )
2002 Jan 24 From PHYSICS NEWS UPDATE Number 574 THE SCIENCE OF ROUGHNESS is how Benoit Mandelbrot describes the use of fractal mathematics to understand objects in the real world. Euclid and the ancient Greeks may have assumed that lines are smooth and one dimensional, but many typical curves in nature are tortuously indented; however, their roughness can be expressed as a fractal dimensionality, one that is greater than one but less than two. In a trailblazing 1967 paper Mandelbrot showed, for instance, that the coastline length of Britain was anything but an "objective" constant. Instead it grew as one shrank the size of the ruler used to lay out the coast. In fact the measured coastline, and many other perimeters, grow as the inverse of the ruler size, raised to a power: perimeter=(1/R)^D, where R is the ruler size and D is the fractal "dimensionality." This power-law relationship can also typify the "time series" behavior of many phenomena, such as volcanos, earthquakes, and hurricanes. What this means is that a plot of the likelihood of an event of a certain magnitude (an earthquake's energy, say) versus the magnitude has a power-law shape. Formulating the chances of large-but-rare floods or earthquakes is obviously not merely of academic interest. Understanding the math behind large systems like a forest or a geologic fault have enormous implications for public safety and insurance underwriting. These are some of the issues that arose at a series of talks at the recent American Geophysical Union meeting in San Francisco, where the recently formed nonlinear geophysics committee (http://geomorphology.geo.arizona.edu/AGU/page.html) sponsored a variety of sessions on things that are "rough" in the fractal sense. Here is a sampling of results. Bruce Malamud (King's College, London, email@example.com) and Donald Turcotte (Cornell University, firstname.lastname@example.org) argued that "fractal" assessments of natural hazards are often more realistic than older statistical models in predicting rare but large disasters. He cited as an example the great Mississippi flood of 1993; a fractal-based calculation for a flood of this magnitude predicts one every 100 years or so, while the more-often-used "log-Pearson" model predicts a period of about 1500 years. In the realm of earthquakes, John Rundle (who heads the Colorado Center for Chaos and Complexity at the University of Colorado, email@example.com, 303-492-1149) described a model in which the customary spring-loaded sliding blocks used to approximate individual faults have a more realistic built-in leeway (or "leaky thresholds," not unlike "integrate-and-fire" provisions used in the study of neural networks) for simulating the way in which faults jerk past each other. Applying these ideas to seismically-active southern California, 3000 coarse-grained regions, each 10 km by 10 km (the typical size for a magnitude-6 quake) are defined. Then a coarse-grained wave function, analogous to those used in quantum field theory, is worked out for the region, and probabilities for when and where large quakes would occur are determined. Rundle claims to have good success in predicting, retroactively, the likelihood for southern-California earthquakes over the past decade and makes comparable prognostications for the coming decade. (See also Rundle et al., Physical Review Letters, 1 October 2001; and Rundle et al., PNAS, in press). At the AGU meeting Mandelbrot himself delivered the first Lorenz Lecture, named for chaos pioneer Edward Lorenz. Mandelbrot discussed, among other things, how the process of diffusion limited aggregation (DLA) is characterized by not one but two fractal dimensions. DLA plays a key role in many natural phenomena, such as the fingering that occurs when two fluids interpenetrate. In a DLA simulation, one begins with a single seed particle. Then other particles, after undergoing a "random walk," attach themselves to the cluster. This results in a branching dendritic-like structure in which the placement of new particles is subject to the blockage of existing limbs. You can study the dimensionality of this structure by drawing a circle and counting the number of particles lying on the circle at that radius out from the original seed particle, and counting up the angular gaps between branches at that radius. For many years studies of DLA have been confused by conflicting reports as to the underlying fractal dimensionality. Now Mandelbrot (at both IBM 914-945-1712, firstname.lastname@example.org and at Yale, email@example.com), Boaz Kol, and Amnon Aharony (firstname.lastname@example.org, 972-3-640- 8558 at the University of Tel Aviv) have shown by employing a massive simulation involving 1000 clusters, each of 30 million particles (previous efforts had used no more than tens of thousands of particles) that two different dimensionalities are always present, but this only becomes apparent in huge simulations. Comparing a modest (10^5 particles) and a large (10^8 particles) simulation shows that the larger cluster is not merely a scaled up version of the smaller (see figures at http://www.aip.org/mgr/png). These results (4 February 2002 issue of Physical Review Letters) are the first quantitative evidence for this type of nonlinear self similarity.
2001 Aug 1 From PHYSICS NEWS UPDATE Number 550 INSECT SENSES SUGGEST NOVEL NEURAL NETWORKS. Animals gather information about their environments when sensory neurons fire minute electrical signals in response to chemicals, light, sounds, and other stimuli. Studying networks of neurons in animals and insects can provide us with insight to the natural world as well as inspiration for manmade networks to aid in computing and other applications. A new model of neural networks, based on recent studies of fish and insect olfactory systems, suggests a way that neurons can be linked together to allow them to identify many more stimuli than possible with conventional networks. Researchers from the Institute for Nonlinear Science at the University of California, San Diego (M. Rabinovich, email@example.com, 858-534-6753) propose that connections between neurons can cause one neuron to delay the firing of another neuron. As a result, a given stimulus leads to a specific time sequence of neural impulses. In essence, the interconnected neurons include time as another dimension of sensory systems through an encoding method called Winnerless Competition (WLC). Using a locust antenna lobe exposed to fragrances such as cherry and mint for comparison, the researchers found their model could identify roughly (N-1)! (equal to (N-1) x (N-2) x ...x 2) items with a network built of N neurons. That is, a ten neuron WLC network should be able to identify hundreds of thousands as many items as a conventional ten-neuron network, and the benefits increase as networks grow. The WLC model helps explain how the senses of animals, insects, and even humans can accurately and robustly distinguish between so many stimuli. In other words, it is a mathematical rationale as to why a rose, by any other name, would smell as sweet---but doesn't smell like an onion. Ultimately, the WLC model may lead to high capacity, potent computing networks that resemble an insect antenna or a human nose more than a desktop PC. (M. Rabinovich et al, Physical Review Letters, 6 August 2001)
2001 July 13 From PHYSICS NEWS UPDATE Number 547 CAUTION: SLIPPAGE MIGHT OCCUR in tightly confined fluids. Fluid mechanics is one of the most mature and successful branches of physics---a study vital for understanding phenomena ranging from ocean currents to the lift generated by butterfly wings. It would likely startle many physicists to learn that one of the venerated discipline's fundamental assumptions is sometimes wrong. But that's exactly what a group of researchers from the Australian National University found in an experimental test of fluid behavior in small spaces. It has long been thought that the fluid molecules adjacent to a surface are stationary, regardless of the motion of the rest of the fluid. This no-slip boundary condition, as the assumption is known, leads to precise and detailed descriptions of nearly all fluid mechanical systems. Recently, however, a number of studies have suggested that the assumption breaks down for flow in confined spaces, such as the insides of capillaries or the channels of microfluidic chips used in cutting-edge bioanalysis. The Australian researchers (V. S. J. Craig, firstname.lastname@example.org, 61-2-6125-3359) set out to explicitly test the age-old conjecture by measuring the motion of a ten micron silica sphere as they drove it through liquid toward a flat wall. Subsequent analysis showed that a fluid model incorporating boundary slip explained the data better than the classical no-slip model. No-slip boundaries are still close enough to the truth in most circumstances. The new study, however, reveals that descriptions of the blood moving through our capillaries, lubricants in nanomachines, and flow in other tiny systems must include boundary slip conditions to achieve precision at such small scales. (V. S. J. Craig et al., Physical Review Letters, 16 July)
2000 Sep 9 From PHYSICS NEWS UPDATE Number 501 THE FIRST HIGH-PERFORMANCE ALL-POLYMER INTEGRATED CIRCUITS have been developed by scientists at the Philips Research Laboratories in The Netherlands. Although not yet as fast as silicon circuits, polymer components have the virtue of being lightweight, flexible, and potentially easier to fabricate since the temperatures used are much lower, the substrate can be cheap plastic instead of expensive glass, and in some cases the ICs can be printed instead of etched. This will come in handy for low-end, high-volume microelectronics applications such as identification and product tagging and anti-theft stickers. The progress reported now by the Philips researchers (Gerwin Gelinck, email@example.com) features a great improvement in carrier speed and processing time by the use of a transistor geometry with gates on the bottom and vertical interconnects ("vias") patterned photochemically in the dielectric insulator layer. (Gelinck et al., Applied Physics Letters, 4 September 2000; Select Articles.) Philips recently announced that it had employed this technology in creating the first active matrix display in which each pixel is driven by an organic transistor ( www.research.philips.com/pressmedia/pictures/.000904.html)
2000 July 12 From PHYSICS NEWS UPDATE Number 493 A TIME CAPSULE FROM ARCHIMEDES is how historian Reviel Netz describes a tenth century manuscript bearing seven different works composed by Archimedes in the third century BCE. The manuscript, referred to as the "Archimedes Palimpsest" (from the Greek word for rescraped, since the parchment was partially scraped clean and then overwritten in the twelfth century with a number of prayers), rested in obscurity for centuries until 1906 when a scholar recognized the underlying text and geometrical drawings as being by Archimedes. The parchment quickly disappeared again, only to show up at auction in 1998. The purchaser, who remains anonymous, is now making the manuscript available for study. Of the seven different texts, several (such as Archimedes' treatises on buoyancy and on centers of gravity in planes) exist in other manuscripts, but in the case of two of the works, this manuscript represents the sole source. And one of these, a treatise called "Method," sets forth Archimedes' approach to using mathematical principles in solving physics problems and vice versa. For this reason, and because the Palimpsest "is, by a long stretch, the earliest evidence we have for Archimedes," Mr. Netz believes this is the most important manuscript associated with the man whom many regard at the most important scientist of antiquity. (Article by Netz in Physics Today, June 2000.)
2000 May From LeggMason Value Advisor, May 2000, p.1 *** Technology drives NASDAQ At its peak on March 9, the NASDAQ had added an astounding $3.4 trillion of equity value from last October's levels - a gain almost equivalent to the outstanding debt of the U.S. Treasury. We would also add that this incredible gain in equity value occurred, in large part, in many companies that were not public seven years ago. The underpinning of the enormous rise in the NASDAQ were the ever-present signs of technology in our daily lives and in the economy.
2000 feb 7 From: Infobeat Newsletter firstname.lastname@example.org *** Computer creates fiction TROY, N.Y. (AP) - Brutus creates stories about lies, self-deception and acts of betrayal. There is no dark muse inspiring Brutus, though. No torturous plumbing of the writer's soul. Brutus is, after all, a computer blueprint. The program based on the blueprint, Brutus.1, can write stories because its creators have condensed the complexities of deceit and double-crosses into mathematical equations it can understand. Characters and facts of the story are fed into Brutus.1, and out comes 500-word tales that read very much like human prose. See http://www.infobeat.com/stories/cgi/story.cgi?id=2563967536-87b
2000 jan 13 From: Infobeat Newsletter email@example.com *** Scientists create 'DNA computer' (AP) - Scientists have created a "DNA computer" from strands of synthetic DNA they coaxed into solving relatively complex calculations. The short-lived chemical computer has no immediate practical applications, but it nudges the fledgling technology of DNA computing further out of world of science fiction and into the realm of the possible, the University of Wisconsin-Madison researchers said. "It's kind of a non-automated computer - an abacus of sorts - but it's an approach we're confident can be automated like a conventional computer," said Lloyd Smith, a professor of chemistry. See http://www.infobeat.com/stories/cgi/story.cgi?id=2563223830-4d0
1999 nov 29 From: PHYSICS NEWS UPDATE #459 THE TOP PHYSICISTS IN HISTORY are, according to a poll of scientists conducted by Physics World magazine, 1. Albert Einstein, 2. Isaac Newton, 3. James Clerk Maxwell, 4. Niels Bohr, 5. Werner Heisenberg, 6. Galileo Galilei, 7. Richard Feynman, 8. Paul Dirac, 9. Erwin Schrodinger, and 10. Ernest Rutherford. Other highlights of Physics World's millennium canvas: the most important physics discoveries are Einstein's relativity theories, Newton's mechanics, and quantum mechanics. Most physicists polled (70%) said that if they had to do it all over again, they would choose to study physics once more. Most do not believe that progress in constructing unified field theories spells the end of physics. Ten great unsolved problems in physics: quantum gravity, understanding the nucleus, fusion energy, climate change, turbulence, glassy materials, high-temperature superconductivity, solar magnetism, complexity, and consciousness. (December issue of Physics World, published by the Institute of Physics, the British professional organization of physicists celebrating its 125th anniversary this year.)
1999 nov 11 From: Infobeat Newsletter firstname.lastname@example.org *** NASA describes loss of Mars orbiter WASHINGTON (AP) - Failure to convert English measures to metric values caused the loss of the Mars planet orbiter, a spacecraft that smashed into the planet instead of reaching a safe orbit, a NASA investigation concluded Wednesday. The Mars planet orbiter, a key craft in the space agency's exploration of the red planet, vanished after a rocket firing Sept. 23 that was supposed to put the spacecraft on station around Mars. An investigation board concluded that NASA engineers failed to convert English measures of rocket thrusts to newton, a metric system measuring rocket force. One English pound of force equals 4.45 newtons. A small difference between the two values caused the spacecraft to approach Mars at too low an altitude and the craft is thought to have smashed into the planet's atmosphere and was destroyed. See http://www.infobeat.com/stories/cgi/story.cgi?id=2562015332-40a
1999 apr 29 From: AIP listserver
PHYSICS NEWS UPDATE Number 425 April 27, 1999 by Phillip F. Schewe and Ben Stein APS CENTENNIAL PHOTOGRAPHS. The American Physical Society (APS) Centennial meeting in Atlanta, Georgia, March 20-26, 1999 was attended by 11,400 physicists, making it the largest physics meeting in history. Highlights included the presence of more than 40 Nobel laureates, a talk by Stephen Hawking, the unveiling of the Centennial physics wall chart, an international banquet attended by physicists from over 60 nations, a series of public lectures on everyday physics, and numerous symposia and press conferences on some of the most important physics topics of the day. A gallery of photographs from these events can be viewed at: http://www.aip.org/physnews/graphics/aps100/apsphoto.html
1999 April 5 the BridgePath Newsletter - 4/5/99 (also at http://www.bridgepath.com) The Nation's Best and Worst Jobs According to the 1999 Jobs Rated Almanac, here are the best and worst jobs based on environment, income, employment outlook, physical demands, security, stress and travel. The Best Jobs Overall: Worst Jobs Overall: Website manager Roustabout Actuary Lumberjack Computer systems analyst Commercial fisherman Software engineer Construction laborer Mathematician Cowboy Computer programmer Professional dancer Accountant Sheet metal worker Industrial designer Taxi driver Best Jobs by Industry: --The Arts-- --Business/Finance-- --Communications-- Motion picture editor Accountant Technical writer Architectural drafter Paralegal assistant Broadcast technician Symphony conductor Financial planner Publication editor --Social Sciences-- --Technical-- Historian Website manager Sociologist Computer systems analyst Political scientist Software engineer
1999 April Legg Mason Value Advisor In the early 1900's 85% of our workers were in agriculture. Now there are less than 3%. In 1950 manufacturing claimed 73% of workers. Today it is less than 15%. By 2000, an estimated 44% of all workers will be involved in data or information services.
1999 Feb 15 the BridgePath Newsletter According to eMarketer Inc., more than 3.4 trillion e-mail messages crossed the Internet in the U.S. in 1998. In comparison, there was about 107 billion pieces of First Class Mail delivered by the U.S. Post Office in 1998. They went on to say that more than 9.4 billion e-mails were exchanged every day of last year in the U.S.
1998 pocket pal calendar: My father always told me: Find a job you love and you'll never have to work a day in your life. -- Jim Fox
1997 November 18 From: InfoBeat
*** Doctors crack fiber code in colon cancer Medical experts said Tuesday they have discovered how fiber in the diet helps prevent colon cancer. The finding could have important implications for preventing the second biggest cancer killer in western countries and may lead to more effective treatments. Scientists have discovered that a substance called butyrate, which is produced when fibre is broken down by the large bowel, can shut down a gene called Bcl-2 that causes colon tumours to develop. See http://www.infobeat.com/stories/cgi/story.cgi?id=6019046-316
1997 July 11 NEWSpot East Coast Edition *** Antibiotics not best for ear infections - journal Antibiotics are not the best treatment for middle ear infections and doctors should stop routinely prescribing drugs for them, experts said Friday. Although middle ear infections can be painful, especially to children, antibiotics may only do harm, they said. Larry Culpepper of Boston University and a team of other doctors from the Netherlands, Britain and the U.S. said needlessly dosing children with drugs had helped antibiotic- resistant organisms to evolve. Their studies concluded antibiotics were a waste of time for most children. For the story, see http://www.merc.com/stories/cgi/story.cgi?id=3880170-518
1997 April 7 Computer World COMPUTERWORLD/@COMPUTERWORLD flash mail for the week of 4/07/97 "Ready For the Next Generation?" Look online and you'll increasingly see -- children. According to a recent report from Jupiter Communications, the number of children ages 2 to 17 using online services and the Internet from home reached 4 million -- nearly double 1995 levels. http://www.computerworld.com/emmerce/depts/kids.html
1997 Jan 15 USA Today According to the U.S.Bureau of Labor Statistics (summer 1996), the median annual income for men age 30 and older with a Bachelor's degree in Mathematics is $52,316. That figure is second highest among all majors (engineering is about $700 higher), and about $8500 above the median for all majors.
[source unknown] It's only time... Imagine there is a bank that credits your account each morning with 86,400. It carries over no balance from day to day. Every evening it deletes whatever part of the balance you failed to use during the day. What would you do? Draw out every cent, of course!!!! Each of us has such a bank. Its name is TIME. Every morning, it credits you with 86,400 seconds. Every night it writes off, as lost, whatever of this you have failed to invest to good purpose. It carries over no balance. It allows no overdraft. Each day it opens a new account for you. Each night it burns the remains of the day. If you fail to use the day's deposits, the loss is yours. There is no going back. There is no drawing against the "tomorrow". You must live in the present on today's deposits. Invest it so as to get from it the utmost in health, happiness and success! The clock is running. Make the most of today. To realize the value of ONE YEAR, ask a student who failed a grade. To realize the value of ONE MONTH, ask a mother who gave birth to a premature baby. To realize the value of ONE WEEK, ask the editor of a weekly newspaper. To realize the value of ONE HOUR, ask the lovers who are waiting to meet. To realize the value of ONE MINUTE, ask a person who missed the train. To realize the value of ONE SECOND, ask a person who just avoided an accident. To realize the value of ONE MILLISECOND, ask the person who won a medal in the Olympics. Treasure every moment that you have! And treasure it more because you shared it with someone special, special enough to spend your time. And remember that time waits for no one.