Category Archives: Physics

First Ever Antimatter Spectroscopy in ALPHA-2

Peter Lobner

ALPHA-2 is a device at the European particle physics laboratory at CERN, in Meyrin, Switzerland used for collecting and analyzing antimatter, or more specifically, antihydrogen.  A common hydrogen atom is composed of an electron and proton.  In contrast, an  antihydrogen atom is made up of a positron bound to an antiproton.

Screen Shot 2016-12-22 at 4.19.01 PMSource: CERN

The ALPHA-2 project homepage is at the following link:

http://alpha.web.cern.ch

On 16 December 2016, the ALPHA-2 team reported the first ever optical spectroscopic observation of the 1S-2S (ground state – 1st excited state) transition of antihydrogen that had been trapped and excited by a laser.

“This is the first time a spectral line has been observed in antimatter. ……..This first result implies that the 1S-2S transition in hydrogen and antihydrogen are not too different, and the next steps are to measure the transition’s lineshape and increase the precision of the measurement.”

In the ALPHA-2 online news article, “Observation of the 1S-2S Transition in Trapped Antihydrogen Published in Nature,” you will find two short videos explaining how this experiment was conducted:

  • Antihydrogen formation and 1S-2S excitation in ALPHA
  • ALPHA first ever optical spectroscopy of a pure anti atom

These videos describe the process for creating antihydrogen within a magnetic trap (octupole & mirror coils) containing positrons and antiprotons. Selected screenshots from the first video are reproduced below to illustrate the process of creating and exciting antihydrogen and measuring the results.

Alpha2 mirror trap

The potentials along the trap are manipulated to allow the initially separated positron and antiproton populations to combine, interact and form antihydrogen.

Combining positron & antiproton 1Combining positron & antiproton 2Combining positron & antiproton 3

If the magnetic trap is turned off, the antihydrogen atoms will drift into the inner wall of the device and immediately be annihilated, releasing pions that are detected by the “annihilation detectors” surrounding the magnetic trap. This 3-layer detector provides a means for counting antihydrogen atoms.

Detecting antihydrogen

A tuned laser is used to excite the antihydrogen atoms in the magnetic trap from the 1S (ground) state to the 2S (first excited) state. The interaction of the laser with the antihydrogen atoms is determined by counting the number of free antiprotons annihilating after photo ionization (an excited antihydrogen atom loses its positron) and counting all remaining antihydrogen atoms. Two cases were investigated: (1) laser tuned for resonance of the 1S-2S transition, and (2) laser detuned, not at resonance frequency. The observed differences between these two cases confirmed that, “the on-resonance laser light is interacting with the antihydrogen atoms via their 1S-2S transition.”

Exciting antihydrogen

The ALPHA-2 team reported that the accuracy of the current antihydrogen measurement of the 1S-2S transition is about “a few parts in 10 billion” (1010). In comparison, this transition in common hydrogen has been measured to an accuracy of “a few parts in a thousand trillion” (1015).

For more information, see the 19 December 2016 article by Adrian Cho, “Deep probe of antimatter puts Einstein’s special relativity to the test,” which is posted on the Sciencemag.org website at the following link:

http://www.sciencemag.org/news/2016/12/deep-probe-antimatter-puts-einstein-s-special-relativity-test?utm_campaign=news_daily_2016-12-19&et_rid=215579562&et_cid=1062570

Emergent Gravity Theory Passes its First Test

Peter Lobner

In 2010, Prof. Erik Verlinde, University of Amsterdam, Delta Institute for Theoretical Physics, published the paper, “The Origin of Gravity and the Laws of Newton.” In this paper, the author concluded:

 “The results of this paper suggest gravity arises as an entropic force, once space and time themselves have emerged. If the gravity and space time can indeed be explained as emergent phenomena, this should have important implications for many areas in which gravity plays a central role. It would be especially interesting to investigate the consequences for cosmology. For instance, the way redshifts arise from entropy gradients could lead to many new insights.

The derivation of the Einstein equations presented in this paper is analogous to previous works, in particular [the 1995 paper by T. Jacobson, ‘Thermodynamics of space-time: The Einstein equation of state.’]. Also other authors have proposed that gravity has an entropic or thermodynamic origin, see for instance [the paper by T. Padmanabhan, ‘Thermodynamical Aspects of Gravity: New insights.’]. But we have added an important element that is new. Instead of only focusing on the equations that govern the gravitational field, we uncovered what is the origin of force and inertia in a context in which space is emerging. We identified a cause, a mechanism, for gravity. It is driven by differences in entropy, in whatever way defined, and a consequence of the statistical averaged random dynamics at the microscopic level. The reason why gravity has to keep track of energies as well as entropy differences is now clear. It has to, because this is what causes motion!”

You can download Prof. Verlinde’s 2010 paper at the following link:

https://arxiv.org/pdf/1001.0785.pdf

On 8 November 2016, Delta Institute announced that Prof. Verlinde had published a new research paper, “Emergent Gravity and the Dark Universe,” expanding on his previous work. You can read this announcement and see a short video by Prof. Verlinde on the Delta Institute website at the following link:

http://www.d-itp.nl/news/list/list/content/folder/press-releases/2016/11/new-theory-of-gravity-might-explain-dark-matter.html

You can download this new paper at the following link:

https://arxiv.org/abs/1611.02269

I found it helpful to start with Section 8, Discussion and Outlook, which is the closest you will find to a layman’s description of the theory.

On the Physics.org website, a short 8 November 2016 article, “New Theory of Gravity Might Explain Dark Matter,” provides a good synopsis of Verlinde’s emergent gravity theory:

“According to Verlinde, gravity is not a fundamental force of nature, but an emergent phenomenon. In the same way that temperature arises from the movement of microscopic particles, gravity emerges from the changes of fundamental bits of information, stored in the very structure of spacetime……

According to Erik Verlinde, there is no need to add a mysterious dark matter particle to the theory……Verlinde shows how his theory of gravity accurately predicts the velocities by which the stars rotate around the center of the Milky Way, as well as the motion of stars inside other galaxies.

One of the ingredients in Verlinde’s theory is an adaptation of the holographic principle, introduced by his tutor Gerard ‘t Hooft (Nobel Prize 1999, Utrecht University) and Leonard Susskind (Stanford University). According to the holographic principle, all the information in the entire universe can be described on a giant imaginary sphere around it. Verlinde now shows that this idea is not quite correct—part of the information in our universe is contained in space itself.

This extra information is required to describe that other dark component of the universe: Dark energy, which is believed to be responsible for the accelerated expansion of the universe. Investigating the effects of this additional information on ordinary matter, Verlinde comes to a stunning conclusion. Whereas ordinary gravity can be encoded using the information on the imaginary sphere around the universe, as he showed in his 2010 work, the result of the additional information in the bulk of space is a force that nicely matches that attributed to dark matter.”

Read the full Physics.org article at the following link:

http://phys.org/news/2016-11-theory-gravity-dark.html#jCp

On 12 December 2016, a team from Leiden Observatory in The Netherlands reported favorable results of the first test of the emergent gravity theory. Their paper, “First Test of Verlinde’s Theory of Emergent Gravity Using Weak Gravitational Lensing Measurements,” was published in the Monthly Notices of the Royal Astronomical Society. The complete paper is available at the following link:

http://mnras.oxfordjournals.org/content/early/2016/12/09/mnras.stw3192

An example of a gravitational lens is shown in the following diagram.

Gravitational-lensing-galaxyApril12_2010-1024x768-e1481555047928 Source: NASA, ESA & L. Calça

As seen from the Earth, the light from the galaxy at the left is bent by the gravitational forces of the galactic cluster in the center, much like light passing though an optical lens.

The Leiden Observatory authors reported:

“We find that the prediction from EG, despite requiring no free parameters, is in good agreement with the observed galaxy-galaxy lensing profiles in four different stellar mass bins. Although this performance is remarkable, this study is only a first step. Further advancements on both the theoretical framework and observational tests of EG are needed before it can be considered a fully developed and solidly tested theory.”

These are exciting times! As noted in the Physics.org article, “We might be standing on the brink of a new scientific revolution that will radically change our views on the very nature of space, time and gravity.”

New Testable Theory on the Flow of Time and the Meaning of Now

Peter Lobner

Richard A. Muller, a professor of physics at the University of California, Berkeley, and Facility Senior Scientist at Lawrence Berkeley Laboratory, is the author of in intriguing new book entitled, “NOW, the Physics of Time.”

NOW cover page  Source: W. W. Norton & Company

In Now, Muller addresses weaknesses in past theories about the flow of time and the meaning of “now.” He also presents his own revolutionary theory, one that makes testable predictions. He begins by describing the physics building blocks of his theory: relativity, entropy, entanglement, antimatter, and the Big Bang. Muller points out that the standard Big Bang theory explains the ongoing expansion of the universe as the continuous creation of new space. He argues that time is also expanding and that the leading edge of the new time is what we experience as “now.”

You’ll find a better explanation in the UC Berkeley short video, “Why does time advance?: Richard Muller’s new theory,” at the following link:

https://www.youtube.com/watch?v=FYxUzm7gQkY

In the video, Muller explains that his theory would have resulted in a measurable 1 millisecond delay in “chirp” seen in the first gravitational wave signals detected on 11 February 2016 by the Laser Interferometer Gravitational-Wave Observatory (LIGO). LIGO’s current sensitivity precluded seeing the predicted small delay. If LIGO and other and-based gravity wave detector sensitivities are not adequate, a potentially more sensitive space-based gravity wave detection array, eLISA, should be in place in the 2020s to test Muller’s theory.

It’ll be interesting to see if LIGO, any of the other land-based gravity wave detectors, or eLISA will have the needed sensitivity to prove or disprove Muller’s theory.

For more information related to gravity wave detection, see my following posts:

  • 16 December 2015 post, “100th Anniversary of Einstein’s General Theory of Relativity and the Advent of a New Generation of Gravity Wave Detectors ”
  • 11 February 2016 post, “NSF and LIGO Team Announce First Detection of Gravitational Waves”
  • 27 September 2016, “Space-based Gravity Wave Detection System to be Deployed by ESA”

The Universe is Isotropic

Peter Lobner, Updated 12 January 2021

The concepts of up and down appear to be relatively local conventions that can be applied at the levels of subatomic particles, planets and galaxies. However, the universe as a whole apparently does not have a preferred direction that would allow the concepts of up and down to be applied at such a grand scale.

A 7 September 2016 article entitled, “It’s official: You’re lost in a directionless universe,” by Adrian Cho, provides an overview of research that demonstrates, with a high level of confidence, that the universe is isotropic. The research was based on data from the Planck space observatory. In this article, Cho notes:

“Now, one team of cosmologists has used the oldest radiation there is, the afterglow of the big bang, or the cosmic microwave background (CMB), to show that the universe is “isotropic,” or the same no matter which way you look: There is no spin axis or any other special direction in space. In fact, they estimate that there is only a one-in-121,000 chance of a preferred direction—the best evidence yet for an isotropic universe. That finding should provide some comfort for cosmologists, whose standard model of the evolution of the universe rests on an assumption of such uniformity.”

The European Space Agency (ESA) developed the Planck space observatory to map the CMB in microwave and infrared frequencies at unprecedented levels of detail. Planck was launched on 14 May 2009 and was placed in a Lissajous orbit around the L2 Lagrange point, which is 1,500,000 km (930,000 miles) directly behind the Earth. L2 is a quiet place, with the Earth shielding Planck from noise from the Sun. The approximate geometry of the Earth-Moon-Sun system and a representative spacecraft trajectory (not Planck, specifically) to the L2 Lagrange point is shown in the following figure.

Lissajous orbit L2Source: Abestrobi / Wikimedia Commons

The Planck space observatory entered service on 3 July 2009. At the end of its service life, Planck departed its valuable position at L2, was placed in a heliocentric orbit, and was deactivated on 23 October 2013. During more than four years in service, Planck performed its CBM mapping mission with much greater resolution than NASA’s Wilkinson Microwave Anisotropy Probe (WMAP), which operated from 2001 to 2010.  Planck was designed to map the CMB with an angular resolution of 5-10 arc minutes and a sensitivity of a millionth of a degree.

One key result of the Planck mission is the all-sky survey shown below.

Planck all-sky survey 2013 CBM temperature map shows anisotropies in the temperature of the CMB at the full resolution obtained by Planck. Source: ESA / Planck Collaboration

ESA characterizes this map as follows:

“The CMB is a snapshot of the oldest light in our Universe, imprinted on the sky when the Universe was just 380,000 years old. It shows tiny temperature fluctuations that correspond to regions of slightly different densities, representing the seeds of all future structure: the stars and galaxies of today.”

The researchers who reported that the universe was isotropic noted that an anisotropic universe would leave telltale patterns in the CMB. However, these researchers found that the actual CMB shows only random noise and no signs of such patterns.

The researchers who reported that the universe was isotropic noted that an anisotropic universe would leave telltale patterns in the CMB.  However, these researchers found that the actual CMB shows only random noise and no signs of such patterns.

In 2015, the ESA / Planck Collaboration used CMB data to estimate the age of the universe at 13.813 ± 0.038 billion years.  This was lightly higher than, but within the uncertainty band of, an estimate derived in 2012 from nine years of data from NASA’s Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft.

In July 2018, the ESA / Planck Collaboration published the “Planck Legacy” release of their results, which included the following two additional CBM sky survey maps.

Planck all-sky survey 2013 CBM smoothed temperature map (top) and smoothed temperature + polarization map (bottom). Source: ESA / Planck Collaboration

The ESA/Planck Collaboration described these two new maps as follows:

  • (In the top map), “the temperature anisotropies have been filtered to show mostly the signal detected on scales around 5º on the sky. The lower view shows the filtered temperature anisotropies with an added indication of the direction of the polarized fraction of the CMB.”
  • “A small fraction of the CMB is polarized – it vibrates in a preferred direction. This is a result of the last encounter of this light with electrons, just before starting its cosmic journey. For this reason, the polarization of the CMB retains information about the distribution of matter in the early Universe, and its pattern on the sky follows that of the tiny fluctuations observed in the temperature of the CMB” (in the 2013 map, above).

Using Planck CMB data, the ESA / Planck Collaboration team has estimated the value of the Hubble constant. Their latest estimate, in 2018, was 67.4 km / second / megaparsec with an uncertainty of less than 1%.  This is lower than the value derived from astrophysical measurements: 73.5 km / second / megaparsec with an uncertainty of 2%.

You’ll find more details on the Planck mission and scientific results on the ESA’s website at the following link: http://www.esa.int/Our_Activities/Space_Science/Planck

For more information:

Space-based Gravity Wave Detection System to be Deployed by ESA

Peter Lobner

The first detection of gravitational waves occurred on 14 September 2015 at the land-based Laser Interferometer Gravitational-Wave Observatory (LIGO). Using optical folding techniques, LIGO has an effective baseline of 1,600 km (994 miles). See my 16 December 2015 and 11 February 2016 posts for more information on LIGO and other land-based gravitational wave detectors.

Significantly longer baselines, and theoretically greater sensitivity can be achieved with gravitational wave detectors in space. Generically, such a space-based detector has become known as a Laser Interferometer Space Antenna (LISA). Three projects associated with space-based gravitational wave detection are:

  • LISA (the project name predated the current generic usage of LISA)
  • LISA Pathfinder (a space-based gravitational wave detection technology demonstrator, not a detector)
  • Evolved LISA (eLISA)

These projects are discussed below.

The science being addressed by space-based gravitational wave detectors is discussed in the eLISA white paper, “The Gravitational Universe.” You can download this whitepaper, a 1-page summary, and related gravitational wave science material at the following link:

https://www.elisascience.org/whitepaper/

LISA

The LISA project originally was planned as a joint European Space Agency (ESA) and National Aeronautics & Space Administration (NASA) project to detect gravitational waves using a very long baseline, triangular interferometric array of three spacecraft.

Each spacecraft was to contain a gravitational wave detector sensitive at frequencies between 0.03 mHz and 0.1 Hz and have the capability to precisely measure its distances to the other two spacecraft forming the array. The equilateral triangular array, which was to measure about 5 million km (3.1 million miles) on a side, was expected to be capable of measuring gravitational-wave induced strains in space-time by precisely measuring changes of the separation distance between pairs of test masses in the three spacecraft. In 2011, NASA dropped out of this project because of funding constraints.

LISA Pathfinder

The LISA Pathfinder (LPF) is a single spacecraft intended to validate key technologies for space-based gravitational wave detection. It does not have the capability to detect gravity waves.

This mission was launched by ESA on 3 December 2015 and the spacecraft took station in a Lissajous orbit around the Sun-Earth L1 Lagrange point on 22 January 2016. L1 is directly between the Earth and the Sun, about 1.5 million km (932,000 miles) from Earth. An important characteristic of a Lissajous orbit is that the spacecraft will follow the L1 point without requiring any propulsion. This is important for minimizing external forces on the LISA Pathfinder experiment package. The approximate geometry of the Earth-Moon-Sun system and a representative spacecraft (not LPF, specifically) stationed at the L1 Lagrange point is shown in the following figure.

L1 Lagrange pointSource: Wikimedia Commons

The LISA Pathfinder’s mission is to validate the technologies used to shield two free-floating metal cubes (test masses), which form the core of the experiment package, from all internal and external forces that could contribute to noise in the gravitational wave measurement instruments. The on-board measurement instruments (inertial sensors and a laser interferometer) are designed to measure the relative position and orientation of the test masses, which are 38 cm (15 inches) apart, to an accuracy of less than 0.01 nanometers (10e-11 meters). This measurement accuracy is believed to be adequate for detecting gravitational waves using this technology on ESA’s follow-on mission, eLISA.

The first diagram below is an artist’s impression of the LISA Pathfinder technology package, showing the inertial sensors housing the test masses (gold) and the laser interferometer (middle platform). The second diagram provides a clearer view of the test masses and the laser interferometer.

LPF technology package 1

Source: ESA/ATG medialab, August 2015LPF technology package 2Source: ESA LISA Pathfinder briefing, 7 June 2016

You’ll find more general information in an ESA LISA Pathfinder overview, which you can download from NASA’s LISA website at the following link:

http://lisa.nasa.gov/Documentation/LISA-LPF-RP-0001_v1.1.pdf

LISA Pathfinder was commissioned and ready for scientific work on 1 March 2016. In a 7 June 2016 briefing, ESA reported very favorable performance results from LISA Pathfinder:

  • LPF successfully validated the technologies used in the local (in-spacecraft) instrument package (test masses, inertial sensors and interferometer).
  • LPF interferometer noise was a factor of 100 less than on the ground.
  • The measurement instruments can see femtometer motion of the test masses (LPF goal was picometer).
  • Performance is essentially at the level needed for the follow-on eLISA mission

You can watch this full (1+ hour) ESA briefing at the following link:

http://www.esa.int/Our_Activities/Space_Science/Watch_LISA_Pathfinder_briefing

eLISA

Evolved LISA, or eLISA, is ESA’s modern incarnation of the original LISA program described previously. ESA’s eLISA website home page is at the following link:

https://www.elisascience.org

As shown in the following diagrams, three eLISA spacecraft will form a very long baseline interferometric array that is expected to directly observe gravitational waves from sources anywhere in the universe. In essence, this array will be a low frequency microphone listening for the sounds of gravitational waves as they pass through the array.

eLISA constellation 1Source: ESAeLISA constellation 2Source: ESA

As discussed previously, gravity wave detection depends on the ability to very precisely measure the distance between test masses that are isolated from their environment but subject to the influence of passing gravitational waves. Measuring the relative motion of a pair of test masses is considerably more complex for eLISA than it was for LPF. The relative motion measurements needed for a single leg of the eLISA triangular array are:

  • Test mass 1 to Spacecraft 1
  • Spacecraft 1 to Spacecraft 2
  • Spacecraft 2 to Test Mass 2

This needs to be done for each of the three legs of the array.

LPF validated the technology for making the test mass to spacecraft measurement. Significant development work remains to be done on the spacecraft-to-spacecraft laser system that must take precise measurements at very long distances (5 million km, 3.1 million miles) of the relative motion between each pair of spacecraft.

In the 6 June 2016 LISA Pathfinder briefing, LPF and ESA officials indicated that an eLisa launch date is expected in the 2029 – 2032 time frame. Then it reaches its assigned position in a trailing heliocentric orbit, eLISA will be a remarkable collaborative technical achievement and a new window to our universe.

Polymagnets® will Revolutionize the Ways in Which Magnets are Used

Peter Lobner

The U.S firm Correlated Magnetics Research (CMR), Huntsville, AL, invented and is the sole manufacturer of Polymagnets®, which are precision-tailored magnets that enhance existing and new products with specific behaviors that go far beyond the simple attract-and-repel behavior of common magnets. Polymagnets have been granted over 100 patents, all held by CMR. You can visit their website at the following link:

http://www.polymagnet.com

CMR describes Polymagnets® as follows:

“Essentially programmable magnets, Polymagnets are the first fundamental advance in magnets in 180 years, since the introduction of electromagnets. With Polymagnets, new products can have softer ‘feel’ or snappier or crisper closing or opening behavior, and may be given the sensation of a spring or latch”.

On a conventional magnet, there is a North (N) pole on one surface and a South (S) pole on the opposite surface. Magnetic field lines flow around the magnetic from pole to pole. On a Polymagnet®, many small, polarized (N or S) magnetic pixels (“maxels”) are manufactured by printing in a desired pattern on the same surface. The magnetic field lines are completed between the maxels on that surface, resulting in a very compact, strong magnetic field. This basic concept is shown in the following figure.

Polymagnet field comparison

The mechanical 3-D behavior of a Polymagnet® is determined by the pattern and strength of the maxels embedded on the surface of the magnet. These customizable behaviors include spring, latch, shear, align, snap, torque, hold, twist, soften and release. The very compact magnetic field reduces magnetic interference with other equipment, opening new applications for Polymagnets® where a conventional magnet wouldn’t be suitable.

The above figure is a screenshot from the Smarter Every Day 153 video, which you can view at the following link. Thanks to Mike Spaeth for sending me this is a 10-minute video, which I think you will enjoy.

https://www.youtube.com/watch?v=IANBoybVApQ

More information on Polymagnet® technology, including short videos that demonstrate different mechanical behaviors, and a series of downloadable white papers, is available at the following link.

http://www.polymagnet.com/polymagnets/

This is remarkable new technology in search of novel applications. Many practical applications are identified on the Polymagnet® website. What are your ideas?

If you really want to look into this technology, you can buy a Polymagnet® demonstration kit at the following links:

https://www.magnetics.com/product.asp?ProductID=164

or,

http://www.mechanismsmarket.com/kits/

Polymagnet demo kit   Source: Mechanisms Market

The Invisible Man may be Blind!

Peter Lobner

Metamaterials are a class of material engineered to produce properties that don’t occur naturally.

The first working demonstration of an “invisibility cloak” was achieved in 2006 at the Duke University Pratt School of Engineering using the complex metamaterial-based cloak shown below.

Duke 2006 metamaterial cloakSource: screenshot from YouTube link below.

The cloak deflected an incoming microwave beam around an object and reconstituted the wave fronts on the downstream side of the cloak with little distortion. To a downstream observer, the object inside the cloak would be hidden.

Effect of Duke metamaterial cloakSource: screenshot from YouTube link below.

You can view a video of this Duke invisibility cloak at the following link:

https://www.youtube.com/watch?v=Ja_fuZyHDuk

In a paper published in the 18 September 2015 issue of Science, researchers at UC Berkley reported creating an ultra-thin, metamaterial-based optical cloak that was successful in concealing a small scale, three-dimensional object. The abstract of this paper, “An ultrathin invisibility skin cloak for visible light”, by Ni et al., is reproduced below.

“Metamaterial-based optical cloaks have thus far used volumetric distribution of the material properties to gradually bend light and thereby obscure the cloaked region. Hence, they are bulky and hard to scale up and, more critically, typical carpet cloaks introduce unnecessary phase shifts in the reflected light, making the cloaks detectable. Here, we demonstrate experimentally an ultrathin invisibility skin cloak wrapped over an object. This skin cloak conceals a three-dimensional arbitrarily shaped object by complete restoration of the phase of the reflected light at 730-nanometer wavelength. The skin cloak comprises a metasurface with distributed phase shifts rerouting light and rendering the object invisible. In contrast to bulky cloaks with volumetric index variation, our device is only 80 nanometer (about one-ninth of the wavelength) thick and potentially scalable for hiding macroscopic objects.”

If you have a subscription to Science, you can read the full paper at the following link:

http://science.sciencemag.org/content/349/6254/1310

Eric Grundhauser writes on the Atlas Obscura website about an interesting quandary for users of an optical invisibility cloak.

“Since your vision is based on the light rays that enter your eyes, if all of these rays were diverted around someone under an invisibility cloak, the effect would be like being covered in a thick blanket. Total darkness.”

So, the Invisible Man is likely to be less of a threat than he appeared in the movies. You should be able to locate him as he stumbles around a room, bumping into everything he can’t see at visible light frequencies. However, he may be able to navigate and sense his adversary at other electromagnetic and/or audio frequencies that are less affected by his particular invisibility cloak.

You can read Eric Grundhauser’s complete article, “The Problem With Invisibility is Blindness,” at the following link:

http://www.atlasobscura.com/articles/the-problem-with-invisibility-is-the-blindness?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

Recognizing this inconvenient aspect of an invisibility cloak, researchers from Yunnan University, China, have been investigating the concept of a “reciprocal cloak,” which they describe as, “an intriguing metamaterial device, in which a hidden antenna or a sensor can receive electromagnetic radiation from the outside but its presence will not be detected.” One approach is called an “open cloak,” which includes a means to, “open a window on the surface of a cloak, so that exchanging information and matter with the outside can be achieved.”

You can read the complete 2011 paper, “Electromagnetic Reciprocal Cloak with Only Axial Material Parameter Spatially Variant,” by Yang et al., at the following link:

http://www.hindawi.com/journals/ijap/2012/153086/

An all-aspect, broadband (wide range of operational frequencies) invisibility cloak is likely to remain in the realm of fantasy and science fiction. A 10 March 2016 article entitled, “Invisibility cloaks can never hide objects from all observers,” by Lisa Zyga, explains:

“….limitations imposed by special relativity mean that the best invisibility cloaks would only be able to render objects partially transparent because they would suffer from obvious visible distortions due to motion. The result would be less Harry Potter and more like the translucent creatures in the 1987 movie Predator.”

You can read the complete article at the following link:

http://phys.org/news/2016-03-invisibility-cloaks.html

Further complications are encountered when applying an invisibility cloak to a very high-speed vessel. A 28 January 2016 article, also by Lisa Zyga, explains:

“When the cloak is moving at high speeds with respect to an observer, relativistic effects shift the frequency of the light arriving at the cloak so that the light is no longer at the operational frequency. In addition, the light emerging from the cloak undergoes a change in direction that produces a further frequency shift, causing further image distortions for a stationary observer watching the cloak zoom by.”

You can read the complete article, “Fast-moving invisibility cloaks become visible,” at the following link:

http://phys.org/news/2016-01-fast-moving-invisibility-cloaks-visible.html

So, there you have it! The Invisible Man may be blind, the Predator’s cloak seems credible even when he’s moving, and a really fast-moving cloaked Klingon battlecruiser is vulnerable to detection.

Simulating Extreme Spacetimes

Peter Lobner

Thanks to Dave Groce for sending me the following link to the Caltech-Cornell Numerical Relativity collaboration; Simulating eXtreme Spacetimes (SXS):

http://www.black-holes.org

Caltech SXSSource: SXS

From the actual website (not the image above), click on the yellow “Admit One” ticket and you’re on your way.

Under the “Movies” tab, you’ll find many video simulations that help visualizes a range of interactions between two black holes and between a black hole and a neutron star. Following is a direct link:

http://www.black-holes.org/explore/movies

A movie visualizing GW150914, the first ever gravitational wave detection on 14 September 2015, is at the following SXS link:

https://www.black-holes.org/gw150914

At the above link, you also can listen to the sound of the GW150914 “in-spiral” event (two black holes spiraling in on each other).  You can read more about the detection of GW150914 in my 11 February 2016 post.

On the “Sounds” tab on the SXS website, you’ll find that different types of major cosmic events are expected to emit gravitational waves with waveforms that will help characterize the original event. You can listen to the expected sounds from a variety of extreme cosmic events at the following SXS link:

http://www.black-holes.org/explore/sounds

Have fun exploring SXS.

NSF and LIGO Team Announce First Detection of Gravitational Waves

Peter Lobner

Today, 11 February 2016, the National Science Foundation (NSF) and the Laser Interferometer Gravitational-Wave Observatory (LIGO) project team announced that the first detection of gravitational waves occurred on 14 September 2015. You can view a video of this announcement at the following link:

https://www.youtube.com/watch?v=_582rU6neLc

The first paper on this milestone event, “Observation of Gravitational Waves From a Binary Black Hole Merger,” is reported in Physical Review Letters, at the following link:

http://journals.aps.org/prl/pdf/10.1103/PhysRevLett.116.061102

The recorded signals from the two LIGO sites, Livingston, LA and Hanford, WA, are shown below, with the Hanford data time shifted to account for the slightly later arrival time of the gravitational wave signal at that detector location. The magnitude of the gravitational wave signal was characterized as being just below the detection threshold of LIGO before installation of the new advanced detectors, which improve LIGO sensitivity by a factor of 3 to 10.

LIGO signals

Source: NSF/LIGO

This milestone occurred during the engineering testing phase of the advanced LIGO detectors, before the start of their first official “observing run” on 18 September 2015.

Analysis and simulations conducted on the data indicate that the observed gravitational wave signals were generated when two orbiting black holes coalesced into a single black hole of smaller total mass and ejected about three solar masses of energy as gravitational waves.

In the Physical Review Letters paper, the authors provide the following diagram, which gives a physical interpretation of the observed gravitational wave signals.

Binary black holes merge

Note the very short timescale of this extraordinarily dynamic process. The recorded gravitational wave signals yielded an audible “chirp” when the two black holes merged.

With only two LIGO detectors, the source of the observed gravitational waves could not be localized, but the LIGO team reported that the source was in the southern sky, most likely in the vicinity of the Magellanic Clouds.

Localization of black hole merger Source: NSF/LIGO

The ability to localize gravitational wave signals will improve when additional gravitational wave detectors become operational later in this decade.

For more information on the current status of LIGO and other new-generation gravitational wave detectors, see my 16 December 2015 post: “100th Anniversary of Einstein’s Theory of General Relativity and the Advent of a New Generation of Gravity Wave Detectors.”

Update: 3 October 2017

 Congratulations to Rainer Weiss, Barry C. Barish, and Kip S. Thorne, all members of the LIGO / VIRGO Collaboration, for their award of the 2017 Nobel Prize in Physics for the first direct observation of gravitational waves. You can read the press release from the Royal Swedish Academy of Sciences here:

https://www.nobelprize.org/nobel_prizes/physics/laureates/2017/press.html

You also can read the scientific background on this award on the Royal Swedish Academy of Sciences website at the following link:

https://www.nobelprize.org/nobel_prizes/physics/laureates/2017/advanced-physicsprize2017.pdf

Anyone Can Quantum

Peter Lobner

Nobel Laureate Dr. Richard Feynman is famously quoted as saying, “I think I can safely say that nobody understands quantum mechanics.” University of Southern California (USC) graduate student Chris Cantwell, the inventor of Quantum Chess, is seeking to change that view by demonstrating that, in the right framework, anyone can grapple with some of the basic concepts of quantum mechanics. In particular, Chris Cantwell views Quantum Chess as a means of “demystifying the quantum world through play.” In Quantum Chess, all of the conventional chess moves are allowed as well as certain quantum moves for all pieces except pawns.

Quantum Chess isn’t a game you can purchase right now, but the short video, “Anyone Can Quantum,” provides an entertaining demonstration of what quantum gameplay will be like in the near future. This video was created by Caltech’s Institute for Quantum Information and Matter (IQIM) (‪http://iqim.caltech.edu) in association with Trouper Productions (‪http://trouper.net). In the video, actor Paul Rudd (Ant Man) challenges Stephen Hawking to a game of Quantum Chess for the right to give the keynote address at Caltech’s 26 – 27 January 2016 special event, “One Entangled Evening: A Celebration of Richard Feynman’s Quantum Legacy.”

You can view the almost 12 minute video at the following link.

https://www.youtube.com/watch?v=Hi0BzqV_b44

Here are a few of screenshots from the video.

Quantum chess match announcement

Quantum chess players

Quantum superposition is demonstrated by “Schrodinger’s king”, which could be in two places at one time.

Without superposition                                                      With superposition

Without superposition             With superposition

Quantum entanglement of the king & bishop enabled a bishop to move through a king.

Without entanglement                                                  With entanglement

Without entanglement           With entangelement

Resolution of the game required a quantum measurement to determine the winner.

For those of you who can’t wait to play a real game of Quantum Chess, Chris Cantwell has launched a Kickstarter funding program. Find out details at the following link:

https://www.kickstarter.com/projects/507726696/quantum-chess

You can find out more about the 26 – 27 January 2016 Caltech event, One Entangled Evening: A Celebration of Richard Feynman’s Quantum Legacy,” at the following link:

https://www.caltech.edu/content/one-entangled-evening-celebration-richard-feynmans-quantum-legacy