Friday, February 12, 2016

Nikki Anne Schmutz writes



Our Universe

In the mirror of us 
we find vastness unexplored - 
reflections of stars 
multiplied by smiles as wide as light years. 
We travel on heartbeats, 
between galaxies of moments 
shared instantaneously 
as we delve into one another’s eyes. 
We breathe adoration, 
touch fingers to the birth of a star 
otherwise known as 
the supernova of us.

 Image result for supernova images

3 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. In 1931 astronomers Walter Baade and Fritz Zwicky coined the word "supernova" to describe the last evolutionary stage of a massive star's life, the dramatic and catastrophic destruction of which is marked by a titanic explosion, causing the sudden appearance of a bright "new" star that outshines the entire output of a typical galaxy and emits energy equal to that created over the lifetime of any solar-like star; the supernova then fades from sight over several weeks or months. It may result when a degenerate white dwarf star accumulates (via accretion or merging) enough material from a binary companion to raise its core temperature, which triggers runaway nuclear fusion, completely disrupting the star. But the scientific consensus is that the core of a massive star undergoes a sudden gravitational collapse and releases enough potential to create an explosion. The astral catastrophe expels stellar material away at 30,000 kms (10% of the speed of light), driving an expanding and fast-moving shock wave into space that forms new stars. Supernovae create, fuse, and eject most of the chemical elements produced by nucleosynthesis (the process that creates new atomic nuclei from pre-existing nucleons (mainly protons and neutrons), enrich the universe with higher mass elements, and form most of the primary cosmic rays and gravitational waves. They are the astrophysical analaog of what the economist Joseph Schumpeter called "creative destruction."

    ReplyDelete
  3. Most rational people probably believe that certain states are associated with a definite value of any particular observable phenomenon. These are eigenstates (from the German word for "inherent" or "characteristic"): a definite position, a definite momentum, a definite energy, a definite time of occurrence. A phenomenon cannot be caused by some event that did not exist until later. In physics, "realism" claims that even if the results of a possible measurement do not exist before he actual measurement, they are not the creation of the observer; and such a mind-independent property does not have to be the value of some physical variable such as position or momentum -- it may be potential, in the way that glass objects tend to break, or are disposed to break, even if they do not actually break. "Locality," on the other hand, means that an object is only directly influenced by its immediate surroundings: for something at one point to have an influence at another point, something in the space between the points (a field) must mediate the action, and something (a wave, a particle) must travel through the space. Albert Einstein's special theory of relativity limits the speed at which all such influences can travel, so the principle of locality implies that an event at one point cannot cause a simultaneous result at another point. In other words, information cannot travel faster than the speed of light. However, experiments in quantum mechanics demonstrate that these eigenstates may not be absolute. Einstein himself, one of the main contributors to the development of quantum mechanics, could not accept many of its postulations, especially its rejection of deterministic causality. ("My God does not play with dice," he said.) In particular he did not believe a single subatomic particle can occupy numerous areas of space at one time.
    In 1935, with Boris Podolsky and Nathan Rosen, Einstein suggested that quantum mechanics might not be a local theory, since a measurement made on one of a pair of separated entangled particles causes simultaneous collapse of the wavefunction of the remote particle. However, because of the probabilistic nature of wavefunction collapse, this violation of locality cannot be used to transmit information faster than light. In 1964 John Stewart Bell countered that quantum mechanics must violate either locality or realism, and beginning with Alain Aspect's photon experiments in 1972, scientists seem to have demonstrated the apparent truth of the "Bell inequality," though critics have pointed out that most of the experiments contained "loopholes (problems in the experimental design or set-up that affect the validity of the experimental findings, especially that detection efficiency is less than 100% -- this is always the case in optical experiments, which are only around 5-30%--, or the "fair sampling assumption" -- that the sample of detected pairs is representative of the pairs emitted -- which is impossible to test experimentally since the number of emitted but undetected pairs is by definition unknown. However, Dr. Ronald Hanson's group at TU Delft’s Kavli Institute of Nanoscience performed what has been called the first loophole-free experiment by deterministically transferring the information contained in one qubit (quantum bit) to a different one three meters away without moving it through intervening space.

    ReplyDelete

Join the conversation! What is your reaction to the post?