5 Temmuz 2012 Perşembe

Higgs Boson



The Higgs boson is a hypothetical particle, a boson, that is the quantum of the Higgs field. The field and the particle provide a testable hypothesis for the origin of mass in elementary particles. In popular culture, the Higgs boson is also called the God particle, after the title of Nobel physicist Leon Lederman’s The God Particle: If the Universe Is the Answer, What Is the Question? (1993), which contained the author’s assertion that the discovery of the particle is crucial to a final understanding of the structure of matter.



The existence of the Higgs boson was predicted in 1964 to explain the Higgs mechanism—the mechanism by which elementary particles are given mass. While the Higgs mechanism is considered confirmed to exist, the boson itself—a cornerstone of the leading theory—had not been observed and its existence was unconfirmed. Its tentative discovery in 2012 may validate the Standard Model as essentially correct, as it is the final elementary particle predicted and required by the Standard Model which has not yet been observed via particle physics experiments. Alternative sources of the Higgs mechanism that do not need the Higgs boson also are possible and would be considered if the existence of the Higgs boson were to be ruled out. They are known as Higgsless models.



The Higgs boson is named after Peter Higgs, who was one of six authors in the 1960s who wrote the ground-breaking papers covering what is now known as the Higgs mechanism and described the related Higgs Field and boson. Technically, it is the quantum excitation of the Higgs field, and the non-zero value of the ground state of this field gives mass to the other elementary particles such as quarks and electrons through the Higgs mechanism. The Standard Model completely fixes the properties of the Higgs boson, except for its mass. It is expected to have no spin and no electric or color charge, and it interacts with other particles through the weak interaction and Yukawa-type interactions between the various fermions and the Higgs field.



Because the Higgs boson is a very massive particle and decays almost immediately when created, only a very high energy particle accelerator can observe and record it. Experiments to confirm and determine the nature of the Higgs boson using the Large Hadron Collider (LHC) at CERN began in early 2010, and were performed at Fermilab's Tevatron until its close in late 2011. Mathematical consistency of the Standard Model requires that any mechanism capable of generating the masses of elementary particles become visible at energies above 1.4 TeV; therefore, the LHC (designed to collide two 7 TeV proton beams, but currently running at 4 TeV each) was built to answer the question of whether or not the Higgs boson exists.


On 4 July 2012, the two main experiments at the LHC (ATLAS and CMS) both reported independently the confirmed existence of a previously unknown particle with a mass of about 125 GeV/c2 (about 133 proton masses, on the order of 10-25 kg), which is "consistent with the Higgs boson" and widely believed to be the Higgs boson. They acknowledged that further work would be needed to confirm that it is indeed the Higgs boson and not some other previously unknown particle (meaning that it has the theoretically predicted properties of the Higgs boson) and, if so, to determine which version of the Standard Model it best supports.


Overview

In particle physics, elementary particles and forces give rise to the world around us. Physicists explain the behaviors of these particles and how they interact using the Standard Model—a widely accepted framework believed to explain most of the world we see around us. Initially, when these models were being developed and tested, it seemed that the mathematics behind those models, which were satisfactory in areas already tested, would also forbid elementary particles from having any mass, which showed clearly that these initial models were incomplete. In 1964 three groups of physicists almost simultaneously released papers describing how masses could be given to these particles, using approaches known as symmetry breaking. This approach allowed the particles to obtain a mass, without breaking other parts of particle physics theory that were already believed reasonably correct. This idea became known as the Higgs Mechanism (not the same as the boson), and later experiments confirmed that such a mechanism does happen—but they could not show exactly how it happens.

The leading and simplest theory for how this effect takes place in nature was that if a particular kind of "field" (known as a Higgs Field) happened to permeate space, and if it could interact with fundamental particles in a particular way, then this would give rise to a Higgs Mechanism in nature, and would therefore create around us the phenomenon we call "mass". During the 1960s and 1970s the Standard Model of physics was developed on this basis, and it included a prediction and requirement that for these things to be true, there had to be an undiscovered boson—one of the fundamental particles—as the counterpart of this field. This would be the Higgs boson. If the Higgs boson was confirmed to exist, as the Standard Model suggested, then scientists could be satisfied that the Standard Model was fundamentally correct. If the Higgs boson was confirmed as not existing, then other theories would be considered as candidates instead.

The Standard Model also made clear that the Higgs boson would be very difficult to demonstrate. It exists for only a tiny fraction of a second before breaking up into other particles—so quickly that it cannot be directly detected—and can be detected only by identifying the results of its immediate decay and analyzing them to show they were probably created by a Higgs boson and not some other reason. The Higgs boson requires so much energy to create (compared to many other fundamental particles) that it also requires a massive particle accelerator to create collisions energetic enough to create it and record the traces of its decay. Given a suitable accelerator and appropriate detectors, scientists can record trillions of particles colliding, analyze the data for collisions likely to be a Higgs boson, and then perform further analysis to test how likely it is that the results combined show a Higgs boson does exist, and that the results are not just due to chance.

Experiments to try and show whether the Higgs boson did or did not exist began in the 1980s, but until the 2000s it could only be said that certain areas were plausible, or ruled out. In 2008 the Large Hadron Collider (LHC) was inaugurated, being the most powerful particle accelerator ever built. It was designed especially for this experiment, and other very high energy tests of the Standard Model. In 2010 it began its primary research role which was to prove whether or not the Higgs boson existed.

In late 2011 two of the LHC's experiments independently began to suggest "hints" of a Higgs boson detection around 125 GeV (the unit of particle mass). In July 2012 CERN announced[1] evidence of discovery of a boson with an energy level and other properties consistent with those expected in a Higgs boson. The available data raised a high statistical likelihood that the Higgs boson had been detected. Further work is necessary for the evidence to be considered conclusive (or disproved). If the newly discovered particle is indeed the Higgs boson, attention will turn to considering whether its characteristics match one of the extant versions of the Standard Model. The CERN data include clues that the additional bosons or similar-mass particles may have been discovered as well as, or instead of, the Higgs itself. If a different boson were confirmed, it would allow and require the development of new theories to supplant the current Standard Model.

Peter Higgs



Peter Ware Higgs, FRS, FRSE, FKC (born 29 May 1929) is a British theoretical physicist and emeritus professor at the University of Edinburgh. He is best known for his 1960s proposal of broken symmetry in electroweak theory, explaining the origin of mass of elementary particles in general and of the W and Z bosons in particular. This so-called Higgs mechanism, which was proposed by several physicists besides Higgs at about the same time, predicts the existence of a new particle, the Higgs boson (often described as "the most sought-after particle in modern physics"). CERN announced on 4 July 2012 that they had experimentally established the existence of a Higgs-like boson, but further work is needed to analyse its properties and see if it has the properties expected from the Standard Model Higgs boson. The Higgs mechanism is generally accepted as an important ingredient in the Standard Model of particle physics, without which particles would have no mass. Higgs has been honoured with a number of awards in recognition of his work, including the 1997 Dirac Medal and Prize for outstanding contributions to theoretical physics from the Institute of Physics, the 1997 High Energy and Particle Physics Prize by the European Physical Society, the 2004 Wolf Prize in Physics, and the 2010 J. J. Sakurai Prize for Theoretical Particle Physics.





July 04th, 2012

12 Mart 2012 Pazartesi

NIST hash function competition SHA-3

NIST SHA3

The NIST hash function competition is an open competition held by the US National Institute of Standards and Technology for a new SHA-3 function to replace the older SHA-1 and SHA-2, which was formally announced in the Federal Register on November 2, 2007. "NIST is initiating an effort to develop one or more additional hash algorithms through a public competition, similar to the development process for the Advanced Encryption Standard (AES)."

Submissions were due October 31, 2008, with a list of candidates accepted for the first round published December 9, 2008. NIST held a conference in late February 2009 where submitters gave presentations on their algorithms and NIST officials discussed criteria for narrowing down the field of candidates for Round 2. The list of 14 candidates accepted to Round 2 was published on July 24, 2009. Another conference was held August 23-24, 2010 (after CRYPTO 2010) at the University of California, Santa Barbara, where the second-round candidates were discussed. The announcement of the final round candidates occurred on December 10, 2010 and the proclamation of a winner and publication of the new standard are scheduled to take place in 2012.

Finalists

NIST has selected five SHA-3 candidate algorithms to advance to the third (and final) round:
BLAK
Grøstl (Knudsen et al.)
JH
Keccak (Keccak team, Daemen et al.)
Skein (Schneier et al.)


NIST noted some factors that figured into its selection as it announced the finalists:

Performance: "A couple of algorithms were wounded or eliminated by very large [hardware gate] area requirement – it seemed that the area they required precluded their use in too much of the potential application space."

Security: "We preferred to be conservative about security, and in some cases did not select algorithms with exceptional performance, largely because something about them made us 'nervous,' even though we knew of no clear attack against the full algorithm."

Analysis: "NIST eliminated several algorithms because of the extent of their second-round tweaks or because of a relative lack of reported cryptanalysis – either tended to create the suspicion that the design might not yet be fully tested and mature."

Diversity: The finalists included hashes based on different modes of operation, including the HAIFA and sponge hash constructions, and with different internal structures, including ones based on AES, bitslicing, and alternating XOR with addition.

NIST has released a report explaining its evaluation algorithm-by-algorithm.

Meltem Sönmez Turan's Blog

29 Eylül 2011 Perşembe

Demon Core



The Demon core was the nickname given to a 6.2-kilogram (14 lb) subcritical mass of plutonium that accidentally went briefly critical in two separate accidents at the Los Alamos laboratory in 1945 and 1946. Both incidents resulted in the acute radiation poisoning and the subsequent death of a scientist. After these incidents, the sphere of plutonium was referred to as the Demon Core.

The core was used in an atomic bomb test in 1946, five weeks after the second fatal accident, and proved in practice to have a slightly increased yield over similar cores which had not been subjected to criticality excursions.

First incident

On August 21, 1945, the plutonium core produced a burst of neutron radiation that caught Harry Daghlian in its path. Daghlian was a physicist who made a mistake while working alone performing neutron reflection experiments on the core. The core was placed within a stack of neutron-reflective tungsten carbide bricks, and the addition of each brick moved the assembly closer to criticality. While attempting to stack another brick around the assembly Daghlian accidentally dropped it onto the core and thereby caused the core to go critical. Despite quick action in moving the brick off the assembly, Daghlian received a fatal dose of radiation. He died 25 days later from acute radiation poisoning.

Another person who was in the lab at the time of the accident—Private Robert J. Hemmerly, a Special Engineer Detachment (SED) guard—received an exposure of approximately 31 roentgens of soft X-rays (80 kV equivalent) and less than 1 roentgen of gamma rays. Hemmerly died in 1978 (33 years after the accident) from acute myelogenous leukemia at the age of 62.

Second Incident

On May 21, 1946, physicist Louis Slotin and seven other scientists were in a Los Alamos laboratory conducting an experiment to verify the exact point at which a subcritical mass (core) of fissile material could be made critical by the positioning of neutron reflectors. The test was known as "tickling the dragon's tail" for its extreme risk. It required the operator to place two half-spheres of beryllium (a neutron reflector) around the core to be tested and manually lower the top reflector over the core via a thumb hole on the top. As the reflectors were manually moved closer and farther away from each other, scintillation counters measured the relative activity from the core. Allowing them to close completely would result in the instantaneous formation of a critical mass and a lethal power excursion. Under Slotin's unapproved protocol, the only thing preventing this was the blade of a standard flathead screwdriver, manipulated by the scientist's other hand. Slotin, who was given to bravado, became the local expert, performing the test almost a dozen separate times, often in his trademark bluejeans and cowboy boots, in front of a roomful of observers. Enrico Fermi reportedly told Slotin and others they would be "dead within a year" if they continued performing it.

While lowering the top reflector, Slotin's screwdriver slipped a fraction of an inch, allowing the top reflector to fall into place around the core. Instantly there was a flash of blue light and a wave of heat across Slotin's skin; the core had become supercritical, releasing a massive burst of neutron radiation. He quickly knocked the two halves apart, stopping the chain reaction and likely saving the lives of the other men in the laboratory. Slotin's body positioning over the apparatus also shielded the others from much of the neutron radiation. He received a massively lethal dose in under a second and died nine days later from acute radiation poisoning. The nearest physicist to Slotin, Alvin C. Graves, was watching over Slotin's shoulder and was thus partially shielded by him, receiving a high but non-lethal radiation dose. Graves was hospitalized for several weeks with severe radiation poisoning, developed chronic neurological and vision problems as a result of the exposure, suffered a significant shortening of his lifespan and died of a radiation-induced heart attack 20 years later. The other six people in the room were far enough away from the assembly to avoid fatal injury, but they all suffered other complications as a result of the accident. Two people suffered severe shortening of their lives and died years later from radiation induced complications: leukemia (at age 42, 18 years after the accident) and clinical aplastic anemia.

Demon Core In Use

The Demon core was put to use for the Able detonation test of the Crossroads series on July 1, 1946, demonstrating that the criticality experiments of Daghlian and Slotin increased the efficiency of the weapon.

1 Temmuz 2011 Cuma

Dark Flow

Dark flow is an astrophysical term describing a peculiar velocity of galaxy clusters. The actual measured velocity is the sum of the velocity predicted by Hubble's Law plus a small and unexplained (or dark) velocity flowing in a common direction.

According to standard cosmological models, the motion of galaxy clusters with respect to the cosmic microwave background should be randomly distributed in all directions. However, analyzing the three-year WMAP data using the kinematic Sunyaev-Zel'dovich effect, the authors of the study found evidence of a "surprisingly coherent" 600–1000 km/s flow of clusters toward a 20-degree patch of sky between the constellations of Centaurus and Vela.

The authors (Alexander Kashlinsky, F. Atrio-Barandela, D. Kocevski, and H. Ebeling) suggest that the motion may be a remnant of the influence of no-longer-visible regions of the universe prior to inflation. Telescopes cannot see events earlier than about 380,000 years after the big bang, when the universe became transparent (the Cosmic Microwave Background); this corresponds to the particle horizon at a distance of about 46 billion (4.6×10^10) light years. Since the matter causing the net motion in this proposal is outside this range, it would in a certain sense be outside our visible universe; however, it would still be in our past light cone.

The results appeared in the October 20, 2008, issue of Astrophysical Journal Letters. Since then, the authors have extended their analysis to additional clusters and the recently released WMAP five-year data.

Location

The dark flow was determined to be flowing in the direction of the Centaurus and Hydra constellations. This corresponds with the direction of the Great Attractor, which was a previous gravitational mystery originally discovered in 1973. However, the source of the Great Attractor's attraction was thought to originate from a massive cluster of galaxies called the Norma cluster, situated merely between 150-250 million light-years away. This may reveal that the source of that attraction might lie even further away, and which the Great Attractor itself is heading towards.

In a study from March 2010, Kashlinsky extended his work from 2008, by using the 5-year WMAP results rather than the 3-year results, and doubling the number of galaxy clusters observed from 700. The team also sorted the cluster catalog into four "slices" representing different distance ranges. They then examined the preferred flow direction for the clusters within each slice. While the size and exact position of this direction display some variation, the overall trends among the slices exhibit remarkable agreement. "We detect motion along this axis, but right now our data cannot state as strongly as we'd like whether the clusters are coming or going," Kashlinsky said.

The team has so far catalogued the effect as far out as 2.5 billion light-years, and hope to expand their catalog out further still to twice the current distance.


Panoramic view of galaxies beyond Milky Way, with Norma cluster & Great Attractor shown by a long blue arrow at the bottom-right in image near the disk of the Milky Way.


NASA's Goddard Space Center confirmed this could be the effects of a sibling universe or a region of space-time fundamentally different from the observable universe. Data on more than 1,000 galaxy clusters have been measured, including some as distant as 3 billion light-years. Alexander Kashlinsky claims these measurements show the universe's steady flow is clearly not a statistical fluke. Said Kashlinsky: "At this point we don't have enough information to see what it is, or to constrain it. We can only say with certainty that somewhere very far away the world is very different than what we see locally. Whether it's 'another universe' or a different fabric of space-time we don't know."

Sloan Great Wall



The Sloan Great Wall is a giant wall of galaxies (a galactic filament) and to the present day is the largest known structure in the universe. Its discovery was announced on October 20, 2003 by J. Richard Gott III of Princeton University and Mario Jurić and their colleagues, based on data from the Sloan Digital Sky Survey.

The wall measures 1.37 billion light years (1.30×1025 m) in length, which is approximately 1/60 of the diameter of the observable universe, and is located approximately one billion light-years from Earth.

The Sloan Great Wall, classified as the supercluster SCl 126 in SIMBAD, is nearly three times longer than the CfA2 Great Wall of galaxies, the previous record-holder, which was discovered by Margaret Geller and John Huchra of Harvard in 1989.

23 Mayıs 2011 Pazartesi

Flexible organic light-emitting diode (FOLED)

A flexible organic light emitting diode (FOLED) is a type of organic light-emitting diode (OLED) incorporating a flexible plastic substrate on which the electroluminescent organic semiconductor is deposited. This enables the device to be bent or rolled while still operating. Currently the focus of research in industrial and academic groups, flexible OLEDs form one method of fabricating a rollable display.



An OLED emits light due to the electroluminescence of thin films of organic semiconductors approximately 100 nm thick. Regular OLEDs are usually fabricated on a glass substrate, but by replacing glass with a flexible plastic such as polyethylene terephthalate (PET) among others, OLEDs can be made both bendable and lightweight.

Such materials may not be suitable for comparable devices based on inorganic semiconductors due to the need for lattice matching and the high temperature fabrication procedure involved.

In contrast, flexible OLED devices can be fabricated by deposition of the organic layer onto the substrate using a method derived from inkjet printing, allowing the inexpensive and roll-to-roll fabrication of printed electronics.

Flexible OLEDs may be used in the production of rollable displays, electronic paper, or bendable displays which can be integrated into clothing, wallpaper or other curved surfaces. Prototype displays have been exhibited by companies such as Sony, which are capable of being rolled around the width of a pencil.