31 Aralık 2007 Pazartesi

Douglas Skyrocket D-558-2

NASA Dryden Douglas Skyrocket

The Douglas Skyrocket (the D-558-2; also found, D-558-II) was a rocket and jet-powered research aircraft built by the Douglas Aircraft Company for the U.S. Navy. On November 20, 1953, shortly before the 50th anniversary of powered flight, Scott Crossfield piloted the Douglas D-558-2 Skyrocket to Mach 2, or more than 2076 km/h, the first time an aircraft had exceeded twice the speed of sound.

All three of the Skyrockets had 35-degree swept wings.

Until configured for air launch, NACA 143 featured a Westinghouse J-34-40 turbojet engine rated at 3,000 pounds force (13 kN) static thrust. It carried 260 US gallons (980 L) of aviation gasoline and weighed 10,572 pounds (4,795 kg) at take-off.

NACA 144 (and NACA 143 after modification in 1955) was powered by an LR-8-RM-6 rocket engine rated at 6,000 pounds force (27 kN) static thrust. Its propellants were 345 US gallons (1,306 L) of liquid oxygen and 378 US gallons (1,431 L) of diluted ethyl alcohol. In its launch configuration, it weighed 15,787 pounds (7,161 kg).

NACA 145 had both an LR-8-RM-5 rocket engine rated at 6,000 pounds force (27 kN) static thrust and featured a Westinghouse J-34-40 turbojet engine rated at 3,000 pounds force (13 kN) static thrust. It carried 170 US gallons (644 L) of liquid oxygen, 192 US gallons (727 L) of diluted ethyl alcohol, and 260 US gallons (984 L) of aviation gasoline for a launch weight of 15,266 pounds (6,925 kg).

[edit] Aircraft serial numbers

* D-558-2 Skyrocket
o D-558-2 #1 - #37973 NACA-143, 123 flights
o D-558-2 #2 - #37974 NACA-144, 103 flights
o D-558-2 #3 - #37975 NACA-145, 87 flights

* Maximum speed: 720 mph, 1,250 mph when air-launched (1,160 km/h, 2,010 km/h when air-launched)
* Stall speed: 160.1 mph (257.7 km/h)
* Service ceiling: 16,500 ft (5,030 m)
* Rate of climb: 22,400 ft/min, 11,100 ft/min under rocket power only (6,830 m/min., 3,380 m/min under rocket power only)
* Wing loading: 87.2 lb/ft² (426 kg/m²)
* Thrust/weight (jet): 0.39

28 Aralık 2007 Cuma


MIT WiTricity
Goodbye wires…

WiTricity, a portmanteau for wireless electricity, is a term coined initially by Dave Gerding in 2005 and used by a MIT research team led by Prof. Marin Soljačić in 2007, to describe the ability to provide electrical energy to remote objects without wires. WiTricity is based on strong coupling between electromagnetic resonant objects to transfer energy wirelessly between them. The system consists of WiTricity transmitters and receivers that contain magnetic loop antennas critically tuned to the same frequency. As WiTricity operates in the electromagnetic near-field, the receiving devices must be no more than about a quarter wavelength from the transmitter, that is a few meters at the frequency of a few MHz used by the system. In their first paper, the group also simulated GHz dielectric resonators.

Wireless power transmission is not a new idea, William C. Brown demonstrated in 1964 on the CBS Walter Cronkite news a microwave-powered model helicopter that received all the power needed for flight from a microwave beam. Between 1969 and 1975 Bill Brown was technical director of a JPL Raytheon program that beamed 30 kW over a distance of 1 mile at 84% efficiency.

Unlike the far field wireless power transfer systems based on traveling EM waves, WiTricity employs near field inductive coupling through magnetic fields, which interact far more weakly with surrounding objects, including biological tissue. It is not known exactly why this technology had not been developed. Researchers attribute it to various reasons ranging from the limitations of well-known physical laws, to simply a lack of need. Only recently have modern consumers obtained a high number of portable electronic devices which currently require batteries and plug-in chargers.

The MIT researchers successfully demonstrated the ability to power a 60 watt light bulb from a power source that was 2 meters (7 ft) away at roughly 40% efficiency. They used two capacitively loaded copper coils, 60 centimeters (24 in) in diameter, oriented along the same axis, The coils were designed to resonate together at 10 MHz. One was connected inductively to a power source, the other to a bulb. The setup powered the bulb on, even when the direct line of sight was blocked using a wooden panel. Aristeidis Karalis says that "the usual non-resonant magnetic induction would be almost 1 million times less efficient in this particular system".

The researchers plan to miniaturize the setup enough for commercial use in three to five years. The researchers suggest that the radiated power densities will be below the threshold for FCC safety regulations.

Further applications for this technology include transmission of information — since the underlying technology is not electromagnetic waves, It would not interfere with radio waves and thus could be used as a cheap and efficient communication device without requiring a license or a government permit.

25 Aralık 2007 Salı

Advanced Encryption Standard


In cryptography, the Advanced Encryption Standard (AES), also known as Rijndael, is a block cipher adopted as an encryption standard by the U.S. government. It has been analyzed extensively and is now used widely worldwide as was the case with its predecessor, the Data Encryption Standard (DES). AES was announced by National Institute of Standards and Technology (NIST) as U.S. FIPS PUB 197 (FIPS 197) on November 26, 2001 after a 5-year standardization process (see Advanced Encryption Standard process for more details). It became effective as a standard May 26, 2002. As of 2006, AES is one of the most popular algorithms used in symmetric key cryptography. It is available by choice in many different encryption packages.

The cipher was developed by two Belgian cryptographers, Joan Daemen and Vincent Rijmen, and submitted to the AES selection process under the name "Rijndael", a portmanteau of the names of the inventors.

As of 2006, the only successful attacks against AES have been side channel attacks. The National Security Agency (NSA) reviewed all the AES finalists, including Rijndael, and stated that all of them were secure enough for US Government non-classified data. In June 2003, the US Government announced that AES may be used for classified information:

"The design and strength of all key lengths of the AES algorithm (i.e., 128, 192 and 256) are sufficient to protect classified information up to the SECRET level. TOP SECRET information will require use of either the 192 or 256 key lengths. The implementation of AES in products intended to protect national security systems and/or information must be reviewed and certified by NSA prior to their acquisition and use."

This marks the first time that the public has had access to a cipher approved by NSA for encryption of TOP SECRET information. Many public products use 128-bit secret keys by default; it is possible that NSA suspects a fundamental weakness in keys this short, or they may simply prefer a safety margin for top secret documents (which may require security decades into the future).

24 Aralık 2007 Pazartesi


Laser Interferometer Gravitational-Wave Observatory

LIGO stands for Laser Interferometer Gravitational-Wave Observatory. Cofounded in 1992 by Kip Thorne and Ronald Drever of Caltech and Rainer Weiss of MIT, LIGO is a joint project between scientists at MIT and Caltech. It is sponsored by the National Science Foundation (NSF). At the cost of $365 million (in 2002 USD), it was the largest and most ambitious project ever funded by NSF (and still is as of 2007). The international LIGO Scientific Collaboration (LSC) is a growing group of researchers, some 400 individuals at roughly 40 institutions, working to analyze the data from LIGO and other detectors, and working toward more sensitive future detectors.

Control Center

LIGO's mission is to directly observe gravitational waves of cosmic origin. These waves were first predicted by Einstein's Theory of General Relativity in 1916, when the technology necessary for their detection did not yet exist. Gravitational waves were indirectly confirmed to exist when observations were made of the binary pulsar PSR 1913+16, for which the Nobel Prize was awarded to Hulse and Taylor in 1993.

Direct detection of gravitational waves has long been sought, for it would open up a new branch of astronomy to complement electromagnetic telescopes and neutrino observatories. Joseph Weber pioneered the effort to detect gravitational waves in the 1960s through his work on resonant mass bar detectors.Bar detectors continue to be used at six sites worldwide. By the 1970s, scientists including Rainer Weiss realized the applicability of laser interferometry to gravitational wave measurements.

In August 2002, LIGO began its search for cosmic gravitational waves. Emissions of gravitational waves are expected from binary systems (collisions and coalescences of neutron stars or black holes), supernova of massive stars (which form neutron stars and black holes), rotations of neutron stars with deformed crusts, and the remnants of gravitational radiation created by the birth of the universe. The observatory may in theory also observe more exotic currently hypothetical phenomena, such as gravitational waves caused by oscillating cosmic strings or colliding domain walls. Since the early 1990s, physicists have believed that technology is at the point where detection of gravitational waves—of significant astrophysical interest—is possible.

The European Gravitational Observatory

The European Gravitational Observatory or EGO is located in the countryside near Pisa in the Commune of Cascina. In order to ensure the long term scientific exploitation of the VIRGO interferometric antenna for gravitational waves detection as well as to foster European collaboration in this upcoming field, the VIRGO funding institutions (CNRS for France and INFN for Italy) have created a consortium called EGO (European Gravitational Observatory).

VIRGO is a 3 kilometer interferometer built through a French-Italian collaboration. This collaboration involves 11 laboratories in France and Italy and more than 150 scientists.

EGO is established under the Italian law. Its governing body is the Council composed of six members nominated by the funding institutions. The Council appoints a Director who is the legal representative and chief executive of EGO. The Scientific and Technical Advisory Committee advises the Council on scientific and technical activities carried out by the Consortium. It is composed of up to ten scientific personalities.

Herbig-Haro Objects

Herbig-Haro object HH47, imaged by the Hubble Space Telescope. The scale bar represents 1000 Astronomical Units, equivalent to about 20 times the size of our solar system, or 1000 times the distance from the Earth to the Sun
Herbig-Haro objects are small patches of nebulosity associated with newly-born stars, and are formed when gas ejected by young stars collides with clouds of gas and dust nearby at speeds of several hundred kilometres per second. Herbig-Haro objects are ubiquitous in star-forming regions, and several are often seen around a single star, aligned along its rotational axis. HH objects are transient phenomena, lasting only a few thousand years at most. They can evolve visibly over quite short timescales as they move rapidly away from their parent star into the gas clouds in interstellar space (the interstellar medium or ISM). Hubble Space Telescope observations reveal complex evolution of HH objects over a few years, as parts of them fade while others brighten as they collide with clumpy material in the interstellar medium. The objects were first observed in the late 19th century by Sherburne Wesley Burnham, but were not recognised as being a distinct type of emission nebula until the 1940s. The first astronomers to study them in detail were George Herbig and Guillermo Haro, after whom they have been named. Herbig and Haro were working independently on studies of star formation when they first analysed Herbig-Haro objects, and recognised that they were a by-product of the star formation process.

Images taken over five years reveal the motion of material in HH object HH47.

Over 400 individual HH objects or groups are now known. They are ubiquitous in star-forming H II regions, and are often found in large groups. They are typically observed near Bok globules (dark nebulae which contain very young stars) and often emanate from them. Frequently, several HH objects are seen near a single energy source, forming a string of objects along the line of the polar axis of the parent star.

The number of known HH objects has increased rapidly over the last few years, but is still thought to be a very small proportion of the total number existing in our galaxy. Estimates suggest that up to 150,000 exist, the vast majority of which are too far away to be resolved with current technological capabilities. Most HH objects lie within 0.5 parsecs of their parent star, with very few found more than 1 pc away. However, some are seen several parsecs away, perhaps implying that the interstellar medium is not very dense in their vicinity, allowing them to travel further from their source before dispersing.

23 Aralık 2007 Pazar

Shallow Water Equations

Output from a shallow water equation model of water in a bathtub. The water experiences five splashes which generate surface gravity waves that propagate away from the splash locations and reflect off of the bathtub walls.

The shallow water equations (also called Saint Venant equations after Adhémar Jean Claude Barré de Saint-Venant) are a set of equations that describe the flow below a horizontal pressure surface in a fluid. The flow these equations describe is the horizontal flow caused by changes in the height of the pressure surface of the fluid. Shallow water equations can be used in atmospheric and oceanic modelling, but are much simpler than the primitive equations. Shallow water equation models have only one vertical level, so they cannot encompass any factor that varies with height.

* u is the zonal velocity (or velocity in the x dimension).
* v is the meridional velocity (or velocity in the y dimension).
* H is the mean height of the horizontal pressure surface.
* η is the deviation of the horizontal pressure surface from its mean.
* g is the acceleration of gravity.
* f is the term corresponding to the Coriolis force, and is equal to 2Ω sin(φ), where Ω is the angular rotation rate of the Earth (π/12 radians/hour), and φ is the latitude.
* b is the viscous drag.

F-35 Lightning II

The F-35 Lightning II takes off for its first flight at Naval Air Station Fort Worth Joint Reserve Base on 15 December 2006.


The F-35 Lightning II is a single-seat, single-engine, stealth-capable military strike fighter, a multi-role aircraft that can perform close air support, tactical bombing, and air-to-air combat. The F-35 is descended from the X-35 of the Joint Strike Fighter (JSF) program. Its development is being principally funded by the United States with the United Kingdom and other partner governments providing additional funding. It is being designed and built by an aerospace industry team led by Lockheed Martin with Northrop Grumman and BAE Systems as major partners. Demonstrator aircraft flew in 2000; a production model first took flight on 15 December 2006. The United States Air Force plans to acquire 1,763 aircraft.

Unit cost

F-35A: US$48 million
F-35B: US$62 million
F-35C: US$63 million

General characteristics

* Crew: 1
* Length: 50 ft 6 in (15.37 m)
* Wingspan: 35 ft 0 in (10.65 m)
* Height: 17 ft 4 in (5.28 m)
* Wing area: 459.6 ft² (42.7 m²)
* Empty weight: 26,000 lb (12,000 kg)
* Loaded weight: 44,400 lb (20,100 kg)
* Max takeoff weight: 60,000 lb (27,200 kg)
* Powerplant: 1× Pratt & Whitney F135 afterburning turbofan
o Dry thrust: 25,000 lbf[63] (111 kN)
o Thrust with afterburner: >40,000 lbf[63] (178 kN)
* Secondary Powerplant: 1× General Electric/Rolls-Royce F136 afterburning turbofan, >40,000 lbf (178 kN) [in development]
* Lift fan (STOVL): 1× Rolls-Royce Lift System driven from either F135 or F136 power plant, 18,000 lbf (80 kN)


* Maximum speed: Mach 1.6+[63] (1,200 mph, 1,931 km/h)
* Range: A: 1,200 nmi; B: 900 nm; C: 1400 nm[63] (A: 2,200 km; B: 1,667 km; C: 2,593 km) on internal fuel
* Combat radius: 600 nmi (690 mi, 1,110 km)
* Rate of climb: classified (not publicly available)
* Wing loading: 91.4 lb/ft² (446 kg/m²)
* Thrust/weight:
o With full fuel: A: 0.89; B: 0.92; C: 0.81[63]
o With 50% fuel: A: 1.12; B: 1.10; C: 1.01[63]


* F-35A: +9G
* F-35B: +7G
* F-35C: +7.5G

Intel 80286

Intel 80286 cpu-info.com

The Intel's 286, introduced on February 1, 1982, (originally named 80286, and also called iAPX 286 in the programmer's manual) was an x86 16-bit microprocessor with 134,000 transistors.

It was widely used in IBM PC compatible computers during the mid 1980s to early 1990s.

After the 6 and 8 MHz initial releases, it was subsequently scaled up to 12.5 MHz. (AMD and Harris later pushed the architecture to speeds as high as 20 MHz and 25 MHz, respectively.) On average, the 80286 had a speed of about 0.21 instructions per clock. [2] The 6 MHz model operated at 0.9 MIPS, the 10 MHz model at 1.5 MIPS, and the 12 MHz model at 2.66 MIPs.

The 80286's performance was more than twice that of its predecessors (the Intel 8086 and Intel 8088) per clock cycle. In fact, the performance increase per clock cycle may be the largest among the generations of x86 processors. Calculation of the more complex addressing modes (such as base+index) had less clock penalty because it was performed by a special circuit in the 286; the 8086, its predecessor, had to perform effective address calculation in the general ALU, taking many cycles. Also, complex mathematical operations (such as MUL/DIV) took fewer clock cycles compared to the 8086.

Having a 24-bit address bus, The 286 was able to address up to 16 MB of RAM, in contrast to 1 MB that the 8086 could directly work with. While DOS could utilize this additional RAM (extended memory) via BIOS call (INT 15h, AH=87h), or as RAM disk, or emulation of expanded memory, cost and initial rarity of software utilizing extended memory meant that 286 computers were rarely equipped with more than a megabyte of RAM.

The 286 was designed to run multitasking applications, including communications (such as automated PBXs), real-time process control, and multi-user systems.

Designer: Intel
Manufacturers: Intel, AMD, Harris, SAB
Introduction date: February 1982
Introduction speed: 6 MHz
Maximum speed: 25 MHz
Cache: -
Transistor count: 134,000
Manufacturing process: 1.5 micron

Super Mario

Nintendo Super Mario

List of Mario games by system

Mario is a video game character created by Japanese game designer Shigeru Miyamoto and the official mascot of Nintendo. He has appeared in over 100 video games since his creation, more than any other character. Originally used for platforming games, he has also found his way into racing games, puzzle games, role-playing games, fighting games, sports games, and many others.

Mario first appeared in the video game Donkey Kong as a character named "Jumpman". The game was surprisingly successful. Mario also starred in an arcade game simply called Mario Bros. and when the Nintendo Entertainment System was released, Mario was given the starring role in the revolutionary Super Mario Bros..

"Jumpman", the protagonist of Donkey Kong, was called "Mario" in certain promotional materials for the game's release overseas. His namesake was Mario Segale, the landlord of Nintendo of America's office, who barged in on a meeting to demand an overdue rent payment. In Paper Mario: The Thousand-Year Door, Mario is given the stage name of "Great Gonzales" during his battles in Glitzville. Before a battle, one of the audience members refers to Mario as "Jumpman," a joke about Mario's first identity. Mario's nickname in Mario Hoops 3-on-3 is "The Jumpman", again making reference to his original name. Mario is currently voiced by Charles Martinet, who also voices Luigi, both their baby counterparts, Wario, Waluigi, and other characters such as Toadsworth.

Restrictions in the mid-1980s; with limited pixels and colors, the programmers could not animate Mario's movement without making his arms "disappear". Making his shirt a solid color and giving him overalls fixed this. They also did not have the space to give him a mouth or ears, and they could not animate hair, so Mario got a moustache, sideburns, and a cap to bypass these problems. Mario's creator, Shigeru Miyamoto, has stated when interviewed that Mario wears a cap because he finds it difficult to draw hair.

The surname "Mario" (which would make his full name Mario Mario) was first used in The Super Mario Bros. Super Show, and then in the 1993 feature film Super Mario Bros.. This was meant to explain how both Mario and his brother Luigi could be known as the "Mario brothers" and was later backed up by Nintendo of Europe's offical Mario Megasite. However, the surname has never been employed officially by Nintendo of America, and that Mario and Luigi are collectively called the Mario Bros. simply because Mario is the head-liner of the pair. No evidence can be found at this time on Nintendo of Japan or Shigeru Miyamoto's position on the matter.

Mario has taken on the role of Nintendo's mascot and has since been extensively merchandised. Mario's major rival was Sega mascot Sonic the Hedgehog who debuted in the early 1990s; the two mascots competed head-to-head for nearly a decade afterwards, until around 2001 when a Sonic game (Sonic Adventure 2: Battle) showed up on a Nintendo console due to Sega's new third party status, ending a lengthy rivalry. Mario and Sonic officially appeared together in a crossover sports game, Mario & Sonic at the Olympic Games, and will be together again in Nintendo's Super Smash Bros. Brawl. Mario was one of the first video game characters to be honored at the Walk of Game in 2005, alongside Link and Sonic the Hedgehog.

Birthday Attack

A birthday attack is a type of cryptographic attack, so named because it exploits the mathematics behind the birthday paradox. Given a function f, the goal of the attack is to find two inputs x1,x2 such that f(x1) = f(x2). Such a pair x1,x2 is called a collision. The method used to find a collision is to simply evaluate the function f for different input values that may be chosen randomly or pseudorandomly until the same result is found more than once. Because of the birthday paradox this method can be rather efficient. Specifically, if a function f(x) yields any of H different outputs with equal probability and H is sufficiently large, then we expect to obtain a pair of different arguments x1 and x2 with f(x1) = f(x2) after evaluating the function for about 1.25 \cdot \sqrt H different arguments on average.

It is easy to see that if the outputs of the function are distributed unevenly, then a collision can be found even faster. The notion of 'balance' of a hash function quantifies the resistance of the function to birthday attacks and allows the vulnerability of popular hashes such as MD and SHA to be estimated.

Digital signatures can be susceptible to a birthday attack. A message m is typically signed by first computing f(m), where f is a cryptographic hash function, and then using some secret key to sign f(m). Suppose Alice wants to trick Bob into signing a fraudulent contract. Alice prepares a fair contract m and a fraudulent one m'. She then finds a number of positions where m can be changed without changing the meaning, such as inserting commas, empty lines, one versus two spaces after a sentence, replacing synonyms, etc. By combining these changes, she can create a huge number of variations on m which are all fair contracts. In a similar manner, she also creates a huge number of variations on the fraudulent contract m'. She then applies the hash function to all these variations until she finds a version of the fair contract and a version of the fraudulent contract which have the same hash value, f(m) = f(m'). She presents the fair version to Bob for signing. After Bob has signed, Alice takes the signature and attaches it to the fraudulent contract. This signature then "proves" that Bob signed the fraudulent contract. This differs slightly from the original birthday problem, as Alice gains nothing by finding two fair or two fraudulent contracts with the same hash. Alice's optimum strategy is to generate "pairs" of one fair and one fraudulent contract. Then Alice compares each freshly-generated pair to all other pairs; that is, she compares the new fair hash to all previous fraudulent hashes, and the new fraudulent contract to all previous fair hashes (but doesn't bother comparing fair hashes to fair or fraudulent to fraudulent). The birthday problem equations apply where "n" is the number of pairs. (The number of hashes Alice actually generates is 2n.)

To avoid this attack, the output length of the hash function used for a signature scheme can be chosen large enough so that the birthday attack becomes computationally infeasible, i.e. about twice as many bits as are needed to prevent an ordinary brute force attack.

Pollard's rho algorithm for logarithms is an example for an algorithm using a birthday attack for the computation of discrete logarithms.


A MiniDisc (MD) is a magneto-optical disc-based data storage device initially intended for storage of up to 80 minutes of digitalized audio. Today, in the form of Hi-MD, it has developed into a general-purpose storage medium in addition to greatly expanding its audio roots.

MiniDisc was announced by Sony in 1991 and introduced January 12, 1992. The music format was originally based exclusively on ATRAC audio compression. Recently, the option of linear PCM recording was introduced to attain truly CD-quality recordings. MiniDiscs are popular in Japan as a digital upgrade to cassette tapes, but have not been as popular world-wide.

The Sony MZ-NHF800, a 2004 Hi-MD model.

In January 2004, Sony announced the Hi-MD media storage format. With its release in later 2004 came the ability to use newly-developed, high-capacity 1 gigabyte Hi-MD discs, sporting the same dimensions as regular MiniDiscs.

Amiga 500

Amiga 500 Specs and Photos

The Amiga 500, also known as the A500, was the first “low-end” Commodore Amiga 16/32-bit multimedia home/personal computer. It was announced at the winter Consumer Electronics Show in January 1987, at the same time as the high-end Amiga 2000, and competed directly against the Atari 520ST. The A500 was released in mid 1987 at the price of US $595.95 without monitor. However, the term high-end and low-end did not truly factor in, until the advent of the A3000, and the AGA systems.

The original A500 proved to be Commodore’s best-selling Amiga model, enjoying particular success in Europe. Although popular with hobbyists, arguably its most widespread use was as a gaming machine, where its advanced graphics and sound were of significant benefit.

Case Type: Computer in a keyboard
Processor: 68000@7.14Mhz
MMU: None
FPU: None
Chipset: OCS (more common) or ECS
Standard CHIP RAM: 512K
RAM sockets: None
Hard Drive Controllers: None
Drive Bays: 1 x Custom Floppy Drive Bay



OSx86 is a collaborative hacking project to run the Mac OS X computer operating system on non-Apple personal computers with x86 architecture processors. The effort started soon after the June 2005 Worldwide Developers Conference announcement that Apple would be transferring their personal computers from PowerPC to Intel microprocessors.

OSx86 is a portmanteau of OS X and x86. A computer built to run this type of Mac OS X is sometimes known as a Hackintosh, which is a recycled term originally denoting the modified Lisa 2/10 running Mac System.

Initial efforts revolved around leaked copies of the Development DVD that was released by Apple as part of the Developer Transition Kit that Apple made available to developers for $999. The first patches centered around circumventing the Trusted Platform Module (TPM) that was included on the motherboard of the Developer Transition Kits. The TPM was required by the Rosetta technology that allowed software compiled for the PowerPC architecture to run on Intel-based architecture. Removing this requirement allowed Mac OS X to be installed on non-Apple computers. Rosetta also required microprocessors that included SSE3 instructions. Patches were released to the community that emulated these instructions with SSE2 equivalents and allowed the installation on machines without SSE3 support (with a performance penalty).

In October 2005 Apple released a 10.4.3 update to developers that required NX bit microprocessor support. Patches were released to circumvent this.


Chaos Group V-Ray

V-Ray is a rendering engine that is used as an extension of certain 3D computer graphics software.

The core developers of V-Ray are Vladimir Koylazov and Peter Mitev of Chaos Software production studio established in 1997, based in Sofia, Bulgaria.

It is a rendering engine that uses advanced techniques, for example global illumination algorithms such as path tracing, photon mapping, irradiance maps and directly computed global illumination. The use of these techniques often makes it preferable to conventional renderers which are provided as standard with 3d software, and generally renders using these technique can appear more photo-realistic to the human eye, as actual lighting effects are more realistically emulated. The use of this engine is been known to increase the necessary computational power and rendering times due to the complicated nature and volume of calculations required.

Reyes Rendering

Reyes Rendering Wikipedia

Reyes rendering is a computer software architecture used in 3D computer graphics to render photo-realistic images. It was developed in the mid-1980s by Lucasfilm's Computer Graphics Research Group, which is now Pixar. It was first used in 1982 to render images for the Genesis effect sequence in the movie Star Trek II: The Wrath Of Khan. Pixar's PhotoRealistic RenderMan is one implementation of the Reyes algorithm. According to the original paper describing the algorithm the Reyes image rendering system is "An architecture ... for fast high-quality rendering of complex images." Reyes was proposed as a collection of algorithms and data processing systems. However the terms "algorithm" and "architecture" have come to be used synonymously and are used interchangeably in this article.

Reyes is an acronym for Renders Everything You Ever Saw (the name is also a pun on Point Reyes, California, near where Lucasfilm was located) and is suggestive of processes connected with optical imaging systems.

Reyes Renderers

The following renderers use the Reyes algorithm in one way or the other or at least allow users to select it to produce their images:

* Digits 'n Art's 3Delight
* Aqsis
* jrMan
* Pixar's RenderMan Pro Server & RenderMan for Maya
* Pixels 3d Renderer
* Pixie
* DotC Software's RenderDotC
* SideFX's VMantra
* e frontier Poser's FireFly



OpenEXR is a high dynamic range imaging image file format, released as an open standard along with a set of software tools created by Industrial Light and Magic, released under a Free software license similar to the BSD license.

It is notable for supporting 16-bits-per-channel floating point values (half precision), with a sign bit, five bits of exponent, and a ten-bit mantissa. This allows a dynamic range of over thirty stops of exposure.

OpenEXR is directly supported by Artizen HDR, Combustion,Smoke 2008, Blender, CinePaint, Cinelerra, Houdini, Lightwave, modo, After Effects 7 Professional, Mental Ray, PRMan, Digital Fusion, Nuke, Toxik, Shake, Photoshop CS2, CINEMA 4D, Pixel Image Editor and Synfig. It is also supported by the Cg programming language and Mac OS X as of version 10.4.

Both lossless and lossy compression of high dynamic range data is also supported.

OpenEXR, or simply EXR for short, is a deep raster format developed by ILM and very broadly used in the CG industry, both visual effects and animation.

OpenEXR's multi-resolution and arbitrary channel format makes it appealing for compositing. OpenEXR alleviates several painful elements of the compositing process. Since it can store arbitrary channels, specular, diffuse, alpha, RGB, normals, and various other types of channels in one file, it takes away the need to store this information in separate files. The multi-channel concept also reduces the necessity to "bake" in the before mentioned data to the final image. If a compositer is not happy with the current level of specularity, he or she can adjust that specific channel.

OpenEXR's straightforward API also makes tools development a relative ease for developers. Since there are almost never two production pipelines that are the same, custom tools always need to be developed to address problems in the production process. Many times these tools are to address some type of image manipulation issue. OpenEXR's library reduces the pain of having to manage bulky header information and allows quick and easy access to the image's attributes such as tiles and channels.


Laserdisc Wikipedia

Laserdisc (LD) was the first commercial optical disc storage medium, and was used primarily for movies for home viewing.

During its development, MCA, which owned the technology, referred to it as the Reflective Optical Videodisc System; changing the name once in 1969 to Disco-Vision and then again in 1978 to DiscoVision (without the hyphen), which became the official spelling. MCA owned the rights to the largest catalog of films in the world during this time, and they manufactured and distributed the DiscoVision releases of those films under the "MCA DiscoVision" label beginning on December 15, 1978.

Laserdisc (left) compared to a DVD/CD (right).

Pioneer Electronics also entered the optical disc market in 1978, manufacturing players and printing discs under the name Laser Videodisc. For 1980 the name was compressed into LaserDisc and in 1981 the intercap was eliminated and "Laserdisc" became the final and common name for the format, supplanting the use of the "DiscoVision" name, which disappeared shortly thereafter; titles released by MCA became MCA Laserdiscs or (later) MCA-Universal Laserdiscs. The format has been incorrectly referred to as LV or LaserVision, although this actually refers to a line of Philips brand players; the term VDP or Video Disc Player was a somewhat more common and more correct name for players in general.

During the early years, MCA also manufactured discs for other companies including Paramount, Disney and Warner Bros. Some of them added their own names to the disc jacket to signify that the movie was not owned by MCA. When MCA merged into Universal years later, Universal began reissuing many of the early DiscoVision titles as MCA-Universal discs. The DiscoVision versions had largely been available only in pan and scan and had often utilized poor transfers, the newer versions improved greatly in terms of both audio and video quality.


UMD Wikipedia

The Universal Media Disc (UMD) is an optical disc medium developed by Sony for use on the PlayStation Portable. It can hold up to 1.8 gigabytes of data. It is considered the first optical disc format to be used for a handheld video game system.

ECMA-365: Data Interchange on 60 mm Read-Only ODC – Capacity: 1.8 GB (UMD)

* Dimensions: approx. 65 mm (W) × 64 mm (D) × 4.2 mm (H)
* Maximum capacity: 1.80 GB (dual layer), 900 MB (single-layer)
* Laser wavelength: 660 nm (red laser)
* Encryption: AES 128-bit

Despite Sony's efforts, the UMD format has been cracked. Using a combination of unsecure firmware and reverse engineering, the Sony PSP can now use a variety of homebrew games, and backup ISO images. Each disc uses a file system whose format follows the ISO 9660 standard. The ISO image can then be stored on a Memory Stick, and run via a special disc emulator program, such as Devhook. The ISO images cannot be burned to UMD discs as UMD writables and burners are not available. The same game will load much faster and become more energy efficient when stored as an ISO image on a Memory Stick as opposed to the original UMD.

Sony has attempted to halt this type of exploitation by updating the firmware. Versions 1.51 and later of the PSP firmware have attempted to patch the exploit. Recent games also come with a 'software switch' that force users to update before the game can be played. This has also been circumvented: some applications for 1.50 report the firmware version as being more recent than it actually is, or firmware spoofing, bypassing the need to update. This has since been fixed by Sony and no longer works. Firmware versions 1.5 to 3.73 have been decrypted and 1.50, 2.71, 3.02, 3.03, 3.10, 3.30, 3.40, 3.51, 3.52, 3.60, 3.71, 3.72, 3.73 and 3.80 (as of Dec2007) have been converted into custom firmwares. These firmwares allow people to run ISOs that they own from their XMB interface in addition to other homebrew available.


DTS (also known as Digital Theater Systems), owned by DTS, Inc., is a multi-channel digital surround sound format used for both commercial / theatrical and consumer grade applications. It is used for in-movie sound both on film and on DVD, and during the last few years of the format's existence, several Laserdisc releases had DTS soundtracks.

One of the company's initial investors was film director Steven Spielberg, who felt that theatrical sound formats up until the company's founding were no longer state of the art, and as a result were no longer optimal for use on projects where quality sound reproduction was of the utmost importance. Work on the format started in 1991, four years after Dolby Labs started work on its new codec, Dolby Digital. The basic and most common version of the format is a 5.1 channel system, similar to a Dolby Digital setup, which encodes the audio as five primary (full-range) channels plus a special LFE (low-frequency effect) channel, for the subwoofer.

Other newer DTS variants are also currently available, including versions that support up to seven primary audio channels plus one LFE channel (DTS-ES). DTS's main competitors in multichannel theatrical audio are Dolby Digital and SDDS, although only Dolby Digital and DTS are used on DVDs and implemented in home theater hardware. Spielberg debuted the format with his 1993 production of Jurassic Park, which came slightly less than a full year after the official theatrical debut of Dolby Digital (Batman Returns). In addition, Jurassic Park also became the first home video release to contain DTS sound when it was released on LaserDisc in January 1997, two years after the first Dolby Digital home video release (Clear and Present Danger on Laserdisc) which debuted in January of 1995.

Jack Higgins

Jack Higgins Blog

Jack Higgins (b. July 27, 1929) is the principal pseudonym of UK novelist Harry Patterson. Higgins is the author of more than sixty novels. Most have been thrillers of various types and, since his breakthrough novel The Eagle Has Landed in 1975, nearly all have been bestsellers.

Non-Series Novels

* Sad Wind from the Sea (1959) (writing as Harry Patterson)
* Cry of the Hunter (1960) (writing as Harry Patterson)
* The Thousand Faces of Night (1961) (writing as Harry Patterson)
* Comes the Dark Stranger (1962) (writing as Harry Patterson)
* Hell Is Too Crowded (1962) (writing as Harry Patterson)
* The Dark Side of the Island (1963) (writing as Harry Patterson)
* Pay the Devil (1963) (writing as Harry Patterson)
* Seven Pillars to Hell (1963) (writing as Hugh Marlowe) aka Sheba
* Thunder At Noon (1964) (writing as Harry Patterson) aka Dillinger
* Passage By Night (1964) (writing as Hugh Marlowe)
* Wrath of the Lion (1964) (writing as Harry Patterson)
* A Phoenix in the Blood (1964) (writing as Harry Patterson)
* A Candle for the Dead (1966) (writing as Hugh Marlowe) aka The Violent Enemy
* The Iron Tiger (1966) (writing as Harry Patterson)
* East of Desolation (1968)
* In the Hour Before Midnight (1969) aka The Sicilian Heritage
* A Game for Heroes (1970) (writing as James Graham)
* Night Judgement At Sinos (1970)
* The Last Place God Made (1971)
* Toll for the Brave (1971) (writing as Harry Patterson)
* The Wrath of God (1971) (writing as James Graham)
* The Savage Day (1972)
* The Khufra Run (1972) (writing as James Graham)
* A Prayer for the Dying (1973)
* The Run to Morning (1974) (writing as James Graham) aka Bloody Passage
* Storm Warning (1976)
* The Valhalla Exchange (1976)
* To Catch a King (1979) (writing as Harry Patterson) aka The Judas Gate
* Solo (1980) aka The Cretan Lover
* Luciano's Luck (1981)
* Exocet (1983)
* A Season in Hell (1988)
* Memoirs of a Dance Hall Romeo (1989)
* Flight of Eagles (1998)
* Sure Fire (2006) (with Justin Richards)
* Sheba


A mid-infrared image of the debris disk around Vega. Spitzer Space Telescope/NASA.

Vega is the brightest star in the constellation Lyra, the fifth brightest star in the night sky and the second brightest star in the northern celestial hemisphere, after Arcturus. It is a relatively nearby star at only 25.3 light years from Earth, and, together with Arcturus and Sirius, one of the most luminous stars in the Sun's neighborhood.

Vega has been extensively studied by astronomers, leading it to be termed, "arguably the next most important star in the sky after the Sun". Historically, Vega served as the pole star at about 12,000 BCE and will do so again at around 14,000 CE. Vega was the first star, other than the Sun, to have its photograph taken and the first to have its spectrum photographed. It was also one of the first stars to have its distance estimated through parallax measurements. Vega has served as the baseline for calibrating the photometric brightness scale, and was one of the stars used to define the mean values for the UBV photometric system.

This star is relatively young when compared to the Sun. It has an unusually low abundance of the elements that have a higher atomic numbers than helium. Vega is also a suspected variable star that may vary slightly in magnitude in a periodic manner. It is rotating rapidly with a velocity of 274 km/s at the equator. This is causing the equator to bulge outward because of centrifugal effects, and, as a result, there is a variation of temperature across the star's photosphere that reaches a maximum at the poles. From the Earth, Vega is being observed from the direction of one of these poles.

Based upon an excess emission of infrared radiation, Vega has a circumstellar disk of dust. This dust is likely the result of collisions between objects in an orbiting debris disk, which is analogous to the Kuiper belt in the Solar System. Stars that display an infrared excess because of dust emission are termed Vega-like stars. Irregularities in Vega's disk also suggest the presence of at least one planet, likely to be about the size of Jupiter, in orbit around Vega.


H.264 Wikipedia
IEEE Overview of the H.264/AVC Video Coding Standard (pdf)

H.264 is a standard for video compression. It is also known as MPEG-4 Part 10, or MPEG-4 AVC (for Advanced Video Coding). It was written by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a partnership effort known as the Joint Video Team (JVT). The ITU-T H.264 standard and the ISO/IEC MPEG-4 Part 10 standard (formally, ISO/IEC 14496-10) are jointly maintained so that they have identical technical content. The final drafting work on the first version of the standard was completed in May 2003.

The intent of the H.264/AVC project was to create a standard capable of providing good video quality at substantially lower bit rates than previous standards (e.g. half or less the bit rate of MPEG-2, H.263, or MPEG-4 Part 2), without increasing the complexity of design so much that it would be impractical or excessively expensive to implement. An additional goal was to provide enough flexibility to allow the standard to be applied to a wide variety of applications on a wide variety of networks and systems, including low and high bit rates, low and high resolution video, broadcast, DVD storage, RTP/IP packet networks, and ITU-T multimedia telephony systems.

The standardization of the first version of H.264/AVC was completed in May of 2003. The JVT then developed extensions to the original standard that are known as the Fidelity Range Extensions (FRExt). These extensions enable higher quality video coding by supporting increased sample bit depth precision and higher-resolution color information, including sampling structures known as YUV 4:2:2 and YUV 4:4:4. Several other features are also included in the Fidelity Range Extensions project, such as adaptive switching between 4×4 and 8×8 integer transforms, encoder-specified perceptual-based quantization weighting matrices, efficient inter-picture lossless coding, and support of additional color spaces. The design work on the Fidelity Range Extensions was completed in July of 2004, and the drafting work on them was completed in September of 2004.

Further recent extensions of the standard have included adding five new profiles intended primarily for professional applications, adding extended-gamut color space support, defining additional aspect ratio indicators, defining two additional types of "supplemental enhancement information" (post-filter hint and tone mapping), and deprecating one of the prior FRExt profiles that industry feedback indicated should have been designed differently.

The H.264 name follows the ITU-T naming convention, where the standard is a member of the H.26x line of VCEG video coding standards; the MPEG-4 AVC name relates to the naming convention in ISO/IEC MPEG, where the standard is part 10 of ISO/IEC 14496, which is the suite of standards known as MPEG-4. The standard was developed jointly in a partnership of VCEG and MPEG, after earlier development work in the ITU-T as a VCEG project called H.26L. It is thus common to refer to the standard with names such as H.264/AVC, AVC/H.264, H.264/MPEG-4 AVC, or MPEG-4/H.264 AVC, to emphasize the common heritage. The name H.26L, referring to its ITU-T history, is less common, but still used. Occasionally, it is also referred to as "the JVT codec", in reference to the Joint Video Team (JVT) organization that developed it. (Such partnership and multiple naming is not uncommon—for example, the video codec standard known as MPEG-2 also arose from the partnership between MPEG and the ITU-T, where MPEG-2 video is known to the ITU-T community as H.262.)



Stage6 is a video sharing website first launched by DivX, Inc. in 2006 and currently in public beta. It is similar to sites like YouTube in allowing streaming video to be uploaded freely by anyone who is willing to register. Significant differences between Stage6 and other sites include better quality video through use of higher resolutions up to 1080p, few upload limitations, and the option to download media directly through the website or the DivX Web Player without the need to install browser extensions.

Unlike most video sharing websites, Stage6 requires the installation of the DivX Web Player to view videos. Since the DivX Web Player is designed specifically for viewing videos, streaming of extremely high quality, high resolution videos is made possible with low CPU overhead. The DivX content uploader is also bundled with the Web Player, enabling users to upload Stage6 compliant videos via web browser.

Stage6 accepts DivX or Xvid encoded files up to 1080p60. Under certain circumstances, Stage6 will reject encoded videos with unsupported audio formats. Upload file size limit for a video is under 2 Gigabytes.

On July 24th 2007 DivX, Inc. announced that it would be seeking to separate Stage6 as a company from the rest of DivX, Inc.. Co-Founder and Executive Chairman Jordan Greenhall will be switching from his current role as CEO to manage the separating Stage6, which, if successful, is expected to be completed later in 2007.

Tom and Jerry

Tom and Jerry is an Academy Award-winning animated cartoon series of Metro-Goldwyn-Mayer theatrical short subjects created, written and directed by animators, William Hanna and Joseph Barbera. One hundred and fourteen Tom and Jerry cartoons were produced by the MGM cartoon studio in Hollywood from 1940 until 1957, when the animation unit was closed down. These shorts are notable for having won seven Academy Awards for Best Short Subject (Cartoons), tieing it with Walt Disney's Silly Symphonies as the most-awarded theatrical animated series. It is widely considered one of the best animated cartoon series ever.

The following cartoons won the Academy Award (Oscar) for Best Short Subject: Cartoons:

* 1943: The Yankee Doodle Mouse
* 1944: Mouse Trouble
* 1945: Quiet Please!
* 1946: The Cat Concerto
* 1948: The Little Orphan
* 1951: The Two Mouseketeers
* 1952: Johann Mouse

These cartoons were nominated for the Academy Award (Oscar) for Best Short Subject: Cartoons, but did not win:

* 1940: Puss Gets the Boot
* 1941: The Night Before Christmas
* 1947: Dr. Jekyll and Mr. Mouse
* 1949: Hatch Up Your Troubles
* 1950: Jerry's Cousin
* 1954: Touché, Pussy Cat!

These cartoons were nominated for the Annie Award in the Individual Achievements Category: Character Animation, but did not win:

* 1946: Springtime for Thomas
* 1955: That's My Mommy
* 1956: Muscle Beach Tom
* 2005: The KarateGuard

Motorola 68000

Pre-release XC68000 chip manufactured in 1979.

The Motorola 68000 is a 16/32-bit CISC microprocessor core designed and marketed by Freescale Semiconductor (formerly Motorola Semiconductor Products Sector). Introduced in 1979 as the first member of the successful 32-bit m68k family of microprocessors, it is generally software forward compatible with the rest of the line despite belonging to the 16-bit hardware technology generation. After twenty-seven years in production, the 68000 architecture remains a popular choice for new designs.

The 68000 grew out of the MACSS (Motorola Advanced Computer System on Silicon) project, begun in 1976 to develop an entirely new architecture without backward compatibility. It would be a higher-power sibling complementing the existing 8-bit 6800 line rather than a compatible successor. In the end, the 68000 did retain a bus protocol compatibility mode for existing 6800 peripheral devices, and a version with an 8-bit data bus was produced. However, the designers mainly focused on the future, or forward compatibility, which gave the M68K platform a head start against later 32-bit instruction set architectures. For instance, the CPU registers are 32 bits wide, though few self-contained structures in the processor itself operate on 32 bits at a time. The 68000 may be considered a 16-bit microprocessor which is microcoded to accelerate 32-bit tasks. The MACSS team drew heavily on the influence of minicomputer processor design, such as the PDP-11 and VAX systems, which were similarly microcoded.

In the mid 1970s, the 8-bit processor manufacturers raced to introduce the 16-bit generation. National Semiconductor had been first with its IMP-16 and PACE processors in 1973-1975, but these had issues with speed. The Intel 8086 in 1977 quickly gained popularity. The decision to leapfrog the competition and introduce a hybrid 16/32-bit design was necessary, and Motorola turned it into a coherent mission. Arriving late to the 16-bit arena afforded the new processor more integration (roughly 70000 transistors against the 29000 in the 8086), higher performance per clock, and acclaimed general ease of use.

The original MC68000 was fabricated using an HMOS process with a 3.5-micron feature size. Initial engineering samples were released in late 1979. Production chips were available in 1980, with initial speed grades of 4, 6, and 8 MHz. 10 MHz chips became available during 1981, and 12.5 MHz chips during 1982. The 16.67 MHz "12F" version of the MC68000, the fastest version of the original HMOS chip, was not produced until the late 1980s.

The 68000 had many high-end design wins early on. It became the dominant CPU for Unix based workstations, found its way into heralded computers such as the Amiga, Atari ST, Apple Lisa and Macintosh, and was used in the first generation of desktop laser printers. In 1982, the 68000 received an update to its ISA allowing it to support virtual memory by conforming to the Popek and Goldberg virtualization requirements. The updated chip was called the 68010. A further extended version which exposed 31 bits of the address bus was also produced, in small quantities, as the 68012.

To support lower-cost systems and control applications with smaller memory sizes, Motorola introduced the 8-bit compatible MC68008, also in 1982. This was a 68000 with an 8-bit data bus and a smaller (20 bit) address bus. After 1982, Motorola devoted more attention to the 68020 and 88000 projects.


Serial Advanced Technology Attachment is a computer bus primarily designed for transfer of data between a computer and storage devices (like hard disk drives or optical drives).

The main benefits are faster transfers, ability to remove or add devices while operating (hot swapping), thinner cables that let air cooling work more efficiently, and more reliable operation with tighter data integrity checks than the older Parallel ATA interface.

It was designed as a successor to the legacy Advanced Technology Attachment standard (ATA), and is expected to eventually replace the older technology (retroactively renamed Parallel ATA or PATA). Serial ATA adapters and devices communicate over a high-speed serial cable.

Standardized in mid-2004, eSATA defined separate cables, connectors, and revised electrical requirements for external applications:

* Minimum transmit potential increased: Range is 500–600 mV instead of 400–600 mV.
* Minimum receive potential decreased: Range is 240–600 mV instead of 325–600 mV.
* Identical protocol and logical signaling (link/transport-layer and above), allowing native SATA devices to be deployed in external enclosures with minimal modification
* Maximum cable length of 2 m (USB and FireWire allow longer distances.)

eSATA enters an external storage market already served by the USB and FireWire interfaces. Most external hard disk drive cases with FireWire or USB interfaces use either PATA or SATA drives and "bridges" to translate between the drives' interfaces and the enclosures' external ports, and this bridging incurs some inefficiency. Some single disks can transfer almost 120 MB/s during real use, more than twice the maximum transfer rate of USB 2.0 or FireWire 400 (IEEE 1394a) and well in excess of the maximum transfer rate of FireWire 800, though the S3200 FireWire 1394b spec reaches ~400 MB/s. Finally, some low-level drive features, such as S.M.A.R.T., are not usable through USB or FireWire bridging. eSATA does not suffer from these issues.

Macintosh XL

Macintosh XL Specifications

The Macintosh XL was a modified version of the Apple Lisa personal computer made by Apple Computer. In the Macintosh XL configuration, the computer shipped with MacWorks XL, a Lisa program that allowed 64K Macintosh ROM emulation. An identical machine was sold as the Lisa 2/10 with the Lisa OS only.

The Macintosh XL had a 400K 3.5" floppy drive and an internal 10 MB proprietary Widget hard drive with provision for an optional 5 or 10MB external ProFile hard drive. At the time of release, the Macintosh XL was colloquially referred to as the "Hackintosh", although this name has also been used more generally to describe Macintosh computers assembled from unusual combinations of parts.

Processor: 68000, 5
FPU: none
Data Path:16, 5
L1 Cache: none
L2 Cache: none
2nd Processor: none
Slots: 3 Lisa slots

Logic Board: none
RAM Slots: 2, Lisa cards
Min - Max RAM: 0.5 MB - 2 MB
RAM Sizes: 512 K
Install in Groups of: 1

5 Aralık 2007 Çarşamba


The Panopticon is a type of prison building designed by English philosopher Jeremy Bentham in the late eighteenth century. The concept of the design is to allow an observer to observe (-opticon) all (pan-) prisoners without the prisoners being able to tell if they are being observed or not, thus conveying a "sentiment of an invisible omniscience." In his own words, Bentham described the Panopticon as "a new mode of obtaining power of mind over mind, in a quantity hitherto without example."

The architecture

incorporates a tower central to a circular building that is divided into cells, each cell extending the entire thickness of the building to allow inner and outer windows. The occupants of the cells are thus backlit, isolated from one another by walls, and subject to scrutiny both collectively and individually by an observer in the tower who remains unseen. Toward this end, Bentham envisioned not only venetian blinds on the tower observation ports but also maze-like connections among tower rooms to avoid glints of light or noise that might betray the presence of an observer
Ben and Marthalee Barton

"Morals reformed - health preserved - industry invigorated - instruction diffused - public burthens lightened - Economy seated, as it were, upon a rock - the gordian knot of the poor-law not cut, but untied - all by a simple idea in Architecture!"

Bentham derived the idea from the plan of a military school in Paris designed for easy supervision, itself conceived by his brother Samuel who arrived at it as a solution to the complexities involved in the handling of large numbers of men. Bentham supplemented this principle with the idea of contract management; that is, an administration by contract as opposed to trust, where the director would have a pecuniary interest in lowering the average rate of mortality. The Panopticon was intended to be cheaper than the prisons of his time, as it required fewer staff; "Allow me to construct a prison on this model," Bentham requested to a Committee for the Reform of Criminal Law, "I will be the gaoler. You will see ... that the gaoler will have no salary -- will cost nothing to the nation." As the watchmen cannot be seen, they need not be on duty at all times, effectively leaving the watching to the watched. According to Bentham's design, the prisoners would also be used as menial labour walking on wheels to spin looms or run a water wheel. This would decrease the cost of the prison and give a possible source of income.

Bentham devoted a large part of his time and almost his whole fortune to promote the construction of a prison based on his scheme. After many years and innumerable political and financial difficulties, he eventually obtained a favourable sanction from Parliament for the purchase of a place to erect the prison, but in 1811 after Prime Minister Spencer Perceval (1809-1812) refused to authorise the purchase of the land, the project was finally abandoned. In 1813 he was awarded a sum of £23,000 in compensation for his monetary loss which did little to alleviate Bentham's ensuing unhappiness.

While the design did not come to fruition during Bentham's time, it has been seen as an important development. For instance, the design was invoked by Michel Foucault (in Discipline and Punish) as metaphor for modern "disciplinary" societies and its pervasive inclination to observe and normalise. Foucault proposes that not only prisons but all hierarchical structures like the army, the school, the hospital and the factory have evolved through history to resemble Bentham's Panopticon. The notoriety of the design today (although not its lasting influence in architectural realities) stems from Foucault's famous analysis of it.

3 Aralık 2007 Pazartesi

Information Theory

Information Theory

Information theory is a branch of applied mathematics and engineering involving the quantification of information. Historically, information theory developed to find fundamental limits on compressing and reliably communicating data. Since its inception it has broadened to find applications in statistical inference, networks other than communication networks, biology, quantum information theory, data analysis, and other areas, although it is still widely used in the study of communication.

A key measure of information that comes up in the theory is known as information entropy, which is usually expressed by the average number of bits needed for storage or communication. Intuitively, entropy quantifies the uncertainty involved in a random variable. For example, a fair coin flip will have less entropy than a roll of a die.

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s), and channel coding (e.g. for DSL lines). The field is at the crossroads of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to success of the Voyager missions to deep space, the invention of the CD, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields. Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, and measures of information.

The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables.

Entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function, Hb(p). The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.

A-5 Vigilante

A-5 Vigilante

The North American A-5 Vigilante was a powerful, highly advanced carrier-based supersonic bomber designed for the United States Navy. Its service in the nuclear strike role to replace the A-3 Skywarrior was very short.

Designated A3J-1, the Vigilante first entered squadron service with VAH-3 in June 1961, replacing the A-3 Skywarrior in the heavy attack role. All variants of the Vigilante were built at North American Aviation's facility at Port Columbus Airport in Columbus, Ohio, alongside the T-2 Buckeye and OV-10 Bronco.

Under the Tri-Services Designation plan implemented under Robert McNamara in September 1962, the Vigilante was redesignated A-5, with the initial A3J-1 becoming A-5A and the updated A3J-2 becoming A-5B. The subsequent recce version, originally AJ3-3P, became the RA-5C.

The Vigilante's early service proved troublesome, with many teething problems for its advanced systems. It also arrived in service during a major policy shift in the Navy strategic role, which switched to emphasize submarine launched ballistic missiles rather than manned bombers. As a result, in 1963 procurement of the A-5 was ended and the type was converted to the fast reconnaissance role. The first RA-5Cs were delivered in July 1963, with Vigilante squadrons redesignated RVAH.

Despite the Vigilante's useful service, it was expensive and complex to operate, and it was phased out after the end of the Vietnam War. Disestablishment of RVAH squadrons began in 1974, with the last Vigilantes completing their final deployment in September 1979.

The Vigilante did not end the career of the Skywarriors, which would carry on as electronic warfare platforms and tankers. Fighters fitted with pods would replace the RA-5C. Following up to present day, the weight of fighters such as the F/A-18E/F Super Hornet has evolved into the same 62,950 lb class as the Vigilante. The Super Hornet fighter is also planned to cover the strike, reconnaissance, tanker, and electronic warfare roles of these old bomber types.

Although the Vigilante served in the attack and recce roles, its design and configuration was believed to be a major influence on one of the world's most famous postwar interceptors: the Soviet MiG-25 'Foxbat' was apparently heavily influenced by the A-5's design. (The MiG-25 would look even more familiar if the Vigilante had retained the twin vertical fins of the prototype; although North American originally specified two fins, like the later Foxbat, that part of the design was vetoed by the Navy in favor of one folding tailfin.) Other western aircraft such as the F-15 Eagle would also adopt a high mounted wing and wedge-shaped intake geometry.