Saturday, April 20, 2024
09:37 AM (GMT +5)

Go Back   CSS Forums > CSS Compulsory Subjects > General Science & Ability > General Science Notes

Reply Share Thread: Submit Thread to Facebook Facebook     Submit Thread to Twitter Twitter     Submit Thread to Google+ Google+    
 
LinkBack Thread Tools Search this Thread
  #51  
Old Tuesday, November 27, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Liver

Liver


I -INTRODUCTION
Liver, largest internal organ of the human body. The liver, which is part of the digestive system, performs more than 500 different functions, all of which are essential to life. Its essential functions include helping the body to digest fats, storing reserves of nutrients, filtering poisons and wastes from the blood, synthesizing a variety of proteins, and regulating the levels of many chemicals found in the bloodstream. The liver is unique among the body’s vital organs in that it can regenerate, or grow back, cells that have been destroyed by some short-term injury or disease. But if the liver is damaged repeatedly over a long period of time, it may undergo irreversible changes that permanently interfere with function.

II -STRUCTURE OF THE LIVER
The human liver is a dark red-brown organ with a soft, spongy texture. It is located at the top of the abdomen, on the right side of the body just below the diaphragm—a sheet of muscle tissue that separates the lungs from the abdominal organs. The lower part of the rib cage covers the liver, protecting it from injury. In a healthy adult, the liver weighs about 1.5 kg (3 lb) and is about 15 cm (6 in) thick.

Despite its many complex functions, the liver is relatively simple in structure. It consists of two main lobes, left and right, which overlap slightly. The right lobe has two smaller lobes attached to it, called the quadrate and caudate lobes.

Each lobe contains many thousands of units called lobules that are the building blocks of the liver. Lobules are six-sided structures each about 1 mm (0.04 in) across. A tiny vein runs through the center of each lobule and eventually drains into the hepatic vein, which carries blood out of the liver. Hundreds of cubed-shaped liver cells, called hepatocytes, are arranged around the lobule's central vein in a radiating pattern. On the outside surface of each lobule are small veins, ducts, and arteries that carry fluids to and from the lobules. As the liver does its work, nutrients are collected, wastes are removed, and chemical substances are released into the body through these vessels.

Unlike most organs, which have a single blood supply, the liver receives blood from two sources. The hepatic artery delivers oxygen-rich blood from the heart, supplying about 25 percent of the liver's blood. The liver also receives oxygen-depleted blood from the hepatic portal vein. This vein, which is the source of 75 percent of the liver's blood supply, carries blood to the liver that has traveled from the digestive tract, where it collects nutrients as food is digested. These nutrients are delivered to the liver for further processing or storage.

Tiny blood vessel branches of the hepatic artery and the hepatic portal vein are found around each liver lobule. This network of blood vessels is responsible for the vast amount of blood that flows through the liver—about 1.4 liters (about 3 pt) every minute. Blood exits the liver through the hepatic vein, which eventually drains into the heart.

III -FUNCTIONS OF THE LIVER
One of the liver’s primary jobs is to store energy in the form of glycogen, which is made from a type of sugar called glucose. The liver removes glucose from the blood when blood glucose levels are high. Through a process called glycogenesis, the liver combines the glucose molecules in long chains to create glycogen, a carbohydrate that provides a stored form of energy. When the amount of glucose in the blood falls below the level required to meet the body’s needs, the liver reverses this reaction, transforming glycogen into glucose.

Another crucial function of the liver is the production of bile, a yellowish-brown liquid containing salts necessary for the digestion of lipids, or fats. These salts are produced within the lobules. Bile leaves the liver through a network of ducts and is transported to the gallbladder, which concentrates the bile and releases it into the small intestine. Vitamins are also stored in the liver. Drawing on the nutrient-rich blood in the hepatic portal vein, the liver collects and stores supplies of vitamins A, D, E, and K. The B vitamins are also stored here, including a two- to four-year supply of Vitamin B12.

The liver also functions as the body’s chemical factory. Several important proteins found in the blood are produced in the liver. One of these proteins, albumin, helps retain calcium and other important substances in the bloodstream. Albumin also helps regulate the movement of water from the bloodstream into the body’s tissues. The liver also produces globin, one of the two components that form hemoglobin—the oxygen-carrying substance in red blood cells. Certain globulins, a group of proteins that includes antibodies, are produced in the liver, as are the proteins that make up the complement system, a part of the immune system that combines with antibodies to fight invading microorganisms.

Many other chemicals are produced by the liver. These include fibrinogen and prothrombin, which help wounds to heal by enabling blood to form clots, and cholesterol, a key component of cell membranes that transports fats in the bloodstream to body tissues. In addition to manufacturing chemicals, the liver helps clear toxic substances, such as drugs and alcohol, from the bloodstream. It does this by absorbing the harmful substances, chemically altering them, and then excreting them in the bile.

IV -LIVER DISEASES
Although the liver is exposed to many potentially harmful substances, it is a remarkable organ that is able to regenerate, or repair or replace, injured tissue. Its construction, in which many lobules perform the same task, means that if one section of the liver is damaged, another section will perform the functions of the injured area indefinitely or until the damaged section is repaired. But the liver is subject to many diseases that can overwhelm its regeneration abilities, threatening a person’s health.

Diseases of the liver range from mild infection to life-threatening liver failure. For many of these ailments, the first sign of a problem is a condition called jaundice, characterized by a yellowish coloring of the skin and the whites of the eye. It develops when liver cells lose their ability to process bilirubin, the yellowish-brown pigment found in bile.

The liver can be harmed whenever injury or disease affects the rest of the body. For example, cancer may spread from the stomach or intestines to the liver, and diabetes, if not properly treated, may result in damage to the liver. Some diseases caused by parasites, including amebiasis and schistosomiasis, can damage the liver. Drug use, including long-term use of some prescription medications as well as illegal drugs, can also cause liver damage. Poisons can easily damage liver cells and even cause complete liver failure, especially the poisons found in certain mushrooms.

One of the most common liver diseases is hepatitis, an inflammation of the liver. Hepatitis may be caused by exposure to certain chemicals, by autoimmune diseases, or by bacterial infections. But hepatitis is most often caused by one of several viruses. The hepatitis A virus (HAV) can produce flu like symptoms and jaundice, but many people who contract it have no symptoms. The disease tends to resolve on its own. Because HAV lives in feces in the intestinal tract, hepatitis A is prevalent in areas where drinking water is contaminated with raw sewage. Good hygiene practices and a hepatitis A vaccination are effective measures of prevention. Hepatitis B is a more serious ailment. Unlike HAV, hepatitis B virus (HBV) may remain active in the body for many years after the time of infection, sometimes permanently damaging the liver. HBV is found in blood and other body fluids—tears, saliva, and semen—and is spread through unprotected sexual intercourse and the sharing of infected needles or other sharp objects that puncture the skin.

In developed countries, alcohol-induced liver diseases far outnumber hepatitis and all other liver disorders. Heavy alcohol use causes fat deposits to build up in the liver, possibly leading to chronic hepatitis, which causes scarring and destruction of liver cells. Over many years, scarring in the liver can progress to cirrhosis, a disease characterized by diminished blood flow through this important organ. When this occurs, toxins are not adequately removed from the blood, blood pressure increases in the hepatic portal vein, and substances produced by the liver, such as blood proteins, are not adequately regulated. Cirrhosis cannot be reversed, but liver function can significantly improve in people who stop consuming alcohol during the early stages of this condition. Beyond abstinence from alcohol, treatments for cirrhosis may include drug therapy or surgery to redirect blood flow.

For people with severe liver disease or impending liver failure, organ transplantation may be an option. Unlike some organ transplants, such as kidney transplants, liver transplants are complex procedures that have not had high long-term success rates. Fortunately, new techniques and drugs are improving the outcome of liver transplants. Current success rates range between 60 and 80 percent, with more than half of recent transplant recipients surviving more than five years. Most of these people have an excellent prognosis for leading healthy, normal lives.
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #52  
Old Tuesday, November 27, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Liver Picture

The largest internal organ in humans, the liver is also one of the most important. It has many functions, among them the synthesis of proteins, immune and clotting factors, and oxygen and fat-carrying substances. Its chief digestive function is the secretion of bile, a solution critical to fat emulsion and absorption. The liver also removes excess glucose from circulation and stores it until it is needed. It converts excess amino acids into useful forms and filters drugs and poisons from the bloodstream, neutralizing them and excreting them in bile. The liver has two main lobes, located just under the diaphragm on the right side of the body. It can lose 75 percent of its tissue (to disease or surgery) without ceasing to function.
Attached Thumbnails
EDS- notes-liver.gif  
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #53  
Old Tuesday, November 27, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Healthy and Diseased Livers

The liver cells on the left are from a healthy liver, while the cells on the right came from the liver of a person with cirrhosis of the liver. Cirrhosis is usually caused by toxins (including alcohol) in the blood or by hepatitis. In cirrhosis, dead and damaged liver cells are replaced by fibrous tissue, which can form masses of scar tissue and dramatically change the structure of the liver. These fibrous areas can slow the flow of blood through the liver.
Attached Thumbnails
EDS- notes-liver2.gif  
__________________
No signature...
Reply With Quote
The Following 2 Users Say Thank You to Predator For This Useful Post:
AFRMS (Tuesday, January 29, 2008), Omer (Tuesday, November 27, 2007)
  #54  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Radio

RADIO



I -INTRODUCTION
Radio, system of communication employing electromagnetic waves propagated through space. Because of their varying characteristics, radio waves of different lengths are employed for different purposes and are usually identified by their frequency. The shortest waves have the highest frequency, or number of cycles per second; the longest waves have the lowest frequency, or fewest cycles per second. In honor of the German radio pioneer Heinrich Hertz, his name has been given to the cycle per second (hertz, Hz); 1 kilohertz (kHz) is 1000 cycles per sec, 1 megahertz (MHz) is 1 million cycles per sec, and 1 gigahertz (GHz) is 1 billion cycles per sec. Radio waves range from a few kilohertz to several gigahertz. Waves of visible light are much shorter. In a vacuum, all electromagnetic waves travel at a uniform speed of about 300,000 km (about 186,000 mi) per second. For electromagnetic waves other than radio.

Radio waves are used not only in radio broadcasting but also in wireless telegraphy, two-way communication for law enforcement, telephone transmission, wireless Internet, television, radar, navigational systems, GPS, and space communication. In the atmosphere, the physical characteristics of the air cause slight variations in velocity, which are sources of error in such radio-communications systems as radar. Also, storms or electrical disturbances produce anomalous phenomena in the propagation of radio waves.

Because electromagnetic waves in a uniform atmosphere travel in straight lines and because the earth's surface is approximately spherical, long-distance radio communication is made possible by the reflection of radio waves from the ionosphere. Radio waves shorter than about 10 m (about 33 ft) in wavelength—designated as very high, ultrahigh, and superhigh frequencies (VHF, UHF, and SHF)—are usually not reflected by the ionosphere; thus, in normal practice, such very short waves are received only within line-of-sight distances. Wavelengths shorter than a few centimeters are absorbed by water droplets or clouds; those shorter than 1.5 cm (0.6 in) may be absorbed selectively by the water vapor present in a clear atmosphere.

A typical radio communication system has two main components, a transmitter and a receiver. The transmitter generates electrical oscillations at a radio frequency called the carrier frequency. Either the amplitude or the frequency itself may be modulated to vary the carrier wave. An amplitude-modulated signal consists of the carrier frequency plus two sidebands resulting from the modulation. Frequency modulation produces more than one pair of sidebands for each modulation frequency. These produce the complex variations that emerge as speech or other sound in radio broadcasting, and in the alterations of light and darkness in television broadcasting.

II -TRANSMITTER
Essential components of a radio transmitter include an oscillation generator for converting commercial electric power into oscillations of a predetermined radio frequency; amplifiers for increasing the intensity of these oscillations while retaining the desired frequency; and a transducer for converting the information to be transmitted into a varying electrical voltage proportional to each successive instantaneous intensity. For sound transmission a microphone is the transducer; for picture transmission the transducer is a photoelectric device.

Other important components of the radio transmitter are the modulator, which uses these proportionate voltages to control the variations in the oscillation intensity or the instantaneous frequency of the carrier, and the antenna, which radiates a similarly modulated carrier wave. Every antenna has some directional properties, that is, it radiates more energy in some directions than in others, but the antenna can be modified so that the radiation pattern varies from a comparatively narrow beam to a comparatively even distribution in all directions; the latter type of radiation is employed in broadcasting.

The particular method of designing and arranging the various components depends on the effects desired. The principal criteria of a radio in a commercial or military airplane, for example, are light weight and intelligibility; cost is a secondary consideration, and fidelity of reproduction is entirely unimportant. In a commercial broadcasting station, on the other hand, size and weight are of comparatively little importance; cost is of some importance; and fidelity is of the utmost importance, particularly for FM stations; rigid control of frequency is an absolute necessity. In the U.S., for example, a typical commercial station broadcasting on 1000 kHz is assigned a bandwidth of 10 kHz by the Federal Communications Commission, but this width may be used only for modulation; the carrier frequency itself must be kept precisely at 1000 kHz, for a deviation of one-hundredth of 1 percent would cause serious interference with even distant stations on the same frequency.

A -Oscillators
In a typical commercial broadcasting station the carrier frequency is generated by a carefully controlled quartz-crystal oscillator. The fundamental method of controlling frequencies in most radio work is by means of tank circuits, or tuned circuits, that have specific values of inductance and capacitance, and that therefore favor the production of alternating currents of a particular frequency and discourage the flow of currents of other frequencies. In cases where the frequency must be extremely stable, however, a quartz crystal with a definite natural frequency of electrical oscillation is used to stabilize the oscillations. The oscillations are actually generated at low power by an electron tube and are amplified in a series of power amplifiers that act as buffers to prevent interaction of the oscillator with the other components of the transmitter, because such interaction would alter the frequency.

The crystal is shaped accurately to the dimensions required to give the desired frequency, which may then be modified slightly by adding a condenser to the circuit to give the exact frequency desired. In a well-designed circuit, such an oscillator does not vary by more than one-hundredth of 1 percent in frequency. Mounting the crystal in a vacuum at constant temperature and stabilizing the supply voltages may produce a frequency stability approaching one-millionth of 1 percent. Crystal oscillators are most useful in the ranges termed very low frequency, low frequency, and medium frequency (VLF, LF, and MF). When frequencies higher than about 10 MHz must be generated, the master oscillator is designed to generate a medium frequency, which is then doubled as often as necessary in special electronic circuits. In cases where rigid frequency control is not required, tuned circuits may be used with conventional electron tubes to generate oscillations up to about 1000 MHz, and reflex klystrons are used to generate the higher frequencies up to 30,000 MHz. Magnetrons are substituted for klystrons when even larger amounts of power must be generated.

B -Modulation
Modulation of the carrier wave so that it may carry impulses is performed either at low level or high level. In the former case the audio-frequency signal from the microphone, with little or no amplification, is used to modulate the output of the oscillator, and the modulated carrier frequency is then amplified before it is passed to the antenna; in the latter case the radio-frequency oscillations and the audio-frequency signal are independently amplified, and modulation takes place immediately before the oscillations are passed to the antenna. The signal may be impressed on the carrier either by frequency modulation (FM) or amplitude modulation (AM).
The simplest form of modulation is keying, interrupting the carrier wave at intervals with a key or switch used to form the dots and dashes in continuous-wave radiotelegraphy.

The carrier wave may also be modulated by varying the amplitude, or strength, of the wave in accordance with the variations of frequency and intensity of a sound signal, such as a musical note. This form of modulation, AM, is used in many radiotelephony services including standard radiobroadcasts. AM is also employed for carrier current telephony, in which the modulated carrier is transmitted by wire, and in the transmission of still pictures by wire or radio.

In FM the frequency of the carrier wave is varied within a fixed range at a rate corresponding to the frequency of a sound signal. This form of modulation, perfected in the 1930s, has the advantage of yielding signals relatively free from noise and interference arising from such sources as automobile-ignition systems and thunderstorms, which seriously affect AM signals. As a result, FM broadcasting is done on high-frequency bands (88 to 108 MHz), which are suitable for broad signals but have a limited reception range.

Carrier waves can also be modulated by varying the phase of the carrier in accordance with the amplitude of the signal. Phase modulation, however, has generally been limited to special equipment. The development of the technique of transmitting continuous waves in short bursts or pulses of extremely high power introduced the possibility of yet another form of modulation, pulse-time modulation, in which the spacing of the pulses is varied in accordance with the signal.

The information carried by a modulated wave is restored to its original form by a reverse process called demodulation or detection. Radio waves broadcast at low and medium frequencies are amplitude modulated. At higher frequencies both AM and FM are in use; in present-day commercial television, for example, the sound may be carried by FM, while the picture is carried by AM. In the super high-frequency range (above the ultrahigh-frequency range), in which broader bandwidths are available, the picture also may be carried by FM.

Digital radio (also called HD or high-definition radio) processes sounds into patterns of numbers instead of into patterns of electrical waves and can be used for both FM and AM broadcasts. The sound received by a radio listener is much clearer and virtually free from interference. The signals can be used to provide additional services, multiple channels, and interactive features. Satellite radio is also a form of digital radio but the signal is broadcast from communication satellites in orbit around Earth and not from local broadcast towers.

C -Antennas
The antenna of a transmitter need not be close to the transmitter itself. Commercial broadcasting at medium frequencies generally requires a very large antenna, which is best located at an isolated point far from cities, whereas the broadcasting studio is usually in the heart of the city. FM, television, and other very-high-frequency broadcasts must have very high antennas if appreciably long range is to be achieved, and it may not be convenient to locate such a high antenna near the broadcasting studio. In all such cases, the signals may be transmitted by wires. Ordinary telephone lines are satisfactory for most commercial radio broadcasts; if high fidelity or very high frequencies are required, coaxial or fiber optic cables are used.

III -RECEIVERS
The essential components of a radio receiver are an antenna for receiving the electromagnetic waves and converting them into electrical oscillations; amplifiers for increasing the intensity of these oscillations; detection equipment for demodulating; a speaker for converting the impulses into sound waves audible by the human ear (and in television a picture tube for converting the signal into visible light waves); and, in most radio receivers, oscillators to generate radio-frequency waves that can be “mixed” with the incoming waves.

The incoming signal from the antenna, consisting of a radio-frequency carrier oscillation modulated by an audio frequency or video-frequency signal containing the impulses, is generally very weak. The sensitivity of some modern radio receivers is so great that if the antenna signal can produce an alternating current involving the motion of only a few hundred electrons, this signal can be detected and amplified to produce an intelligible sound from the speaker. Most radio receivers can operate quite well with an input from the antenna of a few millionths of a volt. The dominant consideration in receiver design, however, is that very weak desired signals cannot be made useful by amplifying indiscriminately both the desired signal and undesired radio noise (see Noise below). Thus, the main task of the designer is to assure preferential reception of the desired signal.

Most modern radio receivers are of the super heterodyne type in which an oscillator generates a radio-frequency wave that is mixed with the incoming wave, thereby producing a radio-frequency wave of lower frequency; the latter is called intermediate frequency. To tune the receiver to different frequencies, the frequency of the oscillations is changed, but the intermediate frequency always remains the same (at 455 kHz for most AM receivers and at 10.7 MHz for most FM receivers). The oscillator is tuned by altering the capacity of the capacitor in its tank circuit; the antenna circuit is similarly tuned by a capacitor in its circuit. One or more stages of intermediate-frequency amplification are included in all receivers; in addition, one or more stages of radio-frequency amplification may be included.

Auxiliary circuits such as automatic volume control (which operates by rectifying part of the output of one amplification circuit and feeding it back to the control element of the same circuit or of an earlier one) are usually included in the intermediate-frequency stage. The detector, often called the second detector, the mixer being called the first detector, is usually simply a diode acting as a rectifier, and produces an audio-frequency signal. FM waves are demodulated or detected by circuits known as discriminators or radio-detectors that translate the varying frequencies into varying signal amplitudes.

Digital and satellite radio require special receivers that can change a digital signal into analog sound. The digital signal can carry additional information that can be displayed on a screen on the radio. The title of a music track and the artist can be provided, for example. Some radios can even record songs in MP3 format.

A -Amplifiers
Radio-frequency and intermediate-frequency amplifiers are voltage amplifiers, increasing the voltage of the signal. Radio receivers may also have one or more stages of audio-frequency voltage amplification. In addition, the last stage before the speaker must be a stage of power amplification. A high-fidelity receiver contains both the tuner and amplifier circuits of a radio. Alternatively, a high-fidelity radio may consist of a separate audio amplifier and a separate radio tuner.

The principal characteristics of a good radio receiver are high sensitivity, selectivity, fidelity, and low noise. Sensitivity is primarily achieved by having numerous stages of amplification and high amplification factors, but high amplification is useless unless reasonable fidelity and low noise can be obtained. The most sensitive receivers have one stage of tuned radio-frequency amplification. Selectivity is the ability of the receiver to obtain signals from one station and reject signals from another station operating on a nearby frequency. Excessive selectivity is not desirable, because a bandwidth of many kilohertz is necessary in order to receive the high-frequency components of the audio-frequency signals. A good broadcast-band receiver tuned to one station has a zero response to a station 20 kHz away. The selectivity depends principally on the circuits in the intermediate-frequency stage.

B -High-Fidelity Systems
Fidelity is the equality of response of the receiver to various audio-frequency signals modulated on the carrier. Extremely high fidelity, which means a flat frequency response (equal amplification of all audio frequencies) over the entire audible range from about 20 Hz to 20 kHz, is extremely difficult to obtain. A high-fidelity system is no stronger than its weakest link, and the links include not only all the circuits in the receiver, but also the speaker, the acoustic properties of the room in which the speaker is located, and the transmitter to which the receiver is tuned. Most AM radio stations do not reproduce faithfully sounds below 100 Hz or above 5 kHz; FM stations generally have a frequency range of 50 Hz to 15 kHz, the upper limit being set by Federal Communications Commission regulations. Digital and satellite radio can provide even better high fidelity over a larger range of frequencies. Digital FM approaches the sound quality of CDs. Digital AM radio should be comparable to regular FM in sound quality.

C -Distortio
A form of amplitude distortion is often introduced to a radio transmission by increasing the relative intensity of the higher audio frequencies. At the receiver, a corresponding amount of high-frequency attenuation is applied. The net effect of these two forms of distortion is a net reduction in high-frequency background noise or static at the receiver. Many receivers are also equipped with user-adjustable tone controls so that the amplification of high and low frequencies may be adjusted to suit the listener's taste. Another source of distortion is cross modulation, the transfer of signals from one circuit to another through improper shielding. Harmonic distortion caused by nonlinear transfer of signals through amplification stages can often be significantly reduced by the use of negative-feedback circuitry that tends to cancel most of the distortion generated in such amplification stages.

D -Noise
Noise is a serious problem in all radio receivers. Several different types of noise, each characterized by a particular type of sound and by a particular cause, have been given names. Among these are hum, a steady low-frequency note (about two octaves below middle C) commonly produced by the frequency of the alternating-current power supply (usually 60 Hz) becoming impressed onto the signal because of improper filtering or shielding; hiss, a steady high-frequency note; and whistle, a pure high-frequency note produced by unintentional audio-frequency oscillation, or by beats. These noises can be eliminated by proper design and construction. Certain types of noise, however, cannot be eliminated. The most important of these in ordinary AM low-frequency and medium-frequency sets is static, caused by electrical disturbances in the atmosphere.

Static may be due to the operation of nearby electrical equipment (such as automobile and airplane engines), but is most often caused by lightning. Radio waves produced by such atmospheric disturbances can travel thousands of kilometers with comparatively little attenuation, and inasmuch as a thunderstorm is almost always occurring somewhere within a few thousand kilometers of any radio receiver, static is almost always present. Static affects FM receivers to a much smaller degree, because the amplitude of the intermediate waves is limited in special circuits before discrimination, and this limiting removes effects of static, which influences the signal only by superimposing a random amplitude modulation on the wave. Digital and satellite radio greatly reduces static.

Another basic source of noise is thermal agitation of electrons. In any conductor at a temperature higher than absolute zero, electrons are moving about in a random manner. Because any motion of electrons constitutes an electric current, this thermal motion gives rise to noise when amplification is carried too far. Such noise can be avoided if the signal received from the antenna is considerably stronger than the current caused by thermal agitation; in any case, such noise can be minimized by suitable design. A theoretically perfect receiver at ordinary temperatures can receive speech intelligibly when the signal power in the antenna is only 4 × 10-18 W (40 attowatts); in ordinary radio receivers, however, considerably greater signal strength is required.

E -Power Supply
A radio has no moving parts except the speaker cone, which vibrates within a range of a few thousandths of a centimeter, and so the only power required to operate the radio is electrical power to force electrons through the various circuits. When radios first came into general use in the 1920s, batteries operated most. Although batteries are used widely in portable sets today, a power supply from a power line has advantages, because it permits the designer more freedom in selecting circuit components. If the alternating-current (AC) power supply is 120 V, this current can be led directly to the primary coil of a transformer, and power with the desired voltage can be drawn off as desired from the secondary coils. This secondary current must be rectified and filtered before it can be used because transistors require direct current (DC) for proper operation. Electron tubes require DC for plate current; filaments may be heated either by DC or AC, but in the latter case hum may be created.

Transistorized radios do not require as high an operating DC voltage as did tube radios of the past, but power supplies are still needed to convert the AC voltage distributed by utility companies to DC, and to step up or step down the voltage to the required value, using transformers. Airplane and automobile radio sets that operate on 12 to 24 volts DC often contain circuits that convert the available DC voltage to AC, after which the voltage is stepped up or down to the required voltage level and again converted to DC by a rectifier . Airplane and automobile radio sets that operate on 6 to 24 volts DC always contain some such device for raising the voltage. The advent of transistors, integrated circuits, and other solid-state electronic devices, which are much smaller in size and require very little power, has today greatly reduced the use of vacuum tubes in radio, television, and other types of communications equipment and devices.

IV -HISTORY
Although many discoveries in the field of electricity were necessary to the development of radio, the history of radio really began in 1873, with the publication by the British physicist James Clerk Maxwell of his theory of electromagnetic waves.

A -Late 19th Century
Maxwell's theory applied primarily to light waves. About 15 years later the German physicist Heinrich Hertz actually generated such waves electrically. He supplied an electric charge to a capacitor, and then short-circuited the capacitor through a spark gap. In the resulting electric discharge the current surged past the neutral point, building up an opposite charge on the capacitor, and then continued to surge back and forth, creating an oscillating electric discharge in the form of a spark. Some of the energy of this oscillation was radiated from the spark gap in the form of electromagnetic waves. Hertz measured several of the properties of these so-called Hertzian waves, including their wavelength and velocity.

The concept of using electromagnetic waves for the transmission of messages from one point to another was not new; the heliograph, for example, successfully transmitted messages via a beam of light rays, which could be modulated by means of a shutter to carry signals in the form of the dots and dashes of the Morse code. Radio has many advantages over light for this purpose, but these advantages were not immediately apparent. Radio waves, for example, can travel enormous distances; but microwaves (which Hertz used) cannot. Radio waves can be enormously attenuated and still be received, amplified, and detected; but good amplifiers were not available until the development of electron tubes. Although considerable progress was made in radiotelegraphy (for example, transatlantic communication was established in 1901), radiotelephony might never have become practical without the development of electronics. Historically, developments in radio and in electronics have been interdependent.

To detect the presence of electromagnetic radiation, Hertz used a loop of wire somewhat similar to a wire antenna. At about the same time the Anglo-American inventor David Edward Hughes discovered that a loose contact between a steel point and a carbon block would not conduct current, but that if electromagnetic waves were passed through the junction point, it conducted well. In 1879 Hughes demonstrated the reception of radio signals from a spark transmitter located some hundreds of meters away. In these experiments he conducted a current from a voltaic cell through a glass tube filled loosely with zinc and silver filings, which cohered when radio waves impinged on it. The British physicist Sir Oliver Joseph Lodge, in a device called the coherer, to detect the presence of radio waves, used the principle.

The coherer, after becoming conductive, could again be made resistant by tapping it, causing the metal particles to separate. Although far more sensitive than a wire loop in the absence of an amplifier, the coherer gave only a single response to sufficiently strong radio waves of varying intensities, and could thus be used for telegraphy but not for telephony.
The Italian electrical engineer and inventor Guglielmo Marconi is generally credited with being the inventor of radio. Starting in 1895 he developed an improved coherer and connected it to a rudimentary form of antenna, with its lower end grounded. He also developed improved spark oscillators, connected to crude antennas. The transmitter was modulated with an ordinary telegraph key.

The coherer at the receiver actuated a telegraphic instrument through a relay, which functioned as a crude amplifier. In 1896 he transmitted signals for a distance exceeding 1.6 km (more than 1 mi), and applied for his first British patent. In 1897 he transmitted signals from shore to a ship at sea 29 km (18 mi) away. In 1899 he established commercial communication between England and France that operated in all types of weather; early in 1901 he sent signals 322 km (200 mi), and later in the same year succeeded in sending a single letter across the Atlantic Ocean. In 1902 messages were regularly sent across the Atlantic, and by 1905 many ships were using radio for communications with shore stations. For his pioneer work in the field of wireless telegraphy, Marconi shared the 1909 Nobel Prize in physics with the German physicist Karl Ferdinand Braun.

During this time various technical improvements were being made. Tank circuits, containing inductance and capacitance, were used for tuning. Antennas were improved, and their directional properties were discovered and used. Transformers were used to increase the voltage sent to the antenna. Other detectors were developed to supplement the coherer with its clumsy tapper; among these were a magnetic detector that depended on the ability of radio waves to demagnetize steel wires; a bolometer that measured the rise in temperature of a fine wire when radio waves are passed through the wire; and the so-called Fleming valve, the forerunner of the thermionic tube, or vacuum tube.

B -20th Century
The modern vacuum tube traces its development to the discovery made by the American inventor Thomas Alva Edison that a current will flow between the hot filament of an incandescent lamp and another electrode placed in the same lamp, and that this current will flow in only one direction. The Fleming valve was not essentially different from Edison's tube. It was developed by the British physicist and electrical engineer Sir John Ambrose Fleming in 1904 and was the first of the diodes, or two-element tubes, used in radios. This tube was then used as a detector, rectifier, and limiter. A revolutionary advance, which made possible the science of electronics, occurred in 1906 when the American inventor Lee De Forest mounted a third element, the grid, between the filament and cathode of a vacuum tube. De Forest's tube, which he called an audion but which is now called a triode (three-element tube), was first used only as a detector, but its potentialities as an amplifier and oscillator were soon developed, and by 1915 wireless telephony had developed to such a point that communication was established between Virginia and Hawaii and between Virginia and Paris.

The rectifying properties of crystals were discovered in 1912 by the American electrical engineer and inventor Greenleaf Whittier Pickard, who pointed out that crystals can be used as detectors. This discovery gave rise to the so-called crystal sets popular about 1920. In 1912 the American electrical engineer Edwin Howard Armstrong discovered the regenerative circuit, by which part of the output of a tube is fed back to the same tube. This and certain other discoveries by Armstrong form the basis of many circuits in modern radio sets.

In 1902 the American electrical engineer Arthur Edwin Kennelly and the British physicist and electrician Oliver Heaviside, independently and almost simultaneously, announced the probable existence of a layer of ionized gas high in the atmosphere that affects the propagation of radio waves. This layer, formerly called the Heaviside or Kennelly-Heaviside layer, is one of several layers in the ionosphere. Although the ionosphere is transparent to the shortest radio wavelengths, it bends or reflects the longer waves. Because of this reflection, radio waves can be propagated far beyond the horizon. Propagation of radio waves in the ionosphere is strongly affected by time of day, season, and sunspot activity. Slight variations in the nature and altitude of the ionosphere, which can occur rapidly, can affect the quality of long-distance reception.

The ionosphere is also responsible for skip, the reception at a considerable distance of a signal that cannot be received at a closer point. This phenomenon occurs when the intervening ground has absorbed the ground ray and the ionospherically propagated ray is not reflected at an angle sufficiently steep to be received at short distances from the antenna.

C -Short-wave Radio
Although parts of the various radio bands—short-wave, long-wave, medium-wave, very-high frequency, and ultrahigh frequency—are allocated for a variety of purposes, the term short-wave radio generally refers to radiobroadcasts in the high-frequency range (3 to 30 MHz) beamed for long distances, especially in international communication. Microwave communication via satellite, however, provides signals with superior reliability and freedom from error.

Amateur, or “ham,” radio is also commonly thought of as short-wave, although amateur operators have been allotted frequencies in the medium-wave band, the very-high-frequency band, and the ultrahigh-frequency band as well as the short-wave band. Certain of these frequencies have restrictions designed to make them available to maximum numbers of users.
During the rapid development of radio after World War I, amateur operators executed such spectacular feats as the first transatlantic radio contact (1921). They have also provided valuable voluntary assistance during emergencies when normal communications are disrupted. Amateur radio organizations have launched a number of satellites piggyback with regular launches by the United States, the former Soviet Union, and the European Space Agency.

These satellites are usually called Oscar, for Orbiting Satellites Carrying Amateur Radio. The first, Oscar 1, orbited in 1961, was also the first nongovernmental satellite; the fourth, in 1965, provided the first direct-satellite communications between the U.S. and the Soviet Union. More than 1.5 million people worldwide were licensed amateur radio operators in the early 1980s.

The ability to webcast radio programs over the Internet had a major impact on shortwave broadcasting. In the early 2000s the BBC dropped their shortwave radio service to the United States, Canada, Australia, and other developed countries since their programs were available through computers over the World Wide Web. The widespread use of personal computers with Internet access to chat groups and personal Web pages also replaced some of the hobby aspects of amateur radio in popularity.

D -Radio today
Immense developments in radio communication technology after World War II helped make possible space exploration, most dramatically in the Apollo moon-landing missions (1969-72). Sophisticated transmitting and receiving equipment was part of the compact, very-high-frequency, communication system on board the command modules and the lunar modules. The system performed voice and ranging functions simultaneously, calculating the distance between the two vehicles by measuring the time lapse between the transmission of tones and the reception of the returns. The voice signals of the astronauts were also transmitted simultaneously around the world by a communications network.

In the 1990s cellular radio telephones (cell phones) became one of the most important and widespread uses of radio communication. By the early 21st century, billions of people worldwide had access to telephone service with lightweight portable cell phones capable of communicating worldwide through radio relays and satellite links. Cell phones have become particularly important in developing countries where landlines for telephones often do not exist outside of large cities. In remote rural areas an individual who owns a cell phone may charge a small fee to let others use the phone service. Such phone service can have a major economic impact in impoverished regions, permitting access to banking services, providing information on prices of crops, and creating small-business contacts.

Digital and satellite radio also greatly expanded the possibilities of radio. Not only does digital radio provide superior sound quality, but it permits such additional services as multiple audio-programming channels, on-demand audio services, and interactive features, as well as targeted advertising. Wireless Internet allows users of computers and portable media devices to access the World Wide Web from all kinds of locations. Personal digital assistants (PDAs) also use radio to access e-mail and other services, including GPS information from satellites. The transition to digital television is expected to free up a large part of the radio spectrum previously used to broadcast analog television. These frequencies may be available for many more wireless uses in the future.
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #55  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Laser

LASER


I -INTRODUCTION
Laser, a device that produces and amplifies light. The word laser is an acronym for Light Amplification by Stimulated Emission of Radiation. Laser light is very pure in color, can be extremely intense, and can be directed with great accuracy. Lasers are used in many modern technological devices including bar code readers, compact disc (CD) players, and laser printers. Lasers can generate light beyond the range visible to the human eye, from the infrared through the X-ray range. Masers are similar devices that produce and amplify microwaves.

II -PRINCIPLES OF OPERATION
Lasers generate light by storing energy in particles called electrons inside atoms and then inducing the electrons to emit the absorbed energy as light. Atoms are the building blocks of all matter on Earth and are a thousand times smaller than viruses. Electrons are the underlying source of almost all light.
Light is composed of tiny packets of energy called photons. Lasers produce coherent light: light that is monochromatic (one color) and whose photons are “in step” with one another.

A -Excited Atoms
At the heart of an atom is a tightly bound cluster of particles called the nucleus. This cluster is made up of two types of particles: protons, which have a positive charge, and neutrons, which have no charge. The nucleus makes up more than 99.9 percent of the atom’s mass but occupies only a tiny part of the atom’s space. Enlarge an atom up to the size of Yankee Stadium and the equally magnified nucleus is only the size of a baseball.

Electrons, tiny particles that have a negative charge, whirl through the rest of the space inside atoms. Electrons travel in complex orbits and exist only in certain specific energy states or levels . Electrons can move from a low to a high energy level by absorbing energy. An atom with at least one electron that occupies a higher energy level than it normally would is said to be excited. An atom can become excited by absorbing a photon whose energy equals the difference between the two energy levels. A photon’s energy, color, frequency, and wavelength are directly related: All photons of a given energy are the same color and have the same frequency and wavelength.
Usually, electrons quickly jump back to the low energy level, giving off the extra energy as light (see Photoelectric Effect). Neon signs and fluorescent lamps glow with this kind of light as many electrons independently emit photons of different colors in all directions.

B -Stimulated Emission
Lasers are different from more familiar sources of light. Excited atoms in lasers collectively emit photons of a single color, all traveling in the same direction and all in step with one another. When two photons are in step, the peaks and troughs of their waves line up. The electrons in the atoms of a laser are first pumped, or energized, to an excited state by an energy source. An excited atom can then be “stimulated” by a photon of exactly the same color (or, equivalently, the same wavelength) as the photon this atom is about to emit spontaneously. If the photon approaches closely enough, the photon can stimulate the excited atom to immediately emit light that has the same wavelength and is in step with the photon that interacted with it. This stimulated emission is the key to laser operation. The new light adds to the existing light, and the two photons go on to stimulate other excited atoms to give up their extra energy, again in step. The phenomenon snowballs into an amplified, coherent beam of light: laser light.

In a gas laser, for example, the photons usually zip back and forth in a gas-filled tube with highly reflective mirrors facing inward at each end. As the photons bounce between the two parallel mirrors, they trigger further stimulated emissions and the light gets brighter and brighter with each pass through the excited atoms. One of the mirrors is only partially silvered, allowing a small amount of light to pass through rather than reflecting it all. The intense, directional, and single-colored laser light finally escapes through this slightly transparent mirror. The escaped light forms the laser beam.

Albert Einstein first proposed stimulated emission, the underlying process for laser action, in 1917. Translating the idea of stimulated emission into a working model, however, required more than four decades. The working principles of lasers were outlined by the American physicists Charles Hard Townes and Arthur Leonard Schawlow in a 1958 patent application. (Both men won Nobel Prizes in physics for their work, Townes in 1964 and Schawlow in 1981). The patent for the laser was granted to Townes and Schawlow, but it was later challenged by the American physicist and engineer Gordon Gould, who had written down some ideas and coined the word laser in 1957. Gould eventually won a partial patent covering several types of laser. In 1960 American physicist Theodore Maiman of Hughes Aircraft Corporation constructed the first working laser from a ruby rod.

III -TYPES OF LASERS
Lasers are generally classified according to the material, called the medium, they use to produce the laser light. Solid-state, gas, liquid, semiconductor, and free electron are all common types of lasers.

A -Solid-State Lasers
Solid-state lasers produce light by means of a solid medium. The most common solid laser media are rods of ruby crystals and neodymium-doped glasses and crystals. The ends of the rods are fashioned into two parallel surfaces coated with a highly reflecting nonmetallic film. Solid-state lasers offer the highest power output. They are usually pulsed to generate a very brief burst of light. Bursts as short as 12 × 10-15 sec have been achieved. These short bursts are useful for studying physical phenomena of very brief duration. One method of exciting the atoms in lasers is to illuminate the solid laser material with higher-energy light than the laser produces. This procedure, called pumping, is achieved with brilliant strobe light from xenon flash tubes, arc lamps, or metal-vapor lamps.

B -Gas Lasers
The lasing medium of a gas laser can be a pure gas, a mixture of gases, or even metal vapor. The medium is usually contained in a cylindrical glass or quartz tube. Two mirrors are located outside the ends of the tube to form the laser cavity. Gas lasers can be pumped by ultraviolet light, electron beams, electric current, or chemical reactions. The helium-neon laser is known for its color purity and minimal beam spread. Carbon dioxide lasers are very efficient at turning the energy used to excite their atoms into laser light. Consequently, they are the most powerful continuous wave (CW) lasers—that is, lasers that emit light continuously rather than in pulses.

C -Liquid Lasers
The most common liquid laser media are inorganic dyes contained in glass vessels. They are pumped by intense flash lamps in a pulse mode or by a separate gas laser in the continuous wave mode. Some dye lasers are tunable, meaning that the color of the laser light they emit can be adjusted with the help of a prism located inside the laser cavity.

D -Semiconductor Lasers
Semiconductor lasers are the most compact lasers. Gallium arsenide is the most common semiconductor used. A typical semiconductor laser consists of a junction between two flat layers of gallium arsenide. One layer is treated with an impurity whose atoms provide an extra electron, and the other with an impurity whose atoms are one electron short. Semiconductor lasers are pumped by the direct application of electric current across the junction. They can be operated in the continuous wave mode with better than 50 percent efficiency. Only a small percentage of the energy used to excite most other lasers is converted into light.

Scientists have developed extremely tiny semiconductor lasers, called quantum-dot vertical-cavity surface-emitting lasers. These lasers are so tiny that more than a million of them can fit on a chip the size of a fingernail.
Common uses for semiconductor lasers include compact disc (CD) players and laser printers. Semiconductor lasers also form the heart of fiber-optics communication systems.

E -Free Electron Lasers
Free electron lasers employ an array of magnets to excite free electrons (electrons not bound to atoms). First developed in 1977, they are now becoming important research instruments. Free electron lasers are tunable over a broader range of energies than dye lasers. The devices become more difficult to operate at higher energies but generally work successfully from infrared through ultraviolet wavelengths. Theoretically, electron lasers can function even in the X-ray range.

The free electron laser facility at the University of California at Santa Barbara uses intense far-infrared light to investigate mutations in DNA molecules and to study the properties of semiconductor materials. Free electron lasers should also eventually become capable of producing very high-power radiation that is currently too expensive to produce. At high power, near-infrared beams from a free electron laser could defend against a missile attack.

IV -LASER APPLICATIONS
The use of lasers is restricted only by imagination. Lasers have become valuable tools in industry, scientific research, communications, medicine, the military, and the arts.

A -Industry
Powerful laser beams can be focused on a small spot to generate enormous temperatures. Consequently, the focused beams can readily and precisely heat, melt, or vaporize material. Lasers have been used, for example, to drill holes in diamonds, to shape machine tools, to trim microelectronics, to cut fashion patterns, to synthesize new material, and to attempt to induce controlled nuclear fusion. Highly directional laser beams are used for alignment in construction. Perfectly straight and uniformly sized tunnels, for example, may be dug using lasers for guidance. Powerful, short laser pulses also make high-speed photography with exposure times of only several trillionths of a second possible.

B -Scientific Research
Because laser light is highly directional and monochromatic, extremely small amounts of light scattering and small shifts in color caused by the interaction between laser light and matter can easily be detected. By measuring the scattering and color shifts, scientists can study molecular structures of matter. Chemical reactions can be selectively induced, and the existence of trace substances in samples can be detected. Lasers are also the most effective detectors of certain types of air pollution.

Scientists use lasers to make extremely accurate measurements. Lasers are used in this way for monitoring small movements associated with plate tectonics and for geographic surveys. Lasers have been used for precise determination (to within one inch) of the distance between Earth and the Moon, and in precise tests to confirm Einstein’s theory of relativity. Scientists also have used lasers to determine the speed of light to an unprecedented accuracy. Very fast laser-activated switches are being developed for use in particle accelerators. Scientists also use lasers to trap single atoms and subatomic particles in order to study these tiny bits of matter

C -Communications
Laser light can travel a large distance in outer space with little reduction in signal strength. In addition, high-energy laser light can carry 1,000 times the television channels today carried by microwave signals. Lasers are therefore ideal for space communications. Low-loss optical fibers have been developed to transmit laser light for earthbound communication in telephone and computer systems. Laser techniques have also been used for high-density information recording. For instance, laser light simplifies the recording of a hologram, from which a three-dimensional image can be reconstructed with a laser beam. Lasers are also used to play audio CDs and videodiscs.

D -Medicine
Lasers have a wide range of medical uses. Intense, narrow beams of laser light can cut and cauterize certain body tissues in a small fraction of a second without damaging surrounding healthy tissues. Lasers have been used to “weld” the retina, bore holes in the skull, vaporize lesions, and cauterize blood vessels. Laser surgery has virtually replaced older surgical procedures for eye disorders. Laser techniques have also been developed for lab tests of small biological samples.


E -Military Applications
Laser guidance systems for missiles, aircraft, and satellites have been constructed. Guns can be fitted with laser sights and range finders. The use of laser beams to destroy hostile ballistic missiles has been proposed, as in the Strategic Defense Initiative urged by U.S. president Ronald Reagan and the Ballistic Missile Defense program supported by President George W. Bush. The ability of tunable dye lasers to selectively excite an atom or molecule may open up more efficient ways to separate isotopes for construction of nuclear weapons.

V -LASER SAFETY
Because the eye focuses laser light just as it does other light, the chief danger in working with lasers is eye damage. Therefore, laser light should not be viewed either directly or reflected. Lasers sold and used commercially in the United States must comply with a strict set of laws enforced by the Center for Devices and Radiological Health (CDRH), a department of the Food and Drug Administration. The CDRH has divided lasers into six groups, depending on their power output, their emission duration, and the energy of the photons they emit. The classification is then attached to the laser as a sticker. The higher the laser’s energy, the higher its potential to injure. High-powered lasers of the Class IV type (the highest classification) generate a beam of energy that can start fires, burn flesh, and cause permanent eye damage whether the light is direct, reflected, or diffused. Canada uses the same classification system, and laser use in Canada is overseen by Health Canada’s Radiation Protection Bureau.

Goggles blocking the specific color of photons that a laser produces are mandatory for the safe use of lasers. Even with goggles, direct exposure to laser light should be avoided.
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #56  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Light Absorption and Emission

When a photon, or packet of light energy, is absorbed by an atom, the atom gains the energy of the photon, and one of the atom’s electrons may jump to a higher energy level. The atom is then said to be excited. When an electron of an excited atom falls to a lower energy level, the atom may emit the electron’s excess energy in the form of a photon. The energy levels, or orbitals, of the atoms shown here have been greatly simplified to illustrate these absorption and emission processes. For a more accurate depiction of electron orbitals, see the Atom article
Attached Thumbnails
EDS- notes-laser1.gif  
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #57  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Laser and Incandescent Light

White light, such as that produced by an incandescent bulb, is composed of many colors of light—each with a different wavelength—and spreads out in all directions. Laser light consists of a single color (a single wavelength) and moves in one direction with the peaks and troughs of its waves in lockstep.
Attached Thumbnails
EDS- notes-laser.gif  
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #58  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Antibiotics

Antibiotics


I -INTRODUCTION
Antibiotics (Greek anti, “against”; bios, “life”) are chemical compounds used to kill or inhibit the growth of infectious organisms. Originally the term antibiotic referred only to organic compounds, produced by bacteria or molds, that are toxic to other microorganisms. The term is now used loosely to include synthetic and semisynthetic organic compounds. Antibiotic refers generally to antibacterials; however, because the term is loosely defined, it is preferable to specify compounds as being antimalarials, antivirals, or antiprotozoals. All antibiotics share the property of selective toxicity: They are more toxic to an invading organism than they are to an animal or human host. Penicillin is the most well-known antibiotic and has been used to fight many infectious diseases, including syphilis, gonorrhea, tetanus, and scarlet fever. Another antibiotic, streptomycin, has been used to combat tuberculosis.

II -HISTORY
Although the mechanisms of antibiotic action were not scientifically understood until the late 20th century, the principle of using organic compounds to fight infection has been known since ancient times. Crude plant extracts were used medicinally for centuries, and there is anecdotal evidence for the use of cheese molds for topical treatment of infection. The first observation of what would now be called an antibiotic effect was made in the 19th century by French chemist Louis Pasteur, who discovered that certain saprophytic bacteria can kill anthrax bacilli.

In the first decade of the 20th century, German physician and chemist Paul Ehrlich began experimenting with the synthesis of organic compounds that would selectively attack an infecting organism without harming the host organism. His experiments led to the development, in 1909, of salvarsan, a synthetic compound containing arsenic, which exhibited selective action against spirochetes, the bacteria that cause syphilis. Salvarsan remained the only effective treatment for syphilis until the purification of penicillin in the 1940s. In the 1920s British bacteriologist Sir Alexander Fleming, who later discovered penicillin, found a substance called lysozyme in many bodily secretions, such as tears and sweat, and in certain other plant and animal substances. Lysozyme has some antimicrobial activity, but it is not clinically useful.

Penicillin, the archetype of antibiotics, is a derivative of the mold Penicillium notatum. Penicillin was discovered accidentally in 1928 by Fleming, who showed its effectiveness in laboratory cultures against many disease-producing bacteria. This discovery marked the beginning of the development of antibacterial compounds produced by living organisms. Penicillin in its original form could not be given by mouth because it was destroyed in the digestive tract and the preparations had too many impurities for injection. No progress was made until the outbreak of World War II stimulated renewed research and the Australian pathologist Sir Howard Florey and German-British biochemist Ernst Chain purified enough of the drug to show that it would protect mice from infection. Florey and Chain then used the purified penicillin on a human patient who had staphylococcal and streptococcal septicemia with multiple abscesses and osteomyelitis. The patient, gravely ill and near death, was given intravenous injections of a partly purified preparation of penicillin every three hours. Because so little was available, the patient's urine was collected each day; the penicillin was extracted from the urine and used again. After five days the patient's condition improved vastly. However, with each passage through the body, some penicillin was lost. Eventually the supply ran out and the patient died.

The first antibiotic to be used successfully in the treatment of human disease was tyrothricin, isolated from certain soil bacteria by American bacteriologist Rene Dubos in 1939. This substance is too toxic for general use, but it is employed in the external treatment of certain infections. Other antibiotics produced by a group of soil bacteria called actinomycetes have proved more successful. One of these, streptomycin, discovered in 1944 by American biologist Selman Waksman and his associates, was, in its time, the major treatment for tuberculosis.

Since antibiotics came into general use in the 1950s, they have transformed the patterns of disease and death. Many diseases that once headed the mortality tables—such as tuberculosis, pneumonia, and septicemia—now hold lower positions. Surgical procedures, too, have been improved enormously, because lengthy and complex operations can now be carried out without a prohibitively high risk of infection. Chemotherapy has also been used in the treatment or prevention of protozoal and fungal diseases, especially malaria, a major killer in economically developing nations (see Third World). Slow progress is being made in the chemotherapeutic treatment of viral diseases. New drugs have been developed and used to treat shingles (see herpes) and chicken pox. There is also a continuing effort to find a cure for acquired immunodeficiency syndrome (AIDS), caused by the human immunodeficiency virus (HIV).

III -CLASSIFICATION
Antibiotics can be classified in several ways. The most common method classifies them according to their action against the infecting organism. Some antibiotics attack the cell wall; some disrupt the cell membrane; and the majority inhibit the synthesis of nucleic acids and proteins, the polymers that make up the bacterial cell. Another method classifies antibiotics according to which bacterial strains they affect: staphylococcus, streptococcus, or Escherichia coli, for example. Antibiotics are also classified on the basis of chemical structure, as penicillins, cephalosporins, aminoglycosides, tetracyclines, macrolides, or sulfonamides, among others.

A -Mechanisms of Action
Most antibiotics act by selectively interfering with the synthesis of one of the large-molecule constituents of the cell—the cell wall or proteins or nucleic acids. Some, however, act by disrupting the cell membrane (see Cell Death and Growth Suppression below). Some important and clinically useful drugs interfere with the synthesis of peptidoglycan, the most important component of the cell wall. These drugs include the Β-lactam antibiotics, which are classified according to chemical structure into penicillins, cephalosporins, and carbapenems. All these antibiotics contain a Β-lactam ring as a critical part of their chemical structure, and they inhibit synthesis of peptidoglycan, an essential part of the cell wall. They do not interfere with the synthesis of other intracellular components. The continuing buildup of materials inside the cell exerts ever-greater pressure on the membrane, which is no longer properly supported by peptidoglycan. The membrane gives way, the cell contents leak out, and the bacterium dies. These antibiotics do not affect human cells because human cells do not have cell walls.

Many antibiotics operate by inhibiting the synthesis of various intracellular bacterial molecules, including DNA, RNA, ribosomes, and proteins. The synthetic sulfonamides are among the antibiotics that indirectly interfere with nucleic acid synthesis. Nucleic-acid synthesis can also be stopped by antibiotics that inhibit the enzymes that assemble these polymers—for example, DNA polymerase or RNA polymerase. Examples of such antibiotics are actinomycin, rifamicin, and rifampicin, the last two being particularly valuable in the treatment of tuberculosis. The quinolone antibiotics inhibit synthesis of an enzyme responsible for the coiling and uncoiling of the chromosome, a process necessary for DNA replication and for transcription to messenger RNA. Some antibacterials affect the assembly of messenger RNA, thus causing its genetic message to be garbled. When these faulty messages are translated, the protein products are nonfunctional. There are also other mechanisms: The tetracyclines compete with incoming transfer-RNA molecules; the aminoglycosides cause the genetic message to be misread and a defective protein to be produced; chloramphenicol prevents the linking of amino acids to the growing protein; and puromycin causes the protein chain to terminate prematurely, releasing an incomplete protein.

B -Range of Effectiveness
In some species of bacteria the cell wall consists primarily of a thick layer of peptidoglycan. Other species have a much thinner layer of peptidoglycan and an outer as well as an inner membrane. When bacteria are subjected to Gram's stain, these differences in structure affect the differential staining of the bacteria with a dye called gentian violet. The differences in staining coloration (gram-positive bacteria appear purple and gram-negative bacteria appear colorless or reddish, depending on the process used) are the basis of the classification of bacteria into gram-positive (those with thick peptidoglycan) and gram-negative (those with thin peptidoglycan and an outer membrane), because the staining properties correlate with many other bacterial properties. Antibacterials can be further subdivided into narrow-spectrum and broad-spectrum agents. The narrow-spectrum penicillins act against many gram-positive bacteria. Aminoglycosides, also narrow-spectrum, act against many gram-negative as well as some gram-positive bacteria. The tetracyclines and chloramphenicols are both broad-spectrum drugs because they are effective against both gram-positive and gram-negative bacteria.

C -Cell Death and Growth Suppression

Antibiotics may also be classed as bactericidal (killing bacteria) or bacteriostatic (stopping bacterial growth and multiplication). Bacteriostatic drugs are nonetheless effective because bacteria that are prevented from growing will die off after a time or be killed by the defense mechanisms of the host. The tetracyclines and the sulfonamides are among the bacteriostatic antiobiotics. Antibiotics that damage the cell membrane cause the cell's metabolites to leak out, thus killing the organism. Such compounds, including penicillins and cephalosporins, are therefore classed as bactericidal.

IV -TYPES OF ANTIBIOTICS
Following is a list of some of the more common antibiotics and examples of some of their clinical uses. This section does not include all antibiotics nor all of their clinical applications.

A -Penicillins
Penicillins are bactericidal, inhibiting formation of the cell wall. There are four types of penicillins: the narrow-spectrum penicillin-G types, ampicillin and its relatives, the penicillinase-resistants, and the extended spectrum penicillins that are active against pseudomonas. Penicillin-G types are effective against gram-positive strains of streptococci, staphylococci, and some gram-negative bacteria such as meningococcus. Penicillin-G is used to treat such diseases as syphilis, gonorrhea, meningitis, anthrax, and yaws. The related penicillin V has a similar range of action but is less effective. Ampicillin and amoxicillin have a range of effectiveness similar to that of penicillin-G, with a slightly broader spectrum, including some gram-negative bacteria. The penicillinase-resistants are penicillins that combat bacteria that have developed resistance to penicillin-G. The antipseudomonal penicillins are used against infections caused by gram-negative Pseudomonas bacteria, a particular problem in hospitals. They may be administered as a prophylactic in patients with compromised immune systems, who are at risk from gram-negative infections.

Side effects of the penicillins, while relatively rare, can include immediate and delayed allergic reactions—specifically, skin rashes, fever, and anaphylactic shock, which can be fatal.

B -Cephalosporin
Like the penicillins, cephalosporins have a Β-lactam ring structure that interferes with synthesis of the bacterial cell wall and so are bactericidal. Cephalosporins are more effective than penicillin against gram-negative bacilli and equally effective against gram-positive cocci. Cephalosporins may be used to treat strains of meningitis and as a prophylactic for orthopedic, abdominal, and pelvic surgery. Rare hypersensitive reactions from the cephalosporins include skin rash and, less frequently, anaphylactic shock.

C -Aminoglycosides
Streptomycin is the oldest of the aminoglycosides. The aminoglycosides inhibit bacterial protein synthesis in many gram-negative and some gram-positive organisms. They are sometimes used in combination with penicillin. The members of this group tend to be more toxic than other antibiotics. Rare adverse effects associated with prolonged use of aminoglycosides include damage to the vestibular region of the ear, hearing loss, and kidney damage.

D -Tetracyclines
Tetracyclines are bacteriostatic, inhibiting bacterial protein synthesis. They are broad-spectrum antibiotics effective against strains of streptococci, gram-negative bacilli, rickettsia (the bacteria that causes typhoid fever), and spirochetes (the bacteria that causes syphilis). They are also used to treat urinary-tract infections and bronchitis. Because of their wide range of effectiveness, tetracyclines can sometimes upset the balance of resident bacteria that are normally held in check by the body's immune system, leading to secondary infections in the gastrointestinal tract and vagina, for example. Tetracycline use is now limited because of the increase of resistant bacterial strains.

E -Macrolides
The macrolides are bacteriostatic, binding with bacterial ribosomes to inhibit protein synthesis. Erythromycin, one of the macrolides, is effective against gram-positive cocci and is often used as a substitute for penicillin against streptococcal and pneumococcal infections. Other uses for macrolides include diphtheria and bacteremia. Side effects may include nausea, vomiting, and diarrhea; infrequently, there may be temporary auditory impairment.

F -Sulfonamides
The sulfonamides are synthetic bacteriostatic, broad-spectrum antibiotics, effective against most gram-positive and many gram-negative bacteria. However, because many gram-negative bacteria have developed resistance to the sulfonamides, these antibiotics are now used only in very specific situations, including treatment of urinary-tract infection, against meningococcal strains, and as a prophylactic for rheumatic fever. Side effects may include disruption of the gastrointestinal tract and hypersensitivity.

V -PRODUCTION
The production of a new antibiotic is lengthy and costly. First, the organism that makes the antibiotic must be identified and the antibiotic tested against a wide variety of bacterial species. Then the organism must be grown on a scale large enough to allow the purification and chemical analysis of the antibiotic and to demonstrate that it is unique. This is a complex procedure because there are several thousand compounds with antibiotic activity that have already been discovered, and these compounds are repeatedly rediscovered. After the antibiotic has been shown to be useful in the treatment of infections in animals, larger-scale preparation can be undertaken.

Commercial development requires a high yield and an economic method of purification. Extensive research may be needed to increase the yield by selecting improved strains of the organism or by changing the growth medium. The organism is then grown in large steel vats, in submerged cultures with forced aeration. The naturally fermented product may be modified chemically to produce a semisynthetic antibiotic. After purification, the effect of the antibiotic on the normal function of host tissues and organs (its pharmacology), as well as its possible toxic actions (toxicology), must be tested on a large number of animals of several species. In addition, the effective forms of administration must be determined. Antibiotics may be topical, applied to the surface of the skin, eye, or ear in the form of ointments or creams. They may be oral, or given by mouth, and either allowed to dissolve in the mouth or swallowed, in which case they are absorbed into the bloodstream through the intestines. Antibiotics may also be parenteral, or injected intramuscularly, intravenously, or subcutaneously; antibiotics are administered parenterally when fast absorption is required.

In the United States, once these steps have been completed, the manufacturer may file an Investigational New Drug Application with the Food and Drug Administration (FDA). If approved, the antibiotic can be tested on volunteers for toxicity, tolerance, absorption, and excretion. If subsequent tests on small numbers of patients are successful, the drug can be used on a larger group, usually in the hundreds. Finally a New Drug Application can be filed with the FDA, and, if this application is approved, the drug can be used generally in clinical medicine. These procedures, from the time the antibiotic is discovered in the laboratory until it undergoes clinical trial, usually extend over several years.

VI -RISKS AND LIMITATIONS
The use of antibiotics is limited because bacteria have evolved defenses against certain antibiotics. One of the main mechanisms of defense is inactivation of the antibiotic. This is the usual defense against penicillins and chloramphenicol, among others. Another form of defense involves a mutation that changes the bacterial enzyme affected by the drug in such a way that the antibiotic can no longer inhibit it. This is the main mechanism of resistance to the compounds that inhibit protein synthesis, such as the tetracyclines.

All these forms of resistance are transmitted genetically by the bacterium to its progeny. Genes that carry resistance can also be transmitted from one bacterium to another by means of plasmids, chromosomal fragments that contain only a few genes, including the resistance gene. Some bacteria conjugate with others of the same species, forming temporary links during which the plasmids are passed from one to another. If two plasmids carrying resistance genes to different antibiotics are transferred to the same bacterium, their resistance genes can be assembled onto a single plasmid. The combined resistances can then be transmitted to another bacterium, where they may be combined with yet another type of resistance. In this way, plasmids are generated that carry resistance to several different classes of antibiotic. In addition, plasmids have evolved that can be transmitted from one species of bacteria to another, and these can transfer multiple antibiotic resistance between very dissimilar species of bacteria.
The problem of resistance has been exacerbated by the use of antibiotics as prophylactics, intended to prevent infection before it occurs. Indiscriminate and inappropriate use of antibiotics for the treatment of the common cold and other common viral infections, against which they have no effect, removes antibiotic-sensitive bacteria and allows the development of antibiotic-resistant bacteria. Similarly, the use of antibiotics in poultry and livestock feed has promoted the spread of drug resistance and has led to the widespread contamination of meat and poultry by drug-resistant bacteria such as Salmonella.

In the 1970s, tuberculosis seemed to have been nearly eradicated in the developed countries, although it was still prevalent in developing countries. Now its incidence is increasing, partly due to resistance of the tubercle bacillus to antibiotics. Some bacteria, particularly strains of staphylococci, are resistant to so many classes of antibiotics that the infections they cause are almost untreatable. When such a strain invades a surgical ward in a hospital, it is sometimes necessary to close the ward altogether for a time. Similarly, plasmodia, the causative organisms of malaria, have developed resistance to antibiotics, while, at the same time, the mosquitoes that carry plasmodia have become resistant to the insecticides that were once used to control them. Consequently, although malaria had been almost entirely eliminated, it is now again rampant in Africa, the Middle East, Southeast Asia, and parts of Latin America. Furthermore, the discovery of new antibiotics is now much less common than in the past.
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #59  
Old Monday, December 03, 2007
Predator's Avatar
Senior Member
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason:
 
Join Date: Aug 2007
Location: Karachi
Posts: 2,572
Thanks: 813
Thanked 1,975 Times in 838 Posts
Predator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to beholdPredator is a splendid one to behold
Post Fertilizer

Fertilizer



Fertilizer, natural or synthetic chemical substance or mixture used to enrich soil so as to promote plant growth. Plants do not require complex chemical compounds analogous to the vitamins and amino acids required for human nutrition, because plants are able to synthesize whatever compounds they need. They do require more than a dozen different chemical elements and these elements must be present in such forms as to allow an adequate availability for plant use. Within this restriction, nitrogen, for example, can be supplied with equal effectiveness in the form of urea, nitrates, ammonium compounds, or pure ammonia.

Virgin soil usually contains adequate amounts of all the elements required for proper plant nutrition. When a particular crop is grown on the same parcel of land year after year, however, the land may become exhausted of one or more specific nutrients. If such exhaustion occurs, nutrients in the form of fertilizers must be added to the soil. Plants can also be made to grow more lushly with suitable fertilizers.

Of the required nutrients, hydrogen, oxygen, and carbon are supplied in inexhaustible form by air and water. Sulfur, calcium, and iron are necessary nutrients that usually are present in soil in ample quantities. Lime (calcium) is often added to soil, but its function is primarily to reduce acidity and not, in the strict sense, to act as a fertilizer. Nitrogen is present in enormous quantities in the atmosphere, but plants are not able to use nitrogen in this form; bacteria provide nitrogen from the air to plants of the legume family through a process called nitrogen fixation. The three elements that most commonly must be supplied in fertilizers are nitrogen, phosphorus, and potassium. Certain other elements, such as boron, copper, and manganese, sometimes need to be included in small quantities.

Many fertilizers used since ancient times contain one or more of the three elements important to the soil. For example, manure and guano contain nitrogen. Bones contain small quantities of nitrogen and larger quantities of phosphorus. Wood ash contains appreciable quantities of potassium (depending considerably on the type of wood). Clover, alfalfa, and other legumes are grown as rotating crops and then plowed under, enriching the soil with nitrogen.

The term complete fertilizer often refers to any mixture containing all three important elements; such fertilizers are described by a set of three numbers. For example, 5-8-7 designates a fertilizer (usually in powder or granular form) containing 5 percent nitrogen, 8 percent phosphorus (calculated as phosphorus pentoxide), and 7 percent potassium (calculated as potassium oxide).

While fertilizers are essential to modern agriculture, their overuse can have harmful effects on plants and crops and on soil quality. In addition, the leaching of nutrients into bodies of water can lead to water pollution problems such as eutrophication, by causing excessive growth of vegetation.
The use of industrial waste materials in commercial fertilizers has been encouraged in the United States as a means of recycling waste products. The safety of this practice has recently been called into question. Its opponents argue that industrial wastes often contain elements that poison the soil and can introduce toxic chemicals into the food chain.
__________________
No signature...
Reply With Quote
The Following User Says Thank You to Predator For This Useful Post:
Rashadbunery (Friday, October 28, 2011)
  #60  
Old Thursday, January 03, 2008
Princess Royal's Avatar
Super Moderator
Medal of Appreciation: Awarded to appreciate member's contribution on forum. (Academic and professional achievements do not make you eligible for this medal) - Issue reason: Best Moderator Award: Awarded for censoring all swearing and keeping posts in order. - Issue reason: Best Mod 2008
 
Join Date: Sep 2007
Location: K.S.A.
Posts: 2,115
Thanks: 869
Thanked 1,764 Times in 818 Posts
Princess Royal is a splendid one to beholdPrincess Royal is a splendid one to beholdPrincess Royal is a splendid one to beholdPrincess Royal is a splendid one to beholdPrincess Royal is a splendid one to beholdPrincess Royal is a splendid one to beholdPrincess Royal is a splendid one to behold
Default

Enzymes

Protein molecules are built up by enzymes which join together tens or hundreds of amino acid molecules. These proteins are added to the cell membrane, to the cytoplasm or to the nucleus of the cell. They may also become the proteins which act as enzymes.

Enzymes are proteins in nature that act as catalysts. They are made in all living cells. A catalyst is a chemical substance which speeds up a reaction but does not get used up during the reaction, thus, one enzyme can be used many times over. Without these catalysts, which speed the rate of chemical reactions, metabolism would not occur at a fast enough rate to sustain life. For instance, if starch is mixed with water it will break down very slowly to sugar, taking several years. In your saliva, there is an enzyme called amylase which can break down starch to sugar in minutes or seconds.

Reactions in which large molecules are built up from smaller molecules are called anabolic reactions, whereas, reactions which split large molecules into smaller ones are called catabolic reactions.

Enzymes are specific
This means simply that an enzyme which normally acts on one substance will not act on a different one. The shape of an enzyme decides what substances it combines with. Each enzyme has a shape which exactly fits the substances on which it acts, but will not fit (or react with) the substances of different shapes.
An enzyme molecule has a dent in it called the active site. This active site is exactly the right size and shape for a molecule of the substrate to fit into (exactly like lock and key). Thus, an enzyme which breaks down starch to maltose will not also break down proteins to amino acids. Also, if a reaction takes places in stages, e.g.
starch → maltose (stage 1)
maltose → glucose (stage 2)
a different enzyme is needed for each stage.

The names of enzymes usually end with –ase and they are named according to the substance on which they act, or the reaction which they promote. For example, an enzyme which acts on proteins may be called a protease; one which removes hydrogen from a substance is a dehydrogenase.

The substance on which an enzyme acts is called its substrate. Thus, the enzyme sucrase acts on the substrate sucrose to produce the monosaccharides glucose and fructose.

Enzymes and temperature
A rise in temperature increases the rate of most chemical reactions; a fall in temperature slows them down. In many cases a rise of 10 degree Celsius will double the rate of reaction in a cell. This is equally true for enzymes controlled reactions. Between 0-50 degree Celsius, increasing the temperature increases the rate of reaction. This is because the enzyme molecules and substrate molecules move faster at higher temperatures, colliding into each other more often. But above 50 degree Celsius the enzymes, being proteins, are denatured (i.e. the shape of enzymes are changed and the enzymes can no longer combine with the substances or fit into the active site) and stop working. A denatured enzyme cannot act as a catalyst.

This is one of the reasons why organisms may be killed by prolonged exposure to high temperatures. The enzymes in their cells are denatured and the chemical reactions proceed too slowly to maintain life.

One way to test whether a substance is an enzyme is to heat it to the boiling point. If it can still carry out its reactions after this, it cannot be an enzyme. This technique is used as a ‘control’ in enzyme experiment.

Enzymes and pH
pH is a measure of how acidic or alkaline a solution is. The scale runs from 1 to 14. A pH of 7 is neutral. A pH below 7 is acidic and a pH above 7 is alkaline.

Acid or alkaline conditions alter the chemical properties of proteins, including enzymes. For most enzymes, there is a small range of pH which their molecules are exactly the right shape to catalyse their reaction. Above or below this pH, their molecules lose their shape, so the substance can not fit into the enzyme’s active site and cannot act as a catalyst. The protein digesting enzyme in your stomach, for example, works well at an acidity of pH 2. At this pH, the enzyme amylase, from your saliva, cannot work at all. Inside the cells, most enzymes will work best in neutral conditions (pH 7).

Although changes in pH affect the activity of enzymes, these effects are usually reversible, i.e. an enzyme which is inactivated by a low pH will resume its normal activity when its optimum pH is restored. Extremes of pH, however, may denature some enzymes irreversibly.

N.B. An enzyme which is denatured by extreme pH or temperature values will not resume its normal activity by decreasing pH or temperature values, as at higher temperatures and certain pH values they lose their shape and die.

The pH or temperature at which an enzyme works best is often called its optimum pH or temperature.

Rates of enzyme reactions
As explained above, the rate of an enzyme-controlled reaction depends on the temperature and pH. It also depends on the concentrations of the enzyme and its substrate. The more enzyme molecules produced by a cell, the faster the reaction will proceed, provided there are enough substrate molecules available. Similarly, an increase in the substrate concentration will speed up the reaction if there are enough enzyme molecules to cope with the additional substrate.

Intra- and extracellular enzymes
All enzymes are made inside cells. Most of them remain inside the cell to speed up the reactions in the cytoplasm and nucleus. These are called intracellular enzymes (‘intra’ means ‘inside’). In a few cases, the enzymes made in the cells are let out of the cells to do their work outside. These are extracellular enzymes (‘extra’ means ‘outside’).
Examples:
1. Fungi and bacteria release extracellular enzymes in order to digest their food.
2. A mould growing on a piece of bread releases starch-digesting enzymes into the bread and absorbs sugars which the enzyme produces from the bread.
3. In the digestive system, extracellular enzymes are released into the stomach and intestines in order to digest the food.

Role of enzymes in the biological washing products
Biological washing powders contain enzymes (extracted from micro-organisms) such as proteases, lipases and amylases, often with high optimum temperatures, which help to break down the protein and fats stains, such as blood and egg, into smaller molecules. The small molecules are colourless and soluble in water, that can be washed away.

For example, the enzyme protease breaks down the colourful but insoluble protein molecules in the stains into simple amino acids. These are colourless and soluble simple molecules which can easily dissolve in water and be washed away. These powders are biodegradable and do not cause pollution.

Role of enzymes in seed germination
Before the germination of a seed, it is dry and contains non-active enzymes and stored food which is in the form of complex molecules and are not used by the seed. When seed is watered, it begins to germinate and absorb water. When sufficient water is absorbed, hydrolysis enzymes (or hydrolases) present in the seeds are activated. These enzymes break down (by hydrolysis) the food stored in the seed and convert it to small and soluble molecules which are transported to the growing parts of the plants and used in the growth of the seedling.

Role of enzymes in food industry
Food manufacturers often use enzymes for example, when juice is squeezed out of apples to make a drink, an enzyme called pectinase is usually added. Pectinase is an enzyme that breaks down the substance that holds the cell wall of the apple cell together. This makes it easier to squeeze most of the substances that make apple juice cloudy, turning it to a cler liquid.

Another enzyme that is often used is lactase. This is an enzyme that breaks down the sugar found in milk called lactose into another sugar called glucose. If lactase is added to the milk it breaks down all the lactose and it is safer for the people to drink who do not have lactase in their digestive system.
Reply With Quote
The Following 2 Users Say Thank You to Princess Royal For This Useful Post:
Rashadbunery (Friday, October 28, 2011), sonia shamroze (Saturday, July 04, 2009)
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Very Important : How to Prepare Study Notes Shaa-Baaz Tips and Experience Sharing 5 Sunday, May 21, 2017 08:30 PM
Effective Study Skills Sureshlasi Tips and Experience Sharing 1 Friday, November 16, 2007 09:28 AM
Regarding Notes Anonymous84 Tips and Experience Sharing 1 Wednesday, August 15, 2007 06:56 PM


CSS Forum on Facebook Follow CSS Forum on Twitter

Disclaimer: All messages made available as part of this discussion group (including any bulletin boards and chat rooms) and any opinions, advice, statements or other information contained in any messages posted or transmitted by any third party are the responsibility of the author of that message and not of CSSForum.com.pk (unless CSSForum.com.pk is specifically identified as the author of the message). The fact that a particular message is posted on or transmitted using this web site does not mean that CSSForum has endorsed that message in any way or verified the accuracy, completeness or usefulness of any message. We encourage visitors to the forum to report any objectionable message in site feedback. This forum is not monitored 24/7.

Sponsors: ArgusVision   vBulletin, Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.