World War Ebooks Catalog
In the UK, the interest in transsphenoidal surgery was rekindled after the World War II and hypophysectomy was used as endocrine control of the spread of hormone-sensitive secondary cancers, such as breast and prostate. As neurosurgery was developing in other directions at that time, otolaryngologists such as Angel-James in Bristol, Richards in Cardiff, and Williams in London developed the transethmoidal technique, gaining enormous experience at a time when endocrinologists were able to measure accurately and to a certain extent control pituitary diseases medically. It was only on the continent and in Canada, through the legacy of Cushing's trainees (Norman Dott, who taught Guiot in Paris, who taught Jules Hardy in Montreal), that transsphenoidal surgery remained in the neurosurgical domain.
Ecological research in the United States was not well funded until after World War II. With the advent of the Cold War, science was suddenly considered important for national welfare. In 1950 the U.S. Congress established the National Science Foundation, and ecologists were able to make the case for their research along with that of the other sciences. The Atomic Energy Commission had already begun to fund ecological researches by 1947, and under its patronage the Oak Ridge Laboratory and the University of Georgia gradually became important centers for radiation ecology research.
Karl von Frisch enrolled at the University of Vienna as a medical student in 1905. Although he excelled in the study of anatomy and physiology, he realized he had no interest in clinical medicine. In 1907, he transferred to the Zoological Institute in Munich. After studying zoology with Richard von Hertwig and experimental biology with Hans Przibaum in Vienna, Frisch studied marine biology at the Biological Institute for Marine Research in Trieste, Italy. He was awarded the Ph.D. in 1910 for his thesis on light and color perception in minnows. These studies led to experiments on color discrimination in bees. While serving as assistant to Richard Hertwig at the Zoological Institute at the University of Munich, he earned his University Teaching Certificate in Zoology and Comparative Anatomy. By 1914, Frisch had proved that food stimuli could be used to train bees to respond to different colors. His research was interrupted by World War I, during which he worked in a Vienna hospital.
In 1940, Cole and Curtis, using careful electrode placement coupled with biophysical and mathematical analysis, obtained the first convincing evidence for a substantial transient increase in membrane conductivity during passage of the action potential. While they estimated a large conductance increase, it was not infinite, so without a direct measurement of membrane potential it was not possible to confirm or nullify Bernstein's hypothesis. During a postdoctoral year in the U.S. in 1937-1938, Hodgkin established connections with Cole's group at Columbia and worked with them at Woods Hole in the summer. He and Curtis nearly succeeded in measuring V directly by tunneling along the giant axon with a glass micropipette. When each succeeded later (separately, with other collaborators), they found, surprisingly, that V rose transiently toward zero, but with a substantial overshoot. This finding brought into serious question the hypothesis of Bernstein and provided much food for thought...
The Public Health Laboratory Service (PHLS) began as a network of bacteriology laboratories in England and Wales, the Emergency Public Health Laboratory Service (EPHLS), brought together in 1939 to combat the threat of epidemics during the Second World War (Williams, 1985). In 1946 a permanent service was established and subsequently enlarged to include 63 laboratories by 1969 to monitor and control the spread of infectious disease in peacetime. In 1946 a collection of reference laboratories was assembled on the site of the Government Lymph Establishment at Colindale in North West London, where previously smallpox vaccine was produced. This formed the Central Public Health Laboratory (CPHL) and included the Virus Reference Laboratory (VRL), the initial function of which was to set up diagnostic facilities for smallpox. The work of CPHL expanded, and in 1951 the building of the ''tower block'' (Fig. 4.1), was begun. VRL was housed on the third floor of this building, and much of the...
Most (but perhaps not all) cases of necrotic enteritis caused by C. perfringens type C are foodborne and involve malnourished individuals (5). Therefore, it is not surprising that this illness was first recognized in post-World War II Germany, where it was referred to as Darmbrand ( bowel fire ).
In addition to medical devices being valued as commodities, the medical device industry appears to be resistant to economic slowdown. This may be because the Baby Boomers (those born after World War II, between 1946 and 1964) are more concerned about staying fit than previous generations, and are also more receptive to high-technology solutions, e.g. having a stent implanted rather than life-long pharmaceutical use. There will also be more alternative treatment sites as patient treatment and care is moved out of the traditional hospital setting to the home, assisted living facilities, and regional treatment centers. While this has become common for some medical devices (e.g. diagnostic devices at regional laboratories and remote reading of radiological images), as technology develops, the types of alternative treatment locations will grow even more.
Yet come close to something like smallpox or hepatitis B as a killer of human beings on a historical scale. It is coming close to, and may have already surpassed, the influenza epidemic that swept 30 million human beings from the face of the earth just after World War I. What is so frightening about AIDS is the speed with which it is spreading, the incredible rate of increase in the number of cases diagnosed each year, with absolutely no cure in sight.
Much of what is written in present-day biochemistry textbooks about the metabolism of glycogen was discovered between about 1925 and 1950 by the remarkable husband and wife team of Carl F. Cori and Gerty T. Cori. Both trained in medicine in Europe at the end of World War I (she completed premedical studies and medical school in one year ). They left Europe together in 1922 to establish research laboratories in the United States, first for nine years in Buffalo, New York, at what is now the Roswell Park Memorial Institute, then from 1931 until the end of their lives at Washington University in St. Louis.
It has been estimated that the influenza epidemic of 1918 killed 675,000 Americans, including 43,000 servicemen mobilized for World War I 12 . The impact was so profound as to depress average life expectancy in the U.S. by more than 12 years, Fig. 1 24 , and may have played a significant role in ending the World War I conflict 12, 43 .
Reprint. Boston Houghton Mifflin, 1994. A hard look at the effects of insecticides and pesticides on songbird populations throughout the United States, whose declining numbers yielded the silence to which the title attests. Cremlyn, R. J. Agrochemicals. New York Wiley, 1991. Discusses the growth in sophistication and application of chemical pesticides or agrochemicals since World War II. Physio-
Gation of the carotid artery bifurcation and intracra-nial internal carotid artery ligations using metallic clips. In 1905 Chiari revived the concept that occlusive disease of the extracranial arteries can result in stroke describing ulcerating plaque in the bifurcation found during his pathological studies 18 and then Hunt 41 noted that the extracranial portions of the carotid arteries should be carefully examined in patients with TIA and ischemic stroke. With the development of cerebral angiography in 1927 by Moniz 51 , it became apparent that extracranial occlusive disease was a major cause for ischemic stroke. Thereafter correction had to await the development of vascular surgery during World War II.
Even acute pain is not a simple matter of stimulus intensity in the clinical situation. Beecher (1946, 1959) observed, on the Anzio beachhead during World War II, that wounded soldiers did not typically report pain as they waited to be removed from the battlefield, in spite of gunshot and shrapnel wounds that eventually may have needed major surgery, amputation, and long-term convalescence. He contrasted the wounded soldier's mild euphoria with similarly injured civilians in a hospital emergency setting, who typically expressed considerable pain and suffering. The soldier knew he was going home, and that he no longer had to fear being killed for the civilian the pain has socio-economic implications, fear of job loss, and so on. Subsequent studies have confirmed that acute pain is primarily mediated by anxiety (Sternbach, 1968). Beecher's (1959) emphasis on the manner in which the psychological significance of the pain modulates wound severity has led to the delineation of learning...
Merrill took a special interest in this particular patient. Merrill had been working with a medical equipment company on the refinement of an artificial kidney, what we would today call a renal dialysis machine. This machine, first developed in Holland during World War II, was showing great promise in being able to substitute for one of the most vital kidney functions removing from the blood toxic substances that could cause precisely the symptoms this young man was experiencing. In fact, on his second visit to the hospital Richard was treated with one of the artificial kidneys and, as the doctors expected, showed great improvement.
The distinguished Russian geneticist Dr. Nikolai Ivanovich Vavilov (1887-1943), at one time president of the Lenin All-Union Academy of Agriculturist Science in the Soviet Union, had collected a large number of seeds, fruits, and tubers from parts of Transcaucasia, Ethiopia, and Afghanistan, which are resistant to rust and mildew. This collection and others from all over the world was preserved at the Vavilov Institute of Plant Industry in Leningrad. During the 880-day siege by the Third Reich during World War II, tens of thousands of the city's residents starved to death. Eight of Vavilov's scientists died of starvation, although they could have eaten the seed collections that were readily at hand. These collections were preserved and guarded because the people's agricultural future depended on the diverse gene stocks (Davis, 2003 Webster, 2003).
The history of the plague and other violent bacterial or viral infections demonstrates not only that diseases with a very high mortality rate have periodically attacked mankind, but there were always surviving individuals who were not affected or became immunized. Since such episodes of infection have occurred for millions of years, it is reasonable to assume that the immune system is versatile enough to withstand similar attacks in the future, and that there is no danger that the entire human population will be wiped out. However, this may be different when an engineered mixture of deadly bacteria and viruses is deliberately released by man himself. Yet this case of biological warfare can no longer be classified as a natural disaster it represents an internal danger, discussed in more detail below.
Even though the real risk of shark attack anywhere in the world is statistically very small, sharks have been known to be such brutal killers that interest in preventing shark attacks is widespread. Various chemical shark repellants such as shark chaser have been tried. This water-soluble mixture of dye and copper acetate was given to U.S. military personnel during World War II for use if they were stranded in the sea after their ships were sunk or planes downed. It was, however, later shown to have little or no effect on sharks. Other techniques have included the cartridge-loaded bang-stick, which is probably more dangerous to the untrained user than to a shark. A more promising device is the shark screen, a floating plastic bag that can be filled with water and entered masking the odors, sounds, and movements that might attract sharks.
Historically, cataract patients had an intracapsular cataract extraction followed by aphakic spectacles. The modern era of implantation of an artificial lens at the time of cataract extraction began with Harold Ridley. During World War II, many ophthalmologists had noted that perforating eye injuries from airplane canopies made from acrylic Perspex plastic often resulted in minimal intraocular irritation secondary to the material itself. This observation, in conjunction with complaints from aphakic patients about their poor quality of vision, prompted Harold Ridley to design the first intraocular lens 7 . The first IOL implantation occurred in 1949 in the UK. Subsequently, IOLs have undergone numerous modifications in the design and surgical implantation procedures.
Traditional views of herbivorous and detritivorous insects as destructive, or at least nuisances, and ecological communities as nonintegrated, random assemblages of species supported harsh control measures. Early approaches to insect control included arsenicals, although much classic research on population regulation by predators and parasites also occurred prior to World War II. With the advent of broad-spectrum, long-lived, chlorinated hydrocarbons and organophos-phates, developed as nerve toxins and used for control of disease vectors in combat zones during World War II, management of insects seemed assured. However, reliance on these insecticides exposed many target species to intense selection over successive generations and led to rapid development of resistant populations of many species (Soderlund and Bloomquist 1990). Concurrently, movement of the toxins through food webs resulted in adverse environmental consequences that became widely known in the 1960s through publication...
Fleming apparently did not grasp the significance of his findings, and the findings did not particularly excite the medical profession until the outbreak of World War II some 10 years later. At that time, a team of British and American scientists at Northern Regional Laboratories in Peoria, The breakthrough in the research came, however, when a different species of Penicillium mold one that yielded 25 times the penicillin produced by the original culture was found on a moldy cantaloupe from a local market. The scientists set to work germinating individual spores of this new mold on culture media, and by careful selection, they eventually were able to isolate a strain that produced more than 80 times the original quantity of penicillin. Later, when this strain was subjected to X-radiation, still other forms were produced that upped the penicillin output to 225 times that of Fleming's mold. Today, most of the penicillin produced around the world comes from descendants of that cantaloupe...
Because peptidoglycans are unique to bacterial cell walls, with no known homologous structures in mammals, the enzymes responsible for their synthesis are ideal targets for antibiotic action. Antibiotics that hit specific bacterial targets are sometimes called magic bullets. Penicillin and its many synthetic analogs have been used to treat bacterial infections since these drugs came into wide application in World War II.
The concept of post-traumatic stress disorder has had a rather checkered history. It has tended to emerge largely in the aftermath of war. During and after World War I there was discussion of 'shell shock'. The treatment then infantilized patients by removing those who could not function in combat as far from it as possible. They usually remained emotional cripples much of their life because the premise was they had been so neurologically damaged that there was no repairing them. This turned out to be a mistake. So in World War II the term was changed to traumatic neurosis, and the idea there was to treat people 'within the sound of the guns' (Kardiner & Spiegel, 1947). This was a much better idea because it acknowledged the reality of intense reaction but did not presume that you had to consolidate it by pulling the soldiers away from their combat duties. Most were able to respond, which was a major advance. However, with the development of the psychoanalytic model there was more...
Signal detection theory arose in part from concerns during World War II. Sailors on watch had to decide whether enemy submarines were nearby and then respond accordingly. If a submarine was nearby and a sailor noted this by activating the alarm for battle stations, this represented a hit. If there was no submarine nearby and the sailor sounded no alarm, a correct reject occurred. However, the sailor's errors could take two forms. If no submarine was present but the sailor saw an iceberg or a whale and mistakenly believed a submarine was present, sounding the call to battle stations represented a false alarm. The potentially more serious form of error happened when there was a submarine present and the sailor did not note this. Making no response placed the ship in jeopardy and constituted a miss. Obviously, in such circumstances, the relative payoffs and costs favored sounding the alarm whenever the possibility existed that an enemy submarine was in the area.
The concept of the glovebox, used to protect a process from the operator, or to protect the operator from a process, is hardly new. Gloveboxes were first developed with the atomic weapons programme during World War II development continued within the nuclear power industry, up to the present day. Gloveboxes were also used almost from the beginning of sterile product manufacture, because operators were quickly recognised to be the major source of contamination. The use of gloveboxes declined when reliable panel high efficiency particulate air (HEPA) filters (see Chapter 2) became available (Agalloco 1995). These filters led to the development of cleanrooms, which have dominated sterile production until recently. Gloveboxes have also been developed for nonnuclear containment purposes, particularly where pathogenic organisms are involved, and clear standards exist for such containments. The Class III Biological Safety Cabinet defined in BS 5295, and latterly in EN 12469, is, of course, a...
Subsequently, ideas of statistical inference evolved in the 20th century, with the important notions being developed from the 1890s to the 1950s. The leaders in statistics at the beginning of the 20th century were Karl Pearson, Egon Pearson (Karl Pearson's son), Harold Cramer, Ronald Fisher, and Jerzy Neyman. They developed early statistical methodology and foundational theory. Later applications arose in engineering and the military (particularly during World War II).
The fruits of wild roses, called hips (Fig. 24.10), are exceptionally rich in vitamin C. In fact, they may contain as much as 60 times the vitamin C of a comparable quantity of citrus fruit. Native Americans from coast to coast included rose hips in their diet (except for members of a British Columbia tribe, who believed they gave one an itchy seat ), and it is believed that this practice contributed to scurvy being unknown among Native Americans. During World War II when food supplies became scarce in some European countries, children in particular were kept healthy on diets that included wild rose hips. In addition to vitamin C, the hips
When the Japanese in World War II took control of Indonesia they shut off the Allies' sole source of quinine, the drug of choice for the prevention and treatment of malaria. Cinchona tree plantations in Indonesia provided the bark from which quinine was extracted. In 1941, the world's demand for Cinchona bark exceeded 700 ton year and 90 of that supply came from Indonesia (1). Synthetic antimalarials like quinacrine were the only option available. Quinicrine was a synthetic antimalarial drug discovered in Germany in 1932. Winthrop Chemical in the US made this drug and sold it under the trade name Atabrine . In (1) Slater, L.B. (2004) Malaria chemotherapy and the kaleidoscopic organization of biomedical research during World War II. Ambix. 51(2) 107-134. (3) Joy, R.J.T. (1999) Malaria in American troops in the South and Southwest Pacific in World War II. Medical History. 43(2) 192-207.
By far, the most important bryophytes to humans are the peat mosses. When allowed to absorb water, 1 kilogram (2.2 pounds) of dry peat moss will take up 25 kilograms (55 pounds) of water. Its extraordinary absorptive capacity has made it very useful as a soil conditioner in nurseries and as a component of potting mixtures. Live shellfish and other organisms are shipped in it. The natural acidity produced inhibits bacterial and fungal growth and gives it antiseptic properties. The absorbency, which is greater than that of cotton, combined with the antiseptic properties, has made it a useful poultice material for application to wounds. It was used for this purpose during the Crimean War of 1854 to 1856 and, as indicated in the chapter introduction, on an emergency basis during World War I. Extensive peat deposits have been formed from the remains of peat mosses that flourished in past eras. Peat, like the undecomposed peat mosses, is used around the world as a soil conditioner and as a...
Unfortunately, during the first decades of the twentieth century, there were three revolutions and the First World War, which did not favor the development of plastic surgery in Russia. Nevertheless, in 1916 the worldwide recognized work by V.P. Filatov was published. The work was dedicated to the results of using of round fat-dermal flaps developed by V.P. Filatov. This method was the only opportunity of tissue complex transplantation right up to the second half of the twentieth century when flaps with axial blood supply came into use. Other famous Russian surgeons who played an important role in plastic surgery development are P.I. Diakonov, N.A. Bogoraz, A.A. Limberg, A.E. Rauer, B.I.Vozchek, B.S. Preobrazhenskii, I.M. Mikhelson, and many others. After World War II, a special system was organized for the treatment of burn patients this played a particular role in the formation and development of plastic surgery in Russia. During this research and organization work several...
Laying the groundwork for her critique of psychology in her 1979 chapter entitled Bias in Psychology, Carolyn Wood Sherif described psychology's heirarchy at the time of World War 11, in which experimentalists occupied the topmost rung of the ladder of prestige and power, followed by testers, those psychologists engaged in mental measurement. The lower rungs consisted of developmentalists, social psychologists, and clinicians. The highest levels of the hierarchy were occupied predominantly by men, with the lower, more applied levels consisting of more women. Sherif noted that these hierarchies had persisted into the late 1970s when she wrote her chapter, with each subdiscipline attempting to prove its worth by appearing as scientific (i.e., experimental) as possible.
The mission of health care services today is not only to cure disease, restore function, and alleviate ailment, but also to prevent disease and promote health. After World War II the 'academic world' tried to reorient the concept of health by broadening its definition. The net result of these efforts was
The brown tree snake vividly illustrates the dangers of introducing an exotic species to an island ecosystem. The brown tree snake, a native of Australia, was accidentally introduced to Guam after World War II. It reproduced in incredible numbers and reached the highest density of any snake population on Earth, with up to sixteen to twenty thousand snakes per square mile. In the late 1970's, biologists noticed that birds were disappearing from Guam. Of the fourteen bird species endemic to the island, at least nine had become extinct by the 1990's. Many small animal species, as well as chickens raised by the island's inhabitants, also disappeared. The snakes harm humans as well they bite viciously and cause frequent power outages by climbing on power lines. Millions of dollars have been spent on unsuccessful attempts to control the brown tree snake.
In the midst of heavy battles in France during World War I, nurses, at what is now referred to as a M.A.S.H. unit, on one occasion ran out of bandages for the wounded soldiers. In desperation, they substituted some soft green plant material they found growing in the water at the edge of a nearby lake. To their surprise, the material turned out to be a great substitute for the bandages there were fewer infections in the wounds with the plant bandages than in those with the cotton bandages.
The First World War killed over 20 million people more than the war itself. In 1957 the H1NI subtype was suddenly replaced by a new subtype, H2N2, known as Asian flu because it originated in China. Within a year over 1 billion people had been infected, but fortunately the mortality was much lower than in 1918, probably because the strain was intrinsically less virulent, although the availability of antibiotics to treat bacterial superinfection undoubtedly saved many lives. In 1968 this subtype was in turn replaced by the Hong Kong flu (H3N2). Finally, in 1977 the H1N1 subtype mysteriously reappeared, and since then the two subtypes H3N2 and H1N1 have cocircu-lated
Susser and Stein (36,37) studied the effects of acute food scarcity during World War II on a previously healthy and nutritionally replete population. Between October 1944 and May 1945, during the German occupation of the Netherlands, the German army restricted food supplies into certain Dutch cities, resulting in a substantial reduction in average daily energy intake to fewer than 1000 kcal. Adjacent cities, in which food supplies were not curtailed by the Germans, were not affected by the famine. Fifty percent of women who were affected by the famine developed amenorrhea. The conception rate dropped to 53 of normal (based on control cities) and correlated with the decreased caloric ration. In addition to the decrease in fertility, undernutrition resulted in an increase in perinatal mortality, congenital malformations, schizophrenia, and obesity. These observations indicate that optimal caloric intake is essential for normal fertility and prenatal growth.
Inspectors to encourage the eradication of potential breeding sites for the mosquito. In the 1920s one agricultural officer made the observation that the lethargy of many local people may be due not to inherent indolence, but to the fact that they were suffering from malaria (MNA M2 5 14). How prevalent malaria was during the colonial period can be judged from the fact that out of the total number of cases of all diseases treated in hospitals in November 1939 (i.e. 83,672), some 20,434 (24 per cent) were due to malaria. It is not surprising, then, that one colonial official could complain that most medical doctors and administrators in the country had failed to realize that malaria was the most serious cause of ill health, and probably the greatest single obstacle to economic development in Africa. Most medical officers in Africa, he wrote, had little interest in malaria control (MNA M2 5 38). There is little to indicate that Malawians are particularly immune to malaria's ravages....
The influenza virus infects the upper respiratory tract and major central airways in humans, horses, birds, pigs, and even seals. In 1918-19, an influenza pandemic (worldwide epidemic) killed more than 20 million people, a toll surpassing the number of casualties in World War I. Some areas, such as Alaska and the Pacific Islands, lost more than half of their population during that pandemic.
The turn of the century brought about a tremendous increase in the number of zoos and, to a lesser extent, aquariums that lasted up through World War I. It was a period during which zoos and aquariums improved their programs in conservation, animal husbandry, research, and education. Beginning in the 1890's, conservation of wildlife became an important concern in the United States and the European colonies. Europe had been dealing with conservation issues for many centuries, but these other regions saw their seemingly limitless resources quickly disappear. The New York Zoological Park (the Bronx Zoo), along with other United States zoos, played a significant role in conserving the American bison, which had become extinct in the wild. European zoos did likewise, saving the European bison (wisent), P re David's deer, and Przewalski's horse from extinction. World War I affected zoos because of a loss of employees to the war effort, a loss of revenues to operate the facilities, a loss of...
We are often interested only in investigating the relationship between two binary variables (e.g., a disease and an exposure) however, we have to control for confounders. A confounding variable is a variable that may be associated with either the disease or exposure or both. For example, in Example 1.2, a case-control study was undertaken to investigate the relationship between lung cancer and employment in shipyards during World War II among male residents of coastal Georgia. In this case, smoking is a confounder it has been found to be associated with lung cancer and it may be associated with employment because construction workers are likely to be smokers. Specifically, we want to know
With the Renaissance and the introduction of firearms, the number of war dead strongly increased. It is estimated that the Thirty Years' War (1618-1648) caused 7 million military and civilian deaths. Due to large-scale mechanization and further technological advances (machine guns, tanks, and airplanes) there were 20 million war dead in the First World War and 56 million in the Second World War. And because of associated social and political catastrophes (collectivization and the Holocaust) the total death tolls were easily doubled. During these wars and afterwards, even more formidable weapons of mass destruction were created chemical and biological as well as atomic weapons. Judging from the enormous advances of weapons technology, mankind will have to face still more powerful and deadly inventions in the future.
It was not before 1880 that the active ingredient Ephedrine was isolated, finally leading to its characterisation in 1920. Ephedrine was widely used in the treatment of asthma. Increased efforts to search for a synthetic substitute led to the rediscovery of amphetamine that was synthesised 40 years before. Since then, many analogues of amphetamine have been developed and characterised, including the popular street drug, Methamphetamine, which was synthesised in 1912 in Darmstadt by Merck. During World War II, amphetamines came into use in the military as a means to keep pilots awake and vigilant during long flight hours. The first condition amphetamine was clinically used for was narcolepsy. Although not curative, it revolutionised therapy for this condition by making the patients relatively symptom free.
Controls were selected from admissions to the four hospitals and from death certificates in the same period for diagnoses other than lung cancer, bladder cancer, or chronic lung cancer. Data are tabulated separately for smokers and nonsmokers in Table 1.1. The exposure under investigation, shipbuilding, refers to employment in shipyards during World War II. By using a separate tabulation, with the first half of the table for nonsmokers and the second half for smokers, we treat smoking as a potential confounder. A confounder is a factor, an exposure by itself, not under investigation but related to the disease (in this case, lung cancer) and the exposure (shipbuilding) previous studies have linked smoking to lung cancer, and construction workers are more likely to be smokers. The term exposure is used here to emphasize that employment in shipyards is a suspected risk factor however, the term is also used in studies where the factor under investigation has beneficial effects.
However, the human animal is unique in that cannibalism is also practiced as a social and religious custom not involving subsistence and survival in the normal sense. One type of human cannibalism involves a genuine reverence by relatives for their dead. Called endocannibalism, this practice is based upon the belief that eating the flesh of departed relatives shows great respect and veneration of the dead. This type was practiced among the natives of islands of the southern Pacific Ocean until it was declared illegal following World War II.
It is interesting that much of the early data on energy and nutrient allocation by insects was a byproduct of studies during 1950 to 1970 on anticipated effects of nuclear war on radioisotope movement through ecosystems (e.g., Crossley and Howden 1961, Crossley and Witkamp 1964). Research also addressed effects of radioactive fallout on organisms that affect human health and food supply. Radiation effects on insects and other arthropods were perceived to be of special concern because of the recognized importance of these organisms to human health and crop production. Radioactive isotopes, such as 31P, 137Cs (assimilated and allocated as is K), and 85Sr (assimilated and allocated as is Ca), became useful tools for tracking the assimilation and allocation of nutrients through organisms, food webs, and ecosystems.
Bowlby's work evolved out of his observations during World War II of the consequences of being deprived of contact with the primary caregiver in children who had been separated from them because of the war. Bowlby based his ideas on ethological theory suggesting that the infant attachment bond is an instinctually guided behavioural system that has functioned throughout human evolution to protect the infant from danger and predators. According to Bowlby, attachment behaviours were seen as part of a behavioural system, which involved inherent motivation, in other words, it was not reducible to another drive.
The great advances in radio technology during the Second World War led in the 1950s to the construction of large radio telescopes, both in England (Jodrell Bank, near Manchester) and the United States (National Radio Astronomy Observatory, NRAO, at Green Bank, West Virginia). The prime aim of these telescopes was to look at a large variety of astronomical objects (galactic and extragalactic sources, interstellar gas, and the Sun) by using the radio window through the Earth's atmosphere. It is worth noting that aside of the narrow optical window, the radio window in those days was the only other means of observing the universe from the Earth's surface. Today, through the use of satellites, this limitation imposed by the atmosphere no longer exists.
Few viruses have played a more central role in the historical development of virology than that of influenza. The pandemic that swept the world in 1918, just as the First World War ended, killed 20 million people more than the war itself. The eventual isolation of the virus in ferrets in 1933 was a milestone in the development of virology as a laboratory science. During the ensuing two decades Burnet pioneered technological and conceptual approaches to the study of the virus in embryonated eggs. His system became the accepted laboratory model for the investigation of viral multiplication and genetic interactions until the early 1950s, when newly discovered cell culture techniques transferred the advantage to poliovirus. Hemagglutination, discovered accidentally by Hirst when he tore a blood vessel while harvesting influenza-infected chick allantoic fluid, provided a simple assay method, subsequently extended to many other viruses. The imaginative investigations of Webster and Laver...
In the early 1940's a new generation of herbicidal compounds emerged. In an attempt to mimic natural plant hormones, the defoliant 2,4-D was created. At low concentrations, 2,4-D promotes retention of fruit and leaves at higher concentrations, it over-stimulates plant metabolism, causing leaves to drop off. Arelated chemical, 2,4,5-T, came into general use in 1948. The years after World War II saw the first large-scale application of herbicides in agriculture and other areas. The new defoliants rapidly gained acceptance because of their effectiveness against broad-leaved weeds in corn, sorghum, small grains, and grass pastures.
Hosts include a range of nematodes, arthropods, molluscans, algae, and plants. Several of the plant pathogens have impacted the cultural and economic history of humans. These include the causative agents for a variety of root rots, downy mildews, white rusts, and late blights. Downy mildews of grapes (Plasmopara) and tobacco (Peronospora) were responsible for the near-decimation of the French wine industry and the Cuban tobacco industry in the late 1870's and the 1980's, respectively. Similarly, Phytophthora infestans, the causative agent of potato late blight, was responsible for the Irish Potato Famine of the mid-1840's and, during World War I, for the starvation of German civilians in 1915-1916.
Physiology flourished during the 1900's, beginning with Frederick Frost Blackman's studies on respiration, Jagadis Chandra Bose's on biophysics, and Mikhail Semenovich Tsvet's on cytophysiol-ogy. In the 1930's Paul Jackson Kramer began investigating water usage, and Hans Adolf Krebs began investigating cyclic metabolic pathways. After World War II, Fran ois Jacob studied the functioning of DNA and RNA, the genetic control of enzymes, and helped develop the concept of the operon in cellular physiology. Melvin Calvin took advantage of availability of radioactive tracers, par Ecology arose simultaneously with genetics but at a slower pace. Johannes Warming, Gottlieb Ha-berlandt, and Andreas Franz Wilhelm Schimper were botanists who laid a foundation for plant ecology shortly before 1900, and the self-consciously ecological researches by Christen Raunkiaer, Felix Eugen Fritsch, Frederic Edward Clements, and Henry Chandler Cowles built upon their foundation. Fritsch studied periodicity in...
Presence of huge Russian and Chinese Communist armies in Europe and Asia that overwhelmingly outnumbered those of the U.S. and its allies made it clear to Western leaders that America would be defenseless without atomic bombs. Only a shield of nuclear weapons, the thinking went, would deter the communists from sweeping over us. Anyone who questioned the making of more and more powerful bombs was thus seen at best as a communist dupe or fellow traveler and at worst as a traitor.
For years he had suffered personal attacks and professional setbacks because of his antibomb efforts. Now he was vindicated. I gave over five hundred public lectures about radioactive fallout and nuclear war and the need for stopping the bomb tests in the atmosphere and the need for eliminating war ultimately, he later told an interviewer. I was doing something that I didn't care to do very much, except for reasons of morality and conviction. So when I received word in 1963 that I had been given the Nobel Prize, I felt that showed that the sacrifice that I had made was worthwhile.
Bioterrorism is the use of biological organisms or their derivatives to sow terror in a civilian population. Bioterrorism is an offshoot of biological warfare, and like most progeny it differs from its parent. The main difference is that biological warfare is a highly organized aggressive activity carried out by one state against another, usually through a military arm, with the sole aim of killing or disabling people. Bioterrorism, while using many of the same agents and tactics as biological warfare, is a more ad hoc activity carried out by individuals or political groups against other political groups or states, with a mixture of objectives. Biological warfare itself has a long if occasionally crude history, including dipping arrowheads and spear points into rotting cadavers or feces, or lobbing entire diseased corpses over town or castle walls. The perpetrators obviously had little understanding of what they were doing, so it may be less than accurate to call this biological...
The malpractice crises of the 1970s and 1980s hit during the best of times (at least financially) for physicians and hospitals. The decades from World War II through the implementation of Medicare are generally considered American medicine's golden age. Faith in the beneficial potential of health care generated massive financial investments from both public and private sources, which were placed under the unfettered control of physicians. Dramatic increases in malpractice litigation toward the end of this period arguably sought to justify the public's trust. Lawsuits imposed real emotional and reputational costs on defendants but seldom constituted a severe financial burden. As studies from the 1980s demonstrated, even substantial increases in liability insurance premiums were quickly passed through to patients and payers as higher fees (6,7).
The only modern-day use of plague for biological warfare was by the Japanese during occupation of China in World War II, when they released plague-infected fleas onto civilian populations. Both the United States and the Soviet Union pursued development of aerosolized Y. pestis, but these appear never to have been used. Aerosols of course would induce pneumonic plague and could be unbelievably deadly. The WHO estimates that 50 kg of aerosolized Y. pestis spread over an urban population of 5 million would infect at least 150,000 people, causing at least 36,000 deaths.
Tularemia was investigated by several countries between 1930 and 1970 as a potential biological warfare agent. The bacterium can be concentrated into a paste, which can be freeze-dried and then milled into a fine powder suitable for distribution through the air. A WHO study estimated that 50 kg of bacteria (about 110 pounds) in aerosolized form, spread over a population of 5 million people, would incapacitate about 250,000 people and cause nearly 20,000 deaths. The United States retained stocks of F. tularensis through the late 1960s, but these were destroyed in the general obliteration of such stockpiles in the early 1970s. Current military research with this microbe is restricted to defensive strategies.
Around three hundred insects have been recorded as pests on the cotton plant. In a check list of cotton insect pests of Nyasaland, compiled by Colin Smee at the outbreak of the Second World War (1940), over eighty insects were recorded as attacking cotton - sometimes as seedling plants, as well as affecting the different parts of the plant, roots, shoots, leaves, flowers and bolls (MNA 51 30 40). Some twenty years later Charles Sweeney made a detailed study of the insect pests of cotton and recorded the following number of species attacking cotton.
As their main mechanism of action, these compounds inhibit protein biosynthesis by binding to the ribosome and inhibiting peptidyl transferase activity (see page 407). They also inhibit DNA biosynthesis. A major human condition known to be caused by trichothecenes is alimentary toxic aleukia (ATA), characterized by destruction of the skin, haemorrhaging, inflammation, sepsis, a decrease in red and white blood corpuscles, bone marrow atrophy, and a high mortality rate. A severe outbreak of ATA was recorded in the former Soviet Union shortly after the Second World War when food shortages necessitated the consumption of grain that had overwintered in the field. This had become badly contaminated with Fusarium sporotrichioides and hence T-2 toxin. It is estimated that tens of thousands died as a result.
Figure 2-15 The human pseudoachondroplasia phenotype, illustrated by a family of five sisters and two brothers. The phenotype is determined by a dominant allele, which we can call D, that interferes with bone growth during development. This photograph was taken upon the arrival of the family in Israel after the end of the Second World War. UPI Bettmann News Photos. Figure 2-15 The human pseudoachondroplasia phenotype, illustrated by a family of five sisters and two brothers. The phenotype is determined by a dominant allele, which we can call D, that interferes with bone growth during development. This photograph was taken upon the arrival of the family in Israel after the end of the Second World War. UPI Bettmann News Photos.
Scientists in the Western nations did not learn about gibberellin until after the end of World War II, in approximately 1950. As many as fifty gibberellins have thus far been isolated from fungi and from the healthy tissues of higher plants (buds, leaf primordia, immature seeds, fruit tissue, and roots). Gibberellin production in roots is generous, and gibberellin is translocated to other parts of the plant.
Smallpox is the first disease-causing microbe to be purged from the human species, by a worldwide immunization campaign launched by the WHO in 1967. By 1972, routine vaccinations were discontinued in the United States because of a small risk of active disease from the vaccine itself. The fact that V. major could not retreat into an animal reservoir during this campaign was a major factor in its eradication. Today V. major officially exists only as frozen stockpiles at the CDC in Atlanta, and in a former biological warfare research center near Novosibirsk, in Russia.
Aside from Saccharomyces cerevisciae, the yeast of the brewing industry, other species of Saccharomyces are used in making other beverages including sake, ginger beer, rum, and wine. Saccharomyces is also rich in vitamin B and protein (composed of approximately 50 percent), making it a nourishing feed for livestock. During World War II, thousands of tons of yeast were produced in Germany as a source of protein. Suitable methods of yeast production for protein manufacture could make yeasts valuable resources for third world countries.
New epidemics, new diseases, new viruses, or new virus-disease associations. New viruses and new virus-disease associations continue to be discovered virtually every year. We need only make the point that over 90 of all the human viruses known today were completely unknown at the end of World War II. Opportunities are legion for astute clinicians as well as virologists and epidemiologists to be instrumental in such discoveries.
Although dengue fever has been known for over 200 years, prior to the 1950s outbreaks of dengue were rare, and because of the slow transport of viremic persons between tropical countries epidemics in any particular locality occurred at intervals of decades. During and after the Second World War millions of people in all countries of the developing world moved to the cities, resulting in rapid and unplanned urbanization and the expansion of breeding places for Aedes aegypti. Major epidemics involving hundreds of thousands of people have occurred in the Caribbean (1977-1981), South America (since the early 1980s), the Pacific (1979), and China (1978-1980), as well as in Southeast Asia and Africa The simultaneous circulation of multiple serotypes led to the appearance of an essentially new disease, dengue hemorrhagic fever. Although in retrospect the disease probably occurred in northern Australia in 1898 and in Greece in the 1920s, the first outbreak of dengue hemorrhagic fever and...
In 1937, Lorenz taught comparative anatomy and animal psychology at the University of Vienna and became the coeditor of the leading journal for ethology. After World War II, Lorenz headed the Institute of Comparative Ethology at Altenberg, Austria. From 1961 to 1973, he served as the director of the Max Planck Institute for Behavior Physiology in Seewiesen, Austria. Lorenz developed the concepts of animal behavior that led to the modern scientific understanding of how behavioral patterns evolve in a species. He advanced the concepts of how these patterns develop and mature during the lifetime of an individual organism.
Northern white cedar, also known as arborvitae, is a favorite ornamental in temperate areas. The wood is pliable, and several Native American tribes used it for canoes. The Atlantic or southern white cedar was the first tree to be used for the construction of pipes for pipe organs in North America. During World War II, old logs of this species found in a swamp in New Jersey were milled and used in the construction of patrol torpedo boats.
A long time passed before this proposal was proven correct. Following World War II, radioactive isotopes of many elements became available for research. A radioactive isotope of oxygen known as O18 made possible the utilization of carbon dioxide and water in photosynthesis wherein the oxygen of the carbon dioxide was different from the oxygen of the water. A carefully arranged experiment was devised wherein the oxygen of carbon dioxide was O16 and the oxygen of the water was O'8. This difference enabled researchers to trace the source where the oxygen in each compound formed. The chemical properties of the two types of oxygen were exactly alike only the weights differed. In this way, it was clearly established that all the oxygen liberated in photosynthesis comes from the breakdown of water. An equation could now be written as shown in figure 9-6. Unlike the water on the left side of the equation, the water on the right side is manufactured water.
Working dogs, such as the sled dogs who delivered diphtheria medicine to an ice-bound Alaskan town and Chips, a World War II military dog who located enemy snipers, have often become heroes celebrated in books and movies. In November, 2000, federal legislation was approved to discontinue the practice of automatically euthanizing retired American military working dogs. Memorials around the world have commemorated working dogs. Putney, William W. Always Faithful A Memoir of the Marine Dogs of World War II. New York The Free Press, 2001. A veterinarian's insights about the usefulness of dogs' keen senses for military applications.
|Surviving World War III|
Where Can I Get Alive after the Fall Review
You can safely download your risk free copy of Alive after the Fall Review from the special discount link below.