Space Ships: Impulse Drive, Warp Drive and Dilithium Crystals The impulse drive is powered by nuclear fusion. The problem here is that fusion turns only one percent of the available mass into energy. you can work out how much fuel would be required to do the following simple manoeuvre: start from rest, go to half the speed of light and then stop. It turns out that you need about 7000 times the mass of the ship in fuel just to do that. The galaxy is about 100,000 light years across. If you're travelling at mere light speed it would take years just to get to the nearest star. So if you want to do any significant travel in the galaxy in a reasonable amount of time - say an episode - you have to travel much faster than the speed of light [which impulse drive cannot provide]. Enter the warp drive. The warp drive is powered with matter and antimatter. Every elementary particle has an associated antiparticle with the same mass but opposite properties, like charge. And when the two come together, they annihilate to produce pure radiation. This is probably the best kind of rocket propulsion because all of the mass is turned into energy. 58445toj95kbd6o The rate at which matter and antimatter interact in the warp drive are apparently regulated by dilithium crystals. That doesn't really make sense, because when matter and antimatter interact, its either all or nothing. You can't regulate the rate. Another problem is that they annihilate on a scale which is thousands, if not millions of times smaller than the scale of atoms in a crystal. So its hard to imagine how any crystalline structure is going to channel matter and antimatter Warp Drive and Negative Energy The warp drive is impossible, the way the writers describe it. As everyone knows, Einstein says you can't go faster than the speed of light. But there is a way that warp drive could work. Although Einstein caused the problem, he also came to the rescue by inventing general relativity. ob445t8595kbbd In principle general relativity allows you to go faster than the speed of light compared to distant objects, but locally be standing still. As we sit here we're not moving relative to our local surroundings. But relative to a galaxy at the other end of the visible universe, we are moving away at the speed of light. And that galaxy is also standing still relative to its surroundings. What’s happening is the space between the two galaxies is actually expanding. So if you wanted to have a warp drive in principle, you could let space do the work for you. Let's say you wanted to go to the nearest star. You'd have to fire up your chemical rockets, and go up about 200 miles from the Earth's surface. Now you're about four light years away from the nearest star. Then what you have to do is arrange for the space between you and the star to catastrophically collapse, and the space between you and the Earth to expand. Then suddenly you're 200 miles from the star and four light years away from Earth. Your clocks haven't changed, and no physical object has been moving. What's wonderful about general relativity is it allows you to create designer space-times. You could take any kind of universe with any geometry and write that down mathematically. A few years ago a physicist named Miguel Alcubierre found a solution of Einstein's equations that would have all the properties of warp drive, but didn't violate general relativity. The question is - can you create the configuration of matter and energy that is required? Mathematically you can, but how about physically? Gravity always pulls, so to make space expand, you have to add a repulsion term. It turns out that you need something called negative energy. On small scales negative energy configurations do exist. But can you create negative energy in a controlled way on a macroscopic scale? We don't know. I think that interstellar travel will be impractical for a very long time because of the huge energy requirements. If we ever interact with extraterrestrial life, I think the last way we will do it is by sending spacecraft. Broadcasting our existence with radio messages would certainly be much cheaper. What is Fusion? Fusion is simply combining the nuclei of light elements to form a heavier element. This nuclear reaction results in the release of large amounts of energy - typically a million times more energy than can be obtained by combining atoms chemically (such as burning coal). In a fusion reaction, the total mass of the resultant nuclei is slightly less than the total mass of the original particles. This difference is converted to energy as described by Einstein’s famous equation, E=mc² . First-generation fusion reactors will use deuterium and tritium, isotopes of hydrogen, for fuel. Deuterium occurs naturally in nature - about one part in 6000 is found in ordinary water. Tritium can be produced from lithium. Advanced fusion reactors will burn pure deuterium (or maybe even hydrogen), of which there is essentially a limitless supply. This deuterium-tritium fusion reaction results in an energy gain of about 450:1.+14.1 Me V +3.5 Me V deuterium tritium neutron alpha. Explanation of virtual reality: Although VR is still in its infancy, potentially the technology represents a new medium for human communication, education and entertainment. VR arcades are already a reality: playtime costs $1.00 per minute. Virtual cadavers help future doctors explore the human body. A VR system can be based on a personal computer. The computer controls several different sensory display devices to immerse you in a 3-dimensional virtual environment. The most common sensory displays are head-mounted displays for 3D visual and headphones for 3D audio. Since these displays need to be updated with new sensory information more than 20 times per second it often helps to have additional processing power in the form of add-on 3D graphics cards and 3D sound cards. A VR system needs to be able to track the position and orientation of your head in order to calculate the appropriate perspectives to display. Any other body parts, such as your hands, feet, or prehensile tails, that will play an active part in the virtual environment must also be tracked. The device that does this is called (surprisingly enough) a tracking device. Input devices make up the final category of VR hardware. In order to interact with the virtual environment you may wish to use a joystick (sometimes called a wand in VR systems), an instrumented glove, a keyboard, voice recognition, or other types of input. These devices allow you to travel through the virtual environment, manipulate objects, and perhaps even build on to the virtual world. Tracking devices are sometimes used together with input devices to add a spatial (3 dimensional) component to their operation. In order to build virtual environments you often need auxiliary software for creating the objects that go into the virtual environment and setting their characteristics. Three-dimensional modelling software allows you to construct the geometry of the objects and specify some of their visual properties. Two-dimensional graphics software lets you manipulate textures to be applied to the objects which can often greatly enhance their visual detail. Digital sound editing software lets you mix and edit the sounds that objects make. All these software packages have other commercial uses in addition to building VR, and so there is a great variety to choose from. The simulation software is what brings all the components together. It accepts data from the trackers and input devices, applies this information to the objects you have built, and updates the sensory displays. You use the simulation software to program how the objects behave and set the rules that the virtual world follows. Although they have improved dramatically in the last few years, VR simulations are not yet photorealistic, and a complete VR system is still quite expensive. VPL's cost upwards of $300,000; the VR system GE built for the military cost $16 million. VR hardwear includes 3-D audio-visual head-mounted displays (based on Ivan Sutherland's 1965 design of the "ultimate display"), and realtime tracking devices like "datagloves" and "cybersuits." These are linked by umbilical cables to some very sophisticated software and some powerful computer hardware. Together, they immerse the user in cyber spaces and places s/he then has the sense of participating in. Current Applications Practical applications of virtual reality are under active development by a variety of agencies and disciplines. The range of applications illustrates the enormous potential for this technology to address highly varied problems and needs. Medicine: Virtual reality is used in planning radiation treatments for cancer patients at The University of North Carolina (Stewart, 1991). Using computerized scans of a patient's anatomy viewed through virtual reality, physicians can move proposed beams around by hand and position them so that they converge most effectively on a tumor. By combining ultrasound scanners with head-mounted display units, Robinett (1991) believes that physicians will soon be able to "see directly inside of living tissue" (p. 18). With half-silvered mirrors, the display allows the wearer to see through to the real world, with images from ultrasound data optically superimposed onto the patient. Using this "x-ray vision", an obstetrician could "see the woman, feel the fetus kick beneath her hands, and see the ultrasound image of the fetus appearing to hang in space inside her belly"(p. 18). Chemistry: At the University of North Carolina, chemists use virtual reality to "see" protein structures in three dimensions, and holding a special joystick, find ways to design new drugs that will "dock" perfectly with enzyme molecules (Brooks, 1988; Stewart, 1992). Architecture: Architects can now "walk through" building designs before any actual construction takes place, with the aid of a treadmill and data sensors. The use can judge design features from any perspective they choose (Southwest Educational Development Laboratory, 1990). Interior Design: Customers in Japan may design custom kitchens and use virtual reality to see the result. Wearing goggles and a glove, they can walk through their design and actually touch "virtual appliances" (Peterson, 1992). Military: For some time, there have been investigations among military agencies concerning use of virtual reality in personnel training, and in the design of new weapon systems. The technology is being applied to the design of tank simulators, flight simulators, and to aircraft design and repair (Lowenstein & Barbee, 1990). Space Exploration: NASA has designed a virtual reality system which creates the illusion of flying over a Martian landscape accurately created from photographs of the planet's surface (Peterson, 1992). The Visualization for Planetary Exploration Project (also designed by NASA) employs virtual reality to allow users to explore the solar system (Ditlea, 1989). Current efforts are focusing on the use of virtual reality to prepare astronauts to live and work on orbiting space stations (Fritz, 1991) and to undertake construction and repair in a space environment (Southwest Educational Development Laboratory, 1990). Robotics: One of the most practical and immediate applications for virtual reality is robotics. The use of simple, small hand movements in a DataGlove can control complex robotics equipment.