Videotape is magnetic tape used for storing video and sound in addition. Information stored can be in the form of either digital signal. Videotape is used in both video tape recorders or, more videocassette recorders and camcorders. Videotapes are used for storing scientific or medical data, such as the data produced by an electrocardiogram; because video signals have a high bandwidth, stationary heads would require high tape speeds, in most cases, a helical-scan video head rotates against the moving tape to record the data in two dimensions. Tape is a linear method of storing information and thus imposes delays to access a portion of the tape, not under the heads; the early 2000s saw the introduction and rise to prominence of high quality random-access video recording media such as hard disks and flash memory. Since videotape has been relegated to archival and similar uses; the electronics division of entertainer Bing Crosby's production company, Bing Crosby Enterprises, gave the world's first demonstration of a videotape recording in Los Angeles on November 11, 1951.
Developed by John T. Mullin and Wayne R. Johnson since 1950, the device gave what were described as "blurred and indistinct" images using a modified Ampex 200 tape recorder and standard quarter-inch audio tape moving at 360 inches per second. A year an improved version using one-inch magnetic tape was shown to the press, who expressed amazement at the quality of the images although they had a "persistent grainy quality that looked like a worn motion picture". Overall the picture quality was still considered inferior to the best kinescope recordings on film. Bing Crosby Enterprises hoped to have a commercial version available in 1954 but none came forth; the BBC experimented from 1952 to 1958 with a high-speed linear videotape system called VERA, but this was unfeasible. It used half-inch tape on 20-inch reels traveling at 200 inches per second. RCA demonstrated the magnetic tape recording of both black-and-white and color television programs at its Princeton laboratories on December 1, 1953.
The high-speed longitudinal tape system, called Simplex, in development since 1951, could record and play back only a few minutes of a television program. The color system used half-inch tape on 10-1/2 inch reels to record five tracks, one each for red, green and audio; the black-and-white system used quarter-inch tape on 10-1/2 inch reels with two tracks, one for video and one for audio. Both systems ran at 360 inches per second with 2,500 feet on a reel. RCA-owned NBC first used it on The Jonathan Winters Show on October 23, 1956 when a prerecorded song sequence by Dorothy Collins in color was included in the otherwise live television program. In 1953, Dr. Norikazu Sawazaki developed a prototype helical scan video tape recorder. BCE demonstrated a color system in February 1955 using a longitudinal recording on half-inch tape. CBS, RCA's competitor, was about to order BCE machines when Ampex introduced the superior Quadruplex system. BCE was acquired by 3M Company in 1956. In 1959, Toshiba released the first commercial helical scan video tape recorder.
The first commercial professional broadcast quality videotape machines capable of replacing kinescopes were the two-inch quadruplex videotape machines introduced by Ampex on April 14, 1956 at the National Association of Broadcasters convention in Chicago. Quad employed a transverse four-head system on a two-inch tape, stationary heads for the sound track. CBS Television first used the Ampex VRX-1000 Mark IV at its Television City studios in Hollywood on November 30, 1956 to play a delayed broadcast of Douglas Edwards and the News from New York City to the Pacific Time Zone. On January 22, 1957, the NBC Television game show Truth or Consequences, produced in Hollywood, became the first program to be broadcast in all time zones from a prerecorded videotape. Ampex introduced a color videotape recorder in 1958 in a cross-licensing agreement with RCA, whose engineers had developed it from an Ampex black-and-white recorder. NBC's special, An Evening With Fred Astaire, is the oldest surviving television network color videotape, has been restored by the UCLA Film and Television Archive.
On December 7, 1963, instant replay was used for the first time during the live transmission of the Army–Navy Game by its inventor, director Tony Verna. Although Quad became the industry standard for thirty years, it has drawbacks such as an inability to freeze pictures, no picture search. In early machines, a tape could reliably be played back using only the same set of hand-made tape heads, which wore out quickly. Despite these problems, Quad is capable of producing excellent images. Subsequent videotape systems have used helical scan, where the video heads record diagonal tracks onto the tape. Many early videotape recordings were not preserved. While much less expensive and more convenient than kinescope, the high cost of 3M Scotch 179 and other early videotapes meant that most broadcasters erased and reused them, regarded videotape as a better and more cost-effective means of time-delaying broadcasts than kinescopes, it was the four time zones of the continental United States which had made the system desirable in the first place.
However, some classic television programs recorded on studio videotape still exist, are available on DVD – among them NBC's Peter Pan with Mary Martin as Peter, several episodes o
Video is an electronic medium for the recording, playback and display of moving visual media. Video was first developed for mechanical television systems, which were replaced by cathode ray tube systems which were replaced by flat panel displays of several types. Video systems vary in display resolution, aspect ratio, refresh rate, color capabilities and other qualities. Analog and digital variants exist and can be carried on a variety of media, including radio broadcast, magnetic tape, optical discs, computer files, network streaming. Video technology was first developed for mechanical television systems, which were replaced by cathode ray tube television systems, but several new technologies for video display devices have since been invented. Video was exclusively a live technology. Charles Ginsburg led an Ampex research team developing one of the first practical video tape recorder. In 1951 the first video tape recorder captured live images from television cameras by converting the camera's electrical impulses and saving the information onto magnetic video tape.
Video recorders were sold for US $50,000 in 1956, videotapes cost US $300 per one-hour reel. However, prices dropped over the years; the use of digital techniques in video created digital video, which allows higher quality and much lower cost than earlier analog technology. After the invention of the DVD in 1997 and Blu-ray Disc in 2006, sales of videotape and recording equipment plummeted. Advances in computer technology allows inexpensive personal computers and smartphones to capture, store and transmit digital video, further reducing the cost of video production, allowing program-makers and broadcasters to move to tapeless production; the advent of digital broadcasting and the subsequent digital television transition is in the process of relegating analog video to the status of a legacy technology in most parts of the world. As of 2015, with the increasing use of high-resolution video cameras with improved dynamic range and color gamuts, high-dynamic-range digital intermediate data formats with improved color depth, modern digital video technology is converging with digital film technology.
Frame rate, the number of still pictures per unit of time of video, ranges from six or eight frames per second for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL standards and SECAM specify 25 frame/s. Film is shot at the slower frame rate of 24 frames per second, which complicates the process of transferring a cinematic motion picture to video; the minimum frame rate to achieve a comfortable illusion of a moving image is about sixteen frames per second. Video can be progressive. In progressive scan systems, each refresh period updates all scan lines in each frame in sequence; when displaying a natively progressive broadcast or recorded signal, the result is optimum spatial resolution of both the stationary and moving parts of the image. Interlacing was invented as a way to reduce flicker in early mechanical and CRT video displays without increasing the number of complete frames per second. Interlacing retains detail while requiring lower bandwidth compared to progressive scanning.
In interlaced video, the horizontal scan lines of each complete frame are treated as if numbered consecutively, captured as two fields: an odd field consisting of the odd-numbered lines and an field consisting of the even-numbered lines. Analog display devices reproduce each frame doubling the frame rate as far as perceptible overall flicker is concerned; when the image capture device acquires the fields one at a time, rather than dividing up a complete frame after it is captured, the frame rate for motion is doubled as well, resulting in smoother, more lifelike reproduction of moving parts of the image when viewed on an interlaced CRT display. NTSC, PAL and SECAM are interlaced formats. Abbreviated video resolution specifications include an i to indicate interlacing. For example, PAL video format is described as 576i50, where 576 indicates the total number of horizontal scan lines, i indicates interlacing, 50 indicates 50 fields per second; when displaying a natively interlaced signal on a progressive scan device, overall spatial resolution is degraded by simple line doubling—artifacts such as flickering or "comb" effects in moving parts of the image which appear unless special signal processing eliminates them.
A procedure known as deinterlacing can optimize the display of an interlaced video signal from an analog, DVD or satellite source on a progressive scan device such as an LCD television, digital video projector or plasma panel. Deinterlacing cannot, produce video quality, equivalent to true progressive scan source material. Aspect ratio describes the proportional relationship between the width and height of video screens and video picture elements. All popular video formats are rectangular, so can be described by a ratio between width and height; the ratio width to height for a traditional television screen is 4:3, or about 1.33:1. High definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack is 1.375:1. Pixels on computer monitors are square, but pixels used in digital video have non-square aspect ratios, such as those used in the PAL and NTSC variants of the CCIR 601 digital video
Spirit DataCine is a telecine and/or a motion picture film scanner. This device is able to transfer 16mm and 35mm motion picture film to NTSC or PAL television standards or one of many High-definition television standards. With the data transfer option a Spirit DataCine can output DPX data files; the image pick up device is a solid state charge-coupled device. This eliminated the need for glass vacuum tube CRTs used on older telecines; the units can transfer negative film, intermediate film and print film, stock. One option is a Super 8 gate for the transfer of Super 8 mm film. With a sound pick up option, optical 16mm and 35mm sound can be reproduced 16mm magnetic strip sound; the unit can be controlled by a scene by scene color corrector. Ken Burns created The Civil War, a short documentary film included in the DVD release, on how he used the Spirit DataCine to transfer and remaster this film; the operator of the unit is called a Colorist Assistant. The Spirit DataCine has become the standard for scanning.
Over 370 units are used in post production facilities around the world. Most current film productions are transferred on Spirit DataCines for TV, Digital television, Cable television, Satellite television, Direct-to-video, DVD, blu-ray Disc, pay-per-view, In-flight entertainment, Stock footage, Film preservation, digital intermediate and digital cinema; the Spirit DataCine is made by DFT Digital Film Technology GmbH in Germany. All Spirit DataCines use continuous transport motion. An optional optic audio, pick up system can be mounted in the capstan. All Spirit DataCines use a xenon lamp for illumination into a diffusion chamber to minimize dust and scratch visibility. With the standard 35mm lens gate: super 35 mm and academy 35 mm are supported. 2, 3, 4, perf are supported. VistaVision 8-perf and 6 perf are an option; the unit comes with select-a-speed, this gives the section of a film speeds from 2.00 frames per second to 57.00 fps in SDTV and 2.00 to 31.00 fps in HDTV interlace format. With the optional 16mm lens gate standard 16mm and Super 16 mm are supported.
With the 16mm lens gate an optional Super 8 mm film gate can be added. 16mm audio system support 16mm mag or magnetic strip sound track on the motion picture would be picked up by a head and could be fed to an audio sound mixing console or to the VTR. Spirit DataCines use a charge-coupled device Line Array - CCD for imaging. In print mode a “white” light is shone through the exposed film image into a lens and to prism, color glass separates out the image into the three primary colors, red and blue; each beam of colored light is projected at a different CCD, one for each color. The CCD converts the light into an electrical signal that produces a modulated video signal, color corrected and sized so it can be recorded onto video tape or a Storage area network-SAN hard disk array. Spirit DataCines can output to different TV standards: or HDTV; the Spatial Processor can change the size of the image: pan and scan, letterbox or make other aspect ratio and rotation changes product interlaced video if needed.
The Spatial Processor produces the 2:3 pulldown, if needed for the format. An optional Scream grain reducer can reduce film grain in all three color channels; the Spirit DataCine opened the door to the technology of digital intermediates, wherein telecine tools were not just used for video outputs, but could now be used for high-resolution data that would be recorded back out to film. The DFT Digital Film Technology Grass Valley Spirit 4K/2K/HD replaced the Spirit 2000 Datacine and uses both 2K and 4k line array CCDs; the SDC-2000 did not use a color prisms and/or dichroic mirrors, color separation was done in the CCD. DFT revealed its newest scanner at the 2009 NAB Show, Scanity. A Spirit DataCines outputing DPX files was used in the 2000 movie O Brother, Where Art Thou?. The DPX files were color corrected with a Pandora Int. Pogle Color Corrector with MegaDEF. A Kodak Lightning II film recorder was used to put the data output to back to film. To output the movie the Spirit Datacine’s Phantom Transfer Engine software running on an SGI computer is used to record the DPX files from the Spirit DataCine.
These files are stored on a SAN hard disk storage array. The Phantom Transfer Engine has been replaced with Bones software running on a Linux-based PC. First generation of DPX interface for data files was the optical fiber HIPPI cables, the next generation interface is GSN-Gigabit Ethernet fiber Optic. GSN is called HIPPI-6400 and was renamed GSN; the SAN hard disks are interfaces to by dual FC-Fibre Channel, cables. The newest DPX output interface is infiniband. Most Spirit DataCines are controlled by 2K or 2K Plus; some are controlled by Pandora International Pogle, some with a their MegaDEF or a Pixi color grading system. A Spirit DataCine comes with a full function control panel that can be used for control and color grade; the Robert Bosch GmbH, Fernseh Div. which became BTS inc. - Philips Digital Video Systems, Thomson's Grass Valley and now is DFT Digital Film Technology introduced the world's first CCD telecine in 1979, the FDL-60. The FDL-60 was designed and made in Darmstadt West Germany, this was the first all solid state Telecine.
FDL is short for Film Digital Line. The FDL-60 uses a three CCD single-line array system, whereby three lines, Red Green Blue each with 1,024 pixels per line to record a single line of the film image. FDL60A uses three Fairchild Semiconductor CCD 133 CCDs for the image pickup
Photographic processing or development is the chemical means by which photographic film or paper is treated after photographic exposure to produce a negative or positive image. Photographic processing transforms the latent image into a visible image, makes this permanent and renders it insensitive to light. All processes based upon the gelatin-silver process are similar, regardless of the film or paper's manufacturer. Exceptional variations include instant films such as those made by Polaroid and thermally developed films. Kodachrome required Kodak's proprietary K-14 process. Kodachrome film production ceased in 2009, K-14 processing is no longer available as of December 30, 2010. Ilfochrome materials use the dye destruction process. All photographic processing use a series of chemical baths. Processing the development stages, requires close control of temperature and time; the film may be soaked in water to swell the gelatin layer, facilitating the action of the subsequent chemical treatments.
The developer converts the latent image to macroscopic particles of metallic silver. A stop bath,† a dilute solution of acetic acid or citric acid, halts the action of the developer. A rinse with clean water may be substituted; the fixer makes the image light-resistant by dissolving remaining silver halide. A common fixer is hypo ammonium thiosulfate. Washing in clean water removes any remaining fixer. Residual fixer can corrode the silver image, leading to discolouration and fading; the washing time can be reduced and the fixer more removed if a hypo clearing agent is used after the fixer. Film may be rinsed in a dilute solution of a non-ionic wetting agent to assist uniform drying, which eliminates drying marks caused by hard water. Film is dried in a dust-free environment and placed into protective sleeves. Once the film is processed, it is referred to as a negative; the negative may now be printed. Many different techniques can be used during the enlargement process. Two examples of enlargement techniques are burning.
Alternatively, the negative may be scanned for digital printing or web viewing after adjustment, and/or manipulation. † In modern automatic processing machines, the stop bath is replaced by mechanical squeegee or pinching rollers. These treatments remove much of the carried-over alkaline developer, the acid, when used, neutralizes the alkalinity to reduce the contamination of the fixing bath with the developer; this process has three additional stages: Following the stop bath, the film is bleached to remove the developed negative image. The film contains a latent positive image formed from unexposed and undeveloped silver halide salts; the film is fogged, either chemically or by exposure to light. The remaining silver halide salts are developed in the second developer, converting them into a positive image; the film is fixed, washed and cut. Chromogenic materials use dye couplers to form colour images. Modern colour negative film is developed with the C-41 process and colour negative print materials with the RA-4 process.
These processes are similar, with differences in the first chemical developer. The C-41 and RA-4 processes consist of the following steps: The colour developer develops the silver negative image, byproducts activate the dye couplers to form the colour dyes in each emulsion layer. A rehalogenising bleach converts the developed silver image into silver halides. A fixer removes the silver salts; the film is washed, stabilised and cut. In the RA-4 process, the bleach and fix are combined; this is optional, reduces the number of processing steps. Transparency films, except Kodachrome, are developed using the E-6 process, which has the following stages: A black and white developer develops the silver in each image layer. Development is stopped with a stop bath; the film is fogged in the reversal step. The fogged silver halides are developed and oxidized developing agents couple with the dye couplers in each layer; the film is bleached, fixed and dried as described above. In some old processes, the film emulsion was hardened during the process before the bleach.
Such a hardening bath used aldehydes, such as formaldehyde and glutaraldehyde. In modern processing, these hardening steps are unnecessary because the film emulsion is sufficiently hardened to withstand the processing chemicals. Black and white emulsions both negative and positive, may be further processed; the image silver may be reacted with elements such as selenium or sulphur to increase image permanence and for aesthetic reasons. This process is known as toning. In selenium toning, the image silver is changed to silver selenide; these chemicals are more resistant to atmospheric oxidising agents than silver. If colour negative film is processed in conventional black and white developer, fixed and bleached with a bath containing hydrochloric acid and potassium dichromate solution, the resultant film, once exposed to light, can be redeveloped in colour developer to produce an unusual pastel colour effect. Before processing, the film must be removed from the camera and from its cassette, spool or holder in a light-proof room or container.
In amateur processing, the film is removed from the camera and wound onto a reel in complete darkness (usually inside a darkroom with the safelight turned off or a lightproof bag with
Telecine is the process of transferring motion picture film into video and is performed in a color suite. The term is used to refer to the equipment used in the post-production process. Telecine enables a motion picture, captured on film stock, to be viewed with standard video equipment, such as television sets, video cassette recorders, DVD, Blu-ray Disc or computers; this allowed television broadcasters to produce programmes using film 16mm stock, but transmit them in the same format, quality, as other forms of television production. Furthermore, telecine allows film producers, television producers and film distributors working in the film industry to release their products on video and allows producers to use video production equipment to complete their filmmaking projects. Within the film industry, it is referred to as a TK, because TC is used to designate timecode. With the advent of popular broadcast television, producers realized they needed more than live television programming. By turning to film-originated material, they would have access to the wealth of films made for the cinema in addition to recorded television programming on film that could be aired at different times.
However, the difference in frame rates between film and television meant that playing a film into a television camera would result in flickering. The kinescope was used to record the image from a television display to film, synchronized to the TV scan rate; this could be re-played directly into a video camera for re-display. Non-live programming could be filmed using the same cameras, edited mechanically as normal, played back for TV; as the film was run at the same speed as the television, the flickering was eliminated. Various displays, including projectors for these "video rate films", slide projectors and film cameras were combined into a "film chain", allowing the broadcaster to cue up various forms of media and switch between them by moving a mirror or prism. Color was supported by using a multi-tube video camera and filters to separate the original color signal and feed the red and blue to individual tubes. However, this still left film shot at cinema frame rates as a problem; the obvious solution is to speed up the film to match the television frame rates, but this, at least in the case of NTSC, is rather obvious to the eye and ear.
This problem is not difficult to fix, however. For NTSC, the difference in frame rates can be corrected by showing every fourth frame of film twice, although this does require the sound to be handled separately to avoid "skipping" effects. A more convincing technique is to use "2:3 pulldown", discussed below, which turns every second frame of the film into three fields of video, which results in a much smoother display. PAL uses a similar system, "2:2 pulldown". However, during the analogue broadcasting period, the 24 frame per second film was shown at a slighly faster 25 frames per second rate, to match the PAL video signal; this resulted in a fractionally higher audio soundtrack, resulted in feature films having a shorter duration, by being shown 1 frame per second faster. In recent decades, telecine has been a film-to-storage process, as opposed to film-to-air. Changes since the 1950s have been in terms of equipment and physical formats. Home movies on film may be transferred to video tape using this technique, it is not uncommon to find telecined DVDs where the source was recorded on videotape.
The same is not true for modern DVDs of cinematic films, which are recorded in their original frame rate—in these cases the DVD player itself applies telecining as required to match the capabilities of the television receiver. The most complex part of telecine is the synchronization of the mechanical film motion and the electronic video signal; every time the video part of the telecine samples the light electronically, the film part of the telecine must have a frame in perfect registration and ready to photograph. This is easy when the film is photographed at the same frame rate as the video camera will sample, but when this is not true, a sophisticated procedure is required to change frame rate. To avoid the synchronization issues, higher-end establishments now use a scanning system rather than just a telecine system; this allows them to scan a distinct frame of digital video for each frame of film, providing higher quality than a telecine system would be able to achieve. Best results are achieved by using a smoothing rather than a frame duplication algorithm to adjust for speed differences between the film and video frame rate.
Similar issues occur when using vertical synchronization to prevent screen tearing, a different problem encountered when frame rates mismatch. In countries that use the PAL or SECAM video standards, film destined for television is photographed at 25 frames per second; the PAL video standard broadcasts at 25 frames per second, so the transfer from film to video is simple. Theatrical features photographed at 24 frame/s are shown at 25 frame/s. While this is not noticed in the picture, the 4% increase in playback speed causes a noticeable increase in audio pitch by just over 0.679 semitones, sometimes corrected using a pitch shifter, though pitch shifting is a recent innovation and supersedes an alternative method of tele
Xena: Warrior Princess
Xena: Warrior Princess is an American fantasy television series filmed on location in New Zealand. The series aired in first-run syndication from September 4, 1995 to June 18, 2001. Critics have praised the series for its strong female protagonist, it has acquired a strong cult following, attention in fandom and academia, has influenced the direction of other television series. Writer-director-producer Robert Tapert created the series in 1995 under his production tag, Renaissance Pictures, with executive producers R. J. Stewart and Sam Raimi; the series narrative follows Xena, an infamous warrior on a quest to seek redemption for her past sins against the innocent by using her formidable fighting skills to now help those who are unable to defend themselves. Xena is accompanied by Gabrielle, who during the series changes from a simple farm-girl into an Amazon warrior and Xena's comrade-in-arms. In 2012 star Lucy Lawless confirmed; the show is a spin-off of the television series Hercules: The Legendary Journeys.
Aware that the character of Xena had become successful among the public, the producers of the series decided to launch a spin-off series based on her adventures. Xena became a successful show which has aired in more than 108 countries around the world since 1998. In 2004 and 2007, it ranked #9 and #10 on TV Guide's Top Cult Shows Ever and the title character ranked #100 on Bravo's 100 Greatest TV Characters. Xena's success has led to hundreds of tie-in products, including comics, video games and conventions, realized annually since 1998 in Pasadena, California and in London; the series soared past its predecessor in popularity. In its second season it became the top-rated syndicated drama series on American television. For all six years, Xena remained in the top five. Cancellation of the series was announced in October 2000, the series finale aired in the summer of 2001. On August 13, 2015 NBC Entertainment chairman Bob Greenblatt said a Xena reboot was in development, with Raimi and Tapert returning as executive producers, with the show's debut sometime in 2016.
Javier Grillo-Marxuach was hired as writer and producer for the reboot, but left the project in April 2017 because of creative differences. In August 2017, NBC announced that it had cancelled its plans for the reboot for the foreseeable future. Xena: Warrior Princess is set in a fantasy version of ancient Greece and was filmed in New Zealand; some filming locations are confidential, but many scenes were recorded in places such as the Waitakere Ranges Regional Park, part of the Auckland Regional parks credited at the end of the episodes. The Ancient Greece depicted in the show is derived from historical locations and customs, modifying known places and events – battles, trading routes, so on – to generate an attractive fictional world; the settlements are presented as a mixture of walled villages and rural hamlets set in a lush green, mountainous landscape. They are seen under attack from warlords, travelling between them involves frequent encounters with small bands of outlaws. All of the main towns are named after historic towns of Ancient Greece, exhibit some of their essential characteristics – Amphipolis, Athens, Corinth and Cirra, burnt to the ground by Xena's army.
As the show progressed, events took place throughout more modern times and places, from Cleopatra's Alexandria to Julius Caesar's Rome. The mythology of the show transitioned from that of the Olympian Gods to include Judeo-Christian elements. Eastern religions were touched on as well, with little regard to accurate time-and-place concerns. One episode, "The Way", which loosely interpreted elements of Hinduism as major plot points, generated controversy, requiring the producers to add a disclaimer at the head of the episode and a tag explaining the episode's intentions at its end. Mythological and supernatural locations are presented as real, physical places accessed through physical portals hidden in the landscape such as lakes and caves, they include the Elysian Fields, the River Styx, Valhalla and Hell. The inhabitants of such places – gods, mythological beings and forces – are for the most part manifested as human characters who can move at will between their domains and the real world. Ares, the Greek God of War, for instance, is an egotistical man who wears studded black leather, Aphrodite, Goddess of Love, is a California valley girl who uses typical valley girl slang and dresses in flowing, translucent pink gowns.
Xena is a historical fantasy set in ancient Greece, although the setting is flexible in both time and location and features Egyptian, Chinese, Central Asian, Medieval European elements. The flexible fantasy framework of the show accommodates a considerable range of theatrical styles, from high melodrama to slapstick comedy, from whimsical and musical to all-out action and adventure. While the show is set in ancient times, its themes are modern and it investigates the ideas of taking responsibility for past misdeeds, the value of human life, personal liberty and sacrifice, friendship; the show addresses ethical dilemmas, such as the morality of pacifism.
A flatbed editor is a type of machine used to edit film for a motion picture. Picture and sound rolls load onto separate motorized disks, called "plates." Each set of plates moves forward or backward separately, or locked together to maintain synchronization between picture and sound. A prism reflects the film image onto a viewing screen, while a magnetic playback head reads the magnetic audio tracks; the two most common configurations are the "eight-plate" models. Most films are shot double-system: The picture is shot on film, while the sound is recorded separately on quarter inch audiotape on a Nagra III, 4.2, 4S or a Stellavox SP7.. For convenience during the editing process, the sound is transferred to a magnetic track — sprocketed recording film, filmstock coated with magnetic oxide, instead of photo-sensitive emulsion. One "frame" on the magnetic film equals one frame of picture; the magnetic film is edge-coded: sequential numbers are stamped on the edge every few frames to facilitate locating particular frames or scenes.
Since picture and sound are recorded separately, the editor must synchronize them. The editor loads one picture roll onto a film plate and its corresponding magnetic roll onto a sound plate, he advances the film to find the frame where the two parts of the Clapperboard came together. The editor repeats the process on the magnetic roll to find the frame with the clap sound. Once found, he marks the frame on both rolls as the synchronization point, switches the flatbed to interlock mode. From on, both picture and sound rolls advance or reverse by the same amount to maintain synchronization; when the editor finds a point to cut one shot into another, he marks it on both picture and sound rolls makes the cut and splices in the next shot. One of the first and most popular film editing machines was the Moviola. With it, one could manage a thousand-foot eleven-minute 35 mm roll, it was difficult to use compared to machines, because it did not have high-speed operation. European flatbeds came into more common use in the United States during the 1970s, although never replacing the Moviola.
By the mid-1990s, flatbeds were replaced by computer-based non-linear systems, such as Avid and Lightworks. As of 2007, some film schools still use flatbed editors for their educational value. Feature films in the United States now use electronic non-linear systems exclusively; the two most common brands of flatbed editor, Steenbeck and K-E-M, were invented in Germany in the 1930s. There are the Italian Prévost, the Dutch Oude Delft or Oldelft, the French Atlas as well as Moritone flatbeds; the U. K. produced the LEM, America the Moviola flatbed and the 16 mm Showchron of which 400 were produced in 4, 6, or 8 plate configurations, 6 being the most common. All these machines employ a rotating prism rather than the Geneva drive intermittent mechanism first used by the American upright Moviola; the rotating prism allows the editor to move the film smoothly and continuously, reducing mechanical noise and film wear. It makes high-speed operation feasible, some machines can move the film at up to ten times standard speed.
The K-E-M Universal, which has a modular construction, supports up to three picture heads and up to three sound tracks. Film editing Film splicer