1.
Windows Media Video
–
Windows Media Video is the name of a series of video codecs and their corresponding video coding formats developed by Microsoft. It is part of the Windows Media framework, WMV consists of three distinct codecs, The original video compression technology known as WMV, was originally designed for Internet streaming applications, as a competitor to RealVideo. The other compression technologies, WMV Screen and WMV Image, cater for specialized content, after standardization by the Society of Motion Picture and Television Engineers, WMV version 9 was adopted for physical-delivery formats such as HD DVD and Blu-ray Disc and became known as VC-1. Microsoft also developed a digital format called Advanced Systems Format to store video encoded by Windows Media Video. In 2003, Microsoft drafted a video compression specification based on its WMV9 format, the standard was officially approved in March 2006 as SMPTE 421M, better known as VC-1, thus making the WMV9 format an open standard. VC-1 became one of the three formats for the Blu-ray video disc, along with H. 262/MPEG-2 Part 2 and H. 264/MPEG-4 AVC. A WMV file uses the Advanced Systems Format container format to encapsulate the encoded multimedia content, while the ASF can encapsulate multimedia in other encodings than those the WMV file standard specifies, those ASF files should use the. ASF file extension and not the. WMV file extension. Although WMV is generally packed into the ASF container format, it can also be put into the Matroska or AVI container format, the resulting files have the. MKV and. AVI file extensions, respectively. One common way to store WMV in an AVI file is to use the WMV9 Video Compression Manager codec implementation, Windows Media Video is the most recognized video compression format within the WMV family. Usage of the term WMV often refers to the Microsoft Windows Media Video format only and its main competitors are MPEG-4 AVC, AVS, RealVideo, and MPEG-4 ASP. The first version of the format, WMV7, was introduced in 1999, continued proprietary development led to newer versions of the format, but the bit stream syntax was not frozen until WMV9. WMV9 also introduced a new profile titled Windows Media Video 9 Professional, which is activated automatically whenever the video resolution exceeds 300,000 pixels and it is targeted towards high-definition video content, at resolutions such as 720p and 1080p. The Simple and Main profile levels in WMV9 are compliant with the same levels in the VC-1 specification. The Advanced Profile in VC-1 is implemented in a new WMV format called Windows Media Video 9 Advanced Profile and it improves compression efficiency for interlaced content and is made transport-independent, making it able to be encapsulated in an MPEG transport stream or RTP packet format. The format is not compatible with previous WMV9 formats, however, WMV is a mandatory video format for PlaysForSure-certified online stores and devices, as well as Portable Media Center devices. The Microsoft Zune, Xbox 360, Windows Mobile-powered devices with Windows Media Player, as well as many uncertified devices, WMV HD mandates the use of WMV9 for its certification program, at quality levels specified by Microsoft. WMV used to be the only supported video format for the Microsoft Silverlight platform, Windows Media Video Screen are video formats that specialise in screencast content. They can capture live screen content, or convert video from third-party screen-capture programs into WMV9 Screen files and they work best when the source material is mainly static and contains a small color palette
2.
Windows Media Audio
–
Windows Media Audio is the name of a series of audio codecs and their corresponding audio coding formats developed by Microsoft. It is a technology that forms part of the Windows Media framework. WMA consists of four distinct codecs, the original WMA codec, known simply as WMA, was conceived as a competitor to the popular MP3 and RealAudio codecs. WMA Pro, a newer and more advanced codec, supports multichannel, a lossless codec, WMA Lossless, compresses audio data without loss of audio fidelity. WMA Voice, targeted at voice content, applies compression using a range of low bit rates, Microsoft has also developed a digital container format called Advanced Systems Format to store audio encoded by WMA. The first WMA codec was based on work by Henrique Malvar. Malvar was a researcher and manager of the Signal Processing Group at Microsoft Research. The first finalized codec was initially referred to as MSAudio 4.0 and it was later officially released as Windows Media Audio, as part of Windows Media Technologies 4.0. Microsoft claimed that WMA could produce files that were half the size of equivalent-quality MP3 files, the former claim however was rejected by some audiophiles. RealNetworks also challenged Microsofts claims regarding WMAs superior audio quality compared to RealAudio, newer versions of WMA became available, Windows Media Audio 2 in 1999, Windows Media Audio 7 in 2000, Windows Media Audio 8 in 2001, and Windows Media Audio 9 in 2003. Microsoft first announced its plans to license WMA technology to third parties in 1999, although earlier versions of Windows Media Player played WMA files, support for WMA file creation was not added until the seventh version. In 2003, Microsoft released new audio codecs that were not compatible with the original WMA codec and these codecs were Windows Media Audio 9 Professional, Windows Media Audio 9 Lossless, and Windows Media Audio 9 Voice. All versions of WMA released since version 9.0 - namely 9.1,9.2 and 10 - have been compatible with the original v9 decoder and are therefore not considered separate codecs. The sole exception to this is the WMA10 Professional codec whose Low Bit Rate mode is only compatible with the older WMA Professional decoders at half sampling rate. Full fidelity decoding of WMA10 Professional LBR bitstreams requires a WMA version 10 or newer decoder, a WMA file is in most circumstances contained in the Advanced Systems Format, a proprietary Microsoft container format for digital audio or digital video. The ASF container format specifies how metadata about the file is to be encoded, metadata may include song name, track number, artist name, and also audio normalization values. See Windows Media DRM for further information, related industry standards such as DECE UltraViolet and MPEG-DASH have not standardized WMA as a supported audio codec, deciding in favor of the more industry-prevalent MPEG and Dolby audio codecs. Each WMA file features an audio track in one of the four sub-formats, WMA, WMA Pro, WMA Lossless
3.
Microsoft
–
Its best known software products are the Microsoft Windows line of operating systems, Microsoft Office office suite, and Internet Explorer and Edge web browsers. Its flagship hardware products are the Xbox video game consoles and the Microsoft Surface tablet lineup, as of 2016, it was the worlds largest software maker by revenue, and one of the worlds most valuable companies. Microsoft was founded by Paul Allen and Bill Gates on April 4,1975, to develop and it rose to dominate the personal computer operating system market with MS-DOS in the mid-1980s, followed by Microsoft Windows. The companys 1986 initial public offering, and subsequent rise in its share price, since the 1990s, it has increasingly diversified from the operating system market and has made a number of corporate acquisitions. In May 2011, Microsoft acquired Skype Technologies for $8.5 billion, in June 2012, Microsoft entered the personal computer production market for the first time, with the launch of the Microsoft Surface, a line of tablet computers. The word Microsoft is a portmanteau of microcomputer and software, Paul Allen and Bill Gates, childhood friends with a passion for computer programming, sought to make a successful business utilizing their shared skills. In 1972 they founded their first company, named Traf-O-Data, which offered a computer that tracked and analyzed automobile traffic data. Allen went on to pursue a degree in science at Washington State University. The January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systemss Altair 8800 microcomputer, Allen suggested that they could program a BASIC interpreter for the device, after a call from Gates claiming to have a working interpreter, MITS requested a demonstration. Since they didnt actually have one, Allen worked on a simulator for the Altair while Gates developed the interpreter and they officially established Microsoft on April 4,1975, with Gates as the CEO. Allen came up with the name of Micro-Soft, as recounted in a 1995 Fortune magazine article. In August 1977 the company formed an agreement with ASCII Magazine in Japan, resulting in its first international office, the company moved to a new home in Bellevue, Washington in January 1979. Microsoft entered the OS business in 1980 with its own version of Unix, however, it was MS-DOS that solidified the companys dominance. For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, branding it as MS-DOS, following the release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM copyrighted the IBM PC BIOS, other companies had to engineer it in order for non-IBM hardware to run as IBM PC compatibles. Due to various factors, such as MS-DOSs available software selection, the company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as with a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in 1983 after developing Hodgkins disease, while jointly developing a new OS with IBM in 1984, OS/2, Microsoft released Microsoft Windows, a graphical extension for MS-DOS, on November 20,1985. Once Microsoft informed IBM of NT, the OS/2 partnership deteriorated, in 1990, Microsoft introduced its office suite, Microsoft Office
4.
Digital audio
–
Digital audio is technology that can be used to record, store, generate, manipulate, and reproduce sound using audio signals that have been encoded in digital form. A microphone converts sound to an electrical signal, then an analog-to-digital converter —typically using pulse-code modulation—converts the analog signal into a digital signal. This digital signal can then be recorded, edited and modified using digital audio tools, digital audio systems may include compression, storage, processing and transmission components. Conversion to a digital format allows convenient manipulation, storage, transmission, modern online music distribution depends on digital recording and data compression. The availability of music as data files, rather than as objects, has significantly reduced the costs of distribution. Before digital audio, the music industry distributed and sold music by selling copies in the form of records. With digital audio and online distribution systems such as iTunes, companies sell digital files to consumers. This digital audio/Internet distribution model is less expensive than producing physical copies of recordings, packaging them. An analog audio system captures sounds, and converts their physical waveforms into electrical representations of those waveforms by use of a transducer, the sounds are then stored, as on tape, or transmitted. The process is reversed for playback, the signal is amplified. Analog audio retains its fundamental wave-like characteristics throughout its storage, transformation, duplication, analog audio signals are susceptible to noise and distortion, due to the innate characteristics of electronic circuits and associated devices. Disturbances in a system do not result in error unless the disturbance is so large as to result in a symbol being misinterpreted as another symbol or disturb the sequence of symbols. A digital audio signal may be encoded for correction of any errors that occur in the storage or transmission of the signal. This technique, known as channel coding, is essential for broadcast or recorded digital systems to maintain bit accuracy, the discrete time and level of the binary signal allow a decoder to recreate the analog signal upon replay. Eight to Fourteen Bit Modulation is a code used in the audio Compact Disc. A digital audio system starts with an ADC that converts a signal to a digital signal. The ADC runs at a sampling rate and converts at a known bit resolution. CD audio, for example, has a rate of 44.1 kHz
5.
Streaming media
–
Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a provider. A client end-user can use their player to begin to play the data file before the entire file has been transmitted. For example, in the 1930s, elevator music was among the earliest popularly available streaming media, the term streaming media can apply to media other than video and audio such as live closed captioning, ticker tape, and real-time text, which are all considered streaming text. As of 2017, streaming is generally taken to refer to cases where a user watches digital video content or listens to audio content on a computer screen. With streaming content, the user does not have to download the digital video or digital audio file before they start to watch/listen to it. There are challenges with streaming content on the Internet, as of 2016, two popular streaming services are the video sharing website YouTube, which contains video and audio files on a huge range of topics and Netflix, which streams movies and TV shows. Live streaming refers to Internet content delivered in real-time, as events happen, Live internet streaming requires a form of source media, an encoder to digitize the content, a media publisher, and a content delivery network to distribute and deliver the content. Live streaming does not need to be recorded at the origination point, in the early 1920s, George O. Attempts to display media on computers date back to the earliest days of computing in the mid-20th century, however, little progress was made for several decades, primarily due to the high cost and limited capabilities of computer hardware. From the late 1980s through the 1990s, consumer-grade personal computers became powerful enough to various media. These technological improvement facilitated the streaming of audio and video content to users in their homes and workplaces. The band Severe Tire Damage was the first group to live on the Internet. On June 24,1993, the band was playing a gig at Xerox PARC while elsewhere in the building, as proof of PARCs technology, the bands performance was broadcast and could be seen live in Australia and elsewhere. Microsoft Research developed a Microsoft TV application which was compiled under MS Windows Studio Suite, realNetworks was also a pioneer in the streaming media markets, when it broadcast a baseball game between the New York Yankees and the Seattle Mariners over the Internet in 1995. The first symphonic concert on the Internet took place at the Paramount Theater in Seattle, the concert was a collaboration between The Seattle Symphony and various guest musicians such as Slash, Matt Cameron, and Barrett Martin. When Word Magazine launched in 1995, they featured the first-ever streaming soundtracks on the Internet.4 in 1999, in June 1999 Apple also introduced a streaming media format in its QuickTime 4 application. It was later widely adopted on websites along with RealPlayer. In 2000 Industryview. com launched its worlds largest streaming video archive website to help promote themselves
6.
Media Foundation
–
Media Foundation is a COM-based multimedia framework pipeline and infrastructure platform for digital media in Windows Vista, Windows 7, Windows 8, Windows 8.1 and Windows 10. The existing DirectShow technology is intended to be replaced by Media Foundation step-by-step, for some time there will be a co-existence of Media Foundation and DirectShow. Media Foundation will not be available for previous Windows versions, including Windows XP and it integrates DXVA2.0 for offloading more of the video processing pipeline to hardware, for better performance. Videos are processed in the colorspace they were encoded in, and are handed off to the hardware and this prevents intermediate colorspace conversions to improve performance. MF includes a new renderer, called Enhanced Video Renderer. EVR has better support for timing and synchronization. It uses the Multimedia Class Scheduler Service, a new service that prioritizes real time multimedia processing, to reserve the resources required for the playback, without any tearing or glitches. The second release included in Windows 7 introduces expanded media format support, the MF architecture is divided into the Control layer, Core Layer and the Platform layer. The core layer encapsulates most of the functionality of Media Foundation and it consists of the media foundation pipeline, which has three components, Media Source, Media Sink and Media Foundation Transforms. A media source is an object acts as the source of multimedia data. It can encapsulate various data sources, like a file, or a server or even a camcorder. A source object can use a source resolver object which creates a source from an URI. Support for non-standard protocols can be added by creating a source resolver for them, a source object can also use a sequencer object to use a sequence of sources or to coalesce multiple sources into single logical source. A media sink is the recipient of processed multimedia data, a media sink can either be a renderer sink, which renders the content on an output device, or an archive sink, which saves the content onto a persistent storage system such as a file. A renderer sink takes uncompressed data as input whereas an archive sink can take either compressed or uncompressed data, the data from media sources to sinks are acted upon by MFTs, MFTs are certain functions which transform the data into another form. MFTs can include multiplexers and demultiplexers, codecs or DSP effects like reverb, the core layer uses services like file access and networking and clock synchronization to time the multimedia rendering. Pausing, stopping, fast forward, reverse or time-compression can be achieved by controlling the presentation clock, however, the media pipeline components are not connected, rather they are just presented as discrete components. The application also has to co-ordinate the flow of data between the pipeline components, the control layer has to pull samples from one pipeline component and pass it onto the next component in order to achieve data flow within the pipeline
7.
Byte
–
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of used to encode a single character of text in a computer. The size of the byte has historically been hardware dependent and no standards existed that mandated the size. The de-facto standard of eight bits is a convenient power of two permitting the values 0 through 255 for one byte, the international standard IEC 80000-13 codified this common meaning. Many types of applications use information representable in eight or fewer bits, the popularity of major commercial computing architectures has aided in the ubiquitous acceptance of the 8-bit size. The unit symbol for the byte was designated as the upper-case letter B by the IEC and IEEE in contrast to the bit, internationally, the unit octet, symbol o, explicitly denotes a sequence of eight bits, eliminating the ambiguity of the byte. It is a respelling of bite to avoid accidental mutation to bit. Early computers used a variety of four-bit binary coded decimal representations and these representations included alphanumeric characters and special graphical symbols. S. Government and universities during the 1960s, the prominence of the System/360 led to the ubiquitous adoption of the eight-bit storage size, while in detail the EBCDIC and ASCII encoding schemes are different. In the early 1960s, AT&T introduced digital telephony first on long-distance trunk lines and these used the eight-bit µ-law encoding. This large investment promised to reduce costs for eight-bit data. The development of microprocessors in the 1970s popularized this storage size. A four-bit quantity is called a nibble, also nybble. The term octet is used to specify a size of eight bits. It is used extensively in protocol definitions, historically, the term octad or octade was used to denote eight bits as well at least in Western Europe, however, this usage is no longer common. The exact origin of the term is unclear, but it can be found in British, Dutch, and German sources of the 1960s and 1970s, and throughout the documentation of Philips mainframe computers. The unit symbol for the byte is specified in IEC 80000-13, IEEE1541, in the International System of Quantities, B is the symbol of the bel, a unit of logarithmic power ratios named after Alexander Graham Bell, creating a conflict with the IEC specification. However, little danger of confusion exists, because the bel is a used unit
8.
QuickTime
–
QuickTime is an extensible multimedia framework developed by Apple Inc. capable of handling various formats of digital video, picture, sound, panoramic images, and interactivity. First made in 1991, the latest Mac version, QuickTime X, is available on Mac OS X Snow Leopard. Apple ceased support for the Windows version of QuickTime in 2016, as of Mac OS X Lion, the underlying media framework for QuickTime, QTKit, is deprecated in favor of a newer graphics framework, AV Foundation. Software development kits for QuickTime are available to the public with an Apple Developer Connection subscription and it is available free of charge for both macOS and Windows operating systems. There are some other free player applications that rely on the QuickTime framework, for example, iTunes can export audio in WAV, AIFF, MP3, AAC, and Apple Lossless. In addition, macOS has a simple AppleScript that can be used to play a movie in full-screen mode, QuickTime Player 7 is limited to only basic playback operations unless a QuickTime Pro license key is purchased from Apple. Until recently, Apples professional applications included a QuickTime Pro license, Pro keys are specific to the major version of QuickTime for which they are purchased and unlock additional features of the QuickTime Player application on macOS or Windows. The Pro key does not require any additional downloads, entering the code immediately unlocks the hidden features. Saving and exporting to any of the supported by QuickTime. QuickTime 7 includes presets for exporting video to a video-capable iPod, Apple TV, saving existing QuickTime movies from the web directly to a hard disk drive. This is often, but not always, either hidden or intentionally blocked in the standard mode, two options exist for saving movies from a web browser, Save as source – This option will save the embedded video in its original format. Save as QuickTime movie – This option will save the video in a. mov file format no matter what the original container is/was. Mac OS X Snow Leopard includes QuickTime X. QuickTime Player X lacks cut, copy and paste and will only export to four formats, but its limited export feature is free. Otherwise, users will have to install QuickTime 7 from the Optional Installs directory of the Snow Leopard DVD after installing the OS. Mac OS X Lion and later also include QuickTime X. No installer for QuickTime 7 is included with software packages. The QuickTime framework provides the following, Encoding and transcoding video, decoding video and audio, then sending the decoded stream to the graphics or audio subsystem for playback. In macOS, QuickTime sends video playback to the Quartz Extreme Compositor, a component plug-in architecture for supporting additional 3rd-party codecs. As of early 2008, the framework hides many older codecs listed below from the user although the option to Show legacy encoders exists in QuickTime Preferences to use them
9.
Ogg
–
Ogg is a free, open container format maintained by the Xiph. Org Foundation. The creators of the Ogg format state that it is unrestricted by software patents and is designed to provide for efficient streaming and its name is derived from ogging, jargon from the computer game Netrek. The Ogg container format can multiplex a number of independent streams for audio, video, text, in the Ogg multimedia framework, Theora provides a lossy video layer. The audio layer is most commonly provided by the music-oriented Vorbis format or its successor Opus, lossless audio compression formats include FLAC, and OggPCM. Before 2007, the. ogg filename extension was used for all files whose content used the Ogg container format, since 2007, the Xiph. Org Foundation recommends that. ogg only be used for Ogg Vorbis audio files. As of August 4,2011, the current version of the Xiph. Org Foundations reference implementation, is libogg 1.3.0, another version, libogg2, has been in development, but is awaiting a rewrite as of 2008. Both software libraries are free software, released under the New BSD License, Ogg reference implementation was separated from Vorbis on September 2,2000. It is sometimes assumed that the name Ogg comes from the character of Nanny Ogg in Terry Pratchetts Discworld novels, Ogg is derived from ogging, jargon from the computer game Netrek, which came to mean doing something forcefully, possibly without consideration of the drain on future resources. At its inception, the Ogg project was thought to be somewhat ambitious given the power of the PC hardware of the time, still, to quote the same reference, Vorbis, on the other hand is named after the Terry Pratchett character from the book Small Gods. The Ogg Vorbis project started in 1993 and it was originally named Squish but that name was already trademarked, so the project underwent a name change. The new name, OggSquish, was used until 2001 when it was changed again to Ogg, Ogg has since come to refer to the container format, which is now part of the larger Xiph. org multimedia project. Today, Squish refers to an audio coding format typically used with the Ogg container format. The Ogg bitstream format, spearheaded by the Xiph, the format consists of chunks of data each called an Ogg page. Each page begins with the characters, OggS, to identify the file as Ogg format, a serial number and page number in the page header identifies each page as part of a series of pages making up a bitstream. Multiple bitstreams may be multiplexed in the file where pages from each bitstream are ordered by the time of the contained data. Bitstreams may also be appended to existing files, a known as chaining. A BSD-licensed library, called libvorbis, is available to encode and decode data from Vorbis streams, independent Ogg implementations are used in several projects such as RealPlayer and a set of DirectShow filters. Mogg, the Multi-Track-Single-Logical-Stream Ogg-Vorbis, is the multi-channel or multi-track Ogg file format, the following is the field layout of an Ogg page header, Capture pattern –32 bits The capture pattern or sync code is a magic number used to ensure synchronization when parsing Ogg files
10.
Web server
–
A web server is a computer system that processes requests via HTTP, the basic network protocol used to distribute information on the World Wide Web. The term can refer to the system, or specifically to the software that accepts. The primary function of a web server is to store, process, the communication between client and server takes place using the Hypertext Transfer Protocol. Pages delivered are most frequently HTML documents, which may include images, the resource is typically a real file on the servers secondary storage, but this is not necessarily the case and depends on how the web server is implemented. While the primary function is to serve content, an implementation of HTTP also includes ways of receiving content from clients. This feature is used for submitting web forms, including uploading of files, many generic web servers also support server-side scripting using Active Server Pages, PHP, or other scripting languages. This means that the behaviour of the web server can be scripted in separate files, usually, this function is used to generate HTML documents dynamically as opposed to returning static documents. The former is used for retrieving or modifying information from databases. The latter is much faster and more easily cached but cannot deliver dynamic content. Web servers are not only used for serving the World Wide Web and they can also be found embedded in devices such as printers, routers, webcams and serving only a local network. The web server may then be used as a part of a system for monitoring or administering the device in question and this usually means that no additional software has to be installed on the client computer, since only a web browser is required. In 1989 Tim Berners-Lee proposed a new project to his employer CERN, the project resulted in Berners-Lee writing two programs in 1990, A browser called WorldWideWeb. In 1994 Berners-Lee decided to constitute the World Wide Web Consortium to regulate the development of the many technologies involved through a standardization process. On an Apache server, this is commonly /home/www, the result is the local file system resource, /home/www/path/file. html The web server then reads the file, if it exists, and sends a response to the clients web browser. The response will describe the content of the file and contain the file itself or a message will return saying that the file does not exist or is unavailable. A web server can be incorporated into the OS kernel. Web servers that run in user-mode have to ask the system for permission to use more memory or more CPU resources, executing in user mode can also mean useless buffer copies which are another handicap for user-mode web servers. When a web server is near to or over its limit, it becomes unresponsive, at any time web servers can be overloaded due to, Excess legitimate web traffic
11.
Hard disk drive
–
The platters are paired with magnetic heads, usually arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a manner, meaning that individual blocks of data can be stored or retrieved in any order. HDDs are a type of storage, retaining stored data even when powered off. Introduced by IBM in 1956, HDDs became the dominant secondary storage device for computers by the early 1960s. Continuously improved, HDDs have maintained this position into the era of servers. More than 200 companies have produced HDDs historically, though after extensive industry consolidation most current units are manufactured by Seagate, Toshiba, as of 2016, HDD production is growing, although unit shipments and sales revenues are declining. While SSDs have higher cost per bit, SSDs are replacing HDDs where speed, power consumption, small size, the primary characteristics of an HDD are its capacity and performance. Capacity is specified in unit prefixes corresponding to powers of 1000, the two most common form factors for modern HDDs are 3. 5-inch, for desktop computers, and 2. 5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as PATA, SATA, Hard disk drives were introduced in 1956, as data storage for an IBM real-time transaction processing computer and were developed for use with general-purpose mainframe and minicomputers. The first IBM drive, the 350 RAMAC in 1956, was approximately the size of two medium-sized refrigerators and stored five million six-bit characters on a stack of 50 disks. In 1962 the IBM350 RAMAC disk storage unit was superseded by the IBM1301 disk storage unit, cylinder-mode read/write operations were supported, and the heads flew about 250 micro-inches above the platter surface. Motion of the head array depended upon a binary system of hydraulic actuators which assured repeatable positioning. The 1301 cabinet was about the size of three home refrigerators placed side by side, storing the equivalent of about 21 million eight-bit bytes, access time was about a quarter of a second. Also in 1962, IBM introduced the model 1311 disk drive, users could buy additional packs and interchange them as needed, much like reels of magnetic tape. Later models of removable pack drives, from IBM and others, became the norm in most computer installations, non-removable HDDs were called fixed disk drives. Some high-performance HDDs were manufactured with one head per track so that no time was lost physically moving the heads to a track, known as fixed-head or head-per-track disk drives they were very expensive and are no longer in production. In 1973, IBM introduced a new type of HDD code-named Winchester and its primary distinguishing feature was that the disk heads were not withdrawn completely from the stack of disk platters when the drive was powered down. Instead, the heads were allowed to land on an area of the disk surface upon spin-down
12.
ID3
–
ID3 is a metadata container most often used in conjunction with the MP3 audio file format. It allows information such as the title, artist, album, track number, ID3 is also specified by Apple as a timed metadata in HTTP Live Streaming, carried as a PID in the main transport stream or in separate audio TS. There are two unrelated versions of ID3, ID3v1 and ID3v2, ID3v1 takes the form of a 128-byte segment at the end of an MP3 file containing a fixed set of data fields. ID3v1.1 is a modification which adds a track number field at the expense of a slight shortening of the comment field. ID3v2 is structurally different from ID3v1, consisting of an extensible set of frames located at the start of the file, each with a frame identifier. 83 types of frames are declared in the ID3v2.4 specification, there are standard frames for containing cover art, BPM, copyright and license, lyrics, and arbitrary text and URL data, as well as other things. Three versions of ID3v2 have been documented, each of which has extended the frame definitions, ID3 is a de facto standard for metadata in MP3 files, no standardization body was involved in its creation nor has such an organization given it a formal approval status. It competes with the APE tag in this arena, the MP3 standard did not include a method for storing file metadata. In 1996 Eric Kemp had the idea to add a small chunk of data to the audio file, the method, now known as ID3v1, quickly became the de facto standard for storing metadata in MP3s. The format was released by Damaged Cybernetics, an group that specialized in cracking console gaming systems. There was no identifying information for any of the cracked console ROMs, Eric and associates carried this over into MP3 files. This format was used for a number of file formats unknown at that time, the ID3v1 tag occupies 128 bytes, beginning with the string TAG128 bytes from the end of the file. The tag was placed at the end of the file to maintain compatibility with older media players, some players would play a small burst of static when they read the tag, but most ignored it, and almost all modern players will correctly skip it. This tag allows 30 bytes each for the title, artist, album, and a comment, four bytes for the year, one improvement to ID3v1 was made by Michael Mutschler in 1997. Since the comment field was too small to write anything useful, such tags are referred to as ID3v1.1. Strings are either space- or zero-padded, unset string entries are filled using an empty string. ID3v1 pre-defines a set of genres denoted by numerical codes, Winamp extended the list by adding more genres in its own music player, which were later adopted by others. However, support for the extended Winamp list is not universal, in some cases, only the genres up to 125 are supported. g. for fading in
13.
MP3
–
Compared to CD quality digital audio, MP3 compression commonly achieves 75 to 95% reduction in size. MP3 files are thus 1/4 to 1/20 the size of the digital audio stream. This is important for both transmission and storage concerns, the basis for such comparison is the CD digital audio format which requires 1411200 bit/s. A commonly used MP3 encoding setting is CBR128 kbit/s resulting in file size 1/11 of the original CD-quality file, the MP3 lossy compression works by reducing the accuracy of certain parts of a continuous sound that are considered to be beyond the auditory resolution ability of most people. This method is referred to as perceptual coding or psychoacoustics. It uses psychoacoustic models to discard or reduce the precision of less audible to human hearing. MP3 was designed by the Moving Picture Experts Group as part of its MPEG-1 standard, the first subgroup for audio was formed by several teams of engineers at Fraunhofer IIS, University of Hanover, AT&T-Bell Labs, Thomson-Brandt, CCETT, and others. MPEG-1 Audio, which included MPEG-1 Audio Layer I, II and III was approved as a draft of ISO/IEC standard in 1991, finalised in 1992. A backwards compatible MPEG-2 Audio extension with lower sample and bit rates was published in 1995, MP3 is a streaming or broadcast format meaning that individual frames can be lost without affecting the ability to decode successfully delivered frames. Storing an MP3 stream in a file enables time-shifted playback, the MP3 lossy audio data compression algorithm takes advantage of a perceptual limitation of human hearing called auditory masking. In 1894, the American physicist Alfred M. Mayer reported that a tone could be rendered inaudible by another tone of lower frequency, in 1959, Richard Ehmer described a complete set of auditory curves regarding this phenomenon. Ernst Terhardt et al. created an algorithm describing auditory masking with high accuracy and this work added to a variety of reports from authors dating back to Fletcher, and to the work that initially determined critical ratios and critical bandwidths. A wide variety of compression algorithms were reported in IEEEs refereed Journal on Selected Areas in Communications. The genesis of the MP3 technology is described in a paper from Professor Hans Musmann who chaired the ISO MPEG Audio group for several years. The immediate predecessors of MP3 were Optimum Coding in the Frequency Domain, the first practical implementation of an audio perceptual coder in hardware, was an implementation of a psychoacoustic transform coder based on Motorola 56000 DSP chips. Another predecessor of the MP3 format and technology is to be found in the perceptual codec MUSICAM based on an integer arithmetics 32 sub-bands filterbank, driven by a psychoacoustic model. It was primarily designed for Digital Audio Broadcasting and digital TV and this codec incorporated into a broadcasting system using COFDM modulation was demonstrated on air and on the field together with Radio Canada and CRC Canada during the NAB show in 1991. F. As a doctoral student at Germanys University of Erlangen-Nuremberg, Karlheinz Brandenburg began working on music compression in the early 1980s
14.
Digital rights management
–
Digital rights management schemes are various access control technologies that are used to restrict usage of proprietary hardware and copyrighted works. DRM technologies try to control the use, modification, and distribution of copyrighted works, the use of digital rights management is not universally accepted. Furthermore, works can become permanently inaccessible if the DRM scheme changes or if the service is discontinued, the Electronic Frontier Foundation and the Free Software Foundation consider the use of DRM systems to be an anti-competitive practice. Worldwide, many laws have been created which criminalize the circumvention of DRM, communication about such circumvention, such laws are part of the United States Digital Millennium Copyright Act, and the European Unions Copyright Directive. The term DRM is also referred to as copy protection, technical protection measures, copy prevention, or copy control. The advent of digital media and analog-to-digital conversion technologies has vastly increased the concerns of copyright-owning individuals and these concerns are particularly prevalent within the music and movie industries, because these sectors are partly or wholly dependent on the revenue generated from such works. This, combined with the Internet and popular file-sharing tools, has made unauthorized distribution of copies of copyrighted digital media much easier, DRM technologies enable content publishers to enforce their own access policies on content, such as restrictions on copying or viewing. These technologies have been criticized for restricting individuals from copying or using the content legally, DRM is in common use by the entertainment industry. However, Apple dropped DRM from all iTunes music files around 2009, for instance, tractor companies try to prevent the DIY repairing by the owning farmers under usage of DRM-laws as DMCA. Digital Rights Management Techniques include, Restrictive Licensing Agreements, The access to materials, copyright. Some restrictive licenses are imposed on consumers as a condition of entering a website or when downloading software, encryption, Scrambling of expressive material and embedding of a tag, This technology is designed to control access and reproduction of information. This includes backup copies for personal use, computer games sometimes use DRM technologies to limit the number of systems the game can be installed on by requiring authentication with an online server. Most games with this restriction allow three or five installs, although some allow an installation to be recovered when the game is uninstalled. In mid-2008, the publication of Mass Effect marked the start of a wave of titles primarily making use of SecuROM for DRM, the use of the DRM scheme in 2008s Spore backfired and there were protests, resulting in a considerable number of users seeking an unlicensed version instead. This backlash against the limit was a significant factor in Spore becoming the most pirated game in 2008. Additionally, other games that use intrusive DRM such as BioShock, Crysis Warhead. Although Ubisoft has not commented on the results of the experiment, Ubisoft formally announced a return to online authentication on 9 February 2010, through its Uplay online gaming platform, starting with Silent Hunter 5, The Settlers 7, and Assassins Creed II. Silent Hunter 5 was first reported to have been compromised within 24 hours of release, the Uplay system works by having the installed game on the local PCs incomplete and then continuously downloading parts of the game-code from Ubisofts servers as the game progresses
15.
Data Encryption Standard
–
The Data Encryption Standard is a symmetric-key algorithm for the encryption of electronic data. Although now considered insecure, it was influential in the advancement of modern cryptography. The publication of an NSA-approved encryption standard simultaneously resulted in its quick international adoption, controversies arose out of classified design elements, a relatively short key length of the symmetric-key block cipher design, and the involvement of the NSA, nourishing suspicions about a backdoor. The intense academic scrutiny the algorithm received over time led to the understanding of block ciphers. DES is now considered to be insecure for many applications and this is mainly due to the 56-bit key size being too small, in January 1999, distributed. net and the Electronic Frontier Foundation collaborated to publicly break a DES key in 22 hours and 15 minutes. There are also some analytical results which demonstrate theoretical weaknesses in the cipher, the algorithm is believed to be practically secure in the form of Triple DES, although there are theoretical attacks. The cipher has been superseded by the Advanced Encryption Standard, furthermore, DES has been withdrawn as a standard by the National Institute of Standards and Technology. Some documentation makes a distinction between DES as a standard and DES as an algorithm, referring to the algorithm as the DEA, the origins of DES go back to the early 1970s. Accordingly, on 15 May 1973, after consulting with the NSA, none of the submissions, however, turned out to be suitable. A second request was issued on 27 August 1974 and this time, IBM submitted a candidate which was deemed acceptable—a cipher developed during the period 1973–1974 based on an earlier algorithm, Horst Feistels Lucifer cipher. On 17 March 1975, the proposed DES was published in the Federal Register, public comments were requested, and in the following year two open workshops were held to discuss the proposed standard. The suspicion was that the algorithm had been weakened by the intelligence agency so that they—but no-one else—could easily read encrypted messages. Alan Konheim commented, We sent the S-boxes off to Washington and they came back and were all different. The United States Senate Select Committee on Intelligence reviewed the NSAs actions to determine whether there had been any improper involvement, however, it also found that NSA did not tamper with the design of the algorithm in any way. Another member of the DES team, Walter Tuchman, stated We developed the DES algorithm entirely within IBM using IBMers, the NSA did not dictate a single wire. In contrast, a declassified NSA book on cryptologic history states, the first offerings were disappointing, so NSA began working on its own algorithm. Then Howard Rosenblum, deputy director for research and engineering, discovered that Walter Tuchman of IBM was working on a modification to Lucifer for general use, NSA gave Tuchman a clearance and brought him in to work jointly with the Agency on his Lucifer modification. And NSA worked closely with IBM to strengthen the algorithm against all except brute force attacks and to strengthen substitution tables, conversely, NSA tried to convince IBM to reduce the length of the key from 64 to 48 bits
16.
RC4
–
In cryptography, RC4 is a stream cipher. While remarkable for its simplicity and speed in software, multiple vulnerabilities have been discovered in RC4 and it is especially vulnerable when the beginning of the output keystream is not discarded, or when nonrandom or related keys are used. Particularly problematic uses of RC4 have led to very insecure protocols such as WEP, as of 2015, there is speculation that some state cryptologic agencies may possess the capability to break RC4 when used in the TLS protocol. IETF has published RFC7465 to prohibit the use of RC4 in TLS, Mozilla, in 2014, Ronald Rivest gave a talk and published a paper on an updated redesign called Spritz. A hardware accelerator of Spritz was published in Secrypt,2016, RC4 was designed by Ron Rivest of RSA Security in 1987. While it is officially termed Rivest Cipher 4, the RC acronym is alternatively understood to stand for Rons Code, RC4 was initially a trade secret, but in September 1994 a description of it was anonymously posted to the Cypherpunks mailing list. It was soon posted on the sci. crypt newsgroup, the leaked code was confirmed to be genuine as its output was found to match that of proprietary software using licensed RC4. Because the algorithm is known, it is no longer a trade secret, the name RC4 is trademarked, so RC4 is often referred to as ARCFOUR or ARC4 to avoid trademark problems. The main factors in RC4s success over such a range of applications have been its speed and simplicity. RC4 generates a stream of bits. As with any stream cipher, these can be used for encryption by combining it with the plaintext using bit-wise exclusive-or, to generate the keystream, the cipher makes use of a secret internal state which consists of two parts, A permutation of all 256 possible bytes. The permutation is initialized with a variable length key, typically between 40 and 2048 bits, using the key-scheduling algorithm, once this has been completed, the stream of bits is generated using the pseudo-random generation algorithm. First, the array S is initialized to the identity permutation, S is then processed for 256 iterations in a similar way to the main PRGA, but also mixes in bytes of the key at the same time. Each element of S is swapped with another element at least once every 256 iterations, in OpenBSD5.5, released in May 2014, arc4random was modified to use ChaCha20. The implementation of arc4random in NetBSD and Linuxs libbsd also uses ChaCha20, however, the implementations of arc4random in FreeBSD, and Mac OS X are still based on RC4, as of January 2015. Man pages for the new, ChaCha20-based arc4random includes a backronym A Replacement Call for Random for ARC4 as a mnemonic, proposed new random number generators are often compared to the RC4 random number generator. Several attacks on RC4 are able to distinguish its output from a random sequence, many stream ciphers are based on linear feedback shift registers, which, while efficient in hardware, are less so in software. The design of RC4 avoids the use of LFSRs, and is ideal for software implementation, as it requires only byte manipulations
17.
SHA-1
–
SHA-1 produces a 160-bit hash value known as a message digest. A SHA-1 hash value is typically rendered as a number,40 digits long. SHA-1 is no longer considered secure against well-funded opponents, microsoft, Google, Apple and Mozilla have all announced that their respective browsers will stop accepting SHA-1 SSL certificates by 2017. On February 23,2017 CWI Amsterdam and Google announced they had performed a collision attack against SHA-1, publishing two dissimilar PDF files which produce the same SHA-1 hash as proof of concept. SHA-1 produces a message digest based on similar to those used by Ronald L. Rivest of MIT in the design of the MD4 and MD5 message digest algorithms. SHA-1 was developed as part of the U. S, the original specification of the algorithm was published in 1993 under the title Secure Hash Standard, FIPS PUB180, by U. S. government standards agency NIST. This version is now often named SHA-0 and it was withdrawn by the NSA shortly after publication and was superseded by the revised version, published in 1995 in FIPS PUB 180-1 and commonly designated SHA-1. SHA-1 differs from SHA-0 only by a single bitwise rotation in the schedule of its compression function. According to the NSA, this was done to correct a flaw in the algorithm which reduced its cryptographic security. Publicly available techniques did indeed compromise SHA-0 before SHA-1, SHA-1 forms part of several widely used security applications and protocols, including TLS and SSL, PGP, SSH, S/MIME, and IPsec. Those applications can also use MD5, both MD5 and SHA-1 are descended from MD4, SHA-1 hashing is also used in distributed revision control systems like Git, Mercurial, and Monotone to identify revisions, and to detect data corruption or tampering. FIPS PUB 180-1 also encouraged adoption and use of SHA-1 by private, a prime motivation for the publication of the Secure Hash Algorithm was the Digital Signature Standard, in which it is incorporated. The SHA hash functions have been used for the basis of the SHACAL block ciphers, revision control systems such as Git and Mercurial use SHA-1 not for security but for ensuring that the data has not changed due to accidental corruption. Linus Torvalds said about Git, If you have disk corruption, if you have DRAM corruption, if you have any kind of problems at all and its not a question of if, its a guarantee. You can have people who try to be malicious, nobody has been able to break SHA-1, but the point is the SHA-1, as far as Git is concerned, isnt even a security feature. The security parts are elsewhere, so a lot of people assume that since Git uses SHA-1 and SHA-1 is used for cryptographically secure stuff, they think that, Okay and it has nothing at all to do with security, its just the best hash you can get. One of the reasons I care is for the kernel, we had a break in on one of the BitKeeper sites where people tried to corrupt the kernel source code repositories. This is called an attack and may or may not be practical depending on L
18.
Library of Congress
–
The Library of Congress is the research library that officially serves the United States Congress and is the de facto national library of the United States. It is the oldest federal cultural institution in the United States, the Library is housed in three buildings on Capitol Hill in Washington, D. C. it also maintains the Packard Campus in Culpeper, Virginia, which houses the National Audio-Visual Conservation Center. The Library of Congress claims to be the largest library in the world and its collections are universal, not limited by subject, format, or national boundary, and include research materials from all parts of the world and in more than 450 languages. Two-thirds of the books it acquires each year are in other than English. The Library of Congress moved to Washington in 1800, after sitting for years in the temporary national capitals of New York. John J. Beckley, who became the first Librarian of Congress, was two dollars per day and was required to also serve as the Clerk of the House of Representatives. The small Congressional Library was housed in the United States Capitol for most of the 19th century until the early 1890s, most of the original collection had been destroyed by the British in 1814, during the War of 1812. To restore its collection in 1815, the bought from former president Thomas Jefferson his entire personal collection of 6,487 books. After a period of growth, another fire struck the Library in its Capitol chambers in 1851, again destroying a large amount of the collection. The Library received the right of transference of all copyrighted works to have two copies deposited of books, maps, illustrations and diagrams printed in the United States. It also began to build its collections of British and other European works and it included several stories built underground of steel and cast iron stacks. Although the Library is open to the public, only high-ranking government officials may check out books, the Library promotes literacy and American literature through projects such as the American Folklife Center, American Memory, Center for the Book, and Poet Laureate. James Madison is credited with the idea for creating a congressional library, part of the legislation appropriated $5,000 for the purchase of such books as may be necessary for the use of Congress. And for fitting up an apartment for containing them. Books were ordered from London and the collection, consisting of 740 books and 3 maps, was housed in the new Capitol, as president, Thomas Jefferson played an important role in establishing the structure of the Library of Congress. The new law also extended to the president and vice president the ability to borrow books and these volumes had been left in the Senate wing of the Capitol. One of the only congressional volumes to have survived was a government account book of receipts and it was taken as a souvenir by a British Commander whose family later returned it to the United States government in 1940. Within a month, former president Jefferson offered to sell his library as a replacement
19.
Google
–
Google is an American multinational technology company specializing in Internet-related services and products. These include online advertising technologies, search, cloud computing, software, Google was founded in 1996 by Larry Page and Sergey Brin while they were Ph. D. students at Stanford University, in California. Together, they own about 14 percent of its shares, and they incorporated Google as a privately held company on September 4,1998. An initial public offering took place on August 19,2004, in August 2015, Google announced plans to reorganize its various interests as a conglomerate called Alphabet Inc. Google, Alphabets leading subsidiary, will continue to be the company for Alphabets Internet interests. Upon completion of the restructure, Sundar Pichai became CEO of Google, replacing Larry Page, rapid growth since incorporation has triggered a chain of products, acquisitions, and partnerships beyond Googles core search engine. The company leads the development of the Android mobile operating system, the Google Chrome web browser, and Chrome OS, the new hardware chief, Rick Osterloh, stated, a lot of the innovation that we want to do now ends up requiring controlling the end-to-end user experience. Google has also experimented with becoming an Internet carrier, alexa, a company that monitors commercial web traffic, lists Google. com as the most visited website in the world. Several other Google services also figure in the top 100 most visited websites, including YouTube, Googles mission statement, from the outset, was to organize the worlds information and make it universally accessible and useful, and its unofficial slogan was Dont be evil. In October 2015, the motto was replaced in the Alphabet corporate code of conduct by the phrase Do the right thing, Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford University in Stanford, California. They called this new technology PageRank, it determined a websites relevance by the number of pages, and the importance of those pages, Page and Brin originally nicknamed their new search engine BackRub, because the system checked backlinks to estimate the importance of a site. Originally, Google ran under Stanford Universitys website, with the domains google. stanford. edu, the domain name for Google was registered on September 15,1997, and the company was incorporated on September 4,1998. It was based in the garage of a friend in Menlo Park, craig Silverstein, a fellow PhD student at Stanford, was hired as the first employee. The first funding for Google was an August 1998 contribution of $100,000 from Andy Bechtolsheim, co-founder of Sun Microsystems, given before Google was incorporated. At least three other investors invested in 1998, Amazon. com founder Jeff Bezos, Stanford University computer science professor David Cheriton. Author Ken Auletta claims that each invested $250,000, early in 1999, Brin and Page decided they wanted to sell Google to Excite. They went to Excite CEO George Bell and offered to sell it to him for $1 million, vinod Khosla, one of Excites venture capitalists, talked the duo down to $750,000, but Bell still rejected it. Googles initial public offering took place five years later, on August 19,2004, at that time Larry Page, Sergey Brin, and Eric Schmidt agreed to work together at Google for 20 years, until the year 2024
20.
Open-source software
–
Open-source software may be developed in a collaborative public manner. According to scientists who studied it, open-source software is a prominent example of open collaboration, a 2008 report by the Standish Group states that adoption of open-source software models has resulted in savings of about $60 billion per year to consumers. In the early days of computing, programmers and developers shared software in order to learn from each other, eventually the open source notion moved to the way side of commercialization of software in the years 1970-1980. In 1997, Eric Raymond published The Cathedral and the Bazaar and this source code subsequently became the basis behind SeaMonkey, Mozilla Firefox, Thunderbird and KompoZer. Netscapes act prompted Raymond and others to look into how to bring the Free Software Foundations free software ideas, the new term they chose was open source, which was soon adopted by Bruce Perens, publisher Tim OReilly, Linus Torvalds, and others. The Open Source Initiative was founded in February 1998 to encourage use of the new term, a Microsoft executive publicly stated in 2001 that open source is an intellectual property destroyer. I cant imagine something that could be worse than this for the software business, IBM, Oracle, Google and State Farm are just a few of the companies with a serious public stake in todays competitive open-source market. There has been a significant shift in the corporate philosophy concerning the development of FOSS, the free software movement was launched in 1983. In 1998, a group of individuals advocated that the free software should be replaced by open-source software as an expression which is less ambiguous. Software developers may want to publish their software with an open-source license, the Open Source Definition, notably, presents an open-source philosophy, and further defines the terms of usage, modification and redistribution of open-source software. Software licenses grant rights to users which would otherwise be reserved by law to the copyright holder. Several open-source software licenses have qualified within the boundaries of the Open Source Definition, the open source label came out of a strategy session held on April 7,1998 in Palo Alto in reaction to Netscapes January 1998 announcement of a source code release for Navigator. They used the opportunity before the release of Navigators source code to clarify a potential confusion caused by the ambiguity of the free in English. Many people claimed that the birth of the Internet, since 1969, started the open source movement, the Free Software Foundation, started in 1985, intended the word free to mean freedom to distribute and not freedom from cost. Since a great deal of free software already was free of charge, such software became associated with zero cost. The Open Source Initiative was formed in February 1998 by Eric Raymond and they sought to bring a higher profile to the practical benefits of freely available source code, and they wanted to bring major software businesses and other high-tech industries into open source. Perens attempted to open source as a service mark for the OSI. The Open Source Initiatives definition is recognized by governments internationally as the standard or de facto definition, OSI uses The Open Source Definition to determine whether it considers a software license open source
21.
VirtualDub
–
VirtualDub is a free and open-source video capture and video processing utility for Microsoft Windows written by Avery Lee. It is designed to process linear video streams, including filtering and it uses AVI container format to store captured video. The first version of VirtualDub, written for Windows 95, to be released on SourceForge was uploaded on August 20,2000, in 2009, the third-party software print guide Learning VirtualDub referred to VirtualDub as the leading free Open Source video capture and processing tool. Several hundred third-party plug-ins for VirtualDub exist, including by professional software companies, furthermore, Debugmode Wax allows use of VirtualDub plug-ins in professional video editing software such as Adobe Premiere Pro and Vegas Pro. VirtualDub is designed for Microsoft Windows but may run on Linux, however, native support for these systems is not available. VirtualDub was made to operate exclusively on AVI files, however, appropriate video and audio codecs need to be installed. VirtualDub supports both DirectShow and Video for Windows for video capture, VirtualDub can help overcome problems with digital cameras that also record video. Many models, especially Canon, record in an M-JPEG format incompatible with Sony Vegas 6.0 and 7.0, saving AVI files as old-style AVI files allows them to appear in Vegas. VirtualDub supports DV capture from Type 2 FireWire controllers only, there is no DV batch capture, still image capture, or DV device control capability. VirtualDub can create a file from a series image files in Truevision TGA or Windows Bitmap file formats. Individual frames must be given file names numbered in order without any gaps. From those, the rate can be adjusted, and other modifications such as the addition of a sound track can be made. VirtualDub can also disassemble a video by extracting its sound tracks saving its frames into Truevision TGA or Windows Bitmap files, VirtualDub can delete segments of a video file, append new segments, or reorder existing segments. Appended segments must have similar audio and video formats, dimensions, number of channels, frame rates. Otherwise, VirtualDub is incapable of mixing dissimilar video files or adding transition effects between segments, VirtualDub comes with a number of video editing components known as filters. They can perform tasks as arbitrary resize, converting the video to grayscale, arbitrary rotation, crop, or changing simple values like brightness. Filters may be used during the assembly as well. Filter plug-ins further extend VirtualDubs capabilities, a plug-in SDK is available for developers to create their own video and audio filters
22.
Internet Assigned Numbers Authority
–
Following ICANNs transition to a global multistakeholder governance model, the IANA functions were transferred to Public Technical Identifiers, an affiliate of ICANN. In addition, five regional Internet registries delegate number resources to their customers, local Internet registries, Internet service providers, a local Internet registry is an organization that assigns parts of its allocation from a regional Internet registry to other customers. Most local Internet registries are also Internet service providers, IANA is broadly responsible for the allocation of globally unique names and numbers that are used in Internet protocols that are published as Request for Comments documents. These documents describe methods, behaviors, research, or innovations applicable to the working of the Internet, IANA maintains a close liaison with the Internet Engineering Task Force and RFC Editorial team in fulfilling this function. IANA is responsible for assignment of Internet numbers which are numerical identifier assigned to an Internet resource or used in the protocols of the Internet Protocol Suite. Examples include IP addresses and autonomous system numbers, IANA delegates allocations of IP address blocks to regional Internet registries. Each RIR allocates addresses for a different area of the world, collectively the RIRs have created the Number Resource Organization formed as a body to represent their collective interests and ensure that policy statements are coordinated globally. The RIRs divide their allocated address pools into smaller blocks and delegate them to Internet service providers, since the exhaustion of the Internet Protocol Version 4 address space, no further IPv4 address space is allocated by IANA. IANA administers the data in the root nameservers, which form the top of the hierarchical Domain name system tree and this task involves liaising with top-level domain operators, the root nameserver operators, and ICANNs policy making apparatus. IANA administers many parameters of IETF protocols, examples include the names of uniform resource identifier schemes and character encodings recommended for use on the Internet. This task is performed under the oversight of the Internet Architecture Board, on March 26,1972, Vint Cerf and Jon Postel at UCLA called for establishing a socket number catalog in RFC322. Network administrators were asked to submit a note or place a call, describing the function. This catalog was published as RFC433 in December 1972. In it Postel first proposed a registry of assignments of port numbers to network services, calling himself the czar of socket numbers. The first reference to the name IANA in the RFC series is in RFC1083, published in December,1988 by Postel at USC-ISI, there was widespread dissatisfaction with this concentration of power in one company, and people looked to IANA for a solution. Postel wrote up a draft on IANA and the creation of new top level domains and he was trying to institutionalize IANA. In retrospect, this would have been valuable, since he died about two years later. Jon Postel managed the IANA function from its inception on the ARPANET until his death in October 1998, by his almost 30 years of selfless service, Postel created his de facto authority to manage key parts of the Internet infrastructure
23.
Multimedia
–
Multimedia is content that uses a combination of different content forms such as text, audio, images, animations, video and interactive content. Multimedia contrasts with media that use only rudimentary computer displays such as text-only or traditional forms of printed or hand-produced material, Multimedia devices are electronic media devices used to store and experience multimedia content. Multimedia is distinguished from mixed media in art, for example. The term rich media is synonymous with interactive multimedia, the term multimedia was coined by singer and artist Bob Goldstein to promote the July 1966 opening of his LightWorks at LOursin show at Southampton, Long Island. Goldstein was perhaps aware of an American artist named Dick Higgins, two years later, in 1968, the term multimedia was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldsteins producers at LOursin. In the intervening forty years, the word has taken on different meanings, in the late 1970s, the term referred to presentations consisting of multi-projector slide shows timed to an audio track. However, by the 1990s multimedia took on its current meaning, in the 1993 first edition of Multimedia, Making It Work, Tay Vaughan declared Multimedia is any combination of text, graphic art, sound, animation, and video that is delivered by computer. When you allow the user – the viewer of the project – to control what, when you provide a structure of linked elements through which the user can navigate, interactive multimedia becomes hypermedia. The German language society Gesellschaft für deutsche Sprache recognized the words significance, the institute summed up its rationale by stating has become a central word in the wonderful new media world. In common usage, multimedia refers to an electronically delivered combination of media including video, still images, audio, much of the content on the web today falls within this definition as understood by millions. That era saw also a boost in the production of educational multimedia CD-ROMs, the term video, if not used exclusively to describe motion photography, is ambiguous in multimedia terminology. Video is often used to describe the format, delivery format. Multiple forms of content are often not considered modern forms of presentation such as audio or video. Likewise, single forms of content with single methods of information processing are often called multimedia. Performing arts may also be considered multimedia considering that performers and props are multiple forms of content and media. Multimedia presentations may be viewed by person on stage, projected, transmitted, a broadcast may be a live or recorded multimedia presentation. Broadcasts and recordings can be analog or digital electronic media technology. Digital online multimedia may be downloaded or streamed, streaming multimedia may be live or on-demand
24.
Data compression
–
In signal processing, data compression, source coding, or bit-rate reduction involves encoding information using fewer bits than the original representation. Compression can be lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy, no information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information, the process of reducing the size of a data file is referred to as data compression. In the context of data transmission, it is called coding in opposition to channel coding. Compression is useful because it reduces resources required to store and transmit data, computational resources are consumed in the compression process and, usually, in the reversal of the process. Data compression is subject to a space–time complexity trade-off, Lossless data compression algorithms usually exploit statistical redundancy to represent data without losing any information, so that the process is reversible. Lossless compression is possible because most real-world data exhibits statistical redundancy, for example, an image may have areas of color that do not change over several pixels, instead of coding red pixel, red pixel. The data may be encoded as 279 red pixels and this is a basic example of run-length encoding, there are many schemes to reduce file size by eliminating redundancy. The Lempel–Ziv compression methods are among the most popular algorithms for lossless storage, DEFLATE is a variation on LZ optimized for decompression speed and compression ratio, but compression can be slow. DEFLATE is used in PKZIP, Gzip, and PNG, LZW is used in GIF images. LZ methods use a table-based compression model where table entries are substituted for repeated strings of data, for most LZ methods, this table is generated dynamically from earlier data in the input. The table itself is often Huffman encoded, current LZ-based coding schemes that perform well are Brotli and LZX. LZX is used in Microsofts CAB format, the best modern lossless compressors use probabilistic models, such as prediction by partial matching. The Burrows–Wheeler transform can also be viewed as a form of statistical modelling. The basic task of grammar-based codes is constructing a context-free grammar deriving a single string, sequitur and Re-Pair are practical grammar compression algorithms for which software is publicly available. In a further refinement of the use of probabilistic modelling. Arithmetic coding is a more modern coding technique that uses the mathematical calculations of a machine to produce a string of encoded bits from a series of input data symbols
25.
International Organization for Standardization
–
The International Organization for Standardization is an international standard-setting body composed of representatives from various national standards organizations. Founded on 23 February 1947, the organization promotes worldwide proprietary and it is headquartered in Geneva, Switzerland, and as of March 2017 works in 162 countries. It was one of the first organizations granted general consultative status with the United Nations Economic, ISO, the International Organization for Standardization, is an independent, non-governmental organization, the members of which are the standards organizations of the 162 member countries. It is the worlds largest developer of international standards and facilitates world trade by providing common standards between nations. Nearly twenty thousand standards have been set covering everything from manufactured products and technology to food safety, use of the standards aids in the creation of products and services that are safe, reliable and of good quality. The standards help businesses increase productivity while minimizing errors and waste, by enabling products from different markets to be directly compared, they facilitate companies in entering new markets and assist in the development of global trade on a fair basis. The standards also serve to safeguard consumers and the end-users of products and services, the three official languages of the ISO are English, French, and Russian. The name of the organization in French is Organisation internationale de normalisation, according to the ISO, as its name in different languages would have different abbreviations, the organization adopted ISO as its abbreviated name in reference to the Greek word isos. However, during the meetings of the new organization, this Greek word was not invoked. Both the name ISO and the logo are registered trademarks, the organization today known as ISO began in 1926 as the International Federation of the National Standardizing Associations. ISO is an organization whose members are recognized authorities on standards. Members meet annually at a General Assembly to discuss ISOs strategic objectives, the organization is coordinated by a Central Secretariat based in Geneva. A Council with a membership of 20 member bodies provides guidance and governance. The Technical Management Board is responsible for over 250 technical committees, ISO has formed joint committees with the International Electrotechnical Commission to develop standards and terminology in the areas of electrical and electronic related technologies. Information technology ISO/IEC Joint Technical Committee 1 was created in 1987 to evelop, maintain, ISO has three membership categories, Member bodies are national bodies considered the most representative standards body in each country. These are the members of ISO that have voting rights. Correspondent members are countries that do not have their own standards organization and these members are informed about ISOs work, but do not participate in standards promulgation. Subscriber members are countries with small economies and they pay reduced membership fees, but can follow the development of standards