Portable media player
A portable media player or digital audio player is a portable consumer electronics device capable of storing and playing digital media such as audio and video files. The data is stored on a CD, DVD, BD, flash memory, microdrive, or hard drive. Most portable media players are equipped with a 3.5 mm headphone jack, which users can plug headphones into, or connect to a boombox or hifi system. In contrast, analogue portable audio players play music from non-digital media that use analogue signal storage, such as cassette tapes or vinyl records. Mobile digital audio players are marketed and sold as "portable MP3 players" if they support other file formats and media types. Increasing sales of smartphones and tablet computers have led to a decline in sales of portable media players, leading to most devices being phased out, though flagship devices like the Apple iPod and Sony Walkman are still in production. Portable DVD/BD players are still manufactured by brands across the world; this article focuses on portable devices.
The immediate predecessor in the market place of the digital audio player was the portable CD player and prior to that, the personal stereo. In particular, Sony's Walkman and Discman are the ancestors of digital audio players such as Apple's iPod. British scientist Kane Kramer invented the first digital audio player, which he called the IXI, his 1979 prototypes were capable of one hour of audio playback but did not enter commercial production. His UK patent application was not filed until 1981 and was issued in 1985 in the UK and 1987 in the US. However, in 1988 Kramer's failure to raise the £60,000 required to renew the patent meant it entering the public domain, but he still owns the designs. Apple Inc. hired Kramer as a consultant and presented his work as an example of prior art in the field of digital audio players during their litigation with Burst.com two decades later. In 2008, Apple acknowledged Kramer as the Inventor of the Digital audio player In 1996 AT&T developed the FlashPAC digital audio player which used AT&T Perceptual Audio Coding for music compression, but in 1997 switched to AAC.
At about the same time AT&T developed an internal Web based music streaming service that had the ability to download music to FlashPAC. AAC and such music downloading services formed the foundation for the Apple iPod and iTunes; the first portable MP3 player was launched in 1997 by Saehan Information Systems, which sold its “MPMan" player in Asia in spring 1998. In mid-1998, the South Korean company licensed the players for North American distribution to Eiger Labs, which rebranded them as the EigerMan F10 and F20; the flash-based players were available in 32 MB or 64 MB storage capacity and had a LCD screen to tell the user the song playing. The first production-volume portable digital audio player was The Audible Player from Audible.com available for sale in January 1998, for US$200. It only supported playback of digital audio in Audible's proprietary, low-bitrate format, developed for spoken word recordings. Capacity was limited to 4 MB of internal flash memory, or about 2 hours of play, using a custom rechargeable battery pack.
The unit had rudimentary controls. The Rio PMP300 from Diamond Multimedia was introduced in September 1998, a few months after the MPMan, featured a 32 MB storage capacity, it was a success during the holiday season, with sales exceeding expectations. Interest and investment in digital music were subsequently spurred from it; because of the player's notoriety as the target of a major lawsuit, the Rio is erroneously assumed to be the first digital audio player. In 1998, Compaq developed the Personal Jukebox, the first hard drive based DAP using a 2.5" laptop drive. It was licensed to HanGo Electronics, which first sold the PJB-100 in 1999; the player had an initial capacity of 4.8 GB, with an advertised capacity of 1200 songs. In 2000, Creative released the 6GB hard drive based Creative NOMAD Jukebox; the name borrowed the jukebox metaphor popularised by Remote Solution used by Archos. Players in the Creative NOMAD range used microdrives rather than laptop drives. In October 2000, South Korean software company Cowon Systems released their first MP3 player, the CW100, under the brand name iAUDIO.
In December 2000, some months after the Creative's NOMAD Jukebox, Archos released its Jukebox 6000 with a 6GB hard drive. On 23 October 2001, Apple Computer unveiled the first generation iPod, a 5 GB hard drive based DAP with a 1.8" Toshiba hard drive and a 2" monochrome display. With the development of a spartan user interface and a smaller form factor, the iPod was popular within the Macintosh community. In July 2002, Apple introduced the second generation update to the iPod, it was compatible with Windows computers through Musicmatch Jukebox. In 2007, Apple introduced the first iPod with a multi-touch screen, its media player was split into the Videos apps. In 2002, Archos released the first "portable media player", the Archos Jukebox Multimedia with a little 1.5" colour screen. Manufacturers have since implemented abilities to play videos into their devices; the next year, Archos released another multimedia jukebox, the AV300, with a 3.8" screen and a 20GB hard drive. In 2004, Microsoft attempted to take advantage of the growing PMP market by launching the Portable Media Center platform.
It was introduced at the 2004 Consumer Electronics Show with the announcement of the Zen Portable Media Center, co-developed by Creative. The Microsoft Zune series would be base
Multimedia is content that uses a combination of different content forms such as text, images, animations and interactive content. Multimedia contrasts with media that use only rudimentary computer displays such as text-only or traditional forms of printed or hand-produced material. Multimedia can be recorded and played, interacted with or accessed by information content processing devices, such as computerized and electronic devices, but can be part of a live performance. Multimedia devices are electronic media devices used to experience multimedia content. Multimedia is distinguished from mixed media in fine art. In the early years of multimedia the term "rich media" was synonymous with interactive multimedia, "hypermedia" was an application of multimedia; the term multimedia was coined by singer and artist Bob Goldstein to promote the July 1966 opening of his "LightWorks at L'Oursin" show at Southampton, Long Island. Goldstein was aware of an American artist named Dick Higgins, who had two years discussed a new approach to art-making he called "intermedia".
On August 10, 1966, Richard Albarino of Variety borrowed the terminology, reporting: "Brainchild of songscribe-comic Bob Goldstein, the'Lightworks' is the latest multi-media music-cum-visuals to debut as discothèque fare." Two years in 1968, the term "multimedia" was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldstein's producers at L'Oursin. In the intervening forty years, the word has taken on different meanings. In the late 1970s, the term referred to presentations consisting of multi-projector slide shows timed to an audio track. However, by the 1990s'multimedia' took on its current meaning. In the 1993 first edition of Multimedia: Making It Work, Tay Vaughan declared "Multimedia is any combination of text, graphic art, sound and video, delivered by computer; when you allow the user – the viewer of the project – to control what and when these elements are delivered, it is interactive multimedia. When you provide a structure of linked elements through which the user can navigate, interactive multimedia becomes hypermedia."The German language society Gesellschaft für deutsche Sprache recognized the word's significance and ubiquitousness in the 1990s by awarding it the title of German'Word of the Year' in 1995.
The institute summed up its rationale by stating " has become a central word in the wonderful new media world". In common usage, multimedia refers to an electronically delivered combination of media including video, still images and text in such a way that can be accessed interactively. Much of the content on the web today falls within this definition; some computers which were marketed in the 1990s were called "multimedia" computers because they incorporated a CD-ROM drive, which allowed for the delivery of several hundred megabytes of video and audio data. That era saw a boost in the production of educational multimedia CD-ROMs; the term "video", if not used to describe motion photography, is ambiguous in multimedia terminology. Video is used to describe the file format, delivery format, or presentation format instead of "footage", used to distinguish motion photography from "animation" of rendered motion imagery. Multiple forms of information content are not considered modern forms of presentation such as audio or video.
Single forms of information content with single methods of information processing are called multimedia to distinguish static media from active media. In the fine arts, for example, Leda Luss Luyken's ModulArt brings two key elements of musical composition and film into the world of painting: variation of a theme and movement of and within a picture, making ModulArt an interactive multimedia form of art. Performing arts may be considered multimedia considering that performers and props are multiple forms of both content and media. Multimedia presentations may be viewed by person on stage, transmitted, or played locally with a media player. A broadcast may be a recorded multimedia presentation. Broadcasts and recordings can be digital electronic media technology. Digital online multimedia streamed. Streaming multimedia may be on-demand. Multimedia games and simulations may be used in a physical environment with special effects, with multiple users in an online network, or locally with an offline computer, game system, or simulator.
The various formats of technological or digital multimedia may be intended to enhance the users' experience, for example to make it easier and faster to convey information. Or in entertainment or art, to transcend everyday experience. Enhanced levels of interactivity are made possible by combining multiple forms of media content. Online multimedia is becoming object-oriented and data-driven, enabling applications with collaborative end-user innovation and personalization on multiple forms of content over time. Examples of these range from multiple forms of content on Web sites like photo galleries with both images and title user-updated, to simulations whose co-efficients, illustrations, animations or videos are modifiable, allowing the multimedia "experience" to be altered without reprogramming. In addition to seeing and hearing, haptic technology enables virtual objects to be felt. Emerging technology involving illusions of taste and smell may enhance the multimedia experience. Multimedia may be broadly divided into linear and non-linear categories: Linea
A file manager or file browser is a computer program that provides a user interface to manage files and folders. The most common operations performed on files or groups of files include creating, renaming, moving or copying and searching for files, as well as modifying file attributes and file permissions. Folders and files may be displayed in a hierarchical tree based on their directory structure; some file managers contain features inspired by web browsers, including forward and back navigational buttons. Some file managers provide network connectivity via protocols, such as FTP, HTTP, NFS, SMB or WebDAV; this is achieved by allowing the user to browse for a file server or by providing its own full client implementations for file server protocols. A term that predates the usage of file manager is directory editor. An early directory editor, DIRED, was developed circa 1974 at the Stanford Artificial Intelligence Laboratory by Stan Kugell. A directory editor was written for EXEC 8 at the University of Maryland, was available to other users at that time.
The term was used by other developers, including Jay Lepreau, who wrote the dired program in 1980, which ran on BSD. This was in turn inspired by an older program with the same name running on TOPS-20. Dired inspired other programs, including dired, the editor script, ded. File-list file managers are older than orthodox file managers. One such file manager is flist, introduced sometime before 1980 on the Conversational Monitor System; this is a variant of fulist, which originated before late 1978, according to comments by its author, Theo Alkema. The flist program provided a list of files in the user's minidisk, allowed sorting by any file attribute; the file attributes could be passed to scripts or function-key definitions, making it simple to use flist as part of CMS EXEC, EXEC 2 or XEDIT scripts. This program ran only on IBM VM/SP CMS, but was the inspiration for other programs, including filelist, programs running on other operating systems, including a program called flist, which ran on OpenVMS, fulist, which runs on Unix.
Orthodox file managers or command-based file managers are text-menu based file managers, that have three windows. Orthodox file managers are one of the longest running families of file managers, preceding graphical user interface-based types. Developers create applications that duplicate and extend the manager, introduced by PathMinder and John Socha's famous Norton Commander for DOS; the concept dates to the mid-1980s—PathMinder was released in 1984, Norton Commander version 1.0 was released in 1986. Despite the age of this concept, file managers based on Norton Commander are developed, dozens of implementations exist for DOS, Microsoft Windows. Nikolai Bezroukov publishes his own set of criteria for an OFM standard. An orthodox file manager has three windows. Two of the windows are positioned symmetrically at the top of the screen; the third is the command line, a minimized command window that can be expanded to full screen. Only one of the panels is active at a given time; the active panel contains the "file cursor".
Panels can be hidden. Files in the active panel serve. For example, files can be copied or moved from the active panel to the location represented in the passive panel; this scheme is most effective for systems in which the keyboard is the sole input device. The active panel shows information about the current working directory and the files that it contains; the passive panel shows the content of another directory. Users may customize the display of columns; the active panel and passive panel can be switched. The following features describe the class of orthodox file managers, they present the user with a two-panel directory view with a command line below. Either panel may be selected to be active; the active panel becomes the working area for delete and rename operations, while the passive panel serves as a target for copy and move operations. Panels may be shrunk. Only the last line of the terminal window is visible, they provide close integration with an underlying OS shell via command line, using the associated terminal window that permits viewing the results of executing shell commands entered on the command line.
They provide the user with extensive keyboard shortcuts. The file manager frees the user from having to use the mouse. Users can create their own file associations and scripts that are invoked for certain file types and organize these scripts into a hierarchical tree. Users can extend the functionality of the manager via a so-called User menu or Start menu and extensions menu. Other common features include: Information on the "active" and "passive" panels may be used for constructing commands on the command line. Examples include path to left panel, path to right panel, etc.. They provide a built-in viewer for the most basic file types, they have a built-in editor. In many cases, the editor can extract certain elements o
H.264 or MPEG-4 Part 10, Advanced Video Coding is a block-oriented motion-compensation-based video compression standard. As of 2014, it is one of the most used formats for the recording and distribution of video content, it supports resolutions up to 8192×4320, including 8K UHD. The intent of the H.264/AVC project was to create a standard capable of providing good video quality at lower bit rates than previous standards, without increasing the complexity of design so much that it would be impractical or excessively expensive to implement. An additional goal was to provide enough flexibility to allow the standard to be applied to a wide variety of applications on a wide variety of networks and systems, including low and high bit rates and high resolution video, broadcast, DVD storage, RTP/IP packet networks, ITU-T multimedia telephony systems; the H.264 standard can be viewed as a "family of standards" composed of a number of different profiles. A specific decoder decodes at least one, but not all profiles.
The decoder specification describes. H.264 is used for lossy compression, although it is possible to create lossless-coded regions within lossy-coded pictures or to support rare use cases for which the entire encoding is lossless. H.264 was developed by the ITU-T Video Coding Experts Group together with the ISO/IEC JTC1 Moving Picture Experts Group. The project partnership effort is known as the Joint Video Team; the ITU-T H.264 standard and the ISO/IEC MPEG-4 AVC standard are jointly maintained so that they have identical technical content. The final drafting work on the first version of the standard was completed in May 2003, various extensions of its capabilities have been added in subsequent editions. High Efficiency Video Coding, a.k.a. H.265 and MPEG-H Part 2 is a successor to H.264/MPEG-4 AVC developed by the same organizations, while earlier standards are still in common use. H.264 is best known as being one of the video encoding standards for Blu-ray Discs. It is widely used by streaming Internet sources, such as videos from Vimeo, YouTube, the iTunes Store, Web software such as the Adobe Flash Player and Microsoft Silverlight, various HDTV broadcasts over terrestrial and satellite.
H.264 is protected by patents owned by various parties. A license covering most patents essential to H.264 is administered by patent pool MPEG LA. Commercial use of patented H.264 technologies requires the payment of royalties to MPEG LA and other patent owners. MPEG LA has allowed the free use of H.264 technologies for streaming Internet video, free to end users, Cisco Systems pays royalties to MPEG LA on behalf of the users of binaries for its open source H.264 encoder. The H.264 name follows the ITU-T naming convention, where the standard is a member of the H.26x line of VCEG video coding standards. The standard was developed jointly in a partnership of VCEG and MPEG, after earlier development work in the ITU-T as a VCEG project called H.26L. It is thus common to refer to the standard with names such as H.264/AVC, AVC/H.264, H.264/MPEG-4 AVC, or MPEG-4/H.264 AVC, to emphasize the common heritage. It is referred to as "the JVT codec", in reference to the Joint Video Team organization that developed it.
Some software programs internally identify this standard as AVC1. In early 1998, the Video Coding Experts Group issued a call for proposals on a project called H.26L, with the target to double the coding efficiency in comparison to any other existing video coding standards for a broad variety of applications. VCEG was chaired by Gary Sullivan; the first draft design for that new standard was adopted in August 1999. In 2000, Thomas Wiegand became VCEG co-chair. In December 2001, VCEG and the Moving Picture Experts Group formed a Joint Video Team, with the charter to finalize the video coding standard. Formal approval of the specification came in March 2003; the JVT was chaired by Gary Sullivan, Thomas Wiegand, Ajay Luthra. In June 2004, the Fidelity range extensions project was finalized. From January 2005 to November 2007, the JVT was working on an extension of H.264/AVC towards scalability by an Annex called Scalable Video Coding. The JVT management team was extended by Jens-Rainer Ohm. From July 2006 to November 2009, the JVT worked on Multiview Video Coding, an extension of H.264/AVC towards free viewpoint television and 3D television.
That work included the development of two new profiles of the standard: the Multiview High Profile and the Stereo High Profile. The standardization of the first version of H.264/AVC was completed in May 2003. In the first project to extend the original standard, the JVT developed what was called the Fidelity Range Extensions; these extens
The Internet is the global system of interconnected computer networks that use the Internet protocol suite to link devices worldwide. It is a network of networks that consists of private, academic and government networks of local to global scope, linked by a broad array of electronic and optical networking technologies; the Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web, electronic mail and file sharing. Some publications no longer capitalize "internet"; the origins of the Internet date back to research commissioned by the federal government of the United States in the 1960s to build robust, fault-tolerant communication with computer networks. The primary precursor network, the ARPANET served as a backbone for interconnection of regional academic and military networks in the 1980s; the funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, led to worldwide participation in the development of new networking technologies, the merger of many networks.
The linking of commercial networks and enterprises by the early 1990s marked the beginning of the transition to the modern Internet, generated a sustained exponential growth as generations of institutional and mobile computers were connected to the network. Although the Internet was used by academia since the 1980s, commercialization incorporated its services and technologies into every aspect of modern life. Most traditional communication media, including telephony, television, paper mail and newspapers are reshaped, redefined, or bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, video streaming websites. Newspaper and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators; the Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or sell goods and services online.
Business-to-business and financial services on the Internet affect supply chains across entire industries. The Internet has no single centralized governance in either technological implementation or policies for access and usage; the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address space and the Domain Name System, are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers. The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force, a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. In November 2006, the Internet was included on USA Today's list of New Seven Wonders; when the term Internet is used to refer to the specific global system of interconnected Internet Protocol networks, the word is a proper noun that should be written with an initial capital letter.
In common use and the media, it is erroneously not capitalized, viz. the internet. Some guides specify that the word should be capitalized when used as a noun, but not capitalized when used as an adjective; the Internet is often referred to as the Net, as a short form of network. As early as 1849, the word internetted was used uncapitalized as an adjective, meaning interconnected or interwoven; the designers of early computer networks used internet both as a noun and as a verb in shorthand form of internetwork or internetworking, meaning interconnecting computer networks. The terms Internet and World Wide Web are used interchangeably in everyday speech. However, the World Wide Web or the Web is only one of a large number of Internet services; the Web is a collection of interconnected documents and other web resources, linked by hyperlinks and URLs. As another point of comparison, Hypertext Transfer Protocol, or HTTP, is the language used on the Web for information transfer, yet it is just one of many languages or protocols that can be used for communication on the Internet.
The term Interweb is a portmanteau of Internet and World Wide Web used sarcastically to parody a technically unsavvy user. Research into packet switching, one of the fundamental Internet technologies, started in the early 1960s in the work of Paul Baran and Donald Davies. Packet-switched networks such as the NPL network, ARPANET, the Merit Network, CYCLADES, Telenet were developed in the late 1960s and early 1970s; the ARPANET project led to the development of protocols for internetworking, by which multiple separate networks could be joined into a network of networks. ARPANET development began with two network nodes which were interconnected between the Network Measurement Center at the University of California, Los Angeles Henry Samueli School of Engineering and Applied Science directed by Leonard Kleinrock, the NLS system at SRI International by Douglas Engelbart in Menlo Park, California, on 29 October 1969; the third site was the Culler-Fried Interactive Mathematics Center at the University of California, Santa Barbara, followed by the University of
Digital rights management
Digital rights management tools or technological protection measures are a set of access control technologies for restricting the use of proprietary hardware and copyrighted works. DRM technologies try to control the use and distribution of copyrighted works, as well as systems within devices that enforce these policies; the use of digital rights management is not universally accepted. Proponents of DRM argue that it is necessary to prevent intellectual property from being copied just as physical locks are needed to prevent personal property from being stolen, that it can help the copyright holder maintain artistic control, that it can ensure continued revenue streams; those opposed to DRM contend there is no evidence that DRM helps prevent copyright infringement, arguing instead that it serves only to inconvenience legitimate customers, that DRM helps big business stifle innovation and competition. Furthermore, works can become permanently inaccessible if the DRM scheme changes or if the service is discontinued.
DRM can restrict users from exercising their legal rights under the copyright law, such as backing up copies of CDs or DVDs, lending materials out through a library, accessing works in the public domain, or using copyrighted materials for research and education under the fair use doctrine. The Electronic Frontier Foundation and the Free Software Foundation consider the use of DRM systems to be an anti-competitive practice. Worldwide, many laws have been created which criminalize the circumvention of DRM, communication about such circumvention, the creation and distribution of tools used for such circumvention; such laws are part of the United States' Digital Millennium Copyright Act, the European Union's Copyright Directive. The rise of digital media and analog-to-digital conversion technologies has vastly increased the concerns of copyright-owning individuals and organizations within the music and movie industries. While analog media lost quality with each copy generation, in some cases during normal use, digital media files may be duplicated an unlimited number of times with no degradation in the quality.
The rise of personal computers as household appliances has made it convenient for consumers to convert media in a physical, analog or broadcast form into a universal, digital form for portability or viewing later. This, combined with the Internet and popular file-sharing tools, has made unauthorized distribution of copies of copyrighted digital media much easier. In 1983, a early implementation of Digital Rights Management was the Software Service System devised by the Japanese engineer Ryuichi Moriya. and subsequently refined under the name superdistribution. The SSS was based on encryption, with specialized hardware that controlled decryption and enabled payments to be sent to the copyright holder; the underlying principle of the SSS and subsequently of superdistribution was that the distribution of encrypted digital products should be unrestricted and that users of those products would not just be permitted to redistribute them but would be encouraged to do so. Common DRM techniques include restrictive licensing agreements: The access to digital materials and public domain is restricted to consumers as a condition of entering a website or when downloading software.
Encryption, scrambling of expressive material and embedding of a tag, designed to control access and reproduction of information, including backup copies for personal use. DRM technologies enable content publishers to enforce their own access policies on content, such as restrictions on copying or viewing; these technologies have been criticized for restricting individuals from copying or using the content such as by fair use. DRM is in common use by the entertainment industry. Many online music stores, such as Apple's iTunes Store, e-book publishers and vendors, such as OverDrive use DRM, as do cable and satellite service operators, to prevent unauthorized use of content or services. However, Apple dropped DRM from all iTunes music files around 2009. Industry has expanded the usage of DRM to more traditional hardware products, such as Keurig's coffeemakers, Philips' light bulbs, mobile device power chargers, John Deere's tractors. For instance, tractor companies try to prevent farmers from making DIY repairs under usage of DRM-laws as DMCA.
Computer games sometimes use DRM technologies to limit the number of systems the game can be installed on by requiring authentication with an online server. Most games with this restriction allow three or five installs, although some allow an installation to be'recovered' when the game is uninstalled; this not only limits users who have more than three or five computers in their homes, but can prove to be a problem if the user has to unexpectedly perform certain tasks like upgrading operating systems or reformatting the computer's hard drive, tasks which, depending on how the DRM is implemented, count a game's subsequent reinstall as a new installation, making the game unusable after a certain period if it is only used on a single computer. In mid-2008, the Windows version of Mass Effect marked the start of a wave of titles making use of SecuROM for DRM and requiring authentication with a server; the use of t