Open-source software is a type of computer software in which source code is released under a license in which the copyright holder grants users the rights to study and distribute the software to anyone and for any purpose. Open-source software may be developed in a collaborative public manner. Open-source software is a prominent example of open collaboration. Open-source software development generates an more diverse scope of design perspective than any company is capable of developing and sustaining long term. A 2008 report by the Standish Group stated that adoption of open-source software models have resulted in savings of about $60 billion per year for consumers. In the early days of computing and developers shared software in order to learn from each other and evolve the field of computing; the open-source notion moved to the way side of commercialization of software in the years 1970-1980. However, academics still developed software collaboratively. For example Donald Knuth in 1979 with the TeX typesetting system or Richard Stallman in 1983 with the GNU operating system.
In 1997, Eric Raymond published The Cathedral and the Bazaar, a reflective analysis of the hacker community and free-software principles. The paper received significant attention in early 1998, was one factor in motivating Netscape Communications Corporation to release their popular Netscape Communicator Internet suite as free software; this source code subsequently became the basis behind SeaMonkey, Mozilla Firefox and KompoZer. Netscape's act prompted Raymond and others to look into how to bring the Free Software Foundation's free software ideas and perceived benefits to the commercial software industry, they concluded that FSF's social activism was not appealing to companies like Netscape, looked for a way to rebrand the free software movement to emphasize the business potential of sharing and collaborating on software source code. The new term they chose was "open source", soon adopted by Bruce Perens, publisher Tim O'Reilly, Linus Torvalds, others; the Open Source Initiative was founded in February 1998 to encourage use of the new term and evangelize open-source principles.
While the Open Source Initiative sought to encourage the use of the new term and evangelize the principles it adhered to, commercial software vendors found themselves threatened by the concept of distributed software and universal access to an application's source code. A Microsoft executive publicly stated in 2001 that "open source is an intellectual property destroyer. I can't imagine something that could be worse than this for the software business and the intellectual-property business." However, while Free and open-source software has played a role outside of the mainstream of private software development, companies as large as Microsoft have begun to develop official open-source presences on the Internet. IBM, Oracle and State Farm are just a few of the companies with a serious public stake in today's competitive open-source market. There has been a significant shift in the corporate philosophy concerning the development of FOSS; the free-software movement was launched in 1983. In 1998, a group of individuals advocated that the term free software should be replaced by open-source software as an expression, less ambiguous and more comfortable for the corporate world.
Software licenses grant rights to users which would otherwise be reserved by copyright law to the copyright holder. Several open-source software licenses have qualified within the boundaries of the Open Source Definition; the most prominent and popular example is the GNU General Public License, which "allows free distribution under the condition that further developments and applications are put under the same licence", thus free. The open source label came out of a strategy session held on April 7, 1998 in Palo Alto in reaction to Netscape's January 1998 announcement of a source code release for Navigator. A group of individuals at the session included Tim O'Reilly, Linus Torvalds, Tom Paquin, Jamie Zawinski, Larry Wall, Brian Behlendorf, Sameer Parekh, Eric Allman, Greg Olson, Paul Vixie, John Ousterhout, Guido van Rossum, Philip Zimmermann, John Gilmore and Eric S. Raymond, they used the opportunity before the release of Navigator's source code to clarify a potential confusion caused by the ambiguity of the word "free" in English.
Many people claimed that the birth of the Internet, since 1969, started the open-source movement, while others do not distinguish between open-source and free software movements. The Free Software Foun
Magnussoft Deutschland GmbH is a pan-European computer game developer and publisher. The company is seated near Dresden in the eastern region of Germany. However, magnussoft does not publish outside of Europe, they leave that work to local companies. In Europe magnussoft are well known for their releases of collections of software for 8-Bit computer systems that were popular in the 1980s: Commodore 64, Commodore Amiga, Atari XL/XE, Atari ST. All required emulators are included so the software work on ordinary PCs although the programs are unaltered. There is a general collection called Retro-Classix that covers a bit of everything and collections that specialize on one particular system, like the Amiga Classix or the C64 Classix. Several successors followed; the company released more than 160 products over the past ten years. Among their assortment are adventure games, board games, strategic games as well as shoot´em up games and jump and runs. On the other hand, magnussoft released computer applications and educational software.
The software was brought under varied labels to the market in Germany, Switzerland, the Benelux countries, Great Britain, the United States of America. By 2008 magnussoft have gained access to the software market in the lower budget and middle price range, they cooperate with acquainted German partners like for example "ak tronik Software & Services", "KOCH Media", the "Verlagsgruppe Weltbild". In addition magnussoft has founded more subsidiaries in other parts of Europe. Magnussoft have created their profile through the release of ZETA, a broad range of retro games, classic computer games like Aquanoid, Barkanoid or Plot's. Amiga Classix Aquanoid Barkanoid Boulder Match Break It C64 Classix Colossus Chess Dr. Tool Serie Fix & Foxi Serie Jacks Crazy Cong Jump Jack KLIX METRIS MiniGolf Packs Serie PLOTS! Pool Island Retro-Classix Sokoman Dr Brain series Dr. Tool series Driver Cataloger Easy Bootmanager Typing Tutor Deutsch, Englisch und Mathe für Zwerge Deutsch– und Mathe Compilation Fahrschule In 2006 magnussoft incurred public criticism for ceasing the distribution and the funding of BeOS replacement Magnussoft Zeta OS because of its uncertain legal status.
Amiga Classix Aquanoid Barkanoid C64 Classix Dr. Brain Dr. Tool Retro Classix magnussoft - official website
A web browser is a software application for accessing information on the World Wide Web. Each individual web page and video is identified by a distinct Uniform Resource Locator, enabling browsers to retrieve these resources from a web server and display them on the user's device. A web browser is not the same thing as a search engine, though the two are confused. For a user, a search engine is just a website, such as google.com, that stores searchable data about other websites. But to connect to a website's server and display its web pages, a user needs to have a web browser installed on their device; the most popular browsers are Chrome, Safari, Internet Explorer, Edge. The first web browser, called WorldWideWeb, was invented in 1990 by Sir Tim Berners-Lee, he recruited Nicola Pellow to write the Line Mode Browser, which displayed web pages on dumb terminals. 1993 was a landmark year with the release of Mosaic, credited as "the world's first popular browser". Its innovative graphical interface made the World Wide Web system easy to use and thus more accessible to the average person.
This, in turn, sparked the Internet boom of the 1990s when the Web grew at a rapid rate. Marc Andreessen, the leader of the Mosaic team, soon started his own company, which released the Mosaic-influenced Netscape Navigator in 1994. Navigator became the most popular browser. Microsoft debuted Internet Explorer in 1995. Microsoft was able to gain a dominant position for two reasons: it bundled Internet Explorer with its popular Microsoft Windows operating system and did so as freeware with no restrictions on usage; the market share of Internet Explorer peaked at over 95% in 2002. In 1998, desperate to remain competitive, Netscape launched what would become the Mozilla Foundation to create a new browser using the open source software model; this work evolved into Firefox, first released by Mozilla in 2004. Firefox reached a 28% market share in 2011. Apple released its Safari browser in 2003, it remains the dominant browser on Apple platforms. The last major entrant to the browser market was Google, its Chrome browser, which debuted in 2008, has been a huge success.
Once a web page has been retrieved, the browser's rendering engine displays it on the user's device. This includes video formats supported by the browser. Web pages contain hyperlinks to other pages and resources; each link contains a URL, when it is clicked, the browser navigates to the new resource. Thus the process of bringing content to the user begins again. To implement all of this, modern browsers are a combination of numerous software components. Web browsers can be configured with a built-in menu. Depending on the browser, the menu may be named Options, or Preferences; the menu has different types of settings. For example, users can change their home default search engine, they can change default web page colors and fonts. Various network connectivity and privacy settings are usually available. During the course of browsing, cookies received from various websites are stored by the browser; some of them contain login credentials or site preferences. However, others are used for tracking user behavior over long periods of time, so browsers provide settings for removing cookies when exiting the browser.
Finer-grained management of cookies requires a browser extension. The most popular browsers have a number of features in common, they allow users to browse in a private mode. They can be customized with extensions, some of them provide a sync service. Most browsers have these user interface features: Allow the user to open multiple pages at the same time, either in different browser windows or in different tabs of the same window. Back and forward buttons to go back to the previous page forward to the next one. A refresh or reload button to reload the current page. A stop button to cancel loading the page. A home button to return to the user's home page. An address bar to display it. A search bar to input terms into a search engine. There are niche browsers with distinct features. One example is text-only browsers that can benefit people with slow Internet connections or those with visual impairments. Mobile browser List of web browsers Comparison of web browsers Media related to Web browsers at Wikimedia Commons
3D rendering is the 3D computer graphics process of automatically converting 3D wire frame models into 2D images on a computer. 3D renders may include non-photorealistic rendering. Rendering is the final process of creating the actual 2D animation from the prepared scene; this can be compared to taking a photo or filming the scene after the setup is finished in real life. Several different, specialized, rendering methods have been developed; these range from the distinctly non-realistic wireframe rendering through polygon-based rendering, to more advanced techniques such as: scanline rendering, ray tracing, or radiosity. Rendering may take from fractions of a second to days for a single image/frame. In general, different methods are better suited for either photo-realistic rendering, or real-time rendering. Rendering for interactive media, such as games and simulations, is calculated and displayed in real time, at rates of 20 to 120 frames per second. In real-time rendering, the goal is to show as much information as possible as the eye can process in a fraction of a second.
The primary goal is to achieve an as high as possible degree of photorealism at an acceptable minimum rendering speed. In fact, exploitations can be applied in the way the eye'perceives' the world, as a result, the final image presented is not that of the real world, but one close enough for the human eye to tolerate. Rendering software may simulate such visual effects as depth of field or motion blur; these are attempts to simulate visual phenomena resulting from the optical characteristics of cameras and of the human eye. These effects can lend an element of realism to a scene if the effect is a simulated artifact of a camera; this is the basic method employed in games, interactive worlds and VRML. The rapid increase in computer processing power has allowed a progressively higher degree of realism for real-time rendering, including techniques such as HDR rendering. Real-time rendering is polygonal and aided by the computer's GPU. Animations for non-interactive media, such as feature films and video, are rendered much more slowly.
Non-real time rendering enables the leveraging of limited processing power in order to obtain higher image quality. Rendering times for individual frames may vary from a few seconds to several days for complex scenes. Rendered frames are stored on a hard disk can be transferred to other media such as motion picture film or optical disk; these frames are displayed sequentially at high frame rates 24, 25, or 30 frames per second, to achieve the illusion of movement. When the goal is photo-realism, techniques such as ray tracing, path tracing, photon mapping or radiosity are employed; this is the basic method employed in artistic works. Techniques have been developed for the purpose of simulating other occurring effects, such as the interaction of light with various forms of matter. Examples of such techniques include particle systems, volumetric sampling and subsurface scattering; the rendering process is computationally expensive, given the complex variety of physical processes being simulated.
Computer processing power has increased over the years, allowing for a progressively higher degree of realistic rendering. Film studios that produce computer-generated animations make use of a render farm to generate images in a timely manner. However, falling hardware costs mean that it is possible to create small amounts of 3D animation on a home computer system; the output of the renderer is used as only one small part of a completed motion-picture scene. Many layers of material may be rendered separately and integrated into the final shot using compositing software. Models of reflection/scattering and shading are used to describe the appearance of a surface. Although these issues may seem like problems all on their own, they are studied exclusively within the context of rendering. Modern 3D computer graphics rely on a simplified reflection model called Phong reflection model. In refraction of light, an important concept is the refractive index. In most 3D programming implementations, the term for this value is "index of refraction".
Shading can be broken down into two different techniques, which are studied independently: Surface shading - How light spreads across a surface Reflection/Scattering - How light interacts with a surface at a given point Popular surface shading algorithms in 3D computer graphics include: Flat shading: A technique that shades each polygon of an object based on the polygon's "normal" and the position and intensity of a light source. Gouraud shading: Invented by H. Gouraud in 1971, a fast and resource-conscious vertex shading technique used to simulate smoothly shaded surfaces. Phong shading: Invented by Bui Tuong Phong, used to simulate specular highlights and smooth shaded surfaces. Ref
Graphical user interface
The graphical user interface is a form of user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, instead of text-based user interfaces, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard; the actions in a GUI are performed through direct manipulation of the graphical elements. Beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices and smaller household and industrial controls; the term GUI tends not to be applied to other lower-display resolution types of interfaces, such as video games, or not including flat screens, like volumetric displays because the term is restricted to the scope of two-dimensional display screens able to describe generic information, in the tradition of the computer science research at the Xerox Palo Alto Research Center.
Designing the visual composition and temporal behavior of a GUI is an important part of software application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the underlying logical design of a stored program, a design discipline named usability. Methods of user-centered design are used to ensure that the visual language introduced in the design is well-tailored to the tasks; the visible graphical interface features of an application are sometimes referred to as chrome or GUI. Users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold; the widgets of a well-designed interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows flexible structures in which the interface is independent from and indirectly linked to application functions, so the GUI can be customized easily; this allows users to select or design a different skin at will, eases the designer's work to change the interface as user needs evolve.
Good user interface design relates to users more, to system architecture less. Large widgets, such as windows provide a frame or container for the main presentation content such as a web page, email message or drawing. Smaller ones act as a user-input tool. A GUI may be designed for the requirements of a vertical market as application-specific graphical user interfaces. Examples include automated teller machines, point of sale touchscreens at restaurants, self-service checkouts used in a retail store, airline self-ticketing and check-in, information kiosks in a public space, like a train station or a museum, monitors or control screens in an embedded industrial application which employ a real-time operating system. By the 1980s, cell phones and handheld game systems employed application specific touchscreen GUIs. Newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, for the tasks of gathering and producing information.
A series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to use computer software; the most common combination of such elements in GUIs is the windows, menus, pointer paradigm in personal computers. The WIMP style of interaction uses a virtual input device to represent the position of a pointing device, most a mouse, presents information organized in windows and represented with icons. Available commands are compiled together in menus, actions are performed making gestures with the pointing device. A window manager facilitates the interactions between windows and the windowing system; the windowing system handles hardware devices such as pointing devices, graphics hardware, positioning of the pointer. In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed.
Window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal digital assistants and smartphones use the WIMP elements with different unifying metaphors, due to constraints in space and available input devices. Applications for which WIMP is not well suited may use newer interaction techniques, collectively termed post-WIMP user interfaces; as of 2011, some touchscreen-based operating systems such as Apple's iOS and Android use the class of GUIs named post-WIMP. These support styles of interaction using more than one finger in contact with a display, which allows actions such as pinching and rotating, which are unsupported by one pointer and mouse. Human interface devices, for the efficient interaction with a GUI include a computer keyboard used together with keyboard shortcuts, pointing devices for the cursor control: mouse, pointing stick, trackball, virtual keyboards, head-up displays. There are actions performed by programs that affect the GUI.
For example, there are components like inotify or D-Bus to facilitate communication between computer programs. Ivan Sutherland developed Sketchpad in 1963 held as the first graphical co
A wireless network is a computer network that uses wireless data connections between network nodes. Wireless networking is a method by which homes, telecommunications networks and business installations avoid the costly process of introducing cables into a building, or as a connection between various equipment locations. Wireless telecommunications networks are implemented and administered using radio communication; this implementation takes place at the physical level of the OSI model network structure. Examples of wireless networks include cell phone networks, wireless local area networks, wireless sensor networks, satellite communication networks, terrestrial microwave networks; the first professional wireless network was developed under the brand ALOHAnet in 1969 at the University of Hawaii and became operational in June 1971. The first commercial wireless network was the WaveLAN product family, developed by NCR in 1986. 1991 2G cell phone network June 1997 802.11 "Wi-Fi" protocol first release 1999 803.11 VoIP integration Terrestrial microwave – Terrestrial microwave communication uses Earth-based transmitters and receivers resembling satellite dishes.
Terrestrial microwaves are in the low gigahertz range, which limits all communications to line-of-sight. Relay stations are spaced 48 km apart. Communications satellites – Satellites communicate via microwave radio waves, which are not deflected by the Earth's atmosphere; the satellites are stationed in space in geosynchronous orbit 35,400 km above the equator. These Earth-orbiting systems are capable of receiving and relaying voice, TV signals. Cellular and PCS systems use several radio communications technologies; the systems divide the region covered into multiple geographic areas. Each area has a low-power transmitter or radio relay antenna device to relay calls from one area to the next area. Radio and spread spectrum technologies – Wireless local area networks use a high-frequency radio technology similar to digital cellular and a low-frequency radio technology. Wireless LANs use spread spectrum technology to enable communication between multiple devices in a limited area. IEEE 802.11 defines a common flavor of open-standards wireless radio-wave technology known as.
Free-space optical communication uses invisible light for communications. In most cases, line-of-sight propagation is used, which limits the physical positioning of communicating devices. Wireless personal area networks connect devices within a small area, within a person's reach. For example, both Bluetooth radio and invisible infrared light provides a WPAN for interconnecting a headset to a laptop. ZigBee supports WPAN applications. Wi-Fi PANs are becoming commonplace as equipment designers start to integrate Wi-Fi into a variety of consumer electronic devices. Intel "My WiFi" and Windows 7 "virtual Wi-Fi" capabilities have made Wi-Fi PANs simpler and easier to set up and configure. A wireless local area network links two or more devices over a short distance using a wireless distribution method providing a connection through an access point for internet access; the use of spread-spectrum or OFDM technologies may allow users to move around within a local coverage area, still remain connected to the network.
Products using the IEEE 802.11 WLAN standards are marketed under the Wi-Fi brand name. Fixed wireless technology implements point-to-point links between computers or networks at two distant locations using dedicated microwave or modulated laser light beams over line of sight paths, it is used in cities to connect networks in two or more buildings without installing a wired link. To connect to Wi-Fi, sometimes are used devices like a router or connecting HotSpot using mobile smartphones. A wireless ad hoc network known as a wireless mesh network or mobile ad hoc network, is a wireless network made up of radio nodes organized in a mesh topology; each node forwards messages on behalf of the other nodes and each node performs routing. Ad hoc networks can "self-heal", automatically re-routing around a node. Various network layer protocols are needed to realize ad hoc mobile networks, such as Distance Sequenced Distance Vector routing, Associativity-Based Routing, Ad hoc on-demand Distance Vector routing, Dynamic source routing.
Wireless metropolitan area networks are a type of wireless network that connects several wireless LANs. WiMAX is described by the IEEE 802.16 standard. Wireless wide area networks are wireless networks that cover large areas, such as between neighbouring towns and cities, or city and suburb; these networks can be used to connect branch offices of business or as a public Internet access system. The wireless connections between access points are point to point microwave links using parabolic dishes on the 2.4 GHz and 5.8Ghz band, rather than omnidirectional antennas used with smaller networks. A typical system contains access points and wireless bridging relays. Other configurations are mesh systems; when combined with renewable energy systems such as photovoltaic solar panels or wind systems they can be stand alone systems. A cellular network or mobile network is a radio network distributed over land areas called cells, each served by at least one fixed-location transceiver, known as a cell site or base station.
In a cellular network, each cell characteristically uses a different set of radio frequencies from all their immediate neighbouring cells to avoid any interference. When joined together these cells provide radio coverage over a wide geographic area; this enables a large number of portable transceivers (e.g. mo
PowerPC is a reduced instruction set computing instruction set architecture created by the 1991 Apple–IBM–Motorola alliance, known as AIM. PowerPC, as an evolving instruction set, has since 2006 been named Power ISA, while the old name lives on as a trademark for some implementations of Power Architecture-based processors. PowerPC was the cornerstone of AIM's PReP and Common Hardware Reference Platform initiatives in the 1990s. Intended for personal computers, the architecture is well known for being used by Apple's Power Macintosh, PowerBook, iMac, iBook, Xserve lines from 1994 until 2006, when Apple migrated to Intel's x86, it has since become a niche in personal computers, but remains popular for embedded and high-performance processors. Its use in 7th generation of video game consoles and embedded applications provided an array of uses. In addition, PowerPC CPUs are still used in third party AmigaOS 4 personal computers. PowerPC is based on IBM's earlier POWER instruction set architecture, retains a high level of compatibility with it.
The history of RISC began with IBM's 801 research project, on which John Cocke was the lead developer, where he developed the concepts of RISC in 1975–78. 801-based microprocessors were used in a number of IBM embedded products becoming the 16-register IBM ROMP processor used in the IBM RT PC. The RT PC was a rapid design implementing the RISC architecture. Between the years of 1982–1984, IBM started a project to build the fastest microprocessor on the market; the result is the POWER instruction set architecture, introduced with the RISC System/6000 in early 1990. The original POWER microprocessor, one of the first superscalar RISC implementations, is a high performance, multi-chip design. IBM soon realized that a single-chip microprocessor was needed in order to scale its RS/6000 line from lower-end to high-end machines. Work began on a one-chip POWER microprocessor, designated the RSC. In early 1991, IBM realized its design could become a high-volume microprocessor used across the industry. Apple had realized the limitations and risks of its dependency upon a single CPU vendor at a time when Motorola was falling behind on delivering the 68040 CPU.
Furthermore, Apple had conducted its own research and made an experimental quad-core CPU design called Aquarius, which convinced the company's technology leadership that the future of computing was in the RISC methodology. IBM approached Apple with the goal of collaborating on the development of a family of single-chip microprocessors based on the POWER architecture. Soon after, being one of Motorola's largest customers of desktop-class microprocessors, asked Motorola to join the discussions due to their long relationship, Motorola having had more extensive experience with manufacturing high-volume microprocessors than IBM, to form a second source for the microprocessors; this three-way collaboration between Apple, IBM, Motorola became known as the AIM alliance. In 1991, the PowerPC was just one facet of a larger alliance among these three companies. At the time, most of the personal computer industry was shipping systems based on the Intel 80386 and 80486 chips, which have a complex instruction set computer architecture, development of the Pentium processor was well underway.
The PowerPC chip was one of several joint ventures involving the three alliance members, in their efforts to counter the growing Microsoft-Intel dominance of personal computing. For Motorola, POWER looked like an unbelievable deal, it allowed the company to sell a tested and powerful RISC CPU for little design cash on its own part. It maintained ties with an important customer and seemed to offer the possibility of adding IBM too, which might buy smaller versions from Motorola instead of making its own. At this point Motorola had its own RISC design in the form of the 88000, doing poorly in the market. Motorola was doing well with its 68000 family and the majority of the funding was focused on this; the 88000 effort was somewhat starved for resources. The 88000 was in production, however; the 88000 had achieved a number of embedded design wins in telecom applications. If the new POWER one-chip version could be made bus-compatible at a hardware level with the 88000, that would allow both Apple and Motorola to bring machines to market far faster since they would not have to redesign their board architecture.
The result of these various requirements is the PowerPC specification. The differences between the earlier POWER instruction set and that of PowerPC is outlined in Appendix E of the manual for PowerPC ISA v.2.02. Since 1991, IBM had a long-standing desire for a unifying operating system that would host all existing operating systems as personalities upon one microkernel. From 1991 to 1995, the company designed and aggressively evangelized what would become Workplace OS targeting PowerPC; when the first PowerPC products reached the market, they were met with enthusiasm. In addition to Apple, both IBM and the Motorola Computer Group offered systems built around the processors. Microsoft released Windows NT 3.51 for the architecture, used in Motorola's