Infrared radiation, sometimes called infrared light, is electromagnetic radiation with longer wavelengths than those of visible light, is therefore invisible to the human eye, although IR at wavelengths up to 1050 nanometers s from specially pulsed lasers can be seen by humans under certain conditions. IR wavelengths extend from the nominal red edge of the visible spectrum at 700 nanometers, to 1 millimeter. Most of the thermal radiation emitted by objects near room temperature is infrared; as with all EMR, IR carries radiant energy and behaves both like a wave and like its quantum particle, the photon. Infrared radiation was discovered in 1800 by astronomer Sir William Herschel, who discovered a type of invisible radiation in the spectrum lower in energy than red light, by means of its effect on a thermometer. More than half of the total energy from the Sun was found to arrive on Earth in the form of infrared; the balance between absorbed and emitted infrared radiation has a critical effect on Earth's climate.
Infrared radiation is emitted or absorbed by molecules when they change their rotational-vibrational movements. It excites vibrational modes in a molecule through a change in the dipole moment, making it a useful frequency range for study of these energy states for molecules of the proper symmetry. Infrared spectroscopy examines transmission of photons in the infrared range. Infrared radiation is used in industrial, military, law enforcement, medical applications. Night-vision devices using active near-infrared illumination allow people or animals to be observed without the observer being detected. Infrared astronomy uses sensor-equipped telescopes to penetrate dusty regions of space such as molecular clouds, detect objects such as planets, to view red-shifted objects from the early days of the universe. Infrared thermal-imaging cameras are used to detect heat loss in insulated systems, to observe changing blood flow in the skin, to detect overheating of electrical apparatus. Extensive uses for military and civilian applications include target acquisition, night vision and tracking.
Humans at normal body temperature radiate chiefly at wavelengths around 10 μm. Non-military uses include thermal efficiency analysis, environmental monitoring, industrial facility inspections, detection of grow-ops, remote temperature sensing, short-range wireless communication and weather forecasting. Infrared radiation extends from the nominal red edge of the visible spectrum at 700 nanometers to 1 millimeter; this range of wavelengths corresponds to a frequency range of 430 THz down to 300 GHz. Below infrared is the microwave portion of the electromagnetic spectrum. Sunlight, at an effective temperature of 5,780 kelvins, is composed of near-thermal-spectrum radiation, more than half infrared. At zenith, sunlight provides an irradiance of just over 1 kilowatt per square meter at sea level. Of this energy, 527 watts is infrared radiation, 445 watts is visible light, 32 watts is ultraviolet radiation. Nearly all the infrared radiation in sunlight is shorter than 4 micrometers. On the surface of Earth, at far lower temperatures than the surface of the Sun, some thermal radiation consists of infrared in the mid-infrared region, much longer than in sunlight.
However, black body or thermal radiation is continuous: it gives off radiation at all wavelengths. Of these natural thermal radiation processes, only lightning and natural fires are hot enough to produce much visible energy, fires produce far more infrared than visible-light energy. In general, objects emit infrared radiation across a spectrum of wavelengths, but sometimes only a limited region of the spectrum is of interest because sensors collect radiation only within a specific bandwidth. Thermal infrared radiation has a maximum emission wavelength, inversely proportional to the absolute temperature of object, in accordance with Wien's displacement law. Therefore, the infrared band is subdivided into smaller sections. A used sub-division scheme is: NIR and SWIR is sometimes called "reflected infrared", whereas MWIR and LWIR is sometimes referred to as "thermal infrared". Due to the nature of the blackbody radiation curves, typical "hot" objects, such as exhaust pipes appear brighter in the MW compared to the same object viewed in the LW.
The International Commission on Illumination recommended the division of infrared radiation into the following three bands: ISO 20473 specifies the following scheme: Astronomers divide the infrared spectrum as follows: These divisions are not precise and can vary depending on the publication. The three regions are used for observation of different temperature ranges, hence different environments in space; the most common photometric system used in astronomy allocates capital letters to different spectral regions according to filters used. These letters are understood in reference to atmospheric windows and appear, for instance, in the titles of many papers. A third scheme divides up the band based on the response of various detectors: Near-infrared: from 0.7 to 1.0 µm. Short-wave infrared: 1.0 to 3 µm. InGaAs covers to about 1.8 µm. Mid-wave infrared: 3 to 5 µm (defined by the atmospheric window and covered by indium antimonide and mercury cadmium telluride and by lead
In computing, a stylus is a small pen-shaped instrument, used to input commands to a computer screen, mobile device or graphics tablet. With touchscreen devices, a user places a stylus on the surface of the screen to draw or make selections by tapping the stylus on the screen. In this manner, the stylus can be used instead of a mouse or trackpad as a pointing device, a technique called pen computing. Pen-like input devices which are larger than a stylus, offer increased functionality such as programmable buttons, pressure sensitivity and electronic erasers, are known as digital pens; the stylus is the primary input device for personal digital assistants. It is used on the Nintendo DS and 3DS handheld game consoles, the Wii U's Wii U GamePad; some smartphones, such as Windows Mobile phones, require a stylus for accurate input. However, devices featuring multi-touch finger-input are becoming more popular than stylus-driven devices in the smartphone market; the stylus is used in the famous Galaxy Note series manufactured by Samsung Electronics.
Graphics tablets use a stylus containing circuitry, to allow multi-function buttons on the barrel of the pen or stylus to transmit user actions to the tablet. Most tablets detect varying degrees of pressure sensitivity, e.g. for use in a drawing program to vary line thickness or color density. Beyond the side of the input mechanism, there has been a need for the physical output of the stylus. New pen-based interfaces have been proposed to simulate the realistic physical sensations on digital surfaces to allow users to feel as if they feel like analog-pen writing, for instance, RealPen Project; the first use of a stylus in a computing device was the Styalator, demonstrated by Tom Dimond in 1957. Active pen Digital pen Light pen Pen computing Apple Pencil Handwriting recognition Stylus Annotated Bibliography of References to Pen Computing and Tablets Notes on the History of Pen-based Computing on YouTube
The Internet is the global system of interconnected computer networks that use the Internet protocol suite to link devices worldwide. It is a network of networks that consists of private, academic and government networks of local to global scope, linked by a broad array of electronic and optical networking technologies; the Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web, electronic mail and file sharing. Some publications no longer capitalize "internet"; the origins of the Internet date back to research commissioned by the federal government of the United States in the 1960s to build robust, fault-tolerant communication with computer networks. The primary precursor network, the ARPANET served as a backbone for interconnection of regional academic and military networks in the 1980s; the funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, led to worldwide participation in the development of new networking technologies, the merger of many networks.
The linking of commercial networks and enterprises by the early 1990s marked the beginning of the transition to the modern Internet, generated a sustained exponential growth as generations of institutional and mobile computers were connected to the network. Although the Internet was used by academia since the 1980s, commercialization incorporated its services and technologies into every aspect of modern life. Most traditional communication media, including telephony, television, paper mail and newspapers are reshaped, redefined, or bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, video streaming websites. Newspaper and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators; the Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or sell goods and services online.
Business-to-business and financial services on the Internet affect supply chains across entire industries. The Internet has no single centralized governance in either technological implementation or policies for access and usage; the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address space and the Domain Name System, are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers. The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force, a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. In November 2006, the Internet was included on USA Today's list of New Seven Wonders; when the term Internet is used to refer to the specific global system of interconnected Internet Protocol networks, the word is a proper noun that should be written with an initial capital letter.
In common use and the media, it is erroneously not capitalized, viz. the internet. Some guides specify that the word should be capitalized when used as a noun, but not capitalized when used as an adjective; the Internet is often referred to as the Net, as a short form of network. As early as 1849, the word internetted was used uncapitalized as an adjective, meaning interconnected or interwoven; the designers of early computer networks used internet both as a noun and as a verb in shorthand form of internetwork or internetworking, meaning interconnecting computer networks. The terms Internet and World Wide Web are used interchangeably in everyday speech. However, the World Wide Web or the Web is only one of a large number of Internet services; the Web is a collection of interconnected documents and other web resources, linked by hyperlinks and URLs. As another point of comparison, Hypertext Transfer Protocol, or HTTP, is the language used on the Web for information transfer, yet it is just one of many languages or protocols that can be used for communication on the Internet.
The term Interweb is a portmanteau of Internet and World Wide Web used sarcastically to parody a technically unsavvy user. Research into packet switching, one of the fundamental Internet technologies, started in the early 1960s in the work of Paul Baran and Donald Davies. Packet-switched networks such as the NPL network, ARPANET, the Merit Network, CYCLADES, Telenet were developed in the late 1960s and early 1970s; the ARPANET project led to the development of protocols for internetworking, by which multiple separate networks could be joined into a network of networks. ARPANET development began with two network nodes which were interconnected between the Network Measurement Center at the University of California, Los Angeles Henry Samueli School of Engineering and Applied Science directed by Leonard Kleinrock, the NLS system at SRI International by Douglas Engelbart in Menlo Park, California, on 29 October 1969; the third site was the Culler-Fried Interactive Mathematics Center at the University of California, Santa Barbara, followed by the University of
The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB; the unit prefix mega is a multiplier of 1000000 in the International System of Units. Therefore, one megabyte is one million bytes of information; this definition has been incorporated into the International System of Quantities. However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes, a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes, in which this quantity is designated by the unit mebibyte. Less common is a convention that used the megabyte to mean 1000×1024 bytes; the megabyte is used to measure either 10002 bytes or 10242 bytes. The interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name.
As 1024 approximates 1000 corresponding to the SI prefix kilo-, it was a convenient term to denote the binary multiple. In 1998 the International Electrotechnical Commission proposed standards for binary prefixes requiring the use of megabyte to denote 10002 bytes and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST; the term megabyte continues to be used with different meanings: Base 10 1 MB = 1000000 bytes is the definition recommended by the International System of Units and the International Electrotechnical Commission IEC. This definition is used in networking contexts and most storage media hard drives, flash-based storage, DVDs, is consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance; the Mac OS X 10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units. In this convention, one thousand megabytes is equal to one gigabyte, where 1 GB is one billion bytes.
Base 2 1 MB = 1048576 bytes is the definition used by Microsoft Windows in reference to computer memory, such as RAM. This definition is synonymous with the unambiguous binary prefix mebibyte. In this convention, one thousand and twenty-four megabytes is equal to one gigabyte, where 1 GB is 10243 bytes. Mixed 1 MB = 1024000 bytes is the definition used to describe the formatted capacity of the 1.44 MB 3.5-inch HD floppy disk, which has a capacity of 1474560bytes. Semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two; the capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, the number of disk platters in the drive. Changes in any of these factors would not double the size. Sector sizes were set as powers of two for convenience in processing, it was a natural extension to give the capacity of a disk drive in multiples of the sector size, giving a mix of decimal and binary multiples when expressing total disk capacity.
Depending on compression methods and file format, a megabyte of data can be: a 1 megapixel bitmap image with 256 colors stored without any compression. A 4 megapixel JPEG image with normal compression. 1 minute of 128 kbit/s MP3 compressed music. 6 seconds of uncompressed CD audio. A typical English book volume in plain text format; the human genome consists of DNA representing 800 MB of data. The parts that differentiate one person from another can be compressed to 4 MB. Timeline of binary prefixes Gigabyte § Consumer confusion Historical Notes About The Cost Of Hard Drive Storage Space the megabyte International Electrotechnical Commission definitions IEC prefixes and symbols for binary multiples
A modem is a hardware device that converts data between transmission media so that it can be transmitted from computer to computer. The goal is to produce a signal that can be transmitted and decoded to reproduce the original digital data. Modems can be used with any means of transmitting analog signals from light-emitting diodes to radio. A common type of modem is one that turns the digital data of a computer into modulated electrical signal for transmission over telephone lines and demodulated by another modem at the receiver side to recover the digital data. Modems are classified by the maximum amount of data they can send in a given unit of time expressed in bits per second or bytes per second. Modems can be classified by their symbol rate, measured in baud; the baud unit denotes symbols per second, or the number of times per second the modem sends a new signal. For example, the ITU V.21 standard used audio frequency-shift keying with two possible frequencies, corresponding to two distinct symbols, to carry 300 bits per second using 300 baud.
By contrast, the original ITU V.22 standard, which could transmit and receive four distinct symbols, transmitted 1,200 bits by sending 600 symbols per second using phase-shift keying News wire services in the 1920s used multiplex devices that satisfied the definition of a modem. However, the modem function was incidental to the multiplexing function, so they are not included in the history of modems. Modems grew out of the need to connect teleprinters over ordinary phone lines instead of the more expensive leased lines, used for current loop–based teleprinters and automated telegraphs. In 1941, the Allies developed a voice encryption system called SIGSALY which used a vocoder to digitize speech encrypted the speech with one-time pad and encoded the digital data as tones using frequency shift keying. Mass-produced modems in the United States began as part of the SAGE air-defense system in 1958, connecting terminals at various airbases, radar sites, command-and-control centers to the SAGE director centers scattered around the United States and Canada.
SAGE modems were described by AT&T's Bell Labs as conforming to their newly published Bell 101 dataset standard. While they ran on dedicated telephone lines, the devices at each end were no different from commercial acoustically coupled Bell 101, 110 baud modems; the 201A and 201B Data-Phones were synchronous modems using two-bit-per-baud phase-shift keying. The 201A operated half-duplex at 2,000 bit/s over normal phone lines, while the 201B provided full duplex 2,400 bit/s service on four-wire leased lines, the send and receive channels each running on their own set of two wires; the famous Bell 103A dataset standard was introduced by AT&T in 1962. It provided full-duplex service at 300 bit/s over normal phone lines. Frequency-shift keying was used, with the call originator transmitting at 1,070 or 1,270 Hz and the answering modem transmitting at 2,025 or 2,225 Hz; the available 103A2 gave an important boost to the use of remote low-speed terminals such as the Teletype Model 33 ASR and KSR, the IBM 2741.
AT&T reduced modem costs by introducing the answer-only 113B/C modems. For many years, the Bell System maintained a monopoly on the use of its phone lines and what devices could be connected to them. However, the FCC's seminal Carterfone Decision of 1968, the FCC concluded that electronic devices could be connected to the telephone system as long as they used an acoustic coupler. Since most handsets were supplied by Western Electric and thus of a standard design, acoustic couplers were easy to build. Acoustically coupled Bell 103A-compatible 300 bit/s modems were common during the 1970s. Well-known models included the Novation CAT and the Anderson-Jacobson, the latter spun off from an in-house project at Stanford Research Institute. An lower-cost option was the Pennywhistle modem, designed to be built using parts from electronics scrap and surplus stores. In December 1972, Vadic introduced the VA3400, notable for full-duplex operation at 1,200 bit/s over the phone network. Like the 103A, it used different frequency bands for receive.
In November 1976, AT&T introduced the 212A modem to compete with Vadic. It used the lower frequency set for transmission. One could use the 212A with a 103A modem at 300 bit/s. According to Vadic, the change in frequency assignments made the 212 intentionally incompatible with acoustic coupling, thereby locking out many potential modem manufacturers. In 1977, Vadic responded with the VA3467 triple modem, an answer-only modem sold to computer center operators that supported Vadic's 1,200-bit/s mode, AT&T's 212A mode, 103A operation; the Hush-a-Phone decision applied only to mechanical connections, but the Carterfone decision of 1968, led to the FCC introducing a rule setting stringent AT&T-designed tests for electronically coupling a device to the phone lines. This opened the door to direct-connect modems that plugged directly into the phone line rather than via a handset. However, the cost of passing the tests was considerable, acoustically coupled modems remained common into the early 1980s.
The falling prices of electronics in the late 1970s led to an increasing number of direct-connect models around 1980. In spite of being directly connected, these modems were operated like their earlier acoustic versions – dialing and other phone-control operations were completed by hand, using an attached handset
USB is an industry standard that establishes specifications for cables and protocols for connection and power supply between personal computers and their peripheral devices. Released in 1996, the USB standard is maintained by the USB Implementers Forum. There have been three generations of USB specifications: USB 2.0 and USB 3.x. USB was designed to standardize the connection of peripherals like keyboards, pointing devices, digital still and video cameras, portable media players, disk drives and network adapters to personal computers, both to communicate and to supply electric power, it has replaced interfaces such as serial ports and parallel ports, has become commonplace on a wide range of devices. USB connectors have been replacing other types for battery chargers of portable devices; this section is intended to allow fast identification of USB receptacles on equipment. Further diagrams and discussion of plugs and receptacles can be found in the main article above; the Universal Serial Bus was developed to simplify and improve the interface between personal computers and peripheral devices, when compared with existing standard or ad-hoc proprietary interfaces.
From the computer user's perspective, the USB interface improved ease of use in several ways. The USB interface is self-configuring, so the user need not adjust settings on the device and interface for speed or data format, or configure interrupts, input/output addresses, or direct memory access channels. USB connectors are standardized at the host, so any peripheral can use any available receptacle. USB takes full advantage of the additional processing power that can be economically put into peripheral devices so that they can manage themselves; the USB interface is "hot pluggable", meaning devices can be exchanged without rebooting the host computer. Small devices can be powered directly from displacing extra power supply cables; because use of the USB logos is only permitted after compliance testing, the user can have confidence that a USB device will work as expected without extensive interaction with settings and configuration. Installation of a device relying on the USB standard requires minimal operator action.
When a device is plugged into a port on a running personal computer system, it is either automatically configured using existing device drivers, or the system prompts the user to locate a driver, installed and configured automatically. For hardware manufacturers and software developers, the USB standard eliminates the requirement to develop proprietary interfaces to new peripherals; the wide range of transfer speeds available from a USB interface suits devices ranging from keyboards and mice up to streaming video interfaces. A USB interface can be designed to provide the best available latency for time-critical functions, or can be set up to do background transfers of bulk data with little impact on system resources; the USB interface is generalized with no signal lines dedicated to only one function of one device. USB cables are limited in length, as the standard was meant to connect to peripherals on the same table-top, not between rooms or between buildings. However, a USB port can be connected to a gateway.
USB has "master-slave" protocol for addressing peripheral devices. Some extension to this limitation is possible through USB On-The-Go. A host cannot "broadcast" signals to all peripherals at once, each must be addressed individually; some high speed peripheral devices require sustained speeds not available in the USB standard. While converters exist between certain "legacy" interfaces and USB, they may not provide full implementation of the legacy hardware. For a product developer, use of USB requires implementation of a complex protocol and implies an "intelligent" controller in the peripheral device. Developers of USB devices intended for public sale must obtain a USB ID which requires a fee paid to the Implementers' Forum. Developers of products that use the USB specification must sign an agreement with Implementer's Forum. Use of the USB logos on the product require annual fees and membership in the organization. A group of seven companies began the development of USB in 1994: Compaq, DEC, IBM, Microsoft, NEC, Nortel.
The goal was to make it fundamentally easier to connect external devices to PCs by replacing the multitude of connectors at the back of PCs, addressing the usability issues of existing interfaces, simplifying software configuration of all devices connected to USB, as well as permitting greater data rates for external devices. Ajay Bhatt and his team worked on the standard at Intel; the original USB 1.0 specification, introduced in January 1996, defined data transfer rates of 1.5 Mbit/s Low Speed and 12 Mbit/s Full Speed. Microsoft Windows 95, OSR 2.1 provided OEM support for the devices. The first used version of USB was 1.1, released in September 1998. The 12 Mbit/s data rate was intended for higher-speed devices such as disk drives, the lower 1.5 Mbit/s rate for low data
Palm OS is a discontinued mobile operating system developed by Palm, Inc. for personal digital assistants in 1996. Palm OS was designed for ease of use with a touchscreen-based graphical user interface, it is provided with a suite of basic applications for personal information management. Versions of the OS have been extended to support smartphones. Several other licensees have manufactured devices powered by Palm OS. Following Palm's purchase of the Palm trademark, the licensed version from ACCESS was renamed Garnet OS. In 2007, ACCESS introduced the successor to Garnet OS, called Access Linux Platform and in 2009, the main licensee of Palm OS, Inc. switched from Palm OS to webOS for their forthcoming devices. Palm OS was developed under the direction of Jeff Hawkins at Palm Computing, Inc. Palm was acquired by U. S. Robotics Corp. which in turn was bought by 3Com, which made the Palm subsidiary an independent publicly traded company on March 2, 2000. In January 2002, Palm set up a wholly owned subsidiary to develop and license Palm OS, named PalmSource.
PalmSource was spun off from Palm as an independent company on October 28, 2003. Palm became a regular licensee of Palm OS, no longer in control of the operating system. In September 2005, PalmSource announced that it was being acquired by ACCESS. In December 2006, Palm gained perpetual rights to the Palm OS source code from ACCESS. With this Palm can modify the licensed operating system as needed without paying further royalties to ACCESS. Together with the May 2005 acquisition of full rights to the Palm brand name, only Palm can publish releases of the operating system under the name'Palm OS'; as a consequence, on January 25, 2007, ACCESS announced a name change to their current Palm OS operating system, now titled Garnet OS. Palm OS is a proprietary mobile operating system. Designed in 1996 for Palm Computing, Inc.'s new Pilot PDA, it has been implemented on a wide array of mobile devices, including smartphones, wrist watches, handheld gaming consoles, barcode readers and GPS devices. Palm OS versions earlier than 5.0 run on Motorola/Freescale DragonBall processors.
From version 5.0 onwards, Palm OS runs on ARM architecture-based processors. The key features of the current Palm OS Garnet are: Simple, single-tasking environment to allow launching of full screen applications with a basic, common GUI set Monochrome or color screens with resolutions up to 480x320 pixel Handwriting recognition input system called Graffiti 2 HotSync technology for data synchronization with desktop computers Sound playback and record capabilities Simple security model: Device can be locked by password, arbitrary application records can be made private TCP/IP network access Serial port/USB, Bluetooth and Wi-Fi connections Expansion memory card support Defined standard data format for personal information management applications to store calendar, address and note entries, accessible by third-party applications. Included with the OS is a set of standard applications, with the most relevant ones for the four mentioned PIM operations. Manufacturers are free to implement different features of the OS in their devices or add new features.
This version history describes the licensed version from Palm/PalmSource/ACCESS. All versions prior to Palm OS 5 are based on top of the AMX 68000 kernel licensed from KADAK Products Ltd. While this kernel is technically capable of multitasking, the "terms and conditions of that license state that Palm may not expose the API for creating/manipulating tasks within the OS." Palm OS 1.0 is the original version present on the Pilot 1000 and 5000. It was introduced in March 1996. Version 1.0 features the classic PIM applications Address, Date Book, Memo Pad, To Do List. Included is a calculator and the Security tool to hide records for private use. Palm OS 1.0 does not file system storage. Applications are executed in place; as no dedicated file system is supported, the operating system depends on constant RAM refresh cycles to keep its memory. The OS supports 160x160 monochrome output displays. User input is generated through the Graffiti handwriting recognition system or optionally through a virtual keyboard.
The system supports data synchronization to another PC via its HotSync technology over a serial interface. The latest bugfix release is version 1.0.7. Palm OS 2.0 was introduced on March 1997 with the PalmPilot Personal and Professional. This version adds TCP/IP network, network HotSync, display backlight support; the last bugfix release is version 2.0.5. Two new applications and Expense are added, the standard PIM applications have been enhanced. Palm OS 3.0 was introduced on March 1998 with the launch of the Palm III series. This version adds IrDA enhanced font support; this version features updated PIM applications and an update to the application launcher. Palm OS 3.1 adds only minor new features, like network HotSync support. It was introduced with the Palm IIIx and Palm V; the last bugfix release is version 3.1.1. Palm OS 3.2 adds Web Clipping support, an early Palm-specific solution to bring web-content to a small PDA screen. It was introduced with the Palm VII organizer. Palm OS 3.3 adds the ability to do infrared hotsyncing.
It was introduced with the Palm Vx organizer. Palm OS 3.5 is the first version to include native 8-bit color support. It adds major convenience features that simplify operation, like a context-sensitive icon-bar or simpler menu activation; the datebook application is extended with an additional agenda view. This version was first introduced with the Palm IIIc device; the la