Stephen Gary "Woz" Wozniak, is an American inventor, electronics engineer, programmer and technology entrepreneur who in 1976 co-founded Apple Inc. which became the world's largest information technology company by revenue and largest company in the world by market capitalization. He and Apple co-founder Steve Jobs are recognized as pioneers of the personal computer revolution of the 1970s and 1980s. In 1975, Wozniak started developing the Apple I into the computer that launched Apple when he and Jobs first began marketing it the following year, he designed the Apple II in 1977, known as one of the first successful mass-produced microcomputers, while Jobs oversaw the development of its foam-molded plastic case and early Apple employee Rod Holt developed the switching power supply. With computer scientist Jef Raskin, Wozniak had major influence over the initial development of the original Apple Macintosh concepts from 1979 to 1981, when Jobs took over the project following Wozniak's brief departure from the company due to a traumatic airplane accident.
After permanently leaving Apple in 1985, Wozniak founded CL 9 and created the first programmable universal remote, released in 1987. He pursued several other business and philanthropic ventures throughout his career, focusing on technology in K–12 schools. Wozniak is Chief Scientist at the data virtualization company Primary Data, has remained an employee of Apple in a ceremonial capacity. Steve Wozniak was born in San Jose, the son of Francis Jacob "Jerry" Wozniak from Michigan and Margaret Louise Wozniak from Washington state, he graduated from Homestead High School in 1968. The name on Wozniak's birth certificate is "Stephan Gary Wozniak", but Steve's mother said that she intended it to be spelled "Stephen", what he uses. Wozniak has mentioned his surname being Polish and Ukrainian and spoken of his Polish descent, but stated that he does not know the origin of some other people with the Wozniak surname because he is "no heritage expert". In the early 1970s, Wozniak was known as "Berkeley Blue" in the phone phreak community, after he made a blue box.
Wozniak has credited watching Star Trek and attending Star Trek conventions while in his youth as a source of inspiration for his starting Apple Inc. In 1969, Wozniak returned to the Bay Area after being expelled from University of Colorado Boulder in his first year for sending prank messages on the university's computer system. During this time, as a self-taught project, Wozniak designed and built a "Cream Soda" computer with his friend Bill Fernandez, he re-enrolled at De Anza College and transferred to University of California, Berkeley in 1971. Before focusing his attention on Apple, he was employed at Hewlett-Packard where he designed calculators, it was during this time. Wozniak was introduced to Jobs by Fernandez, who attended Homestead High School with Jobs in 1971. Jobs and Wozniak became friends when Jobs worked for the summer at HP, where Wozniak too was employed, working on a mainframe computer; this was recounted by Wozniak in a 2007 interview with ABC News, of how and when he first met Jobs: "We first met in 1971 during my college years, while he was in high school.
A friend said,'you should meet Steve Jobs because he likes electronics, he plays pranks.' So he introduced us." In 1973, Jobs was working for arcade game company Inc. in Los Gatos, California. He was assigned to create a circuit board for the arcade video game Breakout. According to Atari co-founder Nolan Bushnell, Atari offered $100 for each chip, eliminated in the machine. Jobs had little knowledge of circuit board design and made a deal with Wozniak to split the fee evenly between them if Wozniak could minimize the number of chips. Wozniak reduced the number of chips by using RAM for the brick representation. Too complex to be comprehended at the time, the fact that this prototype had no scoring or coin mechanisms meant Woz's prototype could not be used. Jobs was paid the full bonus regardless. Jobs told Wozniak that Atari gave them only $700 and that Wozniak's share was thus $350. Wozniak did not learn about the actual $5,000 bonus until ten years but said that if Jobs had told him about it and had said he needed the money, Wozniak would have given it to him.
In 1975, Wozniak began designing and developing the computer that would make him famous, the Apple I. On June 29 of that year, he tested his first working prototype, displaying a few letters and running sample programs, it was the first time in history that a character displayed on a TV screen was generated by a home computer. With the Apple I, he and Jobs were working to impress other members of the Palo Alto-based Homebrew Computer Club, a local group of electronics hobbyists interested in computing; the Club was one of several key centers which established the home hobbyist era creating the microcomputer industry over the next few decades. Unlike other Homebrew designs, the Apple had an easy-to-achieve video capability that drew a crowd when it was unveiled. In 1976, Wozniak completed the Apple I design, he alone designed the hardware, circuit board designs, operating system for the computer. Wozniak offered the design to HP while working there, but was denied by the company on five different occasions.
Jobs instead had the idea to sell the Apple I with Wozniak as a assembled printed circuit board. Wozniak, at first skeptical, was convinced by Jobs that if they were not successful they could at least say to their grandkids they had had their own company. Togethe
Computerworld is an ongoing decades old professional publication which in 2014 "went digital." Its audience is information technology and business technology professionals, is available via a publication website and as a digital magazine. It is published in many countries around the world under the similar names; each country's version of Computerworld is managed independently. The parent company of Computerworld US is IDG Communications; the first issue was published in 1967. The company IDG offers the brand "Computerworld" in 47 countries worldwide, the name and frequency differ though; when IDG established the Swedish edition in 1983 i.e. the title "Computerworld" was registered in Sweden by another publisher. This is, it is distributed as a morning newspaper in tabloid format in 51,000 copies with an estimated 120,000 readers. From 1999 to 2008, it was published three days a week, but since 2009, it is published only on Tuesdays and Fridays. In June 2014, Computerworld US abandoned its print edition, becoming an digital publication.
In late July 2014, Computerworld debuted the monthly Computerworld Digital Magazine. In 2017, Computerworld celebrated its 50th year in tech publishing with a number of features and stories highlighting the publication's history. Computerworld's website premiered nearly two decades before their last printed issue. Computerworld US serves IT and business management with coverage of information technology, emerging technologies and analysis of technology trends. Computerworld publishes several notable special reports each year, including the 100 Best Places to Work in IT, IT Salary Survey, the DATA+ Editors' Choice Awards and the annual Forecast research report. Computerworld in the past has published stories that highlight the effects of immigration to the U. S. on U. S. software engineers. The executive editor of Computerworld in the U. S. is Ken Mingis, who leads a small staff of editors and freelancers who cover a variety of enterprise IT topics. "Computerworld archive from Google News Archive Search".
MOS Technology 6502
The MOS Technology 6502 is an 8-bit microprocessor, designed by a small team led by Chuck Peddle for MOS Technology. When it was introduced in 1975, the 6502 was, by a considerable margin, the least expensive microprocessor on the market, it sold for less than one-sixth the cost of competing designs from larger companies, such as Motorola and Intel, caused rapid decreases in pricing across the entire processor market. Along with the Zilog Z80, it sparked a series of projects that resulted in the home computer revolution of the early 1980s. Popular home video game consoles and computers, such as the Atari 2600, Atari 8-bit family, Apple II, Nintendo Entertainment System, Commodore 64, Atari Lynx, BBC Micro and others, used the 6502 or variations of the basic design. Soon after the 6502's introduction, MOS Technology was purchased outright by Commodore International, who continued to sell the microprocessor and licenses to other manufacturers. In the early days of the 6502, it was second-sourced by Rockwell and Synertek, licensed to other companies.
In its CMOS form, developed by the Western Design Center, the 6502 family continues to be used in embedded systems, with estimated production volumes in the hundreds of millions. The 6502 was designed by many of the same engineers that had designed the Motorola 6800 microprocessor family. Motorola started the 6800 microprocessor project in 1971 with Tom Bennett as the main architect; the chip layout began in late 1972, the first 6800 chips were fabricated in February 1974 and the full family was released in November 1974. John Buchanan was the designer of the 6800 chip and Rod Orgill, who did the 6501, assisted Buchanan with circuit analyses and chip layout. Bill Mensch joined Motorola in June 1971 after graduating from the University of Arizona, his first assignment was helping define the peripheral ICs for the 6800 family and he was the principal designer of the 6820 Peripheral Interface Adapter. Motorola's engineers could run digital simulations on an IBM 370-165 mainframe computer. Bennett hired Chuck Peddle in 1973 to do architectural support work on the 6800 family products in progress.
He contributed in many areas, including the design of the 6850 ACIA. Motorola's target customers were established electronics companies such as Hewlett-Packard, Tektronix, TRW, Chrysler. In May 1972, Motorola's engineers began visiting select customers and sharing the details of their proposed 8-bit microprocessor system with ROM, RAM, parallel and serial interfaces. In early 1974, they provided engineering samples of the chips so that customers could prototype their designs. Motorola's "total product family" strategy did not focus on the price of the microprocessor, but on reducing the customer's total design cost, they offered development software on a timeshare computer, the "EXORciser" debugging system, onsite training and field application engineer support. Both Intel and Motorola had announced a $360 price for a single microprocessor; the actual price for production quantities was much less. Motorola offered a design kit containing the 6800 with six support chips for $300. Peddle, who would accompany the sales people on customer visits, found that customers were put off by the high cost of the microprocessor chips.
To lower the price, the IC chip size would have to shrink so that more chips could be produced on each silicon wafer. This could be done by removing inessential features in the 6800 and using a newer fabrication technology, "depletion-mode" MOS transistors. Peddle and other team members started outlining the design of an improved feature, reduced size microprocessor. At that time, Motorola's new semiconductor fabrication facility in Austin, was having difficulty producing MOS chips and mid 1974 was the beginning of a year-long recession in the semiconductor industry. Many of the Mesa, employees were displeased with the upcoming relocation to Austin. Motorola Semiconductor Products Division's management was overwhelmed with problems and showed no interest in Peddle's low-cost microprocessor proposal. Chuck Peddle was frustrated with Motorola's management for missing this new opportunity. In a November 1975 interview, Motorola's Chairman, Robert Galvin, agreed, he said, "We did not choose the right leaders in the Semiconductor Products division."
The division was reorganized and the management replaced. New group vice-president John Welty said, "The semiconductor sales organization lost its sensitivity to customer needs and couldn't make speedy decisions."Peddle began looking for a source of funding for this new project and found a small semiconductor company in Pennsylvania. In August 1974, Chuck Peddle, Bill Mensch, Rod Orgill, Harry Bawcom, Ray Hirt, Terry Holdt and Wil Mathys left Motorola to join MOS Technology. Of the seventeen chip designers and layout people on the 6800 team, seven left. There were 30 to 40 application engineers and system engineers on the 6800 team; that December, Gary Daniels transferred into the 6800 microprocessor group. Tom Bennett did not want to leave the Phoenix area so Daniels took over the microprocessor development in Austin, his first project was a "depletion-mode" version of the 6800. The faster parts were available in July 1976; this was followed by the 6802 which added 128 bytes of an on-chip clock oscillator circuit.
MOS Technology was formed in 1969 by three executives from General Instrument, Mort Jaffe, Don McLaughlin, John Pavinen, to produce metal-oxide-semiconductor integrated circuits. Allen-Br
Central processing unit
A central processing unit called a central processor or main processor, is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic and input/output operations specified by the instructions. The computer industry has used the term "central processing unit" at least since the early 1960s. Traditionally, the term "CPU" refers to a processor, more to its processing unit and control unit, distinguishing these core elements of a computer from external components such as main memory and I/O circuitry; the form and implementation of CPUs have changed over the course of their history, but their fundamental operation remains unchanged. Principal components of a CPU include the arithmetic logic unit that performs arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations and a control unit that orchestrates the fetching and execution of instructions by directing the coordinated operations of the ALU, registers and other components.
Most modern CPUs are microprocessors, meaning they are contained on a single integrated circuit chip. An IC that contains a CPU may contain memory, peripheral interfaces, other components of a computer; some computers employ a multi-core processor, a single chip containing two or more CPUs called "cores". Array processors or vector processors have multiple processors that operate in parallel, with no unit considered central. There exists the concept of virtual CPUs which are an abstraction of dynamical aggregated computational resources. Early computers such as the ENIAC had to be physically rewired to perform different tasks, which caused these machines to be called "fixed-program computers". Since the term "CPU" is defined as a device for software execution, the earliest devices that could rightly be called CPUs came with the advent of the stored-program computer; the idea of a stored-program computer had been present in the design of J. Presper Eckert and John William Mauchly's ENIAC, but was omitted so that it could be finished sooner.
On June 30, 1945, before ENIAC was made, mathematician John von Neumann distributed the paper entitled First Draft of a Report on the EDVAC. It was the outline of a stored-program computer that would be completed in August 1949. EDVAC was designed to perform a certain number of instructions of various types; the programs written for EDVAC were to be stored in high-speed computer memory rather than specified by the physical wiring of the computer. This overcame a severe limitation of ENIAC, the considerable time and effort required to reconfigure the computer to perform a new task. With von Neumann's design, the program that EDVAC ran could be changed by changing the contents of the memory. EDVAC, was not the first stored-program computer. Early CPUs were custom designs used as part of a sometimes distinctive computer. However, this method of designing custom CPUs for a particular application has given way to the development of multi-purpose processors produced in large quantities; this standardization began in the era of discrete transistor mainframes and minicomputers and has accelerated with the popularization of the integrated circuit.
The IC has allowed complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in electronic devices ranging from automobiles to cellphones, sometimes in toys. While von Neumann is most credited with the design of the stored-program computer because of his design of EDVAC, the design became known as the von Neumann architecture, others before him, such as Konrad Zuse, had suggested and implemented similar ideas; the so-called Harvard architecture of the Harvard Mark I, completed before EDVAC used a stored-program design using punched paper tape rather than electronic memory. The key difference between the von Neumann and Harvard architectures is that the latter separates the storage and treatment of CPU instructions and data, while the former uses the same memory space for both.
Most modern CPUs are von Neumann in design, but CPUs with the Harvard architecture are seen as well in embedded applications. Relays and vacuum tubes were used as switching elements; the overall speed of a system is dependent on the speed of the switches. Tube computers like EDVAC tended to average eight hours between failures, whereas relay computers like the Harvard Mark I failed rarely. In the end, tube-based CPUs became dominant because the significant speed advantages afforded outweighed the reliability problems. Most of these early synchronous CPUs ran at low clock rates compared to modern microelectronic designs. Clock signal frequencies ranging from 100 kHz to 4 MHz were common at this time, limited by the speed of the switching de
Static random-access memory
Static random-access memory is a type of semiconductor memory that uses bistable latching circuitry to store each bit. SRAM exhibits data remanence, but it is still volatile in the conventional sense that data is lost when the memory is not powered; the term static differentiates SRAM from DRAM. SRAM is faster and more expensive than DRAM. Advantages: Simplicity – a refresh circuit is not needed Performance Reliability Low idle power consumptionDisadvantages: Price Density High operational power consumption The power consumption of SRAM varies depending on how it is accessed. On the other hand, static RAM used at a somewhat slower pace, such as in applications with moderately clocked microprocessors, draws little power and can have a nearly negligible power consumption when sitting idle – in the region of a few micro-watts. Several techniques have been proposed to manage power consumption of SRAM-based memory structures. General purpose products with asynchronous interface, such as the ubiquitous 28-pin 8K × 8 and 32K × 8 chips, as well as similar products up to 16 Mbit per chip with synchronous interface used for caches and other applications requiring burst transfers, up to 18 Mbit per chip integrated on chip as RAM or cache memory in micro-controllers as the primary caches in powerful microprocessors, such as the x86 family, many others to store the registers and parts of the state-machines used in some microprocessors on application specific ICs, or ASICs in Field Programmable Gate Array and Complex Programmable Logic Device Many categories of industrial and scientific subsystems, automotive electronics, similar, contain static RAM.
Some amount is embedded in all modern appliances, etc. that implement an electronic user interface. Several megabytes may be used in complex products such as digital cameras, cell phones, etc. SRAM in its dual-ported form is sometimes used for realtime digital signal processing circuits. SRAM is used in personal computers, workstations and peripheral equipment: CPU register files, internal CPU caches and external burst mode SRAM caches, hard disk buffers, router buffers, etc. LCD screens and printers normally employ static RAM to hold the image displayed. Static RAM was used for the main memory of some early personal computers such as the ZX80, TRS-80 Model 100 and Commodore VIC-20. Hobbyists home-built processor enthusiasts prefer SRAM due to the ease of interfacing, it is much easier to work with than DRAM as there are no refresh cycles and the address and data buses are directly accessible rather than multiplexed. In addition to buses and power connections, SRAM requires only three controls: Chip Enable, Write Enable and Output Enable.
In synchronous SRAM, Clock is included. Non-volatile SRAMs, or nvSRAMs, have standard SRAM functionality, but they save the data when the power supply is lost, ensuring preservation of critical information. NvSRAMs are used in a wide range of situations – networking and medical, among many others – where the preservation of data is critical and where batteries are impractical. PSRAMs have a DRAM storage core, combined with a self refresh circuit, they appear externally as a slower SRAM. They have a density/cost advantage over true SRAM, without the access complexity of DRAM. Bipolar junction transistor – fast but consumes a lot of power MOSFET – low power and common today Asynchronous – independent of clock frequency. Address, data in and other control signals are associated with the clock signalsIn 1990s, asynchronous SRAM used to be employed for fast access time. Asynchronous SRAM was used as main memory for small cache-less embedded processors used in everything from industrial electronics and measurement systems to hard disks and networking equipment, among many other applications.
Nowadays, synchronous SRAM is rather employed like Synchronous DRAM – DDR SDRAM memory is rather used than asynchronous DRAM. Synchronous memory interface is much faster as access time can be reduced by employing pipeline architecture. Furthermore, as DRAM is much cheaper than SRAM, SRAM is replaced by DRAM in the case when large volume of data is required. SRAM memory is however much faster for random access. Therefore, SRAM memory is used for CPU cache, small on-chip memory, FIFOs or other small buffers. Zero bus turnaround – the turnaround is the number of clock cycles it takes to change access to the SRAM from write to read and vice versa; the turnaround for ZBT SRAMs or the latency between read and write cycle is zero. SyncBurst – features synchronous burst write access to the SRAM to increase write operation to the SRAM DDR SRAM – Synchronous, single read/write port, double data rate I/O Quad Data Rate SRAM – Synchronous, separate read and write ports, quadruple data rate I/O Binary SRAM Ternary SRAM A typical SRAM cell is mad
Apple Inc. is an American multinational technology company headquartered in Cupertino, that designs and sells consumer electronics, computer software, online services. It is considered one of the Big Four of technology along with Amazon and Facebook; the company's hardware products include the iPhone smartphone, the iPad tablet computer, the Mac personal computer, the iPod portable media player, the Apple Watch smartwatch, the Apple TV digital media player, the HomePod smart speaker. Apple's software includes the macOS and iOS operating systems, the iTunes media player, the Safari web browser, the iLife and iWork creativity and productivity suites, as well as professional applications like Final Cut Pro, Logic Pro, Xcode, its online services include the iTunes Store, the iOS App Store, Mac App Store, Apple Music, Apple TV+, iMessage, iCloud. Other services include Apple Store, Genius Bar, AppleCare, Apple Pay, Apple Pay Cash, Apple Card. Apple was founded by Steve Jobs, Steve Wozniak, Ronald Wayne in April 1976 to develop and sell Wozniak's Apple I personal computer, though Wayne sold his share back within 12 days.
It was incorporated as Apple Computer, Inc. in January 1977, sales of its computers, including the Apple II, grew quickly. Within a few years and Wozniak had hired a staff of computer designers and had a production line. Apple went public in 1980 to instant financial success. Over the next few years, Apple shipped new computers featuring innovative graphical user interfaces, such as the original Macintosh in 1984, Apple's marketing advertisements for its products received widespread critical acclaim. However, the high price of its products and limited application library caused problems, as did power struggles between executives. In 1985, Wozniak departed Apple amicably and remained an honorary employee, while Jobs and others resigned to found NeXT; as the market for personal computers expanded and evolved through the 1990s, Apple lost market share to the lower-priced duopoly of Microsoft Windows on Intel PC clones. The board recruited CEO Gil Amelio to what would be a 500-day charge for him to rehabilitate the financially troubled company—reshaping it with layoffs, executive restructuring, product focus.
In 1997, he led Apple to buy NeXT, solving the failed operating system strategy and bringing Jobs back. Jobs pensively regained leadership status, becoming CEO in 2000. Apple swiftly returned to profitability under the revitalizing Think different campaign, as he rebuilt Apple's status by launching the iMac in 1998, opening the retail chain of Apple Stores in 2001, acquiring numerous companies to broaden the software portfolio. In January 2007, Jobs renamed the company Apple Inc. reflecting its shifted focus toward consumer electronics, launched the iPhone to great critical acclaim and financial success. In August 2011, Jobs resigned as CEO due to health complications, Tim Cook became the new CEO. Two months Jobs died, marking the end of an era for the company. Apple is well known for its size and revenues, its worldwide annual revenue totaled $265 billion for the 2018 fiscal year. Apple is the world's largest information technology company by revenue and the world's third-largest mobile phone manufacturer after Samsung and Huawei.
In August 2018, Apple became the first public U. S. company to be valued at over $1 trillion. The company employs 123,000 full-time employees and maintains 504 retail stores in 24 countries as of 2018, it operates the iTunes Store, the world's largest music retailer. As of January 2018, more than 1.3 billion Apple products are in use worldwide. The company has a high level of brand loyalty and is ranked as the world's most valuable brand. However, Apple receives significant criticism regarding the labor practices of its contractors, its environmental practices and unethical business practices, including anti-competitive behavior, as well as the origins of source materials. Apple Computer Company was founded on April 1, 1976, by Steve Jobs, Steve Wozniak, Ronald Wayne; the company's first product is the Apple I, a computer designed and hand-built by Wozniak, first shown to the public at the Homebrew Computer Club. Apple I was sold as a motherboard —a base kit concept which would now not be marketed as a complete personal computer.
The Apple I went on sale in July 1976 and was market-priced at $666.66. Apple Computer, Inc. was incorporated on January 3, 1977, without Wayne, who had left and sold his share of the company back to Jobs and Wozniak for $800 only twelve days after having co-founded Apple. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple. During the first five years of operations revenues grew exponentially, doubling about every four months. Between September 1977 and September 1980, yearly sales grew from $775,000 to $118 million, an average annual growth rate of 533%; the Apple II invented by Wozniak, was introduced on April 16, 1977, at the first West Coast Computer Faire. It differs from its major rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics and open architecture. While early Apple II models use ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1⁄4-inch floppy disk drive and interface called the Disk II.
The Apple II was chosen to be the desktop platform for the first "killer app" of the business world: VisiCalc, a spreadsheet program. VisiCalc created a business market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the office. Before VisiCalc, Apple had been a distant third place c
The Parallax P8X32A Propeller is a multi-core processor parallel computer architecture microcontroller chip with eight 32-bit reduced instruction set computer central processing unit cores. Introduced in 2006, it is sold by Parallax, Inc.. The Propeller microcontroller, Propeller assembly language, Spin interpreter were designed by Parallax's cofounder and president, Chip Gracey; the Spin programming language and Propeller Tool integrated development environment were designed by Chip Gracey and Parallax's software engineer Jeff Martin. On August 6, 2014, Parallax Inc. released all of the Propeller 1 P8X32A hardware and tools as open-source hardware and software under the GNU General Public License 3.0. This included the Verilog code, top-level hardware description language files, Spin interpreter, PropellerIDE and SimpleIDE programming tools, compilers; each of the eight 32-bit cores has a central processing unit which has access to 512 32-bit long words of instructions and data. Self-modifying code is possible and is used internally, for example, as the boot loader overwrites itself with the Spin Interpreter.
Subroutines in Spin use a call-return mechanism requiring use of a call stack. Assembly code needs no call stack. Access to shared memory is controlled via round-robin scheduling by an internal computer bus controller termed the hub; each cog has access to two dedicated hardware counters and a special video generator for use in generating timing signals for Phase Alternating Line, National Television System Committee, Video Graphics Array, servomechanism-control, others. The Propeller can be clocked using either an internal, on-chip oscillator or an external crystal oscillator or ceramic resonator. Only the external oscillator may be run through an on-chip phase-locked loop clock multiplier, which may be set at 1x, 2x, 4x, 8x, or 16x. Both the on-board oscillator frequency and the PLL multiplier value may be changed at run-time. If used this can improve power efficiency. However, the utility of this technique is limited to situations where no other cog is executing timing-dependent code, since the effective clock rate is common to all cogs.
The effective clock rate ranges from 32 kHz up to 80 MHz. When running at 80 MHz, the proprietary interpreted Spin programming language executes 80,000 instruction-tokens per second on each core, giving 8 times 80,000 for 640,000 high-level instructions per second. Most machine-language instructions take 4 clock-cycles to execute, resulting in 20 million instructions per second per cog, or 160 MIPS total for an 8-cog Propeller. Power use can be reduced by lowering the clock rate to what is needed, by turning off unneeded cogs, by reconfiguring I/O pins which are unneeded, or can be safely placed in a high-impedance state, as inputs. Pins can be reconfigured dynamically, but again, the change applies to all cogs, so synchronizing is important for certain designs; some protection is available for situations where one core attempts to use a pin as an output while another attempts to use it as an input. Each cog has access to some dedicated counter-timer hardware, a special timing signal generator intended to simplify the design of video output stages, such as composite PAL or NTSC displays and Video Graphics Array monitors.
Parallax thus makes sample code available which can generate video signals using a minimum parts count consisting of the Propeller, a crystal oscillator, a few resistors to form a crude digital-to-analog converter. The frequency of the oscillator is important, as the correction ability of the video timing hardware is limited to the clock rate, it is possible to use multiple cogs in parallel to generate a single video signal. More the timing hardware can be used to implement various pulse-width modulation timing signals. In addition to the Spin interpreter and a boot loader, the built-in ROM provides some data which may be useful for certain sound, video, or mathematics applications: a bitmap font is provided, suitable for typical character generation applications; the math extensions are intended to help compensate for the lack of a floating-point unit, more primitive missing operations, such as multiplication and division. The Propeller is a 32-bit processor and these tables may have insufficient accuracy for higher-precision uses.
Spin is a multitasking high-level computer programming language created by Parallax's Chip Gracey, who designed the Propeller microcontroller on which it runs, for their line of Propeller microcontrollers. Spin code is written on the Propeller Tool, a GUI-oriented software development p