1.
Calculator
–
An electronic calculator is a small, portable electronic device used to perform operations ranging from basic arithmetic to complex mathematics. The first solid state electronic calculator was created in the 1960s, building on the history of tools such as the abacus. It was developed in parallel with the computers of the day. The pocket sized devices became available in the 1970s, especially after the first microprocessor and they later became used commonly within the petroleum industry. Modern electronic calculators vary, from cheap, give-away, credit-card-sized models to sturdy desktop models with built-in printers and they became popular in the mid-1970s. By the end of decade, calculator prices had reduced to a point where a basic calculator was affordable to most. In addition to general purpose calculators, there are designed for specific markets. For example, there are scientific calculators which include trigonometric and statistical calculations, some calculators even have the ability to do computer algebra. Graphing calculators can be used to graph functions defined on the real line, as of 2016, basic calculators cost little, but the scientific and graphing models tend to cost more. In 1986, calculators still represented an estimated 41% of the worlds general-purpose hardware capacity to compute information, by 2007, this diminished to less than 0. 05%. Modern 2016 electronic calculators contain a keyboard with buttons for digits and arithmetical operations, most basic calculators assign only one digit or operation on each button, however, in more specific calculators, a button can perform multi-function working with key combinations. Large-sized figures and comma separators are used to improve readability. Various symbols for function commands may also be shown on the display, fractions such as 1⁄3 are displayed as decimal approximations, for example rounded to 0.33333333. Also, some fractions can be difficult to recognize in decimal form, as a result, Calculators also have the ability to store numbers into computer memory. Basic types of these only one number at a time. The variables can also be used for constructing formulas, some models have the ability to extend memory capacity to store more numbers, the extended memory address is termed an array index. Power sources of calculators are, batteries, solar cells or mains electricity, some models even have no turn-off button but they provide some way to put off. Crank-powered calculators were also common in the computer era
2.
Computer science
–
Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. An alternate, more succinct definition of science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems and its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory, are highly abstract, other fields still focus on challenges in implementing computation. Human–computer interaction considers the challenges in making computers and computations useful, usable, the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623, in 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner. He may be considered the first computer scientist and information theorist, for, among other reasons and he started developing this machine in 1834, and in less than two years, he had sketched out many of the salient features of the modern computer. A crucial step was the adoption of a card system derived from the Jacquard loom making it infinitely programmable. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information, when the machine was finished, some hailed it as Babbages dream come true. During the 1940s, as new and more powerful computing machines were developed, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as an academic discipline in the 1950s. The worlds first computer science program, the Cambridge Diploma in Computer Science. The first computer science program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own rights and it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM704 and later the IBM709 computers, still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again. During the late 1950s, the science discipline was very much in its developmental stages. Time has seen significant improvements in the usability and effectiveness of computing technology, modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base
3.
Hard disk drive
–
The platters are paired with magnetic heads, usually arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a manner, meaning that individual blocks of data can be stored or retrieved in any order. HDDs are a type of storage, retaining stored data even when powered off. Introduced by IBM in 1956, HDDs became the dominant secondary storage device for computers by the early 1960s. Continuously improved, HDDs have maintained this position into the era of servers. More than 200 companies have produced HDDs historically, though after extensive industry consolidation most current units are manufactured by Seagate, Toshiba, as of 2016, HDD production is growing, although unit shipments and sales revenues are declining. While SSDs have higher cost per bit, SSDs are replacing HDDs where speed, power consumption, small size, the primary characteristics of an HDD are its capacity and performance. Capacity is specified in unit prefixes corresponding to powers of 1000, the two most common form factors for modern HDDs are 3. 5-inch, for desktop computers, and 2. 5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as PATA, SATA, Hard disk drives were introduced in 1956, as data storage for an IBM real-time transaction processing computer and were developed for use with general-purpose mainframe and minicomputers. The first IBM drive, the 350 RAMAC in 1956, was approximately the size of two medium-sized refrigerators and stored five million six-bit characters on a stack of 50 disks. In 1962 the IBM350 RAMAC disk storage unit was superseded by the IBM1301 disk storage unit, cylinder-mode read/write operations were supported, and the heads flew about 250 micro-inches above the platter surface. Motion of the head array depended upon a binary system of hydraulic actuators which assured repeatable positioning. The 1301 cabinet was about the size of three home refrigerators placed side by side, storing the equivalent of about 21 million eight-bit bytes, access time was about a quarter of a second. Also in 1962, IBM introduced the model 1311 disk drive, users could buy additional packs and interchange them as needed, much like reels of magnetic tape. Later models of removable pack drives, from IBM and others, became the norm in most computer installations, non-removable HDDs were called fixed disk drives. Some high-performance HDDs were manufactured with one head per track so that no time was lost physically moving the heads to a track, known as fixed-head or head-per-track disk drives they were very expensive and are no longer in production. In 1973, IBM introduced a new type of HDD code-named Winchester and its primary distinguishing feature was that the disk heads were not withdrawn completely from the stack of disk platters when the drive was powered down. Instead, the heads were allowed to land on an area of the disk surface upon spin-down
4.
Floppy disk
–
Floppy disks are read and written by a floppy disk drive. Floppy disks, initially as 8-inch media and later in 5¼-inch and 3½-inch sizes, were a form of data storage and exchange from the mid-1970s into the mid-2000s. These formats are usually handled by older equipment and these disks and associated drives were produced and improved upon by IBM and other companies such as Memorex, Shugart Associates, and Burroughs Corporation. The term floppy disk appeared in print as early as 1970, in 1976, Shugart Associates introduced the first 5¼-inch FDD. By 1978 there were more than 10 manufacturers producing such FDDs, there were competing floppy disk formats, with hard- and soft-sector versions and encoding schemes such as FM, MFM and GCR. The 5¼-inch format displaced the 8-inch one for most applications, the most common capacity of the 5¼-inch format in DOS-based PCs was 360 kB and in 1984 IBM introduced the 1.2 MB dual-sided floppy disk along with its PC-AT model. IBM started using the 720 kB double-density 3½-inch microfloppy disk on its Convertible laptop computer in 1986 and these disk drives could be added to older PC models. In 1988 IBM introduced a drive for 2.88 MB DSED diskettes in its top-of-the-line PS/2 models, throughout the early 1980s, limitations of the 5¼-inch format became clear. Originally designed to be practical than the 8-inch format, it was itself too large, as the quality of recording media grew. A number of solutions were developed, with drives at 2-, 2½-, 3-, 3½-, the large market share of the 5¼-inch format made it difficult for these new formats to gain significant market share. A variant on the Sony design, introduced in 1982 by a number of manufacturers, was then rapidly adopted. By the end of the 1980s, 5¼-inch disks had been superseded by 3½-inch disks, by the mid-1990s, 5¼-inch drives had virtually disappeared, as the 3½-inch disk became the predominant floppy disk. Floppy disks became ubiquitous during the 1980s and 1990s in their use with computers to distribute software, transfer data. Before hard disks became affordable to the population, floppy disks were often used to store a computers operating system. Most home computers from that period have a primary OS and BASIC stored as ROM, by the early 1990s, the increasing software size meant large packages like Windows or Adobe Photoshop required a dozen disks or more. In 1996, there were a five billion standard floppy disks in use. Then, distribution of packages was gradually replaced by CD-ROMs, DVDs. External USB-based floppy disk drives are available, many modern systems provide firmware support for booting from such drives
5.
Random-access memory
–
Random-access memory is a form of computer data storage which stores frequently used program instructions to increase the general speed of a system. A random-access memory device allows data items to be read or written in almost the same amount of time irrespective of the location of data inside the memory. RAM contains multiplexing and demultiplexing circuitry, to connect the lines to the addressed storage for reading or writing the entry. Usually more than one bit of storage is accessed by the same address, in todays technology, random-access memory takes the form of integrated circuits. RAM is normally associated with types of memory, where stored information is lost if power is removed. Other types of non-volatile memories exist that allow access for read operations. These include most types of ROM and a type of memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order it was written, drum memory could be expanded at relatively low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, and later, out of transistors, were used for smaller and faster memories such as registers. Such registers were relatively large and too costly to use for large amounts of data, the first practical form of random-access memory was the Williams tube starting in 1947. It stored data as electrically charged spots on the face of a cathode ray tube, since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access. The capacity of the Williams tube was a few hundred to around a thousand bits, but it was smaller, faster. In fact, rather than the Williams tube memory being designed for the SSEM, magnetic-core memory was invented in 1947 and developed up until the mid-1970s. It became a form of random-access memory, relying on an array of magnetized rings. By changing the sense of each rings magnetization, data could be stored with one bit stored per ring, since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible. Magnetic core memory was the form of memory system until displaced by solid-state memory in integrated circuits. Data was stored in the capacitance of each transistor, and had to be periodically refreshed every few milliseconds before the charge could leak away
6.
Kernel panic
–
A kernel panic, is an action taken by an operating system upon detecting an internal fatal error from which it cannot safely recover. The term is specific to Unix and Unix-like systems, for Microsoft Windows operating systems the equivalent term is Stop error. The information provided is of a technical nature and aims to assist a system administrator or software developer in diagnosing the problem. Kernel panics can also be caused by errors originating outside of kernel space, for example, many Unix OSes panic if the init process, which runs in userspace, terminates. The Unix kernel maintains internal consistency and runtime correctness with assertions as the detection mechanism. The basic assumption is that the hardware and the software should perform correctly, the kernel panic was introduced in an early version of Unix and demonstrated a major difference between the design philosophies of Unix and its predecessor Multics. He said, We left all that stuff out, if theres an error, we have this routine called panic, and when it is called, the machine crashes, and you holler down the hall, Hey, reboot it. Source code of panic function in V6 UNIX, As the Unix codebase was enhanced, a panic may occur as a result of a hardware failure or a software bug in the operating system. In many cases, the system is capable of continued operation after an error has occurred. After recompiling a kernel binary image from source code, a kernel panic while booting the resulting kernel is a problem if the kernel was not correctly configured, compiled or installed. Add-on hardware or malfunctioning RAM could also be sources of fatal kernel errors during start up, a kernel may also go into panic if it is unable to locate a root file system. During the final stages of kernel userspace initialization, a panic is typically triggered if the spawning of init fails, in this case, the kernel normally continues to run after killing the offending process. As an oops could cause some subsystems or resources to become unavailable, in Linux kernel, a kernel panic causes keyboard LEDs to blink as a visual indication of a critical condition. When a kernel panic occurs in Mac OS X10.2 through 10.7, prior to 10.2, a more traditional Unix-style panic message was displayed, in 10.8 and later, the computer automatically reboots and displays a message after the restart. The format of the message varies from version to version,10. 0–10.1, The system displays text on the screen, giving details about the error, and becomes unresponsive. 10.2, Rolls down a black transparent curtain then displays a message on a white background informing the user that they should restart the computer, the message is shown in English, French, German and Japanese. 10. 3–10.5, The kernel panic is almost the same as version 10.2,10. 6–10.7, The text has been revised and now includes a Spanish translation. 10.8 and later, The computer becomes unresponsive before it immediately reboots, when the computer starts back up, it shows a message for a few seconds about the computer restarting because of a kernel panic, and then the computer restarts back up
7.
Linux
–
Linux is a Unix-like computer operating system assembled under the model of free and open-source software development and distribution. The defining component of Linux is the Linux kernel, an operating system kernel first released on September 17,1991 by Linus Torvalds, the Free Software Foundation uses the name GNU/Linux to describe the operating system, which has led to some controversy. Linux was originally developed for computers based on the Intel x86 architecture. Because of the dominance of Android on smartphones, Linux has the largest installed base of all operating systems. Linux is also the operating system on servers and other big iron systems such as mainframe computers. It is used by around 2. 3% of desktop computers, the Chromebook, which runs on Chrome OS, dominates the US K–12 education market and represents nearly 20% of the sub-$300 notebook sales in the US. Linux also runs on embedded systems – devices whose operating system is built into the firmware and is highly tailored to the system. This includes TiVo and similar DVR devices, network routers, facility automation controls, televisions, many smartphones and tablet computers run Android and other Linux derivatives. The development of Linux is one of the most prominent examples of free, the underlying source code may be used, modified and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License. Typically, Linux is packaged in a known as a Linux distribution for both desktop and server use. Distributions intended to run on servers may omit all graphical environments from the standard install, because Linux is freely redistributable, anyone may create a distribution for any intended use. The Unix operating system was conceived and implemented in 1969 at AT&Ts Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, first released in 1971, Unix was written entirely in assembly language, as was common practice at the time. Later, in a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie, the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, as a result, Unix grew quickly and became widely adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs, freed of the legal obligation requiring free licensing, the GNU Project, started in 1983 by Richard Stallman, has the goal of creating a complete Unix-compatible software system composed entirely of free software. Later, in 1985, Stallman started the Free Software Foundation, by the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers, daemons, and the kernel were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, although not released until 1992 due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has also stated that if 386BSD had been available at the time, although the complete source code of MINIX was freely available, the licensing terms prevented it from being free software until the licensing changed in April 2000
8.
Unix
–
Among these is Apples macOS, which is the Unix version with the largest installed base as of 2014. Many Unix-like operating systems have arisen over the years, of which Linux is the most popular, Unix was originally meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmer users. The system grew larger as the system started spreading in academic circles, as users added their own tools to the system. Unix was designed to be portable, multi-tasking and multi-user in a time-sharing configuration and these concepts are collectively known as the Unix philosophy. By the early 1980s users began seeing Unix as a universal operating system. Under Unix, the system consists of many utilities along with the master control program. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space, the microkernel concept was introduced in an effort to reverse the trend towards larger kernels and return to a system in which most tasks were completed by smaller utilities. In an era when a standard computer consisted of a disk for storage and a data terminal for input and output. However, modern systems include networking and other new devices, as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, and semaphores. In microkernel implementations, functions such as network protocols could be moved out of the kernel, Multics introduced many innovations, but had many problems. Frustrated by the size and complexity of Multics but not by the aims and their last researchers to leave Multics, Ken Thompson, Dennis Ritchie, M. D. McIlroy, and J. F. Ossanna, decided to redo the work on a much smaller scale. The name Unics, a pun on Multics, was suggested for the project in 1970. Peter H. Salus credits Peter Neumann with the pun, while Brian Kernighan claims the coining for himself, in 1972, Unix was rewritten in the C programming language. Bell Labs produced several versions of Unix that are referred to as Research Unix. In 1975, the first source license for UNIX was sold to faculty at the University of Illinois Department of Computer Science, UIUC graduate student Greg Chesson was instrumental in negotiating the terms of this license. During the late 1970s and early 1980s, the influence of Unix in academic circles led to adoption of Unix by commercial startups, including Sequent, HP-UX, Solaris, AIX. In the late 1980s, AT&T Unix System Laboratories and Sun Microsystems developed System V Release 4, in the 1990s, Unix-like systems grew in popularity as Linux and BSD distributions were developed through collaboration by a worldwide network of programmers
9.
Acorn Computers
–
Acorn Computers Ltd. was a British computer company established in Cambridge, England, in 1978. The company produced a number of computers which were popular in the UK, including the Acorn Electron. Acorns BBC Micro computer dominated the UK educational computer market during the 1980s and it is more known for its BBC Micro model B computer than for its other products. Though the company was broken up several independent operations in 1998. One of its systems, RISC OS, continues to be developed by RISC OS Open. Some of Acorns former subsidiaries lived on, ARM Holdings technology is dominant in the mobile phone, Acorn is sometimes referred to as the British Apple and has been compared to Fairchild Semiconductor for being a catalyst for start-ups. In 2010, the company was listed by David Meyer in ZDNet as number nine in a feature of top ten fallen Dead IT giants, many British IT professionals gained their early experiences on Acorns, which were often more technically advanced than commercially successful US hardware. On 25 July 1961, Clive Sinclair founded Sinclair Radionics to develop, the failure of the Black Watch wristwatch and the calculator markets move from LEDs to LCDs led to financial problems, and Sinclair approached government body the National Enterprise Board for help. After losing control of the company to the NEB, Sinclair encouraged Chris Curry to leave Radionics and get Science of Cambridge up and running. In June 1978, SoC launched a kit, the Mk 14, that Curry wanted to develop further. During the development of the Mk 14, Hermann Hauser, a friend of Currys, had been visiting SoCs offices and had interested in the product. Curry and Hauser decided to pursue their joint interest in microcomputers and, on 5 December 1978, CPU soon obtained a consultancy contract to develop a microprocessor-based controller for a fruit machine for Ace Coin Equipment of Wales. The ACE project was started at office space obtained at 4a Market Hill in Cambridge, initially, the ACE controller was based on a National Semiconductor SC/MP microprocessor, but soon the switch to a MOS Technology 6502 was made. CPU had financed the development of a SC/MP based microcomputer system using the income from its design-and-build consultancy. This system was launched in January 1979 as the first product of Acorn Computer Ltd. a trading name used by CPU to keep the risks of the two different lines of business separate, the microcomputer kit was named as Acorn System 75. Acorn was chosen because the system was to be expandable. It also had the attraction of appearing before Apple Computer in a telephone directory, around this time, CPU and Andy Hopper set up Orbis Ltd. CPU purchased Orbis, and Hoppers Orbis shares were exchanged for shares in CPU Ltd, CPUs role gradually changed as its Acorn brand grew, and soon CPU was simply the holding company and Acorn was responsible for development work
10.
Computer program
–
A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function, and typically executes the programs instructions in a processing unit. A computer program is written by a computer programmer in a programming language. From the program in its form of source code, a compiler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a program may be executed with the aid of an interpreter. A part of a program that performs a well-defined task is known as an algorithm. A collection of programs, libraries and related data are referred to as software. Computer programs may be categorized along functional lines, such as software or system software. The earliest programmable machines preceded the invention of the digital computer, in 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be weaved and repeated by arranging the cards, in 1837, Charles Babbage was inspired by Jacquards loom to attempt to build the Analytical Engine. The names of the components of the device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled, the device would have had a store—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the store would then have then transferred to the mill. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables, however, after more than 17,000 pounds of the British governments money, the thousands of cogged wheels and gears never fully worked together. During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea, the memoir covered the Analytical Engine. The translation contained Note G which completely detailed a method for calculating Bernoulli numbers using the Analytical Engine and this note is recognized by some historians as the worlds first written computer program. In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine and it is a finite-state machine that has an infinitely long read/write tape. The machine can move the back and forth, changing its contents as it performs an algorithm
11.
Android (operating system)
–
Android is a mobile operating system developed by Google, based on the Linux kernel and designed primarily for touchscreen mobile devices such as smartphones and tablets. In addition to devices, Google has further developed Android TV for televisions, Android Auto for cars. Variants of Android are also used on notebooks, game consoles, digital cameras, beginning with the first commercial Android device in September 2008, the operating system has gone through multiple major releases, with the current version being 7.0 Nougat, released in August 2016. Android applications can be downloaded from the Google Play store, which features over 2.7 million apps as of February 2017, Android has been the best-selling OS on tablets since 2013, and runs on the vast majority of smartphones. In September 2015, Android had 1.4 billion monthly active users, Android is popular with technology companies that require a ready-made, low-cost and customizable operating system for high-tech devices. The success of Android has made it a target for patent, Android Inc. was founded in Palo Alto, California in October 2003 by Andy Rubin, Rich Miner, Nick Sears, and Chris White. Rubin described the Android project as tremendous potential in developing smarter mobile devices that are aware of its owners location. The early intentions of the company were to develop an operating system for digital cameras. Despite the past accomplishments of the founders and early employees, Android Inc. operated secretly and that same year, Rubin ran out of money. Steve Perlman, a friend of Rubin, brought him $10,000 in cash in an envelope. In July 2005, Google acquired Android Inc. for at least $50 million and its key employees, including Rubin, Miner and White, joined Google as part of the acquisition. Not much was known about Android at the time, with Rubin having only stated that they were making software for mobile phones, at Google, the team led by Rubin developed a mobile device platform powered by the Linux kernel. Google marketed the platform to handset makers and carriers on the promise of providing a flexible, upgradeable system, Google had lined up a series of hardware components and software partners and signaled to carriers that it was open to various degrees of cooperation. Speculation about Googles intention to enter the communications market continued to build through December 2006. In September 2007, InformationWeek covered an Evalueserve study reporting that Google had filed several patent applications in the area of mobile telephony, the first commercially available smartphone running Android was the HTC Dream, also known as T-Mobile G1, announced on September 23,2008. Since 2008, Android has seen numerous updates which have improved the operating system, adding new features. Each major release is named in order after a dessert or sugary treat, with the first few Android versions being called Cupcake, Donut, Eclair. In 2010, Google launched its Nexus series of devices, a lineup in which Google partnered with different device manufacturers to produce new devices and introduce new Android versions
12.
Twitter
–
Twitter is an online news and social networking service where users post and interact with messages, tweets, restricted to 140 characters. Registered users can post tweets, but those who are unregistered can only read them, users access Twitter through its website interface, SMS or a mobile device app. Twitter Inc. is based in San Francisco, California, United States, Twitter was created in March 2006 by Jack Dorsey, Noah Glass, Biz Stone, and Evan Williams and launched in July. The service rapidly gained worldwide popularity, in 2012, more than 100 million users posted 340 million tweets a day, and the service handled an average of 1.6 billion search queries per day. In 2013, it was one of the ten most-visited websites and has described as the SMS of the Internet. As of 2016, Twitter had more than 319 million monthly active users. On the day of the 2016 U. S. presidential election, Twitter proved to be the largest source of breaking news, Twitters origins lie in a daylong brainstorming session held by board members of the podcasting company Odeo. Jack Dorsey, then a student at New York University. The original project name for the service was twttr, an idea that Williams later ascribed to Noah Glass, inspired by Flickr. The developers initially considered 10958 as a code, but later changed it to 40404 for ease of use. Work on the project started on March 21,2006, when Dorsey published the first Twitter message at 9,50 PM Pacific Standard Time, Dorsey has explained the origin of the Twitter title. we came across the word twitter, and it was just perfect. The definition was a short burst of inconsequential information, and chirps from birds, and thats exactly what the product was. The first Twitter prototype, developed by Dorsey and contractor Florian Weber, was used as a service for Odeo employees. Williams fired Glass, who was silent about his part in Twitters startup until 2011, Twitter spun off into its own company in April 2007. Williams provided insight into the ambiguity that defined this early period in a 2013 interview, With Twitter and they called it a social network, they called it microblogging, but it was hard to define, because it didnt replace anything. There was this path of discovery with something like that, where over time you figure out what it is, Twitter actually changed from what we thought it was in the beginning, which we described as status updates and a social utility. It is that, in part, but the insight we eventually came to was Twitter was really more of an information network than it is a social network, the tipping point for Twitters popularity was the 2007 South by Southwest Interactive conference. During the event, Twitter usage increased from 20,000 tweets per day to 60,000, the Twitter people cleverly placed two 60-inch plasma screens in the conference hallways, exclusively streaming Twitter messages, remarked Newsweeks Steven Levy