Peer-to-peer computing or networking is a distributed application architecture that partitions tasks or workloads between peers. Peers are privileged, equipotent participants in the application, they are said to form a peer-to-peer network of nodes. Peers make a portion of their resources, such as processing power, disk storage or network bandwidth, directly available to other network participants, without the need for central coordination by servers or stable hosts. Peers are both suppliers and consumers of resources, in contrast to the traditional client-server model in which the consumption and supply of resources is divided. Emerging collaborative P2P systems are going beyond the era of peers doing similar things while sharing resources, are looking for diverse peers that can bring in unique resources and capabilities to a virtual community thereby empowering it to engage in greater tasks beyond those that can be accomplished by individual peers, yet that are beneficial to all the peers.
While P2P systems had been used in many application domains, the architecture was popularized by the file sharing system Napster released in 1999. The concept has inspired new philosophies in many areas of human interaction. In such social contexts, peer-to-peer as a meme refers to the egalitarian social networking that has emerged throughout society, enabled by Internet technologies in general. While P2P systems had been used in many application domains, the concept was popularized by file sharing systems such as the music-sharing application Napster; the peer-to-peer movement allowed millions of Internet users to connect "directly, forming groups and collaborating to become user-created search engines, virtual supercomputers, filesystems." The basic concept of peer-to-peer computing was envisioned in earlier software systems and networking discussions, reaching back to principles stated in the first Request for Comments, RFC 1. Tim Berners-Lee's vision for the World Wide Web was close to a P2P network in that it assumed each user of the web would be an active editor and contributor and linking content to form an interlinked "web" of links.
The early Internet was more open than present day, where two machines connected to the Internet could send packets to each other without firewalls and other security measures. This contrasts to the broadcasting-like structure of the web; as a precursor to the Internet, ARPANET was a successful client-server network where "every participating node could request and serve content." However, ARPANET was not self-organized, it lacked the ability to "provide any means for context or content-based routing beyond'simple' address-based routing."Therefore, USENET, a distributed messaging system, described as an early peer-to-peer architecture, was established. It was developed in 1979 as a system; the basic model is a client-server model from the user or client perspective that offers a self-organizing approach to newsgroup servers. However, news servers communicate with one another as peers to propagate Usenet news articles over the entire group of network servers; the same consideration applies to SMTP email in the sense that the core email-relaying network of mail transfer agents has a peer-to-peer character, while the periphery of e-mail clients and their direct connections is a client-server relationship.
In May 1999, with millions more people on the Internet, Shawn Fanning introduced the music and file-sharing application called Napster. Napster was the beginning of peer-to-peer networks, as we know them today, where "participating users establish a virtual network independent from the physical network, without having to obey any administrative authorities or restrictions." A peer-to-peer network is designed around the notion of equal peer nodes functioning as both "clients" and "servers" to the other nodes on the network. This model of network arrangement differs from the client–server model where communication is to and from a central server. A typical example of a file transfer that uses the client-server model is the File Transfer Protocol service in which the client and server programs are distinct: the clients initiate the transfer, the servers satisfy these requests. Peer-to-peer networks implement some form of virtual overlay network on top of the physical network topology, where the nodes in the overlay form a subset of the nodes in the physical network.
Data is still exchanged directly over the underlying TCP/IP network, but at the application layer peers are able to communicate with each other directly, via the logical overlay links. Overlays are used for indexing and peer discovery, make the P2P system independent from the physical network topology. Based on how the nodes are linked to each other within the overlay network, how resources are indexed and located, we can classify networks as unstructured or structured. Unstructured peer-to-peer networks do not impose a particular structure on the overlay network by design, but rather are formed by nodes that randomly form connections to each other.. Because there is no structure globally imposed upon them, unstructured networks are easy to build and allow for localized optimizations to different regions of the overlay; because the role of all peers in the network is the same, unstructured networks are robust in the face of high rates of "churn"—that is, when large numbers of peers are joining and leaving the network.
IEEE 802.11 is part of the IEEE 802 set of LAN protocols, specifies the set of media access control and physical layer protocols for implementing wireless local area network Wi-Fi computer communication in various frequencies, including but not limited to 2.4, 5, 60 GHz frequency bands. They are the world's most used wireless computer networking standards, used in most home and office networks to allow laptops and smartphones to talk to each other and access the Internet without connecting wires, they are created and maintained by the Institute of Electrical and Electronics Engineers LAN/MAN Standards Committee. The base version of the standard was released in 1997, has had subsequent amendments; the standard and amendments provide the basis for wireless network products using the Wi-Fi brand. While each amendment is revoked when it is incorporated in the latest version of the standard, the corporate world tends to market to the revisions because they concisely denote capabilities of their products.
As a result, in the marketplace, each revision tends to become its own standard. The protocols are used in conjunction with IEEE 802.2, are designed to interwork seamlessly with Ethernet, are often used to carry Internet Protocol traffic. Although IEEE 802.11 specifications list channels that might be used, the radio frequency spectrum availability allowed varies by regulatory domain. The 802.11 family consists of a series of half-duplex over-the-air modulation techniques that use the same basic protocol. The 802.11 protocol family employ carrier-sense multiple access with collision avoidance whereby equipment listens to a channel for other users before transmitting each packet. 802.11-1997 was the first wireless networking standard in the family, but 802.11b was the first accepted one, followed by 802.11a, 802.11g, 802.11n, 802.11ac. Other standards in the family are service amendments that are used to extend the current scope of the existing standard, which may include corrections to a previous specification.802.11b and 802.11g use the 2.4 GHz ISM band, operating in the United States under Part 15 of the U.
S. Federal Communications Commission Rules and Regulations; because of this choice of frequency band, 802.11b/g/n equipment may suffer interference in the 2.4 GHz band from microwave ovens, cordless telephones, Bluetooth devices etc. 802.11b and 802.11g control their interference and susceptibility to interference by using direct-sequence spread spectrum and orthogonal frequency-division multiplexing signaling methods, respectively. 802.11a uses the 5 GHz U-NII band, for much of the world, offers at least 23 non-overlapping 20 MHz-wide channels rather than the 2.4 GHz ISM frequency band offering only three non-overlapping 20 MHz-wide channels, where other adjacent channels overlap—see list of WLAN channels. Better or worse performance with higher or lower frequencies may be realized, depending on the environment. 802.11 n can use either the 5 GHz band. The segment of the radio frequency spectrum used by 802.11 varies between countries. In the US, 802.11a and 802.11g devices may be operated without a license, as allowed in Part 15 of the FCC Rules and Regulations.
Frequencies used by channels one through six of 802.11b and 802.11g fall within the 2.4 GHz amateur radio band. Licensed amateur radio operators may operate 802.11b/g devices under Part 97 of the FCC Rules and Regulations, allowing increased power output but not commercial content or encryption. 802.11 technology has its origins in a 1985 ruling by the U. S. Federal Communications Commission that released the ISM band for unlicensed use. In 1991 NCR Corporation/AT & T invented a precursor to 802.11 in the Netherlands. The inventors intended to use the technology for cashier systems; the first wireless products were brought to the market under the name WaveLAN with raw data rates of 1 Mbit/s and 2 Mbit/s. Vic Hayes, who held the chair of IEEE 802.11 for 10 years, has been called the "father of Wi-Fi", was involved in designing the initial 802.11b and 802.11a standards within the IEEE. In 1999, the Wi-Fi Alliance was formed as a trade association to hold the Wi-Fi trademark under which most products are sold.
The major commercial breakthrough came with Apple Inc. adopting Wi-Fi for their iBook series of laptops in 1999. It was the first mass consumer product to offer Wi-Fi network connectivity, branded by Apple as AirPort. One year IBM followed with its ThinkPad 1300 series in 2000; the original version of the standard IEEE 802.11 was released in 1997 and clarified in 1999, but is now obsolete. It specified two net bit rates of 2 megabits per second, plus forward error correction code, it specified three alternative physical layer technologies: diffuse infrared operating at 1 Mbit/s. The latter two radio technologies used microwave transmission over the Industrial Scientific Medical frequency band at 2.4 GHz. Some earlier WLAN technologies used lower frequencies, such as the U. S. 900 MHz ISM band. Legacy 802.11 with direct-sequence spread spectrum was supplanted and popularized by 802.11b. 802.11a, published in 1999, uses the same data link layer protocol and frame format as the original standard, but an OFDM based air interface.
It operates in the 5 GHz band with a maximum net data rate of 54 Mbit/s, plus error correction code, which yields realistic net achievable throughput in the mid-20
Home computers were a class of microcomputers that entered the market in 1977, that started with what Byte Magazine called the "trinity of 1977", which became common during the 1980s. They were marketed to consumers as affordable and accessible computers that, for the first time, were intended for the use of a single nontechnical user; these computers were a distinct market segment that cost much less than business, scientific or engineering-oriented computers of the time such as the IBM PC, were less powerful in terms of memory and expandability. However, a home computer had better graphics and sound than contemporary business computers, their most common uses were playing video games, but they were regularly used for word processing, doing homework, programming. Home computers were not electronic kits. There were, commercial kits like the Sinclair ZX80 which were both home and home-built computers since the purchaser could assemble the unit from a kit. Advertisements in the popular press for early home computers were rife with possibilities for their practical use in the home, from cataloging recipes to personal finance to home automation, but these were realized in practice.
For example, using a typical 1980s home computer as a home automation appliance would require the computer to be kept powered on at all times and dedicated to this task. Personal finance and database use required tedious data entry. By contrast, advertisements in the specialty computer press simply listed specifications. If no packaged software was available for a particular application, the home computer user could program one—provided they had invested the requisite hours to learn computer programming, as well as the idiosyncrasies of their system. Since most systems shipped with the BASIC programming language included on the system ROM, it was easy for users to get started creating their own simple applications. Many users found programming to be a fun and rewarding experience, an excellent introduction to the world of digital technology; the line between'business' and'home' computer market segments blurred or vanished once IBM PC compatibles became used in the home, since now both categories of computers use the same processor architectures, operating systems, applications.
The only difference may be the sales outlet through which they are purchased. Another change from the home computer era is that the once-common endeavour of writing one's own software programs has vanished from home computer use; as early as 1965, some experimental projects such as Jim Sutherland's ECHO IV explored the possible utility of a computer in the home. In 1969, the Honeywell Kitchen Computer was marketed as a luxury gift item, would have inaugurated the era of home computing, but none were sold. Computers became affordable for the general public in the 1970s due to the mass production of the microprocessor starting in 1971. Early microcomputers such as the Altair 8800 had front-mounted switches and diagnostic lights to control and indicate internal system status, were sold in kit form to hobbyists; these kits would contain an empty printed circuit board which the buyer would fill with the integrated circuits, other individual electronic components and connectors, hand-solder all the connections.
While two early home computers could be bought either in kit form or assembled, most home computers were only sold pre-assembled. They were enclosed in plastic or metal cases similar in appearance to typewriter or hi-fi equipment enclosures, which were more familiar and attractive to consumers than the industrial metal card-cage enclosures used by the Altair and similar computers; the keyboard - a feature lacking on the Altair - was built into the same case as the motherboard. Ports for plug-in peripheral devices such as a video display, cassette tape recorders and disk drives were either built-in or available on expansion cards. Although the Apple II series had internal expansion slots, most other home computer models' expansion arrangements were through externally accessible'expansion ports' that served as a place to plug in cartridge-based games; the manufacturer would sell peripheral devices designed to be compatible with their computers as extra cost accessories. Peripherals and software were not interchangeable between different brands of home computer, or between successive models of the same brand.
To save the cost of a dedicated monitor, the home computer would connect through an RF modulator to the family TV set, which served as both video display and sound system. By 1982, an estimated 621,000 home computers were in American households, at an average sales price of US$530. After the success of the Radio Shack TRS-80, the Commodore PET and the Apple II in 1977 every manufacturer of consumer electronics rushed to introduce a home computer. Large numbers of new machines of all types began to appear during the early 1980s. Mattel, Texas Instruments and Timex, none of which had any previous connection to the computer industry, all had short-lived home computer lines in the early 1980s; some home computers were more successful – the BBC Micro, Sinclair ZX Spectrum, Atari 800XL and Commodore 64, sold many units over several years and attracted third-party software development. Universally, home computers had a BASIC interpreter combined with a line editor in permanent read-only memory which one could use to type in BASIC programs and execute them
Norman Manuel Abramson is an American engineer and computer scientist, most known for developing the ALOHAnet system for wireless computer communication. Born in Boston, Massachusetts, he received an A. B. in physics from Harvard University, an M. A. in Physics from UCLA, a Ph. D. in electrical engineering from Stanford University. Abramson was a research engineer at the Hughes Aircraft Company until 1955, when he joined the faculty at Stanford University, was visiting professor at University of California at Berkeley, before moving to University of Hawaii, serving as professor of both Electrical Engineering and Computer Science, Director of Aloha Systems. In 1994 Abramson co-founded Aloha Networks in San Francisco, where he served as a CTO, his early research concerned radar signal characteristics and sampling theory, as well as frequency modulation and digital communication channels, error correcting codes,pattern recognition and machine learning and computing for seismic analysis. In the late 1960s he worked on the ALOHAnet and continued to develop spread spectrum techniques in the 1980s.
1972: IEEE Sixth Region Achievement Award for contributions to Information Theory and Coding. 1980: IEEE Fellow Award for development of the ALOHA-System.. 1992: Pacific Telecommunications Council 20th Anniversary Award for leadership in the PTC. 1995: IEEE Koji Kobayashi Computers and Communications Award for development of the ALOHA System. 1998: Golden Jubilee Award for Technological Innovation from the IEEE Information Theory Society, for "the invention of the first random-access communication protocol". 2000: Technology Award from the German Eduard Rhein Foundation. 2007: IEEE Alexander Graham Bell Medal. 2011: C&C Prize. Information theory and coding Computer communication networks. Editor with Franklin F. Kuo Engineering and Technology History Wiki] Biography from IEEE Oral history interview with Severo Ornstein, Charles Babbage Institute, University of Minnesota. Ornstein discusses the computing contributions of Norman Abramson. Norman Abramson at the Mathematics Genealogy Project Author profile in the database zbMATH
Wi-Fi is technology for radio wireless local area networking of devices based on the IEEE 802.11 standards. Wi‑Fi is a trademark of the Wi-Fi Alliance, which restricts the use of the term Wi-Fi Certified to products that complete after many years of testing the 802.11 committee interoperability certification testing. Devices that can use Wi-Fi technologies include, among others and laptops, video game consoles and tablets, smart TVs, digital audio players, digital cameras and drones. Wi-Fi compatible devices can connect to the Internet via a wireless access point; such an access point has a range of about 20 meters indoors and a greater range outdoors. Hotspot coverage can be as small as a single room with walls that block radio waves, or as large as many square kilometres achieved by using multiple overlapping access points. Different versions of Wi-Fi exist, with radio bands and speeds. Wi-Fi most uses the 2.4 gigahertz UHF and 5 gigahertz SHF ISM radio bands. Each channel can be time-shared by multiple networks.
These wavelengths work best for line-of-sight. Many common materials absorb or reflect them, which further restricts range, but can tend to help minimise interference between different networks in crowded environments. At close range, some versions of Wi-Fi, running on suitable hardware, can achieve speeds of over 1 Gbit/s. Anyone within range with a wireless network interface controller can attempt to access a network. Wi-Fi Protected Access is a family of technologies created to protect information moving across Wi-Fi networks and includes solutions for personal and enterprise networks. Security features of WPA have included stronger protections and new security practices as the security landscape has changed over time. In 1971, ALOHAnet connected the Hawaiian Islands with a UHF wireless packet network. ALOHAnet and the ALOHA protocol were early forerunners to Ethernet, the IEEE 802.11 protocols, respectively. A 1985 ruling by the U. S. Federal Communications Commission released the ISM band for unlicensed use.
These frequency bands are the same ones used by equipment such as microwave ovens and are subject to interference. In 1991, NCR Corporation with AT&T Corporation invented the precursor to 802.11, intended for use in cashier systems, under the name WaveLAN. The Australian radio-astronomer Dr John O'Sullivan with his colleagues Terence Percival, Graham Daniels, Diet Ostry, John Deane developed a key patent used in Wi-Fi as a by-product of a Commonwealth Scientific and Industrial Research Organisation research project, "a failed experiment to detect exploding mini black holes the size of an atomic particle". Dr O'Sullivan and his colleagues are credited with inventing Wi-Fi. In 1992 and 1996, CSIRO obtained patents for a method used in Wi-Fi to "unsmear" the signal; the first version of the 802.11 protocol was released in 1997, provided up to 2 Mbit/s link speeds. This was updated in 1999 with 802.11b to permit 11 Mbit/s link speeds, this proved to be popular. In 1999, the Wi-Fi Alliance formed as a trade association to hold the Wi-Fi trademark under which most products are sold.
Wi-Fi uses a large number of patents held by many different organizations. In April 2009, 14 technology companies agreed to pay CSIRO $1 billion for infringements on CSIRO patents; this led to Australia labeling Wi-Fi as an Australian invention, though this has been the subject of some controversy. CSIRO won a further $220 million settlement for Wi-Fi patent-infringements in 2012 with global firms in the United States required to pay the CSIRO licensing rights estimated to be worth an additional $1 billion in royalties. In 2016, the wireless local area network Test Bed was chosen as Australia's contribution to the exhibition A History of the World in 100 Objects held in the National Museum of Australia; the name Wi-Fi, commercially used at least as early as August 1999, was coined by the brand-consulting firm Interbrand. The Wi-Fi Alliance had hired Interbrand to create a name, "a little catchier than'IEEE 802.11b Direct Sequence'." Phil Belanger, a founding member of the Wi-Fi Alliance who presided over the selection of the name "Wi-Fi", has stated that Interbrand invented Wi-Fi as a pun on the word hi-fi, a term for high-quality audio technology.
Interbrand created the Wi-Fi logo. The yin-yang Wi-Fi logo indicates the certification of a product for interoperability; the Wi-Fi Alliance used the advertising slogan "The Standard for Wireless Fidelity" for a short time after the brand name was created. While inspired by the term hi-fi, the name was never "Wireless Fidelity"; the Wi-Fi Alliance was called the "Wireless Fidelity Alliance Inc" in some publications. Non-Wi-Fi technologies intended for fixed points, such as Motorola Canopy, are described as fixed wireless. Alternative wireless technologies include mobile phone standards, such as 2G, 3G, 4G, LTE; the name is sometimes written as WiFi, Wifi, or wifi, but these are not approved by the Wi-Fi Alliance. IEEE is a separate, but related organization and their website has stated "WiFi is a short name for Wireless Fidelity". To connect to a Wi-Fi LAN, a computer has to be equipped with a wireless network interface controller; the combination of computer and interface controllers is called a station.
A service set is the set of all the devices associated with a particular Wi-Fi network. The service set can be local, extended or mesh; each service set has an associated identifier, the 32-byte Service Set Identifier, which identifies the partic
Bluetooth is a wireless technology standard for exchanging data between fixed and mobile devices over short distances using short-wavelength UHF radio waves in the industrial and medical radio bands, from 2.400 to 2.485 GHz, building personal area networks. It was conceived as a wireless alternative to RS-232 data cables. Bluetooth is managed by the Bluetooth Special Interest Group, which has more than 30,000 member companies in the areas of telecommunication, computing and consumer electronics; the IEEE standardized no longer maintains the standard. The Bluetooth SIG oversees development of the specification, manages the qualification program, protects the trademarks. A manufacturer must meet Bluetooth SIG standards to market it as a Bluetooth device. A network of patents apply to the technology; the development of the "short-link" radio technology named Bluetooth, was initiated in 1989 by Nils Rydbeck, CTO at Ericsson Mobile in Lund, Sweden and by Johan Ullman. The purpose was to develop wireless headsets, according to two inventions by Johan Ullman, SE 8902098-6, issued 1989-06-12 and SE 9202239, issued 1992-07-24.
Nils Rydbeck tasked Tord Wingren with specifying and Jaap Haartsen and Sven Mattisson with developing. Both were working for Ericsson in Lund. Invented by Dutch electrical engineer Jaap Haartsen, working for telecommunications company Ericsson in 1994; the first consumer bluetooth launched in 1999. It was a hand free mobile headset which earned the technology the"Best of show Technology Award" at COMDEX; the first Bluetooth mobile phone was the Sony Ericsson T36 but it was the revised T39 model which made it to store shelves in 2001. The name Bluetooth is an Anglicised version of the Scandinavian Blåtand/Blåtann, the epithet of the tenth-century king Harald Bluetooth who united dissonant Danish tribes into a single kingdom; the implication is. The idea of this name was proposed in 1997 by Jim Kardach of Intel who developed a system that would allow mobile phones to communicate with computers. At the time of this proposal he was reading Frans G. Bengtsson's historical novel The Long Ships about Vikings and King Harald Bluetooth.
The Bluetooth logo is a bind rune merging the Younger Futhark runes and, Harald's initials. Bluetooth operates at frequencies between 2402 and 2480 MHz, or 2400 and 2483.5 MHz including guard bands 2 MHz wide at the bottom end and 3.5 MHz wide at the top. This is in the globally unlicensed industrial and medical 2.4 GHz short-range radio frequency band. Bluetooth uses. Bluetooth divides transmitted data into packets, transmits each packet on one of 79 designated Bluetooth channels; each channel has a bandwidth of 1 MHz. It performs 1600 hops per second, with adaptive frequency-hopping enabled. Bluetooth Low Energy uses 2 MHz spacing. Gaussian frequency-shift keying modulation was the only modulation scheme available. Since the introduction of Bluetooth 2.0+EDR, π/4-DQPSK and 8-DPSK modulation may be used between compatible devices. Devices functioning with GFSK are said to be operating in basic rate mode where an instantaneous bit rate of 1 Mbit/s is possible; the term Enhanced Data Rate is used to describe π/4-DPSK and 8-DPSK schemes, each giving 2 and 3 Mbit/s respectively.
The combination of these modes in Bluetooth radio technology is classified as a BR/EDR radio. Bluetooth is a packet-based protocol with a master/slave architecture. One master may communicate with up to seven slaves in a piconet. All devices share the master's clock. Packet exchange is based on the basic clock, defined by the master, which ticks at 312.5 µs intervals. Two clock ticks make up a slot of 625 µs, two slots make up a slot pair of 1250 µs. In the simple case of single-slot packets, the master transmits in slots and receives in odd slots; the slave, receives in slots and transmits in odd slots. Packets may be 1, 3 or 5 slots long, but in all cases the master's transmission begins in slots and the slave's in odd slots; the above excludes Bluetooth Low Energy, introduced in the 4.0 specification, which uses the same spectrum but somewhat differently. A master BR/EDR Bluetooth device can communicate with a maximum of seven devices in a piconet, though not all devices reach this maximum; the devices can switch roles, by agreement, the slave can become the master.
The Bluetooth Core Specification provides for the connection of two or more piconets to form a scatternet, in which certain devices play the master role in one piconet and the slave role in another. At any given time, data can be transferred between one other device; the master chooses. Since it is the master that chooses which slave to address, whereas a slave is supposed to listen in each receive slot, being a master is a lighter burden than being a slave. Being a master of seven slaves is possible; the specification is vague as to required behavior in scatternets. Bluetooth is a standard wire-replacement communications proto
Wireless network interface controller
A wireless network interface controller is a network interface controller which connects to a wireless radio-based computer network, rather than a wired network, such as Token Ring or Ethernet. A WNIC, just like other NICs, works on Layer 2 of the OSI Model; this card uses an antenna to communicate via microwave radiation. A WNIC in a desktop computer is traditionally connected using the PCI bus. Other connectivity options are USB and PC card. Integrated WNICs are available. Early wireless network interface controllers were implemented on expansion cards that plugged into a computer bus; the low cost and ubiquity of the Wi-Fi standard means that many newer mobile computers have a wireless network interface built into the motherboard. The term is applied to IEEE 802.11 adapters. An 802.11 WNIC can operate in two modes known as infrastructure mode and ad hoc mode: Infrastructure mode In an infrastructure mode network the WNIC needs a wireless access point: all data is transferred using the access point as the central hub.
All wireless nodes in an infrastructure mode network connect to an access point. All nodes connecting to the access point must have the same service set identifier as the access point, if a kind of wireless security is enabled on the access point, they must share the same keys or other authentication parameters. Ad hoc mode In an ad hoc mode network the WNIC does not require an access point, but rather can interface with all other wireless nodes directly. All the nodes in an ad hoc network must have the same channel and SSID; the IEEE 802.11 standard sets out low-level specifications for how all 802.11 wireless networks operate. Earlier 802.11 interface controllers are only compatible with earlier variants of the standard, while newer cards support both current and old standards. Specifications used in marketing materials for WNICs include: Wireless data transfer rates. Wireless transmit power Wireless network standards 802.11g offers data transfer speeds equivalent to 802.11a – up to 54 Mbit/s – and the wider 300-foot range of 802.11b, is backward compatible with 802.11b.
Most Bluetooth cards do not implement any form of the 802.11 standard. Wireless range may be affected by objects in the way of the signal and by the quality of the antenna. Large electrical appliances, such as refrigerators, fuse boxes, metal plumbing, air conditioning units can impede a wireless network signal; the theoretical maximum range of IEEE 802.11 is only reached under ideal circumstances and true effective range is about half of the theoretical range. The maximum throughput speed is only achieved at close range; the reason is that wireless devices dynamically negotiate the top speed at which they can communicate without dropping too many data packets. In an 802.11 WNIC, the MAC Sublayer Management Entity can be implemented either in the NIC's hardware or firmware, or in host-based software, executed on the main CPU. A WNIC that implements the MLME function in hardware or firmware is called a FullMAC WNIC or a HardMAC NIC and a NIC that implements it in host software is called a SoftMAC NIC.
A FullMAC device hides the complexity of the 802.11 protocol from the main CPU, instead providing an 802.3 interface. FullMAC chips are used in mobile devices because: they are easier to integrate in complete products power is saved by having a specialized CPU perform the 802.11 processing. Popular example of FullMAC chips is the one implemented on the Raspberry Pi 3. Linux kernel's mac80211 framework provides capabilities for SoftMAC devices and additional capabilities for devices with limited functionality. FreeBSD supports SoftMAC drivers. List of device bandwidths Wi-Fi operating system support