Brian Harvey (lecturer)
Brian Keith Harvey is a former Lecturer SOE of computer science at University of California, Berkeley. He and his students developed a free and open source Logo interpreter for learners, he received his B. S. in Mathematics at MIT, 1969, a M. S. in Computer Science, Stanford University, 1975, a Ph. D. in Science and Mathematics Education, UC Berkeley, 1985. He received a M. A. in Clinical Psychology, New College of California, 1990. Until his retirement in July 2013, Harvey taught introductory computer science courses at Berkeley, as well as the "CS 195, Social Implications of Computing", he was involved in the development of the Logo for the use in K-12 education. Together with the German programmer Jens Mönig, Harvey designed BYOB and its successor Snap!, an extended version of the Scratch, which added higher order functions and true object inheritance for first-class sprites. With "CS10, The Beauty and Joy of Computing" at Berkeley he co-established the first course that's using BYOB and spread it to other colleges and high schools.
UC Berkeley Distinguished Teaching Award Diane S. McEntyre Award for Excellence in Teaching Jim and Donna Gray Faculty Award for Excellence in Undergraduate Teaching 2011 Why Structure and Interpretation of Computer Programs matters 2001 Harmful to Children? 1999 B. K. Harvey and M. Wright, Simply Scheme: Introducing Computer Science, 2nd ed. Cambridge, MA: MIT Press 1997 B. K. Harvey, Computer Science Logo Style, 2 ed. Vol. 1-3, Cambridge, MA: MIT Press 1997 Reasoning with Computers 1994 Is Programming Obsolete? 1992 Avoiding Recursion 1992 Beyond Programming 1991 Symbolic Programming vs. the AP Curriculum 1985 Computer Hacking and Ethics 1980 Using Computers for Educational Freedom Official page at UC Berkeley Bharvey user profile at Scratch Wiki Berkeley Logo page Brian Harvey's papers on computers and education CS 61A Webcasts
An L-system or Lindenmayer system is a parallel rewriting system and a type of formal grammar. An L-system consists of an alphabet of symbols that can be used to make strings, a collection of production rules that expand each symbol into some larger string of symbols, an initial "axiom" string from which to begin construction, a mechanism for translating the generated strings into geometric structures. L-systems were introduced and developed in 1968 by Aristid Lindenmayer, a Hungarian theoretical biologist and botanist at the University of Utrecht. Lindenmayer used L-systems to describe the behaviour of plant cells and to model the growth processes of plant development. L-systems have been used to model the morphology of a variety of organisms and can be used to generate self-similar fractals; as a biologist, Lindenmayer worked with yeast and filamentous fungi and studied the growth patterns of various types of bacteria, such as the cyanobacteria Anabaena catenula. The L-systems were devised to provide a formal description of the development of such simple multicellular organisms, to illustrate the neighbourhood relationships between plant cells.
On, this system was extended to describe higher plants and complex branching structures. The recursive nature of the L-system rules leads to self-similarity and thereby, fractal-like forms are easy to describe with an L-system. Plant models and natural-looking organic forms are easy to define, as by increasing the recursion level the form slowly'grows' and becomes more complex. Lindenmayer systems are popular in the generation of artificial life. L-system grammars are similar to the semi-Thue grammar. L-systems are now known as parametric L systems, defined as a tuple G =,where V is a set of symbols containing both elements that can be replaced and those which cannot be replaced ω is a string of symbols from V defining the initial state of the system P is a set of production rules or productions defining the way variables can be replaced with combinations of constants and other variables. A production consists of the predecessor and the successor. For any symbol A, a member of the set V which does not appear on the left hand side of a production in P, the identity production A → A is assumed.
The rules of the L-system grammar are applied iteratively starting from the initial state. As many rules as possible are applied per iteration; the fact that each iteration employs as many rules as possible differentiates an L-system from a formal language generated by a formal grammar, which applies only one rule per iteration. If the production rules were to be applied only one at a time, one would quite generate a language, rather than an L-system. Thus, L-systems are strict subsets of languages. An L-system is context-free if each production rule refers only to an individual symbol and not to its neighbours. Context-free L-systems are thus specified by a context-free grammar. If a rule depends not only on a single symbol but on its neighbours, it is termed a context-sensitive L-system. If there is one production for each symbol the L-system is said to be deterministic. If there are several, each is chosen with a certain probability during each iteration it is a stochastic L-system. Using L-systems for generating graphical images requires that the symbols in the model refer to elements of a drawing on the computer screen.
For example, the program Fractint uses turtle graphics to produce screen images. It interprets each constant in an L-system model as a turtle command. Lindenmayer's original L-system for modelling the growth of algae. Variables: A B constants: none axiom: A rules:, which produces: n = 0: A n = 1: AB n = 2: ABA n = 3: ABAAB n = 4: ABAABABA n = 5: ABAABABAABAAB n = 6: ABAABABAABAABABAABABA n = 7: ABAABABAABAABABAABABAABAABABAABAAB n=0: A start / \ n=1: A B the initial single A spawned into AB by rule, rule couldn't be applied /| \ n=2: A B A former string AB with all rules applied, A spawned into AB again, former B turned into A / | | | \ n=3: A B A A B note all A's producing a copy of themselves in the first place a B, which turns... / | | | \ | \ \ n=4: A B A A B A B A... into an A one generation starting to spawn/repeat/recurse The result is the sequence of Fibonacci words. If we count the length of each string, we obtain the famous Fibonacci sequence of numbers: 1 2 3 5 8 13 21 34 55 89...
For each string, if we count the k-th position from the left end of the string, the value is determined by whether a multiple of the golden ratio falls within the interval. The ratio of A to B converges to the golden mean; this example yields the same result if the rule is replaced with, except that the strings are mirrored. This sequence is a locally catenative sequence because G = G G, where
Lexington is a town in Middlesex County, United States. The population was 31,394 at the 2010 census, in nearly 11,100 households. Settled in 1641, it is celebrated as the site of the first shots of the American Revolutionary War, in the Battle of Lexington on April 19, 1775, it is the sixth wealthiest small city in the United States. Lexington was first settled circa 1642 as part of Massachusetts. What is now Lexington was incorporated as a parish, called Cambridge Farms, in 1691; this allowed them to have a separate church and minister, but were still under jurisdiction of the Town of Cambridge. Lexington was incorporated as a separate town in 1713, it was that it got the name Lexington. How it received its name is the subject of some controversy; some people believe that it was named in honor of an English peer. Some, on the other hand, believe that it was named after Lexington in England. In the early colonial days, Vine Brook, which runs through Lexington and Bedford, empties into the Shawsheen River, was a focal point of the farming and industry of the town.
It provided for many types of mills, in the 20th Century, for farm irrigation. For decades, Lexington grew modestly while remaining a farming community, providing Boston with much of its produce, it always had a bustling downtown area. Lexington began to prosper, helped by its proximity to Boston, having a rail line service its citizens and businesses, beginning in 1846. For many years, East Lexington was considered a separate village from the rest of the town, though it still had the same officers and Town Hall. Most of the farms of Lexington became housing developments by the end of the 1960s. Lexington, as well as many of the towns along the Route 128 corridor, experienced a jump in population in the 1960s and 70s, due to the high-tech boom. Property values in the town soared, the school system became nationally recognized for its excellence; the town participates in the METCO program, which buses minority students from Boston to suburban towns to receive better educational opportunities than those available to them in the Boston Public Schools.
On April 19, 1775, what many regard as the first battle of the American Revolutionary War was a battle at Lexington. After the rout, the British march on toward Concord where the militia had been allowed time to organize at the Old North Bridge and turn back the British and prevent them from capturing and destroying the militia's arms stores. Lexington was the Cold War location of the USAF "Experimental SAGE Subsector" for testing a prototype IBM computer that arrived in July 1955 for development of a computerized "national air defense network". Lexington is located at 42°26′39″N 71°13′36″W. According to the United States Census Bureau, the town has a total area of 16.5 square miles, of which 16.4 square miles is land and 0.1 square miles, or 0.85%, is water. Lexington borders the following towns: Burlington, Winchester, Belmont, Waltham and Bedford, it has more area than all other municipalities. By the 2010 census, the population had reached 31,394; as of the census of 2010, there had been 31,394 people, 11,530 households, 8,807 families residing in the town.
The population density was 1,851.0 people per square mile. There were 12,019 housing units at an average density of 691.1 per square mile. The racial makeup of the town was 68.6% White, 25.4% Asian, 1.5% Black or African American, 0.1% Native American, 0.0% Pacific Islander, 0.5% from other races, 2.6% from two or more races. Hispanic or Latino of any race were 2.3% of the population. There were 11,530 households out of which 38.6% had children under the age of 18 living with them, 66.0% were married couples living together, 7.7% had a female householder with no husband present, 24.1% were non-families. 20.8% of all households were made up of individuals and 12.3% had someone living alone, 65 years of age or older. The average household size was 2.66 and the average family size was 3.10. In the town, the population was spread out with 26.4% under the age of 18, 3.5% from 18 to 24, 22.7% from 25 to 44, 28.5% from 45 to 64, 19.0% who were 65 years of age or older. The median age was 44 years. For every 100 females, there were 88.7 males.
For every 100 females age 18 and over, there were 83.5 males. In 2013, the mean home price for detached houses was $852,953, the median price of a house or condo was $718,300. According to a 2012 estimate, the median income for a household in the town was $191,350, the median income for a family was $218,890. Males had a median income of $101,334 versus $77,923 for females; the per capita income for the town was $70,132. About 1.8% of families and 3.4% of the population were below the poverty line, including 3.2% of those under age 18 and 3.4% of those age 65 or over. By race, the median household income was highest for mixed race households, at $263,321. Hispanic households had a median income of $233,875. Asian households had a median income of $178,988. White households had a median income of $154,533. Black households had a median income of $139,398. American Indian or Alaskan Native households had a median income of $125,139. In 2010, 20% of the residents of Lexington were born outside of the United States.
Lexington's public education system
Cynthia Solomon is an American computer scientist known for her work in artificial intelligence and popularizing computer science for students. She is a pioneer in the fields of artificial intelligence, computer science, educational computing. While working as a researcher at Massachusetts Institute of Technology, Dr. Solomon took it upon herself to understand and program in the programming language Lisp; as she began learning this language she realized the need for a programming language, more accessible and understandable for children. Throughout her research studies in education, Dr. Solomon worked full-time as a computer teacher in elementary and secondary schools, her work has focused on research on human-computer interaction and children as designers. While working at Bolt and Newman, she created the first programming language for children, with Wally Feurzeig and Seymour Papert; the programming language, was created to teach concepts of programming related to Lisp. Dr. Solomon has attained many accomplishments in her life such as being the Vice President of R&D for Logo Computer Systems, Inc. when Apple Logo was developed and was the Director of the Atari Cambridge Research Laboratory.
Dr. Solomon worked on the program committee of Constructing Modern Knowledge and the Marvin Minsky Institute for Artificial Intelligence in 2016. On top of this, she has published many writings based on research in the field of child education and technology in the classroom. Dr. Solomon has conducted workshops in elementary schools, high schools, colleges regarding Academic research and writing, she continues to contribute to the field by speaking at conferences and working with the One Laptop Per Child Foundation. Cynthia has admitted to struggling through some of classes during middle and high school, has mentioned that she became motivated to continue her education through her passion of computer science and kind music teachers. Dr. Solomon received her bachelor's degree in history at Radcliffe College in the early 1960s. Subsequently, she studied at Boston University where she received her master's degree in computer science in 1976, she received her PhD in education at Harvard University in 1985.
Intermittently while completing her schooling, Solomon worked for several years as a researcher with Marvin Minsky and Seymour Papert at MIT and at Bolt and Newman. After receiving her bachelor's degree at Radcliffe College, Dr. Solomon taught at Milton Academy for seven years, she became the Technology Integration Coordinator at Monsignor Haddad Middle School in Needham, MA. In the 1980s, Massachusetts Institute of Technology hired Dr. Solomon to lead the Atari Cambridge Research Laboratory due to her success in the development of Logo. Solomon maintained a long relationship working with the MIT Media Lab and the One Laptop Per Child Foundation. Dr. Solomon is still a leading worker for the One Laptop Per Child Foundation and directed the creation of educational materials for the foundation, she continued teaching and her scholarship while working with several esteemed research labs and foundations. In 2016, Solomon was awarded the National Center for Information Technology Pioneer Award. Solomon received a Lifetime Achievement Award at Constructionism 2016.
She introduced the Seymour Papert memorial lecture at Cross Roads 2018, facilitated conversation about new uses for the program in education and to a new demographic of users. Together with Seymour Papert and Wally Feurzeig, Cynthia designed the Logo computer programming language in 1967; this programming language was for kids to experiment with words, solve math problems, make-up stories, create their own games. Logo is known for its use of turtle graphics, in which commands for movement and drawing produced line graphics either on screen or with a small robot called a turtle. In the 1970s, a new development of Logo was introduced allowing the program to be viewed in multiple colors; the language was created to teach concepts of programming related to Lisp, a functional programming language. Logo enabled what Papert called "body-syntonic reasoning", where students could understand and reason about the turtle's motion by imagining what they would do if they were the turtle. There are substantial differences among the many dialects of Logo, the situation is confused by the regular appearance of turtle-graphics programs that call themselves Logo.
Dr. Solomon began the development of Logo after coming to the realization that children needed a programming language of their own. Solomon directed the creation of educational materials for the One Laptop per Child Foundation, her doctoral research at Harvard led to the publication of the critical book, Computer Environments for Children: A Reflection on Theories of Learning and Education; this book explores the opportunities and challenges presented regarding having computers in learning environments. Focused on elementary school mathematics, Solomon discusses the role of computers in innovative learning theories. Solomon is the co-author of Designing Multimedia Environments for Children, with Allison Druin. Along with many other research projects and writings contributing to the knowledge of children's learning environments in conjunction with technology. Computer Environments for Children: A Reflection on Theories of Learning and Education. Solomon, Cynthia. Twenty Things to Do with a Computer.
Papert and Solomon, Cynthia. Http://dspace.mit.edu/handle/1721.1/5836 Leading a Child to a Computer Culture. ACM SIGCUE Outlook.. Solomon, Cynthia. Teaching young children to program in a LOGO turtle computer culture. ACM Sigcue Outlook. Solomon, Cynthia. Apple II - Apple Logo: Reference Manual & Intro
In computer science, artificial intelligence, sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and animals. Computer science defines AI research as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of achieving its goals. Colloquially, the term "artificial intelligence" is used to describe machines that mimic "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving"; as machines become capable, tasks considered to require "intelligence" are removed from the definition of AI, a phenomenon known as the AI effect. A quip in Tesler's Theorem says "AI is whatever hasn't been done yet." For instance, optical character recognition is excluded from things considered to be AI, having become a routine technology. Modern machine capabilities classified as AI include understanding human speech, competing at the highest level in strategic game systems, autonomously operating cars, intelligent routing in content delivery networks and military simulations.
Artificial intelligence can be classified into three different types of systems: analytical, human-inspired, humanized artificial intelligence. Analytical AI has only characteristics consistent with cognitive intelligence. Human-inspired AI has elements from emotional intelligence. Humanized AI shows characteristics of all types of competencies, is able to be self-conscious and is self-aware in interactions with others. Artificial intelligence was founded as an academic discipline in 1956, in the years since has experienced several waves of optimism, followed by disappointment and the loss of funding, followed by new approaches and renewed funding. For most of its history, AI research has been divided into subfields that fail to communicate with each other; these sub-fields are based on technical considerations, such as particular goals, the use of particular tools, or deep philosophical differences. Subfields have been based on social factors; the traditional problems of AI research include reasoning, knowledge representation, learning, natural language processing and the ability to move and manipulate objects.
General intelligence is among the field's long-term goals. Approaches include statistical methods, computational intelligence, traditional symbolic AI. Many tools are used in AI, including versions of search and mathematical optimization, artificial neural networks, methods based on statistics and economics; the AI field draws upon computer science, information engineering, psychology, linguistics and many other fields. The field was founded on the claim that human intelligence "can be so described that a machine can be made to simulate it"; this raises philosophical arguments about the nature of the mind and the ethics of creating artificial beings endowed with human-like intelligence which are issues that have been explored by myth and philosophy since antiquity. Some people consider AI to be a danger to humanity if it progresses unabated. Others believe that AI, unlike previous technological revolutions, will create a risk of mass unemployment. In the twenty-first century, AI techniques have experienced a resurgence following concurrent advances in computer power, large amounts of data, theoretical understanding.
Thought-capable artificial beings appeared as storytelling devices in antiquity, have been common in fiction, as in Mary Shelley's Frankenstein or Karel Čapek's R. U. R.. These characters and their fates raised many of the same issues now discussed in the ethics of artificial intelligence; the study of mechanical or "formal" reasoning began with philosophers and mathematicians in antiquity. The study of mathematical logic led directly to Alan Turing's theory of computation, which suggested that a machine, by shuffling symbols as simple as "0" and "1", could simulate any conceivable act of mathematical deduction; this insight, that digital computers can simulate any process of formal reasoning, is known as the Church–Turing thesis. Along with concurrent discoveries in neurobiology, information theory and cybernetics, this led researchers to consider the possibility of building an electronic brain. Turing proposed that "if a human could not distinguish between responses from a machine and a human, the machine could be considered "intelligent".
The first work, now recognized as AI was McCullouch and Pitts' 1943 formal design for Turing-complete "artificial neurons". The field of AI research was born at a workshop at Dartmouth College in 1956. Attendees Allen Newell, Herbert Simon, John McCarthy, Marvin Minsky and Arthur Samuel became the founders and leaders of AI research, they and their students produced programs that the press described as "astonishing": computers were learning checkers strategies (and by 1959 were playing better than the average human
Cambridge is a city in Middlesex County and part of the Boston metropolitan area. Situated directly north of Boston, across the Charles River, it was named in honor of the University of Cambridge in England, an important center of the Puritan theology embraced by the town's founders. Harvard University and the Massachusetts Institute of Technology are in Cambridge, as was Radcliffe College, a college for women until it merged with Harvard on October 1, 1999. According to the 2010 Census, the city's population was 105,162; as of July 2014, it was the fifth most populous city in the state, behind Boston, Worcester and Lowell. Cambridge was one of two seats of Middlesex County until the county government was abolished in Massachusetts in 1997. In December 1630, the site of what would become Cambridge was chosen because it was safely upriver from Boston Harbor, making it defensible from attacks by enemy ships. Thomas Dudley, his daughter Anne Bradstreet, her husband Simon were among the town's first settlers.
The first houses were built in the spring of 1631. The settlement was referred to as "the newe towne". Official Massachusetts records show the name rendered as Newe Towne by 1632, as Newtowne by 1638. Located at the first convenient Charles River crossing west of Boston, Newe Towne was one of a number of towns founded by the 700 original Puritan colonists of the Massachusetts Bay Colony under Governor John Winthrop, its first preacher was Thomas Hooker, who led many of its original inhabitants west in 1636 to found Hartford and the Connecticut Colony. The original village site is now within Harvard Square; the marketplace where farmers sold crops from surrounding towns at the edge of a salt marsh remains within a small park at the corner of John F. Kennedy and Winthrop Streets; the town comprised a much larger area than the present city, with various outlying parts becoming independent towns over the years: Cambridge Village in 1688, Cambridge Farms in 1712 or 1713, Little or South Cambridge and Menotomy or West Cambridge in 1807.
In the late 19th century, various schemes for annexing Cambridge to Boston were pursued and rejected. In 1636, the Newe College was founded by the colony to train ministers. According to Cotton Mather, Newe Towne was chosen for the site of the college by the Great and General Court for its proximity to the popular and respected Puritan preacher Thomas Shepard. In May 1638, The settlement's name was changed to Cambridge in honor of the university in Cambridge, England. Newtowne's ministers and Shepard, the college's first president, major benefactor, the first schoolmaster Nathaniel Eaton were Cambridge alumni, as was the colony's governor John Winthrop. In 1629, Winthrop had led the signing of the founding document of the city of Boston, known as the Cambridge Agreement, after the university. In 1650, Governor Thomas Dudley signed the charter creating the corporation that still governs Harvard College. Cambridge grew as an agricultural village eight miles by road from Boston, the colony's capital.
By the American Revolution, most residents lived near the Common and Harvard College, with most of the town comprising farms and estates. Most inhabitants were descendants of the original Puritan colonists, but there was a small elite of Anglican "worthies" who were not involved in village life, made their livings from estates and trade, lived in mansions along "the Road to Watertown". Coming north from Virginia, George Washington took command of the volunteer American soldiers camped on Cambridge Common on July 3, 1775, now reckoned the birthplace of the U. S. Army. Most of the Tory estates were confiscated after the Revolution. On January 24, 1776, Henry Knox arrived with artillery captured from Fort Ticonderoga, which enabled Washington to drive the British army out of Boston. Between 1790 and 1840, Cambridge grew with the construction of the West Boston Bridge in 1792 connecting Cambridge directly to Boston, so that it was no longer necessary to travel eight miles through the Boston Neck and Brookline to cross the Charles River.
A second bridge, the Canal Bridge, opened in 1809 alongside the new Middlesex Canal. The new bridges and roads made what were estates and marshland into prime industrial and residential districts. In the mid-19th century, Cambridge was the center of a literary revolution, it was home to some of the famous Fireside Poets—so called because their poems would be read aloud by families in front of their evening fires. The Fireside Poets—Henry Wadsworth Longfellow, James Russell Lowell, Oliver Wendell Holmes—were popular and influential in their day. Soon after, turnpikes were built: the Cambridge and Concord Turnpike, the Middlesex Turnpike, what are today's Cambridge and Harvard Streets connected various areas of Cambridge to the bridges. In addition, the town was connected to the Boston & Maine Railroad, leading to the development of Porter Square as well as the creation of neighboring Somerville from the rural parts of Charlestown. Cambridge was incorporated as a city in 1846 despite persistent tensions between East Cambridge and Old Cambridge stemming from differences in culture, sources of income, the national origins of the resident