Industrial engineering is an inter-disciplinary profession, concerned with the optimization of complex processes, systems, or organizations by developing and implementing integrated systems of people, knowledge, equipment and materials. Industrial engineers use specialized knowledge and skills in the mathematical and social sciences, together with the principles and methods of engineering analysis and design, to specify and evaluate the results obtained from systems and processes. From these results, they are able to create new systems, processes or situations for the useful coordination of man and machines and improve the quality and productivity of systems, physical or social. Depending on the sub-specialties involved, industrial engineering may overlap with, operations research, systems engineering, manufacturing engineering, production engineering, management science, management engineering, financial engineering, ergonomics or human factors engineering, safety engineering, or others, depending on the viewpoint or motives of the user.
Though its underlying concepts overlap with certain business-oriented disciplines, such as operations management, industrial engineering is a longstanding engineering discipline subject to professional engineering licensure in most jurisdictions. There is a general consensus among historians that the roots of the industrial engineering profession date back to the Industrial Revolution; the technologies that helped mechanize traditional manual operations in the textile industry including the flying shuttle, the spinning jenny, most the steam engine generated economies of scale that made Mass production in centralized locations attractive for the first time. The concept of the production system had its genesis in the factories created by these innovations. Adam Smith's concepts of Division of Labour and the "Invisible Hand" of capitalism introduced in his treatise "The Wealth of Nations" motivated many of the technological innovators of the Industrial revolution to establish and implement factory systems.
The efforts of James Watt and Matthew Boulton led to the first integrated machine manufacturing facility in the world, including the implementation of concepts such as cost control systems to reduce waste and increase productivity and the institution of skills training for craftsmen. Charles Babbage became associated with Industrial engineering because of the concepts he introduced in his book "On the Economy of Machinery and Manufacturers" which he wrote as a result of his visits to factories in England and the United States in the early 1800s; the book includes subjects such as the time required to perform a specific task, the effects of subdividing tasks into smaller and less detailed elements, the advantages to be gained from repetitive tasks. Eli Whitney and Simeon North proved the feasibility of the notion of Interchangeable parts in the manufacture of muskets and pistols for the US Government. Under this system, individual parts were mass-produced to tolerances to enable their use in any finished product.
The result was a significant reduction in the need for skill from specialized workers, which led to the industrial environment to be studied later. Frederick Taylor is credited as being the father of the Industrial Engineering discipline, he earned a degree in mechanical engineering from Steven's University and earned several patents from his inventions. His books, Shop Management and The Principles of Scientific Management which were published in the early 1900s, were the beginning of Industrial Engineering. Improvements in work efficiency under his methods was based on improving work methods, developing of work standards, reduction in time required to carry out the work. With an abiding faith in the scientific method, Taylor's contribution to "Time Study" sought a high level of precision and predictability for manual tasks; the husband-and-wife team of Frank Gilbreth and Lillian Gilbreth was the other cornerstone of the Industrial Engineering movement whose work is housed at Purdue University School of Industrial Engineering.
They categorized the elements of human motion into 18 basic elements called therbligs. This development permitted analysts to design jobs without knowledge of the time required to do a job; these developments were the beginning of a much broader field known as human ergonomics. In 1908, the first course on Industrial Engineering was offered as an elective at Pennsylvania State University, which became a separate program in 1909 through the efforts of Hugo Diemer; the first doctoral degree in industrial engineering was awarded in 1933 by Cornell University. In 1912 Henry Laurence Gantt developed the Gantt chart which outlines actions the organization along with their relationships; this chart opens form familiar to us today by Wallace Clark. With the development of assembly lines, the factory of Henry Ford accounted for a significant leap forward in the field. Ford reduced the assembly time of a car more than 700 hours to 1.5 hours. In addition, he was a pioneer of the economy of the capitalist welfare and the flag of providing financial incentives for employees to increase productivity.
Comprehensive quality management system developed in the forties was gaining momentum after World War II and was part of the recovery of Japan after the war. The American Institute of Industrial Engineering was formed in 1948; the early work by F. W. Taylor and the Gilbreths was documented in papers presented to the American Society of Mechanical Engineers as interest grew from improving machine performance to the per
A database is an organized collection of data stored and accessed electronically from a computer system. Where databases are more complex they are developed using formal design and modeling techniques; the database management system is the software that interacts with end users and the database itself to capture and analyze the data. The DBMS software additionally encompasses; the sum total of the database, the DBMS and the associated applications can be referred to as a "database system". The term "database" is used to loosely refer to any of the DBMS, the database system or an application associated with the database. Computer scientists may classify database-management systems according to the database models that they support. Relational databases became dominant in the 1980s; these model data as rows and columns in a series of tables, the vast majority use SQL for writing and querying data. In the 2000s, non-relational databases became popular, referred to as NoSQL because they use different query languages.
Formally, a "database" refers to the way it is organized. Access to this data is provided by a "database management system" consisting of an integrated set of computer software that allows users to interact with one or more databases and provides access to all of the data contained in the database; the DBMS provides various functions that allow entry and retrieval of large quantities of information and provides ways to manage how that information is organized. Because of the close relationship between them, the term "database" is used casually to refer to both a database and the DBMS used to manipulate it. Outside the world of professional information technology, the term database is used to refer to any collection of related data as size and usage requirements necessitate use of a database management system. Existing DBMSs provide various functions that allow management of a database and its data which can be classified into four main functional groups: Data definition – Creation and removal of definitions that define the organization of the data.
Update – Insertion and deletion of the actual data. Retrieval – Providing information in a form directly usable or for further processing by other applications; the retrieved data may be made available in a form the same as it is stored in the database or in a new form obtained by altering or combining existing data from the database. Administration – Registering and monitoring users, enforcing data security, monitoring performance, maintaining data integrity, dealing with concurrency control, recovering information, corrupted by some event such as an unexpected system failure. Both a database and its DBMS conform to the principles of a particular database model. "Database system" refers collectively to the database model, database management system, database. Physically, database servers are dedicated computers that hold the actual databases and run only the DBMS and related software. Database servers are multiprocessor computers, with generous memory and RAID disk arrays used for stable storage.
RAID is used for recovery of data. Hardware database accelerators, connected to one or more servers via a high-speed channel, are used in large volume transaction processing environments. DBMSs are found at the heart of most database applications. DBMSs may be built around a custom multitasking kernel with built-in networking support, but modern DBMSs rely on a standard operating system to provide these functions. Since DBMSs comprise a significant market and storage vendors take into account DBMS requirements in their own development plans. Databases and DBMSs can be categorized according to the database model that they support, the type of computer they run on, the query language used to access the database, their internal engineering, which affects performance, scalability and security; the sizes and performance of databases and their respective DBMSs have grown in orders of magnitude. These performance increases were enabled by the technology progress in the areas of processors, computer memory, computer storage, computer networks.
The development of database technology can be divided into three eras based on data model or structure: navigational, SQL/relational, post-relational. The two main early navigational data models were the hierarchical model and the CODASYL model The relational model, first proposed in 1970 by Edgar F. Codd, departed from this tradition by insisting that applications should search for data by content, rather than by following links; the relational model employs sets of ledger-style tables, each used for a different type of entity. Only in the mid-1980s did computing hardware become powerful enough to allow the wide deployment of relational systems. By the early 1990s, relational systems dominated in all large-scale data processing applications, as of 2018 they remain dominant: IBM DB2, Oracle, MySQL, Microsoft SQL Server are the most searched DBMS; the dominant database language, standardised SQL for the relational model, has influenced database languages for other data models. Object databases were developed in the 1980s to overcome the inconvenience of object-relational impedance mismatch, which led to the coining of the term "post-relational" and the development of hybrid object-relational databas
Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge; the individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function. Issues such as requirements engineering, logistics, coordination of different teams and evaluation, maintainability and many other disciplines necessary for successful system development, design and ultimate decommission become more difficult when dealing with large or complex projects. Systems engineering deals with work-processes, optimization methods, risk management tools in such projects, it overlaps technical and human-centered disciplines such as industrial engineering, mechanical engineering, manufacturing engineering, control engineering, software engineering, electrical engineering, organizational studies, civil engineering and project management.
Systems engineering ensures that all aspects of a project or system are considered, integrated into a whole. The systems engineering process is a discovery process, quite unlike a manufacturing process. A manufacturing process is focused on repetitive activities that achieve high quality outputs with minimum cost and time; the systems engineering process must begin by discovering the real problems that need to be resolved, identifying the most probable or highest impact failures that can occur – systems engineering involves finding solutions to these problems. The term systems engineering can be traced back to Bell Telephone Laboratories in the 1940s; the need to identify and manipulate the properties of a system as a whole, which in complex engineering projects may differ from the sum of the parts' properties, motivated various industries those developing systems for the U. S. Military; when it was no longer possible to rely on design evolution to improve upon a system and the existing tools were not sufficient to meet growing demands, new methods began to be developed that addressed the complexity directly.
The continuing evolution of systems engineering comprises the development and identification of new methods and modeling techniques. These methods aid in a better comprehension of the design and developmental control of engineering systems as they grow more complex. Popular tools that are used in the systems engineering context were developed during these times, including USL, UML, QFD, IDEF0. In 1990, a professional society for systems engineering, the National Council on Systems Engineering, was founded by representatives from a number of U. S. corporations and organizations. NCOSE was created to address the need for improvements in systems engineering practices and education; as a result of growing involvement from systems engineers outside of the U. S. the name of the organization was changed to the International Council on Systems Engineering in 1995. Schools in several countries offer graduate programs in systems engineering, continuing education options are available for practicing engineers.
Systems engineering signifies only an approach and, more a discipline in engineering. The aim of education in systems engineering is to formalize various approaches and in doing so, identify new methods and research opportunities similar to that which occurs in other fields of engineering; as an approach, systems engineering is interdisciplinary in flavour. The traditional scope of engineering embraces the conception, development and operation of physical systems. Systems engineering, as conceived, falls within this scope. "Systems engineering", in this sense of the term, refers to the building of engineering concepts. The use of the term "systems engineer" has evolved over time to embrace a wider, more holistic concept of "systems" and of engineering processes; this evolution of the definition has been a subject of ongoing controversy, the term continues to apply to both the narrower and broader scope. Traditional systems engineering was seen as a branch of engineering in the classical sense, that is, as applied only to physical systems, such as spacecraft and aircraft.
More systems engineering has evolved to a take on a broader meaning when humans were seen as an essential component of a system. Checkland, for example, captures the broader meaning of systems engineering by stating that'engineering' "can be read in its general sense. Enterprise Systems Engineering pertains to the view of enterprises, that is, organizations or combinations of organizations, as systems. Service Systems Engineering has to do with the engineering of service systems. Checkland defines a service system as a system, conceived as serving another system. Most civil infrastructure systems are service systems. Systems engineering focuses on analyzing and eliciting customer needs and required functionality early in the development cycle, documenting requirements proceeding with design synthesis and system validation while considering the complete problem, the system lifecycle; this includes understanding all of the stakeholders involved. Oliver et al. claim that the systems engineerin
Logico-linguistic modeling is a method for building knowledge-based systems with a learning capability using conceptual models from soft systems methodology, modal predicate logic, the Prolog artificial intelligence language. Logico-linguistic modeling is a six-stage method developed for building knowledge-based systems, but it has application in manual decision support systems and information source analysis. Logico-linguistic models have a superficial similarity to John F. Sowa's conceptual graphs. However, logico-linguistic models are different in both logical form and in their method of construction. Logico-linguistic modeling was developed in order to solve theoretical problems found in the soft systems method for information system design; the main thrust of the research into has been to show how soft systems methodology, a method of systems analysis, can be extended into artificial intelligence. SSM employs three modeling devices i.e. rich pictures, root definitions, conceptual models of human activity systems.
The root definitions and conceptual models are built by stakeholders themselves in an iterative debate organized by a facilitator. The strengths of this method lie, firstly, in its flexibility, the fact that it can address any problem situation, secondly, in the fact that the solution belongs to the people in the organization and is not imposed by an outside analyst. Information requirements analysis took the basic SSM method a stage further and showed how the conceptual models could be developed into a detailed information system design. IRA calls for the addition of two modeling devices: "Information Categories", which show the required information inputs and outputs from the activities identified in an expanded conceptual model. A completed Maltese Cross is sufficient for the detailed design of a transaction processing system; the initial impetus to the development of logico-linguistic modeling was a concern with the theoretical problem of how an information system can have a connection to the physical world.
This is a problem in both IRA and more established methods because none base their information system design on models of the physical world. IRA designs are based on a notional conceptual model and SSADM is based on models of the movement of documents; the solution to these problems provided a formula, not limited to the design of transaction processing systems but could be used for the design of KBS with learning capability. The logico-linguistic modeling method comprises six stages. In the first stage logico-linguistic modeling uses SSM for systems analysis; this stage seeks to structure the problem in the client organization by identifying stakeholders, modelling organizational objectives and discussing possible solutions. At this stage it not assumed that a KBS will be a solution and logico-linguistic modeling produces solutions that do not require a computerized KBS. Expert systems tend to capture the expertise, of individuals in different organizations, on the same topic. By contrast a KBS, produced by logico-linguistic modeling, seeks to capture the expertise of individuals in the same organization on different topics.
The emphasis is on the elicitation of organizational or group knowledge rather than individual experts. In logico-linguistic modeling the stakeholders become the experts; the end point of this stage is an SSM style conceptual models such as figure 1. According to the theory behind logico-linguistic modeling the SSM conceptual model building process is a Wittgensteinian language-game in which the stakeholders build a language to describe the problem situation; the logico-linguistic model expresses this language as a set of definitions, see figure 2. After the model of the language has been built putative knowledge about the real world can be added by the stakeholders. Traditional SSM conceptual models contain only one logical connective. In order to represent causal sequences, "sufficient conditions" and "necessary and sufficient conditions" are required. In logico-linguistic modeling this deficiency is remedied by two addition types of connective; the outcome of stage three is an empirical model, see figure 3.
Modal predicate logic is used as the formal method of knowledge representation. The connectives from the language model are logically true and connective added at the knowledge elicitation stage are possibility true. Before proceeding to stage 5, the models are expressed in logical formulae. Formulae in predicate logic translate into the Prolog artificial intelligence language; the modality is expressed by two different types of Prolog rules. Rules taken from the language creation stage of model building process are treated as incorrigible. While rules from the knowledge elicitation stage are marked as hypothetical rules; the system has a built in learning capability. A knowledge based system built using this method verifies itself. Verification takes place, it is an ongoing process. If the stakeholder beliefs about the real world are mistaken this will be brought out by the addition of Prolog facts that conflict with the hypothetical rules, it operates in accordance to the classic principle of falsifiability found in the philosophy of science Logico-linguistic mode
In systems engineering and software engineering, requirements analysis encompasses those tasks that go into determining the needs or conditions to meet for a new or altered product or project, taking account of the conflicting requirements of the various stakeholders, documenting and managing software or system requirements. Requirements analysis is critical to the failure of a systems or software project; the requirements should be documented, measurable, traceable, related to identified business needs or opportunities, defined to a level of detail sufficient for system design. Conceptually, requirements analysis includes three types of activities: Eliciting requirements:, business process documentation, stakeholder interviews; this is sometimes called requirements gathering or requirements discovery. Analyzing requirements: determining whether the stated requirements are clear, complete and unambiguous, resolving any apparent conflicts. Recording requirements: Requirements may be documented in various forms including a summary list and may include natural-language documents, use cases, user stories, process specifications and a variety of models including data models.
Requirements analysis can be a long and tiring process during which many delicate psychological skills are involved. Large systems may confront analysts with thousands of system requirements. New systems change the environment and relationships between people, so it is important to identify all the stakeholders, take into account all their needs and ensure they understand the implications of the new systems. Analysts can employ several techniques to elicit the requirements from the customer; these may include the development of scenarios, the identification of use cases, the use of workplace observation or ethnography, holding interviews, or focus groups and creating requirements lists. Prototyping may be used to develop an example system. Where necessary, the analyst will employ a combination of these methods to establish the exact requirements of the stakeholders, so that a system that meets the business needs is produced. Requirements quality can be improved through these and other methods Visualization.
Using tools that promote better understanding of the desired end-product such as visualization and simulation. Consistent use of templates. Producing a consistent set of models and templates to document the requirements. Documenting dependencies. Documenting dependencies and interrelationships among requirements, as well as any assumptions and congregations. See Stakeholder analysis for a discussion of people or organizations that have a valid interest in the system, they may be affected by it either indirectly. A major new emphasis in the 1990s was a focus on the identification of stakeholders, it is recognized that stakeholders are not limited to the organization employing the analyst. Other stakeholders will include: anyone who operates the system anyone who benefits from the system anyone involved in purchasing or procuring the system. In a mass-market product organization, product management and sometimes sales act as surrogate consumers to guide development of the product. Organizations which regulate aspects of the system people or organizations opposed to the system organizations responsible for systems which interface with the system under design.
Those organizations who integrate horizontally with the organization for whom the analyst is designing the system. Requirements have cross-functional implications that are unknown to individual stakeholders and missed or incompletely defined during stakeholder interviews; these cross-functional implications can be elicited by conducting JRD sessions in a controlled environment, facilitated by a trained facilitator, wherein stakeholders participate in discussions to elicit requirements, analyze their details and uncover cross-functional implications. A dedicated scribe should be present to document the discussion, freeing up the Business Analyst to lead the discussion in a direction that generates appropriate requirements which meet the session objective. JRD Sessions are analogous to Joint Application Design Sessions. In the former, the sessions elicit requirements that guide design, whereas the latter elicit the specific design features to be implemented in satisfaction of elicited requirements.
One traditional way of documenting requirements has been contract style requirement lists. In a complex system such requirements lists can run to hundreds of pages long. An appropriate metaphor would be an long shopping list; such lists are much out of favour in modern analysis. Provides a checklist of requirements. Provide a contract between the project sponsor and developers. For a large system can provide a high level description from which lower-level requirements can be derived; such lists can run to hundreds of pages. They are not intended to serve as a reader-friendly description of the desired application; such requirements lists abstract all the requirements and so there is little context. The Business Analyst may include context for requirements in accompanying design documentation. Thi
A data model is an abstract model that organizes elements of data and standardizes how they relate to one another and to properties of the real world entities. For instance, a data model may specify that the data element representing a car be composed of a number of other elements which, in turn, represent the color and size of the car and define its owner; the term data model is used in two distinct but related senses. Sometimes it refers to an abstract formalization of the objects and relationships found in a particular application domain, for example the customers and orders found in a manufacturing organization. At other times it refers to a set of concepts used in defining such formalizations: for example concepts such as entities, relations, or tables. So the "data model" of a banking application may be defined using the entity-relationship "data model"; this article uses the term in both senses. A data model explicitly determines the structure of data. Data models are specified in a data modeling notation, graphical in form.
A data model can sometimes be referred to as a data structure in the context of programming languages. Data models are complemented by function models in the context of enterprise models. Managing large quantities of structured and unstructured data is a primary function of information systems. Data models describe the structure and integrity aspects of the data stored in data management systems such as relational databases, they do not describe unstructured data, such as word processing documents, email messages, digital audio, video. The main aim of data models is to support the development of information systems by providing the definition and format of data. According to West and Fowler "if this is done across systems compatibility of data can be achieved. If the same data structures are used to store and access data different applications can share data; the results of this are indicated above. However and interfaces cost more than they should, to build and maintain, they may constrain the business rather than support it.
A major cause is that the quality of the data models implemented in systems and interfaces is poor". "Business rules, specific to how things are done in a particular place, are fixed in the structure of a data model. This means that small changes in the way business is conducted lead to large changes in computer systems and interfaces". "Entity types are not identified, or incorrectly identified. This can lead to replication of data, data structure, functionality, together with the attendant costs of that duplication in development and maintenance". "Data models for different systems are arbitrarily different. The result of this is; these interfaces can account for between 25-70% of the cost of current systems". "Data cannot be shared electronically with customers and suppliers, because the structure and meaning of data has not been standardized. For example, engineering design data and drawings for process plant are still sometimes exchanged on paper"; the reason for these problems is a lack of standards that will ensure that data models will both meet business needs and be consistent.
A data model explicitly determines the structure of data. Typical applications of data models include database models, design of information systems, enabling exchange of data. Data models are specified in a data modeling language. A data model instance may be one of three kinds according to ANSI in 1975: Conceptual data model: describes the semantics of a domain, being the scope of the model. For example, it may be a model of the interest area of an industry; this consists of entity classes, representing kinds of things of significance in the domain, relationship assertions about associations between pairs of entity classes. A conceptual schema specifies the kinds of facts or propositions that can be expressed using the model. In that sense, it defines the allowed expressions in an artificial'language' with a scope, limited by the scope of the model. Logical data model: describes the semantics, as represented by a particular data manipulation technology; this consists of descriptions of tables and columns, object oriented classes, XML tags, among other things.
Physical data model: describes the physical means. This is concerned with partitions, CPUs, the like; the significance of this approach, according to ANSI, is that it allows the three perspectives to be independent of each other. Storage technology can change without affecting the conceptual model; the table/column structure can change without affecting the conceptual model. In each case, of course, the structures must remain consistent with the other model; the table/column structure may be different from a direct translation of the entity classes and attributes, but it must carry out the objectives of the conceptual entity class structure. Early phases of many software development projects emphasize the design of a conceptual data model; such a design can be detailed into a logical data model. In stages, this model may be translated into physical data model. However, it is possible to implement a conceptual model directly. One of the earliest pioneering works in modelling information systems was done by Young and Kent, who argued for "a precise and abstract way of specifying the informational and time characteristics of a data processing problem".
They wanted to create "a notation that should enable the analyst to organize the problem around any piece of hardwa
Howard T. Odum
Howard Thomas Odum cited as H. T. Odum, was an American ecologist, he is known for his pioneering work on ecosystem ecology, for his provocative proposals for additional laws of thermodynamics, informed by his work on general systems theory. Odum was the third child of Howard W. Odum, an American sociologist, his wife Anna Louise Odum, he was the younger brother of Eugene Odum. Their father "encouraged his sons to go into science and to develop new techniques to contribute to social progress. Howard learned his early scientific lessons about birds from his brother, about fish and the philosophy of biology while working after school for the marine zoologist Robert Coker, about electrical circuits from The Boy Electrician by Alfred Powell Morgan. Howard Thomas studied biology at the University of North Carolina at Chapel Hill, where he published his first paper while still an undergraduate, his education was interrupted for three years by his World War II service with the Army Air Force in Puerto Rico and the Panama Canal Zone where he worked as a tropical meteorologist.
After the war, he returned to the University of North Carolina and completed his B. S. in zoology in 1947. In 1947, Odum married Virginia Wood. After her 1973 death, he married Elizabeth C. Odum in 1974. Odum's advice on how to manage a blended family was to be sure to keep talking. In 1950, Howard earned his Ph. D. in zoology at Yale University, under the guidance of G. Evelyn Hutchinson, his dissertation was titled The Biogeochemistry of Strontium: With Discussion on the Ecological Integration of Elements. This step took him from his early interest in ornithology and brought him into the emerging field of systems ecology, he made a meteorological "analysis of the global circulation of strontium, anticipated in the late 1940s the view of the earth as one great ecosystem."While at Yale, Howard began his lifelong collaborations with his brother Eugene. In 1953, they published the first English-language textbook on systems ecology, Fundamentals of Ecology. Howard wrote the chapter on energetics, they continued to collaborate in research as well as writing for the rest of their lives.
For Howard, his energy systems language was itself a collaborative tool. From 1956 to 1963, Odum worked as the Director of the Marine Institute of the University of Texas. During this time, he became aware of the interplay of economic forces, he taught at the University of North Carolina at Chapel Hill, where he was in the Department of Zoology, one of the professors in the new Curriculum of Marine Sciences until 1970. That year he moved to the University of Florida, where he taught at the Environmental Engineering Sciences Department and directed the Center for Environmental Policy, founded the University's Center for Wetlands in 1973, it was the first center of its kind in the world, still in operation today. Odum continued this work for 26 years until his retirement in 1996. In the 1960s-1970s Odum was chairman of the International Biological Program's Tropical Biome planning committee, he was supported by large contracts with the United States Atomic Energy Commission, resulting in participation by nearly 100 scientists, who conducted radiation studies of a tropical rainforest His featured project at University of Florida in the 1970s was on recycling treated sewage into cypress swamps.
This was one of the first projects to explore the now widespread approach of using wetlands as water quality improvement ecosystems. This is one of his most important contributions to the beginnings of the field of ecological engineering. In his last years, Odum was Graduate Research Professor Emeritus and Director of the Center for Environmental Policy, he was an avid birdwatcher in both his professional and personal life. The Ecological Society awarded Odum its Mercer Award to recognize his contributions to the study of the coral reef on Eniwetok Atoll. Odum received the French Prix de Vie, the Crafoord Prize of the Royal Swedish Academy of Science, considered the Nobel equivalent for bioscience, not honored by Nobel himself. Charles A S Hall has described Odum one of the most important thinkers of our time, he has noted that Howard Odum, either alone or with his brother Eugene, received all of international prizes awarded to ecologists. The only higher education institute to award honorary degrees to both Odum brothers was The Ohio State University, which honored H.
T. in 1995 and Gene in 1999. Odum's contributions to this field have been recognised by the Mars Society, they named their experimental station the "H. T. Odum greenhouse", at the suggestion of his former student Patrick Kangas. Kangas and his student, David Blersch, made significant contributions to the design of the waste water recycling system on the station. Odum's students have furthered his work at institutions around the world, most notably Mark Brown at the University of Florida, David Tilley and Patrick Kangas at the University of Maryland, Daniel Campbell at the United States Environmental Protection Agency, Enrique Ortega at the UNICAMP in Brazil, Sergio Ulgiati at the University of Siena. Work done at these institutions continues to propagate the Odum's concept of emergy, his former students Bill Mitsch, Robert Costanza, Scott W. Nixon are among a cadre of former students who have been recognized internationally for their contributions to ecological engineering, ecolo