Translate the text with a dictionary.



TEXT 1

A SHORT HISTORY

OF COMPUTERS AND INFORMATION TECHNOLOGY

There is almost 300 years ago between the invention of the first mechanical computer and the invention of the first electronic computer. In 1642, Blaise Pascal, in France, who was 19 at the time, grew tired of adding long columns of figures in his father’s tax office and he designed a mechanical device consisting of a series of numbered wheels with gears for decimal reckoning, which could add and subtract the long columns of figures. Thirty years later, a German, Gothfried Leibniz, invented the Leibniz wheel using similar principles, which could not only do subtraction and addition, but also multiplication and division.

Almost a hundred years passed before Sir Charles Babbage designed the first universal automatic calculator. Again, it was a mechanical device using counting wheels, coping with 1000 words of 50 digits each, but with one vital difference: he ged punched cards to control the programme. Punched cards were also used as input and output devices. The machine contained all the functions necessary in a modern computer - an input unit, a store or memory, an arithmetic unit, a control unit and an output unit. Improvements were made by Pehr Schuetz in Sweden and a machine similar to Babbage’s was built in 1854, which was capable of printing out its own tables. Almost forty years passed before H.Hollerith, in America, developed a machine for tabulating population statistics for the 1890 census2. Holes in punched cards were used to denote age, sex, etc., and the size of the cards were made the size of a dollar bill. Another forty years later, Vannevar Bush in the USA, developed an early analogue computer for solving differential equations, and analogue computers were built by several universities (e.g.Manchester University in the UK in 1934). First electronic digital computers were not purely electronic but electromechanical. In 1937 Howard H. Aiken of Harvard university designed an electromechanical automatic sequence-controlled calculator which was built by IBM and presented to Harvard 7 years later. A relay-operated computer was built by Stibitz, of Bell Laboratories, about the same time. The first truly electronic computer was the ENIAC (Electronic Numerator Integrator and Computer) begun in 1942 by the University of Pennsylvania and completed in 1946. It used 18 000 tubes, was 51 feet long and 8 feet high. The numbers used in this machine could be added in 200 microseconds and multiplied in 2300 microseconds, this was the

fastest calculator developed up to this time. From the middle 1940s, a series of omputers were built each using later electronic techniques as they were developed3, i.e. tubes to transistors, transistors to integrated circuits, each becoming smaller and smaller, until4 present microcomputers were produced.

TEXT

COMPACT DISC

The invention of the compact disc (CD) was the result of research carried out on the video disc by the Dutch electronics company Philips NV. Under a joint licensing agreement by Philips and the Japanese company Sony, the CD was first developed in 1979. A process of digital recording is used, rather than the analogue recording process.

The signal is coded in binary form, using series 0 and 1. The sound is reproduced by a laser beam. The compact disc has a diametre of 5 inches and can hold 75 minutes of music or sound on one side.

The CD was first marketed in 1983, and by 1991 had outstripped both traditional

forms of recorded music - records and tapes - in terms of unit sales and values. In the space of a few years, the CD has achieved incredible success, and its applications are many and varied. In 1984 Philips and Matsushita brought out the prototypes of decoders that enabled fixed images, which had been stored on CDs alongside an audio signal, to be viewed on television. In 1985 the extensive storage capacity of CDs was applied to computers. CD players now have the capability of running a disc at twice the normal speed, which makes it possible to record an hour.

Text 2

Read the text

Information systems

The term information system (IS) sometimes refers to a system of persons, data records and activities that process the data and information in an organization, and it includes the organization's manual and automated processes. Computer-based information systems are the field of study for information technology, elements of which are sometimes called an "information system" as well, a usage some consider to be incorrect.

The term "information system" has different meanings:

In computer security, an information system is described by three objects:

In knowledge representation, an information system consists of three components: human, technology, organization. In this view, information is defined in terms of the three levels of semiotics. Data which can be automatically processed by the application system corresponds to the syntax-level. In the context of an individual who interprets the data they become information, which correspond to the semantic-level. Information becomes knowledge when an individual knows (understands) and evaluates the information (e.g., for a specific task). This corresponds to the pragmatic-level.

In organizational informatics an information system is a system of communication between people. Information systems are systems involved in the gathering, processing, distribution and use of information and as such support human activity systems.

 

TEXT

INFORMATION TECHNOLOGY

The definition of information technology (IT) is as follows: the use of technology toprovide the capture, storage, retrieval, analysis and communication of information, which can be done either in the form of data, text, image or voice.

With the invention and exploitation of the integrated circuit or ‘chip’ since the 1960s, the growth of applications using electronics has been phenomenal. Modern electronic computers can process data, graphics and speech at extremely fast rates. The microprocessor is at the heart of what is known as the IT revolution.

Information and communications technologies are changing the way we work, study, do research, and educate our children and ourselves. They are influencing the way we do our banking, pay our bills, entertain ourselves and do business. New options (choices) are being provided for us in the field of health care, education, environmental protection, culture, and business. Computers control washing machines, cookers, televisions, telephones, home computers, cameras, video games, digital watches and many other devices.Offices and factories now use microprocessors in the everyday life, as do cars, fax machines, aircraft fly control, railway signalling, police computer databases, etc.

The aim of the IT revolution has been to transform labour-intensive work, such as mining, agriculture, iron, steel and cotton industries, hardware manufacturing, etc., into an industry where a few highly-skilled workers manage large factories with mainly automated labour.

Computers in our life

When Charles Babbage, a professor of Mathematics at Cambridge University, invented the first calculating machine in 1812 he couldn’t imagine the situation we find ourselves in today. Nearly everything we do in the world is helped, or even controlled by computers, the complicated descendants of his simple machine. Computers are used more and more often in the world today, for the simple reason that they are far more efficient than human beings. They have much better memories and they can store much information. No man alive can do 500.000 sums in one second, but a computer can. In fact, computers can do many of the things we do, but faster and better. They can predict weather, and ever play chess, write poetry or compose music.

Computers in medicine

Computers are one of great importance in modern hospital. The chief use of computers is the storing and sorting the medical knowledge which has been enquired in the last 50 years. No doctor can possible keep up with all discoveries. The only solution of the problem is store medical knowledge in a computer. Today there are medical computer centers were all existing knowledge of symptoms of various diseases and of their treatment is stored. Doctors feed data on symptoms in the computer and get the necessary information on correct diagnostics and treatment.

 

SWS 3

Translate the text with a dictionary.

THE MOUSE

The mouse is a small device that slides in all directions on a desk which makes it

possible to interact naturally with the computer. Its use was popularized by Apple with the Lisa and the Macintosh models in 1983. However, it was the little-known American inventor Douglas Englebart who conceived and designed the mouse at the Stanford research institute in the mid-1960s. His brilliant idea was to have the computer operator place his or her hand on a small box or mouse. A sphere on the underside of the mouse is used to measure movements which are then transmitted to the computer via a lead - the tail of the mouse. These movements are translated to the cursor on the screen: if the mouse is pushed to the right the cursor goes to the right; if the mouse is pushed away from the user the cursor moves up, and so on.

Google Personalizes Search with SearchWiki

November 20, 2008

By Brad Stone

Google is set on Thursday to significantly change the way some people use its search engine. The company is introducing a new feature called Search Wiki that will allow people to modify and save their results for specific Google searches. They can move the sites that appear in rankings up or down, take them out altogether, leave notes next to specific sites and suggest new sites that are not already in the results (or are buried too far down in the results to see). Users must be logged in to Google to use Search Wiki and can revisit their annotations when they perform the same search later.

Screen shot of Google’s new Search Wiki feature.

The company is also making these annotations public, in a move that may either deter Google users from writing anything too personal on Search Wiki or encourage spammers to exploit the tool.At the bottom of every Google search results page, logged-in Google users will see a link that says, “See all notes for this Search Wiki.” Clicking on it allow users to see how other people have re-ranked results or commented on sites. At least at first, there will not be any way to make these notes private, Google says, but users can change or delete their notes at any time.

Search Wiki may essentially allow users to rank and review the top sites for common searches— like “Indian restaurants in San Francisco,” for example. That could spur users to evaluate businesses and push Google into direct competition with review sites such as Yelp.com and CitySearch. Marissa Mayer, Google’s vice president of search product and user experience, says that in internal tests, people’s notes in Search Wiki have tended to be more about the relevancy of the Web site to that particular search term.

TEXT

COMPUTER GENERATIONS

The first generation (1951-1959). The first generation of computers is usually

thought of as beginning with the UNIVAC I in 1951. First-generation machines used vacuum tubes, their speed was measured in milliseconds (thousandths of a second), and the data input and output was usually based on punched cards. First-generation computers typically filled a very large room, and were used primarily for research. In early 1951 the first UNIVAC-1 became operational at the Census Bureau. When it displaced IBM punched card equipment at the Census Bureau, Thomas J. Watson, the son of IBM’s founder reacted quickly to move IBM into the computer age. The first computer acquired for data processing and record keeping1 by a business organization was another UNIVAC-1, installed in 1954 at General Electric’s Appliance Park in Louisville, Kentucky. The IBM 650 entered service in Boston in late 1954. A comparatively inexpensive machine for that time, it was widely accepted. It gave IBM the leadership in computer production in 1955.

In the period from 1954 to 1959, many businesses acquired computers for data

processing purposes, even though these first-generation machines had been designed for scientific uses. Nonscientists generally saw the computer as an accounting tool, and the first business applications were designed to process routine tasks such as payrolls. The full potential of the computer was underestimated, and many firms used computers because it was the prestigious thing to do. But we shouldn’t judge the early users of computers too harshly. They were pioneering in the use of a new tool. They had to staff their computer installations with a new breed of workers, and they had to prepare

programmes in a tedious3 machine language. In spite of these obstacles, the computer was a fast and accurate processor of mountains of paper.

The second generation (1959-1964). The invention of the transistor led to computers that were both smaller and faster. During this period they were about the size of a closet,and operated in microseconds (millionths of a second). Internal memory was magnetic, and magnetic tapes and disks as well as punched cards were used for input, output, and storage. Computers were still fairly specialized: although computers could now be used for business as well as scientific applications, one computer could not perform both tasks.

The computers of the 2nd generation which began to appear in 1959, were made smaller and faster and had greater computing capacity. The practice of writing applications programmes in machine language gave way to the use of higher-level programming languages. And the vacuum tube, with its relatively short life, gave way to transistors that had been developed at Bell Laboratories in 1947 by John Bardeen, Willliam Shockley,and Walter Brattain.

The third generation and beyond. There is general agreement that the third

generation began in 1964 with the introduction of the IBM System 360, which could handle both scientific and business computing. Computers shrank to the size of a large desk, and processing time shrank to nanoseconds (billionths of a second). Instead of individual transistors, as in the second generation, third-generation computers used integrated circuits, or ICs, which combined hundreds or even thousands of transistors on a single silicon chip. Instead of having a single operator and doing just one task at a time, the computer could work with different people giving them different tasks simultaneously.

Innovation and expansion have continued in the computer industry, but it is hard to

tell a date of specific development which marked the end of the third generation and beginning of the fourth generation. Advances in chip design led to further modernisation, and ICs gave way to the microprocessor, the so-called «computer on a chip».

In the mid-1970s the personal computer revolution started, in the last few years more and more PCs have begun to be connected to other PCs and minicomputer systems.

 

 

Future Trends

The movement toward professionalization of the computer industry and software engineering is continuing. There has been renewed attention to ethical issues within the engineering and computing professions with special attention given to the well-being of the client and the user of software engineering artifacts. The ACM and the IEEE Computer Society adopted the Software Engineering Code of Ethics and Professional Practice in 1998. Since that time it has been adopted by other computing professional organizations and by large and small software development companies as a standard of practice. This has occurred in spite of the fact that in addition to not causing harm, the Code requires software engineers to do what they can to prevent harm. This means that a software engineer is expected to report another software engineer's faulty work. Adherence to this code requires reporting any signs of danger from computer systems, whether or not the individual reporting the risk designed the system. There are also significant movements toward licensing software engineers and the codification of development standards. Clients are beginning to understand the responsibilities of software engineers and holding them responsible to those standards. The next step in software engineering ethics is a change in culture where there is a mutual understanding and expectation of ethical behavior of software engineers and those who are impacted by their products.

Robot ЧТО ДЕЛАТЬ С ТЕКТСОМ?

A robot is a mechanical or virtual artificial agent. In practice, it is usually an electro-mechanical system which, by its appearance or movements, conveys a sense that it has intent or agency of its own. The word robot can refer to both physical robots and virtual software agents, but the latter are usually referred to as bots. There is no consensus on which machines qualify as robots, but there is general agreement among experts and the public that robots tend to do some or all of the following: move around, operate a mechanical arm, sense and manipulate their environment, and exhibit intelligent behavior, especially behavior which mimics humans or animals.

Stories of artificial helpers and companions and attempts to create them have a long history, but fully autonomous machines only appeared in the 20th century. The first digitally operated and programmable robot, the Unimate, was installed in 1961 to lift hot pieces of metal from a die casting machine and stack them.

Today, commercial and industrial robots are in widespread use performing jobs more cheaply or with greater accuracy and reliability than humans. They are also employed for jobs which are too dirty, dangerous or dull to be suitable for humans. Robots are widely used in manufacturing, assembly and packing, transport, earth and space exploration, surgery, weaponry, laboratory research, and mass production of consumer and industrial goods.

PROGRAMMING LANGUAGES

Most courses in the programming begin with a lesson on the binary system, although most programmers seldom have to use the binary number in actual practice. The reason for studying the binary system is to understand the nature of the computer and the way it operates. Understanding the binary system and its correspondence to the switches inside the machine helps to take the mystery out of computers. Above all, it is the programmer who must realize that the machine is controlled by human beings - and that he is the one who is going to control and direct it.

 

There are many programming languages. They are known by such names as

FORTRAN, COBOL, ALGOL, PL/1, and APL. FORTRAN is used primarily for scientific work, and COBOL is used for most commercial applications. COBOL and FORTRAN are the most common of the programming languages. Of the others, ALGOL and APL are used primarily for scientific work, while PL/1 is employed for general-purpose programming.How does the machine understand instructions in one of these languages if the onlylanguage to which the machine can react is machine language? It understands them by means of interpreter, just as an American diplomat at the United Nations communicates with a Chinese diplomat through an interpreter. The interpreter for a computer is a systems

program. The systems programmes are part of the software, but they are supplied by the manufacturer of the machine. One of the systems programmes is called the compiler. The computer takes each instruction in the programme and translates it into machine language- that is, into a binary equivalent. It is this translated programme that activates the millions of switches in the machine during processing. There is a separate compiler for each of the standard programming languages.

COMPUTER SYSTEMS

A modern computer comprises two basic parts - the hardware and the software. The

hardware comprises the computer and all its peripherals. This includes the monitor,

keyboard, mouse, printer, etc. The software are the ‘instructions’ to control the functions of the computer. Software is generally stored on magnetic disks, CDs or tapes. Software can be subdivided into ‘control’ or ‘operating system’ software. In small modern PCs, the operating system may be ‘DOS’ (Disk Operating System) or ‘Windows’. Software is not inexpensive. All digital computers operate by adding, subtracting, multiplying and dividing numbers at incredibly high speeds.

Great advances have been made in the storage or memory system of a computer since

the invention of the first calculator. Instead of the earlier systems such as magnetic tapes, cores and drums disc storage is used in modern computers. Digital compact discs, invented in 1978 by Philips, are unaffected by dust, scratches and fingerprints and are now widely used because of their high storage capability.

The integrated circuit memory chip consists of many thousands of transistors. It is able to store and extract a given pulse or bit when required. The chip is composed of a number of cells. Each cell contains a byte (a number composed of eight bits). There are two kindsof chips: ROM (Read Only Memory) and RAM ( Random Access Memory) chips. Theycan give out their stored programs when needed. In the case of the ROM, this data is fixed,whereas the RAM chip can be instructed to remember patterns.

Calculations are normally performed by using arithmetic. Two numbers only are used

0 and 1 in binary system instead of 0-10 in the decimal system.

As the computer operates in binary code, for a programming language it is necessary

to convert words or instructions into suitable binary codes. As for initial low-level

languages, every action of the computer there has to be described in detail, but with

modern high-level languages, a simple instruction can result in complicated actions. To make sure that instructions are presented in a logical order a flow chart is used sometimes to specify the program.

 

 

TYPES OF COMPUTERS

At present, there are four general categories of computers which vary widely in costand performance.

The Supercomputer. In the first and most expensive category are the ‘super’ computers,such as the GRAY where not only is computation done at very fast speeds, but manycomputation processes occur in parallel enabling a great amount of work to be done.

These computers are used for tasks where an extremely large number of mathematicalequations need to be solved numerically in a reasonably short time, often involving thesimultaneous input of large volumes of data; such a task is, for example, weatherforecasting.

The Main Frame. In the second category are the so-called ‘main frame’ computers;

examples of these are the IBM 3033, Univac 90/30 and ICL 2900. In these machinescomputations are performed fast but with little parallelism. These machines are used forlarge payrolls and other accounting tasks, factory management, financial planning and forscientific and design problems where a large number of equations need to be solved.

The Minicomputer. In the third category are the minicomputers; calculations in thesemachines are still reasonably fast; however, the basic word length is short, typically 32bits, therefore the accuracy of calculation for some purposes has to be improved by using several programming steps. These machines are used in the main1 for tasks of moderate complexity in the accounting, scientific and computer-aided design fields. They are also used for controlling large chemical plants, steel rolling mills2 and other complex continuous processes. Examples of these machines are VAX 750, GEC 4080 and HP 1000.

The Microcomputer (usually called the PC). In the fourth category are the

microcomputers; their central processing unit is usually on a single chip. These computers have a wide range of occupations, and today both computers and microprocessors are widely distributed throughout businesses. Microprocessors are the heart of home video games, and are used as the main processing element in many automatic machines, such as store checkout tills3 and industrial robots.

Computer Aided Design (CAD). Computer Aided Design means using computers to aid the designer to do his job efficiently; for example, in the electronic industry,

computers are used to analyse the parametres of proposed circuits, and translate that design onto a printed circuit board. Computers are widely used in the design of integrated circuits.

For mechanical design, computers are used for mathematical representing of solid

objects allowing to evaluate such properties as mass, centre of gravity, etc. They can also be used to analyse stresses in components and assemblies; for example, bridges, aircraft components and structures.

NOTES:1) in the main - mainly; 2) steel rolling mill - прокатный стан; 3) store

checkout till - кассовый аппарат

Future Trends ЭТОТ ТЕКСТ БЫЛ ЖЕ ВНАЧАЛЕ!

The movement toward professionalization of the computer industry and software engineering is continuing. There has been renewed attention to ethical issues within the engineering and computing professions with special attention given to the well-being of the client and the user of software engineering artifacts.

 The ACM and the IEEE Computer Society adopted the Software Engineering Code of Ethics and Professional Practice in 1998. Since that time it has been adopted by other computing professional organizations and by large and small software development companies as a standard of practice. This has occurred in spite of the fact that in addition to not causing harm, the Code requires software engineers to do what they can to prevent harm. This means that a software engineer is expected to report another software engineer's faulty work. Adherence to this code requires reporting any signs of danger from computer systems, whether or not the individual reporting the risk designed the system. There are also significant movements toward licensing software engineers and the codification of development standards. Clients are beginning to understand the responsibilities of software engineers and holding them responsible to those standards. The next step in software engineering ethics is a change in culture where there is a mutual understanding and expectation of ethical behavior of software engineers and those who are impacted by their products.

INFORMATION TECHNOLOGY

The definition of information technology (IT) is as follows: the use of technology to provide the capture, retrieval, analysis and communication of information, which can be done either in the form of data, text, image or voice.

With the invention and exploitation of the integrated circuit or ‘chip’ since the 1960s, the growth of applications using electronics has been phenomenal. Modern electronic computers can process data, graphics and speech at extremely fast rates. The microprocessor is at the heart of what is known as the IT revolution.

Information and communications technologies are changing the way we work, study, do research, and educate our children and ourselves. They are influencing the way we do our banking, pay our bills, entertain ourselves and do business. New options (choices) are being provided for us in the field of health care, education, environmental protection, culture, and business. Computers control washing machines, cookers, televisions, telephones, home computers, cameras, video games, digital watches and many other devices.

Compact discs can record complete encyclopedias, as well as provide sound and

pictures.The impact of this information revolution on our society cannot yet be fully measured or predicted at this time. The combination of new and rapidly developing interactive multimedia computers and applications with electronic networks will require a restructuring of our traditional approach to strategic planning and organisational structure.

It also means a considerable (great) change in the way we interact with each other, with business and with government.

Traceability

Software development process starts with an initial artifact, such as customer statement of work

, and ends with source code. As the development progresses, being able to trace the links among successive artifacts is key. If you do not make explicit how an entity in the current phase evolved from a previous-phase entity, then it is unclear what was the purpose of doing all that previous work. Lack of traceability renders the past creations irrelevant and we might as well have started with this phase. It makes it difficult for testers to show that the system complies with its requirements and maintainers to assess the impact of a change. Therefore, it is essential that a precise link is made from use cases back to requirements, from design diagrams back to use cases, and from source code back to design diagrams. Traceability refers to the property of a software artifact, such as a use case or a class, of being traceable to the original requirement.

Traceability must be maintained across the lifecycle. Maintaining traceability involves recording, structuring, linking, grouping, and maintaining dependencies between requirements and other software artifacts.

Text 9. HARDWARE, SOFTWARE, AND FIRMWARE

The units that are visible in any computer are the physical components of a data processing system, or hardware. Thus, the input, storage, processing and control devices are hardware. Not visible is the software — the set of computer programs, proce­dures, and associated documentation that make possible the effective operation of the computer system. Software programs are of two types: systems software and applications software.

Systems software are the programs designed to control the operation of a computer system. They do not solve specific problems. They are written to assist people in the use of the computer system by performing tasks, such as controlling all of the operations required, to move data into and out of a com­puter and all of the steps in executing an application program. The person who prepares systems software is referred to as a systems programmer. Systems programmers are highly trained specialists and important members of the architectural team.

Applications software are the programs written to solve spe­cific problems (applications), such as payroll, inventory control, and investment analysis. The word program usually refers to an application program, and the word programmer is usually a person who prepares applications software.

Often programs, particularly systems software, are stored in an area of memory not used for applications software. These protected programs are stored in an area of memory called read­only memory (ROM), which can be read from but not written on.

Automation

Automation (ancient Greek: = self dictated), robotization or industrial automation or numerical control is the use of control systems such as computers to control industrial machinery and processes, reducing the need for human intervention. In the scope of industrialization, automation is a step beyond mechanization. Whereas mechanization provided human operators with machinery to assist them with the physical requirements of work, automation greatly reduces the need for human sensory and mental requirements as well. Processes and systems can also be automated.

Automation plays an increasingly important role in the global economy and in daily experience. Engineers strive to combine automated devices with mathematical and organizational tools to create complex systems for a rapidly expanding range of applications and human activities.

Many roles for humans in industrial processes presently lie beyond the scope of automation. Human-level pattern recognition, language recognition, and language production ability are well beyond the capabilities of modern mechanical and computer systems. Tasks requiring subjective assessment or synthesis of complex sensory data, such as scents and sounds, as well as high-level tasks such as strategic planning, currently require human expertise. In many cases, the use of

SOFTWARE ENGINEERING ETHICS

Software engineering ethics can be approached from three directions. First, it can describe the activity of software engineers making practical choices that affect other people in significant ways. Second, it can be used to describe a collection of principles, guidelines, or ethical imperatives that guide or legislative action, and third, it can be used to name a discipline that studies the relationship between the other two of ethics.

Software engineering ethics is clearly both an activity and a body of principles. The iscipline of software engineering ethics that studies this activity and formalizes these principles, however, is in its infancy.

Software Engineering Ethical Activity

To avoid confusion, “ethics”, as understood here, addresses any intentional action that impacts negatively or positively the lives and values of others. Software engineering conceives of itself primarily as a technical discipline that develops software. There area ariety of names, such as information systems analyst, for those who engage in professional software development. Regardless of the title used, the focus of software engineering activity is primarily on the technical adequacy of the products developed. But the fact that roughly one billion people depend on software systems to effectively conduct their daily lives (Reed, 2000) has led many in computing to give more attention to the nontechnical aspects and to wrestle with the ethical impact of their daily decisions and the values imbedded therein. The relationship between computers and ethics can be described as occurring when humans make decisions about computers, and those decisions affect people's lives. Human values are linked to technical decisions in this way. The activity of Professional Software Engineering Ethics takes place when any decision made by computing professionals during the design, development, construction and maintenance of computing artifacts affect other people. These decisions may be made by individuals, teams, management, or the profession. The software engineering decision how to design the release of an airbag will affect the lives of others. This technical decision is also guided by an ethical decision about human values. Barry Boehm (1981) begins his work on software engineering economics with an anecdote about how the failure to consider human values affected the development of software for a high school attendance system. His work on this project led him to see that software engineers have "… an opportunity to make a significant positive impact on society, simply by becoming more sensitive to the long-term human relations implications of our work and incorporating this sensitivity into our software designs and products." (Boehm, 1981)


Дата добавления: 2018-06-01; просмотров: 399; Мы поможем в написании вашей работы!

Поделиться с друзьями:






Мы поможем в написании ваших работ!