Текст 4.7. Прочитайте текст и определите основные направления в развитии архитектуры суперкомпьютеров. Объясните необходимость изменений в архитектуре.
New Design Strategies
То keep pace with the multiplicity and complexity of large scale applications, tomorrow's macros will need increasingly higher throughputs and greater memory capacity—while, at the same time, being easier to operate. The needed improvement is too great to be accomplished by piece-meal (отдельный) progress in components. Radical changes in basic architecture will be required.
New design strategies are already showing up in some extra-high performance machines, but the full impact of these changes will not be felt for several years.
The two key points are to be emphasized when dealing with the problem of new designs — parallel processing and distributed computing.
Although continued progress is foreseen in the execution speed of circuit components, the dramatic progress needed to increase throughput cannot be achieved solely through improvements in circuitry. One approach that will help is parallelism.
Basically, parallel processing involves the use of parallel or redundant circuits to accomplish similar or different functions. In the first case, the computer achieves a higher throughput merely by having more circuits working at one time. In the case of different functions, throughput is increased by having different portions of the computer work on different aspects of a problem at the same time, instead of having the computer step through series of functions sequentially.
Whereas parallel processing is fundamentally an approach to solving problems, distributing computing refers to the form in which parallelism will most likely be executed. Although it is possible to design parallelism into the massive CPU of a mainframe macro, tomorrow’s big computer will achieve this capability through combinations of separate processors — distributed computing.
The distribution concept will be patterned after today's computer networks. In the macros of the future, several small processors—each dedicated to specific specialized functions — will be interconnected in parallel or tied together by a large central processor. The various elements will be closely coordinated to solve large-scale problems and/or control complex processes.
With this computer configuration, the small processors operate semi-autonomously and are fairly intelligent in their own right (сами по себе). Thus, a computer can be made up of a collection of 16-bit units mat are capable, together, of producing a 64-bit result every IC ns. Each unit might control itself via microcoded instruction sets which allow it to tackle specific functions at its own speed. The various units communicate with each other and the main CPU only in so far as is necessary.
Distributed computing will eventually make the traditional, single mainframe computer obsolete.
Текст 4.8. Прочитайте бегло текст и укажите основные преимущества компьютеров при решении сложных задач, рассматриваемых в тексте.
Big Problems Require Big Computers
The expanding role of the macro computer is due to the ever-increasing number of applications that transcend (выходить за пределы) the capabilities of micros and minis. Certain real time problems — such as the preparation, launch, and guidance of a space vehicle or satellite, for example, require millions of calculations for each external stimulus, with response time of only one or two seconds at the most. The large on-line databases required to solve such problems and the interdependent nature of the calculations can be handled only by the huge memory capacities and high throughputs of large-scale computers.
Other problems are so complicated that millions of bytes of high-speed storage are necessary to fully describe them and solve them in time for the answers to be useful. A weather-prediction model and other complex simulations are cases in point.
For example, if weather prediction is to be possible, countless factors such as wind currents, solar effects, and even planetary configurations must be calculated, correlated, and simulated. Similar problems are involved in the mapping of ocean processes, and probing out of new energy sources.
Large-scale computers are necessary to do the complex processing, necessary to create intricate electronic and photographic image from the coded data sent by space craft and satellites.
In the realm of pure science macro computers may one day be used to model and bring some order to the incredibly complex realm (область) of subatomic particles.
Some complex problems can be split into pieces and handled by several independent small computers or by a network of interconnected small computers. But when a multiplicity of operations must be accomplished simultaneously and/or where a high degree of data-integration is necessary, the only answer is a macro computer.
Текст 4.9. а) Переведите текст устно без словаря. Значения выделенных слов вы сможете понять из контекста.
б) Прокомментируйте высказывание автора:
"... the emergence of database technology is probably a revolutionary development in the world of information processing by computers."
Database Systems
Database systems were born and have evolved as an application technology due to the necessity for managing the large amount of data produced in the real world. However, it was soon recognized that the emergence of the technology is one of the most significant features of transition in computer application from data processing to information processing and further to knowledge processing. The problem so far has been involving various topics: data models, database languages and query (запрос) processing, database design, database system design, file organization, database system evaluation, integrity, database machine, distributed database system, high level database applications and so on.
Database systems were the means by which computer technology began to make effective and systematic use of a permanent store, which has been an important feature of information processing capability belonging only to human beings. In this sense, the emergence of database technology is probably a revolutionary development in the world of information processing by computers. It made computers more like human beings than ever and offered us a chance to reconsider the information processing by computers in comparison with that of the human beings. It is expected that analyzing the problem solving process and creative activity by man will serve us in designing future information processing systems.
Knowledge representation has also become a crucial issue in the field of artificial intelligence. In fact, whichever system we consider, how to represent knowledge and then utilize it on a computer is a key problem for the realization of advanced information system such as natural language processing, image or speech understanding, machine vision, intelligent information retrieval, and intelligent man-machine communication.
Текст 4.10. Прочитайте текст и составьте на английском языке его структурно-логическую схему.
Breaking the Man-Machine Communication Barrier
Technological advances in computers can be used to enrich communications between people. When a person edits a document or writes an electronic message, the computer is not the intended recipient of the result, but merely stores or transmits that information. In the paperless office of the future, most of the letters, memos, and reports that are currently printed on paper will instead be stored in the office computer system. But before it can fill this role successfully, the computer system must provide convenient ways to include figures and photographs in documents and allow comments to be "pencilled into the margin (поле)" of an electronic page. In other words, it must provide mechanisms for human communication that are at least as convenient and efficient as current paper-based communication systems.
Just as graphic displays suggested less obtrusive (назойливый) ways of notifying the user about error corrections, several supplements to written communication have also been made possible by recent computer technology advances. One of them is voice annotations.
The recording and playback of digitized speech is now feasible, even for inexpensive computer systems, primarily as a result of the recent development of special-purpose integrated circuits intended for digital telephone systems. The obvious advantage of the recorded speech is that it is faster and easier for the human user than the corresponding typed input; it is therefore well suited to the role of "pencilled notations" on existing documents. For example, the recipient of a document should be able to point to a portion of the text, record a spoken comment at that point, and return the document to its originator, who can replay the recorded message at his convenience.
Текст 4.11. Бегло прочитайте текст. Озаглавьте его. Дайте обоснование выбора заголовка. Значения выделенных слов вы сможете понять из контекста.
The transistor was the basis for the second generation of computers – a generation that lasted about 15 years. The third generation began in the mid-1960s and produced two types of offspring: the mainframe computer and the minicomputer. The mainframe computer looked likе a computer. It was (and still is) big, required air conditioning or even direct liquid cooling to keep its hot electrical components working (as a direct consequence of its still-substantial power consumption), and had (and still has) to be attended by a group of specialists: systems programmers, applications programmers, and operators. All the advantages of integrated circuits were used in these systems to make them extremely powerful.
The new entrants upon the scene were the minicomputers. Much smaller and less expensive than mainframes, the minis first found application in the field of industrial process control and small-job data processing, but their capabilities continued to expand. They began to appear in places other than the Computer Center.
The fourth and present generation of computer was ushered in by the first commercial production of a microprocessor, the Intel 4004, in 1971. This was the first occasion in which the entire central processing unit (CPU), the "brains" of a computer, was put on a single chip. With a CPU chip and a few memory chips and other integrated circuits, a fully functional, general-purpose, stored-program computer can be built that weighs a few ounces and consumes a few watts of power.
During twenty years the computer has come a long way. At the upper and of the scale are super-computers, such as the Soviet computer Elbrus-3 being developed by the group of young scientists, performing more than 20 million instructions per second. And there are more to come.
Текст 4.12. Переведите письменно со словарем. Время перевода -10 минут.
High-Level Languages
High-level languages are to assembly- or machine-language programming what integrated circuits are to discrete logic - they collect small, related elements into neat modules. The benefits, too, are similar. Just as the hardware designer needs fewer components to build a system, the programmer thinking in a high-level language needs fewer lines of code to make a system go.
Such languages are not the perfect solution for аll programming problems. They require a lot of memory, for example, and in the case of microcomputers, that was economically impractical till quite recently. But now they can often be used to cut expensive microcomputerfirmware development time, especially if their user is aware of the languages' strengths and weaknesses.