.


:




:

































 

 

 

 


Warm-up: What makes English easy to learn? What makes it difficult?




The History of Computers

During the 17fh and the 18 centuries mathematicians devised many easy ways of calculating. But it was not until the early 1800s that the first calculating machine appeared and, not too long after, Charles Babbage designed a machine which became the basis for building today's computers.

An American named Vannevar Bush built the first analog computer in 1930. Military men used this device in the World War II to help aim guns. Prof. Howard Aiken and some people from IBM responsible for the invention of the first digital computer, named MARK I, completed it in 1944. This was the first machine that could figure out long lists of mathematical problems at a very fast rate. In. 1946 two engineers at the University of Pennsylvania, J.Eckert and J.Mauchly, built the first digital computer using parts called vacuum tubes. They named their new invention ENIAC. Another important advancement in computers came in 1947, when John von Newmarm developed the idea of keeping instructions for the computer inside the computer's memory. There were about 700 computers in the United States in 1955 and about 22,000 by 1964. Engineers of the Massachusetts Institute of Technology developed the first electronic computer. It contained 19,000 vacuum tubes, some of which burnt every few days. This computer completely filled a large room, and the cost of electricity to run it was enormous. Early computer memories required 10,000 to 20,000 watts of power, enough to run a small factory. Modeftj computers operate on 60 watts, about the same as a small light bulb. Since then, computers have gone through four generations: digital computers using vacuum tubes in the 1950s, transistors in the early 1960s, integrated circuits in the niid-60s and a single chip in the 1970s. With integrated circuits computers have truly come of age*. Engineers found that the only way to shorten calculation time was.simply to reduce the length of electric circuits that electrons had to travel. They did it by compressing and miniaturizing the circuits themselves, with a technique known as "large-scale integration" - LSI.

A chip is a square or rectangular piece of silicon, usually from 0.1 to 0.25 inch? upon which several layers of an integrated circuit are imprinted, after which the chip is encapsulated in plastic, ceramic or metal. Fourth-generation computers can complete approximately 1,000,000 instructions per second.

 

Due to the personal computing revolution, the military, and the advancement in related fields computers have shrunk from multi-ton mainframe monsters of vacuum tubes costing tens of thousands of dollars to ten pound desktops and two-ounce handhelds. The widespread availability of computers has changed the world forever. The microchip technology which made the personal computers possible has put chips not only into computers,, but also into domestic appliances. Scientists and engineers rely on computers for solutions to problems in almost every7 field of national economy.

 

*come of age - here: to achieve a new stage in its development

: 2532


 

 

Personal Computers

 

Warm-up: What is California famous for?

 

firms producing personal computer do you know? Is a personal computer and wordprocessor one and the same thing? Can you name the fields of human activity where personal computers are vitally important?

 

In 1980 IBM decided that there was a market for 250,000 PCs, so they set up a special team to develop the first IBM PC. It went on sales in 1981 and set a world-wide standard for IBM-compatibility which, over the next ten years, was only seriously challenged by one other company, Apple Computers. Since then, over seventy million PCs made by IBM and other manufacturers have been sold. Over this period, PCs have become commodity items.

 

The history of the multi-billion-dollar PC industry.has 6een one of mistakes. Xerox Corporation funded the initial research on personal computers in their Palo Alto laboratory in California. However, the company-failed to capitalize on this work, and the ideas that they put together went into the operating system developed for Applets computers. This was a graphical interface: using a mouse, the user clicks on icons which represent the function to be performed. The engineers developed the first IBM PC using existing available electrical components. When IBM were looking for an operating system, they went Initially to Digital Research who were leaders in command-based operating systems(these are operating systems in which the users type in commands to perform a function). When the collaboration between IBM and Digital Research failed, IBM turned to Bill Gates, then 25 years old, to write their operating system. Bill Gates founded Microsoft on the basis of the development of MS/DOS, the initial operating system for the IBM PC. The original IBM PC had a minimum of 16K of memory, but this could be upgraded to 512 if necessary, and ran with a processor speed of 4.77 MHz. Ten years later in 1991 IBM were making PCs with 16Mb of memory, expandable to 64 Mb, running with a processor speed of 33MHz. 

: 1686

An Artificial Tongue

 

Computers, Bill Gates is fond of pointing out, lack most of the basic senses that humans often take for granted. Some advanced machines can hear and speak - poorly - but most are blind and oblivious to touch, smell and temperature. Electronic devices may soon gain a sense of taste, however, thanks to tiny electromechanical machines so-called "smart tongues" invented at Pennsylvania State University by husband-and-wife engineers Varadan who presented their designs at a conference in San Diego. The Varadans predict that within a few years the devices will be cheap and sensitive enough to find myriad uses. Stuck inside milk cartoons and juice bottles, some sensors might enable checkout scanners to detect the unwanted bacteria. Slightly different versions could be mounted on aircraft wings to alert pilots when ice begins to form. There is good reason to believe that this is more than daydreams. Smart tongues are very small, made from simple materials - silicon, quartz, aluminum - using the same process by which the cheapest computer chips arc produced. A second benefit is that they arc wireless. Unlike most other micro machines, these sensors receive power from radio waves or microwaves. With no batteries and no tether to a computer they can be placed almost everywhere.

: 1097

: 3459

Once upon a Time

In the early 70s there was no such device as a personal computer. The most common type of computers were expensive mainframes housed in glass-enclosed cooled rooms ('glass houses') and used mostly by large organizations and corporations. These room-sized computers typically relied on large spools of magnetic tape and punch cards to process data. Few people had access to computers at this time, only elite groups of specialists were programming, operating and maintaining mainframes, All departments and workers in a company spent a lot of time negotiating with MIS (management information system) professionals if they wanted to write or change any program. The introduction of VisiCalc* on the Apple IP* in 1979 meant that accountants could compute their own numbers and not wait hours or longer while MIS specialists ran programs for them. Low-cost PCs also made it possible for people to have computers in their homes. And the PC revolution in the "80s laid the foundation for the Internet revolution of the '90s. Perhaps more than any other industry, the development of the personal computer business is a fascinating tale of companies that made millions of dollars while others went bankrupt and disappeared. By 1980, IBM had decided to build personal computers and needed a PC operating system. (Computers are born empty; they need operating systems to be presentable.) So IBM hired Microsoft to build its operating system. They released the PC on the market in August 1981. Microsoft's DOS was one of the three official PC operating systems (including Mackintosh from the Apple) but quickly beat the other two. DOS was primitive at a time when the computer was wearing UNIX from Bell Labs or some variant of the revolutionary window-menu-mouse system that Xerox had introduced in the 1970s. But despite (or maybe because of) its stodginess, DOS established itself as the school uniform of computing. It was homely but everyone needed it. Gates had brokered a marriage between other people's ideas and come up with a hit. DOS was even bigger than BASIC, Gates had it made. Another computer company, Apple, released the Mackintosh in January 1984: a sophisticated computer was now available to the masses. Henceforth DOS was not merely homely, it was obsolete. But it continued to rake the money. In May 1990, Microsoft finally perfected its own version of Apple windows and called it Microsoft Windows 3.0 - another huge hit. By the early '90s, electronic mail and the Internet had become big. Technologists forecast an Internet-centered view of computing called "mirror worlds" and "information superhighway". The World Wide Web emerged in 1994, making browsers necessary, and Netscape was founded that same year. Sun Microsystems developed Java, the Internet programming language. Gates hung back. It wasn't until 1996 that Microsoft finally, according to Gates himself, "embraced the Internet wholeheartedly". Microsoft's first browser. Internet Explorer 1.0, was licensed from a company called Spyglass. Today Microsoft is the world's most powerful supplier of Web Browsers. In 1995 Gates published a book (co-authored with Nathan Myhrvold) called "The Road Ahead". Peering far into the future, he glimpsed a technology-rich dreamworld where you will be able to "watch "Gone with the Wind" with your own face and voice replacing Vivien Leigh's or

Clark GablesApparently this is just what the public had been dying to do, for "The Road Ahead" became a runaway bestseller. Some people have the idea that Microsoft is fated to dominate technology forever, They had the same idea about IBM\ once admired and feared nearly as much as Microsoft is today. As for Gates himself he is a man who likes computers very much. Not their intellectual underpinningsnot the physics or electronics, not the art or philosophy or mathematics of software - just plain computers. He is crazy about them. It seems like an odd passion but after all some people are crazv about potato chipsVisiCalc* - one of the first programs of electronic table (spreadsheet); is out-of-date now. the Apple II* * - a microcomputer.

The Elements of Computer System

Computers are electronic machines which accept data in a certain form, process the data and give the results of the processing in a specified format as information. Three basic steps are involved in the process:

1. data is fed into the computer memory;

2. when the program is run, the computer performs a set of instructions and processes the data;

3. we can see the results (the output) on the screen or in printed form. Information in the form of data and programs is known as software and the electronic and mechanical parts that make up a computer system are called hardware.

A standard computer system consists of three main parts: the CPU (Central Processing Unit), the main memory and the peripherals. The most influential component is the CPU. It executes program instructions and coordinates the activities of all the other units. In a way, it is the "brain" of the computer. The main memory holds the instructions and data which are currently being processed by the CPU The peripherals are the physical units attached to the computer. They include storage devices and input/output devices. Storage devices (floppy or hard disks) provide a permanent storage of both data and programs, Disk drives are used to handle one or more floppy disks. Input devices enable data to go into the computer's memory. The most common input devices are the mouse and the keyboard. Output devices enable us to extract the finished product from the system. The computer shows the output on the monitor or prints the results onto paper by means of a printer. On the rear panel of the computer there are several ports into which we can plug a wide range of peripherals: modems, fax machines, optical drives and scanners. These are the main physical units of a computer system, generally known as the configuration.

 

: 1524

Robots

Warm-up: What do you know about actual robots?

Many people may think that the contemporary industrial robot is the product of high technology but it was invented to meet a rather mundane* need. In the early 60s it became increasingly difficult to find people willing to do boring, repetitive and unpleasant jobs. What was needed was the machine which could perform sequences of the precise movements of the arm and hand. Such sequences were relatively easily programmed into a computer memory, especially after the invention of the microprocessor. This invention made robots independent from the giant mainframe computers of the 1960s, Many machines can perform repetitive manipulation but robots differ from them as they make use of a manipulator arm analogous to the human arm and because they can be reprogrammed to perform many different tasks without the need to redesign its mechanical components, Whatever its task, a robot is dependent for its effectiveness upon automated environment and it has so far restricted robot use to large scale industry. Specialist machine shops, producing small batches of many different items have little wish to set up a lot of devices which a robot requires. Automation, achieves its really spectacular success when rt abandons the attempt to do things in ways based on human skills and finds solutions that are quite new and intrinsically mechanical. Replacing wire circuits, which are impossible for machines to assemble, with printed circuits which machines can manufacture with ease is an obvious example. The robot provides an economic and relatively reliable substitute for human labor while also having a degree of flexibility that is attractive. What has yet to be established is that robots have the potential in them to advance from the status of blind preprogrammed serfs to that of a skilled and adaptive labor force, capable of learning new tricks and acting on their own initiative without the need for human interference at every stage.

The serious purpose in promoting the advance of robot technology which in recent years has not advanced as rapidly as it should have, lies behind the ingenious idea of organizing the World Robot Championships. This idea belongs to Dr Peter Mowforth from Glasgow's Turing Institute which is one of the world's leading centers of research in artificial intelligence and robotics. Besides trade exhibitions, contests seminars and workshops the robots' designers suggest even giving a celebration concert performed by a robot orchestra. mundane* simple, plain, boring

 

: 2158

Multiple Meaning

Warm-up: What makes English easy to learn? What makes it difficult?





:


: 2016-10-27; !; : 374 |


:

:

.
==> ...

1772 - | 1564 -


© 2015-2024 lektsii.org - -

: 0.03 .