Home Technique Computer system

Computer system



Introduction

A machine system that receives and stores information according to human requirements, automatically processes and calculates data, and outputs result information. Computer is the extension and expansion of brain power, and it is one of the major achievements of modern science.

The computer system consists of a hardware (sub) system and a software (sub) system. The former is an organic combination of various physical components based on the principles of electricity, magnetism, light, and machinery, and is the entity on which the system works. The latter are various procedures and documents used to direct the entire system to work according to specified requirements.

Since the first electronic computer came out in 1946, computer technology has made amazing progress in terms of components, hardware system structure, software system, and application. Modern computer systems are as small as microcomputers and personal computers. , As large as supercomputers and their networks, with various forms and characteristics, have been widely used in scientific computing, transaction processing, and process control, and are increasingly penetrated into various fields of society, and have a profound impact on social progress.

Electronic computers are divided into two categories: digital and analog. Generally speaking, the computer refers to a digital computer, and the data processed by its operation is represented by discrete digital quantities. The data processed by analog computer operations is represented by continuous analog quantities. Compared with digital machines, analog machines are fast, have simple interfaces with physical devices, but have low accuracy, difficult to use, poor stability and reliability, and expensive. Therefore, the simulator has become obsolete, and it is only used in occasions where fast response speed but low accuracy is required. The hybrid computer that combines the advantages of the two ingeniously still has a certain vitality.

Features

The computer system is characterized by accurate and fast calculations and judgments, good versatility, easy to use, and can be connected to a network.

①Calculation: Almost all complex calculations can be realized by computers through arithmetic and logical operations.

②Judgment: The computer has the ability to distinguish different situations and choose different processing, so it can be used in management, control, confrontation, decision-making, reasoning and other fields.

③Storage: Computers can store huge amounts of information.

④Accurate: As long as the word length is sufficient, the calculation accuracy is theoretically unlimited.

⑤Fast: The time required for a computer operation is as small as nanoseconds.

⑥General purpose: The computer is programmable, and different programs can realize different applications.

⑦Ease of use: Abundant high-performance software and intelligent man-machine interface greatly facilitate the use.

⑧Networking: Multiple computer systems can transcend geographic boundaries and share remote information and software resources with the help of communication networks.

Composition

Figure 1 shows the hierarchical structure of the computer system. The kernel is a hardware system, an actual physical device for information processing. The outermost layer is the person who uses the computer, the user. The interface between human and hardware system is software system, which can be roughly divided into three layers: system software, support software and application software.

Hardware

The hardware system is mainly composed of central processing unit, memory, input and output control system and various external devices. The central processing unit is the main component for high-speed computing and processing of information, and its processing speed can reach hundreds of millions of operations per second. Memory is used to store programs, data and files. It is often composed of fast internal memory (capacity up to hundreds of megabytes, or even gigabytes) and slow mass external memory (capacity up to tens of gigabytes or more than hundreds of gigabytes). ) Composition. Various input and output external devices are information converters between humans and machines, and the input-output control system manages the information exchange between the external devices and the main memory (central processing unit).

Software

Software is divided into system software, support software and application software. The system software is composed of operating system, utility program, compiler, etc. The operating system implements management and control of various software and hardware resources. Utility programs are designed for the convenience of users, such as text editing. The function of the compiler is to translate the program written by the user in assembly language or a certain high-level language into a machine language program executable by the machine. Supporting software includes interface software, tool software, environmental database, etc., which can support the environment of the machine and provide software development tools. Supporting software can also be considered as part of the system software. Application software is a special program written by users according to their needs. It runs with the help of system software and supporting software, and is the outermost layer of the software system.

Classification

Computer systems can be classified according to system functions, performance or architecture.

① Special-purpose and general-purpose computers: Early computers were designed for specific purposes and were of special nature. Beginning in the 1960s, it began to manufacture general-purpose computers that took into account the three applications of scientific computing, transaction processing and process control. Especially the emergence of serial machines, the adoption of various high-level programming languages ​​of standard texts, and the maturity of the operating system have enabled a model series to choose different software and hardware configurations to meet the different needs of users in various industries, and further strengthened Versatility. But special purpose machines are still being developed, such as all-digital simulators for continuous dynamics systems, ultra-mini space special computers, and so on.

② Supercomputers, mainframes, medium-sized computers, minicomputers, and microcomputers: Computers are developed based on large and medium-sized computers as the main line. Minicomputers appeared in the late 1960s, and microcomputers appeared in the early 1970s. They are widely used because of their light weight, low price, strong functions, and high reliability. In the 1970s, supercomputers capable of computing more than 50 million times per second began to appear, and they were specially used to solve major issues in science and technology, national defense, and economic development. Giant, large, medium, small, and microcomputers, as components of the echelon of computer systems, have their own uses and are developing rapidly.

③ Pipeline processor and parallel processor: Under the condition of limited speed of components and devices, starting from the system structure and organization to achieve high-speed processing capabilities, these two processors have been successfully developed. They all face ɑiθbi=ci(i=1, 2, 3,...,< i>n; θ is an operator) such a set of data (also called vector) operations. The pipeline processor is a single instruction data stream (SISD). They use the principle of overlap to process the elements of the vector in a pipeline manner, and have a high processing rate. Parallel processor is a single instruction stream multiple data stream (SIMD), which uses the principle of parallelism to repeatedly set up multiple processing components, and simultaneously process the elements of the vector in parallel to obtain high speed (see parallel processing computer system). Pipeline and parallel technology can also be combined, such as repeatedly setting multiple pipeline components to work in parallel to obtain higher performance. Research on parallel algorithms is the key to the efficiency of such processors. Correspondingly expand vector statements in high-level programming languages, which can effectively organize vector operations; or set up vector recognizers to automatically recognize vector components in source programs.

An ordinary host (scalar machine) is equipped with an array processor (only for high-speed vector operation pipeline dedicated machine) to form the main and auxiliary machine system, which can greatly improve the processing capacity of the system, and the performance and price The ratio is high, and the application is quite wide.

④Multiprocessors and multicomputer systems, distributed processing systems and computer networks: Multiprocessors and multicomputer systems are the only way to further develop parallel technology, and are the main development directions for giant and mainframe computers. They are multiple instruction streams and multiple data streams (MIMD) systems. Each machine processes its own instruction stream (process), communicates with each other, and jointly solves large-scale problems. They have a higher level of parallelism than parallel processors, with great potential and flexibility. Using a large number of cheap microcomputers to form a system through the interconnection network to obtain high performance is a direction of research on multiprocessors and multicomputer systems. Multiprocessors and multicomputer systems require the study of parallel algorithms at a higher level (processes). High-level programming languages ​​provide means for concurrent and synchronizing processes. The operating system is also very complex, and it is necessary to solve the communication and synchronization of multiple processes between multiple computers. , Control and other issues.

Distributed system is the development of multi-computer system. It is a system that is physically distributed by multiple independent and interacting single computers that cooperate to solve user problems. Its system software is more complex (see distributed computer System).

Modern mainframes are almost all multi-computer systems with distributed functions. In addition to containing high-speed central processing units, there are input and output processors (or front-end user computers) that manage input and output, manage remote terminals, and network communications. The communication control processor, the maintenance and diagnosis machine for system-wide maintenance and diagnosis, and the database processor for database management. This is a low-level form of the distributed system.

Multiple geographically distributed computer systems are connected to each other through communication lines and network protocols to form a computer network. It is divided into local (local) computer network and remote computer network according to the geographical distance. Each computer on the network can share information resources, software and hardware resources with each other. Ticket booking systems and information retrieval systems are examples of computer network applications.

⑤ Neumann machine and non-Neumann machine: Stored program and instruction-driven Neumann machine still dominates so far. It executes instructions sequentially, which limits the parallelism inherent in the problem to be solved and affects the further improvement of processing speed. The non-Neumann machine that breaks through this principle is to develop parallelism from the architecture and improve the system throughput. Research work in this area is ongoing. Data flow computers driven by data flow and highly parallel computers driven by reduction control and on demand are all promising non-Neumann computer systems.

Outlook

The computer system is updated approximately every 3 to 5 years, the performance-price ratio is increased tenfold, and the volume is greatly reduced. VLSI technology will continue to develop rapidly, and will have a huge and profound impact on various computer systems. 32-bit microcomputers have appeared, and 64-bit microcomputers have also come out. It is not far away to make 10 million components on a single chip. Research on devices that are 10 to 100 times faster than semiconductor integrated circuits, such as gallium arsenide, high electron mobility devices, Josephson junctions, and optical components, will have important results. The micro-assembly technology to increase the assembly density and shorten the interconnection line is one of the key technologies of the new generation of computers. Optical fiber communication will be widely used. Various high-speed intelligent peripheral devices are constantly emerging, and the advent of optical discs will make the auxiliary mass storage completely new. Multi-processor systems, multi-computer systems, and distributed processing systems will be eye-catching system structures. Software hardening (called firmware) is a development trend. New types of non-Neumann machines, reasoning computers, knowledge base computers, etc. have begun to be used in practice. Software development will get rid of backwardness and inefficiency. Software engineering is developing in depth. Software production is developing in the direction of engineering, formalization, automation, modularization and integration. The research of new high-level languages ​​such as logic languages, functional languages ​​and artificial intelligence will make the human-computer interface simple and natural (you can directly see, listen, speak, and draw). Database technology will be greatly developed. Computer networks will be widespread. Next-generation computer systems characterized by huge processing power (for example, 10 to 100 billion operations per second), huge knowledge and information base, and high intelligence are being vigorously developed. Computer applications will become increasingly widespread. Computer-aided design, computer-controlled production lines, and intelligent robots will greatly increase social labor productivity. Offices, medical care, communications, education, and family life will all be computerized. The influence of computers on people's lives and social organizations will become increasingly widespread and profound.

Work process

The general process for users to use the computer system to calculate questions:

①Create an account through the system operator and obtain the right to use it. The account is used to identify and protect the user's files (programs and data), and is also used for the system to automatically count the user's use of resources (accounting, payment).

②According to the problem to be solved, research the algorithm, select the appropriate language, write the source program, and provide the data to be processed and related control information at the same time.

③Put the result of ② into the floppy disk on the offline special equipment, and create the user file (it can also be done on the online terminal, and the file is created directly in the auxiliary storage. At this time, the fourth step is omitted ).

④Use the floppy disk machine to input the user files on the floppy disk into the computer, after processing, as a job, register and store it in the auxiliary storage.

⑤ is a requirement to compile. The operating system transfers the job into the main memory, calls the compiler of the selected language, compiles and connects (including the called subroutines), produces a target program executable by the machine, and stores it in the auxiliary memory.

⑥Arithmetic processing is required. The operating system transfers the target program into the main memory, which is processed by the central processing unit, and the result is stored in the auxiliary memory.

⑦The operation result is sent to the external device for output by the operating system in the format required by the user.

The internal work of the computer (④~⑦) is a complex process under the control of the operating system. Usually, multiple user jobs are input at the same time in a computer, and they are uniformly scheduled by the operating system and run in a staggered manner. But this kind of scheduling is transparent to users, and general users do not need to know its internal details.

Users can use a terminal to interactively control the progress of ③~⑦ (time-sharing mode); they can also entrust the operator to complete ③~⑦, of which ④~⑦ is automatically carried out by the computer (batch processing) Way). The batch processing method has a high degree of automation, but the user is not intuitive and there is no intermediate intervention. The time-sharing mode is intuitively controlled by the user and can intervene in error correction at any time, but the degree of automation is low. Most modern computer systems provide two methods, which are selected by the user.

Operating system

Introduction

Operating system is a system software (or collection of programs) that facilitates users, manages and controls computer hardware and software resources. From the user's point of view, the operating system can be seen as an expansion of computer hardware; from the perspective of human-computer interaction, the operating system is the interface between the user and the machine; from the computer's system structure, the operating system is a hierarchical, modular structure The set of programs belongs to the ordered layering method, which is an ordered layered call of disordered modules. The design of the operating system embodies the combination of computer technology and management technology. The status of the operating system in the computer:

The operating system is software, and it is system software. Its role in the computer system can be roughly understood from two aspects: internally, the operating system manages various resources of the computer system and expands the functions of the hardware;

externally, operation The system provides a good man-machine interface, which is convenient for users to use computers. It has a linking position in the entire computer system.

The operating system is a large-scale software system with complex functions and a huge system. The results from different angles are also different. It is "the ridge and the peak are formed horizontally". Let's analyze it from the two most typical angles.

1. From a programmer's point of view

As mentioned earlier, if there is no operating system, programmers must fall into complex hardware implementation details when developing software. Programmers do not want to get involved in this terrible field, and a lot of energy spent on this repetitive, uncreative work also prevents programmers from focusing on more creative programming work. What the programmer needs is a simple, highly abstract device that can be used with it.

Isolate the hardware details from the programmer, which is of course the operating system.

From this perspective, the role of the operating system is to provide users with an equivalent extended machine, also called a virtual machine, which is easier to program than the underlying hardware.

2. From the user's point of view

From the user's point of view, the operating system is used to manage various parts of a complex system.

The operating system is responsible for the orderly control of the allocation of CPU, memory and other I/O interface devices among competing programs.

For example, suppose three programs running on a computer try to output calculation results on the same printer at the same time. Then the first few lines may be the output of program 1, the next few lines are the output of program 2, and then the output of program 3, and so on. The end result will be a mess. At this time, the operating system can avoid this confusion by sending the printout to the buffer on the disk. After a program ends, the operating system can send the files temporarily stored on the disk to the printer for output.

From this perspective, the operating system is the resource manager of the system.

Development history

Let’s review the development history of the operating system in conjunction with the development history of the computer.

1. The first generation of computers (1945-1955): vacuum tubes and plug-in boards

In the mid-1940s, some people at Harvard University, Princeton Institute for Advanced Study, and University of Pennsylvania used data Ten thousand vacuum tubes have built the world's first electronic computer. Start the history of computer development. Machines in this period require a team to design, manufacture, program, operate, and maintain each machine. The programming uses machine language and controls its basic functions through hard wiring on the plug-in board.

At this time in the initial stage of computer development, even programming languages ​​have not yet appeared, and operating systems are unheard of.

2. Second-generation computers (1955-1965): transistors and batch processing systems

In this period, computers have become more and more reliable. They have stepped out of the research institute and entered Commercial applications. However, computers in this period mainly completed various scientific calculations, which required specialized operators to maintain, and needed to be programmed for each calculation task.

The second-generation computer is mainly used for scientific and engineering calculations. Use FORTRAN and assembly language to write programs. In the later period, the prototype of the operating system appeared: FMS (FORTRAN monitoring system) and IBMSYS (the operating system equipped by IBM for the 7094 machine).

3. Third-generation computers (1965-1980): integrated circuit chips and multiprogramming

In the early 1960s, computer manufacturers were divided into two computer series according to different applications. One for scientific computing and one for commercial applications.

With the deepening of computer applications, there is a need for computers to unify the two applications. At this time, IBM tried to solve this problem by introducing System/360.

In conjunction with this plan, IBM organized the development of the OS/360 operating system, and then the complex requirements and the low level of software engineering at that time made the development of OS/360 the most terrible in history." "Software development quagmire", the most famous failure treatise-"Mysterious Man Moon" was born. Although this development plan failed, this desire has become the goal of computer manufacturers.

At this time, MIT, Bell Lab (Bell Labs) and General Electric Company decided to develop a "public computer service system"-MULTICS, hoping to support hundreds of time-sharing users at the same time Of a kind of machine. As a result, the difficulty of developing this plan exceeded everyone's expectations, and the system ended in failure. However, the idea of ​​MULTICS has a lot of hints for later operating systems.

In the late 1960s, Ken Thompson, a computer scientist at Bell Labs who had participated in the development of MULTICS, developed a simplified set of Single-user version of MULTICS. Later led to the birth of the UNIX operating system.

The UNIX operating system dominates the minicomputer, workstation and other markets. It is also one of the most influential operating systems so far, and Linux is also a derivative of the UNIX system. In the next lecture, we will specifically introduce the development history of UNIX.

4. The fourth generation of computers (1980-present): personal computers

With the continuous update and development of computer technology, computers have magically broken into people’s lives. A computer with powerful computing power can be obtained at a low price.

When price is no longer the threshold to block the popularization of computers, it is very important to reduce the ease of use of computers! Due to the characteristics of the UNIX system, it is not suitable for running on a personal computer. At this time, a new operating system is needed.

At this critical time in history, IBM underestimated the PC market and did not use its greatest strength to compete in this market. At this time, Intel took the opportunity to enter and became the leader of today’s microprocessors. . At the same time, the president of Microsoft Corporation, Bill Gates, who is good at seizing the opportunity, entered this field in a timely manner. With the purchased CP/M, he became MS-DOS and became the dominant player in the field of personal computer operating systems.

Although Apple took the lead in GUI, due to Apple’s incompatible and non-open market strategy, it was unable to expand its success. At this time, Microsoft entered the GUI side in a timely manner, again with the WINDOWS system. Rule the roost.

Composition

Generally speaking, the operating system consists of the following parts:

1) Process scheduling subsystem:

Process The scheduling subsystem determines which process uses the CPU, and schedules and manages the process.

2) Inter-process communication subsystem:

Responsible for the communication between each process.

3) Memory management subsystem:

Responsible for managing computer memory.

4) Device management subsystem:

Responsible for managing various computer peripherals, mainly composed of device drivers.

5) File subsystem:

Responsible for managing various files and directories on the disk.

6) Network subsystem:

Responsible for processing various network-related things.

Structural design

There are many implementation methods and design ideas for the operating system. The following only selects the most representative three to make a brief description.

1. Integral system

Integral system structure design:

This is the most commonly used organization method, and it is often referred to as a "hodgepodge" It can also be said that the overall system structure is "unstructured".

In this structure, in order to construct the final target operating system program, the developer first compiles some independent procedures or files containing the procedures, and then uses the linker to link them into a single Target program.

The Linux operating system is designed with an integrated system structure. However, on this basis, some methods such as dynamic module loading are added to improve the overall flexibility and make up for the shortcomings of the overall system structure design.

Second, hierarchical system

Hierarchical system structure design:

This approach is to strictly layer the system, making the entire system clear , Hierarchical! This kind of system has a strong academic flavor! In fact, there are not many operating systems designed completely in accordance with this structure, nor are they widely used.

It can be said that the current operating system design seeks a balance between the overall system structure and the hierarchical system structure design.

3. Micro-kernel system

Micro-kernel system structure design:

Micro-kernel system structure design is a new design concept that has emerged in recent years. The most representative operating systems are Mach and QNX.

The microkernel system, as the name implies, is that the system kernel is very small. For example, the microkernel of QNX is only responsible for: inter-process communication, low-level network communication, process scheduling, and first-level interrupt processing.

Horizontal comparison

There have been many operating systems in the history of computer, and then the waves have been washed away, and many have been ruthlessly eliminated, leaving only some that have experienced the test of the market:

1. Desktop operating system

1) MSDOS: The earliest operating system on Intelx86 series PCs, Microsoft products, once dominated this field, and now it has been gradually adopted by its own brother WINDOWS series Instead, it is now rare except for some low-end machines.

2) Windows: Microsoft's products, developed from Windows 1.0, are now the main operating system on Intel x86-based PCs, and it is also the most installed operating system in personal computers. For desktop, for individual users.

3) Mac OS: owned by Apple, with a friendly interface and excellent performance, but its development is limited because it can only run on Apple’s own computer. However, due to Apple's unique market positioning, it is still alive and well.

4) Linux: Linux is the name of a computer operating system and its kernel. It is also the most famous example of free software and open source development.

Strictly speaking, the word Linux itself only refers to the Linux kernel, but in fact, people are used to using Linux to describe the entire operating system based on the Linux kernel and using various tools and databases of the GNU project ( Also known as GNU/Linux). Linux software based on these components is called a Linux distribution. Generally speaking, a Linux distribution package contains a lot of software, such as software development tools, databases, Web servers (such as Apache), X Window, desktop environments (such as GNOME and KDE), office suites, and so on.

Second, server operating system

1) UNIX series: UNIX can be said to have a long history, is a truly stable, practical, and powerful operating system, but due to many manufacturers on its basis Developed a UNIX version with its own characteristics, so it affected the whole. In foreign countries, the UNIX system is unique, widely used in key fields such as scientific research, schools, and finance. However, due to the relatively backward development of computers in China, the application level of UNIX systems lags behind foreign countries to a certain extent.

2) Windows NT series: Microsoft's products, which take advantage of the friendly user interface of Windows to enter the server operating system market. However, it has a certain gap with UNIX in terms of overall performance, efficiency, and stability, so it is now mainly used in the small and medium-sized enterprise market.

3) Novell Netware series: Novell’s products, which are well-known for being extremely suitable for small and medium-sized networks, have a very high market share in China’s securities industry, and their product features are bright, and they are still in the server system software. Evergreen tree.

4) LINUX series: Linux is a free and open source Unix-like operating system. There are many different kinds of Linux, but they all use the Linux kernel. Linux can be installed in a variety of computer hardware devices, from mobile phones, tablets, routers, and video game consoles, to desktop computers, mainframes, and supercomputers. Linux is a leading operating system. The 10 fastest supercomputers in the world run the Linux operating system. Strictly speaking, the word Linux itself only means the Linux kernel, but in fact, people are used to using Linux to describe the entire operating system based on the Linux kernel and using various tools and databases from the GNU project. Linux gets its name from Linus Torvalds, a computer hobbyist.

Flynn classification

Flynn classification is classified according to instruction flow, data flow and its polyploidy. There are four categories:

SISD——The instruction component processes only one instruction and only controls the operation of one operating component. Such as a general serial single processor.

SIMD――A single instruction component controls multiple repeatedly set processing units at the same time, and executes different

data operations under the same instruction. Such as array processor.

MISD――Multiple instruction components operate on each processing stage of the same data. Such machines are rare.

MIMD――Multiple independent or relatively independent processors execute their own programs, operations, or processes. For example, multiple

processors.

This article is from the network, does not represent the position of this station. Please indicate the origin of reprint
TOP