Installation Techniques
Installation Techniques
Installation Techniques
3
Structure of operating systems
Operating System Structures
An operating system provides the environment within which programs are executed.
Internally, operating systems vary greatly in their makeup, since they are organized along many
different lines. The design of a new operating system is a major task. It is important that the
goals of the system be well defined before the design begins. These goals form the basis for
choices among various algorithms and strategies.
We can view an operating system from several vantage points. One view focuses on the
services that the system provides; another, on the interface that it makes available to users and
programmers; a third, on its components and their interconnections. In this section we explore
all three aspects of operating systems, showing the viewpoints of users, programmers, and
operating-system designers. We consider what services an operating system provides, how
they are provided, and what the various methodologies are for designing such systems. Finally,
we describe how operating systems are created and how a computer starts its operating
system.
processes that are executing on the same computer or between processes that are
executing on different computer systems tied together by a computer network.
Communications may be implemented via shared memory or through message passing,
in which packets of information are moved between processes by the operating system.
Error detection. The operating system needs to be constantly aware of possible errors.
Errors may occur in the CPU and memory hardware (such as a memory error or a power
failure), in I/O devices (such as a parity error on tape, a connection failure on a network,
or lack of paper in the printer), and in the user program (such as an arithmetic overflow,
an attempt to access an illegal memory location, or a too-great use of CPU time). For
each type of error, the operating system should take the appropriate action to ensure
correct and consistent computing. Debugging facilities can greatly enhance the users
and programmers abilities to use the system efficiently.
Another set of operating-system functions exists not for helping the user but rather for
ensuring the efficient operation of the system itself. Systems with multiple users can gain
efficiency by sharing the computer resources among the users.
Resource allocation. When there are multiple users or multiple jobs running at the
same time, resources must be allocated to each of them. Many different types of
resources are managed by the operating system. Some (such as CPU cycles, main
memory, and file storage) may have special allocation code, whereas others (such as
I/O devices) may have much more general request and release code. For instance, in
determining how best to use the CPU, operating systems have CPU-scheduling routines
that take into account the speed of the CPU, the jobs that must be executed, the number
of registers available, and other factors. There may also be routines to allocate printers,
modems, USB storage drives, and other peripheral devices.
Accounting. We want to keep track of which users use how much and what kinds of
computer resources. This record keeping may be used for accounting (so that users can
be billed) or simply for accumulating usage statistics. Usage statistics may be a valuable
tool for researchers who wish to reconfigure the system to improve computing services.
Protection and security. The owners of information stored in a multiuser or networked
computer system may want to control use of that information. When several separate
processes execute concurrently, it should not be possible for one process to interfere
with the others or with the operating system itself. Protection involves ensuring that all
access to system resources is controlled. Security of the system from outsiders is also
important. Such security starts with requiring each user to authenticate himself or herself
to the system, usually by means of a password, to gain access to system resources. It
extends to defending external I/O devices, including modems and network adapters,
from invalid access attempts and to recording all such connections for detection of
break-ins. If a system is to be protected and secure, precautions must be instituted
throughout it. A chain is only as strong as its weakest link.
Some operating systems include the command interpreter in the kernel. Others,
such as Windows XP and UNIX, treat the command interpreter as a special program that
is running when a job is initiated or when a user first logs on (on interactive systems). On
systems with multiple command interpreters to choose from, the interpreters are known
as shells. For example, on UNIX and Linux systems, there are several different shells a
user may choose from including the Bourne shell, C shell, Bourne-Again shell, the Korn
shell, etc. Most shells provide similar functionality with only minor differences; most
users choose a shell based upon personal preference.
The main function of the command interpreter is to get and execute the next
user-specified command. Many of the commands given at this level manipulate files:
create, delete, list, print, copy, execute, and so on.
error conditions for each operation. When the program tries to open the input file, it may find
that there is no file of that name or that the file is protected against access. In these cases, the
program should print a message on the console (another sequence of system calls) and then
terminate abnormally (another system call). If the input file exists, then we must create a new
output file. We may find that there is already an output file with the same name. This situation
may cause the program to abort (a system call), or we may delete the existing file (another
system call) and create a new one (another system call). Another option, in an interactive
system, is to ask the user (via a sequence of system calls to output the prompting message and
to read the response from the terminal) whether to replace the existing file or to abort the
program.
Now that both files are set up, we enter a loop that reads from the input file (a system
call) and writes to the output file (another system call). Each read and write must return status
information regarding various possible error conditions. On input, the program may find that the
end of the file has been reached or that there was a hardware failure in the read (such as a
parity error). The write operation may encounter various errors, depending on the output device
(no more disk space, printer out of paper, and so on).
Finally, after the entire file is copied, the program may close both files (another system
call), write a message to the console or window (more system calls), and finally terminate
normally (the final system call). As we can see, even simple programs may make heavy use of
the operating system. Frequently, systems execute thousands of system calls per second. This
system call sequence is shown in Figure below.
Most programmers never see this level of detail, however. Typically, application
developers design programs according to an application programming interface (API). The
API specifies a set of functions that are available to an application programmer, including the
parameters that are passed to each function and the return values the programmer can expect.
Three of the most common APIs available to application programmers are the Win32 API for
Windows systems, the POSIX API for POSIX-based systems (which includes virtually all
versions of UNIX, Linux, and Mac OS X), and the Java API for designing programs that run on
the Java virtual machine.
As an example of a standard API, consider the ReadFile() function in the Win32 APIa
function for reading from a file. The API for this function appears in Figure below.
Figure 1.3.3 The handling of a user application invoking the Open() system call.
1.3.4 Types of System Calls
System calls can be grouped roughly into five major categories: process control, file
manipulation, device manipulation, information maintenance, and communications. Most
of these system calls support, or are supported by, concepts and functions that are discussed in
later chapters.
1.3.5.3 Communication
There are two common models of inter process communication: the message passing
model and the shared-memory model. In the message-passing model, the communicating
processes exchange messages with one another to transfer information. Messages can be
exchanged between the processes either directly or indirectly through a common mailbox.
Before communication can take place, a connection must be opened. The name of the other
communicator must be known, be it another process on the same system or a process on
another computer connected by a communications network. Each computer in a network has a
host name by which it is commonly known. A host also has a network identifier, such as an IP
address. Similarly, each process has a process name, and this name is translated into an
identifier by which the operating system can refer to the process. The get hostid and get
processed system calls do this translation. The identifiers are then passed to the general
purpose open and close calls provided by the file system or to specific open connection and
close connection system calls, depending on the systems model of communication. The
recipient process usually must give its permission for communication to take place with an
accept connection call. Most processes that will be receiving connections are special-purpose
daemons, which are systems programs provided for that purpose. They execute a wait for
connection call and are awakened when a connection is made. The source of the
communication, known as the client, and the receiving daemon, known as a server, then
exchange messages by using read message and write message system calls. The close
connection call terminates the communication.
In the shared-memory model, processes use shared memories create and shared
memories attach system calls to create and gain access to regions of memory owned by other
processes. Recall that, normally, the operating system tries to prevent one process from
accessing another processs memory. Shared memory requires that two or more processes
agree to remove this restriction. They can then exchange information by reading and writing
data in the shared areas. The form of the data and the location are determined by the processes
and are not under the operating systems control. The processes are also responsible for
ensuring that they are not writing to the same location simultaneously.
Both of the models just discussed are common in operating systems, and most systems
implement both. Message passing is useful for exchanging smaller amounts of data, because
no conflicts need be avoided. It is also easier to implement than is shared memory for inter
computer communication. Shared memory allows maximum speed and convenience of
communication, since it can be done at memory speeds when it takes place within a computer.
In addition to systems programs, most operating systems are supplied with programs
that are useful in solving common problems or performing common operations. Such programs
include web browsers, word processors and text formatters, spreadsheets, database systems,
compilers, plotting and statistical-analysis packages, and games. These programs are known as
system utilities or application programs.
General Warnings
Please employ common sense in using the material. There are millions of different PCs
out there and no two are exactly alike, so obviously you may run into recommendations or ideas
that don't apply to your situation. If this happens, use your own good judgment, and then modify
them to suit using your own particulars. If anything stated here conflicts with the specific
directions in your system or component manuals, use the directions in the manual unless you
know them to be inaccurate. Don't go inside the box or do other work on your PC if you don't
feel comfortable with what you are doing.
Is your PC under warranty? You should realize that some companies will void your
warranty if you open the PC during the warranty period. If you go to open the box and see any
plastic seals stating "warranty void if broken", then that is a good hint that this applies to you but
it may even if you don't see any. If you are having trouble with your PC, check your warranty if
you suspect your machine may still be covered. If it is, you may be better off performing a return
or using your warranty.
Electrical Precautions
Pull the plug. Do not work on a system, under any circumstances, while it is plugged
in. Don't just turn it off and think that's enough. Others have done this and fried things (or
themselves!) by accidentally turning the system on while working. Also, the wires that run from
the power supply to the switch at the front of the box on an AT form factor system carry live
voltage when the PC is plugged in, even when it is off. You could end up in the hospital over the
slip of a screwdriver.
Stay out of the power supply unless you know what you are doing. Similarly, do not open
up your monitor unless you are absolutely sure of what you are doing. You can electrocute
yourself even with the power disconnected when inside the monitor. There are components
inside monitors that can hold a charge for a long time after they have been unplugged.
Watch out for components being left inside the box. Dropping a screw inside your case
can be a hazard if it isn't removed before the power is applied, because it can cause
components to short-circuit.
Mechanical Precautions
Make sure you have a large, flat area to work on. That will minimize the chance of
components falling, getting bent, or getting lost.
Don't tighten screws too far or you may strip them or make it impossible to loosen them
later. Don't use screws longer than around 1/8" when mounting drives or you risk damaging the
data storage areas of the drive.
You might not think of this, but watch out for sharp edges inside the case. The frame of
most PCs is made from thin sheet metal, and the edges can cut your fingers if you aren't
careful. You can also cut or strip a wire. The cheaper the box, the more likely you are to find a
sharp edge.
Sometimes it makes sense to turn your machine on with the cover off the case, to see if
something works before replacing the cover. If you do this, be very careful to keep objects from
accidentally falling into the box.
Data Precautions
Back up your data before you open the box, even if the work you are doing seems
"simple". This applies doubly to any upgrades or repairs that involve changes to the
motherboard, processor or hard disk. So many people take chances and end up with disasters.
A surprisingly large number of problems can result in data loss, and unlike equipment, data is
irreplaceable.
Make sure you have at least one bootable system floppy in the event that you cannot
boot your hard drive.
Make a copy of your system's BIOS settings before doing any major work or changing
anything in the BIOS.
Before shutting off the power to your PC, always use the proper shut down procedure for
your operating system. For Windows 9x, Windows 2000 or Windows NT, this means using the
"Shut down" option on the Start menu. For Windows 3.x, close down to DOS first before
powering off.
to ground yourself is to touch the exterior metal box of your computer's power supply (near the
fan) before you unplug it.
Warning: If you are going to use a grounding strap, buy one, don't try to make your own by
simply running a wire from your wrist or whatnot. Commercial grounding straps are specially
designed to incorporate a large resistor that protects you in the event that you touch live power
while grounded. Without it, you risk becoming the path of least resistance for that live power to
ground, and you may be electrocuted.
In general, handle all components by the edges. If you avoid touching any pins,
edges, chips, or anything else made of metal, you greatly decrease the chances that you
will zap or break anything. Smaller components such as loose RAM chips and processors are at
the greatest risk.
Whenever possible, leave static-sensitive devices in their original packaging. Transport
circuit boards and peripherals in an anti-static metallized bag if you do not have the original
packaging material. However, do not put this material inside your PC, or plug in a motherboard
while it is sitting on top of one of these bags. They are anti-static because they are partially
conductive; you don't want your motherboard shorted out by firing it up while several hundred
pins from its components are touching a partially conductive material.
They don't understand how important they are, because they haven't had a disaster
happen to them (yet).
They forget to do them because they don't have a routine for doing backups.
This section takes a full look at the matter of backups, and discusses how to do them,
how to set up a backup program, and what to do in the event of a disaster. Part of the focus is
on making backups easy to do so that you will remember to back up regularly without it taking
an inordinate amount of time and energy.
Floppy Disks
Floppy disks are not suitable as backup devices for a modern PC. Floppies are slow,
relatively unreliable, and far too small to make effective backup devices in the age of 1 GB hard
disks. Their only possible use for backup is for archiving small files.
Floppy disks are most importantly used as a vehicle for storing critical information about
your system, for use in the event of a system problem. These emergency boot disks are best
stored on floppies so they can be used in the event of a hard disk problem.
Large Floppy Disk Equivalent Drives: These devices are suitable for backup
only if you have a small hard disk, or have the diligence and patience to do attended
backups or large numbers of partial backups. As hard disks increase in size to 4 GB
and beyond, trying to do backups to a device that is only a little more than 100 MB
becomes impractical, and quite expensive. Their performance is general poor to
average.
Removable Hard Disk Equivalent Drives: These are much more suitable for use
as backup devices due to their larger capacity, but even here things are becoming
stretched, since even 500 MB to 1 GB is becoming inadequate for unattended
backups. These drives have generally much higher performance than the smaller
drives, and much higher price tags to go with them.
CD-Rewriteable: This drive is really in the same category as the removable hard
disk equivalents listed above. CD-RW has a lot going for it as a general-purpose
medium, because of its flexibility: its media are reusable and it can also burn CD-Rs
that play in most CD-ROMs or audio CDs as well. The capacity is only 650 MB.
Network Backup
The idea is fairly simple: copy data from one PC to another over the network. Duplicating
each PC's information provides a way to protect each individual PC.
In a way, this type of backup is most similar to in-place hard disk duplication in terms of
how it works. It is simple in the same way, and can be automated. It addresses some of the
concerns about that method: there isn't the same single point of failure in terms of virus attack
or hardware failure. However, depending on the location of the two PCs, theft, disaster and
sabotage can still be a big problem: if the two PCs are sitting on different desks in the same
office, you haven't gained much to protect against these threats. Also remember that file-infector
viruses can travel over a network.
Some of the backup methods and devices described in this section do a much better job
than others of protecting against the risks to your data. The table below shows a general
summary of how the various methods stack up in terms of protecting you from the hazards that
threaten your valuable data. Remember that this is just a general guideline; also remember that
some of these risks are much more common than others are, depending on how you use your
PC:
Minimized Startup Cost: How well does the method minimize startup costs for
hardware or software?
Minimized Media Cost: Does the method allow additional backups at a reasonable
cost?
Simplicity / Convenience: How easy is the method to use? Is there any difficulty
associated with the method that would tend to discourage doing backups?
Universality: How common is the hardware used for the method? If you needed to
use the device in five years, how likely is it that you could find support or additional
media for it?
Performance: How fast is the hardware and software used for the method? How
much time will it take to do a backup?
Routine Potential: In general, how likely is it that, using this method, someone is
likely to settle into a backup routine and stick with it?
Here is the chart; for all items, "High" is better and "Low" is worse:
What To Back Up
What To Back Up
To ensure that your backups are performed properly, in a way that ensures that you are
protected without taking so much of your time that they become a chore, you must determine
what files to back up and how often to back them up. Some files will need to be backed up more
often than others. This section takes a look at what files you will want to include in your backup
routine, and also the ones you will usually want to exclude.
Selective Backup: In a selective (or partial) backup, you select specific files and
directories to back up. This type of backup gives you more control over what is
backed up, at the expense of leaving part of the hard disk unprotected. Selective
backups make sense when some files are changing much more rapidly than others,
or when backup space is limited, although in many cases doing an incremental
backup is better and easier.
Incremental Backup: If you perform frequent backups, as you should, you may
find yourself backing up the same files over and over, even ones that do not change
over time. Instead, you may want to consider a mix of full backups and incremental
backups. An incremental backup is one where only the files that have changed since
the last backup are selected. It is like a selective backup, but the files are selected
based on whether they have changed recently or not, instead of an arbitrary
selection based on directory or file names. This gives the time- and space-saving
advantages of a selective backup while also ensuring that all changed files are
covered.
Incremental backups are supported by most decent backup software. They work using
the archive bit that exists for each file and directory. The backup software looks at this bit to
determine what files have been changed since the last backup, selects them for backup, and
then clears the bit for all the files it backs up. If any files are changed, the software sets the bit
again so on the next incremental they are again selected, and so on. You must rely on this bit
being managed properly, and I don't always like to do this.
Which type of backup you do depends, again, on what is important to you, in terms of
time, media cost, and also ease of restoration.
Programs are a somewhat different story for two main reasons: first, they are static,
meaning once installed they do not change (with a few exceptions). Second, they are
recreatable; if your Microsoft Office directory gets wiped out, you can reinstall it from your
original CD-ROM disk. The combination of these characteristics suggests that backing up
programs is less important than backing up data, and this is true. Programs do not need to be
backed up as often as data does.
Note: Some PCs ship with their software preinstalled on the hard disk and no original disks or
CD-ROMs! This is a poor practice and I recommend that people avoid buying from companies
that do this, since it makes it very hard for you to reinstall software if you need it in case of
disaster. If you have no original disks, your installed programs should be treated as just as
unrecreatable as your data.
Most newer software will in fact automatically deselect the items above, unless you
override and tell it you want them included anyway. Many types of backup software will also let
you select classes of files, by file type, that you want to exclude for whatever reason.
Long Filenames
Long filenames use special tricks in the directory structure to store names of up to about
250 characters while maintaining compatibility with older DOS and Windows 3.x applications
that use standard short filenames..
In terms of backup, LFNs present no concern as long as you use a backup utility that
knows how to deal with them. You do have to exercise some caution when moving files between
PCs due to the tendency for the short filename aliases to move around when changing files
between systems. For standard backup of a PC, long filenames should not pose a problem if
you are using the right software.
Tip: It is possible with some BIOSes to get the BIOS setup screens to print out on the printer.
Turn on the PC, and boot up DOS. Then hit {Ctrl}+{Alt}+{Delete} to reset the PC, and go into
BIOS setup. On some systems, the routine that handles the {Print Screen} key will still function,
and you can use it on each of the BIOS screens as a nifty shortcut!
CMOS Backup Utilities: There are small utilities that will go through the CMOS
settings and record them to a regular file, which can then be backed up through
normal means, or just copied to a couple of floppy disks. Some of these programs
also offer the option of restoring the CMOS settings from one of these files if the
CMOS ever gets wiped out.
How To Back Up
Backup Timing
Backup Software
Software Conflicts
Data Verification
Backup Compression
Media Storage
While backing up your data is in some ways a simple matter there are in fact some
special techniques that can come into play to make backups more effective and less of a
hassle. This section takes a look at specific techniques and considerations for performing
backups, some of which you might not think of. This includes a discussion of backup timing,
scheduling, media storage, & how to ensure that your backups work, & will protect you in the
event that you need them.
Backup Timing
Selecting a time of day to perform backups is a matter of personal choice. It depends, as
usual, on how you use your PC, and also on how long it takes for you to perform backups. Most
people prefer to do unattended backups, and therefore, set their backups up to run when they
are not around. The most common times to run backups therefore are:
Overnight: start a backup before going to sleep, have it run overnight, and then see
the results in the morning. This makes sense because the PC isn't being used, and a
full backup of a loaded system, including time to perform verification, can take
several hours.
During the Day: If you work during the day you can set up your home machine to
perform backups while you are at the office. This is really not much different in
concept from the overnight backup; both use slack time where the PC isn't being
actively used, to perform backups.
Backup Software
An important part of the backup puzzle is using the right software. The difference
between good and mediocre backup software can be the difference between backups that are
reliable and easy to use and ones that are not. The difference between mediocre software and
bad software can be the difference between backups that restore properly when you need them
to, and those that leave you high and dry!
Many backup devices ship with basic backup software, provided as a courtesy by the
hardware manufacturer. In many cases these are functional but stripped-down versions of
commercial packages. They will usually work, but may not be nearly as full-featured as a
package you would buy at the store. The following are abilities or features that you may want to
consider carefully when looking at PC backup software (not listed in any particular order):
Wide Device Support: Backup software varies significantly in its ability to support
backup devices. Generally speaking, it is more difficult to find software support for
newer devices than well-established ones. Some software companies will make
software updates available for their users to provide expanded support as new drives
hit the market; others will not. Do remember that while support for more devices
gives you more flexibility, ultimately the only device you really need support for is the
one that you are actually using.
Operating System Support: The software should support all of the features and
requirements of the operating system under which it runs. This means, for example,
that Windows 98 software should have full support for long filenames, backup of the
Windows 98 Registry, and backup of FAT32 partitions.
Backup Type Selection: All good backup software will let you choose between
doing full, selective and incremental backups. Better ones will let you select files and
directories based on search strings or patterns.
Media Spanning: The software should provide proper support for backing up to
multiple pieces of media in a media set. So if you did a backup to Zip disks and the
data took up 250 MB, the system should prompt you when it is time to switch disks,
etc. Strangely, some poor backup software has problems with this .
Disaster Recovery: A very important feature, and one that is often found only on
more expensive products is support for automatic disaster recovery. With this type of
software, sometimes called one-step recovery or single-step restore or similar, a
floppy disk is created with a special recovery program that will let you restore your
system simply. Without this feature, you often have to reinstall the entire operating
system before you do a restore, which can cost a lot of time and cause a lot of
problems.
Scheduling and Automatic Operation: Depending on how and when you do your
backups, it can be very helpful to have the software run automatically at a preset
time.
Backup Verification: Every decent backup package will allow you to enable a
verification mode. When active, the software will read back from the tape every file
that it backs up and compare it to the file on the hard disk, to ensure that the backup
is correct. This is important to ensure that your backups are viable.
Compression: Good backup software will give you the option of enabling software
compression, possibly at various levels, to enable you to save space on your backup
media.
Security: Better software packages will let you password-protect a backup set so
that the password is required to view or restore from the backup image. (Be very
careful before using something like this, you don't want to lose that password!)
Software Conflicts
When performing backups in a multitasking operating system, it is necessary to be wary
of possible conflicts between the backup software and any other software that may be running
simultaneously. Any programs that are running in the background that might write to files or
directories on the disk can confuse the backup software, especially when it goes to verify the
files it has backed up, because it may find different files in some directories at the end of the
backup compared to what was there at the beginning.
Another problem is with files that are locked due to another program having exclusive
access to the file. To prevent more than one application from changing a document at the same
time, many applications will lock them out so no other application can use them. This can cause
the backup software to be unable to back up these files.
The solution to avoiding these problems is, in most cases, to simply turn off other
software when you are doing a backup. Disable your screen saver, to ensure that it doesn't
cause interference.
Data Verification
The idea behind verification is simple: after the files are backed up, the backup software
reads back the information from the backup media and compares it back to the original files.
This ensures that the backup just made is readable, and that the files match what was just
copied. The only disadvantage is that it lengthens the amount of time that it takes to perform the
backup, but if you are backing up overnight or while away from the PC, this will have no effect
on you anyway.
There are two different levels of verification that you will sometimes find, depending on
the software you are using. The most secure level of verification is full verification, where each
and every file that is backed up is also verified by reading back from the backup medium. A
lesser type of verification is sampling verification. Here, instead of verifying everything that was
backed up, a sample of what was backed up is read back and verified. This makes the
verification take much less time, but of course doesn't do nearly as good a job.
Backup Compression
Most backup systems support some type of compression. The idea behind compression
is simple: to save space and allow the backup of more data onto a given media set. Most
software supports compression, and in fact many backup devices even quote their capacity on
the assumption that compression will be used during backup.
Normally there is nothing wrong with using compression, and in fact I use it myself in
many cases. You do need to bear in mind a few things, however:
Compressibility: Not all files will compress equally well. If you are backing up a
large number of files that are already in a compressed format, it may make sense to
turn off compression, since it isn't going to do much for you anyway.
Proprietary Formats: Each software program will use its own compression
algorithm. This means that the tape written by one program may not be readable by
a different software package. This is not normally a problem since most people only
use one package on one PC, but it is something to bear in mind. The backup formats
themselves are reasonably universal if compression is not used.
Media Storage
Your data backups are exactly as safe as the physical media that contains them. If you
do a tape backup and then leave the tape lying on top of the PC box, then you are
partially defeating the purpose of doing backups. You will give back the protection that
tape offers against risks like theft, disaster or sabotage.
Backup media should be stored in a safe place, away from the PC.
Depending on the type of backup media you are using, you want to make sure that the
storage environment is appropriate. For magnetic media such as tapes and disks, you
want to ensure that the storage area offers protection from the hazards that threaten
them, including temperature, moisture, dirt, magnetic fields and the like.
Finally, pay attention to the matter of off-site storage. It is a good idea to ensure that one
or more of the backup media sets in the media rotation system you are using is always
be stored off-site. This is important to allow for safeguarding against total disaster like
hurricane.