Home Contact Us
Reference  
News  
Reference - Articles
Technologies/Disciplines

 
 Access Control: Concept and Techniques
This article was published in Direct Access for Canada's Information Systems Professionals. Direct Access is published 25 times a year by Laurentian Technomedia Inc., a unit of larentian Publishing Group.

Prepared By Dr. Mir F. Ali

Introduction:
It is no secret that a new revolution in information technology is bringing us out of the industrial age. This has resulted in the storage and transmission of vast amounts of confidential data and an increasing concern about unauthorized access of this data.

We are living in a dynamic era where technology is affecting our lifestyle everyday and is had a penetrating impact on the way we do business. It is becoming increasingly important in the interest of integrity, accuracy and privacy of information, to have controls to prevent unauthorized persons from gaining access to sensitive and valuable information.

The use of computers in business, government, and industry is growing every day in the automation of accounting, purchasing, manufacturing, retailing, management, research, and other related functions. The information processed and maintained in these systems is necessary to daily operations.

The storage and processing of sensitive information, and the integration of computers into operations provide incentives for different types of security breaches. On the one hand there is the opportunity to access, acquire or modify vital information on a timely basis to support business objectives. On the other hand this facility could be used either for personal gain or spite that could disrupt a computer systems operation and cause severe harm to the organization.

Cryptography:
Crypto logy is one approach to privacy in the information age. It does not protect communications as such, but rather their content. If someone does not get into the system, or a part of it for which he is not authorized, he cannot gain the information contained in the encrypted files.

There are two standardized computer cryptographic. One- chiefly represented by the so-called SA (Rivest Shamir and Adleman) cipher developed at MIT - is a “public key” system which, by its structure, is ideally suited to a society based upon electronic mail. The other is the American Data Encryption Standard (DES) developed at IBM, which is featured in an increasing number of hardware products that are quick, but expensive for the everyday user.

The important point is that communications security must also satisfy other requirements:
  • Prevention of traffic analysis: Determination of frequency, volume, length, and origin-destination patterns of the message traffic;
  • Detection of message stream modifications;
  • Detection of denial of message service; and
  • Detection of attempts to establish unauthorized connections.

In addition the following principals can be incorporated for security systems for computer networks:

  • Economy of security mechanism: The use of the simplest possible design that can achieve the desired effect;
  • Fail-safe property: Design approach where access granting is based on permission, not denial; Complete mediation: Every access request is tested against access authorization tables;
  • Open design: The configuration and operation of the protection mechanism is not secret. Protection is not dependent on the ignorance of the attackers, but on the intrinsic strength of the mechanism;
  • Separation of privilege: a protection mechanism requiring two keys is more robust than a mechanism where a single key is used;
  • Least privilege: Every program or user, and every security mechanism should be granted only the absolute minimum of capabilities needed to perform their functions;
  • Least common mechanism: Limitation on the number of shared security mechanisms. Each represents a potential information path between users who may attempt to employ it for unauthorized communications; and
  • Psychological acceptability: The human interface must be designed for ease of use with as much transparency to the user as possible, so that users do not have incentives to avoid complying with security procedures.

The perceived need for computer and communications privacy has brought crypto logy out of the shadowy world of espionage and secrecy with which it has been traditionally been associated in the popular imagination. The advent of the micro, the PC, and the inexpensive modem make the crypto logy option a concern of everyday users as well as the mainframe or mini installation. More and more crypto logy products are likely to come on stream in the near future.

Investment in relevant and suitable access controls can: Prevent damage to computer equipment, installations, and premises; reduce the chances of information being tapped and fraud attempted; ensure reliable data processing by seeing to it that errors, inaccuracies, mishaps and omissions are detected more easily; and ensure that the situation is remedied as soon as possible, should anything happen.

One of the difficulties in implementing adequate access controls is the lack of awareness in many designers of the spectrum of controls available within information technologies. Thus, a fundamental consideration of the solution has to be the adoption of the progressive excellence of computer security solutions.

The most common implemented in applications programs is the “check digit”. This is a technique in which one digit within a large numeric field is mathematically derived from the remaining digits. When properly implemented, check-digits can detect most errors, including transposition.

Other application controls include:

  • Format Checks;
  • Range Check;
  • Hash Totals;
  • Existent Code Checks;
  • Cross-Footing;
  • Audit Trails;
  • Time and Date Stamps;
  • Independent Reconciliation;
  • Digital Labels; and
  • Storage Volumes.

Hardware access controls include:

  • Content Checks;
  • Copy Protection Schemes;
  • Copyright Violations;
  • CPU State Registers;
  • Data Encryption Standards;
  • Data Integrity;
  • Data Security;
  • Digital Signatures;
  • Encryption;
  • Fingerprint Verification;
  • Passwords;
  • and System Kernels.

Operating System controls include:

  • Automated Logical Access Control Standards (ALACS);
  • and Trusted Computer Base (TCB).

Communications controls include:

  • Callback;
  • Eavesdropping Controls;
  • Checksums;
  • Signatures;
  • and Authentication.

Tot prevent anyone with access to a system, from obtaining access to all the information in it, access must be restricted according to each person's information need.

Thus, different users, or user-groups, should be authorized for that part of the information or system needed for them to do their job.

Application Control Techniques:

Authorization: When it has been decided to restrict access to a certain system of information, the first thing to do is to evaluate which parts of the information or system different user-groups are to have access to. Access may also depend on the classification of the information in question.

If a company has decided to use two levels of classification (e.g. Internal Use Only and Confidential), it might also be practical to give two levels of authorization to personnel, according to the level of classification to which they have been entrusted access. In addition to classification levels, access restrictions may depend on which part of the information or system a person needs to perform his or her job.

Password: Once an authorization system has been implemented, some kind of identification or password system is needed to ensure that only authorized personnel are granted access. A password is a unique combination of characters that, a give a person access to predefined information, or allow certain rights or commands from an end-user terminal.

The password is personal keys, which help to prevent unauthorized persons from gaining access. In a computer system, there might be various levels of passwords to access different parts of the information, or different levels of classification.

In general, passwords are used to gain access to: Terminals; computers or parts of computers; areas in the memory; programs or parts of programs; filed or records; information categories; and specific commands.

The use of password systems makes it possible to restrict personnel from access beyond their real need. Such restrictions may be embedded in the operating system, in transaction programs and in application programs. Consideration should be given to restricting the use of information according to the various needs (e.g. read, update, write, delete).

It has become increasingly common to implement additional automation access controls: a terminal maybe automatically disconnected between certain hours to prevent maintenance personnel and others with abnormal working hours from gaining access to the system; when no activity has been registered on a terminal for more than so many minutes, the terminal may automatically be disconnected. Such disconnection should apply when a terminal operator leaves his desk for an extended period of time, and simply forgets to log off; and it may also be relevant to restrict access to certain transactions, to special types of information, to predefined periods, to allow important functions to be processed only at certain times during the day.

Communication Control Techniques:
Encryption: The word “encryption” is derived from the Greek “kryptos” meaning “hidden” or “secret”. Information can be protected against unauthorized use, by making it “hidden”. A person with legal access to such encrypted information must first decrypt the data to be able to read it.

Encryption is done by means of an encryption system, a combination of a key (code) and an algorithm. The key must be known to the receiver, and must be kept secret from unauthorized persons. The algorithm may vary depending upon quality requirements. The algorithm itself is normally open information, known to everyone. It is knowledge about the key that makes it possible to decrypt the data.

The need to encrypt data is generally greatest in connection with transmission in data networks. It is possible to obtain equipment that provides the necessary protection for most applications.

Encryption of passwords is often recommended. This should be arranged so that if a person gets access to one legal password, and then gains access to the encrypted list of passwords in the computer memory, it should still not be possible to find the key to decrypt all the other passwords (one-way encryption).

Authentication: A relatively new concept developing out of cryptographic techniques is the ability to seal” computer files or data streams. While providing no protection or discrete access limitations, seals reveal when data streams or files have been altered. Such seals (functionally, forms of encrypted hash totals) have been effectively applied by the banking industry in Sweden.

This technique can also be used to seal messages without encryption. This is a much older application of the technique. And has long been in use in telexed financial messages. In data communications terminology, the seal is known as an authenticator and accompanies the message. The authenticator is verified using a previously exchanges secret key.

Authentication has been particularly important to the financial industry because of the introduction of undetectable encipherment errors in numeric messages. Such errors could cause unrecoverable transactions in erroneous monetary amounts, or to erroneous institutional account numbers.

Signatures: An interesting benefit of some Public Key Encryption (PKE) techniques is the ability to generate an encrypted signature. PKE techniques such as RSA scheme use two keys - a public key for encryption and a private key for decryption. The encryption algorithm uses a “one-way” or “trapdoor” function that renders the derivation of the key into a prohibitively lengthy computation process.

The RSA scheme is invertible; e.g. data encrypted with the decryption key may only be decrypted with the public encryption key. Thus a message may be “signed” by encryption of a signature with the private key. The receiver of the message can verify its authenticity by decrypting the signature. Fake messages can be detected by this technique, as it is a prohibitively long process to discover the encrypted signature, even knowing the decrypted value and the public key.

Checksums: Most computer communications hardware provides the transaction of checksums or block check characters with each data transmission “unit”. These characters are conditioned by the bit values of the other data in the unit, providing a parity check of each message unit. Degradation or failure in the communication link would likely be detected by checksum errors.

Callback: Computer systems can now be protected against unauthorized access from the public switched network by “callback” devices. These devices prohibit dialed-in log- ins, but initiate a return telephone call to a user when activated by the user from the public switched network. The return call is placed to an authorized number from a list in the system. Thus, the sessions may only be initiated from previously authorized telephone sets.

Future Trends: Recognizing the difference between an authorized user and a fake is a difficult problem for machines. Traditional password systems can never be made perfectly secure and are expected to be replaced by smart card authentication systems in many applications.

The high level of security demanded by access control or payment applications can only be reached when using cryptographic techniques. Today's smart card authentication schemes usually are based on a symmetric cryptographic algorithm (e.g. RSA). It is well known, however, that the use of asymmetric algorithms, i.e. public key (e.g. RSA) or zero-knowledge schemes would offer advantages: No global secret keys have to be stored on the terminal side (e.g. retailer, bank); and Digital signatures can be utilized.

In the near future digital signatures are expected to be one of the most important security mechanisms for smart card applications. The general principle is that only the sender of the message is able to sign a message with his secret key. The receiver of a signed message can then use the corresponding public key to check the origin of the message as well as whether the message has been manipulated on the transport link.

Smart card applications based on asymmetric cryptographic algorithms will typically require the following security services: Mutual card/terminal authentication; and message authentication (signature generation/verification).

A smart card is a plastic card with an embedded integrated circuit chip. There are memory and microprocessor smart cards. Today, a typical chip for a smart card application offers an 8-bit microprocessor and a memory space of about 3Kbyte ROM, 2Kbyte EEPROM and 128 Byte RAM.

The automatic verification of claimed identity or direction recognition of persons has been utilized for the past 20 years. As a result of significant research over this time, commercial products have emerged during the past decade.

Several signature-based systems use dynamic signatures. There are also some systems for verifying “cold” or static signatures in, for example, the clearing of cheque for payment. There are a variety of physical characteristics which have been suggested and investigated for “Automatic Personal Identification (API) which include:

  • Facial Features, full face and profile
  • Fingerprints
  • Palm prints
  • Footprints
  • Hand geometry (Shape)
  • Ear (Pinna)
  • Shape Retinal Blood Vessels
  • Striation of the Iris
  • Surface Blood Vessels (In the wrist); and
  • Electrocardiac Waveforms

Promising solutions leading perhaps to a biometrics smart card are being worked on by a partnership between the British Technology Group and several equipment suppliers are card issuers. These solutions may result in a cost-effective biometrics smart card.

Canadian Government Initiatives:
Treasury Board Canada together with Supply and Services Canada initiated a project in 1989 dealing with electronic authorization and authentication for payment requisitions in the Canadian Government environment.

The major focus of this initiative was to ensure that the certification requirements under Schedule III were met. It states that certification shall comply with the requirements of the Treasury Board for the control of financial transactions; shall be in such a form that it cannot easily be imitated, duplicated or emulated by a person other than the person authorized to certify; and shall be such that: It clearly identifies the person certifying the requisition; it involves the use of information which is personally generated at the time of certification by the person authorized to certify and is not originated from a stored location as part of an automated process; and, it can be authenticated by the Receiver General before payment or settlement is made, and can be automated after payment of settlement is made.

Two pilots were conducted as a part of this initiative and different scenarios were tested through these pilots to finalize a policy for handling electronic authorization and authentication in the Canadian Government environment.

Return to
Articles


Copyright 2003 - Automated Information Management Corporation