Cybersecurity in the early 1990's

+

Back in the late 1980’s and early 1990’s, I remember collecting MS-DOS computer viruses. I’d download them from an online Bulletin Board System (BBS) and store them on floppy disks in compressed format to prevent them from executing. For fun, some friends and I would occasionally release them on one of our systems to see what they did, and then reinstall MS-DOS afterwards. Most viruses didn’t do very much, but some others (e.g. BBkiller) would remove or modify the contents of files.

During the early 1990’s, viruses and other computer-related security topics were rarely discussed, and certainly not taken seriously. For example, when I worked as a computer programmer for the University of Waterloo School of Accounting, we could request login credentials (username/password) for nearly any system we wanted access to, and in most cases we’d have full access to all of the data on that system. Many of us also shared our login credentials with fellow programmers so they could use our local workstations when they needed to. Login credentials were just a necessary nuisance. Sure, we had all seen the 1983 movie War Games where Matthew Broderick almost started a nuclear war by logging into a government computer. But we were creating boring accounting programs, not missile systems. And while mainstream media had already started using the word hacker to refer to malicious computer users, computer hacking headlines were sparse and typically involved banks or large organizations only.

Today, however, computer-related security (now called cybersecurity) is a key focus for anyone working in a technical field. Those who develop Web apps often find themselves performing regular audits against cybersecurity guidelines, and Information Technology (IT) administrators spend a large portion of their time securing systems, monitoring for security breaches, and testing for vulnerabilities (a process called penetration testing). The approach we use today often follows cybersecurity frameworks published by organizations, such as National Institute of Standards and Technology (NIST). Network Security in the 90s book

Recently, I came across a book entitled Network Security In the ‘90s at a used bookstore. Because it had a copyright date of 1992 on the inside cover, it was likely written between 1990 and 1991 since publishers always put a copyright date on technical books one year ahead of the actual release.

At 290 pages, it was rather thin for a security-focused book. So I joked to myself “I guess there wasn’t a lot of security in the ‘90s.” But curiosity got the best of me and I decided to give it a read to find out what the professional computing security landscape was during the early 1990s, because quite frankly, it wasn’t a big focus in my life at the time. Moreover, I was curious to see how it compared to today’s cybersecurity landscape. Here’s what I found out:

Security-focused organizations were in their infancy

  • At the time, NIST had recently supplanted the National Bureau of Standards as the organization that would define security frameworks in the computing industry, and the Electronic Frontier Foundation (EFF) was formed to protect digital freedoms. However, these organizations had not matured yet, and few other security-focused organizations existed.
  • Today, there are dozens of organizations devoted to providing security frameworks for different technologies and environments (including NIST), as well as thousands more that provide cybersecurity services and software.

The general security approach was incredibly simple (Steps 4-7 perform risk analysis):

  1. Decide what security means.
  2. Look for problems that actually exist.
  3. Don’t overlook the obvious.
  4. Decide how important the information is to your organization.
  5. Determine the potential loss if the data is lost/stolen.
  6. Determine the cost to correct any problems.
  7. Compare this cost to the loss potential.

Security practices were categorized by 4 controls:

  • Directive controls consisted mainly of security-focused policies and guidelines.
  • Preventive controls consisted of encryption, access control, hashes, user education, callback systems for dial-in access and tiger team testing (early term for penetration testing).
  • Corrective controls involved following predefined procedures to fix a security problem (since specialized security software and network appliances had not been implemented yet).
  • Recovery controls involved restoring data and system files from backup.

Many basic cybersecurity terms today were also common in the early 1990’s

  • These include confidentiality (restricting who can read data), authentication (validating user identity), data integrity (identifying when data is modified), non-repudiation (ensuring the authenticity for a sender of data) and access control (restricting access to a system or file).

Computer security was divided into three main areas:

  • INFOSEC (information security) included mechanisms to prevent unauthorized access to data, such as encryption.
  • COMSEC (communications security) included mechanisms for securely transmitting data across networks (which at this time was primarily email, fax and online services such as CompuServe).
  • COMPUSEC (computer security) included mechanisms to prevent unauthorized access to computers and computer terminals.

The biggest perceived threats included insider access to data and viruses

  • Insider access is still a key threat today because it’s the easiest way to obtain data. However, viruses and worms are far less of a threat today compared to complex malware and zero-day software exploits that can perform high-level tasks.
  • Surprisingly, social engineering and weak passwords weren’t given a high priority, even though they were prevalent at the time (you can seem them portrayed heavily in the 1992 movie Sneakers).

Encryption and access control were the primary defences against unauthorized data access

  • Encrypting data using the DES symmetric encryption algorithm was the preferred method for implementing INFOSEC.
  • If encryption could not be used, then using permission-based access control on files was recommended for restricting access to data, with mandatory access control (access granted by an administrator) being preferred over discretionary access control (access granted by users).
  • Encrypting data using the RSA asymmetric (private key) encryption algorithm was encouraged for COMSEC (although no concrete examples of this are given in the book since it predates the widespread adoption of SSL/HTTPS for Web traffic).
  • Novell file servers, UNIX systems and IBM mainframes required additional COMSEC focus, as they didn’t possess the network security measures implemented by IBM AS/400 and DEC VMS systems at the time.
  • The most common method to restrict access to a system (COMPUSEC) included login credentials, but some systems implemented fingerprint sensors or one-time password generators.

Viruses were difficult to identify and remedy

  • Viruses were typically identified by users who knew how to recognize the common signs of a virus infection (vanishing files, slowness, etc.). Thus user education was key to identifying virus infections.
  • On systems shared by many different users (e.g. servers, mainframes), system administrators were encouraged to regularly check the hash (CRC) of critical files to identify files that were modified by a virus.
  • Remedying a virus infection was a lengthy (and costly) process that involved restoring data and system backups (likely because antivirus software wasn’t commonly implemented back then).

When writing network software, COMSEC was provided at Layer 7 of the OSI model only

  • The 7-layer Open Systems Interconnection (OSI) model identifies the function of different software components that send data from one computer to another across a network. Higher layers (Layer 7, 6 and 5) deal with the data that is transmitted, while lower layers deal with the protocol and methods used to access the network.
  • Today, attacks are detected at all layers of the OSI model. For example, malware transmission and sophisticated attacks are often detected at Layer 7, while Denial of Service (DoS) and Man in the Middle (MitM) attacks are often detected at lower layers. Consequently, all OSI layers are analyzed by different security software suites today.

Monitoring of log files and network traffic was not common practice

  • Today, however, these practices are among the most important for detecting security incidents, and the primary job responsibility for who work as a Cybersecurity Analyst.
  • Many of the software systems we use today to monitor for security events were not available in the early 1990s, including Intrusion Detection Systems (IDS) and Security Information and Event Management (SIEM) appliances.