Cryptography and Network Security

Cryptography and Network security refers to the actions taken by a business or organization to protect computer networks and data using both hardware and software systems. It aims to ensure the privacy and availability of data and networks. Any company or organization that handles large amounts of data has specific solutions to many cyber threats.

Cryptography and Network Security

Cryptography and Network Security


  1. Brute Force Attack.
  2. Man-in-the-Middle (MitM) Attack.
  3. Denial-of-Service (DoS) Attack.
  4. Phishing Attack.
  5. Cryptanalysis.


  1. Confidentiality.
  2. Integrity.
  3. Authentication.
  4. Non-repudiation.
  5. Availability.


  1. Encryption.
  2. Digital Signatures.
  3. Firewalls.
  4. Intrusion Detection Systems (IDS).
  5. Virtual Private Network (VPN).

Security Services

  1. Confidentiality.
  2. Integrity.
  3. Authentication.
  4. Nonrepudiation.
  5. Access Control.
  6. Availability.

Conventional Encryption

Conventional encryption is an encryption system that uses a key that the sender uses to encrypt a message and the same key that the receiver uses to decrypt the message. It was the only type of encryption used prior to the development of public key cryptography.

Due to its simplicity, it is still the preferred of the two types of encryption systems. This is a relatively fast process because it uses the same key for both encryption and decryption. In this encryption model, the sender encrypts plaintext with the recipient’s private key, which the recipient can then use to decrypt the ciphertext. Below is a picture illustrating this concept.

Classical Techniques

  1. Caesar Cipher.
  2. Vigenère Cipher.
  3. Playfair Cipher.
  4. Transposition Cipher.
  5. One-Time Pad.

Conventional Encryption Model

Symmetric encryption is also called plain encryption or single-key encryption. It was the only type of encryption used prior to the development of public key cryptography. These are the two most popular types of encryption by far.

A symmetric encryption scheme has five ingredients:

  1. Plain text: Raw readable messages or data that are provided as input to an algorithm.
  2. Encryption Algorithms: Encryption algorithms perform various substitutions and transformations to convert plaintext into ciphertext.
  3. Secret Key: The secret key is also used in cryptographic algorithms. A key is a value independent of plain text. The algorithm produces different outputs depending on the specific key currently in use. The exact substitutions and transformations performed by the algorithm depend on the key.
  4. Ciphertext: This is the encrypted message produced as output. It depends on the plaintext and the secret key. Two different keys provide different ciphertexts for a given message. The ciphertext is obviously an arbitrary stream of data and is unreadable in its current form.
  5. Decryption Algorithm: Essentially, this is an encryption algorithm that works in reverse. It takes a ciphertext and a secret key as input and generates the original plaintext.


The method of steganography is to hide sensitive information in plain unclassified files or messages so that it cannot be detected. Sensitive information can then be extracted from the target’s regular files or messages to avoid detection. Steganography is an additional step that can be used with encryption to hide or protect data.

Steganography is a means of hiding confidential information within (or on) plain, unclassified documents or other media to avoid detection. It comes from the Greek words steganos meaning “covered” or “hidden” and grapha meaning “to write”. Hence the “hidden letter”.

Classical Encryption Techniques

There are two basic building blocks of all encryption techniques: substitution and transposition.


A replacement method is a method in which plain text characters are replaced with other letters, numbers, or symbols. If plaintext is viewed as a sequence of bits, replacement involves replacing plaintext bit patterns with ciphertext bit patterns.

(i)Caesar cipher (or) shift cipher
(ii)Playfair cipher
(iii)Polyalphabetic ciphers
(iv)Vigenere cipher


All of the methods described so far involve replacing plaintext characters with ciphertext characters. A completely different type of display is achieved by performing some permutation of plain text characters. This method is called transposition cipher.

Simplified DES

General structure of simplified DES. The S-DES encryption algorithm takes an 8-bit plaintext block (e.g. 10111101) and a 10-bit key as input and produces an 8-bit ciphertext block as output. The S-DES decryption algorithm takes as input an 8-bit block of ciphertext and the same 10-bit key used to generate the ciphertext, and produces the original 8-bit plaintext block.

Block Cipher Principles

Many symmetric block cipher algorithms currently in use are based on a structure called the Feistel block cipher [FEIS73]. For this reason, it is important to study the principles that construct Feistel ciphers. Let’s start by comparing stream ciphers and block ciphers. We then discuss the motivation behind the Feistel block cipher structure. Finally, we will discuss some meanings.

DES Standard

The Data Encryption Standard (DES) has been found to be vulnerable to very strong attacks, which has reduced DES’ popularity somewhat. DES is a block cipher, which encrypts data in 64-bit blocks, which means that 64 bits of plaintext are fed into DES to produce 64 bits of ciphertext. The same algorithms and keys are used for encryption and decryption with minor differences. The key length is 56 bits. The main idea is shown in the picture.

Differential Cryptanalysis

Cryptographic differential analysis is a type of cryptographic analysis that can be used to crack block and stream ciphers as well as cryptographic hash functions. In its broadest sense, it is the study of how changes in input information can affect subsequent output differences. In the context of a block cipher, this is a set of strategies for tracking differences in transformation networks, detecting non-random cryptographic behavior, and using these properties to recover a secret key (encryption key).

Linear Cryptanalysis

Linear cryptography is a common type of cryptography based on finding an affine approximation of cryptographic behavior. Block and stream ciphers have been attacked. Linear cipher analysis is one of the two most common attacks on block ciphers, the other being differential cipher analysis.

Block Cipher Design Principles

Block ciphers are based on the Feistel cipher structure. Block ciphers have a specific number of rounds and keys to generate ciphertext. To determine the level of complexity of an algorithm, several design principles must be considered.

Block Cipher Modes of Operations

Cryptographic algorithms are divided into two categories based on the type of input: block ciphers and stream ciphers. A block cipher is an encryption algorithm that takes a fixed input size, say b bits, and generates ciphertext from b bits again. If the input is larger than b bits, it can be further split. Block ciphers for different applications and applications have several modes of operation.

Conventional Encryption Algorithms

Conventional encryption algorithms, also known as symmetric encryption algorithms, are a type of cryptographic algorithm that use the same key for both encryption and decryption processes. These algorithms are widely used in cryptography and network security to ensure the confidentiality and integrity of data. Here are some commonly used conventional encryption algorithms:

  1. Data Encryption Standard (DES).
  2. Triple Data Encryption Standard (3DES).
  3. Advanced Encryption Standard (AES).
  4. Rivest Cipher (RC).
  5. Blowfish.
  6. Twofish.

Triple DES

Since 1990, the speed of outright DES key lookups has been a source of frustration for DES users. However, users were reluctant to replace DES because it would take a huge amount of time and money to change the widely used encryption algorithm built into a large-scale security architecture.

The pragmatic approach was not to abandon DES entirely, but to change the way DES was used. This led to the modified Triple DES (sometimes referred to as 3DES) approach.

By the way, there are two variants of Triple DES known as 3-key Triple DES (3TDES) and 2-key Triple DES (2TDES).


Blowfish is an encryption method developed by Bruce Schneier in 1993 as an alternative to the DES encryption method. It is much faster than DES and provides superior encryption speed, but no efficient cryptanalysis method has yet been discovered. It is one of the first patent-free secure block ciphers and is freely available to everyone.

International Data Encryption Algorithm

In cryptography, the International Data Encryption Algorithm (IDEA) was originally called the Proposed Proposed Encryption Standard (IPES) and was first described in 1991 as a symmetric key block cipher developed by James Massey and Xuejia Lai at ETH Zurich. It replaces the Data Encryption Standard (DES). IDEA is a reduced version of the older Proposed Encryption Standard (PES).
This cipher was developed as part of a research agreement with the Hasler Foundation, which became part of Ascom-Tech AG. This cipher was patented in several countries, but was freely available for non-commercial use. The name “IDEA” is also a trademark. The last patent expired in 2012, and IDEA is currently patent-free, so it’s completely free for all uses.
IDEA was used in Pretty Good Privacy (PGP) v2.0 and included after BassOmatic, the original cipher used in v1.0, was found to be insecure. IDEA is an optional algorithm in the OpenPGP standard.


RC5 was designed by Ron Rivest. It is a symmetric key based block cipher algorithm. Its main feature is that it is fairly fast because it uses only primitive computational operations. This allows for greater flexibility by allowing for different numbers of rounds and variable key sizes. Another benefit of using RC5 is that it requires less memory to run. This feature allows the RC 5 to be used for a variety of purposes, including desktop operation and smart cards.

Placement Function

The placement function refers to the process of determining where and how data should be stored or transmitted within a network to ensure its security. The main objective of the placement function is to protect data from unauthorized access, modification, or interception. It involves decisions about the location of cryptographic mechanisms, keys, and secure storage systems.

Encryption Function

Encryption is a fundamental technique used in cryptography to protect the confidentiality and integrity of data. The encryption function transforms plaintext (original data) into ciphertext (encrypted data) using an encryption algorithm and a cryptographic key. The encrypted data can only be decrypted back to its original form with the corresponding decryption algorithm and key.

Key Distribution

In cryptography, distributing public and private keys between senders and receivers is a very tedious task. If the key becomes known to a third party (forger/interceptor), the entire security mechanism becomes useless. Therefore, secure key exchange is required.

Random number generation

Random number generation is essential to cryptography. One of the most difficult aspects of cryptographic algorithms is their dependence on truly random information or their generation. This is particularly problematic because there is no known way to obtain truly random data from a finite state machine such as a computer.

There are generally two types of random number generators. Nondeterministic random number generators are sometimes referred to as “true random number generators” (TRNGs), and deterministic random number generators are also referred to as pseudorandom number generators (PRNGs).

Hash Functions

A hash function is a data structure commonly used in computing systems for tasks such as message integrity checking and information authentication. It is considered cryptographically “weak” because it can be solved in polynomial time, but it is not easy to crack.

Cryptographic hash functions add security to common hash functions, making it difficult to detect message content or information about the sender and receiver.`

Message Authentication

Message authentication allows one party (sender) to send a message to another party (receiver). In this way, if a message is altered in transit, the recipient will almost certainly detect it. Message authentication is also referred to as data source authentication. Message authentication is considered to protect the integrity of messages by ensuring that each received and accepted message arrives as transmitted with no bits inserted, missing, or altered.

Hash functions play a crucial role in cryptography and network security. They are mathematical algorithms that take an input (message) and produce a fixed-size output called a hash value or digest. Here are some key aspects of hash functions in the context of cryptography and network security:

  1. Data Integrity.
  2. Digital Signatures.
  3. Password Storage.
  4. Message Authentication Codes (MACs).
  5. Data Storage and Retrieval.

Authentication Requirements

Message authentication is the process of verifying the origin of a received message.
This is the intended source and has not been changed. You can also check message authentication
Consistency and timeliness. A digital signature is also an authentication method.
Include actions in response to denials on the part of the source or recipient. Any mechanism that authenticates a message or digital signature Basically two levels. At the lowest level, there should be a function that creates an authenticator (a value used to authenticate a message). This low-level feature is used as a primitive in higher-level authentication protocols. The recipient can verify the authenticity of the message.

Authentication Functions

Authentication functions play a crucial role in verifying the identity of entities involved in a communication or transaction. These functions ensure that the parties involved are who they claim to be, preventing unauthorized access and protecting the integrity and confidentiality of the information exchanged. Here are some common authentication functions used in cryptography and network security:

  1. Password-based Authentication.
  2. Public Key Infrastructure (PKI).
  3. Challenge-Response Authentication.
  4. Biometric Authentication.
  5. Token-based Authentication.
  6. Multi-factor Authentication (MFA).
  7. Digital Signatures.

Message Authentication Code

Message Authentication Codes (MACs), also known as tags, are used to authenticate the origin and nature of a message. MACs use authentication encryption to verify the legitimacy of data transmitted across a network or from one person to another.

In other words, the MAC guarantees that the message is from the correct sender, that it has not been altered, and that the data transmitted over the network or stored inside or outside the system is legitimate and does not contain malicious code. MAC addresses can be stored in hardware security modules, which are devices used to manage sensitive digital keys.

Hash Function Birthday Attacks

A birthday attack is a type of cryptographic attack based on the birthday paradox to find collisions in hash functions.

A hash function is a mathematical function that takes an input (called a message) and produces a fixed-size output (called a hash value or hash). Output is usually displayed as a string. One of the key properties of hash functions is that it is very difficult to generate the same hash value from two different input messages. This property is called collision resistance.

Hash Functions

The strength of a hash function against brute force attacks depends entirely on the length of the hash code generated by the algorithm. Recall our discussion of hash functions with three desirable properties.

  • One-way: For a given code h, it is computationally infeasible to find x such that H(x) = h.
  • Weak collision resistance: For a given block x, it is computationally impossible to find y x such that H(y) = H(x).
  • Strong collision resistance: It is computationally impossible to find a pair (x, y) such that H(x) = H(y).

For hash codes of length n, the level of effort required is proportional to:

MD5 Message-Digest Algorithm

MD5 is the third message digest algorithm created by Rivest. MD2, MD4, and MD5 have a similar structure, but MD2 is optimized for 8-bit systems compared to the two newer algorithms designed for 32-bit systems. The MD5 algorithm is an extension of MD4 that critical review has shown to be fast but potentially unsafe. In comparison, MD5 is not as fast as the MD4 algorithm, but it guarantees much more data security.

Secure Hash Algorithm (SHA)

Secure Hash Algorithm is a cryptographic hash function that takes an input and produces a 160-bit (20-byte) hash value. This hash value is called the message digest. This message digest is usually represented as 40 hexadecimal digits. It is a US federal information processing standard developed by the US NSA. SHA-1 has been considered insecure since 2005. Major browsers from tech giants like Microsoft, Google, Apple, and Mozilla stopped accepting SHA-1 SSL certificates by 2017. Java uses the MessageDigest class in the package to compute cryptographic hash values. The MessageDigest class provides the following cryptographic hash function for retrieving the hash value of text:

Digital Signatures

digital signature is a public key primitive for message authentication. In the real world, it’s common to use handwritten signatures for handwritten or typed messages. Used to link signers to messages.

Similarly, digital signatures are a way to connect people/organizations to digital data. This binding can be independently verified by the recipient and third parties.

A digital signature is a cryptographic value calculated from data and a secret key known only to the signer.

In the real world, the recipient of a message must guarantee that the message belongs to the sender and must be unable to disprove the origin of the message. This requirement is critical for business applications because the potential for disputes over exchanged data is very high.

Authentication Protocol

An authentication protocol is a type of computer communication protocol or cryptographic protocol specifically designed to transmit authentication data between two entities. This allows the receiving entity to not only authenticate the connecting entity (such as a client connecting to the server), but also to authenticate itself against the connecting entity (from the server to the client) by declaring the type and syntax required for authentication. [One] This is the most important level of protection required for secure communications in computer networks.

Digital Signature Standard (DSS)

The Digital Signature Standard (DSS) is a Federal Information Processing Standard (FIPS) that defines the algorithms used to create digital signatures using the Secure Hash Algorithm (SHA) to authenticate electronic documents. DSS only provides digital signature capabilities, not encryption or key exchange strategies.

Network & System Security

Network and system security are critical aspects of cryptography and network security. They involve implementing measures to protect computer networks, systems, and data from unauthorized access, attacks, and vulnerabilities. Cryptography plays a crucial role in ensuring the confidentiality, integrity, and authenticity of data in these contexts.

Authentication Applications

Authentication is the act of establishing identity by providing information that allows a verifier to know who or what it claims to be. This ID can be many things, including:

  • People
  • Systems
  • Applications
  • Messages

Kerberos X-509

X.509 is a digital certificate based on a standard widely known as the ITU or International Telecommunication Union X.509 standard that defines the format of PKI certificates. X.509 digital certificates are a certificate-based authentication security framework that can be used to securely process transactions and sensitive information. It is mainly used for security and identification in computer networks and Internet communications.

Directory Authentication Service

You can use an external authentication directory service (also called a corporate directory or authentication logon domain) to provide single sign-on to groups of users instead of supporting individual local logon accounts. Each user in the group is assigned the same role (for example, Infrastructure Administrator). An example of an authenticated directory service is a corporate directory using Lightweight Directory Access Protocol (LDAP).

Electronic Mail Security

E-mail security can be defined as the use of various methods to keep sensitive information in e-mail and accounts safe. These safeguards are primarily taken against unauthorized access, loss or damage. This allows individuals or organizations to secure one or more email addresses or account shares.

Pretty Good Privacy (PGP)

PGP is an open source software package for email security. Designed by Phil Zimmerman. Provides basic or basic requirements for cryptography. In this case, several steps are taken to protect your email:

  1. Confidentiality
  2. Authentication
  3. Compression
  4. Resemble
  5. Segmentation
  6. E-mail compatibility


S/MIME is a security-enhanced version of Multipurpose Internet Mail Extension (MIME). In this case, public key cryptography is used to digitally sign, encrypt, or decrypt email. Users receive a public/private key pair from a trusted authority and then use the key appropriately in their mail application.


A network security architecture includes both network and security elements such as:

Network elements: network nodes (computers, routers, etc.), communication protocols (TCP/IP, HTTP, DNS, etc.), connectivity (wired, wireless) and topology (bus, star, mesh, etc.).

Security elements: cybersecurity devices and software, secure communication protocols (eg IPsec VPN and TLS) and data privacy technologies (classification, encryption, key management, etc.).

Authentication Header

The Authentication Header (abbreviated AH) is a security mechanism designed to help authenticate the origin of data packets transmitted on the IP terms (also known as datagrams). Authorization headers also play an important role in ensuring the integrity of information sent and received.

Encapsulating Security Payloads

Cybersecurity is a branch of computer technology that deals with the security of virtual clouds and the Internet. All information stored or transmitted via the cloud must be safe and secure. Cyber ​​networks play a very important role in maintaining the security of established connections and transmitting content over secure/secure channels.

Combining Security Associations

A single SA can implement either the AH protocol or the ESP protocol, but not both. Sometimes certain traffic flows require services provided by both AH and ESP. Additionally, certain traffic flows may require IPsec services between hosts and separate services between security gateways, such as firewalls, for the same flow. In all these cases, multiple SAs must be used for the same traffic flow to achieve the required IPsec service. The term “security association bundle” refers to a set of security associations whose traffic must be processed to provide a desired set of IPsec services. SAs in a bundle can terminate on different endpoints or on the same endpoint.

Key Management

Key management is a critical aspect of cryptography and network security. It involves the generation, distribution, storage, and destruction of cryptographic keys used in encryption and decryption processes. Effective key management is essential for ensuring the confidentiality, integrity, and authenticity of data and communications.