0% found this document useful (0 votes)
31 views49 pages

All Papers

Uploaded by

G Dayanandam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views49 pages

All Papers

Uploaded by

G Dayanandam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

SEMESTER-I

COURSE 1: COMPUTER FUNDAMENTALS AND OFFICE AUTOMATION


Theory Credits: 3 3 hrs/week

Course Objectives
1. Understand foundational computing concepts, including number systems, the evolution of
computers, block diagrams, and generational progress.
2. Develop knowledge of computer architecture, focusing on system organization and networking
fundamentals.
3. Acquire practical skills in document creation, formatting, and digital presentations using word
processing tools.
4. Gain proficiency in spreadsheet operations, such as data entry, formulas, functions, and charting
techniques.
5. Introduce data visualization and basic modelling principles, fostering analytical thinking in
structuring and interpreting data sets.
Course Outcomes

1. At the End of the Course, The Students will be able to explain different number systems, the
historical evolution of computers, and identify key components in a block diagram.
2. Learners will demonstrate basic blocks of a computer and fundamental networking knowledge.
3. Learners will create professional-level documents and design visually appealing presentations
using word processing software and presentation software.
4. Learners will manipulate data within spreadsheets, apply formulas, and generate accurate
summaries and visualizations.
5. Learners will apply data modelling techniques to analyze, organize, and represent data
effectively in various scenarios.

Syllabus:

Unit 1. Number Systems, Evolution, Block Diagram and Generations:


Number Systems: Binary, Decimal, Octal, Hexadecimal; conversions between number systems.
Evolution of Computers: History from early mechanical devices to modern-day systems.
Block Diagram of a Computer: Components like Input Unit, Output Unit, Memory, CPU (ALU + CU).
Generations of Computers: First to Fifth Generation with technologies, characteristics, examples.

Unit 2. Basic organization and N/W fundamentals:


Computer Organization: Functional components –Storage types, Memory hierarchy.
Types of Computers: Micro, Mini, Mainframe, and Supercomputers.
Networking Fundamentals: Definition, need for networks, types (LAN, WAN, MAN), topology (Star,
Ring, Bus).
Internet Basics: IP Address, Domain Name, Web Browser, Email, WWW.

Unit 3. Word Processing and presentations:


Word Processing Basics: Using MS Word/Google Docs – formatting, styles, tables, mail merge.
Applications - Creating resumes, reports. Keyboard Shortcuts- File Operations, Editing Operations,
Formatting Shortcuts, Navigation and Selection.
Presentation Tools: Using PowerPoint/Google Slides – slide design, animations, transitions.
Applications - Brochures, and presentations.
Unit 4. Spreadsheet Basics:
Spreadsheet Concepts: Understanding rows, columns, cells in tools like MS Excel/Google Sheets, cell
referencing.
Functions and Formulae: SUM, AVERAGE, IF, COUNT.
Charts and Graphs: Creating visual representations, Types of Charts.
Data Handling: Sorting, filtering, conditional formatting.
Text Functions: LEFT, RIGHT, MID, LEN, TRIM, CONCAT, TEXTJOIN
Advanced Functions: Logical: IF, AND, OR, IFERROR, Lookup

Unit 5. Data Analysis and Visualization:


Conditional Formatting: Custom rules, Color scales, Icon sets, Data bars
Data Analysis Tools: Pivot Tables and Pivot Charts, Data Validation (Drop-downs, Input Messages,
Error Alerts).
What-If Analysis: Goal Seek, Scenario Manager, Data Tables
Charts and Dashboards: Creating Interactive Dashboards, Using slicers with Pivot Tables, Combo
Charts and Sparklines

Textbooks:

1. Fundamentals of Computers, Reema Thareja, Oxford University Press, Second Edition


2. Fundamentals of Computers, V. Rajaraman – PHI Learning
3. Introduction to Computers by Peter Norton – McGraw Hill
4. Microsoft Office 365 In Practice by Randy Nordell – McGraw Hill Education

References:

1. Excel 2021 Bible by Michael Alexander, Richard Kusleika – Wiley


2. Networking All-in-One For Dummies by Doug Lowe – Wiley
3. Microsoft Official Docs and Training: [Link]
4. Google Workspace Learning Center: [Link]

Activities:

Outcome: At the End of the Course, The Students will be able to explain different number systems, the
historical evolution of computers, and identify key components in a block diagram.

Activity: Create a digital poster or info graphic comparing number systems (binary, decimal, octal,
hexadecimal) and illustrating the timeline of computer generations with key innovations.

Evaluation Method: Rubric-based assessment of the poster presentation on a 10-point scale focusing on:

● Accuracy of number system conversions


● Correct identification of block diagram components
● Visual organization and creativity
Outcome: Learners will demonstrate basic blocks of a computer and fundamental networking
knowledge.

Activity: Design a concept map showing the internal architecture of a computer and types of networks
(LAN, WAN, MAN), including devices and topologies.

Evaluation Method: Checklist-based peer review and instructor validation:

● Completeness of the map


● Correctness of networking concepts
● Use of appropriate terminology
● Logical flow and structure of the map
Outcome: Learners will create professional-level documents and design visually appealing
presentations using word processing software and presentation software.

Activity: Prepare a formal report (e.g., project proposal) in a word processor and present it using a slide
deck with transitions, embedded media, and design elements.

Evaluation Method: Performance-based evaluation using a 10-point scoring scale:

● Formatting and structure of the document


● Presentation aesthetics and clarity
● Communication skills during presentation

Outcome: Learners will manipulate data within spreadsheets, apply formulas, and generate accurate
summaries and visualizations.

Activity: Analyze a dataset (e.g., student scores or sales data) using spreadsheet software. Apply
formulas (SUM, AVERAGE, IF, VLOOKUP) and create relevant charts.

Evaluation Method: Practical test with a rubric:

● Correct use of formulas


● Accuracy of data summaries
Outcome: Learners will apply data modeling techniques to analyze, organize, and represent data
effectively in various scenarios.

Activity: Prepare an interactive dashboard for a given data set using EXCEL.

Evaluation Method: Evaluation of the dashboard on a 10-point scoring scale:

● Presentation aesthetics and clarity


● Inter activeness
● Communication skills during presentation
SEMESTER-I
COURSE 1: COMPUTER FUNDAMENTALS AND OFFICE AUTOMATION
Practical Credits: 1 2 hrs/week

List of Experiments:
1. Demonstration of Assembling and Dissembling of Computer Systems.
2. Identify and prepare notes on the type of Network topology of your institution.
3. Prepare your resume in Word.
4. Using Word, write a letter to your higher official seeking 10-days leave.
5. Prepare a presentation that contains text, audio and video.
6. Using a spreadsheet, prepare your class Time Table.
7. Using a Spreadsheet, calculate the Gross and Net salary of employees(Min 5) considering all the
allowances.
8. Generate the class-wise and subject-wise results for a class of 20 students. Also generate the highest
and lowest marks in each subject.
9. Using IF, AND, OR, and IFERROR to Automate Grade Evaluation.
a. Create a table of student scores in different subjects.
b. Use IF to assign grades (A/B/C/Fail).
c. Use IFERROR to handle missing scores or invalid data.
10. Employee Database Search Using VLOOKUP, HLOOKUP, XLOOKUP, INDEX, and
MATCH
a. Create a database of employees (Name, ID, Department, Salary).
b. Implement VLOOKUP to search by employee ID.
c. Use HLOOKUP to extract department heads by role.
d. Apply XLOOKUP for more flexible searches.
e. Use INDEX + MATCH as an alternative to VLOOKUP.
11. Sales Report Analysis Using Pivot Tables and Charts
a. Use a dataset of product sales (Product, Region, Date, Quantity, Revenue).
b. Create Pivot Tables to summarize data by region/product.
c. Insert Pivot Charts for visual analysis (e.g., bar, line).
d. Add slicers to make the dashboard interactive.
12. Designing a Data Entry Form with Drop-downs and Input Rules
a. Create a student registration form.
b. Add drop-down lists for course selection using Data Validation.
c. Add input messages to guide users.
d. Add error alerts for wrong entries.
13. Monthly Budget Planning using Goal Seek and Scenario Manager
a. Create a simple personal budget (income, expenses, savings).
b. Use Goal Seek to determine income needed to save a desired amount.
c. Use Scenario Manager to compare different budgeting scenarios (best/ worst/ realistic
case).
d. Create a one-variable Data Table to analyze how different expenses affect savings.
14. Dashboard Creation Using Combo Charts, Sparklines & Slicers
a. Use existing sales or attendance data.
b. Insert combo charts (e.g., column + line).
c. Add spark lines to show trends.
d. Use slicers with Pivot Tables to control dashboard elements.
e. Finalize and format for interactivity.
SEMESTER-I
COURSE 2: Mathematical Physical and Computing foundations of Quantum Computing
Theory Credits: 3 3 hrs/week

Course Objectives:
By the end of the course, students will be able to:
1. Understand foundational mathematical concepts such as vectors, matrices, and linear
transformations in quantum computing.
2. Explore the mathematical representation of quantum gates and qubits using linear algebra.
3. Apply complex numbers, eigenvalues, and eigenvectors to quantum computational models.
4. Relate abstract mathematical ideas to practical quantum computing operations.
5. Develop problem-solving skills for quantum mechanics and quantum circuit analysis.
6. To introduce the evolution of computer science and its mathematical basis.
Learning Outcomes:
Students will be able to:
1. Apply complex number operations in quantum computations.
2. Compute eigenvalues and eigenvectors for quantum state matrices.
3. Interpret the Bloch sphere for single-qubit representation.
4. Identify Hermitian and unitary operators as foundational elements in quantum mechanics.
5. Understand the evolution and fundamental experiments leading to quantum mechanics.
6. Explain the mathematical and physical foundations behind superposition and
entanglement.
7. Explain classical computational models and their historical development.

Unit I: Foundations of Quantum Mathematics (9 Periods)


Vectors and Linear Combinations: Euclidean vectors, magnitude and direction, scalar multiplication,
vector addition.
Superposition and Measurement: Understanding superposition states and measurement in quantum
systems.
Introduction to Matrices: Matrix definition, notation, simple operations (addition, transpose, scalar
multiplication).
Matrix Multiplication and Properties: Matrix-vector and matrix-matrix multiplication.
Quantum Gates and Circuit Models: Logic gates vs quantum gates, matrix representation of quantum
gates (NOT, AND, OR, Pauli-X).

Unit II: Elementary Linear Algebra for Quantum Systems (9 Periods)


Sets, Functions, and Vector Spaces: Sets, Cartesian product, functions, fields, and vector spaces.
Subspaces, Basis, and Dimension: Linear independence, span, and basis of a vector space.
Linear Transformations: Definition, properties, and matrix representation.
Transformations Inspired by Euclid: Translation, rotation, and projection.
Linear Operators and Functionals: Matrix dependence on basis, commutator operations.

Unit III: Complex Numbers and Eigen Concepts (9 Periods)


Complex Number System: Cartesian, polar, and exponential forms; conjugates and modulus.
Complex Numbers in Quantum Mechanics: Bloch sphere representation and Euler’s formula.
Eigenvalues and Eigenvectors: Definitions, computation, and physical significance.
Determinants and Invertibility: Matrix inverse and the invertible matrix theorem.
Applications in Quantum State Analysis: Diagonalization, Hermitian and unitary operators.

Unit 4 – Quantum Mechanics and Its Principles


History of Quantum Mechanics:Classical Physics foundations (Newton, Maxwell,
Galileo),Failures of classical theory and emergence of quantum concepts,Key contributors:
Planck, Einstein, Bohr, Heisenberg, Schrödinger, Dirac
Atoms and Thermodynamics: Atomic theory, kinetic theory, and laws of
thermodynamics,Statistical Mechanics: Maxwell–Boltzmann distribution and entropy
(Boltzmann’s constant),Photoelectric Effect: Einstein’s explanation of quantized energy, Wave–
Particle Duality: de Broglie’s matter waves and Bohr’s model, Quantum Models: Schrödinger’s
wave equation, Heisenberg’s matrix mechanics, Copenhagen Interpretation: Bohr’s
complementarity and probabilistic measurement
Key Principles for Quantum Computing:Linear Algebra in Quantum Systems,Superposition
and Interference, Dirac (Bra–Ket) Notation ,Quantum Uncertainty (Heisenberg Principle),
Entanglement and Non-locality, Postulates of Quantum Mechanics

Unit-5: Introduction to Computer Science, Historical Foundations, Turing Machines, Logic


Circuits and Gates, Computational Complexity and Efficiency,Energy, Reversibility, and
Computation

Text Books:

1. Essential Mathematics for Quantum Computing by Leonard S. Woody III (Packt, 2022).
2. Quantum Computing from the Ground Up by Riley Tipton Perry, by World Scientific
Publishing Co. Pte. Ltd.
SEMESTER-I
COURSE 2: Mathematical Physical and Computing foundations of Quantum Computing
Practical Credits: 1 2 hrs/week

List of Experiments:
Tools: GeoGebra, Desmos, Python (NumPy), or any simple matrix calculator.

No. Experiment Title Concepts Covered Tool/Method


Vector Addition and Represent two 2D vectors and visualize their GeoGebra /
1
Superposition superposition (quantum state analogy). Desmos
Matrix Multiplication Multiply 2×2 matrices and visualize the Python NumPy /
2
Basics transformation of a 2D point. GeoGebra
Implement NOT gate matrix ( X =
Quantum NOT Gate
3 \begin{bmatrix}0 & 1 \ 1 & 0\end{bmatrix} ) 0⟩ and
Simulation
and apply to
Matrix Transpose and Create any 3×3 matrix and verify (A = A^T)
4 Simple math tool
Symmetry Check (symmetric) or (A \neq A^T).
Vector Space Verify if given vectors form a subspace GeoGebra /
5
Verification under addition and scalar multiplication. manual
Linear Transformation Use a 2×2 matrix to rotate or project points
6 GeoGebra
Visualization in 2D space.
Plot complex numbers in Cartesian and Polar
Complex Number Plane
7 forms; verify Euler’s relation (e^{iθ} = \cosθ Desmos / Python
Plotting
+ i\sinθ).
Bloch Sphere
8 Represent 0⟩ and
Projection (Intro)

Tools: Logisim or Logisim Evolution

No. Experiment Title Concept Tool/Method


Binary to Decimal Implement conversion circuit using XOR,
1 Logisim
Converter AND, and OR gates.
Half Adder and Full Design classical addition logic for two bits,
2 Logisim
Adder Circuits then extend to full adder.
NOT, AND, OR, XOR Build and verify truth tables for basic logic
3 Logisim
Gates Demonstration gates.
Turing Machine Emulator Simulate tape read/write using a simple
4 Logisim
(Basic) state transition circuit.
Design finite automata circuit that outputs
5 Binary Pattern Recognizer high for a specific binary pattern (like Logisim
“101”).
Logic-to-Quantum Compare Boolean logic gates to unitary Logisim +
6
Transition Concept matrix equivalents (e.g., NOT ↔ X gate). explanation
SEMESTER-II
COURSE3: PROBLEM SOLVING USING C
Theory Credits:3 3 hrs/week

Course Objectives:
1. Understand the fundamentals of computer programming, Apply structured problem-solving approaches using
algorithms, flowcharts, and C programming constructs.
2. Develop efficient logic using decision-making, loop, and jump control statements.
3. Utilize derived data types like arrays and strings for modular program design.
4. Design and implement modular solutions using functions, recursive logic, pointer operations, and dynamic
memory management.
5. Handle complex data structures including structures, unions, and text file operations.
Course Outcomes:
At the end of the course, students will be able to:
1. Understand basic computing concepts, programming paradigms and write structured C programs.
2. Apply control flow statements to solve logical and repetitive tasks in C.
3. Implement arrays and string operations to manage and manipulate data efficiently.
4. Design modular code using functions, recursion, and appropriate parameter passing.
5. Utilize pointers and memory operations for effective data handling. Demonstrate competence in dynamic
memory allocation and text file processing.
Unit 1. Introduction to computer programming:
Introduction, Types of software, Compiler and interpreter, Concepts of Machine level, Assembly level and high-
level programming, Flowcharts and Algorithms, Fundamentals of C: History of C, Features of C, C Tokens-
variables and keywords and identifiers, constants and Data types, Rules for constructing variable names, Operators,
Structure of C program, Input /output statements in C-Formatted and Unformatted I/O
Unit 2. Control statements:
Decision making statements: if, if else, else if ladder, switch statements. Loop control statements: while loop, for
loop and do-while loop. Jump Control statements: break,continue and goto.
Unit 3. Derived datatypes in C:
Arrays: One Dimensional arrays - Declaration, Initialization and Memory representation; Two Dimensional arrays
- Declaration, Initialization and Memory representation. Strings: Declaring & Initializing string variables; String
handling functions, Character handling functions
Unit 4. Functions:
Pointers: Pointer data type, Pointer declaration, initialization, accessing values using pointers. Pointer arithmetic,
Pointers and arrays.
Function Prototype, definition and calling. Return statement. Nesting of functions. Categories of functions.
Recursion (Basic Concept only). Parameter Passing by address & by value. Local and Global variables. Storage
classes: automatic, external, static and register.
Unit 5. Dynamic Memory Management:
Introduction, Functions - malloc, calloc, realloc, free Structures: Basics of structure, structure members, accessing
structure members, nested structures, array of structures, structure and functions, structures and pointers. Unions -
Union definition; difference between Structures and Unions. Working with text files - modes: opening, reading,
writing and closing text files.
Text Books:
1. Programming in ANSI C, E. Balagurusamy, Tata McGraw Hill, 6th Edn
2. Computer fundamentals and programming in C, Reema Theraja, Oxford University Press
Reference Books:
1. Let us C, Y. Kanetkar, BPB publications
2. Head First C: A Brain-Friendly Guide, David Griffiths, Dawn Griffiths
Activities:
Outcome: Understand basic computing concepts, programming paradigms and write structured C programs.
Activity: Create a concept map of computing fundamentals and programming paradigms (procedural, structured,
object-oriented). Then, they write a structured C program (e.g., a calculator or student grade system) using proper
syntax, indentation, and modular design.
Evaluation Method: Rubric-based Code Review & Viva to check the ○ The correctness of the concept map
○ Correct use of structure (main + functions)

○ Identification of paradigm used


○ Code readability and documentation
Outcome: Apply control flow statements to solve logical and repetitive tasks in C.
Activity: Implement a program that solves a logic puzzle (e.g., number guessing game, pattern
generation, or prime number finder) using if, switch, for, while, and do-while.
Evaluation Method: Automated Test Cases + Peer Review to check the
○ Correct use of control statements
○ Logical correctness of output
○ Efficiency and edge case handling
○ Peer feedback on clarity and logic
Outcome: Implement arrays and string operations to manage and manipulate data efficiently.
Activity: Build a program that stores and arranges student marks in ascending and descending order
using arrays and performs string operations like concatenation, comparing, and formatting names.
Evaluation Method: Functional Demonstration + Code Walkthrough to check the
○ Correct array and string usage
○ Memory efficiency
○ Handling of invalid inputs
○ Explanation of sorting/searching logic
Activity:
● Recursive Problem Solver
Students write a modular program to solve a recursive problem (e.g., factorial, Fibonacci, or Tower of
Hanoi) using functions with parameters and return values.
Evaluation Method:
● Code Trace + Written Quiz
○ Correct function decomposition
○ Proper parameter passing (by value/reference)
○ Recursion depth and base case handling
○ Quiz on tracing recursive calls
Outcome: Utilize pointers and memory operations for effective data handling. Demonstrate competence
in dynamic memory allocation and text file processing.
Activity: Create a program that dynamically stores user input (e.g., survey responses) using pointers
and writes/reads the data to/from a text file.
Evaluation Method: Memory Debugging + File I/O Assessment to check the ○ Proper use of
malloc, calloc, realloc, and free
○ Pointer arithmetic and dereferencing
○ File creation, reading, writing, and error handling
○ Use of tools like Valgrind or manual memory trace (Optional for Unix flavours)
SEMESTER-II
COURSE3:PROBLEM SOLVING USING C
Practical Credits:1 2 hrs/week

List of Experiments

1. Write a program to check whether the given number is Armstrong or not.


2. Write a program to find the sum of individual digits of a positive integer.
3. Write a program to generate the first n terms of the Fibonacci sequence.
4. Write a program to find both the largest and smallest number in a list of integer values.
5. Write a program to demonstrate change in parameter values while swapping two integer
variables using Call by Value & Call by Address.
6. Write a program to perform various string operations.
7. Write a program to search an element in a given list of values.
8. Write a program that uses functions to add two matrices.
9. Write a program to calculate factorial of a given integer value using recursive functions.
10. Write a program for multiplication of two N × N matrices.
11. Write a program to sort a given list of integers in ascending order.
12. Write a program to calculate the salaries of all employees using the Employee (ID, Name,
Designation, Basic Pay, DA, HRA, Gross Salary, Deduction, Net Salary) structure.
a. DA is 30% of Basic Pay
b. HRA is 15% of Basic Pay
c. Deduction is 10% of (Basic Pay + DA)
d. Gross Salary = Basic Pay + DA + HRA
e. Net Salary = Gross Salary – Deduction
13. Write a program to read/write the data from/to a file.
14. Write a program to reverse the contents of a file and store in another file.

15. Write a program to create Book (ISBN, Title, Author, Price, Pages, Publisher) structure and store
book details in a file and perform the following operations:
a. Add book details
b. Search a book details for a given ISBN and display book details, if available
c. Update a book details using ISBN
d. Delete book details for a given ISBN and display list of remaining books.
SEMESTER-II
COURSE 4: Numerical Methods for Quantum Computing
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To understand the foundations of numerical computation and approximation.
2. To analyze different types and sources of numerical errors.
3. To apply numerical methods for solving algebraic and linear systems.
4. To compute numerical differentiation and integration using appropriate algorithms.
5. To solve ordinary differential equations using iterative and Runge-Kutta methods.

Course Learning Outcomes


After completing this course, students will be able to:
1. Explain the significance and limitations of numerical methods in solving engineering problems.
2. Identify and minimize different sources of numerical errors in computation.
3. Apply root-finding and linear system solving methods to practical problems.
4. Perform numerical differentiation and integration accurately.
5. Implement numerical algorithms to solve ordinary differential equations with controlled error.

Unit–I: Introduction to Numerical Methods and Errors (9 Periods)


Aim of Numerical Methods, Concept of Numerical Approximation,Measuring Errors: Absolute, Relative,
and Percentage Errors, Important Definitions and Theorems Related to Numerical Methods, Round-off and
Truncation Errors, Error Propagation and Machine Epsilon, Trade-off between Round-off and Truncation
Errors

Unit–II: Solution of Algebraic and Transcendental Equations (9 Periods)


Classification of Equations, Bracketing Methods: Bisection and False Position Methods, Open Methods:
Fixed-Point Iteration, Newton-Raphson Method, Convergence Analysis and Order of Convergence,
Estimation of Errors and Convergence Rate, Backward and Forward Error Analysis

Unit–III: Systems of Linear Equations (9 Periods)


Matrix Representation of Linear Systems, Gaussian Elimination and Gauss-Jordan Methods, LU
Decomposition and Pivoting, Iterative Methods: Jacobi and Gauss-Seidel Methods, Convergence Criteria
and Error Estimation

Unit–IV: Numerical Differentiation and Integration (9 Periods)


Numerical Differentiation using Finite Differences, Polynomial Interpolation and Lagrange Interpolation,
Newton-Cotes Quadrature Formulas (Trapezoidal, Simpson’s Rules), Romberg Integration and Richardson
Extrapolation, Gauss Quadrature Methods and Error Estimation

Unit–V: Numerical Solutions of Differential Equations (9 Periods)


Initial Value Problems (IVPs), Euler’s Method – Stability and Convergence, Runge-Kutta Methods (Second
and Fourth Order), Adaptive Methods for IVPs, Systems and Higher-Order Differential Equations, Global
and Local Truncation Errors

Text Books:

1. Numerical Methods for Engineering and Data Science (Wuthrich & El Ayoubi, CRC Press, 2025
2. Numerical Methods Fundamentals by [Link]
SEMESTER-II
COURSE 4: Numerical Methods for Quantum Computing
Practical Credits: 1 2 hrs/week

List of Experiments:
Exp. Title of Major Tasks /
Objective Software / Tools
No. Experiment Activities
To understand
Perform
the effect of
Study of Round- addition/subtraction of
finite precision
off and small & large numbers, Octave / Python
1 and floating-
Truncation observe round-off (sys.float_info.epsilon)
point arithmetic
Errors errors, compute
on numerical
machine epsilon (eps).
results.
To study how Evaluate propagation of
Error
input data input errors using
Propagation and
2 errors propagate addition/subtraction; Octave / Python
Significant
through demonstrate loss of
Digits
calculations. significance.
To find the root
Bisection Implement Bisection
of a nonlinear
Method for Method for ( f(x)=x^3 -
3 equation using Octave / Python
Nonlinear 4x + 1 ); plot error
a bracketing
Equations reduction per iteration.
method.
To solve Apply Newton-Raphson
Newton- nonlinear and Fixed-Point
Raphson and equations and Iteration to ( f(x)=e^{-
4 Octave / Python
Fixed Point compare x}-x ); compare
Iteration convergence iteration count and
speed. error.
To solve a Input 3×3 system;
Solving Linear
linear system perform forward
Systems using Octave / Python
5 using the direct elimination and back
Gaussian ([Link]())
elimination substitution; verify with
Elimination
approach. built-in solver.
To solve linear
Decompose matrix ( A
LU systems
= LU ); solve for ( X ) Octave (lu()), Python
6 Decomposition efficiently using
given ( B ); compare ([Link]())
Method LU
with direct methods.
factorization.
Input (x, y) values;
To approximate generate interpolation
Polynomial and Octave / Python
a function from polynomial; plot
7 Lagrange ([Link],
discrete data interpolated curve and
Interpolation matplotlib)
points. compare with true
function.
Exp. Title of Major Tasks /
Objective Software / Tools
No. Experiment Activities
To compute
Numerical definite Implement both rules
Integration using
integrals for ( f(x)=\sin(x) ) on Octave / Python
8
Trapezoidal and numerically and [0, π]; compare with ([Link])
Simpson’s Rules compare analytical result.
accuracy.
To solve an
Solve (
initial value
Euler’s Method \frac{dy}{dx}=y-x^2+1
problem
9 for Solving ); compare results for Octave / Python
numerically
First-Order ODE step sizes h=0.1, 0.05;
using Euler’s
plot error vs. iteration.
method.
Solve (
To apply the
\frac{dy}{dx}=x+y,
4th order
Runge-Kutta y(0)=1 ); compute RK4
Runge-Kutta
10 (RK4) Method solution; compare Octave / Python
method and
for IVPs accuracy &
compare with
convergence with
Euler’s.
Euler’s method.
SEMESTER-III
COURSE 5: Data Structures using Python
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the concept and classification of data structures and their role in efficient
programming.
2. To understand algorithm design, analysis, and implementation using Python.
3. To study and apply fundamental data structures such as arrays, linked lists, stacks, queues,
trees, and graphs.
4. To analyze searching, sorting, and hashing techniques for problem-solving efficiency.
5. To implement data structures using Python and explore their practical applications in
computing.
Course Learning Outcomes:
After successful completion of this course, students will be able to:
1. Explain the fundamental concepts of data structures and their operations.
2. Design and implement algorithms for solving computational problems efficiently.
3. Apply and manipulate linear data structures such as arrays, stacks, queues, and linked lists.
4. Build and traverse hierarchical data structures like trees and graphs using Python.
5. Analyze the time and space complexity of algorithms and implement hashing for efficient
data access.
Unit 1: Introduction to Data Structures (9 Periods)
Definition, importance, and applications of data structures, Types of data structures: Linear
and Non-Linear, Static and Dynamic, Homogeneous and Non-Homogeneous, Primitive vs
Non-Primitive data types, Operations on data structures (Creation, Insertion, Deletion,
Traversal, Sorting, etc.),Algorithm design and analysis: Time & Space complexity, Big-O
notation, Time-Space trade-off,Abstract Data Types (ADT)
Unit 2: Arrays and Lists (9 Periods)
Definition and representation of arrays/lists, Array initialization and address calculation,
Operations on arrays/lists (traversing, insertion, deletion, searching, merging),
Multidimensional arrays and sparse matrices, Applications of arrays/lists
Unit 3: Linked Lists (9 Periods)
Concept and memory representation of linked lists, Types of linked lists: singly, circular, and
doubly linked lists, Operations: creation, insertion, deletion, traversal, searching, Header
linked lists and applications. Polynomial representation using linked lists
Unit 4: Stacks and Queues (9 Periods)
Stack definition, operations (push, pop, peek), and implementation using arrays/lists and linked
lists. Applications: expression conversion, postfix/prefix evaluation, recursion, parenthesis
balancing, Queue definition and operations (enqueue, dequeue), Types of queues: Circular
Queue, Priority Queue, Double-Ended Queue (Deque),Applications of queues
Unit 5: Trees, Graphs, and Hashing (9 Periods)
Trees: terminology, binary tree, binary search tree, AVL tree – representation and traversal
methods, Graphs: definitions, types, representation (adjacency matrix/list), BFS and DFS
traversal, Searching and Sorting overview: linear, binary, and interpolation search; insertion
and quick sort basics, Hashing: hash functions, hash tables, collision and resolution techniques,
Applications in file structures and indexing
Text Books:
1. Data Structures and Program Design Using Python” by Dheeraj Malhotra & Neha
Malhotra.
2. Data Structures and Algorithms in Java/C++”) by Robert Lafore
SEMESTER-III
COURSE 5: Data Structures using Python
Practical Credits: 1 2 hrs/week

Experiment Concepts / Topics


Name of the Experiment
No. Covered
Write a Python program to create, insert, delete, and
Basic data operations,
1 traverse elements of different primitive data
ADT
structures.
Develop and test simple algorithms (e.g., find
Algorithm design,
2 max/min, average, factorial) and calculate their time
complexity analysis
complexity using Big-O notation.
Implement a Python program to perform insertion, Array operations and
3
deletion, and traversal in a list/array. manipulation
Develop a program to perform matrix addition,
Multidimensional arrays
4 subtraction, and multiplication using 2-D
and matrices
arrays/lists.
Implement sparse matrix representation and display Sparse matrices
5
its triplet form. representation
Create and traverse a singly linked list using Python
6 Node creation and traversal
classes and objects.
Implement insertion and deletion operations in Dynamic memory and
7
singly and doubly linked lists. pointer manipulation
Write a program to represent and add two
8 Application of linked lists
polynomials using linked lists.
Implement a stack using Python lists and perform
9 Stack ADT and operations
push, pop, and peek operations.
Convert an infix expression to postfix and evaluate Expression conversion and
10
the postfix expression using a stack. evaluation
Implement queue operations (enqueue, dequeue)
Queue types and
11 using lists and linked lists. Demonstrate Circular
applications
Queue and Priority Queue.
Implement creation and traversal (inorder, preorder, Tree representation and
12
postorder) of a Binary Search Tree. traversal
Write a program to perform insertion and rotation in
13 Height-balanced trees
an AVL tree.
Represent a graph using adjacency matrix and Graph representation and
14
adjacency list; implement BFS and DFS. traversal algorithms
Hash tables, hash
Implement hashing with linear probing and
15 functions, collision
quadratic probing for collision resolution.
resolution
SEMESTER-III
COURSE 6: Operating Systems
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To understand the fundamental principles, architecture, and functionality of operating
systems.
2. To explore process, thread, and CPU scheduling mechanisms for multitasking systems.
3. To study memory management, paging, segmentation, and virtual memory algorithms.
4. To understand file system architecture, I/O subsystems, and device management.
5. To gain hands-on experience with OS concepts using Linux system programming and shell
utilities.

Course Learning Outcomes:


After completing this course, students will be able to:
1. Explain the fundamental components, structure, and types of operating systems.
2. Analyze and simulate process management, scheduling, and synchronization techniques.
3. Apply memory management and virtual memory concepts to system design.
4. Demonstrate understanding of file systems, I/O handling, and system call mechanisms.
5. Evaluate operating system designs and compare real-world systems such as Linux,
Android, and Windows.

Unit I – Introduction to Operating Systems (9 Periods)


Definition, Objectives, and Functions of Operating Systems, OS as Extended Machine and
Resource Manager, Evolution: Batch, Multiprogramming, Time Sharing, Personal and Mobile
OS, Types: Mainframe, Server, Embedded, Real-Time, IoT, Smart Card OS, System Calls,
Operating System Structures (Monolithic, Microkernel, Client-Server, VM, Exokernel), OS
Research and Modern Trends

Unit II – Process and Thread Management (9 Periods)


Process Concept, States, PCB, Process Creation & Termination, Process Scheduling:
Preemptive and Non-Preemptive Scheduling Algorithms, Threads: User vs Kernel Threads,
Multithreading, POSIX Threads, Synchronization: Semaphores, Monitors, Mutexes, Classical
Problems: Producer-Consumer, Readers-Writers, Dining Philosophers

Unit III – Memory Management (9 Periods)


Contiguous and Non-contiguous Allocation, Paging, Segmentation, and Virtual Memory
Concepts, Page Table Structures, TLB, and Page Replacement Algorithms (FIFO, LRU,
Optimal, WSClock), Thrashing and Working Set Model, Linux Memory Management
Overview

Unit IV – File Systems and I/O Management (9 Periods)


File Concept: Structure, Access Methods, and Attributes, Directory Structure and File-System
Implementation, Disk-Space Management, Disk Scheduling (FCFS, SSTF, SCAN, C-SCAN)
I/O System: Device Controllers, Drivers, DMA, Interrupts, Case Study: UNIX File System,
Linux I/O Handling
Unit V – Deadlocks, Virtualization, and Security (9 Periods)
Deadlocks: Conditions, Detection, Prevention, and Avoidance (Banker’s
Algorithm),Multiprocessor and Distributed Operating Systems, Virtualization: Hypervisors,
Containers, Cloud OS Concepts, Security Principles: Authentication, Access Control,
Vulnerabilities, OS Hardening, Case Studies: Linux, Android, and Windows 11 Internals
Text Books:
1. Andrew S. Tanenbaum & Herbert Bos, Modern Operating Systems, 5th Ed., Pearson, 2023.
2. Abraham Silberschatz, Peter Baer Galvin, Greg Gagne, Operating System Concepts, 10th
Ed., Wiley.
3. William Stallings, Operating Systems: Internals and Design Principles, 9th Ed., Pearson.
4. Daniel P. Bovet & Marco Cesati, Understanding the Linux Kernel, O’Reilly.
SEMESTER-III
COURSE 6: Operating Systems
Practical Credits: 1 2 hrs/week

List of Experiments:

Exp.
Title of Experiment Objective / Description
No.
Introduction to Linux Shell Explore file operations, user management, and system
1
and System Calls calls using open(), read(), write(), fork().
Process Creation using Demonstrate parent-child process relationship and
2 fork() process IDs.
Implementation of Simulate FCFS, SJF, Priority, and Round Robin
3
Scheduling Algorithms scheduling in C.
Implement message passing and shared memory
4 Inter-Process Communication
communication.
Implement synchronization using semaphores or mutex
5 Producer-Consumer Problem
locks.
Demonstrate readers–writers synchronization using
6 Reader-Writer Problem
semaphores.
Memory Management Simulate paging and page replacement algorithms
7
Simulation (FIFO, LRU).
File System Commands and Implement file creation, deletion, and directory traversal
8
Manipulation using shell commands and C programs.
Implement disk scheduling algorithms (FCFS, SSTF,
9 Disk Scheduling Algorithms
SCAN, C-SCAN).
Deadlock Detection and Simulate resource allocation and Banker’s algorithm in
10
Avoidance C.
Demonstrate VirtualBox or Docker container setup and
11 Virtualization Demonstration
basic OS virtualization concepts.
Combine process, memory, or file management concepts
12 Mini Project
into a single integrated OS simulation.
SEMESTER-III
COURSE 7: Foundations of Quantum Technologies
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the foundational principles of quantum mechanics relevant to computing.
2. To understand various architectures and technologies used in quantum computers.
3. To explore core quantum phenomena such as superposition, entanglement, and quantum
gates.
4. To gain practical exposure to Qiskit and IBM Q for simulating and implementing
quantum algorithms.
5. To analyze quantum algorithms, communication, and applications across industries.

Course Learning Outcomes (CLOs):


After successful completion, learners will be able to:
1. Explain the fundamental quantum mechanical principles applied in computing.
2. Compare and contrast classical and quantum computing paradigms.
3. Design and implement quantum gates and circuits using Qiskit.
4. Apply quantum communication and error correction techniques for secure systems.
5. Evaluate the potential of quantum algorithms and their industrial applications.

Unit I – Principles of Quantum Computing (9 Periods)


Fundamentals of quantum behavior, Subatomic particles and quantum ensemble, Wave-
particle duality and uncertainty principle, Quantum entanglement and synchronization
Quantum information, teleportation, and technologies, Quantum field theory, Schrödinger’s
wave mechanics, and interpretations (Copenhagen, Many-Worlds)

Unit II – Quantum Computers and Quantum States (9 Periods)


Types of quantum computers: D-Wave, IonQ, Honeywell, Microsoft, Xanadu, Rigetti,
Superconducting qubits and cryogenic cooling, Superposition and entanglement concepts,
Mathematical representation of quantum states, Decoherence, fault tolerance, and scalability
issues

Unit III – Quantum Gates, Circuits, and Qiskit (9 Periods)


Unitary matrices and Bloch sphere, Basic quantum gates: X, Y, Z, H, S, T, CNOT, SWAP
Building and simulating circuits using Qiskit and IBM Q, Quantum Composer and Quantum
Lab, Grover’s search and Oracle in Qiskit

Unit IV – Quantum Communication and Error Correction (9 Periods)


Quantum superdense coding (Message 00–11), Quantum teleportation protocols, Quantum key
distribution (QKD) and post-quantum cryptography, Quantum error types and mitigation,
Shor’s Code and 3-qubit bit-flip/phase-flip correction

Unit V – Quantum Algorithms and Industrial Applications (9 Periods)


Quantum algorithms: Phase Kickback, Grover’s, Shor’s, Deutsch-Jozsa, Bernstein–Vazirani,
Quantum computing in cryptography, finance, AI, pharmaceuticals, and materials science,
Challenges in post-quantum cryptography and standardization, Industry case studies: IBM,
Google, Microsoft, and Atom Computing

Text Books:
1. Sudeep Satheesan & Sri Mounica Kalidasu (2025). Quantum Computing: Concepts,
Fundamentals, Circuits, and Code. BPB Publications, India.
SEMESTER-III
COURSE 7: Foundations of Quantum Technologies
Practical Credits: 1 2 hrs/week

List of Experiments:
Expt.
Title of the Experiment Learning Objective Expected Outcome
No.
Introduction to IBM Successfully configure Qiskit
Set up Python environment
1 Quantum Lab and Qiskit and connect to IBM Quantum
and install Qiskit library
Installation backend
Understand superposition
Representation of Qubits and
2 and visualize single-qubit Visualize
Bloch Sphere Visualization
states
Implementation of Basic Create and simulate gates in
Study and apply single-qubit
3 Quantum Gates (X, Y, Z, H, Qiskit to observe state
unitary gates
S, T) transformations
Construction of Two-Qubit Demonstrate entanglement Build simple two-qubit circuits
4
Gates (CNOT, SWAP) and controlled operations and verify entangled output
Simulation of Superposition Explore concepts of Observe and measure Bell pair
5
and Entanglement States quantum parallelism entanglement using Qiskit
Quantum Circuit Design Design quantum circuits for
Create, visualize, and
6 using IBM Quantum gate sequences and analyze
simulate circuits graphically
Composer output
Implement Grover’s
Grover’s Search Algorithm Demonstrate quantum speedup
7 algorithm for searching
using Qiskit over classical search
unsorted data
Distinguish between
Deutsch–Jozsa Algorithm Implement circuit and verify
8 constant and balanced
Implementation results through simulation
functions
Understand integer Simulate small-number
Shor’s Algorithm
9 factorization using quantum factorization using modular
Demonstration (Simulation)
circuits arithmetic
Quantum Teleportation Implement quantum state Successfully teleport a qubit
10
Protocol transfer using entanglement state across quantum nodes
Explore secure
Quantum Key Distribution Implement and test a basic
11 communication using
(BB84 Protocol) QKD communication setup
quantum mechanics
Quantum Error Detection
Apply bit-flip and phase-flip Demonstrate error detection and
12 and Correction (3-Qubit
correction codes correction on qubits
Code)
Design of Classical Logic Construct AND, OR, XOR Simulate logic gates and verify
13
Gates using Quantum Gates using quantum equivalents classical truth tables
Implementation of
Learn to find hidden binary Build circuit and verify output
14 Bernstein–Vazirani
strings efficiently using Qiskit backend
Algorithm
Observe difference between
Real Quantum Device Execute any quantum circuit
15 simulator and actual device
Execution on IBM’s real hardware
output
SEMESTER-IV
COURSE 8: Python for Quantum Computing
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce students to Python-based quantum computing frameworks for algorithm
simulation and experimentation.
2. To enable learners to use Cirq for quantum circuit creation, execution, and analysis.
3. To develop proficiency in LazyQML for hybrid quantum–classical machine learning
model design.
4. To explore quantum cryptographic techniques and quantum key distribution (QKD)
through Python libraries.
5. To equip students with the ability to build, test, and analyze secure quantum algorithms for
real-world applications.
Course Outcomes:
After completing this course, students will be able to:
1. Implement quantum circuits using Cirq and visualize qubit operations.
2. Design and execute quantum machine learning models using LazyQML.
3. Demonstrate understanding of quantum-safe cryptographic algorithms.
4. Integrate classical and quantum computation techniques in Python.
5. Apply quantum libraries to solve practical problems in quantum cyber security and
secure communication.
Unit I – Foundations of Quantum Programming in Python
Overview of quantum computing and its Python ecosystem, Installation and setup of quantum
libraries (Cirq, LazyQML, Qiskit, QuCrypt, etc.), Qubit representation, gates, and
measurement, Superposition, entanglement, and unitary operations in Python

Unit II – Quantum Circuit Design and Simulation with Cirq


Introduction to Cirq and Google’s quantum framework, Creating and visualizing quantum
circuits, Measurement outcomes and probability distributions, Quantum teleportation and
Grover’s algorithm in Cirq, Integration with TensorFlow Quantum for hybrid models

Unit III – Quantum Machine Learning using LazyQML


Overview of hybrid quantum-classical ML models, Installation and structure of LazyQML,
Quantum feature maps, kernels, and parameterized circuits, Training quantum neural networks
and classifiers, Comparison: LazyQML vs PennyLane and Qiskit Machine Learning

Unit IV – Python Libraries for Quantum Cyber Security


Introduction to Quantum Cryptography and Quantum Key Distribution (QKD), Quantum-safe
algorithms and post-quantum cryptography overview, Libraries: QuCrypt, PyQKD, and
SimulaQron, Implementing BB84 and E91 protocols in Python, Quantum random number
generation (QRNG) using Python
Unit V – Integrated Applications and Research Trends
Combining Cirq and LazyQML for quantum data protection, Quantum error correction and
noise simulation libraries, Case studies: Quantum-enhanced cybersecurity systems, Current
research in quantum-safe blockchain and authentication, Future trends in Python-based
quantum development
Suggested Textbooks / References
1. Google Cirq Documentation – [Link]
2. LazyQML Official Docs and Tutorials – [Link]
3. Qiskit Textbook (IBM Quantum) – “Learn Quantum Computation Using Qiskit” (2023
Edition)
4. K. Bharti et al., "Quantum Machine Learning: Theory and Implementations", Springer,
2022.
5. M. Mosca, "Quantum Cryptography and Cybersecurity", Cambridge University Press,
2023.
SEMESTER-IV
COURSE 8: Python for Quantum Computing
Practical Credits: 1 2 hrs/week

List of Experiments:

Experiment
Title Tools/Libraries
No.
1 Implementing Basic Quantum Gates in Cirq Cirq
2 Bell State Creation and Measurement Cirq
3 Quantum Teleportation Protocol Cirq
4 Quantum Classifier using LazyQML LazyQML
Quantum Key Distribution (BB84)
5 PyQKD
Simulation
6 Quantum Random Number Generator QuCrypt
Quantum Noise Simulation and Error
7 Cirq
Correction
LazyQML + TensorFlow
8 Hybrid Quantum–Classical Neural Network
Quantum
9 Quantum Secure Chat using Python QKD PyQKD
10 Capstone: Quantum Cybersecurity Demo All Libraries
SEMESTER-IV
COURSE 9: Quantum Computing using QISKIT
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce fundamental concepts of quantum computation and information theory.
2. To provide practical skills for quantum programming using Python and Qiskit.
3. To develop an understanding of quantum states, gates, and circuits.
4. To explore mathematical foundations such as vectors, matrices, and Hilbert spaces used in
quantum systems.
5. To equip learners to simulate and execute basic quantum algorithms on quantum simulators
or IBM Quantum devices.

Course Outcomes:
After successful completion, students will be able to:
1. Describe the structure of quantum computing systems and their difference from classical
computers.
2. Write and execute quantum programs using Qiskit and Python.
3. Apply linear algebra concepts to understand qubit transformations and superposition.
4. Design and simulate quantum circuits for computation and measurement tasks.
5. Demonstrate practical implementations of simple quantum algorithms and interpret
simulation results.

Unit I – Foundations of Quantum Computing (9 Periods)


Introduction to Quantum Computing and Qiskit textbook overview, Classical bits vs Qubits:
representation and manipulation, The concept of superposition and entanglement,
Understanding quantum states and measurement, Basic Python and Jupyter Notebooks for
Quantum Programming, Installing and configuring Qiskit

Unit II – Programming with Qiskit (9 Periods)


QuantumCircuit, QuantumRegister, ClassicalRegister, Building and visualizing quantum
circuits using Qiskit, Single-qubit and multi-qubit gates (X, Y, Z, H, CX, etc.), Measurement
operations and simulation on QASM and Statevector simulators, Using Aer and IBMQ
providers for quantum backend access

Unit III – Mathematical Foundations for Quantum Computing (9 Periods)


Basics of Linear Algebra: vectors, matrices, and tensor products, Vector spaces, linear
dependence, basis, and dimension, Hermitian and Unitary matrices in quantum computation
Inner product, normalization, and orthogonality in Hilbert spaces, Eigenvalues and
eigenvectors of quantum operators

Unit IV – Quantum Logic and Circuit Design (9 Periods)


Quantum gates as matrix operations, Quantum state transformations and unitary evolution,
Circuit composition and custom gate creation in Qiskit, Measurement and classical control,
Accessing and executing jobs on IBM Quantum hardware

Unit V – Quantum Algorithms and Applications (9 Periods)


Overview of basic quantum algorithms: Deutsch–Jozsa, Grover’s, and Bernstein–Vazirani,
Quantum teleportation and superdense coding, Quantum error and noise introduction,
Introduction to Variational Quantum Circuits (VQC) and hybrid quantum-classical approaches
Future directions: Quantum machine learning and optimization
Textbook:
• Learn Quantum Computing Using Qiskit (Qiskit Textbook, IBM Quantum Community)

References:
1. Nielsen, M. A., & Chuang, I. L. (2010). Quantum Computation and Quantum Information.
2. Asfaw, A. et al. Qiskit Textbook Online Resource.
3. Benenti, G., Casati, G., & Strini, G. (2019). Principles of Quantum Computation and
Information.
4. Qiskit Documentation – [Link]
SEMESTER-IV
COURSE 9: Quantum Computing using QISKIT
Practical Credits: 1 2 hrs/week

List of Experiments:

Exp.
Title of Experiment Learning Focus / Tools
No.
Setting up the Qiskit Environment – Installing Anaconda,
1 Qiskit, Jupyter Notebook
Python, and Qiskit; exploring Jupyter Notebook
Introduction to Qiskit Circuits – Create simple quantum
2 QuantumCircuit, draw()
circuits and visualize them using [Link]()
Single-Qubit Gates Implementation – Apply X, Y, Z, and Quantum gates,
3 H gates on qubits and analyze results on
statevector_simulator
statevector
Two-Qubit Entanglement (Bell State) – Implement H + Entanglement,
4
CX to create an entangled pair and measure results plot_histogram()
Measurement and Probabilistic Results – Measure Measurement, QASM
5
multiple qubits and interpret output distributions simulator
Custom Quantum Gate Creation – Build and append user-
6 to_instruction(), append()
defined gates; observe their effect on quantum states
Bloch Sphere Visualization – Represent qubit states and [Link],
7
transformations using Bloch vector plots matplotlib
Linear Algebra Simulation – Demonstrate vector and
8 NumPy, Matrix algebra
matrix operations using NumPy and apply to qubit states
Quantum Teleportation Protocol – Implement CNOT, CX, Measure,
9
teleportation using 3-qubit circuit Conditional Gates
Deutsch–Jozsa Algorithm – Identify balanced or constant
10 Quantum Algorithms
function using quantum speedup
Grover’s Search Algorithm – Implement 2-qubit search Oracle design, Quantum
11
problem using Oracle and Diffusion operators search
Running on Real Quantum Hardware – Execute selected
IBMQ provider, real-
12 circuits on ibmq_qasm_simulator and IBM Quantum
device execution
backend
SEMESTER-IV
COURSE 10: Quantum Computing Algorithms
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the principles and mathematical foundations underlying quantum algorithms.
2. To explore the design and implementation of core quantum algorithms such as Shor’s and
Grover’s.
3. To apply quantum optimization and machine learning techniques for solving complex
computational problems.
4. To provide practical exposure through simulation frameworks like Qiskit, Cirq, and
PennyLane.

Course Learning Outcomes:


Upon successful completion of this course, students will be able to:
1. Explain the theoretical principles of major quantum algorithms.
2. Implement and simulate quantum algorithms using Python-based libraries.
3. Differentiate between classical and quantum optimization strategies.
4. Analyze the computational complexity and efficiency of quantum algorithms.
5. Apply quantum computing concepts to real-world domains such as cryptography,
optimization, and AI.

Unit-I: Introduction to Quantum Computing Algorithms


Quantum computation fundamentals: qubits, superposition, and entanglement, Quantum operators
and circuits overview, Quantum algorithm structure and complexity classes (BQP, BQNP),
Classical vs Quantum approaches in data processing, Overview of Quantum Information
Processing Framework
Unit-II: Foundational Quantum Algorithms
Deutsch–Jozsa algorithm: balanced vs constant functions, Simon’s algorithm: period-finding and
oracle problem, Shor’s algorithm: integer factorization and quantum Fourier transform, Grover’s
search algorithm: unstructured search and amplitude amplification
Unit-III:Quantum Optimization Algorithms
Introduction to quantum optimization and combinatorial problems, Approximate optimization
algorithms (QAOA, SDP), Quantum semidefinite programming, Quantum NP problems and
quantum annealing
Unit-IV: Advanced Quantum Algorithms and Eigen Solvers
Quantum least squares fitting, Quantum eigenvalue and eigenvector solvers, Quantum
semidefinite programming methods, Variational Quantum Eigensolver (VQE) concept
Unit-V: Quantum AI and Hybrid Algorithms
Quantum neural network and associative memory, Quantum deep learning and Boltzmann
machine, Quantum search and pattern recognition, Hybrid quantum-classical models for machine
learning

Text Books:
1. Bhagvan Kommadi, “Quantum Computing Solutions: Real-World Algorithms,” Apress,
2020.
2. E. Rieffel and W. Polak, — “Programming Quantum Computers: Essential Algorithms and Code
Samples,” O’Reilly Media, Sebastopol, CA, USA, 2020.
3. D. Wang and H. Wang, — “Quantum Computing Algorithms,” Packt Publishing, Birmingham, U.K.,
2022.
Recommended Tools:
• IBM Qiskit, Google Cirq, Microsoft Q#, PennyLane, and TensorFlow Quantum
SEMESTER-IV
COURSE 10: Quantum Computing Algorithms
Practical Credits: 1 2 hrs/week

List of Experiments:

Exp. Tools /
Title of Experiment Description / Learning Focus
No. Framework
Construct basic circuits using
Introduction to Quantum Hadamard, Pauli-X/Y/Z, and CNOT
1 Qiskit / Cirq
Circuits and Gates gates. Observe superposition and
entanglement.
Implement the algorithm to distinguish
Simulation of Deutsch–
2 between constant and balanced Qiskit
Jozsa Algorithm
functions using quantum oracles.
Find the hidden bit string using quantum
Implementation of Qiskit /
3 parallelism. Demonstrate quantum
Simon’s Algorithm PennyLane
advantage.
Perform modular exponentiation and
Shor’s Algorithm for
4 use Quantum Fourier Transform (QFT) Qiskit
Integer Factorization
for factoring numbers.
Implement Grover’s algorithm for
Grover’s Search
5 unstructured database search. Compare Cirq / Qiskit
Algorithm
with classical search complexity.
Quantum Approximate Solve simple combinatorial
PennyLane /
6 Optimization Algorithm optimization problems such as MaxCut
Qiskit
(QAOA) or Traveling Salesman Problem.
Use VQE to estimate the ground-state PennyLane /
Variational Quantum
7 energy of a molecule or matrix TensorFlow
Eigensolver (VQE)
Hamiltonian. Quantum
Construct a basic QNN with
Quantum Neural Network parameterized quantum circuits. TensorFlow
8
(QNN) Evaluate for simple pattern recognition Quantum
tasks.
Implement a quantum kernel-based
Quantum Support Vector PennyLane /
9 classifier for binary classification.
Machine (QSVM) Qiskit
Compare with classical SVM.
Simulate Quantum Key Distribution
Quantum Cryptography Qiskit / Microsoft
10 (QKD) using BB84 protocol and
and Security Experiment Q#
analyze post-quantum security.
SEMESTER-V
COURSE 11: Quantum Computing Applications
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the foundational principles and integration of quantum computing with
emerging technologies.
2. To explore quantum computing applications across domains such as AI, IoT, cybersecurity,
and smart infrastructure.
3. To analyze how quantum algorithms revolutionize optimization, cryptography, and data
analytics.
4. To understand the role of quantum computing in healthcare, finance, and sustainable
systems.
5. To develop problem-solving skills using quantum-inspired and hybrid computing
approaches.
Course Outcomes:
After completing the course, learners will be able to:
1. Explain the key principles and frameworks that enable quantum computing applications.
2. Demonstrate understanding of how quantum computing enhances AI, IoT, and machine
learning systems.
3. Apply quantum cryptography and optimization concepts in secure and intelligent systems.
4. Analyze the benefits and challenges of quantum computing in industrial, healthcare, and
financial sectors.
5. Design application-oriented case studies integrating quantum computing with real-world
technologies.
Unit I – Fundamentals and Integration Applications (9 Periods)
Introduction to quantum computing and quantum mechanics primer, Quantum gates, qubits,
superposition, entanglement, and teleportation, Quantum computing architectures, processors,
and software tools (Qiskit, Cirq), Integration applications: role of quantum computing in
Industry 4.0 and Education 5.0 ecosystems.
Unit II – Quantum Computing in Artificial Intelligence (9 Periods)
Role of quantum computing in AI and deep learning, Quantum neural networks and quantum
machine learning models, Quantum annealing and variational quantum circuits, Explainable
AI and hybrid quantum-classical ML approaches.
Unit III – Quantum Computing for Cybersecurity and Cryptography (9 Periods)
Quantum cryptography and quantum key distribution (QKD), Quantum-safe and post-quantum
encryption, Blockchain and distributed ledger integration with quantum systems, Quantum
resilience for data privacy and digital trust.
Unit IV – Quantum Computing in FinTech and Healthcare (9 Periods)
FinTech applications: quantum-enhanced optimization and risk modeling, Quantum data
analytics in financial forecasting, Quantum applications in precision medicine and drug
discovery, AI-aided data analytics and IoT-driven healthcare ecosystems.
Unit V – Quantum Computing for Smart Infrastructure and Sustainability (9 Periods)
Smart grids, IoT-enabled energy systems, and renewable energy optimization, Smart city
management: waste reduction, transportation, and disaster response using quantum models,
Sustainable computing and green quantum technologies, Case studies: Quantum computing
applications in smart logistics and climate modeling.

Reference Book:

1. Khang, A. (2023). Applications and Principles of Quantum Computing. IGI Global Press.
DOI:10.4018/979-8-3693-1168-4
SEMESTER-V
COURSE 11: Quantum Computing Applications
Practical Credits: 1 2 hrs/week

List of Experiments:

Expt.
Title of the Experiment Objective / Description Expected Learning Outcome
No.
Install Qiskit; create and
Understand quantum gates,
Introduction to Qiskit and simulate basic quantum
1 qubits, and Bloch sphere
Quantum Circuit Design circuits (Hadamard, Pauli,
representation.
CNOT gates).
Quantum Superposition Create two- and three-qubit Visualize entanglement and
2 and Entanglement systems using superposition verify measurement
Simulation and entanglement operators. correlations.
Simulate a quantum search
Implementation of Learn how quantum speedup
algorithm and compare
3 Grover’s Search works for unstructured data
complexity with classical
Algorithm search.
search.
Build a simple quantum
Quantum Machine Implement and analyze a
classifier using Qiskit
4 Learning using hybrid quantum-classical
Machine Learning or
Cirq/Qiskit ML machine learning model.
TensorFlow Quantum.
Demonstrate the BB84
Learn principles of quantum
Quantum Cryptography – protocol for secure quantum
5 key exchange and post-
QKD Protocol Simulation key distribution using Python
quantum cryptography.
simulator.
Quantum Optimization Use Quantum Approximate Apply quantum optimization
6 for Financial Data Optimization Algorithm for to financial applications
(QAOA) portfolio or risk optimization. (FinTech).
Quantum IoT Data Simulate a small IoT network Explore how quantum
7 Encryption and where quantum cryptography technologies improve IoT
Communication enhances data security. security and performance.
Model a quantum-assisted Understand how quantum
Quantum Smart City
smart grid or traffic system computing supports
8 Simulation (Energy or
optimization using hybrid sustainability and
Traffic Optimization)
models. infrastructure optimization.
SEMESTER-V
COURSE 12A: Computer Networks
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the fundamental concepts and architecture of computer networks.
2. To understand various network models, protocols, and communication layers.
3. To study data transmission, error detection, and flow control mechanisms.
4. To familiarize students with network addressing, routing, and security concepts.
5. To provide hands-on knowledge of emerging networking technologies and applications.
Course Learning Outcomes:
After successful completion of this course, the students will be able to:
1. Explain the basic concepts, components, and architectures of computer networks.
2. Describe various network models (OSI and TCP/IP) and their layer functionalities.
3. Demonstrate knowledge of data transmission techniques, switching, and routing
algorithms.
4. Analyze network performance and apply error detection and flow control methods.
5. Identify network security issues and understand modern technologies like IoT and 5G.

Unit I: Introduction to Computer Networks (9 Periods)


Definition and evolution of Computer Networks, Network Goals, Applications, and
Topologies, Network Hardware and Software components, Types of Networks: LAN, MAN,
WAN, PAN, Network Models: OSI Reference Model and TCP/IP Model comparison, Concept
of Protocols, Interfaces, and Standards

Unit II: Data Transmission and Physical Layer (9 Periods)


Data and Signals: Analog and Digital transmission, Transmission Media: Guided (Twisted
Pair, Coaxial, Fiber Optic) and Unguided Media, Switching Techniques: Circuit, Packet, and
Message Switching, Multiplexing: FDM, TDM, WDM, Error Detection and Correction:
Parity, CRC, Hamming Code, Performance Metrics: Bandwidth, Throughput, Latency

Unit III: Data Link Layer and MAC Sub-layer (9 Periods)


Framing and Flow Control: Stop and Wait, Sliding Window Protocols, Error Control
Mechanisms: ARQ, Media Access Control: CSMA/CD, CSMA/CA, LAN Technologies:
Ethernet, Token Ring, Wireless LAN (IEEE 802.11), VLANs and Concept of Switching
(Layer 2 Switches)

Unit IV: Network and Transport Layers (9 Periods)


Logical Addressing: IPv4, IPv6 structure and addressing, Routing Concepts and Algorithms:
Distance Vector, Link State, Internetworking Devices: Routers, Gateways, Firewalls,
Transport Layer Services: TCP and UDP protocols, Congestion Control and Quality of Service
(QoS)

Unit V: Application Layer and Network Security (9 Periods)


Application Layer Protocols: HTTP, FTP, SMTP, DNS, DHCP, SNMP, Socket Programming
Basics, Network Security Principles: Cryptography, Authentication, Firewalls, VPN,
Introduction to Cloud Networking, IoT Networking, and 5G Communication, Emerging
Trends: SDN, Network Virtualization, and Edge Computing

Text Books:
1. Andrew S. Tanenbaum, Computer Networks, 5th Edition, Pearson Education.
2. Behrouz A. Forouzan, Data Communications and Networking, 5th Edition, McGraw Hill.
3. William Stallings, Data and Computer Communications, 10th Edition, Pearson.
Reference Books:
1. James F. Kurose & Keith W. Ross, Computer Networking: A Top-Down Approach, 7th
Edition, Pearson.
2. Larry L. Peterson & Bruce S. Davie, Computer Networks: A Systems Approach, Morgan
Kaufmann.
3. Natalia Olifer & Victor Olifer, Computer Networks: Principles, Technologies and
Protocols for Network Design, Wiley.
SEMESTER-V
COURSE 12A: Computer Networks
Practical Credits: 1 2 hrs/week

Expt. Tools /
Title of the Experiment Learning Outcome
No. Software Used
Study of basic network components Understand physical network
Cisco Packet
1 and topologies (bus, star, ring, setup and topological
Tracer / GNS3
mesh) arrangements.
Demonstrate basic data
Simulation of data transmission Packet Tracer /
2 communication and media
using guided and unguided media ns-3
differences.
Learn IPv4 and IPv6 address
Configuration of IP addressing and Packet Tracer /
3 configuration and subnet
subnetting in a LAN Mininet
calculation.
Study and analysis of OSI and Identify headers and fields in
4 TCP/IP layer interactions using Wireshark Ethernet, IP, TCP, and UDP
Wireshark packets.
Implementation of error detection
Python / C Develop algorithms for error
5 techniques: Parity check, CRC,
program detection and correction.
Hamming code
Simulation of flow control protocols
Understand flow control
6 – Stop-and-Wait and Sliding Python
mechanisms in Data Link Layer.
Window
Study and configuration of MAC
Analyze contention-based
7 layer protocols (CSMA/CD and ns-3 / Mininet
medium access control methods.
CSMA/CA)
Understand LAN design, VLAN
Configuration of a small LAN with Cisco Packet
8 configuration, and inter-VLAN
multiple switches and routers Tracer
routing.
Implementation of static and
Packet Tracer / Study packet forwarding and
9 dynamic routing protocols (RIP,
GNS3 routing algorithm behavior.
OSPF)
Apply TCP/UDP concepts to
Socket Programming in Python –
10 Python 3 create communication
Client-Server chat application
applications.
Simulation of congestion control Analyze congestion avoidance
11 ns-3
(TCP slow start, AIMD) mechanisms in transport layer.
Packet capture and analysis of
Understand application layer
12 HTTP, FTP, DNS, and SMTP Wireshark
protocol headers and operations.
protocols
Demonstration of basic firewall and Ubuntu iptables Implement basic network
13
port filtering configuration / pfSense security measures.
SEMESTER-V
COURSE 12B: Mathematics for Machine Learning
Theory Credits: 3 3 hrs/week

Course Objectives:
By the end of the course, students will be able to:
1. Understand key mathematical foundations that underpin modern machine learning
algorithms.
2. Apply concepts of linear algebra and vector calculus to model and analyze data
transformations.
3. Utilize probability and statistics to handle uncertainty and make predictions in ML models.
4. Explore optimization techniques essential for training machine learning models.
5. Integrate mathematical reasoning to implement and evaluate machine learning algorithms.
Course Learning Outcomes:
After successful completion of this course, students will be able to:
1. Demonstrate an understanding of vector spaces, eigenvalues, and matrix operations in ML.
2. Apply differential calculus and gradient-based optimization methods in model training.
3. Use probability distributions and statistical inference for data-driven decision-making.
4. Formulate and solve optimization problems related to regression and classification tasks.
5. Mathematically interpret the internal working of machine learning algorithms such as PCA,
SVM, and neural networks.
Unit I: Linear Algebra for Machine Learning (10 Periods)
Vectors, matrices, and tensor notation, Linear transformations and matrix operations,
Determinants, rank, inverse, and orthogonality, Eigenvalues and eigenvectors, Applications:
Dimensionality reduction (PCA), covariance matrix.

Unit II: Multivariable Calculus and Optimization (9 Periods)


Functions of several variables, limits, and continuity, Partial derivatives, gradient, Jacobian,
Hessian, Taylor series expansion and approximation, Gradient descent, stochastic gradient
descent, Convex functions and constrained optimization

Unit III: Probability and Random Variables (9 Periods)


Basics of probability theory: conditional probability, Bayes theorem, Random variables, PMF,
PDF, CDF, Expectation, variance, covariance, correlation, Common distributions: Bernoulli,
Binomial, Gaussian, Exponential, Law of large numbers, central limit theorem

Unit IV: Statistics and Data Analysis (9 Periods)


Sampling, estimation, and hypothesis testing, Confidence intervals, p-values, chi-square test
Correlation and regression analysis, Maximum likelihood estimation (MLE), Feature scaling
and normalization techniques

Unit V: Advanced Mathematical Tools for Machine Learning (8 Periods)


Vector spaces and inner product spaces, Singular Value Decomposition (SVD) and Matrix
Factorization, Information theory: Entropy, cross-entropy, KL divergence, Fourier and
Laplace transforms in ML signal processing. Numerical methods for large-scale ML
computations
Recommended Textbooks:
1. Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning.
Cambridge University Press.
2. Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.
3. Strang, G. (2019). Linear Algebra and Learning from Data. Wellesley-Cambridge Press.
4. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
5. Ross, S. M. (2014). Introduction to Probability and Statistics for Engineers and Scientists.
Academic Press.
SEMESTER-V
COURSE 12B: Mathematics for Machine Learning
Practical Credits: 1 2 hrs/week

List of Practicals

1. Matrix operations using NumPy and visualization of transformations


2. Implementation of PCA for dimensionality reduction
3. Visualization of gradient descent optimization
4. Probability distribution fitting using real-world datasets
5. Naïve Bayes classifier on categorical data
6. Linear regression using least squares and gradient descent
7. Logistic regression with sigmoid function and cost minimization
8. Implementation of MLE on sample datasets
9. Correlation and covariance analysis
10. Entropy and information gain computation for decision trees
11. SVD and matrix factorization on recommendation data
12. Visualization of loss functions and convergence curves
13. Comparison of optimization algorithms (SGD, Adam, RMSProp)
14. Monte Carlo simulation for probabilistic estimation
15. Implementation of basic neural network mathematics manually
SEMESTER-V
COURSE 13A: Classical Cryptography & Network Security
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the evolution, concepts, and principles of classical and modern cryptography.
2. To explain symmetric and asymmetric cryptographic techniques for securing data.
3. To analyze various network models, protocols, and cryptographic mechanisms in network
security.
4. To study web and email security protocols such as TLS, SSL, IPsec, and PGP.
5. To develop the ability to design secure communication systems resistant to common
attacks.
Course Outcomes:
After completion of the course, students will be able to:
1. Demonstrate understanding of classical ciphers and their cryptanalytic techniques.
2. Compare and contrast symmetric and asymmetric encryption algorithms.
3. Explain the design and function of network security architectures and protocols.
4. Implement cryptographic algorithms to achieve confidentiality, integrity, and
authentication.
5. Apply security mechanisms in web, email, and VPN communications effectively.

Unit I – Fundamentals of Cryptography and Classical Systems (9 Periods)


Introduction to Cryptography and Security Goals: Confidentiality, Integrity, Authenticity,
Non-repudiation, Cryptographic Terminologies: Plain text, Cipher text, Encryption,
Decryption, Key, Keystream, Classical Cryptography: Substitution and Transposition
Ciphers – Caesar, Vigenère, Playfair, Hill [Link] Machine and Zimmerman Telegram
case study. Cryptanalysis: Frequency analysis, brute-force attacks, limitations of classical
ciphers.
Unit II – Symmetric Cryptography (9 Periods)
Concept of Symmetric Key Cryptography, Stream Ciphers vs Block Ciphers, Algorithms:
DES, Triple DES, IDEA, AES – Architecture and Operation, Modes of Operation: ECB, CBC,
CFB, OFB, CTR, Key Management and Distribution Issues.
Unit III – Asymmetric Cryptography and Hash Functions (9 Periods)
Principles of Public Key Cryptography, RSA Algorithm: Key generation, encryption, and
decryption, Diffie-Hellman Key Exchange, Elliptic Curve Cryptography (ECC) basics,
Message Authentication Codes (MAC), Digital Signatures, and Hash Functions (SHA-1, SHA-
2, SHA-3).
Unit IV – Network Security Models and Protocols (9 Periods)
OSI & TCP/IP Models – Structure, layers, and encapsulation/decapsulation process, Security
services at different layers, Network Security Components: Firewalls, IDS/IPS, VPN, NAT,
and Proxy Servers, Web Security Protocols: SSL, TLS (v1.2 & v1.3), HTTPS – Operation and
Handshake Process.
Unit V – Applied Security: VPN, Email, and Web Applications (9 Periods)
IPsec and its modes of operation – AH, ESP, and IPComp, Virtual Private Networks (VPN):
Remote access and site-to-site VPNs, Email Security: PGP, S/MIME, MIME, Web
Application Security and OWASP Overview.
Textbook:
Sandip Dholakia (2025). Modern Cryptography: Securing Data. Rheinwerk Publishing Inc.,
Boston (MA).
Suggested Reference Books:
1. William Stallings – Cryptography and Network Security: Principles and Practice, Pearson.
2. Behrouz A. Forouzan – Cryptography and Network Security, McGraw Hill.
3. Christof Paar & Jan Pelzl – Understanding Cryptography: From Established Symmetric
and Asymmetric Ciphers to Post-Quantum Algorithms, Springer, 2024.
SEMESTER-V
COURSE 13A: Classical Cryptography & Network Security
Practical Credits: 1 2 hrs/week

Exp. Title of Tools / Software


Objective / Description Expected Outcome
No. Experiment Used
Implement Caesar and Understand
Implementation of
Vigenère cipher Python / CrypTool substitution
1 Substitution
algorithms for encryption 2 encryption and key
Ciphers
and decryption. dependency.
Differentiate
Implementation of Implement Rail Fence
Python / CrypTool substitution and
2 Transposition and Columnar
2 transposition
Ciphers Transposition ciphers.
techniques.
Perform frequency Learn basic code-
Cryptanalysis of analysis to break Caesar Python / CrypTool breaking and
3
Classical Ciphers and Monoalphabetic 2 cryptanalysis
ciphers. concepts.
Encrypt and decrypt data Python Understand Feistel
Simulation of DES
4 using the DES (PyCryptoDome) / rounds and key
Algorithm
algorithm. CrypTool scheduling.
Study modern
Perform AES-128 and
Implementation of symmetric
5 AES-256 encryption on Python / OpenSSL
AES Algorithm encryption
text/files.
structures.
Demonstration of Compare ECB, CBC, Analyze diffusion
Python / CrypTool
6 Block Cipher CFB, OFB, CTR modes and error
2
Modes of operation. propagation effects.
Generate keys, encrypt Learn public-key
Implementation of
7 and decrypt data using Python / OpenSSL encryption
RSA Algorithm
RSA. mechanism.
Simulate secure key Understand shared
Diffie–Hellman Python / Manual
8 exchange between two secret generation in
Key Exchange Calculation
users. insecure channels.
Compute MD5, SHA-1,
Hash Functions and Learn hashing and
SHA-256, SHA-3 hashes;
9 Message Python / HashCalc authentication in
verify integrity with
Authentication data protection.
HMAC.
Capture HTTPS packets;
Understand secure
Analyze HTTPS analyze TLS/SSL
10 Wireshark data transmission
using Wireshark handshake and cipher
protocols.
suites.
SEMESTER-V
COURSE 13B: Python for Machine Learning
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the principles and workflow of machine learning using Python and Scikit-
Learn.
2. To understand supervised and unsupervised learning algorithms and their mathematical
foundations.
3. To apply model selection, feature engineering, and evaluation techniques for predictive
analytics.
4. To develop the ability to preprocess and visualize datasets for real-world problem-solving.
5. To implement and optimize ML models using Scikit-Learn for practical and research
applications.

Course Outcomes:
After successful completion of the course, students will be able to:
1. Demonstrate understanding of machine learning fundamentals and workflows using Scikit-
Learn.
2. Apply classification, regression, and clustering algorithms to real datasets.
3. Evaluate model performance using appropriate metrics and validation techniques.
4. Design preprocessing and feature engineering pipelines for ML projects.
5. Build, tune, and deploy predictive models for real-world applications.

Unit I:
Introduction to Machine Learning and Scikit-Learn
Definition of ML – Applications – ML Pipeline – Overview of Scikit-Learn architecture –
Installing and using Scikit-Learn – Data loading, preprocessing, and splitting – Feature scaling –
Data visualization using Matplotlib and Seaborn.
Unit II:
Supervised Learning Algorithms
Linear Regression – Polynomial Regression – Logistic Regression – Decision Trees – Random
Forests – K-Nearest Neighbors – Support Vector Machines – Performance metrics (accuracy,
precision, recall, F1-score).

Unit III:
Unsupervised Learning and Dimensionality Reduction
K-Means Clustering – Hierarchical Clustering – DBSCAN – PCA – t-SNE – Evaluation of
clustering results – Applications of unsupervised learning in anomaly detection and customer
segmentation.
Unit IV:
Model Selection and Optimization
Train-test split – Cross-validation – Bias-variance tradeoff – Hyperparameter tuning
(GridSearchCV, RandomizedSearchCV) – Regularization (L1, L2) – Ensemble methods
(Bagging, Boosting, AdaBoost, Gradient Boosting).
Unit V:
Practical Applications and Deployment
Building complete ML projects – Feature engineering pipelines – Saving and loading models
(joblib, pickle) – Case studies (house price prediction, image classification, text sentiment
analysis) – Model deployment basics (Flask, Streamlit).
Text Books:

1. Machine Learning with Scikit-Learn by Parag Saxena, 2024, ISBN: 978-8197223945


2. Python Data Science Cookbook (2025) by Taryn Voska
SEMESTER-V
COURSE 13B: Python for Machine Learning
Practical Credits: 1 2 hrs/week

Exp.
Title of Experiment Objective / Learning Outcome
No.
To familiarize students with the ML
Installation and setup of Python, Scikit-
1 programming environment and verify
Learn, Pandas, NumPy, and Matplotlib
successful installation.
To learn dataset loading, cleaning, encoding
Loading and preprocessing datasets using
2 categorical data, and handling missing
Scikit-Learn
values.
Data visualization and feature correlation To visualize dataset distributions and
3
analysis using Seaborn relationships between variables.
Implementation of Linear Regression To build and evaluate a simple regression
4
using Scikit-Learn model and interpret model coefficients.
Polynomial Regression for non-linear To demonstrate curve fitting using
5
data fitting polynomial regression.
Logistic Regression for binary To classify data into binary categories and
6
classification evaluate accuracy and confusion matrix.
Decision Tree Classifier on sample To understand tree-based learning and
7
dataset interpret decision boundaries.
Random Forest and feature importance To implement ensemble learning and
8
analysis interpret feature importance rankings.
K-Nearest Neighbors (KNN) To explore distance-based classification and
9
Classification parameter tuning (K-value).
To perform unsupervised learning and
10 K-Means Clustering and visualization
visualize clusters using PCA plots.
Principal Component Analysis (PCA) for To reduce high-dimensional data and
11
dimensionality reduction visualize principal components.
Hyperparameter tuning using To optimize model performance through
12
GridSearchCV systematic parameter search.
Model evaluation using confusion matrix,
To compare model metrics for better
13 precision, recall, F1-score, and ROC
understanding of classification performance.
curve
Saving and loading trained models using To demonstrate model persistence and
14
joblib/pickle reusability.
Mini Project: Build and deploy an ML To design, train, evaluate, and deploy a real-
15
model using Streamlit world predictive model as a mini project.
SEMESTER-V
COURSE 14A: Quantum Cryptography Protocols
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To introduce the fundamental principles of quantum mechanics as applied to information
security.
2. To understand quantum key distribution (QKD) and its role in secure communication.
3. To explore various quantum cryptographic protocols and their implementation
mechanisms.
4. To analyze the security advantages and challenges of quantum protocols over classical
systems.
5. To study post-quantum cryptographic approaches and the integration of quantum and
classical systems.
Course Outcomes:
After successful completion of this course, students will be able to:
1. Explain the principles of quantum mechanics relevant to cryptography.
2. Describe and compare different quantum key distribution (QKD) protocols such as BB84,
B92, and E91.
3. Analyze the security and performance of quantum cryptographic systems.
4. Implement basic simulations of quantum protocols using open-source quantum computing
frameworks.
5. Evaluate hybrid and post-quantum cryptographic schemes for secure communication.
Unit I:
Introduction to Quantum Cryptography
Overview of Classical vs Quantum Cryptography, Basic Postulates of Quantum Mechanics, Quantum Bits
and Superposition, Quantum Entanglement and Measurement Principles, Quantum No-Cloning Theorem
and Its Implications
Unit II:
Quantum Key Distribution (QKD)
Concept and Need for QKD, BB84 Protocol – Theory and Steps, B92 Protocol – Simplified QKD, E91
Entanglement-Based QKD Protocol, Security Analysis and Attack Models (Intercept-Resend, Photon
Number Splitting, etc.)
Unit III:
Advanced Quantum Cryptographic Protocols
Quantum Secret Sharing (QSS), Quantum Digital Signatures (QDS), Quantum Bit Commitment and
Oblivious Transfer, Quantum Authentication and Quantum Teleportation in Security, Device-Independent
Quantum Cryptography
Unit IV:
Implementation & Simulation of Quantum Protocols
Quantum Cryptography Hardware Components (Photon Sources, Detectors, Polarizers), Simulation Tools:
Qiskit, QuTiP, and IBM Quantum Experience, Hands-on: Simulating BB84 and E91 Protocols
Error Correction and Privacy Amplification Techniques
Unit V:
Future Directions and Post-Quantum Security
Post-Quantum Cryptography vs Quantum Cryptography, Lattice-based, Code-based, and Multivariate
Cryptosystems, Quantum Network Security and Quantum Internet, Standardization Efforts and NIST PQC
Initiatives, Research Trends and Ethical Considerations in Quantum Security

Textbooks:

1. Nielsen, M.A. & Chuang, I.L. – Quantum Computation and Quantum Information,
Cambridge University Press.
2. Imre, S. & Balazs, F. – Quantum Computing and Communications: An Engineering
Approach, Wiley.
3. Gisin, N. et al. – Quantum Cryptography, Reviews of Modern Physics.
4. Menezes, A. – Handbook of Applied Cryptography, CRC Press.

Online Tools & Platforms:

• IBM Quantum Experience (Qiskit)


• Microsoft Quantum Development Kit (Q#)
• QuTiP (Quantum Toolbox in Python)
SEMESTER-V
COURSE 14A: Quantum Cryptography Protocols
Practical Credits: 1 2 hrs/week

Exp.
Title of Experiment Objectives / Learning Outcomes Tools / Platform
No.
To set up Qiskit / QuTiP
Introduction to Quantum IBM Quantum
1 environment and understand
Computing Environment Experience, Qiskit
quantum circuit basics.
To implement quantum states, apply
Implementation of Qubits and
2 gates (X, H, Z), and observe Qiskit Notebook
Quantum Gates
superposition effects.
Simulation of Qubit
To understand measurement effects
3 Measurement and No-Cloning Qiskit / QuTiP
and verify the no-cloning principle.
Theorem
Implementation of BB84
To simulate secure key generation
4 Quantum Key Distribution Qiskit / Python
using polarization states in BB84.
Protocol
To simulate an eavesdropper and
Analysis of Intercept-Resend
5 measure quantum bit error rate Qiskit / Python
Attack on BB84
(QBER).
Implementation of B92 QKD To perform simplified two-state
6 Qiskit / QuTiP
Protocol QKD and measure key efficiency.
Qiskit / IBM
Simulation of E91 To implement E91 protocol and
7 Quantum
Entanglement-Based QKD study entanglement correlations.
Experience
Visualization of Quantum
To verify quantum non-locality using
8 Entanglement and Bell’s Qiskit / Python
Bell test simulation.
Inequality
To distribute information securely
Quantum Secret Sharing
9 among multiple parties using GHZ Qiskit / Python
Protocol
states.
Quantum Bit Commitment To understand and simulate quantum
10 and Quantum Digital bit commitment scheme and Qiskit / QuTiP
Signature signature verification.
Quantum Teleportation and To implement quantum teleportation
11 Qiskit / Python
Authentication Protocol and analyze its security aspects.
SEMESTER-V
COURSE 14B: Deep Learning Fundamentals
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To understand the theoretical foundations and mathematics underlying deep learning
algorithms.
2. To explore neural network architectures and their optimization techniques.
3. To apply convolutional and recurrent neural networks to real-world data.
4. To understand the role of autoencoders, generative models, and transfer learning.
5. To develop practical deep learning models using open-source frameworks such as
TensorFlow or PyTorch.

Course Outcomes:
After successful completion of this course, students will be able to:
1. Explain the fundamental principles of neural networks and deep learning.
2. Implement and train deep neural networks for various data modalities.
3. Analyze and optimize network performance using gradient-based methods.
4. Apply advanced architectures such as CNNs, RNNs, and GANs to solve complex
problems.
5. Design, evaluate, and deploy end-to-end deep learning systems for practical applications.

Unit I: Introduction to Deep Learning and Neural Networks


Introduction to AI and Machine Learning – Motivation for Deep Learning – Biological Neurons
and Artificial Neurons – Perceptrons and Multilayer Perceptrons – Activation Functions – Loss
Functions – Backpropagation and Gradient Descent
Unit II: Optimization and Training Deep Networks
Challenges in Training Deep Networks – Vanishing and Exploding Gradients – Regularization
(Dropout, Batch Normalization) – Weight Initialization – Optimizers (SGD, Adam, RMSProp) –
Hyperparameter Tuning and Model Evaluation
Unit III: Convolutional Neural Networks (CNNs)
Convolution and Pooling Operations – CNN Architectures (LeNet, AlexNet, VGG, ResNet,
Inception) – Transfer Learning – Visualization and Interpretation – Applications in Image
Recognition and Computer Vision
Unit IV: Sequential Models and Recurrent Neural Networks (RNNs)
Sequence Modeling – RNNs and LSTMs – GRUs – Attention Mechanisms – Transformers
Overview – Applications in Natural Language Processing (Text, Speech)
Unit V: Generative and Advanced Deep Learning Models
Autoencoders – Variational Autoencoders (VAEs) – Generative Adversarial Networks (GANs) –
Reinforcement Learning Basics – Ethical and Societal Implications of Deep Learning

Text Books:

1. Fundamentals of Deep Learning (2nd Edition) by Nikhil Buduma, Nithin Buduma & Joe
Papa, O’Reilly Media
2. Programming Neural Networks with Python” by Joachim Steinwender and Roland
Schwaiger (Rheinwerk Computing).
SEMESTER-V
COURSE 14B: Deep Learning Fundamentals
Practical Credits: 1 2 hrs/week

No. Practical Title Objective Key Concepts / Tools


Implement a Perceptron from Understand the basic neuron NumPy, dot product, step
1
Scratch model and activation functions function
Train a Simple Neural Learn supervised learning and Feedforward, weight
2
Network for AND/OR Gates error correction updates, loss function
Handwritten Digit Implement your first deep Keras/TensorFlow,
3
Recognition using MNIST neural network softmax, dense layers
Visualize Activation Compare sigmoid, tanh, and Matplotlib visualization,
4
Functions ReLU behaviors activation study
Build a Multi-Layer
Introduce hidden layers and Dense networks, batch
5 Perceptron (MLP) for
backpropagation training
Classification
Implement Gradient Descent Learn the optimization concept Mean squared error,
6
from Scratch manually learning rate tuning
Image Classification with
Learn CNN architecture and Conv2D, MaxPooling,
7 Convolutional Neural
filters Flatten layers
Networks
Train a Neural Network with Prevent overfitting and improve Dropout layers,
8
Dropout Regularization generalization training/testing accuracy
Predict House Prices Using a Apply deep learning to Normalization, linear
9
Regression Neural Network regression problems output layer
Implement a Basic
Understand neural networks in Q-learning, reward
10 Reinforcement Learning
decision-making function, exploration
Agent
SEMESTER-V
COURSE 15A: Quantum Simulation in Cyber Security
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To explore practical quantum simulation frameworks in cybersecurity contexts.
2. To develop skills for simulating cyber-attacks, network defenses, and cryptographic
systems using quantum tools.
3. To understand and simulate hybrid quantum–classical architectures for security.
4. To implement quantum machine learning (QML) and blockchain-based secure systems.
5. To examine real-world applications, standards, and future research in quantum-secure
infrastructures.
Course Outcomes:
After completing this course, learners will be able to:
• CO1: Apply quantum simulators for modeling cyber-attack and defense scenarios.
• CO2: Design quantum-safe and hybrid cybersecurity architectures.
• CO3: Implement simulation-based approaches for secure communication and blockchain
systems.
• CO4: Utilize quantum machine learning for intrusion and anomaly detection.
• CO5: Critically evaluate current research, ethical issues, and future trends in quantum
cybersecurity.
Unit I: Quantum Simulation Frameworks for Cybersecurity
Architecture of quantum simulators for cybersecurity research, Quantum circuit emulation,
noise modeling, and gate fidelity in security simulation, Frameworks and platforms: IBM
Qiskit Aer, Cirq, QuTiP, Xanadu PennyLane, Workflow for simulating quantum algorithms in
cybersecurity
Unit II: Simulation of Cyber Attacks and Quantum Defense Mechanisms
Simulation of brute-force and Grover-based quantum attacks, Modeling RSA and AES
vulnerabilities under quantum threats, Quantum-resistant protocols and defensive simulations,
Quantum replay and Man-in-the-Middle (MITM) simulation in communication networks,
Performance metrics: simulation accuracy, execution time, and qubit resources.
Unit III: Quantum-Safe Network and Cloud Security Simulation |
Designing quantum-resilient network architectures, Quantum-safe cloud computing and access
control simulation, Secure key exchange and data integrity validation using quantum protocols,
Simulation of IoT and industrial control systems security (SCADA, IIoT), Hybrid classical–
quantum security system simulation and benchmarking.
Unit IV: Quantum Machine Learning and Blockchain for Cybersecurity
Quantum Machine Learning (QML) fundamentals and types (QNN, QSVM, QGAN),
Simulation of QML models for intrusion detection and anomaly analysis,Quantum
Blockchain: principles, consensus mechanisms, and cryptographic integration,Simulation of
secure quantum-ledger transactions and decentralized systems,Comparative study: classical vs
quantum blockchain performance.
Unit V: Emerging Research Trends, Standards, and Ethics in Quantum Cybersecurity
Quantum Internet and Quantum Cloud Security initiatives, Standardization efforts: NIST PQC
standards, ETSI quantum-safe working group, Future research: quantum-secure infrastructure,
post-quantum transition planning, Case studies on government and industry quantum
cybersecurity projects (e.g., IBM, Google, ISRO), Ethical, legal, and societal impacts of
quantum cybersecurity research.

Textbooks & References:


1. S. B. Goyal et al., Quantum Computing, Cyber Security and Cryptography: Issues,
Technologies, Algorithms, Programming and Strategies, 2025.
2. A. Joshi, Quantum Computing Models for Cybersecurity and Communications, O’Reilly,
2025.
3. IBM Quantum Team, Qiskit Textbook (2025 Update): Simulation of Quantum Systems
for Cybersecurity.
4. Narayanan, A., Post-Quantum Cryptography and Secure Simulation Frameworks,
Springer, 2025.
5. Recent papers (2023–2025) from IEEE, Springer, and Elsevier on Quantum Simulation
and Cybersecurity Applications.
SEMESTER-V
COURSE 15A: Quantum Simulation in Cyber Security
Practical Credits: 1 2 hrs/week

Tools /
Exp. Title of Mapped
Objective / Description Software Expected Outcome
No. Experiment COs
Used
Students can
Setting up the
Install prepare and
Quantum Qiskit,
Qiskit/PennyLane, measure qubit
Simulation Python, CO1,
1 create and measure qubit states, interpret
Environment & Jupyter CO2
states, visualize Bloch Bloch sphere and
Basic Qubit Notebook
sphere. measurement
Operations
probabilities.
Implement multiple
Understanding of
gates and simulate noise
Quantum Gates gate operations,
models (depolarizing, Qiskit Aer, CO1,
2 & Noise noise impact, and
amplitude damping) to Cirq CO2
Simulation error simulation in
observe fidelity
quantum systems.
degradation.
Simulation of Simulate simplified Demonstrate
Shor’s Shor’s algorithm to Qiskit factorization and
CO2,
3 Algorithm for factor small integers and Algorithms, understand resource
CO3
Integer analyze implications on Python requirements and
Factorization RSA security. quantum limits.
Visualize quadratic
Implement Grover’s
Grover’s Search speedup and
algorithm to simulate a
Simulation – evaluate attack CO2,
4 quantum brute-force Qiskit / Cirq
Quantum Brute- feasibility on CO3
attack and compare with
Force Attack classical
classical search.
encryption.
Simulate QKD between
Quantum Key Simulate secure key
Alice and Bob;
Distribution Qiskit, exchange, compute CO2,
5 introduce an
(BB84) Protocol Python QBER, and detect CO3
eavesdropper and detect
Simulation intrusion.
via QBER.
Quantum Generate random keys Qiskit,
Demonstrate
Random using quantum Python
QRNG and validate CO2,
6 Number measurements and (NIST
randomness for CO4
Generation evaluate statistical randomness
cryptographic use.
(QRNG) randomness. tests)
Implement hybrid key
Simulation of Understand hybrid
exchange (post-quantum
Quantum-Safe Python, encryption and CO3,
7 + quantum) for secure
Hybrid Network Qiskit quantum-resilient CO4
network handshake
(PQC + QKD) handshake process.
simulation.
SEMESTER-V
COURSE 15B: Quantum Machine Learning
Theory Credits: 3 3 hrs/week

Course Objectives:
1. To understand the quantum representation of classical data and feature spaces.
2. To study quantum versions of supervised and unsupervised learning algorithms.
3. To apply hybrid quantum-classical approaches in machine learning.
4. To explore quantum reinforcement learning and optimization techniques.
5. To evaluate and apply quantum ML models in real-world case studies.
Course Outcomes:
1. Explain quantum data encoding and feature mapping methods.
2. Implement quantum supervised and unsupervised algorithms such as QSVM and quantum
clustering.
3. Design and simulate hybrid quantum-classical models.
4. Apply quantum reinforcement learning and optimization algorithms.
5. Develop and present quantum ML solutions for domain-specific applications.

Unit I:
Quantum Data Representation & Feature Spaces
Quantum data encoding – amplitude, basis, and angle encoding methods, Quantum feature maps and
kernels, Quantum distance measures and inner product evaluation, Quantum Principal Component
Analysis (qPCA) for dimensionality reduction.
Unit II:
Quantum Supervised Learning Algorithms
Quantum Support Vector Machine (QSVM), Quantum Perceptron and Quantum Linear Regression,
Quantum Decision Trees and Ensemble methods, Performance metrics and complexity analysis.
Unit III:
Quantum Unsupervised and Generative Models
Quantum k-Means and clustering approaches, Quantum Self-Organizing Maps, Quantum Autoencoders
for feature extraction, Quantum Generative Adversarial Networks (QGANs) for data generation.
Unit IV:
Hybrid Quantum–Classical and Reinforcement Learning
Variational Quantum Circuits (VQC) and hybrid architectures, Quantum Approximate Optimization
Algorithm (QAOA), Quantum Reinforcement Learning (QRL) and Quantum Policy Gradient, TensorFlow
Quantum and PennyLane-based implementations.
Unit V:
Applications, Case Studies & Future Directions
Applications in finance, drug discovery, healthcare, and image recognition, Performance benchmarking
and error mitigation in NISQ devices, Emerging quantum ML research (2025 trends), Ethical considerations
and societal impact.
Textbooks and References:

1. Karthikeyan, P., Akila, M., Sumathi, S., Poongodi, M. (2025). Quantum Machine
Learning: A Modern Approach. Routledge, CRC Press.
2. Packt Publishing (2025). A Practical Guide to Quantum Machine Learning and
Quantum Optimization.
3. Cambridge University Press (2025). Machine Learning in Quantum Sciences.
4. Nielsen, M.A. & Chuang, I.L. (2025). Quantum Computation and Quantum Information
– Updated Edition.
5. IBM Qiskit Documentation (2025) – [Link]
6. TensorFlow Quantum Documentation (2025) – [Link]
SEMESTER-V
COURSE 15B: Quantum Machine Learning
Practical Credits: 1 2 hrs/week

Exp.
Experiment Title Tools / SDKs
No.
1 Quantum Data Encoding and Visualization IBM Qiskit
2 Implement Quantum Feature Mapping Qiskit / PennyLane
3 Quantum Principal Component Analysis (qPCA) Qiskit
Quantum Support Vector Machine (QSVM) for
4 Qiskit ML
Classification
5 Quantum k-Means Clustering PennyLane
6 Variational Quantum Circuit Optimization TensorFlow Quantum
7 Hybrid Quantum Neural Network TensorFlow Quantum
8 Quantum Reinforcement Learning Simulation Cirq
9 Quantum Generative Adversarial Network (QGAN) PennyLane
Any SDK (IBM Cloud or
10 Mini Project – Quantum ML for Real Dataset
Local)

You might also like