Module and Programme Catalogue

Search site

Find information on

2016/17 Undergraduate Module Catalogue

COMP1211 Computer Architecture

10 creditsClass Size: 165

Module manager: Dr Kevin McEvoy
Email: K.McEvoy@leeds.ac.uk

Taught: Semester 1 (Sep to Jan) View Timetable

Year running 2016/17

This module is not approved as a discovery module

Module summary

Computer architecture transitions from the chip-level details of computation to how a chips and other components work together in modern computers. This module introduces the basics of internal machine architecture, including how memory is accessed, how various components are connected together, and how to program in assembly language: the necessary preliminary to considering how operating systems control computation and how compilers translate human instructions to machine instructions.

Objectives

On completion of this module a student should be able to:
-Explain why everything is data, including instructions, in computers.
-Explain the reasons for using alternative formats to represent numerical data.
-Describe how negative integers are stored in sign-magnitude and twos-complement representations.
-Explain how fixed-length number representations affect accuracy and precision.
-Describe the internal representation of non-numeric data, such as characters, strings, records, and arrays.
-Convert numerical data from one format to another.
-Explain the organization of the classical von Neumann machine and its major functional units.
-Describe how an instruction is executed in a classical von Neumann machine, with extensions for threads, multiprocessor synchronization, and SIMD execution.
-Describe instruction level parallelism and hazards, and how they are managed in typical processor pipelines.
-Summarize how instructions are represented at both the machine level and in the context of a symbolic assembler.
-Demonstrate how to map between high-level language patterns into assembly/machine language notations.
-Explain different instruction formats, such as addresses per instruction and variable length vs. fixed length formats.
-Explain how subroutine calls are handled at the assembly level.
-Explain the basic concepts of interrupts and I/O operations.
-Write simple assembly language program segments.
-Show how fundamental high-level programming constructs are implemented at the machine-language level.
-Describe how computing systems are constructed of layers upon layers, based on separation of concerns, with well-defined interfaces, hiding details of low layers from the higher layers.
-Explain the distinction between program errors, system errors, and hardware faults (e.g., bad memory) and exceptions (e.g., attempt to divide by zero).
-Articulate the distinction between detecting, handling, and recovering from faults, and the methods for their implementation.
-Describe the role of error correcting codes in providing error checking and correction techniques in memories, storage, and networks.
-Apply simple algorithms for exploiting redundant information for the purposes of data correction.
-Compare different error detection and correction methods for their data overhead, implementation complexity, and relative execution time for encoding, detecting, and correcting errors.
-Define how finite computer resources (e.g., processor share, memory, storage and network bandwidth) are managed by their careful allocation to existing entities.
-Explain why it is important to isolate and protect the execution of individual programs and environments that share common underlying resources.
-Describe how the concept of indirection can create the illusion of a dedicated machine and its resources even when physically shared among multiple programs and environments.
-Measure the performance of two application instances running on separate virtual
machines, and determine the effect of performance isolation.
-Explain how interrupts are used to implement I/O control and data transfers.
-Identify various types of buses in a computer system.
-Describe data access from a magnetic disk drive.

Learning outcomes
On completion of the year/programme students should have provided evidence of being able to:
- demonstrate a familiarity with the basic concepts, information, practical competencies and techniques which are standard features of the discipline;
- be able to communicate the results of their work;
- be able to interpret and evaluate the underlying concepts and principles of the discipline;
- evaluate qualitative and/or quantitative data;
- appreciate their strengths and weaknesses as learners;
- demonstrate an awareness of professional and disciplinary boundaries;
- demonstrate computational thinking including its relevance to everyday life;
- operate computing equipment effectively, taking into account its logical and physical properties.


Syllabus

Teaching methods

Delivery typeNumberLength hoursStudent hours
Example Class51.005.00
Laboratory51.005.00
Class tests, exams and assessment12.002.00
Lecture221.0022.00
Private study hours66.00
Total Contact hours34.00
Total hours (100hr per 10 credits)100.00

Opportunities for Formative Feedback

Coursework and labs

Methods of assessment


Coursework
Assessment typeNotes% of formal assessment
AssignmentCoursework15.00
AssignmentCoursework15.00
Total percentage (Assessment Coursework)30.00

This module is re-assessed by exam only.


Exams
Exam typeExam duration% of formal assessment
Open Book exam2 hr 70.00
Total percentage (Assessment Exams)70.00

This module is re-assessed by exam only.

Reading list

The reading list is available from the Library website

Last updated: 27/02/2017

Disclaimer

Browse Other Catalogues

Errors, omissions, failed links etc should be notified to the Catalogue Team.PROD

© Copyright Leeds 2019