This course covers fundamental principles and methods of theory of calculus and linear algebra as applied to computing specialties. The course utilizes the universal language of calculus and linear algebra necessary to formulate and understand practical problems pertinent to computing applications. The student gains mathematical knowledge and skills that are directly applicable to computing practices.
Prerequisites
Corequisites
None