Friday 8 August 2014

Getting It On With Linear Algebra

Linear algebra is an important part of mathematics that concerns itself with vector spaces, which are often finite and countably infinite dimensional. It also discusses linear mappings in between spaces. Such investigation is motivated by linear equation systems through several unknowns, normally represented by the formalism that makes up matrices and vectors. Linear algebra is paramount to pure and applied ideas in mathematics. Abstract algebra, for instance, is made known by relaxing vector space axioms, which should then lead to some generalizations. The idea of functional analysis studies infinite-dimensional versions of the vector space theory. When combined with calculus, it should facilitate linear system solutions made up of differential equations. The techniques involved are also utilized in the areas of engineering, analytic geometry, natural sciences, physics, social sciences (especially economics), and in computer science as well. Since it is a well-developed theory, any nonlinear mathematical models can be approximated by linear models themselves. The first studies of linear algebra emerged from studying determinants, which were used back then in the solving of the systems involving linear equations. Such determinants were employed by Leibniz back in 1693, which also paved the way for the Cramer's Rule involving linear systems in the mid 1700's. Theories in solving the linear systems soon followed with the help of the Gaussian elimination, initially listed as an important advancement in the area of geodesy. Matrix algebra ideas first surfaced to the consciences back in the mid 1800s, with James Joseph Sylvester introducing the matrix term (Latin for "womb"). Matrix inverses and multiplications was the result of studying linear transformation composition. It was during such events that single letters were used to denote any matrix, treating the matrix itself as an aggregate object. The proponents also came to realize how there became a connection between the determinants and matrices, which practically precedes the theory regarding determinants. A more modern and precise definition of vector spaces was introduced back in 1888, followed by the theory suggesting linear transformations comprising of a finite-dimensional vector space. Linear algebra had its first modern form in the early twentieth century, where the ideas and methodologies used in the previous centuries generally regarded as abstract algebra. Using matrices in the area of quantum mechanics, statistics, and special relativity had helped in the spread of the subject beyond the realm of pure mathematics. Advancements surfaced with the introduction of computers and their growing computing powers, which encouraged increased research through efficient algorithms intended for Gaussian elimination/matrix decomposition. These days, it is considered an essential tool used in modeling and simulation functions. Since linear algebra is regarded a successful theory, the methods used have successfully developed as well, thereby generalized in the other corners of mathematics. In the module theory for example, one replaces a field of scalars using a ring. Linear independence concepts, including span, dimension (also referred to as rank in the module theory itself) and basis are still relied upon. Many theorems come into conflict with the module theory however. Not all the modules involved has a basis, especially those tagged as free modules.

1 comment: