A vector space is a collection of objects that can be added and multiplied by scalars. The operations called addition and multiplication are not necessarily our familiar algebraic operations, but they must obey certain rules.
Ordinary vectors in 3dimensional space can be added using vector addition. Vector addition is different from ordinary addition, but it obeys the rules for the addition operation of a vector space.
In quantum mechanics, it is postulated that all possible states of a system form a vector space, i.e. they can be manipulated with two operations called addition and multiplication, which obey the rules for addition and multiplication in a vector space. The operations are obviously different from the operations of adding and multiplying ordinary numbers.
Innerproduct spaces are vector spaces for which an additional operation is defined, namely taking the inner product of two vectors. This operation associates which each pair of vectors a scalar, i.e. a number, not a vector. The operation also must obey certain rules, but again, as long as it does obey the rules it can be defined quite differently in different vector spaces. The vector space of ordinary 3d vectors is an innerproduct space; the inner product is the dot product.
The vector space that all possible states belong to in QM is not 3dimensional, but infinitedimensional. It is called a Hilbert space and it is an innerproduct space. In Dirac notation the inner product of a vectors ψ> with a vector φ> is denoted by the symbol <ψφ>. This symbol denotes a number, not a vector. The inner product is quite different from ordinary multiplication, for example <φψ> is not equal to <ψφ>, but the inner product satisfies the rule for an innerproduct space.
In Dirac notation kets represent the vectors. To every ket corresponds exactly one bra. There is a one to one correspondence. ψ> is a ket, the corresponding bra is <ψ. If x> is a ket, the corresponding bra is <x.
The vectors in the Hilbert space can be represented in various representations, i.e. we can choose different bases, and give their components along the basis vectors. If we choose coordinate representation, the basis is the set of all vectors {x>} and the component of a vector ψ> along a vector x> is given by the inner product <xψ> = ψ(x). If we evaluate ψ(x) for all x> we get the wave function. Because we want to interpret the square of the wave function as a probability density, we require that the wave function can be normalized and that if we integrate the square of the normalized wave function over all space we get 1. The probability that we find the system somewhere in space is 1.
We require
that the wave function is squareintegrable. We
therefore say that our Hilbert space is equivalent to the space of squareintegrable
functions.
A linear
vector space V is a set of elements, {V_{i}},
which may be added and multiplied by scalars {α_{i}}
in such a way that
the operation yields only elements of V (closure);  
addition and scalar multiplication obey the following rules:

The domain
of allowed scalars is called the field F over which Vis defined. (Examples:
F consists of all real numbers, or F consists of all complex numbers.)
Ordinary vectors in threedimensional space;  
The
set L^{2} of square integrable functions ψ(r,t) defined by

A set of vectors {V_{1}, V_{2}, V_{3}, ...} is linearly independent (LI) if there exists no linear relation of the form , except for the trivial one with all α_{i} =0.
A vector space is ndimensional if it admits at most n LI vectors. The space of ordinary vectors in threedimensional space is 3dimensional. The space L^{2} is an infinitedimensional vector space.
Given a set of n LI vectors in V^{n}, any other vector in V may be written as a linear combination of these. The vectors are one example of a set of 3 LI vectors in 3 dimensions. One can always choose such a set for every denumerably or nondenumerably infinitedimensional vector space. Any such set is called a basis that spans V. The expansion coefficients are called the components of a vector in this basis.
Assume {u_{i}(r), u_{i}∈L^{2}} forms a basis of L^{2}. Then every vector ψ in L^{2} may be written as
,
the c_{i} being the components of ψ(r) in this basis.
If all vectors are expanded
in a given basis then
to add vectors, add their components;  
to multiply a vector by α, multiply each component by α. 
The inner
product is a scalar function of two
vectors satisfying the following rules:
i) <V_{i}V_{i}>≥0 ;  
ii) <V_{i}V_{j}>=<V_{j}V_{i}>^{*} ;  
iii) <V_{i}αV_{j}+βV_{k}>=α<V_{i}V_{j}>+β<V_{i}V_{j}>. 
Rule ii) and iii) combine to give <αV_{i}+βV_{j}V_{k}>=α^{*}<V_{i}V_{j}>+β^{*}<V_{i}V_{k}>.
A vector space with an inner product is called an inner product space. The inner product in L^{2} is defined by
.
The norm of a vector V defined by V=. A unit vector has norm 1.
Two vectors are orthogonal if their inner product vanishes. A set of vectors {V_{i}} is called orthonormal if <V_{i}V_{j}>=δ_{ij}. Assume the vectors {u_{i}(r)} are orthonormal and form a basis for L^{2}. Then
The component c_{j} is therefore equal to the scalar product of u_{i}(r) and ψ(r).
Let
.
The norm can be expressed in terms of the components.
The inner product obeys the Schwarz inequality
.
The norm obeys the triangle inequality
.
Links:
It is sometimes convenient to introduce "bases" not belonging to V, but in terms of which any vector in V can nevertheless be expanded.
The
set of functions
may be considered a basis not belonging to L^{2}_{x}, labeled by the continuous index p. We write , or . ψ(x)
is an element of L^{2}_{x}. {v_{p}(x)},
the set of all plane waves with different values of p = The set {v_{p}} is "orthonormalized in the Dirac sense". . (See CohenTannoudji, appendix II, regarding the properties of the Dirac δ function.) We have


If
we define the δ
function through the relationship
, then δ_{xo}=δ(xx_{0}) may be considered a basis not belonging to L^{2}_{x}, labeled by the continuous index x_{0}, which spans L^{2}_{x}. where the expansion coefficient ψ(x') is given by
The basis δ(xx_{0}) } is "orthonormalized in the Dirac sense". 
Solution:
by the definition of the Fourier transform. In particular
The inverse Fourier transform then yields
This is an equivalent definition of the Dirac δ function. 
An ordinary vector in three dimensional space may be represented by the components or depending on the choice of basis . But if we write A, we specify the vector without explicitly choosing a basis. In Dirac notation we would label this vector A>.  
The quantum state of any physical system is characterized by a state vector, belonging to a space E, which is the state space of the system. If ψ(r)>∈L^{2} then ψ>∈E. We may consider ψ(r) to be one specific representation of ψ>, namely the set of components in a particular basis δ(r), r playing the role of an index. 
ψ> ∈ V implies that there exist a complex number χ, χ ∈ F.
&chi(λ_{1}  ψ_{1}> + λ_{2} ψ_{2}>)= λ_{1}(χψ_{1}>) + λ_{2}(χψ_{2}>).
The set of all linear functionals defined on V forms a vector space, which is called the dual space of V, denoted by V^{*}. Forming the inner product <χψ> of the vector χ> with other elements ψ> in V is a linear functional. It associates with each vector ψ> the complex number <χψ>. Therefore this operation is an element of the dual space V^{*}. We denote this element with the symbol <χ and call it a bra vector or bra. To every ket in V corresponds a bra in V^{*}. This correspondence is anti linear.
Take the ket λ_{1}χ_{1}>+λ_{2} χ_{2}>) = φ>. Form the inner product of this ket with any other vector ψ> in V.
.
The bra corresponding to
φ> is <φ=l_{1}^{*} <χ_{1}+λ_{2}^{*}<χ_{2}.
We therefore have that
the bra corresponding to λψ> = λψ > is <λψ=λ^{*} < ψ .
Kets and bras are adjoints of each other. To find the adjoint, take the complex conjugate of all scalars and replace each ket (bra) by its corresponding bra (ket).
For additional notes on
Dirac notation click here.
Auf diesem Webangebot gilt die Datenschutzerklärung der TU Braunschweig mit Ausnahme der Abschnitte VI, VII und VIII.