Norm
An understanding of a norm is needed to proceed to linear isometries.
A norm is a special case of metrics. See Subtypes of topological spaces for more information
Contents
[hide]Definition
A norm on a vector space (V,F) (where F is either R or C) is a function ∥⋅∥:V→R such that[1][2][3][4]See warning notes:[Note 1][Note 2]:
- ∀x∈V ∥x∥≥0
- ∥x∥=0⟺x=0
- ∀λ∈F,x∈V ∥λx∥=|λ|∥x∥ where |⋅| denotes absolute value
- ∀x,y∈V ∥x+y∥≤∥x∥+∥y∥ - a form of the triangle inequality
Often parts 1 and 2 are combined into the statement:
- ∥x∥≥0 and ∥x∥=0⟺x=0 so only 3 requirements will be stated.
I don't like this (inline with the Doctrine of monotonic definition)
Terminology
Such a vector space equipped with such a function is called a normed space[1]
Relation to inner product
Every inner product ⟨⋅,⋅⟩:V×V→(R or C) induces a norm given by:
- ∥x∥:=√⟨x,x⟩
TODO: see inner product (norm induced by) for more details, on that page is a proof that ⟨x,x⟩≥0 - I cannot think of any complex norms!
Induced metric
To get a metric space from a norm simply define[2][1]:
- d(x,y):=∥x−y∥
(See Subtypes of topological spaces for more information, this relationship is very important in Functional analysis)
TODO: Some sort of proof this is never complex
Weaker and stronger norms
Given a norm ∥⋅∥1 and another ∥⋅∥2 we say:
- ∥⋅∥1 is weaker than ∥⋅∥2 if ∃C>0∀x∈V such that ∥x∥1≤C∥x∥2
- ∥⋅∥2 is stronger than ∥⋅∥1 in this case
Equivalence of norms
Given two norms ∥⋅∥1 and ∥⋅∥2 on a vector space V we say they are equivalent if:
∃c,C∈R with c,C>0 ∀x∈V: c∥x∥1≤∥x∥2≤C∥x∥1
Theorem: This is an Equivalence relation - so we may write this as ∥⋅∥1∼∥⋅∥2
Note also that if ∥⋅∥1 is both weaker and stronger than ∥⋅∥2 they are equivalent
Examples
- Any two norms on Rn are equivalent
- The norms ∥⋅∥L1 and ∥⋅∥∞ on C([0,1],R) are not equivalent.
Common norms
Name | Norm | Notes |
---|---|---|
Norms on Rn | ||
1-norm | ∥x∥1=n∑i=1|xi| | it's just a special case of the p-norm. |
2-norm | ∥x∥2=√n∑i=1x2i | Also known as the Euclidean norm - it's just a special case of the p-norm. |
p-norm | ∥x∥p=(n∑i=1|xi|p)1p | (I use this notation because it can be easy to forget the p in p√) |
∞−norm | ∥x∥∞=sup({xi}ni=1) | Also called sup-norm |
Norms on C([0,1],R) | ||
∥⋅∥Lp | ∥f∥Lp=(∫10|f(x)|pdx)1p | NOTE be careful extending to interval [a,b] as proof it is a norm relies on having a unit measure |
∞−norm | ∥f∥∞=supx∈[0,1](|f(x)|) | Following the same spirit as the ∞−norm on Rn |
∥⋅∥Ck | ∥f∥Ck=k∑i=1supx∈[0,1](|f(i)|) | here f(k) denotes the kth derivative. |
Induced norms | ||
Pullback norm | ∥⋅∥U | For a linear isomorphism L:U→V where V is a normed vector space |
Examples
Notes
- Jump up ↑ A lot of books, including the brilliant Analysis - Part 1: Elements - Krzysztof Maurin referenced here state explicitly that it is possible for ∥⋅,⋅∥:V→C they are wrong. I assure you that it is ∥⋅∥:V→R≥0. Other than this the references are valid, note that this is 'obvious' as if the image of ∥⋅∥ could be in C then the ∥x∥≥0 would make no sense. What ordering would you use? The canonical ordering used for the product of 2 spaces (R×R in this case) is the Lexicographic ordering which would put 1+1j≤1+1000j!
- Jump up ↑ The other mistake books make is saying explicitly that the field of a vector space needs to be R
References
- ↑ Jump up to: 1.0 1.1 1.2 Analysis - Part 1: Elements - Krzysztof Maurin
- ↑ Jump up to: 2.0 2.1 Functional Analysis - George Bachman and Lawrence Narici
- Jump up ↑ Functional Analysis - A Gentle Introduction - Volume 1, by Dzung Minh Ha
- Jump up ↑ Real and Abstract Analysis - Edwin Hewitt & Karl Stromberg