院試問題 東大 数理科学研究科 平成30年 専門科目B

Pocket
LINEで送る

院試問題解答作成基本方針

問題は↓から見られます.
平成31(2019)年度修士課程入学試験について | 東京大学大学院数理科学研究科理学部数学科・理学部数学科

9

(4)以外は特段難しくはありません. (1),(2)はBanach空間の問題ですが, (3)からは内積を使う必要が出てきます. 多少頭の切り替えに苦労するかもしれません. (4)は直交分解に気づけるかが鍵です.

Assumption
It is assumed that for every g \in L^q(X) the following holds:

(1)   \begin{align*} \lim_{n \to \infty}\int_{X}f_ngd\mu = \int_{X}fgd\mu. \end{align*}

(1)
By Hölder’s inequality, for every g \in L^q(X) s.t. ||g||_q = 1, we have

(2)   \begin{align*} \int_{X}fg d\mu \leq ||f||_p.  \end{align*}


Hence \sup_{||g||_q = 1}\int_{X}fgd\mu \leq ||f||_p. Define h(x) = (1/||f||^{p-1}_p)\mathrm{sgn}(x)|f(x)|^{p-1} where

(3)   \begin{equation*} \mathrm{sgn}(x) = \left\{ \begin{aligned} 1, \quad x > 0, \\ -1 \quad x < 0. \end{aligned} \right.  \end{equation*}


Then it holds that h \in L^q(X), \ ||h||_q = 1 and

(4)   \begin{align*} \int_{X}fhd\mu = ||f||_p,  \end{align*}


and therefore we obtain the result.

(2)
By Hölder’s inequality, we have for every g \in L^q(X) s.t. ||g||_q = 1,

(5)   \begin{align*} \int_{X}f_ngd\mu \leq ||f_n||_p.   \end{align*}


Then from (1), it follows that

(6)   \begin{align*} \int_{X}fgd\mu \leq \liminf_{n \to \infty}||f_n||_p.  \end{align*}


Then from the problem 1, we obtain the result.

(3)
We denote the inner product of f,g \in L^2(X) as (f,g). Note that

(7)   \begin{align*} ||f_n - f||^2_2 = ||f_n||^2_2 + ||f||^2_2 - 2(f_n,f). \end{align*}


From the assumption and (1), we have \lim_{n \to \infty}||f_n|| = ||f||_2, \  \lim_{n\to \infty}(f_n,f) = ||f||_2. Then from (7), we obtain the result.

(4)
We decompose f = g + h where g \in \overline{V},\ h \in \overline{V}^{\perp}. Then it follows that

(8)   \begin{align*} \int_{X}f_nfd\mu = \int_{X}f_ngd\mu  \end{align*}


Then from (1), LHS of (8) converges to ||f||_2 as n \to \infty, and RHS converges to (f,g) = ||g||_2. Hence it holds that ||f||_2 = ||g||_2 and therefore h = 0.

11

(1)
Assume A is an orthogonal projection. Since A^2 = A, we have

(9)   \begin{align*}     P_1P_2 + P_2P_1 = 0.  \end{align*}


Then for every \xi \in \mathcal{H}, we have \mathrm{Re} \langle P_1\xi,P_2\xi \rangle = 0.
Take \xi \in \mathcal{H}_1 \cap \mathcal{H}_2. Then it follows that

(10)   \begin{align*}      0 = \mathrm{Re}\langle P_1\xi,P_2\xi\rangle = ||\xi||^2,  \end{align*}


and hence \xi = 0. Therefore we obtain \mathcal{H}_1 \cap \mathcal{H}_2 = \{ 0 \}.
For every \xi \in \mathcal{H}, it holds from (9) that P_1P_2\xi \in \mathcal{H}_1 \cap \mathcal{H}_2. Hence P_1P_2 = P_2P_1 = 0 and from this, \cos \theta = \pi/2 easily follows.

Assume \cos \theta = \pi/2. Then for every \xi, \eta \in \mathcal{H}, we have \langle P_1\xi,P_2\eta \rangle = 0 and it easily follows that P_1P_2 = P_2P_1 = 0.

(2)
Note that

(11)   \begin{align*}     \mathcal{H} = (\mathcal{H}_1 \ominus \mathcal{H}_2) \oplus (\mathcal{H}_2 \ominus \mathcal{H}_1) \oplus (\mathcal{H}_1 \cap \mathcal{H}_2),  \end{align*}


where (\mathcal{H}_i \ominus \mathcal{H}_j) denotes \mathcal{H}_i \cap \mathcal{H}^{\perp}_j and \oplus does the direct sum of Hilbert spaces.
It is enough to show that \mathcal{H}_i \ominus \mathcal{H}_j, \ \mathcal{H}_1 \cap \mathcal{H}_2 \subset B\mathcal{H} \  (i,j = 1,2, \ i \neq j).
Let \xi \in \mathcal{H}_i \ominus \mathcal{H}_j. Then we have B(B\xi) = A\xi = \xi.
Hence \xi \in B\mathcal{H}.
Let \xi \in \mathcal{H}_1 \cap \mathcal{H}_2. Then we have B(B\xi) = A\xi = 2\xi, and therefore \xi \in B\mathcal{H}.

(3)
Note that \langle A\xi, \xi \rangle = 0 for every \xi \in \mathcal{L}^{\perp}. Then from the problem 2, it is enough to show for every \xi \in \mathcal{H}

(12)   \begin{align*}     \langle AB\xi, B\xi \rangle \leq (1 + \cos \theta)||B\xi||^2.  \end{align*}


Since it holds that ||B\xi||^2 = ||P_1\xi||^2 + ||P_2\xi||^2, we have for \xi_1 = \frac{P_1\xi}{||P_1\xi||} and \xi_2 = \frac{P_2\xi}{||P_2\xi||},

(13)   \begin{align*}     \langle AB\xi, B\xi \rangle &= ||A\xi||^2   \\     &= (||P_1\xi||^2 + ||P_2\xi||^2) + 2\mathrm{Re} \langle P_1\xi,P_2\xi \rangle  \\     &= ||B\xi||^2 + \frac{2||P_1\xi|| ||P_2\xi||}{||P_1\xi||^2 + ||P_2\xi||^2}     \mathrm{Re} \langle \xi_1,\xi_2 \rangle ||B\xi||^2  \\     &\leq \left(1 + \frac{2||P_1\xi|| ||P_2\xi||}{||P_1\xi||^2 + ||P_2\xi||^2}\cos \theta \right)||B\xi||^2.     \end{align*}


Since it holds that \frac{2xy}{x^2 + y^2} \leq 1 for every x,y \in \mathbb{R}, we obtain the desired result.

14

(1), (2)はかなり容易ですし, 確率論の知識がそれほどなくても解けますが, (3)以降は予備知識なしでは厳しいですね.

Assumption
For every \epsilon > 0 and k \in \mathbb{N}, there exists N \in \mathbb{N} and for every m,n \geq N, we have

(14)   \begin{align*}\sup_{x \in \mathbb{R}} | P[X_nS_k > x] - P[X_mS_k > x]| < \epsilon. \end{align*}

(1)
This is no more than a simple calculation:

(15)   \begin{align*}P[S_k] = 1, \ \text{and} \ P[(S_k - P[S_k])^2] = 1/k. \end{align*}

(2)
By (15) and Chebyshev’s inequality, we have for every \epsilon > 0,

(16)   \begin{align*}P[|S_k - 1| > \epsilon] \leq 1/(\epsilon^2k) \xrightarrow{k \to \infty} 0.\end{align*}


Hence S_k converges to 1 in probability.

(3)
Fix k \in \mathbb{N}. From Helly’s selection theorem, there is a subsequence \{X_{n'}S_k\}_{n'} the law of which converges vaguely to a finite measure \mu_k. It is enough to show that \mu_k is a probability measure and the law of X_nS_k converges to \mu_k as n \to \infty. By taking n to \infty along the subsequence {n'} in (14), there exists an integer N \in \mathbb{N} and it holds for every continuity point x of \mu_k, m \geq N and \epsilon > 0,

(17)   \begin{align*}\sup_{x \in \mathbb{R}}|\mu_k(x,\infty) - P[X_mS_k > x]| \leq \epsilon. \end{align*}


Fix \delta > 0. By taking x so large that P[X_NS_k > x] < \delta, we have from (17)

(18)   \begin{align*}\mu_k(-\infty,x) \geq 1 - \delta - \epsilon. \end{align*}


Since \epsilon and \delta can be taken arbitrary small, it follows that \mu_k(-\infty,\infty) = 1.
Again from (17), we have

(19)   \begin{align*}\lim_{m \to \infty}\sup_{x \in \mathbb{R}}|\mu_k(x,\infty) - P[X_mS_k > x]| = 0, \end{align*}


and hence the law of X_nS_k weakly converges to \mu_k as n \to \infty.

(4)
At first, we show the tightness of the sequence \{X_n\}_n. Fix \epsilon \in (0,1). Then it holds that for every R > 0 and k \in \mathbb{N}

(20)   \begin{align*}P[|X_n| > R] &= P[|X_nS_k| > S_kR ]  \\&\leq P[|S_k - 1| \geq \epsilon] + P[|X_nS_k| > (1 - \epsilon) R]. \end{align*}


From the problem 3, it follows that

(21)   \begin{align*}\lim_{R \to \infty}\limsup_{n \to \infty}P[|X_n| > R] \leq P[|S_k - 1| \geq \epsilon]. \end{align*}


Since k can be taken arbitrary large, we obtain the tightness of \{X_n\}_n from the problem 2.
Suppose that the law of X_n converges to the probability measures \nu and \nu' along the subsequences \{X_{n'}\}_{n'} and \{X_{n"}\}_{n"}, respectively. Since it holds that for every x such that \mu_k\{x\} = 0 \ (\forall k \geq 0) and \epsilon > 0,

(22)   \begin{align*}&P[X_nS_k > x] - P[X_mS_k > x]  \\\leq &P[X_n > x/(1 + \epsilon)] - P[X_m > x/(1 - \epsilon)] + 2P[|S_k - 1| > \epsilon], \end{align*}


by taking n and m to \infty along {n'} and {n"}, respectively and then taking k to \infty, we have

(23)   \begin{align*}0 \leq \nu[x/(1 + \epsilon), \infty) - \nu'(x/(1 - \epsilon), \infty). \end{align*}


Hence we have

(24)   \begin{align*}0 \leq \nu[x,\infty) - \nu'(x,\infty), \end{align*}


and similarly we can show \nu'[x,\infty) - \nu(x,\infty) \geq 0. Then since the point x \in \mathbb{R} such that \nu\{x\} > 0, \nu'\{x\} > 0 or \mu_k\{x\} > 0 \ (\exists k \geq0) is at most countable, we obtain \nu = \nu'.

コメントを残す

メールアドレスが公開されることはありません。

このサイトはスパムを低減するために Akismet を使っています。コメントデータの処理方法の詳細はこちらをご覧ください