[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 8 KB, 547x228, day.gif [View same] [iqdb] [saucenao] [google]
4403819 No.4403819[STICKY]  [Reply] [Original]

Putnam/Olympiad problem of the day from
http://math.harvard.edu/putnam/

>> No.4403828

First

>> No.4403847

Reported for homework

>> No.4403849

Doesn't x_i = e^{-ix} work?

>> No.4403856

>>4403849
a_ij > 0.

>> No.4404091

>>4403819
I'm a biologist and what is this?

>> No.4404176

>>4404091
Calculus.

>> No.4404292

>>4403819

I don't know what the going to zero thing is all about, but by the exchange lemma, we can construct for each function labeled j

<span class="math">
x_j = \Sigma_{i=0}^n b_i \frac{dx_i}{dt}
[/spoiler]

Plugging this into the given relations we find that the images of the functions under the differentiation map are lin dep. Since the images are lin dep and the differentiation map is a bijection, we're done.

>> No.4404426

The wronkskian is zero, so I suppose it is inconclusive.

>> No.4404830

>>4404292
>exchange lemma

O rly?

>> No.4404906

>>4404426
the wronskian of what?

>> No.4404921

man I like this one, let's give it a try

>> No.4404962

ok if the matrix <span class="math"> (a_{ij})_{1\leq i,j \leq n} [/spoiler] can be diagonalizable, then it's settled because if it is, there exists a base in which we can chose the coordinates independently. And, if we can chose them coordinates independently,
I'll try something related to that.

>> No.4405162

>>4404292
>I don't know what the going to zero thing is all about
It's about preventing you from using this counter-proof:
Take n=2,
Take <span class="math">x_1=t\mapsto 1[/spoiler], and <span class="math">x_2=t\mapsto t[/spoiler]
<span class="math">x_1'(t)=0= 0\cdot x_1(t) + 0\cdot x_2(t)[/spoiler]
<span class="math">x_2'(t)=1= 1\cdot x_1(t) + 0\cdot x_2(t)[/spoiler]
but <span class="math">x_1[/spoiler] and <span class="math">x_2[/spoiler] are not linearly dependent.

This kind of simple trick can't be done if the functions all go to <span class="math">0[/spoiler].

>> No.4405168

>>4405162
Bullshit. This is already excluded by a_{ij} > 0

>> No.4405176

>>4405168
Ah my bad, indeed... Let me try to see if there's another trivial counter-example when we don't assume the functions go to 0.

>> No.4405210

>>4404962
ok by investigating the case where n=2, I think it is impossible to have linearily independent solutions, and to prove it, one only has to show taht the matrix has at least one positive eigenvalue.
(1/3)
Which is easy, let's call A the matrix with the forementionned coefficients.
<span class="math"> Tr(A) >0 [/spoiler] because all the coefficients on the diagonal are positive. As <span class="math"> Tr(A) [/spoiler] is the sum of all the eigenvalues of <span class="math"> A[/spoiler], <span class="math"> A [/spoiler] has at least one positive eigenvalue <span class="math"> \lambda[/spoiler].
As <span class="math"> A [/spoiler] is at least trigonalizable in <span class="math"> \mathbb{C}[/spoiler]. There exists <span class="math"> P \in GL_{n}(\mathbb{C}) [/spoiler], and <span class="math"> T [/spoiler] an upper triangular matrix such as:
1) <span class="math"> T= P^{-1} A P[/spoiler]
2) <span class="math"> T[n,n]=\lambda[/spoiler]

let <span class="math"> X(t) [/spoiler] be a vector whose coordinates are solution of the system.
Then <span class="math"> X'= PTP^{-1}[/spoiler], so <span class="math"> <span class="math"> P^{-1}X'=T P^{-1}X [/spoiler]

Let <span class="math"> Y(t)= P^{-1}X[/spoiler]

then we have <span class="math"> Y'(t)= T Y(t)[/spoiler]

and more specifically, <span class="math"> y'_{n}(t)= \lambda y(t) [/spoiler], so there exists <span class="math"> C\in \mathbb{C} [/spoiler] such as <span class="math"> y(t)=C e^{\lambda t} [/spoiler]
If <span class="math"> C=0, y_{n}(t)=0[/spoiler], and it is settled since the functions are necessarily linearily dependent.
If <span class="math"> C\neq 0[/spoiler], let's go back to <span class="math"> X(t) [/spoiler]:

<span class="math"> X(t)= P Y(t) [/spoiler]
we have <span class="math"> X(t) \rightarrow 0 [/spoiler] as <span class="math"> t \rightarrow \infty [/spoiler]:[/spoiler]

>> No.4405209 [DELETED] 

What if
x_1(t)=cosh(t)+2 sinh(t)
x_2(t)=sinh(t)+2 cosh(t)

>> No.4405214

(2/3?)

and
<span class="math">\forall i, 1\leq i \leq n, x_i = \sum_{k=1}^n P[i,k] y_i = P[i,n] y_n + sum_{k=1}^n P[i,k] y_i[/spoiler]

in fact, none of the <span class="math"> \forall i, \forall t, y_i (t) \neq 0 [/spoiler] (they're all sums of exponentials of the <span class="math"> T[m,m] [/spoiler] etc). The important thing is that they either go to 0 or to infinity.
as we cannot have all the <span class="math"> P[i,n] [/spoiler] equal to 0 (or P wouldn't be invertible), then <span class="math"> \exists j, P[j,n] \neq 0 [/spoiler]
then, as <span class="math"> x_j (t) \rightarrow 0 [/spoiler], there must exist at least a <span class="math"> (d_1... d_p)\in [1,n-1]^p, (K_1 ... K_p) \in \mathbb{C}^p [/spoiler], <span class="math"> P[j,n]y_n (t) + \sum K_m P[j,d_m] y_{d_m} (t) \rightarrow 0 [/spoiler] in fact, this is equal to 0, since I don't fucking now, I need to pause. But this is the last thing I have to prove, because it would imply the dependence of y_i 's, implying the dependence of some x_i 's.

>> No.4405215

>>4405210
small correction: there isn't necessarily a positive eigenvalue, but at least one with a positive real part, which is enough for our purpose.

>> No.4405223

>>4405176
One is
<span class="math">dx_1 / dt = x_1 + x_2[/spoiler]
<span class="math">dx_2 / dt = x_1 + x_2[/spoiler]
where
<span class="math">x_1 = e^{2t} + 1[/spoiler]
<span class="math">x_2 = e^{2t} - 1[/spoiler]

>> No.4405250

>>4405215
ok fuck this I looked at the solution, it was much easier.

>> No.4405317 [DELETED] 

>>4405223
Ah, thank you! I was pretty sure it was possible to figure out.

>> No.4405321

>>4405223
Ah, thank you! I was pretty sure it was possible to figure out something.

>> No.4406270

I'm in IB Maths and what is this?

How do I get to learn to pull whatever I need whenever I need it?

HOW DO I INTO SKILL

>> No.4406294 [DELETED] 

Let x1 = -exp(-t) and xi = exp(-t). They clearly satisfy the condition that xi -> 0 for t->infty, and being scalar multiples of one another, they're linearly dependent. Furthermore,
x1' = x1 + 2/(n-1) sum_{ i>1} xi
xi' = nx1 + sum_{ i>1}xi


Could someone please teach me how to display math properly, so I can make my supposed solution readable?

>> No.4406347

>>4406270
I'm taking IB too! HL or SL math?

>> No.4406352

>>4406270

I am a physics grad student and I have no idea how to even approach the majority of these questions... The only people I know who can touch them take the absolute purest-math versions of every course in their undergrad... And they are at the top of their classes.

Cheers

>> No.4406386

>>4406347

Livin' the HL life, you?

>> No.4406391

>>4406386
It's only SL for me. My school is split between local curriculum and IB stuff, so we don't get to choose our SL/HL courses. In other words, I still have to take my diplomas and my SATs (I want to go to school in the states).

Have you done matrices yet? That was my favorite part of IB math.

>> No.4406430

IB student here too but I finished last year.

I took math HL and passed with 5/7. The exam is hard as hell (or at least it was for me) so be ready. In fact I don't even know how did I get that score, expected a 3/7.

Also, discrete math > matrices.

>> No.4406449

>>4406430
>2011
>Doing IB
SHIGGIDY DIGGIDY, BATMAN!

>> No.4406465

>>4406430
Oh lord, that motherfucking Oxford textbook. The answers were so fucking ridiculous, the only good part were a lot of IB questions they put in turn up as variations in exam.

>> No.4407023

>>4406430
Actually, discrete maths do involve matrices. Especially graph theory, where graphs can be represented by matrices, and operations on graphs can be represented by matrix products in weird algebras, and where the eigenvalues of the matrix of a graph are also of importance.
They are also used in (potentially discrete) dynamical systems in several variables, if the system as some good linearity properties, because if the system can be diagonalized, it makes solving it possible.

>> No.4407089

Confused babby linear algebra student here:

what do tuples have to do with dimension? Like if I have a vector of (xyz) components vs. (xyzw) components that only affects the rowspace. Is the row space the column space of some transformation T: V---> W and hence dimension is in fact based on the tuples?

>> No.4407098

>>4407089

1/10

Nice try though. I ain't even mad.

>> No.4407117

>>4407098

I'm legitimately confused because I think there's a mistake in my course notes implying that this is the case when it's clearly not.

>> No.4407317

I just want to say the concept of this as a sticky may be one of the greater ideas in the universe.

>> No.4407328

1) Rewrite system as matrix equation

2) Find eigenvalues of coefficient matrix

3) Determine if New matrix made of eigenvalues v_1, v_2, ... v_n, has a determinant of 0.

4) if yes, then x_1,...x_n, are linearly independent.

>> No.4407330

>>4407328
>implying the matrix is diagonalizable

>> No.4407331

>>4405321
Please stop posting.

>> No.4407335

>>4407328
Oops, meant Eigenvectors.

>> No.4407391

>>4407328
>implying no one thought about that already