Answer:
Step-by-step explanation:
a) Consider the sequence
if n is odd, and
if n is even. So, the sequence diverges (since as n tends to infinity the sequence doesn't approach any particular number), but the subsequence of the even integers is convergent to -1 since it is constant.
b) consider the sequence
. The function f(x) =
when x is real is a monotolically decreasing function and tends to 0. Then, when multiplying by a minus sign, it becomes a monotonically increasing function that tends to 0. Hence, the given sequence is monotonically increasing and converges to 0.
c) Suppose that the sequence
converges to a. So, from an specific n and on, the values of
are really close to a. So, for almost all the value of the sequence, they are less than a+1 and greater than a-1. Hence it must be bounded.
d) It is a theorem that a monotonically decreasing/increasing sequence that is bounded must converge, so such a sequence can't exist.