Sunday 10 September 2017

4-3 CLRS part (a)

Finally, something apt for my appetite!
4-3[clrs]:
Give asymptotic upper and lower bounds for $T(n)$ in each of the following recurrences. Assume that $T(n)$ is constant for sufficiently small $n$. Make your bounds as tight as possible, and justify your answers.
(a) $T(n) = 4T(n/3)+nlgn$
Since I'm a lazy person, I'll not show the lower bounds.
For the upper bound:
First of all we'll need two summations:
$$\sum_{j=0}^{j=i}(\frac {4}{3})^j = 4[(\frac{4}{3})^i-1]$$
and
$$\sum_{j=0}^{j=i}(j \times (\frac {4}{3})^j) = M = 1 \times (4/3)+2 \times (4/3)^2+ 3 \times (4/3)^3 +...+i \times (4/3)^i \\ \frac {4}{3} \times M = 1 \times (4/3)^2 + 2 \times (4/3)^3 + 3 \times (4/3)^4 +...+i \times (4/3)^{i+1} \\ M - \frac{4}{3} \times M = \sum_{j=1}^{j=i}(\frac{4}{3})^j-i \times (\frac{4}{3})^{i+1}\\  \frac{M}{3} = i \times (\frac{4}{3})^{i+1}-4 \times (\frac {4}{3})^i+4 \\ M = 3i \times (\frac{4}{3})^{i+1}-12 \times(\frac{4}{3})^i+12 \\ =  (4i-12) \times (\frac{4}{3})^i+12$$

Now, see $$T(n) = 4 T(\frac{4}{3})+nlgn \\ = 4 ( 4T(n/3^2)+(n/3)lg(n/3))+nlgn \\ = 4 ^2 T(n/3^2)+(4/3)lg(n/3)+n lgn \\ = 4 ^3 T(n/3^3)+ (4/3)^2lg(n/3^2)+n lgn \\ = 4^i T(n/3^i)+ \sum_{j=1}^{j=i}(\frac{4}{3})^{j-1}lg(\frac{n}{3^{j-1}}) \\ = 4^iT(n/3^i)+ \frac{3}{4} \times \sum_{j=1}^{j=i} (\frac{4}{3})^j[lgn - (j-1)lg3] \\ = 4^i T(n/3^i)+ \frac{3}{4} \times lgn \times \sum_{j=1}^{j=i} (\frac{4}{3})^j - \frac{3}{4} log 3 \sum_{j=1}^{j=i} j \times (\frac{4}{3})^j + \frac{3}{4} lg3 \times \sum_{j=1}^{j=i} (\frac{4}{3})^j  \\ = 4^i T(n/3^i)+3\times lgn [4 [(\frac{4}{3})^i-1]] -\frac{3}{4} log 3 [(4i-12)(\frac{4}{3})^i+12]+\frac{3}{4}\times lg3 [4[(\frac{4}{3})^i-1]] \\ = 4^iT(n/3^i)+3\times lgn \times (\frac{4}{3})^i -3\times lg 4 - 3 \times lg3 [ 4i\times (4/3)^i-12 (4/3)^i+12]+3 lg3 \times (4/3)^i- \frac{3}{4}lg3 $$
Since, we're finding only the asymptotic bounds, therefore, we can remove the constant terms, and moreover $n/3^i=1$ when $i=log_3n = k$. Therefore, doing that:
$$T(n) =  4^k \times T(1) + 3 lgn \times (\frac{4}{3})^k - 4k (4/3)^k +12\times /3)^k -12 +(4/3)^k \\= 4^{log_3n} + 3 log_2n \frac{4^{log_3n}}{n}-\frac{4 log_3n}{n}+\frac{4^{log_3n}}{n}-12$$
From this equation, we can see that the largest term is $4^{log_3n}$. Hence the asymptotic upperbound of the given recurrence is $O(4^{log_3n})=O(n^{log_34})$.

4.3-1,2,3 CLRS

Again, all three of these questions were easy, but were marked. So, alas, I'll put their solutions here too.
4.3-1[clrs]:
Show that the solution of $T(n)=T(n-1)+n$ is $O(n^2)$.
Assume $T(n) \leq cn^2$, for some constant $c$. Then we'll have, $$T(n)\leq c(n-1)^2+n \\ =c(n^2+1-2n)+n\\=cn^2+c-2cn+n\\ \leq cn^2$$
This can be done only if $c+n \leq 2cn$. Now, assume, $c \geq 1$ and $n \geq 1$, then $c+n \geq 2$ and $2cn \geq 2$. Therefore this inequality can exist for these assumptions.

4.3-2[clrs]:
Show that the solution of $Tn(n)=T(\lceil n/2 \rceil)+1$ is $O(log n)$.
Assume $T(n) \leq clog n-d$, for some constant $c$. Then we'll have, $$T(n)  \leq c log(\lceil n/2 \rceil) -d+1 \\ = c log(\frac{n}{2}+1)-d+1 \\ < c(log(\frac{n}{2})+1)-d+1 \\ = c(logn - log2+1)-d+1 \\= clog n -d +1 \\ \leq  clogn$$
Note that this inequality will only exist if  $log(\frac{n}{2}+1) \leq log(\frac {n}{2})+1$. It's proof(backwards):
Raising both sides to the power of 2.
$$\frac{n}{2}+1 \leq  2^{log(\frac{n}{2})+1}\\ \frac{n}{2}+1 \leq 2 \times 2 ^{log \frac{n}{2}}\\ \frac{n}{2}+1 \leq 2 \times \frac{n}{2} \\ \frac{n}{2}+1 \leq n \\ 1 \leq \frac{n}{2} \\ 2 \leq n$$
Therefore, $n_0 = 2$, for our inequality to exist.

4.3-3[clrs]:
We saw that the solution of $T(n)=2T(\lfloor n/2 \rfloor)+n$ is $O(nlogn)$ . Show that the solution of this recurrence is also $\Omega (n logn )$. Conclude that the solution is $\Theta (n logn )$.
Assume, this time that $T(n) \geq c logn $ for some constant $c$. Then we'll have, $$ T(n) \geq 2 c \lfloor \frac {n}{2} \rfloor log \lfloor \frac{n}{2} \rfloor +n \\ T(n) \geq 2(c(\frac{n}{2}-1)log(\frac{n}{2}-1))+n \\ T(n) \geq 2 (\frac {cn}{2}-c)log(\frac{n}{2}-1)+n\\2(\frac{cn}{2}-c)(log(\frac{n}{2})-1)+n \\ T(n) \geq (cn -2c)(log(\frac{n}{2})-1 )+n \\ T(n) \geq (cn-2c)(log n -2) +n \\ T(n) \geq cnlogn -2cn -2clogn + 4c +n\\ T(n) \geq cnlogn$$

Notice that here too we are needed to prove that $log(\frac{n}{2})-1 \leq log(\frac{n}{2}-1)$, but that can easily be derived from the proof we did in the last answer, by putting $\frac{n}{2}-1$ in place of $\frac{n}{2}$.

4.2-6 CLRS

4.2-6[clrs]:
How quickly can you multiply a $kn \times n$ matrix by an $n \times kn$ matrix, using Strassen's algorithm as a subroutine? Answer the same question with the order of the input matrices reversed.
Well, this one was easy. I don't know why I couldn't do it earlier. So,
Case-1:
We want to multiply a $kn \times n$ matrix with a $n \times kn$ matrix. Now we can see that the $kn \times n$ matrix is composed of $k$, $n \times n$ matrices. We can say the same thing for the $n \times kn$ matrix.

Therefore we can rewrite both the matrices as $k \times 1$ and $1 \times k$, by assuming each $n \times n$ matrix as a single element.

Now, if we want to multiply a $k \times 1$ matrix by a $1 \times k$ matrix, then we'll do $k^2$ multiplications, along with no additions. So if  $T(kn \times n,n \times k)$ represents the running times of the multiplication of these two matrices then:$$T(kn \times n,n \times kn) = k^2 T(n \times n,n \times n)+ O(1)$$

And, by Strassen's Algorithm, we know that $T(n \times n,n \times n)= n^{log_2 7}$. Therefore, $$T(kn \times n, n \times kn)=k^2 \times n^{log_2 7}$$

Case-2:
Now, in this case, we want to multiply a $n \times kn$ matrix by a $kn \times n$ matrix. Now again, we'll change them into $ 1 \times k$ into $k \times 1$ matrices. Notice that this multiplication requires $k$ multiplications of $n \times n$ matrices and $k-1$ additions. Therefore, $$T(n \times kn, kn \times n)= k T(n \times n) +O(k)$$
Which, equals:
$$T(n \times kn,kn \times n) = k \times n^{log_2 7}+O(k) = k \times n^{log_2 7}$$

Sunday 3 September 2017

4.2-5 CLRS

4.2-5[CLRS], States that:
V. Pan has discovered a way of multiplying $68 \times 68$ matrices using $132,464$ multiplications, a way of multiplying $70 \times 70$ matrices using $143,640$ multiplications, and a way of multiplying $72 \times 72$ matrices using $155,424$ multiplications. Which method yields the best asymptotic running time when used in a divide and conquer matrix multiplication algorithm ?  How does it compare to Strassen's algorithm?


ANSWER:
Here is my try over this question:
We are provided with three options:
  1.  Use $68 \times 68$ matrices in order to find the multiplications of larger matrices.
  2.  Use $70 \times 70$ matrices.
  3.  Use $72 \times 72$ matrices instead.
But which one to choose? How to compare?
Well, if we want to compare these three, we need a (bigger) matrix that can use all three of them. Consider the LCM of $68$,$70$ and $72$: $42840$. So, if we get to multiply two $42840 \times 42840$ matrices, then we can test all the three of the submatrices!

So, let's do that.

I'll start with a smaller example in order to clarify stuff. Let's consider a multiplication of two $9 \times 9$ matrices. If we divide a $9 \times 9$ matrix into 9 $3 \times 3$ matrix, then we'll do $3^3=27$ $3 \times 3$ matrix multiplications. And if we claim that each  $3 \times 3$ matrix multiplication takes $k$ multiplications, then we can multiply two $9 \times 9$ matrices in $27 \times k$ multiplications.

Here, the claims are given in the question. All we have to find is the total number of multiplications.

CASE-1:
We'll get a $630 \times 630 $ matrix, thus $630^{3} \times 132464 = 3.3122 \times 10^{13}$ multiplications.

CASE-2:

We'll get a $612 \times 612 $ matrix, thus $612^{3} \times 143640 = 3.2925 \times 10^{13}$ multiplications.

CASE-3:

We'll get a $595 \times 595 $ matrix, thus $595^{3} \times 155424 = 3.2739 \times 10^{13}$ multiplications.So eventually, the least number of multiplications are in the case-3, when we take $72 \times 72$ matrices in a divide and conquer matrix-multiplication algorithm.

Now, guess what? I'm wrong.
 REAL ANSWER: 

The thing I'm calculating considers only the multiplications, not the additions that are done.
Well, I'd seen this  blog-post upon this question.  I didn't really understand it.  But later, I found my mistake. In the question number 4.2-4 of the same blog, we can find a formula: $$T(n) = k\times T(n/3)+\Theta(n^2)$$

which, by case-1 of the master theorem gives the answer $O(n^{log_3k})$. In this case we also consider the additions.
So, this blog gives the second way as the answer, and that seems legit too, because the third case might have extra additions, (because it has a bigger tree) leading to extra running time. Therefore, IMHO a $70 \times 70$ matrix should be used for the divide and conquer matrix-multiplication.
 

1295D - Same GCDs

 1295D - Same GCDs Link to the problem :  https://codeforces.com/problemset/problem/1295/D I had to see the tutorial to understand this. $$ ...