Message from @ThisIsChris

Discord ID: 506667933188227074


2018-10-30 03:09:52 UTC  

so for the positive case you have that f(n) involves a check on n, which is constant time, a printline which is constant time, and then two calls to f(n-1)

2018-10-30 03:10:30 UTC  

so in this n positive case
f(n) = the constant stuff + 2\*f(n-1) = O(1) + 2\*f(n-1)

2018-10-30 03:11:23 UTC  

But don't the calls to f(n-1) create even more calls?

2018-10-30 03:11:28 UTC  

since it's recursive

2018-10-30 03:11:32 UTC  

Yep

2018-10-30 03:11:55 UTC  

Do I need some kind of formula to determine how many calls it will create?

2018-10-30 03:12:12 UTC  

so by using the same formula, f(n-1) = 2*f(n-2) + o(1)

2018-10-30 03:13:19 UTC  

plugging that in to what you had before:
f(n) = 2\*(2\*f(n-2) + O(1)) + O(1) = 2\*2\*f(n-2) + O(1)

2018-10-30 03:13:48 UTC  

ah, so it just simplifies to another logarithmic function?

2018-10-30 03:13:53 UTC  

yes

2018-10-30 03:14:16 UTC  

its exponential (the inverse of logarithm)

2018-10-30 03:15:01 UTC  

Okay. That should be enough to write up a logical explanation, I think. Apparently the other two function are easier to analyze.

2018-10-30 03:15:03 UTC  

Thanks a lot

2018-10-30 03:15:28 UTC  

```java
public void three(int n)
{
int i, j, k;
for (i = n/2; i > 0; i = i/2)
for (j = 0; j < n; j++)
for (k = 0; k < n; k++)
System.out.println("i: " + i + " j: " + j+" k: " + k);
```

2018-10-30 03:15:47 UTC  

This is the next one. Should be easier for me to figure this one out since it's just for loops, right?

2018-10-30 03:16:30 UTC  

yes

2018-10-30 03:16:40 UTC  

```java
public static void four(int n)
{
if (n > 1)
{
System.out.println(n);
four(n-1);
}
for (int i = 0; i < n; i++)
System.out.println(i);
}
```

2018-10-30 03:16:56 UTC  

I'm pretty sure this one is just Nlog(N) but I might be wrong

2018-10-30 03:17:15 UTC  

idk I'll go try to figure this stuff out myself and come back if there's a problem

2018-10-30 03:17:16 UTC  

thanks

2018-10-30 03:17:28 UTC  

you're welcome

2018-10-30 03:47:39 UTC  

@ThisIsChris If there was only one call to two() rather than two calls, would that make it an O(n) function?

2018-10-30 03:48:00 UTC  

Since it just calls itself n times until it reaches the base case, and than calling itself twice each time

2018-10-30 05:57:25 UTC  

@Jacob yes

2018-10-30 06:18:18 UTC  

Thanks

2018-10-30 06:18:48 UTC  

so I guess the last function is O(n^2 log(n))

2018-10-30 06:19:59 UTC  

wait no

2018-10-30 06:20:05 UTC  

last one if O(n^2)

2018-10-30 06:20:24 UTC  

second one is O(n^2 log(n))

2018-10-30 06:20:39 UTC  

maybe that simplifies to something idk

2018-10-30 16:20:16 UTC  

@Jacob base case is f(1) = O(1)

2018-10-30 16:21:04 UTC  

for n > 1:
f(n) = f(n-1) + O(n)

2018-10-30 16:21:36 UTC  

so f(n) = sum over i from 1 to n of O(i)

2018-10-30 16:22:05 UTC  

sum of i from 1 to n of i = n*(n-1)/2

2018-10-30 16:22:16 UTC  

so f(n) = O(n^2)

2018-10-31 06:30:30 UTC  

^

2018-10-31 06:31:00 UTC  

you could also expand and see how much work is being done in each step

2018-10-31 06:31:26 UTC  

four(n) does a function call (O(1) not accounting for the recursion) and then prints n

2018-10-31 06:31:53 UTC  

if you expand that all the way, you'll see that it does n+(n-1)+(n-2)+...+2+1 print statements, which is O(n^2) as @ThisIsChris pointed out

2018-10-31 06:32:12 UTC  

that's another perspective that I use sometimes to calculate big Os

2018-11-09 06:22:51 UTC  

<@&435155896780324864> Lads, it's that time of week again