First let me say thank you xkcd forum members for allowing yourselves this abuse of your knowledge.
With that out of the way onto my homework question.
(I'm going to say "int" to mean "indefinite integral")
Suppose f and g are any integrable functions (and fg is also integrable). For any real numbers a and b, prove that int(fg)(x)dx =/= ag(x)(intf(x)dx)+bf(x)(intg(x)dx)
Now I cant just pick some functions and find their antiderivates. We haven't done that yet. I can write intf(x)dx=F(x) and so forth, but thats about it.
The problem comes from a section of a textbook that talks about integration by parts, substitution, etc, for indefinite integrals.
Indefinite integral question
Moderators: gmalivuk, Moderators General, Prelates
Re: Indefinite integral question
Well you need some more restrictions, as g(x) = C, a=1, b=0 satisfies the equation you want to show is false. I believe f = e^root(i)x and e^(x/root(i)) work with a=b=0.5 as well. (Taking the real part should work too if you don't like complex numbers)
What is the definition of the indefinite integral you are using if not an antiderivative? Or do you mean you know it is an antiderivative but haven't learned how to find them yet?
What is the definition of the indefinite integral you are using if not an antiderivative? Or do you mean you know it is an antiderivative but haven't learned how to find them yet?
addams wrote:This forum has some very well educated people typing away in loops with Sourmilk. He is a lucky Sourmilk.

 Posts: 6
 Joined: Sun Mar 16, 2014 5:44 am UTC
Re: Indefinite integral question
Yeah, there are cases where there are such an a and b and such functions where the equation is satisfied, but the question seems to be asking show that there is no single pair of a and b that will work for all functions (which seems quite obvious, doesn't it).
Yes, I mean that although I know they are antiderivatives, I cannot do int (cos(x) dx)=sin(x) + C. I can do int (some continuous function)= F(x)+C. That is, I can choose some general function that I know has an antiderivative, but not specify what that antiderivative is exactly.
Yes, I mean that although I know they are antiderivatives, I cannot do int (cos(x) dx)=sin(x) + C. I can do int (some continuous function)= F(x)+C. That is, I can choose some general function that I know has an antiderivative, but not specify what that antiderivative is exactly.
 Schrollini
 Posts: 515
 Joined: Sat Sep 29, 2012 5:20 pm UTC
Re: Indefinite integral question
Since there are many pairs of functions that do satisfy that equation (that is, don't satisfy the inequality), presumably the question is asking you to show that not all pairs of functions satisfy the equation. Most of the time, the easiest way to show that something isn't always true is to find a counterexample.
That said, taking the derivative of both sides may lead to some insight.
That said, taking the derivative of both sides may lead to some insight.
For your convenience: a LaTeX to BBCode converter
Re: Indefinite integral question
This might be a spoiler, but I haven't actually done anything and it's just what comes to mind for me and may not get anywhere. Just in case though, I'll stick it in spoilers.
edit:Or y'know, schrollini could beat me to that suggestion. (edit2: and noticing the 7 minute post difference, I must be either blind or super slow.)
Spoiler:
edit:Or y'know, schrollini could beat me to that suggestion. (edit2: and noticing the 7 minute post difference, I must be either blind or super slow.)

 Posts: 6
 Joined: Sun Mar 16, 2014 5:44 am UTC
Re: Indefinite integral question
Yes, I think that must be what it's asking.
I found that if I let f(x)=g(x)=1, I can show that a+b must equal 1. And, then, if I let f=a and g=derivative of f. I can show that 0=ba^2. Finally, if I let a=0, (which would imply b=1 from a+b=1), then I get int(fg)=f*Int(g). Now, that last result looks rather promising, but I'm not quite sure how to pick some functions to show that int(fg)=f*Int(g) can't be true.
I'll have to try your derivative suggestion; I haven't thought of doing that yet.
I found that if I let f(x)=g(x)=1, I can show that a+b must equal 1. And, then, if I let f=a and g=derivative of f. I can show that 0=ba^2. Finally, if I let a=0, (which would imply b=1 from a+b=1), then I get int(fg)=f*Int(g). Now, that last result looks rather promising, but I'm not quite sure how to pick some functions to show that int(fg)=f*Int(g) can't be true.
I'll have to try your derivative suggestion; I haven't thought of doing that yet.

 Posts: 6
 Joined: Sun Mar 16, 2014 5:44 am UTC
Re: Indefinite integral question
Trying that derivative approach (assuming I'm doing it right), I let f and g be two differentiable functions. Then, from
int(fg)=a*f*int(g)+b*g*int(f)
I take the derivative of both sides and get:
fg=a(fg+int(g)*f')+b(fg+int(f)*g')
That's assuming I remember my product differentiation rule correctly. Now, if I rearrange, and then take my a+b=1 result from earlier, I get
fg=a(fg+int(g)*f')+b(fg+int(f)*g')
=a*int(g)*f'+b**int(f)*g'+(a+b)(fg)
=a*int(g)*f'+b**int(f)*g'+1*fg
0=a*int(g)*f'+b**int(f)*g'
Then, if I let a=0 and b=1, as seen from earlier:
0=int(f)*g'
But now what?
int(fg)=a*f*int(g)+b*g*int(f)
I take the derivative of both sides and get:
fg=a(fg+int(g)*f')+b(fg+int(f)*g')
That's assuming I remember my product differentiation rule correctly. Now, if I rearrange, and then take my a+b=1 result from earlier, I get
fg=a(fg+int(g)*f')+b(fg+int(f)*g')
=a*int(g)*f'+b**int(f)*g'+(a+b)(fg)
=a*int(g)*f'+b**int(f)*g'+1*fg
0=a*int(g)*f'+b**int(f)*g'
Then, if I let a=0 and b=1, as seen from earlier:
0=int(f)*g'
But now what?
Re: Indefinite integral question
Aonedayaccount wrote:int(fg)=f*Int(g). Now, that last result looks rather promising, but I'm not quite sure how to pick some functions to show that int(fg)=f*Int(g) can't be true.
f = x, g = 1/x (x > 0)?
Spoiler:

 Posts: 6
 Joined: Sun Mar 16, 2014 5:44 am UTC
Re: Indefinite integral question
lalop wrote:Aonedayaccount wrote:int(fg)=f*Int(g). Now, that last result looks rather promising, but I'm not quite sure how to pick some functions to show that int(fg)=f*Int(g) can't be true.
f = x, g = 1/x (x > 0)?Spoiler:
Hmm, that does look promising. But as you point out, it relies on letting int (1)= x, which, despite its obviousness, might be pushing it.
But that's the best idea so far.
 Schrollini
 Posts: 515
 Joined: Sat Sep 29, 2012 5:20 pm UTC
Re: Indefinite integral question
lalop wrote:f = x, g = 1/x (x > 0)?
But this works with a = 0, b = 2.
I believe the question is not asking you to show that there is no (a, b) that satisfy that equation for all (f, g). Rather, it's asking you to show that there are functions (f, g) that will not satisfy the equation for any (a, b). (And even if it's not, showing the latter automatically shows the former.)
So start playing around with some functions, and see what values of a and b are needed for each pair. This may lead you to a pair where you can't find a and b to satisfy the equation.
Spoiler:
Spoiler:
For your convenience: a LaTeX to BBCode converter

 Posts: 6
 Joined: Sun Mar 16, 2014 5:44 am UTC
Re: Indefinite integral question
Schrollini wrote:lalop wrote:f = x, g = 1/x (x > 0)?
But this works with a = 0, b = 2.
I believe the question is not asking you to show that there is no (a, b) that satisfy that equation for all (f, g). Rather, it's asking you to show that there are functions (f, g) that will not satisfy the equation for any (a, b). (And even if it's not, showing the latter automatically shows the former.)
So start playing around with some functions, and see what values of a and b are needed for each pair. This may lead you to a pair where you can't find a and b to satisfy the equation.Spoiler:Spoiler:
While it does work with a=0 and b=2, there was another case that could be found where a+b=1, which 0+2 does not satisfy. Which would solve the problem (apart from the possible objection of calculating int (1)=x).
I'll try power functions now.
Who is online
Users browsing this forum: No registered users and 6 guests