how can i try it?
Symbolic Mathematics Finally Yields to Neural Networks
I quote:
"Lample and Charton's program could produce precise solutions to
complicated integrals and differential equations - including some
that stumped popular math software packages with explicit
problem-solving rules built in."
"The new program exploits one of the major advantages of neural
networks: They develop their own implicit rules. As a result,
"there's no separation between the rules and the exceptions," said
Jay McClelland, a psychologist at Stanford University who uses neural
nets to model how people learn math. In practice, this means that the
program didn't stumble over the hardest integrals. In theory, this
kind of approach could derive unconventional "rules" that could make
headway on problems that are currently unsolvable."
1. It doesn't handle large numbers very well. [...]
2. DL may give correct result that contains strange constant. [...]
3. DL doesn't understand multiplication very well. [...]
4. DL doesn't handle long expression very well. [...]
5. For the FWD test set with 9986 integrals, (which is generate
random expression first, then try to solve with sympy and discard
failures) FriCAS can solve 9980 out of 9986 in 71 seconds, of the
remaining 6 integrals, FriCAS can solve another 2 under 100 seconds,
[...] The DL system can solve 95.6%, by comparison FriCAS is over
99.94%.
6. The DL system is slow. To solve the FWD test set, the DL system
may use around 100 hours of CPU time.
7. For the BWD test set, (which is generate random expression first,
then take derivatives as integrand), FriCAS can roughly solve 95%.
Compared with DL's claimed 99.5%. [...]
8. DL doesn't handle rational function integration very well. It can
handle '(x+1)^2/((x+1)^6+1)' but not its expanded form. [...]
9. DL doesn't handle algebraic function integration very well. I have
a list of algebraic functions that FriCAS can solve while other CASs
can't, DL can't solve them as well.
10. For the harder mixed-cased integration, I have a list of
integrations that FriCAS can't handle, DL can't solve them as well.
<https://www.quantamagazine.org/symbolic-mathematics-finally-yields-to-neural-networks-20200520/>
My sceptical attitude is borne out by results of experiments with the Lample-Charton code that were posted by Qian Yun on the <fricas-devel> newsgroup in a thread started on November 16, 2020 and named "the 'deep learning' 'neural network' symbolic integrator":
<https://www.mail-archive.com/fricas-devel@googlegroups.com/msg13743.html>
Qian's conclusions are (DL = Deep Learning):
1. It doesn't handle large numbers very well. [...]
2. DL may give correct result that contains strange constant. [...]
3. DL doesn't understand multiplication very well. [...]
4. DL doesn't handle long expression very well. [...]
5. For the FWD test set with 9986 integrals, (which is generate
random expression first, then try to solve with sympy and discard
failures) FriCAS can solve 9980 out of 9986 in 71 seconds, of the
remaining 6 integrals, FriCAS can solve another 2 under 100 seconds,
[...] The DL system can solve 95.6%, by comparison FriCAS is over
99.94%.
6. The DL system is slow. To solve the FWD test set, the DL system
may use around 100 hours of CPU time.
7. For the BWD test set, (which is generate random expression first,
then take derivatives as integrand), FriCAS can roughly solve 95%.
Compared with DL's claimed 99.5%. [...]
8. DL doesn't handle rational function integration very well. It can
handle '(x+1)^2/((x+1)^6+1)' but not its expanded form. [...]
9. DL doesn't handle algebraic function integration very well. I have
a list of algebraic functions that FriCAS can solve while other CASs
can't, DL can't solve them as well.
10. For the harder mixed-cased integration, I have a list of
integrations that FriCAS can't handle, DL can't solve them as well.
Martin.
Who needs neural networks?one last item. From the evidence posted on fricas-devel, it is apparent that DL can't do arithmetic. Given n it appears that it cannot compute n+1, in general.
On Tuesday, December 1, 2020 at 11:27:50 AM UTC-8, Richard Fateman wrote:
Who needs neural networks?one last item. From the evidence posted on fricas-devel, it is apparent that DL can't do arithmetic. Given n it appears that it cannot compute n+1, in general.
I don't know if this is susceptible to a proof. I could ask around...
RJF
Well, deep learning/AI just solved the 50-year-old grand challenge in biology,
the "protein folding problem".
So I am sure one day, it will be able to fully solve integration as well?
https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology
The AI system which did this is called alphafold.
--Nasser
On 11/22/2020 2:30 PM, clicliclic@freenet.de wrote:
<https://www.quantamagazine.org/symbolic-mathematics-finally-yields-to-neural-networks-20200520/>
My sceptical attitude is borne out by results of experiments with
the Lample-Charton code that were posted by Qian Yun on the
<fricas-devel> newsgroup in a thread started on November 16, 2020
and named "the 'deep learning' 'neural network' symbolic
integrator":
<https://www.mail-archive.com/fricas-devel@googlegroups.com/msg13743.html>
Qian's conclusions are (DL = Deep Learning):
1. It doesn't handle large numbers very well. [...]
2. DL may give correct result that contains strange constant. [...]
3. DL doesn't understand multiplication very well. [...]
4. DL doesn't handle long expression very well. [...]
5. For the FWD test set with 9986 integrals, (which is generate
random expression first, then try to solve with sympy and discard
failures) FriCAS can solve 9980 out of 9986 in 71 seconds, of the
remaining 6 integrals, FriCAS can solve another 2 under 100
seconds, [...] The DL system can solve 95.6%, by comparison FriCAS
is over 99.94%.
6. The DL system is slow. To solve the FWD test set, the DL system
may use around 100 hours of CPU time.
7. For the BWD test set, (which is generate random expression
first, then take derivatives as integrand), FriCAS can roughly
solve 95%. Compared with DL's claimed 99.5%. [...]
8. DL doesn't handle rational function integration very well. It
can handle '(x+1)^2/((x+1)^6+1)' but not its expanded form. [...]
9. DL doesn't handle algebraic function integration very well. I
have a list of algebraic functions that FriCAS can solve while
other CASs can't, DL can't solve them as well.
10. For the harder mixed-cased integration, I have a list of
integrations that FriCAS can't handle, DL can't solve them as well.
FYI,
Well, deep learning/AI just solved the 50-year-old grand challenge in biology, the "protein folding problem".
So I am sure one day, it will be able to fully solve integration as
well?
https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology
The AI system which did this is called alphafold.
"Nasser M. Abbasi" schrieb:
On 11/22/2020 2:30 PM, clicl...@freenet.de wrote:
<https://www.quantamagazine.org/symbolic-mathematics-finally-yields-to-neural-networks-20200520/>
My sceptical attitude is borne out by results of experiments with
the Lample-Charton code that were posted by Qian Yun on the <fricas-devel> newsgroup in a thread started on November 16, 2020
and named "the 'deep learning' 'neural network' symbolic
integrator":
<https://www.mail-archive.com/fricas...@googlegroups.com/msg13743.html>
Qian's conclusions are (DL = Deep Learning):
1. It doesn't handle large numbers very well. [...]
2. DL may give correct result that contains strange constant. [...]
3. DL doesn't understand multiplication very well. [...]
4. DL doesn't handle long expression very well. [...]
5. For the FWD test set with 9986 integrals, (which is generate
random expression first, then try to solve with sympy and discard
failures) FriCAS can solve 9980 out of 9986 in 71 seconds, of the
remaining 6 integrals, FriCAS can solve another 2 under 100
seconds, [...] The DL system can solve 95.6%, by comparison FriCAS
is over 99.94%.
6. The DL system is slow. To solve the FWD test set, the DL system
may use around 100 hours of CPU time.
7. For the BWD test set, (which is generate random expression
first, then take derivatives as integrand), FriCAS can roughly
solve 95%. Compared with DL's claimed 99.5%. [...]
8. DL doesn't handle rational function integration very well. It
can handle '(x+1)^2/((x+1)^6+1)' but not its expanded form. [...]
9. DL doesn't handle algebraic function integration very well. I
have a list of algebraic functions that FriCAS can solve while
other CASs can't, DL can't solve them as well.
10. For the harder mixed-cased integration, I have a list of
integrations that FriCAS can't handle, DL can't solve them as well.
You cannot use numerical evaluation to tell if a symbolic indefinite integral is correct, since there are an arbitrary number of correct solutions that differ by a constant. Maybe you first differentiate the answer..FYI,
Well, deep learning/AI just solved the 50-year-old grand challenge in biology, the "protein folding problem".
So I am sure one day, it will be able to fully solve integration as
well?
https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology
The AI system which did this is called alphafold.
I looked at the alphafold article at <deepmind.com>, but haven't dug
deeper. Apparently, protein-folding theorists are unable to estimate a
folded protein's configuration energy sufficiently quickly, else
simulated annealing (as used to effectively solve the Travelling
Salesman's problem) would allow to make good folding predictions. But training a neural network on a library of 1.7*10^5 experimental protein structures was now found to yield a good folding predictor. Trying to replicate this feat in one resarcher's head would presumably need more
than a lifetime of experience with the library data.
According to Lample and Charton's paper at arXiv, symbolic parameters
were excluded from their FWD and BWD integration test sets - might
there be particular problems with them? (By the way, Maple's algebraic
Risch integrator also appears to reject symbolic parameters.) In this situation, some mean-square numerical deviation of a trial solution's derivative from the integrand could perhaps be used for a simulated- annealing approach to symbolic integration, however. But could reliable deviation estimates be computed sufficiently quickly? Something for
Nasser to try!
Martin.
"Facebook AI has built the first AI system that can solve advanced mathematics equations using symbolic reasoning."
https://ai.facebook.com/blog/using-neural-networks-to-solve-advanced-mathematics-equations/
On Wednesday, January 15, 2020 at 4:01:39 AM UTC-6, peter....@gmail.com wrote:
"Facebook AI has built the first AI system that can solve advanced mathematics equations using symbolic reasoning."
https://ai.facebook.com/blog/using-neural-networks-to-solve-advanced-mathematics-equations/
FYI,
They are now working on using AI to solve PDE's
https://www.infoq.com/news/2020/12/caltech-ai-pde/
"Caltech Open-Sources AI for Solving Partial Differential Equations"
"The Caltech team's approach is to build a neural network that can
learn a solution operator; that is, it learns the mapping between a
PDE and its solution."
On Wednesday, January 15, 2020 at 4:01:39 AM UTC-6, peter....@gmail.com wrote:
"Facebook AI has built the first AI system that can solve advanced mathematics equations using symbolic reasoning."
https://ai.facebook.com/blog/using-neural-networks-to-solve-advanced-mathematics-equations/FYI,
They are now working on using AI to solve PDE's
https://www.infoq.com/news/2020/12/caltech-ai-pde/
"Caltech Open-Sources AI for Solving Partial Differential Equations"
"The Caltech team's approach is to build a neural network that can learn a solution operator; that is, it learns the mapping between a PDE and its solution."
--Nasser
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 112 |
Nodes: | 8 (1 / 7) |
Uptime: | 196:44:16 |
Calls: | 2,465 |
Files: | 8,602 |
Messages: | 1,876,870 |