Hello, Informatics student here. I had a course on numerical methods this semester (going to the finals just now actually) and I kept learning methods named after Gauss (like Gaussian quadrature or approximating the solution for large systems of linear equations with Gauss-Seidel) and other (old) mathematicians over and over again. A lot of what we deal with is approximations. My question is that naively I thought of numerics as a field of Mathematics that was relevant in the era of computers, so I didn't expect a lot of methods to be specified that long ago. Why am I wrong? What was the historical context in which these findings came to be?
Side note: any book recommendations for the history of numerics would be appreciated
[link] [comments]
from math https://ift.tt/2TaO1Bi
Post a Comment