• Order to parcel locker

    Order to parcel locker
  • easy pay

    easy pay
  • Reduced price
Theory of Neural Information Processing Systems

Theory of Neural Information Processing Systems

9780198530244
645.84 zł
581.26 zł Save 64.58 zł Tax included
Lowest price within 30 days before promotion: 581.26 zł
Quantity
Available in 4-6 weeks

  Delivery policy

Choose Paczkomat Inpost, Orlen Paczka, DPD or Poczta Polska. Click for more details

  Security policy

Pay with a quick bank transfer, payment card or cash on delivery. Click for more details

  Return policy

If you are a consumer, you can return the goods within 14 days. Click for more details

Description
Theory of Neural Information Processing Systems provides an explicit, coherent, and up-to-date account of the modern theory of neural information processing systems. It has been carefully developed for graduate students from any quantitative discipline, including mathematics, computer science, physics, engineering or biology, and has been thoroughly class-tested by the authors over a period of some 8 years. Exercises are presented throughout the text and notes on historicalbackground and further reading guide the student into the literature. All mathematical details are included and appendices provide further background material, including probability theory, linear algebra and stochastic processes, making this textbook accessible to a wide audience.
Product Details
OUP Oxford
85532
9780198530244
9780198530244

Data sheet

Publication date
2005
Issue number
1
Cover
paperback
Pages count
586
Dimensions (mm)
172 x 245
Weight (g)
1021
  • I Introduction to Neural Networks; General introduction; Layered networks; Recurrent networks with binary neurons; II Advanced Neural Networks; Competitive unsupervised learning processes; Bayesian techniques in supervised learning; Gaussian processes; Support vector machines for binary classification; III Information Theory and Neural Networks; Measuring information; Identification of entropy as an information measure; Building blocks of Shannons information theory; Information theory and statistical inference; Applications to neural networks; IV Macroscopic Analysis of Dynamics; Network operation: macroscopic dynamics; Dynamics of online learning in binary perceptrons; Dynamics of online gradient descent learning; V Equilibrium Statistical Mechanics of Neural Networks; Basics of equilibrium statistical mechanics; Network operation: equilibrium analysis; Gardner theory of task realizability; Appendices; Historical and bibliographical notes; Probability theory in a nutshell; Conditions for central limit theorem to apply; Some simple summation identities; Gaussian integrals and probability distributions; Matrix identities; The delta-distribution; Inequalities based on convexity; Metrics for parametrized probability distributions; Saddle-point integration; References;
Comments (0)