fbpx
Wikipedia

Chainer

Chainer is an open source deep learning framework written purely in Python on top of NumPy and CuPy Python libraries. The development is led by Japanese venture company Preferred Networks in partnership with IBM, Intel, Microsoft, and Nvidia.[4][5][6][7]

Chainer
Original author(s)Seiya Tokui
Developer(s)Community, Preferred Networks, Inc.
Initial releaseJune 9, 2015; 8 years ago (2015-06-09).[1][2]
Stable release
7.8.1[3] / 5 January 2022; 2 years ago (5 January 2022)
Repository
  • github.com/chainer/chainer
Written inPython
Platformcross-platform
Available inPython
TypeDeep learning library
LicenseMIT
Websitechainer.org

Chainer is notable for its early adoption of "define-by-run" scheme, as well as its performance on large scale systems.[1] The first version was released in June 2015 and has gained large popularity in Japan since then.[1][2] Furthermore, in 2017, it was listed by KDnuggets in top 10 open source machine learning Python projects.[8]

In December 2019, Preferred Networks announced the transition of its development effort from Chainer to PyTorch and it will only provide maintenance patches after releasing v7.[9]

Define-by-run edit

Chainer was the first deep learning framework to introduce the define-by-run approach.[10][11] The traditional procedure to train a network was in two phases: define the fixed connections between mathematical operations (such as matrix multiplication and nonlinear activations) in the network, and then run the actual training calculation. This is called the define-and-run or static-graph approach. Theano and TensorFlow are among the notable frameworks that took this approach. In contrast, in the define-by-run or dynamic-graph approach, the connection in a network is not determined when the training is started. The network is determined during the training as the actual calculation is performed.

One of the advantages of this approach is that it is intuitive and flexible.[12] If the network has complicated control flows such as conditionals and loops, in the define-and-run approach, specially designed operations for such constructs are needed. On the other hand, in the define-by-run approach, programming language's native constructs such as if statements and for loops can be used to describe such flow. This flexibility is especially useful to implement recurrent neural networks.[13][14]

Another advantage is ease of debugging.[12] In the define-and-run approach, if an error (such as numeric error) has occurred in the training calculation, it is often difficult to inspect the fault, because the code written to define the network and the actual place of the error are separated. In the define-by-run approach, you can just suspend the calculation with the language's built-in debugger and inspect the data that flows on your code of the network.

Define-by-run has gained popularity since the introduction by Chainer and is now implemented in many other frameworks, including PyTorch[15] and TensorFlow.[12]

Extension libraries edit

Chainer has four extension libraries, ChainerMN, ChainerRL, ChainerCV and ChainerUI. ChainerMN enables Chainer to be used on multiple GPUs with performance significantly faster than other deep learning frameworks.[1] A supercomputer running Chainer on 1024 GPUs processed 90 epochs of ImageNet dataset on ResNet-50 network in 15 minutes, which is four times faster than the previous record held by Facebook.[16][17] ChainerRL adds state of art deep reinforcement learning algorithms, and ChainerUI is a management and visualization tool.

Applications edit

Chainer is used as the framework for PaintsChainer, a service which does automatic colorization of black and white, line only, draft drawings with minimal user input.[18][19]

See also edit

References edit

  1. ^ a b c d "Big-in-Japan AI code 'Chainer' shows how Intel will gun for GPUs". The Register. 2017-04-07. Retrieved 2017-12-24.
  2. ^ a b "Deep Learning のフレームワーク Chainer を公開しました" (in Japanese). 2015-06-09. Retrieved 2017-12-24.
  3. ^ "Release 7.8.1". 5 January 2022. Retrieved 3 October 2022.
  4. ^ "Chainer Homepage". Retrieved 2017-12-24.
  5. ^ "IBM Wants to be "Red Hat" of Deep Learning". HPCwire. 2017-01-26. Retrieved 2017-09-08.
  6. ^ "Intel Collaborating with Preferred Networks in Japan on Deep Learning". 2017-04-06. Retrieved 2017-12-24.
  7. ^ "Microsoft partners with Preferred Networks to bring Chainer deep learning technology to Azure - MSPoweruser". MSPoweruser. 2017-05-23. Retrieved 2017-09-08.
  8. ^ "Top 20 Python Machine Learning Open Source Projects". KDnuggets. 2017-11-24.
  9. ^ "Preferred Networks Migrates its Deep Learning Research Platform to PyTorch". Preferred Networks, Inc. 2019-12-05. Retrieved 2019-12-27.
  10. ^ Tokui, Seiya; et al. (2015). "Chainer: a next-generation open source framework for deep learning". 29th Annual Conference on Neural Information Processing Systems (NIPS). 5.
  11. ^ Shimada, Naoki (September 14, 2017). Deep Learning with Chainer. Gijutsu-Hyohron. p. 61. ISBN 4774191868.
  12. ^ a b c "Eager Execution: An imperative, define-by-run interface to TensorFlow". Google Research Blog.
  13. ^ "Deep Learning With Dynamic Computation Graphs (ICLR 2017)". Metadata.
  14. ^ Hido, Shohei (8 November 2016). "Complex neural networks made easy by Chainer". O'Reilly Media. Retrieved 26 June 2018.
  15. ^ Perez, Carlos E. (20 January 2017). "PyTorch, Dynamic Computational Graphs and Modular Deep Learning". Medium.
  16. ^ "Extremely Large Minibatch SGD: Training ResNet-50 on ImageNet in 15 Minutes" (pdf). Retrieved 2017-12-24.
  17. ^ Greene, Tristan (20 November 2017). "Facebook's nerds bested by Japan's in the race to train AI". The Next Web. Retrieved 24 November 2017.
  18. ^ Know, Now You (2017-02-15). "This neural network-based software will add colour to your drawings for free". Techly. Retrieved 2017-09-08.
  19. ^ "Drawing app "pixiv Sketch" and automatic coloring service "PaintsChainer" collaborate to provide a new function for automatic coloring of illustrations!". 2017-05-24. Retrieved 2017-12-24.

External links edit

  • Official website

chainer, open, source, deep, learning, framework, written, purely, python, numpy, cupy, python, libraries, development, japanese, venture, company, preferred, networks, partnership, with, intel, microsoft, nvidia, original, author, seiya, tokuideveloper, commu. Chainer is an open source deep learning framework written purely in Python on top of NumPy and CuPy Python libraries The development is led by Japanese venture company Preferred Networks in partnership with IBM Intel Microsoft and Nvidia 4 5 6 7 ChainerOriginal author s Seiya TokuiDeveloper s Community Preferred Networks Inc Initial releaseJune 9 2015 8 years ago 2015 06 09 1 2 Stable release7 8 1 3 5 January 2022 2 years ago 5 January 2022 Repositorygithub wbr com wbr chainer wbr chainerWritten inPythonPlatformcross platformAvailable inPythonTypeDeep learning libraryLicenseMITWebsitechainer wbr org Chainer is notable for its early adoption of define by run scheme as well as its performance on large scale systems 1 The first version was released in June 2015 and has gained large popularity in Japan since then 1 2 Furthermore in 2017 it was listed by KDnuggets in top 10 open source machine learning Python projects 8 In December 2019 Preferred Networks announced the transition of its development effort from Chainer to PyTorch and it will only provide maintenance patches after releasing v7 9 Contents 1 Define by run 2 Extension libraries 3 Applications 4 See also 5 References 6 External linksDefine by run editChainer was the first deep learning framework to introduce the define by run approach 10 11 The traditional procedure to train a network was in two phases define the fixed connections between mathematical operations such as matrix multiplication and nonlinear activations in the network and then run the actual training calculation This is called the define and run or static graph approach Theano and TensorFlow are among the notable frameworks that took this approach In contrast in the define by run or dynamic graph approach the connection in a network is not determined when the training is started The network is determined during the training as the actual calculation is performed One of the advantages of this approach is that it is intuitive and flexible 12 If the network has complicated control flows such as conditionals and loops in the define and run approach specially designed operations for such constructs are needed On the other hand in the define by run approach programming language s native constructs such as if statements and for loops can be used to describe such flow This flexibility is especially useful to implement recurrent neural networks 13 14 Another advantage is ease of debugging 12 In the define and run approach if an error such as numeric error has occurred in the training calculation it is often difficult to inspect the fault because the code written to define the network and the actual place of the error are separated In the define by run approach you can just suspend the calculation with the language s built in debugger and inspect the data that flows on your code of the network Define by run has gained popularity since the introduction by Chainer and is now implemented in many other frameworks including PyTorch 15 and TensorFlow 12 Extension libraries editChainer has four extension libraries ChainerMN ChainerRL ChainerCV and ChainerUI ChainerMN enables Chainer to be used on multiple GPUs with performance significantly faster than other deep learning frameworks 1 A supercomputer running Chainer on 1024 GPUs processed 90 epochs of ImageNet dataset on ResNet 50 network in 15 minutes which is four times faster than the previous record held by Facebook 16 17 ChainerRL adds state of art deep reinforcement learning algorithms and ChainerUI is a management and visualization tool Applications editChainer is used as the framework for PaintsChainer a service which does automatic colorization of black and white line only draft drawings with minimal user input 18 19 See also editComparison of deep learning software Machine learning Artificial neural networkReferences edit a b c d Big in Japan AI code Chainer shows how Intel will gun for GPUs The Register 2017 04 07 Retrieved 2017 12 24 a b Deep Learning のフレームワーク Chainer を公開しました in Japanese 2015 06 09 Retrieved 2017 12 24 Release 7 8 1 5 January 2022 Retrieved 3 October 2022 Chainer Homepage Retrieved 2017 12 24 IBM Wants to be Red Hat of Deep Learning HPCwire 2017 01 26 Retrieved 2017 09 08 Intel Collaborating with Preferred Networks in Japan on Deep Learning 2017 04 06 Retrieved 2017 12 24 Microsoft partners with Preferred Networks to bring Chainer deep learning technology to Azure MSPoweruser MSPoweruser 2017 05 23 Retrieved 2017 09 08 Top 20 Python Machine Learning Open Source Projects KDnuggets 2017 11 24 Preferred Networks Migrates its Deep Learning Research Platform to PyTorch Preferred Networks Inc 2019 12 05 Retrieved 2019 12 27 Tokui Seiya et al 2015 Chainer a next generation open source framework for deep learning 29th Annual Conference on Neural Information Processing Systems NIPS 5 Shimada Naoki September 14 2017 Deep Learning with Chainer Gijutsu Hyohron p 61 ISBN 4774191868 a b c Eager Execution An imperative define by run interface to TensorFlow Google Research Blog Deep Learning With Dynamic Computation Graphs ICLR 2017 Metadata Hido Shohei 8 November 2016 Complex neural networks made easy by Chainer O Reilly Media Retrieved 26 June 2018 Perez Carlos E 20 January 2017 PyTorch Dynamic Computational Graphs and Modular Deep Learning Medium Extremely Large Minibatch SGD Training ResNet 50 on ImageNet in 15 Minutes pdf Retrieved 2017 12 24 Greene Tristan 20 November 2017 Facebook s nerds bested by Japan s in the race to train AI The Next Web Retrieved 24 November 2017 Know Now You 2017 02 15 This neural network based software will add colour to your drawings for free Techly Retrieved 2017 09 08 Drawing app pixiv Sketch and automatic coloring service PaintsChainer collaborate to provide a new function for automatic coloring of illustrations 2017 05 24 Retrieved 2017 12 24 External links editOfficial website Retrieved from https en wikipedia org w index php title Chainer amp oldid 1109254166, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.