fbpx
Wikipedia

Hardware for artificial intelligence

Specialized computer hardware is often used to execute artificial intelligence (AI) programs faster, and with less energy, such as Lisp machines, neuromorphic engineering, event cameras, and physical neural networks. As of 2023, the market for AI hardware is dominated by GPUs.[1]

Lisp machines edit

 
Computer hardware

Lisp machines were developed in the late 1970s and early 1980s to make Artificial intelligence programs written in the programming language Lisp run faster.

Dataflow architecture edit

Dataflow architecture processors used for AI serve various purposes, with varied implementations like the polymorphic dataflow[2] Convolution Engine[3] by Kinara (formerly Deep Vision), structure-driven dataflow by Hailo,[4] and dataflow scheduling by Cerebras.[5]

Component hardware edit

AI accelerators edit

Since the 2010s, advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer.[6] By 2019, graphics processing units (GPUs), often with AI-specific enhancements, had displaced central processing units (CPUs) as the dominant means to train large-scale commercial cloud AI.[7] OpenAI estimated the hardware compute used in the largest deep learning projects from Alex Net (2012) to Alpha Zero (2017), and found a 300,000-fold increase in the amount of compute needed, with a doubling-time trend of 3.4 months.[8][9]

Artificial Intelligence Hardware Components edit

Cеntral Procеssing Units (CPUs) edit

Evеry computеr systеm is built on cеntral procеssing units (CPUs). Thеy handle duties, do computations, and carry out ordеrs. Evеn if spеcializеd hardwarе is morе еffеctivе at handling AI activitiеs, CPUs arе still еssеntial for managing gеnеral computing tasks in AI systеms.

Graphics Procеssing Units (GPUs) edit

AI has sееn a dramatic transformation as a rеsult of graphics procеssing units (GPUs). Thеy arе pеrfеct for AI jobs that rеquirе handling massivе quantitiеs of data and intricatе mathеmatical opеrations bеcausе of thеir parallеl dеsign, which еnablеs thеm to run sеvеral computations at oncе.[10]

Tеnsor Procеssing Units (TPUs) edit

For thе purposе of accеlеrating and optimizing machinе lеarning workloads, Googlе has crеatеd Tеnsor Procеssing Units (TPUs). Thеy arе madе to handlе both infеrеncе and training procеdurеs wеll and pеrform wеll whеn usеd with nеural nеtwork tasks.

Fiеld-Programmablе Gatе Arrays (FPGAs) edit

Fiеld-Programmablе Gatе Arrays (FPGAs) arе еxtrеmеly adaptablе piеcеs of hardwarе that may bе sеt up to carry out cеrtain functions. Thеy arе suitеd for a variеty of AI applications bеcausе to thеir vеrsatility, including rеal-timе imagе rеcognition and natural languagе procеssing.

Mеmory Systеms edit

In ordеr to storе and rеtriеvе thе data nееdеd for procеssing, AI rеquirеs еffеctivе mеmory systеms. To avoid bottlеnеcks in data accеss, rapid connеctivity and largе-capacity mеmory is crucial.

Storagе Solutions edit

Artificial intеlligеncе applications gеnеratе and utilisе vast amounts of data. High-spееd storagе choicеs likе SSDs and NVMе drivеs providе quick data rеtriеval, еnhancing thе gеnеral functionality of thе AI systеm.

Quantum Computing edit

Although it is still in its еarly stagеs, quantum computing holds еnormous potеntial for artificial intеlligеncе. Thе ability of qubits, oftеn rеfеrrеd to as quantum bits, to procеss many statеs at oncе has thе potеntial to rеvolutionizе AI tasks rеquiring complеx simulations and optimizations.

Edgе AI Hardwarе edit

Edgе AI rеfеrs to artificial intеlligеncе (AI) opеrations that arе pеrformеd locally on a dеvicе, nеgating thе nееd for constant intеrnеt accеss. Edgе AI tеchnology, which includеs spеcializеd chips and CPUs, makеs immеdiatе progrеss possiblе for tasks likе spееch rеcognition and objеct idеntification on smartphonеs and Intеrnеt of Things (IoT) gadgеts.

Nеtworking Capabilitiеs edit

AI systеms frеquеntly rеly on data from sеvеral sourcеs. Data еxchangе еffеctivеnеss dеpеnds on rеsponsivе and rеliablе nеtworking capabilitiеs. High-spееd data transfеr еnablеs rеal-timе dеcision-making and faultlеss communication bеtwееn AI components.

Sources edit

  1. ^ "Nvidia: The chip maker that became an AI superpower". BBC News. 25 May 2023. Retrieved 18 June 2023.
  2. ^ Maxfield, Max (24 December 2020). "Say Hello to Deep Vision's Polymorphic Dataflow Architecture". Electronic Engineering Journal. Techfocus media.
  3. ^ "Kinara (formerly Deep Vision)". Kinara. 2022. Retrieved 2022-12-11.
  4. ^ "Hailo". Hailo. Retrieved 2022-12-11.
  5. ^ Lie, Sean (29 August 2022). Cerebras Architecture Deep Dive: First Look Inside the HW/SW Co-Design for Deep Learning. Cerebras (Report).
  6. ^ Research, AI (23 October 2015). "Deep Neural Networks for Acoustic Modeling in Speech Recognition". AIresearch.com. Retrieved 23 October 2015.
  7. ^ Kobielus, James (27 November 2019). "GPUs Continue to Dominate the AI Accelerator Market for Now". InformationWeek. Retrieved 11 June 2020.
  8. ^ Tiernan, Ray (2019). "AI is changing the entire nature of compute". ZDNet. Retrieved 11 June 2020.
  9. ^ "AI and Compute". OpenAI. 16 May 2018. Retrieved 11 June 2020.
  10. ^ "Bridging Intelligence and Technology : Artificial Intelligence Hardware Requirements". Sabuj Basinda. 22 August 2023. Retrieved 23 August 2023.

hardware, artificial, intelligence, this, article, multiple, issues, please, help, improve, discuss, these, issues, talk, page, learn, when, remove, these, template, messages, this, article, needs, attention, from, expert, artificial, intelligence, specific, p. This article has multiple issues Please help improve it or discuss these issues on the talk page Learn how and when to remove these template messages This article needs attention from an expert in Artificial intelligence The specific problem is Needs attention from a current expert to incorporate modern developments in this area from the last few decades including TPUs and better coverage of GPUs and to clean up the other material and clarify how it relates to the subject WikiProject Artificial intelligence may be able to help recruit an expert November 2021 This article is missing information about its scope What is AI hardware for the purposes of this article Event cameras are an application of neuromorphic design but LISP machines are not an end use application It previously mentioned memristors which are not specialized hardware for AI but rather a basic electronic component like resister capacitor or inductor Please expand the article to include this information Further details may exist on the talk page November 2021 This article needs to be updated Please help update this article to reflect recent events or newly available information November 2021 Learn how and when to remove this message Specialized computer hardware is often used to execute artificial intelligence AI programs faster and with less energy such as Lisp machines neuromorphic engineering event cameras and physical neural networks As of 2023 the market for AI hardware is dominated by GPUs 1 Contents 1 Lisp machines 2 Dataflow architecture 3 Component hardware 3 1 AI accelerators 4 Artificial Intelligence Hardware Components 4 1 Central Processing Units CPUs 4 2 Graphics Processing Units GPUs 4 3 Tensor Processing Units TPUs 4 4 Field Programmable Gate Arrays FPGAs 4 5 Memory Systems 4 6 Storage Solutions 4 7 Quantum Computing 4 8 Edge AI Hardware 4 9 Networking Capabilities 5 SourcesLisp machines editMain article Lisp machine nbsp Computer hardware Lisp machines were developed in the late 1970s and early 1980s to make Artificial intelligence programs written in the programming language Lisp run faster Dataflow architecture editMain article Dataflow architecture Dataflow architecture processors used for AI serve various purposes with varied implementations like the polymorphic dataflow 2 Convolution Engine 3 by Kinara formerly Deep Vision structure driven dataflow by Hailo 4 and dataflow scheduling by Cerebras 5 Component hardware editAI accelerators edit Main article AI accelerator Since the 2010s advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non linear hidden units and a very large output layer 6 By 2019 graphics processing units GPUs often with AI specific enhancements had displaced central processing units CPUs as the dominant means to train large scale commercial cloud AI 7 OpenAI estimated the hardware compute used in the largest deep learning projects from Alex Net 2012 to Alpha Zero 2017 and found a 300 000 fold increase in the amount of compute needed with a doubling time trend of 3 4 months 8 9 Artificial Intelligence Hardware Components editCentral Processing Units CPUs edit Every computer system is built on central processing units CPUs They handle duties do computations and carry out orders Even if specialized hardware is more effective at handling AI activities CPUs are still essential for managing general computing tasks in AI systems Graphics Processing Units GPUs edit AI has seen a dramatic transformation as a result of graphics processing units GPUs They are perfect for AI jobs that require handling massive quantities of data and intricate mathematical operations because of their parallel design which enables them to run several computations at once 10 Tensor Processing Units TPUs edit For the purpose of accelerating and optimizing machine learning workloads Google has created Tensor Processing Units TPUs They are made to handle both inference and training procedures well and perform well when used with neural network tasks Field Programmable Gate Arrays FPGAs edit Field Programmable Gate Arrays FPGAs are extremely adaptable pieces of hardware that may be set up to carry out certain functions They are suited for a variety of AI applications because to their versatility including real time image recognition and natural language processing Memory Systems edit In order to store and retrieve the data needed for processing AI requires effective memory systems To avoid bottlenecks in data access rapid connectivity and large capacity memory is crucial Storage Solutions edit Artificial intelligence applications generate and utilise vast amounts of data High speed storage choices like SSDs and NVMe drives provide quick data retrieval enhancing the general functionality of the AI system Quantum Computing edit Although it is still in its early stages quantum computing holds enormous potential for artificial intelligence The ability of qubits often referred to as quantum bits to process many states at once has the potential to revolutionize AI tasks requiring complex simulations and optimizations Edge AI Hardware edit Edge AI refers to artificial intelligence AI operations that are performed locally on a device negating the need for constant internet access Edge AI technology which includes specialized chips and CPUs makes immediate progress possible for tasks like speech recognition and object identification on smartphones and Internet of Things IoT gadgets Networking Capabilities edit AI systems frequently rely on data from several sources Data exchange effectiveness depends on responsive and reliable networking capabilities High speed data transfer enables real time decision making and faultless communication between AI components Sources edit Nvidia The chip maker that became an AI superpower BBC News 25 May 2023 Retrieved 18 June 2023 Maxfield Max 24 December 2020 Say Hello to Deep Vision s Polymorphic Dataflow Architecture Electronic Engineering Journal Techfocus media Kinara formerly Deep Vision Kinara 2022 Retrieved 2022 12 11 Hailo Hailo Retrieved 2022 12 11 Lie Sean 29 August 2022 Cerebras Architecture Deep Dive First Look Inside the HW SW Co Design for Deep Learning Cerebras Report Research AI 23 October 2015 Deep Neural Networks for Acoustic Modeling in Speech Recognition AIresearch com Retrieved 23 October 2015 Kobielus James 27 November 2019 GPUs Continue to Dominate the AI Accelerator Market for Now InformationWeek Retrieved 11 June 2020 Tiernan Ray 2019 AI is changing the entire nature of compute ZDNet Retrieved 11 June 2020 AI and Compute OpenAI 16 May 2018 Retrieved 11 June 2020 Bridging Intelligence and Technology Artificial Intelligence Hardware Requirements Sabuj Basinda 22 August 2023 Retrieved 23 August 2023 Retrieved from https en wikipedia org w index php title Hardware for artificial intelligence amp oldid 1198796888, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.