{"id":1353,"date":"2019-02-15T10:32:05","date_gmt":"2019-02-15T10:32:05","guid":{"rendered":"http:\/\/kusuaks7\/?p=958"},"modified":"2023-09-19T12:07:59","modified_gmt":"2023-09-19T12:07:59","slug":"coding-deep-learning-for-beginners-start","status":"publish","type":"post","link":"https:\/\/www.experfy.com\/blog\/ai-ml\/coding-deep-learning-for-beginners-start\/","title":{"rendered":"Coding Deep Learning for Beginners\u200a\u2014\u200aStart!"},"content":{"rendered":"<p><strong><em>Ready to learn Machine Learning? <a href=\"https:\/\/www.experfy.com\/training\/courses\">Browse courses<\/a>\u00a0like\u00a0<a href=\"https:\/\/www.experfy.com\/training\/courses\/machine-learning-foundations-supervised-learning\">Machine Learning Foundations: Supervised Learning<\/a> developed by industry thought leaders and Experfy in Harvard Innovation Lab.<\/em><\/strong><\/p>\n<section>\n<blockquote>\n<h5 id=\"6aa1\">Intuition based series of articles about Neural Networks dedicated to programmers who want to understand basic math behind the code and non-programmers who want to know how to turn math into\u00a0code.<\/h5>\n<\/blockquote>\n<p id=\"8a00\">This is the 1st article of series \u201c<strong>Coding Deep Learning for Beginners<\/strong>\u201d.<\/p>\n<\/section>\n<section>\n<p id=\"0150\">If you read this article I assume you\u00a0<em>want to learn<\/em>\u00a0about one of the most promising technologies \u2014 <strong>Deep Learning<\/strong>. Statement\u00a0<a href=\"https:\/\/medium.com\/@Synced\/artificial-intelligence-is-the-new-electricity-andrew-ng-cc132ea6264\" target=\"_blank\" rel=\"noopener noreferrer\" class=\"broken_link\"><strong>AI is a new electricity\u00a0<\/strong><\/a>becomes more and more popular lately. Scientists believe that as\u00a0<em>Steam<\/em><em>Engine<\/em>, later<em>\u00a0Electricity<\/em>\u00a0and finally\u00a0<em>Electronics\u00a0<\/em>have<em>\u00a0<\/em>totally changed the industry later A<em>rtificial Intelligence<\/em>\u00a0is next to transform it again. In a few years, basics of Machine Learning will become must-have skills for any developer. Even now, we can observe\u00a0<a href=\"https:\/\/research.hackerrank.com\/developer-skills\/2018\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/research.hackerrank.com\/developer-skills\/2018\/\" data->increased popularity of programming languages\u00a0<\/a>that are used mainly in Machine Learning like Python and R.<\/p>\n<h4 id=\"0db5\"><strong>Technology that is capable of\u00a0magic<\/strong><\/h4>\n<p id=\"3a6a\">In the last years, applications of Deep Learning made huge advancements in many domains arousing astonishment in people that didn\u2019t expect the technology and world to change so fast.<\/p>\n<p id=\"3263\">Let\u2019s start from historical match between\u00a0super-computer AlphaGo\u00a0and one of the strongest Go players, 18-time world champion \u2014 Lee Sedol, in March 2016. The AI ended up victorious with the\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/AlphaGo_versus_Lee_Sedol\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/AlphaGo_versus_Lee_Sedol\" data->result<\/a>\u00a0of 4 to 1. This match had a huge influence on Go community as AlphaGo invented completely new moves which made people try to understand, reproduce them and created totally new perspective on how to play the game. But that\u2019s not over, in 2017\u00a0DeepMind introduced AlphaGo Zero. The newer version of an already unbeatable machine\u00a0<strong>was able to learn everything without any starting data or human help. All that with computational power 4 times less\u00a0<\/strong>than it\u2019s predecessor!<\/p>\n<figure id=\"bc10\"><canvas width=\"75\" height=\"55\"><\/canvas><img decoding=\"async\" style=\"width: 700px; height: 525px;\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*8UWGFy271ovrfl3cjMKvQw.jpeg\" data-src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*8UWGFy271ovrfl3cjMKvQw.jpeg\" \/><\/figure>\n<p id=\"0efd\" style=\"text-align: center;\">AlphaGo versus Ke Jie in May 2017 (source: The Independent)<\/p>\n<p>Probably many of you have already heard about self-driving cars project that\u2019s being developed for a few years, by companies like\u00a0<a href=\"https:\/\/waymo.com\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/waymo.com\/\" data->Waymo<\/a>\u00a0(Google),\u00a0<a href=\"https:\/\/www.tesla.com\/autopilot\" target=\"_blank\" rel=\"noopener noreferrer\" class=\"broken_link\">Tesla<\/a>,\u00a0<a href=\"http:\/\/www.thedrive.com\/sheetmetal\/17440\/toyota-to-unveil-semi-autonomous-platform-3-0-at-the-2018-consumer-electronics-show\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"http:\/\/www.thedrive.com\/sheetmetal\/17440\/toyota-to-unveil-semi-autonomous-platform-3-0-at-the-2018-consumer-electronics-show\" data->Toyota<\/a>,\u00a0<a href=\"https:\/\/www.volvocars.com\/au\/about\/innovations\/intellisafe\/autopilot\" target=\"_blank\" rel=\"noopener noreferrer\" class=\"broken_link\">Volvo<\/a>\u00a0and more. There are also\u00a0self-driving trucks\u00a0that are\u00a0<a href=\"https:\/\/www.wired.com\/story\/embark-self-driving-truck-deliveries\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.wired.com\/story\/embark-self-driving-truck-deliveries\/\" data->already used on some highways in the US<\/a>. Many countries slowly prepare for the introduction of autonomous cars on their roads, yet their peak is predicted for the next decade.<\/p>\n<p id=\"1123\">But how about autonomous flying car? Just recently Udacity announced their new Nanodegree programme where developers can learn how to become Flying Car Engineers and create autonomous flying cars!<\/p>\n<p><a title=\"https:\/\/www.udacity.com\/course\/flying-car-nanodegree--nd787\" href=\"https:\/\/www.udacity.com\/course\/flying-car-nanodegree--nd787\" data-href=\"https:\/\/www.udacity.com\/course\/flying-car-nanodegree--nd787\" data- rel=\"noopener\"><strong>Flying Cars and Autonomous Flight | Udacity<\/strong><br \/>\n<em>Master job-ready autonomous flight software engineering skills as you tackle advanced challenges, write real code for\u2026<\/em>www.udacity.com<\/a><\/p>\n<p id=\"5196\">Lately thanks to improvement in AI speech recognition, voice interfaces like Google Home or Google Assistant become totally new development branch.<\/p>\n<p id=\"2119\" style=\"text-align: center;\">\n<p style=\"text-align: center;\">Google Advertisement on Google Assistant product.\u00a0<a href=\"https:\/\/youtu.be\/-qCanuYrR0g\" target=\"_blank\" rel=\"noopener noreferrer\" aria-label=\"Share link https:\/\/youtu.be\/-qCanuYrR0g\">https:\/\/youtu.be\/-qCanuYrR0g<\/a><\/p>\n<p>Future when AI will inform you to leave home earlier because of traffic, buy tickets to cinema, reschedule calendar meetings, control your home and more is closer than you think.<\/p>\n<p id=\"888b\">And of course this list could be longer:\u00a0AI is capable of reproducing human speech with many dialects,\u00a0<a href=\"https:\/\/www.newyorker.com\/magazine\/2017\/04\/03\/ai-versus-md\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.newyorker.com\/magazine\/2017\/04\/03\/ai-versus-md\" data->AI being better at diagnosing cancer than humans<\/a>,\u00a0AI generating new chapter of Harry Potter\u2026<\/p>\n<p id=\"05dc\">The key point in mentioning in all of that is making you understand that each of those inventions is using Deep Learning technology. To summarise it Deep Learning is currently excelling in tasks like:<\/p>\n<ol>\n<li id=\"e546\">Image recognition<\/li>\n<li id=\"1171\">Autonomous Vehicles<\/li>\n<li id=\"a444\">Games like Go,\u00a0<a href=\"https:\/\/www.theguardian.com\/technology\/2017\/dec\/07\/alphazero-google-deepmind-ai-beats-champion-program-teaching-itself-to-play-four-hours\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.theguardian.com\/technology\/2017\/dec\/07\/alphazero-google-deepmind-ai-beats-champion-program-teaching-itself-to-play-four-hours\" data->Chess<\/a>,\u00a0<a href=\"http:\/\/www.zdnet.com\/article\/researchers-reveal-how-poker-playing-ai-beat-the-worlds-top-players\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"http:\/\/www.zdnet.com\/article\/researchers-reveal-how-poker-playing-ai-beat-the-worlds-top-players\/\" data->Poker<\/a>, but lately also\u00a0<a href=\"https:\/\/www.theverge.com\/2017\/8\/11\/16137388\/dota-2-dendi-open-ai-elon-musk\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.theverge.com\/2017\/8\/11\/16137388\/dota-2-dendi-open-ai-elon-musk\" data->computer games<\/a><\/li>\n<li id=\"797c\"><a href=\"https:\/\/www.theverge.com\/2016\/9\/27\/13078138\/google-translate-ai-machine-learning-gnmt\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.theverge.com\/2016\/9\/27\/13078138\/google-translate-ai-machine-learning-gnmt\" data->Language Translation<\/a>\u00a0(but only a few languages)<\/li>\n<li id=\"9d8c\"><a href=\"https:\/\/9to5google.com\/2017\/06\/01\/google-speech-recognition-humans\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/9to5google.com\/2017\/06\/01\/google-speech-recognition-humans\/\" data->Speech recognition<\/a><\/li>\n<li id=\"9ba9\">Analysis of handwritten texts<\/li>\n<\/ol>\n<p id=\"f8ef\">And this is only the beginning because technology gets democratized every day and as more people become capable of using it, the more research is being done and simple ideas tested.<\/p>\n<h4 id=\"c1a5\"><strong>So what is Deep Learning?<\/strong><\/h4>\n<p id=\"26e1\">It\u2019s a subset of Machine Learning algorithms, based on learning data representations, called\u00a0<strong>Neural Networks<\/strong>. Basic idea is that\u00a0<strong>such an algorithm is being shown<\/strong>\u00a0a partial representation of reality in the form of\u00a0<strong>numerical data<\/strong>. During this process,\u00a0<strong>it\u2019s gaining experience and trying to create it\u2019s own understanding<\/strong>\u00a0of given data. That understanding has\u00a0<strong>hierarchical structure<\/strong>\u00a0as algorithm has\u00a0<strong>layers<\/strong>. First layer learns the simplest facts and is connected to the next layer that uses experiences from previous one to learn more complicated facts.\u00a0<strong>Number of layers is called the depth of the model<\/strong>. The more layers, the more complicated data representations the model can learn.<\/p>\n<figure id=\"2840\"><canvas width=\"75\" height=\"37\"><\/canvas><img decoding=\"async\" style=\"width: 700px; height: 360px;\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*XDm7O6tNYy9CBesXIa2SqQ.png\" data-src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*XDm7O6tNYy9CBesXIa2SqQ.png\" \/><figcaption>\u00a0<\/figcaption><\/figure>\n<p id=\"7fa9\" style=\"text-align: center;\">Neural Network that is used for face detection. It learns hierarchy of representations: corners in first layer, eyes and ears in the second layer, and faces in the third layer (source: strong.io)<\/p>\n<h4><strong>Is Deep Learning really new technology?<\/strong><\/h4>\n<p id=\"1a57\">Some of you might think that Deep Learning is technology that was developed lately. That\u2019s not entirely true. Deep Learning had very rich history and had various names depending on philosophical viewpoint. People were\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Ada_Lovelace\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Ada_Lovelace\" data->dreaming about intelligent machines over a hundred years ago before first mathematical concepts were built<\/a>. There have been three waves of development.<\/p>\n<p id=\"ff8a\">During the first wave, Deep Learning went by name\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Cybernetics\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Cybernetics\" data->Cybernetics<\/a>. First predecessors of modern deep learning were linear models inspired by the study about the nervous system\u2014\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Neuroscience\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Neuroscience\" data->Neuroscience<\/a>. The first\u00a0<a href=\"https:\/\/pdfs.semanticscholar.org\/5272\/8a99829792c3272043842455f3a110e841b1.pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/pdfs.semanticscholar.org\/5272\/8a99829792c3272043842455f3a110e841b1.pdf\" data->concept of the neuron (1943)<\/a>, the smallest piece of Neural Network, was proposed by McCulloch-Pitt that tried to implement brain function. A few years later Frank Rosenblatt turned that concept into the first trainable model \u2014<a href=\"http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.335.3398&amp;rep=rep1&amp;type=pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"http:\/\/citeseerx.ist.psu.edu\/viewdoc\/download?doi=10.1.1.335.3398&amp;rep=rep1&amp;type=pdf\" data->Mark 1 Perceptron<\/a><\/p>\n<p id=\"4da9\" style=\"text-align: center;\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*lC7w-cbJ26_FEGjnYskahw.jpeg\" \/><\/p>\n<p style=\"text-align: center;\">Mark 1 Perceptron (source: Wikipedia)<\/p>\n<p>But people had problems to describe brain behaviors with theories available at that time. That\u2019s why interest in them decreased for the next 20 years.<\/p>\n<p id=\"7420\">The second wave started in the 80s and went by name\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Connectionism\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Connectionism\" data->Connectionism<\/a>\u00a0but also term Neural Networks started to be used more often. The main idea was that neurons could achieve more intelligent behaviors when grouped together in large number. This concept was introduced by Hinton and is called\u00a0<a href=\"https:\/\/www.cs.toronto.edu\/~hinton\/absps\/families.pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/www.cs.toronto.edu\/~hinton\/absps\/families.pdf\" data->distributed representation (1986)<\/a>. It\u2019s still very central to today\u2019s Deep Learning. Another great accomplishment of a second wave was the invention of\u00a0back-propagation by Yann LeCun (1987)\u2014 core algorithm that is used until today for training Neural Network parameters. Also in year 1982\u00a0<a title=\"John Hopfield\" href=\"https:\/\/en.wikipedia.org\/wiki\/John_Hopfield\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/John_Hopfield\" data->John Hopfield<\/a>\u00a0has invented Recurrent Neural Networks, which after additional introduction of\u00a0<a href=\"http:\/\/www.bioinf.jku.at\/publications\/older\/2604.pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"http:\/\/www.bioinf.jku.at\/publications\/older\/2604.pdf\" data->LSTM in 1997, are used today for language translation<\/a>. Those few years of big hype about Neural Networks has ended due large interest of the various investors which expectations towards implementing AI in products was not fulfilled.<\/p>\n<figure id=\"52db\"><canvas width=\"75\" height=\"27\"><\/canvas><img decoding=\"async\" style=\"width: 700px; height: 263px;\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*J5W8FrASMi93Z81NlAui4w.png\" data-src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*J5W8FrASMi93Z81NlAui4w.png\" \/><\/figure>\n<p id=\"778d\" style=\"text-align: center;\">Image of LSTM cell based Recurrent neural Network (source:\u00a0<a href=\"http:\/\/colah.github.io\/posts\/2015-08-Understanding-LSTMs\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-href=\"http:\/\/colah.github.io\/posts\/2015-08-Understanding-LSTMs\/\" data->http:\/\/colah.github.io\/posts\/2015-08-Understanding-LSTMs\/<\/a>)<\/p>\n<p>The third wave started in 2006. At that time computer became a more common thing that everyone could afford. Thanks to the various groups, e.g. gamers, has grown the market for powerful GPUs. Internet was available to everyone. Companies started paying more attention to analytics \u2014 gathering data in digital form. As a side effect researchers had more data, and computational power to perform experiments and validate theories. Consequently, there was another huge advancement, thanks to\u00a0<a href=\"http:\/\/www.cs.toronto.edu\/~fritz\/absps\/ncfast.pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"http:\/\/www.cs.toronto.edu\/~fritz\/absps\/ncfast.pdf\" data->Geoffrey E. Hinton that managed to train Neural Network with many layers<\/a>. From that moment a lot of different proposals for Neural Network architectures with many layers started to appear. Scientists referred to the number of layers in Neural Network as of \u201cdepth\u201d \u2014 the more layers it had the deeper it was. Very important occurring was usage of Convolutional Neural Network\u00a0<a href=\"https:\/\/papers.nips.cc\/paper\/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/papers.nips.cc\/paper\/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf\" data->AlexNet<\/a>\u00a0in image classification contest ILSVRC-2012. It has revolutionized a lot of industries by providing them with reliable image detection mechanism \u2014 allowing many machines to see e.g. autonomous cars.<\/p>\n<figure id=\"9a45\"><canvas width=\"75\" height=\"22\"><\/canvas><img decoding=\"async\" style=\"width: 700px; height: 218px;\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*YXAvY6cemqDsPXrV1c5nxw.jpeg\" data-src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*YXAvY6cemqDsPXrV1c5nxw.jpeg\" \/><\/figure>\n<p id=\"b4cc\" style=\"text-align: center;\">Structure of AlexNet CNN (source: Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton, \u201cImageNet Classification with Deep Convolutional Neural Networks\u201d, 2012)<\/p>\n<p>In 2014, Ian Goodfellow has introduced new type of Neural Networks called\u00a0<a href=\"https:\/\/arxiv.org\/abs\/1406.2661\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/arxiv.org\/abs\/1406.2661\" data->Generative Adversarial Networks<\/a>. In this architecture two Neural Networks were competing against each other. First network tried to mimic some distribution of data. The role of a second network was to tell if the data it received is fake or real. The goal of first network is to trick the second network. Competition lead to increase in performance of first network and made generation of any kind of data \u2014 images, music, text, speech possible.<\/p>\n<figure id=\"834d\"><canvas width=\"75\" height=\"37\"><\/canvas><img decoding=\"async\" style=\"width: 700px; height: 350px;\" src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*6MTr2nbXymtlZGqXwgYBCw.gif\" data-src=\"https:\/\/cdn-images-1.medium.com\/max\/800\/1*6MTr2nbXymtlZGqXwgYBCw.gif\" \/><\/figure>\n<p id=\"c10b\">GAN used to transfer style of one image into another (source:\u00a0<a href=\"https:\/\/github.com\/lengstrom\/fast-style-transfer\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-href=\"https:\/\/github.com\/lengstrom\/fast-style-transfer\" data->https:\/\/github.com\/lengstrom\/fast-style-transfer<\/a>)<\/p>\n<p>And this is it I guess. Third wave continues until today and it depends on us how far it can go!<\/p>\n<h4 id=\"3314\"><strong>Why am I creating this series of articles?<\/strong><\/h4>\n<p id=\"cf09\">I am really passionate about Machine Learning and especially Deep Learning. My dream is to become Machine Learning Expert \u2014 person who work with people to solve problems and democratize the knowledge. I am working hard every day to reach that goal and this blog is a part of it. So study with me!<\/p>\n<p id=\"e013\">In my opinion, the biggest problem with access to this technology is that it was developed at universities and in laboratories by highly qualified Ph. D scientists and still partially stays there. It\u2019s understandable as everything is strongly based on\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Linear_algebra\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Linear_algebra\" data->Linear Algebra<\/a>,\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Probability\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Probability\" data->Probability<\/a>\u00a0and\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Information_theory\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Information_theory\" data->Information Theory,\u00a0<\/a><a href=\"https:\/\/en.wikipedia.org\/wiki\/Numerical_analysis\" target=\"_blank\" rel=\"noopener noreferrer\" data-href=\"https:\/\/en.wikipedia.org\/wiki\/Numerical_analysis\" data->Numerical Computing<\/a>. But\u00a0<strong>in order to become a driver, you don\u2019t need to know the engine<\/strong>\u00a0right? There is still conviction that in order to work in this field you need to be Ph. D but\u00a0<a href=\"https:\/\/www.quora.com\/Is-a-PhD-necessary-for-a-job-in-machine-learning-or-can-I-work-in-the-industry-without-one-Would-I-still-be-able-to-work-on-the-cutting-edge-Is-a-PhD-worth-it-if-I-have-no-intentions-of-entering-academia\" target=\"_blank\" rel=\"noopener noreferrer\" class=\"broken_link\">it is starting to change in terms of Software Engineering<\/a>.<\/p>\n<p id=\"e3e0\">Demand for people with those skills will become so big it will simply become impossible for everyone to have Ph. D title. That\u2019s why in order to make people use it, there must be someone who can translate that to others while skipping complicated proofs, scientific notation and adding more intuition.<\/p>\n<h4 id=\"f2c2\"><strong>What I hope to show to\u00a0you<\/strong><\/h4>\n<p id=\"09a6\">My goal is to provide strong understanding of most popular topics related to Deep Learning. I don\u2019t want to be protective when it comes to picking content \u2014 I want to show you even more complicated stuff and at the same time, do my best to provide you with intuition to grasp it. My main priority is to allow you understand to how those algorithms work and teach you how to code them from scratch. Like Mark Daoust (Developer Programs Engineer for TensorFlow) once said to me:<\/p>\n<blockquote id=\"f0a9\"><p>Everyone should code Neural Network from scratch once\u2026 but only once\u2026<\/p><\/blockquote>\n<p id=\"90f9\">So there will be a lot of code that I plan to carefully explain. Among the topics, you can expect mini-projects where I will show you how to use what we\u2019ve learned.\u00a0<strong>It\u2019s really important for knowledge to be followed by practice<\/strong>.<\/p>\n<p id=\"ab33\">The approach will be bottom-up then:<\/p>\n<ul>\n<li id=\"7380\">low-level \u2014 basic (and explained) math turned into Python NumPy code,<\/li>\n<li id=\"7b5f\">mid-level \u2014 TensorFlow (both tf.nn and tf.layer modules) where most of the stuff that I\u2019ve already shown to you can be automated in a single line of code,<\/li>\n<li id=\"15c8\">high-level \u2014 very popular framework that allows you to create Neural Networks really fast \u2014 Keras.<\/li>\n<\/ul>\n<p id=\"da69\">This project will focus only on Multilayer Perceptrons. It\u2019s already a lot of work to be done. If it succeeds I might consider doing an extension for Convolutional Neural Networks, Recurrent Neural Networks, and Generative Adversarial Neural Networks.<\/p>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>This an article about Neural Networks dedicated to programmers who want to understand basic math behind the code and non-programmers who want to know how to turn math into&nbsp;code. In the last years applications of Deep Learning made huge advancements in many domains arousing astonishment in people that didn&rsquo;t expect the technology and world to change so fast. Deep Learning is currently excelling in tasks like image recognition, Autonomous Vehicles, games like Chess,&nbsp;and also&nbsp;computer games, Language Translation, and speech recognition.<\/p>\n","protected":false},"author":321,"featured_media":3074,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"content-type":"","footnotes":""},"categories":[183],"tags":[97],"ppma_author":[2905],"class_list":["post-1353","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-ml","tag-artificial-intelligence"],"authors":[{"term_id":2905,"user_id":321,"is_guest":0,"slug":"kamil-krzyk","display_name":"Kamil Krzyk","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/?s=96&d=mm&r=g","user_url":"","last_name":"Krzyk","first_name":"Kamil","job_title":"","description":"Kamil Krzyk is Data Scientist at <a href=\"http:\/\/www.azimo.com\/\">Azimo<\/a>,&nbsp; In the past, he was a Full Stack Engineer on the mobile team. Passionate about Machine Learning technology, he focuses on building software components which use data and math as its core."}],"_links":{"self":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/1353","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/users\/321"}],"replies":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/comments?post=1353"}],"version-history":[{"count":5,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/1353\/revisions"}],"predecessor-version":[{"id":33022,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/1353\/revisions\/33022"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media\/3074"}],"wp:attachment":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media?parent=1353"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/categories?post=1353"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/tags?post=1353"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=1353"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}