Hands-On Python Deep Learning for the Web
上QQ阅读APP看书,第一时间看更新

Demystifying Artificial Intelligence and Fundamentals of Machine Learning

"Just as electricity transformed almost everything 100 years ago, today I actually have a hard time thinking of an industry that I don't think AI will transform in the next several years."
                                                                                                                                                                                                                                                          - Andrew Ng

This quote may appear extremely familiar and it's needless to say that, as a statement, it is really strongly resonant with respect to the current technological disruption. Over the recent course of time, Artificial Intelligence (AI) has been a great area of interest to almost every industry. Be it an educational company, a telecommunications firm, or an organization working in healthcare —all of them have incorporated AI to enhance their businesses. This uncanny integration of AI and several other industries only promises to get better with time and solve critical real-world problems in intelligent ways. Today, our phones can make clinical appointments for us upon our instructions, our phone cameras can tell us several human-perceived attributes of the images they capture, and our car alarm systems can detect our driving gestures and can save us from possible accidents. The examples will only get better and better and will grow as intelligent as possible with advancements in research, technology, and the democratization of computing power.

As we step into the era of Software 2.0, it is extremely important to understand why a technology that has existed since the 1950s is making most of the headlines in recent times. Yes! Artificial intelligence was born in the 1950s when a handful of computer scientists and mathematicians such as Alan Turing started to think about whether machines could think and whether they could be empowered with intelligence so that they can answer questions on their own without being explicitly programmed.

Soon after this inception, the term artificial intelligence was first coined by John McCarthy in 1956 in an academic conference. From the question "Can machines think?" (proposed by Turing in his paper, entitled Computing Machinery and Intelligence) around 1950 to the current day in the 21st century, the world of AI has shown some never-seen-before results that we could never have even thought of.

Today, it is almost impossible to think of a day without using the web. It has easily become one of our fundamental necessities. Our favorite search engines can directly answer our questions rather than give us a list of relevant links. They can analyze online text and detect their intent and summarize their content. All of this is possible because of AI. 

This book aims to be a hands-on guide to the readers on how they can use AI techniques such as deep learning to make intelligent web applications based on computer vision, natural language processingsecurity, and lots more. This chapter provides the readers with a quick refresher on AI and its different types and the basic concepts of ML, and introduces some of the biggest names in the industry and what they are doing by fusing AI and web technologies. We will be covering the following aspects:

  • Introduction to AI and its different types
  • Machine Learning (ML): The most popular AI
  • A brief introduction to Deep Learning (DL)
  • The relationship between AI, ML, and DL
  • Fundamentals of ML
  • The web before and after AI
  • The biggest web-AI players and what they are doing