Mastering IOT
上QQ阅读APP看书,第一时间看更新

History of the IoT

The term IoT can most likely be attributed to Kevin Ashton in 1997 with his work at Proctor and Gamble using RFID tags to manage supply chains. The work brought him to MIT in 1999  where he and a group of like-minded individuals started the Auto-ID center research consortium (for more information, visit http://www.smithsonianmag.com/innovation/kevin-ashton-describes-the-internet-of-things-180953749/). Since then, IoT has taken off from simple RFID tags to an ecosystem and industry that by 2020 will cannibalize, create, or displace five trillion out of one hundred trillion global GDP dollars, or 6% of the world GDP. The concept of things being connected to the Internet up through 2012 was primarily connected smartphones, tablets, PCs, and laptops. Essentially, things that first functioned in all respects as a computer. Since the humble beginnings of the Internet starting with ARPANET in 1969, most of the technologies surrounding the IoT didn't exist. Up to the year 2000, most devices that were associated with the Internet were, as stated, computers of various sizes. The following timeline shows the slow progress in connecting things to the Internet:

 

Certainly, the term IoT has generated a lot of interest and hype. One can easily see that from a buzzword standpoint, the number of patents issued (https://www.uspto.gov) has grown exponentially since 2010. The number of Google searches (https://trends.google.com/trends/) and IEEE peer-reviewed paper publications hit the knee of the curve in 2013:

Analysis of keyword searches for IoT, patents, and technical publications