Well first off, the Internet of Things is a buzzword. And like many technical buzzwords, the underlying concept that it purports to describe existed long before the media-friendly “IoT” moniker came into existence.
One way to look at it is that the first wave of “settlers” on the Internet were people, and the servers offering up the human-consumable content that these people came looking for.
The next wave is turning out to be the machines, exchanging data with other machines. Data which, on its own, is likely of little interest to the average human. It’s an Internet of the Things, superimposed over the traditional Internet that we all know and love. And although the data traversing this network is usually not human-readable, it can be analyzed (especially in aggregate) to provide insights that are most definitely of interest to us.
The reality is that the IoT is nothing new; machines of all sorts have been connected to the Internet for as long as the Internet has existed. The difference now is that connectivity (particularly wireless) has become so pervasive and inexpensive that it’s now feasible to connect many more devices - devices which, in the past, were too remote, too power-hungry, or too insignificant to justify a connection to the Internet.
The other innovation that is driving IoT is “Big Data” analytics and machine learning. On one side you have this explosion of small and inexpensive devices that are all connected, and on the other you have these incredibly powerful capabilities to categorize and analyze vast amounts of data from these devices.
So, what is IoT?
The Internet of Things, or IoT is just a generalized way to refer to the vast collection of devices - from the very large and complex to the very small and seemingly insignificant - collecting data from every imaginable facet of human life and using the insights we mine from this data to make smarter decisions.