Over the last 2-3 years, a lot has been written about data growing faster than hardware capabilities. Computing power has largely progressed along what Gordon Moore had predicted half a century back, and that was enough to handle the increase in data volumes. But, with the advent of Internet of Things (IoT) that has changed. The numerous clicks, touches and sensors on pretty much every "thing" has been pumping out data at a pace that puts Moore's law to notice. Frequently cited examples include size of Facebook's daily log (~60 TB), data generated by a transatlantic flight (~ 650 TB) etc.
The supply chain has not been immune to this trend. Omni-channel for sales & service, sensors that not only track but also monitor the condition of goods in transit, forecasts based on social media feeds etc. have not only increased the quantum of data, but also its velocity.
But more data does not automatically mean better decisions. If anything it becomes a stark pointer to the inadequacy of existing tools. And that is not just hardware, but also the software traditionally used to analyze the data for decision making.
To start with, the analysis itself becomes more complex, where decisions were made based on 3-4 variables, examining another 3 is the new norm. For example. To decide on the container to prioritize and clear at a congested port, the traditional analysis would involve inventory levels at the warehouse, sell-through rates at retail stores and retail inventory. But, today you also have to factor in what the twitterati is saying about the goods in each container. Throw in cargo conditions (For example: If there was too much humidity it might cause damage), and the relative location of alternative containers, and you have a potent mix for a complexity explosion.
Next comes the pace, at which decisions have to be made. Data getting generated fast also means data becoming obsolete fast. So, it is no longer about collecting data, but about connecting with the data on the fly. This is where the current hardware can become the limiting factor. Traditional methods of batch reports imported to gigantic spreadsheets, would mean that your competitors are making more optimal decisions just by being more current with their data. Newer technologies like in-memory computing, cluster computing are needed to handle the data in real-time without spending a fortune on specialized hardware.
But, being able to handle the data in real-time does not mean decision support – especially when the analysis needed is complex. And that is where an operational intelligence tool like OpsVeda plays a critical role. The platform not only acquires the data in real-time but also interprets it according to predictive and prescriptive rules framed by the operations user. These rules could be very complex and industry/ function specific. But for the operations team it is as simple as peering at a screen where, the apparel container to prioritize and release is highlighted, the order that needs immediate confirmation is bubbled up or the batches of drugs that cannot be shipped are traced.
Still thinking about how to manage your operations in the new IoT world? Give us a call to know how you can optimize revenue opportunities and mitigate risks... and still keep things simple.