OnPrem is now Qvest! Providing technology and business solutions across five continents, we are proud to be the world’s largest media and entertainment focussed consulting firm.
Just as the rise of digital assets fundamentally changed the way the industry had to think about their supply chain, we believe the next big disruptor will come from connecting data at a much higher fidelity than what is done today. The ability to exploit existing micro-level data, such as transactional data, to make real-time decisions used to be difficult if not impossible. Many times the data you were looking for was not captured, or by the time it became available it was obsolete. That has all changed with the emergence of API driven services, data and analytic tools, and cloud capabilities. Companies now have the ability to leverage the transactional data from services and audit logs to quickly get near real-time information on a granular level. This enables organizations to do things they previously could only imagine, from making real-time data driven decisions, to making more accurate forecasts, to allowing for more precise internal or external charge back models.
In the past, reporting and data were limited by the storage expense of logging detailed transactional data, as well as the computational cost of querying and merging data from separate databases. Business executives would ask for an analysis that would take days or weeks for an analyst to run the numbers, and often the data might not even be there at the level of detail needed to make accurate decisions. Real-time decision making was limited in two ways within systems: first, by the complexity of the data relationships, such that more analysis was needed to understand what data must be pulled in order to answer the business question, and second, by the time and effort required to develop code instrumented to gathering the required data – usually secondary to immediate business needs.
Building relationships between data was further challenged by the way systems and services have traditionally been architected. Data has been copied between systems, but the relationships between the data have not been accounted for. Legacy systems tended to be isolated and siloed, with limited or superficial integration points. The exchange of data often only included status and simple UUID (Universally Unique Identifier), without information on what was completed and how. To complicate things more, if the valuable data was stored in the system database, it often resided in unrelated data schemas that failed to decouple meaningful data from internal operational status data. With limited access to the granular transactional data, most integrations were only capable (or willing) to integrate merely at a surface level.
Current Advances
Today, the growing trend of micro services and APIs have done much to improve the ability to quickly access system data for decision making purposes. Additionally, cloud data providers make it easier to move data between different products / services under their umbrella. The philosophy of micro services is to architect cohesive service units where each unit accomplishes a bounded, focused business goal, while providing an API to the services providing that goal. How does this help? Part of the goal of a micro service architecture is to construct your system in such a way that bigger things can be constructed by combining micro services in a composable way. Similar to how houses and cars can be built from Lego blocks. Requests to micro services get funneled through an API Gateway, at which point client data gets extracted, transformed, and pushed into a data lake, or centralized pool of structured or unstructured data. The data lake supports big data applications. An API Gateway in front of micro services doubly protects backwards compatible clients by supporting release versions and decreases development time for new micro services for organizations already practicing continuous code delivery. Finally, as there is now an abstraction layer in between each platform, and development is streamlined for future integrations, managers of legacy systems can begin to evaluate the costs and impacts of swapping additional back end systems out for cloud born alternatives.
The use cases below give examples of how, at a high level, powerful insights and decisions can be made using a data driven, integrated platform of micro services. First, we illustrate how digital supply chain operations can realize tactical and strategic benefits. Second, we look at a more strategic example as financial systems are taken into consideration.
Applications
As mentioned earlier, currently integrated digital supply chain systems are often monolithic, with complex integrations and limited reporting data. When fully evolved to take advantage of this new paradigm, micro services would be used across the supply chain and vendors, both on premise and on cloud platforms, on a service by service level. The micro transactions’ data pushed to the data lake would provide a wealth of information around the lifecycle of assets flowing through the ecosystem. On the surface, end to end delivery status, supply chain bottlenecks, and defects can be easily discovered and more efficient processes can be developed. On a deeper level, consumer and financial data could be tied with operational data to discover new and interesting relationships. Imagine taking consumer usage data from services such as Netflix, Amazon, and Hulu, and using it to prioritize delivery and discover trends in your back catalog. Furthermore, because of the low overhead of integration and the ability to target a small, modular piece of the workflow, new vendors could be tried out on a single service basis, with data such as performance, defects, and invoicing cost used to make short or long term decisions. These are just a few concepts on how, on a tactical level, understanding the transactional data can create real impact on operations.
From a Studio perspective, understanding each transaction could also have a deeper financial impact. One of the constant challenges of centralized operations is accurately charging back the costs and overhead of shared services. As operations are pushed into the cloud (and SaaS vendors who provide services via the cloud) understanding the cost of operations becomes much easier. Content in the cloud is represented via its URI, or Universal Resource Identifier, allowing data to be captured in real-time, down to a fraction of a cent for those cloud services which are metered. Linking in title information with the URI could paint a picture of the true cost of development and distribution, a data point which is currently often fraught with unreported costs and errors. When costs can all be billed back to the title, just imagine the impact for the studio. How could this information be used to aid in forecasting? Estimating cost and layering in information such as historic theatrical sales, home distribution royalties, current consumer viewing trends, and competitor information could be used to create complex and accurate models for forecasting. The possibilities for advancing digital supply chains are endless.
Conclusions
Using micro services and combining them with a data and analytics toolset has countless benefits to both operations and strategy. While the trend is taking off, there is still a lot of work to be done. Deep rooted, monolithic systems need to be broken up or integrated with micro service connectors to capture the data necessary to make these advances. Change is not only required with the technology, but also in the way data is conceptualized and analyzed. Analysts need to think outside the box and build relationships across tiers of data in what could be seen as unrelated systems. It is only then that this new ecosystem can take shape and the true benefits realized.