Organizations are looking to deliver more business value from their AI investments, a hot topic at Big Data & AI World Asia. At the well-attended data science event, a DataRobot customer panel highlighted innovation with AI that challenges the status quo. A packed keynote session showed how repeatable workflows and flexible technology get more models into production. Our in-booth theater attracted a crowd in Singapore with practical workshops, including Using AI & Time Series Models to Improve Demand Forecasting and a technical demonstration of the DataRobot AI Cloud platform.
Automate with Rapid Iteration to Get to Scale and Compliance
Financial Services leaders understand the importance of speed and safety. At the event, a financial services panel discussion shared why iteration and experimentation are critical in an AI-driven data science environment.
Sara Venturina, VP Head of Data from GCash, the Philippines’ leading e-wallet, and Trevor Laight, Chief Risk Officer from CIMB, a leading ASEAN universal bank, hosted a discussion panel with Jay Schuren, DataRobot Chief Customer Officer.
The panel discussion focused on Boyd’s Law of Iteration—a theory from dogfighting (military aviation strategy) which believes that the speed of iteration beats the quality of iteration.
Trevor explained how this mindset of rapid iteration has been critical to keep pace with the evolving needs of the business. He reinforced that the ability to use automation within the experimentation and iteration phases has allowed CIMB, to continue to scale—even in the midst of unique data challenges and a complex regulatory environment.
With DataRobot AI Cloud, Trevor is able to combine his people’s best expertise with the power of automation to drive repeatable experimentation at scale and ensure that the best possible model makes it into production.
Sara added that driving digital transformation was not just a technology initiative—but rather an all-encompassing change management exercise. While GCash has been growing exponentially as a disruptor in the financial market, the importance of being able to bring everyone along on the journey—even non-technical stakeholders—is crucial.
With DataRobot, Sara has the ability to explain the models that her Data Science team is creating and can automatically generate the required compliance documentation. This allows GCash to maintain the pace of innovation and iteration without exposing the business to significant risk.
Closing the Value Gap: Reducing AI Cycle Time
What happens when you try to solve complex problems in silos—without the alignment of critical stakeholders? You spawn the dreaded AI value creation gap. Ted Kwartler, VP of Trusted AI, DataRobot, shared a keynote address that put this creation gap under the microscope—and showed how AI governance can lead to faster value creation.
Data scientists in many organizations are under undue pressure to narrow this value gap. Ted explained that—by operating in silos—most businesses are getting models from their Data Science team that then need to be rewritten by IT before finally moving into production. These models don’t allow for monitoring over time, have little or no documentation, and don’t meet the fundamental needs of the business.
Closing the value gap and reducing the overall AI cycle time means addressing the individual needs of each stakeholder group within the machine learning lifecycle. Ted highlighted four key stakeholder needs:
- AI Innovators have a strategic lens and are looking at the overall ROI of the AI project while assessing critical elements like trust and risk
- AI Creators look through a technical lens and focus on defining and building the right model
- AI Implementers focus on deploying, maintaining, and monitoring the model over time and are responsible for overall system health
- AI Consumers ensure that a model fits with organizational values, compliance, legal, and regulatory requirements
In order to meet these needs, Ted enumerated the standard governance questions that organizations need to address:
- Is the code easy to read and understand?
- Is the model explainable, traceable, and auditable?
- Is the model reproducible?
- Can we be confident that it will meet regulatory requirements?
Explainability spans across the entire DataRobot platform to support users at each step. Global explanation techniques allow stakeholders to understand the behavior of models and how features affect them. Local explanations provide row-level explanations for why a model made a prediction. Prediction explanations share which features and values contributed to an individual prediction and their impact.
DataRobot offers automated documentation that helps speed the documentation process for models with deployment reports and compliance reports that outline model methodologies and performance.
Simplify Your Tech Stack with Interoperable, Flexible Tools
At Big Data & AI Asia, DataRobot teams also discussed how flexibility and interoperability in the machine learning technology stack can help derive value from AI initiatives. Machine Learning stacks are commonly fragmented and hard to manage across departments, creating complexity and cost that can inhibit scale and slow down progress. Organizations that are simplifying their stacks—with a bias towards tools that are flexible across the storage, development, and consumption layers—are better placed to capture value.
DataRobot offers flexibility and interoperability with the broadest multi-cloud and hybrid deployment options, allowing teams to leverage the infrastructure they already have in place. Broad ecosystem integrations also enable teams to work with the data where it resides, minimizing complexity and allowing for easy consumption.
Learn How to Accelerate Business Results with DataRobot AI Cloud
Learn more about the DataRobot AI Cloud platform and the ability to accelerate experimentation and production timelines. Explore the DataRobot platform today.
About the author