This business came to Teradata for guidance. They were aware they couldn’t just give a data scientist every toy on the market — they needed a vision for how to navigate this difficult journey.
The bank first set out to acquire a big data for enterprise platform, with a technology framework that was based on an open-source analytical environment and was capable, scalable and flexible. They wanted a framework that could provide as much freedom for its data scientists as possible so they can innovate and develop new insights. They also needed to set up a governance structure so employees had access to the data.
Initially, Teradata helped the client batch ingest data into their enterprise data analytics platform, which enabled them to identify thousands of derivative variables. They also were able to stand up near-real-time data analytics, by implementing multiple data ingestion pipelines. This all fed into what the bank termed its Data Lab 2.0.
This new architecture allowed the company to use web interfaces and Python libraries distributed in a cluster and integrate them with the company’s development tools, which were deployed in both preproduction and production.
The lab enabled the bank to create new algorithms for the first time and then push the outputted information through an analytics visualization tool that created transparency and the ability to quickly respond to the needs of their environment, according to the customer.
To build its models, the bank didn’t want to use a manual process — where a model would go through a review and business approvals process and then get transferred to IT for manual deployment and production, which could take six to eight months. Instead, they wanted a lifecycle process that allowed them to make a model production-ready as quickly as possible.
The bank opted to develop its models with feature engineering at the front end with a set of parameters that would measure its effectiveness, and then it used Stash (now Bitbucket Server) for version control. That allowed the bank to trigger a continuous integration process, automatically running tests on the model. Its data scientists still design the model, but anyone in the business can productionalize the models once they are built. Through this method, it takes the company less than a day to retrain its models.
In the course of the year, the bank went from standing up a data analytics architecture for the first time to putting 200 models into production in a single month. And its future plans are even more aggressive: They are standing up an artificial intelligence platform, including a customer-facing robot at a handful of the bank’s 2,000 branches. The bank plans to use AI to help it with theft, geolocated offers, customer segmentation, preventative maintenance, client-facing chatbots, virtual assistants and fraud detection.
The bank proved it’s possible for any business to operationalize analytics in the enterprise, and, in fact, starting from scratch can help a business go from zero to 60 while their competition is left in the dust.