Use Cases

The project will validate the project’s development and research results in the scope of real-life use cases undertaken by the project partners in three different sectors: Banking, Telco and Retail.

To this end the use case providers will integrate the CloudDBAppliance platform with their use case applications. The use cases to be targeted will cover the following:


Overview of ClouDBAppliance Validating Use Cases

 *BUCP : Bull Banking Use Case Partner. Bull will continue discussions with Nordea that has proposed the following use case in order to get this use case realized. (Nordea failed to join the project due to delay in their internal administrative process). Should Nordea fail to join the project, Bull shall propose an equivalent use case to be validated by the project and commission or discard this use case from the project in a future project amendment.

Banking Use Case: Real-Time Risk Monitoring in Banking

The way Investment Banks monitor Credit Risk and Market Risk is rapidly changing both for complying with regulatory requirements and for enhancing competitiveness. Risk applications used to rely on batch processing to produce aggregated reports that business users would then browse with traditional BI solutions, but this approach is showing its limits.

First, the financial regulators emphasize that pre-aggregation destroys information. They explain that risk analysts should always be able to freely select and filter the detailed data, but that implies doing aggregations and calculations on the fly. On the other hand, the banks themselves want to do more than daily reporting with their risk systems. Their goal is to support traders and sales who negotiate with their customers and let them do whatif analysis on the fly, over the live operational data.

Bull-BUCP is well-advanced in this transition and has chosen the ActivePivot In-Memory platform to power its new generation of risk applications. They plan to expand this new architecture where all analysis is done inmemory, and it raises what Bull-BUCP calls the “new requirements for operational analysis”:

  • how to deal with large datasets that do not fit in the memory of a commodity server (about 1TB today)
  • how to maintain lean, flexible operations on large scale applications
  • how to reduce the latency to propagate operational data into the in-memory analytics application
  • how to combine real-time analysis with historical datasets

If we consider a risk application in general, the vast majority of the data are pricing simulations that compute the value and the sensitivities of financial trades. Those simulations are calculated in farms of hundreds of servers and can produce terabytes of data per day. This data must be persisted and then loaded in-memory for interactive aggregations and calculations. When new trades are captured during the day, or when market data change significantly, the servers must recalculate the prices for a fraction of the prices, and the refreshed simulations should be available as soon as possible for analysis.

A solution to those new requirements and technical challenges would be to combine ActivePivot with an operational database capable of persisting large amounts of data written in parallel and to transfer this data into ActivePivot at very high speeds, on demand or continuously. CloudDBAppliance will develop a database that integrates the analytics features of ActivePivot with the LeanXcale operational database in a large memory node able to run the analytics on the most recent operational data.

The success of the solution would be measured with the following KPIs:

  • Number of write IOPS (Input/Output Operations per Second) supported by the operational database
  • Speed of the bulk data transfers between the operational database and the fast analytics engine.
  • Latency to forward changes in the operational database into the in-memory store.


Telco Use Case: Immediate Mobility Number Portability

Mobile Number Portability (MNP) is a key process of the IT system for a mobile network operator. The MNP process starts when a customer subscribes a contract with a different network operator (the Recipient); then, the Recipient sends the MNP request to the current network operator (the Donor).

The process from the customer point of view is:

  • A customer subscribes a new contract with the Recipient, asking for the number portability;
  • In order to get the number portability, the customer gets a Subscriber Identity Module (SIM) card from the Recipient;
  • Typically, customers subscribing the portability contract already have one SIM card. The new SIM card is activated with a “new” number, so the customer can immediately use the new card, even if (s)he can continue using the old one managed by the Donor.
  • After three working days, in the early morning, the portability is done with a maximum of two hours of service loss.
  • Then, the old SIM does not receive service from the Donor and is substituted with the new one given by the Recipient.

The high complexity of the process and the lack, in Italy, of a central database which runs all the phone numbers require strong integration between ICT platforms of different players involved in the process in order to fulfil the time constraints. In particular, all de-activation (by Donor operator) and activation (by Recipient operator) activities on the telephone networks must take place in a short time frame, very early morning, in order to reduce the failure of telephone services for customers. It is also necessary to inform all other operators (excluding the Donor) of the completion of porting activities and to receive notifications of mobile number portability activation from other operators during that period of time.

The involved ICT systems must be able to run perfectly and handle peak demand, so that customer telephone service is not stopped. In particular, the effectiveness of the process must be ensured daily, in the few hours in which thousands of requests should be managed. For this reason, the predictive analysis applied to malfunctions of the ICT systems involved in the MNP process is very useful and interesting for a mobile network operator. The considerable number of systems to be monitored and the control of their performance require the analysis of thousands of events related to the performance of the underlying systems (CPU usage, memory usage, I/O disk) and also of all the involved network platforms (switches and routers).

The targeted application, the centralized database for portability of phone numbers, is not feasible with today’s technology, which is why it does not exist today. CloudDBAppliance promises to deliver a platform able to provide the portability service nation-wide and with real-time response guarantees in countries of any size. The database will enable to switch the operator in a transactional manner between two operators. Additionally, it will also enable to perform analytics on the database such as analysing the churn of the different operators. It should scale to the number of ported phones in the largest countries with 100s of millions of users (e.g. China, USA, India).


Retail Use Case: Proximity Marketing

Shopping centres and malls can benefit (i.e. in terms of revenue) from analysing its customers’ data and using the concept of proximity marketing, by exploiting cellular technology in order to send marketing messages to the users’ mobile-devices that are in close proximity to a specific area of one of its stores. With more than six billion mobile phones in the hands of consumers today, this makes just about every consumer with a smart phone potentially susceptible to a proximity marketing campaign.

IKEA is a major retailer that participates in CloudDBAppliance as a partner with the interest of trying a cloud-based proximity marketing solution that will be developed by CloudBiz, a company specialized in cloud customer solutions for large enterprises, especially in the sector of retail.

Thus, the proposed scenario takes place in IKEA stores with the aim of analysing customer data deriving from multiple sources (e.g. Bluetooth and Wi-Fi enabled devices), in order to offer personalized content, encourage specific behaviours, enhance the shopping experience, facilitate the purchase decision and predict the needs of its customers. The goal is to have a platform able to get insights from all IKEA stores, segment the clients, and be able to personalize the offers based on the insights being extracted globally from all stores in real-time (e.g. reaction to a particular real-time coupon).

IKEA has several ways for gathering and storing data from its customers, in order to use it for its benefit. To begin with, in that specific scenario, IKEAcustomers will be equipped through their smartphones (or through an especially designed mobile device that will be provided to the customers by the time they enter the store) with a mobile application, in which the customers will take note of the purchases that they are willing to make. Thus, these “what-to-buy” notes will be digitalized, allowing IKEA to know exactly what its customers will probably buy and what their itinerary will be while in the store. What is more, most of IKEA customers are registered to the IKEA loyalty programme, in which they are equipped with members’ plastic cards and collect points upon their purchases. By the time a customer collects a specific amount of points, these points can be transformed into discount vouchers to the customer’s receipt. However, through this loyalty programme, IKEA is not only offering discounts to its customers, but also gathering data about the customers’ consumer behaviour, that is further analysed and used for marketing reasons. Currently the loyalty programme is reactive. In CloudDBAppliance, the loyalty programme will be transformed into a proactive programme, mobile loyalty programme, that to further attract customers’ data will leverage beacons that will be installed in different areas in the IKEA stores, and will collect information in real-time about the customers’ specific location, by the time that they are passing by the area that is monitored by each corresponding beacons.

In that scenario, through these ways of collecting and managing customers’ data, specially designed mechanisms will process and analyse this information in real-time, aiming at predicting customers’ needs and suggest them additional purchases, offers and coupons. More specifically, by the time that a customer with a balanced consumer behaviour will add to her/his shopping cart a product of different nature than the usual products (s)he used to purchase (e.g. product of higher price), real-time analytics will be performed on: 1) data concerning the customer’s consumer behaviour according to his/her previous purchases, and 2) other customers’ consumer behaviour according to what kind of similar items they have purchased in combination with that product. Thus, similar consumer patterns will be identified and forecasted between customers with similar behaviours, and suggestions will be provided to the customers’ device, through predictive real-time analytics mechanisms, about products that other customers purchased along with the selected product. What is more, through the installed beacon devices, IKEA will track in real-time the “geographic location” of all its customers. As a result, real-time analytics algorithms will be performed including data deriving from: 1) a customer’s current location into the store, 2) a customer’s current shopping cart, 3) other customers’ current shopping carts, and 4) past consumer behaviour of customers’ that purchased same or similar products. In that scenario, real-time predictions will be made into the aforementioned vast amounts of data (i.e. Big Data) in such a way that customers will be suggested through their devices about items that match/correspond to the items that they have already added to their shopping cart, and are located just a few steps away from them in the store.

CloudDBAppliance will provide the real-time big data capabilities to store the operational data about client location, itinerary, purchase history, amongst others, and to perform the data mining and machine learning analytics to discover insights into consumer behaviour. The real-time analytical query capabilities will be leveraged for providing support for the marketing analysts that will be making what-if analyses by means of online analytical queries about what will be the impact of a particular offering, coupon or suggestion in real-time.

Banking Use Case: Optimizing ATM Networks

This use case will assess the capabilities of the envisioned CloudDBAppliance platform when considering large interconnected ATM channel networks. During the last few years, in their attempt to achieve mobility to better serve their customers, banks across the globe are investing on next-generation ATMs, featuring advanced capabilities and offering customers the ability to perform more types of financial transactions than the ones are currently handled by customer service representatives. This will create a whole new perspective of the user ATM experience, also increasing the spatiotemporal heterogeneity of the cash demand and the data produced by the ATMs. Additionally, as user mobility increases, so does the complexity of predicting the expected user withdrawal and deposit transactions volumes.

Currently, banks use mostly stationary algorithms to predict the amount of cash that ATMs should contain every day, generally providing for extra deposits to avoid ATM cash drain. In the context of CloudDBAppliance, automated, optimal, combined forecasting of cash transactions and required deposits will be performed, in order to coordinate the cash available on the ATM channels with users and cash centres and also to minimize the amount of money contained in the ATMs, reduce cash logistics procedures, at the same time assuring cash availability. The CloudDBAppliance platform will gather information from a variety of data sources and will perform multi-level analysis and optimization techniques in order to 1) accurately model the ATM user behavior based also on external factors of variable behaviour such as weather forecast or social events, 2) predict the expected cash flows to and from the ATMs, and 3) come up with a global cash deposit allocation scheme that will consider not only historical data related to deposits and withdrawals, but also near- and distant-future events and temporary imbalances. The (simplified) use case depiction is given in the following figure:

Simplified ATMs use case

Simplified ATMs use case

More specifically, multiple sources of information will be exploited to predict the user behaviour. These sources may be classified into several categories, based on the exploitation perspective. First, all the transactions on the supported ATM channel networks will be constantly monitored in order to have a detailed view of the ATM channel network Next, flows of information from a set of general use and mass data aggregation services will be integrated, each targeting at increasing the consciousness of the platform with respect to different user behaviour aspects. Indicatively, fluctuations in the currency rates will be retrieved from public and private financial/monetary services and will be assessed in order to evaluate their potential influence in the cash deposit/withdrawal user activities at large geographical planes. Moreover, crowd sensing by exploiting location and social services such as Facebook and Foursquare will be used in order to first identify future social events that second, track the evolution of these events in real time. This will help to better train the intelligent optimization modules of the CloudDBAppliance platform in an attempt to augment its user behaviour-predicting capabilities.

Further, data aggregation sites such as Google and Twitter Trends will be leveraged in order to derive people trending behaviours at large and small time scales. Last, as ATMs are usually collocated with popular, crowded shopping hubs, interconnection with a number of cooperating retailers (see the Retailer use case) will be sought in order to achieve context awareness with respect to the cash-driven activities of the users so that, such information can be incorporated.

All this information will be continuously collected in the CloudDBAppliance and will be subject to data mining and pattern recognition and then to statistical analysis and ATM cash deposits optimisation procedures in order to derive insights related to:

1) The people’s response to external factors directly or indirectly affecting the value of money available by the ATMs

This information accurately models people’s behaviour as a result of their exposure to different types of stimuli, including weather, social, international monetary or political events and disturbances in the prices of users’ essential products, also taking into consideration specific cultural, regional and ethical aspects of the users at both aggregate and region-specific level.

2) The optimal cash allocation to the ATMs of an ATM channel network

Combining the aforementioned information, this optimal cash allocation plan will enable timely coordination of the cash centre and cash logistics departments in order to make sure that cash availability plans comply with the actual user cash demands. The goal of the optimization procedures will be to ensure cash availability at as marginal levels as possible, to avoid cash stagnation in the ATMs.

This information will be evaluated by the banks’ experts and then used in cash centres by the cash logistics departments, finally reaching the ATM channels.

In summary, CloudDBAppliance will serve as an operational database with the strong data integrity guarantees required by banking applications for the ATMs, provide the analytical query support to take decisions in real-time of how to provision ATMs and react to unplanned activities in the ATMs, and provide the data mining and machine learning analytics support to extract insights about customer behaviours.


Retail Use Case: Real-Time Personalised Pricing

This case study will evaluate the CloudDBAppliance platform against fostering customer intimacy with respect to retail market places of variable customer size, such as supermarkets and large malls/shopping centres. Acknowledging that proper product pricing is the key element to success in most retailer-related cases, the CloudDBAppliance view is to exploit modern techniques for harvesting data related to the shopping behaviour of the customers, analyse them and come up with an optimal product pricing strategy that will optimally match the spatiotemporal customers shopping trends. The goal of this pricing strategy will not be myopic, targeting at increasing the benefit of the retailers in the short run; instead, pricing will be used as a means to build trust and, ultimately, loyalty relationships with the customers. In this manner, the CloudDBAppliance platform will allow for achieving/provide reciprocal benefits to both for customers (enjoying competitive prices) and retailers (guaranteeing a large, loyal customer base).

Currently, most retail businesses are equipped with purchasing activity tracking infrastructure, collecting realtime data related to the actual demand fluctuation of the products available for purchase. Currently, this data is used to produce near-future predictions with respect to the products expected product demand, so that a preliminary product pricing strategy can be concluded.

The objective of this use case is to apply the CloudDBAppliance to enrich the current product pricing decision strategies so that they are more accurate in capturing the very near future customer purchasing trends. Specifically, the harvested real-time product demand data will be securely transferred to the CloudDBAppliance cloud infrastructure where a series of big-data analysis procedures will take place starting from data mining and pattern recognition in order to limit the dimensions of the data received, proceeding to advanced statistical modelling to extrapolate the near-future product demand, finally feeding (personalized) pricing optimization techniques. Additionally, this data processing chain is going to be fed by data originating from external, wellknown data aggregation services, such as Google and Twitter Trends and Foursquare Trending Venues. These services will allow for enriching the already captured data with information not directly related to the products per se, but to the psychology of the consumers. In this manner, modelling and predicting aggregate customer

behaviour at regional levels will be achieved. Granted customer approval, personal information will also be also gathered, exploiting non-sensitive data exposed through popular social services such as Facebook and Instagram, so as to allow for modelling the evolution of individual customers’ purchasing habits in the long run. Finally, external information such as the weather conditions or specific cultural conditions indicating a shift in the consumers’ needs will also be assessed. All these information will be used in order to update, coordinate and optimize the OLTP and OLAP processes of the retailers. Consequently, by granting this portfolio of information sources and data management and analysis techniques, in addition to monitoring the customer behaviour, CloudDBAppliance will be able to provide the retailers with insights related to:

  • The customers’ perception of a product price in comparison to other retailers.
  • This will allow the retailers to shape the product prices in a way that the customers always feel that they are getting the most out of their purchasing. It is important to underline, that CloudDBAppliance will not strive for assuring the lowest prices in the market, but for assuring that the customers are feeling that they are able to get the product they want, without being manipulated into purchasing products they do not approve.
  • The customers’ perception of a product price in comparison to other products.
  • This asset will allow retailers to handle the prices of the products in order to i) match customer demand, and ii) to leverage the dynamics of the warehouse management systems in order to guarantee that stagnant merchandise in their storerooms is kept at minimal levels.
  • The predicted fluctuation of the demand for a product in the near future.
  • This knowledge will enable the coordination of the producers and retailers with the actual needs of the customers. This, in turn, will enable optimal orchestration of the logistics departments which will allow for better inventory management. Finally, this prediction will feed the pricing strategy optimization engine that will calculate the optimal pricing strategy for the product.
  • The optimal pricing strategy for a product.

Combining the aforementioned information, the optimization core of CloudDBAppliance will come up with an optimal pricing strategy for a product and present it in a variety of contexts, e.g. by lowering/raising the product price, putting it into a promotion campaign or even edit coupons for special discounts on the specific product. The simplified Retail use case architecture is depicted in the next figure:


The simplified Retail use case


Finally, to further strengthen, when applicable, the efficacy of the big data analysis results, Electronic Shelf Labels (ESLs) for fast, accurate, consistent price shaping will be employed in order to enable real-time price updates at large (both from a spatial and range of products perspective) scale and boost the retailers’ price efficiency.

The ATM optimization and real-time pricing use cases, although of different discipline and focus, exhibit the versatility of the approach adopted by CloudDBAppliance. Despite the fact that previuous figures look similar, the control flow is radically different. In the ATMs use case, the users are mostly agnostic to the changes performed by the banks and their behaviour is static; the CloudDBAppliance platform will perform the operational optimization a priori using the bank resources as control objects, considering a wide variety of heterogeneous information. On the contrary, in the Retail use case, the users exhibit an active reaction to the decisions of the platform, transforming it from an offline decision-making mechanism to an online one, particularly in the case of ESL adoption, where the platform reaction time should be minimal. In this framework and in contrast to the ATM use case, the control objects are, in fact, the customers themselves. In any case, CloudDBAppliance envisions to bridge the gap between core operational processes and big data analytics by providing a combined real-time big data analytics platform.