Digital Assets Issuance Leadership

The race for regulated digital insuance is intensifying and expanding – with a global valuation of $2.1 trillion, this market is currently almost as big as Apple Inc.

First, the Swiss Stock Exchange’s SDX was approved by FINMA, then Deutsche Börse launched D7 for German equities, and today the Singapore stock exchange SGX, is initiating Marketnode for bond issuance.

This is just the beginning of the digitalisation of the $250 trillion financial services market. Which regulated market infrastructure will become the global market leader by leveraging a unique, profitable digital business strategy model, global asset liquidity, market connectivity, and instant settlement of tradable assets leverage digital asset servicing using an ICSD model?

The Future of Data

The Future of Data

The future of data will be strengthened and secured in the coming years by an integrated analytical AI decision-making platform driven by cloud computing, AI and new, mature blockchain technology tailored to new use cases.

The main issue that these innovation hubs will accelerate, promote and address is the verification and authenticity of critical reference information from which global international indices, voting mechanisms, economic model, asset price-discovery, incentive policies, tradable financial indices and other indices in the creation of the global, country, regional and industry benchmarks, and ESG benchmarks are derived.

These innovative technologies will enable dynamic knowledge transfer from market segments using API, straight-through processing, and automation by design and default.

Optimising Data-Driven Decision Making Processes – A Retail Use Case

Optimising Data-Driven Decision Making Processes – A Retail Use Case

Recently global retail had supply issues. Could integrated, holistic data-driven culture in the organisation and country have solved this problem? More than one retailer having a supply issue should trigger some form of action from the countries leadership.

First, Retail organisations should have integrated internal data assets.

Let us look at how a fully optimised data process can help.

An ERP system may exist in the organisation, and getting meaningful insight may also exist.

  • Inventory management – manage inventory and track goods
  • Demand forecasting – forecast consumer demands, maximise sales and minimise costs
  • Vendor management – manage vendors, their lead time and performance.

The organisation should have monitoring systems for inventory, demand and vendor supply timelines. Using outlier detection algorithms will inform you when you have a surge in demand coming through for specific products or multiple products. You may have suppliers who can meet unusual volumes in demand. The demand spike should alert the vendors, who will then try to match product requests.

Are you using analytics to optimise your business decision to make the right decision at the right time?

The simple economics reminds one of the demand and supply concept, where a contraction in supply could lead to higher prices, scarcity, a reduction in consumer confidence which in turn leads to panic buying.

BECOMING DATA-DRIVEN PART 4 – Analytics

BECOMING DATA-DRIVEN PART 4 – Analytics

An architect’s dream is an engineer’s nightmare! As you already know having a vision of your analytics is someone else’s nightmare. Some problems you will be looking to solve may be very challenging. Once you have your data in production you must set up monitoring. Delivering an analytics platform in the past was challenging. Now with cloud technology a lot of impossible tasks are more attainable.

Becoming data-driven enables you to gain insight into people, businesses and things by combining your organisation data (Customer records transactional systems, operational systems, expert knowledge) with external events. Examples of external data can be social media, image data, weather and other event data. One of the driving forces for creating analytics is enabling your organisation to make informed decisions based on data to drive growth and productivity.

In the previous write-ups, part 1, part 2 and part 3, we saw we had to take the following steps: Identify your analytics requirement, source the data, choose and create your data platform. The great thing about having your data in place is that you can explore your data and decide how to use the data.

Your data is ready for the next step. You have all your internal and external data. Still, you want to get more meaningful insights into your business as an organisation to enable you to make cost savings and revenue-generating decisions.

Examples of problems that analytics address:

Who are your top customers, what are they buying? What is our customer attrition rate? What is our share of the market? How well are our products doing compared to the competition? How long to the next machine component break down? There is terrible weather forecasted for tomorrow, how will that affect your staff, your flights? Which planes will be affected?

Using analytics, we can choose what system to put in place for your business operations.

There are four main types of analytical models, which I talk about next.

Descriptive Analytics

So what is descriptive analytics? These are your time-bound standard reports that explain what events have happened and why. They also so answer questions like when or how many times an event happened. Dashboards, standard reports, ad-hoc reports or any simple report are all forms of descriptive analytics. Reports can have numbers, text and charts. Reports can include drill-downs and alerts. A simple report most users have is your bank statement which shows your incoming and outgoing cash, interest and fees. Other reports are your household bills, for example, electricity which tells you the amount of energy you have used over a given period. Most organisations are here have data warehouses and business intelligence solutions.

The example reports I gave above are reports that a company sends out to their customers. Each of these customers sees information that is related to them and them only. On the other hand, the companies sending out these reports have a bigger picture. They have an aggregated view of the reports for all clients and make more efficient business decisions using the data.

Examples

Let us take an airline company as an example. We want to decide if we should continue to fly from a remote airport. We need a report that gives us the passenger number over the last year, the revenue generated, and costs associated with the trip, stopovers and direct flights.

Predictive Analytics

Predictive analytics is a branch of advanced analytics that uses historical data and statistical algorithms to predict future trends. To build models, we use historical data to train our models. To predict a value based on new data, you need to prepare the data (data preprocessing), build the model and validate your models. Let us look at this briefly. Here we answer the question, “what will happen and why? What is the best action to take? “.

Data Preprocessing

  • Data cleaning and transformation
  • Outlier detection and treatment
  • Missing values
  • Dimension reduction
  • Data reduction
  • Feature selection

Model

Several models are available for predicting and classifying your dependent variables. Before building and selecting your model, you need to choose features relevant to your model. The model you select will depend on the type of the dependent variable you wish to predict. For example, if your dependent variable is binary (1 & 0), you may want to look at using a logistic regression model.

Types of Model

1. Regression

  • Linear Regression
  • Multiple Regression
  • Logistic Regression
  • Multiple Logistic Regression

2. Trees

  • Classification trees
  • Regression trees
  • Bagging, Boosting, Random Forest

3. Neural Networks

  • Recurrent neural networks
  • Artificial neural networks
  • Convolutional neural networks

Model Validation

Choosing a valid model is an essential step in your analytics. Here is where you may use out-of-sample data to check that the model gives good results. For example, you may wish to check for model overfitting when you get excellent training results but poor test results.

Prescriptive Analytics

As the name suggests prescribes a course of action. Your organisations can build and optimise models subject to constraints to enable you to make a decision. Some problems require yes/no decisions, and this problem is a binary problem.

Models are similar to predictive models.

What is Cluster Analysis? Cluster analysis is a way of grouping elements where the elements are similar. Clustering is used in market analysis to find customer segments to which the company can market their product, and clustering is also used for data reduction.

Example of Cluster Analysis: An example of segmentation is generation segments often used for marketing products based on the customer’s segment.

  • The Greatest Generation: Born between 1901 – 1924
  • The Silent Generation: Born between 1924 – 1945
  • Baby Boomers: Born between 1946 – 1964
  • Generation X: Born between 1965 and 1980
  • Generation Y or Millennials: Born between 1981 and 2000
  • Generation Z: It is the generation born after 1995

What is Principal Component Analysis (PCA)? – Principal component is unsupervised learning, often used to reduce the dimensionality of your data.

Example: If you have one hundred variables present in your dataset, Principal component analysis can be used to find a lower number of variables that can represent 90% of your data.

What is Optimisation: In general, optimisations tends to maximise or minimise specific output. For example, an organisation wishes to invest £1,000,000 in 20 stocks. These stocks can be chosen in any order but cannot exceed the investment. The objective is to maximise the profit. Optimisations are usually nonlinear and multimodal with complex constraints.

Constraints: With our models, we can specify constraints. For example, I want to invest £100,00, 40% of the money in technology stocks and 10% Industrial stocks.

Cognitive Analytics

With cognitive, we are looking at reasoning learning and natural language processing. Here the system is constantly learning and refining analysis.

Models

Latent Dirichlet Analysis (LDA) – This is unsupervised which classifies documents in different topics.

Sentiment Analysis is used for a classification problem or a Regression problem.

The attention model uses an encoder (RNN), all hidden states and an attention decoder.

Sequence to sequence model – uses encoder (RNN), Context vector and a decoder(RNN)

Long short term model (LSTM ) – Generally used for classification and prediction models.

Gated recurrent unit (GRU) – This is a type of RNN.

Acoustic model –  helps turn sound signal into a type of phonetic representation.

Language Model – Grammar, words and sentence structure for a language

Hidden Markov Model – uses transition probabilities and emission probability

Example

Algorithmic trading allows you to take data real-time from social media, live broadcast, economic news and other signals and take a position in the market.

Let us look at what happens: Sentiment analysis is used to determine the nature of the tweets from Twitter. Here we want to determine if the sentiment is positive or negative for a market instrument (e.g. TESLA).

A live broadcast is converted from speech to text and similarly analysed for positive or negative sentiments—the output from the sentiment analysis and other inputs consumed by your model and other inputs. The final step executes an action automatically by taking a position in the market. This is cognitive analytics.

Conclusion

 

In conclusion, there are various excellent reasons why an organisation should become data-driven. When chosen, all your analytics will have to be deployed into the production environment and available for users either as self-service, alerts, reports, or other uses.

Having the platform and all the analytics is not enough for your organisation to be data-driven. You have to have a data-driven culture in the organisation from how data is stored and collected. Data-driven culture has to be driven by the senior leadership of an organisation.

 

References

 

 

Blei, D., NG, A. and Jordan, M., 2003. [online] Jmlr.org. Available at: <https://www.jmlr.org/papers/volume3/blei03a/blei03a.pdf> [Accessed 26 September 2021].

Arxiv.org. 2014. [online] Available at: <https://arxiv.org/pdf/1406.1078.pdf> [Accessed 26 September 2021].

Becoming Data-Driven Part 3 – Data Tools

Becoming Data-Driven Part 3 – Data Tools

l wrote about the sources of the data we will need for our analytics. These data exist in different systems, internal or external to your organisation. It is advisable to have a dedicated platform for your analytics data. This platform may have one or more data tools. I will refer to this as your data tool ecosystemAnalytics can consume a lot of system resources, and you want to avoid impacting the performance of an operational system. System resources are memory, hard disk and central processing units (CPU).

So you have your vision of your company, and you have your data. What is your technology vision? The vision is more about what capabilities the technologies you need must have? You need to understand your data, and when it becomes stale, this will enable you to decide if you require data to be received and processed from source in real-time, hourly, daily, weekly, monthly and so on. Is your data going to grow very large (“big data”)? It is essential to note the critical elements of your data to help you choose the right tool.

Choosing and reviewing your tools should be done periodically. Set up a process to do this. ThoughtWorks [1] has a tool that the architectural team can use to drive the data technology vision. It has four stages, hold, assess, trial and adopt. There are three further states within the stages, new, moved in/out, no change.

DATA STORAGE

Before choosing your data tools, you need to decide on your data storage, and this depends on where your source data is stored and the type of computation required for your analytics. 

Data can be moved from one data system to another using a process known as Extract Transform and Load (ETL). Moving data from one location transforming introduces points of potential failure and adds latency to the availability of the data. With memory becoming cheaper and new technology emerging and improving, organisations are looking to reduce or eliminate this problem. 

Virtualization

Data virtualization integrates data from disparate sources without moving the data. This can provide a single customer view without the users knowing where the data is stored. This process eliminates the need for projects to create new systems, and the data is available when it is in the source systems. One disadvantage is that the data is not related to each other.

Federation

Data federation is like data virtualization, but a standard data model shows the relationships between entities.

In-Memory computing

As the name suggests, memory computing uses random access memory to store all the data required for analysis and processing is performed in memory.

In-database analytics

In-database analytics allows the processing of data within the database.


DATA TOOLS ECOSYSTEM

When creating a data platform, you may need one or more tools to help achieve your goal. We will discuss the tools on the market in this section. The tools use the micros service architecture, which I will discuss in a future article.

When you get data from different sources, there are different ways to store your data. One of the most common and widely used ways is through relational databases. Always look at the use case of the database you choose to ensure that it is the right choice.

Types of databases

Relational database

Relational databases store data in two-dimensional structures called tables. Tables can be related to other tables using inbuilt relational concepts.

Properties of relational databases: 

  • Data is structured
  • Table structure must exist before attempting to load data
  • ACID, atomicity, consistency, isolation and durability

Examples of relational databases:

Graph database

A graph database uses graph structures for semantic queries with nodes, edges and properties to represent and store data.[2]

Examples of graph databases:

Document Stores

A document store database is a non-relational database that stores its data in JSON-like documents [3] [4].

Properties of document stores:

  • High-performance data persistence
  • Replication & Failover
  • Outward scaling
  • Multiple storage engines

Examples of document store database:

Columnar database

The columnar database is optimised for retrieving columns of data fast and is excellent for the analytical workload. The storage reduces the overall disk I/O requirements due to reduced data load from the disk. 

Examples of columnar databases [5]:

Key-Value database

A key-value database is a non-relational database that uses the key-value method to store data.

Examples of key-value databases:

ETL TOOLS

Now you have a new database, how do you extract your data from the different internal and external sources into your database periodically.

As discussed above, we use the ETL process. Let us look at some tools that will allow you to extract the data from source systems of files, transform and normalise, load it to your datastore ready for ingestion by internal and external sources. 

Example ETL Tools:

Organisations use ETL processing patterns when the following conditions are met. 

  1. The business does not require the data in real-time.
  2. The batch processing workload is the best.

EVENT STREAMING

What if you want your data in near real-time or real-time. When an event happens, you want to predict the impact. Some use cases include

  • Stock prices
  • Economic news
  • Weather news
  • Recommender systems 
  • Fraud detection
  • Decouple and scale microservices

Example event streaming tools:

Master data management systems

 

Master data management systems are useful when you need to standardise your data. A good example will be if you are collecting data from different organisations, each may use different name for the same product.  This can be challenging to manage in code. When a change is required, you will have to go through the change process of your organisation to have this in production.

Master data management allows business owners to change their data as and when needed with a history of changes available.

Master data management tools allow you to also build, govern and manage, data quality, data hierarchies and business glossary.
Examples of master data management systems:

ON-PREMISE OR IN THE CLOUD

On-Premise

On premise means, your service is hosted in house physically within your organisation. You are therefore responsible for updates, licenses, servers, database software.

In the cloud

The hosting company handles cloud services, which means that they are responsible for the hardware, software, and updates. Your organisation pays for the services that it has requested and also by usage. 

Choosing a data platform is not a small task, but you can mix and match it with microservices depending on your data types and future data requirements.

In part four, I will be discussing analytics and its use case.

Further reading

 

  1. Assets.thoughtworks.com. 2021. [online] Available at: <https://assets.thoughtworks.com/assets/technology-radar-vol-24-en.pdf> [Accessed 11 July 2021].
  2. En.wikipedia.org. 2021. Graph database – Wikipedia. [online] Available at: <https://en.wikipedia.org/wiki/Graph_database> [Accessed 31 July 2021].
  3. Json.org. 2021. JSON. [online] Available at: <https://www.json.org/json-en.html> [Accessed 1 August 2021].
  4. Sciencedirect.com. 2021. Document Database – an overview | ScienceDirect Topics. [online] Available at: <https://www.sciencedirect.com/topics/computer-science/document-database> [Accessed 1 August 2021].
  5. En.wikipedia.org. 2021. Column-oriented DBMS – Wikipedia. [online] Available at: <https://en.wikipedia.org/wiki/Column-oriented_DBMS> [Accessed 1 August 2021].
  6. https://www.gartner.com/reviews/market/master-data-management-solutions/