What is Machine-Power?

Machine Power

Analytics has its origins over several centuries ago. For example, Logistics Regression, often used in machine learning algorithms and predictive analytics can be traced back to the 19th century. At the time that much of the analytics technology was developed modern computing power was not available. Hence there was an emphasis on formulae and functions that could be calculated by hand with a set of tables. Most of today’s analytics has remained in a time capsule, without real change. Considerable academic research has certainly been performed but with little impact. Much of the research has made little difference with no practical solutions being offered.

Attempts have been made in recent years to augment analytics, especially predictive analytics, with Machine Learning and Artificial Intelligence. Although there has been and still is considerable over-hype, as will be discussed in another article, this trend is in the right direction. The time has come to move out of the time capsule that analytics has fallen in. Machine Learning and Artificial Intelligence all rely on modern computing power aka Machine Power.

There is nothing magical about Machine Power which Machine Learning and Artificial Intelligence rely on. Machine power is no more than the number crunching prowess of the computer. All computer applications rely on Machine Power. Machine Learning and Artificial intelligence are just some applications. However to move out of the time capsule Machine Power must be harnessed. However, It is important to understand that the real power is not from the machine, or human intelligence, but a combination of both. It is human intelligence that creates all the complex algorithms for the computer to follow. Without the speed of the computer the algorithms would be ineffective and without the algorithms the computer would be ineffective.

Although all computer applications require machine power, modern society has not yet fully harnessed the combination of human intelligence and the power of computers, especially in the field of analytics. Computing power is not the constraint, but human intelligence is. To develop highly sophisticated algorithms, that can mimic human intelligence to the level where that laymen believes that the computer can consciously think, requires a rare level of intelligence beyond applying formulae and functions.

Although the intelligence that went into these formula and functions by early pioneers cannot and will not be disputed, todays problems are far more complicated than the problems for which the formulae and functions were first developed for. Today analysis often requires an additional layer of intelligence in algorithmic development utilizing machine power that goes beyond just the application of formulae. It is fair to say, at the risk of overgeneralizing, that the future of analytics will rely less on mathematics and statistics, as we appear to have reached a threshold beyond which there are no practical gains with advances in these fields. The future lies in algorithmic development harnessing the power of the computer, i.e. Machine Power.

Examples of Applications

The following example show that the combination of human intelligence and computing power can achieve results otherwise not possible. In this instance savings of several million dollars per annum were achieved.

The application involved chocolate viscosity adjustment with a combination of Coca Butter, and the surfactants, Lecithin and Crester. Viscosity was characterized with Yield Value and Plastic Viscosity. Lecithin reduces Plastic Viscosity and Crester reduces Yield Value. Cocoa Butter reduces both. There were also interaction effects.

Different approaches were used to find that combination of raw materials that brings both Yield Value and Plastic Viscosity inside Specification, with least cost. Conventional approaches such as Linear, Integer and Dynamic Programming were tried but failed because the main and interaction effects varied from batch to batch, processing conditions varied. One complication was the fact that after each addition, effects changed, i.e. effects were dependent on previous addition.

The final solution was a combination of established algorithms, such as non-linear regression and inhouse developed analytical and optimization procedures, some of which involved modifying classical optimization algorithms. The key break through was in the development of dynamic switching analysis to cope with each batch on its own merit.

The second example is about Change Analysis, about which several articles have been published in the Knowledge Center

Change Analysis is about detecting changes in chronologically ordered data. Change Analysis estimates the onset and duration of changes with countless applications. It is great market research tool to identify factors that affect Sales.

Relying on Machine Power, multiple algorithms with switching abilities were developed and multiple sensitivity optimizing parameters were added and combined with robust statistical analysis to detect changes. The algorithm is able detect changes in the average, variability and slopes whilst being robust to outliers and a wide range of probability distributions.

One of the most recent applications has been to detect leaks and detect changes in domestic water consumption. The image below shows changes in consumption over a three-year period. The pink overlay shows the previous year’s pattern.

The alternative was to use Machine Learning to identify past patterns and then detect changes by comparing the predicted pattern with the current pattern. Machine Learning failed. Change Analysis demonstrated that Machine Leaning cannot be applied, because past patterns are unpredictable and inconsistent as expected with human water consumption.

Machine Power and BIS.Net Analyst

The BIS.Net Analyst team has developed algorithms, and decision rules over three decades to obtain far more effective analysis than classical main-stream analysis can, through Machine Power. Change Analysis is one example. Other areas are in:

  • Distribution fitting. Curve fitting has classically been achieved through methods such as regression and maximum likely-hood fitting. The machine powered algorithms can obtain better fits when closed form equations are not possible, such as for sigmoidal Logistic Curve fitting.
  • Advanced Process Control for Stream Processes. Stream processes are common such as in the confectionery industry. Classical alternatives are too cumbersome and insensitive.
  • Advanced Process Performance for Stream Processes. Process performance/capability analysis can be performed on multimodal data from different streams, each with their own probability distribution.
  • Realistic Process Capability Analysis. Classical Process Capability Analysis is based on within sub-group variation on the basis that between subgroup variations is assignable cause variation that can and should be removed. This is often not realistic, and it is not possible to apply such an approach to non-normal data. Machine powered analysis can be applied to non-normal data and individual data without relying on a moving range estimate of the variability.
  • More comprehensive analysis in areas such as Measurement System Analysis and Categorial Analysis
  • More informative perspective giving visual presentation of data, especially in Categorial Analysis, often used in research surveys.
  • Non-normal X-bar charts. Depending the sub-group size, degree of skewness and other factors X-bar charts are not as robust to non-normality as is believed
  • More reliable confidence intervals especially for discrete variables.


Sometimes the augmented technology makes little difference compared to standard technology. Sometimes it makes a big difference, such as Change Analysis, Hybrid Charts and Distribution optimized control charts.