FEATURED ARTICLE
# Machine-Power

#### Harnessing Machine Power

#### Examples of Applications

#### Machine Power and BIS.Net Analyst

Machine power is no more than the number crunching prowess of the computer, in combination with data storage of large amounts of data. All computer applications rely on Machine Power. Machine Learning and Artificial intelligence are just some applications. It is important to understand that the real power is not from the machine, or human intelligence, but a combination of both. It is human intelligence that creates all the complex algorithms for the computer to follow. Without the speed of the computer the algorithms would be ineffective and without the algorithms the computer would be ineffective.

Modern society has not fully harnessed the combination of human intelligence and the power of computers. Computing power is not the constraint, but human intelligence is. To develop highly sophisticated algorithms, that can mimic human intelligence to the level where that laymen believes that the computer can consciously think, requires a rare level of intelligence beyond applying formulae and functions.

It is recognized that in today’s resource scarce competitive environment large data must be converted to information through data analysis. At the risk of over generalizing, historically analysis, has been no more than the application of formulae and rules. Some of these formulae and functions were first developed up to 300 years ago. Although the intelligence that went into these formula and functions buy early pioneers cannot and will not be disputed, todays problems are far more complicated than the problems for which the formulae and functions were first developed for. Today analysis often requires an additional layer of intelligence in algorithmic development that goes beyond just the application of formulae. Algorithms alone are also not enough. Considerable thought is required to adapt the algorithms dynamically and to switch between alternative algorithms depending on the situation.

Fully harnessing machine power is the way forward. Analytical applications are becoming too complex for the recipe approach, which is one reason Machine Learning has risen in popularity over the years. However, there are also caveats to be aware off. Unlike functions and formulae and many algorithms such as Linear Programing, intellectual out-of-the-box thinking provides a competitive edge and hence is proprietary. This means that the solutions used cannot be scrutinized on a theoretical basis as formulae and functions and closed form algorithms can. This however is not a problem if the end user is able to first test the applications before committing to their use.

The following example show that even with little computing power the combination of human intelligence and computing power can achieve results otherwise not possible. In this instance savings of several million dollars were achieved using a mere Commodore 64 computer in the 1980s.

The application involved chocolate viscosity adjustment with a combination of Coca Butter, Lecithin and Crester. In those days Viscosity was characterized with Yield Value and Plastic Viscosity. Lecithin reduces Plastic Viscosity and Crester reduces Yield Value. Coca Butter reduces both. There were also interaction effects.

Different approaches were used to find that combination of raw materials that brings both Yield Value and Plastic Viscosity inside Specification, with least cost. Conventional approaches such as Linear, Integer and Dynamic Programming were tried but failed because the main and interaction effects varied from batch to batch, processing conditions varied. One complication was the fact that after each addition, effects changed, i.e. effects were dependent on previous addition.

The final solution was a combination of established algorithms, such as non-linear regression and inhouse developed analytical and optimization procedures, some of which involved modifying classical optimization algorithms. The key break through was in the development of dynamic switching analysis to cope with each batch on its own merit.

The second example is about Change Analysis, about which several articles have been published in the Knowledge Center

Change Analysis is about detecting changes in chronologically ordered data. As explained in other articles it is a much more effective alternative to Shewhart Charts, which do little more than detect unusual patterns in data. Change Analysis estimates the onset and duration of changes.

It is not a new concept and was first introduced in the 1960s by Woodward and Goldsmith. The original algorithm was based on detecting turning points in Cumulative Sum Charts. Due to the lack of computing power in those days, it lacked the sophistication to be robust for different types of data. The algorithm failed in many instances. There were issues with false alarm rates.

Relying on Machine Power, multiple algorithms with switching abilities were developed and multiple sensitivity optimizing parameters were added and combined with robust statistical analysis to detect changes. The algorithm is able detect changes in the average, variability and slopes whilst being robust to outliers and a wide range of probability distributions.

One of the most recent applications has been to detect leaks and detect changes in domestic water consumption. The image below shows changes in consumption over a three-year period. The pink overlay shows the previous years pattern.

Figure 1: Machine-Powered Change Analysis showing changes in water consumption over 3 years.

The alternative was to use Machine Learning to identify past patterns and then detect changes by comparing the predicted pattern with the current pattern. Change Analysis demonstrated that Machine Leaning cannot be applied, because past patterns are unpredictable and inconsistent as expected with human water consumption. Heat maps such as shown below (not related to the above data) were used to visually identify changes.

Figure 2: Heat Maps (not related to the above data) used to visually identify changes

Change Analysis was able to far more effectively show changes than Heatmaps

The BIS.Net Analyst team has developed algorithms, and decision rules over three decades to obtain far more effective analysis than classical main stream analysis can through Machine Power. Change Analysis is one example. Other areas are in:

- Distribution fitting. Curve fitting has classically been achieved through methods such as regression and maximum likely-hood fitting. The machine powered algorithms can obtain better fits when closed form equations are not possible, such as for sigmoidal Logistic Curve fitting.
- Advanced Process Control for Stream Processes. Stream processes are common such as in the confectionery industry. Classical alternatives are too cumbersome and insensitive.
- Advanced Process Performance for Stream Processes. Process performance/capability analysis can be performed on multimodal data from different streams, each with their own probability distribution.
- Realistic Process Capability Analysis. Classical Process Capability Analysis is based on within sub-group variation on the basis that between subgroup variations is assignable cause variation that can and should be removed. This is often not realistic, and it is not possible to apply such an approach to non-normal data. Machine powered analysis can be applied to non-normal data and individual data without relying on a moving range estimate of the variability.
- More comprehensive analysis in areas such as Measurement System Analysis and Categorial Analysis
- More informative perspective giving visual presentation of data, especially in Categorial Analysis, often used in research surveys. Many of the conclusions drawn in the past using classical analysis appear dubious.

Improve product quality through smart data analysis using new-age, machine-powered driven analytics for quality assurance, available to download through a suite of Apps!

- Apps for SPC, MSA, Process Performance, Inferences, Visualization, and much more
- No licences or subscriptions! Pay ONLY per analysis, billed monthly! Don't Use, Don't Pay!
- Always up-to-date with the latest tools and features

TRY FIRST WITH A 30 DAYS FREE TRIAL!