Defining the UX of the Modern Enterprise

Defining the UX of the Modern Enterprise

How to design for autonomous performance management This week our team at Trufa received the wonderful news that together with the design agency NONOBJECT, we’ve received the Red Dot Design Award for our efforts to build a new type of application for the enterprise. A break from the past Before IT conquered the enterprise, offices were decorated with large, gray cabinets. Desks were laced with manila envelopes and strict processes ruled the business world. Then ERP systems came along. Cabinets became servers and people used mice and keyboards instead of pens. Companies like SAP established new standards in business software. The next revolution came with Business Intelligence (BI). It wasn’t enough anymore to simply do digitally, what was previously done with pen and paper. Every part and product, supplier and customer, purchase and sale was recorded somewhere in an ERP system. This wealth of information was untapped and BI systems were there to tap it. Armies of consultants created long running batch jobs that spat out reports to provide insights into the business. Enter 2017. The computing world has made the move to the cloud, democratizing scalability. Modern frameworks have enabled more rapid development. Mobile has eaten the world. We’re on the cusp of an AI revolution. It’s time to unlock the true potential of what enterprise software can be. Trufa shows what this future looks like, today. The future is autonomous We like to say that we sell autonomous performance management. A Business Intelligence system is at heart descriptive. It can take some ERP data, reshape it, and provide it in the form of a sorted table or...
Can Finance & Ops Ever Talk To Each Other?

Can Finance & Ops Ever Talk To Each Other?

Up to now Finance has its own language. Finance language is revolving around gross margin, cash, revenue, costs, etc. And Operations has its own language. Operations language is revolving around customer satisfaction, product quality, just-in-time, lean, flexible, lot size 1, etc. There is no common language. No surprise that people cannot talk to each other. But they have to talk to each other. They need to talk to each other when it comes to managing the performance of a company. Finance has certain demands on gross margin and free cash for example (actually the company has those demands but they are expressed via Finance). And Operations needs to understand how to meet these requirements. So far this sync process between Finance and Operations is trial-and-error. Nothing less nothing more. There was a similar problem in the communication between doctors and patients. Doctors wanted to convince patients of a certain diet. And patients wanted to understand the rationale for those recommendations. This was the reason for the introduction of the so-called Body-Mass-Index / BMI. Meanwhile the BMI is so widely accepted that nobody thinks about it as a common language between doctors and patients anymore. We have applied the same principle on enterprise performance. Our Trufa Performance Index (TPI) translates Finance language into Operations language. The TPI shows e.g. that improving perfect fulfillment by 2% yields 5% more cash respectively 0.5% more profitability. Our TPI shows e.g. that you need to increase your customer satisfaction by 2.5% in order to achieve the required 5% cash improvement. Our TPI shows that increasing your electronic orders by 10% saves you 3% free...
No Process Harmonization, No Performance Comparability?

No Process Harmonization, No Performance Comparability?

In the days of “lot size 1” and “customer 1” process harmonization becomes more and more a pipe dream. If we like it or not. And with process harmonization performance, comparability goes down the tube as well. Or? One rationale for driving process harmonization is to establish comparability between the various business entities. Why do you want comparability? Because you want to establish some sort of benchmark in order to understand where you perform better and where you perform worse. Comparability is the prerequisite for identifying best practices. Fortunately, financial performance is easily comparable. Because it measures everything in dollars and cents. And it is naturally clear that effectiveness (profitability) and efficiency (cash) are the key finance metrics. Unfortunately, there is no straight way to translate operational performance figures into financial performance figures. What is the financial impact of improving your customer satisfaction? What is the financial impact of improving your product quality? What is the financial impact of shipping on time? What is the financial impact of an increased number of electronic orders? etc. The first question is whether there is a relationship between a certain operational activity and the financial outcome at all. Does it matter whether you are doing manual or automatic planning for example? This can be found out with statistical correlation analysis. If there is a relationship at all the ensuing question is how does this relationship look like. If you improve your dispatching quality by 2%, how much does it improve your free cash position? Such relationships can be determined with statistical predictive modeling. As a side effect, such models yield also the...
Isn’t S/4HANA the same as Trufa?

Isn’t S/4HANA the same as Trufa?

The answer is simple: Yes and No! “Yes” because Trufa and S/4HANA apply the same philosophy of “all details” and “no aggregates”. “No” because Trufa solves a totally different problem than S/4HANA. S/4HANA is about supporting and automating the operations of an enterprise. Trufa is about supporting and automating the management decisions of an enterprise. This means that S/4HANA and Trufa are complementary by nature. You cannot do with S/4HANA what you can with Trufa and vice versa you cannot do with Trufa what you can do with S/4HANA. But both solutions are synergistic. Let’s illustrate these synergies with a few exemplary use cases of Trufa for S/4HANA: b Familiarizing yourself with the foundational S/4HANA characteristics $ Experiencing the different way of working with ad-hoc dynamic hierarchies of all sorts Q No need for a running S/4HANA system  Preparing your S/4HANA introduction $ Taking stock of your current ERP usage patterns Q No need for a running S/4HANA system  Comparing your ERP and S/4HANA complexity $ Gauging your degree of achieved simplification O S/4HANA test system required  Weighing your ERP- and S/4HANA-based business performance $ Qualitative and quantitative performance driver assessment P Parallel run of ERP and S/4HANA required  Accompanying the S/4HANA rollout $ From S/4HANA Finance to S/4HANA R S/4HANA in production mode required Technically Trufa is an “S/4HANA application”. Trufa uses the same technology stack as S/4HANA. Trufa uses the same raw tables as S/4HANA. Trufa uses the built-in financial functions of HANA as S/4HANA. Did we raise your interest? Then don’t hesitate to talk to...
Does Cutting Down Days of Duration Improve Performance? Hint: Maybe Yes! Maybe No!

Does Cutting Down Days of Duration Improve Performance? Hint: Maybe Yes! Maybe No!

Many people believe that driving down durations of business processes is an improvement in itself. I.e. if it takes only three (3) days it is better than it takes five (5) days. How come? It is alluring because it looks so intuitive and simple. It is possible to measure it. It is impossible to measure process durations in free cash of profitability. Or? In such situations people tend to take the path of least resistance. Repeatedly throughout the history of performance management. Though everybody knows as well: “1. Performance indicators that measure activity rather than performance will provide less useful data and information overload” (https://smartkpis.kpiinstitute.org/kpi-101/pitfalls-and-how-to-avoid-them) Actually there is common sense about what it takes to make an effective performance indicator: We have figured how to measure operational activities in free cash or profitability. Hence Trufa performance indicators:  can be linked to objectives (enterprise efficiency or effectiveness)  are actionable (because drivers are clearly identified)  are simple (it’s either cash or profit)  are credible (it can be looked up in the P&L or in the balance sheet)  are integrated (financial outcome can be translated into operational measures)  are measurable (that’s what Trufa is all about). Do you want to take us up on...
The Profile of a Digital Controller

The Profile of a Digital Controller

New technology such as Digital Performance Management gives birth to a new type of controller (financial analyst). Proven tools and routines will be complemented with new tools and routines. Trusted habits and approaches will be recharged with new ways of doing things. How long this takes depends on our openness and willingness to learn from each other and to adjust to this change. We at Trufa are observing following changes of behavior in our customer base: 1. Leaving the ivory tower Digital Performance Management enables the automatic detection and in time management of operational input factors (operational performance drivers). Digital Controllers are closely collaborating with the operational managers. Digital Controllers are genuine Business Controllers. They are sitting in the business and no more in a central location. The input factors get managed early enough to influence the output factors (focusing on leading rather than lagging performance indicators). 2. Substituting reporting Digital Performance Management enables unbiased ad-hoc big data queries. Digital Controllers resort to experimentation to verify hunches and for special aspects. Experimentation saves time and money. 3. Ignoring outliers Digital Performance Management enables the management of whole distributions. Digital Controllers focus on bulk instead of single improvements. Managing outliers could be very hard and results are questionable regarding time and money. 4. Overcoming silos Digital Performance Management enables the analysis of end-to-end processes. Digital Controllers think in processes rather than departments. Competing and conflicting sub-optimizations are avoided. 5. Applying statistical thinking Digital Performance Management enables the application of robust statistics and machine learning. Digital Controllers accept statistical errors instead of insisting on deterministic results all the time. Statistics make...
Ignorance Is a Blessing – If Done Consciously

Ignorance Is a Blessing – If Done Consciously

Today’s business challenge is to decide what not to do rather than what to do! To take these decisions we are continuing to add data, tooling, technology, people and time. We are making everything even more complicated as our inherent complexity already demands. More often than not we are ending up in “doing nothing”. Either because we are stuck in “analysis paralysis” (see below). Or because we are continuing to do it like always: struggling in crunch times. Our customers are telling us that they have overcome this stalemate situation. Because they can test their hunches in seconds instead of spending three men-months use freed up cash or profitability as uniform decision criteria instead of incomparable KPIs be sure that nothing has been overlooked instead of relying on biased reports For our customers “conscious ignorance” has arrived. P.S. Analysis paralysis or paralysis by analysis is the state of over-analyzing (or over-thinking) a situation so that a decision or action is never taken, in effect paralyzing the outcome. A decision can be treated as over-complicated, with too many detailed options, so that a choice is never made, rather than try something and change if a major problem arises. A person might be seeking the optimal or “perfect” solution upfront, and fear making any decision which could lead to erroneous results, while on the way to a better solution.[1] The phrase describes a situation in which the opportunity cost of decision analysis exceeds the benefits that could be gained by enacting some decision, or an informal or non-deterministic situation where the sheer quantity of analysis overwhelms the decision-making process itself, thus...
10 Principles of Attention Allocation

10 Principles of Attention Allocation

Attending to the things that matter is the foremost task of any manager. This means deciding which things matter and how much time to attend to those things. And as we all know the time to decide is shrinking dramatically while the number of to be taken decisions is growing by magnitudes. We at Trufa are set out to help business managers with these decisions. We are trying to mimic human reasoning with machine learning. For this we are applying following principles in our algorithms: Attention allocation algorithms must be comprehensible by human beings. Algorithms people don’t trust in will not be accepted. Attention allocation algorithms must produce dependable results. Algorithms never produce 100% precise results. But their margin of error must be negligible. Attention allocation algorithms must produce reproducible results. Algorithms must be reliable. Attention allocation algorithms must be robust against outliers. Human beings tend to overly turn to outliers though often these outliers are unmanageable. Attention allocation algorithms must be unbiased. Among others multi-variate regressions make assumptions about the applicable variates. Attention allocation algorithms must be diligently applied. In the 80ties expert systems were the cure for everything. Nowadays machine learning is overrated. Attention allocation algorithms must work in real-time. Decisions are to be taken ever faster these days. Attention allocation algorithms must work on very large data sets. ERP systems are growing day by day. And the digital transformation is accelerating this. Attention allocation algorithms must work for various businesses in various industries. Trustworthiness beats individualism. Attention allocation algorithms must work across multiple ERP systems. Non-matching document ids and master data must not throw of the...
Process Mining on Shaky Ground – or – The Fallacy of (SAP) ERP Event Logs

Process Mining on Shaky Ground – or – The Fallacy of (SAP) ERP Event Logs

Process Mining is highly popular. Because customers want to learn about their business. In detail. Unfortunately, this high level of customer interest has lured vendors into taking technical shortcuts which are questionable at least. The typical approach in process mining is to look out for so-called event logs. “Event logs: To be able to apply process mining techniques it is essential to extract event logs from data sources (e.g., databases, transaction logs, audit trails, etc.).” (http://www.processmining.org/logs/start) In case of technical systems such as database or transaction management systems event logs are the basis for ensuring their transactional integrity. “Examples from double-entry accounting systems often illustrate the concept of transactions. In double-entry accounting every debit requires the recording of an associated credit. If one writes a check for $100 to buy groceries, a transactional double-entry accounting system must record the following two entries to cover the single transaction: 1. Debit $100 to Groceries Expense Account 2. Credit $100 to Checking Account A transactional system would make both entries pass or both entries would fail. By treating the recording of multiple entries as an atomic transactional unit of work the system maintains the integrity of the data recorded. In other words, nobody ends up with a situation in which a debit is recorded but no associated credit is recorded, or vice versa.” (https://en.wikipedia.org/wiki/Database_transaction) The transactional integrity can only be preserved if and only if the respective event log is 100% accurate and complete. Hence event logs in transactional systems are highly reliable for subsequent process mining as for example performed by Splunk. In case of business systems like SAP ERP such...
The Death of Process Modeling as We Know It

The Death of Process Modeling as We Know It

We are all too familiar with the indispensable task of business process modeling. “Business process modeling (BPM) in systems engineering is the activity of representing processes of an enterprise, so that the current process may be analyzed or improved. BPM is typically performed by business analysts, who provide expertise in the modeling discipline; by subject matter experts, who have specialized knowledge of the processes being modeled; or more commonly by a team comprising both. Alternatively, the process model can be derived directly from events’ logs using process mining tools. The business objective is often to increase process speed or reduce cycle time; to increase quality; or to reduce costs, such as labor, materials, scrap, or capital costs. In practice, a management decision to invest in business process modeling is often motivated by the need to document requirements for an information technology project.” https://en.wikipedia.org/wiki/Business_process_modeling And we are all too familiar with the daunting shortcomings of this approach. Even considering the fact that modern process mining might speed up the detection of the as-is processes the modeling of the to-be processes continues to be an unsolved problem. Because you still have to have the endless discussions which process variants are to be discarded and which ones are to be improved for example. Since you have no decision criteria for which process scenario is preferable over the other. Using the navigation system as analogy process mining gets you a tracking of which way you have gone and how fast. But process mining cannot help you with deciding which route is the fastest. Nor does it remind you to start your journey early...
Multiple ERP Systems Divided by a Common Goal

Multiple ERP Systems Divided by a Common Goal

It is not uncommon that companies operate on multiple ERP systems (the average is 4 to 5). And it is also highly questionable whether it would be feasible to overcome this situation or not. Hence this situation will not change for the time to come. Unfortunately, this situation turns out as being a huge obstacle for any end-to-end process optimization. Because to make matters worse these multiple ERP systems have only little in common. Master data are not complete in each system. And where overlapping their keys are not in sync. And processes are cut in half. And where overlapping document numbers and timestamps are not in sync. All these idiosyncrasies contribute to the Sisyphus task of optimizing processes which are spanning multiple systems. Let’s assume you are running your logistics in one ERP system. And your finance people have a different ERP system. Of course you can perfectly manage your inventory in the logistics system. And of course you can perfectly pay your suppliers on time. Or? Your goods arrive on March 15 let’s assume. And you transfer your goods receipt note from your logistics to your finance system within five days; e.g. on March 20. And then you pay your supplier on March 23 let’s say. So far so good. But what if your supplier would introduce a new early payment discount applicable if the invoice is being paid within five days? Then you’d be unable to leverage this advantage. Too bad. What’s the real problem here? This opportunity would probably have gone unnoticed. Even worse. Actually not everything is lost anymore nowadays. Because in essence we...
Is Your Customer Satisfaction Related To Your Profit Or Not?

Is Your Customer Satisfaction Related To Your Profit Or Not?

The Net Promoter Score (NPS) measures customer satisfaction as a key driver to the financial success of a company. NPS transformed the business world and remain the core measurement for customer experience management worldwide. While NPS improved customer satisfaction, the results do not guarantee business growth. As many critics of NPS point out, a customer can provide a score of satisfied or very satisfied but still defect or skip repurchasing. In many cases, we see companies with great NPS scores yield poor results at the end of the quarter. Trufa offers an innovative way for business users to include NPS in operational decisions by leveraging their SAP ERP data: Unlike NPS, Trufa explores the correlation between customer satisfaction and financial gains. By weighing customer satisfaction as it relates to monetary value, business users can effectively use their NPS to gain better insights. Business users can have a complete view on customer satisfaction over all of the organization’s performed surveys. Trufa offers a complete view of the score, thereby enabling business users to easily and quickly determine areas of improvement in their operational processes. Business users can leverage the most up-to-date results of their NPS. Questions such as, “How does my recent operational changes affect customer satisfaction?” and “Where do I trend?” are easily explored and quickly answered in the application via Trufa. As a business, wouldn’t you like to know how your customer satisfaction correlates with your financial success? Is it worth your time and money to ensure that customer service levels are consistent throughout the process? The Trufa Performance Management Machine offers state of the art enterprise application...
The Need for Stochastic Controlling  or The End of FP&A as We Know it Today

The Need for Stochastic Controlling or The End of FP&A as We Know it Today

Classical FP&A (Controlling) approaches are on a dead-end street. We are witnessing the materialization of three fundamental facts of life which is forcing us to change: Incomplete data – 1 The avalanche of newly created data is growing faster than ever before. IoT or Industry 4.0 is real. Incomplete data – 2 The number of deployed enterprise systems has grown for more than 30 years now. The knowledge about how these systems interact is fading away, day by day. Incomplete data – 3 Today, it is not uncommon for vendors to not fully understand how their ERP systems function in detail. The developers who have conceived these products are all above 60 years old. As a result, classical deterministic approaches are doomed to fail – sooner than later. Therefore, deterministic approaches need to be complemented with stochastic statistical approaches – better sooner than later. Do these proven approaches—used by physicists—work in enterprise finance? Yes, they do. We don’t know how “stochastic controlling” will unfold in the future. Our vision is that “stochastic controlling” will revolve around performance scoring. FP&A stands for Financial Planning & Analysis. Planning is largely determined by aspirations and ambitions rather than capability to execute. Analysis is more or less confined to Excel massaging. In order to improve planning – advancing it from once a year to continuously – one must cognize business drivers. This means FP&A need to understand which of their financial indicators are dependent on which operational process steps. Moreover, FP&A must know about each and every operational process steps. Provided FP&A has access to all detailed money and goods movements during a...
Too Little, Too Late? The Future of Controlling

Too Little, Too Late? The Future of Controlling

Controlling is constantly fighting for their place within the corporation. Today, there is an opportunity for controlling to break out of their “bean counter” jail. The essence of controlling is planning and reporting. Planning is challenging because, as a Danish proverb says, “Prediction is difficult, especially when dealing with the future.” Along the same vein, reporting is tough because it answers only those questions where the potential answer is already known. As a consequence, controlling contributions are perceived more or less as “too little, too late.” This is the reason why controlling has been in a standstill situation for more than 25 years (1990 saw the birth of the Balanced Score Card). Now, the “business of predictions” is experiencing its second wind: artificial intelligence is back. Machine Learning is the buzzword of the day. In our daily lives, we have become reliant on predictions; might it be Google or Amazon proposals of all sorts, or might it be the car navigation system which brings us safely and quickly to our destination. Actually, I must admit that I sometimes find it nearly scary what systems know about my intentions and behavior. We abound in algorithms. Algorithms have evolved for a long time. Machine learning was first mentioned in 1959, and Artificial Intelligence in 1956. I personally lived through the first hype cycle of Expert Systems in 80’s. Those systems worked in highly specialized niches but they never conquered broader grounds. How come? Computers were just not fast enough. Data could not be processed fast enough. This situation caused the “fall of mankind”—the introduction of OLAP technologies which, in essence, attempted...
A Radically New Approach to Complexity Cost

A Radically New Approach to Complexity Cost

A new way to reduce complexity cost in your organization.   One of the questions we frequently receive from customers is if we can help with their complexity cost; complexity cost caused by the amount of suppliers, of products, of process variants, etc. We absolutely can: with a radically new approach simulating complexity effect on free cash and profitability to derive decision criteria for taking action—supplier by supplier, product by product, process variant by process variant, etc. Complexity costs grow quadratically with the amount of moving parts such as SKUs, process variants, payment terms and alike. An analysis by ATKearney found that “EBIT reserves of more than €30 billion are just waiting to be tapped,” and that “companies can increase their EBIT by 3 to 5 percentage points on average.” Yet why do we have a difficult time unraveling complexity cost issues? It begins with the term “complexity costs.” While the term is correct in describing the phenomenon that complexity drives costs, the term is misleading because it suggests that we have to analyze costs in order to be able to do something about it. This is wrong. Complexity is not a bad thing, per se. There’ s “good” as well as “bad” complexity. “Good” complexity is  value adding whereas “bad” complexity is value diminishing. Hence, the challenge is to differentiate between “good” and “bad” complexity. Our approach starts by defining “value.” Though value targets vary per enterprise, they typically revolve around the capability to generate cash and profits and thus are composed out of cash (working capital) and profitability goals. That is the basis for our new idea....
12 Rules for True Finance Applications

12 Rules for True Finance Applications

The Oxford Dictionary defines “principle” as “A fundamental truth or proposition that serves as the foundation for a system of belief or behavior or for a chain of reasoning.” Principles are the prerequisite for growing a well-behaved software system, too. The most famous principles in the computing history are probably the “IBM System/360 Principles of Operation” (POO). Those principles have been the foundation of a whole industry and created the software world we are living in. At Trufa,  we strongly believe in principles as well. Those principles have and do determine our daily decisions. We have even gone so far to abstract those principles into a set of 12 rules in order to share them with our customers, partners and employees. Rules Explanation True Numbers All numbers are calculated rather than approximated. No bias due to aggregates or cleansing. All Process Details No relevant information gets lost due to process gaps or omissions. No Semantic Gap The same language throughout the whole world of data. Transcending Silos All business functions look at the same source of truth. Forward-looking Simulating the future leveraging true relationships insights. Intuitive Usage Deciders understand their discipline without intermediaries (e.g. data scientists). Instant Live No data preparation (especially modeling) beyond access to raw data. “Viral” Implementation All potential users can study the data without further ado. Collective Intelligence Cloud-centricism enables collaborative decision making. Ready-to-run Apps No IT (ERP/BI) projects in order to install, configure and customize the software. Always Up-to-Date Friction-less and latency-less data provisioning. Simple Technology Robust and reliable due to significantly shortened technology stack. These 12 rules are built on Edsger Dijkstra’s insight...

Coinvention indicates the maturity of an idea (here advanced statistics for Finance for Corporations)

The other day we came across the following inspiring article CFOs Frustrated with Return on FP&A Investments. This reminded us of the wise saying that coinvention – independent people inventing the same thing at the same time – is a sign for the maturity of an idea. Some three years ago we came to the conclusion that new technology advancements such as in-memory columnar databases, robust statistics and cloud computing give way for a new approach to solve “unsolvable” corporate finance problems as alluded to in this article. That said we developed an application solution which enables sensitivity analyses leveraging vital financial ratios with respect to operational efficiency and complexity (portfolio and process) of medium to large enterprises. For this we calculate ad-hoc arbitrary operational and financial KPIs from every detailed business transaction and correlate them in order to determine opportunity and risk spans as well as automatically detecting virtually all corrective action opportunities. Key for this is our proprietary business performance score TPI (True Performance Index) which relates operational with financial performance by leveraging robust statistical simulation methodologies. On the one hand our customers benefit from our customer-specific sustainable free cash optimization recommendations (instead of just tinkering with the symptoms of too high working capital). On the other hand they benefit from our customer-specific pricing optimization recommendations (assessing their price elasticity patterns). And now we have introduced profitability into the picture (focusing on the real profit rather than an allocation game). The results so far are amazing. Truly a new dimension of business controlling and corporate finance. P.S. By the way; did we already mention that our customers went live in...

Error margin of aggregate-based analysis and reporting

We are so used to base our daily analysis and reporting on aggregated data rather than detailed data that we have forgotten to ask ourselves what kind of mistake we are making by doing so. And indeed there is a price for this sort of simplification. Already Albert Einstein stated “Everything should be made as simple as possible, but not simpler.”. Building data aggregates turns out as being such an over-simplification in many if not most cases. Actually the error margin attributed to neglecting the detail by relying on aggregates can easily be in the magnitude of 10%. So far so good? Not really. Because with an error margin of 10% all improvement conclusions of less than 10%, let’s say 5% for example, are unfounded. Those findings might be real or a fallacy. What does this mean to our current management practice? How come that we have been so ignorant about this issue? We just couldn’t do better with the traditional data analysis technology. But now we can. So why don’t we do it?   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...
Paying late honored by steep discounts?

Paying late honored by steep discounts?

One of our customers found a surprising fact the other day. And actually it took her only a few hours for this. Here’s the scoop. Our customer did some pricing sensitivity analyses. Among others she checked into the realized prices for customers with 180 days payment target. And she learned that those customers are paying about 23% below the average prices. Astounding. Isn’t it? Was this insight just coming by accident? You may think so. Since with reporting you wouldn’t have found this top-line improvement opportunity. Because reporting presupposes that you have to have a certain suspicion in order to parametrize the report accordingly. I.e. all reporting is biased by nature. Hence you cannot find surprising facts at all with reporting. Furthermore such findings require the accessibility of all detailed transactional records. In order to be able to correlate arbitrary business events in unusual ways. And this needs to be doable ad-hoc. We at Trufa have specialized in top- and bottom-line sensitivity analyses. We are applying advanced statistics to all business details as recorded in SAP ERP systems. We are quantifying opportunities and risks in dollars. In impact on working capital, realized pricing levels and achieved operating margin. We are relating financial and operational performance indicators. We are fostering product portfolio and process complexity reductions. And we are setting our customers live in less than one (1) week. That’s all.   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...

What makes the life of process owners so complicated?

If you want to get end-to-end business processes under control you are inclined to appoint process owners. Sounds logical doesn’t it. Nevertheless there are only relatively few process owners in a company. How come? Typically a process owner needs to argue with the line managers whose areas he is traversing. And typically this means a lot of infighting because normally hard facts are hard to come by to decide such trade-offs. That’s a real-life problem which we incidentally solved while we created our TPI (True Performance Index). The TPI gives a meaning to KPIs. The main disadvantage of KPIs is that they are not comparable. Let’s say that your KPI for shipping on time in Italy is 45% and in Mexico it is 65%. The question on hand is where should I invest? In Italy or in Mexico? The TPI converts the KPI into the associated value weighted with the company-specific likelihood to achieve the identified value gain. This way you can take an educated decision. By the same token a TPI can be calculated for the whole and all individual steps of a business process. And the beauty is that this “process” TPI can be compared with the “org unit” TPI for example. Thus you can decide whether you gain more by investing into high frequency business process scenarios like “rush order” versus investing into “shipping on time in Russia” for example. Sounds cool? If you want to learn about the details talk to us. Right away.   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...