Process Mining on Shaky Ground – or – The Fallacy of (SAP) ERP Event Logs

Process Mining on Shaky Ground – or – The Fallacy of (SAP) ERP Event Logs

Process Mining is highly popular. Because customers want to learn about their business. In detail. Unfortunately, this high level of customer interest has lured vendors into taking technical shortcuts which are questionable at least. The typical approach in process mining is to look out for so-called event logs. “Event logs: To be able to apply process mining techniques it is essential to extract event logs from data sources (e.g., databases, transaction logs, audit trails, etc.).” (http://www.processmining.org/logs/start) In case of technical systems such as database or transaction management systems event logs are the basis for ensuring their transactional integrity. “Examples from double-entry accounting systems often illustrate the concept of transactions. In double-entry accounting every debit requires the recording of an associated credit. If one writes a check for $100 to buy groceries, a transactional double-entry accounting system must record the following two entries to cover the single transaction: 1. Debit $100 to Groceries Expense Account 2. Credit $100 to Checking Account A transactional system would make both entries pass or both entries would fail. By treating the recording of multiple entries as an atomic transactional unit of work the system maintains the integrity of the data recorded. In other words, nobody ends up with a situation in which a debit is recorded but no associated credit is recorded, or vice versa.” (https://en.wikipedia.org/wiki/Database_transaction) The transactional integrity can only be preserved if and only if the respective event log is 100% accurate and complete. Hence event logs in transactional systems are highly reliable for subsequent process mining as for example performed by Splunk. In case of business systems like SAP ERP such...
The Death of Process Modeling as We Know It

The Death of Process Modeling as We Know It

We are all too familiar with the indispensable task of business process modeling. “Business process modeling (BPM) in systems engineering is the activity of representing processes of an enterprise, so that the current process may be analyzed or improved. BPM is typically performed by business analysts, who provide expertise in the modeling discipline; by subject matter experts, who have specialized knowledge of the processes being modeled; or more commonly by a team comprising both. Alternatively, the process model can be derived directly from events’ logs using process mining tools. The business objective is often to increase process speed or reduce cycle time; to increase quality; or to reduce costs, such as labor, materials, scrap, or capital costs. In practice, a management decision to invest in business process modeling is often motivated by the need to document requirements for an information technology project.” https://en.wikipedia.org/wiki/Business_process_modeling And we are all too familiar with the daunting shortcomings of this approach. Even considering the fact that modern process mining might speed up the detection of the as-is processes the modeling of the to-be processes continues to be an unsolved problem. Because you still have to have the endless discussions which process variants are to be discarded and which ones are to be improved for example. Since you have no decision criteria for which process scenario is preferable over the other. Using the navigation system as analogy process mining gets you a tracking of which way you have gone and how fast. But process mining cannot help you with deciding which route is the fastest. Nor does it remind you to start your journey early...
Multiple ERP Systems Divided by a Common Goal

Multiple ERP Systems Divided by a Common Goal

It is not uncommon that companies operate on multiple ERP systems (the average is 4 to 5). And it is also highly questionable whether it would be feasible to overcome this situation or not. Hence this situation will not change for the time to come. Unfortunately, this situation turns out as being a huge obstacle for any end-to-end process optimization. Because to make matters worse these multiple ERP systems have only little in common. Master data are not complete in each system. And where overlapping their keys are not in sync. And processes are cut in half. And where overlapping document numbers and timestamps are not in sync. All these idiosyncrasies contribute to the Sisyphus task of optimizing processes which are spanning multiple systems. Let’s assume you are running your logistics in one ERP system. And your finance people have a different ERP system. Of course you can perfectly manage your inventory in the logistics system. And of course you can perfectly pay your suppliers on time. Or? Your goods arrive on March 15 let’s assume. And you transfer your goods receipt note from your logistics to your finance system within five days; e.g. on March 20. And then you pay your supplier on March 23 let’s say. So far so good. But what if your supplier would introduce a new early payment discount applicable if the invoice is being paid within five days? Then you’d be unable to leverage this advantage. Too bad. What’s the real problem here? This opportunity would probably have gone unnoticed. Even worse. Actually not everything is lost anymore nowadays. Because in essence we...
Is Your Customer Satisfaction Related To Your Profit Or Not?

Is Your Customer Satisfaction Related To Your Profit Or Not?

The Net Promoter Score (NPS) measures customer satisfaction as a key driver to the financial success of a company. NPS transformed the business world and remain the core measurement for customer experience management worldwide. While NPS improved customer satisfaction, the results do not guarantee business growth. As many critics of NPS point out, a customer can provide a score of satisfied or very satisfied but still defect or skip repurchasing. In many cases, we see companies with great NPS scores yield poor results at the end of the quarter. Trufa offers an innovative way for business users to include NPS in operational decisions by leveraging their SAP ERP data: Unlike NPS, Trufa explores the correlation between customer satisfaction and financial gains. By weighing customer satisfaction as it relates to monetary value, business users can effectively use their NPS to gain better insights. Business users can have a complete view on customer satisfaction over all of the organization’s performed surveys. Trufa offers a complete view of the score, thereby enabling business users to easily and quickly determine areas of improvement in their operational processes. Business users can leverage the most up-to-date results of their NPS. Questions such as, “How does my recent operational changes affect customer satisfaction?” and “Where do I trend?” are easily explored and quickly answered in the application via Trufa. As a business, wouldn’t you like to know how your customer satisfaction correlates with your financial success? Is it worth your time and money to ensure that customer service levels are consistent throughout the process? The Trufa Performance Management Machine offers state of the art enterprise application...
The Need for Stochastic Controlling  or The End of FP&A as We Know it Today

The Need for Stochastic Controlling or The End of FP&A as We Know it Today

Classical FP&A (Controlling) approaches are on a dead-end street. We are witnessing the materialization of three fundamental facts of life which is forcing us to change: Incomplete data – 1 The avalanche of newly created data is growing faster than ever before. IoT or Industry 4.0 is real. Incomplete data – 2 The number of deployed enterprise systems has grown for more than 30 years now. The knowledge about how these systems interact is fading away, day by day. Incomplete data – 3 Today, it is not uncommon for vendors to not fully understand how their ERP systems function in detail. The developers who have conceived these products are all above 60 years old. As a result, classical deterministic approaches are doomed to fail – sooner than later. Therefore, deterministic approaches need to be complemented with stochastic statistical approaches – better sooner than later. Do these proven approaches—used by physicists—work in enterprise finance? Yes, they do. We don’t know how “stochastic controlling” will unfold in the future. Our vision is that “stochastic controlling” will revolve around performance scoring. FP&A stands for Financial Planning & Analysis. Planning is largely determined by aspirations and ambitions rather than capability to execute. Analysis is more or less confined to Excel massaging. In order to improve planning – advancing it from once a year to continuously – one must cognize business drivers. This means FP&A need to understand which of their financial indicators are dependent on which operational process steps. Moreover, FP&A must know about each and every operational process steps. Provided FP&A has access to all detailed money and goods movements during a...
Too Little, Too Late? The Future of Controlling

Too Little, Too Late? The Future of Controlling

Controlling is constantly fighting for their place within the corporation. Today, there is an opportunity for controlling to break out of their “bean counter” jail. The essence of controlling is planning and reporting. Planning is challenging because, as a Danish proverb says, “Prediction is difficult, especially when dealing with the future.” Along the same vein, reporting is tough because it answers only those questions where the potential answer is already known. As a consequence, controlling contributions are perceived more or less as “too little, too late.” This is the reason why controlling has been in a standstill situation for more than 25 years (1990 saw the birth of the Balanced Score Card). Now, the “business of predictions” is experiencing its second wind: artificial intelligence is back. Machine Learning is the buzzword of the day. In our daily lives, we have become reliant on predictions; might it be Google or Amazon proposals of all sorts, or might it be the car navigation system which brings us safely and quickly to our destination. Actually, I must admit that I sometimes find it nearly scary what systems know about my intentions and behavior. We abound in algorithms. Algorithms have evolved for a long time. Machine learning was first mentioned in 1959, and Artificial Intelligence in 1956. I personally lived through the first hype cycle of Expert Systems in 80’s. Those systems worked in highly specialized niches but they never conquered broader grounds. How come? Computers were just not fast enough. Data could not be processed fast enough. This situation caused the “fall of mankind”—the introduction of OLAP technologies which, in essence, attempted...
Surfing the Wave of Big Data Analytics

Surfing the Wave of Big Data Analytics

Approaching big data analytics is no different than learning how to surf. Ever wonder why surfers have broad shoulders? When surfers ride a wave, the effort does not seem to rely on their shoulders, or even their legs. The true reason for a surfer’s broad upper body physique is that it is used to prepare for the next wave: paddling out to sea, paddling towards the peak and paddling to catch the ultimate wave. Along with upper body strength, surfers rely on balance, agility and the ability to read an incoming swell. Big data analytics is no different. Business users know big data analytics for its “cool” visualizations and “cool” findings. However, we forget all the paddling that need to happen to get to that point. Getting to the visualizations and findings require data integration, data collection, data cleansing, and data organization. The barriers can be daunting. How are we going to collect billions of rows? How are we going to find our way across the tens of thousands of tables and fields where our data is located? Even after data collection, how will we connect the dots between the billions of data points to expose them in a business sense? These tasks, which often represents about 80 percent of a big data project, are difficult to tackle as they require the perfect mix of processing fire power, data sources, knowledge and business expertise to weave data in a meaningful way. In terms of effort and time, data integration is usually where the bulk of the effort is when starting big data analytics. One of the key game changers that application-based big data solution...
Reaching the efficient frontier in our business feels like having our cake and eating it too

Reaching the efficient frontier in our business feels like having our cake and eating it too

Many of our choices are constrained by technicalities that are, more often than not, out of our control. For example, I love photography. A challenge in photography is finding the right balance to capture the perfect photo. The natural suspects to this constraint are the laws of physics. For instance, if I want more light I have the option to extend exposure time but I may get blur. If I increase aperture, I will lose depth of field. If I push up ISO’s to gain sensitivity, I may get grain. In photography, success depends on finding the perfect balance of parameters for any given type of subject. Thankfully, modern cameras offer backscreens. I don’t have to second guess my photo’s results—I can see them instantly and reshoot with new parameters if needed. Much like photography, business choices are loaded with constraints; we constantly search for the efficient business frontier to find the right balance between operational efforts and financial yield. We are often faced with issues in process simplification for more revenue or for instance, improving delivery quality to drive invoice payments. Moreover, there seems to be a constant stream operational processes that could use further improvements: can the increase of electronic transactions make up for the IT set-up cost? Can I reduce my suppliers to lower my purchase prices without hurting procurement? The biggest constraint facing businesses today is that there are no easy ways to predict results of our decisions. What could appear to improve customer satisfaction can easily backfire and affect the entire chain with sky-high costs in logistics and production. Back then, addressing possible areas...
Finding Elegant Simplicity in Enterprise Applications

Finding Elegant Simplicity in Enterprise Applications

My fondest childhood memory was riding in my Dad’s convertible Peugeot 405. I was not much of a car fanatic then, and even now, but what stood out most to me was the car’s elegant design. Elegance. I heard the word every time my Dad took me around the city is his beloved old brown car. As I grew older and newer, faster cars flooded the market, I found myself never tiring of the Peugeot 405’s simple, elegant look. Not surprisingly, so did many others. During these rides, my Dad often talked about his admiration for Pininfarina—the Italian design house behind the 405, Fiat 130 Coupe, Alfa Romeo GTV, some of the 20th century’s greatest Ferraris and so many other classic vehicles. For him, beyond speed and power, elegance was everything: every element is necessary, intentional, thoughtfully designed not only in concept, but also in the materials used to engineer the vehicle. As an engineer and statistician, he shared the same passion for elegant mathematic solutions and pushed me to solve problems that same way; putting in the extra work to solve a problem in an elegant and simple manner that not only makes sense to me, but to others who are reading it, is extremely gratifying. If you look over Pininfarina’s website, you’ll find that their design approach has not changed since 1930. Their philosophy consistently revolves around three essential requirements: elegance, purity and innovation. Doing so has enabled them to expand their business into so many industries without sacrificing core beliefs. The 405, in my humble opinion, is the physical manifestation of this design philosophy: elegant simplicity....
A Radically New Approach to Complexity Cost

A Radically New Approach to Complexity Cost

A new way to reduce complexity cost in your organization.   One of the questions we frequently receive from customers is if we can help with their complexity cost; complexity cost caused by the amount of suppliers, of products, of process variants, etc. We absolutely can: with a radically new approach simulating complexity effect on free cash and profitability to derive decision criteria for taking action—supplier by supplier, product by product, process variant by process variant, etc. Complexity costs grow quadratically with the amount of moving parts such as SKUs, process variants, payment terms and alike. An analysis by ATKearney found that “EBIT reserves of more than €30 billion are just waiting to be tapped,” and that “companies can increase their EBIT by 3 to 5 percentage points on average.” Yet why do we have a difficult time unraveling complexity cost issues? It begins with the term “complexity costs.” While the term is correct in describing the phenomenon that complexity drives costs, the term is misleading because it suggests that we have to analyze costs in order to be able to do something about it. This is wrong. Complexity is not a bad thing, per se. There’ s “good” as well as “bad” complexity. “Good” complexity is  value adding whereas “bad” complexity is value diminishing. Hence, the challenge is to differentiate between “good” and “bad” complexity. Our approach starts by defining “value.” Though value targets vary per enterprise, they typically revolve around the capability to generate cash and profits and thus are composed out of cash (working capital) and profitability goals. That is the basis for our new idea....
12 Rules for True Finance Applications

12 Rules for True Finance Applications

The Oxford Dictionary defines “principle” as “A fundamental truth or proposition that serves as the foundation for a system of belief or behavior or for a chain of reasoning.” Principles are the prerequisite for growing a well-behaved software system, too. The most famous principles in the computing history are probably the “IBM System/360 Principles of Operation” (POO). Those principles have been the foundation of a whole industry and created the software world we are living in. At Trufa,  we strongly believe in principles as well. Those principles have and do determine our daily decisions. We have even gone so far to abstract those principles into a set of 12 rules in order to share them with our customers, partners and employees. Rules Explanation True Numbers All numbers are calculated rather than approximated. No bias due to aggregates or cleansing. All Process Details No relevant information gets lost due to process gaps or omissions. No Semantic Gap The same language throughout the whole world of data. Transcending Silos All business functions look at the same source of truth. Forward-looking Simulating the future leveraging true relationships insights. Intuitive Usage Deciders understand their discipline without intermediaries (e.g. data scientists). Instant Live No data preparation (especially modeling) beyond access to raw data. “Viral” Implementation All potential users can study the data without further ado. Collective Intelligence Cloud-centricism enables collaborative decision making. Ready-to-run Apps No IT (ERP/BI) projects in order to install, configure and customize the software. Always Up-to-Date Friction-less and latency-less data provisioning. Simple Technology Robust and reliable due to significantly shortened technology stack. These 12 rules are built on Edsger Dijkstra’s insight...

Practical Treasury Examples Where Predictive Analytics Can be Very Beneficial

I enjoyed contributing ideas to Amber’s blog about leveraging predictive analytics to improve business process decision and therefore operational and financial performance. From a Treasury point of view, she compares the common ratio-based analysis and the new approaches.  Please review this blog by Amber Christian from Ace Treasury Consulting at: http://blog.consultace.biz/2016_02_01_archive.html  ...

Coinvention indicates the maturity of an idea (here advanced statistics for Finance for Corporations)

The other day we came across the following inspiring article CFOs Frustrated with Return on FP&A Investments. This reminded us of the wise saying that coinvention – independent people inventing the same thing at the same time – is a sign for the maturity of an idea. Some three years ago we came to the conclusion that new technology advancements such as in-memory columnar databases, robust statistics and cloud computing give way for a new approach to solve “unsolvable” corporate finance problems as alluded to in this article. That said we developed an application solution which enables sensitivity analyses leveraging vital financial ratios with respect to operational efficiency and complexity (portfolio and process) of medium to large enterprises. For this we calculate ad-hoc arbitrary operational and financial KPIs from every detailed business transaction and correlate them in order to determine opportunity and risk spans as well as automatically detecting virtually all corrective action opportunities. Key for this is our proprietary business performance score TPI (True Performance Index) which relates operational with financial performance by leveraging robust statistical simulation methodologies. On the one hand our customers benefit from our customer-specific sustainable free cash optimization recommendations (instead of just tinkering with the symptoms of too high working capital). On the other hand they benefit from our customer-specific pricing optimization recommendations (assessing their price elasticity patterns). And now we have introduced profitability into the picture (focusing on the real profit rather than an allocation game). The results so far are amazing. Truly a new dimension of business controlling and corporate finance. P.S. By the way; did we already mention that our customers went live in...

With Big Data, companies don’t have to get unnecessarily penalized for having high receivables at the end of the year.

One of the critical financial metrics on a company’s balance sheet is the account receivables (AR) it carries. Financial analysts and investors use this metric to determine the effectiveness of the company’s collection department and the health of its customers. Instead of using the actual value of AR, however, analysts use a key performance indicator (KPI) called Days Sales Outstanding (DSO) that is derived from the value of AR. The DSO calculation, also called the average collection period, measures the average number of days it takes a company to collect cash from its customers. The DSO metric is traditionally calculated by taking the value of the receivables at the end of a period and dividing it by the average daily sales over the period. In other words, DSO = (receivables / annual revenue) x 365 For a company whose sales is uniformly distributed throughout the year, this DSO calculation does reflect the average amount of time its customers take to pay their invoices. However, for a company that has uneven sales throughout the year, especially a company that follows a hockey stick sales pattern, this measure does not reflect the true average amount of time its customers take to pay their invoices. Before we go into an example to show why the traditional DSO measure fails to capture the real collection efficiency of a company, let’s define another metric called the “True DSO”. The True DSO number is calculated by averaging the actual length of time it takes for all the invoices of the company to get paid. Now let’s look at these two calculations in two different but...

Error margin of aggregate-based analysis and reporting

We are so used to base our daily analysis and reporting on aggregated data rather than detailed data that we have forgotten to ask ourselves what kind of mistake we are making by doing so. And indeed there is a price for this sort of simplification. Already Albert Einstein stated “Everything should be made as simple as possible, but not simpler.”. Building data aggregates turns out as being such an over-simplification in many if not most cases. Actually the error margin attributed to neglecting the detail by relying on aggregates can easily be in the magnitude of 10%. So far so good? Not really. Because with an error margin of 10% all improvement conclusions of less than 10%, let’s say 5% for example, are unfounded. Those findings might be real or a fallacy. What does this mean to our current management practice? How come that we have been so ignorant about this issue? We just couldn’t do better with the traditional data analysis technology. But now we can. So why don’t we do it?   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...
Paying late honored by steep discounts?

Paying late honored by steep discounts?

One of our customers found a surprising fact the other day. And actually it took her only a few hours for this. Here’s the scoop. Our customer did some pricing sensitivity analyses. Among others she checked into the realized prices for customers with 180 days payment target. And she learned that those customers are paying about 23% below the average prices. Astounding. Isn’t it? Was this insight just coming by accident? You may think so. Since with reporting you wouldn’t have found this top-line improvement opportunity. Because reporting presupposes that you have to have a certain suspicion in order to parametrize the report accordingly. I.e. all reporting is biased by nature. Hence you cannot find surprising facts at all with reporting. Furthermore such findings require the accessibility of all detailed transactional records. In order to be able to correlate arbitrary business events in unusual ways. And this needs to be doable ad-hoc. We at Trufa have specialized in top- and bottom-line sensitivity analyses. We are applying advanced statistics to all business details as recorded in SAP ERP systems. We are quantifying opportunities and risks in dollars. In impact on working capital, realized pricing levels and achieved operating margin. We are relating financial and operational performance indicators. We are fostering product portfolio and process complexity reductions. And we are setting our customers live in less than one (1) week. That’s all.   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...

What makes the life of process owners so complicated?

If you want to get end-to-end business processes under control you are inclined to appoint process owners. Sounds logical doesn’t it. Nevertheless there are only relatively few process owners in a company. How come? Typically a process owner needs to argue with the line managers whose areas he is traversing. And typically this means a lot of infighting because normally hard facts are hard to come by to decide such trade-offs. That’s a real-life problem which we incidentally solved while we created our TPI (True Performance Index). The TPI gives a meaning to KPIs. The main disadvantage of KPIs is that they are not comparable. Let’s say that your KPI for shipping on time in Italy is 45% and in Mexico it is 65%. The question on hand is where should I invest? In Italy or in Mexico? The TPI converts the KPI into the associated value weighted with the company-specific likelihood to achieve the identified value gain. This way you can take an educated decision. By the same token a TPI can be calculated for the whole and all individual steps of a business process. And the beauty is that this “process” TPI can be compared with the “org unit” TPI for example. Thus you can decide whether you gain more by investing into high frequency business process scenarios like “rush order” versus investing into “shipping on time in Russia” for example. Sounds cool? If you want to learn about the details talk to us. Right away.   Guenther Tolkmit / Chief Delivery Officer and Co-Founder at Trufa,...

Von Daten und Bauchgefühlen

Wochenende, Fußball. Was macht der Trainer der favorisierten Mannschaft, wenn sein Team unerwartet zurückliegt? Er wechselt mehr Offensivkräfte ein. Das leuchtet ein. Das verlangt die Tribüne. Aus einer Studie zur Verhaltensökonomie, die auf Ergebnissen von 8.200 Spielen aus zwei großen Ligen über 12 Jahre beruht, lernen wir, dass diese Strategie keinen Erfolg verspricht. Für Verhaltensökonomen ergibt sich die Erkenntnis, dass der Mensch nicht immer fähig ist, seinen Nutzen zu maximieren. Aber vielleicht geht es den Trainern wie vielen anderen Entscheidern auch: es wusste einfach bisher keiner, dass diese Strategie nicht zielführend ist. Und so lautet die Empfehlung: öfter mal das Bauchgefühl anhand der Daten verifizieren. Dann klappt’s im Unternehmen und auf dem Fußballplatz. Link zur Presse-Info der Johannes Gutenberg Universität Mainz: http://idw-online.de/de/news617519   Ralph Treitz / Chief Operation Officer and Co-Founder at Trufa,...

What is wrong with Business Intelligence? Do we need Enterprise Data Lakes?

Everybody knows that you have to perform following vital tasks in order to solve business intelligence problems: * You have to involve your IT. * Your IT has to model your data (build a data mart by specifying how your data are to be aggregated). * Your IT might have to harmonize your master data. * Your IT has to cleanse your source data. * Your IT has to populate your data mart with a batch run. * Your IT has to train you how to use your data mart. In essence you need to know what you want to know before your are getting your answers. There is something wrong with this picture. What if you could start the other way round? What if you don’t know the answers before asking your questions? Would a data lake method be the better approach? What is a data lake? “A massive, easily accessible data repository built on (relatively) inexpensive computer hardware for storing “big data”. Unlike data marts, which are optimized for data analysis by storing only some attributes and dropping data below the level aggregation, a data lake is designed to retain all attributes, especially so when you do not yet know what the scope of data or its use will be.” (http://en.wiktionary.org/wiki/data_lake) An enterprise data lake could shift the upfront IT effort till later or sometimes forfeiting it at all. You could experiment with your data right away. And determine later where and when your IT gets involved. Sounds like a pipe dream. Or? Talk to us if you are curious to learn how our customers are leveraging...

“Unbiased Reporting” is an Oxymoron

Most enterprises continue to rely on reporting as their source of business intelligence. In this context reporting is nothing else than sifting the records of the ordinary business activity on a more or less regular schedule. The result of reporting is a list with graphics. And/or combined in a dashboard. That’s our daily reality when it comes to understanding what we are doing in our business. What does “unbiased” mean in this context? Google defines “unbiased” as un·bi·ased ˌənˈbīəst/ adjective showing no prejudice for or against something; impartial. synonyms: impartial, unprejudiced, neutral, nonpartisan, disinterested, detached,dispassionate, objective, value-free, open-minded, equitable, even-handed, fair “we need an unbiased opinion” Taking this definition into account it becomes obvious that you require “unbiased reporting” if you want to understand what is really going on in your business. Now the crux is that “unbiased reporting” is impossible with our current business intelligence approaches. How come? Whenever you design a report you are making assumptions about your business reality. You are making assumptions about how your business is structured. You are making assumptions about how your business processes flow. You are making assumptions about critical thresholds of your business. Regarding money as well as duration. Because IT technically requires these parameters for setting up your lists and dashboards. In essence you are creating a model of your business in order to apply current business intelligence technologies. The unavoidable and thus unfortunate dilemma with this approach is that there is no way to prove that your model is indeed correctly reflecting your business reality. This might sound a bit theoretical. But we encountered such phenomena multiple times in our customer base. For example there was a...