Friday, October 9, 2015

Keep Your Factory Humming with Tableau

Download my TC15 Demo Worksheet ... Sample Data  

How well does your factory run? 

Do you ever have to pause production while you wait for parts to arrive?  Unexpected material shortages are the bane of a supply chain planner's life.  These planners need tools to forecast if they have enough supplies ordered to meet future demand.  While inventory and supply may be tracked in a corporate ERP system, the planning work often falls to an analyst using spreadsheets and manual processes.  There is a huge need for systems to predict and react to gaps in future supply, to maintain peak manufacturing efficiency.

Tableau is a powerful analytics application that helps users better understand their data.  Analysts and developers can use special functions called Table Calculations to build forward looking models of supply, demand; spot shortages, and suggest remedies.  Used in this way, Tableau can serve a vital role for supply chain departments to self-correct as the demand for finished goods changes.
Supply Events

As a Business Intelligence developer, I have prepared a demo dashboard called "Can Build" that demonstrates the power of Tableau Calculations, to be both descriptive and prescriptive.

  • We need to predict inventory and best possible production rates, both for today (Week 0 ) and in future weeks (Week N ).
  • Our factory builds one type of finished good product, with a defined Bill of Materials.  The BOM lists the parts and quantities required to build one unit.  For the demo, we are building Breakfast Sandwiches, using bread, egg, and cheese parts.
  • Supply is a listing of part names, quantities, and dates.  Supply for week 0 is inventory currently on hand, while supply for week 3 is an expected delivery from a vendor, due three weeks from now.

To arrive at a good understanding of our factory capacity, a series of aggregations and calculations are needed.  Starting with a dataset of supply events, we stack table calcs together to project inventory, find constraints and bottlenecks, incorporate asssumptions about the factory's maximum weekly rate, and display a visualization of production by week.

Factory Performance Prediction

Supply events plotted by week:

Projected part balances using RUNNING_SUM table calc:

Remove discrete supply events, combine part supply (part capacity) projections onto single page:

Time for our second table calc!  Find lowest of these lines, call it the Bottleneck, or Capacity Min.  This is the grey line that traces the lowest projected part each week.

This bottleneck, called CapMin, on its own:

Nominal "Best Case" of 5 per week.  This is the factory's maximum output, and we want to run the factory as close as possible to this rate.  Adjustable via Input Parameter.

Plotting CapMin and BestCase together we get an idea where our supply is ahead or behind our best production rate.  For example, in Week 2 we will be running well below peak.

In order to better understand these competing capacity lines, let's flatten the BestCase goal to zero by subtracting it from the CapMin.  The purple line clearly shows where our CapMin goes above or below our BestCase goal.

When our CapMin line goes below our goal (shown as negative values here) our factory will be running at less than 100%.  The 4 units of capacity lost in week 2 are lost forever, never to be regained.  But how can we isolate these lost capacity levels moving forward in time?  Time for another table calc!  Use RUNNING_MIN to get the lowest value of this line.  Throw out any values greater than zero.

Call this grey line "Lost Capacity".  Compare it to and combine it with our nominal goal of 5 per week.

The red line is called Actual. This is what our factory can build.  Plot it with our original Best Case goal line.

Analyze the weekly changes in the red "Actual" line to get the discrete build quantity for each week, plotted via the squares:

Wow, we now have a slick dashboard that predicts our factory output.

We can model any desired factory rate (via an input parameter) and see how the factory will perform, given the state of supply.  Our dashboard was built in several steps, or layers.  I call this the "Pancake" approach.  Just like a stack of pancakes, our model gets better and better the higher we stack it.  Many of the layers use simple techniques such as subtraction, multiplication, take lowest of two values, etc.  The Tableau table calculations we used are summarized here in blue.  These calculations are available in Tableau 8 and 9, and do not use Tableau 9 Level of Detail (LOD) calculations, although those are very cool too.

What, my factory is down again?
The viz works great, but the news is not good.  The assembly line team will have idle time, as we will not be able to run at full capacity.  But, all is not lost.  There is still time to do something.  The problems do not hit us until week two.

What Ifs

We can manually review the bottleneck parts and either place new orders or perhaps expedite existing orders.  For example, the blue line for Bread could use a bump up in week 3 to lift it out of a long stretch of being the #1 shortest part.  This could be accomplished by requesting an earlier delivery of the 28 pieces due in week 7.  Even if the vendor requested a small expedite fee, it may well be less than the potential benefit of boosting overall production.

Manually modeling potential updates to supply can be done by hand, but can Tableau automatically calculate the exact list of orders needed to run at our desired rate?

Recommendation Engine

How can we react to expected part shortages?  Below we have our original part supply lines showing expected inventory.  The second chart shows these same vector lines, after subtracting all the parts needed to build to our Best Case rate of 5 units per week.  Observe that some part balances dip below zero and turn negative (the shaded area).

To zoom in on these negative dips, lets ignore any positive balances and examine negatives only.  Let's further find the lowest low of each part, going forward.  We want to measure when the part projection hits a new low.  This can be drawn using RUNNING_MIN.

Assume each time we hit a new low, we want to place a corresponding new order to fix the shortage and bring that part balance back up to zero.  The calc - LOOKUP(calc, -1) function allows us to break out these weekly deltas as discrete data points (the circles below):

These discrete new part lows can be flipped positive to provide a list of recommended orders.  This can be displayed in a tabular form.  We now have a list of Parts, Quantities, and Due Dates.  If executed, these orders will fix all the shortages and ensure our factory runs at 100%.

 Finally, we can craft a SQL code string based on our recommendations, such that we could copy and paste the order lines directly into our ERP system database.  Wow, Tableau can write code!  What can't it do?

Uses for Can Build Forecasting

Many business scenarios can benefit from a supply vs demand forecasting solution.  It can be used any time there is to be a sustained future campaign requiring adequate supply to meet anticipated demand.

  • Manufacturing supply chain planning
  • Catering planning
  • Restaurant planning
  • Military logistics planning
  • Retail planning and buying

 Value of solution:

In the context of a factory, the benefits of this solution are many.  The primary benefit is to reduce the downtime of a factory due to poor planning or inability to react to changing demand.  Secondary benefits include reducing the need of an analyst to build manual forecasting models in Excel that are cumbersome and error prone.  Another benefit is to reduce the need to purchase an expensive APS (Advanced Planning and Scheduling) module from your ERP vendor.  Finally, with the recommended reorders and SQL code generation, the need for buyers to execute orders is reduced.

In summary, a good forecasting solution can save a large factory millions of dollars per year by eliminating preventable downtime.  Keep your Factory Humming with Tableau!

by Robert McCauley
BI Engineer

Friday, September 26, 2014

"See the Future with Running Totals" Presentation Download

Hi!  Here are the links to download my PASS BI presentation from Google Drive: 

Script install instructions:
Run CreateDemoEnvironment.sql to create a new database called PASS_DEMO, with tables and views.

Table Name  Rows   
RT          1005967
DimDate     18629
POItem      324
COItem      39
BOM         18
Item        13
Inventory   11
PO          6
CO          5

Saturday, July 26, 2014

See the Future with Running Totals

Hi it's me, checking in again with an update! I'm preparing a talk on Running Totals.  Did you know that T-SQL and SSRS both have functions to help calculate running totals?  I use these in my day job as a data warehouse developer for a manufacturing operation.  We compare all the planned Demand and Supply for the upcoming few months, and need to predict whether we will have all the parts in time to build what it needed.
Stay tuned to this space to learn more about this topic, including links to a presentation I am preparing for an upcoming SQL Saturday.

Friday, October 28, 2011

Code Camp 16

I'll be presenting a talk titled "New TSQL Features in SQL Server 2008 and 2012" at Code Camp #16 at Microsoft's Waltham office, Oct 29th 2011.

Here is a link to my Powerpoint Slides. and Example Code.

It's been quite a year for me as a database software engineer, phew!

In October 2010, the previous database engineer quit, and I inherited a ton of design and development work, as well as supporting a continuous integration build, deploy, and release environment. Several different projects were started within the company that required new SQL Server databases, schemas, and procedures. A reporting solution was desired, which gave me the opportunity to design my first data warehouse from scratch. Referring to the Ralph Kimball school of warehouse design, I created a star schema of dimension and fact tables, and associated SSIS packages to load and refresh everything. An existing legacy custom .NET application was in place to create and view reports. I proposed we switch to SQL Server Reporting Services to get out of the "report builder" business. I built a set of SSRS reports to surface key business metrics from the warehouse.

During the year made a point to attend the New England SQL Server User Group meetings in Waltham, and attended the local SQL Saturday events as well.

Along the way, I spent my free time pursuing a SQL Server certification. After much studying, I took and passed the 70-433 (TSQL) exam in June.

This October I attended the SQL PASS Summit in Seattle where I "drank from the firehose" of SQL Server knowledge. There were many great presentations and the chance to learn from luminaries in the field such as Itzik Ben-Gan, Adam Machanic, Dr. David DeWitt, Kalen Delaney, Paul Randal, Kimberly Tripp, Brent Ozar, and many others.

Back at work, I've been shifted from the Data Warehouse project into a new project to design and build a high-volume OLTP application database.

Oh, and along the way my division, Iron Mountain's Digital divison, was sold to Autonomy. Autonomy itself was then soon acquired by HP.

So it's been a whirlwind 12 months. I can't wait to find out what 2012 has in store!

Thursday, October 7, 2010

SQL Server Query Statistics

Before issuing a query in SQL Server Management Studio, you can apply settings to show statistics about how the query is executed.

Show a text-based estimated execution plan instead of running the query.

Set this to see:
  • Scan count
  • Logical reads
  • Physical reads
  • Read-ahead reads
  • LOB read stats

Set this to see:

Execution Times:
CPU time, elapsed time
Parse and compile time:
CPU time, elapsed time