Silicon Valley State of Mind, a blog by John Weathington, "The Science of Success"
  • @SVSOM
    @SVSOM

    Welcome to a Silicon Valley State of Mind, thoughts tips and advice based on the consulting work of John Weathington, "Silicon Valley's Top Information Strategist."

  • bio
    bio

Silicon Valley State of Mind

Tips, thoughts, and advice based on the consulting work of John Weathington, "The Science of Success."

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that has been used in the blog.
  • Archives
    Archives Contains a list of blog posts that were created previously.
Subscribe to this list via RSS Blog posts tagged in six-sigma

Posted by on in Operational Excellence

Because of my experience and credentials as a Six Sigma Black Belt, I’m often called into companies to help them improve a process that shouldn’t be improved. I was recently on a Six Sigma effort where the process was so broken we couldn’t even establish a baseline. That’s a good clue that you’re heading down the wrong path. You cannot improve a defective process—you need to replace it.

Most people identify Six Sigma with process improvement (i.e., DMAIC[1]); however, there is another part of Six Sigma that deals with process development (i.e., DMADV[2]) called Design for Six Sigma, or DFSS[3] for short. Although the two look similar side-by-side, the execution is very different. For instance, both have a measure phase following their design phase; however, with DMAIC a key goal of the measure phase is a baseline; however, with DMADV that goal doesn’t exist. Instead, you’ll focus more on obtaining a crisper understanding of how the new process will be measured.

To decide which path to pursue, ask yourself whether you have an efficiency problem or an effectiveness problem. For instance, if the process works okay; however, the results are coming out too slow, you have an efficiency problem that requires DMAIC. If however, the process doesn’t work at all, you have an effectiveness problem that requires DMADV, which is more along the lines of process innovation.

Trying to improve a dysfunctional process is like changing the oil in a blown engine. It doesn’t make any sense. Before you start a process improvement effort, make sure you first have a process to improve. If you don’t, it’s best to just start over with a new process.


  1. DMAIC stands for Define, Measure, Analyze, Improve, Control; and represents the major phases of a Six Sigma process improvement effort.  ↩

  2. DMADV stands for Define Measure, Analyze, Design, Verify; and represents the major phases of a Six Sigma process development effort.  ↩

  3. For most intents and purposes DMADV and DFSS can be used interchangeably to represent process development using Six Sigma techniques. For those who care, DFSS is more of objective-based characterization and DMADV is more of a process-based characterization.  ↩

Rate this blog entry:
0

Posted by on in Operational Excellence

Have you ever tried to drive in white-out conditions? I remember driving to Reno once, and the falling snow was so blinding, the only thing I could see were the red tail lights of the semi-trailer truck in front of me. If that truck had gone over the side of the mountain, I would have followed right behind. We all know it’s important to be clear about your objectives and goals; however, it’s also important to be clear about your progress. To clearly define your key performance indicators (KPIs) you need operational definitions.

Operational definitions are about means as opposed to ends; they’re used to clarify key performance indicators, which are used to gauge progress. A strategic vision is a desired strategic outcome, and it’s good to have a clear vision, but you also need to know whether or not you’re moving in the right direction. I just wrote an article for TechRepublic on how to define Big Data to build a competitive advantage. In it I dissect what it means to be strategically competitive, and overlay that with how Big Data can foster those objectives. In going through this exercise, I help you realize that competitive Big Data must be valuable to one of your target markets. That’s all fine and good, but how can you tell if your Big Data is valuable? To properly answer this question, you must create an operational definition.

The key to creating an operational definition is to be precise. Operational definitions come from the world of Total Quality Management, and they’re a required component in any Six Sigma project. As a Black Belt, when I’m building the data collection plan for a Six Sigma project, I spend a considerable amount of time precisely defining what each measurement means. The result is a set of operational definitions that are used to collect, analyze, and monitor the metrics that are critical to quality (CTQ). The same must be done for your important efforts.

When building an operational definition, consider the acronym ACT: accuracy, completeness, and time. An operational definition will usually fall into one of these three categories. Accuracy deals with how closely your measurement comes to a desired target. This is extremely common; in our Big Data example, you may consider creating a value index that’s based on customer feedback. Completeness deals with coverage. An example is testing software, where you want to measure how much of the code has been tested. Since time is such a common measurement, it has its own category. And, since time is on a continuum, operational definitions that deal with time always have an upper and a lower bound.

Knowing where you’re going is great, but knowing you’re headed in the right direction is just as important. Don’t run your strategy in white-out conditions, you might head over a cliff.

Rate this blog entry:
0

Posted by on in Program Management

Happy Friday, everyone!

As we’re wrapping up another week, I thought I’d share a few thoughts on—wrapping up. There are lots of things you could wrap up: a conversation, meeting, client engagement, project, program, or even a large-scale strategic implementation. In all cases, it pays to intentionally spend some time in review of what happened. In Deming’s classic Plan, Do, Check, Act management method, this would be the Check phase. Six Sigma, in true form, has formalized this process and it’s called a Plus Delta.

Although more structured than an ad-hoc review, the Plus Delta is really a loosely defined construct. The Plus part attempts to discover what went right, and the Delta part attempts to discover what could be improved. Notice that there’s no Minus, this is intentional.

Concerning yourself with what went wrong has three problems. First, it’s negative. Call me a typical, California bliss-head if you must, but negative talk is just not empowering. Second, it encourages ratholes. For some reason people love to complain about what went wrong. It’s easy to have the discussion devolve into a huge moan and groan session if you start going down this path. Third and most importantly, there’s very little value in what went wrong. It’s like telling a restaurant server who’s taking your order, “Well, I don’t like lima beans or liver.” Okay, but it doesn’t say much about what you would like to eat now.

Exploring a Delta, or what could be improved, is a much more constructive conversation. However, don’t let you or your collaborators spin this into a Minus conversation. After the Plus, it’s easy for people to assume it’s time to talk about the Minus, even if you call it a Delta. This is not clever management-speak, Deltas and Minuses are two different topics.

Finally, if this session has a title, please don’t call it a post-mortem. As you probably know, this is latin for after death. What kind of cynic came up with this crazy term? I’d rather call the session a post-victoria!

So, the next time you hold a post-victoria, try the Plus Delta format. You’ll find it sets you up nicely for the next time around.

Have a great weekend, everyone!

Rate this blog entry:
0

Posted by on in Innovation

People love the tried and trued. There’s great comfort in knowing a process is in place, and it works. “If it ain’t broke, don’t fix it,” right? I agree, as long as you’re sure it ain’t broke.

I went to Lawrence’s Meat Company in Alamo the other day, the best place to buy meat in the area. It’s the closest thing you’ll find to a real butcher shop around here—a concept that never should have been superseded by the huge supermarkets. Amazingly enough, they’ve been around since 1887, and they’ve probably done an outstanding job since then. I can certainly attest to the quality of their meat today; this this my only consideration when planning any serious meal.

While there, I noticed some BBQ sauce they were selling on the counter. Apparently, it was created in 1849, so it’s even older than Lawrence’s. For some reason, I was lured by the age of the BBQ sauce, assuming that the “old west,” had great BBQ sauce, and a flavor that’s been around so long must be good.

I was wrong.

I know everybody has their own taste, but this one isn’t mine. Maybe “old west” BBQ sauce isn’t all that great, or maybe this company just got it wrong, but after 163 years I would hope they could create something better tasting than this. I’m not sure how in touch they are with their customers, but a controlled approach to their customer experience might be in order.

In Six Sigma, there’s a tool called a control plan. A control plan is an operational tool that measures how your process is performing. You can overlay this information with your process expectations (when you do this it is not a control plan anymore, but that’s a different topic) to see if your process is consistently meeting expectations.

Of course, your expectations should be a reflection of what your customers expect. So, to ensure you’re meeting or exceeding your customers’ expectations, I advise that you collect some data from your customers on a periodic basis, and juxtapose it with your control plan.

This simple sensor should drive your entire organization. You must actively manage your markets’ expectations, and you can’t manage what you can’t measure. This also becomes the foundation for a good customer relationship strategy.

So, always challenge how trued your tried process is, especially when it comes to your customers. If you aren’t meeting or exceeding their expectations, what category would you guess remains?

Rate this blog entry:
0

Posted by on in Operational Excellence

A few days ago I completed an experiment to see if it was safe to shower with my iPhone; here’s an update. Anxious to start my new morning routine in stereo, I jumped in the shower, cranked up the Wall Street Journal, and started about my business. In very short order, I realized there was a slight glitch in the production rollout of this process, and I needed a hot-fix. I found myself in a situation similar to my blog rollout, only a little less serious.

Everything I tested worked fine so no worries on the condition of the phone; however, I didn’t test for sound quality. The sound played fine through the iPhone, it just wasn’t loud enough to overcome the noise of the shower. In retrospect, even if I had tagged this as a CTQ (critical to quality—it’s a Six Sigma term), there’s no reasonable way I could have tested for this. So, I had to make an in-flight adjustment (aka hot-fix, and based on the temperature of the shower, this was literally a hot-fix).

I quickly searched Amazon for a bluetooth shower speaker, and I came up with the Hipe Waterproof Bluetooth Stereo Shower Speaker. With a product name like this how could I go wrong?!

I was right, problem solved. This thing is awesome. I’ve already tried it a few times, and this speaker cranks, much louder than I need for the shower.

When I posted the news of my original experiment, Olaf, a good friend of mine from grade school suggested that I post a picture from the shower to see what the quality’s like, so this one’s for Olaf! This is the Hipe hanging on the wall, a few seconds after I finished my shower. As you can see, the water’s not really an issue for picture quality; however, the steam fogs up the lens. If I tried to take this picture during my shower, it would be a big blur. Fortunately (for everyone involved), I don’t anticipate taking many pictures while in the shower!

So, there you go: test, release, hot-fix, success! All is good.

Rate this blog entry:
0

Posted by on in Operational Excellence

For those of you wondering whether or not it’s safe to shower with your iPhone, I have the answer—well at least a validated thesis. The thought originally crossed my mind as a way to be efficient in the morning. I listen every morning to the Wall Street Journal podcast, and of course I shower every morning, so I always felt it would be great to multi-task these two processes.

Of course I could stream in the audio with A2DP over bluetooth; however, I wanted to have the iPhone close enough to check emails or jot notes while I’m in the shower. This is when all the greatest ideas come to me, and the worst place for them to show up!

So the first challenge was to make the phone waterproof, which is where the LifeProof Case comes in. Since I have full faith and confidence in my LifeProof case to keep the phone dry, the next challenge was heat—can the phone withstand the heat of the shower?

A quick call to LifeProof support offered zero help. The customer service lady was real nice, but her answer was, “I’m sure the case can withstand the heat, but I really can’t speak for the phone inside the case.”

Really?

My reply was, “Did you think I was going in the shower with just the case and no phone?”

Okay, no help from LifeProof, so I had to setup my own experiment. The one good thing LifeProof support told me, is Apple’s stated upper threshold for heat which is 95 degrees Fahrenheit. As a safety valve, I know the iPhone will try to shut itself off before it overheats, but I didn’t want to take any chances. So, I setup my own experiment.

I started by ordering a 3M shower caddy and two very inexpensive Acu Rite Thermometers (actually, they’re sold as humidity monitors, but I was just interested in the temperature readings). I’ll explain why I bought two in a minute. I could have gone with a sauna thermometer, but they were much more expensive. By the way, the cheap thermometers are not waterproof, but I didn’t care. If the shower splash/steam ruined both of them, I was out less than half the money it would cost to buy a sauna thermometer. I was prepared for that—no worries. Fortunately, they’re still working fine today, so the risk was well taken.

For the last seven days, I’ve taken a shower with both thermometers in the caddy. I would periodically check on the readings to satisfy my curiosity, but at the end of the shower I would record both temperature readings (and quickly dry off the devices). Today was the last day of my experiment, and my working theory is: yes, you can safely take a shower with your iPhone.

Here’s how I know.

Knowing I’m a Six Sigma Black Belt, a lot of people ask me how many data points make a trend. They know it’s not one or two—so what is the right answer? To be honest, it depends; however, a good rule of thumb I use is seven. In an experiment like this, I don’t expect the variation to be wide, so seven is a good number.

So why two thermos instead of one? The answer is: measurement error. This is something many people forget, which is why Six Sigma is one of the few methodologies that formalizes the process of uncovering measurement error. There’s no way to know the true temperature, you can only read what’s on your measuring devices. All measuring methods and devices have some degree of error; it’s important to always understand how much measurement error is in your approach. On some days, the two thermometers would read exactly the same; however on days like today, the readings were different—as much as two degrees apart. Which one is right? There’s no way to know. Both could be wrong. The best you can do is take an average and observe the variance (the difference between the two readings). Since both readings were never more than two degrees apart, I’m comfortable using this as a measurement tolerance.

To summarize the results of the experiment, the average temperature over the seven days was about 85 degrees, and the highest reading was 88 degrees. In the worst case scenario based on my data, and factoring in my 2 degrees of measurement error (88 + 2 = 90), I’m still a good 5 degrees away from Apple’s upper threshold of 95 degrees (which I assume is a very conservative limit). So, my conclusion is: shower with confidence!

I ran a pre-test earlier with the iPhone in the caddy, and everything checked out okay! Tomorrow I get to enjoy my shower with the Wall Street Journal blasting through the speaker, with full confidence that my precious iPhone won’t drown or fry as a result. I just hope my wife doesn’t protest to an early newscast coming through the walls!

Rate this blog entry: