Advertisements

Category Archives: Decision Intelligence

A Career Change at 70

When I was in High School (1962 – 1966) it took me three and a half years to escape) there were no computer classes. Although there seems to be some disagreement on when the first personal computer was invented, even the earliest claim puts the date six years after I graduated. There was no such thing as a computer, let alone a programming or coding, class. Also, I did not attend university and, to my recollection, nobody I knew at the time was interested in computers or information technology. Actually, I was a terrible student and wasn’t interested in much of anything by the time I finished High School.

IBM Memory 50

The IBM Memory 50 Typewriter

Fast forward to 1974. Despite having no undergraduate education, I was able to secure admission to an accredited Law School, located not far from my home in the San Fernando Valley. I began in the fall of 1973 and the following year I managed to get a job in the law office of a sole practitioner in Beverly Hills as a legal secretary/law clerk. Shortly after I began, the lawyer I worked for purchased an IBM Memory 50 Typewriter. I attended a one-day class where I learned how to use it. This was my first introduction to anything resembling “computing.”

The office later upgraded to an Artec Display 2000, which had an LED readout of approximately 30 characters. There was no CRT display. It used two 8″ floppy disks and had a document/data merge capability that made it perfect for boilerplate documents, e.g. Pleadings, Interrogatories, Summonses, etc. It was a great leap forward in word processing.

Edward Ladd & Sons

The Family’s Wholesale Food Business

Shortly after graduating from law school I had, for numerous reasons, decided spending the rest of my work life around the judicial system was not something I really had my heart in and, after much gnashing of teeth and going over my alternatives, I decided to join my family’s wholesale food distribution business. One large factor in making this determination was my father suffering his second major heart attack. The business was supporting my mother and my sister, who was only 10 years old at the time. I felt the need to help the business grow, ensuring they would be taken care of if my father were to die . . . which he did eight years later.

Edward Ladd & Sons Jacket

Our company jackets. Logo design by me, jackets created by Cat’s Pyjamas.

After a couple of years, the business had grown substantially and, given my desire for another type of challenge, I once again struck off on my own. I dabbled in a few things, then joined forces with a couple of CPAs and formed a royalty auditing business, serving some very high-end artists. The company first purchased an Apple computer (I can’t recall if it was a II or a IIe but, based on the release dates of the two, I’m inclined to think it was a II). We later purchased a Northstar Advantage, which used the CP/M OS and two 160 KB, 5.25″ floppy disks. We also purchased a dot matrix printer and, in anticipation of taking the system out on the road, we had Anvil make a hardened case for the two, with room for cabling, paper, and instructions to be packed inside.

At that point our audits required us to visit the artists’ recording companies, and my first visit was to RCA records in the Meadowlands of New Jersey. Standard procedure for the record company was to stick us somewhere that was relatively uncomfortable, then bring us stacks of paper, which we then transferred to ledger pages. Upon returning to our office in Playa del Rey, we would then have to transfer all the data to a spreadsheet; we were using SuperCalc on the Northstar Advantage, though we had started with VisiCalc on the Apple.

I suggested taking the computer with us when we performed audits, so the people who went out on the road could enter the numbers they received directly into an electronic spreadsheet, thereby saving a huge amount of time and stress. We were also using WordStar at the time for writing the narratives that would accompany our audit analysis.

My first experience with programming came when we were contemplating taking the system out on a European tour with Neil Young. I sat with my friend and partner, who had performed many a box office reconciliation, and we sketched out the different scenarios that were used to close out the night’s receipts. Doing so required the use of nested “if” statements, which determined the precise equation to use for each venue. Unfortunately, that same friend who had worked so diligently with me to create the formulae that would power the spreadsheet never felt comfortable with using it by himself and it never went out on the road.

Sinclair ZX81

My Very First Computer, the Sinclair ZX81

It was also around this time I purchased a Sinclair ZX81, which was a small computer that had a membrane keyboard and used a cassette recorder to save programs on. It also had its own OS, as well as its own version of Basic, which I endeavored to learn. The first program I wrote, which took me all night to complete, counted down from 10 to 0, in the center of the screen. It then plotted a single pixel (resolution was 64 x 48) at a time, starting from the bottom and, after reaching a height of six pixels, began plotting another pixel above the previous six and erasing a pixel from the bottom of the stack, until it left the screen at the top. This required me to learn how to use either (I don’t recall the exact commands; it’s only been a little over thirty-five years) nested “if” statements or “do while” commands.

Fast forward to 1984, the year my father died. Shortly afterward, I returned to help my brother keep the business going. We purchased a more advanced Northstar Advantage, which had a huge hard disk that could store 5MB of data! At the time, we also purchased a copy of dBase II, which was one of the first database systems written for microcomputers. I taught myself how to write systems using their programming language, which I wrote using WordStar. I wrote an entire accounting system for the business. My favorite component was the preparation of a deposit ticket, where I laboriously emulated the workings of a calculator in allowing for numerous methods of inputting dollars and cents (whether or not a decimal point was included) was the real differentiator and sticking point for me but, after much trial and error, I figured it out.

Unfortunately, my brother and I didn’t see eye-to-eye on the direction the business should go in and, after a while I left again, this time taking temporary jobs to keep me afloat. It was during this time I worked for a while at a litigation support firm that used a DEC minicomputer and several of the earliest versions of the Macintosh. All of my work with computers was novel for me, as I never took any classes — with the exception of that class I took to learn how to use the IBM Memory 50 typewriter. I taught myself how to program through reading and doing, sometimes taking dozens of iterations to get a bit of code correct.

In 1987, I had been working for a company that made hard drives (Micropolis). Their business was highly seasonal and, on one particular Friday, all the temps got summarily laid off. I was using Apple One at the time to send me out on engagements and, thanks to my willingness to show up wherever, and whenever, they would offer me a job, I got a call from them on that very Friday, telling me to report to Rocketdyne the following Monday.

By this time I had been shifting my focus from working under the hood, to figuring out how to best use the systems and tools that were rapidly evolving as business tools. I was beginning to focus more on business results with whatever was available. My first responsibility at Rocketdyne was to enter text I received from Engineers into a document called a Failure Modes and Effects Analysis / Critical Items List (FMEA/CIL). It was in direct support of recertifying the Space Shuttle Main Engine (SSME) for eventual return to flight after the Challenger disaster.

SSME

SSME Hotfire Test

It was a strange task, as the document was clearly text-based, yet we were using a spreadsheet to create it. I suppose it made some sort of sense, as the company was an engineering company and that’s kind of how engineers see the world; in numbers, rather than words.

I also worked with a stress engineer on creating an app (we didn’t use the term back then, but that’s what it was) that could be used to predict crack propagation and its effects. I was unfamiliar with the equations behind the program, but my job was to use dBase II to create an interface that allowed for data input and crunched the numbers. It was fun and was successfully used for some time after that.

One year after joining as a temp (referred to as a “job shopper”) I hired in full-time and began working with the Flight Ops team. It was exciting and I spent much of my time massaging telemetry data from hot fire tests of the SSME. I received flat files from a Perkin-Elmer mainframe and eventually ported the data to Microsoft Access, which allowed for further massaging and reporting.

In October of 1988, a little over eight months after hiring in, the U.S. Space Program returned to flight with the successful launch of Discovery. At a celebratory event that evening I met one of the managers of the Program Office. As we talked and he discovered my background, he offered me a job. I did some research and talked to my current managers, who advised me to take it, which I did. As time went on, I moved further away from anything resembling coding and, eventually, wound up concentrating on the use of software and computing tools to increase the effectiveness of me and my colleagues.

Not quite 22 years later, I took an early severance package (which was offered to everyone over 60) and retired. I would turn 63 less than a month after leaving the company. In 2015, I returned as a contractor doing something I had done nearly 20 years previously. I spent the next two years (until February 17 of 2017, to be exact) providing program scheduling for two small rocket engine programs.

Last month I turned 70. I recently signed a referral partnership agreement with an organization I worked with a few years ago. They specialize in machine learning (ML) though I was unaware of that back then. My primary responsibility will be selling their services and, when possible, any product they may create. In order to be effective, I am now studying statistics and ML, partly to better understand what it is I’m selling and partly because I’m fascinated by the algorithms that power these efforts.

I do worry that my comprehension is somewhat hampered by, if not the years, the considerable mileage I’ve managed to accumulate. There’s also a minor problem with my “just don’t give a shit” attitude about a lot of things. Nevertheless, I will persist. I intend to share what I’m learning but, as with most things these days, it may be sporadic and somewhat unfocused.

I do believe machine learning is going to drastically change just about everything humans do and I’m well aware of the disruption it might entail. I don’t however, believe that to be a show stopper. We’ve opened Pandora’s box and there is no closing it. Let’s roll!

Advertisements

Empathy: The Core of Complex Decisions

Having worked with Dr. Pratt and her company, Quantellia, I have long been convinced their approach to decision making is one of, if not THE, best methodologies I’ve encountered. After what I consider to be one of the most disastrous general elections in my lifetime, it would seem we need help in navigating the complexities of the world and our place in it. Lorien’s work can, I believe, help us understand the consequences of our decisions, before we make them. I urge you to watch this video and become more conversant in the issues Dr. Pratt raises. What follows below the video are some of the “liner notes” that go with her TEDxLivermore talk.

Making decisions based on invisible inputs is like building a skyscraper without a blueprint. Yet that is the norm, even for very complex problems. Contrary to how most of us think about making a decision as being the act of choosing, a decision is the last piece of a long, almost completely invisible, process. The good news: it is possible to make the invisible part of decisions visible.

In working with the Community Justice Advisor Program in Liberia, Africa, Lorien and colleagues helped The Carter Center (founded by Jimmy and Rosalynn Carter) use decision models to increase positive outcomes in the domain of civil justice, by identifying the most effective levers for change.

Using deep learning artificial intelligence, the interconnections between inputs become visible, and unintended consequences can be identified before implementation. Vicious cycles can be reversed, and virtuous cycles of improvement can be built in place and nurtured through intelligent decision metrics.

As co-founder of Quantellia, Dr. Lorien Pratt co-created the decision intelligence methodology and the company’s award-winning World Modeler™ software. She consults and speaks worldwide, and is known for her neural network research and the book Learning to Learn. A former college professor, Pratt is widely known as the former global director of telecommunications research for Stratecast, a division of Frost & Sullivan. A graduate of Dartmouth College and Rutgers University, Pratt holds three degrees in computer science. She received the CAREER award from the National Science Foundation, an innovation award from Microsoft, and is author of dozens of technical papers and articles.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx


Program Management By Ouija Board

image

Going back to work after nearly five years of “retirement” has been both interesting and instructive. When I was asked if I would be willing to do scheduling, which is something I had done many years ago, I happily said “yes”. I would have probably agreed to almost anything they wanted me to do, as I was anxious to supplement my meager retirement income. Actually, I first learned scheduling software using a mainframe tool called Artemis. Shortly afterward, we were introduced to a PC version of Artemis which, if memory serves, was called Schedule Publisher and, within another very short period, it was spun off into a product from Advanced Management Solutions, called AMS REALTIME Projects.

This was somewhere around 1994 and, at the time, Microsoft Project was comparatively bare bones and nowhere near as useful (in my opinion at the time) as REALTIME Projects. Having long been very much a visual person, I find the visualization provided by Gantt charts to be particularly useful when looking to see how the logic in a schedule affects downstream activities as time, and the work contemplated in the schedule, moves forward. Until Project introduced the Timeline view, which allows quick zooming and panning, I was not terribly happy with it compared to the AMS product, which offered a useful timeline capability.

So . . . since I had done scheduling for a few years during the 90s, I readily accepted the challenge and, upon my return on January 19, 2015, I was amused to see the company was still using Project 2002 which, although newer than the version I had struggled with, was still well over a decade old. The main reason for this, I was told, was because a set of macros had been developed over the years that allowed schedules to be matched up with the organization’s earned value management system, which is Deltek MPM.

Unfortunately, using such an old piece of software presented some interesting problems. One of the most egregious, from my point of view, was its inability to run in any of the conference rooms in my building. This was — and still is — due to an IT rule put in place that won’t run software in conference rooms if it’s more than two versions older than the most current one available. In the case of MS Project, the latest version available when I returned was 2013. Also, MS had released a 2007 and a 2010 version, which put the one in widespread use more than two versions behind and, as a result, clicking on the tool (which was installed in all the conference rooms) invoked Project but, instead of seeing the tabular data alongside a Gantt chart, all one got was an empty box with a small red “x” in the upper lefthand corner.

In my experience, scheduling is an activity that absolutely must be done collaboratively. A good, useful schedule requires (at the very least) a great deal of understanding of not only the work to be done, but the ways in which the logic of its progression needs to be modeled in order to accurately reflect how downstream activities are impacted by small changes as work progresses . . . and changes are absolutely unavoidable, especially in large, complex projects such as rocket engine design, manufacture, and test.

Since it was impossible to use the tool in a conference room, where I could sit with the Program Manager, one or more Control Account Managers, and various Engineers (Design, Quality, Manufacturing, etc.) developing schedules became somewhat difficult and inordinately iterative, requiring dozens of communications back and forth between me and the Program Manager, as well as others who we needed input from. As work progressed, I was able to get IT to agree to allow me to log into my computer remotely from any one of the conference rooms, which made working on the schedule much easier. However, the resolution in the conference rooms was far less than that available to me on my Dell all-in-one. Its screen is 23″ diagonally, plus I have an extension display that gives me another 19″ off to the side. What I see on screen in conference rooms is not as inclusive as what I normally work with and it takes a bit of adjusting, which cuts into the speed with which I can get things done.

As I both refamiliarize myself with the scheduling process and learn how the tools have advanced, I’m learning a lot about how best to do it. Perhaps more importantly, I’m also learning how little most people know of the power of a good piece of scheduling software. There are people here who still use Excel spreadsheets and date functions to create schedules. Maybe I’m missing something, but MS Project and other similar tools provide not only calendaring functionality, but also the kind of logic necessary to accurately model the interplay between design, quality, procurement, operations, testing, and numerous other ancillary and important processes that make up the entirety of a program.

Inasmuch as Project also provides for highly detailed resource loading (quite literally down to the gnat’s ass, if one is so inclined), I’m unclear as to why we don’t use it for at least first cut proposal activity. Were we to do so, I’m convinced it would not only speed up the initial process of pricing a decent proposal but, when completed, there would be no need to then create a schedule from scratch, which is generally the way it’s done now. I suspect there are some people out there who actually do what I’m suggesting but, for all I know at this point, my perception could be wildly innacurate.

So . . . I’m kind of hedging my bets and, while I’m agitating for people to consider using MS Project more widely and for deeper resource planning, I’m mostly looking to understand the tool a little more each day. It, like many tools available to organizations of all kinds and sizes, is far more powerful than most individuals understand or are interested in learning. I’m constantly finding myself believing we are crippling ourselves by not using it far more extensively but, as many have pointed out, changing direction in a reasonably large organization, especially one which depends largely on government contracts and oversight, is like turning an aircraft carrier with a canoe paddle. On the bright side, it could keep me working for another decade, the prospect of which does not bother me in the slightest.


Making Sense of All That Data

Deep Data

Transforming Big Data Information into Deep Data Insights

Yesterday I posted a question to several of the groups I belong to on LinkedIn. It was related to several of the things I’m interested and involved in: Systems Thinking, Knowledge Management, and Decision Modeling. It was somewhat informed, as well, by an article appearing in the Huffington Post, where Otto Scharmer, a Senior Lecturer at MIT and founder of the Presencing Institute, talks about the need to make sense of the huge and growing amounts of data we have available to us. He argues the importance of turning from “Big” data, where we mainly look outward in our attempt to understand what it is telling us about markets and our external influence, to “Deep” data, where we begin looking inward to understand what it’s telling us about ourselves and our organizations and how we get things done.

The question I asked was designed to seek out capabilities and functionality that people would like to have, but that is currently unavailable. My interests include working with others to understand and provide for those needs, if possible. I thought I would present the question here as well, where it will remain a part of my online presence and, hopefully, might elicit some useful responses. Here it is:

With the growing proliferation and importance of data — a development at least one author and MIT Lecturer has suggested is moving us from the information technology era to the data technology era — what tools would you like to see become available for handling, understanding, and sharing the new types of information and knowledge this development will bring?

In other words, what would you need that you don’t have today? What types of technology do you think would offer you, your colleagues, and your organizations a greater ability to make use of data to bring about a transformation from primarily siloed, outward looking data to collaborative, inward looking data as well?

I would love to hear of any ideas you might have regarding the kinds of tools or apps you could use to better deal with data by turning it into useful information and knowledge . . . perhaps even a smidgen of understanding and wisdom.


Chasing Earned Value

Recently, I was given the task of writing a short (4 – 5 page) paper on the basics of Earned Value Management (EVM), and why it’s useful for medium to large organizations in managing their projects. The idea was to deal with the “why”, not the “how”. I worked in a large aerospace organization for over two decades and we used EVM extensively. It is, after all, a requirement for all government contractors.

Earned Value Terminology

A Plethora of Acronyms Revealed

Having retired from that industry a little over four years ago, I was a bit rusty. However, you can’t have that stuff drummed into your head without it engraving itself fairly deeply on your consciousness. It didn’t take me long to come back up-to-speed. In fact, the biggest problem I had was knowing where to stop. EVM is full of acronyms and formulae (BCWS, BCWP, ACWP, SPI, CPI, etc., etc., etc.), all of which I’m fairly certain are useful . . . when used intelligently. As with most things, how valuable they are depends a great deal on what you’re trying to accomplish, how prepared and disciplined you are, and how well you execute over time.

Now this brings me to a somewhat vexing problem and the reason I’m sharing this. I could swear there’s a good argument somewhere as to why EV is not a very good method for managing a project. However, when I searched for problems or reasons not to use EV, all I could find were lists of where organizations go wrong because they don’t plan properly, they don’t pay attention to detail, or they don’t use tools as they’re designed to be used.

So I have a question, which I am now going to throw out into the aether. Assuming some who read this actually know about, and have experience with, Earned Value Management and maybe one or more of the systems used to facilitate its proper application, are you aware of any reasons NOT to use EVM and, if so, could you point me to a resource or school me on the subject? Thanks.


Simplicity is the Ultimate Sophistication

 

In an effort to improve my “working out loud” chops, I’m learning from a friend who has begun sharing the text of (not links to) his blog posts on Facebook and LinkedIn, as well as on the blog he’s had for a very long time. <Light Bulb!> This one’s a kind of reverse emulation, as this is something I shared on Facebook first.

Simplicity - Da VinciI have found an interesting difference of opinion on the subject of simplicity versus complexity, but it seems to hang on what dimension of endeavor we’re looking from. From an engineering design perspective – especially wrt products for the consumer market – there’s evidence complexity (think shiny objects) is actually a better seller than simplicity.

It seems to me, however, that da Vinci was looking a little deeper than marketing prospects and was more interested in the aesthetics of design . . . all kinds of design.

So . . . I’m thinking of it in terms of this software tool I am now representing, called World Modeler, which is used to model the elements required to make important and quite likely expensive organizational decisions to better . What we (Quantellia, LLC and I) can do is transform highly complex decision models (involving numerous decision levers, external factors, intermediate effects, interconnections, and even qualitative assumptions) to graphically (and quite simply) show how they will play out over time given certain values. The goal is to render the complex simple, not to simplify that which is complex.


What Is Decision Intelligence?

World Modeler Logo

World Modeler adds a Systems approach to Project Management

In my last post I took a stab at defining, and explaining, the concept of Decision Intelligence. I’m willing to bet you’re going to be hearing a lot about it in the not-too-distant future. So you don’t have to click back and forth, I’ll copy over what I wrote about it in that post:

This is the term Quantellia now uses to describe what it is we do. NB – The term is not “Decision Analytics”; there’s a reason for this. Perhaps it is best understood when one looks at a part of how decision modeling is accomplished. Part of the raw material available today for making decisions is what we call “big data”. There’s an awful lot of attention being paid to the field of predictive analytics, which uses big data as its raw material. We at Quantellia prefer the term predictive intelligence. This is because predictive analytics uses past performance (data) to project trends into the future. We like to think we take the concept a bit further.

While we believe analytics are useful and important, they lack the dimensions of human knowledge and understanding that can more completely predict how the past will play out in the future. A subtle distinction? Perhaps, but I find it a valuable one. Unless we’re talking about the future activity of a machine designed to perform a very limited set of instructions or actions, our activities involve human understanding, emotion, and interpretation. There are times when these attributes can dramatically change the course of an organizational effort, rendering previous decisions moot or, at best, only partially useful or correct.

By providing a method whereby human understanding, intuition, and wisdom can be incorporated into the decision model itself, we believe we can more intelligently predict the future. We are well aware there is no such thing as infallibility. However, we also know the more useful and actionable information and knowledge we have available to understand what has happened − and is likely to happen − the better our decisions will be.

Now, having had some time to think about it – it’s been over a month since that post -and having discussed it a bit with Quentellia’s Chief Scientist, Dr. Lorien Pratt (@LorienPratt), I’d like to add a little something to both the definition and the description of what World Modeler has to offer. Keep in mind, as with many things, perhaps even more so with something truly emergent and reasonably new to my experience, both my understanding and my ability to explain are evolving; developing structure and nuance as I learn more theory and encounter more examples of real-world situations.

I consider systems thinking, or the ability to see systems — and systems of systems — as the most effective way to understand what is happening within any one or more of those systems, as well as have a chance at affecting the outcomes of the ones designed to produce value and realize valuable results or consequences of their workings. The more elements of a system that can be modeled, the more likely you will be able to understand downstream effects of your decisions, and the more likely you are to see the unintended consequences of actions before you take them.

Here’s where Quantellia’s World Modeler™ excels as a decision modeling — and making — tool and enabler. Consider Predictive Analytics, the practice of extracting information from existing data sets in order to determine patterns and predict future outcomes and trends. PA usually returns fairly simple, pairwise relationships, e.g. these customers in this demographic, with this amount of revenue, etc. are likely/not likely to churn or devoting a certain amount of energy to customer retention is likely to affect/not affect customer churn.

World Modeler, on the other hand, allows you to create a highly complex systems model. This means you can look at numerous elements and their interrelationships to see how they work together, e.g. customer characteristics, customer retention efforts, likelihood to churn, total customers, revenues, and even business rules that might have to be taken into consideration if certain levels of activity are reached. Furthermore, when you don’t have data for one or more of these elements, you can use human expertise, the tacit knowledge of your employees or the group to fill in the gaps. When you have real data, if you later are able to gather it, you can then plug it into the model and continue going.

One more thing. World Model is a highly flexible, iterative navigation mechanism. It allows you to predict without complete or perfect knowledge, then pivot and change the model as new and/or different knowledge, information, and data are gathered or encountered. You can do this repeatedly over the course of months or years, whatever’s necessary to help you make the best decisions for achieving your desired outcomes. So success doesn’t depend on long-term predictions. Rather, it depends on navigation and alignment between the organizations systems, processes, and the humans that employ them.

Now . . . having learned all that, aren’t you interested in seeing how this tool works? You can get a free evaluation copy and all you’re giving up is a little contact information. There’s no obligation. Click on this the link to download a fully-functional two-week evaluation copy of World Modeler. Give her a Whirl(d)!


What Did I Say I Did?

Biz Card

You Read it on the Internet, so it Must be True!

Anyone who knows me, knows I am quite the stickler for clarity and correctness in communication. I have proudly held myself out as a Senior Inspector in the U.S. Grammar Police, as evidenced by this card I created only halfway in jest. Actually, the card’s creation (I shared the process publicly) led to a couple of paid editing gigs. I’ve also been called a Grammar Nazi, which has caused me to momentarily flash a slightly sheepish smile, accompanied by a sparklingly demure blush.

Recently, I began a new engagement with a company I’ve wanted to work with for some time, Quantellia, LLC. As of the beginning of the year, I am what they call a referral partner. As such, I am contracted to Quantellia to sell their product, World Modeler™, and their services, which include training, workshops, etc. designed to help organizations make better decisions. In learning about my new venture, I have come across a few phrases that are similar, yet different enough to cause me to dig a little deeper in search of clarity as to their meaning. I want to very briefly share my understanding of the meaning of four separate phrases, each of which begins with the word “decision”.

At first, I thought one of the terms was kind of a catch-all; an umbrella term that encompassed the others. However, I no longer believe that to be the case, at least not fully. Keep in mind, all four of these phrases are relevant to what it is Quantellia and I are doing. At the same time, my understanding is quite likely imperfect and incomplete. As I gain a foothold in the discipline, and become more proficient, I have no doubt my definitions and my understanding will need refinement. 

Decision Science – at first I thought this term was one into which the others neatly folded. However, having done a bit of research, I can no longer say that’s the case. As I currently understand it, Decision Science concerns itself not so much with the process of making business decisions, but with the psychology of making any kind of decision. In other words, why do people make the decisions they do; what are the factors they take into consideration; how do they weigh them; how emotional are people in reaching decisions, etc.

Originally constituted in late 1968 as the American Institute for Decision Sciences, and later named the Decision Sciences Institute, this organization had its first annual meeting on October 30 – 31, 1969 in New Orleans. If interested, here’s a history of the organization written in July of 1989 by the then President, Bernard W. Taylor III. According to Wikipedia, the Institute is a “professional association of university professors, graduate students, and practitioners whose interest lies in the application of quantitative and qualitative research to the decision problems of individuals, organizations, and society. Many of the members of this academic organization are faculty members in business schools.”

It seems that Decision Science is a relatively new discipline. This conclusion is backed up by the history of its presence in some of the Universities and Colleges in the United States. For instance, Carnegie Mellon University’s Department of Social and Decision Sciences finds its roots in 1976, as part of what is now the Marianna Brown Dietrich College of Humanities and Social Sciences.” The Harvard Decision Science Laboratory opened its doors much more recently. According to their website, they’ve only been around since January of 2009. I couldn’t find the date George Washington University’s Business School’s Department of Decision Sciences opened its doors, but my hunch is it was sometime in the last decade, at most. The Columbia Business School’s Center for Decision Sciences, formerly part of the Institute for Social and Economic Research and Policy, appears to be fairly young as a separate discipline as well.

Decision Modeling – Although at best an inexact science, decision modeling can be a highly effective tool in helping an organization better predict the outcomes of its decisions. This is made more likely if the model is comprehensive, based on not merely data and analytics but also the knowledge of the people involved in the organization for which the decision is being made, and if the model is iterative and capable of incorporating newly discovered information and relationships. Furthermore, the structure of the model becomes more and more effective as it accurately models the complex relationships it seeks to help understand. World Modeler™ is capable, despite it’s seemingly simple interface, of modeling highly complex relationships. I’ll post more in the future about its capabilities, including embedding some excellent videos showing what it can do.

Decision Engineering – This is a term I don’t believe we are using any longer to explain what Quantellia does. Frankly, as someone who spent over two decades working with aerospace engineers and rocket scientists (quite literally, on the Space Shuttle Main Engine, Delta, and Atlas engine programs), I’m kind of partial to engineering. I can, however, understand how it may sound a bit intimidating or dweeby to people without my background, so I won’t dwell on it here.

Decision IntelligenceDecision Intelligence – This is the term Quantellia now uses to describe what it is we do. NB – The term is not “Decision Analytics”; there’s a reason for this. Perhaps it is best understood when one looks at a part of how decision modeling is accomplished. Part of the raw material available today for making decisions is what we call “big data”. There’s an awful lot of attention being paid to the field of predictive analytics, which uses big data as its raw material. We at Quantellia prefer the term predictive intelligence. This is because predictive analytics uses past performance (data) to project trends into the future. We like to think we take the concept a bit further.

While we believe analytics are useful and important, they lack the dimensions of human knowledge and understanding that can more completely predict how the past will play out in the future. A subtle distinction? Perhaps, but I find it a valuable one. Unless we’re talking about the future activity of a machine designed to perform a very limited set of instructions or actions, our activities involve human understanding, emotion, and interpretation. There are times when these attributes can dramatically change the course of an organizational effort, rendering previous decisions moot or, at best, only partially useful or correct.

By providing a method whereby human understanding, intuition, and wisdom can be incorporated into the decision model itself, we believe we can more intelligently predict the future. We are well aware there is no such thing as infallibility. However, we also know the more useful and actionable information and knowledge we have available to understand what has happened − and is likely to happen − the better our decisions will be.


 

Mild Disclaimer – I hope I didn’t rattle anyone’s cages too much with these definitions/explanations. They represent my current thinking and, being somewhat of a newbie to the science/craft of decision making as a discipline, my understanding is necessarily incomplete and in a state of flux. Nonetheless, this is a first attempt at explaining some of the concepts that are informing my work with Quantellia and World Modeler™. Consider it an ongoing process.

At the same time, I would like to make it clear that Quantellia has been doing this for approximately eight years and has a track record of helping organizations both large and small to make decisions and manage programs successfully. Dr. Lorien Pratt, co-founder and Chief Scientist, as well as her team, know their stuff and are the biggest part of my team. We are looking for people who have “wicked problems” they need to solve; people who are facing highly complex decisions involving lots of time and money, and for whom the wrong decision could be very costly. If you fit that description, or know of someone who does, you could do a lot worse than contact us for an initial discussion of your needs. We begin with the end in mind and believe we can help. Only by understanding what it is you face can we determine whether or not that’s possible.


Changing My Game

While I have written a little bit about one of the new endeavors I have set out to pursue (here and here), I haven’t really done much to explain what it is I’m doing with decision modeling and my work with Quantellia LLC. I am in the process of writing a post about some of the concepts I’ve been looking into and learning about, but it won’t be ready for a while, as I have more studying and research to do.

I do, however, have the ability to share some of the material I’m learning from, as Quantellia has produced a significant number of videos and recorded webinars. This one is the one I usually send to prospects. While it is the oldest, it’s also one of the shortest and still conveys the essence of what Quantellia, and it’s product World Modeler, can do for a business or organization facing complex decision-making.

So . . . I’m not sure if I actually announced it here on my blog, but as of the beginning of this year I have become a referral partner for Quantellia. In my opinion they have not only a superior product, but a superior mindset regarding how decisions are made. As a systems thinker I am keenly aware of the value in a long-range, strategic, informed approach to deciding how to proceed and to keeping track of what’s happening, always being prepared to take a different path if circumstances warrant it. I believe the people of Quantellia do exactly that and that World Modeler is a tool that makes it much easier to accomplish.

If you have an important, complex decision to make you need to understand how decision modeling works. As Dr. Pratt says on the video, you can model many decisions using paper and pencil, but you can’t do a good job of it without understanding how to “engineer” the decision using more than just analytics and predictions based on them. You need to use “Decision Intelligence”. Quantellia can help, which means so can I. Please let me know if you’re interested in discussing your specific needs. I’d be happy to set up a teleconference to see if we can help. Thanks.

PS – I’m going to share more of these videos here, but you can see them all for  yourself at Quantellia’s YouTube channel, located here.


%d bloggers like this: