Category Archives: Decision Intelligence

RAIDI

Robot and Human hands touching

I have no doubt I am a very lucky person. Although I do not have an education in any science, I was able to spend approximately two decades working on the Space Shuttle Main Engine (SSME) program at Rocketdyne (through four major aerospace corporations). I spent a lot of time working with some of the brightest rocket scientists (for realz) as well as world-class engineers and scientists in literally dozens of disciplines.

Since my retirement from (what was then) Pratt & Whitney Rocketdyne, I have worked intermittently with Quantellia, LLC, an artificial intelligence / machine learning software development firm. Needles to say, I have no formal education in any computer field, with the exception of two Visual Basic classes I took at a nearby Junior College. I was introduced to one of the co-founders of Quantellia shortly after my retirement. She showed me a tool they had been developing called “World Modeler”. It was the most exciting thing I’d seen in a long time, and I was especially impressed with how it brought a highly systemic approach to modeling and forecasting in complex situations. I ended up writing several papers and a bunch of case studies for them.

In 2015 I returned to work at what was then Aerojet Rocketdyne (still is, for now) where I worked on a small rocket engine program for a little over two years. After leaving, I started doing some selling for Quantellia and, beginning in March of 2018, I became the company’s Business Manager, a position I’m still working at.

Last year we held a summit, in conjunction with SAP Global Services, at their Labs in Palo Alto. It was called the “Responsible AI/DI Summit.” In this context AI stands for “Artificial Intelligence” and DI stands for “Decision Intelligence.” One of the main purposes of the summit was to discuss how we can develop artificial and decision intelligence such that we concentrate on using them to solve humanity’s most “wicked” problems, rather than merely work at developing apps, the main purpose of which is to make money for the developers, investors, and entrepreneurs involved in the business.

Below are some of the folks who worked on the Summit, including me (the long-haired guy in the middle of the back row). Also, here’s a link to this year’s second Summit – Responsible AI/DI Summit 2019, as well as a link to the RAIDI Blog.

Quantellia and SAP folks who worked on putting it all together

As I learn more about machine learning, artificial intelligence, and decision intelligence, I will work at sharing my knowledge and understanding of these tools, and the issues they raise. I know the people I’m working with are dedicated to serving humanity, not merely milking it for profit. That pleases me and I hope we’ll be able to prove we’re doing the right things to ensure such service continues to exist and grow.


Preserving My Past

The time has come for me to simplify . . . to apply some feng shui to my collection of old (ancient?) paperwork, some of which is more than several decades old. Paper is the one thing I seem to be a bit of a hoarder with; that and old clothing, I guess.

I am coming across papers, letters, and notes I’ve written over the years, many of them from my over two decades of service at Rocketdyne, where I was privileged to work on the Space Shuttle Main Engine program. In that time I worked for (without changing desks) Rockwell International, The Boeing Company, and the Pratt & Whitney Division of United Technologies. After I accepted an early retirement package in 2010, I returned as a contractor to work for Aerojet Rocketdyne in 2015, where I worked for a bit over two years.

Recently, I purchased a small, portable Brother scanner and I am slowly scanning old papers I’m finding. Inasmuch as I’m now publishing far more frequently to this blog, I’ve decided to save some of these things so I can throw the paper away and still have a record. It’s been over nine years since I retired and I find I’m forgetting what working in a large organization was like. Reading some of the documents I created helps me to remember what I did, as well as to feel reasonably confident I wasn’t just spinning my wheels.

What follows should be somewhat self-evident. It’s a letter I wrote to my manager in 1994, now over 25 years ago. I think I sound pretty reasonable, and I’m gratified to know I was pushing—pretty hard, I think—for positive change back then. I’m not an IT person; never went to undergrad and, besides, the earliest PCs didn’t come into existence until I was nearing my thirties. However, I did recognize the value such tools brought to managing and operating a business and I have always been a big promoter of technology in the office. At any rate, this is more for me than my readers, but some may find it “amusing.”

PS – I scanned the original “memo” in .jpg format and the accompanying Lotus presentation materials in .pdf, which you’ll have to click on if you’re interested in what Lotus was doing 25 years ago, before its acquisition by IBM.


Will Someone Stand Up?

In the 2020 General Election, coming up waaaaay sooner than you think, time being what it is, there are eight (count ’em, eight) Republican Senators who are up for election unopposed. Actually, two of the eight are retiring but, in all cases, whether it’s a replacement or the incumbent, they’re all running unopposed. This is an intolerable situation, IMO.

Allowing any Republican, all (save for Justin Amash) of whom have shown themselves to be hapless sycophants, bowing to the whims of the most destructive and inhumane President in modern history, to run without any Democratic opposition is something we should avoid at all costs.

  • Bill Cassidy, Louisiana (In 2014 he beat three-term incumbent, Democrat Mary Landrieu, 56 percent to 44 percent. Don’t know if there are any Democrats in the running at present.)
  • Mike Enzi, Wyoming (Retiring – This seat is considered safe by most people.)
  • Cindy Hyde-Smith, Mississippi (Hyde-Smith defeated Mike Espy last November in a racially charged campaign.)
  • James Inhofe, Oklahoma (This is the schmuck who brought a snowball into the Senate chambers to make the argument that global warming can’t be possible because it’s still cold somewhere.)
  • Pat Roberts, Kansas (Retiring – Maybe a lost cause, as he ran unopposed last time and Kansas is a deep red state)
  • Mike Rounds, South Dakota (The entire state has approximately a quarter of a million voters. Unknown if there are enough Democrats to matter.)
  • Ben Sasse, Nebraska (In the 2014 election, there were a little over a half million voters; Sasse won every county in the State – 64% to 31%)
  • Dan Sullivan, Alaska (In the 2014 election, Sullivan won by 2.2% with a total of only a little over a quarter million voters. This state could be ripe for a flip.)

After the 2016 General Election, I worked with a group of people who were creating a canvassing tool that was designed to use AI to better prepare people who were out knocking on doors. It would have used demographics and historical voting data to train a machine learning algorithm on the patterns to be found in the data. Unfortunately, our primary investor kept adding requirements and ultimately squeezed the value right out of the app.

Nevertheless, our original concept we had discussed was to use machine learning to help political organizations make the most effective (not merely efficient) use of their various resources, e.g. time, money, people, connections, as well as understanding the political environment based on polls and overall news coverage.

Frankly, nobody I know of has sat down and begun to develop such a decision model, though I would dearly love to see it happen. It’s what we envisioned after Trump “won” and I still think it’s a viable approach. It does look like it’s a somewhat daunting challenge, however, when it comes to how expensive it would be to gather all the data we’d need access to, as well as develop the algorithms that would analyze and correlate the data.

Regardless, it seems a shame so many Republicans might run without any Democratic opposition. You’d think the least we can do is make them fight for their seats, which would include forcing them to shift resources around as well. It should be part of the overall pattern of the elections, which I’m unconvinced the Democratic Party really understands.


Hey! Long Time, No See.

QuantelliaLogoPaleI know it’s been quite a while since last I posted here. I’ve been continuously active on Facebook and have begun tweeting quite a bit as well, but that’s not why I haven’t posted to this blog in the past nearly three months. As of March 1 I began a new career, probably not the kind of thing you hear about 70-year-olds doing all that often. Since then I have been working as the Business Manager for Quantellia, LLC. You may recall I’ve done work for and with Quantellia on and off for the past six years.

Quantellia is a small AI/ML software development house and, until now, one of the co-founders has been running the business. Inasmuch as she is also the organization’s Chief Scientist, and a well-known pioneer in Machine Learning, this was not exactly the optimal thing for her to be doing. I had been touching on the subject and, since she was having such a hard time getting someone competent to run the business, I pressed my offer to do so. She finally relented and things have been going swimmingly, although there have been times I was swimming against the current. I’m definitely climbing a steep learning curve, which sometimes has me questioning if I’m losing my edge.

Actually, at times I can’t quite tell if my intellect is slipping a little bit, or if I just don’t care as much as I used to and I’m not quite as arrogantly sure of myself. My memory seems to be intact, along with my ability to learn and adapt. I’m going to go with the “I just don’t care as much about things as I used to; I’m more sanguine about life, work, and the need to control everything.

At any rate, I’m having a lot of fun. I was once partnered with two CPAs, doing royalty accounting for some big acts: Jackson Browne, Joni Mitchell, The Cars, Dollie Parton, Ronnie Milsap, The Commodores, even Jimi Hendrix’s estate. I learned a fair amount about accounting back then, and now I’m getting the opportunity to revisit what I learned, applying it in different circumstances. I’m also learning about artificial intelligence and machine learning, and hope to convey some of what’s going on in these fields. Although not a data scientist, I am quite capable of seeing where AI can be applied in business to assist with all kinds of issues. I’m sure you can as well.


A Career Change at 70

When I was in High School (1962 – 1966) it took me three and a half years to escape) there were no computer classes. Although there seems to be some disagreement on when the first personal computer was invented, even the earliest claim puts the date six years after I graduated. There was no such thing as a computer, let alone a programming or coding, class. Also, I did not attend university and, to my recollection, nobody I knew at the time was interested in computers or information technology. Actually, I was a terrible student and wasn’t interested in much of anything by the time I finished High School.

IBM Memory 50

The IBM Memory 50 Typewriter

Fast forward to 1974. Despite having no undergraduate education, I was able to secure admission to an accredited Law School, located not far from my home in the San Fernando Valley. I began in the fall of 1973 and the following year I managed to get a job in the law office of a sole practitioner in Beverly Hills as a legal secretary/law clerk. Shortly after I began, the lawyer I worked for purchased an IBM Memory 50 Typewriter. I attended a one-day class where I learned how to use it. This was my first introduction to anything resembling “computing.”

The office later upgraded to an Artec Display 2000, which had an LED readout of approximately 30 characters. There was no CRT display. It used two 8″ floppy disks and had a document/data merge capability that made it perfect for boilerplate documents, e.g. Pleadings, Interrogatories, Summonses, etc. It was a great leap forward in word processing.

Edward Ladd & Sons

The Family’s Wholesale Food Business

Shortly after graduating from law school I had, for numerous reasons, decided spending the rest of my work life around the judicial system was not something I really had my heart in and, after much gnashing of teeth and going over my alternatives, I decided to join my family’s wholesale food distribution business. One large factor in making this determination was my father suffering his second major heart attack. The business was supporting my mother and my sister, who was only 10 years old at the time. I felt the need to help the business grow, ensuring they would be taken care of if my father were to die . . . which he did eight years later.

Edward Ladd & Sons Jacket

Our company jackets. Logo design by me, jackets created by Cat’s Pyjamas.

After a couple of years, the business had grown substantially and, given my desire for another type of challenge, I once again struck off on my own. I dabbled in a few things, then joined forces with a couple of CPAs and formed a royalty auditing business, serving some very high-end artists. The company first purchased an Apple computer (I can’t recall if it was a II or a IIe but, based on the release dates of the two, I’m inclined to think it was a II). We later purchased a Northstar Advantage, which used the CP/M OS and two 160 KB, 5.25″ floppy disks. We also purchased a dot matrix printer and, in anticipation of taking the system out on the road, we had Anvil make a hardened case for the two, with room for cabling, paper, and instructions to be packed inside.

At that point our audits required us to visit the artists’ recording companies, and my first visit was to RCA records in the Meadowlands of New Jersey. Standard procedure for the record company was to stick us somewhere that was relatively uncomfortable, then bring us stacks of paper, which we then transferred to ledger pages. Upon returning to our office in Playa del Rey, we would then have to transfer all the data to a spreadsheet; we were using SuperCalc on the Northstar Advantage, though we had started with VisiCalc on the Apple.

I suggested taking the computer with us when we performed audits, so the people who went out on the road could enter the numbers they received directly into an electronic spreadsheet, thereby saving a huge amount of time and stress. We were also using WordStar at the time for writing the narratives that would accompany our audit analysis.

My first experience with programming came when we were contemplating taking the system out on a European tour with Neil Young. I sat with my friend and partner, who had performed many a box office reconciliation, and we sketched out the different scenarios that were used to close out the night’s receipts. Doing so required the use of nested “if” statements, which determined the precise equation to use for each venue. Unfortunately, that same friend who had worked so diligently with me to create the formulae that would power the spreadsheet never felt comfortable with using it by himself and it never went out on the road.

Sinclair ZX81

My Very First Computer, the Sinclair ZX81

It was also around this time I purchased a Sinclair ZX81, which was a small computer that had a membrane keyboard and used a cassette recorder to save programs on. It also had its own OS, as well as its own version of Basic, which I endeavored to learn. The first program I wrote, which took me all night to complete, counted down from 10 to 0, in the center of the screen. It then plotted a single pixel (resolution was 64 x 48) at a time, starting from the bottom and, after reaching a height of six pixels, began plotting another pixel above the previous six and erasing a pixel from the bottom of the stack, until it left the screen at the top. This required me to learn how to use either (I don’t recall the exact commands; it’s only been a little over thirty-five years) nested “if” statements or “do while” commands.

Fast forward to 1984, the year my father died. Shortly afterward, I returned to help my brother keep the business going. We purchased a more advanced Northstar Advantage, which had a huge hard disk that could store 5MB of data! At the time, we also purchased a copy of dBase II, which was one of the first database systems written for microcomputers. I taught myself how to write systems using their programming language, which I wrote using WordStar. I wrote an entire accounting system for the business. My favorite component was the preparation of a deposit ticket, where I laboriously emulated the workings of a calculator in allowing for numerous methods of inputting dollars and cents (whether or not a decimal point was included) was the real differentiator and sticking point for me but, after much trial and error, I figured it out.

Unfortunately, my brother and I didn’t see eye-to-eye on the direction the business should go in and, after a while I left again, this time taking temporary jobs to keep me afloat. It was during this time I worked for a while at a litigation support firm that used a DEC minicomputer and several of the earliest versions of the Macintosh. All of my work with computers was novel for me, as I never took any classes — with the exception of that class I took to learn how to use the IBM Memory 50 typewriter. I taught myself how to program through reading and doing, sometimes taking dozens of iterations to get a bit of code correct.

In 1987, I had been working for a company that made hard drives (Micropolis). Their business was highly seasonal and, on one particular Friday, all the temps got summarily laid off. I was using Apple One at the time to send me out on engagements and, thanks to my willingness to show up wherever, and whenever, they would offer me a job, I got a call from them on that very Friday, telling me to report to Rocketdyne the following Monday.

By this time I had been shifting my focus from working under the hood, to figuring out how to best use the systems and tools that were rapidly evolving as business tools. I was beginning to focus more on business results with whatever was available. My first responsibility at Rocketdyne was to enter text I received from Engineers into a document called a Failure Modes and Effects Analysis / Critical Items List (FMEA/CIL). It was in direct support of recertifying the Space Shuttle Main Engine (SSME) for eventual return to flight after the Challenger disaster.

SSME

SSME Hotfire Test

It was a strange task, as the document was clearly text-based, yet we were using a spreadsheet to create it. I suppose it made some sort of sense, as the company was an engineering company and that’s kind of how engineers see the world; in numbers, rather than words.

I also worked with a stress engineer on creating an app (we didn’t use the term back then, but that’s what it was) that could be used to predict crack propagation and its effects. I was unfamiliar with the equations behind the program, but my job was to use dBase II to create an interface that allowed for data input and crunched the numbers. It was fun and was successfully used for some time after that.

One year after joining as a temp (referred to as a “job shopper”) I hired in full-time and began working with the Flight Ops team. It was exciting and I spent much of my time massaging telemetry data from hot fire tests of the SSME. I received flat files from a Perkin-Elmer mainframe and eventually ported the data to Microsoft Access, which allowed for further massaging and reporting.

In October of 1988, a little over eight months after hiring in, the U.S. Space Program returned to flight with the successful launch of Discovery. At a celebratory event that evening I met one of the managers of the Program Office. As we talked and he discovered my background, he offered me a job. I did some research and talked to my current managers, who advised me to take it, which I did. As time went on, I moved further away from anything resembling coding and, eventually, wound up concentrating on the use of software and computing tools to increase the effectiveness of me and my colleagues.

Not quite 22 years later, I took an early severance package (which was offered to everyone over 60) and retired. I would turn 63 less than a month after leaving the company. In 2015, I returned as a contractor doing something I had done nearly 20 years previously. I spent the next two years (until February 17 of 2017, to be exact) providing program scheduling for two small rocket engine programs.

Last month I turned 70. I recently signed a referral partnership agreement with an organization I worked with a few years ago. They specialize in machine learning (ML) though I was unaware of that back then. My primary responsibility will be selling their services and, when possible, any product they may create. In order to be effective, I am now studying statistics and ML, partly to better understand what it is I’m selling and partly because I’m fascinated by the algorithms that power these efforts.

I do worry that my comprehension is somewhat hampered by, if not the years, the considerable mileage I’ve managed to accumulate. There’s also a minor problem with my “just don’t give a shit” attitude about a lot of things. Nevertheless, I will persist. I intend to share what I’m learning but, as with most things these days, it may be sporadic and somewhat unfocused.

I do believe machine learning is going to drastically change just about everything humans do and I’m well aware of the disruption it might entail. I don’t however, believe that to be a show stopper. We’ve opened Pandora’s box and there is no closing it. Let’s roll!


Empathy: The Core of Complex Decisions

Having worked with Dr. Pratt and her company, Quantellia, I have long been convinced their approach to decision making is one of, if not THE, best methodologies I’ve encountered. After what I consider to be one of the most disastrous general elections in my lifetime, it would seem we need help in navigating the complexities of the world and our place in it. Lorien’s work can, I believe, help us understand the consequences of our decisions, before we make them. I urge you to watch this video and become more conversant in the issues Dr. Pratt raises. What follows below the video are some of the “liner notes” that go with her TEDxLivermore talk.

Making decisions based on invisible inputs is like building a skyscraper without a blueprint. Yet that is the norm, even for very complex problems. Contrary to how most of us think about making a decision as being the act of choosing, a decision is the last piece of a long, almost completely invisible, process. The good news: it is possible to make the invisible part of decisions visible.

In working with the Community Justice Advisor Program in Liberia, Africa, Lorien and colleagues helped The Carter Center (founded by Jimmy and Rosalynn Carter) use decision models to increase positive outcomes in the domain of civil justice, by identifying the most effective levers for change.

Using deep learning artificial intelligence, the interconnections between inputs become visible, and unintended consequences can be identified before implementation. Vicious cycles can be reversed, and virtuous cycles of improvement can be built in place and nurtured through intelligent decision metrics.

As co-founder of Quantellia, Dr. Lorien Pratt co-created the decision intelligence methodology and the company’s award-winning World Modeler™ software. She consults and speaks worldwide, and is known for her neural network research and the book Learning to Learn. A former college professor, Pratt is widely known as the former global director of telecommunications research for Stratecast, a division of Frost & Sullivan. A graduate of Dartmouth College and Rutgers University, Pratt holds three degrees in computer science. She received the CAREER award from the National Science Foundation, an innovation award from Microsoft, and is author of dozens of technical papers and articles.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx


Program Management By Ouija Board

image

Going back to work after nearly five years of “retirement” has been both interesting and instructive. When I was asked if I would be willing to do scheduling, which is something I had done many years ago, I happily said “yes”. I would have probably agreed to almost anything they wanted me to do, as I was anxious to supplement my meager retirement income. Actually, I first learned scheduling software using a mainframe tool called Artemis. Shortly afterward, we were introduced to a PC version of Artemis which, if memory serves, was called Schedule Publisher and, within another very short period, it was spun off into a product from Advanced Management Solutions, called AMS REALTIME Projects.

This was somewhere around 1994 and, at the time, Microsoft Project was comparatively bare bones and nowhere near as useful (in my opinion at the time) as REALTIME Projects. Having long been very much a visual person, I find the visualization provided by Gantt charts to be particularly useful when looking to see how the logic in a schedule affects downstream activities as time, and the work contemplated in the schedule, moves forward. Until Project introduced the Timeline view, which allows quick zooming and panning, I was not terribly happy with it compared to the AMS product, which offered a useful timeline capability.

So . . . since I had done scheduling for a few years during the 90s, I readily accepted the challenge and, upon my return on January 19, 2015, I was amused to see the company was still using Project 2002 which, although newer than the version I had struggled with, was still well over a decade old. The main reason for this, I was told, was because a set of macros had been developed over the years that allowed schedules to be matched up with the organization’s earned value management system, which is Deltek MPM.

Unfortunately, using such an old piece of software presented some interesting problems. One of the most egregious, from my point of view, was its inability to run in any of the conference rooms in my building. This was — and still is — due to an IT rule put in place that won’t run software in conference rooms if it’s more than two versions older than the most current one available. In the case of MS Project, the latest version available when I returned was 2013. Also, MS had released a 2007 and a 2010 version, which put the one in widespread use more than two versions behind and, as a result, clicking on the tool (which was installed in all the conference rooms) invoked Project but, instead of seeing the tabular data alongside a Gantt chart, all one got was an empty box with a small red “x” in the upper lefthand corner.

In my experience, scheduling is an activity that absolutely must be done collaboratively. A good, useful schedule requires (at the very least) a great deal of understanding of not only the work to be done, but the ways in which the logic of its progression needs to be modeled in order to accurately reflect how downstream activities are impacted by small changes as work progresses . . . and changes are absolutely unavoidable, especially in large, complex projects such as rocket engine design, manufacture, and test.

Since it was impossible to use the tool in a conference room, where I could sit with the Program Manager, one or more Control Account Managers, and various Engineers (Design, Quality, Manufacturing, etc.) developing schedules became somewhat difficult and inordinately iterative, requiring dozens of communications back and forth between me and the Program Manager, as well as others who we needed input from. As work progressed, I was able to get IT to agree to allow me to log into my computer remotely from any one of the conference rooms, which made working on the schedule much easier. However, the resolution in the conference rooms was far less than that available to me on my Dell all-in-one. Its screen is 23″ diagonally, plus I have an extension display that gives me another 19″ off to the side. What I see on screen in conference rooms is not as inclusive as what I normally work with and it takes a bit of adjusting, which cuts into the speed with which I can get things done.

As I both refamiliarize myself with the scheduling process and learn how the tools have advanced, I’m learning a lot about how best to do it. Perhaps more importantly, I’m also learning how little most people know of the power of a good piece of scheduling software. There are people here who still use Excel spreadsheets and date functions to create schedules. Maybe I’m missing something, but MS Project and other similar tools provide not only calendaring functionality, but also the kind of logic necessary to accurately model the interplay between design, quality, procurement, operations, testing, and numerous other ancillary and important processes that make up the entirety of a program.

Inasmuch as Project also provides for highly detailed resource loading (quite literally down to the gnat’s ass, if one is so inclined), I’m unclear as to why we don’t use it for at least first cut proposal activity. Were we to do so, I’m convinced it would not only speed up the initial process of pricing a decent proposal but, when completed, there would be no need to then create a schedule from scratch, which is generally the way it’s done now. I suspect there are some people out there who actually do what I’m suggesting but, for all I know at this point, my perception could be wildly innacurate.

So . . . I’m kind of hedging my bets and, while I’m agitating for people to consider using MS Project more widely and for deeper resource planning, I’m mostly looking to understand the tool a little more each day. It, like many tools available to organizations of all kinds and sizes, is far more powerful than most individuals understand or are interested in learning. I’m constantly finding myself believing we are crippling ourselves by not using it far more extensively but, as many have pointed out, changing direction in a reasonably large organization, especially one which depends largely on government contracts and oversight, is like turning an aircraft carrier with a canoe paddle. On the bright side, it could keep me working for another decade, the prospect of which does not bother me in the slightest.


Making Sense of All That Data

Deep Data

Transforming Big Data Information into Deep Data Insights

Yesterday I posted a question to several of the groups I belong to on LinkedIn. It was related to several of the things I’m interested and involved in: Systems Thinking, Knowledge Management, and Decision Modeling. It was somewhat informed, as well, by an article appearing in the Huffington Post, where Otto Scharmer, a Senior Lecturer at MIT and founder of the Presencing Institute, talks about the need to make sense of the huge and growing amounts of data we have available to us. He argues the importance of turning from “Big” data, where we mainly look outward in our attempt to understand what it is telling us about markets and our external influence, to “Deep” data, where we begin looking inward to understand what it’s telling us about ourselves and our organizations and how we get things done.

The question I asked was designed to seek out capabilities and functionality that people would like to have, but that is currently unavailable. My interests include working with others to understand and provide for those needs, if possible. I thought I would present the question here as well, where it will remain a part of my online presence and, hopefully, might elicit some useful responses. Here it is:

With the growing proliferation and importance of data — a development at least one author and MIT Lecturer has suggested is moving us from the information technology era to the data technology era — what tools would you like to see become available for handling, understanding, and sharing the new types of information and knowledge this development will bring?

In other words, what would you need that you don’t have today? What types of technology do you think would offer you, your colleagues, and your organizations a greater ability to make use of data to bring about a transformation from primarily siloed, outward looking data to collaborative, inward looking data as well?

I would love to hear of any ideas you might have regarding the kinds of tools or apps you could use to better deal with data by turning it into useful information and knowledge . . . perhaps even a smidgen of understanding and wisdom.


Chasing Earned Value

Recently, I was given the task of writing a short (4 – 5 page) paper on the basics of Earned Value Management (EVM), and why it’s useful for medium to large organizations in managing their projects. The idea was to deal with the “why”, not the “how”. I worked in a large aerospace organization for over two decades and we used EVM extensively. It is, after all, a requirement for all government contractors.

Earned Value Terminology

A Plethora of Acronyms Revealed

Having retired from that industry a little over four years ago, I was a bit rusty. However, you can’t have that stuff drummed into your head without it engraving itself fairly deeply on your consciousness. It didn’t take me long to come back up-to-speed. In fact, the biggest problem I had was knowing where to stop. EVM is full of acronyms and formulae (BCWS, BCWP, ACWP, SPI, CPI, etc., etc., etc.), all of which I’m fairly certain are useful . . . when used intelligently. As with most things, how valuable they are depends a great deal on what you’re trying to accomplish, how prepared and disciplined you are, and how well you execute over time.

Now this brings me to a somewhat vexing problem and the reason I’m sharing this. I could swear there’s a good argument somewhere as to why EV is not a very good method for managing a project. However, when I searched for problems or reasons not to use EV, all I could find were lists of where organizations go wrong because they don’t plan properly, they don’t pay attention to detail, or they don’t use tools as they’re designed to be used.

So I have a question, which I am now going to throw out into the aether. Assuming some who read this actually know about, and have experience with, Earned Value Management and maybe one or more of the systems used to facilitate its proper application, are you aware of any reasons NOT to use EVM and, if so, could you point me to a resource or school me on the subject? Thanks.


Simplicity is the Ultimate Sophistication

 

In an effort to improve my “working out loud” chops, I’m learning from a friend who has begun sharing the text of (not links to) his blog posts on Facebook and LinkedIn, as well as on the blog he’s had for a very long time. <Light Bulb!> This one’s a kind of reverse emulation, as this is something I shared on Facebook first.

Simplicity - Da VinciI have found an interesting difference of opinion on the subject of simplicity versus complexity, but it seems to hang on what dimension of endeavor we’re looking from. From an engineering design perspective – especially wrt products for the consumer market – there’s evidence complexity (think shiny objects) is actually a better seller than simplicity.

It seems to me, however, that da Vinci was looking a little deeper than marketing prospects and was more interested in the aesthetics of design . . . all kinds of design.

So . . . I’m thinking of it in terms of this software tool I am now representing, called World Modeler, which is used to model the elements required to make important and quite likely expensive organizational decisions to better . What we (Quantellia, LLC and I) can do is transform highly complex decision models (involving numerous decision levers, external factors, intermediate effects, interconnections, and even qualitative assumptions) to graphically (and quite simply) show how they will play out over time given certain values. The goal is to render the complex simple, not to simplify that which is complex.


%d bloggers like this: