As I’ve noted previously, I am working on a couple of memoirs and my autobiography. In doing so, I’ve been conducting a bit of archaeological research on my two current computers’ contents. I have a PC laptop and an iMac. The laptop is going on three years old and the Mac was purchased around June of 2010, right after I retired from Rocketdyne, though it crapped out while it was still under warranty, and the CPU and most of the other components were replaced with those of a newer model.
Something I hadn’t been thinking about much was that I had moved all of my personal files from my years at Rocketdyne, as well as a lot of writing I did while I was there that isn’t worth their energy to call protected IP. At any rate, I’m encountering things I had long forgotten existed and I’d like to share some of them.
This is a press release I’m pretty sure I wrote tongue-in-cheek, but I’m not sure what happened with it. It was, if the file metadata is correct, written in early February of 2006, a little over 14.5 years ago. I can’t recall the last time I read a physical copy of the L.A. Times.
For Immediate Release
In an amazing display of ineptness and communications failure, and for the third time in almost as many weeks, the Los Angeles Times’ home delivery department, Ventura County division, on Sunday, February 5, completely mismanaged the delivery of the Times Sunday edition to the home of a Simi Valley family.
For years, this weekend edition, complete with both the opinion section and numerous advertisements and coupons, has been delivered to the Ladd family double wrapped in plastic and sealed to protect it from being soaked by the sprinkler system which, unfortunately, drains water in the exact location where the paper seems to be most conveniently placed by the L.A. Times’ intrepid delivery person.
Approximately four to five weeks ago, and without any explanation or reason which would be immediately apparent to the Ladds, the paper started being delivered with one, unsealed plastic bag. This difference, however, was not matched by a change in location used to place the paper and, the laws of physics and water being what they are, the paper wicked up enough liquid to add several pounds to its weight. As a side effect, it made reading the articles and advertisements contained in the Times virtually impossible.
Up until the 5th of February, subsequent to calls to the Times’ Customer Service automated telephone number, a new paper has twice been delivered within the promised 90 minutes. The last time brought an apology and a promise to see the paper was sufficiently wrapped and it was, in fact, delivered dry on January 29th. However, the following week, on February 5, the paper was once again single wrapped, and soaking wet by the time it was retrieved.
Richard Ladd immediately called the Times’ Customer Service automated telephone number, once again pressing the button to inform the electronic system that there was, indeed, a delivery problem involving an automatic sprinkler system and a wet newspaper. He then entered his phone number and street address, and was informed a new paper would be delivered within 90 minutes.
As of midnight, at the beginning of a new week, the Sunday edition of the Los Angeles Times had not been delivered to the Ladd family, causing them to wonder if they shouldn’t just throw in the towel and cancel their subscription, opting instead to read the paper (assuming they even care any longer) on the Internet, and either celebrating or bemoaning (they are currently not quite sure which it should be) the continuing slide of print media into oblivion.
When I was in High School (1962 – 1966) it took me three and a half years to escape) there were no computer classes. Although there seems to be some disagreement on when the first personal computer was invented, even the earliest claim puts the date six years after I graduated. There was no such thing as a computer, let alone a programming or coding, class. Also, I did not attend university and, to my recollection, nobody I knew at the time was interested in computers or information technology. Actually, I was a terrible student and wasn’t interested in much of anything by the time I finished High School.
The IBM Memory 50 Typewriter
Fast forward to 1974. Despite having no undergraduate education, I was able to secure admission to an accredited Law School, located not far from my home in the San Fernando Valley. I began in the fall of 1973 and the following year I managed to get a job in the law office of a sole practitioner in Beverly Hills as a legal secretary/law clerk. Shortly after I began, the lawyer I worked for purchased an IBM Memory 50 Typewriter. I attended a one-day class where I learned how to use it. This was my first introduction to anything resembling “computing.”
The office later upgraded to an Artec Display 2000, which had an LED readout of approximately 30 characters. There was no CRT display. It used two 8″ floppy disks and had a document/data merge capability that made it perfect for boilerplate documents, e.g. Pleadings, Interrogatories, Summonses, etc. It was a great leap forward in word processing.
The Family’s Wholesale Food Business
Shortly after graduating from law school I had, for numerous reasons, decided spending the rest of my work life around the judicial system was not something I really had my heart in and, after much gnashing of teeth and going over my alternatives, I decided to join my family’s wholesale food distribution business. One large factor in making this determination was my father suffering his second major heart attack. The business was supporting my mother and my sister, who was only 10 years old at the time. I felt the need to help the business grow, ensuring they would be taken care of if my father were to die . . . which he did eight years later.
Our company jackets. Logo design by me, jackets created by Cat’s Pyjamas.
After a couple of years, the business had grown substantially and, given my desire for another type of challenge, I once again struck off on my own. I dabbled in a few things, then joined forces with a couple of CPAs and formed a royalty auditing business, serving some very high-end artists. The company first purchased an Apple computer (I can’t recall if it was a II or a IIe but, based on the release dates of the two, I’m inclined to think it was a II). We later purchased a Northstar Advantage, which used the CP/M OS and two 160 KB, 5.25″ floppy disks. We also purchased a dot matrix printer and, in anticipation of taking the system out on the road, we had Anvil make a hardened case for the two, with room for cabling, paper, and instructions to be packed inside.
At that point our audits required us to visit the artists’ recording companies, and my first visit was to RCA records in the Meadowlands of New Jersey. Standard procedure for the record company was to stick us somewhere that was relatively uncomfortable, then bring us stacks of paper, which we then transferred to ledger pages. Upon returning to our office in Playa del Rey, we would then have to transfer all the data to a spreadsheet; we were using SuperCalc on the Northstar Advantage, though we had started with VisiCalc on the Apple.
I suggested taking the computer with us when we performed audits, so the people who went out on the road could enter the numbers they received directly into an electronic spreadsheet, thereby saving a huge amount of time and stress. We were also using WordStar at the time for writing the narratives that would accompany our audit analysis.
My first experience with programming came when we were contemplating taking the system out on a European tour with Neil Young. I sat with my friend and partner, who had performed many a box office reconciliation, and we sketched out the different scenarios that were used to close out the night’s receipts. Doing so required the use of nested “if” statements, which determined the precise equation to use for each venue. Unfortunately, that same friend who had worked so diligently with me to create the formulae that would power the spreadsheet never felt comfortable with using it by himself and it never went out on the road.
My Very First Computer, the Sinclair ZX81
It was also around this time I purchased a Sinclair ZX81, which was a small computer that had a membrane keyboard and used a cassette recorder to save programs on. It also had its own OS, as well as its own version of Basic, which I endeavored to learn. The first program I wrote, which took me all night to complete, counted down from 10 to 0, in the center of the screen. It then plotted a single pixel (resolution was 64 x 48) at a time, starting from the bottom and, after reaching a height of six pixels, began plotting another pixel above the previous six and erasing a pixel from the bottom of the stack, until it left the screen at the top. This required me to learn how to use either (I don’t recall the exact commands; it’s only been a little over thirty-five years) nested “if” statements or “do while” commands.
Fast forward to 1984, the year my father died. Shortly afterward, I returned to help my brother keep the business going. We purchased a more advanced Northstar Advantage, which had a huge hard disk that could store 5MB of data! At the time, we also purchased a copy of dBase II, which was one of the first database systems written for microcomputers. I taught myself how to write systems using their programming language, which I wrote using WordStar. I wrote an entire accounting system for the business. My favorite component was the preparation of a deposit ticket, where I laboriously emulated the workings of a calculator in allowing for numerous methods of inputting dollars and cents (whether or not a decimal point was included) was the real differentiator and sticking point for me but, after much trial and error, I figured it out.
Unfortunately, my brother and I didn’t see eye-to-eye on the direction the business should go in and, after a while I left again, this time taking temporary jobs to keep me afloat. It was during this time I worked for a while at a litigation support firm that used a DEC minicomputer and several of the earliest versions of the Macintosh. All of my work with computers was novel for me, as I never took any classes — with the exception of that class I took to learn how to use the IBM Memory 50 typewriter. I taught myself how to program through reading and doing, sometimes taking dozens of iterations to get a bit of code correct.
In 1987, I had been working for a company that made hard drives (Micropolis). Their business was highly seasonal and, on one particular Friday, all the temps got summarily laid off. I was using Apple One at the time to send me out on engagements and, thanks to my willingness to show up wherever, and whenever, they would offer me a job, I got a call from them on that very Friday, telling me to report to Rocketdyne the following Monday.
By this time I had been shifting my focus from working under the hood, to figuring out how to best use the systems and tools that were rapidly evolving as business tools. I was beginning to focus more on business results with whatever was available. My first responsibility at Rocketdyne was to enter text I received from Engineers into a document called a Failure Modes and Effects Analysis / Critical Items List (FMEA/CIL). It was in direct support of recertifying the Space Shuttle Main Engine (SSME) for eventual return to flight after the Challenger disaster.
SSME Hotfire Test
It was a strange task, as the document was clearly text-based, yet we were using a spreadsheet to create it. I suppose it made some sort of sense, as the company was an engineering company and that’s kind of how engineers see the world; in numbers, rather than words.
I also worked with a stress engineer on creating an app (we didn’t use the term back then, but that’s what it was) that could be used to predict crack propagation and its effects. I was unfamiliar with the equations behind the program, but my job was to use dBase II to create an interface that allowed for data input and crunched the numbers. It was fun and was successfully used for some time after that.
One year after joining as a temp (referred to as a “job shopper”) I hired in full-time and began working with the Flight Ops team. It was exciting and I spent much of my time massaging telemetry data from hot fire tests of the SSME. I received flat files from a Perkin-Elmer mainframe and eventually ported the data to Microsoft Access, which allowed for further massaging and reporting.
In October of 1988, a little over eight months after hiring in, the U.S. Space Program returned to flight with the successful launch of Discovery. At a celebratory event that evening I met one of the managers of the Program Office. As we talked and he discovered my background, he offered me a job. I did some research and talked to my current managers, who advised me to take it, which I did. As time went on, I moved further away from anything resembling coding and, eventually, wound up concentrating on the use of software and computing tools to increase the effectiveness of me and my colleagues.
Not quite 22 years later, I took an early severance package (which was offered to everyone over 60) and retired. I would turn 63 less than a month after leaving the company. In 2015, I returned as a contractor doing something I had done nearly 20 years previously. I spent the next two years (until February 17 of 2017, to be exact) providing program scheduling for two small rocket engine programs.
Last month I turned 70. I recently signed a referral partnership agreement with an organization I worked with a few years ago. They specialize in machine learning (ML) though I was unaware of that back then. My primary responsibility will be selling their services and, when possible, any product they may create. In order to be effective, I am now studying statistics and ML, partly to better understand what it is I’m selling and partly because I’m fascinated by the algorithms that power these efforts.
I do worry that my comprehension is somewhat hampered by, if not the years, the considerable mileage I’ve managed to accumulate. There’s also a minor problem with my “just don’t give a shit” attitude about a lot of things. Nevertheless, I will persist. I intend to share what I’m learning but, as with most things these days, it may be sporadic and somewhat unfocused.
I do believe machine learning is going to drastically change just about everything humans do and I’m well aware of the disruption it might entail. I don’t however, believe that to be a show stopper. We’ve opened Pandora’s box and there is no closing it. Let’s roll!
Do you remember the postscript? You know, that extra thought preceded by a PS, usually appearing after the signature in a letter. I’ve come to the realization postscripts are a thing of the past, a relic of the days in which we would actually write letters, cards, and notes and send them to others. When using pen and ink, one had no choice but to put an afterthought in a postscript. The computer has put an end to that. Regardless of the medium, any afterthought you have can easily be inserted in the body of the main message prior to sending. Even when instant messaging or texting, there’s no longer a need for what used to be the fairly ubiquitous PS (sometimes even a PPS). Just keep adding to the thread.
This came to me the other day when, after posting something to Facebook, I realized I wanted to add another thought. Of course, it was too late to edit the original post, but I was able to comment on my own post, which is exactly what I did. In fact, I even preceded the comment with a “PS”. It dawned on me this wasn’t quite the same usage as those of us who can remember actual written communication were used to. In those days, if you didn’t include the PS you were forever barred from adding – and let’s not forget commenting, texting, etc. are virtually instantaneous – the afterthought.
I have no clear idea how this affects our ability to communicate, though I suspect it’s an improvement in clarity of thought. Given some of the lamentations I’ve read over the decline of the English language and proper grammar, spelling, and punctuation in today’s rapid-fire communications, I assume there are those who would disagree with me. Nevertheless, that’s my story.
Now that I’ve had a little while to work with my new iMac, I’m beginning to come down from the techno-induced stupor I’ve been in and am thinking about what this all means to me. I’ve also been thinking about what it should mean for many people who work in corporate America, where I have been laboring for the past two decades and more.
Let me explain what I’m getting at. From the first day I started working at what was then Rockwell International’s Rocketdyne division (formerly North American Rockwell), I was stuck using technology that was already a little behind the eight-ball. Back then (1987) there wasn’t much in the way of personal computers, but they were developing rapidly. I went from an IBM 8086 to an 8088 to an AT and, finally to Windows and on and on. As time wore on the level of state-of-the-artiness of the available technology I had available at work, unfortunately, fell further and further behind.
Now, this isn’t about the battle that took place between IT (formerly MIS) and Engineering for many years, and how it affected the development of the first LAN in the company (hint – it wasn’t pretty), but rather about the level of security and, perhaps, paranoia that built up over the years with respect to the use of computing resources.
Part of the problem for my line of work was the very real issue of the International Traffic in Arms Regulations (ITAR) which, sometime after we were purchased by the Boeing Company, was painfully and expensively learned after an inadvertent and ignorant violation of the Regs (another story this really isn’t about). This lesson required some education and was fairly easily addressed once understood.
I think I need to throw in a caveat here. I am not an IT person. I have absolutely no formal IT education. I am merely a business person who has worked with (mostly) micro-computers – now called PCs – for close to thirty-five years. I have participated in or led efforts in knowledge management and Enterprise 2.0 for Pratt & Whitney Rocketdyne, and I was instrumental in bringing in our first web-based social system over 7.5 years ago. I have also been the project manager for that terribly under-used application all this time as well. My point here is I may not use language that’s accurate, but I know the kinds of functionality available and I know all of it is – from a corporate point of view – there to serve the business.
What I’m concerned with is the application of a one-size-fits-all mentality to the provision of information technology to a company’s workforce, as well as the imposition of blanket security regulations that serve to cripple an organization’s ability to keep abreast of developments in that same technology. This becomes increasingly important as more capability moves out into the cloud (this includes micro-cloud environments, i.e. inside the firewall capability that utilizes cloud-like architecture.)
I have tried to argue, to no avail – I’m sure others will recognize this particular kind of frustration – for the identification of power users who could be provided with, for lack of a better term, beta capabilities they would exercise and learn about. These people would provide a cadre of workers who are constantly looking at new ways to improve communication, collaboration, and findability. People who’s job, in part, is to find newer and better ways to get things done. In my eyes, this is a no-brainer, and I have to say with the speed things are changing nowadays, I think this kind of approach is even more important.
I recognize it is difficult to get large organizations to move rapidly. One doesn’t turn a battleship on a dime. Nevertheless, it is conceivable to me (much more so now than a decade ago) a small group of people could help any organization understand – at the very least – how work gets done, how workers are communicating and collaborating with each other across various boundaries, and how knowledge is being shared in a timely and useful fashion. I also think, daring as it may seem to some, that paying attention to – and preparing to learn from – the processes that are changing the way we do these things can position a company competitively to be a player, rather than an also-ran. I quite certain failing to do so leaves you with the situation I grew used to; a company with computing resources and experience years behind state-of-the-art. In marketplaces where this can change dramatically in under a year, I think that’s unconscionable.
Have any of you experienced this situation? Does it resonate at all? Am I totally off-base or do you think this would be a viable approach for large organizations to engage in?
Since my retirement from Pratt & Whitney Rocketdyne in 2010, I have spent quite a bit of energy on developing work as a social media marketer for small business, a business manager for an AI software development firm, and as an editor/proofreader for a number of business books and a couple of novels, as well as a two-year return engagement at Rocketdyne from 2015 to 2017.
I have decided to stop actively pursuing business in these fields and am now positioning myself to be a writer. I have done quite a bit of writing over the years, but I’ve never really attempted to make any money at it; at least not specifically. I’m starting out with a couple of memoirs and, currently, I’m studying the craft, creating a detailed outline and timeline, and honing my skills as a storyteller. Pretty sure I’ll be writing some fiction as well.
The views expressed herein are those of the author. Any opinions regarding the value or worth of particular business processes, tools, or procedures, whether at his former place of employment, at a current client's enterprise, or in general, are his responsibility alone.