What I Learned About SDLC, and What I Didn’t
William Sundwick
In the early ‘80s, I resolved to prepare myself for a second
career, beyond librarianship. I enrolled in a 36-hour master’s degree program
at American University which allegedly would qualify me to enter the growing
field of systems management. The degree was called a Master of Science in the
Technology of Management (MSTM). It came in 1984. I had talked my employer (the
Library of Congress) into granting me a leave of absence, a sabbatical, for
purposes of pursuing this academic endeavor.
The program taught me much about the nature of ADP (Automated Data Processing, an
archaic acronym) and systems management in a variety of economic sectors,
including government. Among the topics covered was software engineering. I
learned that there was something called the System
Development Life Cycle (SDLC) which followed a series of rules and
formulae for successful development of large software projects; indeed, all
software projects in those days were large, centered around mainframe data
processing shops.
The SDLC had five basic phases: 1) requirements analysis, 2)
design, 3) implementation, 4) testing, and 5) production. Not too different
from any other engineering methodology. The difference with software
development, as we were beginning to see in the early ‘80s, was that technology
was racing ahead so fast that the entire cycle needed to be compressed into
something of much shorter duration, to maintain an agile systems dependent organization.
Alas, my newly minted MSTM degree was insufficient to escape
ten years of vested library experience. There would be no start of a new career
for me in the “beltway bandit” sector, those software developers popping up,
especially in Northern Virginia. I returned to my old job at LoC, gradually becoming
an advocate for relevant “user-centered” systems development. This role did
bring me into contact with other agency minds working in similar directions,
and eventually with decision makers, both in my user shop and the central ADP
directorate of the Library.
We made a strong case in those days for greater user
participation in both requirements analysis and design of new systems.
Technology aided this approach – the Library deployed networked desktop
computers widely in the ‘90s. Such technology empowered users and facilitated
the implementation phase of new systems. Testing was carried out via incremental
implementation – first one group of users, iron out the bugs, then another
group, etc.
Problems arose mostly in the production phase. Things broke.
And, soon, user requirements changed. Systems were retired, users forced to
migrate to something new. Eventually, we discovered that “off-the-shelf”
software worked better than anything the local DP shop could create. That SDLC
transformed itself into: 1) shopping for best commercial product, 2) mastering
its user interface (UI), 3) buying enough for all potential users, 4) providing
“user support” to get most from product, and 5) rinse and repeat, until time to
migrate to a new product. This worked through the nineties and into the aughts.
Then, technology changed again. Security vulnerabilities existed
in the proliferation of commercial packages -- a serious matter in the federal
government. Internet connectivity was the villain. But, everybody needed
Internet connectivity.
Enter the world of cybersecurity. What used to be empowering
was now constraining. Users started complaining. FIPS regs (Federal Information
Processing Standards) added security to
the body of rules required for all federal agencies. New rules affecting design
of software, and access. Users became unhappy with new restrictions on their
activity. I became a policeman.
There was still “high level” requirements analysis and
design, carried out only at an abstract level. I lost much of the UI practical expertise
I had spent the last several years accumulating. I responded by inventing
situations where I could try to convince folks that a new design was needed,
hoping for some systems analysis opportunity. Not much success there, despite finally
landing a new job (and promotion) carrying a position description of an official
IT specialist (USCS 2910 series) rather than a librarian (USCS 1410 series).
My job became boring. Deployment (i.e., buying stuff) began
to consume more of my time. As old users retired, new (younger) users came on
board. They needed less “user support,” except for the policeman variety, telling
them why they couldn’t do what they wanted!
I retired in 2015. The
good news is that I was quickly replaced. Good news, because it signaled that
my boss, at least, prioritized my role. But, only since retirement have I
learned that there is a move afoot to modify that SDLC methodology. The new
thing is called Agile System
Development (ASD), and the design of UI is now known as designing “user
experience” (UX). Much of this is commercial hype and may not spread
very fast in the federal government. But it is interesting, nevertheless.
ASD emphasizes iterative requirements analysis. The
design team meets with users many times over a process of “sprints” and
“scrums.” Design is occurring simultaneously with the requirements phase – and
focuses on UX, which will cascade into UI design, through both implementation
and testing. User acceptance testing, rather than coming at the end, as in
traditional SDLC, now comes at the beginning. It is UX testing. Prototyping
has become much more routine, thanks to wireframe
and mock-up tools for developers (yes, off-the-shelf products – I
even downloaded a free one, Adobe XD).
Also, development of mobile apps (iOS and Android) is now the dominant part of
the consumer market. It’s cheap
and easy, and is where many developers have landed.
What does this new methodology say about career prospects
for young systems developers? Nothing good, I’m afraid. The old-line software
engineers maintain their barriers to entry, earning their salaries mostly by
dealing with security threats. The graphics
designer (UX/UI) is relegated mostly to a subordinate role –
possibly being hired by a start-up consultancy doing design work for business
clients who don’t want the overhead of a dedicated design staff. But, job
hopping is a fact of life for younger workers these days, and there are
certainly enough of the new UX/UI consultancies
around to choose from.
My question as a software consumer now, schooled in the “old
ways,” is this: isn’t the traditional software engineering SDLC still the best
paradigm for all engineering projects? Even with new bells and whistles
like “sprints” and “scrums” and multiple iterations? It’s still the way things
get done in real time -- even as real time becomes more condensed. Nobody ever
argued, even in my day, that requirements analysis was not the key to
successful design, deployment, and QA testing (Quality Assurance). And,
designers (or analysts) working closely with users was always standard
procedure for all projects I remember.
Conclusion: it’s
the skills of those designers and analysts that makes the difference, not the
methodology!
Have received a comment from a young professional trained in ASD -- underlines my totally obsolete view of software development. It may have to do mostly with my long career in federal govt. Apparently, the private sector, at least fast moving parts of it, are much better now! User acceptance testing in these fields has completely REPLACED QA testing, says this source. "If it breaks, who cares?" ... sounds very radical to me!
ReplyDelete