For 2 years at Bell Labs, I was the Supervisor of the NETS Requirements Group. The Nationwide Emergency Telecommunications System (NETS) was a research activity contracted by the US Federal Government to evaluate the survivability of national telecom assets to various threats, primarily nuclear war. I supervised a number of activities: network data consolidation and analysis, network design and simulation and development of hardening and routing requirements for network equipment. In my first year, we wrote a lot of software to process and validate data, to structure and map the physical and the logical structure of the network, to estimate damage and to simulate the performance of the network in its damaged state. We also developed new design rules and routing tables to extract the maximum performance from the surviving assets.
When the contract came up for renewal, I negotiated with the government to deliver all the design, simulation and visualization software as a stable product that non-research users could use to evaluate and enhance the survivability of other networks. I later learned that the final development cost was about 10x what I had estimated. Part of this was due to creeping features, negotiated when the professional software people took over, part of it was due to the difference in effectiveness of the average software developer compared to my best people (e.g., a PhD in theoretical physics who was brilliant at both the concepts and the execution of network design) and part of it was my own ignorance of what it takes to create a software product vs. a collection of research algorithms. Also, everything we had done was on a dedicated Unix platform using shell script and C programs, we’d built our own data structures from scratch, there were no stacks, frameworks or significant third party software that we could leverage, there was no structured programming to maximize reuse. We did a lot of our testing and debugging the old fashioned way... bit by bit and byte by byte.
After I left Bell Labs, I became an individual contributor again at Ameritech Science and Technology. I evaluated technology and designed access and transport networks. Most of my work was done within spreadsheets on Macintosh computers. I became a power user of Wingz, for a time more powerful and graphically sophisticated than Excel. I was able to characterize the technology, design the network, run all the economics and sensitivities and develop the management presentation on one platform. The efficiency and effectiveness of this platform, which contained database, scripting and graphical capabilities had an impact on my attitude towards 3rd party software.
Later, still at Ameritech, I became very interested in R&D focused on improving software productivity. I saw the technology trends for software productivity as one of the primary challenges of technology management, especially for the telephone operating companies, that were stuck in a rat’s maze of interlocking, inflexible and expensive operational support systems. At that time, structured programming had promise, but had not yet delivered on the promise. Development frameworks were in their infancy and almost all protocol stacks had to be laboriously built and maintained for each development project.
This takes me up to the mid-1990s. What are your memories of software processes and structures from this period?
No comments:
Post a Comment