Earlier, one of the veterans from the technical department in the company mentioned that before, the programmers that the company hired were really very good. They were able to generate the programs at the soonest possible time, sometimes as soon as a day and at other times, a week, depending on the scope of the program. And I don't question that. It's her own subjective perspective, that of someone who's responsible of getting the program to the client at the soonest possible time, if possible, beat the deadline. I, on the other hand, have seen their code and I am saying that it is a mess. Of course, this is my own subjective perspective, that of a systems developer who has to plow through their thousands of lines of code and make sense of what I am reading. I don't mean that the code is unreadable and therefore impossible to understand, just ugly to look at and it'll take some time before one'll understand how it works. What I couldn't fathom is where did they find the patience to read and maintain programs, thousands of lines long, without functions, read from top to bottom, without getting exasperated. Oh yeah, I forgot they've already left. And they were written in Perl, a language I know nothing about and by the looks of it, I should be studying.
So anyway, this led me to compare the software development life cycle I learned (and touted) in school and the actual software development life cycles as applied in developing systems in companies (in my current company anyway). This made me think about planned development versus speedy "instantaneous" development. I majored in Software Engineering in college so I took courses such as advanced software engineering where I learned UML and the need to document, where we looked for a prospective client, developed a system according to their needs and requirements, did the necessary testing for each module, integrated modules, performed integration testing, and outputted the necessary paperwork: overview, background, diagrams, testing results, and user manual. The biggest difference between school and real world development is the timeline. In school, when one starts on a project, before anything else, the professor asks for a timeline. This timeline is broken down to the specific stages in software development: requirement definition, design and planning, implementation, testing, modification, integration, testing again, modification again, testing again... it's a cycle and it just recurses. Usually, the recursive process of testing and modifying and testing and modifying till one deems the software passable takes at the very least a week. Requirements definition phase also needs a significant amount of time, unless one is so familiar with the system that is to be developed. Design and planning a week too at the very least (unless of course these are really very simple programs). Implementation and development requires a considerable amount of time, testing and modifications too require a certain amount of time. And even when the program has already been rolled out, bugs are bound to surface, and it's back at the developer's table again for fine-tuning. A different version will be rolled out, and the process is repeated.
Working in the real world, specifically in the 3rd world, SDLCs are but a dream*. It's an effective framework but I pretty much have no use for it for now. Requirements definition? A new requirement gets added every now and then even when the initial release of the program is yet to be rolled out. Planning and design? One does that by one's self, as one is coding. So now we have done away with 2 initial stages. Of course no one skips the implementation. It's actually the first step. Testing and debugging? Not enough time to do that. The program gets rolled out and errors are bound to crop up and accumulate. What can you expect with a spur-of-the-moment, not-even-well-thought-out program? The output? I don't even want to know. To begin with, I was given a program to finish which will generate a report for different users, on different servers, and there was no timeline, just a deadline. Added to that, there wasn't any decent documentation. The documentation that was forwarded to me for this specific report didn't indicate the data source(s). I asked the system analyst about this and apparently, the system analyst too didn't have any idea where the developers were sourcing their data from. There was no mention of design, no one asked me how I was to go about in generating it, I didn't know the particulars of the business process, I didn't have any idea how the users were going to use it. I didn't know a lot of things. The only thing I knew, aside from some formulas to use, was the deadline.
In effect, this got me wondering how much the company was spending on maintenance. Everyday, my inbox is flooded with patches. These patches are the company's way of correcting those bugs which we never seem to run out of. They call it fire-fighting. It's a boring, exhaustive maintenance job. How much would it cost a company to invest in a requirements definition phase, a design and planning phase, and testing phase as opposed to going immediately to development and implementation and roll out then deal with all the bugs that are sure to crop up afterwards? Of course, maintenance is there to stay. But how lesser will the maintenance be if these initial stages are actually followed? Questions like these make me want to measure and monitor the company's expense on maintenance and cost of initial deployment. Questions like these make me want to go back to data integration and business analytics. But more than these questions, realities such this make me want to abandon software development altogether.
* Am sure outsourcing firms are in (much, much, much, way, way, way) better shape than ours. I am not currently working for an IT company but one still wishes that certain standards and processes still be followed.