Modern Algorithm Analysis




Jerry Potter

Department of Computer Science

Kent State University

Kent, OH 44242



The algorithm analysis of the 1970s needs to be updated. In order to be manageable, it stripped the analysis to the minimum. It reduced the complexity of a human-computer activity to one of a centralized processing element and memory.


Today with the very low cost of CPU and memory it is being realized that the theory needs to be refined to accommodate other factors such as the practical cost of processing very large data sizes. While theoretically speaking the difference between polynomial versus expotential growth is important the constant factor often ignored is critical. Accomplishing a task in 15 minutes is a very long but manageable time. But an algorithm that only take 1000 times a long results in a 10 day wait probably not practical.


Another factor that the traditional analysis does not take into account is the human cost. That is how difficult is it to implement an algorithm. There are many factors involved in this analysis such as what language to use, which operating system and how do you compare one algorithm written in C++ with another written in, say, Lisp? Execution time alone is not sufficient for actual industrial applications.


Assume for example that algorithm A executes 1000 times faster than algorithm B, and takes only 10 times the effort to implement. It sounds like a winner until you put actual numbers in the equation. If the algorithms execute in .0001 and .01 seconds but require 400 hours versus 40 to implement respectively, it is hard to justify the 400 hour cost under any circumstances. The high cost of software is the main reason for outsourcing, yet it is not factored into algorithm analysis, programming languages, hardware design or our educational system.


What factors should be included

Hardware based components

1)                          Divide the traditional CPU analysis into instruction stream and computation components.


Software based components

1)                  Explicit dynamic memory allocation

2)                  Overhead activities, i.e. sorting before searching.

3)                  Data organization 90% of work (programming)

4)                  Cost of debugging

5)                  Cost of documentation

6)                  Ease of readability