Programming Tutorials and Interview Questions

An algorithm is defined as a step-by-step procedure or method for solving a problem by a computer in a finite number of steps. Steps of an algorithm definition may include branching or repetition depending upon what problem the algorithm is being developed for. While defining an algorithm steps are written in human understandable language and independent of any programming language. We can implement it in any programming language of our choice.

Besides merely being a finite set of rules which gives a sequence of operations for solving a specific type of problem, a well defined algorithm has five important features:

**Finiteness.**An algorithm must always terminate after a finite number of steps.**Definiteness.**Each step of an algorithm must be precisely defined; the actions to be carried out must be rigorously and unambiguously specified for each case.**Input.**An algorithm has zero or more inputs, i.e, quantities which are given to it initially before the algorithm begins.**Output.**An algorithm has one or more outputs i.e, quantities which have a specified relation to the inputs.**Effectiveness.**An algorithm is also generally expected to be effective. This means that all of the operations to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time.

In practice we not only want algorithm definitions, we want *good* algorithm definitions in some loosely-defined aesthetic sense. One criterion of goodness is the length of time taken to perform the algorithm. Other criteria are the adaptability of the algorithm to computers, its simplicity and elegance, etc.

A data breach affecting 57 million customers and drivers should not have been concealed, the information commissioner says.

The devices send location data back to Google even when location services are switched off.

The chancellor acknowledged the digital revolution would change the way people lived and worked.

×