The differential-privacy idea states that maintaining privacy often includes adding noise to a data set to make it more challenging to identify data that corresponds to specific individuals. The accuracy of data analysis is typically decreased when noise is added, and differential privacy provides a technique to evaluate the accuracy-privacy trade-off. Although it may be more difficult to discern between analyses performed on somewhat dissimilar data sets, injecting random noise can also reduce the usefulness of the analysis. If not, enough noise is supplied to a very tiny data collection, analyses could become practically useless. The trade-off between value and privacy should, however, become more manageable as the size of the data set increase. Along these lines, in this paper, the fundamental ideas of sensitivity and privacy budget in differential privacy, the noise mechanisms utilized as a part of differential privacy, the composition properties, the ways through which it can be achieved and the developments in this field to date have been presented.