Fluctuating user demand on server resources in a cloud computing environment is something which is important to monitor to have an efficient delivery system. Allocating too much or too little resources is not appropriate and causes wastage or slow speed respectively.
Researchers at MIT have come up with an algorithm and a software known as DBSeer which uses machine learning techniques to correlate the user demand and system performance and hence allocate server resources on the fly in the most efficient manner possible. It is also estimated to reduce hardware requirements by as much as 95%
This is certainly a very promising technology and with predictions of cloud computing growing annually at a compound rate of nearly 27% for the next few years, this would act as a boon for the industry.
Read more About the Cloud