A way of measuring the historical risk-adjusted return on an investment. It is the average previous return minus the risk-free return, divided by the standard deviation (a measure of risk that looks at the diversion of actual returns from expected returns). 
Measuring the historical performance of investment managers, such as mutual fund and hedge fund managers, has always been a challenge. The lack of performance consistency and repeatability (that is repeating past good performance), track record, and adjustment for risk, are the frequently cited problems. Nevertheless, academia and the industry have attempted to surmount these problems by coming up with succinct, risk-adjusted measures so as to encapsulate the performance of an investment manager using metrics.
One of the metrics is the Sharpe ratio which is a measure of the excess return per unit of risk. The risk-adjusted measure is the adjustment of investment performance for the risk taken by the investment manager.
This is used to measure the historical performance of absolute return funds.
To calculate this you take the after-fee investment performance of the absolute return fund minus the “riskless” rate such as the Libor. Then, you take the result and divide it by the standard deviation of returns. The higher the number, the better the result.
For instance, a hedge fund managed to underperform the Libor by minus 4 per cent per annum over the past three years, with a corresponding standard deviation of 8 per cent per annum. The Sharpe ratio of the fund is – 0.5.
However, as the Financial Times - in the "Lessons from past performance" article - astutely points out, these ratios are all based on the manager’s historical performance, and past performance is no guide to future performance. With that caveat in mind, many fund rating companies, such as Morningstar and Lipper, provide comprehensive fund management rankings, commentaries, risk-adjusted performance and the suitability of various funds in their reports and on their websites.