In the context of portfolio management, understanding various risk measures is crucial for evaluating the performance and stability of investments. One of the most commonly used risk measures is standard deviation, which quantifies the amount of variation or dispersion in a set of values. Consider the following scenario:
A portfolio manager is analyzing two different portfolios: Portfolio A, which has an expected return of 8% and a standard deviation of 4%, and Portfolio B, which has an expected return of 10% with a standard deviation of 6%. Both portfolios are equally weighted and have similar correlations with the market.
Based on the provided data, which portfolio demonstrates a higher risk-adjusted return when measured by the Sharpe Ratio, assuming a risk-free rate of 2%?