An investment manager is evaluating two portfolios over a one-year period. Portfolio A had a return of 15% with a standard deviation of 8%, while Portfolio B had a return of 10% with a standard deviation of 5%. Both portfolios are expected to have a risk-free rate of 2% during this period. Using the Sharpe Ratio as a performance measure, which portfolio exhibited a better risk-adjusted return?