0.0 showing up as double when return type is decimal
I have a value of 0.0 that it is showing up as a double. The return type that I am expecting is a decimal. I can easily cast it as a decimal and fix the issue, but I am curious why 0.0 defaults to a double. I would think that the code would be smart enough to see that my return type is a decimal and would convert 0.0 to a decimal for me. Why does 0.0 default to a double? Is there a different way to give a decimal value in the return type? below is a snippet of my code along with the error.
I came across this stack and it is very helpful to understand the difference between decimal and double but it does not address my question.