In SQL Server 2000 and 2005 you have the money data type and the smallmoney data type. (Oracle doesn’t have this data type)
You wouldn’t think there is anything smaller than a penny or a cent in money matters, so why did they designed these data types with 4 decimal places? To help hackers puts all the partial pennies in a separate bank account? :-p or to keep compatibility with Excel and allow easier data import from Excel spreadsheets?
Apparently, the only reason is to let developers insert currency symbols that won’t be preserved:
CREATE TABLE TestMoney (cola INT PRIMARY KEY, colb MONEY);
SET NOCOUNT ON;
-- The following three INSERT statements work.
INSERT INTO TestMoney VALUES (1, $123.45);
INSERT INTO TestMoney VALUES (2, $123123.45);
INSERT INTO TestMoney VALUES (3, CAST('$444,123.45' AS MONEY) );
-- This INSERT statement gets an error because of the comma
-- separator in the money string.
INSERT INTO TestMoney VALUES (3, $555,123.45);
SET NOCOUNT OFF;
SELECT * FROM TestMoney;
DECLARE @dollars AS money
The value returned is 100.0000, without a currency symbol.
If you’re a developer, these data types will make you round your values to two decimal places before retrieving them to your UI and they will also give you a couple of headaches with calculated data.
Stay away from the money type, use decimal instead.