Very similar to something I'm working on. Except in my case I have an interface that the consumers implement to be able to talk to their service/database/whatever. It's designed for time series data (e.g., cpu usage, room temperature, that kind of thing). It's meant to only allow for numeric values and strings. But because there is no base type or interface for numeric types I unfortunately can't do something like:
public Result WriteValue<T>(T value) where T : INumeric/Numeric { }
My favorite taunt from the .NET team: according to the .NET Reference Source, built-in numeric types implement an IArithmetic<T> interface... And then they comment it out.
Here's an ad-hoc polymorphic way of dealing with numeric types in C# without the boxing that would be caused by using an IArithmetic interface:
public interface Num<A>
{
A Add(A x, A y);
A Subtract(A x, A y);
A Multiply(A x, A y);
A Divide(A x, A y);
A FromInteger(int value);
}
public struct NumInt : Num<int>
{
public int Add(int x, int y) => x + y;
public int Subtract(int x, int y) => x - y;
public int Multiply(int x, int y) => x * y;
public int Divide(int x, int y) => x / y;
public int FromInteger(int value) => value;
}
public struct NumDouble : Num<double>
{
public double Add(double x, double y) => x + y;
public double Subtract(double x, double y) => x - y;
public double Multiply(double x, double y) => x * y;
public double Divide(double x, double y) => x / y;
public double FromInteger(int value) => (double)value;
}
public struct NumBigInt : Num<BigInteger>
{
public BigInteger Add(BigInteger x, BigInteger y) => x + y;
public BigInteger Subtract(BigInteger x, BigInteger y) => x - y;
public BigInteger Multiply(BigInteger x, BigInteger y) => x * y;
public BigInteger Divide(BigInteger x, BigInteger y) => x / y;
public BigInteger FromInteger(int value) => (BigInteger)value;
}
public static class TestGenericNums
{
public static void Test()
{
var a = DoubleIt<NumInt, int>(10); // 20
var b = SquareIt<NumInt, int>(10); // 100
var c = NegateIt<NumInt, int>(10); // -10
var t = DoubleIt<NumDouble, double>(10); // 20
var u = SquareIt<NumDouble, double>(10); // 100
var v = NegateIt<NumDouble, double>(10); // -10
var x = DoubleIt<NumBigInt, BigInteger>(10); // 20
var y = SquareIt<NumBigInt, BigInteger>(10); // 100
var z = NegateIt<NumBigInt, BigInteger>(10); // -10
}
public static A DoubleIt<NumA, A>(A value) where NumA : struct, Num<A> =>
default(NumA).Add(value, value);
public static A SquareIt<NumA, A>(A value) where NumA : struct, Num<A> =>
default(NumA).Multiply(value, value);
public static A NegateIt<NumA, A>(A value) where NumA : struct, Num<A> =>
default(NumA).Subtract(default(NumA).FromInteger(0), value);
}
:(
So instead the interface has an identical method for ints, doubles, floats, decimals, strings...