I use Radzen to infern from an SQL Server database schema. This works great. One of my tables as a colum with a "bit" data type and the default value set to "1".
When I create a new entity for this table in code and I set the value of the field to "false", this is not respected and the value ends up as "true" after creating and persisting the object.
The reason for this that EF assumes a value of "false" as "unset" and therefore uses the default value defined by the DB.
A simple workaround would be to change the datatype in the model from bool to bool? (nullable). That way, either true or false will be respected as given and only if no value (null) is given, this will make EF apply the default value which is "true" here.
So far this also works, but my problem is that this customization of the model will get lost if I again (re)infer the schema using Radzen. How can I flag or exclude this entity from the process so that it does not get overwritten?
Hello @enchev - yes, that works. But where does Radzen store this information? I noticed that if I uncheck the entity and do the infer and then close Radzen and restart it, the entity remains unchecked for all future infers.
This is genreally good, but I was unable to find where this information is stored. It would be bad if I e.g. install Radzen on a new computer and then suddenly the enity is checked again and I overwrite my model just becaused I overlooked that setting.
Ah, ok - got it. But that's also kind of a problem since the enity is then not integrated into the model anymore. This requires further customization and extra partial classes. So for now, there is no other way than manually changing the datatype from bool to bool? every time a refer is done.
However, more customiztation options on the data model would be nice, similar to EF core power tools which can exclude or customize certain files if needed.