I need to consume a service that sends JSON responses containing JSON-serialized nested structures, which I would like to deserialize and store in my database - my application uses Django.
Business rules are the following:
-
The query returns objects which always have an
id
property which is a unique integer, often acreatedAt
property and anupdatedAt
property, both with datetime data, and then several other properties which are primitive types (int, float, str, datetime, etc.), and several properties that can be another object or an array of objects. -
In case the property value is an object, then the parent relates to it through a 'foreign key'. In case it's an array of objects, then we have two scenarios: either the objects of the array relate to the parent through a 'foreign key', or the parent and each member of the array are related through a 'many-to-many' relation.
-
I need to mirror each of those objects in my database, so each model has an
id
field which is the primary key, but it's not autogenerated, because the real ids will be provided with the imported data. -
The relations between all those entities are already mirrored in my model schema. I adopted this approach (mirroring data structure) because if I flatten the received data to save it all into a single table, there will be horrendous replication, defying all data normalization rules.
-
For every root object, I need to do this:
- check whether there is already a record in database for that
id
- create a new record in case there isn't
- update the existing record in case there is already one (update might be skipped if
updatedAt
values are the same for both the record and the incoming data - recursively repeat these same steps for each nested object that is the provided value for one of its parent's properties.
- check whether there is already a record in database for that
Below I'm reproducing a very simplified sample of the data I receive from the service and the models I in which I want to store it. The real thing is much, much more bulky and complex than that, and that's why I'm so wanting to learn a way of letting the ORM take care of the problem, should it be able to. Hard-coding the whole thing is taking forever, aside of being pretty error-prone and creating a maintenance hell should the data schema change in the future.
JSON sample:
{
"records": [
{
"id": 25228371,
"createdAt": "2018-07-08 23:00:00",
"updatedAt": "2019-03-08 09:45:52",
"field1": "foo",
"field2": 2,
"field3": {
"date": "2019-03-08 09:45:52"
},
"entityA": {
"id": 174,
"createdAt": "2018-07-08 23:00:00",
"updatedAt": "2019-03-08 09:45:52",
"field4": "bar",
"field5": 1
},
"entityB": {
"id": 6059889,
"field6": "zoot",
"field7": {
"date": "2015-05-11 00:00:00"
},
"entityC": {
"id": 3,
"field8": "ni"
},
"entityD": {
"id": 20879,
"createdAt": "2018-07-08 23:00:00",
"updatedAt": "2019-03-08 09:45:52",
"field9": "aah",
"entityE": {
"id": 7,
"field10": 4
},
"entityE_id": 7
},
"entityC_id": 3,
"entityD_id": 20879
},
"entityFvinculations": [
{
"id": 3423557,
"field11": "a newt",
"field12": {
"date": "2019-03-08 10:29:19"
},
"entityG": {
"id": 416038854,
"field13": 0,
"field14": {
"date": "2019-03-07 14:45:53"
}
}
},
{
"id": 3423579,
"field11": "a witch",
"field12": {
"date": "2019-03-08 10:29:19"
},
"entityG": {
"id": 4160521578,
"field13": 0,
"field14": {
"date": "2019-03-12 11:24:07"
}
}
}
],
"entityA_id": 174,
"entityB_id": 6059889
}
]
}
Models.py sample:
from django.db.models import *
class EntityRoot(Model):
id = PositiveIntegerField(primary_key=True)
createdAt = DateTimeField(null=True)
updatedAt = DateTimeField(null=True)
field1 = CharField(max_length=100, null=True)
field2 = PositiveIntegerField(null=True)
field3 = DateTimeField(null=True)
entityA = ForeignKey('EntityA', null=True, on_delete=SET_NULL)
entityB = ForeignKey('EntityB', null=True, on_delete=SET_NULL)
class EntityA(Model):
id = PositiveIntegerField(primary_key=True)
createdAt = DateTimeField(null=True)
updatedAt = DateTimeField(null=True)
field4 = CharField(max_length=100, null=True)
field5 = PositiveIntegerField(null=True)
class EntityB(Model):
id = PositiveIntegerField(primary_key=True)
field6 = CharField(max_length=100, null=True)
field7 = DateTimeField(null=True)
entityC = ForeignKey('EntityC', null=True, on_delete=SET_NULL)
entityD = ForeignKey('EntityD', null=True, on_delete=SET_NULL)
class EntityC(Model):
id = PositiveIntegerField(primary_key=True)
field8 = CharField(max_length=100, null=True)
class EntityD(Model):
id = PositiveIntegerField(primary_key=True)
createdAt = DateTimeField(null=True)
updatedAt = DateTimeField(null=True)
field9 = CharField(max_length=100, null=True)
entityE = ForeignKey('EntityE', null=True, on_delete=SET_NULL)
class EntityE(Model):
id = PositiveIntegerField(primary_key=True)
field10 = PositiveIntegerField(null=True)
class VinculationEntitiesRootAndF(Model):
entityRoot = ForeignKey('EntityRoot', on_delete=CASCADE)
entityE = ForeignKey('EntityF', on_delete=CASCADE)
class EntityF(Model):
id = PositiveIntegerField(primary_key=True)
field11 = CharField(max_length=100, null=True)
field12 = DateTimeField(null=True)
entityG = ForeignKey('EntityG', null=True, on_delete=SET_NULL)
class EntityG(Model):
id = PositiveIntegerField(primary_key=True),
field13 = PositiveIntegerField(null=True)
field14 = DateTimeField(null=True)
EDIT (ref. DRF Serializers):
I'm trying to follow Max Malysh I Reinstate Monica's suggestion, and I started to work on a recursive serializer:
from django.db.models import Manager, Model, Field, DateTimeField, ForeignKey
from rest_framework.serializers import ModelSerializer
class RecursiveSerializer(ModelSerializer):
manager: Manager
field_dict: dict
def __init__(self, target_manager: Manager, data: dict, **kwargs):
self.manager = target_manager
self.Meta.model = self.manager.model
self.field_dict = {f.name: f for f in self.manager.model._meta.fields}
instance = None
data = self.process_data(data)
pk_name = self.manager.model._meta.pk.name
if pk_name in data:
try:
instance = target_manager.get(pk=data[pk_name])
except target_manager.model.DoesNotExist:
pass
super().__init__(instance, data, **kwargs)
def process_data(self, data: dict):
processed_data = {}
for name, value in data.items():
field: Field = self.field_dict.get(name)
if isinstance(value, dict):
if isinstance(field, ForeignKey):
processed_data[name] = self.__class__(field.related_model.objects, data=value)
continue
elif len(value) == 1 and 'date' in value and isinstance(field, DateTimeField):
processed_data[name] = value['date']
continue
processed_data[name] = value
return processed_data
class Meta:
model: Model = None
fields = '__all__'
However, it does a weird thing: when first run, against an empty database, it only creates the last and most deeply nested object. In the second run, it does nothing and returns a code='unique'
validation error saying that such object already exists.
Now I must say I'm quite new to Python and Django (I come from .NET development) and the difficulties I'm facing about this task begin to look very awkward for me. I've been reading docs about Django and DRF, which helped me less than I expected. Yet I refuse to believe aforementioned language and framework lack resources for performing such a trivial operation. So, If I'm missing something very obvious, as it seems, for lack of knowledge of mine, I'll be grateful if someone teaches me what I seem not to know here.
from Deserialize nested JSON structures to Django model objects
No comments:
Post a Comment