Package xdynamo
ORM Dynamo Overview
Intended use of this library is for an quick way to get/retrieve objects from Dynamo tables.
Things that it can help with:
-
If table does not exist, it will create it automatically.
-
Helps when unit-testing with
@moto.mock_dynamodb
decorator, since mock expects you to create the table before using it. This means tables are automatically/lazily created as needed (with no special effort on the part of the unit test). !!! tip "Tip: To use" -
When running any code locally, it can automatically create the table for you without any special effort.
- Normally, lambdas are not given permission to create table as we want cloud-formation/serverless to manage the table. During deployment into aws serverless/cloud-formation should be the one creating table if needed
- Works like with the other standard
xmodel.remote.model.RemoteModel
's: DynModel
and other related classed work very similar. Supports the same basic/common methods.- Easy to use single 'string' to identify any dynamo object via
DynModel.id
- will synthesize a string representing the full primary key, for use with other BaseModel's when using child objects (ie: they know how to look each other up).
- Easy/Standard way to Paginate
- When you use the standard
DynApi.get
method, it will return a generator giving you one object at a time while bulk-getting up to 100 per-request to dynamo, depending on how it had to query the data for you (the goal is for it to figure out the most efficient way to query automatically). - You don't have to worry about what page of results you are on or how it works, you just loop/run the generator and it will eventually give you back all of the objects. Auto Prefetch Children
- TODO (rest-client
xmodel_rest.RestClient
has this, need to do it still forDynClient
). You can still use methods inxmodel.children
to bulk-grab them.- If your curious, see Auto Prefetch Children
for more info about how to use the auto-pre-fetch feature with the standard
xmodel.base.model.BaseModel
.
- If your curious, see Auto Prefetch Children
for more info about how to use the auto-pre-fetch feature with the standard
- When you use the standard
- Simple way to insert/update/delete objects individually or in bulk.
- Full power of the JSON to/from BaseModel's infrastructure.
- Including automated conversion of types to/from, such as dates.
- Default values
- Read only fields, etc.
- Central spot to put future high-level dynamo code to share among our projects.
-
Quick Start
If you don't know much about the ORM, have a look at ORM Library Overview first. It's an overview of the basic concepts.
Index/Summary of the main classes you'll be interacting with:
DynModel
: Represents an object in a table.DynApi
: Basically represents the table, it's the central 'hub' class that lets you get to theDynStructure
: List of fields and other class-level info about theDynModel
.DynClient
: Wraps boto, figures out the request to use with boto and executes it.
DynKey
: Contains a hash + range keys, along with ways to put them together into anDynModel.id
string and splitting them apart again.DynField
: Represents a field on aDynModel
. Automatically created inDynStructure
if it's not user allocated on a attribute/field onDynModel
.HashField
: SpecialDynField
object, indicated field is the Hash of the table.RangeField
: Indicates the Range field of the table (if there is one).
DynBatch
: Context manager (ie:with
object). Allows you to batch non-transaction put's (so system will just use strait put') and deletes.
Example Data Models
Examples are probably the best way to get a 'quick start', here are some below.
First, I'll get a few BaseModel's defined. After I'll show examples of using them.
This first one show's a table with a Hash + Range key, along with a list of dicts along with some basic data fields (str/bool).
Note: You can see models vary similar to these in action in sine unit-tests
Look at tests/test_dynamo.py in xdynamo source if your interested.
>>> class ModelWithRangeKey(
... DynModel,
... # ---> used for end of table name:
... dyn_name="modelWithRangeKey"
... ):
... my_hash: str = HashField()
... my_range: str = RangeField()
... name: str
... a_number: int
... hello: bool
... items: List[Dict[str, str]]
Here is a normal non-dynamo model. We will be using this as a way to parse a sub-dict
automatically into a regular model object.
We enabled Field.include_in_repr
in the below example, it will make sub_name
print out
in string when object is converted to a string (such as when logging object out).
>>> from xmodel.fields import Field
>>> class ModelAsSubJsonDict(BaseModel, has_id_field=False):
...
... # It puts 'sub-name' into the object description when converting object
... # to a string
... # (ie: such as when you log out a object of type 'ModelAsSubJsonDict')
... sub_name: str = Field(include_in_repr=True)
... queue: bool
Here is a second DynModel
for a separate table.
It has a relationship to a
ModelWithRangeKey
.
>>> class ModelOnlyHash(
... DynModel,
... dyn_name="visibleShipConfirm",
... dyn_service="experimental"
... ):
... hash_only: str = HashField()
... name: str
... items: List[Dict[str, str]]
... a_number: int
... test_item_id: str
... test_item: ModelWithRangeKey
... sub_item: ModelAsSubJsonDict
In the real dynamo table, it would store VisiblePackage.sub_item
as a
test_item_id
attribute by grabbing the DynModel.id
from the ModelWithRangeKey
object.
It would lazily lookup object if you try to access VisiblePackage.sub_item
, just like you
would expect.
TODO
rest-client xmodel_rest.RestClient
has an ability to auto pre-fetch children.
Still need to do it for DynClient
. You can still use methods in xmodel.children
to bulk-grab them.
If your curious, see Auto Prefetch Children
for more info about how to use the auto-pre-fetch feature with the standard
xmodel.base.model.BaseModel
.
Create a few items in a table (for illustrative purposes, for the following examples):
>>> ModelWithRangeKey(my_hash="my-h1", my_range="my-r1", name="A").api.send()
>>> ModelWithRangeKey(my_hash="my-h2", my_range="my-r2", name="B").api.send()
Basics of Getting Items
Quick Example of getting an item, this gets the item by a "my-h1"
hash-key and
"my-r1"
range-key:
>>> ModelWithRangeKey.api.get_via_id({'my_hash': "my-h1", 'my_range': "my-r1"})
ModelWithRangeKey(my_hash: "my-h1", my_range: "my-r1")
Various ways Get Item are below, in general you need a hash-key to do a dynamo-query and to generally get items. If you don't have a hash-key value, then you must scan the table.
Right now, you can grab all items in a table via DynApi.get
.
This will pass a blank query to DynClient.get
, which will make it do a
full-table scan and return all items.
Right now, DynClient
won't scan and will instead raise an
xmodel.remote.errors.XRemoteError
if you pass in a non-blank query without including a hash-key value.
TODO
Support scanning entire table with a non-blank query to filter it with. This will be supported in the future, right now it's unimplemented. Most of the time, you'll really, really want to query the table in any case. Querying the table (with a hash-key) is MUCH faster then scanning it.
I have various examples below, but in general we support querying in these ways,
you can use DynApi.get
or DynClient.get
.
These methods will figure out the best way to execute get request/query, generally detailed
below:
- If provided query is blank, scans and returns all items/objects in table.
- Generally if you have both (and only) hash and range keys, it will do a batch-get automatically
which lets us query for 100 objects at a time. Otherwise it will fall-back to a query.
When doing a query, dynamo only supports one query-request per-hash.
- If the table structure has only a hash-key (and no range key), then it's a list of only hashes.
- If you only have a:
- hash-key:
Will return all objects with that hash, could be multiple objects if the table
structure has a range-key.
- If table only supports hash-key, it will be either a single object or empty list.
- single hash + range keys: Return's a single object if it exists, or empty-list.
- Consider using
XynApi.get_via_id
, it won't return alist
if you provide theid
value as a single string/dict as first argument.
- Consider using
- multiple Hash + range keys: Will query every combination of hash + range keys automatically. It can use a batch-get for this and attempt to lookup up to 100 objects per-request. If you have more than 100, we will split up the requests for you automatically (you'll just see a single stream of objects come back).
id
: If you have theDynModel.id
for an object, it contains one or both keys and can be used to query object viaDynApi.get_via_id
.DynKey
: You can use this viaDynApi.get_via_id
to query for the object.DynKey
's represet one or both components of a DynamoDB primary key.
- hash-key:
Will return all objects with that hash, could be multiple objects if the table
structure has a range-key.
- If you have other attributes besides just the range/hash key:
- We need to fallback to a query in this case, to support filtering by non-key attributes.
- One query per-hash/range key in the query.
- If the range-key is using
between
operator, we still only do one query since dynamo supports this operator in a query.- We'll talk more about operators later in the Advanced Queries section further on.
- If you use a list of values with range-key, it will have to use multiple queries, one per-hash/range key combination(s) provided.
- If the range-key is using
First there is DynApi.get_via_id
which can take a list of id strings,
or a list of dicts with a hash-key and (optionally) a range-key:
>>> ModelWithRangeKey.api.get_via_id({'my_hash': "my-h1", 'my_range': "my-r1"})
ModelWithRangeKey(my_hash: "my-h1", my_range: "my-r1")
This will produce an error, you must provide all parts of the key to use get_via_id
,
so it needs the range-key part:
>>> ModelWithRangeKey.api.get_via_id({'my_hash': "my-h1"})
Raises XRemoteError
If you want all the objects for a particular hash regadless of the range-key, you can
uss the DynApi.get
method instead. We wrap it in a list, because a generator is
normally returned (to showcase all output). You can put the returned generator in
a for
loop instead if you want. The generator will correctly paginate all results
for you automatically. Here is the example:
>>> list(ModelWithRangeKey.api.get({'my_hash': "my-h1"}))
[ModelWithRangeKey(..., my_range="my-r1"), ModelWithRangeKey(..., my_range="my-r2")]
You can also query on other non-key attributes,
>>> list(ModelWithRangeKey.api.get({'my_hash': "my-h1"}))
You can also use a generic object that represents a dynamo key with DynKey
.
You can pass DynKey
objects directly into DynApi.get_via_id
.
Other methods let you also directly pass DynKey
's, such as DynClient.delete_objs
.
>>> key = DynKey(hash_key='my-h1', range_key='my-r1')
>>> ModelWithRangeKey.api.get_via_id(key)
To you can grab a list of them, a list of dicts or a list of keys, when you provide a list you get a generator back that will paginate though the results correctly.
>>> list(ModelWithRangeKey.api.get_via_id([key]))
[ModelWithRangeKey(....)]
Also, if your table only has a hash-key, you can just directly provide it's string/int value; or a list of them:
>>> ModelOnlyHash.api.get_via_id('a-hash-key')
This is the same value as you get from the model's DynModel.id
attribute.
>>> obj = ModelOnlyHash(hash_only = 'a-hash-key')
>>> assert obj.id == 'a-hash-key'
Objects with a range-key by default have them joined together with hash-key via a pipe |
delimiter, like so:
>>> obj = ModelWithRangeKey(my_hash="my-h1", my_range="my-r1")
>>> assert obj.id == 'my-h1|my-r1'
You can change the joining/delimiter string via DynStructure.dyn_id_delimiter
.
Just like with the normal xmodel.base.model.BaseModel
, you can set these attributes via the
class arguments on DynModel
subclasses.
(for more details on class arguments, see xmodel.base.model.BaseModel.__init_subclass__
and DynStructure.configure_for_model_type
for the DynModel specific ones available).
Every DynModel
has this 'virtual' DynModel.id
value that is the primary key of
the object. THis is the hash-key, plus range-key value (if object has one) of an object.
This single string uniquely identified the object.
The rest of the xmodel.remote
can use this just like any id
from
other xmodel.base.model.BaseModel
objects. Meaning, this virtual id
value can be
stored in other places to form relationships (as you can see with the above
example on ModelOnlyHash.sub_item
.
Updating Items
Just like other xmodel.base.model.BaseModel
objects, DynModel
can be changes and then
the changes sent to Dynamo like so:
>>> obj = ModelOnlyHash(hash_only="a-hash-key")
>>> obj.carrier = "new-carrier"
>>> obj.api.send()
You can also mass-update/create objects via DynClient.send_objs
:
>>> list_of_objs: List[ModelOnlyHash]
>>> ModelOnlyHash.api.client.send_objs(list_of_objs)
Right now we only support "Putting" objects into dynamo (ie: not patching them). It will replace the entire item in dynamo, no mater what you change or update on the object.
TODO
In the future, there will be an option to patch object(s) via DynTransaction
.
The feature has not been finished yet.
It will use a Dynamo Transaction to do it.
See Todo/Future: Batch via Transaction
Deleting Items
>>> obj.api.delete()
You can mass delete via:
>>> key = DynKey(hash_key='my-h1', range_key='my-r1')
>>> objs = list(ModelWithRangeKey.api.get_via_id(key))
>>> ModelWithRangeKey.api.client.delete_objs(objs)
You can also give the DynClient.delete_objs
a list of DynKey
's.
>>> ModelWithRangeKey.api.client.delete_objs([key])
This allows you to delete objects without having to create full-models and looking to see
which field is the hash/range key.
The DynKey
always accepts the values the same way.
Batch Updating / Deleting
You can send a one-off list of multiple objects to update/delete all at once (see examples above).
If you want a section of code that gets executed to batch-delete/update you can directly use
the DynBatch
class. It's a context-manager and can apply batch Put's and Delete's via a
with
statment that will continue to apply it no mater how deep the call stack is inside
the with
statement. See DynBatch
for more details
Quick Example:
>>> # DynModel objects of some sort....
>>> obj1: ModelOnlyHash
>>> obj2: ModelOnlyHash
>>> obj3: ModelOnlyHash
>>> with DynBatch():
... obj1.carrier = "changed"
... obj1.api.send()
... obj2.carrier = "changed"
... obj2.api.send()
... obj3.api.delete()
This would end up sending both updates and the delete in the same request.
It works by 'batching' a number of objects at a time and sending them.
If there are still objects to send by the time the with
statement exits,
the renaming unsent objects are sent to dynamo.
Todo/Future: Batch via Transaction
TODO
In the future, there will be a class called DynTransaction
that you can use to
batch transactions together. It would also allow bulk-partial updating/patching objects
instead of using 'puts' to replace entire object. And to tie a set of objects together
that must all be written or rolled back in one go.
Mocking Dynamo with moto
You can use the moto
3rd part dependency, and their @moto.mock_dynamodb
method decorator
to 'mock' dynamo. There is something to be aware though, but they are easy to get correct:
- The mocking library expects you to first your dynamo tables via boto3 calls
before they are used.
xdynamo.DynClient
will check and ensure the tables are created automatically in a lazy fashion (ie: first time an attempt to send/get aDynMode
from Dynamo). -
In an effort to reuse already opened connections to dynamo, we use a shared resource
xdynamo.dyn_connections.DynamoDB
. It's important that the connection is created whilemoto
is active formoto
to mock the connection/session.Note:
xinject.fixtures.context
fixture is automatically used whenxinject
isinstalled as a dependency. This ensures when a
xdynamo.dyn_connections.DynamoDB
is asked for, a brand new one will be created for the unit test and therefore create a new connection/session.- It's relatively expensive to open a new connection vs using an already existing connection to Dynamo since Dynamo has to setup a encryption key context in it's self.
Advanced Queries
Ensure you've read the earilier segment about Basics of Getting Items first before reading this more advanced section.
DynApi.get
and DynClient.get
, for the query
param can accept more then
just the hash-key and range-key; other attributes are allowed to filter the results even more!
Dynamo has a concept of a 'query' and a 'scan' to get data. A query needs to at least each by the hash-key (range-key is optional). Scan will let you search with any attribute but has to look at all records, and so is slow.
If you at least provide the hash-key, you can filter by other attributes and still get decent performance (as it only has to scan records with that hash). If you also provide the range-key, that's even better as that will narrow it down to one-record, and is fast since Dynamo can use an index.
You can also provide a list of hash-keys, with other attributes to fitler by.
DynClient
will automatically break it up into multiple queries, one per-hash key
automatically and paginate all the results together for you.
Operators
It also supports operators, like some_field__gte: 3
would look for things that are
greater-than-or-equal to 3
. A hash-key must be exact, or a list (ie: __in
).
When wanting to query, the hash-field has to be an exact-equals value (no other special operators)
or a list
value and/or __in
operator.
When it's a list (which maps it's self automatically
to the __in
operator by default) we simulate it by doing multiple queries, one per-item
in the list for the hash and then concatenating the results together via the generator
that gets returned (ie: lazily execute each query as the generator runs).
To get full list of operators, see operator method names on bot3.dynamodb.conditions.Attr
(and it's AttributeBase superclass). Also have a mapping at operator_alias_map
that maps
some of our standard xyngular-api operators to how it's named in Dynamo.
I also have a list right below:
non-hash query operators you can use:
Operator List
- eq
- lt
- lte
- gt
- gte
- begins_with
- between
- is_int _ exists _ not_exists
- contains
- size
- attribute_type
See examples below to see how operators can be used.
For more info, take a look at some of the following examples and also look at
DynClient.get
.
Examples
ModelOnlyHash.api.get(
query={
'hash_only':['vis-track-id-2', 'vis-track-id-1'],
'carrier': 'c2'
}
)
This will do two queries in Dynamo, one for each:
- 'hash_only: vis-track-id-2' + 'carrier: c2'
- 'hash_only: vis-track-id-1' + 'carrier: c2'
It will combine the results of both together into a single generator and return the generator. You can iterate on the generator to get all the results, all paginated for you.
You can use operators after the keyword, just like you can do for the Xyngular APIs:
ModelOnlyHash.api.get(
query={
'hash_only':['vis-track-id-2', 'vis-track-id-1'],
'a_number__gte': 2
}
)
Looks for the two different hash_only
's that have a a_number
attribute that
is greater then or equal to 2.
>>> my_objs = ModelWithRangeKey.api.get(
... query={
... 'my_hash':['hash-1', 'hash-2'],
... 'my_range':['range-1', 'range-2'],
... 'a_number__gte': 2
... }
... )
It looks for all of the following combinations of range/hash with four seperate queries (internally):
- hash-1, range-1
- hash-1, range-2
- hash-2, range-1
- hash-2, range-2
Each of these queries will also have a filter where a_number
attribute will need
to be greater then or equal to 2.
DynApi.get
will return a generator, that will return all objects returns from
these four internal queries it will do for you.
All you have to do is run the generator to see all the objects, like so:
>>> for o in my_objs:
... print(o.a_number)
--- outputs one number per-object in table that match query ---
If the above example query only had my_hash
and my_range
and NOT a_number__gte
,
it would have executed a very-fast batch-get on dynamo with the hash/range key combinations.
Doing four query/requests are still pretty fast, but doing a batch-get can be executed in a single request (100 key combinations per-request)!
Like I said earlier, the nice thing about DynApi.get
/ DynClient.get
is that it can figure out the best way to query the objects for you based on what you give
so you don't have figure it out your self.
Sub-modules
xdynamo.api
xdynamo.client
xdynamo.common_types
xdynamo.const
xdynamo.db
-
Used to keep track of a shared connection and Dynamo table resources …
xdynamo.errors
xdynamo.fields
xdynamo.model
xdynamo.resources
xdynamo.structure
xdynamo.utils
Classes
class DynBatch
-
Allows one to batch bulk updates/deletes (via dynamo put/delete-item) with a context manager. You can bulk-delete/update currently via:
DynClient.delete_objs
DynClient.update_objs
But if you want to combine a number of separate update/delete object calls (including with other calls to
DynClient.delete_objs
/DynClient.update_objs
) into the same request(s), this class allows you to do that.For example code, see Batch Updating Deleting.
Expand source code
class DynBatch(_DynBatcher): """ Allows one to batch bulk updates/deletes (via dynamo put/delete-item) with a context manager. You can bulk-delete/update currently via: - `DynClient.delete_objs` - `DynClient.update_objs` But if you want to combine a number of separate update/delete object calls (including with other calls to `DynClient.delete_objs` / `DynClient.update_objs`) into the same request(s), this class allows you to do that. For example code, see [Batch Updating Deleting](#batch-updating-deleting). """ pass
Ancestors
- xdynamo.resources._DynBatcher
class DynField (name: str = Default, type_hint: Type = <property object>, nullable: bool = Default, read_only: bool = Default, exclude: bool = Default, default: Any = Default, post_filter: Optional[xmodel.base.fields.Filter] = Default, converter: Optional[xmodel.base.fields.Converter] = Default, fget: Optional[Callable[[M], Any]] = Default, fset: Optional[Callable[[BaseModel, Any], None]] = Default, include_with_fields: Set[str] = Default, json_path: str = Default, json_path_separator: str = Default, include_in_repr: bool = Default, related_type: Optional[Type[BaseModel]] = Default, related_field_name_for_id: Optional[str] = Default, related_to_many: bool = Default, model: BaseModel = Default)
-
If this is not used on a model field/attribute, the field will get the default set of options automatically if the field has a type-hint; see topic BaseModel Fields.
Preferred way going forward to provide additional options/configuration to BaseModel fields.
If you don't specify a value for a particular attribute, it will have the
xsentinels.default.Default
value. When a Default value is encountered while constructing axmodel.base.model.BaseModel
, it will resolve these Default values and assign the final value for the field.To resolve these Defaults, it will look at field on the parent BaseModel class. If a non-Default value is defined there, it will use that for the child. If not, then it looks at the next parent. If no non-Default value is found we then use a value that makes sense. You can see what this is in the first line of each doc-comment. In the future, when we start using Python 3.9 we can use type annotations (typing.Annotated) to annotate a specific value to the Default type generically. For now it's hard-coded.
Side Notes
Keep in mind that after the
.xdynamo.api
is accessed for the first time on a particular model class, the sdk will construct the rest of the class (lazily)… it will read and then remove/delete from the BaseModel class any type-hinted json fields with a Field object assigned to the class. It moves these Field objects into a special internal structure. The class getsNone
values set on all fields after this is done.Details on why we remove them:
Doing this helps with getattr, as it will still be executed for fields without a value when we create an object instance. getattr is used to support lazy lookups [via API] of related objects. Using getattr is much faster than using the getattribute version. So I want to keep using the getattr version if possible.
Expand source code
class DynField(Field): dyn_key: Optional[DynKeyType] = Default def resolve_defaults( self, name, type_hint: Type, default_converter_map: Optional[Dict[Type, Converter]] = None, parent_field: "DynField" = None ): # pydoc3 will copy the parent-class doc-comment if left empty here; # that's exactly what I want so leaving doc-comment blank. super().resolve_defaults( name=name, type_hint=type_hint, default_converter_map=default_converter_map, parent_field=parent_field ) if self.dyn_key: if not self.was_option_explicitly_set_by_user('include_in_repr'): self.include_in_repr = True
Ancestors
- xmodel.base.fields.Field
Subclasses
Class variables
var dyn_key : Optional[DynKeyType]
Methods
def resolve_defaults(self, name, type_hint: Type, default_converter_map: Optional[Dict[Type, xmodel.base.fields.Converter]] = None, parent_field: DynField = None)
-
Resolves all dataclass attributes/fields on self that are still set to
Default
. The only exception istype_hint
. We will always use what is passed in, regardless of if there is a parent-field with one set. This allows one on a BaseModel to easily override the type-hint without having to create a field with an explicitly set type_hint set on it (ie: let normal python annotated type-hint override any parent type).This includes ones on subclasses [dataclass will generically tell us about all of them]. System calls this when a BaseModel class is being lazily constructed [ie: when gets the
xmodel.base.model.BaseModel.api
attribute for the first time or attempts to create an instance of the BaseModel for the fist time].When the BaseModel class is being constructed, this method is called to resolve all the Default values still on the instance. We do this by:
- We first look at parent_field object if one has been given.
- If ask that parent field which options where explicitly set by user and which
ones were set by resolving a
xsentinels.default.Default
. Field objects have an internal/private var that keeps track of this.
- If ask that parent field which options where explicitly set by user and which
ones were set by resolving a
- Next, figure out standard default value for option if option's current value is
current at
xsentinels.default.Default
(a default sentential value, used to detect which values were left unset by user).
More Details
I have Field objects keep track of which fields were not at Default when they are resolved. This allows child Field objects to know which values to copy into themselves and which ones should be resolved normally via Default.
The goal here is to avoid copying value from Parent that were originally resolved via Default mechanism (and were not set explicitly by user).
An example of why this is handy:
If we have a parent model with a field of a different type vs the one on the child. Unless the converter was explicitly set by the user we want to just use the default converter for the different type on the child (and not use the wrong converter by default).
- We first look at parent_field object if one has been given.
class DynKey (api: DynApi, id: str = None, hash_key: Any = None, range_key: Optional[Any] = None, range_operator: str = None, require_full_key: bool = True)
-
DynKey(api: 'DynApi', id: str = None, hash_key: Any = None, range_key: Optional[Any] = None, range_operator: str = None, require_full_key: bool = True)
Expand source code
@dataclasses.dataclass(frozen=True, eq=True) class DynKey: api: 'DynApi' = dataclasses.field(compare=False) # We only compare with `id`, this should represent our identity sufficiently. id: str = None hash_key: Union[Any] = dataclasses.field(default=None, compare=False) range_key: Optional[Any] = ( dataclasses.field(default=None, compare=False) ) range_operator: str = dataclasses.field(default=None, compare=False) require_full_key: bool = dataclasses.field(default=True, compare=False) def __str__(self): return self.id or '' @classmethod def via_obj(cls, obj: 'DynModel') -> 'DynKey': structure = obj.api.structure hash_name = structure.dyn_hash_key_name if not hash_name: raise XModelDynamoNoHashKeyDefinedError( f"While constructing {structure.model_cls}, found no hash-key field. " f"You must have at least one hash-key field." ) hash_value = getattr(obj, hash_name) if hash_value is None: raise XModelDynamoError( f"Unable to get DynKey due to `None` for dynamo hash-key ({hash_value}) " f"on object {obj}." ) range_name = structure.dyn_range_key_name range_value = None if range_name: range_value = getattr(obj, range_name) if range_value is None: raise XModelDynamoError( f"Unable to get DynKey due to `None` for dynamo range-key ({range_name}) " f"on object {obj}." ) return DynKey(api=obj.api, hash_key=hash_value, range_key=range_value) def key_as_dict(self): structure = self.api.structure hash_field = structure.dyn_hash_field range_field = structure.dyn_range_field def run_converter(field: 'DynField', value) -> Any: converter = field.converter if not converter: return value return converter( self.api, Converter.Direction.to_json, field, value ) # Append the keys for the items we want into what we will request. item_request = {hash_field.name: run_converter(hash_field, self.hash_key)} if range_field: item_request[range_field.name] = run_converter(range_field, self.range_key) return item_request def __post_init__(self): structure = self.api.structure delimiter = structure.dyn_id_delimiter range_name = structure.dyn_range_key_name need_range_key = bool(range_name) hash_key = self.hash_key range_key = self.range_key api = self.api _id = self.id if _id is not None and not isinstance(_id, str): # `self.id` must always be a string. # todo: Must check for standard converter method _id = str(_id) object.__setattr__(self, 'id', _id) require_full_key = self.require_full_key # First, figure out `self.id` if not provided. if not _id: if not hash_key: raise XRemoteError( f"Tried to create DynKey with no id ({_id}) or no hash key ({hash_key})." ) if require_full_key and need_range_key and not range_key: raise XRemoteError( f"Tried to create DynKey with no id ({_id}) or no range key ({range_key})." ) key_names = [(structure.dyn_hash_key_name, hash_key)] # Generate ID without delimiter to represent an entire hash-page (ie: any range value) if need_range_key and range_key is not None: key_names.append((range_name, range_key)) keys = [] for key_name, key_value in key_names: field = structure.get_field(key_name) converter = field.converter final_value = key_value if converter: if self.range_operator == 'between' and isinstance(key_value, list): sub_v_result = [] for sub_v in key_value: sub_v = converter( api, Converter.Direction.to_json, field, sub_v ) sub_v_result.append(str(sub_v)) final_value = ",".join(sub_v_result) else: final_value = converter( api, Converter.Direction.to_json, field, key_value ) keys.append(final_value) _id = delimiter.join([str(x) for x in keys]) object.__setattr__(self, 'id', _id) elif need_range_key and delimiter not in _id: raise XRemoteError( f"Tried to create DynKey with an `id` ({_id}) " f"that did not have delimiter ({delimiter}) in it. " f"This means we are missing the range-key part for field ({range_name}) " f"in the `id` that was provided. Trying providing the id like this: " f'"{_id}{delimiter}' r'{range-key-value-goes-here}".' # <-- want to directly-output the `{` part. ) # If we got provided a hash-key directly, no need to continue any farther. if hash_key: if require_full_key and need_range_key and not range_key: raise XRemoteError( f"Have hash_key ({hash_key}) without needed range_key while creating DynKey." ) # We were provided the hash/range key already, as an optimization I don't use time # checking to see if they passed in the same values that they may have passed in `id`. return # They did not pass in hash_key, so we must parse the `id` they provided # and then set them on self. if not need_range_key: # If we don't need range key, there is no delimiter to look for. hash_key = _id else: split_id = _id.split(delimiter) if len(split_id) != 2: raise XRemoteError( f"For dynamo table ({self.api.model_type}): Have id ({_id}) but delimiter " f"({delimiter}) is either not present, or is in it more than once. " f"'id' needs to contain exactly one hash and range key combined together " f"with the delimiter, ie: 'hash-key-value{delimiter}range-key-value'. " f"See xdynamo.dyn_connections documentation for more details on how " f"this works." ) # todo: Consider converting these `from_json`, like we convert `to_json` # when we put the keys into the `id` (see above, where we generate `id` if needed)? hash_key = split_id[0] range_key = split_id[1] object.__setattr__(self, 'hash_key', hash_key) object.__setattr__(self, 'range_key', range_key)
Class variables
var api : DynApi
var hash_key : Any
var id : str
var range_key : Optional[Any]
var range_operator : str
var require_full_key : bool
Static methods
def via_obj(obj: DynModel)
Methods
def key_as_dict(self)
class DynModel (*args, id=Default, **initial_values)
-
Used to easily parse/generate JSON from xyn_sdk model's for use in Dynamo. So it will take advantage of all the other features of the xyn_sdk models. This includes automatically converting dates to/from strings, converting strings to numbers, looking up child-objects from other tables automatically, etc. It also will modify the results JSON so remove blank values; to easily prevent these sorts of errors in dynamo boto3 library.
We pass in None for name/service to indicate we don't have an associated table, that we are more of an abstract class.
Creates a new model object. The first/second params need to be passed as positional arguments. The rest must be sent as key-word arguments. Everything is optional.
Args
id
- Specify the
BaseModel.id
attribute, if you know it. If left as Default, nothing will be set on it. It could be set to something via args[0] (ie: a JSON dict). If you do provide a value, it be set last after everything else has been set. *args
-
I don't want to take names from what you could put into 'initial_values', so I keep it as position-only *args. Once Python 3.8 comes out, we can use a new feature where you can specify some arguments as positional-only and not keyword-able.
FirstArg - If Dict:
If raw dictionary parsed from JSON string. It just calls
self.api.update_from_json(args[0])
for you.FirstArt - If BaseModel:
If a
BaseModel
, will copy fields over that have the same name. You can use this to duplicate a Model object, if you want to copy it. Or can be used to copy fields from one model type into another, on fields that are the same name.Will ignore fields that are present on one but not the other. Only copy fields that are on both models types.
**initial_values
- Let's you specify other attribute values for convenience.
They will be set into the object the same way you would normally doing it:
ie:
model_obj.some_attr = v
is the same asModelClass(some_attr=v)
.
Expand source code
class DynModel( RemoteModel, dyn_name=None, dyn_service=Default, dyn_environment=Default, lazy_loader=lazy_load_types_for_dyn_api ): """ Used to easily parse/generate JSON from xyn_sdk model's for use in Dynamo. So it will take advantage of all the other features of the xyn_sdk models. This includes automatically converting dates to/from strings, converting strings to numbers, looking up child-objects from other tables automatically, etc. It also will modify the results JSON so remove blank values; to easily prevent these sorts of errors in dynamo boto3 library. We pass in None for name/service to indicate we don't have an associated table, that we are more of an abstract class. """ api: DynApi[Self] id: str @property def id(self) -> Optional[str]: # We could do some intelligent caching, but for now just calculate each time. try: return DynKey.via_obj(self).id except XModelDynamoNoHashKeyDefinedError: # There is something wrong with class structure, there is no hash-key defined! raise except XRemoteError: # Any other error, we simply don't have a full `id` value assigned to object. return None @id.setter def id(self, value): structure = self.api.structure if type(value) is str: parsed_value = value.split('|') hash_value = parsed_value[0] range_value = parsed_value[1] if len(parsed_value) == 2 else None self.__setattr__(structure.dyn_hash_field.name, hash_value) if range_value: self.__setattr__(structure.dyn_range_field.name, range_value) return raise NotImplementedError( "Read-only for now, but want to support it. " "Supporting it would involve parsing ID with DynKey, and taking hash/range key " "components and setting them on the proper attributes." "\n\n" "Also, want to eventually support for using 'id' as a HashField " "(ie: a single/only key called 'id' in dynamo-db)" )
Ancestors
- xmodel.remote.model.RemoteModel
- xmodel.base.model.BaseModel
- abc.ABC
Class variables
var api : DynApi[xmodel.base.model.BaseModel]
Instance variables
prop id : str
-
Expand source code
@property def id(self) -> Optional[str]: # We could do some intelligent caching, but for now just calculate each time. try: return DynKey.via_obj(self).id except XModelDynamoNoHashKeyDefinedError: # There is something wrong with class structure, there is no hash-key defined! raise except XRemoteError: # Any other error, we simply don't have a full `id` value assigned to object. return None
class HashField (name: str = Default, type_hint: Type = <property object>, nullable: bool = Default, read_only: bool = Default, exclude: bool = Default, default: Any = Default, post_filter: Optional[xmodel.base.fields.Filter] = Default, converter: Optional[xmodel.base.fields.Converter] = Default, fget: Optional[Callable[[M], Any]] = Default, fset: Optional[Callable[[BaseModel, Any], None]] = Default, include_with_fields: Set[str] = Default, json_path: str = Default, json_path_separator: str = Default, include_in_repr: bool = Default, related_type: Optional[Type[BaseModel]] = Default, related_field_name_for_id: Optional[str] = Default, related_to_many: bool = Default, model: BaseModel = Default)
-
If this is not used on a model field/attribute, the field will get the default set of options automatically if the field has a type-hint; see topic BaseModel Fields.
Preferred way going forward to provide additional options/configuration to BaseModel fields.
If you don't specify a value for a particular attribute, it will have the
xsentinels.default.Default
value. When a Default value is encountered while constructing axmodel.base.model.BaseModel
, it will resolve these Default values and assign the final value for the field.To resolve these Defaults, it will look at field on the parent BaseModel class. If a non-Default value is defined there, it will use that for the child. If not, then it looks at the next parent. If no non-Default value is found we then use a value that makes sense. You can see what this is in the first line of each doc-comment. In the future, when we start using Python 3.9 we can use type annotations (typing.Annotated) to annotate a specific value to the Default type generically. For now it's hard-coded.
Side Notes
Keep in mind that after the
.xdynamo.api
is accessed for the first time on a particular model class, the sdk will construct the rest of the class (lazily)… it will read and then remove/delete from the BaseModel class any type-hinted json fields with a Field object assigned to the class. It moves these Field objects into a special internal structure. The class getsNone
values set on all fields after this is done.Details on why we remove them:
Doing this helps with getattr, as it will still be executed for fields without a value when we create an object instance. getattr is used to support lazy lookups [via API] of related objects. Using getattr is much faster than using the getattribute version. So I want to keep using the getattr version if possible.
Expand source code
class HashField(DynField): dyn_key = DynKeyType.hash
Ancestors
- DynField
- xmodel.base.fields.Field
Class variables
var dyn_key
Methods
def resolve_defaults(self, name, type_hint: Type, default_converter_map: Optional[Dict[Type, xmodel.base.fields.Converter]] = None, parent_field: DynField = None)
-
Inherited from:
DynField
.resolve_defaults
Resolves all dataclass attributes/fields on self that are still set to
Default
. The only exception istype_hint
. We will always use what is passed …
class RangeField (name: str = Default, type_hint: Type = <property object>, nullable: bool = Default, read_only: bool = Default, exclude: bool = Default, default: Any = Default, post_filter: Optional[xmodel.base.fields.Filter] = Default, converter: Optional[xmodel.base.fields.Converter] = Default, fget: Optional[Callable[[M], Any]] = Default, fset: Optional[Callable[[BaseModel, Any], None]] = Default, include_with_fields: Set[str] = Default, json_path: str = Default, json_path_separator: str = Default, include_in_repr: bool = Default, related_type: Optional[Type[BaseModel]] = Default, related_field_name_for_id: Optional[str] = Default, related_to_many: bool = Default, model: BaseModel = Default)
-
If this is not used on a model field/attribute, the field will get the default set of options automatically if the field has a type-hint; see topic BaseModel Fields.
Preferred way going forward to provide additional options/configuration to BaseModel fields.
If you don't specify a value for a particular attribute, it will have the
xsentinels.default.Default
value. When a Default value is encountered while constructing axmodel.base.model.BaseModel
, it will resolve these Default values and assign the final value for the field.To resolve these Defaults, it will look at field on the parent BaseModel class. If a non-Default value is defined there, it will use that for the child. If not, then it looks at the next parent. If no non-Default value is found we then use a value that makes sense. You can see what this is in the first line of each doc-comment. In the future, when we start using Python 3.9 we can use type annotations (typing.Annotated) to annotate a specific value to the Default type generically. For now it's hard-coded.
Side Notes
Keep in mind that after the
.xdynamo.api
is accessed for the first time on a particular model class, the sdk will construct the rest of the class (lazily)… it will read and then remove/delete from the BaseModel class any type-hinted json fields with a Field object assigned to the class. It moves these Field objects into a special internal structure. The class getsNone
values set on all fields after this is done.Details on why we remove them:
Doing this helps with getattr, as it will still be executed for fields without a value when we create an object instance. getattr is used to support lazy lookups [via API] of related objects. Using getattr is much faster than using the getattribute version. So I want to keep using the getattr version if possible.
Expand source code
class RangeField(DynField): dyn_key = DynKeyType.range
Ancestors
- DynField
- xmodel.base.fields.Field
Class variables
var dyn_key
Methods
def resolve_defaults(self, name, type_hint: Type, default_converter_map: Optional[Dict[Type, xmodel.base.fields.Converter]] = None, parent_field: DynField = None)
-
Inherited from:
DynField
.resolve_defaults
Resolves all dataclass attributes/fields on self that are still set to
Default
. The only exception istype_hint
. We will always use what is passed …