Creating a unique id in a python dataclass

2024/10/14 21:23:17

I need a unique (unsigned int) id for my python data class. This is very similar to this so post, but without explicit ctors.

import attr
from attrs import field
from itertools import count
@attr.s(auto_attribs=True)
class Person:#: each Person has a unique id_counter: count[int] = field(init=False, default=count())_unique_id: int = field(init=False)@_unique_id.defaultdef _initialize_unique_id(self) -> int:return next(self._counter)

Is there any more-"pythonic" solution?

Answer

Use a default factory instead of just a default. This allows to define a call to get the next id on each instantiation.
A simple means to get a callable that counts up is to use count().__next__, the equivalent of calling next(...) on a count instance.1

The common "no explicit ctor" libraries attr and dataclasses both support this:

from itertools import count
from dataclasses import dataclass, field@dataclass
class C:identifier: int = field(default_factory=count().__next__)import attr@attr.s
class C:identifier: int = attr.field(factory=count().__next__)

To always use the automatically generated value and prevent passing one in as a parameter, use init=False.

@dataclass
class C:identifier: int = field(default_factory=count().__next__, init=False)

1 If one wants to avoid explicitly addressing magic methods, one can use a closure over a count. For example, factory=lambda counter=count(): next(counter).

https://en.xdnf.cn/q/69367.html

Related Q&A

How to get all the models (one for each set of parameters) using GridSearchCV?

From my understanding: best_estimator_ provides the estimator with highest score; best_score_ provides the score of the selected estimator; cv_results_ may be exploited to get the scores of all estimat…

How do I perform deep equality comparisons of two lists of tuples?

I want to compare two lists of tuples:larry = [(1,a), (2, b)] moe = [(2, b), (1, a)]such that the order of the items in the list may differ. Are there library functions to do this ?>> deep_equal…

Metadata-generation-failed when trying to install pygame [duplicate]

This question already has answers here:Python pygame not installing(3 answers)Closed last year.Trying to install pygame on python 3.11 using the following command "pip install pygame" and I a…

Why such a big pickle of a sklearn decision tree (30K times bigger)?

Why pickling a sklearn decision tree can generate a pickle thousands times bigger (in terms of memory) than the original estimator? I ran into this issue at work where a random forest estimator (with …

Buffer size for reading UDP packets in Python

I am trying to find out / adjust the size of network buffers:import socketsock = socket.socket(socket.AF_INET,socket.SOCK_DGRAM)sock.getsockopt(socket.SOL_SOCKET,socket.SO_RCVBUF) 212992What on earth i…

Why does datetime give different timezone formats for the same timezone?

>>> now = datetime.datetime.now(pytz.timezone(Asia/Tokyo)) >>> dt = datetime(now.year, now.month, now.day, now.hour, now.minute, now.second, now.microsecond, pytz.timezone(Asia/Tokyo)…

Connect with pyppeteer to existing chrome

I want to connect to an existing (already opened, by the user, without any extra flags) Chrome browser using pyppeteer so I would be able to control it. I can do almost every manual action before (for …

Combining asyncio with a multi-worker ProcessPoolExecutor and for async

My question is very similar to Combining asyncio with a multi-worker ProcessPoolExecutor - however a slight change (I believe its the async for) makes the excellent answers there unusuable for me.I am …

Convert UTF-8 to string literals in Python

I have a string in UTF-8 format but not so sure how to convert this string to its corresponding character literal. For example I have the string:My string is: Entre\xc3\xa9Example one:This code:uEntre\…

Memory usage not getting lowered even after job is completed successfully

I have a job added in apscheduler which loads some data in memory and I am deleting all the objects after the job is complete. Now if I run this job with python it works successfully and memory drop af…