ButlerLogRecord¶
- class lsst.daf.butler.logging.ButlerLogRecord(*, name: str, asctime: datetime, message: str, levelno: int, levelname: str, filename: str, pathname: str, lineno: int, funcName: str | None = None, process: int, processName: str, exc_info: str | None = None, MDC: dict[str, str])¶
- Bases: - _BaseModelCompat- A model representing a - logging.LogRecord.- A - LogRecordalways uses the current time in its record when recreated and that makes it impossible to use it as a serialization format. Instead have a local representation of a- LogRecordthat matches Butler needs.- Attributes Summary - Methods Summary - construct([_fields_set])- Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. - copy(*[, include, exclude, update, deep])- Duplicate a model, optionally choose which fields to include, exclude and change. - dict(*[, include, exclude, by_alias, ...])- Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. - format([log_format])- Format this record. - from_orm(obj)- from_record(record)- Create a new instance from a - LogRecord.- json(*[, include, exclude, by_alias, ...])- Generate a JSON representation of the model, - includeand- excludearguments as per- dict().- model_construct([_fields_set])- model_dump(*[, mode, include, exclude, ...])- model_dump_json(*[, indent, include, ...])- model_rebuild(*[, force, raise_errors, ...])- model_validate(obj, *[, strict, ...])- model_validate_json(json_data, *[, strict, ...])- parse_file(path, *[, content_type, ...])- parse_obj(obj)- parse_raw(b, *[, content_type, encoding, ...])- schema([by_alias, ref_template])- schema_json(*[, by_alias, ref_template])- update_forward_refs(**localns)- Try to update ForwardRefs on fields based on this Model, globalns and localns. - validate(value)- Attributes Documentation - model_fields = {'MDC': ModelField(name='MDC', type=Mapping[str, str], required=True), 'asctime': ModelField(name='asctime', type=datetime, required=True), 'exc_info': ModelField(name='exc_info', type=Optional[str], required=False, default=None), 'filename': ModelField(name='filename', type=str, required=True), 'funcName': ModelField(name='funcName', type=Optional[str], required=False, default=None), 'levelname': ModelField(name='levelname', type=str, required=True), 'levelno': ModelField(name='levelno', type=int, required=True), 'lineno': ModelField(name='lineno', type=int, required=True), 'message': ModelField(name='message', type=str, required=True), 'name': ModelField(name='name', type=str, required=True), 'pathname': ModelField(name='pathname', type=str, required=True), 'process': ModelField(name='process', type=int, required=True), 'processName': ModelField(name='processName', type=str, required=True)}¶
 - Methods Documentation - classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model¶
- Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if - Config.extra = 'allow'was set since it adds all passed values
 - copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model¶
- Duplicate a model, optionally choose which fields to include, exclude and change. - Parameters:
- include – fields to include in new model 
- exclude – fields to exclude from new model, as with values this takes precedence over include 
- update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data 
- deep – set to - Trueto make a deep copy of the model
 
- Returns:
- new model instance 
 
 - dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny¶
- Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. 
 - classmethod from_record(record: LogRecord) ButlerLogRecord¶
- Create a new instance from a - LogRecord.- Parameters:
- recordlogging.LogRecord
- The record from which to extract the relevant information. 
 
- record
 
 - json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode¶
- Generate a JSON representation of the model, - includeand- excludearguments as per- dict().- encoderis an optional function to supply as- defaultto json.dumps(), other arguments as per- json.dumps().
 - model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) dict[str, Any]¶
 - model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str¶
 - classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None¶
 - classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self¶
 - classmethod model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None) Self¶
 - classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
 - classmethod parse_raw(b: str | bytes, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
 - classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny¶
 - classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode¶