ButlerLogRecords

class lsst.daf.butler.ButlerLogRecords

Bases: pydantic.main.BaseModel

Class representing a collection of ButlerLogRecord.

Attributes Summary

copy Duplicate a model, optionally choose which fields to include, exclude and change.
dict Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
json Generate a JSON representation of the model, include and exclude arguments as per dict().
log_format

Methods Summary

append(value, …)
clear()
construct(_fields_set, None] = None, **values) Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
extend(records, …)
from_file(filename) Read records from file.
from_orm(obj)
from_raw(serialized, bytes]) Parse raw serialized form and return records.
from_records(records) Create collection from iterable.
from_stream(stream) Read records from I/O stream.
insert(index, value, …)
parse_file(path, pathlib.Path], *, …)
parse_obj(obj)
parse_raw(b, bytes], *, content_type, …)
pop(index)
reverse()
schema(by_alias, ref_template)
schema_json(*, by_alias, ref_template, …)
set_log_format(format, None]) Set the log format string for these records.
update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns.
validate(value)

Attributes Documentation

copy

Duplicate a model, optionally choose which fields to include, exclude and change.

Parameters:
  • include – fields to include in new model
  • exclude – fields to exclude from new model, as with values this takes precedence over include
  • update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
  • deep – set to True to make a deep copy of the model
Returns:

new model instance

dict

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

json

Generate a JSON representation of the model, include and exclude arguments as per dict().

encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().

log_format

Methods Documentation

append(value: Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]) → None
clear() → None
classmethod construct(_fields_set: Optional[SetStr, None] = None, **values) → Model

Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = 'allow' was set since it adds all passed values

extend(records: Iterable[Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]]) → None
classmethod from_file(filename: str) → lsst.daf.butler.core.logging.ButlerLogRecords

Read records from file.

Parameters:
filename : str

Name of file containing the JSON records.

Notes

Works with one-record-per-line format JSON files and a direct serialization of the Pydantic model.

classmethod from_orm(obj: Any) → Model
classmethod from_raw(serialized: Union[str, bytes]) → lsst.daf.butler.core.logging.ButlerLogRecords

Parse raw serialized form and return records.

Parameters:
serialized : bytes or str

Either the serialized JSON of the model created using .json() or a streaming format of one JSON ButlerLogRecord per line. This can also support a zero-length string.

classmethod from_records(records: Iterable[lsst.daf.butler.core.logging.ButlerLogRecord]) → lsst.daf.butler.core.logging.ButlerLogRecords

Create collection from iterable.

Parameters:
records : iterable of ButlerLogRecord

The records to seed this class with.

classmethod from_stream(stream: IO) → lsst.daf.butler.core.logging.ButlerLogRecords

Read records from I/O stream.

Parameters:
stream : typing.IO

Stream from which to read JSON records.

Notes

Works with one-record-per-line format JSON files and a direct serialization of the Pydantic model.

insert(index: int, value: Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]) → None
classmethod parse_file(path: Union[str, pathlib.Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: pydantic.parse.Protocol = None, allow_pickle: bool = False) → Model
classmethod parse_obj(obj: Any) → Model
classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: pydantic.parse.Protocol = None, allow_pickle: bool = False) → Model
pop(index: int = -1) → lsst.daf.butler.core.logging.ButlerLogRecord
reverse() → None
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs) → unicode
set_log_format(format: Optional[str, None]) → Optional[str, None]

Set the log format string for these records.

Parameters:
format : str, optional

The new format string to use for converting this collection of records into a string. If None the default format will be used.

Returns:
old_format : str, optional

The previous log format.

classmethod update_forward_refs(**localns) → None

Try to update ForwardRefs on fields based on this Model, globalns and localns.

classmethod validate(value: Any) → Model