ButlerLogRecords¶
- 
class 
lsst.daf.butler.ButlerLogRecords¶ Bases:
pydantic.main.BaseModelClass representing a collection of
ButlerLogRecord.Attributes Summary
copyDuplicate a model, optionally choose which fields to include, exclude and change. dictGenerate a dictionary representation of the model, optionally specifying which fields to include or exclude. jsonGenerate a JSON representation of the model, includeandexcludearguments as perdict().log_formatMethods Summary
append(value, …)clear()construct(_fields_set, **values)Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. extend(records, …)from_file(filename)Read records from file. from_orm(obj)from_raw(serialized, bytes])Parse raw serialized form and return records. from_records(records)Create collection from iterable. from_stream(stream)Read records from I/O stream. insert(index, value, …)parse_file(path, pathlib.Path], *, …)parse_obj(obj)parse_raw(b, bytes], *, content_type, …)pop(index)reverse()schema(by_alias, ref_template)schema_json(*, by_alias, ref_template, …)set_log_format(format)Set the log format string for these records. update_forward_refs(**localns)Try to update ForwardRefs on fields based on this Model, globalns and localns. validate(value)Attributes Documentation
- 
copy¶ Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters: - include – fields to include in new model
 - exclude – fields to exclude from new model, as with values this takes precedence over include
 - update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
 - deep – set to 
Trueto make a deep copy of the model 
Returns: new model instance
- 
dict¶ Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- 
json¶ Generate a JSON representation of the model,
includeandexcludearguments as perdict().encoderis an optional function to supply asdefaultto json.dumps(), other arguments as perjson.dumps().
- 
log_format¶ 
Methods Documentation
- 
append(value: Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]) → None¶ 
- 
clear() → None¶ 
- 
classmethod 
construct(_fields_set: Optional[SetStr] = None, **values) → Model¶ Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if
Config.extra = 'allow'was set since it adds all passed values
- 
extend(records: Iterable[Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]]) → None¶ 
- 
classmethod 
from_file(filename: str) → lsst.daf.butler.core.logging.ButlerLogRecords¶ Read records from file.
Parameters: - filename : 
str Name of file containing the JSON records.
Notes
Works with one-record-per-line format JSON files and a direct serialization of the Pydantic model.
- filename : 
 
- 
classmethod 
from_orm(obj: Any) → Model¶ 
- 
classmethod 
from_raw(serialized: Union[str, bytes]) → lsst.daf.butler.core.logging.ButlerLogRecords¶ Parse raw serialized form and return records.
Parameters: 
- 
classmethod 
from_records(records: Iterable[lsst.daf.butler.core.logging.ButlerLogRecord]) → lsst.daf.butler.core.logging.ButlerLogRecords¶ Create collection from iterable.
Parameters: - records : iterable of 
ButlerLogRecord The records to seed this class with.
- records : iterable of 
 
- 
classmethod 
from_stream(stream: IO) → lsst.daf.butler.core.logging.ButlerLogRecords¶ Read records from I/O stream.
Parameters: - stream : 
typing.IO Stream from which to read JSON records.
Notes
Works with one-record-per-line format JSON files and a direct serialization of the Pydantic model.
- stream : 
 
- 
insert(index: int, value: Union[logging.LogRecord, lsst.daf.butler.core.logging.ButlerLogRecord]) → None¶ 
- 
classmethod 
parse_file(path: Union[str, pathlib.Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: pydantic.parse.Protocol = None, allow_pickle: bool = False) → Model¶ 
- 
classmethod 
parse_obj(obj: Any) → Model¶ 
- 
classmethod 
parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: pydantic.parse.Protocol = None, allow_pickle: bool = False) → Model¶ 
- 
pop(index: int = -1) → lsst.daf.butler.core.logging.ButlerLogRecord¶ 
- 
reverse() → None¶ 
- 
classmethod 
schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶ 
- 
classmethod 
schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs) → unicode¶ 
- 
set_log_format(format: Optional[str]) → Optional[str]¶ Set the log format string for these records.
Parameters: Returns: - old_format : 
str, optional The previous log format.
- old_format : 
 
- 
classmethod 
update_forward_refs(**localns) → None¶ Try to update ForwardRefs on fields based on this Model, globalns and localns.
- 
classmethod 
validate(value: Any) → Model¶ 
-