SIwave log parser – pyedb.workflows.utilities.siwave_log_parser#

siwave_log_parser.py Parse Ansys SIwave batch logs into dataclasses.

Usage#

>>> from pyedb.workflows.utilities.siwave_log_parser import SiwaveLogParser
>>> parser = SiwaveLogParser(r"C:\path  o\siwave.log")
>>> log = parser.parse()
>>> log.summary()
>>> log.to_json("siwave.json")

Overview#

The SIwave log parser extracts structured data from Ansys SIwave batch solve logs, including version information, batch execution metadata, simulation settings, warnings, profiling data, and completion status.

Key Features:

  • Automatic detection of simulation completion status (Normal Completion vs. Aborted)

  • Extraction of electrical short warnings with precise coordinates

  • Performance profiling data (timing, memory usage)

  • Support for multiple timestamp formats

  • JSON export capability

Quick Start#

Basic usage:

from pyedb.workflows.utilities.siwave_log_parser import SiwaveLogParser

# Parse a SIwave log file
parser = SiwaveLogParser("path/to/siwave.log")
log = parser.parse()

# Check simulation status
if log.is_completed():
    print(f"✓ Simulation completed successfully")
else:
    print(f"✗ Simulation failed or was aborted")

# Display summary
log.summary()

# Export to JSON
log.to_json("results.json", indent=2)

Top-level façade#

class pyedb.workflows.utilities.siwave_log_parser.SiwaveLogParser(log_path: str | Path)#

Bases: object

High-level façade that orchestrates all block parsers.

Typical usage:

>>> parser = SiwaveLogParser("/tmp/siwave.log")
>>> log = parser.parse()
>>> log.summary()
>>> log.to_json("output.json")
parse() ParsedSiwaveLog#

Execute all sub-parsers and return a unified object.

Returns:

Structured representation of the entire log.

Return type:

ParsedSiwaveLog

Raises:

Aggregated result object#

class pyedb.workflows.utilities.siwave_log_parser.ParsedSiwaveLog(aedt: ~pyedb.workflows.utilities.siwave_log_parser.AEDTVersion, batch: ~pyedb.workflows.utilities.siwave_log_parser.BatchInfo, settings: ~pyedb.workflows.utilities.siwave_log_parser.SimSettings, warnings: ~typing.List[~pyedb.workflows.utilities.siwave_log_parser.WarningEntry] = <factory>, profile: ~typing.List[~pyedb.workflows.utilities.siwave_log_parser.ProfileEntry] = <factory>)#

Bases: object

Root container returned by SiwaveLogParser.parse().

Variables:
is_aborted() bool#

Check if the simulation was aborted.

Returns:

True if simulation did not complete normally, False otherwise.

Return type:

bool

is_completed() bool#

Check if the simulation completed normally.

Returns:

True if status is “Normal Completion”, False otherwise.

Return type:

bool

summary() None#

Print a summary of the parsed log.

to_dict() dict#

Deep-convert the entire object to JSON-serialisable primitives.

Returns:

Plain dict / list / scalar structure.

Return type:

dict[str, Any]

to_json(fp: str, **kw) None#

Serialise to JSON (datetime→ISO).

Parameters:
  • fp (str) – File path to write JSON to.

  • kw – Additional keyword arguments for json.dumps.

Data containers#

Version information#

class pyedb.workflows.utilities.siwave_log_parser.AEDTVersion(version: 'str', build: 'str', location: 'str')#

Bases: object

Batch execution metadata#

class pyedb.workflows.utilities.siwave_log_parser.BatchInfo(path: 'str', started: 'datetime', stopped: 'datetime', run_by: 'str', temp_dir: 'str', project_dir: 'str', status: 'str' = '')#

Bases: object

Simulation settings#

class pyedb.workflows.utilities.siwave_log_parser.SimSettings(design_type: 'str', allow_off_core: 'bool', manual_settings: 'bool', two_level: 'bool', distribution_types: 'List[str]', machines: 'List[str]')#

Bases: object

Warning entries#

class pyedb.workflows.utilities.siwave_log_parser.WarningEntry(timestamp: 'datetime', category: 'str', net1: 'str', net2: 'str', layer: 'str', x: 'float', y: 'float', message: 'str')#

Bases: object

Profile entries#

class pyedb.workflows.utilities.siwave_log_parser.ProfileEntry(timestamp: 'datetime', task: 'str', real_time: 'Optional[str]' = None, cpu_time: 'Optional[str]' = None, memory: 'Optional[str]' = None, extra: 'Dict[str, str]' = <factory>)#

Bases: object

Block parsers (advanced usage)#

These parsers handle specific sections of the log file. Most users will not need to use them directly, as they are orchestrated by SiwaveLogParser.

class pyedb.workflows.utilities.siwave_log_parser.HeaderBlockParser(lines: List[str])#

Bases: BlockParser

Extract AEDT version information from the log header.

parse() AEDTVersion#

Parse the stored lines and return an AEDTVersion instance.

Returns:

Populated data object.

Return type:

AEDTVersion

class pyedb.workflows.utilities.siwave_log_parser.BatchSettingsBlockParser(lines: List[str])#

Bases: BlockParser

Extract batch info and simulation settings.

parse() tuple[BatchInfo, SimSettings]#

Parse batch information and simulation settings.

Returns:

Tuple of (BatchInfo, SimSettings).

Return type:

tuple[BatchInfo, SimSettings]

class pyedb.workflows.utilities.siwave_log_parser.WarningsBlockParser(lines: List[str])#

Bases: BlockParser

Extract warning entries from the log.

parse() List[WarningEntry]#

Parse warning messages.

Returns:

List of warning entries.

Return type:

list[WarningEntry]

class pyedb.workflows.utilities.siwave_log_parser.ProfileBlockParser(lines: List[str])#

Bases: BlockParser

Extract profile entries from the log.

parse() List[ProfileEntry]#

Parse profile entries showing task timing and resource usage.

Returns:

List of profile entries.

Return type:

list[ProfileEntry]

Base class#

class pyedb.workflows.utilities.siwave_log_parser.BlockParser(lines: List[str])#

Bases: object

Base class for a single block parser.

Utility helpers#

The functions below are private by convention (leading underscore) but are exposed in the documentation for contributors and advanced users.

pyedb.workflows.utilities.siwave_log_parser._parse_ts(txt: str) datetime#

Convert timestamp strings to datetime.

Supports two formats: - ‘11/10/2025 05:46:09 PM’ (date first) - ‘11:55:29 AM Oct 12, 2025’ (time first)

pyedb.workflows.utilities.siwave_log_parser._split_kv(line: str, sep: str = ':') tuple[str, str]#

Return (key, value) from ‘key: value’.

pyedb.workflows.utilities.siwave_log_parser._as_dict(obj: Any) Any#

Recursively convert dataclasses / lists / primitives to plain Python types.

Examples#

Check simulation completion status#

from pyedb.workflows.utilities.siwave_log_parser import SiwaveLogParser

parser = SiwaveLogParser("siwave.log")
log = parser.parse()

if log.is_completed():
    print("Status:", log.batch.status)
    print("Runtime:", log.batch.stopped - log.batch.started)
else:
    print("Simulation was aborted or failed")

Extract warnings#

parser = SiwaveLogParser("siwave.log")
log = parser.parse()

# Get all electrical short warnings
shorts = [w for w in log.warnings if w.category == "SHORT"]

for warning in shorts:
    print(f"Short between {warning.net1} and {warning.net2}")
    print(f"  Layer: {warning.layer}")
    print(f"  Location: ({warning.x:.3f}, {warning.y:.3f}) mm")

Analyze performance#

parser = SiwaveLogParser("siwave.log")
log = parser.parse()

# Display profiling information
for entry in log.profile:
    print(f"Task: {entry.task}")
    print(f"  Real Time: {entry.real_time}")
    print(f"  CPU Time: {entry.cpu_time}")
    print(f"  Memory: {entry.memory}")

Batch processing#

from pathlib import Path
from pyedb.workflows.utilities.siwave_log_parser import SiwaveLogParser

# Process multiple log files
log_dir = Path("simulation_results")
completed = []
failed = []

for log_file in log_dir.glob("**/*.log"):
    try:
        parser = SiwaveLogParser(log_file)
        log = parser.parse()

        if log.is_completed():
            completed.append(
                {
                    "project": log.batch.path,
                    "warnings": len(log.warnings),
                    "runtime": (log.batch.stopped - log.batch.started).total_seconds(),
                }
            )
        else:
            failed.append({"project": log.batch.path, "status": log.batch.status})
    except Exception as e:
        print(f"Failed to parse {log_file}: {e}")

print(f"Completed: {len(completed)}, Failed: {len(failed)}")

Export to JSON#

parser = SiwaveLogParser("siwave.log")
log = parser.parse()

# Export with pretty formatting
log.to_json("results.json", indent=2)

# Convert to dictionary for further processing
data = log.to_dict()

See Also#