HFSS log parser – pyedb.workflows.utilities.hfss_log_parser#

Top-level façade#

class pyedb.workflows.utilities.hfss_log_parser.HFSSLogParser(log_path: str | Path)#

Bases: object

High-level façade that orchestrates all block parsers.

Typical usage:

>>> log = HFSSLogParser("/tmp/project.aedt.batchinfo.1234/hfss.log")
>>> data = log.parse()
>>> data.is_converged()
True
parse() ParsedLog#

Execute all sub-parsers and return a unified object.

Returns:

Structured representation of the entire log.

Return type:

ParsedLog

Raises:

Aggregated result object#

class pyedb.workflows.utilities.hfss_log_parser.ParsedLog(project: ProjectInfo, init_mesh: InitMesh, adaptive: List[AdaptivePass], sweep: Sweep | None)#

Bases: object

Root container returned by HFSSLogParser.parse().

Variables:
  • project (ProjectInfo) – Project meta-data.

  • init_mesh (InitMesh) – Initial-mesh metrics.

  • adaptive (list[AdaptivePass]) – Adaptive passes in chronological order.

  • sweep (Sweep | None) – Frequency-sweep summary (None if absent).

adaptive_passes() List[AdaptivePass]#

Alias to keep API explicit.

errors() List[str]#

Extract only error lines (warnings are ignored).

ANSYS marks errors with [error] or *** ERROR ***.

Returns:

List of stripped error lines (empty if none).

Return type:

list[str]

is_completed() bool#

Heuristic indicating a successful end-to-end solve.

A simulation is considered complete when both of the following conditions are satisfied:

  1. At least one adaptive pass converged.

  2. A frequency-sweep block exists with elapsed time greater than zero.

Return type:

bool

is_converged() bool#

Return True if the adaptive solver declared convergence.

Return type:

bool

memory_on_convergence() float#

Memory (MB) consumed by the last converged adaptive pass.

Returns:

Megabytes, or math.nan if no pass converged.

Return type:

float

to_dict() dict#

Deep-convert the entire object to JSON-serialisable primitives.

Returns:

Plain dict / list / scalar structure.

Return type:

dict[str, Any]

Data containers#

class pyedb.workflows.utilities.hfss_log_parser.ProjectInfo(name: str, file: Path, design: str = '', user: str = '', cmd_line: str = '')#

Bases: object

Basic meta-data extracted from the header of an HFSS batch log.

Variables:
  • name (str) – Project name (without extension).

  • file (Path) – Full path to the project file.

  • design (str) – Active design name (may be empty).

  • user (str) – OS user that launched the solve.

  • cmd_line (str) – Exact command line used for the run.

class pyedb.workflows.utilities.hfss_log_parser.InitMesh(tetrahedra: int, memory_mb: float, real_time_sec: int, cpu_time_sec: int)#

Bases: object

Statistics reported during the initial tetrahedral meshing phase.

Variables:
  • tetrahedra (int) – Number of tetrahedra created.

  • memory_mb (float) – Peak memory consumption in megabytes.

  • real_time_sec (int) – Wall clock time in seconds.

  • cpu_time_sec (int) – CPU time in seconds.

class pyedb.workflows.utilities.hfss_log_parser.AdaptivePass(pass_nr: int, freq_hz: float, tetrahedra: int, matrix_size: int, memory_mb: float, delta_s: float | None, converged: bool, elapsed_sec: int)#

Bases: object

Single adaptive solution pass (frequency, delta-S, memory, …).

Variables:
  • pass_nr (int) – 1-based pass index.

  • freq_hz (float) – Target frequency in hertz.

  • tetrahedra (int) – Number of tetrahedra at end of pass.

  • matrix_size (int) – Order of the FEM matrix.

  • memory_mb (float) – Memory used in megabytes.

  • delta_s (float) – Maximum |ΔS| observed (None until reported).

  • converged (bool) – True if this pass triggered convergence.

  • elapsed_sec (int) – Wall time spent in this pass.

class pyedb.workflows.utilities.hfss_log_parser.Sweep(type: str, frequencies: int, solved: List[float], elapsed_sec: int)#

Bases: object

Frequency-sweep summary block.

Variables:
  • type (str) – Sweep algorithm: Interpolating, Discrete or Fast.

  • frequencies (int) – Total number of frequency points requested.

  • solved (list[float]) – List of frequencies (Hz) actually solved.

  • elapsed_sec (int) – Wall clock time for the entire sweep.

Block parsers (advanced usage)#

class pyedb.workflows.utilities.hfss_log_parser.ProjectBlockParser(lines: List[str])#

Bases: BlockParser

Extract project meta-data from the log header.

Example:

>>> block = ProjectBlockParser(lines)
>>> info = block.parse()
>>> info.name
'Patch_Antenna'
parse() ProjectInfo#

Parse the stored lines and return a ProjectInfo instance.

Returns:

Populated data object.

Return type:

ProjectInfo

Raises:

ValueError – If mandatory fields (project name or file path) cannot be located.

class pyedb.workflows.utilities.hfss_log_parser.InitMeshBlockParser(lines: List[str])#

Bases: BlockParser

class pyedb.workflows.utilities.hfss_log_parser.AdaptiveBlockParser(lines: List[str])#

Bases: BlockParser

Build a list of AdaptivePass objects from the adaptive section.

parse() List[AdaptivePass]#

Parse every adaptive pass and determine which one triggered convergence.

Returns:

Ordered list of passes (pass_nr always increases).

Return type:

list[AdaptivePass]

class pyedb.workflows.utilities.hfss_log_parser.SweepBlockParser(lines: List[str])#

Bases: BlockParser

Extract frequency-sweep summary (if present).

parse() Sweep | None#

Return sweep information or None if the log contains no sweep block.

Returns:

Sweep summary object.

Return type:

Sweep | None

Base class#

class pyedb.workflows.utilities.hfss_log_parser.BlockParser(lines: List[str])#

Bases: object

Base class for a single block parser.

Utility helpers#

The functions below are private by convention (leading underscore) but are exposed in the documentation for contributors and advanced users.

pyedb.workflows.utilities.hfss_log_parser._to_hz(text: str) float#

Convert a human-readable frequency string to hertz.

Parameters:

text (str) – Frequency expression such as '3 GHz', '100 kHz', '10MHz'.

Returns:

Numerical value in Hz. Returns math.nan if the string cannot be parsed.

Return type:

float

pyedb.workflows.utilities.hfss_log_parser._to_sec(mm_ss: str) int#

Convert an ANSYS time stamp to seconds.

Accepts MM:SS, H:MM:SS or HH:MM:SS.

Parameters:

mm_ss (str) – Time stamp extracted from the log.

Returns:

Total elapsed seconds.

Return type:

int

pyedb.workflows.utilities.hfss_log_parser._as_dict(obj: Any) Any#

Recursively convert dataclasses / lists / primitives to plain Python types.