HFSS log parser – pyedb.workflows.utilities.hfss_log_parser#
HFSS log file parser for extracting simulation results and metrics.
Top-level façade#
- class pyedb.workflows.utilities.hfss_log_parser.HFSSLogParser(log_path: str | Path)#
Bases:
objectHigh-level parser that orchestrates all block parsers.
This class provides the main interface for parsing HFSS log files. It coordinates multiple specialized parsers to extract project info, mesh statistics, adaptive passes, and sweep data.
- Parameters:
- log_path
strorpathlib.Path Path to the HFSS log file to parse.
- log_path
Examples
>>> from pathlib import Path >>> log = HFSSLogParser("/tmp/project.aedt.batchinfo.1234/hfss.log") >>> data = log.parse() >>> data.is_converged() True
Parse and check for errors:
>>> log = HFSSLogParser("simulation.log") >>> result = log.parse() >>> if result.errors(): ... print("Errors found:", result.errors()) ... else: ... print("No errors detected")
- parse() ParsedLog#
Execute all sub-parsers and return a unified object.
- Returns:
ParsedLogStructured representation of the entire log including project info, mesh statistics, adaptive passes, and sweep data.
Examples
>>> log = HFSSLogParser("hfss.log") >>> result = log.parse() >>> print(f"Converged: {result.is_converged()}") >>> print(f"Passes: {len(result.adaptive)}")
Aggregated result object#
- class pyedb.workflows.utilities.hfss_log_parser.ParsedLog(project: ProjectInfo, init_mesh: InitMesh, adaptive: list[AdaptivePass], sweep: Sweep | None)#
Bases:
objectRoot container returned by HFSSLogParser.parse().
This class holds all parsed information from an HFSS log file and provides convenience methods for checking convergence, completion status, and extracting specific metrics.
- Attributes:
- project
ProjectInfo Project meta-data including name, file path, and design information.
- init_mesh
InitMesh Initial mesh statistics.
- adaptive
listofAdaptivePass Adaptive passes in chronological order.
- sweep
SweeporNone Frequency-sweep summary, or
Noneif no sweep was performed.
- project
Examples
>>> from pathlib import Path >>> parsed = ParsedLog( ... project=ProjectInfo(name="Test", file=Path("/tmp/test.aedt")), ... init_mesh=InitMesh(tetrahedra=5000, memory_mb=100, real_time_sec=30, cpu_time_sec=28), ... adaptive=[], ... sweep=None, ... ) >>> parsed.project.name 'Test'
- adaptive_passes() list[AdaptivePass]#
Return the list of adaptive passes.
- Returns:
listofAdaptivePassAll adaptive passes in chronological order.
Examples
>>> parsed = ParsedLog(project=..., init_mesh=..., adaptive=[pass1, pass2], sweep=None) >>> passes = parsed.adaptive_passes() >>> len(passes) 2
- errors() list[str]#
Extract error lines from the log file.
Searches the log for lines containing error markers like
[error]or*** ERROR ***. Warnings are ignored.Examples
>>> parsed = ParsedLog(project=..., init_mesh=..., adaptive=[], sweep=None) >>> errs = parsed.errors() >>> len(errs) 0
- is_completed() bool#
Check if the simulation completed successfully.
A simulation is considered complete when both adaptive convergence occurred and a frequency sweep was executed.
- Returns:
- bool
Trueif converged and sweep completed,Falseotherwise.
Examples
>>> parsed = ParsedLog( ... project=ProjectInfo(name="T", file=Path("/t")), ... init_mesh=InitMesh(tetrahedra=100, memory_mb=10, real_time_sec=5, cpu_time_sec=5), ... adaptive=[ ... AdaptivePass( ... pass_nr=1, ... freq_hz=1e9, ... tetrahedra=100, ... matrix_size=50, ... memory_mb=256, ... delta_s=0.01, ... converged=True, ... elapsed_sec=10, ... ) ... ], ... sweep=Sweep(type="Interpolating", frequencies=11, solved=[1e9], elapsed_sec=30), ... ) >>> parsed.is_completed() True
- is_converged() bool#
Check if the adaptive solver declared convergence.
- Returns:
- bool
Trueif at least one adaptive pass converged,Falseotherwise.
Examples
>>> parsed = ParsedLog( ... project=ProjectInfo(name="T", file=Path("/t")), ... init_mesh=InitMesh(tetrahedra=100, memory_mb=10, real_time_sec=5, cpu_time_sec=5), ... adaptive=[ ... AdaptivePass( ... pass_nr=1, ... freq_hz=1e9, ... tetrahedra=100, ... matrix_size=50, ... memory_mb=10, ... delta_s=0.01, ... converged=True, ... elapsed_sec=10, ... ) ... ], ... sweep=None, ... ) >>> parsed.is_converged() True
- memory_on_convergence() float#
Memory consumed by the last converged adaptive pass.
- Returns:
floatMemory in megabytes, or
math.nanif no pass converged.
Examples
>>> parsed = ParsedLog( ... project=ProjectInfo(name="T", file=Path("/t")), ... init_mesh=InitMesh(tetrahedra=100, memory_mb=10, real_time_sec=5, cpu_time_sec=5), ... adaptive=[ ... AdaptivePass( ... pass_nr=1, ... freq_hz=1e9, ... tetrahedra=100, ... matrix_size=50, ... memory_mb=256.5, ... delta_s=0.01, ... converged=True, ... elapsed_sec=10, ... ) ... ], ... sweep=None, ... ) >>> parsed.memory_on_convergence() 256.5
- to_dict() dict#
Deep-convert the entire object to JSON-serializable primitives.
- Returns:
dictPlain dict/list/scalar structure suitable for JSON serialization.
Examples
>>> parsed = ParsedLog(project=..., init_mesh=..., adaptive=[], sweep=None) >>> data_dict = parsed.to_dict() >>> isinstance(data_dict, dict) True
Data containers#
- class pyedb.workflows.utilities.hfss_log_parser.ProjectInfo(name: str, file: Path, design: str = '', user: str = '', cmd_line: str = '')#
Bases:
objectBasic meta-data extracted from the header of an HFSS batch log.
- Attributes:
- name
str Project name (without extension).
- file
pathlib.Path Full path to the project file.
- design
str,optional Active design name. The default is
"".- user
str,optional OS user that launched the solve. The default is
"".- cmd_line
str,optional Exact command line used for the run. The default is
"".
- name
Examples
>>> from pathlib import Path >>> info = ProjectInfo( ... name="Patch_Antenna", file=Path("/project/antenna.aedt"), design="HFSSDesign1", user="engineer" ... ) >>> info.name 'Patch_Antenna'
- class pyedb.workflows.utilities.hfss_log_parser.InitMesh(tetrahedra: int, memory_mb: float, real_time_sec: int, cpu_time_sec: int)#
Bases:
objectStatistics reported during the initial tetrahedral meshing phase.
- Attributes:
Examples
>>> mesh = InitMesh(tetrahedra=5000, memory_mb=128.5, real_time_sec=45, cpu_time_sec=42) >>> mesh.tetrahedra 5000 >>> mesh.memory_mb 128.5
- class pyedb.workflows.utilities.hfss_log_parser.AdaptivePass(pass_nr: int, freq_hz: float, tetrahedra: int, matrix_size: int, memory_mb: float, delta_s: float | None, converged: bool, elapsed_sec: int)#
Bases:
objectSingle adaptive solution pass with convergence metrics.
- Attributes:
- pass_nr
int 1-based pass index.
- freq_hz
float Target frequency in hertz.
- tetrahedra
int Number of tetrahedra at end of pass.
- matrix_size
int Order of the FEM matrix.
- memory_mb
float Memory used in megabytes.
- delta_s
float,optional Maximum |ΔS| observed. The default is
Noneuntil reported.- convergedbool
Trueif this pass triggered convergence.- elapsed_sec
int Wall time spent in this pass.
- pass_nr
Examples
>>> pass1 = AdaptivePass( ... pass_nr=1, ... freq_hz=3e9, ... tetrahedra=10000, ... matrix_size=5000, ... memory_mb=256.0, ... delta_s=0.02, ... converged=False, ... elapsed_sec=120, ... ) >>> pass1.pass_nr 1 >>> pass1.converged False
- class pyedb.workflows.utilities.hfss_log_parser.Sweep(type: str, frequencies: int, solved: list[float], elapsed_sec: int)#
Bases:
objectFrequency-sweep summary block.
- Attributes:
Examples
>>> sweep = Sweep(type="Interpolating", frequencies=101, solved=[1e9, 2e9, 3e9], elapsed_sec=300) >>> sweep.type 'Interpolating' >>> len(sweep.solved) 3
Block parsers (advanced usage)#
- class pyedb.workflows.utilities.hfss_log_parser.ProjectBlockParser(lines: list[str])#
Bases:
BlockParserExtract project meta-data from the log header.
This parser searches for project name, design name, user information, and command line arguments in the log file header section.
Examples
>>> lines = [ ... "Project: MyProject, Design: HFSSDesign1", ... "Running as user: engineer", ... 'Using command line: "ansysedt.exe"', ... "Batch Solve/Save: /path/to/project.aedt", ... ] >>> parser = ProjectBlockParser(lines) >>> info = parser.parse() >>> info.name 'MyProject'
- parse() ProjectInfo#
Parse the stored lines and return a ProjectInfo instance.
- Returns:
ProjectInfoPopulated project meta-data object.
Examples
>>> lines = ["Project: Antenna, Design: HFSS1", "Batch Solve/Save: /tmp/antenna.aedt"] >>> parser = ProjectBlockParser(lines) >>> info = parser.parse() >>> info.name 'Antenna'
- class pyedb.workflows.utilities.hfss_log_parser.InitMeshBlockParser(lines: list[str])#
Bases:
BlockParserExtract initial mesh statistics from the log.
This parser searches for the initial meshing profile section and extracts tetrahedra count, memory usage, and timing information.
Examples
>>> lines = [ ... "[PROFILE] Initial Meshing", ... "Tetrahedra: 5000", ... "Memory 128.5 MB", ... "Real Time 00:45", ... "CPU Time 00:42", ... ] >>> parser = InitMeshBlockParser(lines) >>> mesh = parser.parse() >>> mesh.tetrahedra 5000
- parse() InitMesh#
Parse initial mesh statistics from log lines.
- Returns:
InitMeshInitial mesh metrics including tetrahedra count, memory, and timing.
Examples
>>> lines = ["[PROFILE] Initial Meshing", "Tetrahedra: 10000"] >>> parser = InitMeshBlockParser(lines) >>> mesh = parser.parse() >>> mesh.tetrahedra 10000
- class pyedb.workflows.utilities.hfss_log_parser.AdaptiveBlockParser(lines: list[str])#
Bases:
BlockParserBuild a list of AdaptivePass objects from the adaptive section.
This parser extracts all adaptive pass information including convergence status, frequency, mesh statistics, and delta-S values.
Examples
>>> lines = [ ... "Adaptive Pass 1 at Frequency: 3 GHz", ... "Tetrahedra: 10000", ... "Matrix size: 5000", ... "Memory 256.0 MB", ... "Max Mag. Delta S: 0.02", ... "[CONVERGE] Solution has converged at pass number 1", ... "Adaptive Passes converged", ... ] >>> parser = AdaptiveBlockParser(lines) >>> passes = parser.parse() >>> passes[0].pass_nr 1 >>> passes[0].converged True
- parse() list[AdaptivePass]#
Parse every adaptive pass and determine which one triggered convergence.
- Returns:
listofAdaptivePassOrdered list of passes (pass_nr always increases).
Examples
>>> lines = ["Adaptive Pass 1 at Frequency: 2 GHz", "Tetrahedra: 8000"] >>> parser = AdaptiveBlockParser(lines) >>> passes = parser.parse() >>> len(passes) 0
- class pyedb.workflows.utilities.hfss_log_parser.SweepBlockParser(lines: list[str])#
Bases:
BlockParserExtract frequency-sweep summary from the log.
This parser searches for frequency sweep information including sweep type, number of frequencies, and elapsed time.
Examples
>>> lines = [ ... "Interpolating Sweep", ... "101 Frequencies", ... "Frequency - 1 GHz", ... "Frequency - 2 GHz", ... "Elapsed time: 00:05:00", ... ] >>> parser = SweepBlockParser(lines) >>> sweep = parser.parse() >>> sweep.type 'Interpolating' >>> sweep.frequencies 101
Base class#
Utility helpers#
The functions below are private by convention (leading underscore) but are exposed in the documentation for contributors and advanced users.
- pyedb.workflows.utilities.hfss_log_parser._to_hz(text: str) float#
Convert a human-readable frequency string to hertz.
Parse frequency expressions with standard SI prefixes (k, M, G) and convert them to numerical values in hertz.
- Parameters:
- text
str Frequency expression such as
'3 GHz','100 kHz', or'10MHz'. Spaces between value and unit are optional.
- text
- Returns:
floatNumerical value in Hz. Returns
math.nanif the string cannot be parsed.
Examples
>>> _to_hz("3 GHz") 3000000000.0 >>> _to_hz("100 kHz") 100000.0 >>> _to_hz("10MHz") 10000000.0 >>> _to_hz("2.4GHz") 2400000000.0 >>> import math >>> math.isnan(_to_hz("invalid")) True
- pyedb.workflows.utilities.hfss_log_parser._to_sec(mm_ss: str) int#
Convert an Ansys time stamp to seconds.
Parse time stamps in various formats (MM:SS, H:MM:SS, or HH:MM:SS) and convert them to total elapsed seconds.
- Parameters:
- mm_ss
str Time stamp extracted from the log in format
MM:SS,H:MM:SS, orHH:MM:SS.
- mm_ss
- Returns:
intTotal elapsed seconds.
Examples
>>> _to_sec("02:30") 150 >>> _to_sec("1:15:45") 4545 >>> _to_sec("12:30:00") 45000 >>> _to_sec("00:05") 5
- pyedb.workflows.utilities.hfss_log_parser._as_dict(obj: Any) Any#
Recursively convert dataclasses to plain Python types.
Convert dataclass instances, lists, and Path objects to JSON-serializable primitive types (dict, list, str, etc.).
- Parameters:
- obj
Any Object to convert. Can be a dataclass instance, list, Path, or primitive type.
- obj
- Returns:
AnyPlain Python type representation. Dataclasses become dicts, Paths become strings, lists are recursively processed, and primitives pass through unchanged.
Examples
>>> from dataclasses import dataclass >>> from pathlib import Path >>> @dataclass ... class Point: ... x: int ... y: int >>> _as_dict(Point(1, 2)) {'x': 1, 'y': 2} >>> _as_dict(Path("/tmp/file.txt")) '/tmp/file.txt' >>> _as_dict([1, 2, Path("/test")]) [1, 2, '/test']