Interop reader authoring ======================== How to add a new card / keyword to a vendor reader. The three production readers — NASTRAN ``from_bdf``, Abaqus ``from_inp``, Ansys MAPDL ``from_dat`` / ``from_cdb`` — share a common shape but different parser idioms; this page maps the work end-to-end so adding the next card is a mechanical edit. Authoring driver: most reader gaps are surfaced by a VM in the verification corpus that hits a card the parser doesn't yet handle. When that happens, file the gap as a child issue (label ``[interop]`` + ``verify-blocked``), tick the registry row XFAIL, land the card support in a focused PR, then flip the row. The VM spec at :doc:`verification_manual_spec` is the upstream consumer of this guide. .. contents:: Page contents :local: :depth: 2 Reader landscape ---------------- .. list-table:: :header-rows: 1 :widths: 22 22 56 * - Reader - Module - Parser idiom * - NASTRAN BDF - ``femorph_solver.interop.nastran._bdf`` - **Dispatch table.** ``_CARD_DISPATCH`` maps card name (``"GRID"``, ``"CBAR"``, ``"PBAR"``, …) to a per-card ``_parse_(fields, data)`` function. Cards are line-oriented; the parser splits on commas / fixed-width fields and accumulates into a ``_BdfData`` collector. * - Abaqus INP - ``femorph_solver.interop.abaqus._inp`` - **Block parser.** Keywords are ``*`` lines that open a block; data lines follow until the next ``*`` line. Per- keyword handlers ``_parse__block(data, params, lines)`` consume the block. Unrecognised keywords are skipped with a debug log; recognised-but-unsupported options raise ``NotImplementedError``. * - Ansys MAPDL ``.dat`` - ``femorph_solver.interop.mapdl._apdl_dialect`` (and the per-verb dialect under ``femorph_solver.interop.mapdl``) - **Verb interpreter.** APDL is procedural; each verb (``ET``, ``MP``, ``N``, ``E``, ``D``, ``F``, …) is a side effect on a stateful ``Model``. ``from_dat`` walks the deck a verb at a time and dispatches to the appropriate ``APDL`` shim method. The shim is reused by build-path tests under ``tests/validation/_vm/`` (see :ref:`vm-spec-reader-pending`). * - Ansys MAPDL ``.cdb`` - ``femorph_solver.interop.mapdl`` ``from_cdb`` - Binary archive reader; covers the same surface as ``from_dat`` but parses the structured ``.cdb`` format. Each reader's public API is the ``from_(path) -> Model`` entry point. All file I/O, string parsing, dispatch, and materialisation into a ``Model`` is internal. The materialisation contract ---------------------------- Cards / blocks accumulate into a per-vendor data structure (``_BdfData``, ``_InpData``, etc.). When the parser is done, a materialisation pass converts that structure into a ``Model`` — node coordinates, element connectivity, materials, real constants, boundary conditions, loads. The materialisation contract is the **neutral** representation the rest of the solver consumes; per- vendor parsers map onto it, and per-element kernels read from it. Real-constant slot conventions (excerpted from the codebase): .. list-table:: :header-rows: 1 :widths: 22 78 * - Element family - ``real`` layout * - Shell (``QUAD4_SHELL``, ``QUAD8_SHELL``, ``QUAD4_PLANE``) - ``real[0] = thickness`` * - Rod / truss (``TRUSS2``) - ``real[0] = cross-section area`` * - Beam (``BEAM2`` and friends) - ``real = (A, IZZ, IYY, J)`` — note the **reader-side I1/I2 swap** (NASTRAN PBAR / Abaqus ``*BEAM SECTION`` ``I11 → IYY → real[2]``; ``I2`` / ``I22 → IZZ → real[1]``). See #509 / #573 for the audit; ``fixtures_and_decks`` codifies the immutable-deck rule that came out of it. * - Concentrated mass (``POINT_MASS``) - ``real[0] = mass`` * - Solid (``HEX8``, ``HEX20``, ``TET10``, ``WEDGE6``) - ``real`` unused; the kernel reads geometry directly. Materials are dictionaries keyed by property name (``EX``, ``PRXY``, ``DENS``, ``ALPX``, …) — *not* a vendor-specific struct. Every reader normalises into the same key set so kernels don't care where the material came from. Boundary conditions and loads land on the ``Model`` via the same per-DOF API regardless of vendor (``model.point_data_to_dirichlet``, ``model.add_force_dof``, etc.). Adding a card to ``from_bdf`` ----------------------------- Worked example: adding a hypothetical ``CBUSH`` card (a 6-DOF spring-damper element). The BDF reader is the simplest pattern, so this is the canonical walkthrough. Step 1 — extend the data collector ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ``_BdfData`` already holds element / property / material / BC / load tables. Most cards land in an existing table. When in doubt, mirror the closest existing card. Step 2 — write the per-card parser ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Per-card parsers all have signature ``(fields: list[str], data: _BdfData) -> None`` and follow this pattern: .. code-block:: python def _parse_cbush(fields: list[str], data: _BdfData) -> None: # CBUSH eid pid ga gb [orient1 orient2 orient3] [pa pb [s ot]] if len(fields) < 4: return eid = _as_int(fields[1]) pid = _as_int(fields[2]) ga = _as_int(fields[3]) gb = _as_int(fields[4]) if len(fields) > 4 else None if eid is None or pid is None or ga is None: return data.elements[eid] = _Element( eid=eid, pid=pid, kind="bush", nodes=(ga, gb) if gb else (ga,), ) Conventions: * Use ``_as_int`` / ``_as_float`` for field parsing — they handle blank fields and BDF's signed/scientific number quirks. * Validate before accumulating. A card with a missing required field is a parse-warning, not a partial accumulation. * Don't raise on optional cards we don't yet support — log debug and skip. Raise only when the card was understood but the *option* isn't supported (e.g. PBAR with a non-zero ``I12``); surface that as a kernel-side gap. Step 3 — register in ``_CARD_DISPATCH`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Add a row mirroring the existing entries: .. code-block:: python _CARD_DISPATCH = { # ... existing entries unchanged ... "CBUSH": _parse_cbush, } The phase comments (``# Phase 1``, ``# Phase 2b``, etc.) are authoring scars from earlier rounds; new cards drop in next to their family. Step 4 — extend materialisation ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Cards that introduce a new element kind require a route in ``_ELEMENT_MAP`` and a real-constant packaging branch. The pattern in ``_bdf.py`` is: .. code-block:: python _ELEMENT_MAP = { # ... existing entries unchanged ... ("CBUSH", 2): "BUSH6", # 6-DOF spring kernel } …and a new branch in the materialisation loop (mirrors the existing ``shell`` / ``rod`` / ``beam`` arms): .. code-block:: python if prop.kind == "bush": real = (prop.k_translational, prop.k_rotational) If the kernel doesn't exist yet (most common case for a new card), this is a coordinated card+kernel PR — see `Coordinated card+kernel landings`_. Step 5 — unit test ~~~~~~~~~~~~~~~~~~ Every new card lands with a focused unit test under ``tests/interop/nastran/test_bdf_reader_phase.py``. The test authors a minimal fixture under ``tests/interop/nastran/fixtures/.bdf`` and asserts: * Card parsed (element exists in the registry). * Real-constant slots land at the right positions and values. * Material / BC / load fields propagate correctly. Don't assert solve results in the interop unit test — that's the cross-solver harness's job (see :doc:`testing` for the test-category boundary). A worked example pattern (from ``tests/interop/nastran/test_bdf_reader_phase2b.py``): .. code-block:: python def test_cbar_beam_pbar_reals_land_correctly(): model = from_bdf(_FIXTURES / "cbar_beam.bdf") reals = np.asarray(model._real_constants[1]) assert reals[0] == pytest.approx(1.0e-4) # A assert reals[1] == pytest.approx(2.0e-8) # IZZ ← I2 assert reals[2] == pytest.approx(1.0e-8) # IYY ← I1 assert reals[3] == pytest.approx(5.0e-9) # J Adding a keyword block to ``from_inp`` -------------------------------------- Abaqus is more forgiving — keywords are blocks delimited by ``*`` lines, and the dispatch is on the keyword name plus its parameters dict. Same five-step pattern, slightly different plumbing. Step 1 — write the block parser ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Block parsers have signature ``(data, params: dict[str, str], lines: list[str]) -> None``. ``params`` is the comma-separated key=value list on the keyword line; ``lines`` is the data block until the next ``*``. .. code-block:: python def _parse_spring_block(data, params, lines): # *SPRING, ELSET= # eid, k elset_name = params.get("ELSET") for line in lines: tokens = line.split(",") eid = int(tokens[0]) k = float(tokens[1]) data.elements[eid] = _Element(eid=eid, kind="spring", k=k) Step 2 — register in the keyword dispatch ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The INP reader has a ``_KEYWORD_DISPATCH`` analogous to BDF's ``_CARD_DISPATCH``: .. code-block:: python _KEYWORD_DISPATCH = { # ... existing entries unchanged ... "SPRING": _parse_spring_block, } Step 3 — extend materialisation ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Same as BDF: route to the appropriate kernel via ``_ELEMENT_TYPE_MAP``, populate real constants in the materialisation pass. Step 4 — handle the params dict's options ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Abaqus keywords frequently carry options (``SECTION=GENERAL``, ``MATERIAL=``, etc.). Handle them inline in the block parser; raise ``NotImplementedError`` with a clear message when an option exists but isn't supported yet — the corpus authoring agent sees that and opens a child issue. Step 5 — unit test ~~~~~~~~~~~~~~~~~~ Mirrors the BDF pattern but lives under ``tests/interop/abaqus/test_inp_reader_phase.py``. The MAPDL ``from_dat`` shim --------------------------- MAPDL's ``.dat`` deck is procedural — each verb is an instruction to the solver. The current reader (#522 Phase 1) sits on top of the ``APDL`` shim under ``femorph_solver.interop.mapdl._apdl_dialect``: ``from_dat`` walks the deck a verb at a time and translates to ``apdl.(...)`` calls. Adding a new APDL verb is two edits: #. **Shim method** — add ``def (self, *args)`` to the ``APDL`` class with the side effect on the bound ``Model``. #. **Dat dialect** — add the verb name to the dispatch in ``_apdl_dialect.py`` so ``from_dat`` recognises it. The build-path tests under ``tests/validation/_vm/`` use the shim directly — that's how a VM gets verified before the ``.dat`` parser handles every required verb. When the verb lands, the harness picks up the row automatically (see :ref:`vm-spec-reader-pending`). Coordinated card+kernel landings -------------------------------- When a new card requires an element kernel that doesn't exist yet, the work is one PR (or two PRs in lock-step) covering both: 1. **Kernel** — a new element under ``src/femorph_solver/elements/`` with ``ke``, ``me``, real-constant layout, and registration. See ``kernel_authoring`` (planned) for the full walkthrough. 2. **Reader** — the parser changes above, plus the ``_ELEMENT_MAP`` / ``_ELEMENT_TYPE_MAP`` route into the new kernel. 3. **VM round-trip** — flip the registry XFAIL once the harness reads the deck successfully. Recent worked examples: * **#549 — PLANE182 EAS (Wilson Q6)** — kernel-side ``QUAD4_PLANE`` ``tech="enhanced"`` formulation + reader change to route ``KEYOPT(1)=2`` decks through it. * **#622 — SHELL281 (Quad8Shell)** — full new kernel + INP / BDF route + VM6 round-trip. * **#580 — PIPE family** — circular hollow beam + internal / external pressure load card on the BDF side. * **#515 — SECTYPE,1,BEAM,I** — derived I-section reals from the vendor's section-shape namespace. In each case the PR title is ``feat(elements|interop): ``, the body cites the registry row(s) it unblocks, and the merge flips the corresponding ☐ → ☑ on the detail tracker (#345 / #511 / #322). Common pitfalls --------------- * **Side-effecting in the parser pass.** Per-card parsers should *only* accumulate. Materialisation is a separate pass; mixing them makes "what's actually in this deck" hard to inspect at the data-collector level. * **Silent-skip on a recognised card.** If you recognise a card, parse it. Don't ``pass`` and hope the test catches it — the test won't, because nothing else in the deck depends on the field you skipped. * **Embedding vendor convention into the kernel.** The MSC PBAR ``I1`` / Abaqus ``I11`` semantics are *vendor convention*, not kernel convention. Translate at the reader boundary (``real = (A, I2, I1, J)`` for BDF, ``(A, I22, I11, J)`` for INP) — the kernel always sees ``(A, IZZ, IYY, J)``. See #509 / #573 for the prior reference case where this was botched. * **Editing a fixture to make a parser test pass.** Fixtures are immutable. See :doc:`fixtures_and_decks` for the rule and the two narrow exceptions. * **Coupling reader changes to a kernel that hasn't merged yet.** Land them together in one PR or in adjacent PRs that gate on each other. Don't merge a reader edit that points at a kernel the next ``main`` lacks. * **Forgetting the materialisation comment block.** Each property kind's real-constant packaging in the materialisation loop carries an inline comment (``# real[1] = IZZ ← I2 …``). Future readers depend on those comments to audit cross-vendor parity; don't strip them. Where things live ----------------- .. list-table:: :header-rows: 1 :widths: 32 68 * - Concern - Path * - Per-vendor reader source - ``src/femorph_solver/interop//`` * - Per-vendor unit tests - ``tests/interop//test__phase.py`` * - Per-vendor fixtures - ``tests/interop//fixtures/`` * - Cross-solver harness (closed-form assertions) - ``tests/cross_solver/test_verification_round_trip.py`` * - Registry rows - ``tests/cross_solver/_verification_registry.py`` * - Build-path fallback (reader-pending VMs) - ``tests/validation/_vm/test_.py`` * - Element kernels - ``src/femorph_solver/elements/`` * - Element registration - ``src/femorph_solver/elements/_registry.py`` * - Per-element specs (with KEYOPT / formulation kwargs) - ``src/femorph_solver/elements/_specs.py`` Companion pages: :doc:`verification_manual_spec` (consumer of this guide — VM ingest is what surfaces most reader gaps), :doc:`fixtures_and_decks` (immutable-deck rule + provenance), :doc:`testing` (where each kind of test lives).