M MarginWise Pre-read
Working session pre-read

Republic aircraft acquisition workflow.

A short read before we get on the call. What we think we understand, what we still need to confirm, and three early shapes for how the agent could work — to react to, not approve.

01 · Problem understanding

Where we think the work lives today.

Edmund or Mike reads a seller pack (unstructured, no standard format) and manually re-keys data into Republic’s proprietary Excel model to determine whether to acquire the aircraft. The data spans:

What’s in the pack
  • Aircraft condition
  • Maintenance schedules
  • Lease terms
  • Contract clauses
  • Some of it embedded as images inside Excel
  • No standard column structure or naming convention across sellers — every pack is different
A few things we want to confirm
  • Does this take roughly the same amount of time per deal, or is there meaningful variance?
  • Edmund mentioned a few weeks ago wanting to handle multiple deals in a single pack. Would each deal go into a new model? Our assumption is yes, pending confirmation.
  • Is external research ever required outside the seller pack (web research, reaching back out to the seller, validating parts, etc.)?
02 · Process walkthrough

Show us a live pack.

  • We’d like Mike or Edmund to walk us through a live seller pack in real time.
  • Does anyone besides Mike and Edmund touch the pack at any stage?
03 · The Excel model

Five tabs — how do they actually flow?

  • Of the five tabs, which are filled in manually vs. formula-driven?
  • How much does data flow between tabs — if one cell is wrong, does it cascade?
  • We noticed some formulas appear broken. Intentional, or a known issue?
  • Can we get 5–10 closed deals with both the original seller pack and the final model inputs?
04 · Seller pack variability

How different is different?

  • How different are packs across sellers — minor formatting differences or fundamentally different structures? Is there enough commonality to generalize, or do we need a separate process per vendor?
  • Does the full lease ever come as a standalone PDF, or is it always embedded as images in Excel?
05 · Scope and expectations

What does “good enough” look like?

  • How automated do you picture this in the ideal state for phase one?
  • Is the goal to keep the Excel model as-is, or is there openness to replacing it down the line with something more robust (e.g., a web tool)?
  • What would make you confident enough in the output to run it on a live deal?
06 · Proposed solutions

Three shapes to react to.

A few ways we see this working. We are sharing these as options to react to, not as recommendations.

01
Portal & email handoff

Upload and walk away.

Mike or Edmund uploads the seller pack to a simple web portal. The agent processes the data in the background, then emails them when complete with:

  • A populated Excel model
  • Findings and flags (anything unusual or worth a second look)
  • Questions it couldn’t resolve on its own (missing data, ambiguous clauses, etc.)
  • The original seller pack and the completed model dropped into SharePoint or OneDrive
The agent works in the background while you do other things.
02
Interactive review session

Watch the work happen.

Same upload step, but instead of an async email, the agent works through the pack in a chat-style interface. Mike or Edmund can:

  • Watch it extract data live
  • Answer clarifying questions in real time (“the lease term shows two different end dates — which one is the renewal?”)
  • Confirm or correct fields as the model fills in
  • Final outputs still land in SharePoint
Slower per deal, but a human checks work as it happens rather than reviewing a finished draft.
03
Email-in, email-out

Zero new tools to learn.

Mike or Edmund forwards the seller pack to a dedicated address (e.g., deals@…). The agent processes it and replies with the model, findings, and questions attached.

  • Populated model attached
  • Findings and flags inline
  • Questions it couldn’t resolve on its own
  • No portal, no new tool to learn
Zero workflow change, at the cost of less control over file handling and versioning.