Esteban Garzon
Code Lab

Selective code access, architecture context, step-by-step usage, and sanitized engineering proof.

This page intentionally exposes curated slices of the work rather than a full repository dump. The goal is to show engineering thinking, integration breadth, execution flow, and practical usage guidance without over-sharing credentials, internal identifiers, or raw business data.

Public Samples13

Curated excerpts across ETL, APIs, telemetry, Graph, Excel automation, schema bridges, and batching flows.

Integration Families5

Moraware, RFMS, Microsoft Graph, Samsara, and spreadsheet-driven shop-floor pipelines.

Download Bundles2

Sanitized public utilities and source packages that are safe to share in a portfolio context.

Usage Guides2

Step-by-step instructions, prerequisites, and expected outputs for the public downloads.

Operational Ingestion

Patterns for turning shop-floor logs, ERP signals, and structured exports into cleaner analytical layers.

Spreadsheet normalization and sheet harmonization.
Controlled joins between job, slab, and activity records.
Incremental exports that preserve business context without leaking raw internals.

API Orchestration

Session handling, credential flows, pagination, and defensive parsing across vendor APIs and cloud services.

RFMS session bootstrap and estimate hydration.
Microsoft Graph client-credentials token exchange.
Samsara polling windows built for historical backfills.

Reliability Layers

The code emphasizes safe writes, deduplication, placeholder handling, and shape checks before records hit storage.

Completeness scoring before duplicate removal.
Batch inserts and time-window chunking for long runs.
Selective updates instead of blunt full-table churn where possible.

Desktop + Internal Tooling

Not everything lives in a web app. Some of the strongest leverage comes from automating the awkward middle ground.

PythonNET bridges into .NET-based vendor tooling.
Excel automation with `xlwings` and downstream dataframe cleanup.
Utilities packaged for non-technical operators when needed.
Requirements

What you need before using the public bundles.

Python 3.11+

Most samples assume a modern Python runtime and common data tooling such as `requests`, `pandas`, and `psycopg2-binary`.

Windows-first utilities

The Outlook tool and some spreadsheet or desktop flows are designed around Windows environments and local desktop automation.

Service credentials

Public packages replace all secrets with placeholders. To run them, you must provide your own API keys, database credentials, and tenant identifiers.

External systems

Some samples expect access to third-party systems such as Moraware, Microsoft Graph, RFMS, SQL Server, PostgreSQL, or Samsara.

Workflow

How to use this material step by step.

01
Choose a bundle

Pick the public package that matches your need: Outlook extraction for email attachments or Live Data Integration Lab for ETL and API examples.

02
Read the guide first

Each bundle is sanitized and intentionally selective. Start with the README to understand the expected environment and what has been redacted.

03
Install prerequisites

Create a clean environment, install the listed packages, and replace placeholders like `***` with your own safe local configuration.

04
Run a narrow test

Begin with a single script or a reduced date range so you can validate connectivity, schema behavior, and output shape before scaling up.

05
Adapt to your stack

The samples are meant to be templates. Rename tables, adjust schemas, and route outputs to your own storage and monitoring setup.

Playbooks

Bundle-specific instructions with prerequisites, steps, and expected outputs.

For engineers exploring ETL, API sync, and data-platform patterns

How to use the Live Data Integration Lab

This package is best used as a reference implementation. It shows structure, flow control, and integration techniques rather than pretending to be a plug-and-play app.

Requirements
Python 3.11 or newer
A virtual environment
Your own PostgreSQL or SQL-compatible destination
Your own vendor credentials for whichever script you want to test
Step by step
Download the zip and extract it into a clean folder.
Create a virtual environment and install `requirements.txt`.
Open the script you want to study first, such as the RFMS bridge or the Microsoft Graph sync.
Replace the placeholder values with your own local configuration.
Run a small test scope first, then expand the date range or dataset size once the shape looks correct.
Expected outputs
Understanding of auth flows, polling windows, and ETL control patterns
Reusable code structure for ingestion jobs
A safe starting point for adapting to your own systems
For teams or operators who need a practical attachment-export utility

How to use the Outlook Attachment Extractor

This package is intended as a usable Windows utility sample. It demonstrates both packaging discipline and a real-world local workflow.

Requirements
Windows environment
Python installed locally if running from source
Access to an OST file you are authorized to inspect
Enough local disk space for exported attachments
Step by step
Download the source package and extract it locally.
Review the README and install the dependencies if you plan to run it from source.
Launch the GUI or CLI entrypoint depending on your workflow.
Point the tool to your OST file, filter by the criteria you need, and choose an export folder.
Verify a small sample export first before running a broader extraction.
Expected outputs
Structured attachment exports
A reusable local utility for mailbox recovery or audit workflows
A reference implementation for GUI-driven Python tooling
Source

Inventory Deduplication Logic

A practical deduplication pass that protects downstream merges from repeated inventory identifiers.

slab_project/etl_slabsmith.py
Ranks rows by completeness before removing duplicates.
Preserves rows without a usable key instead of discarding them.
Makes the ETL safer for reporting and enrichment joins.
Read-only excerpt
def deduplicate_by_inventory_id(df, key_col):
    if df.empty or key_col not in df.columns:
        return df.copy(), 0

    deduped = df.copy()
    deduped[key_col] = deduped[key_col].apply(normalize_inventory_key)

    mask_has_key = deduped[key_col].notna()
    with_key = deduped[mask_has_key].copy()
    without_key = deduped[~mask_has_key].copy()

    if with_key.empty:
        return deduped, 0

    completeness_score = pd.Series(0, index=with_key.index, dtype="int64")
    for col in [col for col in with_key.columns if col != key_col]:
        if pd.api.types.is_numeric_dtype(with_key[col]):
            completeness_score += with_key[col].notna().astype("int64")
        else:
            completeness_score += (
                with_key[col].fillna("").astype(str).str.strip().ne("").astype("int64")
            )

    with_key["_dedupe_score"] = completeness_score
    with_key = with_key.sort_values([key_col, "_dedupe_score"], ascending=[True, False])
    duplicate_rows_removed = len(with_key) - with_key[key_col].nunique()
    with_key = with_key.drop_duplicates(subset=[key_col], keep="first")

    deduped = pd.concat([with_key.drop(columns=["_dedupe_score"]), without_key], ignore_index=True)
    return deduped, duplicate_rows_removed
Public Bundles

Downloads that stay useful without exposing internals.

Free Utility

Outlook Attachment Extractor

A Windows-based OST attachment extraction utility with GUI and CLI workflows, designed to browse Outlook data files, filter messages, and export attachments in a structured way.

Sanitized public package with source code only.
No client data, extracted mail content, temporary mailboxes, or build artifacts included.
Useful as a practical download for teams that need OST attachment extraction workflows.
Public Source Bundle

Live Data Integration Lab

A curated package of sanitized integration scripts covering Moraware, RFMS, Microsoft Graph, Samsara, and Excel-based ingestion patterns.

Built from real operational automations, rewritten to remove secrets, internal IDs, private paths, and raw business data.
Useful for demonstrating ETL, API authentication, batch backfills, desktop automation, and schema-shaping patterns.
Safe to share as a public engineering sample without exposing the original `livedataproj` workspace.