<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>mdcbowen</title><link>http://mdcbowen.info/projects/</link><description>Recent content on mdcbowen</description><generator>Hugo</generator><language>en-us</language><atom:link href="http://mdcbowen.info/projects/index.xml" rel="self" type="application/rss+xml"/><item><title/><link>http://mdcbowen.info/projects/ai_skills/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/ai_skills/</guid><description>&lt;h3 id="abstract"&gt;
 Abstract
 &lt;a class="heading-link" href="#abstract"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;I am actively engaged in extending my data engineering and bi development with emerging AI protocols.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Warp integration (Sonnet 4, OpenAI 03-mini planning)&lt;/li&gt;
&lt;li&gt;Anthropic MCP Inspector (python FastMCP)&lt;/li&gt;
&lt;li&gt;Local ollama (0.6.8) (gemma3:27b, gemma3:latest)&lt;/li&gt;
&lt;li&gt;Zed Agent (primary)&lt;/li&gt;
&lt;li&gt;React &amp;amp; Vue Web development (Sonnet 4)&lt;/li&gt;
&lt;li&gt;DuckDB prototyping (DeltaLake)&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="de-aim"&gt;
 DE Aim
 &lt;a class="heading-link" href="#de-aim"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h4&gt;
&lt;p&gt;My aim is to use OpenMetadata (or Collate) to clarify the integration of traditional with new agentic data pipelines.&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/best/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/best/</guid><description>&lt;h2 id="the-best-clients--why"&gt;
 the best clients &amp;amp; why
 &lt;a class="heading-link" href="#the-best-clients--why"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Philip Morris, NYC&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The most well-managed, highly talented, no-nonsense business oriented customer I have ever worked with.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Full360, Cornelius NC&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Sharpest CTO, fabulous work environment, forward looking&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Com Edison, Chicago&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Most technically proficient IT staff.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Qantas, Sydney&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Forward looking, risk taking in software, highly secure &amp;amp; redundant implementations.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Gap, NorCal&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Most tactically motivated, well-understood problem definitions.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Nielsen, Dunedin&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/boeing/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/boeing/</guid><description>&lt;h2 id="boeing-cps-rehost-proposal"&gt;
 boeing cps rehost proposal
 &lt;a class="heading-link" href="#boeing-cps-rehost-proposal"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;h3 id="boeing-cps"&gt;
 Boeing CPS
 &lt;a class="heading-link" href="#boeing-cps"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Bowen was the technical lead for a proposal to Boeing. He designed the architecture and RFP for a proposed system that replace the current cost planning system for all of Boeing’s commercial and classified projects. This system was proposed for 5,000 users worldwide and would involve on the order of 17,000 active multidimensional models on an annual basis. Bowen worked with Answerthink’s sales team in conjunction with Hyperion and another vendor that he brought into the solution. Ultimately Answerthink won selection which took place over 14 months and originally included 12 vendors. Boeing decided however not to immediately build the system because of budget constraints.&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/jobber/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/jobber/</guid><description>&lt;h1 id="jobber-an-agentic-job-search-platform"&gt;
 Jobber: An Agentic Job Search Platform
 &lt;a class="heading-link" href="#jobber-an-agentic-job-search-platform"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;p&gt;The modern job search is a grinding, repetitive process — sifting through alert emails, cross-referencing requirements against your own experience, tailoring resumes, and making go/no-go decisions dozens of times a week. Jobber is an agentic AI platform that automates the mechanical parts of this workflow while preserving human judgment where it matters most. It also serves as a demonstration of end-to-end system design: structured AI agent orchestration, idempotent data pipelines, and a purpose-built review interface — all written in &lt;strong&gt;Go&lt;/strong&gt; against PostgreSQL.&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/jobber_old/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/jobber_old/</guid><description>&lt;h1 id="jobber-an-agentic-job-search-platform"&gt;
 Jobber: An Agentic Job Search Platform
 &lt;a class="heading-link" href="#jobber-an-agentic-job-search-platform"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;h2 id="project-summary"&gt;
 Project Summary
 &lt;a class="heading-link" href="#project-summary"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Jobber is a multi-stage, AI-agent-driven platform that automates the full lifecycle of a professional job search — from harvesting listings out of email alerts, through intelligent triage and application packaging, to human-in-the-loop curation via a custom web console. The system is composed of three tightly integrated subsystems, each purpose-built to eliminate manual overhead while preserving human judgment where it matters most.&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/metro_abstract/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/metro_abstract/</guid><description>&lt;h1 id="metro-practice-engineering-and-data-governance-tooling"&gt;
 METRO: Practice Engineering and Data Governance Tooling
 &lt;a class="heading-link" href="#metro-practice-engineering-and-data-governance-tooling"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;h2 id="executive-summary"&gt;
 Executive Summary
 &lt;a class="heading-link" href="#executive-summary"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;METRO is a practice engineering initiative that codifies data engineering methodology, AI-assisted development frameworks, and web design standards into executable tools and auditable documentation. Its most significant technical artifact is &lt;strong&gt;md-util&lt;/strong&gt;, a Go CLI that transforms raw data assets from disparate sources into SHACL 1.2 ontology representations and emits dialect-specific DDL for Postgres, BigQuery, Databricks, Parquet/Arrow, and Vortex — solving the schema synchronization problem across heterogeneous data environments with a single canonical model. Together, METRO and md-util demonstrate end-to-end system design from strategic framing through specification to working software.&lt;/p&gt;</description></item><item><title/><link>http://mdcbowen.info/projects/tiingo/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/tiingo/</guid><description>&lt;h2 id="tiingo-portfolio-tracker"&gt;
 tiingo portfolio tracker
 &lt;a class="heading-link" href="#tiingo-portfolio-tracker"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;A Go-based application that automatically fetches market data from the &lt;a href="https://www.tiingo.com/" class="external-link" target="_blank" rel="noopener"&gt;Tiingo API&lt;/a&gt; and stores it in a local DuckDB database. Track stocks and cryptocurrencies with automated daily syncing via Kafka messaging architecture.&lt;/p&gt;
&lt;h2 id="overview"&gt;
 Overview
 &lt;a class="heading-link" href="#overview"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;This application reads a portfolio of ticker symbols from a text file, fetches historical and current market data from Tiingo, and maintains a local DuckDB database for fast querying and analysis. The application uses a &lt;strong&gt;Kafka-based architecture&lt;/strong&gt; for decoupling API fetching from database persistence, providing idempotency, auditability, and scalability.&lt;/p&gt;</description></item><item><title>bondcliq</title><link>http://mdcbowen.info/projects/bondcliq/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/bondcliq/</guid><description>&lt;p&gt;&lt;strong&gt;bondcliq&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;This is currently a private project.&lt;/p&gt;
&lt;p&gt;But basically what I&amp;rsquo;m doing is tooling a set of Go based libraries of custom metrics for the analysis of commercial bonds.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;McCauley Duration&lt;/li&gt;
&lt;li&gt;McCauley Convexity&lt;/li&gt;
&lt;li&gt;Yeild to Maturity&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Cymer/ASML</title><link>http://mdcbowen.info/projects/cymer/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/cymer/</guid><description>&lt;h2 id="cymerasml"&gt;
 Cymer/ASML
 &lt;a class="heading-link" href="#cymerasml"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;As the lead data engineer for a high-visibility engagement with Cymer (a division of ASML), I was tasked with executing a structured technology evaluation of &lt;strong&gt;Amazon Redshift&lt;/strong&gt; as a candidate to replace a legacy Microsoft SQL Server (MSSQL) data warehouse. The client provided an extensive test plan, including performance benchmarks and reproducibility criteria, for which I was responsible for both implementation and interpretation.&lt;/p&gt;</description></item><item><title>Full 360 Major Projects</title><link>http://mdcbowen.info/projects/full360_cv/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/full360_cv/</guid><description>&lt;p&gt;Full360 cv&lt;/p&gt;
&lt;h1 id="full-360-major-projects"&gt;
 Full 360 Major Projects
 &lt;a class="heading-link" href="#full-360-major-projects"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;p&gt;Shortlist&lt;/p&gt;
&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;&lt;strong&gt;Customer&lt;/strong&gt;&lt;/th&gt;
 &lt;th&gt;&lt;strong&gt;What I Did&lt;/strong&gt;&lt;/th&gt;
 &lt;th&gt;&lt;strong&gt;What Was Delivered&lt;/strong&gt;&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;3Com: Santa Clara, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Salesforce Automation &amp;ndash; Trial&lt;/strong&gt; Weekly worldwide product sales analysis. SFA for 1200 users worldwide. Multiple front-ends design &amp;ndash; Excel, Lotus, Analyzer, Avantgo Server /Palm integration using Hyperion Spiderman. Extended POC management and design win over 2 competitors&lt;/td&gt;
 &lt;td&gt;Won 8 week trial for $1.5 million deal.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;SLC Construction&lt;/td&gt;
 &lt;td&gt;Design &amp;amp; development of cash-sensitive planning &amp;amp; reporting model for construction &amp;amp; real-estate contractor. Solomon integration. Large number of driver based metrics and KPIs&lt;/td&gt;
 &lt;td&gt;Reverse engineered Solomon ERP&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Cingular Wireless: Alpharetta, GA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Store Level P&amp;amp;L &amp;ndash; Multiple ERP&lt;/strong&gt; Built Store &amp;amp; Channel profitability on 5000- member COA. Integrated store attribute dimensions. Sourced from Oracle Financials - One World. Oracle - Teradata backends. Extensive scripting and automation of replicated partition maintenance.&lt;/td&gt;
 &lt;td&gt;Implemented in record time, &amp;lt; 6 weeks.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Cleveland, OH&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Large Scale Migration &amp;ndash; Systems Inventory&lt;/strong&gt; High level coordination of large upgrade to System 9 and migration of 100- Essbase databases. Custom scripting for transfer of native security to multiple LDAP external authentications via Hyperion Shared Services CLI tools.&lt;/td&gt;
 &lt;td&gt;Rubric Based Mgmt&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;I2 Technologies Las Colinas, TX&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Supply Chain Analytics &amp;ndash; Custom Models&lt;/strong&gt; Custom data integration with Rhythm suite for SCM analytics product prototype. Reading of supply chain BOM. Investigation of CORBA interfaces.&lt;/td&gt;
 &lt;td&gt;API Level analysis, PhD customers&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Men&amp;rsquo;s Wearhouse Houston TX&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;System Performance Review &amp;ndash; Strategic Technology&lt;/strong&gt; Extensive assessment for strategic upgrade &amp;amp; cross-platform migration of 25- Essbase databases and associated reporting systems. Established use case and produced weighted recommendations for performance &amp;amp; reliability.&lt;/td&gt;
 &lt;td&gt;All analysis, end to end. Systems review.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Boeing Federal Way, WA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Global Project Planning &amp;ndash; Massive Scale&lt;/strong&gt; Global engineering project-based cost management system. Custom built Essbase designs for 64bit Superdome application with full .NET smart client custom design. RFP winning design over 15 competitors. Designed for 5000 concurrent users on 17,000 financial models. Extended POC coordinated with multiple partners including HP Cupertino &amp;amp; Applied OLAP.&lt;/td&gt;
 &lt;td&gt;Las Vegas Presentation, Applied OLAP&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Nissan: Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;North American Regional DW Planning &amp;ndash; ETL Coordination&lt;/strong&gt; Total 34 Essbase databases, 17 under Hyperion Planning Near realtime asynchronous DW ETL interface for multiple Essbase datamarts. Five timezones. P/L , B/S, others independent. Intercube data transfer plus rerouting of Essbase data back to DW. IBM Regatta / DB2 UDB plus Informatica backend with custom ksh &amp;amp; perl integration with MAXL.&lt;/td&gt;
 &lt;td&gt;Complex SLA, workflow&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Qantas: Sydney AUS&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Flight Operations - Transaction Re-engineering&lt;/strong&gt; Ingested legacy transaction &amp;amp; VLDB systems to support generation of complex metrics associated with fuel consumption, pilot fatigue. Regulatory compliance with work-rules design established in the new system. Columnar DB, AWS data migration. Near realtime processes.&lt;/td&gt;
 &lt;td&gt;Realtime &amp;amp; Wide data,&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Coca Cola: Atlanta GA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;High Dimensionality &amp;ndash; Customer Sampling&lt;/strong&gt; 16 Dimensional model on consumer buying and consumption pattens.&lt;/td&gt;
 &lt;td&gt;Deep analytics&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Rockwell Rocketdyne: Canoga Park, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;BOM Inventory &amp;ndash;&lt;/strong&gt; Custom built hierarchical model for ISS launch package inventory. Weight calculations for all subsystems.&lt;/td&gt;
 &lt;td&gt;Space Shuttle&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Hot Topic: Industry, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Retail Loyalty &amp;ndash; VB Integration&lt;/strong&gt; Built integrated POS reporting system distributed daily from all store locations to managers.&lt;/td&gt;
 &lt;td&gt;Custom VB API&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Qantas: Sydney&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;FRMS &amp;ndash; Waypoint Navigation&lt;/strong&gt;&lt;/td&gt;
 &lt;td&gt;&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Toyota: Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Essbase - OracleDB Integration&lt;/strong&gt; OLAP - DW drilldown integration. Spec&amp;rsquo;d ODS for Olap drillthrough Added new security for DW detail. Resolved timeout issues &lt;strong&gt;across network &amp;amp; queries.&lt;/strong&gt;&lt;/td&gt;
 &lt;td&gt;Network timeout troubleshooting, security limitations, ODS Optimization&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;BCBS: Providence, RI&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Capitation Analytics&lt;/strong&gt; Essbase datamart design to track diagnosis, doctor, prescription.&lt;/td&gt;
 &lt;td&gt;First such system in BCBS&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Hines: Houston, TX&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;REIT -&lt;/strong&gt; Rescued broken database missing backups. Reorganized functional staff procedures. Implemented new backup &amp;amp; security procedures.&lt;/td&gt;
 &lt;td&gt;Excel-based spaghetti, no security.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Guess: Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Cluster Availability&lt;/strong&gt; Increased reliability adding Raft-based service discovery service mesh. Instrumented observability, Performed database optimization, load metrics. Rid NFS backup.&lt;/td&gt;
 &lt;td&gt;Five Vertica clusters, US - Japan.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Teva Pharma NYC&lt;/td&gt;
 &lt;td&gt;&lt;strong&gt;Database Performance &amp;amp; Migration&lt;/strong&gt; Performed all DBA functions for multi-schema datamodel including Veeva, S3 based customer submissions, query performance monitoring &amp;amp; tuning, QA -&amp;gt; PROD, daily multistage ingestion &amp;amp; transformation, salesforce integration.&lt;/td&gt;
 &lt;td&gt;Massive complexity&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;</description></item><item><title>Full Technical Employment History</title><link>http://mdcbowen.info/projects/full/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/full/</guid><description>&lt;h3 id="full-technical-employment-history"&gt;
 Full Technical Employment History
 &lt;a class="heading-link" href="#full-technical-employment-history"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;updated 2025-08&lt;/li&gt;
&lt;/ul&gt;
&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;Company&lt;/th&gt;
 &lt;th&gt;HQ&lt;/th&gt;
 &lt;th&gt;Position&lt;/th&gt;
 &lt;th&gt;Start&lt;/th&gt;
 &lt;th&gt;End&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;Bachelor Chemical&lt;/td&gt;
 &lt;td&gt;Whittier, CA&lt;/td&gt;
 &lt;td&gt;High School Intern&lt;/td&gt;
 &lt;td&gt;1978&lt;/td&gt;
 &lt;td&gt;1978&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Health Department&lt;/td&gt;
 &lt;td&gt;Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;Intern&lt;/td&gt;
 &lt;td&gt;1982-06&lt;/td&gt;
 &lt;td&gt;1982-12&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Xerox Corp&lt;/td&gt;
 &lt;td&gt;El Segundo, CA&lt;/td&gt;
 &lt;td&gt;Summer Intern x3&lt;/td&gt;
 &lt;td&gt;1984&lt;/td&gt;
 &lt;td&gt;1986&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Xerox Corp&lt;/td&gt;
 &lt;td&gt;El Segundo, CA&lt;/td&gt;
 &lt;td&gt;Programmer / Analyst&lt;/td&gt;
 &lt;td&gt;1987-07&lt;/td&gt;
 &lt;td&gt;1990-12&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Pilot Software&lt;/td&gt;
 &lt;td&gt;Cambridge, MA&lt;/td&gt;
 &lt;td&gt;Sr Consultant / Support Mgr.&lt;/td&gt;
 &lt;td&gt;1990-12&lt;/td&gt;
 &lt;td&gt;1994-06&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Metro Decisions&lt;/td&gt;
 &lt;td&gt;Atlanta, GA&lt;/td&gt;
 &lt;td&gt;President&lt;/td&gt;
 &lt;td&gt;1994-06&lt;/td&gt;
 &lt;td&gt;1996-10&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Arbor / Hyperion&lt;/td&gt;
 &lt;td&gt;Sunnyvale, CA&lt;/td&gt;
 &lt;td&gt;Sr. Consultant / BizDev&lt;/td&gt;
 &lt;td&gt;1996-09&lt;/td&gt;
 &lt;td&gt;2001-06&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Metro Decisions&lt;/td&gt;
 &lt;td&gt;Redondo Beach, CA&lt;/td&gt;
 &lt;td&gt;President&lt;/td&gt;
 &lt;td&gt;2001-06&lt;/td&gt;
 &lt;td&gt;2005-06&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Hackett Group&lt;/td&gt;
 &lt;td&gt;Miami, FL&lt;/td&gt;
 &lt;td&gt;Managing Consultant&lt;/td&gt;
 &lt;td&gt;2005-04&lt;/td&gt;
 &lt;td&gt;2007-10&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Solver&lt;/td&gt;
 &lt;td&gt;Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;VP Services&lt;/td&gt;
 &lt;td&gt;2007-10&lt;/td&gt;
 &lt;td&gt;2008-06&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;WhitmanHart / Rolta&lt;/td&gt;
 &lt;td&gt;Alpharetta, GA&lt;/td&gt;
 &lt;td&gt;Sr. Mgmt. Consultant&lt;/td&gt;
 &lt;td&gt;2008-06&lt;/td&gt;
 &lt;td&gt;2010-11&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Full 360&lt;/td&gt;
 &lt;td&gt;Charlotte, NC&lt;/td&gt;
 &lt;td&gt;Lead Data Architect&lt;/td&gt;
 &lt;td&gt;2011-01&lt;/td&gt;
 &lt;td&gt;2021-06&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;OpenText&lt;/td&gt;
 &lt;td&gt;Waterloo, ON&lt;/td&gt;
 &lt;td&gt;Sr. Solutions Architect&lt;/td&gt;
 &lt;td&gt;2021-06&lt;/td&gt;
 &lt;td&gt;2024-05&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Darwin Analytics&lt;/td&gt;
 &lt;td&gt;Phoenix, AZ&lt;/td&gt;
 &lt;td&gt;Lead Data Engineer&lt;/td&gt;
 &lt;td&gt;2024-05&lt;/td&gt;
 &lt;td&gt;2024-10&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Metro Decisions&lt;/td&gt;
 &lt;td&gt;Los Angeles, CA&lt;/td&gt;
 &lt;td&gt;President&lt;/td&gt;
 &lt;td&gt;2024-11&lt;/td&gt;
 &lt;td&gt;Present&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;</description></item><item><title>Git Repositories Inventory</title><link>http://mdcbowen.info/projects/repos/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/repos/</guid><description>&lt;h1 id="git-repositories-inventory"&gt;
 Git Repositories Inventory
 &lt;a class="heading-link" href="#git-repositories-inventory"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;h3 id="abstract"&gt;
 Abstract
 &lt;a class="heading-link" href="#abstract"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Sometimes people ask me for a selction of Github repos. That has been a problem because I am restricted by NDA and basically never
got in the habit of paying Github to hold all of this. So it wasn&amp;rsquo;t until I got a 4TB workstation that I could keep all of my work
in one development library. I consolidated all of that over the past year, and I now host my own Gitlab server, so at least you can
see that I&amp;rsquo;ve been busy over the years. (VEGA is Vertica stuff)&lt;/p&gt;</description></item><item><title>golfbag</title><link>http://mdcbowen.info/projects/golfbag/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/golfbag/</guid><description>&lt;h2 id="golfbag"&gt;
 golfbag
 &lt;a class="heading-link" href="#golfbag"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;status&lt;/strong&gt;
&amp;ndash; finished linear regression (2025-09-17)&lt;/p&gt;
&lt;h1 id="-12-week-ml--etl-curriculum-golf-bag-training"&gt;
 🏌️ 12-Week ML + ETL Curriculum (Golf Bag Training)
 &lt;a class="heading-link" href="#-12-week-ml--etl-curriculum-golf-bag-training"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;p&gt;This curriculum covers 12 canonical ML categories (the “golf bag”) with ETL practice.&lt;br&gt;
Each week introduces a dataset, ETL tasks, and ML goals. Use this as a &lt;strong&gt;checklist&lt;/strong&gt; in Obsidian.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="-week-1-linear-regression"&gt;
 📅 Week 1: Linear Regression
 &lt;a class="heading-link" href="#-week-1-linear-regression"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://scikit-learn.org/stable/datasets/real_world.html#california-housing-dataset" class="external-link" target="_blank" rel="noopener"&gt;California Housing (scikit-learn)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Load CSV, validate schema&lt;/li&gt;
&lt;li&gt;Handle missing values&lt;/li&gt;
&lt;li&gt;Normalize numeric fields&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train/test split&lt;/li&gt;
&lt;li&gt;Fit linear regression&lt;/li&gt;
&lt;li&gt;Interpret coefficients&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-2-logistic-regression"&gt;
 📅 Week 2: Logistic Regression
 &lt;a class="heading-link" href="#-week-2-logistic-regression"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://scikit-learn.org/stable/datasets/toy_dataset.html#breast-cancer-dataset" class="external-link" target="_blank" rel="noopener"&gt;Breast Cancer Wisconsin&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Encode categorical features&lt;/li&gt;
&lt;li&gt;Scale inputs&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train binary classifier&lt;/li&gt;
&lt;li&gt;Evaluate with ROC, precision, recall&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-3-decision-trees"&gt;
 📅 Week 3: Decision Trees
 &lt;a class="heading-link" href="#-week-3-decision-trees"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://scikit-learn.org/stable/datasets/toy_dataset.html#iris-plants-dataset" class="external-link" target="_blank" rel="noopener"&gt;Iris Dataset&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Basic EDA (histograms, scatter plots)&lt;/li&gt;
&lt;li&gt;Verify balanced classes&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Build decision tree&lt;/li&gt;
&lt;li&gt;Visualize splits&lt;/li&gt;
&lt;li&gt;Observe overfitting&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-4-random-forests"&gt;
 📅 Week 4: Random Forests
 &lt;a class="heading-link" href="#-week-4-random-forests"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://archive.ics.uci.edu/dataset/2/adult" class="external-link" target="_blank" rel="noopener"&gt;UCI Adult / Census Income&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Encode categorical features&lt;/li&gt;
&lt;li&gt;Impute missing values&lt;/li&gt;
&lt;li&gt;Balance classes&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train random forest&lt;/li&gt;
&lt;li&gt;Measure feature importance&lt;/li&gt;
&lt;li&gt;Compare to logistic regression&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-5-k-means-clustering"&gt;
 📅 Week 5: K-Means Clustering
 &lt;a class="heading-link" href="#-week-5-k-means-clustering"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="http://yann.lecun.com/exdb/mnist/" class="external-link" target="_blank" rel="noopener"&gt;MNIST Digits (scikit-learn or full MNIST)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Flatten images → tabular form&lt;/li&gt;
&lt;li&gt;Apply PCA for preprocessing&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Run K-Means (k=10)&lt;/li&gt;
&lt;li&gt;Visualize clusters&lt;/li&gt;
&lt;li&gt;Evaluate with silhouette score&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-6-pca-dimensionality-reduction"&gt;
 📅 Week 6: PCA (Dimensionality Reduction)
 &lt;a class="heading-link" href="#-week-6-pca-dimensionality-reduction"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://archive.ics.uci.edu/dataset/109/wine" class="external-link" target="_blank" rel="noopener"&gt;UCI Wine Dataset&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Normalize features&lt;/li&gt;
&lt;li&gt;Check correlations&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Run PCA&lt;/li&gt;
&lt;li&gt;Plot explained variance&lt;/li&gt;
&lt;li&gt;Create 2D scatter plot&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-7-neural-networks-mlp"&gt;
 📅 Week 7: Neural Networks (MLP)
 &lt;a class="heading-link" href="#-week-7-neural-networks-mlp"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="http://yann.lecun.com/exdb/mnist/" class="external-link" target="_blank" rel="noopener"&gt;MNIST Digits&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Normalize pixel intensities&lt;/li&gt;
&lt;li&gt;Train/val/test split&lt;/li&gt;
&lt;li&gt;Manage batch sizes&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Build 2-layer MLP&lt;/li&gt;
&lt;li&gt;Compare accuracy to logistic regression&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-8-cnn-convolutions"&gt;
 📅 Week 8: CNN (Convolutions)
 &lt;a class="heading-link" href="#-week-8-cnn-convolutions"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://www.cs.toronto.edu/~kriz/cifar.html" class="external-link" target="_blank" rel="noopener"&gt;CIFAR-10&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Resize and augment images&lt;/li&gt;
&lt;li&gt;Split train/val/test&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Build CNN with conv + pooling&lt;/li&gt;
&lt;li&gt;Apply dropout&lt;/li&gt;
&lt;li&gt;Compare to transfer learning&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-9-rnn--lstm-sequences"&gt;
 📅 Week 9: RNN / LSTM (Sequences)
 &lt;a class="heading-link" href="#-week-9-rnn--lstm-sequences"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txt" class="external-link" target="_blank" rel="noopener"&gt;Shakespeare Corpus (tiny)&lt;/a&gt; or &lt;a href="https://ai.stanford.edu/~amaas/data/sentiment/" class="external-link" target="_blank" rel="noopener"&gt;IMDB Reviews&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Tokenize sequences&lt;/li&gt;
&lt;li&gt;Pad to fixed length&lt;/li&gt;
&lt;li&gt;Manage vocabulary size&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train LSTM&lt;/li&gt;
&lt;li&gt;Predict next char / classify sentiment&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-10-transformers-bert"&gt;
 📅 Week 10: Transformers (BERT)
 &lt;a class="heading-link" href="#-week-10-transformers-bert"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://gluebenchmark.com/tasks" class="external-link" target="_blank" rel="noopener"&gt;SST-2 Sentiment Treebank&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Tokenize with HuggingFace&lt;/li&gt;
&lt;li&gt;Create train/dev/test splits&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Fine-tune BERT&lt;/li&gt;
&lt;li&gt;Evaluate F1 score&lt;/li&gt;
&lt;li&gt;Compare to LSTM baseline&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-11-reinforcement-learning-q-learning"&gt;
 📅 Week 11: Reinforcement Learning (Q-Learning)
 &lt;a class="heading-link" href="#-week-11-reinforcement-learning-q-learning"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset/Env&lt;/strong&gt;: &lt;a href="https://gymnasium.farama.org/environments/classic_control/cart_pole/" class="external-link" target="_blank" rel="noopener"&gt;OpenAI Gym CartPole&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Log state/action/reward data&lt;/li&gt;
&lt;li&gt;Define schema for episodes&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train Q-learning agent&lt;/li&gt;
&lt;li&gt;Plot reward curve&lt;/li&gt;
&lt;li&gt;Test convergence&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-week-12-anomaly-detection"&gt;
 📅 Week 12: Anomaly Detection
 &lt;a class="heading-link" href="#-week-12-anomaly-detection"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;Dataset&lt;/strong&gt;: &lt;a href="https://www.kaggle.com/datasets/mlg-ulb/creditcardfraud" class="external-link" target="_blank" rel="noopener"&gt;Kaggle Credit Card Fraud Detection&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ETL Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Handle extreme class imbalance&lt;/li&gt;
&lt;li&gt;Scale features&lt;/li&gt;
&lt;li&gt;Stratified train/test split&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;input disabled="" type="checkbox"&gt; &lt;strong&gt;ML Tasks&lt;/strong&gt;:
&lt;ul&gt;
&lt;li&gt;Train Isolation Forest&lt;/li&gt;
&lt;li&gt;Compare to Autoencoder&lt;/li&gt;
&lt;li&gt;Evaluate precision@k&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id="-notes"&gt;
 📝 Notes
 &lt;a class="heading-link" href="#-notes"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Each week: spend equal time on &lt;strong&gt;ETL&lt;/strong&gt; and &lt;strong&gt;ML&lt;/strong&gt; — good pipelines make models shine.&lt;/li&gt;
&lt;li&gt;Keep notebooks + scripts versioned (Git).&lt;/li&gt;
&lt;li&gt;Optional stretch goal: orchestrate pipelines (Airflow, Dagster, Prefect).&lt;/li&gt;
&lt;li&gt;Track experiments: MLflow, Weights &amp;amp; Biases, or simple CSV logs.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;</description></item><item><title>Guess Vertica Infrastructure Project</title><link>http://mdcbowen.info/projects/guess/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/guess/</guid><description>&lt;h2 id="guess-vertica-infrastructure-project"&gt;
 Guess Vertica Infrastructure Project
 &lt;a class="heading-link" href="#guess-vertica-infrastructure-project"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;This engagement involved comprehensive Vertica database infrastructure work for Guess Inc., spanning multiple areas including system monitoring, backup strategies, disaster recovery planning, and operational optimization. The work included both hands-on implementation and strategic documentation.&lt;/p&gt;
&lt;h3 id="key-engagement-areas"&gt;
 Key Engagement Areas
 &lt;a class="heading-link" href="#key-engagement-areas"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;h4 id="1-vertica-cluster-management--upgrades"&gt;
 1. &lt;strong&gt;Vertica Cluster Management &amp;amp; Upgrades&lt;/strong&gt;
 &lt;a class="heading-link" href="#1-vertica-cluster-management--upgrades"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Cluster Inventory&lt;/strong&gt;: Multiple Vertica clusters were managed including:&lt;/p&gt;</description></item><item><title>HEXACO</title><link>http://mdcbowen.info/projects/soft/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/soft/</guid><description>&lt;h3 id="hexaco"&gt;
 HEXACO
 &lt;a class="heading-link" href="#hexaco"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;item&lt;/th&gt;
 &lt;th&gt;Score&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;Honesty &amp;amp; Humility&lt;/td&gt;
 &lt;td&gt;5.41&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Emotionality&lt;/td&gt;
 &lt;td&gt;2.51&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Extraversion&lt;/td&gt;
 &lt;td&gt;5.47&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Agreeableness&lt;/td&gt;
 &lt;td&gt;4.07&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Conscientiousness&lt;/td&gt;
 &lt;td&gt;6.65&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Openness&lt;/td&gt;
 &lt;td&gt;6.0&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;Extremes&lt;/p&gt;
&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;Item&lt;/th&gt;
 &lt;th&gt;Score&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;Prudence&lt;/td&gt;
 &lt;td&gt;7.44&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Inquistiveness&lt;/td&gt;
 &lt;td&gt;6.81&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Sincerety&lt;/td&gt;
 &lt;td&gt;7.01&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Organization&lt;/td&gt;
 &lt;td&gt;6.35&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="ocean"&gt;
 &lt;strong&gt;OCEAN&lt;/strong&gt;
 &lt;a class="heading-link" href="#ocean"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Openness: Very High&lt;/li&gt;
&lt;li&gt;Conscientiousness: Very High&lt;/li&gt;
&lt;li&gt;Extraversion: Higher than Average&lt;/li&gt;
&lt;li&gt;Agreeableness: Lower than Average&lt;/li&gt;
&lt;li&gt;Neuroticism: Very Low&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="myers-briggs"&gt;
 &lt;strong&gt;Myers Briggs&lt;/strong&gt;
 &lt;a class="heading-link" href="#myers-briggs"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;2017-05-21:
&lt;strong&gt;ESTJ-A  ‘Executive&lt;/strong&gt;’&lt;/p&gt;</description></item><item><title>How Hard Can It Be?</title><link>http://mdcbowen.info/projects/other_tools/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/other_tools/</guid><description>&lt;p&gt;&lt;strong&gt;how hard can it be?&lt;/strong&gt;&lt;/p&gt;
&lt;h3 id="abstract"&gt;
 abstract
 &lt;a class="heading-link" href="#abstract"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Occasionally, a recruiter (or recruiter-bot) asks me if I have &amp;lsquo;a particular set of skills&amp;rsquo; that they are looking for. Here&amp;rsquo;s the answer: Probably. What does &amp;lsquo;probably&amp;rsquo; mean?&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;I have done most of the things that there are tools for prior to the reliable invention of that tool.&lt;/li&gt;
&lt;li&gt;OR it is a tool that I used to use more than 10 years ago.&lt;/li&gt;
&lt;li&gt;OR it means my employer has considered that me doing something on that tool is a waste of their money - in which case they pass this to a junior or cut it from the budget.&lt;/li&gt;
&lt;li&gt;OR it means I&amp;rsquo;ve used a tool similar to it, but not the exact same tool, but generally that could not be my focus.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;So here is a list of tools that I have used, but never mastered, but never needed to master. In any case, my attitude is, if the tool doesn&amp;rsquo;t actually suck, how hard can it be?&lt;/p&gt;</description></item><item><title>men's warehouse, houston tx</title><link>http://mdcbowen.info/projects/mw/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/mw/</guid><description>&lt;h2 id="mens-warehouse-houston-tx"&gt;
 men&amp;rsquo;s warehouse, houston tx
 &lt;a class="heading-link" href="#mens-warehouse-houston-tx"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Background:&lt;/p&gt;
&lt;p&gt;TMW systems folks are spending too much time on maintenance and don’t have enough time. They have inherited a number of systems and were not entirely aware of the reasons for many design decisions. Because of the high maintenance of these systems they have not had time for a comprehensive review.&lt;/p&gt;
&lt;p&gt;Hyperion systems have not been upgraded since their installation. TMW is using applications which are 2-3 years out of date.&lt;/p&gt;</description></item><item><title>meta prompt</title><link>http://mdcbowen.info/projects/meta_prompt/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/meta_prompt/</guid><description>&lt;p&gt;&lt;strong&gt;meta prompt&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Following the models of &lt;strong&gt;Nate B. Jones&lt;/strong&gt;, I&amp;rsquo;m making a library of prompts and meta prompts like the following. I will be making several that follow my design and build methodologies, like Design For Analysis. This will speed up my design work, specific to SOWs.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;You are helping me [PRIMARY OUTCOME]. This is a structured exercise, not a casual brainstorming session.&lt;/p&gt;
&lt;p&gt;CONTEXT:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Primary subject: [SUBJECT/ASSET/DECISION]&lt;/li&gt;
&lt;li&gt;Scope: [BOUNDS OF WORK]&lt;/li&gt;
&lt;li&gt;Constraints: [KEY CONSTRAINTS]&lt;/li&gt;
&lt;li&gt;Success definition: [HOW WE WILL JUDGE SUCCESS]&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;FRAMEWORK:&lt;/p&gt;</description></item><item><title>Michael Bowen – Curriculum Vitae</title><link>http://mdcbowen.info/projects/essbase_cv/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/essbase_cv/</guid><description>&lt;h1 id="michael-bowen--curriculum-vitae"&gt;
 Michael Bowen – Curriculum Vitae
 &lt;a class="heading-link" href="#michael-bowen--curriculum-vitae"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;h2 id="essbase--bi-details"&gt;
 Essbase + BI Details
 &lt;a class="heading-link" href="#essbase--bi-details"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;&lt;em&gt;3Q2002&lt;/em&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="background"&gt;
 Background
 &lt;a class="heading-link" href="#background"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;I would like to elaborate on my Essbase &amp;amp; BI experience.&lt;/p&gt;
&lt;p&gt;I first began using Essbase in 1996 when I hired on to Arbor Software as a post-sales consultant. I handled engagements for the Southeastern US for Arbor&amp;rsquo;s own consulting group. Previous to this work, I had been doing design and implementation of DB2 and Oracle star schemas for 6 years and working with Pilot Software&amp;rsquo;s multidimensional database, Time Server. So before I even laid eyes on Essbase, I had years of multidimensional database design experience. I instantly realized the potential of this database.&lt;/p&gt;</description></item><item><title>names</title><link>http://mdcbowen.info/projects/names/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/names/</guid><description>&lt;h3 id="names"&gt;
 names
 &lt;a class="heading-link" href="#names"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Believe it or not, the most interesting thing for me in programming is solving a business problem. Other than sodoku and sushi selections, I&amp;rsquo;m really not into solving puzzles for the sake of the puzzle. And I got into the bad habit of not trusting machines. So while it has always been fascinating for me to see my code process correctly, I actually prefer kudos from the people whose decision work I assist. Thus it&amp;rsquo;s easier for me to remember the places I&amp;rsquo;ve worked as a consultant and what those people were all about, than the code and documents I&amp;rsquo;ve created along the way.&lt;/p&gt;</description></item><item><title>nissan cdp</title><link>http://mdcbowen.info/projects/nissan/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/nissan/</guid><description>&lt;h2 id="nissan-cdp"&gt;
 nissan cdp
 &lt;a class="heading-link" href="#nissan-cdp"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Semi-concurrently with the &lt;a href="http://mdcbowen.info/projects/boeing/" &gt;Boeing&lt;/a&gt; deal, Bowen served as lead architect for the expansion of Nissan’s Worldwide Financial Planning system. This entailed the design, separate from re-bid and implementation of a system that would include integration with the North American Data Warehouse for all budgeting and planning for Nissan North America’s 18 dimensional financial codeblock. Bowen built the backend in ksh and Perl to supply asynchronous feeds to 37 synchronized Essbase cubes. As well he supervised the transition of this system to the offshore support team.&lt;/p&gt;</description></item><item><title>PG Stuff</title><link>http://mdcbowen.info/projects/pgstuff/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/pgstuff/</guid><description>&lt;p&gt;&lt;strong&gt;pg stuff&lt;/strong&gt;&lt;/p&gt;
&lt;h3 id="abstract"&gt;
 abstract
 &lt;a class="heading-link" href="#abstract"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;I&amp;rsquo;m looking at the world of PostgreSQL and how it can be used to solve problems in the financial industry but also its integration with DuckDB, which could be a replacement for SingleStore. I also want to understand the alleged weakness of its data layer.&lt;/p&gt;
&lt;p&gt;Why? Because I am hosting it at home and want to see if pg_lakehouse works with MinIO. Also I might want to store documents for AI integrations.&lt;/p&gt;</description></item><item><title>philip morris</title><link>http://mdcbowen.info/projects/pm/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/pm/</guid><description>&lt;h2 id="philip-morris"&gt;
 philip morris
 &lt;a class="heading-link" href="#philip-morris"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Bowen led the design and implementation of a market share system for all industry brands of cigarettes. This included database (DB2) and a custom scratch-built front-end. The front-end was designed with a dynamic SQL building code based on the navigation set on the star schema. This allowed for 50 concurrent users on Macintosh. Working with an economist, Bowen integrated custom confidence intervals which led to the significant decision of Philip Morris to discount Marlboro in order to gain market share.&lt;/p&gt;</description></item><item><title>qantas frms</title><link>http://mdcbowen.info/projects/qantas/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/qantas/</guid><description>&lt;h2 id="qantas-frms"&gt;
 qantas frms
 &lt;a class="heading-link" href="#qantas-frms"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;h3 id="abstract"&gt;
 abstract
 &lt;a class="heading-link" href="#abstract"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Bowen served as lead architect for migration, redesign and build of Qantas&amp;rsquo; Flight Risk Management System (FRMS). This required realtime integration of Jeppesen subsytems and Teradata DW and implementation of complex business rules based on the guidelines established by the Australian Civil Aviation Safety Authority (CASA) and the Australian and International Pilots Association (AIPA). This system provided data from all shorthaul and longhaul flights 24/7.&lt;/p&gt;</description></item><item><title>rolta cv - dec 2010</title><link>http://mdcbowen.info/projects/rolta_cv/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/rolta_cv/</guid><description>&lt;h3 id="rolta-cv---dec-2010"&gt;
 rolta cv - dec 2010
 &lt;a class="heading-link" href="#rolta-cv---dec-2010"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;h3 id="qualifications"&gt;
 Qualifications
 &lt;a class="heading-link" href="#qualifications"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Michael Bowen is a Sr. Management Consultant with TUSC. He is an IT professional with over 20 years of experience in building and architecting DW and EPM solutions. He has significant experience with the Hyperion product line beginning with their core technologies in 1996. His strengths include custom &amp;amp; financial applications of Essbase and Hyperion Planning. Working in the Enterprise Performance Management practice at TUSC, his focus has been on technical leadership in architecture and hands-on delivery.&lt;/p&gt;</description></item><item><title>southwall homelab</title><link>http://mdcbowen.info/projects/southwall/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/southwall/</guid><description>&lt;h2 id="southwall-homelab"&gt;
 southwall homelab
 &lt;a class="heading-link" href="#southwall-homelab"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;h3 id="southwall"&gt;
 Southwall
 &lt;a class="heading-link" href="#southwall"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;p&gt;Southwall is my homelab. It consists of four linux servers, two mac servers, two Synology NAS devices, Ubiquiti network devices and about 50TB of storage.&lt;/p&gt;
&lt;p&gt;My aim is to create and maintain a comprehensive development environment connected to major cloud providers and essentially do all DevOps on my own without the assistance of SREs. So I&amp;rsquo;ll be doing my own site reliability with AI assistance. I am also running S3 services on AWS, websites hosted on Dreamhost, and probably a bunch of other stuff I forgot.&lt;/p&gt;</description></item><item><title>Teva Project Work Summary</title><link>http://mdcbowen.info/projects/teva/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/teva/</guid><description>&lt;h1 id="teva-project-work-summary"&gt;
 Teva Project Work Summary
 &lt;a class="heading-link" href="#teva-project-work-summary"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h1&gt;
&lt;h2 id="executive-summary"&gt;
 Executive Summary
 &lt;a class="heading-link" href="#executive-summary"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;The Teva project involved extensive work on a comprehensive data analytics platform built primarily on Vertica database infrastructure with cloud-based ETL processes, performance benchmarking, and automated data collection systems. The work spanned multiple domains including database infrastructure, data migration, stream processing, monitoring systems, and performance optimization.&lt;/p&gt;
&lt;h2 id="major-project-components"&gt;
 Major Project Components
 &lt;a class="heading-link" href="#major-project-components"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;h3 id="1-database-infrastructure--migration"&gt;
 1. Database Infrastructure &amp;amp; Migration
 &lt;a class="heading-link" href="#1-database-infrastructure--migration"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Vertica Cluster Management&lt;/strong&gt;: Implemented and managed EON mode Vertica clusters across QA and Production environments&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Schema Migration&lt;/strong&gt;: Executed multiple phases of data migration including &amp;ldquo;first-migration&amp;rdquo;, &amp;ldquo;march-migration&amp;rdquo;, and &amp;ldquo;third-migration&amp;rdquo; projects&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;DDL Management&lt;/strong&gt;: Comprehensive database schema management through the &lt;code&gt;vertica-ddl&lt;/code&gt; and &lt;code&gt;duck-ddl&lt;/code&gt; projects&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Performance Testing&lt;/strong&gt;: Extensive TPC-DS benchmark testing with S3 object storage compatibility testing&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="2-etl--data-processing-infrastructure"&gt;
 2. ETL &amp;amp; Data Processing Infrastructure
 &lt;a class="heading-link" href="#2-etl--data-processing-infrastructure"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Stream Processing&lt;/strong&gt;: Built multiple Ruby-based streaming applications:
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;ftp-file-producer&lt;/code&gt;: FTP file processing and ingestion&lt;/li&gt;
&lt;li&gt;&lt;code&gt;s3-file-producer&lt;/code&gt;: S3-based file processing&lt;/li&gt;
&lt;li&gt;&lt;code&gt;salesforce-extract&lt;/code&gt;: Salesforce data extraction pipeline&lt;/li&gt;
&lt;li&gt;&lt;code&gt;sneaql-transform&lt;/code&gt;: Data transformation workflows&lt;/li&gt;
&lt;li&gt;&lt;code&gt;ingestion-consumer&lt;/code&gt;: Data ingestion processing&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="3-data-collection--monitoring-systems"&gt;
 3. Data Collection &amp;amp; Monitoring Systems
 &lt;a class="heading-link" href="#3-data-collection--monitoring-systems"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Data Collector Tables (&lt;code&gt;dc_tables&lt;/code&gt;)&lt;/strong&gt;: Python-based system for processing Vertica data collector metrics
&lt;ul&gt;
&lt;li&gt;Automated S3 data retrieval and processing&lt;/li&gt;
&lt;li&gt;Focus on 5 key metrics: RequestsIssued, RequestsCompleted, ResourceReleases, Errors, ResourceAcquisitions&lt;/li&gt;
&lt;li&gt;Template-based SQL generation and execution&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;CloudWatch Integration&lt;/strong&gt;: Metrics collection and monitoring setup&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Control Tables&lt;/strong&gt;: HPS control table management system&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="4-performance--benchmarking"&gt;
 4. Performance &amp;amp; Benchmarking
 &lt;a class="heading-link" href="#4-performance--benchmarking"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;TPC-DS Benchmarking&lt;/strong&gt;: Comprehensive performance testing framework
&lt;ul&gt;
&lt;li&gt;3-node Vertica EON clusters with 32 CPUs, 256GB memory per node&lt;/li&gt;
&lt;li&gt;S3 object storage compatibility testing&lt;/li&gt;
&lt;li&gt;Multiple data size configurations (10GB to 5TB)&lt;/li&gt;
&lt;li&gt;Concurrent user load testing (1-30 users)&lt;/li&gt;
&lt;li&gt;Depot on/off performance comparisons&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Performance Analysis&lt;/strong&gt;: Detailed performance metrics collection and analysis&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Resource Pool Management&lt;/strong&gt;: Database resource optimization&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="5-cloud-infrastructure--devops"&gt;
 5. Cloud Infrastructure &amp;amp; DevOps
 &lt;a class="heading-link" href="#5-cloud-infrastructure--devops"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Terraform Infrastructure&lt;/strong&gt;: Complete infrastructure as code implementation&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AWS Integration&lt;/strong&gt;: Extensive use of AWS services (S3, ECS, DynamoDB, CloudWatch)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Container Orchestration&lt;/strong&gt;: Docker-based application deployment&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Security Management&lt;/strong&gt;: Encrypted secrets management using Biscuit&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Environment Management&lt;/strong&gt;: Multi-environment setup (DEV, QA, PROD)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="6-support--maintenance-systems"&gt;
 6. Support &amp;amp; Maintenance Systems
 &lt;a class="heading-link" href="#6-support--maintenance-systems"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Automated Workflows&lt;/strong&gt;: Various automation scripts and utilities&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Troubleshooting Tools&lt;/strong&gt;: Comprehensive diagnostic and monitoring tools&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Backup &amp;amp; Recovery&lt;/strong&gt;: S3-based backup and restore procedures&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Database Maintenance&lt;/strong&gt;: Automated maintenance tasks and monitoring&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="key-technical-achievements"&gt;
 Key Technical Achievements
 &lt;a class="heading-link" href="#key-technical-achievements"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;h3 id="database-performance"&gt;
 Database Performance
 &lt;a class="heading-link" href="#database-performance"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Successfully implemented and tested Vertica EON mode with S3 object storage&lt;/li&gt;
&lt;li&gt;Achieved performance benchmarks with TPC-DS queries across multiple concurrency levels&lt;/li&gt;
&lt;li&gt;Implemented automated backup/restore procedures with S3 integration&lt;/li&gt;
&lt;li&gt;Database revive functionality for disaster recovery scenarios&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="data-processing-pipeline"&gt;
 Data Processing Pipeline
 &lt;a class="heading-link" href="#data-processing-pipeline"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Built scalable ETL infrastructure capable of processing various data sources&lt;/li&gt;
&lt;li&gt;Implemented streaming data ingestion from FTP, S3, and Salesforce&lt;/li&gt;
&lt;li&gt;Created automated data validation and error handling systems&lt;/li&gt;
&lt;li&gt;Developed template-based SQL generation for dynamic data processing&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="infrastructure-automation"&gt;
 Infrastructure Automation
 &lt;a class="heading-link" href="#infrastructure-automation"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Complete infrastructure automation using Terraform&lt;/li&gt;
&lt;li&gt;Multi-environment deployment capabilities&lt;/li&gt;
&lt;li&gt;Automated scaling and resource management&lt;/li&gt;
&lt;li&gt;Comprehensive monitoring and alerting systems&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="data-migration-success"&gt;
 Data Migration Success
 &lt;a class="heading-link" href="#data-migration-success"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Successfully migrated data across multiple phases&lt;/li&gt;
&lt;li&gt;Implemented parallel processing for large-scale data operations&lt;/li&gt;
&lt;li&gt;Created tools for schema comparison and validation between environments&lt;/li&gt;
&lt;li&gt;Developed automated DDL generation and deployment processes&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="project-files-and-structure"&gt;
 Project Files and Structure
 &lt;a class="heading-link" href="#project-files-and-structure"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;The project comprises over 40 major components including:&lt;/p&gt;</description></item><item><title>The Fundamentals</title><link>http://mdcbowen.info/projects/the_case/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>http://mdcbowen.info/projects/the_case/</guid><description>&lt;h2 id="the-fundamentals"&gt;
 The Fundamentals
 &lt;a class="heading-link" href="#the-fundamentals"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Mike Bowen is a data architect &amp;amp; hands-on master practitioner in the decision support space. His experience is broad and deep.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Analysis, Design &amp;amp; Development&lt;/li&gt;
&lt;li&gt;Project &amp;amp; Trials Management&lt;/li&gt;
&lt;li&gt;Technical Sales &amp;amp; Business Development&lt;/li&gt;
&lt;li&gt;Product Marketing &amp;amp; Product Management&lt;/li&gt;
&lt;li&gt;Training &amp;amp; Technical Support&lt;/li&gt;
&lt;li&gt;Communications &amp;amp; Presentations&lt;/li&gt;
&lt;li&gt;Vision &amp;amp; Leadership&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="career-outline"&gt;
 Career Outline
 &lt;a class="heading-link" href="#career-outline"&gt;
 &lt;i class="fa-solid fa-link" aria-hidden="true" title="Link to heading"&gt;&lt;/i&gt;
 &lt;span class="sr-only"&gt;Link to heading&lt;/span&gt;
 &lt;/a&gt;
&lt;/h2&gt;
&lt;p&gt;Most of my career can be described in terms of project delivery to Fortune 1000 customers in the US over multiple generations of database and visualization software.&lt;/p&gt;</description></item></channel></rss>