Business Analyst/Data Scientist for HSE Compliance Software Integration

Business Analyst/Data Scientist for HSE Compliance Software Integration

Business Analyst/Data Scientist for HSE Compliance Software Integration

Upwork

Upwork

Remote

9 hours ago

No application

About

Company: WorkSync Type: Contract (6–12 weeks, extension likely) · Remote (North America time zones preferred) WorkSync builds tools that help industrial teams move faster and safer. We’re looking for a hands-on Business Analyst or Data Scientist to lead an HSE data integration and analytics initiative—ingesting safety data from KPA Flex into a Microsoft Fabric environment and turning it into trustworthy models, metrics, and dashboards. What you’ll do Own the integration: Map KPA Flex domains (incidents, actions, inspections, training/compliance, audits, observations) and stand up robust pipelines into Microsoft Fabric (OneLake/Lakehouse, Dataflows Gen2, Notebooks, Pipelines). Design the data model: Normalize, document, and version HSE entities; define conformed dimensions and semantic models for self-serve analytics in Power BI. Automate the flow: Build scheduled and/or event-driven ingestion using REST APIs/webhooks, OAuth, and incremental loads; implement data quality checks, lineage, and monitoring. Deliver insight: Build or guide Power BI dashboards/KPIs (TRIR, LTIR, near-misses, CAPA cycle times, audit closure rates, training expiry risk, leading indicators). Partner with stakeholders: Run discovery, write clear user stories, acceptance criteria, and a lightweight data dictionary; create go-live runbooks and handover docs. Harden and govern: Apply row-level security, PII handling, and auditability aligned to enterprise standards; set SLAs for refresh and data health. You’re a great fit if you have HSE/EHS integration experience (preferred) — with platforms like KPA Flex (ideal), Intelex, Enablon, Cority, Sphera, or similar. Microsoft Fabric expertise across OneLake/Lakehouse, Dataflows Gen2, Pipelines, Notebooks, and Power BI semantic models. Strong SQL and Python (pandas/PySpark); comfort with REST APIs (OAuth2), JSON, pagination, rate limiting, and webhook/event patterns. Solid data modeling fundamentals (star/snowflake, SCDs, incremental processing, Delta/Parquet). Proven BA chops: requirements elicitation, process mapping (BPMN/Swimlanes), and crisp documentation (data dictionary, ERDs, sequence diagrams). Security and governance mindset: RLS/OLS, data classification, audit trails, quality rules, observability. Nice to have Experience with Databricks on Azure, dbt, or Fabric Warehouse. Power Platform familiarity (Power Automate, Power Apps) and SharePoint integrations. Industrial domain experience (oil & gas, utilities, manufacturing). KPI design for safety programs (lagging/leading indicators, exposure-based metrics). What success looks like (30/60/90) 30 days Discovery complete; integration scope and data contracts documented; PoC pipeline pulling at least one KPA Flex entity into Lakehouse with dq checks. 60 days Core entities landed and modeled; Power BI semantic model + first HSE KPI dashboard live; refresh schedule and monitoring in place. 90 days Full pipeline hardening; documentation/runbooks delivered; knowledge transfer complete; backlog and roadmap defined for advanced analytics (predictive risk, trend detection). Engagement & logistics Contract: 20–40 hrs/week to start; extensions possible based on scope. Location: Remote (workable overlap with Pacific Time preferred). Compensation: Competitive; commensurate with experience. How to apply Send your résumé/LinkedIn and a short note about a past HSE integration you delivered (tools, data model, and outcomes). Include links or screenshots of relevant Power BI work (scrub sensitive data).