Tallo logoTallo logo

SENIOR TECHNICAL LEAD

Job

Coforge Ltd.

Oaks, PA (In Person)

Full-Time

Posted 3 weeks ago (Updated 2 weeks ago) • Actively hiring

Expires 5/28/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
84
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Job Description Key Responsibilities Data Warehouse Modeling (End-to-End with DBT) This requires strong SQL skills to write complex queries to join multiple tables / views to extract the relevant data. Design and build end-to-end data warehouse layers using DBT, including Raw/staging → intermediate → marts, ensuring clean separation of concerns and reusable model patterns. Develop DBT models using strong SQL and sound modeling principles (e.g., star schema, SCDs, conformed dimensions, fact tables, auditability). Manage DBT model dependencies and orchestration design (sources, refs, exposures, selection syntax, tagging, and modular project structure). Implement DBT incremental models using appropriate strategies (e.g., merge/upsert patterns where required, rebuild/overwrite patterns when appropriate) with correctness and cost-efficiency. Use DBT pre-hooks and post-hooks for operational tasks such as auditing, schema/grant management, session configuration, cleanup, and metadata capture. Establish data quality with DBT tests (generic + custom), freshness checks, and controlled release processes. Snowflake Performance Tuning Optimization Tune SQL and Snowflake workloads for performance and cost, including: Query optimization (pruning, join strategy, aggregations, window functions, CTE usage) Efficient incremental processing and minimizing full-table scans Warehousing strategy (right-sizing compute, concurrency management, workload patterns) Table design decisions (clustering strategy where applicable, micro-partition awareness) Diagnose slow queries using query profiling techniques and implement improvements to reduce runtime and warehouse spend. Orchestration Production Operations (Airflow) Build and maintain Airflow DAGs for robust production orchestration, including: Parameterized deployments across environments Backfills, retries, SLAs, alerting, dependencies, and failure recovery patterns Clean DAG structure (TaskGroups, modular design; dynamic task patterns preferred) Coordinate DBT runs through Airflow with environment-aware configs, robust logging, and observability. Reliability, Governance, and Collaboration Ensure pipeline reliability through monitoring, alerting, idempotent design, and resilient recovery mechanisms. Apply strong data governance practices including access controls, role-based permissions, and handling of sensitive financial data. Collaborate with analysts, business stakeholders, and platform teams to translate complex requirements into well-modeled, trusted datasets. Required Qualifications 1. Expert SQL skills, including the ability to write production-grade transformations and create DBT models using advanced SQL (joins, window functions, complex aggregations, CTEs, performance-aware design). 2. Strong experience with dbt, specifically: a. Incremental models and strategy selection b. Pre-hooks and post-hooks c. Model dependency management and scalable project structuring d. Deployments across environments (dev/test/prod) and dbt execution patterns 3. Solid experience with Snowflake, including performance tuning and optimization (query performance, cost control, efficient loading patterns, warehouse sizing strategies). 4. Good hands on with Apache Airflow for orchestration and production operations. 5. Strong data warehousing fundamentals, including end-to-end warehouse design principles (dimensional modeling, SCDs, fact/dimension design, conformed dimensions, data lineage). 6. Proven analytical and problem-solving skills; ability to debug data issues across SQL/dbt/Snowflake/Airflow layers. 7. Experience working with large-scale, complex datasets in production environments. 8. Prefer candidates who have worked on Financial Services / Wealth Management projects.

Similar remote jobs

Similar jobs in Oaks, PA

  • Job

    QA Engineer

    Compunnel, Inc.

    Oaks, PA

    Posted2 days ago

    Updated1 day ago

  • Job

    UX Designer

    VST Consulting, Inc

    Oaks, PA

    Posted3 days ago

    Updated1 day ago

  • Job

    GRILL COOK (FULL TIME)

    Compass Group

    Oaks, PA

    Posted3 days ago

    Updated1 day ago

  • Job

    Angular Architect

    Compunnel, Inc.

    Oaks, PA

    Posted3 days ago

    Updated1 day ago

  • Job

    TekShapers

    Oaks, PA

    Posted4 days ago

    Updated2 days ago

Similar jobs in Pennsylvania

  • Job

    Bounce Back Physical Therapy

    Wayne, PA

    Posted2 days ago

    Updated1 day ago

  • Job

    Cardinal Health

    Harrisburg, PA

    Posted2 days ago

    Updated1 day ago

  • Job

    Maintenance Tech

    Concord Hospitality Enterprises

    Pittsburgh, PA

    Posted2 days ago

    Updated1 day ago

  • Job

    Mailroom Associate

    Keystone Property Enhancement, LLC

    Pittsburgh, PA

    Posted2 days ago

    Updated1 day ago

  • Job

    Mechanic

    Employers' Strategic Partnership Group LLC *Check Case Notes*

    Williamsport, PA

    Posted2 days ago

    Updated1 day ago