Tallo logoTallo logo

Data Scientist

Job

Skysoft Inc

Chicago, IL (In Person)

$135,200 Salary, Full-Time

Posted 3 days ago (Updated 1 day ago) • Actively hiring

Expires 6/14/2026

Apply for this opportunity

This job application is on an outside website. Be sure to review the job posting there to verify it's the same.

Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
83
out of 100
Average of individual scores

Were these scores useful?

Skill Insights

Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.

Job Description

Client:
S&P Global Market Intelligence (Cognizant)
The Role:
Quantitative Research Analyst (Data Modeling & Imputation)
Location:
Chicago, IL or Boston, MA (5 days Onsite)
Rate:
$70 on C2C or $65 on Hourly W2 The Opportunity This role focuses on building high-quality, research-ready datasets from incomplete, inconsistent, and fragmented financial data. You will be responsible for designing and implementing imputation frameworks, data transformations, and modeling pipelines that convert raw inputs into reliable signals. This is not a pure modeling role. The majority of the value comes from data construction, validation, and methodological rigor upstream of the model . The Impact You will directly influence the quality and credibility of investment research by:
  • Constructing datasets where ground truth is partial or noisy
  • Designing imputation methodologies that are economically and statistically sound
  • Ensuring outputs are stable, explainable, and usable in production research Your work underpins signal generation, backtesting, and ultimately client-facing insights. Core Responsibilities
  • Build and maintain end-to-end data pipelines across structured and unstructured datasets
  • Develop imputation frameworks for missing or sparsely reported financial data (e.g., segment-level estimates, coverage gaps, timing mismatches)
  • Design and implement data normalization and reconciliation logic across overlapping hierarchies (e.g., segments, geographies, entities)
  • Perform data quality diagnostics , including coverage analysis, bias detection, and stability testing
  • Partner with researchers to translate raw data into model-ready features
  • Write efficient, reproducible code in Python and SQL for large-scale data processing
  • Document methodologies clearly to ensure transparency and repeatability Required Experience
  • 5-7 years of experience in quantitative research, data science, or financial data engineering
  • Strong expertise in data wrangling and transformation at scale (this is the core skill)
  • Proven experience with missing data techniques and imputation methods (e.g., cross-sectional inference, time-series interpolation, model-based approaches)
  • Advanced Python skills (pandas, numpy); strong SQL required
  • Experience working with messy, real-world datasets (not just clean academic data)
  • Solid grounding in statistics and econometrics
  • Familiarity with equity markets and financial statements preferred What We re Actually Looking For
  • You are an expert data wrangler - you know that 80% of the work is getting the data right
  • You are skeptical of inputs and instinctively test for edge cases, leakage, and bias
  • You understand that imputation is modeling , not a preprocessing afterthought
  • You can balance statistical rigor with practical constraints (coverage vs. precision trade-offs)
  • You write code that others can read, audit, and reuse Nice to Have (Not Required)
  • Experience with NLP or unstructured data pipelines
  • Exposure to alternative datasets (supply chain, transcripts, etc.)
  • Familiarity with distributed compute environments (Databricks, Spark)

Similar remote jobs

Similar jobs in Chicago, IL

Similar jobs in Illinois