
Senior Data Engineer - Contract
- On-site
- United States
- $60 - $80 per hour
In this role, you'll build scalable data solutions using SQL and Python, working with both structured and unstructured data in a cloud-focused, collaborative environment.
Job description
** This is a 6+ month contract opportunity with our client located in Minneapolis, MN. On-site requirement is 4 days per week. Candidates must be authorized to work in the United States without the need for sponsorship. **
We are seeking a Senior Data Engineer to join a fast-paced, growing data engineering team within a forward-thinking investment firm. This role is ideal for a technically skilled data professional with strong experience in SQL and Python, and a passion for building data pipelines, modeling, and delivering modern data solutions. You must be comfortable working with both structured and unstructured data, and have a strong interest in cloud technologies. While experience with AWS, DBT, and Snowflake is a plus, the team is open to training the right candidate in these tools.
Primary Duties
Build and maintain scalable data pipelines using SQL and Python.
Work with both structured and unstructured data to support analytical and operational use cases.
Support cloud data engineering projects, especially those leveraging AWS (training available).
Collaborate on data modeling and transformation efforts across teams.
Assist with documentation, QA, and code reviews to ensure data quality and system reliability.
Communicate effectively with engineers, analysts, and business partners to deliver timely data solutions.
Continuously identify opportunities to optimize and improve data workflows.
Required Qualifications
Strong experience with SQL and Python.
Ability to work confidently with structured and unstructured data.
Strong analytical thinking and problem-solving skills.
Excellent written and verbal communication skills.
Self-starter who is eager to learn and grow in a fast-moving environment.
Nice to Have Skills
Experience with DBT (Data Build Tool).
Familiarity with Snowflake or similar cloud data platforms.
Exposure to AWS services (e.g., S3, Redshift, Glue, Lambda).
Experience with data visualization tools such as Looker (training provided).
Knowledge of version control (Git), CI/CD, and workflow orchestration tools like Airflow.
Background in financial services or data-intensive business environments.
#EmergentStaffing
- United States
or
All done!
Your application has been successfully submitted!