We invite you to join the Greycroft Talent Network

Leverage our network to build your career.
Tell us about your professional DNA to get discovered by any company in our network with opportunities relevant to your career goals.

Sr. Data Engineer (Python) - Discovery and Evaluation



Software Engineering, Data Science
Warsaw, Poland
Posted on Thursday, April 20, 2023
About Craft:
Craft is a supply chain resilience company helping organizations accelerate data-informed business decisions. Our unique, proprietary data platform tracks thousands of real-time signals across millions of companies globally, delivering best in class monitoring and insight into global supply chains, among other company cohorts.
Our clients (including Fortune 100 companies, government and military agencies, SMEs, asset management groups, and others) are some of the largest enterprises in the world, and use our technology for supply chain resilience, market intelligence and related use cases. Our modular, secure, end-to-end platform offers high scale data, analytics, and workflows through which our clients can monitor any company they are working with and drive critical actions in real-time
We are a well-funded technology company with leading investors including Greycroft, Uncork, High Alpha, ServiceNow Ventures, Sam Palmisano, Freddy Kerrest, but are not your typical data or SaaS startup. Our CEO is a seasoned entrepreneur and Juilliard-trained cellist. The Craft team is globally distributed with headquarters in San Francisco and an office in London. We have team members across North America, Canada, and Europe. We are looking for innovative and driven people who are passionate about building delightful software to join our rapidly growing team!
A Note to Candidates:
We are an equal opportunity employer who values and encourages diversity, equity and belonging at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, caste, or disability status.
About the role:
Craft is looking for an experienced and motivated Data Engineer to join our Discovery and Evaluation team. The main focus of the DE team is to transform Craft data (1st, 2nd and 3rd party, structured and unstructured) into usable insights using various data engineering and data science techniques. In this role you are responsible for a key product within the organization!
As a core member of this team you will have a great say in how solutions are engineered and delivered. Craft gives engineers a lot of responsibility, which is matched by our investment in their growth and development. We are growing quickly, and the only limits to your future growth with Craft are your interests and abilities.

What You'll Do:

  • Build and optimize data pipelines (batch & streaming) for big data systems.
  • Extract, Analyze and Model rich & diverse data sets.
  • Apply Data mining to techniques such as anomaly detection, clustering, regression,classification, summarization to extract value from our data sets.
  • Design software that is easily testable and maintainable
  • Ensure data analysis is in line with company policies and regulations.
  • Strong problem solving and the ability to communicate ideas effectively.
  • Keep track of emerging technologies & trends in the Data Engineering world, incorporating modern tooling and best practices at Craft.
  • Work on extendable data processing system that allows to add and scale pipelines with low-code approach

Who You Are:

  • Show curiosity through asking questions, digging into new technologies, and always trying to grow
  • 5+ years of experience in Data Engineering
  • 5+ years of hands on working experience with Python
  • Knowledge and experience of Amazon Web Services (AWS).
  • Familiar with infrastructure-as-code approach (Terraform)
  • Self-starter, independent, likes to take initiative.
  • Experience in mentoring data engineers and building a technical team
  • Have fundamental and deep knowledge of data engineering techniques: ETL (Batch & Streaming), DWH, Data Lakes, MPP
  • Currently reside in Poland
  • Data analytical and statistical background would be a plus
  • Our current technology stack:
  • AWS Services: S3, Batch, Athena, Lambda, Kinesis, Kafka, Redshift, RDS / Aurora Postgres, DynamoDB
  • Data Pipelines: Python, SQL, Pandas, Dask, aws-data-wrangler, Papermill, Airflow, Prefect

What We Offer:

  • Competitive Salary based on levels of expertise, location, cost of living, taxes, market experience, etc. (Can be paid in USD, EUR, or PLN)
  • 28 Days of Annual Leave so you can take the time you need to refresh!
  • Uncapped Sick Leave so you can focus on your health when you need it
  • 100% Remote work!
  • zł 400 Monthly Wellness/Learning stipend (Gym memberships, meals, snacks, books, classes, conferences, etc.)
  • zł 2,000 Workstation Allowance (standing desk, chair, monitor, etc)
Interview Process
We want you to have a great interviewing experience and see this as a conversation. To help you prepare, the outline typically (but subject to change) looks like:
Recruiter (20 min) - Non-technical phone call to discuss Craft, your background and answer questions.
Take Home Assignment- This project is designed to evaluate your coding skills and problem-solving abilities
Hiring Manager (45 min) - video call to do a deeper dive into your background, the role, technical skills, and a small live coding task.
Technical (90 min) - Video call taking a deeper dive into your technical skills including case studies, live coding activity, design + architecture, and other data engineering technical questions.
Engineering Manager + Product (60 min) - Video call with one of our Engineering Managers and team member from Product to discuss culture fit.