- Create and maintain data pipelines and ETL jobs using Python and SQL
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Collaborate with business leaders, Executives, Data Scientists, BI Analytics, Product, Engineering, and other operational departments to ensure successful delivery of data integration and BI projects
- Perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
Note: work from home (WFH) options available
Benefits & Compensation
Acima understands that employment is the sum of many parts. Our compensation is very competitive and total benefits round out to what we feel is a complete package.
- Unlimited discretionary time off (DTO)
- We offer a traditional insurance plan AND 3 QHDHP plans with company contribution to an HSA plan. $750 per year for individuals and $1,500 per family (2+).
- Medical insurance in the IHC/United network, Acima pays 85% of the employee premium
- Dental and Vision Insurance
- Health Savings Account (HSA) with company contribution
- Supplemental Insurance (long-term/short-term disability, life insurance, etc.)
- Paid Parental Leave
- Company Paid Holidays
- 401K with company match
- (EAP) Paid Employee Assistance Program with fully covered mental health benefits and more.
- Options for a hybrid work from home and in office
- Flexible schedules available
- Onsite Gym & Bike Lockers
- College Tuition Reimbursement Program
- Training, education, and recertification reimbursement
- Competitive Pay - Our posted pay range for this position starts at $85,000+ However, we understand that each candidate brings their own skills and experience. We evaluate each candidate and review salary requirements to provide the most appropriate and competitive offers.
Skills, Experience & Qualifications:
- 2+ years of experience in Data Engineer role
- Bachelor's degree or equivalent experience in Computer Science or related technical field
- Strong Python experience
- Strong SQL experience
- Strong REST API experience
- Experience using Linux
- Experience with data warehousing: Redshift and Snowflake
- Warehouse data modeling experience is a plus
- Ability and motivation to learn new technologies quickly with minimal support and guidance
- Strong communication skills
- Experience supporting and working with cross-functional teams in a dynamic environment.