|
Job Description
Role: Platform Engineer - DataOps Location: Hybrid, Arden Hills, MN- 3 days/week onsite Rate: $43.25-45.68/hour W2 dependent on skills and qualifications
Summary The Platform (DataOps) Engineer understands that data is vital for information-driven environments and crucial for business success. They will collaborate with Data Architects, Data Engineers, and Data Scientists to design solutions that deliver consistent, reliable, and efficient data using best practices and standards. These efforts result in accurate and trusted data assets for the Enterprise Data Platform and the organization. Additionally, they will support tools to build the data platform for DevOps processes, enhancing the platform and utilities to improve existing data pipelines and engineer new ones.
Key Responsibilities Technical Platform Operations - Automate admin tasks with Python and SQL for APIs and CLIs to enhance the data platform.
- Collaborate with technical staff to optimize their environments.
- Manage cloud platforms like Snowflake, Databricks, Qlik, Event Hubs, Power BI, and other Azure services.
- Create notification solutions to ensure cloud platforms are operational.
- Install patches and upgrades.
- Support issue resolution and escalation.
- Report on business metrics of the platform (adoption, goals alignment, product team needs).
- Develop custom DevOps capabilities when not built-in.
- Engage with vendors to learn new platform capabilities.
- Set platform standards.
- Measure data pipeline compliance and platform uptime.
- Participate in MVP and PoC activities to explore new tech capabilities.
- Perform root cause analysis for issues.
- Implement data quality checks and monitoring systems based on source data gaps and business rules.
- Redesign compute clusters to optimize costs and ensure workload isolation.
- Provide cost monitoring and visibility tools to track and report on cloud platform compute expenses.
Product Development Operations - Communicate with technical staff.
- Automate release processes for data applications.
- Implement frameworks for varied data ingestion.
- Resolve technical and process issues.
- Document processes and onboard new engineers.
- Measure and report cloud platform usage.
Improve Interoperability - Communicate platform status to technical product team members.
- Develop regression testing solutions to ensure alignment across various teams and prevent conflicting implementations on the platform.
- Collaborate effectively with cross-functional IT stakeholders to develop DevOps solutions.
- Proactively communicate potential risks across functions.
Required Experience/Education - College or vocational training in Computer Science, Programming, or similar technical areas.
- OR 1–2 years work experience in data processing, programming languages (SQL, Python, SAS, R, etc.), infrastructure, DevOps, etc. This experience could be in school or a work environment.
Required Competencies/Skills - Data lifecycle knowledge
- Python
- CI/CD
- Snowflake (priority over Databricks)
- Data Engineer with a DevSecOps background
Nice-to-Have: - Experience with Fivetran (replication tool being introduced in coming months).
- Proficiency in modern data platforms (Snowflake, Databricks, Power BI, Qlik Data Integration).
- Basic technical knowledge and understanding of data processing.
- Understanding and applying data quality principles.
- Identifying potential technical issues and risks.
- Collaborating across all levels and areas of the data platform team and analytics teams.
- Participating in PoC efforts to research emerging technologies.
- Working closely with technical personnel, including hands-on implementation.
- Proficiency in Python and SQL.
- Tech-focused career enthusiasm.
- Experience with CI/CD.
Preferred Experience/Education - Post-high school college or technical training.
Preferred Competencies/Skills - Knowledge of infrastructure as code.
- Adaptability to a fast-paced environment.
|