|
Job Description
Role: Data Engineer Location: Remote Duration: 1 year Rate: $52.85/hour W2
Job Summary: Data Engineer gains access to data across the organization and provides ongoing analysis of the data by monitoring, profiling and analyzing databases. Requires a mix of functional, data and technical skills. The right candidate must be able to understand business requirements, translate them into information needs and implement those requirements using data available. The hire will be responsible for expanding and optimizing data architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems. The Data Engineer will support our software developers, database architects, and data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
Job Responsibilities: - Assemble large, complex data sets that meet functional / non-functional business requirements.
- Strong knowledge of SQL required. Ability to identify sets and subsets of information across multiple joins or unions of tables is preferred in addition to writing and troubleshooting SQL queries for data mining
- Perform complex data analysis and investigation for customer requests to explain results and to make appropriate recommendations.
- Strong understanding of data modeling concepts
- Problem solver with the initiative to think critically to identify improvement opportunities (error detection, error correction, root cause analysis)
- Understand ETL that will aid in verification and testing of data
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Analyze business objectives and develop data solutions to meet customer needs.
- Demonstrated ability to effectively participate in multiple, concurrent projects
- Improve and customize current data solutions to meet business functional and non-functional requirements.
- Research new and existing data sources in order to contribute to new development, improve data management processes, and make recommendations for data quality initiatives.
- Perform periodic data quality reviews for internal and external data.
- Ensure timely resolution of queries and data issues.
- Look for new ways to find and collect data by researching potential new sources of information.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Required Skills & Experience: - Demonstrated ability to analyze and profile data as a means to address various business problems through leveraging advanced data modeling, source system databases, or data mining techniques.
- May provide consultative services to departments/divisions and committees.
- Demonstrated application of several problem-solving methodologies, planning techniques, continuous improvement methods, and analytical tools and methodologies (e.g. data analysis, data profiling, modeling, etc.).
- Incumbent must have ability to manage a varied workload of projects with multiple priorities and stay current on healthcare trends and enterprise changes.
- Interpersonal skills and time management skills.
- Requires strong analytical skills and the ability to identify and recommend solutions, advanced computer application skills and a commitment to customer service.
- Experience with data analysis, quality, and profiling; including data exploration tools including but not limited to Rapid SQL, AQT, Information Analyzer, and Informatics.
Required Education: - Bachelor's degree in Computer Science or Engineering from an accredited University or College.
- OR
- Associate's degree in Computer Science or Engineering from an accredited University or College with two (2) years of experience.
Required Skills: Additional Skills: Languages: - English (Read, Write, Speak)
Required Skills & Responsibilities - Contribute to the design, configuration, and support of data, analytics, and AI environments across Google Cloud Platform (GCP) and Microsoft Azure, including Microsoft Fabric
- Build and maintain data pipelines to ingest, cleanse, transform, and curate structured and unstructured data
- Support batch and near–real-time data ingestion and transformation workflows
- Use Infrastructure as Code (IaC) tools (e.g., Terraform) to help automate cloud environment provisioning and configuration
- Configure and support cloud services related to data ingestion, integration, messaging, CI/CD, and data processing
- Assist with data modeling and performance optimization in cloud data warehouses (e.g., partitioning and clustering in BigQuery)
- Support the setup and tuning of operational databases or data-serving layers based on defined use cases
- Implement and maintain monitoring, logging, and alerting for data pipelines and platforms
- Write and maintain data transformations using SQL and Python
- Collaborate with engineers, analysts, and product teams in an iterative, product-focused environment
|