National Information Solutions Cooperative (NISC)
Data Engineer (AWS, Databricks)
National Information Solutions Cooperative (NISC), Saint Louis, Missouri, United States, 63146
For more than 50 years, NISC has worked to develop technology solutions for our customers, who we call our “Members”. Those Members are comprised primarily of 950+ utility and broadband companies across the country and abroad, and we strive to provide services and technology to help them operate efficiently and better serve their end users. Our Members have over 16 million end customers (residential and businesses who receive power, internet, television and/or telephone services) that our enterprise software solution enables our Members to compete effectively in the industry, while excelling in customer service.
We are seeking an experienced
Data Engineer
to join our growing team of data analytics experts. The hire will be responsible for curating and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for various application teams. The Data Engineer will support our application experts, software developers, database architects, and data analysts on a Data Roadmap strategy and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be comfortable supporting the data needs of multiple teams, systems, and products.
Work Schedule:
Hybrid from one of our office locations:
Cedar Rapids, IA
Lake Saint Louis, MO
Mandan, ND
Hybrid Schedule: Minimum of working 3 days per week out of an office location and ability to work up to all 5 days a week from an office location.
Required Days from an Office Location: Tuesday and Wednesday - the third required day will be up to the candidate and their supervisor to choose.
Essential Duties and Responsibilities:
Assemble large, complex data sets that meet functional / non-functional business requirements.
Understanding of Data Warehouse and Data Lakehouse paradigms.
Design and build optimal data pipelines from a wide variety of data sources using AWS technologies.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and optimizing a unified data stream.
Work with other data engineering experts to strive for greater functionality while making data more discoverable, addressable, trustworthy, and secure.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Create and maintain a culture of engagement and one that is conducive of NISC’s Statement of Shared Values.
Commitment to NISC’s Statement of Shared Values.
Knowledge, Skills & Abilities Preferred:
Experience building and optimizing data pipelines, architectures, and data sets.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build ETL processes supporting data transformation, data structures, metadata, dependency, and workload management.
Working knowledge of message queuing, stream processing, and highly scalable data stores.
Experience supporting and working with cross-functional teams in a dynamic environment.
Candidate with experience in a Data Engineer role, who has attained a BS or MS degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
Experience with AWS: Lambda, S3, SQS, SNS, CloudWatch, etc.
Experience with Databricks and Delta Lake.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with relational SQL and NoSQL databases, including Oracle, Postgres Cassandra, and DynamoDb.
Experience with data pipeline and workflow management tools: Hevo Data, Airflow, etc.
Experience with AWS cloud services: EC2, Databricks, EMR.
Experience with stream-processing systems: Apache Spark, Kafka Streams, Spring Cloud, etc.
Experience with object-oriented languages: Java, Scala.
Nice-to-have: Experience with scripting languages: Python, JavaScript, Bash, etc.
Strong verbal and written communication skills.
Ability to demonstrate composure and think analytically in high-pressure situations.
NISC’s Shared Values & Competencies:
Integrity
– We are committed to doing the right thing – always.
Relationships
– We are committed to building and preserving lasting relationships.
Innovation
– We promote the spirit of creativity and champion new ideas.
Teamwork
– We exemplify the cooperative spirit by working together.
Empowerment
– We believe individuals have the power to make a difference.
Personal Development
– We believe the free exchange of knowledge and information is absolutely necessary to the success of each individual and the organization.
Benefits:
Medical, Dental and Vision Insurance.
Health Savings Account (HSA) with $100 monthly contributions from NISC.
Dependent Care Flexible Spending Account (FSA) thru Paylocity.
Fully covered life insurance up to x3 annual base salary.
Fully covered short- and long-term disability.
401(k), traditional or Roth, with employee match up to 6% and employer 4% salary base contributions.
PTO accrual levels dependent on years of service, 120 Life Leave Event hours, 9 paid holidays and an annual holiday week.
$2,500 Interest-FREE technology loan program.
$25,000 employee educational assistance program.
Volunteer, Wellness, Family Events and other employee fun supplied by our committees.
Employee Assistance Program; assisting employees and dependents with virtually any life event.
Benevolence Committee to support employees with financial hardships like unexpected medical bills, funerals and other unfortunate hardships.
Education Preferred:
Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems or similar discipline, preferred.
Certification in Database Administration, along with relevant experience in lieu of 4-year degree.
Minimum Physical Requirements:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this position. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. While performing the essential functions of this position, employees must be able to see and communicate. Employees are regularly required to maintain a stationary position, move, and operate computer keyboards or office equipment.
Disclaimer:
Management may modify this job description by assigning or reassigning duties and responsibilities at any time.
Key Words:
SQL | Data | Big Data | Databricks | ETL | Spark | Scala | DBA| Lakehouse | Postgres | Python | AWS
#J-18808-Ljbffr
We are seeking an experienced
Data Engineer
to join our growing team of data analytics experts. The hire will be responsible for curating and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for various application teams. The Data Engineer will support our application experts, software developers, database architects, and data analysts on a Data Roadmap strategy and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be comfortable supporting the data needs of multiple teams, systems, and products.
Work Schedule:
Hybrid from one of our office locations:
Cedar Rapids, IA
Lake Saint Louis, MO
Mandan, ND
Hybrid Schedule: Minimum of working 3 days per week out of an office location and ability to work up to all 5 days a week from an office location.
Required Days from an Office Location: Tuesday and Wednesday - the third required day will be up to the candidate and their supervisor to choose.
Essential Duties and Responsibilities:
Assemble large, complex data sets that meet functional / non-functional business requirements.
Understanding of Data Warehouse and Data Lakehouse paradigms.
Design and build optimal data pipelines from a wide variety of data sources using AWS technologies.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and optimizing a unified data stream.
Work with other data engineering experts to strive for greater functionality while making data more discoverable, addressable, trustworthy, and secure.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Create and maintain a culture of engagement and one that is conducive of NISC’s Statement of Shared Values.
Commitment to NISC’s Statement of Shared Values.
Knowledge, Skills & Abilities Preferred:
Experience building and optimizing data pipelines, architectures, and data sets.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build ETL processes supporting data transformation, data structures, metadata, dependency, and workload management.
Working knowledge of message queuing, stream processing, and highly scalable data stores.
Experience supporting and working with cross-functional teams in a dynamic environment.
Candidate with experience in a Data Engineer role, who has attained a BS or MS degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
Experience with AWS: Lambda, S3, SQS, SNS, CloudWatch, etc.
Experience with Databricks and Delta Lake.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with relational SQL and NoSQL databases, including Oracle, Postgres Cassandra, and DynamoDb.
Experience with data pipeline and workflow management tools: Hevo Data, Airflow, etc.
Experience with AWS cloud services: EC2, Databricks, EMR.
Experience with stream-processing systems: Apache Spark, Kafka Streams, Spring Cloud, etc.
Experience with object-oriented languages: Java, Scala.
Nice-to-have: Experience with scripting languages: Python, JavaScript, Bash, etc.
Strong verbal and written communication skills.
Ability to demonstrate composure and think analytically in high-pressure situations.
NISC’s Shared Values & Competencies:
Integrity
– We are committed to doing the right thing – always.
Relationships
– We are committed to building and preserving lasting relationships.
Innovation
– We promote the spirit of creativity and champion new ideas.
Teamwork
– We exemplify the cooperative spirit by working together.
Empowerment
– We believe individuals have the power to make a difference.
Personal Development
– We believe the free exchange of knowledge and information is absolutely necessary to the success of each individual and the organization.
Benefits:
Medical, Dental and Vision Insurance.
Health Savings Account (HSA) with $100 monthly contributions from NISC.
Dependent Care Flexible Spending Account (FSA) thru Paylocity.
Fully covered life insurance up to x3 annual base salary.
Fully covered short- and long-term disability.
401(k), traditional or Roth, with employee match up to 6% and employer 4% salary base contributions.
PTO accrual levels dependent on years of service, 120 Life Leave Event hours, 9 paid holidays and an annual holiday week.
$2,500 Interest-FREE technology loan program.
$25,000 employee educational assistance program.
Volunteer, Wellness, Family Events and other employee fun supplied by our committees.
Employee Assistance Program; assisting employees and dependents with virtually any life event.
Benevolence Committee to support employees with financial hardships like unexpected medical bills, funerals and other unfortunate hardships.
Education Preferred:
Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems or similar discipline, preferred.
Certification in Database Administration, along with relevant experience in lieu of 4-year degree.
Minimum Physical Requirements:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this position. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. While performing the essential functions of this position, employees must be able to see and communicate. Employees are regularly required to maintain a stationary position, move, and operate computer keyboards or office equipment.
Disclaimer:
Management may modify this job description by assigning or reassigning duties and responsibilities at any time.
Key Words:
SQL | Data | Big Data | Databricks | ETL | Spark | Scala | DBA| Lakehouse | Postgres | Python | AWS
#J-18808-Ljbffr