At NativeML, our proven success has exponentially increased the demand for our services, resulting in quality growth and an expanded presence at our company headquarters conveniently located in the North Loop of Downtown Minneapolis (Industrious) as well as across the US. Seasoned professionals with proven experience can work remote from their location with limited travel for project kick-offs.  

In addition to phenomenal growth and learning opportunity, we offer a competitive compensation package including excellent perks, annual bonus, extensive training, paid Snowflake and Databricks certifications - in addition to generous PTO and a long term incentive program. 

As a Solutions Architect at NativeML, your responsibilities will include:  

  • Design, develop and innovate Snowflake or Databricks solutions; partner with our internal Infrastructure Architects and Data Engineers to build creative solutions to tough data problems    
  • Integrate data from a variety of data sources (data ware house, data marts) utilizing on-prem or cloud-based data structures (AWS); determine new and existing data sources 
  • Design and implement streaming, data lake, and analytical data solutions 
  • Create and direct testing strategies including unit, integration, and full end-to-end tests of data pipelines 
  • Utilize ETL processes to build data repositories; integrate data into Snowflake or Databricks Delta 
  • Determine and select the best tools to ensure optimized data performance; perform Data Analysis utilizing Spark and SQL
  • Mentor and coach Developers and Data Engineers. Provide guidance with project creation, application structure, automation, code style, testing, and code reviews 

Required Skills & Experience

  • 5+ years previous experience as a Software Engineer, Data Engineer or Data Analytics
  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience
  • Good competency in writing production-ready Java, Scala, or Python applications
  • Experience with Big Data Technologies such as Spark, Hadoop, and Kafka
  • Excellent skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
  • Solid skills in databases, data warehouses, and data processing
  • Extensive hands-on expertise with SQL and SQL analytics
  • Experience and understanding of large-scale infrastructure-as-a-service platforms such as Amazon AWS and Microsoft Azure using Terraform
  • Strong Linux/Unix administration skills are preferred.
  • Experience implementing ETL pipelines using custom and packaged tools
  • Experience using AWS services such as Lambda, S3, Kinesis, Glue
  • Experience using Azure Data Factory to connect to source systems and copy data to Azure Blob store

 

Apply for this Job

* Required