Codal is a user experience design and development agency with a focus on blending an Agile process with the latest technologies. Our clientele has ranged from small business to the Fortune 100, but our philosophy has always remained the same: to empower brand visibility and deliver the most elegant web and mobile solutions possible.
Roles & Responsibilities
Codal is searching for a highly skilled data engineer to optimize data management, flow, and architecture for the digital platforms Codal designs and develops. With veritable experience in both re-architecting and building data systems from scratch, our ideal candidate will ensure that our client’s data pipelines are as robust and efficient as the digital experiences that house them.
Candidates must have proven experience in both SQL technologies (including NoSQL and Postgres), as well as AWS platforms (including EC2, EMR, RDS, & Redshift). Other responsibilities include:
- Crafting & managing data pipelines that meet project requirements and client expectations
- Identifying pain point and inefficiencies in current data processes and implementing optimized solutions to correct them
- Construct the necessary infrastructure required for the extraction, transformation, and loading of data from a wide variety of sources via SQL and AWS technologies.
- Build analytics tools that parse the data systems to provide actionable insight and inform client’s larger business strategy
- Collaborate with both internal and external product and design teams to offer input and expertise on data-related needs
Requirements / Must Haves
- 3+ years of verifiable experience in a data engineering role
- Bachelor’s or Master’s degree in Computer Science, Statistics, Informatics, Information Systems, or similar field
- Mastery of SQL knowledge, including relational databases and query authoring
- Mastery in constructing and optimizing ‘big data’ architectures
- Experience performing root cause analysis on internal and external data and processes
- Experience building processes that support data transformation, data structures, metadata, dependency, and workload management.
- Experience manipulating, processing and extracting value from large disconnected datasets.
- Proficiency with Hadoop, Spark, Kafka, and other data tools
- Proficiency with SQL, NoSQL, and Postgres
- Proficiency with Azkaban, Luigi, Airflow, and other data pipeline management tools
- Excellent project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
Why Work For Codal
As a Codal employee, you're a member of a dedicated & driven organization, composed of industry visionaries and auteurs. We take the utmost pride in our work and are truly passionate about the services we provide. As a world-class agency, we offer all of the benefits of an enterprise company, in a friendlier, tight-knit office community.