Job Description: Data Engineer

Hours: Monday – Friday, up to 45 (some hours outside of this as required)

Work hours: IST shift (6:00 pm to 3:00 am).

Location: Cerebrum IT Park, Kalyani Nagar, Pune

Position: Permanent position, with a three-month probation period

Salary: As per industry standards

About the Company

Valasys Media is a Global Integrated Marketing and Sales process outsourcing company that specializes in helping companies to build sales pipeline with qualified opportunities and reduce their sales cycle for their products/services portfolio. As part of our capability, we also help create market visibility, build awareness, and establish business relationships in new markets.

Job Brief

We are looking for a Data Engineer to help us to expand and optimize our data and data pipeline architecture, as well as optimize data flow. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Data Engineer responsibilities include creating ETL pipelines. To do this job successfully, you need exceptional skills in SQL and programming. If you also have knowledge of data science and software engineering, we’d like to meet you. Your ultimate goal will be to shape and build efficient self-learning applications.

Key Skills: Data Engineering, Python/R, SQL, Pandas, NumPy, Big Data, AWS Cloud

Key Management Areas of Responsibility

  • Analyze and organize raw data.
  • Automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Prepare data for prescriptive and predictive modelling.
  • To design, build and support data pipelines for ingestion, transformation, deployment, conversion, and validation.
  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies.
  • Conduct data assessment, perform data quality and performance checks, and transform and load raw data using SQL and ETL tools.

Professional Skills & Qualification

  • Proven experience as a Data Engineer, devOps engineer or similar role.
  • Understanding of data structures, data modeling and software architecture.
  • Hands-on-experience with SQL/NO-SQL (MySQL/ Postgres/Cassandra /MongoDB).
  • Good knowledge and coding experience in one or more programming languages such as Python, Java, JavaScript.
  • Knowledge of web scraping and tools.
  • Knowledge of Object-oriented programming.
  • Experience with AWS cloud services (EC2).
  • Experience in Deploying Machine Learning/Deep Learning models on cloud.
  • Experience with big data tools: Hadoop, Spark.
  • Knowledge of Data Analytics pipeline.
  • Deep knowledge of math, probability & statistics.
  • Outstanding analytical and problem-solving skills.
  • 2-4 years’ experience with a Bachelor’s Degree in Computer Science, Engineering, Technology or related field required




    Position Applying For*

    Upload Your CV Here*