About us
Renova Cloud is a leading provider of Strategic Cloud Consulting in Vietnam, specializing in Cloud Computing Solutions, DevOps, and Managed Services. As the AWS Partner of the Year for 2023 and 2024 in Vietnam, we have a skilled team of engineers, architects, and DevOps professionals dedicated to accelerating customers’ success through cloud transformation and modernization processes.
More about Renova Cloud
Position Summary / Primary Purpose of the Position
As a Cloud Data Head, you are to be in the top technology edge and working closely with various cloud technology leaders. Join us now and be a part of the next generation of cloud technology.
Key Duties & Responsibilities (but not limited to)
Designing, coding, and testing software modules/applications for data collection;
Identifying, creating, and preparing data required for modern Data Analytics, Data Science;
Building modern Data Architect (Data Lake, Data Warehouse, and Data Lakehouse) solutions using existed cloud provided services;
Creating and documenting the tests to meet requirements;
Deploying Data services solutions into cloud environments and integrating them with other components in the application;
Maintaining, tuning, and adapting applications to keep them performing to specifications Lead/ Manage Data team
Other tasks as assigned by CTO, CEO
Key Duties & Responsibilities (but not limited to)
Internal
Team members
External
Clients
Partners
Qualifications and Experience
Qualifications/ Memberships
Fluency in SQL and Python is must, knowledge of other languages are preferable (Scala, Java)
Experienced with ETL/ELT and Data Integration tools like : Airflow, AWS Glue, Talend(opt), Airbyte or Fivetran, DBT – Query Engine/Data Processing: Prestodb, Flink, Pyspark – Database Engine/Streaming: AWS Redshift(opt), MongoDB, PostgreSQL, Kafka, SQL, MySQL, Oracle(opt)
Experienced in real-time data processing like Kafka, Apache Flink, Apache Beam
Experienced in using data orchestrations like Apache Airflow, and (or Dagster, Luigi)
Experience
4-5 years of experience in data engineering and cloud data processing services (AWS, Azure, and (or) Google Cloud Platform)
Personal Requirements
Interest or experience in Big Data technologies (Hadoop, Spark, Data Bricks, Snowflake)
Open mindset, ability to quickly adapt to new technologies and learn new practice.