Our client is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, The company is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
oferujemy- Practical benefits: permanent employment contract from the first day; hybrid, flexible working model; equipment package for home office; private medical care with Medicover; life insurance; Capgemini Helpline; NAIS benefit platform;
- Access to 70+ training tracks with certification opportunities; platform with free access to Pluralsight, TED Talks, Coursera, Udemy Business and SAP Learning HUB
- Community Hub that will allow you to choose from over 20 professional communities that gather people interested in, among others: Salesforce, Java, Cloud, IoT, Agile, AI.
- design, develop, and maintain Snowflake data pipelines to support various business functions;
- collaborate with cross-functional teams to understand data requirements and implement scalable solutions;
- optimize data models and schemas for performance and efficiency;
- ensure data integrity, quality, and security throughout the data lifecycle;
- implement monitoring and alerting systems to proactively identify and address issues;
- plan and execute migration from on-prem data warehouses to Snowflake;
- develop AI, ML and Generative AI solution;
- stay updated on Snowflake best practices and emerging technologies to drive continuous improvement.
- at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience);
- understanding of Snowflake's pricing model and cost optimization strategies for managing resources efficiently;
- experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners;
- familiarity with Snowflake’s security model;
- practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience;
- at least basic knowledge of SQL and one of: Python/Scala/Java/bash;
- very good command of English.
Agencja zatrudnienia – nr wpisu 47
ta oferta pracy przeznaczona jest dla osób powyżej 18 roku życia
...