Data Modernization Program
The Data Stability and Modernization program is a critical foundation block to achieving our “Fix Data” objective. We will be building upon existing technology investments and modernizing our Data & Analytics solutions to deliver on critical business and IT outcomes. The program objectives are migrating and redesigning Teradata data to cloud technology, rebuilding operational integration outside of data analytics platforms and rebuilding business rules and processes in order to decommission legacy data repositories.
The business outcomes are:
- Single sources of truth with integrity controls and monitoring;
- Timely and reliable reporting;
- Leverage rich data sources provided with PSS and Loyalty ;
- Separation of operational data flows will improve safe, secure and reliable operations;
- Reduce overall operating costs by running Cloud-based systems instead of on separate on-premise infrastructures.
The IT outcomes are:
- Simplify tech footprint;
- Improved security and controls ;
- Reduced risk and cost ;
- Ready to decommission legacy systems.
- Analyze business objectives and develop data solutions to meet customer needs.
- Determine root cause for data integrity gaps and provide appropriate data resolution and process remedy solutions.
- Develop innovative and effective approaches to solve business problems through data analytics approaches and communicate results and methodologies.
- Support the development of data management standards, policies and procedures of the Organization.
- Design and administer data collection tools, clean, merge and manage existing data sets as well as establish and maintain data quality procedures to ensure accuracy and timeliness of data.
- Develop data visualization solutions derived from multiple data sources using right tools to
- enable insight and decision-making at various levels of the organization.
- Develop data visualizations for key business metrics on an ongoing basis.
- Define and document data dependencies and data flow diagrams of the solution developed, e.g. between source and systems consuming the data, as well as the logical processes and interfaces with other systems.
- Initiate data analysis and identify strategic opportunities that creates a positive impact on the Business.
- Spark / Python development.
- Azure (Data Lake Gen2, Data Factory, Event Hub, Azure Data Warehouse).
- Databricks and/or Snowflake.
- ETL / Data experience.
- Understands and experience working with DevOps process & pipeline (we use Azure DevOps).
- Important tech knowledge: Informatica and Teradata.
- Must have experience implementing Security Controls.
- 3-5 years experience.
Consultante en recrutement TI