Tri-global Solutions Group Inc.
SENIOR DATA ENGINEER (Cloud Data Integration)
Requisition #: R26-3499 (GOAPRDJP00000910)
Location: Remote (within Canada)
Engagement Type: Contract
Number of Resources required: 1
Rate (CAD): Up to $110.00 per hour / Commensurate with related experience and market competitiveness
Term: 2026-05-25 to 2027-04-30 with two 12-month extensions available (up to 35-month contract)
Hours per day: 7.25
Security Screening: Standard (Criminal Record Check)
————————————————————————
Tri-global Solutions Group Inc. is seeking one (1) Senior Data Engineer (Cloud Data Integration) to join our talented Service Delivery team at Digital Design and Delivery (a division of Government of Alberta) to support projects across various ministries.
WORK MODEL: The successful candidate(s) will work remotely; however, may be required to attend meetings or work sessions in Edmonton, AB on reasonable notice provided by the Client (onsite attendance unlikely for out-of-province contractors). Work must be performed from within Canada, due to network and data security policies. Applicants must be authorized to work in Canada to apply (Canadian Citizen or Permanent Resident). Standard Hours of work are between 08:00 – 16:30 Mountain Time, Monday through Friday excluding observed holidays.
Please review the project overview and requirements below. If you meet the requirements and are interested in submitting for this role, please reply to this job posting.
If you know other consultants who may be interested in this opportunity kindly share this job posting.
Thank you.
Tri-global Solutions Group Inc.
Website: https://tri-global.com
————————————————————————
PROJECT OVERVIEW
The Government of Alberta (GoA) has embarked on transforming the work of government to deliver simpler, more efficient, and better services for the citizens of Alberta, thereby ensuring that the needs of Albertans are effectively met in the digital age. The Province has a strategic role within government to drive efficiencies, innovation and modernization. The Digital Design and Delivery Division (DDD) is the Province’s new centre for digital delivery. It was established to maximize capability and confidence in modern digital practice by ensuring service quality and value through standards and controls. This includes utilizing human-centred design approaches together with agile methodology and modern data practices.
DDD is currently working with Ministries across the GoA, establishing working relationships with partner Ministries throughout this engagement.
Tri-global requires one (1) Data Engineer to work with our delivery team within DDD on service innovation, program review, and digital transformation projects across the GoA. Data Engineers will work as part of cross-functional program review or product delivery teams. These teams, led by GoA product owners and DDD work collaboratively and collectively participate in a full range of activities including: field research; backlog definition and refinement; and sprint planning and execution. Digital transformation projects review the current state of services, identify future opportunities, and then deliver new services that are efficient, effective and affordable.
We are seeking talented and versatile Data Engineer(s) to join our dynamic team. The ideal candidate(s) will have a strong foundation in data engineering practices, combined with the analytical skills necessary to derive actionable insights from data. This role involves designing, implementing, and maintaining robust data pipelines and architectures, as well as performing detailed data analysis to support business decisions.
DESCRIPTION OF SERVICES
Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:
Data Engineering:
• Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
• Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
• Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
• Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
• Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
• Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
• Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
• Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.
Data Analytics:
• Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
• Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
• Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
• Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
• Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
• Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
MANDATORY SKILLS
– Bachelor degree in Computer Science, IT or related field of study. (Yes or No)
– Ensuring data quality, security, and governance. (3 years+)
– Experience as a Data Engineer and/or Data Analyst. (5 years+)
– Experience designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics. (3 years+)
– Experience developing and maintaining reports, dashboards, and visualizations using Power BI, DAX, Tableau, or Python libraries. (3 years+)
– Experience manipulating and extracting data from diverse on-premises and cloud-based sources. (5 years+)
– Experience performing migrations across on-premises, cloud, and cross-database environments. (3 years+)
– Experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions. (2 years+)
– Experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications. (3 years+)
DESIRABLE / NICE TO HAVE SKILLS
– Experience in application development, with knowledge of object-oriented and functional programming/scripting languages. (2 years+)
– Experience in the Government of Alberta environment or an environment of equivalent size and complexity. (1 years+)
– Experience with databases and data integration, including PostgreSQL, MongoDB, Azure Cosmos DB and data intefration tools like Synapse pipeline, Fabric data factory, Informatica, Talend, DBT and Airbyte. (2 years+)
– Exposure to AI/ML tools and workflows relevant to data engineering, such as integrating AI-driven analytics or automation within cloud platforms like Databricks and Azure. (1 years+)
PROJECT EXAMPLES (MUST PROVIDE 2 PROJECT/ASSIGNMENT EXAMPLES)
Two (2) project/assignment examples must be provided, which exemplify/demonstrate your expertise in the selected service area. Each example is composed of the following (5) Five Questions.
1. Provide an overview of the project/assignment the proposed resource or the proposed resource’s team was engaged in that demonstrates expertise in the selected service area and role. The overview should clearly describe the data problem being addressed and the proposed resource’s responsibilities from a Data Engineering perspective.
2. Describe the sector(s) (i.e. public, private or other) the project/assignment served, including any data sensitivity, regulatory, or privacy considerations relevant to the work.
3. Identify the project/assignment size in dollar value (i.e. less than $100,000, less than $500,000, less than $1,000,000 or greater than $1,000,000).
4. Describe the approach for the design, development, mitigation of risk and delivery of the project/assignment, including any special considerations with respect to methodology or processes. In the context of Data Engineering, include how data pipelines, data quality, performance, reliability, and operational considerations were addressed. In providing a response, consider quality assurance and communication across the cross functional team.
5. Provide a list of specific skills, tools and/or technology used within the project/assignment, particularly those related to Data Engineering. Clearly identify the tools and technologies the proposed resource personally worked with.
NOT FOR YOU?
Check out our other opportunities at https://tri-global.com or follow us on LinkedIn. We thank all candidates in advance. Only candidates selected for an interview will be contacted.
WHY WORK WITH TRI-GLOBAL?
– Empower positive change by enabling our clients to revolutionize innovation and technology, elevating them to a higher level of excellence and efficiency.
– Join an exceptional and committed team that redefines the landscape, forging a distinctive path towards success.
– Engage in stimulating and captivating projects that push boundaries and keep you constantly motivated.
To apply for this job email your details to application-intake+3499@tri-global.com
