ABOUT KV CAPITAL
At KV Capital, we have established ourselves as industry leaders in real estate debt and investment, driven by our unwavering commitment to excellence, trusted partnerships, and innovative solutions.
Since our founding in 2006, we have expanded our business across three connected areas of service : real estate private equity, real estate lending, and private equity investments in real estate supply chain businesses.
Managing a robust portfolio of +$1 billion in funded investments, accompanied by +$500 million in assets under management, our dedicated team of finance experts is driven by a shared goal to cultivate prosperity for our clients, partners, and communities.
POSITION Data Engineer
LOCATION Edmonton, Alberta or Remote
YOUR OPPORTUNITY
The ideal candidate should be a self-starter and a driven individual that is willing to enhance their career within a growing business.
KV Capital has experienced rapid growth over the past number of years, which is expected to continue, and therefore will present many opportunities for the Data Engineer to scale the data and BI architecture with the organization.
The successful candidate should be well versed in the fundamentals of data engineering and business intelligence, be a strong communicator, problem solver, and an all-around team player;
all to the alignment of KV Capital's growth-oriented agenda.
YOUR KEY RESPONSIBILITIES
In this role, your key responsibilities will encompass :
- Implement, maintain, audit, and govern enterprise-wide data warehousing in Azure and Microsoft Fabric.
- Perform dimensional modelling according to the Kimball / star-schema principles.
- Develop and maintain ETL processes using Data Factory, Dataflows Gen 2, and Notebooks in Microsoft Fabric.
- Develop and maintain Power BI reports for the business units.
- Performance optimize SQL and DAX within the data warehouse, ETL, and BI systems.
- Administer the data platforms.
- Lay the foundational infrastructure to allow for future state data science, analytics, and AI implementations.
YOUR CAPABILITIES AND CREDENTIALS
To excel in this role, you must have experience and knowledge with the following :
- Azure data engineering technologies, such as ADLS Gen 2, Azure Data Factory, Azure SQL Database, etc.
- SQL querying.
- Kimball enterprise data warehousing and dimensional modelling principles.
- ETL system development using Data Factory, Dataflows Gen 2, and Power Query.
- Analysis Services tabular model development using Power BI.
- Advanced DAX.
- Power BI reports and dashboards.
- Documentation.
- Being able to propose solutions learned from Microsoft MVP & SQLBI articles, Microsoft documentation, whitepapers, and community publications.
Additional experience or knowledge in other areas would be beneficial, such as :
- Azure administration, service principles, API permissions, etc.
- SQL optimization.
- Advanced Power Query development using M, custom connectors, custom columns, Excel CUBE functions, Excel PowerPivot, etc.
- ETL system development using Fabric Notebooks or Jupyter Notebooks.
- Analysis Services tabular model development using Tabular Editor and Visual Studio / VS Code.
- DAX performance optimization using DAX Studio.
- Power BI paginated reports.
- Power BI Service / Fabric administration and pricing.
- Power BI Service / Fabric artifact deployment using Azure DevOps, Git, Tabular Editor, Azure Data Studio, etc.
- APIs and Postman.
- Python data science and visualization libraries.
APPLY NOW!
Join a dynamic and highly effective team with a continuous stream of challenging and transformative projects. KV Capital is in a phase of growth and expansion, offering competitive total compensation, a collaborative work environment, recognition for outstanding performance, and abundant opportunities for both personal and professional advancement.
For further insights into our company, please visit our website at www.kvcapital.ca .
To apply , please forward your resume and cover letter to :
Michael Crawley, VP Operations Excellence at [email protected]
Please use the subject line Data Engineer in your email correspondence.