Data Architect
noida
About Us:
One third of the UK working age population are unable to access affordable credit. We at Amplifi want to rectify this issue. We aim to improve the nation’s financial health, using our state-of-the-art FinTech ecosystem, allowing ethical lending via credit unions accessible to everyone in the UK.
Amplifi Capital are one of the top 5 lenders in the near prime unsecured personal loans market in the UK. Through our innovative work we have grown our new customer volumes by 5 folds in just the last two years alone. Aiming to increase that number even further as we progress.
We are the biggest name in the UK credit union market with the two largest Credit Unions in the UK on the Amplifi platform. We at Amplifi Capital want to go beyond that, aiming to be the biggest player in the UK personal loan market for the near prime segment. Standing out against our competitors, being at the forefront of personal loans market.
We don’t just want to stop there, with the launch of the Reevo Money and our anticipated credit card product in early 2024 we are expanding our footprint in the near prime lending space.
People always come first at Amplifi Capital. From how we engage with our customers to the thorough recruitment process. Our journey is just getting started, the business has attracted amazing talent so far, and we don’t plan on stopping yet!
The Role:
Exciting opportunity for a hands-on Data Architect with strong data architecture skills and a comprehensive understanding of data engineering, BI, and reporting to assume responsibility for the data platform of a growing FinTech business.
This position is ideal for someone skilled in data modelling (relational, dimensional, industry-specific models such as FSLDM) and who enjoys tackling complex data challenges by developing innovative solutions using cloud technologies. The role will involve overseeing the enterprise data lakehouse built on Databricks and handling data modelling for all layers in a medallion architecture. Proficiency in building a data lakehouse on Databricks or Snowflake, as well as experience in data enrichment, database performance management, data caching techniques, data quality initiatives, data engineering techniques, job orchestration, and CI/CD pipelines, will be required.
As a business-facing role, successful candidates must possess excellent stakeholder management and communication skills, along with the ability to rapidly acquire new skills and take ownership of a complex, multi-layered data platform. Collaborating closely with business analysts, data engineers, and data scientists, you will play a pivotal role in constructing and sustaining a sophisticated data platform that is essential to the company's success.
Responsibilities:
Own the data architecture across multiple data platforms, AWS and Databricks
Own the data lakehouse platform and maintain the data models for all layers of FSLDM
Lead the data modelling of diverse datasets and hands on development of the database
Build an understanding of the business process and be able to translate this into the data models
Translate business requirements into technical requirements and architectural diagrams
Lead initiatives through the build process with various engineering teams
Subject matter expert for all company data, master data, reference data
Be part of the operational support of the data platform to ensure a reliable service
Track and communicate issues with the data platform to the technology leadership team
Document the delivered solutions at the technical and business level
Responsibilities:
Experience with performing database modeling and deploying changes to database schema
A sound knowledge cloud platforms and server less computing technologies such as AWS and Databricks
Utilizing complex SQL queries for logic, analysis, and performance tuning
Conducting data analytics on cloud platforms like AWS, Azure, or GCP
A sound understanding and experience in utilizing complex SQL queries to interrogate data and join datasets
Hands-on - managing and organizing large sets of messy data efficiently
Background with modern BI tools and semantic layers, such as Tableau and PowerBI
Experience in working effectively in virtual teams and maintaining constant collaboration
Ability to work independently and take ownership of key services
Nice to Have:
Experience with ETL architectures and tools, including integration with APIs and coding Python data pipelines
Team leadership skills for managing tasks and conducting stand-ups
Knowledge of reference/master data management
Understanding of data governance initiatives such as lineage, masking, retention policy, and data quality
Benefits:
Competitive salary
25 days annual leave
Gratuity
Subsidized Transport
Discount shopping
Private health insurance
Sociable company
Hybrid working (2 days from home)
Commitment:
We are committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender, race and social background.