interactive investor is an award-winning investment platform that puts its customers in control of their financial future. We’ve been helping investors for nearly 30 years. We’ve seen market highs and lows and been resilient throughout. We’re now the UK’s number one flat-fee investment platform, with assets under administration approaching £75 billion and over 450,000 customers. For a simple, flat monthly fee we provide a secure home for your pensions, ISAs and investments. We offer a wide choice of over 20,000 UK and international investment options, including shares, funds, trusts and ETFs. We also bring impartial, expert content from our award-winning financial journalists, highly engaged community of investors, and daily newsletters and insights.
The Data Engineer role will help ii design, build, and continually improve the firm's Data Platform, consolidating datasets such as customer, transaction, marketing, web analytics, and market data into a trusted, comprehensive source for analytics and Data Science/ML/AI. You will design, build, and run robust Python/SQL pipelines, orchestrated with Dagster, to deliver and transform data in our Snowflake Data Warehouse. The role also partners with the wider Data and Innovation team and business stakeholders on Intelligent Automation—embedding AI agents within data workflows to replace manual, data‑heavy processes—while maintaining high standards of data quality, security, and governance. While our Data Analysts primarily build and maintain reporting outputs, you should be comfortable presenting data via Streamlit and occasionally Power BI. Work may span Microsoft Azure, Amazon Web Services, and Google Cloud, leveraging their AI agent feature sets where appropriate. We are seeking candidates with a range of experience levels, from Junior to Senior and Lead positions.
Support and monitor the daily Data Platform build; investigate and resolve issues from overnight jobs Orchestrate reliable ELT/ETL pipelines using Dagster (assets, jobs, schedules, sensors), implementing dependency management, retries, backfills, SLAs, and alerting to populate the warehouse (star schemas, snapshot tables, slowly changing dimensions) Provide reusable SQL queries and data extracts to Data Analysts and business users; promote self‑service patterns Monitor and triage Data Request tickets for ad‑hoc data needs across business functions, including legacy transaction record requests Maintain clear data lineage and up‑to‑date data cataloguing and dictionaries; advise on the most appropriate fields for specific use cases Partner with business stakeholders to identify manual, data‑heavy processes and redesign them as automated pipelines, incorporating AI agents into data workflows (e.g., classification, enrichment, reconciliation, document processing, alerting) Integrate pipelines with business systems, APIs, and webhooks; implement scheduling, retries, and alerting through orchestration Apply data quality checks, validation rules, and observability (e.g., automated tests, SLAs, anomaly detection, etc) Tune performance and cost (query optimisation, partitioning/clustering in Snowflake, indexing, caching, efficient storage formats such as Parquet/Delta) Practice strong DataOps: Git‑based version control, pull requests/code reviews, CI/CD, environment promotion Ensure compliance with data privacy, security, and regulatory requirements; uphold access controls, encryption, and auditability Maintaining the documentation for the Data Warehouse including design documentation to accompany new scripts/processes and corresponding Data Dictionaries As required, develop and support new and existing data outputs via Streamlit and occasionally Power BI dashboards/reports for operational MI and ad‑hoc analysis
In-depth knowledge of fundamental database concepts (design and queries) and strong knowledge of advanced topics (management and optimising) Advanced SQL knowledge: can write new and interpret existing complex multi-join aggregation queries Advanced understanding of data mart concepts: star/snowflake schemas, snapshot tables, slowly changing dimensions Strong knowledge of Python with a focus on scripting data collection and transformation queries Experience with Dagster (preferred) for orchestration: assets, jobs, schedules, sensors for event‑based and scheduled pipelines; familiarity with comparable cloud services is a plus Experience with Snowflake (or similar cloud data warehouses) Strong DataOps practices: Git-based version control, pull requests, code reviews, CI/CD for data pipelines and infrastructure, environment promotion Experience developing data outputs using Streamlit (primary) and BI visualization tools such as Power BI (occasional) Understanding of data quality frameworks and observability and operational monitoring/alerting Exposure to Intelligent Automation in data workflows, including safe use of AI agents for enrichment, classification, or document processing; familiarity with cloud AI agent feature sets (e.g., Azure AI/Agents including Azure OpenAI, AWS Agents for Bedrock, Google Vertex AI Agents) Familiarity with security, governance, and privacy concepts Able to translate high‑level business requirements into clear data requirements and robust technical designs Group Personal Pension Plan – 8% employer contribution and 4% employee contribution Life Assurance and Group Income Protection Private Medical Insurance – Provided by Bupa 25 Days Annual Leave, plus bank holidays Staff Discounts on our investment products Personal & Well-being Fund – Supporting your physical and mental wellness Retail Discounts – Savings at a wide range of high street and online retailers Voluntary Flexible Benefits – Tailor your benefits to suit your lifestyle Please Note: We will do our utmost efforts to respond to all applicants. However, due to the high volume of applications we're currently receiving, if you haven't been contacted within 30 days of application, please consider unsuccessful. interactive investor operates in accordance with the UK Equality Act 2010. We welcome applications from individuals of all ages, disabilities, gender identities, marital status, pregnancy/maternity, race, religion or belief, sex, and sexual orientation. We are committed to treating all applicants fairly and making reasonable adjustments where needed to support disabled applicants. We actively prevent all forms of discrimination, harassment, and victimisation—whether direct, indirect, associative, or perceptive
Get similar opportunities delivered to your inbox. Free, no account needed!
You're currently viewing 1 out of 20,708 available remote opportunities
🔒 20,707 more jobs are waiting for you
Access every remote opportunity
Find your perfect match faster
New opportunities every day
Never miss an opportunity
Join thousands of remote workers who found their dream job
Premium members get unlimited access to all remote job listings, advanced search filters, job alerts, and the ability to save favorite jobs.
Yes! You can cancel your subscription at any time from your account settings. You'll continue to have access until the end of your billing period.
We offer a 7-day money-back guarantee on all plans. If you're not satisfied, contact us within 7 days for a full refund.
Absolutely! We use Stripe for payment processing, which is trusted by millions of businesses worldwide. We never store your payment information.