Browse
···
Log in / Register

Senior Data Engineer

Negotiable Salary

Plum Inc

San Francisco, CA, USA

Favourites
Share

Description

PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person’s contributions and ideas are critical to the growth of the company.  This is a fully remote position, open to candidates anywhere in the U.S. with a reliable internet connection. While we gather in person a few times a year, this role is designed to remain remote long-term. You will have autonomy and flexibility in a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. You'll collaborate with a high-performing team — including sales, marketers, and financial services experts —  who stay connected through Slack, video calls, and regular team and company-wide meetings. We’re a team that knows how to work hard, have fun, and make a meaningful impact—both together and individually. Job Summary We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines that ingest and process data from a variety of external client systems. This role is critical in building the data infrastructure that powers Plum’s next-generation AI-driven products. You will work with a modern data stack including Python, Databricks, AWS, Delta Lake, and more. As a senior member of the team, you’ll take ownership of architectural decisions, system design, and production readiness—working with team members to ensure data is reliable, accessible, and impactful. Key Responsibilities Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse. Integrate with external systems (e.g., CRMs, file systems, APIs) to automate ingestion of diverse data sources. Develop robust data workflows using Python and Databricks Workflows. Implement modular, maintainable ETL processes following SDLC best practices and Git-based version control. Contribute to the evolution of our Lakehouse architecture to support downstream analytics and machine learning use cases. Monitor, troubleshoot, and optimize data workflows in production. Collaborate with cross-functional teams to translate data needs into scalable solutions. Requirements Master’s degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience. 3+ years of experience building and maintaining production-grade data pipelines. Proven expertise in Python and SQL for data engineering tasks. Strong understanding of lakehouse architecture and data modeling concepts. Experience working with Databricks, Delta Lake, and Apache Spark. Hands-on experience with AWS cloud infrastructure. Track record of integrating data from external systems, APIs, and databases. Strong problem-solving skills and ability to lead through ambiguity. Excellent communication and documentation habits. Preferred Qualifications Experience building data solutions in Fintech, Sales Tech, or Marketing Tech domains. Familiarity with CRM platforms (e.g., Salesforce, HubSpot) and CRM data models. Experience using ETL tools such as Fivetran or Airbyte. Understanding of data governance, security, and compliance best practices. Benefits A fast-paced, collaborative startup culture with high visibility. Autonomy, flexibility, and a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action.  Opportunity to make a meaningful impact in building a company and culture.  Equity in a financial technology startup.  Generous health, dental, and vision coverage for employees and family members + 401K. Eleven paid holidays and unlimited discretionary vacation days. Competitive compensation and bonus potential.

Source:  workable View original post

Location
San Francisco, CA, USA
Show map

workable

You may also like

Workable
Power BI Developer
Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for multiple Fortune 500 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best global analytics consulting team in the world. The right candidate will have broad skills in database design, be comfortable dealing with large and complex data sets, have experience building self-service dashboards, be comfortable using visualization tools, and be able to apply your skills to generate insights that help solve business challenges. We are looking for someone who can bring their vision to the table and implement positive change in taking the company's data analytics to the next level. Requirements Bachelor’s degree in Computer Science or related field 6 + years of experience in a Business Intelligence space Knowledge of Power BI Business intelligence reporting tool is a must Knowledge of SQL is a must Knowledge of Snowflake is a must Experience and knowledge with Data Warehouse ETL process using SSIS packages is preferred Experience with tools and concepts related to data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data Consulting experience is highly preferred Benefits This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.
Tempe, AZ, USA
Negotiable Salary
Workable
Senior Cloud Engineer - Kubernetes
ABOUT US: Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint. We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology.  Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.  OVERVIEW We are looking for an experienced Senior Cloud Engineer specializing in Kubernetes development and enhancing underlying system capabilities to join our team. In this role, you will focus on Kubernetes ecosystem development, optimizing and enhancing the underlying architecture, integrating opensource solutions, and improving Kubernetes cluster capabilities. You will directly contribute to the development of the underlying Kubernetes cluster architecture, driving system performance improvements and scalability enhancements. KEY RESPONSIBILITIES Design and develop Kubernetes functionalities and modules to enhance cluster scalability, performance, and stability. Integrate and develop open-source solutions within the Kubernetes ecosystem to enhance features and build a more robust platform. Provide a more efficient platform for development teams, supporting automation deployment, monitoring, and failure recovery features. Assist development teams in solving technical issues related to Kubernetes' underlying functions and provide development support. Participate in Kubernetes cluster lifecycle management, including setup, maintenance, scaling, and optimization. Collaborate with operations teams to ensure the stable operation of the clusters and assist in solving complex configuration and troubleshooting issues. Design and implement Kubernetes management solutions across multiple cloud platforms, supporting seamless multi-cloud architecture integration. Continuously drive improvements in the cluster architecture, enhance automation, and reduce operational costs. Requirements REQUIRED QUALIFICATIONS At least 3 years of Kubernetes development experience, with a solid understanding of Kubernetes architecture, components, and operational principles, capable of developing and optimizing the underlying technology stack. Extensive experience in Kubernetes cluster development, able to solve deep technical issues related to underlying architecture. Proficiency in containerization technologies (such as Docker, Helm), and experience with integrating and developing open-source tools. Experience with large-scale Kubernetes clusters and performance optimization, with hands-on experience in underlying system development. Experience with managing Kubernetes clusters in multi-cloud environments and designing crosscloud solutions. Proficient in automation, CI/CD processes, and promoting efficient automated deployment and integration within Kubernetes clusters. Strong problem-solving skills, with the ability to address technical challenges and improve system architecture. Good communication skills, able to collaborate clearly with teams and other departments to drive technical improvements. PREFERRED QUALIFICATIONS Familiarity with service meshes (e.g., Istio) and monitoring systems (e.g., Prometheus, Grafana) development and integration. Certifications such as CKA (Certified Kubernetes Administrator) or CKAD (Certified Kubernetes Application Developer) are a plus. Benefits Salary range: $150,000 - 180,000 Free snacks and drinks, and provided lunch on Fridays Fully paid medical, dental, and vision insurance (partial coverage for dependents) Contributions to 401k funds Bi-annual reviews, and annual pay increases Health and wellness benefits, including free gym membership Quarterly team-building events At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc. Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.
Irvine, CA, USA
$150,000-180,000/year
Workable
Oracle Developer with ETL (OBIA)
Oracle Developer with ETL (OBIA) Location – Louisville, KY (Remote) Requirements Note: Customer is looking for OBIA Upgrade/Implementation , OBIEE and ETL experience • Developer who has strong understanding of Oracle Business Analytics Warehouse and ODI coding. This person should have experience with BIACM (Oracle BI Applications Configuration Manager) and has overall very good oracle pl SQL writing skills. • Manage, Support, and optimize all ETL process to perform data extraction, transformation, and loading (ETL) using the Oracle Integrator tool in Oracle BI application (OBIA 12c). • Design, Develop and Test any new ETL process required to populate the Reporting/Data Analytics Platform from source systems. • Manage the development of new and the support of existing data warehouse reports, dashboards, analytics, or other data warehouse output requirements using the Oracle Business Intelligence tools • Responsible for end-to-end design, coding, testing, review, and implementation using OBIEE 12c and Oracle Data Integrator and Oracle OBIA applications. • Perform data queries for analysis by understanding existing warehouse structures. • Additional responsibilities include troubleshooting, maintenance, optimization and performance tuning of ETL jobs to meet SLA. • Strong knowledge of Oracle Business Analytics Warehouse and Oracle BI Applications Configuration Manager tool . • Working experience with OBIEE/OBIA upgrades.
Louisville, KY, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.