Browse
···
Log in / Register

Senior Data Engineer

Negotiable Salary

Plum Inc

San Francisco, CA, USA

Favourites
Share

Description

PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person’s contributions and ideas are critical to the growth of the company.  This is a fully remote position, open to candidates anywhere in the U.S. with a reliable internet connection. While we gather in person a few times a year, this role is designed to remain remote long-term. You will have autonomy and flexibility in a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. You'll collaborate with a high-performing team — including sales, marketers, and financial services experts —  who stay connected through Slack, video calls, and regular team and company-wide meetings. We’re a team that knows how to work hard, have fun, and make a meaningful impact—both together and individually. Job Summary We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines that ingest and process data from a variety of external client systems. This role is critical in building the data infrastructure that powers Plum’s next-generation AI-driven products. You will work with a modern data stack including Python, Databricks, AWS, Delta Lake, and more. As a senior member of the team, you’ll take ownership of architectural decisions, system design, and production readiness—working with team members to ensure data is reliable, accessible, and impactful. Key Responsibilities Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse. Integrate with external systems (e.g., CRMs, file systems, APIs) to automate ingestion of diverse data sources. Develop robust data workflows using Python and Databricks Workflows. Implement modular, maintainable ETL processes following SDLC best practices and Git-based version control. Contribute to the evolution of our Lakehouse architecture to support downstream analytics and machine learning use cases. Monitor, troubleshoot, and optimize data workflows in production. Collaborate with cross-functional teams to translate data needs into scalable solutions. Requirements Master’s degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience. 3+ years of experience building and maintaining production-grade data pipelines. Proven expertise in Python and SQL for data engineering tasks. Strong understanding of lakehouse architecture and data modeling concepts. Experience working with Databricks, Delta Lake, and Apache Spark. Hands-on experience with AWS cloud infrastructure. Track record of integrating data from external systems, APIs, and databases. Strong problem-solving skills and ability to lead through ambiguity. Excellent communication and documentation habits. Preferred Qualifications Experience building data solutions in Fintech, Sales Tech, or Marketing Tech domains. Familiarity with CRM platforms (e.g., Salesforce, HubSpot) and CRM data models. Experience using ETL tools such as Fivetran or Airbyte. Understanding of data governance, security, and compliance best practices. Benefits A fast-paced, collaborative startup culture with high visibility. Autonomy, flexibility, and a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action.  Opportunity to make a meaningful impact in building a company and culture.  Equity in a financial technology startup.  Generous health, dental, and vision coverage for employees and family members + 401K. Eleven paid holidays and unlimited discretionary vacation days. Competitive compensation and bonus potential.

Source:  workable View original post

Location
San Francisco, CA, USA
Show map

workable

You may also like

Workable
API Kong Engineer
API Kong Engineer Location: Phoenix, AZ (Hybrid – 3-5 days/week from office) Duration: 12 months Experience:8+Years   Job Description: We are seeking a skilled API Developer / Integration Engineer who is proficient in managing and optimizing APIs using KONG Gateway. As part of our team, you will be responsible for designing, developing, and deploying API solutions that enhance our platform’s functionality and performance. Responsibilities: Design and implement scalable API solutions using KONG Gateway. Collaborate with cross-functional teams to gather and understand integration requirements. Develop custom plugins and middleware to extend KONG Gateway functionalities. Optimize API performance and security through monitoring, tuning, and troubleshooting. Implement API management best practices including versioning, documentation, and access control. Integrate APIs with various backend systems and databases. Ensure high availability and reliability of APIs by implementing robust failover and redundancy strategies. Stay updated with the latest industry trends and advancements in API management and integration. Requirements: Proven experience in designing and developing APIs using KONG Gateway. Strong understanding of RESTful API design principles and best practices. Proficiency in programming languages such as Python, Node.js, or Java. Experience with containerization technologies like Docker and orchestration tools such as Kubernetes. Familiarity with API security standards (OAuth, JWT, SSL/TLS). Ability to troubleshoot complex API issues and performance bottlenecks. Excellent communication skills and the ability to collaborate effectively with team members. Proven experience as a Kong API expert or similar role. Strong knowledge of API design and development principles. Proficiency in Kong API gateway configuration and management. Experience with Kong plugins, Lua, and Kong Enterprise features. Familiarity with RESTful APIs, OAuth, JWT, and other API security protocols. Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus. Strong troubleshooting and problem-solving skills.   Preferred Skills: Experience with other API gateways (e.g., Apigee, AWS API Gateway) is an advantage. Knowledge of cloud platforms (e.g., AWS, Azure, GCP).Experience with DevOps practices and CI/CD pipelines. Familiarity with logging and monitoring tools (e.g., ELK stack, Prometheus). Preferred Qualifications: KONG Certified Developer or Administrator. Experience with other API gateways and management platforms. Knowledge of microservices architecture and design patterns.  
Phoenix, AZ, USA
Negotiable Salary
Workable
Senior Software Developer (Gateway/Market Data)
Eagle Seven is seeking a Senior Software Developer focused on exchange connectivity and market data.  The individual will be responsible for analyzing exchange protocols, proposing design solutions, and implementing connectivity to trading venues across the world. The role be a part of the platform development team and will provide the individual with exposure to traders and strategy developers. The successful candidate will be a self-starter, have strong sense of ownership and be driven to provide technical and intellectual solutions to business problems.    Primary Responsibilities include: Architecting and implementing low-latency market access solutions Understanding, interpreting, and interfacing with global exchanges and their protocols Designing, developing, and supporting market data feed handlers and exchange order routers Diagnosing latency issues and resolving with appropriate tuning and optimizations Working with traders to source, evaluate and facilitate access to new data sources Working with extended team to capture, house, and provide historical access to market data Liaise with vendors on data and technical issues as needed to deliver rapid solutions to the business Requirements Skills and Experience: Bachelor’s degree in Computer Science or related field Proven track record of understanding and working with global exchange protocols Experience with writing parsers for exchange protocols such as FIX and ITCH, etc. Strong background in C++ and C++ Template metaprogramming with demonstrated experience using C++14/C++20 Expertise with TCP/IP, UDP multicast, sockets, network protocols, particularly on Linux/Unix systems Experience using network tools such as Wireshark and TCPDump to monitor and debug behavior Ability to work in a collaborative environment Excellent written and verbal communication skills Benefits Eagle Seven offers a competitive and comprehensive benefits package to all full-time employees. Medical PPO and HMO coverage through BlueCross BlueShield Company Contributions to a Health Savings Account (with enrollment into a High Deductible Health Plan) Dental coverage through Principal Vision coverage through VSP 401k Retirement Savings Plan with Employer Match Company Paid Life Insurance Company Paid Disability Insurance Paid Time Off Flexible Spending Account Pre-tax Transit Benefits Complimentary Lunch and Beverages The minimum base salary for this role starts at $150,000. This role is eligible for a discretionary performance bonus as part of the total compensation package, in addition to the benefits listed above. Exact compensation offered may vary based on factors including, but not limited to, the candidate's experience, qualifications, and skill set.
Chicago, IL, USA
$150,000/year
Workable
LDAP Software Engineer - Peru, IN -Fulltime Role
LDAP Software Engineer Location: Peru, IN Duration: Permanent role Responsibilities: * Work with upstream data source and downstream clients to resolve data related issues. * Assist in designing, developing, and implementing Data as A Service (DaaS) and support for microservices. * Perform data and log analysis. * Assist in designing, developing, and implementing flexible LDAP architecture for PaaS and SaaS applications. * Performance periodic maintenance as/when required. Requirements: * Performance periodic maintenance as/when required. * Assist in designing, developing, and implementing flexible LDAP architecture for PaaS and SaaS applications. * Work with upstream data source and downstream clients to resolve data related issues. * Assist in designing, developing, and implementing Data as A Service (DaaS) and support for microservices. * Perform data and log analysis. * Advanced knowledge of LDAP directories based on OpenDJ, Oracle (sun) or ForgeRock directories. * Expert in maintaining, troubleshooting issues in production environment. * Experience in installing and configuring Oracle DSEE 11g & OpenDJ, ForgeRock DS. Required Software Skills: * Directory Services (LDAP, MS Active Directory, MS Entra ID), including Group Policy, DNS, and DHCP Exchange / M365 * VMware vSphere/vCenter * Server and PC operating systems including Windows OS's and Linux * Citrix DaaS, and related components * Microsoft SQL Server and/or Oracle SQL along with database management utilities Recommended Working Knowledge * PowerShell * Git (GitHub / Gitlab) * TCP/IP * Networking OSI mode
Indiana, Peru
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.