Browse
···
Log in / Register

Senior Data Engineer

Negotiable Salary

Plum Inc

San Francisco, CA, USA

Favourites
Share

Description

PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person’s contributions and ideas are critical to the growth of the company.  This is a fully remote position, open to candidates anywhere in the U.S. with a reliable internet connection. While we gather in person a few times a year, this role is designed to remain remote long-term. You will have autonomy and flexibility in a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. You'll collaborate with a high-performing team — including sales, marketers, and financial services experts —  who stay connected through Slack, video calls, and regular team and company-wide meetings. We’re a team that knows how to work hard, have fun, and make a meaningful impact—both together and individually. Job Summary We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines that ingest and process data from a variety of external client systems. This role is critical in building the data infrastructure that powers Plum’s next-generation AI-driven products. You will work with a modern data stack including Python, Databricks, AWS, Delta Lake, and more. As a senior member of the team, you’ll take ownership of architectural decisions, system design, and production readiness—working with team members to ensure data is reliable, accessible, and impactful. Key Responsibilities Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse. Integrate with external systems (e.g., CRMs, file systems, APIs) to automate ingestion of diverse data sources. Develop robust data workflows using Python and Databricks Workflows. Implement modular, maintainable ETL processes following SDLC best practices and Git-based version control. Contribute to the evolution of our Lakehouse architecture to support downstream analytics and machine learning use cases. Monitor, troubleshoot, and optimize data workflows in production. Collaborate with cross-functional teams to translate data needs into scalable solutions. Requirements Master’s degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience. 3+ years of experience building and maintaining production-grade data pipelines. Proven expertise in Python and SQL for data engineering tasks. Strong understanding of lakehouse architecture and data modeling concepts. Experience working with Databricks, Delta Lake, and Apache Spark. Hands-on experience with AWS cloud infrastructure. Track record of integrating data from external systems, APIs, and databases. Strong problem-solving skills and ability to lead through ambiguity. Excellent communication and documentation habits. Preferred Qualifications Experience building data solutions in Fintech, Sales Tech, or Marketing Tech domains. Familiarity with CRM platforms (e.g., Salesforce, HubSpot) and CRM data models. Experience using ETL tools such as Fivetran or Airbyte. Understanding of data governance, security, and compliance best practices. Benefits A fast-paced, collaborative startup culture with high visibility. Autonomy, flexibility, and a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action.  Opportunity to make a meaningful impact in building a company and culture.  Equity in a financial technology startup.  Generous health, dental, and vision coverage for employees and family members + 401K. Eleven paid holidays and unlimited discretionary vacation days. Competitive compensation and bonus potential.

Source:  workable View original post

Location
San Francisco, CA, USA
Show map

workable

You may also like

Workable
Low-Latency Developer
Atto Trading, a dynamic quantitative trading firm founded in 2010 and leading in global high-frequency strategies, is looking for a Low-Latency Developer to join our team. We are expanding an international, diverse team, with experts in trading, statistics, engineering, and technology. Our disciplined approach combined with rapid market feedback allows us to quickly turn ideas into profit. Our environment of learning and collaboration allows us to solve some of the world’s hardest problems, together. As a small firm, we remain nimble and hold ourselves to the highest standards of integrity, ingenuity, and effort.  Position Highlights: We are modernizing our trading and research platform to scale our alpha trading business. This platform will enable researchers to explore, test, and deploy sophisticated signals, models, and strategies across asset classes in a robust, fully automated manner while maintaining competitive latency targets. As a Low-Latency Developer, you will be responsible for designing, optimizing, and maintaining high-performance trading systems to minimize latency. Your Mission and Goals: Analyze and optimize the performance of low-latency trading systems by identifying bottlenecks and inefficiencies in the code, and implementing effective solutions.  Develop and adapt the platform to support the demands of a fast-paced trading environment, while effectively managing technical debt. Requirements Over 5 years of experience as a low-latency developer with a focus on performance optimization in a high-frequency trading (HFT) environment. Experience with multiple components of an HFT platform or system, particularly those on the critical path. Experience working at an HFT firm during its startup phase and/or on a trading team is a significant plus. Technical Skills: Deep knowledge of HFT platforms: networking, kernel bypass, market data, order entry, threading, inter-process communication, and strategy APIs. Proven low-latency development and performance optimization in HFT. Strong proficiency in C++. Excellent understanding of CPU caches and cache efficiency. Experience with multithreaded and multi-process synchronization. Good understanding of networking protocols. Skilled in performance profiling and optimization tools. Advanced knowledge of Linux operating systems, including kernel-level device mechanisms. About You: Practical decision-making skills. Excellent communication skills. Strong analytical and problem-solving skills. Passion for trading. Ability to work independently and as part of a team. Benefits Competitive rates of pay. Paid time off (5 weeks). Coverage of health insurance costs. Office lunches. Discretionary bonus system.  Annual base salary range of $150,000 to $300,000. Pay (base and bonus) may vary depending on job-related skills and experience. Our motivation: We are a company committed to staying at the forefront of technology. Our team is passionate about continual learning and improvement. With no external investors or customers, we are the primary users of the products we create, giving you the opportunity to make a real impact on our company's growth. Ready to advance your career? Join our innovative team and help shape the future of trading on a global scale. Apply now and let's create the future together!
New York, NY, USA
$150,000-300,000/year
Workable
Senior Software Developer (Gateway/Market Data)
Eagle Seven is seeking a Senior Software Developer focused on exchange connectivity and market data.  The individual will be responsible for analyzing exchange protocols, proposing design solutions, and implementing connectivity to trading venues across the world. The role be a part of the platform development team and will provide the individual with exposure to traders and strategy developers. The successful candidate will be a self-starter, have strong sense of ownership and be driven to provide technical and intellectual solutions to business problems.    Primary Responsibilities include: Architecting and implementing low-latency market access solutions Understanding, interpreting, and interfacing with global exchanges and their protocols Designing, developing, and supporting market data feed handlers and exchange order routers Diagnosing latency issues and resolving with appropriate tuning and optimizations Working with traders to source, evaluate and facilitate access to new data sources Working with extended team to capture, house, and provide historical access to market data Liaise with vendors on data and technical issues as needed to deliver rapid solutions to the business Requirements Skills and Experience: Bachelor’s degree in Computer Science or related field Proven track record of understanding and working with global exchange protocols Experience with writing parsers for exchange protocols such as FIX and ITCH, etc. Strong background in C++ and C++ Template metaprogramming with demonstrated experience using C++14/C++20 Expertise with TCP/IP, UDP multicast, sockets, network protocols, particularly on Linux/Unix systems Experience using network tools such as Wireshark and TCPDump to monitor and debug behavior Ability to work in a collaborative environment Excellent written and verbal communication skills Benefits Eagle Seven offers a competitive and comprehensive benefits package to all full-time employees. Medical PPO and HMO coverage through BlueCross BlueShield Company Contributions to a Health Savings Account (with enrollment into a High Deductible Health Plan) Dental coverage through Principal Vision coverage through VSP 401k Retirement Savings Plan with Employer Match Company Paid Life Insurance Company Paid Disability Insurance Paid Time Off Flexible Spending Account Pre-tax Transit Benefits Complimentary Lunch and Beverages The minimum base salary for this role starts at $150,000. This role is eligible for a discretionary performance bonus as part of the total compensation package, in addition to the benefits listed above. Exact compensation offered may vary based on factors including, but not limited to, the candidate's experience, qualifications, and skill set.
Chicago, IL, USA
$150,000/year
Workable
LDAP Software Engineer - Peru, IN -Fulltime Role
LDAP Software Engineer Location: Peru, IN Duration: Permanent role Responsibilities: * Work with upstream data source and downstream clients to resolve data related issues. * Assist in designing, developing, and implementing Data as A Service (DaaS) and support for microservices. * Perform data and log analysis. * Assist in designing, developing, and implementing flexible LDAP architecture for PaaS and SaaS applications. * Performance periodic maintenance as/when required. Requirements: * Performance periodic maintenance as/when required. * Assist in designing, developing, and implementing flexible LDAP architecture for PaaS and SaaS applications. * Work with upstream data source and downstream clients to resolve data related issues. * Assist in designing, developing, and implementing Data as A Service (DaaS) and support for microservices. * Perform data and log analysis. * Advanced knowledge of LDAP directories based on OpenDJ, Oracle (sun) or ForgeRock directories. * Expert in maintaining, troubleshooting issues in production environment. * Experience in installing and configuring Oracle DSEE 11g & OpenDJ, ForgeRock DS. Required Software Skills: * Directory Services (LDAP, MS Active Directory, MS Entra ID), including Group Policy, DNS, and DHCP Exchange / M365 * VMware vSphere/vCenter * Server and PC operating systems including Windows OS's and Linux * Citrix DaaS, and related components * Microsoft SQL Server and/or Oracle SQL along with database management utilities Recommended Working Knowledge * PowerShell * Git (GitHub / Gitlab) * TCP/IP * Networking OSI mode
Indiana, Peru
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.