Browse
···
Log in / Register

Senior Data Engineer

Negotiable Salary

Plum Inc

San Francisco, CA, USA

Favourites
Share

Description

PLUM is a fintech company empowering financial institutions to grow their business through a cutting-edge suite of AI-driven software, purpose-built for lenders and their partners across the financial ecosystem. We are a boutique firm, where each person’s contributions and ideas are critical to the growth of the company.  This is a fully remote position, open to candidates anywhere in the U.S. with a reliable internet connection. While we gather in person a few times a year, this role is designed to remain remote long-term. You will have autonomy and flexibility in a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action. You'll collaborate with a high-performing team — including sales, marketers, and financial services experts —  who stay connected through Slack, video calls, and regular team and company-wide meetings. We’re a team that knows how to work hard, have fun, and make a meaningful impact—both together and individually. Job Summary We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines that ingest and process data from a variety of external client systems. This role is critical in building the data infrastructure that powers Plum’s next-generation AI-driven products. You will work with a modern data stack including Python, Databricks, AWS, Delta Lake, and more. As a senior member of the team, you’ll take ownership of architectural decisions, system design, and production readiness—working with team members to ensure data is reliable, accessible, and impactful. Key Responsibilities Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse. Integrate with external systems (e.g., CRMs, file systems, APIs) to automate ingestion of diverse data sources. Develop robust data workflows using Python and Databricks Workflows. Implement modular, maintainable ETL processes following SDLC best practices and Git-based version control. Contribute to the evolution of our Lakehouse architecture to support downstream analytics and machine learning use cases. Monitor, troubleshoot, and optimize data workflows in production. Collaborate with cross-functional teams to translate data needs into scalable solutions. Requirements Master’s degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience. 3+ years of experience building and maintaining production-grade data pipelines. Proven expertise in Python and SQL for data engineering tasks. Strong understanding of lakehouse architecture and data modeling concepts. Experience working with Databricks, Delta Lake, and Apache Spark. Hands-on experience with AWS cloud infrastructure. Track record of integrating data from external systems, APIs, and databases. Strong problem-solving skills and ability to lead through ambiguity. Excellent communication and documentation habits. Preferred Qualifications Experience building data solutions in Fintech, Sales Tech, or Marketing Tech domains. Familiarity with CRM platforms (e.g., Salesforce, HubSpot) and CRM data models. Experience using ETL tools such as Fivetran or Airbyte. Understanding of data governance, security, and compliance best practices. Benefits A fast-paced, collaborative startup culture with high visibility. Autonomy, flexibility, and a flat corporate structure that gives you the opportunity for your direct input to be realized and put into action.  Opportunity to make a meaningful impact in building a company and culture.  Equity in a financial technology startup.  Generous health, dental, and vision coverage for employees and family members + 401K. Eleven paid holidays and unlimited discretionary vacation days. Competitive compensation and bonus potential.

Source:  workable View original post

Location
San Francisco, CA, USA
Show map

workable

You may also like

Workable
Data & BI Senior Data Engineer
Job Description: We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a specialization in Matillion, SSIS, Azure DevOps, and ETL processes. This role will involve designing, developing, testing, and deploying ETL jobs, collaborating with cross-functional teams, and ensuring efficient data processing. Key Responsibilities: Design, develop, test, and deploy Matillion ETL jobs in accordance with project requirements. Collaborate with the Data and BI team to understand data integration needs and translate them into Matillion ETL solutions. Create and modify Python code/components in Matillion jobs. Identify opportunities for performance optimization and implement enhancements to ensure efficient data processing. Collaborate with cross-functional teams, including database administrators, data engineers, and business analysts, to ensure seamless integration of ETL processes. Create and maintain comprehensive documentation for Matillion ETL jobs, ensuring knowledge transfer within the team. Create, test, and deploy SQL Server Integration Service (SSIS) packages and schedule them via Active Batch scheduling tool. Create Matillion deployment builds using Azure DevOps CI/CD pipeline and perform release manager activities. Review code of other developers (L2, L3-BI/DI) to ensure code standards and provide approval as part of code review activities. Resolve escalation tickets from the L2 team as part of the on-call schedule. Working knowledge of API and Postman tool is an added advantage. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a focus on ETL processes. Proficiency in Matillion, SSIS, Azure DevOps, and ETL. Strong knowledge of SQL, Python, and data integration techniques. Experience with performance optimization and data processing enhancements. Excellent collaboration and communication skills. Ability to work in a fast-paced, dynamic environment. Preferred Skills: Experience with cloud platforms such as AWS or Azure. Knowledge of data warehousing and data modeling. Familiarity with DevOps practices and CI/CD pipelines. Strong problem-solving skills and attention to detail.
Atlanta, GA, USA
Negotiable Salary
Workable
Software Engineer, Skill Level 2
At Avalore, we are a mission-driven, veteran-owned small business that helps government agencies harness the power of data and emerging technologies to solve complex problems. Our team combines deep technical expertise with a passion for public service, delivering innovative, responsible solutions in AI, data governance, cybersecurity, and enterprise transformation. Joining Avalore means working alongside experts who have successfully led high-impact initiatives across the DoD and Intelligence Community, and being part of a company that values integrity, agility, and purpose. Your responsibilities will include: Develops, maintains, and enhances complex and diverse software systems (e.g., processing-intensive analytics, novel algorithm development, manipulation of extremely large data sets, real-time systems, and business management information systems) based upon documented requirements.  Analyze user requirements to derive software design and performance requirements  Design and code new software or modify existing software to add new features  Debug existing software and correct defects  Integrate existing software into new or modified systems or operating environments  Develop simple to complex data queries for existing or proposed databases or data repositories  Provide recommendations for improving documentation and software development process standards  Works individually or as part of a team.  Reviews and tests software components for adherence to the design requirements and documents test results.  Utilizes software development and software design methodologies appropriate to the development environment.  Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components. Requirements Bachelor’s degree in a technical discipline from an accredited college or university is required + Fourteen (14) years’ experience as a SWE in programs and contracts of similar scope, type, and complexity  Four (4) years of relevant SWE experience may be substituted for the degree  Ability to work independently and manage multiple priorities. TS/SCI and Special Security Accesses and Polygraph required. Applicants must be currently authorized to work in the United States on a full-time basis. Avalore will not sponsor applicants for work visas for this position. Desired: Cloud, CNO, DevOps, Data Analytics, Machine Learning & AI  Benefits Eligibility requirements apply. Health Care Plan (Medical, Dental & Vision) Retirement Plan (401k, IRA) Life Insurance (Basic, Voluntary & AD&D) Paid Time Off (Vacation, Sick & Public Holidays) Short Term & Long Term Disability Training & Development Employee Assistance Program
Fort Meade, MD, USA
Negotiable Salary
Workable
Numerical Algorithm Software Engineer
SciTec is a dynamic small business, with the mission to deliver advanced sensor data processing technologies and scientific instrumentation capabilities in support of National Security and Defense, and we are growing our creative team! We support customers throughout the Department of Defense and U.S. Government in building innovative new tools to deliver unique world-class data exploitation capabilities. Important Notice: SciTec exclusively works on U.S. government contracts that require U.S. citizenship for all employees. SciTec cannot sponsor or assume sponsorship of employee work visas of any type. Further, U.S. citizenship is a requirement to obtain and keep a security clearance. Applicants that do not meet these requirements will not be considered. SciTec has immediate opportunities for talented software & algorithm developers and engineers to support programs focusing on low-latency data processing, fusion, and tracking algorithms for exploitation of remote sensing systems. Our ideal candidate will work well in multiple software languages as part of a rapid pace, collaborative, small-team environment consisting of Scientists, Engineers, and Developers and be able to prototype and develop advanced algorithms leading to eventual integration in C++ on Linux operating systems as part of government frameworks. Responsibilities Research new algorithms and analysis techniques for remote sensor data exploitation Demonstrate fluent, idiomatic mastery of Python and/or C++; comfort with software design and architecture Develop proof-of-concept signal processing, image processing, and data exploitation tools in Python Characterize quality/performance of algorithms and sensor systems Work as part of an Agile team and contribute to shared tools Other duties as assigned Requirements Colorado Residents: In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information. A bachelor’s degree in the physical sciences, mathematics, engineering, or computer science At least five years of ongoing professional experience in defense and/or defense-related technological fields (additional years of education may be substituted for years of experience) Professional experience with state estimation, tracking, or Guidance, Navigation, and Control (GNC) Professional experience and fluency in the following languages: C++, Python Fluency with Linux operating systems Ability to work full-time in-person in Boulder, CO office location Detail oriented with good verbal and written communication skills Candidates who have any of the following skills will be preferred A current active DoD SECRET security clearance or higher An advanced degree in the physical sciences, mathematics, engineering, or computer science Professional experience with application orchestration and/or deployment to the cloud Professional experience with Agile software development Professional experience with the exploitation and analysis of OPIR, E/O, SAR, Spectral, RF, or other remotely sensed data Benefits SciTec offers a highly competitive salary and benefits package, including: Employee Stock Ownership Plan (ESOP) 3% Fully Vested Company 401K Contribution (no employee contribution required) 100% company paid HSA Medical insurance, with a choice of 2 buy-up options 80% company paid Dental insurance 100% company paid Vision insurance 100% company paid Life insurance 100% company paid Long-term Disability insurance 100% company paid Hospital Indemnity insurance Voluntary Accident and Critical Illness insurance Short-term Disability insurance Annual Profit-Sharing Plan Discretionary Performance Bonus Paid Parental Leave Generous Paid Time Off, including Holiday, Vacation, and Sick Pay Flexible Work Hours The pay range for this position is $117,000 - $168,000 / year. SciTec considers several factors when extending an offer of employment, including but not limited to the role and associated responsibilities, a candidate's work experience, education/training, and key skills. This is not a guarantee of compensation. SciTec is proud to be an Equal Opportunity employer. VETS/Disabled.
Boulder, CO, USA
$117,000-168,000/year
Workable
Full Stack Developer - Need Only Locals to GA
Job Description :   We are seeking a talented Full Stack Developer to join our team at iSoftTek Solutions Inc. As a Full Stack Developer, you will be responsible for designing, developing, and maintaining our web applications and software solutions. You will collaborate with cross-functional teams to deliver high-quality and scalable software products.   Responsibilities: Design, develop, and maintain web applications using modern technologies and frameworks Collaborate with product owners, designers, and other stakeholders to gather requirements and translate them into technical specifications Write clean and efficient code following industry best practices Perform code reviews for team members and provide constructive feedback Optimize application performance and ensure scalability Troubleshoot and debug issues reported by clients and users Stay up-to-date with the latest trends and technologies in web development   Requirements: Bachelor's degree in Computer Science, Software Engineering, or a related field Minimum of 3 years of experience as a Full Stack Developer Proficient in front-end technologies such as HTML, CSS, JavaScript, and JavaScript frameworks like React or Angular Strong knowledge of back-end technologies such as Java, Python, or Node.js Experience with databases such as MySQL or MongoDB Knowledge of version control systems like Git Excellent problem-solving and communication skills Ability to work independently and in a team environment Must be a local resident of Georgia, USA Requirements Requirement Summary: Bachelor's degree in Computer Science, Software Engineering, or related field 3+ years of experience as a Full Stack Developer Proficiency in HTML, CSS, JavaScript, and JavaScript frameworks (React, Angular) Strong knowledge of back-end technologies (Java, Python, Node.js) Experience with databases (MySQL, MongoDB) Knowledge of version control systems (Git) Excellent problem-solving and communication skills Ability to work independently and in a team environment Must be a local resident of Georgia, USA
Atlanta, GA, USA
Negotiable Salary
Workable
Data Engineer III
Position: Data Engineer III    Location: Redmond, WA 98052    Duration: 12 Months       Job Type: Contract         Work Type:  Onsite      Job Description:     Responsibilities  Build statistical models to identify patterns and trends in production testing data (telemetry, time-series) to generate correlation and predictions.  Develop an understanding of the manufacturing execution system with its multiple sources of data and relationship. Analyze the production system data and derive insights into current bottlenecks and areas of improvement, along with specific actionable recommendations  Collect, format and report periodic snapshots of the production and test data for consumption by other program managers and engineers.  Synthesize analysis and communicate insights and recommendations to various audiences  Work closely with technical program managers and engineers to provide data insights that help explain anecdotes from the manufacturing floor, and provide actionable resolution steps  Create templates and scripts to help scale and automate our ability to ingest, analyze and enable quick data reviews.  Required Skills & Experience  7+ years of related experience (data science, data engineering, data analysis, ML engineering)  4+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, Minitab, etc.) experience  2+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience  Very Strong development experience with notable BI reporting tools (Oracle BI Enterprise Edition (OBIEE)).  Should have experience developing complex and a variety of reports.  A good candidate has strong analytical skills and enjoys working with large complex data sets.  Experience applying theoretical models in an applied environment  A good candidate can partner with business owners directly to understand their requirements and provide data which can help them observe patterns and spot anomalies. Preferred  Experience in Python, Perl, or another scripting language  Master's degree in a quantitative field such as statistics, mathematics, data science, engineering, or computer science  Experience in aerospace, manufacturing, test processes and/or test systems development  Typical Day in the Role:    Daily Schedule:   Accelerate test data collection and reviews to help reduce testing requirements across the OISL LRUs  Simplify and automate (with scripts and standard schemas) the continuous collection and formatting of such data  Develop dashboards and visualizations to help the test triage process  Develop a continuous monitoring and alarming framework that can help inform Engineers of areas needing a deeper analysis  Partner with the continuous improvement and OISL teams to accelerate the creation of analysis and tooling to establish manufacturing statistical process controls of key LRUs and test processes  Candidate Requirements:    REQUIRED SKILLS  Years of Experience: 5 years  Leadership Principle:   Customer Obsession, Learn and Be Curious, Dive Deep, Clear communication  Top 3 must-have hard skills  5+ years of related experience (data science, data engineering, data analysis, ML engineering)  scripting languages (e.g. Python  Statistical modeling data analysis tools   
Redmond, WA, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.