Browse
···
Log in / Register

Senior Data Engineer

Negotiable Salary

Pickle

New York, NY, USA

Favourites
Share

Description

Must be located in NYC (we are in office Monday - Thursday). Please apply by emailing recruiting@shoponpickle.com with “Senior Data Engineer” in the subject line and include the following: Share 1-3 (max) bullets on why you think you’re a standout applicant for this role. Share 1-2 (max) bullets summarizing an initiative you’re most proud of and the impact it drove (we love metrics!) Please attach your resume Pickle is a rental marketplace that aims to monetize the billions of underutilized assets sitting in consumers closets and brands inventory. Users can easily tap into shared closets within their community through flexible and/or on-demand delivery options. Our goal is to provide affordable and convenient access to quality items exactly when our users need them. We are starting with P2P clothing/accessories and expanding to other categories. Role Overview: As our first dedicated Senior Data Engineer, you will architect and scale the data hub that fuels every insight, experiment, and decision across the company. You’ll own the full lifecycle of our data platform—from ingestion and modeling to governance and observability—while partnering closely with Product, Growth, and Engineering to unlock self-serve analytics and advanced use cases. Requirements 6 + years of professional data-engineering experience, designing and running production-grade ELT/ETL pipelines Deep expertise with Snowflake or similar tool (warehouse design, performance tuning, security, and cost management) Strong SQL plus proficiency in Python (or TypeScript/Node.js) for data transformation and orchestration Hands-on experience with modern data-stack tooling (e.g., dbt, Airflow/Prefect, Fivetran/Hevo, Kafka/Kinesis) Solid grasp of data-governance concepts (catalogs, lineage, quality testing, RBAC, PII handling) Cloud infrastructure skills—AWS preferred (S3, ECS/Lambda, IAM, Secrets Manager, CloudWatch) Proven track record of setting up data infrastructure and analytics processes from the ground up in a high-growth or startup environment Excellent communication skills and the ability to translate business questions into scalable datasets and metrics Bonus: experience supporting experimentation frameworks, real-time/event-driven pipelines, or ML feature stores Key Responsibilities Own the data platform roadmap: design, build, and scale the pipelines and orchestration that feed Pickle’s central data hub Stand up analytics infrastructure: establish best-practice processes for data modeling, testing, CI/CD, and observability to ensure reliable, timely data Implement and Optimize Snowflake (or similar tool): tune performance, manage costs, and leverage advanced features (e.g., zero-copy clones, data sharing) Champion data governance & security: implement standards for documentation, data quality, lineage, and access control Enable self-service analytics: partner with analysts and stakeholders to expose trusted datasets and metrics in BI tools and experimentation platforms Collaborate across functions: work with Product, Growth, Operations, and Engineering to translate business requirements into data assets that drive conversion, retention, and liquidity Mentor and elevate the team: share best practices, review code, and influence architecture and tooling decisions as we scale Measure and monitor: define SLAs/SLOs for data freshness and accuracy; set up dashboards and alerts to maintain platform health Benefits Competitive compensation and equity Healthcare (Medical, Dental, Vision) Take what you need paid time off Meal Pal credits to cover the cost of lunch Stipend to help set up your desk and office environment Work directly with the founders and executive team Professional coaching, training, and development Grow with the company Pickle credits for our employees, we love when the team uses Pickle! Fun team events and company parties Company offsites Office space in NYC

Source:  workable View Original Post

Location
New York, NY, USA
Show Map

workable

You may also like

Workable
Finance Senior Applications Engineer (Law firm exp. required)
Los Angeles, CA, USA
Responsibilities: Support and drive forward enterprise applications and platforms such as BigSquare, Carpe Diem, Chrome River, eBillingHub, Elite 3E including Design Gallery, IntApp Time, Proforma, and Star Collect for the finance department; Work on application implementations, upgrades, enhancements, troubleshooting, and roadmaps; Develop in 3E IDE and Design Gallery; Conduct development activities including analysis and solution design along with hands-on development across application layers; Provide direct oversight and management of various solutions as assigned; Act as subject matter expert for those systems and provide technical support as needed. Ensure the health and stability of all solutions owned; Monitor, track, and report on performance and capacity. Responsible for capacity planning; Under direction of the Senior Manager, Enterprise Applications, engage with the business as needed in supporting the above; Work with the relevant IT teams in developing and delivering enhancements and new capabilities. Drive and manage those efforts; Deliver recommendations to the business and IT leadership based on an understanding of the platform, roadmap and related business drivers; Maintain current and accurate architecture and support documentation for use by both engineering teams and support staff; Adhere to and participate in the Firm’s Change Management process; Provide regular updates and reporting to both the Senior Manager, Enterprise Applications and Director of Applications; Collaborates in the creation and use of training and QA materials and activities for applications supported; Provides coaching to junior analysts on trouble shooting and best practices; Know common legal enterprise solutions. Advanced knowledge of enterprise applications on premise, in the cloud, and hybrid; Work with product vendors and procurement; Requirements At least 7 years’ of information technology experience; At least 5 years’ experience working with Thomson Reuters Elite 3E; 5+ years of law firm experience
Negotiable Salary
Craigslist
👨‍💻 AI Developer Training and Work – From Home
1920 E Maish Ave, Des Moines, IA 50320, USA
Looking for a career in AI? We’re hiring passionate individuals ready to grow, collaborate, and innovate at the edge of artificial intelligence development. We’re a well-established tech company inviting driven individuals to grow their skills. Want real experience in AI and software? This structured remote path is designed to be flexible—full- or part-time—and includes over 600 hours of guided work using real industry tools. You’ll be prepared to pursue work as an AI Developer. What you’ll gain: 🖥️ Computer & Software Fundamentals • Understand computer and networking systems • Study algorithms, hardware logic, and digital security • Learn Python programming from scratch 💻 Web & Application Development • Develop websites using HTML, CSS, and JavaScript • Utilize Bootstrap and React.js frameworks • Collaborate using Git and manage team projects 🧠 AI and Machine Learning Tools • Dive into machine learning, neural networks, and data modeling • Use OpenAI, TensorFlow, Pandas, and more • Train and deploy models for real use cases like automation and chatbots • Visualize data and learn Docker for project setup 🗄️ Database and Backend Skills • Design and query SQL databases • Use backend logic and CRUD integration 🧪 Capstone Project • Build and launch a full application • Use professional tools and workflows • Simulate a team environment with Agile/Scrum 🧰 Career Preparation • Prepare for tech interviews and assessments • Write a compelling technical résumé • Transition into software or AI development roles No experience needed. 100% remote. Apply here: https://aitraining.compare
$30/hour
Workable
Data Engineer - Job ID: DE2025
Princeton, NJ, USA
Build the data foundation that powers patient‑centric decisions Ascendis Pharma is a growth‑stage biopharmaceutical company guided by the core values Patients · Science · Passion. To strengthen our cloud‑native commercial and medical data platform, we are looking for a Data Engineer who combines solid data‑pipeline craftsmanship with a collaborative, product‑oriented mindset. You will join the same agile product team as our analysts, working in rapid product‑discovery cycles to deliver trustworthy data that drives access to life‑changing therapies. Your four focus areas 1. Data Engineering Design, build and maintain scalable ELT pipelines that ingest Patient Support Program, Specialty Pharmacy, Claims, CRM, Marketing, Medical and Financial data. Implement automated data‑quality checks, monitoring and alerting. 2. Data Modelling Develop canonical data models and dimensional marts in our up-coming Databricks Lakehouse to enable self‑service analytics. Apply best‑practice naming, documentation and version control. 3. Collaboration & Facilitation Work side‑by‑side with analysts and product owners to translate business questions into robust technical solutions. Lead whiteboard sessions and spike explorations during product discovery. 4. DevOps & Continuous Improvement Configure CI/CD pipelines for data code, automate testing and promote reusable patterns. Keep an eye on performance, cost and security, driving iterative enhancements. Key responsibilities Engineer reliable, end‑to‑end data flows using Spark SQL and Pyspark notebooks in Databricks (nice to have). Orchestrate and schedule batch & streaming jobs in Azure Data Factory or similar tools. Develop and maintain dbt models (nice to have) to standardise transformations and documentation. Implement data‑cataloguing, lineage and governance practices that meet pharma‑compliance requirements. Drive alignment with analysts to optimise queries, refine schemas and ensure metrics are consistent across dashboards to ensure stakeholders have a single source of truth for decision-making Contribute to agile ceremonies (backlog refinement, sprint reviews, retros) and actively champion a culture of data craftsmanship. Why Ascendis Mission that matters – Our pipelines accelerate insight generation that expands patient access to vital therapies. Modern stack – Azure, Databricks, dbt, GitHub Actions – with room to propose new tools. End‑to‑end ownership – Influence everything from ingestion to visualisation in a small, empowered team. Hybrid flexibility & growth – Work three days onsite, enjoy generous benefits, great snacks and pursue clear technical or leadership paths. Requirements What you bring Experience – 3–7 years in Data Engineering or Analytics Engineering, ideally in pharma, biotech or another regulated, data‑rich environment. Core toolkit – Advanced SQL and Python. Solid experience in dimensional modelling. Nice‑to‑have – Hands‑on with Databricks notebooks/Lakehouse and dbt for transformation & testing. Product mindset – Comfortable iterating fast, demoing early and measuring impact. Communication & teamwork – Able to explain trade‑offs, write clear documentation and collaborate closely with analysts, product managers and business and other stakeholders. Quality focus – Passion for clean, maintainable code, automated testing and robust data governance. Benefits 401(k) plan with company match Medical, dental, and vision plans Company-offered Life and Accidental Death & Dismemberment (AD&D) insurance Company-provided short and long-term disability benefits Unique offerings of Pet Insurance and Legal Insurance Employee Assistance Program Employee Discounts Professional Development Health Saving Account (HSA) Flexible Spending Accounts Various incentive compensation plans Accident, Critical Illness, and Hospital Indemnity Insurance   Mental Health resources Paid leave benefits for new parents A note to recruiters: We do not allow external search party solicitation.  Presentation of candidates without written permission from the Ascendis Pharma Inc Human Resources team (specifically from: Talent Acquisition Partner or Human Resources Director) is not allowed.  If this occurs your ownership of these candidates will not be acknowledged.
Negotiable Salary
Craigslist
DAS TECHNICIAN
Little Jetties, 3554-3598 Jimmy Buffett Mem Hwy, Jacksonville, FL 32233, USA
HardHat National Accounts is hiring an experienced DAS Technician/Installers to oversee and manage in-building wireless (IBW) DAS installations and low-voltage projects. The ideal candidate will have hands-on experience leading crews, coordinating job sites, and ensuring the successful completion of projects on time and within budget. Locations: Sandston Virginia South Hill Virginia Atlanta Georgia Pay is up to $35/hour-Per diem is provided if you meet requirements, and is paid for all 7 days. We can be reached at 713.487.6911 by call or text for an immediate interview. Jobscope not limited to the following: Installation, maintenance, and troubleshooting of Distributed Antenna Systems (DAS) to enhance telecommunications coverage Reporting to the Technical Manager, your core skills in computer networking, cabling, and telecommunication will be essential in ensuring system reliability. Your premium skills in network installation, customer service, and CCTV will enhance client satisfaction and operational efficiency. Minimum experience required: Basic understanding of RF principles, wireless communication technology & DAS systems Testing & commissioning Troubleshooting & Problem Solving to address installation issues We offer competitive benefits plan, weekly pay and long term work. Please call/text for an immediate phone interview 713.487.6911 All estimates of project are mere estimates and not representative of a guaranteed period of employment. HardHat is always subject to our clients termination of a project or assignment
$35/hour
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.