Browse
···
Log in / Register

Freelance Software Developer (Rust) - Quality Assurance (AI Trainer)

$50

Mindrift

Colorado, USA

Favourites
Share

Description

This opportunity is only for candidates currently residing in the specified country. Your location may affect eligibility and rates. At Mindrift, innovation meets opportunity. We believe in using the power of collective intelligence to ethically shape the future of AI. What we do The Mindrift platform connects specialists with AI projects from major tech innovators. Our mission is to unlock the potential of Generative AI by tapping into real-world expertise from across the globe. About the Role GenAI models are improving very quickly, and one of our goals is to make them capable of addressing specialized questions and achieving complex reasoning skills. If you join the platform as an AI Tutor in Coding, you’ll have the opportunity to collaborate on these projects. Although every project is unique, you might typically: Code generation and code review Prompt evaluation and complex data annotation Training and evaluation of large language models Benchmarking and agent-based code execution in sandboxed environments Working across multiple programming languages (Python, JavaScript/TypeScript, Rust, SQL, etc.) Adapting guidelines for new domains and use cases Following project-specific rubrics and requirements Collaborating with project leads, solution engineers, and supply managers on complex or experimental projects How to get started Simply apply to this post, qualify, and get the chance to contribute to projects aligned with your skills, on your own schedule. From creating training prompts to refining model responses, you’ll help shape the future of AI while ensuring technology benefits everyone. Requirements You have a Bachelor's degree in Computer Science, Software Engineering, or related field. You have at least 3 years of professional experience with Rust and hands-on experience with Rust testing tools (e.g., cargo test, property-based testing with quickcheck). Familiarity with Rust tooling (Cargo, Clippy, rustfmt), module and crate management. Experience with code review, quality analysis, and identifying/fixing code smells and type issues⁠ Familiar with test integration in CI/CD environments. Experience using, integrating, or tutoring others with AI solutions or LLMs in Rust/test automation contexts. Your level of English is advanced (C1) or above. You are ready to learn new methods, able to switch between tasks and topics quickly and sometimes work with challenging, complex guidelines. Our freelance role is fully remote so, you just need a laptop, internet connection, time available and enthusiasm to take on a challenge. Benefits Why this freelance opportunity might be a great fit for you? Get paid for your expertise, with rates that can go up to $50/hour depending on your skills, experience, and project needs. Take part in a part-time, remote, freelance project that fits around your primary professional or academic commitments. Work on advanced AI projects and gain valuable experience that enhances your portfolio. Influence how future AI models understand and communicate in your field of expertise.

Source:  workable View original post

Location
Colorado, USA
Show map

workable

You may also like

Craigslist
CLASS "A" CDL INTERMODAL Willing to get HAZ MAT- company driver (Chicago)
Metro Intermodal - A MEDLOG Company. We are busy and getting busier, and we have immediate openings for SAFE, professional, and dependable Class A owner-operators and Class A drivers who know the importance of safety, communication, BEING ON TIME, and excellent customer service. In the current market, efficient owner-operators with intermodal experience and strong work ethics can earn between $1500 - 2500 after deductions. They get to go home every night and do not have to deal with forced dispatch. We try our best to pre-pull and terminate your load and empties as much as possible to minimize your time spent in the rail and container yards, allowing you to spend more time on the road. * Home Every Night! * We move 20's / 40's / 45's / Refers and 53' Domestic Regionally and Locally. * 24/7 Pleasant/Professional Operations support staff, we are there for you if you need help! * No Forced Dispatch! * We offer Direct Deposit with only ONE week's holdback! *Health insurance for you and the family. At Metro Intermodal, we know our Drivers are the most critical piece. We ensure that our Drivers are well-supported and have a workplace that makes them feel comfortable and motivated! Honesty is the key to any relationship. You need to be able to count on us, and we need to rely on you. Simply put, we will always tell you the truth. It's why we have one of the highest retention rates in the trucking industry! Honesty is one of our guiding principles. Call John at 1-866-370-1617 or (773) 458-6075 for more information.
6512 W 65th St, Chicago, IL 60638, USA
$1,500-2,500/month
Workable
Data Engineer - Job ID: DE2025
Build the data foundation that powers patient‑centric decisions Ascendis Pharma is a growth‑stage biopharmaceutical company guided by the core values Patients · Science · Passion. To strengthen our cloud‑native commercial and medical data platform, we are looking for a Data Engineer who combines solid data‑pipeline craftsmanship with a collaborative, product‑oriented mindset. You will join the same agile product team as our analysts, working in rapid product‑discovery cycles to deliver trustworthy data that drives access to life‑changing therapies. Your four focus areas 1. Data Engineering Design, build and maintain scalable ELT pipelines that ingest Patient Support Program, Specialty Pharmacy, Claims, CRM, Marketing, Medical and Financial data. Implement automated data‑quality checks, monitoring and alerting. 2. Data Modelling Develop canonical data models and dimensional marts in our up-coming Databricks Lakehouse to enable self‑service analytics. Apply best‑practice naming, documentation and version control. 3. Collaboration & Facilitation Work side‑by‑side with analysts and product owners to translate business questions into robust technical solutions. Lead whiteboard sessions and spike explorations during product discovery. 4. DevOps & Continuous Improvement Configure CI/CD pipelines for data code, automate testing and promote reusable patterns. Keep an eye on performance, cost and security, driving iterative enhancements. Key responsibilities Engineer reliable, end‑to‑end data flows using Spark SQL and Pyspark notebooks in Databricks (nice to have). Orchestrate and schedule batch & streaming jobs in Azure Data Factory or similar tools. Develop and maintain dbt models (nice to have) to standardise transformations and documentation. Implement data‑cataloguing, lineage and governance practices that meet pharma‑compliance requirements. Drive alignment with analysts to optimise queries, refine schemas and ensure metrics are consistent across dashboards to ensure stakeholders have a single source of truth for decision-making Contribute to agile ceremonies (backlog refinement, sprint reviews, retros) and actively champion a culture of data craftsmanship. Why Ascendis Mission that matters – Our pipelines accelerate insight generation that expands patient access to vital therapies. Modern stack – Azure, Databricks, dbt, GitHub Actions – with room to propose new tools. End‑to‑end ownership – Influence everything from ingestion to visualisation in a small, empowered team. Hybrid flexibility & growth – Work three days onsite, enjoy generous benefits, great snacks and pursue clear technical or leadership paths. Requirements What you bring Experience – 3–7 years in Data Engineering or Analytics Engineering, ideally in pharma, biotech or another regulated, data‑rich environment. Core toolkit – Advanced SQL and Python. Solid experience in dimensional modelling. Nice‑to‑have – Hands‑on with Databricks notebooks/Lakehouse and dbt for transformation & testing. Product mindset – Comfortable iterating fast, demoing early and measuring impact. Communication & teamwork – Able to explain trade‑offs, write clear documentation and collaborate closely with analysts, product managers and business and other stakeholders. Quality focus – Passion for clean, maintainable code, automated testing and robust data governance. Benefits 401(k) plan with company match Medical, dental, and vision plans Company-offered Life and Accidental Death & Dismemberment (AD&D) insurance Company-provided short and long-term disability benefits Unique offerings of Pet Insurance and Legal Insurance Employee Assistance Program Employee Discounts Professional Development Health Saving Account (HSA) Flexible Spending Accounts Various incentive compensation plans Accident, Critical Illness, and Hospital Indemnity Insurance   Mental Health resources Paid leave benefits for new parents A note to recruiters: We do not allow external search party solicitation.  Presentation of candidates without written permission from the Ascendis Pharma Inc Human Resources team (specifically from: Talent Acquisition Partner or Human Resources Director) is not allowed.  If this occurs your ownership of these candidates will not be acknowledged.
Princeton, NJ, USA
Negotiable Salary
Workable
Data Engineer III
Position: Data Engineer III  Location: Seattle, WA 98121  Duration: 17 Months        Job Type: Contract          Work Type:  Onsite        Job Description:      The Infrastructure Automation team is responsible for delivering the software that powers our infrastructure.  Responsibilities  As a Data Engineer you will be working in one of the world's largest and most complex data warehouse environments.  You will be developing and supporting the analytic technologies that give our customers timely, flexible and structured access to their data.  Design, build, and maintain scalable, reliable, and reusable data pipelines and infrastructure that support analytics, reporting, and strategic decision-making  You will be responsible for designing and implementing a platform using third-party and in-house reporting tools, modeling metadata, building reports and dashboards  You will work with business customers in understanding the business requirements and implementing solutions to support analytical and reporting needs.  Explore source systems, data flows, and business processes to uncover opportunities, ensure data accuracy and completeness, and drive improvements in data quality and usability.  Required Skills & Experience  7+ years of related experience.  Experience with data modeling, warehousing and building ETL pipelines  Strong experience with SQL  Experience in at least one modern scripting or programming language, such as Python, Java.  Strong analytical skills, with the ability to translate business requirements into technical data solutions.  Excellent communication skills, with the ability to collaborate across technical and business teams.  A good candidate can partner with business owners directly to understand their requirements and provide data which can help them observe patterns and spot anomalies.   Preferred  Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions  Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)  3+ years of reparation of data for direct use in visualization tools like Tableau experience  KPI: Meet requirements, how they action solutions, etc.:  Leadership Principles:   Deliver Results  Dive Deep  Top 3 must-have hard skills  Strong experience with SQL  Experience in at least one modern scripting or programming language, such as Python, Java.  Experience with data modeling, warehousing and building ETL pipelines 
Seattle, WA, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.