Browse
···
Log in / Register

Platform Engineer AI Tool & Integration

Negotiable Salary

Axiom Software Solutions Limited

Coral Gables, FL, USA

Favourites
Share

Description

Platform Engineer - AI Tool & Integration Position Overview: The Data Analytics & AI department at RCG is seeking a highly skilled and experienced Software & Platform Engineer to join our team. This pivotal role requires a strong technical background in AI tooling, data platform architecture, cloud computing, and big data technologies. The successful candidate will be responsible for all tooling and integration with GenAI LLM, maintaining our Azure platform with OpenAI, and leveraging Databricks Mosaic AI. This role will be instrumental in driving innovation, ensuring seamless integration, and optimizing our AI and data platforms to meet the evolving needs of the business. Key Responsibilities: • Design, develop, and maintain tooling and integration for GenAI LLM and other AI models. • Manage and optimize our Azure platform, ensuring seamless integration with OpenAI and Databricks Mosaic AI. • Collaborate with cross-functional teams to identify and implement innovative AI solutions to enhance our platform capabilities. • Stay up to date with the latest advancements in AI and machine learning technologies to drive continuous improvement and innovation. • Develop and implement best practices for AI model deployment & scaling. • Design and execute integration strategies for incorporating large language models (LLMs) and CoPilot technologies with existing business platforms. • Assess and recommend suitable LLM and CoPilot technologies that align with business needs and technical requirements. • Conduct feasibility studies and proof-of-concepts to validate the integration of new tools and technologies. • Keep abreast of the latest advancements in LLM, CoPilot, and related technologies to identify opportunities for further innovation and improvement. • Understand and leverage MS CoPilot and MS CoPilot Studio for enhanced productivity and collaboration within the development team. • Integrate MS CoPilot tools into existing workflows and ensure seamless integration with other systems and applications used by the team. • Work closely with product managers, data scientists, and other stakeholders to gather requirements and ensure successful integration of LLM and CoPilot technologies. Requirements: • Bachelor’s or Master’s degree in computer science, Engineering, or a related field. • Proven experience in software/system/platform engineering, with a focus on AI tooling and integration. • Strong expertise in working with Azure, including managing and integrating AI services such as OpenAI and Databricks Mosaic AI. • Proficiency in programming languages such as Python, Java, or C++. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Solid understanding of software development methodologies, including Agile and DevOps practices and tools for continuous integration and deployment. • Understanding of data security best practices and compliance requirements in software development and integration • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. • Strong communication and collaboration skills. Preferred Qualification: • Strong proficiency in one or more programming languages such as Python, Java, C#, or JavaScript. • Experience with large language models (LLMs) and familiarity with tools like OpenAI GPT, Google BERT, or similar. • Hands-on experience with Databricks for data engineering, data science, or machine learning workflows. • Proficiency in using Databricks for building and managing data pipelines, ETL processes, and real-time data processing. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Hands-on experience with Microsoft CoPilot and CoPilot Studio. • Proficiency in working with APIs, microservices architecture, and web services (REST/SOAP). • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their integration services. • Knowledge of database systems, both SQL and NoSQL (e.g., MySQL, MongoDB). • Knowledge of natural language processing (NLP) and large language models (LLMs). • Previous experience in a similar role within a technology-driven organization. • Certifications in Azure, AI, or related areas.

Source:  workable View original post

Location
Coral Gables, FL, USA
Show map

workable

You may also like

Workable
R&D AI Software Engineer / End-to-End Machine Learning Engineer / RAG and LLM
About Pathway Pathway is building LiveAI™ systems that think and learn in real time as humans do. Our mission is to deeply understand how and why LLMs work, fundamentally changing the way models think. The team is made up of AI luminaries. Pathway's CTO, Jan Chorowski, co-authored papers with Geoff Hinton and Yoshua Bengio and was one of the first people to apply attention to speech. Our CSO, Adrian Kosowski, received his PhD in Theoretical Computer Science at the age of 20 and made significant contributions across numerous scientific fields, including AI and quantum information. He also served as a professor and a coach for competitive programmers at Ecole Polytechnique. The team also includes numerous world's top scientists and competitive programmers, alongside seasoned Silicon Valley executives. Pathway has strong investor backing. To date, we have raised over $15M; our latest reported round was our seed. Our offices are located in Palo Alto, CA, as well as Paris, France, and Wroclaw, Poland. The Opportunity We are currently searching for 2 ML/AI Engineers with a solid software engineering backbone who are able to prototype, evaluate, improve, and productionize end-to-end Machine Learning projects with enterprise data. For representative examples of the type of projects you would be expected to create or deliver, see our AI pipelines templates. If you would consider it would be fun to create a hybrid Vector/Graph index that beats the state of the art on a RAG benchmark, to deliver a working AI pipeline to a client in a critical industry, or to pre-process datasets in a way which would boost LLM accuracy in inference & training - this is the job for you! You Will help design experimental end-to-end ML/AI pipelines contribute to addressing new use cases, beyond state of the art improve/adapt AI pipelines for production, working directly with client data (often live data streams) invent ways to pre-process data sources and perform tweaks (reranking, model paramater configuration...) for optimal performance of AI pipelines. design benchmark tasks and perform experiments. build unit tests and implement model monitoring. contribute high-quality production code to our developer frameworks, used by thousands of developers. help to pre-process data sets for LLM training. The results of your work will play a crucial role in the success of both our developer offering and client delivery. Requirements Cover letter It's always a pleasure to say hi! If you could leave us 2-3 lines, we'd really appreciate that. You Are A graduate of a 4+-year university degree in Computer Science, where you have received A-grades in both foundational courses (Algorithms, Computational Complexity, Graph Theory,...) and Machine Learning courses. Passionate about delivering high-quality code and solutions that work. Good with data & engineering innovation in practice - you know how to put things together so that they don't blow up. Experienced at hands-on Machine Learning / Data Science work in the Python stack (notebooks, etc.). Experienced with model monitoring, git, build systems, and CI/CD. Respectful of others Curious of new technology - an avid reader of HN, arXiv feeds, AI digests, ... Fluent in English Bonus Points Successful track-record in algorithms (ICPC / IOI), data science contests (Kaggle), or a HuggingFace leaderboard. Showing a code portfolio. PhD in Computer Science. Authoring a paper at major Machine Learning conference. You like playing/tinkering with new tools in the LLM stack. You are already a part of the Pathway community, or have been recognized for your work in one of our bootcamps. Why You Should Apply Join an intellectually stimulating work environment. Be a technology innovator that makes a difference: your code gets delivered to a community of thousands of developers, and to clients processing billions of records of data. Work in one of the hottest AI startups, that believes in impactful research and foundational changes. Benefits Type of contract: Full-time, permanent Preferable joining date: January 2025. The positions are open until filled – please apply immediately. Compensation: competitive base salary (80th to 99th percentile) based on profile and location + Employee Stock Option Plan + possible bonuses if working on client projects. The stated lower band of EUR 72k/ USD 75k for the salary base concerns senior candidates; mid-senior or mid candidates may be considered with adapted salary bands. Location: Remote work. Possibility to work or meet with other team members in one of our offices: Palo Alto, CA; Paris, France or Wroclaw, Poland. As a general rule, permanent residence will be required in the EU, UK, US, or Canada. (Note for candidates based in India: We are proud to be part of the current Inter-IIT as well as a partner of ICPC India. Top IIT/IIIT graduates are more than welcome to apply regardless of current location.) If you meet our broad requirements but are missing some experience, don’t hesitate to reach out to us.
Palo Alto, CA, USA
Negotiable Salary
Craigslist
Software Development Training Program 🧑‍💻
We are inviting motivated individuals ready to grow their careers in technology. If you want to gain practical coding skills, complete professional projects, and prepare for developer jobs, this training path is for you. This is a remote program available part-time or full-time. With nearly 900 hours of guided learning and project tasks, you’ll use modern languages and tools to create a strong résumé and portfolio, preparing you for employment as a Software Developer. 🖥️ Technology & Programming Fundamentals -Learn how computers, browsers, networks, and the internet operate -Study algorithms, data structures, number systems, and security essentials -Practice Python scripting, command line basics, and flowchart logic 💻 Web & Front-End Development -Build and style websites with HTML5, CSS3, and Bootstrap -Create interactive applications with JavaScript, jQuery, and React.js -Practice responsive layouts and design best practices 🗄️ Back-End & Database Development -Design and query databases with SQL and SQL Server -Perform CRUD functions and study relational models -Develop back-end systems in Python (Django) and C# (.NET Core) 🧑‍💻 Programming Languages & Tools -Work with seven major programming languages including C#, Python, JavaScript, HTML, CSS, SQL, and more -Use developer tools like Git, GitHub, Visual Studio, and Team Foundation Server -Practice collaborative workflows and version control 🧪 Capstone Projects -Complete two in-depth projects (Python + C#) applying your skills -Build a portfolio using Agile, Scrum, and DevOps practices -Gain debugging, problem-solving, and teamwork experience 🧰 Career Preparation -Learn résumé and cover letter writing -Practice technical interviews and whiteboard coding -Enter the job market prepared for entry-level developer positions 🚀 No previous training required. Remote participation encouraged. Start your career path in software today. 👉 Apply here: https://softwaredevpros.online/
4J7J+VV Tampa, FL, USA
$30/hour
Workable
Informatica Engineer - Fort Worth, TX (locals only)
Job Title: Informatica Engineer Location: Fort Worth, TX (locals only) Mode of Work: Hybrid Yrs of exp: 6+ Any Visa   Description:   Infromatica Engineer POSITION Web/Mobile Developer Job Responsibilities This role is responsible for designing, coding, and modifying ETL processes using Informatica based on mappings from Source to Target applications. Experience in migration to Cloud Developer Job Duties - Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to MongoDB database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter - 5-7 years of experience. Developer Skills and Qualifications Informatica PowerCenter, Informatica Powerxchange. MongoDB, IMS , Relational Database Education/Experience • Bachelor’s Degree in computer science or equivalent training required • 5+ years’ experience required
Fort Worth, TX, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.